Site icon The Michigan Chronicle

Judge Dismisses Lawsuit Over Wrongful Arrest of Pregnant Detroit Woman Misidentified by Facial Recognition

Facial recognition technology has long faced criticism for its failure to correctly identify Black faces, and that failure has real consequences when it is used in policing. In Detroit, those consequences fell on Porcha Woodruff, a Black woman who was eight months pregnant when she was arrested in February 2023 for a carjacking she did not commit. She was taken from her home while preparing her children for school, spent 10 hours in jail, and was later cleared when the charges were dropped. Her ordeal highlighted the risks that come when a tool marketed as objective is allowed to influence investigations in communities already subject to heavy surveillance.

Woodruff challenged her arrest in court, filing a civil rights lawsuit. U.S. District Judge Judith Levy acknowledged in her August 5 decision that Woodruff’s experience “is troubling for many reasons,” but dismissed the case against the officer who pursued the arrest warrant. Levy ruled that Woodruff’s attorney had not proven there was no probable cause at the time and noted the officer was not immediately aware of any “exculpatory evidence” that would have eliminated her as a suspect. The ruling ended this chapter of Woodruff’s legal fight, though her attorney confirmed an appeal is underway.

That attorney, Ivan Land, argued that investigators relied too heavily on the match produced by facial recognition and did not gather enough supporting evidence. “We’re just shocked by the decision,” Land said, adding that the city had offered to settle before the ruling but no agreement was reached.

The path that led to Woodruff’s arrest illustrates how these systems can steer cases. Investigators ran surveillance footage from a gas station through facial recognition software. The program flagged Woodruff’s image, which was placed into a photo lineup. The carjacking victim then selected her. That pairing—the technology’s suggestion and a victim’s confirmation—was treated as sufficient to move forward. For Woodruff, it was enough to upend her life.

Woodruff’s attorneys allege in the suit that while in jail, Woodruff “was not able to consume the foods or beverages offered because of her diagnosis” of gestational diabetes. She told Insider that police had no bottled water to offer her, only faucet water from the station, which she did not trust to drink. She added that they also offered her sugary concentrated lemonade, which she could not drink because of her diabetes, and a ham sandwich that she said looked 5–10 days old.

Woodruff was eventually released on a $100,000 personal bond, and her fiancé took her to the hospital, where she was treated for dehydration and contractions from stress, the lawsuit says. A few weeks later, the charges against Woodruff were dismissed for lack of evidence, court documents included in the lawsuit show. Her lawyer said she ended up not having to pay the $100,000 bond.

Woodruff said the incident took a “huge toll” on her and her family. “I’m still stressed. My anxiety is through the roof, especially now,” she told Insider, adding that she is still dealing with postpartum depression from the experience. “My kids, they have anxiety. They’re stressed out. They see police officers, and they’re scared.” Attorneys for Woodruff, 32, filed the lawsuit against the City of Detroit and Detroit Police Detective LaShauntia Oliver on August 3, 2023, alleging that Woodruff was falsely arrested and accused of carjacking and robbery. She was seeking $25 million in damages at that time.

Detroit is not unfamiliar with such cases. Years earlier, Robert Williams was wrongfully accused of shoplifting after a similar misidentification. His case was settled for $300,000, a public acknowledgment that the process had failed. Together, these incidents shaped the city’s approach to the technology. Detroit police have since changed their policy, stating they will not make arrests based solely on facial recognition results or on photo lineups created from them.

These adjustments matter, but they also reveal the limits of technology when placed in contexts already marked by inequity. Studies have consistently shown that facial recognition is less accurate for Black and Brown individuals. A 2019 analysis by the National Institute of Standards and Technology found error rates for Black women were far higher than those for white men. When such tools are used in communities of color, the potential for harm grows, as lives can be disrupted by nothing more than an algorithmic suggestion.

Woodruff’s arrest underscores what those statistics mean in human terms. She was pregnant, caring for her children, and suddenly faced with the trauma of being handcuffed and detained. For her children, witnessing the arrest left its own mark. These moments extend beyond the walls of a courtroom or jail cell. They ripple through families and communities, shaping how trust in public institutions is understood.

Detroit’s new limits on the use of facial recognition show that lessons have been learned, but questions remain about whether safeguards will be enough. The experiences of residents like Woodruff and Williams point to the risks of letting efficiency or speed outweigh accuracy and care. Communities cannot afford errors of this magnitude, particularly when the same populations are repeatedly affected.

Other cities have taken different approaches. San Francisco and New Orleans have banned or paused police use of facial recognition altogether, citing concerns about bias and accuracy. Detroit has opted to regulate rather than ban, reflecting the balance local leaders are trying to strike between adopting new technologies and protecting civil rights. The path forward will likely be shaped not only by policy but by the voices of residents who have lived through the consequences.

For now, the dismissal of Woodruff’s lawsuit does not resolve the larger issues her case brings to light. It leaves her story as a reminder of how fragile justice can become when tools not designed with fairness in mind are allowed to influence decisions. It also leaves open the question of what accountability should look like when lives are disrupted, even if courts do not find clear fault.

Woodruff’s case is now headed for appeal, but its impact is already visible in Detroit’s evolving policies and in the conversations happening across the country about the role of technology in law enforcement. Her story illustrates both the dangers of overreliance on imperfect tools and the resilience of those who challenge their misuse. It calls attention to the need for transparency, oversight, and community-centered reform.

The questions raised go beyond one arrest or one lawsuit. They reach into the broader matter of how technology interacts with systems of justice and whose lives are most at risk when those systems fall short. For Detroit, the path forward will not be measured only in policies written but in the trust rebuilt—or lost—along the way.

This website uses cookies.

This website uses cookies.

Exit mobile version