Skip to main content
TopAIThreats home TOP AI THREATS
INC-20-0005 confirmed critical

Robert Williams Wrongful Arrest from Facial Recognition Racial Bias (2020)

Attribution

DataWorks Plus developed and Detroit Police Department, Michigan State Police deployed facial recognition technology operated by Michigan State Police and Detroit Police Department, harming Robert Williams, wrongfully arrested and detained for 30 hours in front of his wife and two young daughters and Williams family members who witnessed the wrongful arrest ; possible contributing factors include training data bias, model opacity, and over-automation.

Incident Details

Last Updated 2026-03-28

Robert Williams was wrongfully arrested by Detroit police based on a false facial recognition match, detained for 30 hours, and charged with a crime he did not commit. The case became the first publicly reported wrongful arrest caused by facial recognition technology and resulted in a landmark $300,000 settlement with nation-leading policy reforms.

Incident Summary

In January 2020, Robert Williams was arrested by Detroit police outside his home, in front of his wife and two young daughters, based on a false facial recognition match.[4] The arrest stemmed from a 2018 shoplifting investigation at a Shinola store in Detroit, where officers captured a blurry, low-quality still image from surveillance video and sent it to the Michigan State Police for a facial recognition search.[2] A Detroit Police Department detective then applied for an arrest warrant but omitted information that would have alerted the magistrate to the unreliability of both the facial recognition result and the subsequent photo lineup procedure.[1]

Williams was detained for approximately 30 hours in an overcrowded, dirty cell. His case became the first publicly reported instance of a facial recognition false match leading to a wrongful arrest.[4]

Key Facts

  • Wrongful arrest: Williams arrested at his home in front of his family based on a false facial recognition match from a blurry surveillance image[4]
  • Detention: Approximately 30 hours in an overcrowded cell[2]
  • Root cause: Facial recognition technology matched Williams to a low-quality image of the actual suspect; the detective failed to disclose the unreliability of the match in the warrant application[1]
  • Lawsuit: Filed April 2021 by ACLU, ACLU of Michigan, and University of Michigan Law School Civil Rights Litigation Initiative under Fourth Amendment and Elliott-Larsen Civil Rights Act[1]
  • Settlement: $300,000 plus attorneys’ fees on June 28, 2024[2]
  • Policy reforms: Described as “the nation’s strongest police department policy on facial recognition technology”[3]
  • Reforms include: Prohibition on arrests based solely on facial recognition; requirement for independent evidence before photo lineups; mandatory officer training on racial bias; audit of all facial recognition-linked cases since 2017[3]

Threat Patterns Involved

Primary: Data Imbalance Bias — Facial recognition systems have been repeatedly demonstrated to perform less accurately on darker-skinned faces, a direct consequence of training data that underrepresents non-white subjects. This case illustrates how data-driven bias translates into real-world harm when deployed in high-stakes decision-making without adequate safeguards.

Secondary: Overreliance & Automation Bias — The detective treated the facial recognition output as a reliable identification rather than an investigative lead requiring independent corroboration, omitting critical information about the match’s limitations from the warrant application.

Secondary: Proxy Discrimination — The facial recognition system’s documented higher error rates for Black individuals functioned as a proxy for racial profiling, subjecting Williams to arrest based on a match that the technology was statistically more likely to get wrong for someone of his demographic.

Significance

This case established legal and policy precedent for facial recognition accountability:

  1. First of its kind — The first publicly reported wrongful arrest caused by a false facial recognition match, making the abstract risk of biased AI concrete and personal[4]
  2. Landmark settlement — The policy reforms negotiated as part of the settlement represent the strongest restrictions any U.S. police department has adopted on facial recognition use[3]
  3. Systemic audit — The requirement to audit all facial recognition-linked cases since 2017 may reveal additional wrongful identifications
  4. Deterrent effect — The $300,000 settlement and policy mandates signal to law enforcement agencies that uncritical reliance on facial recognition technology carries legal and financial consequences

Timeline

Shoplifting incident at a Shinola store in Detroit; police capture a blurry surveillance image (exact date unknown)

Detroit Police Department sends surveillance image to Michigan State Police for facial recognition search (exact date unknown)

Robert Williams arrested outside his home in front of his wife and two young daughters based on the false match

Williams detained for approximately 30 hours in an overcrowded cell

ACLU, ACLU of Michigan, and University of Michigan Law School file civil rights lawsuit (Williams v. City of Detroit)

Case settles for $300,000 plus attorneys' fees, with landmark policy reforms on facial recognition use

Outcomes

Financial Loss:
$300,000 settlement plus attorneys' fees
Regulatory Action:
Detroit Police Department prohibited from arresting based solely on facial recognition results; photo lineups cannot follow directly from facial recognition without independent evidence; mandatory officer training on facial recognition risks and racial bias; audit of all DPD cases since 2017 where facial recognition was used for arrest warrants
Legal Outcome:
Settlement with City of Detroit including the nation's strongest police department facial recognition policy

Use in Retrieval

INC-20-0005 documents Robert Williams Wrongful Arrest from Facial Recognition Racial Bias, a critical-severity incident classified under the Discrimination & Social Harm domain and the Data Imbalance Bias threat pattern (PAT-SOC-003). It occurred in North America (2020-01). This page is maintained by TopAIThreats.com as part of an evidence-based registry of AI-enabled threats. Cite as: TopAIThreats.com, "Robert Williams Wrongful Arrest from Facial Recognition Racial Bias," INC-20-0005, last updated 2026-03-28.

Sources

  1. Williams v. City of Detroit — ACLU Case Page (primary, 2024-06-28)
    https://www.aclu.org/cases/williams-v-city-of-detroit-face-recognition-false-arrest (opens in new tab)
  2. Wrongful Facial Recognition Arrest Leads to Landmark Settlement — Michigan Public (news, 2024-06-28)
    https://www.michiganpublic.org/criminal-justice-legal-system/2024-06-28/it-didnt-make-sense-at-all-wrongful-facial-recognition-arrest-leads-to-landmark-settlement (opens in new tab)
  3. ACLU Press Release: Nation's Strongest Police Department Policy on Facial Recognition Technology (primary, 2024-06-28)
    https://www.aclu.org/press-releases/civil-rights-advocates-achieve-the-nations-strongest-police-department-policy-on-facial-recognition-technology (opens in new tab)
  4. NPR: How Facial Recognition Led To False Arrest Of Black Man (news, 2020-06-24)
    https://www.npr.org/2020/06/24/882683463/the-computer-got-it-wrong-how-facial-recognition-led-to-a-false-arrest-in-michig (opens in new tab)

Update Log

  • — First logged (Status: Confirmed, Evidence: Primary)