INC-16-0003 confirmed critical COMPAS Recidivism Algorithm Racial Bias (2016)
Northpointe (now Equivant) developed and U.S. state and county courts deployed decision automation, harming Black defendants and Minority defendants in the U.S. criminal justice system ; contributing factors included training data bias, model opacity, accountability vacuum, and regulatory gap.
Incident Details
| Date Occurred | 2016-05 | Severity | critical |
| Evidence Level | primary | Impact Level | Institution |
| Domain | Discrimination & Social Harm | ||
| Primary Pattern | PAT-SOC-004 Proxy Discrimination | ||
| Secondary Patterns | PAT-CTL-002 Implicit Authority Transfer | ||
| Regions | north america | ||
| Sectors | Government, Law Enforcement | ||
| Affected Groups | Vulnerable Communities, General Public | ||
| Exposure Pathways | Algorithmic Decision Impact | ||
| Causal Factors | Training Data Bias, Model Opacity, Accountability Vacuum, Regulatory Gap | ||
| Assets & Technologies | Decision Automation | ||
| Entities | Northpointe (now Equivant)(developer), ·U.S. state and county courts(deployer) | ||
| Harm Types | rights violation, psychological | ||
ProPublica's investigation revealed that the COMPAS recidivism prediction algorithm used in U.S. courts produced racially biased risk scores, with Black defendants nearly twice as likely to be falsely flagged as high risk compared to white defendants.
Incident Summary
In May 2016, ProPublica published an investigation titled “Machine Bias” analyzing the Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) algorithm, a proprietary risk assessment tool developed by Northpointe Inc. (now Equivant) and used widely across the United States criminal justice system to predict the likelihood of recidivism.[1] ProPublica examined COMPAS risk scores assigned to more than 7,000 defendants arrested in Broward County, Florida, between 2013 and 2014, and compared those scores against actual recidivism outcomes over a two-year period.
The analysis found significant racial disparities in the algorithm’s predictions. Black defendants were nearly twice as likely as white defendants to be incorrectly flagged as being at high risk of reoffending (false positives), while white defendants were significantly more likely to be incorrectly labeled as low risk when they did go on to reoffend (false negatives).[1] Northpointe disputed ProPublica’s conclusions, arguing that the algorithm achieved comparable predictive accuracy across racial groups when measured by a different statistical metric.[2]
In July 2016, the Wisconsin Supreme Court addressed the use of COMPAS in its ruling in State v. Loomis, holding that the algorithm could be used as one factor in sentencing decisions but that courts must be informed of its limitations, including the fact that the tool was not designed for use in determining the severity of a sentence and that it may exhibit group-level bias.[3]
Key Facts
- Algorithm: COMPAS (Correctional Offender Management Profiling for Alternative Sanctions), developed by Northpointe Inc. (now Equivant)
- Use: Pretrial risk assessment and sentencing recommendations in U.S. criminal courts
- Investigation scope: Over 7,000 defendants in Broward County, Florida
- Finding: Black defendants nearly twice as likely to be falsely flagged as high-risk compared to white defendants
- Corollary finding: White defendants more likely to be falsely labeled low-risk
- Developer response: Northpointe disputed findings, citing alternative fairness metrics
- Legal ruling: Wisconsin Supreme Court allowed continued use with mandatory disclosure of limitations
Threat Patterns Involved
Primary: Proxy Discrimination — Although COMPAS does not use race as an explicit input variable, ProPublica’s analysis found that the algorithm’s predictions correlated with race in a manner that produced disparate error rates, suggesting that the model relied on features that served as proxies for race.
Secondary: Implicit Authority Transfer — The widespread use of COMPAS scores in sentencing and pretrial decisions effectively transferred significant decision-making authority from judges to an opaque proprietary algorithm, with courts relying on risk scores they could not independently verify or fully understand.
Significance
- Foundational case in algorithmic fairness. The COMPAS investigation became the most widely cited real-world example of racial bias in algorithmic decision-making and catalyzed an entire field of research into competing definitions of algorithmic fairness.[1]
- Impossibility of simultaneous fairness metrics. The subsequent academic debate demonstrated that certain statistical definitions of fairness are mutually exclusive, meaning that an algorithm cannot simultaneously satisfy all reasonable fairness criteria when base rates differ between groups — a finding with profound implications for the use of predictive algorithms in high-stakes decisions.
- Opacity in high-stakes decisions. COMPAS is a proprietary system whose internal workings are not publicly disclosed, raising fundamental questions about due process when individuals’ liberty is influenced by algorithms they cannot examine or challenge.[3]
- Continued use despite documented bias. Despite the investigation’s findings and ongoing legal challenges, COMPAS and similar risk assessment tools remain in use across U.S. courts, illustrating the gap between the identification of algorithmic harms and institutional response.
Timeline
COMPAS algorithm widely adopted across U.S. courts for pretrial and sentencing risk assessment
ProPublica publishes 'Machine Bias' investigation analyzing COMPAS scores for over 7,000 defendants in Broward County, Florida
Northpointe (now Equivant) publishes response disputing ProPublica's methodology and conclusions
Wisconsin Supreme Court rules in State v. Loomis that COMPAS may be used in sentencing but must include warnings about its limitations
Academic debate intensifies over competing statistical definitions of algorithmic fairness, prompted by the COMPAS controversy
Outcomes
- Financial Loss:
- Not quantifiable; impact measured in unjust pretrial and sentencing outcomes
- Arrests:
- Not applicable
- Recovery:
- COMPAS remains in use in many jurisdictions; some courts have adopted disclosure requirements
- Regulatory Action:
- Wisconsin Supreme Court imposed disclosure requirements; no federal regulation enacted
Glossary Terms
Use in Retrieval
INC-16-0003 documents compas recidivism algorithm racial bias, a critical-severity incident classified under the Discrimination & Social Harm domain and the Proxy Discrimination threat pattern (PAT-SOC-004). It occurred in north america (2016-05). This page is maintained by TopAIThreats.com as part of an evidence-based registry of AI-enabled threats. Cite as: TopAIThreats.com, "COMPAS Recidivism Algorithm Racial Bias," INC-16-0003, last updated 2026-02-15.
Sources
- ProPublica: Machine Bias — There's software used across the country to predict future criminals. And it's biased against blacks. (primary, 2016-05)
https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing (opens in new tab) - Northpointe Inc.: Response to ProPublica — Demonstrating accuracy equity and predictive parity (primary, 2016-07)
https://www.equivant.com/response-to-propublica-demonstrating-accuracy-equity-and-predictive-parity/ (opens in new tab) - State v. Loomis, 881 N.W.2d 749 (Wis. 2016) (primary, 2016-07)
https://scholar.google.com/scholar_case?case=12268852475956079706 (opens in new tab)
Update Log
- — First logged (Status: Confirmed, Evidence: Primary)