INC-24-0016 confirmed high SafeRent Algorithmic Housing Discrimination Settlement (2024)
SafeRent Solutions developed and SafeRent Solutions, Landlords and property management companies using SafeRent (deployers) deployed SafeRent Solutions tenant screening algorithm, harming Black and Hispanic rental applicants allegedly denied housing due to algorithmic screening and Housing voucher holders allegedly disproportionately rejected by tenant screening ; contributing factors included training data bias and model opacity.
Incident Details
| Date Occurred | 2024-04 | Severity | high |
| Evidence Level | primary | Impact Level | Sector |
| Domain | Discrimination & Social Harm | ||
| Primary Pattern | PAT-SOC-002 Allocational Harm | ||
| Regions | north america, united states | ||
| Sectors | Social Services | ||
| Affected Groups | Vulnerable Communities, General Public | ||
| Exposure Pathways | Algorithmic Decision Impact | ||
| Causal Factors | Training Data Bias, Model Opacity | ||
| Assets & Technologies | Decision Automation | ||
| Entities | SafeRent Solutions(developer, deployer), ·Landlords and property management companies using SafeRent (deployers)(deployer) | ||
| Harm Types | rights violation, financial | ||
SafeRent Solutions agreed to a $2.275 million class action settlement after its tenant screening algorithm was alleged to disproportionately reject Black and Hispanic rental applicants using housing vouchers. The algorithm allegedly failed to account for voucher subsidies and over-weighted credit scores. The case resolved via settlement without a court determination on liability.
Incident Summary
SafeRent Solutions, a widely used tenant screening company, agreed to a $2.275 million class action settlement after its algorithmic screening system was alleged to disproportionately reject Black and Hispanic rental applicants who used housing vouchers.[1]
The lawsuit alleged that the algorithm failed to account for the subsidy component of housing vouchers when evaluating applicants’ ability to pay rent. By over-weighting credit scores and not adjusting for guaranteed government subsidies, the system allegedly produced scores that disadvantaged applicants from racial and ethnic groups that disproportionately rely on housing vouchers.
The case resolved via class action settlement, without a court determination on liability. The settlement required algorithmic modifications and represents one of the first class actions challenging algorithmic discrimination in housing.
Key Facts
- Settlement amount: $2.275 million
- Defendant: SafeRent Solutions (tenant screening company)
- Alleged mechanism: Algorithm over-weighted credit scores and failed to account for housing voucher subsidies
- Affected populations: Black and Hispanic rental applicants using housing vouchers
- Outcome: Class action settlement with required algorithmic modifications; no admission of liability
Threat Patterns Involved
Primary: Allocational Harm — The lawsuit alleged that the tenant screening algorithm denied housing opportunities to qualified applicants from protected groups by failing to account for voucher subsidies in its scoring model.
Significance
This incident is significant for several reasons:
- Housing as high-stakes allocation — Unlike many algorithmic bias cases in hiring or lending, this case concerns access to housing, a fundamental need where algorithmic denial can have immediate and severe consequences.
- Structural design flaw — The alleged discrimination did not arise from biased training data alone but from a fundamental design failure: the algorithm allegedly did not account for a well-known government subsidy program.
- Precedent for tenant screening — As AI-powered tenant screening becomes standard practice, this settlement signals that vendors may face liability for discriminatory outcomes in housing decisions.
- Proxy discrimination — The case illustrates how facially neutral variables like credit scores can serve as proxies for race when correlated with socioeconomic factors that themselves reflect historical discrimination.
Timeline
SafeRent Solutions agrees to $2.275 million class action settlement
Outcomes
- Financial Loss:
- $2.275 million settlement
- Legal Outcome:
- Class action settlement with required algorithmic modifications; no court determination on liability
Use in Retrieval
INC-24-0016 documents saferent algorithmic housing discrimination settlement, a high-severity incident classified under the Discrimination & Social Harm domain and the Allocational Harm threat pattern (PAT-SOC-002). It occurred in north america, united states (2024-04). This page is maintained by TopAIThreats.com as part of an evidence-based registry of AI-enabled threats. Cite as: TopAIThreats.com, "SafeRent Algorithmic Housing Discrimination Settlement," INC-24-0016, last updated 2026-03-13.
Sources
- Cohen Milstein: Rental Applicants Using Housing Vouchers Settle Ground-Breaking Discrimination Class Action Against SafeRent Solutions (primary, 2024-04)
https://www.cohenmilstein.com/rental-applicants-using-housing-vouchers-settle-ground-breaking-discrimination-class-action-against-saferent-solutions/ (opens in new tab)
Update Log
- — First logged (Status: Confirmed, Evidence: Primary)