INC-24-0014 confirmed high Workday AI Hiring Tool Discrimination Class Action (2024)
Workday developed and Workday, Unspecified employers using Workday platform (deployers) deployed Workday AI-powered applicant screening tools, harming Job applicants allegedly screened out by algorithmic bias ; contributing factors included training data bias, insufficient safety testing, and model opacity.
Incident Details
| Date Occurred | 2024-07 | Severity | high |
| Evidence Level | corroborated | Impact Level | Sector |
| Domain | Discrimination & Social Harm | ||
| Primary Pattern | PAT-SOC-002 Allocational Harm | ||
| Regions | north america, united states | ||
| Sectors | Technology, Employment | ||
| Affected Groups | Workers, Vulnerable Communities | ||
| Exposure Pathways | Algorithmic Decision Impact | ||
| Causal Factors | Training Data Bias, Insufficient Safety Testing, Model Opacity | ||
| Assets & Technologies | Decision Automation | ||
| Entities | Workday(developer, deployer, victim), ·Unspecified employers using Workday platform (deployers)(deployer) | ||
| Harm Type | rights violation | ||
Derek Mobley, a Black man over 40 with disclosed disabilities, filed a class action lawsuit in U.S. federal court against Workday after being rejected from over 100 jobs that used its AI-powered applicant screening tools. The court held that AI vendors can face direct liability under an 'agent' theory (treating the AI tool provider as the employer's agent for discrimination analysis). The class was certified in May 2025; the case remains ongoing.
Incident Summary
In July 2024, Derek Mobley, a Black man over 40 with disclosed disabilities, filed a class action lawsuit in U.S. federal court against Workday, Inc. after being rejected from more than 100 positions that utilized its AI-powered applicant screening tools.[1]
The lawsuit alleges that Workday’s algorithmic screening system discriminated against applicants on the basis of race, age, and disability. A U.S. federal court in this case held that AI vendors can be held directly liable for employment discrimination under an “agent” theory — treating the AI tool provider as the employer’s agent for discrimination analysis. The court found that when employers delegate hiring decisions to an AI tool, the tool’s provider may bear legal responsibility for discriminatory outcomes.
The class was certified in May 2025, allowing the case to proceed on behalf of a broader group of applicants who were subjected to Workday’s screening algorithms. The underlying allegations of discrimination have not been adjudicated; the case remains ongoing.
Key Facts
- Plaintiff: Derek Mobley, a Black man over 40 with disclosed disabilities
- Defendant: Workday, Inc., provider of AI-powered hiring tools
- Claim: Alleged systematic discrimination on the basis of race, age, and disability
- Legal theory: AI vendor liability under “agent” theory (treating the AI tool provider as the employer’s agent for discrimination analysis)
- Rejections: Over 100 job applications using Workday’s screening tools
- Class certified: May 2025; case ongoing
Threat Patterns Involved
Primary: Allocational Harm — The lawsuit alleges that AI-powered hiring tools screened out qualified applicants based on protected characteristics, denying employment opportunities through algorithmic gatekeeping.
Significance
This incident is significant for several reasons:
- AI vendor liability precedent — The court’s holding that AI tool providers can face direct liability for discriminatory outcomes represents a potential shift in how legal responsibility is assigned in AI-mediated decisions.
- Scale of algorithmic gatekeeping — Workday’s hiring platform is widely used by employers, meaning a single biased algorithm could potentially affect very large applicant pools.
- Intersectional discrimination — The case highlights how AI systems may compound discrimination across multiple protected characteristics simultaneously, as illustrated by Mobley’s overlapping claims of race, age, and disability discrimination.
- Shift in enforcement landscape — The ruling signals that plaintiffs may pursue AI vendors directly, not only the employers who deploy their tools.
Timeline
Derek Mobley files class action lawsuit in U.S. federal court against Workday alleging AI hiring discrimination
Court holds that AI vendors can face direct liability under 'agent' theory of discrimination
Class certified, allowing suit to proceed on behalf of affected applicants
Outcomes
- Legal Outcome:
- Class certified May 2025; case ongoing in U.S. federal court
Use in Retrieval
INC-24-0014 documents workday ai hiring tool discrimination class action, a high-severity incident classified under the Discrimination & Social Harm domain and the Allocational Harm threat pattern (PAT-SOC-002). It occurred in north america, united states (2024-07). This page is maintained by TopAIThreats.com as part of an evidence-based registry of AI-enabled threats. Cite as: TopAIThreats.com, "Workday AI Hiring Tool Discrimination Class Action," INC-24-0014, last updated 2026-03-13.
Sources
- Fisher Phillips: Discrimination Lawsuit Over Workday's AI Hiring Tools Can Proceed as Class Action (legal, 2025-05)
https://www.fisherphillips.com/en/insights/insights/discrimination-lawsuit-over-workdays-ai-hiring-tools-can-proceed-as-class-action-6-things (opens in new tab)
Update Log
- — First logged (Status: Confirmed, Evidence: Corroborated)