INC-25-0043 confirmed high AI Grading Errors — Connecticut Students Petition After Misscoring, MCAS Glitch Affects 1,400 Students (2025)
Various AI grading system providers developed and Amity High School (Connecticut), Massachusetts Department of Education (MCAS) deployed AI automated essay scoring systems, harming 150+ Amity HS students whose work was misgraded and ~1,400 Massachusetts students with incorrect MCAS scores ; possible contributing factors include over-automation, insufficient safety testing, and inadequate human oversight.
Incident Details
| Date Occurred | 2025-09 |
| Severity | high |
| Evidence Level | corroborated |
| Impact Level | Sector-wide |
| Domain | Human-AI Control |
| Primary Pattern | PAT-CTL-004 Automation Bias in AI: Definition, Examples, and Prevention |
| Regions | north america |
| Sectors | Education |
| Affected Groups | Children, General Public |
| Exposure Pathways | Algorithmic Decision Impact |
| Causal Factors | Over-Automation, Insufficient Safety Testing, Inadequate Human Oversight |
| Assets & Technologies | Decision Automation |
| Entities | Various AI grading system providers(developer), ·Amity High School (Connecticut)(deployer), ·Massachusetts Department of Education (MCAS)(deployer), ·Amity High School(victim), ·192 Massachusetts school districts(victim) |
| Harm Types | financial, psychological |
AI grading systems produced significant errors in two documented cases. At Amity High School in Connecticut, AI misinterpreted 'at least one' as 'only one,' prompting 150+ students to petition. In Massachusetts, AI scored approximately 1,400 MCAS essays incorrectly across 192 districts, with some students receiving scores of '0' instead of 6 or 7. AI-human grading agreement was only 40%.
Incident Summary
AI-powered grading systems produced significant errors affecting students in two documented cases across different states. At Amity High School in Connecticut, an AI grading system misinterpreted the phrase “at least one” as “only one,” marking student answers incorrect for a fundamentally wrong linguistic interpretation — prompting over 150 students to file a petition challenging the AI grading system.[1] In Massachusetts, an AI system used to score MCAS (Massachusetts Comprehensive Assessment System) essays graded approximately 1,400 essays incorrectly across 192 school districts, with some students receiving scores of “0” instead of the 6 or 7 they had earned — potentially affecting academic records and graduation eligibility.[2] Research accompanying the incidents found that AI-human grading agreement was only 40%, meaning AI grading systems disagreed with human evaluators on the majority of assessments.[3] The combination of documented errors at both the school and state level raises fundamental questions about the reliability of AI grading systems in educational contexts where scores have material consequences for students’ academic trajectories.
Key Facts
- Amity HS: AI misinterpreted “at least one” as “only one”; 150+ students petitioned[1]
- MCAS: ~1,400 essays scored incorrectly across 192 Massachusetts districts[2]
- Score errors: Some students received “0” instead of 6 or 7[2]
- Agreement rate: AI-human grading agreement only 40%[3]
- Student response: 150+ students organized petition at Amity HS[1]
Threat Patterns Involved
Primary: Overreliance & Automation Bias — The deployment of AI grading systems for high-stakes educational assessments without adequate human oversight represents institutional overreliance on automated scoring, where the efficiency benefits of AI grading were prioritized over the accuracy requirements of assessments that affect students’ academic futures.
Significance
- 40% agreement rate — The finding that AI-human grading agreement is only 40% fundamentally challenges the premise that AI grading is sufficiently reliable for educational use, as disagreement on the majority of assessments indicates systematic differences in evaluation rather than edge-case errors
- Language comprehension failure — The misinterpretation of “at least one” as “only one” reveals that AI grading systems can fail at basic language comprehension in ways that systematically penalize students who use correct but nuanced language
- Scale of MCAS impact — The 1,400 incorrect scores across 192 districts demonstrate that AI grading errors can affect students at statewide scale, with errors potentially influencing academic records, graduation eligibility, and school accountability ratings
- Student-organized resistance — The 150+ student petition at Amity HS represents students themselves identifying and organizing against AI grading errors, a form of algorithmic accountability driven by the people most directly affected
Timeline
Massachusetts MCAS AI grading glitch scores ~1,400 essays incorrectly across 192 districts
Amity HS students discover AI misinterpreted 'at least one' as 'only one'
150+ students petition over AI grading errors at Amity HS
Outcomes
- Recovery:
- Some scores corrected after errors identified
Use in Retrieval
INC-25-0043 documents AI Grading Errors — Connecticut Students Petition After Misscoring, MCAS Glitch Affects 1,400 Students, a high-severity incident classified under the Human-AI Control domain and the Automation Bias in AI: Definition, Examples, and Prevention threat pattern (PAT-CTL-004). It occurred in North America (2025-09). This page is maintained by TopAIThreats.com as part of an evidence-based registry of AI-enabled threats. Cite as: TopAIThreats.com, "AI Grading Errors — Connecticut Students Petition After Misscoring, MCAS Glitch Affects 1,400 Students," INC-25-0043, last updated 2026-03-29.
Sources
- Amity HS students petition over AI grading errors (news, 2026-03)
https://ctmirror.org (opens in new tab) - MCAS AI grading glitch affects 1,400 students across 192 districts (news, 2025-09)
https://nbcboston.com (opens in new tab) - AI grading errors and student impact analysis (news, 2026-03)
https://patch.com (opens in new tab)
Update Log
- — First logged (Status: Confirmed, Evidence: Corroborated)