INC-26-0075 confirmed high Canada Immigration AI Hallucinated Job Duties — PhD Immunologist Denied Permanent Residency (2026)
Immigration, Refugees and Citizenship Canada (IRCC) developed and IRCC deployed IRCC generative AI decision support system, harming Kemy Ade (PhD immunologist denied permanent residency) ; possible contributing factors include hallucination tendency, over-automation, and inadequate human oversight.
Incident Details
| Date Occurred | 2026-03 |
| Severity | high |
| Evidence Level | corroborated |
| Impact Level | Sector-wide |
| Domain | Information Integrity |
| Primary Pattern | PAT-INF-004 Misinformation & Hallucinated Content |
| Secondary Patterns | PAT-CTL-004 Automation Bias in AI: Definition, Examples, and Prevention |
| Regions | canada |
| Sectors | Government |
| Affected Groups | Vulnerable Communities, Workers |
| Exposure Pathways | Algorithmic Decision Impact |
| Causal Factors | Hallucination Tendency, Over-Automation, Inadequate Human Oversight |
| Assets & Technologies | Large Language Models, Decision Automation |
| Entities | Immigration, Refugees and Citizenship Canada (IRCC)(developer), ·IRCC(deployer) |
| Harm Types | financial, rights violation |
PhD immunologist Kemy Ade was denied permanent residency in Canada after IRCC's AI fabricated job duties — describing her as 'wiring control circuits, building robot panels.' This was the first documented IRCC acknowledgment of generative AI in immigration refusal decisions. A 1 million+ application backlog was driving AI adoption.
Incident Summary
PhD immunologist Kemy Ade was denied permanent residency in Canada after Immigration, Refugees and Citizenship Canada’s (IRCC) AI system fabricated job duties in the refusal letter, describing her as “wiring control circuits, building robot panels” — duties that bear no relationship to immunology research.[1] The case represents the first documented acknowledgment by IRCC that generative AI was used in immigration decision-making, revealing that Canada’s immigration agency had been deploying AI to process applications without public disclosure.[2] The AI adoption was driven by a backlog of over 1 million applications, creating pressure to automate decisions at the cost of accuracy.[3] The hallucinated job duties — entirely fabricated by the AI system and bearing no connection to the applicant’s actual qualifications — demonstrate that generative AI hallucinations in government decision-making can have direct, life-altering consequences for individuals, transforming a technical limitation of language models into a denial of legal status.
Key Facts
- Applicant: Kemy Ade, PhD immunologist[1]
- AI fabrication: Described her as “wiring control circuits, building robot panels”[1]
- Decision: Permanent residency denied based on hallucinated duties[1]
- Precedent: First IRCC acknowledgment of generative AI in immigration decisions[2]
- Context: 1 million+ application backlog driving AI adoption[3]
Threat Patterns Involved
Primary: Misinformation & Hallucinated Content — The AI system generated completely fabricated job duties that were incorporated into an official government decision, demonstrating how LLM hallucinations in automated government systems produce authoritative-seeming misinformation with direct legal consequences.
Secondary: Overreliance & Automation Bias — The incorporation of AI-hallucinated job duties into an official refusal letter without human verification indicates that IRCC decision-makers treated the AI’s output as accurate without cross-referencing against the applicant’s actual documentation.
Significance
- Government hallucination with legal force — The fabricated job duties were incorporated into an official immigration decision with legal standing, demonstrating that AI hallucinations in government systems carry consequences that extend beyond misinformation to the denial of legal rights
- First acknowledged government AI in immigration — IRCC’s acknowledgment that generative AI was used in immigration decisions reveals previously undisclosed automation of consequential government decision-making
- Backlog-driven automation risk — The 1 million+ backlog creating pressure for AI adoption demonstrates how administrative pressure can override the caution necessary for deploying AI in high-stakes decisions
- PhD immunologist as test case — The absurdity of describing a PhD immunologist as “wiring control circuits” makes the hallucination obvious, but raises the question of how many less obviously wrong hallucinations have gone undetected in other immigration decisions
Timeline
Kemy Ade's permanent residency application refused by IRCC
Refusal letter describes fabricated job duties: 'wiring control circuits, building robot panels'
First documented IRCC acknowledgment of generative AI in immigration decisions
Outcomes
- Regulatory Action:
- First documented IRCC acknowledgment of AI in immigration decisions
Use in Retrieval
INC-26-0075 documents Canada Immigration AI Hallucinated Job Duties — PhD Immunologist Denied Permanent Residency, a high-severity incident classified under the Information Integrity domain and the Misinformation & Hallucinated Content threat pattern (PAT-INF-004). It occurred in Canada (2026-03). This page is maintained by TopAIThreats.com as part of an evidence-based registry of AI-enabled threats. Cite as: TopAIThreats.com, "Canada Immigration AI Hallucinated Job Duties — PhD Immunologist Denied Permanent Residency," INC-26-0075, last updated 2026-03-29.
Sources
- Canada IRCC AI hallucinated job duties in immigration refusal (news, 2026-03)
https://rcicnews.com (opens in new tab) - PhD immunologist denied PR after AI fabricated duties (news, 2026-03)
https://slashdot.org (opens in new tab) - IRCC generative AI in immigration decisions analysis (analysis, 2026-03)
https://macleans.ca (opens in new tab)
Update Log
- — First logged (Status: Confirmed, Evidence: Corroborated)