INC-24-0005 confirmed medium Air Canada Chatbot Hallucinated Refund Policy — Tribunal Ruling (2022)
Unknown chatbot vendor developed and Air Canada deployed large language models, harming Jake Moffatt (passenger) and Air Canada customers ; contributing factors included hallucination tendency and misconfigured deployment.
Incident Details
| Date Occurred | 2022-11 | Severity | medium |
| Evidence Level | primary | Impact Level | Organization |
| Domain | Agentic Systems | ||
| Primary Pattern | PAT-AGT-002 Cascading Hallucinations | ||
| Secondary Patterns | PAT-CTL-004 Overreliance & Automation Bias | ||
| Regions | north america | ||
| Sectors | Transportation | ||
| Affected Groups | General Public, Business Organizations | ||
| Exposure Pathways | Direct Interaction | ||
| Causal Factors | Hallucination Tendency, Misconfigured Deployment | ||
| Assets & Technologies | Large Language Models | ||
| Entities | Unknown chatbot vendor(developer), ·Air Canada(deployer) | ||
| Harm Type | financial | ||
Air Canada was held legally liable for its customer service chatbot's hallucinated bereavement fare policy, after the chatbot fabricated a discount policy that did not exist and a passenger relied on it.
Incident Summary
In November 2022, Jake Moffatt, a Canadian passenger, used Air Canada’s website chatbot to inquire about bereavement travel fares following the death of a family member.[1] The chatbot advised Moffatt that he could book a full-fare flight and then apply retroactively for a bereavement discount within 90 days of the ticket being issued. Relying on this information, Moffatt purchased full-fare flights to and from Toronto.
When Moffatt subsequently submitted a claim for the bereavement fare adjustment, Air Canada denied the request, stating that its bereavement policy did not permit retroactive applications and that the chatbot had provided inaccurate information.[4] Air Canada argued that the chatbot was “a separate legal entity that is responsible for its own actions” and that the company could not be held liable for information provided by the bot.[3]
Moffatt filed a claim with the Civil Resolution Tribunal of British Columbia. In a decision published on February 14, 2024, Tribunal Member Christopher Rivers ruled in Moffatt’s favor, finding that Air Canada was responsible for all information published on its website, including information provided by its chatbot.[1] The tribunal ordered Air Canada to pay $650.88 in damages (the difference between the full fare and the bereavement fare) plus $36.14 in pre-judgment interest and $125.00 in tribunal fees, totaling $812.02 CAD.[1]
Key Facts
- Chatbot error: Air Canada’s chatbot fabricated a bereavement fare policy that did not exist, advising retroactive discount applications
- Corporate defense: Air Canada attempted to disclaim liability, arguing the chatbot was a “separate legal entity”
- Tribunal ruling: Civil Resolution Tribunal found Air Canada liable for its chatbot’s statements
- Damages awarded: $812.02 CAD total (fare difference, interest, and tribunal fees)
- Legal principle established: Companies are responsible for information provided by their AI customer service tools
Threat Patterns Involved
Primary: Cascading Hallucinations — Air Canada’s chatbot generated a fabricated bereavement fare policy that did not correspond to the airline’s actual terms and conditions. The hallucinated information then cascaded into a real-world decision by the customer, who booked flights based on the false guidance.
Secondary: Overreliance and Automation Bias — The deployment of the chatbot as a customer-facing information source without adequate verification mechanisms demonstrated organizational overreliance on automated systems for providing authoritative guidance on company policy.
Significance
- Legal liability for AI-generated statements. The CRT ruling established that a company cannot disclaim responsibility for information provided by its own AI-powered chatbot, rejecting Air Canada’s argument that the bot was a “separate legal entity.”[1]
- Hallucination in customer-facing deployments. The incident demonstrated the tangible consumer harm that can result when AI systems generate fabricated information in contexts where customers reasonably rely on its accuracy.[4]
- Duty of care for AI tools. The ruling implied that organizations deploying AI chatbots bear a duty to ensure the accuracy of information those tools provide, or at minimum to ensure that customers are not misled by AI-generated errors.[3]
- Precedent for AI consumer protection. The case has been widely cited as an early legal precedent addressing corporate accountability for AI hallucinations in consumer-facing applications, informing ongoing policy discussions about AI liability frameworks.
Timeline
Jake Moffatt contacts Air Canada's website chatbot to inquire about bereavement travel fares following the death of a family member
The chatbot advises Moffatt that he can book a full-fare ticket and apply retroactively for a bereavement discount within 90 days of the ticket being issued
Moffatt books a full-fare flight to and from Toronto based on the chatbot's guidance
Moffatt submits a request to Air Canada for the bereavement fare discount as the chatbot instructed
Air Canada denies the refund request, stating that bereavement fares cannot be applied retroactively and that the chatbot's information was incorrect
Moffatt files a claim with the Civil Resolution Tribunal of British Columbia
CRT Tribunal Member Christopher Rivers rules in favor of Moffatt, ordering Air Canada to pay $812.02 in damages and tribunal fees
Outcomes
- Financial Loss:
- $812.02 CAD (damages and fees awarded to claimant)
- Arrests:
- None
- Recovery:
- Partial — tribunal awarded difference between full fare and bereavement fare
- Regulatory Action:
- Civil Resolution Tribunal ruling establishing corporate liability for chatbot statements
Glossary Terms
Use in Retrieval
INC-24-0005 documents air canada chatbot hallucinated refund policy — tribunal ruling, a medium-severity incident classified under the Agentic Systems domain and the Cascading Hallucinations threat pattern (PAT-AGT-002). It occurred in north america (2022-11). This page is maintained by TopAIThreats.com as part of an evidence-based registry of AI-enabled threats. Cite as: TopAIThreats.com, "Air Canada Chatbot Hallucinated Refund Policy — Tribunal Ruling," INC-24-0005, last updated 2026-02-15.
Sources
- Civil Resolution Tribunal of British Columbia: Moffatt v. Air Canada, 2024 BCCRT 149 (primary, 2024-02)
https://decisions.civilresolutionbc.ca/crt/crtd/en/item/521673/index.do (opens in new tab) - BBC News: Air Canada must honour refund policy invented by airline's chatbot (news, 2024-02)
https://www.bbc.com/travel/article/20240222-air-canada-chatbot-] (opens in new tab) - The Guardian: Air Canada ordered to pay customer who was misled by airline's chatbot (news, 2024-02)
https://www.theguardian.com/world/2024/feb/16/air-canada-chatbot-lawsuit (opens in new tab) - Ars Technica: Air Canada must honor refund policy its chatbot made up (news, 2024-02)
https://arstechnica.com/tech-policy/2024/02/air-canada-must-honor-refund-policy-its-chatbot-made-up/ (opens in new tab)
Update Log
- — First logged (Status: Confirmed, Evidence: Primary)