Skip to main content
TopAIThreats home TOP AI THREATS
Technical Attack

Grandparent Scam

A social engineering fraud using AI voice cloning to impersonate a grandchild and convince older adults to send money.

Definition

The grandparent scam is a social engineering fraud in which perpetrators contact older adults while impersonating a grandchild or other close relative, typically claiming to be in urgent distress such as an arrest, accident, or medical emergency. The victim is pressured to send money immediately, often through untraceable methods such as wire transfers, gift cards, or cryptocurrency. AI voice cloning has transformed this scheme by enabling perpetrators to produce convincing replicas of a relative’s voice from brief audio samples obtained through social media or voicemail. This technological enhancement has significantly increased the plausibility and success rate of the scam compared to earlier variants.

How It Relates to AI Threats

The grandparent scam connects to AI threats within the Information Integrity and Security & Cyber domains, specifically through deepfake identity hijacking. The integration of AI voice cloning represents a qualitative shift in the threat: whereas traditional grandparent scams relied on generic impersonation and the victim’s willingness to believe, AI-enhanced variants produce audio that closely matches the actual voice of the purported relative. This reduces the effectiveness of verification strategies that older adults have historically relied upon, such as recognising a family member’s voice. The scam also illustrates how consumer-accessible AI tools can be repurposed for targeted fraud.

Why It Occurs

  • AI voice cloning tools can generate convincing voice replicas from as little as three to five seconds of sample audio
  • Publicly available audio on social media, voicemail greetings, and video platforms provides source material
  • Older adults may be less aware that voice cloning technology exists or is accessible
  • The emotional urgency of the scenario (a grandchild in distress) inhibits rational evaluation
  • Payment methods such as gift cards and wire transfers are difficult to reverse or trace

Real-World Context

The Newfoundland voice cloning grandparent scam (INC-23-0004) documented a case in which perpetrators used AI-generated voice cloning to impersonate a family member and extract funds from an older adult. Law enforcement agencies across North America have issued warnings about the increasing prevalence of AI-enhanced grandparent scams. The FBI and the Canadian Anti-Fraud Centre have noted that voice cloning has made these schemes more difficult for victims to detect, and have recommended that families establish verification protocols such as code words to authenticate identity during unexpected calls.

Last updated: 2026-02-14