Skip to main content
TopAIThreats home TOP AI THREATS
Harm Mechanism

Elder Fraud

Financial crimes targeting older adults, increasingly enabled by AI voice cloning, deepfakes, and automated robocalls.

Definition

Elder fraud encompasses financial crimes that specifically target older adults, exploiting factors such as social isolation, reduced digital literacy, cognitive vulnerabilities, and accumulated savings. In the context of AI-enabled threats, elder fraud has been amplified by voice cloning technology, deepfake impersonation, and automated robocall systems that can reach large numbers of potential victims at minimal cost. These AI tools allow perpetrators to convincingly impersonate family members, authority figures, or financial institutions, increasing the effectiveness and scale of fraudulent schemes. Elder fraud represents a significant and growing category of harm where technological sophistication meets demographic vulnerability.

How It Relates to AI Threats

Elder fraud intersects with AI threats across the Information Integrity and Security & Cyber domains. AI voice cloning enables highly convincing impersonation of family members in grandparent scams, while deepfake video can be used in investment fraud or romance scams targeting older adults. Automated AI systems facilitate high-volume vishing campaigns and robocalls that screen for vulnerable individuals. The combination of AI-generated content with social engineering techniques has increased both the scale and success rate of elder-targeted fraud.

Why It Occurs

  • AI voice cloning requires only seconds of sample audio to produce convincing impersonations
  • Older adults may be less familiar with AI capabilities and less likely to suspect synthetic media
  • Social isolation reduces opportunities for victims to verify suspicious communications
  • Automated calling systems enable mass targeting at negligible marginal cost
  • Financial institutions and law enforcement lack real-time detection mechanisms for AI-enhanced fraud

Real-World Context

The FBI’s 2024 Elder Fraud Report documented billions of dollars in losses attributable to fraud targeting individuals over 60, with AI-enabled schemes representing a growing proportion of reported incidents. Voice cloning technology has been specifically identified in grandparent scam variants where perpetrators use brief audio samples from social media to replicate the voice of a victim’s relative. Authorities in multiple jurisdictions have noted that AI-enhanced elder fraud schemes achieve higher success rates and larger individual losses compared to traditional methods.

Last updated: 2026-02-14