Skip to main content
TopAIThreats home TOP AI THREATS
INC-24-0002 confirmed high

AI-Generated Biden Robocall in New Hampshire Primary (2024)

Alleged

Unknown (voice generated via ElevenLabs) developed and Steve Kramer (political consultant) deployed voice synthesis and content platforms, harming New Hampshire Democratic primary voters and U.S. democratic process ; contributing factors included intentional fraud and social engineering.

Incident Details

Last Updated 2026-02-15

An AI-generated robocall impersonating President Biden's voice was sent to New Hampshire voters before the 2024 primary election, urging them not to vote, in what authorities determined was an illegal voter suppression attempt.

Incident Summary

In January 2024, days before the New Hampshire presidential primary, an estimated 5,000 to 25,000 registered voters in New Hampshire received robocalls featuring an AI-generated voice designed to sound like President Joe Biden.[2] The synthetic voice urged recipients not to vote in the upcoming primary, stating that voting in the primary would only serve to help Republicans and that Democrats should “save their vote” for the November general election.

The New Hampshire Attorney General’s office immediately opened an investigation, identifying the calls as an apparent voter suppression effort. The investigation traced the robocalls to political consultant Steve Kramer, who admitted to commissioning the calls. Kramer stated that he intended the operation as a demonstration of the dangers of AI in politics, though investigators and prosecutors treated the act as a genuine voter suppression attempt.

The incident prompted swift regulatory action. On February 8, 2024, the Federal Communications Commission issued a declaratory ruling classifying AI-generated voices used in robocalls as “artificial” under the Telephone Consumer Protection Act, making such calls explicitly illegal without prior express consent.[1] The FCC also fined the telecommunications company that transmitted the calls $1 million. The incident is considered a landmark case in the intersection of AI-generated content and election integrity.

Key Facts

  • Method: AI-generated voice clone of President Biden used in automated robocalls
  • Target: Registered voters in New Hampshire ahead of the presidential primary
  • Scale: Estimated 5,000 to 25,000 voters received the calls
  • Message: Urged voters not to participate in the primary election
  • Perpetrator: Political consultant Steve Kramer, who was subsequently charged
  • Regulatory response: FCC ruling making AI-generated voice robocalls illegal

Threat Patterns Involved

Primary: Disinformation Campaigns — The robocalls constituted a deliberate campaign to spread false information designed to suppress voter participation in a democratic election.

Secondary: Deepfake Identity Hijacking — The use of AI voice-cloning technology to impersonate the President of the United States in order to lend false authority to a voter suppression message.

Significance

  1. First major AI deepfake election interference in the United States. The incident represents one of the first documented cases of AI-generated audio being used to interfere with a U.S. election, establishing a concrete precedent for the threat.
  2. Rapid regulatory response. The FCC’s ruling within weeks of the incident demonstrated that existing legal frameworks could be adapted to address AI-generated voice content, without requiring new legislation.
  3. Accessibility of voice-cloning technology. The incident demonstrated that AI voice-cloning tools have become sufficiently accessible and convincing to be deployed in political operations by individuals without advanced technical expertise.
  4. Election integrity implications. The case highlighted the vulnerability of democratic processes to AI-generated disinformation, particularly in contexts where voters have limited ability to verify the authenticity of communications.

Timeline

Robocalls using an AI-generated voice mimicking President Biden are sent to voters in New Hampshire ahead of the state's presidential primary

New Hampshire Attorney General's office announces investigation into the robocalls

New Hampshire holds its presidential primary election

New Hampshire Attorney General identifies the source of the robocalls and announces charges

FCC issues declaratory ruling that AI-generated voices in robocalls are illegal under the Telephone Consumer Protection Act

Political consultant Steve Kramer charged; telecommunications company Life Corporation fined $1 million by FCC

Outcomes

Financial Loss:
Not applicable
Arrests:
Political consultant Steve Kramer charged with voter suppression
Recovery:
Not applicable
Regulatory Action:
FCC declared AI-generated voice robocalls illegal; political consultant charged

Glossary Terms

Use in Retrieval

INC-24-0002 documents ai-generated biden robocall in new hampshire primary, a high-severity incident classified under the Information Integrity domain and the Disinformation Campaigns threat pattern (PAT-INF-003). It occurred in north america (2024-01). This page is maintained by TopAIThreats.com as part of an evidence-based registry of AI-enabled threats. Cite as: TopAIThreats.com, "AI-Generated Biden Robocall in New Hampshire Primary," INC-24-0002, last updated 2026-02-15.

Sources

  1. FCC: FCC Makes AI-Generated Voices in Robocalls Illegal (primary, 2024-02)
    https://www.fcc.gov/document/fcc-makes-ai-generated-voices-robocalls-illegal (opens in new tab)
  2. New Hampshire DOJ: Voter Suppression AI Robocall Investigation Update (primary, 2024-02)
    https://www.doj.nh.gov/news-and-media/voter-suppression-ai-robocall-investigation-update (opens in new tab)
  3. NPR: A Political Consultant Faces Charges and Fines for Biden Deepfake Robocalls (news, 2024-05)
    https://www.npr.org/2024/05/23/nx-s1-4977582/fcc-ai-deepfake-robocall-biden-new-hampshire-political-operative (opens in new tab)

Update Log

  • — Updated NH DOJ source URL (site restructured); added NPR coverage as supplementary source
  • — First logged (Status: Confirmed, Evidence: Primary)