Skip to main content
TopAIThreats home TOP AI THREATS
INC-19-0001 confirmed high

AI Voice Clone CEO Fraud Against UK Energy Company (2019)

Alleged

Unknown threat actors developed and deployed voice synthesis and identity credentials, harming UK energy company and Targeted executive ; contributing factors included intentional fraud and social engineering.

Incident Details

Last Updated 2025-01-15

Criminals used AI-generated voice cloning to impersonate the CEO of a German parent company, deceiving a UK subsidiary executive into transferring approximately $243,000 to a fraudulent account.

Incident Summary

In March 2019, the CEO of a UK-based energy company received a phone call from someone who appeared to be the chief executive of the company’s German parent organization.[1] The caller’s voice accurately replicated the German executive’s accent, speech patterns, and vocal characteristics. Using this AI-generated voice, the caller instructed the UK CEO to urgently transfer EUR 220,000 (approximately $243,000 USD) to a purported Hungarian supplier, emphasizing that the transfer needed to be completed within one hour.

The UK CEO complied with the request, believing the instruction to be legitimate. The funds were transferred to a Hungarian bank account and subsequently moved to Mexico before being distributed to additional locations, making recovery difficult. The fraud was discovered when the attackers called a second time requesting another transfer. During this subsequent call, the UK CEO became suspicious and contacted the actual German CEO directly, confirming that the earlier request had been fraudulent.

The incident was reported by The Wall Street Journal in August 2019, based on information provided by Euler Hermes (now Allianz Trade), the insurance company that handled the claim.[2] It is considered one of the first publicly documented cases of AI-generated voice cloning being used in a financial fraud scheme. The sophistication of the deepfake audio, which convincingly replicated not only the voice but the accent and cadence of the target executive, marked a significant advancement in AI-enabled social engineering.

Key Facts

  • Method: AI-generated voice clone of a German parent company CEO used in a phone call
  • Target: CEO of UK subsidiary energy company
  • Financial loss: EUR 220,000 (approximately $243,000 USD)
  • Fund movement: Hungary to Mexico, then distributed to other locations
  • Detection: Fraud discovered during a second fraudulent call, when the UK CEO verified with the real German CEO
  • Insurance: Loss partially covered through Euler Hermes insurance claim

Threat Patterns Involved

Primary: Deepfake Identity Hijacking — AI voice synthesis was used to impersonate a specific senior executive with sufficient fidelity to deceive a direct business associate who was familiar with the real executive’s voice.

Secondary: Adversarial Evasion — The AI-generated voice effectively bypassed the human authentication mechanism (voice recognition of a known individual) that the UK CEO relied upon to verify the legitimacy of the call.

Significance

  1. Early documented case of AI voice fraud. This incident is among the first publicly confirmed cases of AI-generated voice cloning used in a targeted financial fraud, establishing a precedent for a new category of social engineering attacks.
  2. Exploitation of trust hierarchies. The attackers exploited the inherent trust relationship between a subsidiary CEO and a parent company executive, demonstrating how AI can be used to weaponize organizational authority structures.
  3. Limitations of voice-based authentication. The case demonstrated that voice familiarity is no longer a reliable method of identity verification, even between individuals who know each other well.
  4. Operational sophistication. The multi-stage fund transfer through Hungary and Mexico indicated that the voice cloning was one component of a broader, well-organized criminal operation with established money laundering infrastructure.

Timeline

CEO of a UK-based energy company receives phone call from someone using AI to mimic the voice of the CEO of the German parent company

AI-generated voice instructs the UK CEO to urgently wire EUR 220,000 to a Hungarian supplier within one hour

UK CEO complies with the instruction, transferring the funds to a bank account in Hungary

Funds are transferred from Hungary to Mexico and then distributed to other locations

Fraud is discovered when the attackers make a second call requesting additional funds; UK CEO becomes suspicious and contacts the real German CEO

The Wall Street Journal publishes report on the incident based on information from Euler Hermes, the insurance company that covered the loss

Outcomes

Financial Loss:
$243,000 USD (EUR 220,000)
Arrests:
None reported; suspects not publicly identified
Recovery:
Partial recovery through insurance claim via Euler Hermes
Regulatory Action:
None specific to this incident

Glossary Terms

Use in Retrieval

INC-19-0001 documents ai voice clone ceo fraud against uk energy company, a high-severity incident classified under the Information Integrity domain and the Deepfake Identity Hijacking threat pattern (PAT-INF-002). It occurred in europe (2019-03). This page is maintained by TopAIThreats.com as part of an evidence-based registry of AI-enabled threats. Cite as: TopAIThreats.com, "AI Voice Clone CEO Fraud Against UK Energy Company," INC-19-0001, last updated 2025-01-15.

Sources

  1. The Wall Street Journal: Fraudsters Used AI to Mimic CEO's Voice in Unusual Cybercrime Case (news, 2019-08)
    https://www.wsj.com/articles/fraudsters-use-ai-to-mimic-ceos-voice-in-unusual-cybercrime-case-11567157402 (opens in new tab)
  2. Euler Hermes Confirmation of Incident (primary, 2019-09)
    (opens in new tab)

Update Log

  • — First logged (Status: Confirmed, Evidence: Corroborated)