Business Email Compromise
Targeted fraud impersonating executives or trusted contacts to authorise fraudulent transactions.
Definition
Business email compromise (BEC) is a form of targeted social engineering fraud in which attackers impersonate executives, vendors, or trusted business contacts — typically via email — to deceive employees into transferring funds, disclosing sensitive information, or modifying financial instructions. BEC attacks rely on identity deception rather than technical exploitation, often leveraging publicly available information about organisational hierarchies and business relationships. With the advent of AI-generated synthetic media, BEC has expanded beyond email to include deepfake video calls and AI-cloned voice messages, significantly increasing the persuasiveness and sophistication of these attacks.
How It Relates to AI Threats
Business email compromise intersects primarily with the Security & Cyber and Information Integrity threat domains. AI has transformed BEC from a text-based social engineering technique into a multi-modal attack vector. Deepfake identity hijacking — in which attackers generate convincing video or audio of trusted individuals — enables real-time impersonation during video conferences or phone calls, removing many of the traditional indicators of fraud (e.g., unusual writing style, email domain inconsistencies). The combination of AI-generated content with established BEC tactics represents a qualitative escalation in the threat, as verification measures designed for text-based communication are insufficient against synthetic media.
Why It Occurs
- Organisational hierarchies create pressure to comply with requests from apparent senior leadership without extensive verification
- AI voice cloning and deepfake video tools can produce convincing impersonations from limited source material
- Financial transaction workflows often rely on verbal or email authorisation without multi-factor verification
- Publicly available information (social media, corporate websites, press releases) provides material for targeted impersonation
- The financial incentive is substantial — the FBI reported over $2.9 billion in BEC losses in a single year in the United States alone
Real-World Context
The AI-augmented BEC landscape is illustrated by incidents in which deepfake technology has been used to impersonate executives during live video calls. In the Hong Kong deepfake CFO fraud (INC-24-0001), attackers used AI-generated video of multiple company executives in a conference call to authorise a transfer of approximately $25 million. BEC attacks leveraging AI voice cloning have also been documented (INC-23-0006), where cloned voices were used to impersonate executives via phone calls to authorise wire transfers. These incidents demonstrate the progression of BEC from email-only to AI-enabled multi-channel fraud.
Related Threat Patterns
Last updated: 2026-02-14