TechCrunch

Microsoft Copilot Exposed Emails

about 12 hours agoRead original →

Microsoft has admitted to a significant privacy lapse involving its Copilot AI chatbot. A software bug allowed the system to access and summarize private emails from paying customers, completely bypassing established data-protection protocols designed to keep sensitive information secure.

The revelation highlights the inherent risks of integrating generative AI deeply into enterprise productivity suites. While Microsoft moves to patch the vulnerability, the incident raises urgent questions about data isolation and the reliability of AI guardrails in handling confidential corporate communications.

Want the full story?

Read on TechCrunch