Copilot chatbot read and summarized confidential emails says Microsoft

This is the second time the AI tool has come under scrutiny in a matter of months. Last September, it was criticized after a glitch allowed false audit logs.

Microsoft has admitted an internal glitch allowed its flagship AI assistant tool, Copilot, to read and summarize confidential email messages in some users’ draft and sent email folders.

The firm has insisted there was no data leakage to external users, and that it “did not provide anyone access to information

Register for free to keep reading

To continue reading this article and unlock full access to GRIP, register now. You’ll enjoy free access to all content until our subscription service launches in early 2026.

  • Unlimited access to industry insights
  • Stay on top of key rules and regulatory changes with our Rules Navigator
  • Ad-free experience with no distractions
  • Regular podcasts from trusted external experts
  • Fresh compliance and regulatory content every day