Varonis found a “Reprompt” attack that let a single link hijack Microsoft Copilot Personal sessions and exfiltrate data; Microsoft patched it in January 2026.
Microsoft has fixed a vulnerability in its Copilot AI assistant that allowed hackers to pluck a host of sensitive user data ...
Reprompt impacted Microsoft Copilot Personal and, according to the team, gave "threat actors an invisible entry point to perform a data‑exfiltration chain that bypasses enterprise security controls ...
Researchers identified an attack method dubbed "Reprompt" that could allow attackers to infiltrate a user's Microsoft Copilot session and issue commands to exfiltrate sensitive data.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results