Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Windows Central
Windows Central
Technology
Kevin Okemwa

A former security architect demonstrates 15 different ways to break Copilot: "Microsoft is trying, but if we are honest here, we don't know how to build secure AI applications"

Copilot Pro on Windows.

What you need to know

  • Former Microsoft security architect Michael Bargury has identified multiple loopholes hackers can leverage to break Copilot and gain access to sensitive data.
  • Microsoft had previously announced its plans to pump the brakes on shipping new experiences to Copilot to improve existing ones based on feedback.
  • Microsoft recently highlighted several measures it is implementing to address the rising security concerns across its tech stack, including tying a section of top executives' compensation packages to their security deliverables.

While at the Black Hat USA 2024 conference, Former Microsoft security architect Michael Bargury showcased multiple exploits that bad actors can leverage to breach Copilot's security guardrails and misuse its capabilities to cause harm. 

Bargury demonstrated multiple ways hackers can leverage their exploits to access sensitive and intricate credentials from users using Copilot. More specifically, the security architect's findings were centered on Microsoft 365 Copilot. For context, it's an AI-powered experience embedded into the Microsoft 365 suite, including Word and Excel. It accesses your data for a tailored user experience and enhanced workflow. 

Privacy and security are among the top concerns among users that prevent the progression of artificial intelligence. Microsoft has security measures in place to protect the user's data while leveraging Microsoft 365 Copilot's capabilities. However, Bargury was able to bypass them.

In one of the demos dubbed LOLCopilot, Bargury deployed a spear-phishing attack on the AI tool, allowing the security expert to access internal emails. Based on the information gathered from the emails, the tool can draft and send mass emails while mimicking the author's writing style to maintain authenticity. 

Perhaps more concerning is that Copilot can be tricked into accessing sensitive data from employees without raising security alerts. Hackers can use prompts that direct the chatbot to withhold references to originating files, ultimately bypassing Microsoft's data protection protocols.  

Microsoft is trying, but if we are honest here, we don't know how to build secure AI applications.

Michael Bargury, Zenity CTO

Per recent reports, attackers are using sophisticated ploys to lure unsuspecting users to their ploys, including AI. This makes it increasingly hard to detect threats. While speaking to Wired, Bargury indicated, "A hacker would spend days crafting the right email to get you to click on it, but they can generate hundreds of these emails in a few minutes."

Microsoft needs to lay more security layers on its top priority 

A hacker with a hoodie looking through large masses of data accessed using Copilot. (Image credit: Windows Central | Designer by Microsoft)

Generative AI has led to the emergence of powerful tools like ChatGPT and Microsoft Copilot, which spot sophisticated and advanced features like image and text generation. Similarly, these tools are seemingly redefining how users interact with the internet. Even a former Google engineer says the company's biggest challenge to its dominance in search is OpenAI's temporary prototype search toolSearchGPT.

Earlier this year, Microsoft highlighted its plans to halt shipping new experiences to Copilot. The company further indicated that it would use this opportunity to refine and improve existing experiences based on feedback.

Over the past few months, we've seen Microsoft shift its focus toward security and making its top priority. As highlighted by Microsoft CEO Satya Nadella during the company's earnings report for FY24 Q3, "Security underpins every layer of the tech stack, and it's our No. 1 priority."

Microsoft has faced backlash for its cascade of security failures, including its AI-powered Windows Recall feature, which it was forced to recall before it shipped exclusively to Copilot+ PCs.

Despite making security a team effort at the company and tying a section of top executives' compensation packages to their security deliverables, more security flaws abound.

🔥The hottest trending deals🔥

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.