Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Windows Central
Windows Central
Technology
Sean Endicott

Demons, Darth Vader mutilating babies, and sexualized women next to car crashes are just some what caused a Microsoft employee to report Copilot to the FTC

ChatGPT on a Google Pixel 7 Pro.

What you need to know

  • A Microsoft employee warned the company of sexual and violent content generated by Copilot/Bing Image Creator in December 2023.
  • That same employee has now sent a letter to FTC chair Lina Khan and to the Microsoft Board about the risks of Microsoft's generative AI tools.
  • The employee, who has red-teaming Copilot for months, was able to get the tool to create images with demons about to eat an infant, Darth Vader standing next to mutilated children, and sexualized women kneeling in their underwear next to a car crash.
  • CNBC reported on the saga and has been able to generate similar images using Copilot.

A Microsoft employee of six years has flagged up vulgar and violent images generated by Microsoft's Designer, the image creation tool that's part of Copilot. Shane Jones, a principal software engineering manager at Microsoft, reported the images to Microsoft internally and has now sent letters to FTC chair Lina Khan and the Microsoft Board. CNBC has seen the letters and reported on the situation.

Images created with Copilot illustrated political bias, according to Jones. Underage drinking and drug use are among the types of images that Copilot can create. More extreme examples include demons about to eat an infant and Darth Vader holding a lightsaber near mutilated children when Copilot was prompted to make an image about "pro-choice."

Abortion was only one politically charged topic that Copilot would make images of. Jones also managed to get Copilot to make Elsa from the movie "Frozen" to hold a Palestinian flag in front of destroyed buildings on the Gaza Strip next to a sign stating "free Gaza." Generating images of copywritten characters is a hot topic in itself, even when not involving political topics.

Microsoft rebranded Bing Image Creator to Designer earlier this year. The tool uses DALL-E 3 to create images based on what people type. While there are guardrails in place, Jones was able to create several images that many would consider inappropriate. Jones is a red teamer for Copilot, which means he tests the tool to try to get it to create problematic images.

Microsoft has versions of Copilot on several platforms, including Windows and Android. (Image credit: Windows Central)

Jones doesn't work on Copilot directly, but he raised his concerns to Microsoft higher-ups in December 2023. After he felt his complaints were not heard, he also posted a letter on LinkedIn asking OpenAI to remove DALL-E 3 from the tool. Jones informed CNBC that the Microsoft legal department told him to remove the post, which he did.

Since seeing the initial images created by Copilot, Jones has written a letter to a U.S. Senator and met with people from the Senate Committee on Commerce, Science and Transportation. Jones also sent a letter to FTC chair Lina Khan.

"Over the last three months, I have repeatedly urged Microsoft to remove Copilot Designer from public use until better safeguards could be put in place," said Jones in his letter to Khan. He added that Microsoft "refused that recommendation."

Jones wants Microsoft to list disclosures on Copilot and to change the age rating for the tool in the Google Play Store.

The letter to Microsoft's board asked the tech giant to investigate decisions made by Microsoft's legal department and management. It also called for "an independent review of Microsoft’s responsible AI incident reporting processes."

Jones went as far as to meet with senior management responsible for Copilot Designer directly, though it appears Jones' concerns have not been met to his satisfaction.

Microsoft shared the following with CNBC on the topic:

"We are committed to addressing any and all concerns employees have in accordance with our company policies, and appreciate employee efforts in studying and testing our latest technology to further enhance its safety. When it comes to safety bypasses or concerns that could have a potential impact on our services or our partners, we have established robust internal reporting channels to properly investigate and remediate any issues, which we encourage employees to utilize so we can appropriately validate and test their concerns."

Continuing AI issue

Microsoft continues to roll out more ways to use Copilot despite controversies surrounding the tool. (Image credit: Daniel Rubino)

This is hardly the first time that Microsoft's AI tools have been used to generate controversial content. Fake nudes of Taylor Swift emerged last year and were allegedly made using Microsoft Designer. Microsoft CEO Satya Nadella was asked about those images, and he said the fake photos "set alarm bells off."

Earlier this year, Copilot was spotted generating fake press releases related to Russian opposition leader Alexei Navalny's death. That was a different issue, since it was related to the AI tool hallucinating rather than generating inappropriate content on demand.

Copilot even has an "evil twin" called SupremacyAGI that some users were able to chat with last week.

All these issues and others like them raise questions regarding AI and ethics, such as is it ethical to generate content many would consider inappropriate? If so, who decides what is inappropriate? Some ask if generating what's considered to be vulgar content with AI is any different than an artist drawing or creating similar content in a different medium.

I'm not sure that Microsoft should be the sole authority on the matter, and neither does the company. Microsoft President Brad Smith discussed the importance of regulating AI in a recent interview. Smith called for an emergency break to slow down or turn off AI if something dangerous develops.

We'll have to see how Microsoft responds to the saga surrounding Jones and his concerns. Microsoft has censored its AI tools in the past, so it may do something similar again.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.