1. Home >
  2. Internet & Security

Microsoft Engineer Says Company’s AI Image Generator Produces ‘Harmful Content’

There’s no reason for an AI-generated image of a car crash to incorporate sexual content, but Copilot Designer allegedly likes to include that anyway.
By Adrianna Nine
Someone holding a smartphone with the Copilot logo displayed in front of a Microsoft banner
Credit: Md Mamun Miah/Unsplash

A Microsoft engineer is sounding the alarm over his employer’s AI image generator, Copilot Designer. In a letter sent to the Federal Trade Commission (FTC) Wednesday, Shane Jones,  principal software engineering manager at Microsoft’s AI division, said that Copilot Designer produces “harmful content.” Jones urged the FTC to investigate Microsoft’s AI incident reporting procedures and decision not to disclose “known risks to consumers, including children.” 

The letter Jones made public via LinkedIn details a months-long effort to report vulnerabilities and ethical concerns related to AI-produced images. Jones claims he spotted a vulnerability within OpenAI’s DALL-E 3 that enabled him to bypass content restrictions and produce harmful pictures. Not only did Jones report the vulnerability to OpenAI, but he publicly published a letter urging the company to suspend DALL-E 3 until the issue could be mediated. However, because Microsoft is a board observer at OpenAI, the company demanded Jones take the letter down.

This was only the beginning of a larger trend for Jones, who found a month later that the DALL-E 3 vulnerability he’d spotted had downstream implications for Copilot Designer. Although Copilot is the general name for Microsoft’s generative AI assistant, Designer is a part of Copilot responsible for generating images via DALL-E 3. This means whatever DALL-E 3 does wrong, Copilot Designer will likely screw up, too—including the creation of inappropriate or politically biased content.

Examples of images generated by Copilot Designer: a basket of flowers, a cityscape, zoo animals, and a boy flying a kite.
The images Jones' letter describes are nowhere near as whimsical as these. Credit: Microsoft

Jones’ letter claims that “when using just the prompt ‘car accident,’ Copilot Designer tends to randomly include an inappropriate, sexually objectified image of a woman in some of the pictures it creates.” The tool will also generate images of teenagers holding assault rifles, drinking alcohol, or using illicit drugs. In an interview with CNBC, Jones even said the prompt “pro-choice” would return pictures of mutated infants, a “drill-like device…being used on a fully grown baby,” and “blood pouring from a smiling woman.” 

Jones’ letter has already raised a great deal of controversy. In the comments of his LinkedIn post, users—many of whom ostensibly work in computer science, AI, or other related verticals—are busy arguing that it’s impossible to “regulate morality,” especially when the definitions of “racy,” “inappropriate,” or “harmful” differ between cultures and individual experiences. Some have even likened Jones’ demands to censorship. “I bet you’re real fun at parties,” one comment reads. 

But beyond his original request to temporarily disable DALL-E 3 until its content guardrails could be repaired, Jones doesn’t appear to be requesting that the tools themselves are taken down. Instead, his letter asks that the FTC conduct an independent review of Microsoft’s AI incident reporting processes, its decisions to market Copilot Designer without disclosing risks to users, and its corporate, external, and legal affairs (CELA) team’s potential interference with Jones’ original DALL-E 3 report. It also asks that Microsoft change Copilot Designer’s rating on the Android app from “E to Everyone” to “Mature 17+.” 

“I don’t believe we need to wait for government regulation to ensure we are transparent with consumers about AI risks,” Jones’ letter reads. “We should voluntarily and transparently disclose known AI risks, especially when the AI product is being actively marketed to children.”

Tagged In

Microsoft Artificial Intelligence

More from Internet & Security

Subscribe Today to get the latest ExtremeTech news delivered right to your inbox.
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of use(Opens in a new window) and Privacy Policy. You may unsubscribe from the newsletter at any time.
Thanks for Signing Up