1. Home >
  2. Computing

New Tool Protects Artists’ Work From AI Image Generators

Nightshade makes generators’ training data—which consists of pretty much any image on the internet—look like something it’s not.
By Adrianna Nine
3D digital art depicting cubes floating in space.
Credit: Steve Johnson/Unsplash

A new tool developed at the University of Chicago promises to protect artists’ images from the generative AI models that might use them as training data. It’s the latest step in a large-scale generative AI resistance born from alleged plagiarism, copyright infringement, and other intellectual property offenses.

The tool is called Nightshade, and it’s currently undergoing the peer review process in preparation for the computer security conference Usenix. A paper shared on the Arxiv preprint server describes Nightshade as a data poisoning attack. It manipulates generative AI models’ training data to produce undesirable effects.

Artists who want to use Nightshade simply upload their images and wait for them to undergo a quick data poisoning routine. Because Nightshade messes with an image’s behind-the-scenes definition of what it’s depicting—in the researchers’ example, they say a cat photo could be manipulated to call itself a dog photo—the image remains unchanged. This means artists can continue to show off their work via their websites, social media, image repositories, and other online spaces without sacrificing the integrity of their work.

A graph showing the ways in which Nightshade manipulates image definitions, resulting in incorrect outputs.
Credit: Shan et al/arXiv:2310.13828

Image generators like DALL-E, Stable Diffusion, and Midjourney rely on the internet’s endless visual assets. When a user requests an image of a cat, it looks at thousands of online cat images to figure out what that output should look like, then pieces together elements from those images to produce the desired asset. But when Nightshade redefines cat images as dog images, the data on which AI image generators are based becomes useless. The generator might spit out what it believes to be a decent cat drawing, but to the human viewer, the output features grotesquely assembled elements of multiple different animal types. 

If this concept sounds familiar, it’s because the same team of researchers created Glaze earlier this year. Though Glaze is a style mimicry disruptor that prevents AI image generators from ripping off a particular artist’s technique, it works similarly to Nightshade in that it “cloaks” an image’s behind-the-scenes data before it is shared online. 

The University of Chicago team and many artists hope tools like Nightshade and Glaze will add friction to the AI image generation process. “It is going to make [AI companies] think twice, because they have the possibility of destroying their entire model by taking our work without our consent,” one illustrator told MIT Technology Review following an exclusive Nightshade demonstration.

Tagged In

Artificial Intelligence

More from Computing

Subscribe Today to get the latest ExtremeTech news delivered right to your inbox.
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of use(Opens in a new window) and Privacy Policy. You may unsubscribe from the newsletter at any time.
Thanks for Signing Up