New anti-AI tool allows artists to “poison” training models

A new anti-AI tool, promises to allow artists to fight back against generative AI algorithms scraping their art.

Nightshade, as the new tool is called, works by “poisoning” the training data of any image-generating AI models that attempt to scrape art protected by it. The result is that any output of models trained on art protected by Nightshade will be effectively useless, outputting incorrect images—asking for dogs generates cats, asking for hats generates cakes, and so on.

Ben Zhao, a professor at the University of Chicago, led the team that created Nightshade. He believes that the tool will be a power deterrent against AI companies looking to train their models without permission from artists, swinging the balance of power back towards the latter.

Zhao’s team previously worked on anti- AI scraping tool Glaze, which allowed artists to “mask” their own personal style to prevent AI models from scraping them. This tool works by changing the pixels of an image in subtle ways. While these changes are invisible to the human eye, they’re still able to manipulate machine-learning models to interpret the image as something different from what it actually portrays.

Nightshade, on the other hand, exploits a vulnerability in generative AI models that stems from the fact that these are trained on vast amounts of data—thousands, and even millions of images taken from the internet. Nightshade messes with these images in a way that tricks these AI models.

The poisoned images can then manipulate AI models into thinking that certain things are in fact other things—dogs being cats, or cars being cows. In addition, this poisoned data is very difficult to remove, as it requires the AI companies to painstakingly search for each corrupted sample in their data and delete it.

The team has so far tested the Nightshade on Stable Diffusion’s latest AI models and their own AI model that they trained from scratch. With the former, just feeding it 50 poisoned images of dogs prompted it to output strange images, with creatures having too many limbs and cartoonish faces.

The plan for Zhao’s team is to make their anti-AI tools together, integrating Nightshade into Glaze. Artists who want to share their work online but don’t want them to be scraped by AI models can upload them to Glaze with an option for further protection with Nightshade.

In addition, the team is also making Nightshade open-source. This will allow other developers to tinker with it and make their own versions. According to the team, the more versions of Nightshade that other developers make, the more powerful the anti-AI tool becomes. The more poisoned images from Nightshade versions of it are uploaded, the greater the damage the technique can cause to AI scraping models.

If you like reading our content, why not show your appreciation by treating us to a cup of coffee? (or two, if you’re feeling generous)



Author

Variable staff

Collective will of the legion

%d