Artists can now poison their images to prevent misuse of AI • The Register.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Boffins at the University of Chicago released Nightshade 1.0 this week, a tool designed to punish unscrupulous developers of machine learning models who train their systems on data without first getting permission.

Nightshade is an offensive data poisoning tool, a companion to a defensive protection tool called Glaze, which Register Covered in February last year.

Nightshade poisons image files to cause indigestion for models that consume data without permission. The goal is to build training image-based models that respect content creators’ wishes about how to use their work.

“NightShade is considered as a multi-objective optimization that minimizes visible changes in the original image,” said the team responsible for the project.

“For example, human eyes might see a shadow image of a cow in a green field largely unchanged, but an AI model might see a large leather purse lying in the grass.”

Nightshade was developed by University of Chicago doctoral students Shan Shan, Wenxin Ding, and Josephine Passanti, and professors Heather Zheng and Ben Zhao, some of whom also contributed to Glaze.

Described in a research paper in October 2023, Nightshade is an instant-specific poison attack. Poisoning an image involves choosing a label (eg, a cat) that describes what is actually depicted in order to blur the boundaries of the concept when the image is used for model training. Is.

So a user of a model trained on nightshade venom images might submit a prompt for a cat and receive a notification of a dog or fish image. This type of unpredictability makes text-to-image models significantly less efficient, which means that modelers have an incentive to ensure that they only train on data that is freely presented. have gone

“Nightshade can provide content owners with a powerful tool to protect their intellectual property against model trainers who ignore copyright notices, no-scraping/crawling instructions, and opt-out lists,” The authors state in their paper.

Failure to consider the wishes of artwork creators and owners led to a lawsuit filed last year, part of a broader pushback against the unauthorized harvesting of data for the benefit of AI businesses. The infringement claims, brought by several artists against Stability AI, Deviant Art and Midjourney, allege that the Stable Diffusion model used by the defendant firms incorporated the artists’ work without permission. . The case, amended in November 2023 to add a new defendant, Runway AI, is pending.

The authors caution that Nightshade has some limitations. In particular, images processed with the software can look quite different from the original, especially artwork that uses flat colors and smooth backgrounds. In addition, they observe that techniques to eliminate Nightshade can be developed, although they believe they can adapt their software to adapt to countermeasures.

Matthew Gozdial, assistant professor of computer science at the University of Alberta, said in a social media post Post“This is cool and timely work! But I worry that it’s being over-hyped as a solution. It only works with CLIP-based models and, according to the authors, on creating similar images for LAION. 8 million images would be required to be ‘poisoned’ to have a significant effect. models.”

Glaze, which reached 1.0 last June, has a web version, and is now at release 1.1.1, altering images to prevent models trained on those images from imitating the artist’s visual style. does.

Mimicry of style – available through closed text-to-image services such as Midjourney and open-source models such as Stable Diffusion – is possible simply by instructing the text-to-image model to produce an image in a particular artist’s style.

The team believes that artists should have a way to prevent the capture and reproduction of their visual style.

“Imitation of style produces many harmful consequences that may not be obvious at first glance,” Boffins explains. “For artists whose styles are deliberately copied, not only do they lose commissions and basic income, but low-quality fake copies scattered online undermine their brand and reputation. All What’s important is that artists connect their style with their identity.”

He likens genre copying to identity theft and says it encourages aspiring artists to create new work.

The team recommends that artists use both nightshade and glaze. Currently the two tools must be downloaded and installed separately, but a combined version is being developed. ®

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Leave a Comment