Researchers try to ‘poison’ art so AI platforms can’t copy it

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Artists and computer scientists are trying a new way to prevent artificial intelligence from ripping off copyrighted images: “poisoning” cat-eye AI models.

A tool called Nightshade, released in January by researchers at the University of Chicago, alters images in small ways that are almost invisible to the human eye but look dramatically different from the AI ​​platforms that digest them. come. Artists like Carla Ortiz are now “nightshading” their artworks to prevent them from being scanned and copied by text-to-photo programs like DeviantArt’s DreamUp, Stability AI’s Stable Diffusion, and others.

“It struck me that a lot of it is basically the whole of my work, the whole of my colleagues’ work, the whole of almost every artist I know that has work,” Ortiz said. Said, a conceptual artist and illustrator whose portfolio has earned him jobs. Visual designing for film, TV and video game projects such as “Star Wars,” “Black Panther” and Final Fantasy XVI.

“And everything was done without anyone’s consent — no credit, no compensation, nothing,” he said.

Nightshade takes advantage of the fact that AI models don’t “see” the way people do, said Sean Shan, the research leader.

An original photo by NBC News’ Brian Cheung, left, is barely distinguishable from the Nightshade-“poisoned” version, right, intended to trip AI tools.Nate Congleton/NBC

“Machines, they just see a big array of numbers, right? It’s pixel values ​​from zero to 255, and up to the model, that’s all they see,” he said. So the nightshade changes thousands of pixels — a drop in the bucket for standard images that contain millions of pixels, but enough to trick the model into seeing “something is different,” University of Chicago Ph.D. Shaan, a fourth year student of K. In a paper to be presented in May, the team explains how NightShade automatically chooses a concept that it intends to confuse an AI program responding to a given prompt — images of “dogs.” Embedding, for example, with an array of pixel distortions. Read the model as “cat”.

After feeding 1,000 subtly “poisoned” dog photos to a text-to-photo AI tool and requesting a dog photo, the model produces something decidedly un-canine.

Stable Diffusion XL produced these images after feeding Brian Cheung’s “Nightshade” image.Nightshade, University of Chicago

However, the target pest of a nightshade is not always a cat. The program decides on a case-by-case basis which alternative concept it wants to “see” its AI targets. In some instances, Shawn said, it only takes 30 nightshade images to poison a model in this way.

Ben Zhao, a computer science professor who runs the University of Chicago lab that developed NightShade, doesn’t expect widespread use of the tool to near a level where it threatens to displace AI image generators. Instead, he described it as a “spear” that could render some narrow applications unusable to force companies to pay when scraping artists’ work.

“If you’re any kind of creator, for example, if you take photos, and you don’t necessarily want your photos to be fed into a training model—or your children’s likenesses, or your own likenesses. should be fed into a model. – then nightshade is something you can consider,” Zhao said.

The device is free to use, and Zhao said he plans to keep it that way.

Models like Stable Diffusion already offer “opt-outs” so artists can tell datasets not to use their content. But many copyright holders have complained that the proliferation of AI tools is outpacing their efforts to protect their work.

The debate over intellectual property protection adds to a broader set of ethical concerns about AI, including the spread of deepfakes and questions about the limits of watermarking to prevent these and other abuses. While there is growing recognition in the AI ​​industry that more security measures are needed, the rapid growth of technology — including new text-to-video tools like OpenAI’s Sora — worries some experts.

“I don’t know that it’s going to do much because there’s going to be a technical solution that’s going to be a counter-reaction to this attack,” Sonja Schumer Gallander, a professor of AI and ethics at the University of Florida, said of Nightshade.

While the Nightshade Project and others like it represent a welcome “rebellion” against AI models in the absence of meaningful regulation, Schumer-Gallander said, AI developers are likely to defend against such countermeasures. We will prepare our programs for this.

Researchers at the University of Chicago acknowledge the possibility of nightshade as a “potential defense” against image poisoning, as AI platforms are updated to filter out data and images suspected of being “abnormally altered.” passed through

Zhao believes it is unfair to burden people to remove AI models from their photos.

“How many companies do you have to go to as an individual to not infringe on your rights?” They said. “You don’t say, ‘Yeah, you really should sign a form every time you cross the street that says, “Please don’t hit me!” to every oncoming driver.”

In the meantime, Ortiz said she sees Nightshade as a helpful “attack” that gives her job some protection while she looks for strong people on the court.

“Nightshade is just saying, ‘Hey, if you take this without my consent, there could be consequences,'” said Ortiz, who in January 2023 filed a lawsuit against Stability AI, Midjourney and DeviantArt. The filings are part of a class action lawsuit. Copyright infringement

A court late last year rejected some of the plaintiffs’ arguments but left the door open for them to file an amended suit, which they did in November, adding Runway AI as a defendant. . In a motion to dismiss earlier this year, Stability AI argued that “mere imitation of an aesthetic style does not infringe the copyright of any work.”

Stability AI has not commented. Midjourney, DeviantArt and Runway AI did not respond to requests for comment.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Leave a Comment