The new tools aim to protect artists’ work from AI scraping.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Making a living as an artist is hard. According to the online magazine Contemporary Art Issue, an estimated 85 percent of visual artists earn less than $25,000 a year. And now, with artists paying “at the show” and devaluing their work, there’s a new threat to their income: creative artificial intelligence.

To train the AI ​​models, the companies behind them usually copy art found online and use the images for free. Lawsuits over copyright infringement are pending, but there are two new free tools that artists can use to fight back: Glaze, which protects artists from impersonating styles, and Nightshade, which Also poisons an AI program unfortunate enough to try to use this art for data. .

Ben Zhao is a professor of computer science at the University of Chicago and a leader in the Glaze and Nightshade projects. He joins “Marketplace” Kai Rysdal to explain how these tools work and what he hopes will affect the future of the creative industries. Below is an edited transcript of their conversation.

Kai Reisdal: As a layman you can, these two tools that I mentioned in the introduction, Glaze and Nightshade—again, very generally, very basically—what do they do? How do they work?

Ben Zhao: Oh, wow, that’s a lot in one question. So let me break it down: OK, so two tools, right? They do different things. And what Glaze does when you run it over images of your art, it basically changes it in very subtle ways so that, when we look at it, in reality, we can barely see anything. But when AI models look at it, they actually perceive something very, very different. And then if you do that and protect your art, someone trying to copy your art style using one of those models will likely end up with a style that actually matches that. It will look completely different from what they wanted.

What Nightshade does is something different. Nightshade targets these base models, the larger models that companies are training. And, in training, what they do is take millions and millions of photos from online. And of course, there are still a lot of issues in the legal system about legality, licensing agreements, you know, no consent, etc. But Nightshade is basically designed to make the process more expensive by introducing a tiny poison pill inside each piece. of art so that when these models take too many pictures of them, it will start to get poisoned. It accumulates, and then the models get really confused about what the cat actually looks like, what the dog actually looks like. And when you ask her for a cow, she’ll probably give you a 1960s Ford Bronco. And really, the goal isn’t necessarily to break those models. The goal is to basically make unlicensed training data so expensive that you can get by just scraping the internet so that these companies actually think about licensing content from artists to train models on. Get started.

Ryssdal: Well, as you say, the subject of at least one, and probably many, many more lawsuits to come. It does seem a bit though like a black hat hacker vs. white hat hacker thing that the people writing these AI models will eventually take your tools apart and figure out how to get around them. And so we have something like an AI arms race.

Zhao: Of course, it can be fine. And actually, you know, most security systems are like that. It is a type of ever-evolving system. But that doesn’t mean it doesn’t have an effect. The point of security is to add value to the other side when it comes to defenses or attacks. So in this case, all we’re trying to do is increase the cost of unauthorized training on scraped data.

Ryssdal: Have you heard from your computer science colleagues who work for Stable Diffusion or Mid Journey? And they’re like, “Ben, what are you doing, man? You’re killing us?”

Zhao: No, actually surprisingly, great, the feedback is great. Of course, some people in some companies are like, “Yeah, you know, it seems, a little aggressive. It seems like maybe we can talk, maybe we can, you know, legislate it.” But I think what it does is it tries to balance the playing field, because, if you think about it, creatives — whether they’re individual artists or actual companies that do IP. I own [intellectual property] – They can’t do anything against AI model trainers right now. Nothing. The best they can do is sign up for an opt-out list somewhere and hope the AI ​​trainers are kind enough to honor it. Because there is no way to enforce it. There is no way to verify this. And that’s for companies that actually care. , and small companies, of course, will just do what they want and there will be no consequences.

Ryssdal: You know, when ChatGPT and all the others first came on the scene, and those of us who weren’t in the field became familiar with it, it seemed like—I’m exaggerating here—but it was a little Sounds like magic. And maybe this is a silly question, but is it hard computer science?

Zhao: Hard? I mean, it takes a few PhDs to think of that idea. It’s reasonably interesting and unusual in the sense that, if it were easy, someone would have done it a long time ago. So yes, I think it is difficult. But you know, if research were easy, it wouldn’t be interesting.

Ryssdal: Yes. So I was going to let you go, but what you said piqued my interest. You said it was unusual. As you see it, a guy in this field — deep in this field — what’s at stake here?

Zhao: Ah, boy, in terms of scale, if I step back and I look at what’s happening today with creative AI and human creators, I think, “Boy, everything is at stake. ” Because what we’re seeing are tools being deployed without regulation, without ethical guardians, without regard for the real harm to human creators in these industries. And then thinking, if you push it, it gets worse. These models rely on human creativity to fuel them, to help train them, to help them improve. If there is no more training data, they won’t get much better. And so, even for AI people who are very excited about these models and the future, they have to be aware of that, and they have to be aware of the fact that you need to support human creations in that future. Because otherwise, these models will just shrivel up and die.

There is a lot going on in the world. Through it all, Marketplace is there for you.

You rely on Marketplace to break down world events and tell you how it affects you in a factual, accessible way. We depend on your financial support to make this possible.

Your donation today powers the independent journalism you depend on. For just $5/month, you can help maintain the marketplace so we can keep reporting on the things that matter to you.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Leave a Comment