The Senate bill would allow individuals to sue AI that uses content without permission.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

A bipartisan group of senators introduced a bill this week aimed at protecting artists and journalists from using their work to train AI models or producing AI content without their consent.

The Original Protection and Integrity of Content from Edited and Deepfake Media Act (COPIED Act) would require the National Institute of Standards and Technology (NIST) to create guidelines and standards to include details such as watermarks on where AI content come from and make it illegal to remove, disable, or tamper with that “content origin” information.

The bill would also allow individuals to sue for violations and give the FTC and state attorneys general the authority to enforce its requirements.

“Artificial intelligence has given bad actors the ability to deepfake anyone, including members of the creative community, copy their likenesses without their consent and profit from fake content,” Bill said. says co-sponsor Sen. Marsha Blackburn, a Tennessean. Republicans “The COPIED Act takes an important step toward better defending common targets like artists and performers against deepfakes and other inauthentic content.”

The COPIED Act “will provide much-needed transparency around AI-generated content. [and] Put creators, including local journalists, artists, and musicians back in control of their content with a provenance and watermarking process that I think is sorely needed,” said Senate Commerce Committee Chair Maria Cantwell, a Washington Democrat. added.

Artists, writers, and publishers are already waging legal battles against AI companies, most notably OpenAI and The New York Times' lawsuit against Microsoft. Last year, Google was also sued for scouring the internet for data to train its Gemini AI.

Some publishers have struck deals with AI companies for permission to use their data, which can be a more lucrative option than multibillion-dollar tech companies. In the music world, YouTube is reportedly in talks with record labels about using specific artists' catalogs to train its upcoming AI tools.

Deepfakes also have the problem of using an artist's likeness without their permission to promote shady crypto schemes and even porn.

Recommended by our editors

Several groups have endorsed the COPIED Act, including the Recording Industry Association of America (RIAA). “Leading tech companies refuse to share basic data about the creation and training of their models because they can copy and use copyrighted material without a license to create synthetic recordings,” says Mitch Glazier, chairman and CEO of the RIAA. benefit from it.”

Glazier added that the bill would “provide much-needed visibility into AI development and pave the way for more ethical innovation and fair and transparent competition in the digital marketplace.”

Other sponsors include SAG-AFTRA, Nashville Songwriters Association International, Recording Academy, National Music Publishers' Association, News/Media Alliance, National Newspaper Association, America's Newspapers, Rebuild Local News, Seattle Times, National Association of Broadcasters, Artist Rights Alliance , Human Artistry Campaign, Public Citizen, The Society of Composers and Lyricists, Songwriters Guild of America, and Music Creators North America.

Get our best stories!

Sign up for What's new now? Every morning to deliver our top stories to your inbox.

This newsletter may contain advertisements, deals, or affiliate links. Subscribing to the newsletter indicates your agreement to our Terms of Use and Privacy Policy. You can unsubscribe from newsletters at any time.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Leave a Comment