More than 200 music artists have signed an open letter, including stars such as Nicki Minaj, Katy Perry, Billie Eilish, Stevie Wonder, J Balon and Jon Bon Jovi. Warning against “predatory use of AI”. in the music industry.
On Monday, the artists’ group released a letter that acknowledged AI’s “tremendous potential to advance human creativity” but also warned that powerful companies should model their original work on artificial intelligence. could be used to train and eventually replace human musicians.
“We must protect against the predatory use of AI to steal the voices and likenesses of professional artists, violate the rights of creators, and destroy the music ecosystem,” the letter said.
It calls on more tech companies, AI developers and digital music services to commit not to develop or use AI-powered technology that harms songwriters and artists or deprives them of their art. Prevents getting adequate compensation.
While the letter certainly makes a statement, things get a little more complicated when it comes to companies complying, says Michael Hippe, president and CEO of SoundExchange, which collects and distributes digital performance royalties. Divides. Hippe is also a professor of music law at Georgetown University.
Streaming platforms and tech companies “can’t turn a blind eye to these kinds of concerns from the creative community,” he tells CNBC Make It. “But, unfortunately, if they feel they can do something without getting the proper license or without the proper permission, some of them will do it and some won’t.”
In a blog post on March 29, ChatGPT creator OpenAI revealed the development of a new AI tool capable of creating a realistic clone of someone’s voice from a 15-second audio clip. The company noted the risks voice cloning technology presents and said the tool has not been released to the public at this time.
“We believe that any widespread deployment of artificial voice technology should be accompanied by voice verification experiments that verify that the original speaker is intentionally adding their voice to the service and that a no-go A voice list that detects and blocks the creation of these voices. Like prominent figures,” OpenAI said in a blog post.
Although US federal copyright laws offer protections to artists and music labels against blatant infringement of their work, these laws can be difficult to apply to AI-generated content that is merely an artist’s voice or general Mimics the voice but does not directly mimic their speech. or music.
Hippe says the laws currently on the books could use an update to keep up with the rapid growth of AI technology and prevent it from being used irresponsibly.
“An artist’s voice is their brand,” he says. “If you’re using someone’s voice or likeness without permission, that’s beyond copyright law because it’s stealing their brand for commercial gain.”
To that point, Tennessee, known as a music and entertainment hub, recently became the first state to pass legislation that would protect musicians from AI voice-cloning technology. The Ensuring Likeness Voice and Image Security (ELVIS) Act, which takes effect July 1, expands existing state laws that protect artists’ names, images and likenesses from misuse to include their voices. can go.
Hippe says AI innovation isn’t all bad news for the music industry. AI tech has potential for collaboration if used responsibly.
One way to do this, he says, is to get artists to agree to use their voices in AI-generated music, pay them appropriately, and give them proper credit.
“As long as you have consent, credit and compensation, many artists and creators will likely happily collaborate with AI,” he says. “It’s a way of doing things.”
Want to make extra money outside of your day job? Sign up for CNBC’s new online course. How to earn passive income online. Learn about common passive income streams, startup tips and real-life success stories. Register today and save 50% with discount code EARLYBIRD.
Plus, Sign up for the CNBC Make It newsletter. Get tips and tricks for success at work, with money and in life.