The Recording Industry Association of America (RIAA) now considers AI voice cloning a potential copyright infringement threat and wants the US government to include it in its piracy watchdog list.
In a submission to the US Trade Representative (USTR), the RIAA asked the government to include the category of AI voice cloning in its annual list of entities that reportedly promote piracy or counterfeiting. The Review of Notorious Markets for Counterfeiting and Piracy often includes specific companies, websites, or territories to warn users of potential copyright infringement. For example, AliExpress, WeChat, and The Pirate Bay have been on the list for years.
The RIAA called out only one company in the space, Voicify.AI, which provides voice models of famous music artists like Ariana Grande, Taylor Swift, and Kanye West.
The organization believes the website lets users rip YouTube videos and copy an a cappella track before using an AI voice model to modify it. RIAA said, “This unauthorized activity infringes copyright as well as infringing the sound recording artist’s right to publicity.”
“The year 2023 saw an eruption of unauthorized AI vocal clone services that infringe not only the rights of the artists whose voices are being cloned but also the rights of those that own the sound recordings in each underlying musical track,” the RIAA said in its comment letter.
It added AI voice cloning has led to “an explosion of unauthorized derivative works.”
The Verge reached out to both the RIAA and Voicify for comment.
AI voice cloning blew up in the past year, especially after a song using AI versions of Drake and The Weeknd’s voices went viral in April, raising significant questions about its copyright status. Even though YouTube took down the video with the song after a sternly worded letter from Drake’s music label, the copyright call was not for the unauthorized use of Drake’s or The Weeknd’s voices but for a music sample.
Several other voice cloning companies have sprung up since then, with some platforms actively marketing toward musicians or podcasters. Some music artists have also embraced AI voice platforms. Grimes and Holly Herndon created AI versions of their voices that other people can use, but they control their distribution. Record labels like Universal Music Group — Drake’s label, by the way — were reported to work with Google to license their artists’ voices for AI models.
While the RIAA said it is concerned websites like Voicify could potentially encourage more unauthorized use of someone else’s voice, it isn’t yet clear how the law will treat the argument that AI violates the right of publicity. Protection around someone’s likeness — such as their face, name, or voice — differs from state to state.
Even though some people believe in going after AI tools that infringe on likeness rules, aggressively pursuing that legal action could also potentially upset the balance the music industry has struck between tributes and cover bands.
The RIAA also named several stream-ripping sites, unauthorized music download platforms, and BitTorrent indexing sites in its comment letter.
The USTR typically releases a final review of notorious markets at the start of the following year. It takes recommendations from industry groups on websites to include in the list and may not even add AI voice cloning as a category for the 2023 review.