in

Spotify Is Filling Up With AI-Generated Music to Scam Revenue From Real Bands


And Spotify doesn’t seem to care.

Phony Tonk

The age of artificial intelligence-generated music is upon us — and already, hucksters are gaming Spotify to profit from their shitty wares.

As Slate reports, a group of rabble-rousing country music fans discovered what’s essentially a stream-stealing scheme that involves AI covers of songs being placed in otherwise legit music playlists to rack up millions of listens.

With generic-sounding names like “Highway Outlaws” and “Waterfront Wranglers,” these almost certainly bogus bands all followed the same pattern: tens or hundreds of thousands of streams, zero original songs, bios that sounded an awful lot like ChatGPT wrote them, and no social media footprint.

As one intrepid user posted on the r/CountryMusic subreddit wrote, this “artist-robbing scam” was uncovered after one of the forum’s moderators discovered one such band and went down the rabbit hole of “similar to” artists, all of whom seemed to be equally fake.

“When [the moderator] looked at ‘similar to’ artists, he discovered a huge cluster of identical AI ‘bands’ with massive monthly [listeners],” u/calibuildr wrote. “They were all on playlists like ‘summer country vibes’ and clearly there’s some kind of inauthentic engagement going on here.”

Label Mate

To get to the bottom of this bizarre debacle, Slate reached out to 11A, a label that claims to represent the seeming scam bands in question.

A representative for the alleged label, which has an expired domain name and a 117-follower Facebook page that’s been inactive for at least three years, insisted that it has documents showing the participation of human artists in the making of the covers. When pressed to provide more information, however, the spokesperson didn’t reply.

Strangely enough, the alleged AI covers disappeared during Slate‘s reporting, and Spotify insists it didn’t take them down.

“Spotify does not have a policy against artists creating content using autotune or AI tools, as long as the content does not violate our other policies, including our deceptive content policy, which prohibits impersonation,” a Spotify representative told the website. “In this instance, the content was removed by the content providers.”

Obviously, this problem isn’t exclusive to country music covers. As calibuildr told Slate, “this has been going on for several years, with ambient music and with electronic music and jazz.” The blog Metal Sucks also uncovered similar scammy AI renditions of metalcore songs that similarly seemed to “hijack” legit bands.

With Spotify seeming to have no problem with AI music, it’s up to the labels of the bands whose work is being covered by computers to get them taken down — or, as in the case of the phony country covers, for the quote-unquote “content providers” to do it themselves.

More on AI music: Major Record Labels Sue Music AI Startups for Copyright Infringement

Cops Say Hallucinating AIs Are Ready to Write Police Reports That Could Send People to Prison

Cops Say Hallucinating AIs Are Ready to Write Police Reports That Could Send People to Prison