If a song is created by artificial intelligence and listened to by a bot, was it even heard at all? It’s a problem music-streaming companies now face as generative AI is rapidly making it easier for anyone to churn out songs with a few clicks, and then send bots to stream them for cash.
“It’s a floodgate,” says Tony Rigg, a lecturer in music industry management at the University of Central Lancashire in the UK. He’s talking about the arrival of AI-generated music. And that torrent of new music amplifies the issue of fake listening, giving people a simple way to get streams on low-quality tracks.
Artificial streaming, or bot listening, isn’t new. Some turn to third-party companies promising to boost streams, which then enlist bot-made accounts to listen to the same playlists on repeat. It’s a problem because streaming companies divide up royalty payments from a limited pool of cash—the more a song plays, the more its creator earns. So, more money for songs listened to by bots can mean that less is sent to those with human fans. Human artists have already been caught up in artificial streaming scandals, but AI is adding a new element.
The first major test case came last week when Spotify reportedly removed tens of thousands of songs created and uploaded to Spotify by AI music generator Boomy. These made up a small percentage of total Boomy creations, but did include songs that were suspected of being streamed by bots, according to the Financial Times. Spotify did not respond to a request for comment to confirm the removal, but the platform does have policies against fake streaming.
Boomy uploads to Spotify were paused last week, but resumed May 6. Alex Mitchell, Boomy’s CEO and founder, says that the company is “categorically against any type of manipulation or artificial streaming.” Mitchell also says Boomy has a system in place to respond to suspicious streaming notices sent by streaming companies, and can move to freeze payments and block users who may be manipulating the system for gain. But tons of artificial listeners were still able to get through before they were caught, showing just how prolific these scams have become.
Fake streaming is an industry-wide problem that goes beyond AI-made music. A study from France’s Centre National de la Musique, a public-private organization focused on the French music industry, found that between 1 and 3 percent of all music streamed across various popular platforms in France in 2021 was detected to have been streamed by bots, accounting for an estimated 1 to 3 billion fake streams.
Competition for listeners is fierce. There are more than 100 million songs on Apple Music, Amazon Music, and Spotify, and many are rarely, if ever, played. With rapid advancements in generative AI, a deluge of new songs are expected to hit streaming platforms.
And, as with so much in generative AI, it’s happening fast. Last month, a song faking the voices of Drake and The Weeknd went viral, and was booted from Spotify. As the technology becomes more convincing, the impersonations are duping more and more people. Scammers allegedly used AI to generate new tracks with Frank Ocean’s voice, and sold them to fans for thousands of dollars, luring people with a promise of leaked songs.
Some artists are embracing AI. Last week, Grimes announced she would let people make music using an AI version of her voice in exchange for half the royalties, and singer Holly Herndon has deepfaked her voice, allowing her to “sing” in languages she hasn’t learned.
Universal Music Group, home to some of the world’s biggest musicians, has pushed back aggressively against AI training on their artists’ work, saying the move could infringe on copyright. But artists themselves could begin manipulating their own voices with AI, reducing the labor and financial investment needed to make new music, says Albert Soler, an attorney and owner of a music and entertainment management firm. “AI creates an opportunity for a flood of income that doesn’t exist today, with the artist having to do absolutely nothing to get there,” he says. The tech could help artists make money in the recording and streaming industries that often exploit them, but it also may add to the problem of derivative songs.
But all artists are up against artificial tracks like the Frank Ocean dupes and artificial streaming bots, both of which erode trust in the music industry. With limited time and endless options, people may select songs because they come from favorite artists, or because they’re popular on streaming services. But if those streams are fraudulent or the music doesn’t belong to the performer it mimics, that pulls attention away from real artists. Fake streams may also distort the algorithms that determine popularity and recommend music to listeners, Rigg says.
When bots are listening to music made largely by bots, they undermine an entire creative industry. Artists have been pushed to major streaming platforms to make paltry royalties off their recorded work. But they’re struggling to keep up the way generative AI is uprooting the music industry. Even Spotify’s CEO Daniel Ek acknowledged last month that he has yet to see “anything moving as fast as the development of AI.” If Spotify doesn’t catch up, it’s human artists who will be left behind.