...

Spotify’s Attempt to Fight AI Slop Falls on Its Face



It’s been clear for a while that a deluge of AI slop is drowning out real music and human artists on Spotify.

The platform has become overrun by bots and AI-spun trickery, which have actively been scamming revenue from real bands.

Earlier this year, a self-proclaimed “indie rock band” called The Velvet Sundown racked up millions of streams on the streaming service using AI-generated songs. Weeks later, the company was caught populating the profiles of long-dead artists with new AI-generated songs that have nothing to do with them.

Now Spotify has finally acknowledged the problem, announcing new policies to protect artists against “spam, impersonation, and deception.”

“At its worst, AI can be used by bad actors and content farms to confuse or deceive listeners, push ‘slop’ into the ecosystem, and interfere with authentic artists working to build their careers,” the company wrote. “That kind of harmful AI content degrades the user experience for listeners and often attempts to divert royalties to bad actors.”

Spotify head of marketing and policy Sam Duboff told reporters at a press briefing that 15 record labels and music distributors had committed to the changes already, The Verge reported.

The company is also planning to roll out a new spam filter that can detect common tactics used by spammers to game Spotify’s royalties system.

“Left unchecked, these behaviors can dilute the royalty pool and impact attention for artists playing by the rules,” the company wrote in its press release.

But just one day later, a new AI scandal on Spotify showed the magnitude of the undertaking.

The issue arose when an acclaimed and long-dormant side project by Bon Iver frontman Justin Vernon, called Volcano Choir, unexpectedly uploaded a new single called “Silkymoon Light” after being on hiatus for more than a decade.

The problem was that the track clearly wasn’t a real Volcano Choir song — and bore obvious hallmarks of AI generation, like robotic vocals and a glitchy album cover.

“We’re aware of this issue and have removed the content,” Spotify told us in a statement. “That change should be reflected on the platform soon. Across the music industry, AI is accelerating many of the problems that already exist — e.g. spam, fraud, and deceptive content — which is why we recently announced new policies aimed at making Spotify a more transparent, fair, and trustworthy platform for artists and listeners.”

“Because music flows through a complex supply chain of labels and distributors, bad actors can sometimes exploit gaps in the system to push incorrect content onto artist profiles,” it continued. “We’re testing new prevention tactics with leading artist distributors to equip them to better stop these attacks at the source, and investing more resources into our content mismatch process, reducing the wait time for review, and enabling artists to report ‘mismatch’ even in the pre-release state.”

In other words, Spotify may be on board to clean up its platform, but the technical hurdles are clearly immense.

The use of AI in the music industry has become a major point of contention, especially when it comes to impersonation. We’ve seen countless tracks featuring the cloned vocals of famous music artists go viral online, a trend that has already resulted in prolonged legal battles.

According to Spotify’s impersonation policy, the company says it will “remove music” if it’s found to be replicating “another artist’s voice without that artist’s permission,” even when it’s labeled as being an “‘AI version’ of the artist.”

“Some artists may choose to license their voices to AI projects — and that’s their choice to make,” the company’s press release reads. “Our job is to do what we can to ensure that the choice stays in their hands.”

The company will be working with the Digital Data Exchange (DDEX), a not-for-profit dedicated to the creation of digital music value chain standards, to establish common “AI disclosures in music credits.”

“As this information is submitted through labels, distributors, and music partners, we’ll begin displaying it across the app,” the statement reads.

It remains to be seen whether Spotify’s new policies will stem a tidal wave of AI slop proliferating on its platform, let alone whether they’ll be meaningfully enforced in the future.

And it’s not clear how Spotify will ferret out artists that don’t cop to their use of AI. Initially, the Velvet Underground denied it was using the tech, but eventually updated its bio on Spotify, referring to itself as an “ongoing artistic provocation” that made considerable use of AI.

Even after all that drama, The Velvet Sundown’s music can still be streamed on Spotify. Some of its lazily generated songs have amassed over three million listens to date, generating royalties that could’ve easily gone to human musicians instead.

And, though a Spotify spokesperson acknowledged the dodgy Volcano Choir song on a call, it currently remains live on the band’s page.

More on AI music: Startup Using AI to Bring Whitney Houston Back on Tour 13 Years After Her Death

Source link

#Spotifys #Attempt #Fight #Slop #Falls #Face