AI

Spotify Allows AI-Generated Songs To Be Published Under Dead Artists Name

Spotify is under fire once again, this time for allowing AI-generated songs that replicate the voices and styles of deceased artists to be uploaded without any meaningful oversight. The revelation has sparked widespread backlash from fans, musicians, and ethics experts who warn this trend could spiral into a dangerous digital resurrection free-for-all.

No Filters, No Permission, No Boundaries

According to Blunt Magazine, Spotify currently has no automated system or official policy in place to vet or restrict the publishing of AI-generated songs, even when they impersonate dead artists. This means virtually any user or label using generative AI can release songs under the guise of musical legends, from Tupac to Amy Winehouse, without the consent of their estates or families.

The implications are unsettling. These artificially created tracks can easily deceive listeners, blur the line between real and fake, and exploit the legacy of deceased icons for commercial gain.

The Deepfake Music Era Has Arrived

The rise of AI voice cloning technology has made it shockingly easy to replicate a singer’s voice, emotional tone, and signature delivery. Multiple viral TikTok accounts and underground labels have been releasing eerie “new” music from artists who’ve been dead for decades, all generated by machine learning models trained on their original recordings.

Fans describe the experience as “haunting” and “disturbing,” while music ethicists are calling for immediate industry-wide reforms.

“This is a clear violation of legacy and intellectual identity,” said Dr. Alina Riaz, a digital rights expert. “Just because an artist has passed doesn’t mean their voice is public domain. It’s exploitation masquerading as innovation.”

Spotify’s Silence Raises More Alarms

Despite increasing scrutiny, Spotify has not issued a clear stance on how it plans to address AI-generated content that mimics real people, dead or alive. Unlike YouTube, which at least requires AI-generated disclosures for impersonated voices, Spotify continues to quietly host and profit from synthetic tracks without labeling or restrictions.

Critics argue this silence encourages unethical behavior and opens the floodgates to “deepfake music” designed to manipulate nostalgia and maximize streams.

The AI-music boom is colliding with a lack of regulation and platform accountability. With no laws currently governing digital voice ownership after death, dead artists could be turned into perpetual content farms, performing songs they never agreed to sing, in genres they never endorsed, for audiences they never imagined.