The landscape of music creation is being rapidly reshaped by Artificial Intelligence, with a surge in powerful tools that can generate original compositions in mere moments. This technology offers unprecedented opportunities for creators but has also opened the door to a new wave of sophisticated fraud, leaving artists and streaming platforms struggling to keep up.

A New Generation of Creative Tools

A wide array of AI music generators is now available, each offering distinct features tailored to different creative needs. For content creators and filmmakers on tight deadlines, platforms like AIVA, Amper Music, and Ecrett Music provide user-friendly interfaces to produce high-quality, royalty-free background scores simply by selecting parameters such as genre and mood. Others, like Soundraw, offer deeper customisation, allowing users to fine-tune the AI-generated tracks to their specific requirements.

The application of this technology extends deep into the music industry. Tools such as Melodrive are generating interest among game developers for their ability to create adaptive music that reacts in real-time to a user’s actions. For aspiring musicians, Boomy offers a pathway to not only create original songs but also distribute them on major streaming platforms, potentially monetising their work. Meanwhile, platforms like Endlesss are fostering a new form of remote collaboration, enabling artists to build tracks together in real-time. Songwriters are also turning to AI, with tools like Amadeus Code assisting in the creative process by generating new melodic ideas. Leading the charge in 2025, innovators like Suno.ai and Mubert are pushing boundaries further with advanced features, including text-to-music conversion and real-time streaming.

Artists Targeted by AI-Generated Fakes

While the creative potential of AI is clear, a darker side has emerged. A recent string of incidents has seen AI-generated songs fraudulently uploaded to major streaming services, including Spotify, falsely credited to established, living musicians. As reported by the BBC, this issue has impacted numerous artists, with folk singer Emily Portman and singer-songwriter Josh Kaufman among the most prominent recent examples.

The case of Emily Portman unfolded a month and a half ago when she began receiving messages from fans praising her new album, “Orca.” The only problem was that she hadn’t released any new music this year. She soon discovered the entirely AI-generated album was listed on all major streaming sites under her name, falsely identifying her as the artist, writer, and even the copyright holder. Portman was then left with the arduous task of contacting each platform individually to have the fraudulent music removed.

A Slow Response from Streaming Giants

While some services acted swiftly, Portman noted that it took Spotify a full three weeks to remove the fake album. More than a month after the issue was first reported, the album still appeared under her name on the platform, and she had yet to regain full control of her official artist profile.

This experience highlights a concerning vulnerability for independent artists. Portman suggests that fraudsters may be deliberately targeting less famous musicians, assuming they lack the influence and resources of major stars who can pressure platforms to remove fraudulent content more quickly. This view is shared by Tatiana Cirisano, a media and technology analyst at MIDIA Research. “I would think that the AI counterfeiters are targeting lesser-known artists in the hopes that their schemes will fly under the radar, compared to a superstar who could get Spotify on the phone right away,” Cirisano commented, while also noting that platforms are gradually improving their detection capabilities.

Calls for Greater Protection

New York-based musician and producer Josh Kaufman faced a similar situation, being contacted by friends and fans about new tracks that were not his. He remarked that it was obvious to most listeners “that it was someone using my artist profile to release weird, clearly computer-generated music.”

Kaufman criticised how simple it is for fraudsters to upload music under a false name, leaving legitimate artists to deal with the fallout. Although Spotify stated that the tracks in question had been removed, it remains unclear in all these cases who is financially profiting from the streams generated by the fake songs. The incidents have sparked urgent calls from the music community for streaming services to implement more robust verification and protection measures to prevent artists’ identities from being hijacked.