AI-Generated Music Fraud Leads to $10 Million Royalty Scheme Charges for Musician

  • Musician Michael Smith charged with fraud and money laundering conspiracy after earning $10 million in royalties. 
  • Smith used AI to create music and deployed bots to stream his tracks on platforms like Spotify, Apple Music, and Amazon Music. 
  • Over seven years, his bots streamed his music billions of times, generating significant revenue.
  • Smith avoided detection by producing many tracks, ensuring no single song was streamed excessively. 
  • His songs had unconventional titles and artist names to obscure the fraud. 
  • Frustrated by the low income from legitimate music streaming, Smith turned to AI-generated content and bots in 2018. 
  • His fraudulent activity generated up to $1.2 million annually, and his income reached $110,000 monthly in 2019. 
  • By 2024, Smith’s tracks were streamed around 4 billion times, producing $12 million in royalties.
  • Despite warnings from a distribution company in 2018, Smith denied any wrongdoing. 
  • He now faces a potential 20-year prison sentence for his role in the scheme.

Main AI News:

Michael Smith, a musician from North Carolina, has been charged with fraud and money laundering conspiracy after allegedly amassing $10 million in royalties through an intricate scheme involving AI-generated music and bots. Smith, 52, initially pursued a legitimate music career but eventually shifted to a fraudulent operation, producing music with artificial intelligence and using bots to stream it on platforms like Spotify, Apple Music, and Amazon Music.

According to prosecutors, Smith orchestrated the scheme by acquiring numerous email addresses, which he used to create a network of bots that streamed his tracks billions of times over seven years. He carefully avoided detection by ensuring no single track was streamed excessively. His songs featured unusual titles, such as “Zygotic Washstands” and “Zyme Bedewing,” and his artist names, like “Calvinistic Dust” and “Callous Post,” mimicked indie bands, further obscuring his actions.

Smith’s shift to this fraudulent model began after his attempts to generate income through legitimate streaming yielded minimal results. In 2018, he developed an extensive catalog of songs after collaborating with an AI music company. His use of bots to stream these tracks quickly generated significant revenue, with annual earnings reaching up to $1.2 million. By 2019, his monthly income surged to $110,000, which he shared with his co-conspirators.

Investigators uncovered evidence that Smith had carefully calculated his potential earnings. In a 2017 email, he estimated that his tracks were being streamed 661,440 times daily, which would translate to over $3,000 in revenue per day. By 2024, Smith had calculated that his AI-generated tracks had been streamed around 4 billion times, generating approximately $12 million in royalties.

Smith’s activities drew scrutiny from the streaming industry, particularly in 2018 when a distribution company questioned possible “streaming abuse.” However, Smith denied any wrongdoing. He now faces a possible 20-year prison sentence for his elaborate scheme, which diverted millions in royalties from legitimate artists and songwriters entitled to the earnings.

Conclusion:

This case highlights significant vulnerabilities in the digital music streaming market, where AI-generated content and automated streaming manipulation can exploit platform algorithms for profit. As streaming becomes an increasingly dominant revenue source in the music industry, platforms will likely need to invest in more sophisticated fraud detection technologies. The rise of AI-generated content also presents ethical and regulatory challenges, as it blurs the line between legitimate artistry and manipulation. For the market, this emphasizes the need for tighter controls and transparency to maintain trust among legitimate creators and platforms.

Source