33% of Apple Music Uploads, 44% of Deezer Uploads Are AI-Generated. But is Anyone Listening?
- Michele
- 11 May 2026, Monday
According to recent data from Deezer and Apple Music, the number of AI-generated tracks uploaded to the platforms continues to climb steadily. At the same time, the number of people who actually listen to these releases is low. What conclusions can we draw?
AI Music on Streaming Platforms: No Signs of a Decline in Uploads
This week, Deezer released new data on the share of AI-generated music that reaches its platform. Around the same time, Oliver Schusser from Apple Music, speaking on a podcast, provided exclusive insight into the proportion of synthetic tracks on the platform. Taken together, the data suggests what many already suspected: there is no sign that AI-generated tracks will decline now or anytime soon.
Deezer: 44% of Tracks are AI-Generated
Let’s look at the new numbers. According to Deezer's new press release, the platform is now receiving “almost 75,000 AI-generated tracks per day, representing roughly 44% of daily uploads. This amounts to more than 2 million AI-generated tracks uploaded per month.” The number has increased since January 2026, when synthetic tracks made up around 39% of all Deezer uploads. However, fully AI-generated music still accounts for only 1–3% of streams on Deezer, and 85% of these are labelled as fraudulent.
Deezer CEO Alexis Lanternier comments: “AI-generated music is now far from a marginal phenomenon, and as daily deliveries keep increasing, we hope the whole music ecosystem will join us in taking action to help safeguard artists’ rights and promote transparency for fans.”
Moreover, he addresses the role of its detection technology, which is integral to Deezer’s AI music policy: “Thanks to our technology and the proactive measures we put in place more than a year ago, we have shown that it’s possible to reduce AI-related fraud and payment dilution in streaming to a minimum. Since January, we have made our detection technology available for licensing, and we’re looking forward to seeing industry peers of all kinds join us in the fight for fairness in the age of AI.”
Apple Music: One Third of Tracks are Synthetic But Receive no Attention
Deezer is not the only platform that has experienced a significant surge in AI-generated music. Recently, Apple Music’s VP Oliver Schusser revealed in an interview on Billboard’s ‘On The Record’ podcast that more than a third of the music delivered to the platform each month is fully AI-generated. However, it only attracts less than 0.5% of listeners, which represents a tiny number of all Apple Music users.
Similar to Deezer, Apple Music has reportedly developed its own technology for detecting AI-generated music. The system exists alongside Apple’s recently introduced Transparency Tags, which allow labels and distributors to voluntarily disclose when content was created with AI assistance. The difference between the two lies in who provides the transparency. The Tags rely on self-reporting from rights holders, while Apple’s internal detection system acts as an independent layer of verification.
In this way, Apple is taking a different approach from Spotify, which recently announced that it will rely on artists and labels to disclose the use of AI in tracks uploaded to its catalog. By integrating both systems into its platform, Apple signals that it is taking a much more systematic approach to cataloguing and monitoring AI content than it has previously let on.
AI Streaming Fraud Remains an Issue, But One That Seems Under Control
Apple also addressed the role AI-generated music plays in streaming fraud. Because synthetic tracks can be produced and uploaded at scale for almost no cost, they create obvious incentives for artificial streaming and royalty manipulation. Deezer’s recent findings underline how closely the two issues are connected: according to the platform, up to 85% of streams tied to fully AI-generated tracks were detected as fraudulent and subsequently demonetized.
According to Oliver Schusser, however, Apple’s own systems are already showing results: “The good news is our fraud penalty works incredibly well. We've seen a 60% reduction sort of over time in fraud, just because of the penalty.” For artists, that matters because reducing fraudulent streams means more revenue stays within the legitimate royalty pool.
What Can Artists Take Away From These Numbers?
What does all of this mean for artists? First, it confirms something many already assumed: listeners still respond more strongly to human-made music than synthetic output. In other words, personality, context, and cultural connection matter more, not less. And that’s despite the fact that the number of AI-generated tracks is clearly growing fast across major streaming services. Thus, while the role of AI in music is not something to ignore, there is no need for excessive concern, at least not for now.
At the same time, platforms are beginning to build more structured approaches around AI music. From Apple’s disclosure tags to internal detection tools, the industry appears to be heading toward a setup where AI-related content is more closely monitored and labelled. For artists using AI in their workflow, this also means increasing expectations around transparency in how it is used.
For artists, this also has financial implications. Better fraud detection could mean fairer royalty distribution, with less of the streaming pool being diluted by artificial activity and more of it reaching real listeners and the artists behind the music.
Michele is a Berlin-based writer passionate about music in its many forms, from soulful house, groovy techno, and breaky jungle/drum & bass to alternative rock, dark wave, and beyond. With experience in production, journalism, and DJing, they engage with the culture of sound from multiple perspectives. Their current topics of interest include club culture, music discovery & curation, dance, and the ways music affects perception & feeling. Michele writes in English.