Deezer Says ~39% of New Uploads Are AI—So Why Are Your Favorite Artists Still Getting Blamed for ‘AI Slop’ (and Who’s Actually Cashing the Checks)?
Deezer’s “39%” is about what hits the intake pipe—not what listeners play. The real story is the gap between uploads, streams, and a fraud economy targeting royalties.

Key Points
- 1Separate the stats: Deezer’s 39% refers to daily AI uploads, while fully AI music is about ~0.5% of streams.
- 2Follow the money: Deezer links AI catalogs to fraud, saying it detects and demonetizes up to 85% of AI-track streams.
- 3Watch discovery: Deezer says it won’t recommend fully AI tracks—so catalogs can explode without changing what most listeners hear.
A number—39%—has been ricocheting around the music internet like a fire alarm.
It sounds like a verdict: AI has arrived, swallowed streaming whole, and is now what everyone is hearing. Except that’s not what the number means, and it’s not what listeners are doing.
In early 2026, Deezer reported that daily deliveries of AI-generated music averaged around 60,000 tracks in January 2026, representing about 39% of all music delivered to the platform each day. That statistic is real, but it describes uploads, not listening. You can flood a warehouse without changing what people buy.
The more revealing detail sits elsewhere: Deezer has said fully AI-generated music accounts for about 0.5% of total streams on the service. And within that small slice, Deezer and press reporting say fraud is rampant—so rampant that the “AI boom” looks less like a consumer trend than an attempted raid on the payout system.
“The headline isn’t that listeners switched to AI music. The headline is that upload pipes can be exploited at industrial scale.”
— — TheMurrow Editorial
What Deezer actually said—and what the 39% figure is (and isn’t)
A quick look at Deezer’s earlier public numbers shows how fast the supply curve has moved. In June 2025, Deezer announced what it called the first AI tagging system for a streaming service and began speaking more bluntly about volume and fraud. By September 11, 2025, industry coverage cited Deezer saying 28% of new uploads were AI-generated. By January 2026, Deezer’s figure rose again to about 39%—a jump that signals a structural change in how music is being delivered, not necessarily how it’s being enjoyed.
The interpretive mistake is a common one: readers see “39%” and assume “39% of what I’m hearing.” That assumption collapses two different marketplaces into one. Uploads measure friction in the delivery pipeline; streams measure demand. When those metrics diverge sharply, the story is rarely aesthetic. It’s usually economic.
The metric that matters: streams, not submissions
A platform can be inundated while its users remain largely indifferent. That doesn’t make the flood harmless. It just changes what the flood is: not a takeover of culture, but a stress test of infrastructure.
The upload flood vs. what people listen to: why these numbers can both be true
Deezer’s “about 0.5% of total streams” claim suggests that—so far—listeners haven’t embraced fully AI-generated catalogs at anything like the rate those catalogs are being delivered. A sensible reading is that much of the new material is not being made to be heard by humans in the first place. It’s being made to be processed by systems: ingestion tools, recommendation engines, payout formulas.
That distinction matters because the moral panic tends to focus on taste: “AI songs are replacing artists.” The platform reality often points to something less romantic and more mechanical: if you can upload at scale, you can also probe a service’s weaknesses at scale.
Discovery is a choke point—and Deezer is tightening it
From Deezer’s perspective, this is both consumer protection and market hygiene. From an AI-music creator’s perspective, it’s a hard limit on reach—especially when the product depends on the platform doing the marketing.
The practical implication for listeners is subtle: you may not “see” much of what’s being uploaded, even if it’s pouring in by the tens of thousands of tracks per day. The platform can be packed and still feel normal.
“A catalog can explode in size without changing what the average listener hears—if discovery stays closed.”
— — TheMurrow Editorial
The fraud engine hiding inside “AI music”: the money problem Deezer keeps pointing to
In June 2025, reporting cited Deezer estimates suggesting up to about 70% of streams of AI-generated music were fraudulent—bot or farm activity rather than human listening. By January 2026, Deezer’s own communications sharpened the claim: it said it detects and demonetizes up to 85% of streams on AI-generated music as fraudulent, removing them from the royalty pool.
Those are staggering numbers, not because they suggest AI music is popular, but because they suggest a portion of AI music is being used as a vehicle for extracting money—attempting to generate royalties through automated listening rather than winning attention.
What “demonetization” actually changes
The public is trained to read platforms culturally. Deezer is inviting readers to view the problem financially. A flood of synthetic tracks looks very different when you treat it as an attack on a payment system rather than a shift in taste.
A case study in incentives: why upload volume is such a tempting lever
No single statistic proves motive. But Deezer’s repeated emphasis on fraud—70% fraudulent streams (as reported in 2025) and up to 85% demonetized (per Deezer’s 2026 statement)—makes the platform’s interpretation clear: much of the “AI flood” is less art movement than monetization attempt.
“When a platform says it’s demonetizing 85% of AI-track streams as fraud, the genre isn’t the story. The business model is.”
— — TheMurrow Editorial
Deezer’s response: labeling, recommendation limits, and a new kind of gatekeeping
The labeling matters for a simple reason: without it, listeners fill the void with suspicion. A song sounds slightly uncanny, a vocal line seems too perfect, a cover art style feels templated—and suddenly social feeds are full of allegations. Labels don’t solve the argument about whether AI music belongs on streaming. But they reduce the chaos of guessing.
The second intervention is quieter: don’t recommend it
Recommendation systems are the largest distributor of attention in the modern music economy. Saying “we won’t amplify this category” doesn’t ban it. It simply denies it oxygen.
Platforms have always shaped taste by shaping exposure. Deezer’s move makes that power explicit—and positions it as a defense of listeners and artists rather than an editorial preference.
The third intervention is economic: remove fraud from payouts
If Deezer’s fraud estimates are even close, then demonetization is not a marginal fix. It’s a structural response to a structural exploitation attempt.
Key Insight
Why fans blame artists anyway: misattribution, impersonation, and “AI slop” as a catch-all
- Fully AI-generated tracks uploaded at scale (often anonymous, high-volume, low-identity catalogs)
- Human-led music using AI tools (mastering, stems, vocal processing, writing assistance)
- Unauthorized voice cloning and impersonation (deepfake-style releases)
Deezer’s labeling is specifically about tracks it classifies as “100% AI-generated,” as noted in coverage. That’s a narrower category than what most people mean when they say “AI slop.”
The confusion is not merely semantic. It fuels accusations—especially when tracks appear under the wrong name. Spotify has publicly emphasized that a major abuse vector is impersonation and fraudulent delivery to the wrong artist profile, a problem that can make it look like a real artist released material they never touched.
Listeners can’t reliably tell—and Deezer says so
The result is a kind of digital witch hunt where the visible target is the artist, while the underlying mechanism may be distribution abuse. When labeling is inconsistent across platforms—and there is no universal standard requiring it—guesswork becomes the norm.
Practical takeaway for listeners and artists
Editor’s Note
The missing standard: why the industry can’t agree on what to disclose
Deezer’s bet is that labeling and demotion can preserve trust. Other platforms may prioritize openness, fearing that labeling becomes a value judgment or a moderation nightmare. Some creators argue that tool-based creation is a spectrum and that rigid labels punish legitimate experimentation.
All of those concerns can be true at once. But the absence of standards has a cost: it externalizes the burden of detection to listeners, who are least equipped to do it, and to artists, who suffer when impersonation or spam lands on their pages.
Expert perspective: Deezer’s own framing
That framing may read self-serving, especially because Deezer has also described monetization potential in licensing its detection technology as a B2B opportunity. Yet the existence of a commercial angle doesn’t invalidate the underlying issue. Fraud is real whether or not someone sells a solution.
Who benefits—and what to watch next
The upload spike suggests at least one group benefits: entities capable of generating and delivering music at enormous scale. If Deezer is right that a high percentage of AI-track streaming is fraudulent—up to 85% detected and demonetized, per Deezer—then part of the ecosystem is attempting to turn streaming payouts into a programmable income stream.
Deezer’s countermeasures—labeling, recommendation suppression, and demonetization—aim to break that loop. The open question is whether those defenses become a template across the industry or a differentiator that leaves other services more exposed.
What readers should track (practical implications)
- Discovery policy: If a platform demotes AI-generated tracks from recommendations, the cultural impact may stay limited even as catalog volume explodes.
- Fraud enforcement: Demonetization changes incentives. Weak enforcement invites industrial abuse.
- Attribution integrity: Impersonation and profile-hijacking are reputational landmines, not just platform housekeeping.
The larger lesson is uncomfortable: streaming has always been vulnerable to scale. AI didn’t create that vulnerability; it lowered the cost of exploiting it.
What to watch when you see an “AI is X%” headline
- ✓Ask: Is the number uploads/deliveries or streams/listening?
- ✓Check: Does the platform recommend this content or keep it out of discovery?
- ✓Look: Are there stated fraud rates and demonetization policies?
- ✓Verify: Could it be impersonation/misdelivery to the wrong artist profile?
The real story behind 39%: an attention economy meets an automation economy
Meanwhile, the listening metric—about 0.5% of streams for fully AI-generated music, per Deezer—suggests culture is not moving as fast as automation. That gap between what can be produced and what can be loved is where the next phase of streaming economics will be fought.
If Deezer is correct that the majority of AI-track streaming it sees is fraudulent—reported as up to ~70% in 2025 coverage, and described as up to 85% demonetized in a 2026 Deezer statement—then the industry’s challenge isn’t a battle over authenticity. It’s a battle over incentives.
The future of music on streaming may hinge less on whether AI can write a chorus and more on whether platforms can prevent their royalty systems from being treated like a slot machine.
Frequently Asked Questions
Is 39% of the music on Deezer AI-generated now?
No. Deezer’s ~39% figure refers to daily deliveries/uploads, not the share of the overall catalog and not the share of listening. Deezer reported about 60,000 AI-generated tracks delivered per day in January 2026, which it framed as ~39% of all daily deliveries. Upload volume can rise dramatically without becoming what most users play.
How much of Deezer listening is actually fully AI-generated?
Deezer has said fully AI-generated music represents about ~0.5% of total streams on the platform. That’s a small fraction compared with the upload share. The mismatch suggests a major supply-side surge without a corresponding demand-side takeover.
Why does Deezer say AI music is linked to streaming fraud?
Because Deezer and press reporting have described extremely high fraud levels in this category. In June 2025 reporting, Deezer estimates suggested up to ~70% of streams of AI-generated music were fraudulent. In a January 2026 statement, Deezer said it detects and demonetizes up to 85% of streams on AI-generated music as fraudulent, removing them from the royalty pool.
What is Deezer doing about AI-generated tracks?
Deezer has taken three major steps described in its communications and coverage: labeling fully AI-generated tracks on albums, not promoting AI-generated tracks in editorial playlists or algorithmic recommendations, and demonetizing streams it detects as fraudulent—removing those streams from royalty calculations.
Why do AI tracks sometimes appear under real artists’ names?
One major reason is impersonation and misdelivery—fraudulent uploads that land on the wrong artist profile. Spotify has highlighted impersonation and incorrect delivery as an abuse vector, which helps explain why fans sometimes think a favorite artist “released” an AI track when it may be distribution abuse rather than authorship.
Can listeners reliably tell whether a song is AI-generated?
Not consistently. Deezer has claimed research indicating many listeners cannot reliably distinguish fully AI-generated music from human-made tracks. That uncertainty fuels misattribution and online panic—especially when platforms don’t have clear, consistent labeling.















