TheMurrow

Deezer Says 39% of New Uploads Are AI—So Why Are Humans Still Getting Paid Like It’s 2015? The Royalty Math That’s About to Flip Streaming

Deezer’s “39% of daily intake” isn’t about what people listen to—it’s about what floods the pipeline. And Deezer says most AI-track streams are fraud, forcing streaming’s payout logic into a rewrite.

By TheMurrow Editorial
April 4, 2026
Deezer Says 39% of New Uploads Are AI—So Why Are Humans Still Getting Paid Like It’s 2015? The Royalty Math That’s About to Flip Streaming

Key Points

  • 1Deezer says 60,000 AI tracks/day now hit its catalog—39% of daily intake—a supply shock that reshapes discovery and payouts.
  • 2Track the distinction: uploads ≠ streams ≠ revenue. AI may flood inventory while remaining a small share of actual listening.
  • 3Follow the money: Deezer claims up to 85% of streams on fully AI-generated tracks are fraudulent—and says it demonetizes them.

A new song arrives on a streaming service every second. Then another. Then ten more.

For years, that flood has been a familiar industry problem: too much music, too little human attention. But in early 2026, Deezer put a number on what many labels, distributors, and independent artists had been whispering about—and the figure is hard to ignore.

60,000
On January 29, 2026, Deezer said “60,000 AI-tracks are now uploaded per day,” amounting to “roughly 39% of daily intake.”

The claim is about uploads, not about what listeners are actually playing. Still, the intake number matters because it describes the raw material that recommendation systems sift through, moderators review, and royalty systems must defend.

“The new music problem isn’t only how much gets made. It’s how much gets delivered.”

— Pullquote

What Deezer actually said—and what “39%” does (and doesn’t) mean

Deezer’s 39% figure comes from a press release dated January 29, 2026, marking one year since the platform launched its AI-music detection tool. The company framed the result as a measurement of incoming supply: daily deliveries to Deezer, not user engagement.

That distinction is not pedantic. “Uploads” are the music entering the catalog through distributors and rights holders. “Streams” are what audiences choose (or are tricked into) playing. “Revenue” is what rights holders receive after platforms apply their rules, fraud filters, and payout formulas.

Deezer’s own public messaging underscores that point. Multiple reports echo Deezer’s stance that fully AI-generated tracks remain a small share of listening in many contexts—even as the volume of AI music arriving at the gates accelerates. TechRadar, for example, has noted the mismatch between the surge in AI supply and the much smaller share of streams those tracks typically capture.

Why the number is platform-specific—and why that still matters

Deezer is not claiming that roughly four in ten songs you hear are AI. The company is reporting that roughly four in ten new tracks delivered to Deezer each day are identified as AI-generated.

The statistic is also platform-specific. Competing services generally do not publish equivalent intake metrics, which is why Deezer’s number now anchors so much public debate. French outlet Le Monde has echoed that more than 39% of daily uploads to Deezer are AI-generated, citing the broader industry uncertainty about how close other platforms are to the same ratio.

The best way to read “39%” is as a stress test for the streaming pipeline: discovery, trust, and payments all depend on what enters the system—even if listeners never consciously seek it out.

“Uploads aren’t culture yet. They’re inventory—and inventory shapes everything downstream.”

— Pullquote

From 18% to 39%: the trendline that changed the conversation

The January 2026 disclosure did not appear out of thin air. Deezer had already signaled the direction of travel.

In an April 2025 release, Deezer reported that 18% of all new music uploaded was “fully AI-generated.” Nine months later, Deezer’s estimate for AI tracks in daily intake had climbed to roughly 39%, alongside the headline number of 60,000 AI tracks per day.

Two caveats are worth keeping in view. First, these are statements made at different times, potentially using slightly different operational definitions and detection improvements. Second, Deezer’s AI detection tool itself likely became better at recognizing what earlier might have slipped through.

Even with those cautions, the direction is unmistakable: more AI music is being shipped to platforms, in greater volume, at faster speed than human catalog growth could reasonably match.

Why “daily intake” is the number that spooks platforms

Streaming services live and die by two quiet systems most listeners never think about:

- Catalog management: what gets accepted, how it’s categorized, and whether it’s legitimate
- Discovery: how algorithms decide what to surface
- Trust and safety: how manipulation is detected and removed
- Royalties: how money is allocated when plays are counted

A rapidly rising intake of AI-generated tracks strains every layer at once. Even if only a fraction becomes popular, the platform still has to ingest the files, store them, scan them, potentially label them, and decide whether to recommend them.

Deezer’s statistic functions like a weather report. It doesn’t tell you which neighborhood will flood, but it tells you the storm is getting heavier.
18% → 39%
Deezer reported 18% AI among new uploads in April 2025, then roughly 39% of daily intake by January 2026—a sharp acceleration.

The accelerant is fraud: why Deezer links AI uploads to royalties

The most consequential part of Deezer’s communications is not the 39%. It’s the motive Deezer attaches to the surge.

Deezer’s central allegation is direct: a large portion of the AI-generated upload wave is tied to streaming fraud—actors who upload huge amounts of low-cost content, then attempt to generate artificial listening to siphon money from the royalty pool.

In its January 29, 2026 release, Deezer said up to 85% of streams of fully AI-generated tracks are fraudulent, and that those streams are demonetized—removed from the royalty pool. In mid‑2025 reporting, including The Guardian, an earlier snapshot circulating widely cited Deezer’s estimate that up to 70% of streams for AI-generated music on the service were fraudulent.

That change in figure (from ~70% to up to 85%) matters less as a precise measurement than as a sign of how intensely Deezer is framing the problem: not primarily as an artistic debate, but as an economic and integrity threat.

“The fight over AI music on streaming isn’t only about taste. It’s about theft—real or attempted—at industrial scale.”

— Pullquote

How the math of streaming makes fraud everyone’s problem

Most streaming payouts still rely on a pro‑rata model: the total subscription and ad revenue becomes a pool, then rights holders are paid according to their share of total streams.

In such a system, fraudulent plays can do two things at once:

- Inflate the denominator (total streams), diluting legitimate shares
- Redirect payouts toward the fraudster if fake streams are counted as real

Deezer’s demonetization claim is an attempt to reassure artists and rights holders: if fraudulent plays are removed, the pool is protected. The harder question is operational—how accurately can any platform detect fraud and AI generation at scale, and how quickly can it respond as tactics change?
Up to 85%
Deezer says up to 85% of streams of fully AI-generated tracks are fraudulent—and that those streams are demonetized.

Detection, labeling, and exclusion: Deezer’s three-part response

Deezer’s public response has been unusually concrete for a major streamer, moving beyond general statements about “responsible AI” to shipping product changes.

Labeling AI-generated tracks for listeners

In June 2025, TechCrunch reported that Deezer began labeling AI-generated music in its product, positioning the labels as part of a broader push to tackle streaming fraud and increase transparency.

Labels are a deceptively serious intervention. They change how listeners interpret what they hear, how journalists report on charts, and how rights holders evaluate the platform. Labels also invite debate: some listeners want full disclosure; others worry that labeling becomes stigma, or that detection errors could misidentify human work.

Excluding fully AI-generated tracks from recommendations

Deezer has also described excluding fully AI-generated tracks from algorithmic recommendations, an approach TechCrunch connected to Deezer’s effort to reduce incentives and clutter.

That move acknowledges an uncomfortable truth about streaming: discovery is oxygen. If recommendation slots can be gamed with cheap, mass-produced tracks, the most valuable real estate on the platform becomes vulnerable. Excluding fully AI-generated tracks from recommendations isn’t a moral verdict; it’s an incentive redesign.

Selling the detection tool to other platforms

The January 2026 release went further: Deezer announced plans to license/sell its AI detection technology.

That step turns a defensive measure into a potential industry standard—if others adopt it. It also invites scrutiny. Any detection tool must answer the hardest questions transparently: false positives, false negatives, model updates, appeals, and who gets to define “fully AI-generated” in the first place.

Key Insight

Deezer’s response isn’t just rhetoric: it ties AI detection to labeling, recommendation controls, and royalty protection—a full pipeline strategy.

The listening reality: why AI can be 39% of uploads but a small share of streams

A key piece of context often lost in social-media retellings is that a surge in supply does not automatically translate into a surge in consumption.

Deezer and multiple media reports have emphasized that fully AI-generated tracks remain a small share of actual listening in many contexts—sometimes characterized as low single-digit percentages of streams. The exact number varies by reporting context, but the direction is clear: audiences are not yet streaming AI music in proportion to its arrival rate.

That gap has several plausible explanations—none of which require guessing beyond the evidence Deezer has made public.

Discovery is a bottleneck, not a blank check

Even in the age of infinite shelf space, attention is finite. If a large portion of AI uploads is undifferentiated functional audio—soundalikes, generic mood tracks, mass-produced instrumentals—then listeners may not seek it out organically.

Recommendation systems also act as gatekeepers. If Deezer is excluding fully AI-generated tracks from algorithmic recommendations, that alone would suppress their listen share relative to upload share.

Fraud tries to substitute for fandom

Deezer’s fraud narrative suggests another reason for the mismatch. If a significant portion of AI track streaming is fraudulent—and then demonetized—those “plays” never reflect audience interest in the first place. They reflect an attempt to manufacture it.

The public tension here is straightforward: stream counts have become a proxy for popularity, and popularity is a proxy for money and opportunity. Fraud exploits the proxy.

Editor’s Note

“Uploads” describe supply entering the catalog; “streams” describe consumption; “revenue” reflects platform rules, fraud filtering, and payout formulas.

What this means for artists, labels, and listeners—practical takeaways

The AI music debate can drift into abstraction: creativity, authorship, the future of art. Deezer’s disclosures pull the conversation back to operations—what happens when a platform has to process tens of thousands of AI tracks daily and defend a royalty pool at the same time.

Here are the practical implications, grounded in what Deezer has said and what its policies suggest.

For artists and songwriters: visibility becomes the scarce resource

When intake balloons, discovery becomes more competitive, even if your audience remains loyal. If platforms respond by tightening recommendation criteria and scrutinizing suspicious activity, legitimate artists may see:

- More friction in distribution and verification
- Greater reliance on trusted channels (editorial playlists, direct fan engagement)
- Higher stakes for metadata accuracy and rights clarity

Deezer’s demonetization stance is designed to protect artists, but it also signals a more policed environment.

For labels and distributors: intake quality becomes reputational

The 39% figure raises uncomfortable questions for the supply chain. If mass AI content is flooding ingestion systems, platforms will look upstream. Distributors may face pressure to:

- Improve screening and provenance checks
- Separate AI content clearly in delivery metadata
- Cooperate with platform detection and labeling standards

Deezer’s choice to commercialize detection tech hints at a future where proof and labeling become routine parts of distribution, not optional add-ons.

For listeners: transparency is coming, whether you want it or not

Labels, exclusions from recommendations, and more explicit policy statements reshape the user experience. Some listeners will welcome clarity. Others will resent the friction, or worry that legitimate experimental music could be misclassified.

The deeper point is that streaming services are reasserting editorial power—algorithmic and policy-driven—after years of presenting themselves as neutral pipes.

Practical implications Deezer’s disclosures point toward

  • Tighter distribution and verification for legitimate artists
  • Greater importance of metadata accuracy and rights clarity
  • Upstream pressure on distributors to screen and label AI content
  • More visible user-facing transparency (labels) and reduced algorithmic exposure for fully AI tracks

The industry split: cultural anxiety vs. system integrity

Public arguments about AI music often collapse into two camps: defenders of human creativity versus champions of new tools. Deezer’s framing suggests a third axis that cuts across both: system integrity.

A listener might enjoy an AI-generated ambient track and still agree that streaming fraud should be stamped out. A label might experiment with AI in production workflows and still oppose mass upload schemes designed to manipulate royalties.

TechCrunch’s coverage connects Deezer’s labeling initiative explicitly to fraud. Le Monde describes an industry “at a loss,” highlighting the uncertainty platforms face when supply grows faster than their governance.

None of these perspectives require treating AI music as inherently illegitimate. They require treating scale and incentives as decisive.

A real-world case study in policy, not philosophy: Deezer’s demonetization stance

Deezer has made one of the clearest public commitments in the market: fraudulent streams tied to fully AI-generated tracks are demonetized to protect royalties for “human artists, songwriters and other rights owners,” as the company stated in its January 2026 release.

That policy is a case study in how streaming platforms may respond more broadly:

- Detect AI-generated content at ingestion
- Monitor its streaming behavior for fraud signals
- Remove fraudulent activity from payouts
- Reduce discovery incentives by excluding content from recommendations
- Add labeling to shape listener expectations

Whether these measures become industry norms depends on whether other platforms follow Deezer into public disclosure—and whether rights holders demand comparable transparency elsewhere.
39%
Deezer’s figure is about intake (uploads)—inventory entering the system—not a claim about culture, popularity, or listener preference.

Conclusion: 39% is not a verdict—it’s a warning light

Deezer’s “roughly 39% of daily intake” number is easy to misuse. It does not mean four in ten streams are AI. It does not mean audiences have embraced AI music at that rate. It does not prove that all AI music is fraudulent.

The number does something more valuable: it tells you where the pressure is building.

A streaming ecosystem designed for human-scale creativity is now absorbing machine-scale production. Deezer’s response—detection, labeling, recommendation exclusion, and demonetization—signals that platforms are preparing for a long fight over trust, incentives, and money.

The next phase won’t be decided by philosophy alone. It will be decided by operational definitions, audit trails, and whether the industry can agree on a simple principle: if streaming is the marketplace, then the marketplace has to be defendable.
T
About the Author
TheMurrow Editorial is a writer for TheMurrow covering entertainment.

Frequently Asked Questions

Did Deezer say 39% of streams are AI-generated?

No. Deezer’s January 29, 2026 press release says AI-generated tracks are roughly 39% of daily intake, meaning uploads/deliveries to the platform. Deezer and media coverage have separately emphasized that fully AI-generated tracks are still a small share of listening in many contexts, even as uploads surge.

What exactly did Deezer report in January 2026?

Deezer said “60,000 AI-tracks are now uploaded per day,” representing “roughly 39% of daily intake.” Deezer presented the figure as a one-year milestone for its AI-music detection tool, and linked the surge to efforts to combat streaming fraud and protect royalties.

How does the January 2026 number relate to Deezer’s earlier 18% figure?

In April 2025, Deezer said 18% of all new music uploaded was “fully AI-generated.” In January 2026, Deezer reported roughly 39% of daily intake as AI tracks. The two datapoints suggest rapid acceleration, though they come from different timeframes and could reflect evolving definitions or improved detection.

Why is Deezer focusing so much on fraud?

Deezer argues that a significant portion of AI-music activity is tied to streaming manipulation designed to siphon royalties. In January 2026, Deezer said up to 85% of streams of fully AI-generated tracks are fraudulent and that those streams are demonetized—removed from the royalty pool to protect legitimate rights holders.

What does “demonetized” mean in this context?

“Demonetized” means Deezer says it removes identified fraudulent streams from royalty calculations, so they do not earn payouts. The goal, according to Deezer’s January 2026 statement, is to ensure royalties for human artists, songwriters, and other rights owners are not affected by fraudulent activity.

What steps has Deezer taken against AI-generated music?

Deezer’s publicly described measures include:
- Detecting fully AI-generated tracks (and marking the one-year milestone in January 2026)
- Labeling AI-generated music for users (reported by TechCrunch in June 2025)
- Excluding fully AI-generated tracks from algorithmic recommendations (reported by TechCrunch)
- Commercializing its detection tech by offering to license/sell it (January 2026 press release)

More in Entertainment

You Might Also Like