A Dead Actor Just Got Cast via AI—Here’s the Legal Loophole That Could Decide Who Owns Your Voice on Streaming in 2026
Val Kilmer’s estate-approved AI performance looks like a “clean” case—yet it still set off alarms. The real battleground isn’t just replicas; it’s the quiet contract clauses that grant AI training and future reuse.

Key Points
- 1Track the loophole: replica rights control what audiences see, but training rights decide whether your past recordings can replace your future work.
- 2Read streaming contracts like labor agreements: perpetual, worldwide, transferable AI-training clauses can turn a one-time session fee into indefinite synthetic reuse.
- 3Expect more “clean” posthumous castings: even with estate consent and pay, the unresolved question is what counts as a performance—and who owns it.
On March 18, 2026, the Associated Press reported a detail that would have sounded like science fiction not long ago: First Line Films announced an indie production, As Deep as the Grave, featuring an AI-rendered, posthumous performance by Val Kilmer. According to the AP, Kilmer’s estate approved the digital replication and is being compensated. Producers framed the move as finishing a role Kilmer had accepted while alive but could not complete due to health.
The industry quickly treated the project as a “best-case scenario” for synthetic performance: permission, pay, and a narrative of artistic continuity. Yet the backlash wasn’t limited to moral unease. The deeper dispute is legal. The question isn’t only whether an estate can authorize a digital replica, but what exactly is being licensed when a “performance” is synthesized—voice, face, underlying footage, or something newer and harder to name.
The most important fight now sits in the fine print: training rights. Contracts can grant permission to feed recorded performances into AI systems, building models that later generate “new” output without copying any single clip. Even where “digital replica” rights exist on paper, training clauses can quietly determine whether a studio can replace tomorrow’s work using yesterday’s recordings.
Even the ‘ethical’ version of AI resurrection forces the industry to define what a performance is—and who owns it.
— — TheMurrow Editorial
The Val Kilmer flashpoint: why a “clean” case still shook Hollywood
Yet legality and legitimacy are not synonyms. Estate approval answers one question—who may authorize use of a deceased performer’s identity?—but leaves others unresolved. What counts as “use”? A digital double that looks like Kilmer? A voice model that sounds like him? Or a composite performance assembled by prompting a system trained on earlier recordings?
“Completing the role” vs. creating a new performance
The uncomfortable counterpoint is that synthetic completion still creates a performance that never occurred. Viewers aren’t seeing Kilmer act in the traditional sense; they are seeing an engineered approximation delivered under his name. Even when done with care, the act of authorship shifts. The performance becomes a collaboration between the estate, the studio, and the toolmakers shaping the model.
The legal marker hiding in plain sight
Two rights, one confusion: replica rights vs. training rights
Digital replica rights: the obvious part of the argument
Replica disputes often hinge on:
- Whether consent was granted (by the performer while alive, or by an estate)
- How the work is marketed (implying endorsement vs. portraying a character)
- Whether compensation reflects the value of the identity being used
Training rights: the quiet clause that changes everything
Training rights are especially powerful because they can be negotiated once and leveraged repeatedly. A performer might grant a platform permission to use recordings “for model improvement,” only to discover later that the model enables replacement work at scale.
A digital replica is what you see. Training rights are what makes replacement affordable.
— — TheMurrow Editorial
Replica rights vs. training rights (why the loophole appears)
Before
- Digital replica rights — governs the visible imitation; disputes center on consent
- marketing
- and pay
After
- Training rights — governs ingestion of recordings to build models; disputes center on scope
- duration
- and ownership of outputs
Streaming contracts as the battlefield: the German voice actor boycott
The dispute is instructive for two reasons. First, it shows where power actually sits: not in public statements about “ethical AI,” but in contract language most audiences never see. Second, it highlights a feature of AI adoption that rarely gets said aloud—once a system can reproduce voices convincingly, the incentives shift toward minimizing future labor.
Why dubbing and localization are especially exposed
Reports about the German boycott underscore how these clauses can be framed as routine, even benign—tucked into “standard” terms rather than presented as a separate licensing negotiation.
What performers should look for in 2026-era clauses
- Perpetual (no end date)
- Worldwide (no geographic limits)
- Transferable (can be sold or assigned)
- Written to cover “machine learning,” “model improvement,” or “AI training”
- Written to declare that outputs/results are owned by the company
Those terms are not automatically abusive, but they change bargaining power. A one-time fee can become the price of indefinite reuse.
The contract you sign for today’s session can decide whether you have work next year.
— — TheMurrow Editorial
Clause red flags performers should scrutinize
- ✓Perpetual grants with no end date
- ✓Worldwide rights with no territorial limits
- ✓Transferable/assignable rights to third parties
- ✓Broad “machine learning,” “model improvement,” or “AI training” language
- ✓Company ownership of all “results” or “outputs”
- ✓One-time fees that function as buyouts for indefinite reuse
The U.S. legal reality in 2026: a patchwork built for another era
Why patchwork law encourages aggressive experimentation
The result is a legal environment where edge cases proliferate. A studio may avoid calling something a “replica” while still producing something that the audience experiences as one. If enforcement triggers depend on narrow definitions, the practical protection can erode without any dramatic courtroom loss.
The core question: identity rights vs. authored work
In that sense, Kilmer’s case is symbolic: even with estate consent and compensation, the industry has no stable consensus on what is being bought and sold—identity, labor, or a new category that mixes both.
What, exactly, is being licensed when AI “casts” someone?
The pieces that get bundled together
- Image rights (face, body, recognizable features)
- Voice rights (tone, cadence, accent, vocal “signature”)
- Underlying recordings (past films, ADR takes, interviews)
- Copyright interests (in the film and sometimes in recordings)
- Union-covered labor (when a performance is arguably being “performed,” even if synthesized)
- Training rights (permission to use recordings to build models)
Even when an estate approves a digital replica, questions remain about the training materials. Were those recordings licensed for training? Were they captured under contracts that anticipated machine learning? If a model is trained on decades of work, the scope of what has been “licensed” can balloon beyond what anyone understood at the time of recording.
Estates can consent—audiences still decide legitimacy
The Kilmer case is likely to become a template for how studios seek legitimacy: estate approval plus compensation plus a narrative of honoring the performer’s intention. The industry should not confuse that template with a settled ethical standard.
Key Insight
The labor question: when “replacement” is a business model
Why training rights are labor rights
Studios and platforms, by contrast, argue that AI can reduce costs, accelerate localization, and help productions survive tight budgets. Those arguments aren’t frivolous; many productions are financially strained. The problem is that cost-saving claims often skip the distribution question: who benefits from those savings, and who absorbs the loss of bargaining power?
A practical test for “ethical AI” claims
- Is consent opt-in or buried in defaults?
- Is compensation ongoing or one-time?
- Is the use limited to a project, or open-ended?
- Is there auditability—can performers verify whether training occurred?
- Is the output labeled clearly for audiences?
Ethics without enforceable boundaries tends to become branding. Contracts are where boundaries live.
Editor’s Note
Practical takeaways for readers: what to watch next
For performers (and their representatives)
- Avoid “perpetual, worldwide, transferable” grants unless the pay matches the scope.
- Ask whether the company claims ownership of outputs/results and what that means in practice.
- Insist on project-specific limitations where possible and clear disclosure requirements.
For producers and studios
- If training data is drawn from legacy recordings, clarify whether those recordings were licensed for that purpose. Ambiguity invites backlash and litigation.
- Consider that the “best-case” narrative can become a worst-case reputational crisis if audiences feel deceived.
For audiences
- When controversies erupt, look beyond the headline question—“Did they have permission?”—and ask the quieter one: “What rights did they take for the future?”
Conclusion: the real fight isn’t resurrection—it’s ownership of the future tense
The confusion—and the opportunity for exploitation—sits between replica rights and training rights. Replica disputes are visible, dramatic, and easy to understand. Training rights are contractual, quiet, and structurally more consequential. They determine whether yesterday’s recordings can be converted into tomorrow’s labor without tomorrow’s pay.
Entertainment has always been built on negotiated rights. AI doesn’t change that principle; it changes the stakes. The most valuable performance in the next decade may not be the one an actor gives on set. It may be the one a contract allows a company to generate forever.
A studio can do everything “right” on consent—and still end up rewriting what it means to own a performance.
— — TheMurrow Editorial
Frequently Asked Questions
Was Val Kilmer’s AI casting legal?
The Associated Press reported that First Line Films said Kilmer’s estate granted permission and is being compensated for the AI-rendered performance in As Deep as the Grave (announced March 18, 2026). That supports a strong claim of legality on consent grounds. Legal risk can still exist around what materials were used to train or build the performance, depending on underlying rights and contracts.
What’s the difference between a “digital replica” and AI training?
A digital replica is the end product: a synthetic voice or likeness used in a film. AI training rights govern whether a company can use recorded performances to build models that later generate new output. Training rights can be broader and longer-lasting than replica permissions, making them a critical—and often overlooked—part of contract negotiations.
Why did German voice actors reportedly boycott Netflix?
Reports in early 2026 described German voice actors objecting to a contract clause allowing their recorded performances to be used for AI training. The concern was that training could enable future voice replacement without adequate consent or compensation. The dispute illustrates how AI conflicts often arise from contract terms rather than overt “resurrection” headlines.
Do U.S. performers have federal protection for voice and likeness?
Protection still largely comes from state right-of-publicity laws, which create a patchwork that can be hard to enforce consistently across jurisdictions, as noted by the Congressional Research Service. That uncertainty can encourage aggressive experimentation and careful forum selection by companies. Performers typically need strong contract language in addition to relying on state law.
If an estate consents, is AI resurrection automatically ethical?
Consent and compensation are meaningful, but they aren’t the whole ethical picture. Audiences often care about transparency, artistic intent, and whether the work feels like tribute or exploitation. The Kilmer case shows that even a consent-forward approach can raise questions about authorship, disclosure, and how far “permission” should extend.
What contract terms should performers watch most closely?
Pay special attention to clauses granting perpetual, worldwide, transferable rights for “machine learning,” “AI training,” or “model improvement.” Also scrutinize language stating that the company owns all “results” or “outputs.” Those terms can turn a single session into indefinite leverage for synthetic reuse without future payment.















