Who Owns the Voice of the Dead? AI Resurrections, Publicity Rights, and Legal Gaps

Sponsored Links

📌 TL;DR — As AI recreates the likeness and voice of the dead, the legal question is no longer “Can we?” but “Should we — and who decides?”

From sentimental videos of a parent speaking again to eerily lifelike chatbots mimicking celebrities, AI-powered recreations of the deceased are no longer science fiction. But alongside the emotional awe comes a legal and ethical fog:

  • Can AI use someone’s face, voice, or personality after they die?
  • Does the deceased have any rights left?
  • What if the AI says something false or damaging — who is held accountable?

This article breaks down the emerging legal debate around postmortem publicity rights, AI-generated personas, and the growing need for digital consent frameworks.


Sponsored Links

👤 What Is “AI Resurrection”?

Also known as deadbots or digital resurrections, this refers to AI technologies that simulate the personalities of deceased individuals using:

  • Audio and video archives
  • Writing or chat logs
  • Public data (interviews, social media, etc.)

These tools can recreate someone’s voice, expressions, mannerisms — and even generate responses “in their style.”


Sponsored Links

🧪 Why It’s Trending Now

Recent real-world examples include:

  • Alexis Ohanian (Reddit co-founder) recreating his late mother through an animated AI video
  • “Talking gravestones” or AI-enhanced memorials growing in Japan and South Korea
  • Increasing use of AI voice synthesis to simulate deceased performers

But the backlash has been just as intense:

“She would never say that.”
“This feels like emotional manipulation.”
“Did anyone get permission from the deceased?”

These reactions aren’t just emotional — they point to a legal vacuum.


Sponsored Links

⚖️ What the Law Currently Says (or Doesn’t)

🔹 In Most Countries (including Japan and many U.S. states):

  • Defamation and privacy laws only apply to living individuals
  • Once someone dies, their reputation and likeness often lose direct legal protection
  • Families can sometimes sue for emotional distress, but not on behalf of the deceased’s own reputation

This means that:

  • If an AI “version” of your late grandfather starts saying false things…
  • no clear legal structure prevents it.

Sponsored Links

🧑‍⚖️ Enter: The “Postmortem Right of Publicity”

This is a proposed extension of existing publicity rights — the right to control the commercial use of one’s name, image, voice, and likeness.

🔸 What Experts Are Arguing:

🧠 Professor Jennifer Rothman (Loyola Law School)

  • Suggests publicity rights should survive death
  • Calls for laws allowing families to demand removal, correction, or preemptive limits on how AI uses the deceased’s data
  • Recommends viewing postmortem rights as hybrid legal tools: part property, part personality

📚 S.L. Klein (Hofstra Law Review)

  • Proposes a structured framework:
    • Time limits (e.g., protection for 50 years after death)
    • Allowing heirs or designated persons to approve/deny AI usage
    • Mandatory disclosure when AI is simulating a deceased person
    • Possible royalties or compensation mechanisms

🧩 Key idea: “Recreating someone’s personality is not just artistic expression — it’s the use of a deeply personal asset.”

Sponsored Links

💥 What Could Go Wrong? Real Risks of AI Recreating the Dead

While AI resurrection can seem touching or poetic, the legal and reputational risks are substantial — especially when simulations speak publicly or interactively.

⚠️ Scenario 1: False Statements or Defamation

  • An AI clone of a deceased figure says, “I hated my son.”
  • It claims the person had affairs, illegal behavior, or political stances they never actually held.
  • These statements are broadcast on social media, triggering reputational fallout for the family or estate.

→ The public might not know it’s AI.
→ The damage to reputation becomes real — even if the person is no longer alive.


⚠️ Scenario 2: Manipulated Legacy

  • The AI expresses views the person never held, or even opposed.
  • A company or platform uses the AI’s persona to endorse products, causes, or ideologies.
  • This creates a “false version” of the deceased that lives on, unchallenged.

As Professor Rothman warns:

“Even dead people have interests in how they are remembered. That memory can be manipulated.”


Sponsored Links

🧭 So… Who Is Liable?

Here’s where the legal fog thickens. Responsibility may fall on:

ActorLegal Status
AI ItselfNot a legal person → cannot be sued
Developer/ProviderMay be liable if output was foreseeable or reckless
Family/UploaderIf they knowingly created harmful content, could be responsible
Platform/HostUnder laws like the U.S. Section 230 or Japan’s Provider Liability Act, they may have limited liability unless ignoring takedown requests

In practice, there’s no consistent framework — most cases would become complex civil disputes.


Sponsored Links

🛠️ What Legal Fixes Are Being Proposed?

Based on academic and legal proposals, these are the leading suggestions:

✅ 1. Digital Wills

  • Individuals could specify whether and how they can be recreated posthumously
  • Include limits on voice, image, and interactive use
  • Potentially linked to national ID or data systems

✅ 2. Mandatory Disclosure

  • AI simulations of the deceased must include a visible label or disclaimer
    • e.g., “This is an AI-generated likeness based on publicly available data”
  • Helps prevent confusion or manipulation

✅ 3. Designated Rights Holders

  • Similar to music or literary estates, authorized family or agents could:
    • Approve or reject use
    • Request corrections or takedowns
    • Set time-limited permissions (e.g., “no use until 10 years after death”)

✅ 4. Time Limits (Sunset Clauses)

  • Publicity rights might last 50–70 years post-death, depending on the jurisdiction
  • After that, persona may enter public domain (with limits)

Sponsored Links

🧠 Final Thought: You Can Simulate a Voice — But Not a Life

AI gives us the illusion of bringing someone back.
But there’s a difference between remembering a person and reconstructing them without consent.

When an AI speaks in someone’s voice:

  • Is it them?
  • Or is it just a collection of patterns, curated for effect?

That’s why legal frameworks matter:
Not to stop us from grieving, but to protect the dignity of memory — and the limits of simulation.


Sponsored Links

✅ Summary: Key Takeaways

  • Current laws don’t clearly protect the deceased’s voice or likeness from AI misuse
  • Experts propose a new layer of rights: postmortem publicity rights
  • Tools like digital wills, disclosure rules, and designated heirs could become standard
  • The line between simulation and exploitation will define future AI ethics and media law

Sponsored Links

🔗 Sources

  • Rothman, J.E. Postmortem Publicity Rights at the Property–Personality Interface (SSRN)
  • Klein, S.L. The Post-Mortem Right of Publicity (Hofstra Law Review)
  • The Guardian: Digital recreations of dead people need urgent regulation (The Guardian)