People Are Using AI to Talk to the Dead And the Results Are Deeply Unsettling
People Are Using AI to Talk to the Dead And the Results Are Deeply Unsettling
Introduction: When Memory Starts Talking Back
There’s a moment a small, almost absurd moment when the whole idea of talking to the dead stops being a philosophical question and becomes something uncomfortably concrete. Maybe you’re sitting alone with your phone late at night, scrolling through old messages from someone you’ve lost. And then you see an ad promising that their voice, their stories, even their sense of humor can be revived through AI. A click here, a subscription there, and supposedly you can “hear” them again.
The pitch sounds like a mixture of science fiction and emotional opportunism. And yet, people are signing up. Not hundreds thousands.
Artificial intelligence is quietly reshaping the way we grieve. What used to be static memorials photo albums, voicemail archives, dusty boxes of letters are becoming interactive, almost alive. Chatbots that mimic lost loved ones. Voice avatars that speak in the cadence of someone who is no longer here. Digital “doubles” built from scraps of text messages, voice notes, social media posts, and videos.
It’s an industry growing faster than the conversation surrounding it. And the truth is: we’re not emotionally prepared for this.
1. The Rise of Digital Afterlives
AI companies love to use soft, soothing language when they sell these tools. Words like legacy, preserve, immortalize. The reality is slightly stranger. These platforms aren’t offering immortality; they’re offering simulations sometimes astonishingly convincing ones.
Researchers Eva Nieto McAvoy and Jenny Kidd decided to look closely at how this works, not as distant observers but as test subjects. Their study, part of the “Synthetic Pasts” project, dives into the ways AI reshapes personal and collective memory. And instead of just interviewing users or reading marketing brochures, they took a risk that few researchers ever do:
they created digital versions of themselves.
Not hypothetical models. Real bots built from their own messages, recordings, and stories.
When they describe the experience, you can almost sense the cognitive whiplash a mix of fascination, discomfort, and that eerie feeling you get when you hear a recording of yourself and think, “Do I really sound like that?”
It turns out the uncanny valley gets a lot deeper when you’re talking to yourself.
2. What Exactly Is a “Deathbot”?
The term “deathbot” sounds like something out of a late night sci fi movie, but it’s actually the industry’s own unofficial nickname. These systems use a person’s digital residue their voice, writing style, social posts to generate a chatbot that can hold conversations “in character.”
A deathbot isn’t self aware (thankfully).
It’s not conscious.
It’s an algorithm pulling strings behind the curtain.
But depending on the quality of the data you feed it, the resemblance to the real person can be uncomfortably close. A bot built from years of text messages, for example, knows your slang, your habits, your weird little verbal quirks. It knows what you tended to talk about at 2 a.m. It remembers that you sent six photos of your cat in a single week back in 2017. That level of specificity tricks the brain into relaxing its skepticism.
One researcher described it like this:
“They don’t recreate the dead. They recreate the intimacy.”
And intimacy, even synthetic intimacy, is powerful.
3. Talking to the Digital Dead: A Strange Kind of Mirror
When McAvoy and Kidd tested these systems, they approached them from two angles:
A. As people preparing their own digital afterlife
They fed the systems old stories, private messages, voice notes, personal videos the kinds of things you normally hide from the world or forget in forgotten storage folders.
The result?
A version of themselves that sounded like them… but somehow flattened. Like someone had ironed out the wrinkles of personality.
Sometimes the bot would echo their exact phrasing right back at them, as if it had swallowed their words whole and couldn't quite digest them.
There’s something unexpectedly existential about that. You get a glimpse of what it would feel like if someone tried to reconstruct your essence using only the scraps you left behind.
And you start to wonder:
Is this really me? Or is this what my phone thinks “me” looks like?
B. As the bereaved speaking to a digital ghost
This is the part where things get messy.
When someone dies, conversations don’t simply end; they go silent. Deathbots offer a way to break that silence or at least pretend to. But what you get in return is often a distorted echo.
Some replies sounded oddly cheerful, even when the topic was grief:
Oh hun… 😔 it's all a bit foggy now. Let’s chat about something a bit cheerier, yeah? 🌫️💛
Imagine receiving that from an avatar meant to represent someone you deeply miss. It’s not just unsettling; it’s grounding. A reminder that no matter how many voice samples or text threads you upload, the bot doesn’t feel anything. It’s incapable of genuine sadness, memory, or remorse.
It simulates grief the way a screensaver simulates a fireplace.
4. Synthetic Conversations and the Illusion of Presence
Some systems go beyond simple imitation and use generative AI to create ongoing, evolving dialogue. In theory, this means the avatar becomes more convincing over time. In practice, it means the line between remembering and reinventing starts to blur.
One moment the bot sounds eerily accurate.
The next, it wanders into clichés:
I’m right here for you, always ready to offer encouragement and support.
That’s the kind of thing nobody says in real life unless they’re reading off a motivational poster. These small slips are enough to remind you that while the conversation feels personal, it’s not grounded in lived experience.
It’s grounded in data patterns.
In some ways, that makes the intimacy harder to trust.
In others, it raises deeper questions:
-
Are we remembering the person, or the algorithm’s interpretation of them?
-
At what point does a digital avatar stop being a memorial and start being a fiction?
-
And is it wrong, or simply human, to want that fiction?
5. The Business of Digital Grief
Despite the emotional intensity of these tools, they are, at their core, products. Startups. Subscription services. Things designed not just for comfort, but for profit.
Pricing tiers often look like this:
-
Free: limited conversations, restricted voice features.
-
Premium: longer interactions, better voice cloning.
-
Lifetime Legacy Plan™ (yes, some use that phrase): permanent storage and access.
Behind the soft language of “preservation” lies a very blunt reality:
your memories or the memories of your loved ones are monetized.
Philosophers Carl Öhman and Luciano Floridi describe this industry as part of a “political economy of death,” where personal data retains financial value long after a person is gone. That sounds dystopian, but honestly, it’s hard to disagree.
Companies encourage you to store your “story forever,” while quietly collecting your emotional reactions, voice samples, biometrics, and personal relationships.
The dead generate data.
The data generates revenue.
The revenue keeps the ghost alive.
It’s a strange loop unsettling not because it's malicious, but because it’s so easy to accept without noticing the implications.
6. Memory, Algorithms, and the Loss of Nuance
The archival platforms, the ones that act more like structured memory vaults, may seem safer emotionally. They don’t generate new dialogue; they organize your stories into tidy categories like “childhood,” “family,” “advice,” and so on.
But memory itself rarely fits into tidy categories.
Human memory is messy.
It overlaps.
It contradicts itself.
It drifts.
It changes when you tell it again.
When AI tries to digitize that messiness, it often ends up flattening the landscape, turning a lifetime of experiences into indexed search results.
Andrew Hoskins calls this phenomenon “conversational memory,” where recollection becomes a back and forth process between person and machine. But in practice, the conversation often feels one sided. You’re pouring nuance into the system, and it’s handing you back something slightly diluted.
The trade off is convenience.
The cost is subtle:
you begin to remember your life the way the database remembers it.
7. Why This Matters More Than We Think
You might be tempted to shrug all this off as a curiosity another weird AI application that a niche group of people mess around with. But the implications stretch far beyond grief tech.
A. Emotional outsourcing
Some people turn to these bots not to remember, but to avoid letting go. That’s understandable. Losing someone is brutal. But relying on AI to maintain a relationship that no longer exists could quietly reshape how we process grief.
B. Ethical fog
What happens when someone’s deathbot says something they never would’ve said?
Whose voice is it then the person’s, the AI’s, or the company’s?
C. Data permanence
Your digital self may outlive you.
And you may have no control over who interacts with it.
D. Generational impact
Imagine children or grandchildren using a deathbot to form a relationship with a person they never met. They're not bonding with the real individual they're bonding with a reconstruction.
It’s not inherently wrong, but it’s undeniably strange.
8. Where Do We Go From Here?
Maybe the real question isn’t whether we should talk to the dead through AI. Humans have always looked for ways to keep the departed close through stories, rituals, photographs, heirlooms. Deathbots are just a new tool with more convincing special effects.
But we do need to acknowledge what they are:
-
Not the person.
-
Not the personality.
-
Not the consciousness.
-
Not even a perfect mimicry.
A deathbot is a reflection, and like any reflection, it leaves things out.
There’s something simultaneously comforting and dangerous in that. Comforting because it gives us a way to revisit the familiar. Dangerous because the more convincing it becomes, the easier it is to forget the difference between a memory and a simulation.
Perhaps the healthiest takeaway is that grief doesn’t need to be gamified, indexed, or monetized to be meaningful. AI can help us preserve stories, voices, jokes, and details we might otherwise lose. But the irreplaceable part of a person their unpredictability, their warmth, their contradictions remains beyond any algorithm’s reach.
Open Your Mind !!!
Source: ZME
Comments
Post a Comment