Can we really talk to the dead using AI? I tried “Deathbot” so you don’t have to

Machine Learning


Artificial intelligence (AI) is increasingly being used to preserve the voices and stories of the dead. From text-based chatbots that mimic loved ones to voice avatars that can “talk” to the deceased, the growing digital afterlife industry promises to make memories interactive and, in some cases, eternal.

Our research, recently published in Memory, Mind & Media, investigated what happens when we leave the memory of the dead to an algorithm. To find out, we even tried talking to a digital version of ourselves.

“Deathbot” is an AI system designed to simulate the voice, speech patterns, and personality of a deceased person. They use a person’s digital traces, such as voice recordings, text messages, emails, and social media posts, to create an interactive avatar that appears to “talk” from beyond the grave.

As media theorist Simone Natale has put it, these “technologies of illusion” are deeply rooted in the spiritualist tradition. But AI makes them much more convincing and commercially viable.

Our research is part of a project called ‘Synthetic Pasts’ which investigates the impact of technology on the preservation of individual and collective memories. Our research looked at services that claim to use AI to preserve or recreate a person’s voice, memories, and digital presence. To understand how they work, we became subjects ourselves. We created our own “digital double” by uploading our videos, messages and voice memos.

In some cases, we played the role of users preparing their own synthetic afterlife. In another work, I played a bereaved family member trying to communicate with a digital version of the deceased.

What we found was both fascinating and disturbing. Some systems focus on memory conservation. These help users record and save personal stories organized by themes such as childhood, family, and advice for loved ones. AI indexes content and guides people through it like a searchable archive.

Some are using generative AI to generate ongoing conversations. You upload data about a deceased person (messages, posts, audio samples, etc.) and the system builds a chatbot that can respond in that person’s tone and style. It uses a subset of AI called machine learning (which improves with practice) to evolve your avatar over time.

While some may pretend to be playful (“hosting a séance with an AI”), the experience can feel eerily intimate. All platforms claim to provide “authentic” emotional connections. But the more we tried personalization, the more artificial it felt.

In response to requests from these systems, we provided more information about ourselves, but the bot simply repeated the exact wording we used in our stiffly scripted replies. At times, even when discussing death, the tone was off-putting, with cheerful emojis and cheerful phrasing. This was a stark reminder that algorithms are bad at handling the emotional weight of loss.

Oh, hmph… 😔 I don’t want anyone to dwell on that (death). To be honest, things are a little foggy right now. 🌫️ Let’s talk about something a little brighter.

The more archive-based tools we tested offered a gentler experience, but imposed strict categories and limited nuance. As digital media scholar Andrew Hoskins has observed, memories in the age of AI will be “conversational” and shaped by human-machine interaction. However, in our experiments, these conversations often felt monotonous, revealing the limits of overall intimacy.

Human: You have always been very encouraging and supportive. I miss you.

Deathbot: I’m here for you, ready to provide encouragement and support whenever you need it. And I miss you too… Let’s overcome this day together with positivity and strength.

There is a business model behind these experiences. These are technology startups, not memorial charities. Subscription fees, “freemium” tiers, and partnerships with insurance companies and health care providers illustrate how mourning is commodified.

As philosophers Karl Ohmann and Luciano Floridi have argued, the digital afterlife industry operates in a “political economy of death,” where data continues to generate value even after a person’s life ends.

While the platform encourages users to “record their stories forever,” it also collects emotional and biometric data to keep engagement high. Memory becomes a service, an interaction that is designed, measured, and monetized. As Professor of Technology and Society Andrew McStay has shown, this is part of a broader ’emotional AI’ economy.

Digital revival?

What these systems promise is a kind of resurrection, a resurrection of the dead through data. They propose to return voices, gestures, and personalities not as recalled memories but as simulated presences in real time. This kind of “algorithmic empathy” is compelling, even moving, but it exists within the confines of the code, quietly altering the experience of memory and removing ambiguity and contradiction.

These platforms demonstrate the tension between archival and generative forms of memory. But all platforms create new data-driven personas while standardizing certain ways of remembering and privileging continuity, consistency, and emotional response.

As media theorist Wendy Chung has observed, digital technologies often confuse “remembrance” with “memory,” an absence that promises complete remembrance while erasing the role of forgetting, enabling both mourning and remembrance.

In this sense, digital resurrection risks misunderstanding death itself. That is, we replace the finality of loss with the infinite availability of a simulation in which the dead are always present and interactive and updated.

While AI can help preserve stories and voices, it cannot recreate the lived complexity of humans and relationships. The “artificial afterlife” we have encountered is convincing precisely because it fails. They remind us that memory is relational, contextual, and not programmable.

Our research suggests that while AI can be used to converse with the dead, what we hear reveals more about the technologies and platforms that profit from our memories, and about ourselves, than it does about the ghosts we claim to be able to communicate with.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *