Now AI can imitate a deceased loved one. Should it?
Now AI can imitate a deceased loved one. Should it?
Erica Cervini

Now AI can imitate a deceased loved one. Should it?

As another anniversary of my husband’s death approaches, I’ve been wondering if I want AI to “resurrect” him.

An AI clone of him could move, answer my questions and remind me of his voice. It wouldn’t be hard to do. There’s a growing number of fee-paying companies such as StoryFileDeepbrain and  Replika that can turn a dead person’s digital footprint into AI avatars and chatbots. These can be trained to “speak” by feeding them audio of the dead person that imitates their speech and language patterns. Deepfake technology could also be used to “resurrect” my husband Paul’s face and voice.

Some apps allow you to prepare for your “afterlife” so that friends and family can have “conversations” with you once you have died. The HereAfter AI app interviews you while you are alive and hundreds of prompts allow you to give your opinions on myriad topics. One is giving advice. “When life is overwhelming, I’ve always found it very useful to remember that…” You fill in the gaps.

Some people are using these apps so they can speak at their own funerals. In 2022, a chief executive at StoryFile digitally  resurrected his mother by creating an AI clone of her and projected her on a large screen at her funeral in the US. She “interacted” with guests and “answered” their questions.

People have also been digitally resurrected for court cases. Last month, in an Arizona Court, a clone of a murdered man addressed his killer. The family of the murdered man prepared the AI-generated video and the sister scripted what her dead brother said. “The goal was to humanise Chris (the murdered man), to reach the judge, and let him know his impact on this world and that he existed,” she told Reuters.

It’s not just in America that people are digitally resurrecting family members, but also in China. While researching this story about “digital afterlife” companies, I came cross multiple articles about the growing popularity of Chinese families wanting to create a digital likeness of loved ones who had died so they could interact with them.

I thought about accessing one of the free trials to digitally resurrect my husband, who died in 2013 of cancer. I thought if I’m writing this article, I should perform an experiment and explain how you digitally resurrect someone and what it felt like to see someone who had died move and hear and interact with them. I was tempted and stared for a few moments at the blue start button on my screen. But I couldn’t do it; I just didn’t want to.

For a start, I felt that I would be crossing the line of consent. Paul couldn’t agree to this and I’m sure he wouldn’t have wanted to be digitally “resurrected”. This is also about protecting the dignity of the dead. What is there to stop people using the voice and image of a dead family member to answer a video call for fun? There is no justification for doing what you want with a person’s memory just because they are dead.

I want to preserve my memories of Paul, and the memories other people have of him. I fear that digitally resurrecting a person could muddle the memories individuals have of the person and has the potential to create false details about them. Some chatbots can go rogue. In 2023, New York Times columnist  Kevin Roose recounted how he spent two hours speaking to a chatbot that told him to leave his wife.

It’s not an understatement, therefore, to say that using “deadbots” or “griefbots”, terms that describe creating AI replicas of the dead, pose ethical and philosophical questions.

For a start, I’m sceptical that “deadbots” can help with the grief process and could even exacerbate it, despite what the digital afterlife companies say. Advertisements suggest the griefbots will help people through their grief and provide comfort and emotional support if they can “talk” to loved ones. I can only see these bots risking psychological harm for people because these digital ghosts blur the line between reality and fantasy. How can people process their grief and rebuild their world if they believe that someone lives on, not in their memories, but in some distorted version of reality?

A Cambridge University study published last year in the journal Philosophy and Technology suggests deadbots can do emotional harm to both adults and children. In their study, the AI ethicists created scenarios to illustrate ethical and philosophical issues associated with using bots and in one scenario discuss the harmful psychological effects on children.

“When Sam (the child) refers to Anna (his dead mother) using the past tense, the deadbot corrects him, pronouncing that ‘Mom will always be there for you’. The confusion escalates when the bot begins to depict an impending in-person encounter with Sam.”

It easy to see how this would distress the child, because Sam is being told the dead parent is still with him.

The study also suggests the living can be “stalked” or “haunted” by the dead if afterlife companies constantly give them updates and reminders to use deadbots. There is also nothing stopping the deadbot, the Cambridge researchers suggest, from using the dead person’s simulation for advertising. The researchers maintain that digital afterlife companies need regulation.

I wish these companies weren’t around at all. There is something exploitative about promising to digitally recreate a dead person for money while a person is grieving and vulnerable. Or encouraging someone who is dying to create a digital version of themselves so they can “speak” to their family and friends after they have died. This may in turn place an emotional burden on family members if they don’t want to interact with the deadbot.

Most companies have subscription plans. HereAfter AI charges from US$3.99 to US$7.99 to a month. Or people can take out a single payment of from US$99 to US$199. Deepbrain charges up to US$55 a month.

On Paul’s anniversary, I’ll spend time with a couple of friends and share memories and stories about him. There are media interviews with him online that I could watch, but I won’t. It’s too emotional to watch them; I prefer the memories. They are authentic.

 

Republished from Eureka Street, 4 June 2025

The views expressed in this article may or may not reflect those of Pearls and Irritations.

Erica Cervini