Want to chat with the dead? AI grief bots change future of mourning
Many companies are developing AI chatbots based on the digital footprints of the dead for us to interact with. (Photo generated by Bing AI)

AI-generated grief bots are transforming how we mourn, allowing conversations with digital simulations of deceased loved ones but raising ethical and psychological concerns



If you could talk to the dead, would you? What if those dead are simulations generated by artificial intelligence (AI) systems and the digital ashes of the deceased? As the digital afterlife industry evolves and expands, more questions arise.

It sounds haunting and a bit weird, but the future is already here. Have you seen the "Black Mirror" episode "Be Right Back," where Martha texts an AI chatbot of her dead boyfriend, making their interactions much more lifelike? The idea of using AI to resurrect a dead loved one digitally definitely sounds like a plot from an episode of "Black Mirror." These "grief bots," "dead bots" or "death bots" have slowly but surely become a reality, with several firms now offering the service.

I am currently investigating how generative AI could potentially impact the grieving process. Nowadays, people are utilizing AI chatbots to engage with synthetic versions of deceased loved ones.

Many companies are developing AI chatbots based on the digital footprints of the dead for us to interact with. These griefbots create a simulation of a lost loved one. Built on artificial intelligence that uses large language models, or LLMs, the bots imitate how the deceased person talked by using their emails, text messages, voice recordings and more. The technology is meant to help the bereaved deal with grief by allowing them to chat with the bot as if they were talking to the person. But does it help?

Humans have used technology to cope with feelings of loss for more than a century. For example, post-mortem photographs gave 19th-century Victorians a likeness of their dead to remember them when they couldn’t afford a painted portrait. Recent studies have shown that having a drawing or picture as a keepsake helps some survivors to grieve. In recent years, people have continued to listen to old voicemails and watch videos of the deceased loved one, turning it into a ritual to maintain a connection with the lost loved one.

Some health and technology experts warn about grief bots. They believe these bots can trap mourners in online conversations, making it difficult for them to move on with their lives. Some argue that chatbots enable people to form strong emotional ties to virtual personas, making them dependent on chatbots for emotional support.

Given the uncertainty about how such chatbots will impact the grieving process, social scientists must research all possible reactions to this new technology.

So, why is it considered healthy to look at pictures or videos of a loved one but it is frowned upon to chat with a griefbot? The answer is pretty clear, I think: reality. Let's compare past technologies with the potential of grief bots.

Old photographs and videos are genuine representations of a moment in time. They are real and leave behind digital footprints. On the other hand, griefbots create a synthetic version of the deceased based on their digital footprint.

Images and videos offer a view of a specific time in the past like that summer you went to the beach together. Grief bots use past data to generate new content, making these outputs potentially more lifelike but fundamentally not real and never will be.

Another criticism against grief bots is that they prevent closure. Continuing a chat with the deceased may result in ongoing grieving, blurring the line between connection and closure. Dr. Pauline Boss, a professor emerita in the Department of Family Social Science at the University of Minnesota coined the term "ambiguous loss" in the 1970s. She explained that people we love can be physically gone but psychologically present, or the opposite. Ambiguity complicates our ability to achieve closure and move forward in life.

While the dead bot initially serves as a therapeutic aid, ethicists at Cambridge's Leverhulme Centre for the Future of Intelligence examined hypothetical scenarios likely to emerge in the fast-growing digital afterlife industry. The authors warn that the bots could be used to advertise products from beyond the grave and could distress children by asserting that a dead parent is still "with you." Finally, the departed could be used to spam surviving family and friends with reminders and updates about the services they provide – a scenario they describe as being "stalked by the dead."

So far, there isn’t much evidence or research on grief bots, and it’s uncertain how they will affect the way we deal with loss. Usage data is not publicly available, and currently, not many mourners use the technology. There is very little known about the potential short-term and long-term consequences of using digital simulations for the dead.

But still, regardless of the comfort a bereaved person may or may not find in a griefbot, it is always worth remembering that when speaking to a chatbot, it’s just a case of AI predicting what the next word is and it is not your loved one.