This article first appeared in Volume 54, Issue 3 of our print edition of Index on Censorship, titled Truth, trust and tricksters: Free expression in the age of AI, published on 30 September 2025. Read more about the issue here.
“Don’t speak ill of the dead” is an aphorism that dates back centuries, but what if the dead speak ill of you? Over the past few years there has been a rise in the creation of chatbots trained on the social media and other data of the deceased. These griefbots are deepfakes designed to simulate the likeness and the personality of someone after their death, as though they have been brought back as ghosts.
The concept of the griefbot is not new. Our narratives around AI span centuries and the stories about creating an artificial version of a lost loved one can be found in Greek mythology: Laodameia, for example, distraught at losing her husband Protesilaus during the Battle of Troy, commissioned an exact likeness of him. (It did not end well: she was caught in bed with it. Her father, fearing she was prolonging her grief, burned the wax replica husband and Laodameia killed herself to be with Protesilaus.)
Further back, as US academic Alexis Elder has explored, there are precursors to griefbots in classical Chinese philosophy. The Confucian philosopher Xunzi, writing in the third century BCE, described a ritual where the deceased person was deliberately impersonated via a roleplay to allow loved ones the chance to engage with them once more.
These days, sci-fi likes to surface our contemporary fears and the TV shows have notable storylines warning of the pitfalls of resurrecting our loved ones via technology. In the 2013 Black Mirror episode Be Right Back, a grieving woman uses an AI service to talk with her recently deceased partner, desperate for communication that is ultimately doomed to be illusory.
Grief tech hit the headlines in 2020 when US rapper Kanye West gave his then-wife, Kim Kardashian, a birthday hologram of her dead father.
“Kanye got me the most thoughtful gift of a lifetime,” she wrote on social media. “It is so lifelike and we watched it over and over.”
West likely steered the script, which might’ve been obvious when the hologram told Kim she’d married “the most, most, most, most, most genius man in the whole world – Kanye West”.
While the broader public perception of ghostbots is often one of distaste and concern, those who have engaged with the digital echoes of a lost loved one have been surprisingly positive. When we lose someone we love, we do what we can to fix in place our concept of them. We remember and we memorialise: keepsakes and pictures, speaking their names and telling their stories. Having them with us again through technology is compelling. A Guardian newspaper article in 2023 reported users’ sense of comfort and closure at engaging with chatbots of their dead relatives.
“It’s like a friend bringing me comfort,” said one user.
With a potentially huge new market – grief is universal, after all – come the start-ups. Alongside general tools like ChatGPT are the dedicated software products. The US-based HereAfterAI, which bills itself as a ‘memory app’, allows users to record their thoughts, upload photos and grant access of their content to their loved ones. South Korean company DeepBrain AI claims it can build you an avatar of your dead loved one from just a single photo and a 10 second recording of their voice.
Current technology offers us the ‘could we?’, but what about the ‘should we’? In their 2023 paper, Governing Ghostbots, Edina Harbinja, Lilian Edwards and Marisa McVey flagged a very major problem: that of consent.
“In addition to the harms of emotional dependence, abusive communications and deception for commercial purposes, it is worth considering if there is potential harm to the deceased’s antemortem persona,” they wrote.
If we have some ownership of our data when alive, then should we have similar rights after our death? Creating an avatar of someone who is no longer around to approve it means we are literally putting words in someone’s mouth. Those words might be based on sentences they’ve typed and videos they’ve made but these have been mediated through machine learning, generating an approximation of an existence.
There is, of course, the potential that a desire for a sanitised reminder of the deceased means their words are only permitted to be palatable. Content moderation of AI chatbots might mean censorship or moderation – the same that applies to the large language models (LLMs) that drive them. Views could be watered down, and ideologies reconfigured. There is no true freedom of speech in the literal sense, and no objection available to the lack of it. The dead have no redress.
Conversely, what if posthumous avatars are built for political influence? In India in 2024, a deepfake avatar of a woman who had died more than a decade previously – the daughter of the founder of the Tamil Tigers – was shown in a video urging Tamils to fight for freedom. And in the USA, the parents of Joaquin Oliver, killed in a school shooting in Florida in 2018, created an AI version of their son to speak to journalists and members of Congress to push for gun reform. In both the India and USA cases, the griefbot technology did not exist when these people died and they would have had no way of knowing this could happen, let alone be able to consent to it.
Whether we like it or not, most of us will live on digitally when we die. Our presence is already out there in the form of data – all the social media we’ve ever posted, all the photos and videos of us online, our transaction history, our digital footprints. Right now, there is a lack of clear governance. Digital rights vary dramatically from jurisdiction to jurisdiction, and AI regulation is in its infancy. Only the EU and China currently have explicit AI legislation in place with moves afoot in other countries including the USA and UK, but not yet in statute. Amidst all of this, global tech companies get to set the agenda. For now, all we have is the hope that we can set our own personal boundaries for posthumous expression before our grief becomes someone else’s commodity.


