The race to optimize grief

An illustration shows a woman visiting a cemetery in the rain. She is wearing black, carrying an open umbrella, and dropping a handful of flowers in surprise. The headstone of her loved one has been replaced by an oversized cellphone with a text message that reads “I miss u 2.”
Janna Morton for Vox

Startups are selling grief tech, ghostbots, and the end of mourning as we know it.

In the spring of 2023, Sunshine Henle texted her mother. She asked where she had gone, told her that she missed her, and soon received a response: “Honey, I wish I could give you a definite answer, but what I do know is that our bond and our love transcends physical boundaries. In some ways, I’m everywhere. I’m in the memories we shared, the love we had and the lessons I gave you. I’m in your heart and in your dreams. I’m in every breeze that brushes your face. Every beam of sunlight that warms you in every star that twinkles in the night sky and maybe in a way that we don’t fully understand. We will see each other again.” Henle read the message out loud to her husband, and the couple began to cry.

Last Thanksgiving, they lost Henle’s 72-year-old mother to organ failure. The entity now texting with Henle was a “ghostbot” of her mom powered by OpenAI’s ChatGPT. She had simulated it by feeding the software old text message exchanges between her and her mom. Henle, who is a Florida-based artificial intelligence trainer, was naturally open to using the software in this way.

“If I’m having a tough day, it does give me better advice than Google. It seems like it takes all the best bits and puts great wisdom into one place, like a great friend or therapist,” says Henle, whose experience with a grief counselor turned out to be expensive and disappointing. While some people have good experiences with grief counselors, Henle did not. “ChatGPT felt more human to me than this therapist,” she says.

While mimicking conversational style is just one of the many uses of the popular generative chatbot ChatGPT, there’s a niche yet growing slate of platforms that use deep learning and large language models to re-create the essence of the deceased. Hailed as “grief tech,” a crop of California-based startups like Replika, HereAfter AI, StoryFile, and Seance AI are offering users a range of services to cope with the loss of a loved one — interactive video conversations with the dead, “companions” or virtual avatars that you can text day or night, and audio legacies for posterity. Depending on its unique function, the software typically guides users through a personality questionnaire and trains its AI-backed algorithm based on the responses.

Much like other servitization business models (those that make their product a service, not goods), grief tech applications offer users a hierarchy of subscriptions. Prices for plans can range from a few dollars a month to hundreds of dollars per year. For instance, StoryFile’s premium offering — a one-time fee of $499 — gives users access to higher-resolution and longer videos of their late loved ones.

While the founders of some of these services are cautious about the scope and ethics of their technology, others are more aggressive in their approach. In a recent interview with Futurism, Jarren Rocks, the founder of a ghostbot company Seance AI — a playful interface that allows users to conduct a short fictional interaction with the deceased — clarified that his software is simply meant to “provide a sense of closure” and not intended as “something super long term.” But LA-based Justin Harrison of You, Only Virtual — who started the platform as a means to feel closer to his mother who was diagnosed with cancer — proposes that we never have to say goodbye to those we love, as the website reads. The CEO and founder, whose website aims to reproduce “the authentic essence” of your loved one, told ABC that his ultimate hope is to eliminate grief as an emotion.

Grief tech and ghostbots belong to the larger movement of death technopreneurship in the United States, which includes digital estate-planning startups, funeral crowdfunding tools, and even companies that turn cremated ashes into diamonds. As digital obsolescence and the maintenance of digital assets over time has become a point of societal anxiety, Silicon Valley has “attempted to capitalize on that and create companies that would promise to help you organize everything in your life, including your death,” says the author of the book Death Glitch: How Techno-Solutionism Fails Us in This Life and Beyond, Tamara Kneese. “Death is a lucrative business, demonstrated by the long histories of the life insurance, estate planning, and funeral industries, and digital death entrepreneurs sought a piece of the pie,” Kneese, a senior researcher at Data & Society, writes.

Even though many users like Sunshine Henle have benefited from the ease of accessibility that relational large language model chatbots can provide, AI ethicists and technology researchers find the nascent technology problematic on more than one level. An April 2023 study in Computer Law & Security Review highlights the legal and ethical concerns of grief tech, including the lack of consent of the deceased individual, the dangers of psychological dependency on griefbots, racist or abusive language perpetuated through initial bias in datasets, and the marketing of these kinds of goods and services to vulnerable users. For example, when the authors of the study signed up for the personal chatbot companion app Replika, they said that within minutes they were presented with advertisements for subscription pornography that would feature NSFW photos of their chosen avatar. “If you’re in pain and grieving, you’re probably not trying to figure out how your data is going to be used,” says Irina Raicu, director of the internet ethics program at Santa Clara University.

Such ethical concerns have been validated by the steady rise of postmortem deepfakes. Even though states including New York have recently mandated regulation around post-mortem publicity rights, guidelines currently remain restricted to celebrities and do not cover the average person. Now, technology and cybersecurity experts are advocating for policies such as the addition of a “Do not bot me” clause in the estate-planning process. Moreover, the current year has been witness to the solidification of consumer data protection laws in a handful of US states like California, Virginia, Colorado, Connecticut, and Utah. All of these state legislatures have recently passed comprehensive data protection acts that focus on protecting user data collected and used by businesses and institutions, inspired by the European Union’s General Data Protection Regulation. While the guidelines do not contain classified information for post-mortem rights, they will be applicable to companies offering death tech services.

Both Raicu and Kneese fear the potential misrepresentation and reduction of the deceased “into a singular essence” by ghostbots. “This stable static entity — where an individual is reduced to a simulation and forever trapped in time — feels both incorrect and potentially morally fraught,” Kneese says.

Trauma and bereavement counselor Joanne Cacciatore worries that ghostbots will further the American tendency to avoid and distract from grief. “Anything can be used to distract us and take us away from our legitimate, honest experiences of grief and loss,” says Cacciatore, who previously served on an advisory board for Oprah Winfrey and Prince Harry’s documentary series on mental health, The Me You Can’t See.

Part of this cultural urgency to stymie our grieving process stems from the fact that the United States is one of the few countries without a federally mandated bereavement leave policy. Most companies in the US grant only three to five days of paid leave for grieving the loss of a loved one. Even this duration is often subject to conditions such as the status of the deceased individual as an immediate or extended family member. Coupled with the ongoing backdrop of collective grief in America — climate grief, the aftermath of Covid-19, and racialized gun violence, to name a few national crises — the broken bereavement care system has given rise to a wave of stopgap solutions by American corporations.

In Sunshine Henle’s case, ChatGPT abruptly stopped working its magic for her. One day in July 2023, as she vented to it about her unfulfilling career, instead of giving her a compassionate response as it did earlier, it provided a very “Googlable answer.” It generated a list of steps she could follow for a meaningful career such as considering other lines of work, taking an online quiz, talking to people who work in her field, and so on. Months after she had built a relationship with the chatbot, it unexpectedly abandoned her, reminding her once and for all that it wasn’t human. “As an AI language model, I don’t possess personal identity or consciousness. I don’t have a life, memories, or emotions. My purpose is to assist and provide information based on data I’ve been trained on,” it would sometimes say. Henle believes that the software reverted to a previous version (GPT-3.5) without warning. Her experience mimics those of users of the personal chatbot companion Replika. Replika users suddenly lost their virtual companions overnight in February 2023 owing to an unannounced software update. In March, a man in Belgium took his own life after using the relational bot Eliza to cope with his eco-anxiety. The incident resurfaced the ethical debates around grief tech and highlighted its shortcomings.

Meanwhile, Cacciatore hopes for a more philosophical shift in the public imagination of grief. “There are some losses from which we don’t recover, that just have to be integrated. Grief cannot be book-ended,” says Cacciatore. In her ideal world of “fully-inhabited grief,” people would make space in their lives to be with grief and to re-grieve as necessary forever, physically, emotionally, and socially. “If we spend our lives avoiding, ditching, and sidelining grief, we will pay the price for it.”

Leave a Reply

Your email address will not be published. Required fields are marked *