In the Paper BrandedUp Watch Hello! Create with us Privacy Policy

Griefbots: Can AI share in our grief?

Published Oct 27, 2025 8:19 pm Updated Oct 27, 2025 8:50 pm

Grieving has entered the realm of artificial intelligence. At a time when people are going through this most human of experiences, many are now giving AI companies permission to harvest their deceased loved ones’ online and digital history to help them face the period of mourning. Armed with the data, these companies create “griefbots” to impersonate the departed and comfort the ones left behind.

Just recently, Zelda Williams—the daughter of late actor Robin Williams—made the rounds on social media after asking Internet users to put a "full stop" to AI-generated videos of her and her father, stressing how disrespectful it is to his legacy. "Believe me, it's not what he'd want," she said. "To watch the legacies of real people be condensed down to, 'This vaguely looks and sounds like them so that's enough,' just so other people can churn out horrible TikTok slop puppeteering them is maddening."

For some, the intent may be clear: Bereaved people need comfort and closure. But does this sophisticated attempt to find peace disrupt the natural processes of mourning and death? Have we, as a society, crossed a moral and ethical line?

How griefbots work

As reported by AP News, Michael Bommer, a 61-year-old start-up entrepreneur, was diagnosed with terminal cancer in 2022. Toward the end of his life in 2024, he spent about two months working with AI-powered legacy platform Eternos, now Uare.ai, to create an interactive version of himself that his wife could continue to talk to after his death.

Able to analyze large amounts of human language and also generate it, griefbots are chatbots trained using a departed individual’s digital footprint—photos, voice recordings, videos, text messages, emails, writings, photos, social media posts, among others. In Bommer’s case, he allowed Uare.ai to recreate his voice.

What else can griefbots do? "If the griefbot has an avatar, it would be able to mimic even the movements of the person," Dr. Charibeth Cheng, associate dean of the De La Salle University College of Computer Studies, told PhilSTAR L!fe. "Using all the data, the griefbot will be able to mimic the deceased person’s communication style, writing style, speaking style, and, to some extent, personality."

Glitches in the grieving process

Joan Menco, grief and end-of-life coach for GriefShare, discussed what AI can offer grieving individuals: anonymity, 24/7 availability, and much less cost compared to professional help. 

“People who have been left behind always say [about their departed loved ones], ‘If I could hear their voice again.’ They have many regrets. They want to be able to say sorry, reveal one last secret, share things they weren’t able to say,” Menco said in an interview with L!fe. “AI provides that safe space for them.”

The safety, however, is relative as griefbots offer only a temporary relief, creating a “fantasy” that a departed loved one is still alive.

“It’s a long process before the death of a loved one sinks in. Griefbots could further delay that process of healing,” explained Menco. “You could become dependent on AI. Ang mas maganda nga niyan, if you lost a relationship, you cultivate new relationships [with humans] that will be helpful and beneficial for you.”

Psychologist Wenna Brigaste pointed out that AI griefbots "won't be able to show empathy" the way humans can, "especially if they cannot see a person's non-verbal gestures that show exactly how they are feeling."

Toni-Jan Keith Monserrat, a senior principal engineer under AI & Data for Kollab, detailed how AI is unable to grasp compassion.

According to him, griefbots can have some understanding of human compassion, dignity, and consent only based on how near it is “mathematically to other words that connote them. [AI] doesn’t really know [their] meaning,” Monserrat explained. “We can’t have [griefbots] focus on compassion [per se].”

“[AI] is not perfect in its mimicry,” added Cheng. “Everything that is digital is not completely about one person, especially if it’s [based on] social media data. There’s a large portion of the [person] that’s not seen by the AI. So for me, misrepresentation is going to very likely happen.”

What griefbots create is just an echo chamber. According to Richard Parayno, co-founder and chief product and design officer of cloud platform Navegante, griefbots “are explicitly trained to support and agree with the user, thus leading these models to be sycophantic.”

“AI does not judge; AI is not impatient; AI will always side with you; and AI will never challenge your statements unless you specifically prompt it to,” he continued.

Red flags

Without specific instructions in a will giving consent to their digital persona, “there’s a risk of misrepresentation” in griefbots, Cheng pointed out. “At nawala sa [deceased] ‘yung control over their data.”

The law is clear on ownership. Per Atty. Kyra S. Sy-Santos, an associate at Consunji & Peralta Law Offices, the deceased person keeps ownership of their online identity even after death. 

“Since the digital footprint [of the deceased] is not an asset, ownership thereof does not pass to anyone upon death,” she told L!fe. But according to her, heirs still have certain rights over what can or cannot be done with what’s left behind. 

“Consent is essential before the likeness of the deceased may be used in any shape or form,” the lawyer added. If the deceased did not specify instructions in a will, the heirs may be the ones to set the terms and conditions for use of their loved one’s online history. 

In the Philippines, regulation of AI falls under the Data Privacy Act and the Intellectual Property Law. However, Sy-Santos noted that deciding who owns and controls specific griefbots, and the information they will be using, is yet to be determined. 

There is still much to discuss about the nuances of humans turning to AI for comfort in deep moments of grief. But for now, Brigaste provided some direction in dealing with pain without AI: "Talk to people you trust: family members, friends, counselors, or any spiritual authority. Write a journal. Cry," she said.

To put it simply, Menco noted that it's all about facing your grief. "We have to normalize that. Know that it’s okay not to be okay when you are grieving."