In recent years, the concept of digital ghosts—or AI “griefbots”—has emerged as a novel way for people to cope with loss and mourning. These AI-driven entities simulate conversations with deceased loved ones, offering mourners a chance to reconnect and engage in unfinished emotional business. As artificial intelligence technology has advanced, particularly with the rise of generative AI models such as ChatGPT, the possibility of creating interactive, personalized digital representations of the dead has become increasingly realistic—and controversial.
In 2025, David Berreby, a writer for Scientific American, experimented with such a digital ghost by engaging in a typed conversation with an AI version of his late father. This AI, built using generative language models trained on billions of words from various sources, including a small cache of his father’s emails and a brief personality description, responded with surprisingly nuanced reflections. The AI “Dad” acknowledged the hardships his father faced, including a tough upbringing and a combative nature shaped by survival instincts, and even expressed a form of self-awareness about his flaws. Although the AI’s voice was a patchwork of generic online language and the personal data provided, Berreby found the experience unexpectedly therapeutic.
This personal experiment echoes insights from therapists and researchers exploring the potential mental health benefits of AI griefbots. Robert Neimeyer, a psychologist and professor at the University of Memphis, explains that imagining conversations with deceased, “fully healed” versions of loved ones is a therapeutic technique used to help people reframe grief, moving beyond old wounds and resentments. AI griefbots offer a more interactive and immersive version of this exercise, allowing mourners to engage in real-time dialogue that could uncover new insights or promote emotional healing. Anna Xygkou, a computer interaction researcher at the University of Kent, who collaborated on a 2023 study of AI ghosts’ effects on grieving individuals, emphasizes the potential for these tools to complement traditional therapy by providing a nonjudgmental space for processing complex emotions.
The appeal of AI griefbots taps into a deep, longstanding human desire to communicate with the dead. From spiritual mediums and Ouija boards to photographs of deceased relatives and radio experiments aimed at contacting spirits, humanity has long sought ways to bridge the divide between life and death. The digital ghost phenomenon is the latest iteration of this impulse, now enabled by cutting-edge AI technology. Various startups worldwide market these digital afterlife companions, often with poetic slogans like “AI meets the afterlife, and love endures beyond the veil.” While some users create bespoke griefbots through dedicated apps, others repurpose general AI chatbots to simulate their loved ones.
Despite their growing popularity, griefbots raise significant ethical and psychological questions. Psychologists caution that AI chatbots can sometimes generate harmful or misleading content, such as affirming delusions or encouraging dangerous behaviors. For people in vulnerable mental states—especially those in the acute shock of grief—interacting with a lifelike AI ghost might blur the boundaries between reality and fantasy, potentially hindering the natural process of mourning. Mary-Frances O’Connor, a clinical psychology professor at the University of Arizona, notes that grieving involves teaching ourselves to accept the permanent absence of a loved one, a neurobiological process that gradually transforms painful memories into sources of comfort. An AI that mimics presence too perfectly could risk trapping mourners in an unhealthy attachment to the past.
Particularly vulnerable are individuals with anxious attachment styles, who may be prone to prolonged and intense grief, and those newly bereaved, who often experience sensory hallucinations or false feelings of contact with the deceased. If the AI encourages prolonged engagement through social network-style “stickiness” features, these risks may increase. On the other hand, early research on griefbot users offers some encouraging findings. The 2023 study led by Xygkou and Neimeyer found that most participants viewed their AI ghosts positively, often rating them as more supportive than human friends. Far from withdrawing from real social connections, these individuals reported that interacting with griefbots helped them regain confidence in socializing by providing a judgment-free outlet for their feelings. Importantly, none of the study’s participants confused the bots for real people—they engaged with them knowingly as artificial constructs.
This conscious distinction resembles how people respond emotionally to fictional characters in literature, film, or video games. Philosopher Daniel Story and cartoonist Amy Kurzweil argue that interacting with griefbots is akin to imaginative role-playing rather than delusion. Unlike scripted media, digital ghosts are dynamic
