Experts raise concerns over the psychological impact of digitally resurrecting loved ones

Grief and loss affect everyone’s life. However, what if saying goodbye wasn’t the last step? Imagine having the ability to communicate with loved ones, digitally bring them back, and find out how they’re doing no matter where they are.

As explained here, Nigel Mulligan, an assistant professor of psychotherapy at Dublin City University, noted that for many people, the thought of seeing a deceased loved one moving and speaking again could be comforting.

AI “ghosts” could lead to psychosis, stress and confusion

Mulligan is an AI and therapy researcher who finds the emergence of ghost bots fascinating. But he’s also concerned about how they can impact people’s mental health, especially grieving individuals.

Bringing back deceased people as avatars could lead to more issues than they solve, increasing confusion, stress, sadness, anxiety, and, in extreme circumstances, even psychosis.

Thanks to developments in artificial intelligence, chatbots like ChatGPT—which simulate human interaction—have become more common.

AI software can create convincing virtual representations of deceased people using digital data, including emails, videos, and pictures, with the use of deepfake technology. Mulligan claims that what appeared to be pure fiction in science fiction is now becoming a physical reality in science.

AI ghosts could interfere with the mourning process

A study that was published in Ethics and Information Technology suggested using death bots as temporary comfort throughout the grieving process in order to avoid an emotional dependence on technology.

AI ghosts can interfere with the normal grieving process and affect people’s mental health since grief is a long-term process that starts slowly and progresses through many phases over several years.

People may often think about who they lost and remember them vividly during the early stages of grief. According to Mulligan, it’s typical for grieving individuals to have vivid dreams about their departed loved ones.

AI “ghostbots” could lead to hallucinations

Psychoanalyst Sigmund Freud had a deep interest in how individuals cope with loss. He noted that additional challenges could arise during the grieving process if there are further negative aspects involved.

For instance, if someone had mixed feelings toward a person who passed away, they might feel guilt afterward. In the same way, accepting a death under tragic circumstances—like murder, for example—may be much more difficult for the grieving person.

Melancholia, or “complicated grief,” is the name used by Freud to describe that feeling. In severe cases, it may cause someone to see ghosts or have hallucinations of the deceased, giving them the impression that they are still alive.

The introduction of AI ghostbots may exacerbate problems like hallucinations and increase the suffering of a person who is experiencing a complex grieving process.

While the idea of digitally communicating with deceased loved ones may seem comforting at first, this technology could have profoundly negative psychological impacts. Interacting with an AI-generated avatar or “ghostbot” risks disrupting the natural grieving process that humans need to go through after a loss.

The grieving process involves many stages over the years – having an artificial representation of the deceased could lead to unhealthy denial of death, avoidance of coming to terms with reality, and an inability to properly let go.

While the ethics of creating these “digital resurrections” is debatable, the psychological fallout of confusing artificial representations with reality poses a serious risk. As the capabilities of AI continue to advance, it will be crucial for technologists to carefully consider the mental health implications. Abusing this technology recklessly could cause significant emotional and psychological harm to grieving people struggling with loss. Proceeding with empathy is paramount when blending powerful AI with something as profound as human grief and mortality.