AI ‘ghosts’ could pose serious threat to mental health, expert warns: Science Alert

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

We all experience loss and grief. However, imagine not having to say goodbye to your loved ones. That you can virtually recreate them so you can communicate and find out how they are feeling.

For Kim Kardashian’s fortieth birthday, her then-husband Kanye West gifted her with a hologram of her dead father, Robert Kardashian. Kim Kardashian reportedly reacted with disbelief and joy to her father’s virtual appearance at her birthday party. Being able to see, move and talk again with a long-dead, dearly missed loved one can bring comfort to those left behind.

After all, bringing a deceased loved one back to life may seem miraculous — and possibly more than a little scary — but what about the impact on our health? Are AI ghosts helping or hindering the grieving process?

As a psychotherapist researching how AI technology can be used to enhance therapeutic interventions, I am intrigued by the advent of ghostbots. But I’m also a little more concerned about the potential impact of this technology on the mental health of users, especially those who are grieving.

Reviving dead people as avatars has the potential to do more harm than good, perpetuating even more confusion, stress, depression, paranoia, and in some cases psychosis.

Recent advances in artificial intelligence (AI) have led to the creation of ChatGPT and other chatbots that can allow users to have sophisticated human-like conversations.

Using deepfake technology, AI software can create an interactive virtual representation of a deceased person using digital content such as photos, emails and videos.

Some of these creations were mere themes in science fiction fantasy only a few years ago but now they are a scientific reality.

Help or hindrance?

Digital ghosts can bring comfort to the bereaved by helping them reconnect with their lost loved ones. They can give the user a chance to say things or ask questions that they would never have had the chance to when the deceased was alive.

But ghost bots’ uncanny resemblance to a lost loved one may not be as positive as it seems. Research suggests that deathbots should only be used as temporary bereavement aids to avoid potentially harmful emotional dependence on the technology.

frameborder=”0″ allow=”accelerometer; autoplay clipboard writing; encrypted media; gyroscope picture in picture; web-share” allowfullscreen>

AI ghosts can be harmful to people’s mental health by interfering with the grieving process.

Grief takes time and there are many different stages that can happen over many years. When newly bereaved, those experiencing grief often think about their deceased loved one. They can bring back nostalgic memories and it is common for a grieving person to dream more intensely about their lost loved one.

Psychologist Sigmund Freud was concerned with how humans respond to the experience of loss. This points to potential additional difficulties for the bereaved if negativity surrounds the death.

For example, if a person has conflicted feelings for someone and they die, that person may be left with feelings of guilt. Or if the person died under horrific circumstances, such as murder, it can be more difficult for the grieving person to come to terms with it.

Freud called it “derangement,” but it can also be called “complicated grief.” In some extreme cases, a person may experience apparitions and become delusional, seeing the dead person and believing that he or she is alive. AI ghostbots can further traumatize someone experiencing complicated grief and exacerbate associated problems such as hallucinations.

frameborder=”0″ allow=”accelerometer; autoplay clipboard writing; encrypted media; gyroscope picture in picture; web-share” allowfullscreen>

Chatbot Horror

There are also risks that these ghost bots may say harmful things or give bad advice to someone in mourning. Similar generative software such as ChatGPT chatbots are already widely criticized for misinforming users.

Imagine if the AI ​​technology went rogue and started making inappropriate comments to the user – a situation journalist Kevin Rose experienced in 2023 when a Bing chatbot tried to force him to leave his wife. It would be very painful if a deceased father was presented as an AI ghost by a son or daughter to hear comments that they were not loved or liked or that they were not their father’s favorite.

Or, in a more extreme scenario, if the Ghostbot advises the user to join them in death or should it kill or harm someone. This may sound like a plot from a horror movie but it has not been achieved so far. In 2023, Britain’s Labor Party outlined a law to prevent the training of AI to incite violence.

It was in response to an assassination attempt on the Queen earlier in the year by a man who was allegedly instigated by her chatbot girlfriend, with whom he had an “emotional and sexual” relationship.

The creators of ChatGPT acknowledge at this time that the software makes mistakes and is still not completely reliable because it falsifies information. Who knows how a person’s texts, emails or videos will be interpreted and what content this AI technology will produce?

In any case, it appears that no matter how far this technology advances, considerable monitoring and human supervision will be required.

Forgetting is healthy.

This latest technology speaks volumes about our digital culture of limitless possibilities.

Data can be stored on the cloud indefinitely and everything is recoverable and nothing is truly deleted or destroyed. Forgetting is an important element of healthy grief, but forgetting will require people to find new and meaningful ways to remember the deceased.

Anniversaries play a key role in helping those who are grieving not only to remember lost loved ones, but they are also opportunities to represent loss in new ways. Rituals and symbols can mark the end of something that allows humans to properly remember in order to properly forget.

Nigel MulliganAssistant Professor in Psychotherapy, School of Nursing, Psychotherapy and Community Health, Dublin City University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Leave a Comment