Sunday, April 21, 2024
HomeNewsGhostbots: AI versions of deceased loved ones could be a serious threat...

Ghostbots: AI versions of deceased loved ones could be a serious threat to mental health

Ghostbots: AI versions of deceased loved ones could be a serious threat to mental health

Dublin, March 15 (The Conversation) We all experience grief when we lose someone. However, imagine that you never had to say goodbye to your loved one. You can bring them back to life, talk to them and find out how they are feeling.

On Kim Kardashian’s fortieth birthday, her then-husband Kanye West gifted her a hologram of her deceased father Robert Kardashian. It is being said that Kim Kardashian expressed her disbelief and joy at her father’s virtual presence at her birthday party. Being able to see a long-dead, much-missed loved one, walking and talking again, can bring comfort to a family member grieving their loss.

After all, reviving a deceased loved one may seem miraculous – and possibly even a little scary – but what effect does it have on our health? Are loved ones reappearing through AI a help or a hindrance to the grieving process?

As a psychiatrist researching how AI technology can be used to enhance therapeutic interventions, I am intrigued by the advent of Ghostbots. But I’m also a little more concerned about the potential effects of this technology on the mental health of those who use it, especially those who are grieving.

Reviving dead people as avatars is likely to do more harm than good, causing even more confusion, stress, depression, paranoia, and in some cases, psychosis.

Recent developments in artificial intelligence (AI) have led to the creation of ChatGPT and other chatbots that interact with users like humans.

Using deep fake technology, AI software can create an interactive virtual version of a deceased person using digital content such as photographs, emails and videos.

Some of these creations were only the subject of science fiction fantasy a few years ago but now they are a scientific reality.

Helper or hindrance?

Digital ghosts of deceased relatives can console bereaved people by helping them reconnect with their lost loved one. They may provide the user with the opportunity to say things or ask questions that they never had the opportunity to do when the deceased person was alive.

But the Ghostbots’ uncanny resemblance to a lost loved one may not be as positive as it seems. Research suggests that deathbots should only be used as a temporary aid in grieving to avoid potentially harmful emotional dependence on the technology.

AI ghosts can be harmful to people’s mental health by interfering with the grieving process.

Recovering from grieving someone’s death takes time and involves many different stages that may occur over many years. Immediately after the death of a loved one, people become more grieving and remember their deceased loved one frequently. They bring back old memories and a grieving person seeing their lost loved one again and again in their dreams is quite common in itself.

Psychoanalyst Sigmund Freud was concerned with how humans respond to the experience of loss. He explained that there could be potential additional difficulties for mourners if there is negativity surrounding the death.

For example, if a person had ambivalent feelings toward someone and they died, that person may be left with feelings of guilt. Or if someone died under terrible circumstances, such as murder, it may be more difficult for a grieving person to accept.

Freud called this ‘melancholy’, but it can also be called ‘complicated grief’. In some extreme cases, a person may experience hauntings and hallucinate that he or she sees a dead person and begins to believe that he or she is alive.

AI ghostbots can further traumatize someone experiencing complicated grief and increase related problems like hallucinations.

chatbot horror

There is also a risk that these spirits may say hurtful things or give bad advice to someone who is grieving. Similar generator software like ChatGPT chatbots are already widely criticized for providing false information to users.

Imagine if AI technology went haywire and started making inappropriate comments at the user – a situation journalist Kevin Roose experienced in 2023 when a Bing chatbot tried to convince him to leave his wife. How tragic this situation would be if a son or daughter were to imagine their dead father as an AI ghost and hear comments that their father did not love them or that they were not their father’s favourite.

Or, the limit would be reached if the Ghostbot suggested to the user that they should die along with them or that they should kill or harm someone. This may sound like the plot of a horror movie but it is not that far from reality. In 2023, the UK Labor Party outlined a law to prevent AI from being trained to incite violence.

This was a response to the attempted assassination of the Queen earlier that year by a man who was encouraged by his chatbot girlfriend, with whom he was having an ’emotional and sexual’ relationship.

The creators of ChatGPT currently admit that the software makes mistakes and is still not completely reliable because it fabricates information. Who knows how a person’s text, email or video will be interpreted and what content will be generated by this AI technology?

In any event, it appears that no matter how far this technology advances, it will require a great deal of monitoring and human supervision.

it’s good to forget

This latest technology speaks volumes about our digital culture with endless possibilities without limits.

Data can be stored indefinitely on the cloud and everything is recoverable and nothing is actually deleted or destroyed. Forgetting is an important element of healthy grief but forgetting requires people to find new and meaningful ways to remember the deceased person.

Anniversaries play an important role in helping those who are grieving not only remember lost loved ones, but they are also an opportunity to process their grief in new ways. Rituals and symbols can mark the end of something that can help humans remember properly in order to forget.


Most Popular