Awani Review

Complete News World

AI that makes the dead speak: ‘For the sake of mourning, it’s really not a good idea’

AI that makes the dead speak: ‘For the sake of mourning, it’s really not a good idea’

New tools to simulate a conversation with the dead using artificial intelligence will be detrimental to mourning your loved one.

At least this is the opinion shared by psychologist Paul Langevin in an interview with TVA Nouvelles.

I asked myself a lot of questions and called some colleagues. We all agree that to mourn, it’s really not a good idea,” says Mr. Langevin.

Software like Forever Voices currently allows users to simulate conversations with deceased people like Steve Jobs, for example.

The prospect of downloading the voice of a loved one who lost his life causes a headache for the psychologist.

“In the grieving process, there is letting go that we need to make,” he explains. And where we wonder if you are still in contact and you think that this is the truth, that you are talking to someone because you are in pain and you are vulnerable, then there will certainly be difficulties, because the person will find it difficult to move on to the next stage and even resume their lives.”

This is indeed a phenomenon that psychologists notice when a person repeatedly watches a video of a deceased person.

He continues, “We have to tell them to put it down or ask a loved one to pick up a DVD or a USB key for them to keep until they are able to listen to them better.”

While it is acceptable to speak to a deceased person, the response the AI ​​allows will be problematic.

“In the grieving process, it’s healthy to talk to people who have died, but they don’t respond, so it’s a connection to the self,” says the psychiatrist. We’re talking about other things here. The person responds, and with what he responds, we can do anything with it.”

See also  OSM withdraws Russian pianist Alexander Malofeev

The bad guys make the psychologist fear the worst.

“We are weak […] So what scares me are people with bad intentions who might force anyone to say anything in AI, that’s the danger.

Other uses of AI, including simulating romantic relationships, are also scary, Langevin says.

“If we are not able to live our lives normally, and we have to fall in love with the robot, and there at the limit, they will put a dummy of the ex-husband on the robot, so there it will prevent them from moving forward,” he shares.

Watch the full interview in the video above