It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
AI simulations of dead people risk “unwanted digital hauntings”, researchers have warned.
A new study by ethicists at Cambridge University found that AI chatbots capable of simulating the personalities of people who have passed away – known as deadbots – should require safety protocols in order to protect surviving friends and relatives. Some chatbot companies are already offering customers the option to simulate the language and personality traits of a deceased loved one using artificial intelligence. Ethicists from Cambridge’s Leverhulme Centre for the Future of Intelligence say such ventures are “high risk” due to the psychological impact they can have on people.
“These services run the risk of causing huge distress to people if they are subjected to unwanted digital hauntings from alarmingly accurate AI recreations of those they have lost. The potential psychological effect, particularly at an already difficult time, could be devastating.” The findings were published in the journal Philosophy and Technology in a study titled ‘Griefbots, Deadbots, Postmortem Avatars: on Responsible Applications of Generative AI in the Digital Afterlife Industry’.
originally posted by: andy06shake
It might bring some form of happiness to those of the departed.
.
I don't know how deadbots can imitate dead people ...
I don’t think this would be helpful at all quite the opposite actually.
Technology taken to its final degree is indistinguishable from magic.
We are finite not meant to last for ever not meant to interact forever not meant to be compared to everyone alive not meant to be compared to the dead.
Just my hot take on this F’d planet.