DUBLIN: "You look like you’ve seen a ghost." The expression has typically been restricted to addressing someone who comes across as scared or shocked.
But advances with artificial intelligence (AI) technology could create a wider use: People face the prospect of being "digitally reincarnated" as chatbots or in so-called deepfakes after they die, even if they never agreed to this while alive.
Such "ghostbots" could "attempt to recreate the appearance, voice and/or personality of dead people," either as chatbots, holograms or in deepfake videos, according to law academics at Queen’s University Belfast (QUB), Aston Law School and Newcastle University Law School.
"While it is not thought that ghostbots could cause physical harm, the likelihood is that they could cause emotional distress and economic harm, particularly impacting upon the deceased’s loved ones and heirs," the legal scholars said.
The experts are now calling for a "do not bot me" clause to be added to wills to help make sure people don't end up haunting family and friends as AI-generated zombies.
Long before the recent advances with AI, such as ChatGPT, genealogy site MyHeritage in 2021 launched a service called DeepNostalgia, which allows users to animate photographs of late relatives, the researchers pointed out.
They warn that the digital trails left by people while alive - on social media in particular - could make them easy posthumous pickings for more advanced AI-driven reincarnating, not least as the law does not look like catching up with the technology anytime soon.
While a proposed European Union code for AI appears to cover ghostbots, they for now "lie at the intersection of many different areas of law, such as privacy and property," according to Marisa McVey of QUB.
"There remains a lack of protection for the deceased’s personality, privacy, or dignity after death," McVey said, warning that "in the absence of specific legislation in the UK and further afield, it’s unclear who might have the power to bring back our digital persona after we die." – dpa