Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Belfast Live
Belfast Live
National
Claudia Savage

Ghostbots' danger needs to be addressed, Queen's researchers warn

Belfast based researchers have warned of the dangers of people’s digital personas being brought back to life with ‘ghostbots’ after they have died in real life.

Lack of data protection laws in the UK could see an increasing problem of AI ‘ghostbots’ made from people’s digital footprints, new research from Queen’s University Belfast has suggested.

‘Ghostbot’ is a term used to describe what happens when artificial intelligence is used to create digital reincarnations of the dead.

Read more: DUP to hold election for deputy leader role

The technology used to create this includes chatbots, deepfakes or holographs that can replicate the voice, face and even personality of a dead person using data from social media.

As the concept of digital reincarnation moves into the mainstream, celebrities are beginning to showcase the capabilities of such technology, for example, a hologram of the late Robert Kardashian created using deepfake technology was gifted to Kim Kardashian by Kanye West in 2020, which used her father’s likeness and spoke in his voice.

A research study titled Governing Ghostbots from Queen’s University Belfast, Aston Law School and Newcastle University Law School, has suggested that greater societal awareness of “ghostbots” and a “do not bot me” clause in wills and other contracts could prevent people from being digitally reincarnated without permission.

The research looked at potential legal avenues to protect privacy (including post-mortem privacy), property, personal data, and reputation.

Dr Marisa McVey from the School of Law at Queen’s University Belfast said there was a lack of protection for people’s privacy or dignity after death.

“Ghostbots lie at the intersection of many different areas of law, such as privacy and property, and yet there remains a lack of protection for the deceased’s personality, privacy, or dignity after death,” she said.

“Furthermore, in the UK, privacy and data protection laws do not extend to heirs after death. While it is not thought that ‘ghostbots’ could cause physical harm, the likelihood is that they could cause emotional distress and economic harm, particularly impacting upon the deceased’s loved ones and heirs.

“Currently, in the absence of specific legislation in the UK and further afield, it’s unclear who might have the power to bring back our digital persona after we die.”

In the US and EU there is increasing momentum to legislate on who has ownership over a person’s digital identity, for example the EU AI Act which requires greater transparency for deepfakes and chatbots.

Dr McVey has suggested that in addition to more formal legislation, an increased understanding of the phenomenon of ‘ghostbots’ could help people to protect their data.

“In the absence of legislation in the UK, one way to protect our post-mortem selves might be through the drafting of a legally binding ‘do not bot me’ clause that could be inserted into wills and other contracts while people are still alive,” she said.

“This, combined with a global searchable database of such requests, may prove a useful solution to some of the concerns raised by ‘ghostbots’.

“We also suggest that in addition to legal protections, greater societal awareness of the phenomenon of ‘ghostbots’, education on digital legacies and cohesive protection across different jurisdictions is crucial to ensure that this does not happen without our permission.”

The research was a part of the Leverhulme Trust-funded project Modern Technologies, Privacy Law And The Dead.

READ NEXT:

For all the latest news, visit the Belfast Live homepage here and sign up to our daily newsletter here.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.