Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - UK
The Guardian - UK
Business
Dan Milmo Global technology editor

UK families call for easier access to deceased children’s social media history

Molly Russell
In September, a coroner ruled that Molly Russell ‘died from an act of self-harm while suffering from depression and the negative effects of online content’. Photograph: Family handout/PA

Bereaved families are calling for easier access to the social media histories of deceased children, supporting amendments to the online safety bill.

The changes have been proposed by Beeban Kidron, a crossbench peer, as the bill returns to parliament on Monday. It is being supported by the family of Molly Russell, a 14-year-old who took her own life in 2017 after months of viewing harmful online content related to suicide, depression, self-harm and anxiety.

Molly’s family spent years seeking access to information about their daughter’s social media accounts, including Instagram. Instagram’s owner, Meta, released more than 1,200 posts that Molly had engaged with on the platform – including some of the most distressing videos and posts that she interacted with – less than a month before the inquest started.

“The experience of living through Molly’s prolonged inquest is something that no family should have to endure,” said Ian Russell, Molly’s father. “There is a dire need for managing this process to make it more straightforward, more compassionate and more efficient. We can no longer leave bereaved families and coroners at the mercy of social media companies.”

In September, a coroner ruled that Molly “died from an act of self-harm while suffering from depression and the negative effects of online content”, in a ruling described by campaigners as a global first and a “big tobacco moment” for social media.

The amendments proposed by Kidron, which also require changes to the Coroners and Justice Act 2009, would put a duty on Ofcom, the communications regulator, to act as a point of contact between a bereaved family and a tech company. They also require coroners to consider if a tech platform holds information regarding the circumstances in which a child died. A further amendment requires tech firms to preserve information from the moment a notice is served and to send a senior manager to any inquest when ordered to testify.

Kidron said families suffered “agony” trying to uncover what their children had been looking at in the days and weeks leading up to their deaths. The amendments will be tabled when the bill, which imposes a duty of care on tech firms to protect children from harmful content, enters the House of Lords.

She added: “These amendments would create a swift, humane route for families and coroners to access data. For the sake of bereaved families now and in the future, I urge the government to adopt them. Denying them this right is simply inhumane.”

Alongside the Russell family, the changes are supported by the family of Frankie Thomas, a 15-year-old who killed herself after months of viewing graphic content about suicide and self-harm; the family of Olly Stephens, 13, who was murdered after a dispute on social media; the mother of Sophie Parkinson, 13, who took her own life after viewing harmful material online; and Lorin LaFave, whose 14-year-old son, Breck Bednar, was groomed and murdered by someone he met online.

The Department for Digital, Culture, Media and Sport is expected to consider Kidron’s proposed changes.

The bill returned to the House of Commons on Monday, with Labour warning that the removal of its provisions on “legal but harmful” content – or offensive material that does not constitute a criminal offence – could lead to the proliferation of the type of content posted on Twitter by US rapper Ye, formerly known as Kanye West. At the weekend Ye was suspended from Twitter for tweeting an image of a swastika blended with a star of David.

Commenting on Ye’s tweet, the shadow culture minister, Alex Davies-Jones, said: “It is absolutely abhorrent and should never be online. But sadly that is exactly the type of legal but harmful content that will now be allowed to proliferate online because of the government’s swathes of changes to this bill, meaning that will be allowed to be seen by everybody.”

Under changes to the bill unveiled last week, platforms are required to enforce their terms and conditions for users. If those terms explicitly prohibit content that falls below the threshold of criminality such as some forms of abuse, Ofcom, the communications regulator, will then have the power to ensure they enforce them adequately.

Paul Scully, the culture minister, said the bill was “not a silver bullet” for dealing with online harm.

“This has to be worked through, with government acting, with media platforms acting, with social media acting, with parents having their role in terms of children within this as well,” he added.

“And it will evolve but first of all we need to, as I say, get back to the fundamental thing that social media platforms are not geared up, frankly, to enforce their own terms and conditions.”

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.