Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - UK
The Guardian - UK
Technology
Shanti Das, Home affairs correspondent

Ofcom accused of ‘excluding’ bereaved parents from online safety consultation

Bunches of flowers and laid against a fence with a photograph of Oliver Stephens on top of them
Floral tributes to 13-year-old Oliver Stephens, who was murdered after a dispute on social media. Photograph: Geoff Swaine/REX/Shutterstock

Bereaved parents and abuse survivors who have endured years of “preventable, life-changing harm” linked to social media say they have been denied a voice in official discussions about holding tech firms to account.

Mariano Janin, whose ­daughter Mia, 14, killed herself after online bullying, and the parents of Oliver Stephens, 13, who was murdered after a dispute on social media, are among those who have accused Ofcom of excluding them from a ­consultation process for tackling online harms.

Ian Russell, whose daughter Molly, 14, killed herself after viewing self-harm content on Instagram and Pinterest, and parents who believe their children’s deaths were linked to viral social media challenges, also signed an open letter criticising the regulator, backed by 20 campaigners with lived experience.

The families, together with survivors of online grooming and abuse, say that Ofcom has so far failed to properly engage with them, despite them having “deeply valuable insights” into the “devastating effects of online harm”.

Their frustration is focused on a consultation process over Ofcom’s approach to protecting people from illegal harms online, which they say included more than 2,000 pages of “incredibly technical and inaccessible material”.

While people could respond individually, they say there was “little to no proactive effort nor means provided for those with lived experience to understand or respond to the proposals, even after raising our concerns to Ofcom”, which they said had “the effect of being exclusionary”.

In a letter sent last week to the regulator’s chief executive, Melanie Dawes, they said this had led Ofcom to draw up a draft set of proposals that were “too weak to address the scale and magnitude of the online harms facing children”.They believe the focus is too much on “merely requiring platforms to test the systems” they “already know first-hand do not work”, on “isolated parts of a user journey rather than overarching user experience”, and on taking down illegal material after it is detected – rather than on more preventive measures.The criticism of Ofcom comes before another consultation linked to online safety, launching this month, which will examine the growing trend of children as young as five engaging with the internet. Under the Online Safety Act, Ofcom is responsible for overseeing tech companies and has the power to fine those breaking the law.

Ofcom denied excluding people from its consultations and said the lived experiences of bereaved families and survivors of online abuse were “invaluable in shaping our online safety policy work”. It said it had heard from a “wide range of voices” including victims’ groups and bereaved families, and spoken to 15,000 children and 7,000 parents through a research programme. It offered to meet the signatories of the letter during consultation over its next proposals. “We strongly agree that ongoing dialogue and a collaborative approach is in everyone’s best interests,” a spokesperson said.

The letter’s coordinator, Frida, an online abuse survivor, said meetings and research were welcome but that so far the approach to engaging survivors had been “piecemeal”. She said: “I believe that many people with lived experience of online harm have been shut out.”

The signatories – including Lisa Kenevan, who believes her son Isaac died after taking part in a “choke challenge” on social media, and Liam Walsh, who has campaigned for TikTok to allow him access to his late daughter’s social media activity after she viewed self-harm videos – are calling for Ofcom to reopen the consultation on illegal harms, which ended in February, and to launch “targeted engagement with victims and survivors”. They also want a named person at Ofcom responsible for engaging with them.

Janin, whose daughter Mia killed herself in 2021 after being bullied – including by pupils who shared one of her TikTok posts to a Snapchat group chat, where they made fun of her – questioned whether Ofcom was “fit for purpose”.

He said: “If they really want to change things, I think they need to survey all the information that they can gather from families, schools. They need all of that information. We need to be involved.”

He called for tougher action to ensure that social media companies were held to account over failings that had affected him and the other bereaved families. “My case is irreversible. Nothing will bring Mia back. But we see a new case every week. Why are we treating this in such a polite way?

“Before you get your driving licence, you need prove you’ll be a good, safe driver and not harm anyone else. Why do normal citizens have all these rules when social media companies don’t?”

Janin concluded: “It’s the wrong way around: we should put the bar high and they should have to prove they can run their platforms without risk.”

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.