Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Marie Claire
Marie Claire
Lifestyle
Mischa Anouk Smith

Is the Online Safety Act enough to keep children safe?

Side view of desperate teenager with cellphone sitting by bed in her room - stock photo.

Ian Russell, the father of Molly Russell, who committed suicide at just 14 after viewing graphic posts about suicide and self-harm on social media, warns the next government that the Online Safety Act is not a “job done”.

Speaking to the PA news agency, Russell agreed that the act has “really important” foundations but said more work is needed to “keep on top” of the ever-changing tech landscape.

Just two weeks after Russell received an MBE from Prince William at Windsor Castle for his online safety campaigning, he called on the next government to uphold ongoing internet safety regulations and commit to tackling the graphic content children are still accessing. The bereaved father has warned that not staying abreast of online safety would be a “disaster”.

A post shared by Ofcom

A photo posted by ofcom.org.uk on

It is an opinion shared by Esther Ghey, whose daughter Brianna was murdered by two teenagers who had been accessing graphic videos of violence and torture on the dark web.

According to research from the Online Safety Data Initiative, 80% of children (aged 12-15) have had potentially harmful experiences online. Russell and Ghey met earlier this year and jointly called on regulators to “have courage” and “step further and faster forward” to hold tech companies accountable for the content on their platforms.

Ghey launched her own petition calling for tougher laws that would make phone companies more responsible for children’s online welfare.

A post shared by Marie Claire UK

A photo posted by marieclaireuk on

The Online Safety Act passed into law last October and requires tech companies like Meta, TikTok, Snap, X, and Discord to protect children from explicit content. Ofcom was also given extra enforcement powers as part of the law, but many child safety campaigners have argued that the act isn’t enough to protect children from harmful content.

Speaking to PA, Russell explained that the act should be seen as “a constantly evolving thing.” He has implored the next government to see the Online Safety Act as something that will never be complete because of the nature of technology and the speed at which it moves. “This isn’t finished. They need to complete the work and need to work out how to keep on top of it,” he proclaimed.

A post shared by NSPCC

A photo posted by nspcc_official on

What is the Online Safety Act?

The Online Safety Bill was passed on 26th October 2023 and essentially, it makes Big Tech more responsible for its platform’s content.

Social media platforms have to prove they’re committed to removing illegal content, including:

  • child sexual abuse
  • controlling or coercive behaviour
  • extreme sexual violence
  • illegal immigration and people smuggling
  • promoting or facilitating suicide
  • promoting self-harm
  • animal cruelty
  • selling illegal drugs or weapons
  • terrorism

Platforms are also now responsible for shielding children from harmful but legal content.

Failure to do so can result in significant fines and potential penalties.

Cyber-flashing and sharing “deepfake” pornography are also offences under the act.

The act also introduced measures that allow bereaved parents to access information about their children from tech firms.

Is the Online Safety Act enough to protect children?

Many, such as Emily Clarkson—who campaigned for greater online protections—have cautioned against the act resting on its “laurels”. Speaking to Marie Claire UK earlier this year, Clarkson said, “This is a much more powerful place to continue being vigilant and continue regulating and continue the hard work in this space”. It’s an opinion shared by Esther Ghey, whose petition, which Marie Claire UK backed, aims to make phone companies even more accountable for the safety of users under 16.

Tech is always evolving, and keeping ahead of changes requires constant work. New research shared with us by Vodafone UK reveals that AI algorithms are turning harmless searches into harmful content, which Nicki Lyons, Chief Corporate Affairs & Sustainability Officer, says is gradually desensitising Britain’s tween and teen boys to the negative views they are seeing. Key findings show that, on average, boys aged 11-14 are exposed to harmful content within 30 minutes of being online, and one in 10 are seeing it in as little as 60 seconds.

More shockingly, 59% of boys are accessing this content through innocent and unrelated searches, which, in turn, unintentionally fuels online algorithms. As is so often the case, young girls and women are being disproportionately affected with 81% of teachers interviewed saying that the rise of the “aggro-rithm” is negatively impacting female students.

On average, boys aged 11-14 are exposed to harmful content within 30 minutes of being online, and one in 10 are seeing it in as little as 60 seconds.

Despite the act, WhatsApp—who previously threatened to withdraw from the UK if the act was passed—has lowered the lowered the age limit from 16 to 13 in the UK.

“I think the Online Safety Act is going to go a long way in keeping children safe online.” says Jess Chalmers, an online child safety expert, social media professional, and self-described “mum to two tech-loving children.” Chalmers says despite the act, she’s “passionate that parents have a duty of care to educate themselves about the dangers and harms of being online so that they can guide their children to have a safer and more positive experience online.” TalkTalk reveals that 45% of parents see family safety online as their biggest concern.

As online violence, damaging ideologies and explicit content become increasingly widespread and rampant, young women and girls are at ever-greater risk.

Findings from Communia, a female-founded social network, show that over a third (35%) of women surveyed have been victims of online abuse. The same report reveals that 3 in 10 (30%) respondents felt failed by the tech platform’s response towards the abuse they reported.

The act might at last be law, but the action needed to protect our children is far from over.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.