Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - UK
The Guardian - UK
Politics
Sally Weale Education correspondent

Almost half of children in England have seen harmful content online – survey

Child using a computer
Children are being exposed to pornography, sexualised and violent imagery and anonymous trolling, as well as content featuring self-harm and suicide, according to the survey. Photograph: Dominic Lipinski/PA

The children’s commissioner for England has said she fears there could be a repeat of the Molly Russell tragedy, after research showed almost half of children have seen harmful content online, including material promoting self-harm and suicide.

The research published on Thursday by Dame Rachel de Souza found that 45% of children aged eight to 17 have come across material they felt was inappropriate or made them worried or upset, though half of them did not report it.

As well as content featuring self-harm and suicide, the survey found children are being exposed to pornography, sexualised and violent imagery, anonymous trolling, and content featuring images of diet restriction.

A coroner is expected to deliver his findings shortly on the death of Molly Russell, 14, from Harrow in north-west London, who killed herself after viewing extensive amounts of online content related to suicide, depression, self-harm and anxiety.

“This content shouldn’t be available to children of this age and the tech companies should be making sure it’s taken down,” De Souza told the Telegraph. Asked if there was still a risk of another child dying like Molly, she said: “I think it absolutely could and can happen.”

In the foreword to the report, Digital Childhoods, De Souza said self-regulation by tech companies had failed. “Girls as young as nine told my team about strategies they employ when strangers ask for their home address online. In a room of 15- and 16-year-olds, three-quarters had been sent a video of a beheading.”

She said children rarely sought out this content. “It is promoted and offered up to them by highly complex recommendation algorithms, which are designed to capture and retain their attention. When harmful content is reported to platforms, children tell me that little is done in response.”

Children were most likely to report having experienced anonymous trolling, which was most prevalent on Twitter and Facebook. Sexualised and violent or gory content was the next most frequently seen, “occurring with highest prevalence on TikTok and YouTube respectively”, the report said.

The survey, which polled 2,005 children aged eight to 17 and their parents, found that children who are eligible for free school meals were significantly more likely to see every type of harmful content (54% v 40% of non-FSM children). Overall, boys were more likely to see it than girls.

Only half of children who had seen harmful content reported it to the platform they saw it on. Of those who did not report it, 40% said they did not feel there was any point, while of those who did report harmful content, a quarter saw no action taken.

The report also confirmed that a large number of children are active on social media and messaging platforms before the recommended age, with two-thirds of eight- to 12-year-olds using social media. Seven out of 10 children who took part in the survey agreed that social media platforms should enforce minimum age requirements, a view also held by 90% of their parents.

The survey looked into parents’ anxieties about their children’s lives. Two-thirds (67%) said they were concerned about the content their children were viewing online, with parents of younger children more likely to be worried. Three-quarters (75%) of parents who took part said they used tools to monitor and restrict what their children see and do online.

De Souza said the inquest into Molly’s death was “a harrowing reminder” of the impact of harmful content on young minds.

“Self-regulation by tech companies has failed; the experiences of so many children are testament to that,” she said. “Yet we have an enormous opportunity to right these wrongs through the online safety bill and careful regulation by Ofcom. It is vital that the bill keeps children’s safety at its heart.”

  • In the UK, the youth suicide charity Papyrus can be contacted on 0800 068 4141 or email pat@papyrus-uk.org, and in the UK and Ireland Samaritans can be contacted on freephone 116 123, or email jo@samaritans.org or jo@samaritans.ie. In the US, the National Suicide Prevention Lifeline is at 800-273-8255 or chat for support. You can also text HOME to 741741 to connect with a crisis text line counsellor. In Australia, the crisis support service Lifeline is 13 11 14. Other international helplines can be found at befrienders.org.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.