Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - UK
The Guardian - UK
Business
Dan Milmo

Young people must report harmful online content, says UK watchdog

The online safety bill is expected to become law by the end of the year.
The online safety bill is expected to become law by the end of the year. Photograph: Ian Allenden/Alamy

Young people should report harmful online content, the communications watchdog has said, after finding that two-thirds have encountered potential harms on social media but only one in six report it.

Ofcom found that 67% of people aged between 13 and 24 had seen potentially harmful content online, although only 17% report it. The regulator is charged with enforcing measures in the forthcoming online safety bill, which will require social media companies to protect children and adults from online harms.

The most common potential harm encountered online was offensive or bad language (28%), according to respondents in Ofcom’s Online Nation 2022 report, followed by: misinformation (23%); scams, fraud and phishing (22%); unwelcome friend or follow requests (21%) and trolling (17%). A further 14% had experienced bullying, abusive behaviour and threats online.

Ofcom is launching a campaign with TikTok influencer Lewis Leigh, who rose to fame during lockdown by posting videos of himself teaching dance moves to his grandmother. The “Only Nans” campaign will encourage young people to report harmful content they see on social media.

The campaign is also supported by Jo Hemmings, a behavioural psychologist. She said: “People react very differently when they see something harmful in real life – reporting it to the police or asking for help from a friend, parent or guardian – but often take very little action when they see the same thing in the virtual world.”

TikTok removed more than 85m pieces of content in the final three months of last year, with nearly 5% of that total coming from user referrals. Instagram removed more than 43m pieces of content over the same period, of which more than 6% came from users reporting or flagging content.

Anna-Sophie Harling, online safety principal at Ofcom, said: “Our campaign is designed to empower young people to report harmful content when they see it, and we stand ready to hold tech firms to account on how effectively they respond.”

The online safety bill is expected to become law by the end of the year. Ofcom will have the power to impose fines of £18m or 10% of a company’s global turnover for breaches of the act, which imposes a duty of care on tech firms to protect people from harmful user-generated content. One of the specific mandates in the bill is ensuring that children are not exposed to harmful or inappropriate content.

Andy Burrows, head of child safety online policy at the NSPCC, which has called for a strengthening of the bill, said: “This report lays bare how young people are at increased risk of coming across harmful content but feel unsupported on social media and either do not know how to report it or feel platforms simply won’t take action when they do.”

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.