Dangerous suicide and self-harm content will remain online despite the introduction of the Online Safety Act because of “big gaps around high risk content”, Samaritans has said.
The charity’s policy and influencing lead said online safety regulator Ofcom is currently choosing to “ignore” advice from safety groups around how it decides which websites are in scope of the new rules.
Jacqui Morrissey told the PA news agency that Ofcom has advised the Government to use site user numbers as the criteria for subjecting platforms to the strictest measures of the Act – which would require them to hide or remove such content even for users aged over 18 – but said that this approach would leave a number of “small, but very high risk platforms” not subjected to these toughest rules.
The charity has raised its concerns to coincide with Safer Internet Day (February 11).
Loneliness can affect anyone and reaching out can make a big difference.
— Samaritans (@samaritans) February 10, 2025
If you sense someone might be feeling isolated, trust your instincts and start a conversation.
💬 Be there.
💚 Reassure them.
🕰 Be patient.
👂 Listen.
Learn more 👇 https://t.co/oTREsQP5lk
“The Online Safety Act includes really important provisions within it that enables government to subject small but very high risk platforms to the strictest measures,” Ms Morrissey told PA.
“We are extremely disappointed that Ofcom have chosen to ignore our advice and the advice of many organisations in making the criteria for the strictest measures focused on the number of users.
“We know the Secretary of State has chosen to accept Ofcom’s advice, so basically, this means dangerous suicide and self-harm content is going to continue to be accessible to anyone over the age of 18.
“Turning 18 doesn’t make you any less vulnerable to this content. Lots of research shows the part that the internet has played in suicides.”
Ms Morrissey highlighted studies which had found internet-related use in suicide in 26% of deaths in those under 20 and 13% of deaths in 20-to-24-year-olds, and warned that the Government and Ofcom were “missing this vital opportunity”.
“These are all people who are over the age of 18 who are going to continue to come into contact with this very dangerous, very harmful content, which isn’t going to be covered by the Act at the moment,” she said.
You don't have to face anything alone. We're here 24/7 if you need us 💚 pic.twitter.com/QIr0eAklFQ
— Samaritans (@samaritans) February 8, 2025
“So we’re missing this really vital opportunity – Government and Ofcom have both said they recognise the concerns around small but very high risk services and they’ve said the decision on which services have to meet these highest measures needs to be based on evidence.
“Our concern is, why are they not listening to this evidence?
“We’ve heard all too often the role that online forums can play in someone taking their own life. We know coroners have clearly linked deaths to a particular site.
“So how much more evidence is needed to recognise this level of harm being caused on a site, and to ensure that it is subject to the strongest measures within the Act?
“We’re saying it is crucial that Government and Ofcom fully harness the power of the Online Safety Act to ensure people are protected from this dangerous content and to make sure that implementation is going to be effective as soon as possible.”
Ms Morrissey added that Samaritans believes that the level of risk – for example, if either Ofcom or a coroner “reasonably link one or more deaths” to a particular site – could be a suitable approach for deciding whether a platform should be placed in the highest category of the rules.
We're always here if you need to talk. Call free day or night on 116 123 💚 https://t.co/f3r4nFqUNp
— Samaritans (@samaritans) February 6, 2025
An Ofcom spokesperson said: “From next month, all platforms in scope of the Online Safety Act – including small but risky services – will have to start taking action to protect people of all ages from illegal content, including illegal suicide and self-harm material.
“We won’t hesitate to use our robust enforcement powers against sites that don’t comply with these duties – which could include applying to a court to block them in the UK in serious cases.
“Additional duties such as consistently applying terms of service will be a powerful tool in making larger platforms safer in due course.
“But they would do little to tackle the harm done by smaller, riskier sites – as many of them permit harmful content that is legal.”
A Government spokesperson said: “Suicide devastates families and social media companies have a clear responsibility to keep people safe on their platforms.
“We are determined to keep people safe online by swiftly implementing the Online Safety Act which will tightly regulate smaller platforms that spread hate and suicide content.
“Platforms will need to proactively remove illegal suicide material and, if accessed by children, protect them from harmful material – regardless of whether or not they are a categorised service.
“We expect Ofcom to use the full range of its powers – including fining and seeking court approval to block access to sites – if these platforms fail to comply.”
Ms Morrissey added that the charity was also “concerned” by the recent “rolling back” of certain moderation and fact-checking tools by some platforms, most notably Meta, which last month said it would replace third-party fact checkers in the US with user-generated community notes and would also loosen content moderation on some topics.
“We would hope that what the Online Safety Act does is create a minimum standard, but we would hope that platforms go beyond this and implement best practice,” she said.
“We know the Act isn’t going to cover lots of kinds of over-18s, dangerous content, but platforms can choose to do that – they can choose to make their platforms as safe as possible for their users.
“So we are concerned that we’re perhaps seeing some rolling back of progress with platforms recently around keeping people safe online. That puts us in, I think, a more precarious situation than we were a year ago.
“The internet can be such a positive place. It can be a really important space for people who are feeling suicidal to access really helpful information, to be able to talk to people who might be experiencing similar things in a safe and supportive environment.
“So tech platforms can be at the forefront of creating these safe spaces to enable safe conversations to be happening.”
Samaritans is available on 116 123 or at www.samaritans.org/how-we-can-help/contact-samaritan/