
The Metropolitan Police recorded 2,778 child sexual abuse image offences in London last year — more than seven every day.
NSPCC chief executive Chris Sherwood said the figures were “deeply alarming” and called on the government to toughen up safeguarding measures in the Online Safety Act.
Home Office figures showed 38,685 child sexual abuse image offences were logged by police forces across the UK last year — more than 100 a day on average.
In a separate FoI request, the NSPCC found that of the offences where police forces recorded the platform used by perpetrators, exactly half took place on Snapchat.
Around 11% of incidents were recorded on Instagram, 7% on Facebook and 6% on WhatsApp.
Snapchat, in particular, can be a dangerous platform for children because messages and images disappear after a short time may encourage the exchange of inappropriate content.
Data from police forces indicates private messaging sites are involved in more crimes than any other type of platform.
Perpetrators exploit the secrecy offered by these spaces to harm children and go undetected.

Ofcom is in charge of implementing the new Online Safety Act, but charities like the NSPCC have said its recent code of practice contains a loophole.
“Having separate rules for private messaging services lets tech bosses off the hook from putting robust protections for children in place,” Mr Sherwood explained.
“This enables crimes to continue to flourish on their platforms even though we now have the Online Safety Act.”
Mr Sherwood said it was an “outrage” that in 2025, there is still a “blatant disregard from tech companies to prevent this illegal content from proliferating on their sites.”
He called on the government and social media platforms to take a stronger stance against child sexual abuse, reiterating “there can be no excuse for inaction or delay.”
Kellie Ann Fitzgerald, NSPCC Assistant Director for London and the South East, said: “It is very disturbing that thousands of child sexual abuse image crimes are being recorded by the Metropolitan Police in London, knowing as we do the harm and distress these offences cause for our children.
“Much of this illegal material is repeatedly shared and viewed online, and it is outrageous that in 2025 we are still seeing tech companies blatantly disregard taking action to prevent this illegal content from flourishing on their sites.
“Tech bosses were let off the hook from creating robust child protection measures by having separate rules for private messaging services. This has enabled crimes to proliferate on online platforms in spite of the 2023 Online Safety Act.
“The Government must set out, without delay, how they will take a bold stand against abuse on private messaging services and hold tech companies accountable for keeping children in London and across the country safe, even if it requires changes to the platform’s design.”
Last year, Childline delivered 903 counselling sessions to children and young people relating to blackmail or threats to expose or share sexual images online. This was a 7% increase compared to the previous year.
One girl, aged 13 years, said: “I sent nude pics and videos to a stranger I met on Snapchat. I think he’s in his thirties. I don’t know what to do next. I told him I didn’t want to send him any more pictures and he started threatening me, telling me that he’ll post the pictures online.
“I’m feeling really angry with myself and lonely. I would like support from my friends, but I don’t want to talk to them about it as I’m worried about being judged.”
A spokesperson for Snapchat said: “The sexual exploitation of any member of our community is illegal and horrific. If we are made aware of such content, whether through our proactive detection efforts or confidential in-app reporting, we remove it, lock the violating account, and report to authorities.
“Snapchat is designed to make it difficult for predators to find and interact with young people and has extra safeguards in place to help prevent strangers from connecting with teens.
“Our Family Centre also allows parents to see who their teens are friends with and talking to on Snapchat. We work with expert NGOs, and industry peers to jointly attack these problems and don’t believe the methodology used in this report reflects the seriousness of our collective commitment and efforts.”