Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - UK
The Guardian - UK
Politics
Dan Milmo

One in five child abuse images found online last year were category A – report

Hands on a laptop keyboard
One content analyst at the IWF said child sexual abuse content was being treated as a ‘commodity’ on some sites. Photograph: Dominic Lipinski/PA

The most extreme form of child sexual abuse material accounted for a fifth of such content found online last year, according to a report.

Category A abuse represented 20% of illegal images discovered online last year by the Internet Watch Foundation, a UK-based body that monitors distribution of child sexual abuse material (CSAM). It found more than 51,000 instances of such content, which can include the most severe imagery including rape, sadism and bestiality.

The IWF annual report said the 2022 total for category A imagery was double the figure in 2020 and the increase was partly due to criminal sites selling videos and images of such abuse.

“We have seen criminals looking to exploit more and more insidious ways to profit from the abuse of children,” said Susie Hargreaves, the chief executive of the IWF. “I don’t think I can overstate the harm being done here. These are real children, and the suffering inflicted on them is unimaginable.

“They are being raped, and subjected to sexual torture, and criminals are making money off the back of that. It is truly appalling.”

The IWF said the number of webpages dedicated to making money off CSAM had more than doubled since 2020, with nearly 29,000 such pages identified last year. One content analyst at the IWF said child sexual abuse content was being treated as a “commodity” on some sites.

The IWF said it took action over more than 250,000 webpages last year, an increase of 1% on 2021, with three-quarters of them containing self-generated imagery, where the victim is manipulated into recording their own abuse before it is shared online.

The NSPCC, a child protection charity, said the figures were “incredibly concerning”. It said the government must also update the online safety bill to ensure senior managers were held liable for the presence of CSAM on their sites.

Under the bill, tech executives face the threat of a two-year jail term for failing to protect children from online harm. However, the provision applies to protecting children from content such as material promoting self-harm and eating disorders, and does not apply to dealing with CSAM.

The security minister, Tom Tugendhat, said companies using heavily encrypted services – which mean only the sender and recipient can see the content – must build in safety features that help detect abuse.

“Companies need to ensure that features such as end-to-end encryption have the necessary safety features built in so that they do not blind themselves to abuse occurring on their platforms,” he said.

The encrypted messaging service WhatsApp said last month that it would leave the UK rather than accept weakened encryption. WhatsApp and other encrypted services are concerned by provisions in the bill that could force them to apply content moderation policies that would amount to circumventing end-to-end encryption.

  • The NSPCC offers support to children on 0800 1111, and adults concerned about a child on 0808 800 5000. The National Association for People Abused in Childhood (Napac) offers support for adult survivors on 0808 801 0331.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.