Paedophiles are using artiificial intelligence (AI) to create and sell photorealistic images of child sexual abuse online on an “industrial scale”, a new report has revealed.
People are buying the material by paying subscriptions to accounts on mainstream content-sharing website Patreon, according to the BBC.
Those behind the illicit market in child sexual abuse content are using software called Stable Diffusion to create the images. The tool relies on AI to make images from simple text instructions by analysing pictures found online.
Stable Diffusion is among several newcomers on the AI scene that have been embraced by the AI art community. This is due to their ease of use and lax controls compared with other programs.
Another text-to-image software called Midjourney was also allegedly being used to create a large volume of sexualised content of children, a separate report found.
Content created with Stable Difussion includes life-like images of child sexual abuses, including depicting the rape of babies and toddlers.
It is illegal to take, create, share or possess indecent images and pseudo-photographs of people under 18 in the UK. The Government defines a pseudo-photograph as an image made by computer graphics or which appears to be a photograph.
The images in question were reportedly promoted on a popular Japanese social media platform for animation called Pixiv, before being sold on the US-based platform Patreon.
Accounts on Patreon allegedly offered AI-generated obscene images of children for sale, with different levels of pricing based on the material being requested. In one case, offensive images were being sold for the equivalent of £6.50 per month.
Patreon said it had a “zero-tolerance” policy banning content that depicts sexual themes involving minors. However, it added that AI-generated harmful material was on the rise online, and that it was increasingly identifying and removing it.
The company behind the software, Stability AI, said that several versions of the tool had been released with restrictions written into the code that prohibits certain types of content.
It added that it "prohibits any misuse for illegal or immoral purposes across our platforms, and our policies are clear that this includes CSAM (child sexual abuse material)”.
Pixiv said it banned all photo-realistic depictions of content involving minors last month, and had reinforced its monitoring systems.