Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - AU
The Guardian - AU
National
Josh Taylor

AFP calls on public to donate childhood photos in bid to combat child abuse with AI

computer
The My Picture Matters project needs thousands of images from adults of themselves as children to train up an AI system. Photograph: Image Source/Alamy

The Australian federal police want the public to donate their childhood photos to an artificial intelligence project aimed at helping save children from abuse.

The project, run by AFP and Monash University, will help detect child abuse material on the dark web, or on devices that have been seized during criminal investigations.

The system is trained to recognise potential images of children. It is then combined with algorithms trained on sexual content or violent material to search for images. From there, any matches can be reviewed further.

The My Pictures Matters project needs at least 10,000 images from adults of themselves as children – such as school portraits – to train up the system, but so far only about 1,000 have been donated.

Dr Nina Lewis, the head of the project and an ethics specialist, said there were strict controls over the use of the images – and people can remove consent at any time.

“The dataset itself is not a police dataset. It is owned, stored and managed by Monash University,” she said.

There are measures in place to ensure that photos aren’t used for other purposes, and if police wish to use them for other purposes they must seek permission.

Lewis said she understands people’s reluctance, but she is hopeful the campaign will encourage those who do feel comfortable to share their images.

“[Child sexual abuse is] something that people don’t want to hear about. They don’t want to talk about it. And that’s part of the problem. It lets predators hide and get away with things.”

Lewis said there was also no possibility the AFP could acquire the dataset once the project is completed, but said the AFP could end up using the algorithms developed by the project in future operations.

An AFP spokesperson said My Pictures Matter was not related to any automated identification projects, such as that of the US technology company Clearview AI.

“The use of this AI program will be targeted,” they said. “It cannot do widespread scraping of the internet or dark web. The technology being developed does not directly access data, it analyses what it is given.”

Lizzie O’Shea, chair of Digital Rights Watch, said it was good to see an approach to training an AI tool that doesn’t involve stolen data, but said she would want to be assured there were extremely robust safeguards on data security and processes to protect against unauthorised access.

“I would be very cautious about the possibility of such technology being used proactively in surveillance, rather than for evidence gathering purposes in legal processes that require a warrant,” she said.

“There would need to be limits to avoid over-reliance on automated systems that can have the capacity to erode the presumption of innocence.”

The push comes as the AFP continues to face scrutiny over its use of AI. In July, Crikey reported AFP officials secretly met with Clearview AI employees just months after the privacy commissioner took the agency to task for breaching Australians’ privacy in using the technology without conducting a privacy impact assessment.

Unlike the My Pictures Matters project, Clearview AI did not seek the consent of the people whose images make up the 30bn items it scraped from the internet for its system.

It’s not clear the extent to which the AFP is already using AI more broadly, and the AFP has rebuffed FoI requests related to its use of the technology, arguing it would reveal methodology.

The AFP recently cracked the case of what is accused to be Australia’s worst serial paedophile on the basis of identifying bedsheets in images and linking it to the childcare centre where the alleged perpetrator worked.

The AFP spokesperson said the AFP does not use Clearview, and “has not made any recommendations to the Commonwealth to allow the use of the technology”.

  • In Australia, children, young adults, parents and teachers can contact the Kids Helpline on 1800 55 1800; adult survivors can seek help at Blue Knot Foundation on 1300 657 380. In the UK, the NSPCC offers support to children on 0800 1111, and adults concerned about a child on 0808 800 5000. The National Association for People Abused in Childhood (Napac) offers support for adult survivors on 0808 801 0331. In the US, call or text the Childhelp abuse hotline on 800-422-4453. Other sources of help can be found at Child Helplines International

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.