Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Evening Standard
Evening Standard
World
Laura Sharman

Apple to scan iPhones for child sex abuse images

File photo

(Picture: AFP via Getty Images)

Apple has announced details of a new safety tool designed to identify child sexual abuse material (CSAM) on users’ phones.

The new technology will allow Apple to detect known CSAM images stored in iCloud Photos and report them to law enforcement agencies.

Apple will also launch a new feature in the Messages app which will warn children and their parents using linked family accounts when sexually explicit photos are sent or received.

The tool will show a warning to a child when they are sent sexually explicit photos, blurring the image and reassuring them that it is OK if they do not want to view the image as well as presenting them with helpful resources.

The iPhone maker said the new detection tools have been designed to protect user privacy and do not allow the tech giant to see or scan a user’s photo album.

Instead, the system will look for matches, securely on the device, based on a database of “hashes” - a type of digital fingerprint - of known CSAM images provided by child safety organisations.

This matching will only take place when a user attempts to upload an image to their iCloud Photo Library.

Apple said that if the threshold for harmful content was exceeded the user would be reported to law enforcement agencies.

The new tools are set to be introduced later this year as part of the iOS and iPadOS 15 software update due in the autumn, and will initially be introduced in the US only, but with plans to expand further over time.

The company reiterated that the new CSAM detection tools would only apply to those using iCloud Photos and would not allow the firm or anyone else to scan the images on a user’s camera roll.

The scanning of phones for CSAM has been welcomed by child protection groups but privacy campaigners are concerned the software could be abused by authoritarian governments that could use it to spy on citizens.

Matthew Green, a security researcher at Johns Hopkins University, told the BBC: “Regardless of what Apple’s long term plans are, they’ve sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users’ phones for prohibited content.

“Whether they turn out to be right or wrong on that point hardly matters. This will break the dam — governments will demand it from everyone.”

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.