Australia’s online safety regulator is not swayed by Apple’s justification for abandoning the development of technology to scan its cloud storage services for images of child abuse, saying a standard to force the company to develop it is on the way.
Last week, Wired reported a letter Apple had sent to a child safety group called Heat Initiative in response to the group’s calls to force Apple to scan its iCloud photos to combat child abuse outlining in detail, for the first time, Apple’s concerns about how the technology would be used.
The tech would have allowed Apple to scan images before they are uploaded to Apple’s iCloud service and checked them against a database of known child abuse imagery. If there was a match, it would then be reviewed by Apple staff before being passed on to authorities and the user’s account disabled.
Meta, Microsoft and Google already operate similar services, but Apple ultimately decided against the technology in December, instead focusing on on-device prompts for children to warn them when attempting to send or receive explicit images.
This decision was widely criticised by child safety groups and regulators, including Australia’s eSafety commissioner, Julie Inman Grant, describing it at the time as “a major setback”.
In Apple’s letter released last week, the company explained its reason for abandoning the project for the first time. According to the letter published by Wired, Erik Neuenschwander, Apple’s director of user privacy and child safety, said the company had concluded “it was not practically possible to implement [the technology] without ultimately imperiling the security and privacy of our users.”
“Scanning every user’s privately stored iCloud content would in our estimation pose serious unintended consequences for our users,” he said.
He said it would create new threat vectors for data thieves to find and exploit, and would open the door for mass surveillance.
“Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types (such as images, videos, text, or audio) and content categories,” he said.
“How can users be assured that a tool for one type of surveillance has not been reconfigured to surveil for other content such as political activity or religious persecution? Tools of mass surveillance have widespread negative implications for freedom of speech and, by extension, democracy as a whole.”
In May, the eSafety commissioner, Julie Inman Grant, declined to register two industry codes, including one covering cloud storage services on the grounds there was no requirement to detect and prevent the distribution of child abuse or pro-terror material. Her office is now in the process of developing a mandatory standard that will apply to Apple and other storage providers operating in Australia that will likely require technology similar to that Apple had been developing.
Asked this week whether Apple’s letter had any impact on whether the standards would go ahead, a spokesperson for the eSafety commissioner indicated the standards were still in development.
“Like all online platforms, once the industry codes and standards are in place, Apple will be required to take appropriate measures to address the risk of class 1 material including child sexual abuse material on their services in Australia,” a spokesperson for the eSafety commissioner said.
Draft standards will be published in the future for public consultation. The spokesperson did not confirm whether Apple had made similar appeals to eSafety, but said the commissioner had been consulting industry throughout the process.
Apple was contacted for comment.