Australia has unveiled regulations requiring internet search engines to crack down on child sexual abuse material created by artificial intelligence.
The online safety code announced on Friday will require services such as Google, Bing, DuckDuckGo and Yahoo to take “appropriate steps” to prevent the spread of child exploitation material, including “synthetic” images created by AI.
The announcement comes after the eSafety commissioner delayed the implementation of an earlier version of the code in June after Microsoft and Google introduced AI functionality for their internet search engine services.
“The use of generative AI has grown so quickly that I think it’s caught the whole world off guard to a certain degree,” eSafety Commissioner Julie Inman Grant said in a statement.
Grant said the code was a “great example” of how regulators and tech firms could work together to make the internet more safe.
“When the biggest players in the industry announced they would integrate generative AI into their search functions we had a draft code that was clearly no longer fit for purpose and could not deliver the community protections we required and expected,” she said.
“We asked the industry to have another go at drafting the code to meet those expectations and I want to commend them for delivering a code that will protect the safety of all Australians who use their products.”
Earlier this year, the BBC reported that paedophiles have been using the AI software Stable Diffusion to create and sell life-like child sexual abuse material on content-sharing sites such as Patreon.
Australia’s eSafety commissioner is currently working on drafting two new codes to regulate online storage services such as iCloud and OneDrive, and private messaging services, respectively.
Efforts by governments to increase oversight of cloud and messaging services to combat child exploitation have prompted pushback from the tech industry and privacy advocates.
WhatsApp and Signal have threatened to pull out of the United Kingdom if it passes the Online Safety Bill, which would require platforms to scan for child sexual abuse material.
Tech firms and civil libertarians say the law would compel platforms to scrap end-to-end encryption, putting the privacy of all users at risk.
In June, Australia’s eSafety commissioner announced codes to regulate social media, internet carriage services, app distribution services, hosting services and equipment providers.
Breaches of the codes are subject to civil penalties.