Easily accessible image-generation phone applications and websites have little to no safeguards for people using them to create fake porn without the consent of people in photos or age verification of the user.
Such platforms allow users to upload photos of any person and either impose their faces on pornographic images and videos or generate raunchy alterations from a supplied picture.
Some available applications provide blocks on graphic content, while others feature explicit filters to alter bodies, including adding latex outfits and lingerie or removing clothes completely.
The internet could be a dangerous place, especially for children, "where they may be targeted by predators and bullies and exposed to horrendous abuse and graphic violence through social media", eSafety Commissioner Julie Inman Grant said.
A quarter of reports about image-based abuse, which includes actions such as revenge porn, were from people younger than 18.
There had been an almost 250 per cent increase in complaints about intimate photos and videos being shared without consent in the past year compared to the same period four years ago.
There has also been a tripling of "sextortion" reports, in which sexual images or nudes are used to extort a vulnerable person, in 2023 compared to 2017.
Men younger than 24 were particularly targeted, the commissioner said.
"We've all seen and heard of the tragic stories of how quickly sexual extortion can start in a teenager's bedroom for instance, through an initial contact, and escalate to a child taking their life within a four-hour period," Ms Inman Grant said.
"That's how distressing it is."
But the commissioner was powerless in many instances as "we currently don't have any specific tools in the Online Safety Act to deal with sexual extortion", Ms Inman Grant said.
"Platforms have a huge role to play here and my view is that they're not doing enough," she said.
There was also "a huge disincentive" for app stores to try and shut down graphic content, she said, as owners usually received a fee for each transaction.
"Think about the force multiplier of de-platforming an app and (what) that would mean to their revenue," Ms Inman Grant said.
New regulations are set to designate services that use AI to create porn or "nudify" photos as "high-risk generative AI services" subject to requirements such as having effective terms of service in place.
This includes guardrails to stop the generation of some content, including child porn, eSafety's regulation general manager Toby Dagg said.
Financial penalties will be attached.
However, independent senator David Pocock questioned what was being done to prevent harm in the short term.
"There's a huge sense of urgency," he said.
A person subject to AI-generated image abuse can refer it to the eSafety commissioner, who can issue a take-down order under the threat of civil penalties, Mr Dagg said.
Porn will be the primary focus of an online age verification trial but the use of technology won't be launched on actual platforms.
It would instead include an "independent assessment of their technology" to determine what was effective, assistant secretary of the communication department's online safety branch Bridget Gannon said.
The trial has no time limit attached and verification technology to be included has not been selected.
It was also important the trial was not rushed so it did not lead to technology that did not work or that gave parents a false sense of security, Ms Gannon said.
There were also privacy and data security issues, especially regarding children, and concerns the technology could push people into less regulated spaces such as the dark web.
Lifeline 13 11 14
beyondblue 1300 22 4636
Kids Helpline 1800 55 1800 (for people aged 5 to 25)
Fullstop Australia 1800 385 578