A broadcaster's inappropriate use of AI imagery of a member of parliament has been labelled a wake-up call which poses ethical questions for news organisations.
Victorian Animal Justice Party MP Georgie Purcell was shocked to see a story on Nine News on Monday about the state's decision to keep allowing duck hunting using a photo of her with enlarged breasts and her dress turned into a crop top.
Nine apologised and blamed the imagery on an automation function in Adobe's Photoshop program during the resizing of a photo that was sourced online.
The program has an artificial intelligence (AI) function that can realistically fill, edit or remove elements from an image.
However, Adobe said humans would have been involved in the process.
"Changes to this specific image would have required human intervention and approval," Adobe said in a statement.
Ms Purcell said the incident raises questions about the use of AI.
"It's quite confronting seeing your own body altered on the nightly news after one of the worst days of work that you have ever had, and I'm used to the ... sexist mistreatment of women in politics," Ms Purcell told ABC News Breakfast on Wednesday.
She said she accepted Nine's apology because she is satisfied in every other way they have handled it.
"(But) I'm not sure I buy the AI excuse," Ms Purcell said.
"I have deep concerns that AI is moving at a pace that our laws can't keep up with and I think we need to seriously consider looking at them."
Industry and Science Minister Ed Husic told AAP the federal government is working towards standards on the safe and responsible use of AI.
"This is a wake-up call for the way that businesses may use AI," he said.
"The expectation is there, business needs to do better.
"A lot of businesses use AI well, this is a case where it hasn't."
Earlier in January, Mr Husic revealed the government's interim response to generative AI with voluntary safeguards considered for high-risk critical industries such as water and electricity, health and law enforcement.
Journalism produced without human oversight poses ethical concerns and the rapid development of the technology has posed a challenge for news organisations, according to senior RMIT University media lecturer TJ Thomson.
He noted the Media Entertainment Arts Alliance's code of ethics states journalists must present accurate pictures and disclose any manipulation likely to mislead.
"News organisations around the world are scrambling to develop policies to respond to generative AI and are actively seeking best-practice guidance on how to use these tools," Dr Thomson said.
He said publishers and audiences need to have confidence in news and that responsible use of the technology is particularly important in 2024 because many countries will hold elections.
It's not the first time the way media companies use machine-generated images has been in the spotlight.
The Australian Financial Review purposefully used AI generated portraits in September to mixed results, with actor Margot Robbie pictured with deformed hands.
Fake sexually-explicit images of celebrities are also increasingly appearing online.
Social-media company X this week temporarily banned users from searching for images of Taylor Swift due to the spread of deep fake pornography.
Altering images is not a new concept, however in the past 'airbrushing' was more closely associated with models, Monash University data scientist Geoffrey Webb explained.
"Older technologies have been used to put people's heads on other people's bodies and all kinds of things, it's just that this technology can do it more effectively," Professor Webb said.