The legal battle between Australia’s online safety regulator and Elon Musk’s X is shaping up as the first true test of the power governments can wield over tech platforms, not just within Australia but globally.
There have been skirmishes between Musk and the eSafety commissioner, Julie Inman Grant, before. But tensions rose in the aftermath of the stabbing of bishop Mar Mari Emmanuel on 15 April while he was giving a livestreamed service at the Assyrian Christ the Good Shepherd church in the Sydney suburb of Wakeley.
The following day, X was ordered to remove 65 tweets containing the video of the stabbing attack. When it chose to hide the posts from Australian users only, the commissioner launched an urgent court case seeking an injunction. The federal court then ordered X to hide posts from users globally, pending a hearing on 10 May.
If the order is upheld, X could face fines upwards of A$15m – up to $782,500 a day for each day the tweets are kept online since the order was made. But the case represents a bigger test of the regulator’s ability to force multinational tech companies to adhere to Australian law.
“This is really the first real test of the powers of the eSafety commissioner,” the spokesperson for the Australian Lawyers Alliance, Greg Barns, says. “The powers are quite substantial in the sense that she can order the taking down of material, with the capacity for fairly substantial daily fines.
“However, I think also the limitation is going to be tested because the difficulty is in making orders which purport to have a global effect requires the cooperation of other countries.”
‘X poses a challenge’
The idea of an eSafety commissioner was first floated in 2013, when the conservative Coalition government came into power, as a way to tackle the online bullying of children.
Not long after its launch in 2015, eSafety’s powers were boosted to include image-based abuse and, with the passage of the Online Safety Act in 2021, adult bullying.
At the time, critics warned that giving eSafety additional powers to regulate content according to Australia’s classification code – something that was written in the time of VCRs and is still under review – could have wide ramifications for freedom of speech and what people can view online.
But, for the most part, eSafety’s use of power has not raised much controversy.
Of the 33,000 reports the office received regarding potentially illegal content in the last financial year, the eSafety commissioner passed on about half as informal requests for the URLs to be removed by the platforms. According to the commissioner’s office, 99% of those related to child sexual abuse material. Only three formal notices were issued in that year related to violent content.
The regulator has also used its more graduated powers to have URLs removed from search results, but it has never used its power to have an app removed from app stores.
Most of the time, the platforms comply or eSafety does not pursue the matter further. At least, that was the case until Musk took over X at the end of 2022.
The chair of Digital Rights Watch, Lizzie O’Shea, says eSafety was largely dependent upon cooperation with tech platforms, but that has shifted.
“The recent dispute with X poses a challenge as the company has not only resisted cooperation but is now challenging the basis for these laws, demonstrating that they appear to have little concern for maintaining a social licence to operate, at least insofar as regulators are concerned,” she says.
“The eSafety commissioner has a difficult and important job, but her powers are also limited and have to be balanced against other concerns, whether they be practical or human rights ones.”
eSafety and X are now involved in at least three legal cases in Australia over notices issued to the company. The stabbing attack notice is slated as the first to be heard later this month.
‘A legitimate debate’
While the eSafety commissioner’s office argues its use of powers in this case is about “ensuring Australians are protected where possible from extreme violent and other Class 1 material”, setting the video of the stabbing attack up as its hill to die on has raised a few eyebrows.
Emmanuel himself has argued the video should stay online. And ABC’s Media Watch and News Corp columnist Andrew Bolt found a rare moment of unity in arguing the video is not as bad as many others still available online. The opposition leader, Peter Dutton, has argued that seeking a global ban goes too far.
The Australian federal police, in an affidavit to the federal court, said the video could be used to recruit people to terrorist groups or undertake terrorist acts.
Alastair MacGibbon, who was Julie Inman Grant’s predecessor in the role and is now chief strategy officer for CyberCX, says he understands both sides of the argument, but the police describing it as a terrorist act shifts the impact of the video.
“It’s become a legitimate debate about whether or not a potentially inflammatory piece of moving picture should be not circulated because it could well drive others to commit acts of violence against certain parts of our community,” he says.
“That’s what community is about. It is actually about curtailing broader rights sometimes to prevent [violence] against others … They deserve a freedom from violence, just as much as I deserve a freedom to watch a video.”
Barns says the office has to be very judicious in orders to be taken down and it’s likely this particular video came to eSafety’s attention because of the wide media coverage. He points to the WikiLeaks “collateral murder” video as something the world needed to see, along with the footage coming out of Gaza, which has galvanised public opinion.
“It’s a difficult exercise in terms of the exercise of freedom of speech and the right to know on the one hand and, on the other hand, gratuitous acts of violence which can serve no useful purpose – and in fact can become a negative force in society.”
Julia Powles, an associate professor of law and technology at the University of Western Australia, says “the regulator is too focused on take downs of singular pieces of content”.
“To regain public confidence, it needs to take a victim-informed, systemic approach to the structural drivers of online hate and abuse.”
Technical solutions, social problems
On Monday the communications minister, Michelle Rowland, announced the government was seeking feedback on whether the enforcement powers and penalties of the office are fit for purpose, as part of the review of the Online Safety Act.
Two days later she announced $6.5m in additional funding for the eSafety commissioner to trial age-assurance technology for social media and adult websites.
Both O’Shea and Powles say they are concerned about eSafety’s focus on private communications – particularly when it comes to proposed standards for detecting child sexual abuse material and terrorist material in end-to-end encrypted communications.
“There is a heavy focus by government on technical solutions to these problems, when they are often social in nature, and require political leadership,” O’Shea says.
Barns says the government should wait for the outcome of the X case before making any changes to the powers held by the eSafety commissioner.
“It potentially provides an opportunity for the courts to explore the scope of the legislation and the way in which the powers can be used and not used,” he says. “And it might be prudent to wait for a decision which may explore those areas before rushing into further legislation.”