In Scott Morrison’s preferred virtual reality Australia would be emerging from a restorative summer of freedom ready to embrace his government’s efforts to tame online trolls by holding big tech to account.
In this parallel metaverse the only damaging character assessments in our gaze would be the ones delivered over social media platforms, where vulnerable users have become cannon fodder for a business model that privileges the outrage over the civil.
The government’s determination to keep women and children safe online would be emerging as a political game changer, part of a well-crafted election pitch to keep Australia safe: safe from the trolls, safe from the virus (#missionaccomplished) and safe from China (#dontrisklabor).
The hastily convened parliamentary inquiry into online safety held over the summer would have driven the sleepy holiday news cycle with vivid retellings of online pile-ons, orchestrated harassment and platform nonchalance and neglect.
While not a partisan battle, a war with an industry that is consistently ranked less trusted than even our politicians would have given the prime minister a forward agenda, directly appealing to anxious parents using the same micro-targeting tools that have given the platforms their power.
Strapping on the oculus, the summer would have been loaded up with front-page empathy bombs as sympathetic victims (girls with eating disorders, religious minorities, female public figures) pitched against a weakened Facebook, whose market share is plummeting in the face of falling user numbers, tighter regulation and a collapse in social license.
The Omicron outbreak crashed this program, but this week’s Guardian Essential report shows there is broad support for enforcing and imposing more rules on the way social media platforms operate.
The radically libertarian values underpinning social media – the brazen assertion they are neutral platforms allowing for the free exchange of ideas – has been mugged by reality. On key issues being canvassed in the inquiry relating to platform responsibility and accountability, three-quarters of voters are on board for greater government intervention.
Where there are differences of sentiment it is not in the usual partisan affiliation but between the generations, with older voters significantly more receptive to regulatory intervention and younger people happier to let it all rip.
This is politically relevant for a Coalition whose base skews older: in a separate question the issue of online safety emerges as one more likely to hold current voters and, critically, entice people who voted elsewhere in 2019.
Our PM, the retail politician par excellence, has been circling online safety for some time influenced by the focus groups picking up this community sentiment that “something must be done” and his own experience with his family.
Online safety was to be his global signature, with a landmark speech to be the centrepiece of his G20 appearance last year before French president Emmanuel Macron’s free character assessment derailed the trip.
But like all leaders, Morrison’s strengths are also his weakness; when it comes to big tech the government has been all RAM and no ROM.
An objective look at the Coalition’s record with big tech would find it has been a scattergun of high-profile political interventions: the creation of an e-safety commissioner to deal with sexting and bullying; emergency takedown laws in response to the Christchurch massacre, the news media bargaining codes at the behest of the major media companies.
Wearing my other hat as director of the Australia Institute’s Centre for Responsible Technology, I am a big fan of regulating the platforms. I leaned in behind the Morrison government when it introduced the news media bargaining code in the hope it was the first step in the comprehensive reform agenda that the Australian Competition and Consumer Commission had laid out in its digital platforms inquiry.
But as I argued to the committee, piecemeal “get tough” regulation might be popular but will not bring about the systemic changes required to place public interest parameters around big tech.
Like focusing on the skidmarks of a car that continually careers out of control, we focus too much on the point of impact and not enough on the engine design that keeps blowing up in our faces. We need more than tough action on specific activity: we need systemic reform to change how these companies operate.
As we argued in our submission to the inquiry, much of the thinking has already been done: the 23 other recommendations from the ACCC’s digital platforms inquiry are waiting for action from the government, an equally groundbreaking Human Rights Commission report on the impact of AI is gathering dust on the attorney general’s desk, reforms to privacy law moves at snail’s pace.
The Labor party’s Tim Watts points out in the Burning Platforms podcast that the Centre for Responsible Technology hosts, there are nine discreet regulatory inquiries into big tech currently under way scattered across different portfolios. There are five separate initiatives for age verification currently in train. As Watts says: “This is no way to run a railway.”
Here’s the frustration. The public wants change, the detailed policy work has been done. The roadmap to better regulation is set: anti-monopoly intervention, safety by design, privacy reform, platform accountability. Any one of these initiatives would be globally significant; developed in a coordinated and sequenced way they could be transformative.
“Anti-trolling” is a politically convenient catchall that justifies any intervention – trolls are naturally bad people, bullying behaviour is abhorrent. But it creates an environment where bad behaviour is personalised and not the natural consequence of current business models that deliberately nudge users to be their worst selves.
The idea of taming big tech by wielding the big legislative stick to crack down on trolls makes a great political meme, but the real work of understanding the models, building corporate accountability and imagining alternative models of public infrastructure may be less likely to go viral but is much more important.
Like the child trapped in a wormhole, consuming unverifiable content, bouncing from darkened discussion boards to narcissistic performative platforms, this government lacks the focus to brings these threads together.
• Peter Lewis will discuss the findings of the week’s Guardian Essential Report at 1pm on Tuesday 8 February – free registration here.