Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Crikey
Crikey
Lizzie O'Shea

Let’s get this right and avoid knee-jerk decisions on misogyny and men’s violence against women

The federal government has announced a range of measures for tackling online harms as they relate to misogyny and men’s violence against women. Some of these are sensible in principle, such as the ban on creating and sharing sexually explicit material without consent, using artificial intelligence. A few of the others, however, warrant some further scrutiny.

In particular, age assurance regimes have been a source of significant concern for digital rights advocates. There are a number of technological approaches to age assurance, but all require some level of privacy invasion. Common approaches include requiring users to upload government-issued ID, while others might use biometric data by way of a photograph or a face scan to infer someone’s age.

Given the sensitive nature of accessing pornography, people would reasonably expect such systems to ensure that data never falls into the wrong hands — a data breach revealing ID documents is bad; a breach linking ID documents to pornography habits would be much worse. It is unclear if there is a provider in the market that can give such robust digital security assurances. Just today, Clubs NSW is contending with a data breach that includes information from its facial verification system.

But of course, the right to privacy is not absolute and this may be a risk worth taking, given it is clear that many people encounter pornography at a young age and often unintentionally. Using a human rights approach, it is possible to conclude that such an imposition may be justified if it is a reasonable, necessary and proportionate means for a legitimate purpose.

Unfortunately, we have not seen compelling evidence that this is the case.

The reality is that young people often access adult content on social media sites, rather than dedicated pornography sites. Mandating age assurance technology for pornography sites misses the mark, but to require mandatory age verification for all social media sites would be an overreach, and risk limiting freedom of expression and association, among other concerns. For young people who are intentionally seeking out adult content, in other jurisdictions (such as several US states where age verification regimes have been established), the systems have pushed children and adults alike toward darker, more extreme sites that don’t have such barriers in place.

Moreover, the research, as the eSafety Commission has repeatedly pointed out, is complex and even at times conflicting when it comes to connecting mainstream pornography with gender-based violence. Much more research needs to be done to understand the link, if any, and making assumptions can do more harm than good.

Where we end up, therefore, is reflected in eSafety’s public perceptions research: a broad base of approval for the technology, but low confidence in the successful operationalisation of it, including concerns about privacy and security. And this is in fact where the government landed last August when it elected not to impose a requirement for the use of age assurance technology, including because the market for such technology was immature. In our view, this was a highly sensible decision.

It is not entirely clear what has changed in the intervening eight months, other than an increasingly dicey political climate in a year that has been dominated by horrific headlines of intimate partner violence and violence against women. The desire to do something is no doubt real, but right now it is more important than ever to get it right.

For many years now, we have supported the review of the Privacy Act, and there is much to be gained in this space from the introduction of individual data rights and an update to the definition of personal information. Australian privacy laws are decades out of date, and we agree with the government that reforms of this nature would assist with preventing intimate partner violence, including by imposing limits on consumer tech being used as stalkerware and giving people more control over their personal information.

There is sometimes a tendency among policymakers to rely on regulator-mandated content regulation to address broader cultural problems. The eSafety Commissioner already has extensive powers to compel industry to mandate codes and take down content (though the scope of these is currently being very publicly tested). While there is some utility in responding to emerging technology with an update to these powers, it is also important to acknowledge the limitations of this approach. The commissioner’s job is a difficult and important one, but constantly expanding her mandate cannot be our answer to everything.

We remain concerned about knee-jerk responses to complex problems, which too often happens when it comes to technology. We defer to experts in the field, but note evidence that access to comprehensive sex education means that young people are less likely to use porn to educate themselves. Young people in Australia generally do not support national-level filters of pornography, according to researchers, and would prefer school-based or national pornography education campaigns. Restrictions will have a disproportionate and even harmful impact on LGBTQIA+ young people, who are often excluded from the sex education that is on offer. If we want to start a conversation with young people about preventing gendered violence, we might want to start by listening.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.