In late November, Australia’s federal parliament passed landmark legislation banning under-16s from accessing social media.
Details remain vague: we don’t have a complete list of which platforms will fall under the legislation, or how the ban will look in practice. However, the government has signalled that trials of age assurance technologies will be central to its enforcement approach.
Video games and online game platforms are not currently included in Australia’s ban of social media. But we can anticipate how enforcing an online ban might (not) work by looking at China’s large-scale use of age verification technologies to restrict young people’s video game consumption.
In China, strict regulations limit children under 18 to just one hour of online gaming on specified days. This approach highlights significant challenges in scaling and enforcing such rules, from ensuring compliance to safeguarding privacy.
‘Spiritual opium’: video games in China
China is home to a large video game industry. Its tech giants, like Tencent, are increasingly shaping the global gaming landscape. However, the question of young people’s consumption of video games is a much thornier issue in China.
The country has a deep cultural and social history of associating video games with addiction and harm, often referring to them as “spiritual opium”. This narrative frames gaming as a potential threat to the physical, mental and social wellbeing of young people.
For many Chinese parents, this perception shapes how they view their children’s play. They often see video games as a disruptive force that undermines academic success and social development.
Parental anxiety like this has paved the way for China to implement strict regulations on children’s online gaming. This approach has received widespread parental support.
In 2019, China introduced a law to limit gaming for under 18-year-olds to 90 minutes per day on weekdays and three hours on weekends. A “curfew” would prohibit gameplay from 10pm to 8am.
A 2021 amendment further restricted playtime to just 8pm to 9pm on Fridays, Saturdays, Sundays and public holidays.
In 2023, China expanded this regulatory framework beyond online gaming to include livestreaming platforms, video-sharing sites and social media. It requires the platforms to build and complete “systems for preventing addiction”.
How is it enforced?
Leading game companies in China are implementing various compliance mechanisms to ensure adherence to these regulations. Some games have incorporated age-verification systems, requesting players to provide their real name and ID for age confirmation.
Some even introduced facial recognition to ensure minors’ compliance. This approach has sparked privacy concerns.
In parallel, mobile device manufacturers, app stores and app developers have introduced “minor modes”. This is a feature on mobile games and apps that limits user access once a designated time limit has been reached (with an exception for apps pre-approved by parents).
A November 2022 report by the China Game Industry Research Institute – a state-affiliated organisation – declared success. Over 75% of minors reportedly spent fewer than three hours a week gaming, and officials claimed to have curbed “internet addiction”.
Yet these policies still face significant enforcement challenges, and highlight a wider set of ethical issues.
Does it work?
Despite China’s strict rules, many young players find ways around them. A recent study revealed more than 77% of the minors surveyed evaded real-name verification by registering accounts under the names of older relatives or friends.
Additionally, a growing black market for game accounts has emerged on Chinese commerce platforms. These allow minors to rent or buy accounts to sidestep restrictions.
Reports of minors successfully outsmarting facial recognition mechanisms – such as by using photos of older individuals – underscore the limits of tech-based enforcement.
The regulation has also introduced unintended risks for minors, including falling victim to scams involving game account sellers. In one reported case, nearly 3,000 minors were collectively scammed out of more than 86,000 yuan (approximately A$18,500) while attempting to bypass the restrictions.
What can Australia learn from China?
The Chinese context shows that a failure to engage meaningfully with young people’s motivations to consume media can end up driving them to circumvent restrictions.
A similar dynamic could easily emerge in Australia. It would undermine the impact of the government’s social media ban.
In the lead-up to the law being introduced, we and many colleagues argued that outright bans enforced through technological measures of questionable efficacy risk being both invasive and ineffective. They may also increase online risks for young people.
Instead, Australian researchers and policymakers should work with platforms to build safer online environments. This can be done by using tools such as age-appropriate content filters, parental controls and screen time management features, alongside broader safety-by-design approaches.
These measures empower families while enabling young people to maintain digital social connections and engage in play. These activities are increasingly recognised as vital to children’s development.
Crucially, a more nuanced approach fosters healthier online habits without compromising young people’s privacy or freedom.
Ben Egliston is a recipient of funding from the Australian Research Council (DE240101275). He has previously received funding from Meta and TikTok.
Marcus Carter is a recipient of an Australian Research Council Future Fellowship (#220100076) on 'The Monetisation of Children in the Digital Games Industry'. He has previously received funding from Meta, TikTok and Snapchat, and has consulted for Telstra. He is a current board member, and former president, of the Digital Games Research Association of Australia.
Tianyi Zhangshao does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
This article was originally published on The Conversation. Read the original article.