Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - AU
The Guardian - AU
National
Tory Shepherd

Online roulette: the popular chat sites that are drawing in children and horrifying parents

A girl adds a face mask to her character in a Roblox game
Parents trying to protect their children are faced with a growing number of online worlds including Omegle and Roblox (pictured) with varying parental controls – and loopholes. Photograph: Phil Noble/Reuters

“In less than 30 seconds I saw a man clearly masturbating on camera,” Kirra Pendergast says, describing a visit to the website Omegle.

Omegle is a type of video and chat roulette, with the tag line: “Talk to strangers!” There’s no need to register, or log in, and you don’t need an app. Just go to the website, tick a box saying you’re 18 or above and you’re away – even if you’re not 18.

Parents tell Guardian Australia that “playing” on Omegle is something kids do at parties, at sleepovers. It just takes one of the group to have a screen with internet access and before long they are chatting to strangers all over the world.

Pendergast says it’s like a prank phone call – an illicit thrill, but this one’s dangerous.

When Guardian Australia tried it, it was a whirlwind of man after man in darkened room, waiting. You clock the stranger then chat or click and move on. Click, another man, click, another man. Click, three young girls sitting on a bed in their pyjamas.

Pendergast, the founder and chief executive officer of cybersafety program provider Safeonsocial.com, was horrified when she tried it.

“I felt like I had to bleach my eyeballs,” she says.

“It’s been around since 2009 – I’ve always referred to it as the cockroach of the internet because it refuses to die.”

The Australian Centre to Counter Child Exploitation (ACCCE) has recorded a dramatic increase in reports of online child sexual exploitation, from 17,400 in 2018 to 33,114 in 2021. The eSafety commissioner has warned that reports about technology being “weaponised to abuse children” surged since the start of the pandemic.

The Australian Federal Police said in a statement that anonymous chat functions provided a platform for offenders who often create fake accounts to target children and young people.

eSafety’s Mind the Gap research found about two in three teenagers have been exposed to violent sexual images and self-harm content online, while almost half of children have been treated in a nasty or hurtful way online.

“In the worst cases, children may come into contact with predators, whom we know exploit platforms popular with them,” the acting eSafety commissioner, Toby Dagg, says.

While online experiences can be “fun and enriching”, there can be downsides, Dagg says.

“Many of these interactions are likely to be positive and new friends made online can be a welcome addition to young people’s social lives,” he says.

“However, online interactions also carry risks, including bullying, sexual victimisation and the misuse of personal images shared online, otherwise known as image-based abuse.”

There has been a recent focus on the internet giants and what they are doing to combat abuse, but Omegle flies a little more under the radar. As does Roblox, a digital platform where people can make and share games and interact with other users’ avatars.

‘It’s the first stages of online grooming’

A report from Online Guardians – using data from about 3,000 New South Wales students under the age of 13 – found Roblox had four times more bullying than any other platform, with 327 students bullied online on Roblox. Next up was 85 on Fortnite, followed by 67 on Discord and 50 on TikTok.

The eSafety commissioner has demanded that Twitter, TikTok, Google, Apple, Meta, Microsoft, Snap and Omegle say how they are tackling child sexual abuse on their platforms.

Omegle says it prohibits anyone under the age of 18 – but users just have to tick a box saying they’re an adult. It moderates parts of the site using artificial intelligence, backed up with “human review”, and monitors text chats for “certain patterns”.

Roblox hasn’t had a notice from eSafety, but the commission’s website directs people towards its parental and privacy controls. Roblox itself says it has a program to detect “inappropriate attire”, a process to report inappropriate material and remove inappropriate content, customisable safety features, and strict chat filters. Parents can restrict or block the chat feature entirely, and users can mute or block other members if they feel uncomfortable.

“We are committed to ensuring people of all ages have a positive and safe experience on Roblox. We have a zero-tolerance policy for inappropriate content and behaviour, including endangering children in any way, and strong systems and protocols designed to swiftly catch anyone attempting to circumvent our strict rules,” a spokesperson said in a statement.

Pendergast says neither of them are doing a good enough job.

When she’s running programs in schools – for grade 3 and up – she asks children if they’ve ever been asked to be someone’s boyfriend or girlfriend on Roblox. “Most of the room puts their hands up,” she says. “Then I ask who’s been asked to follow people on another platform … Most of the room will put their hands up. I ask who’s been asked to play the doctor or the nurse. Again, hands up.

“I ask who’s been offered free Robux (the internal currency) for their avatar to lie down next to another avatar and 10 to 15% put their hands up.

“They’re all giggling, not knowing it’s the first stages of online grooming.”

‘Understand the world they live in’

Part of the difficulty for parents is the sheer number of online worlds, all of which have varying levels of parental controls, and varying types of loopholes that predators can exploit. The eSafety commission site lists dozens, but Pendergast says the government just can’t keep up.

The AFP said there were certain “red flags” to look out for, including unsolicited friend requests from strangers, strangers asking personal questions or sexual questions, or asking to be friends on a different app.

Parents should also look out for young people distancing themselves from friends and family, developing fixations on “conspiracy theories or contentious social issues”, displaying extreme reactions to certain news or politics, or spending more time on “fringe forums”.

Pendergast says the kids seeing and hearing exploitative material online don’t know how to process it, but they also don’t want to tell the grownups – they worry they’ll get in trouble, or will be banned from the internet or the “game”.

She wants a shift in language; from “playing” to “visiting”. “Playing” diminishes the seriousness of the engagement, while “visiting” more fully captures the complexity of virtual reality, of the metaverse. She says when she talks to children about “going to a Roblox world”, they realise it’s real, not fictitious.

Communication is also key, she says.

“Parents have to stop banning kids from these games, and get down and understand them,” she says.

“Understand the world they live in.”

Guardian Australia has contacted Omegle for more information on what is being done to protect children.

To find out more about online child exploitation or to make a report, visit ACCCE.

  • In Australia, crisis support can be reached 24 hours a day at Lifeline 13 11 14; Suicide Call Back Service 1300 659 467; Kids Helpline 1800 55 1800; MensLine Australia 1300 78 99 78 and Beyond Blue 1300 22 4636

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.