
"The hatred of women is everywhere and dictates the way men behave online," one young Gen Z person said in a recent poll when asked about misogyny on social media, while a young man said, "it isn’t that deep. It’s all a laugh".
According to the Amnesty International survey released this week, 73 per cent of Gen Z social media users (those aged 13-28) in the UK have witnessed misogynistic content online, with half encountering it every week.
The poll comes as social media has been placed under an even brighter spotlight following the release of the Netflix drama Adolescence, which follows the fictional story of the social media-fuelled murder of a schoolgirl by a 13-year-old boy.
The show’s co-creator Jack Thorne has called on the UK government to ban smartphones for those under 16, following a similar move made by Australia to ban social media for the same age range.
However, experts warn that a ban does not get to the root of the problem.
There is no current data to suggest that banning social media for under 16s works.
The most recent study published in Lancet Regional Health Europe - and the first of its kind - looked at the impact of smartphone bans in schools across England and found that they made no difference to mental well-being, sleep, or educational outcomes.
The study was limited in that it only evaluated bans during school hours and not an outright prohibition, as some countries have suggested.
However, restricting smartphone use until the age of 16 may not work either as it may be too late for children to be educated about the harms of social media.
"Currently, it does look like the age of 16 is the age at which a child is allowed full access to social media. So you've got to think, what happens when a child is 15 years and 364 days old," said Drew Benvie, the founder of the social media campaign group Raise, which works with children to educate them about the harms of social media.
Another issue, he told Euronews Next, is that "children always find a way" to get around bans and parental restrictions.
"I think banning social media for an age group has inherent risks. It's going to be incredibly difficult to enforce, and even if it’s possible, users will find a way to bypass it," he said, pointing out that children are using VPNs to get around the TikTok ban in the United States.
He argued that bans would likely see messaging apps replacing social media and that there is a potential risk for them to be more harmful because the content "goes dark and can't be seen by others," such as parents.
The most prolific misogynistic influencer is Andrew Tate, who blends get-rich-quick tips with misogyny, saying that women belong at home and that rape victims must "bear responsibility for their attacks".
In the last month, 57 per cent of Gen Z men in the UK reported that they’d seen content from Elon Musk, US President Donald Trump (55 per cent), and Andrew Tate (41 per cent), according to the Amnesty International report.
"The influence of toxic masculinity on young boys and how it's affecting their behaviour towards young girls. I think that does scare me the most about all of this," said Benvie, who added that education is just part of the solution.

What are social media companies doing to protect children?
For their part, social media companies have taken some steps to make their platforms safer for children.
TikTok introduced a new mindfulness tool that automatically turns on for teenagers under 16 after 10 pm and will interrupt the "For You" feed with a full-screen takeover, play calming music, and activate blue light.
It also has new controls that allow parents to block teenagers from TikTok during specific times.
Other platforms allow parents to block their children’s social media use in the evening, and Meta has also introduced teen accounts for Instagram that give parents greater control.
The mindfulness tool by TikTok "can help a little bit, but really only a little bit. If teens use it maybe they will turn off for a bit and get a better night’s sleep, or they will use it when they see something," said Sonia Livingstone, professor of social psychology, department of media and communications at the London School of Economics.
"The very fact of providing it will communicate something about the value of being more mindful and being more, as psychologists would say, kind of self-regulating, taking a bit more control of their own engagement instead of just feeling taken over by the app," she told Euronews Next.
As for parental controls, she said that she has researched them and found that the controls should be reached through a consensus in the family, not just imposed by the parent.
"Anything imposed high-handed by the parent tends to get rejected or resented by the child, or they find a workaround," Livingstone said, adding that replacing the word control would also be a good move so that children feel less dictated to.
Although these measures give parents greater control, social media platforms can amplify misogynistic content. The longer the user stays on a video and the more popular it becomes, the more similar types of videos are promoted by the algorithm.
A study last year found that there was a four-fold increase in the level of misogynistic content, which TikTok suggested when it was monitored by researchers over five days.
The algorithm showed more extreme videos, which, for the most part, focused on anger and blame directed at women.
The study by University College London and the University of Kent in the UK focused solely on TikTok, but the researchers said the findings were likely similar to what other social media platforms offer.
The researchers also said that an total restriction of phones or social media was "likely to be ineffective," and instead called for a "healthy digital diet" and for the tech companies to look into their algorithms.
Taking down harmful content
The other way social media companies can act is to be quicker at removing content when it is reported as offensive, said Livingstone.
"Children say to me a lot that they want a much better way of reporting and taking down the content," said Livingstone.
"I've had children say we report things such as beheadings and it took TikTok two hours to take it down," she said, adding that two hours is fast for a social media platform, but in that period, it has already been shared on other platforms and been screenshot many times.
When it comes to misogyny, "it needs to be caught much faster than it currently is," said Benvie.
Illegal content on social media is classified, for instance as child sexual exploitation or graphic content, but misogyny is easy to go undetected for a long period.
"It could be dressed up as attractive to impressionable minds, and it needs to be detected faster. It needs to be removed faster," said Benvie.
"And those pushing it need to face sanctions as well as the social networks. And that's only part of the solution," he added.
While controversial influencer Tate has been banned from Meta, TikTok, and YouTube, his content continued to be posted on these platforms by others for quite some time.
Another point Benvie makes is that misogynistic posts from profiles that are not as known as Tate may go undetected for a while, as children will not bring up these posts or profiles with their parents or teachers.
"That's when it will make an impact, and it will affect a child in quite a harmful way. I think if it's [misogyny] better understood, it can be more quickly dealt with as an issue," he said.
Benvie argues this is where education needs to come in, which would talk to children about the issues of misogyny, teach digital literacy, and talk about disinformation and fake news.
He said that parents can also help with education and that the best way to do so is for parents to use the apps themselves to understand them and talk to their children about worrisome content they may have seen, as children will not bring up the content themselves.
However, Benvie said that education alone cannot fill the gap and that action needs to be taken by regulators and social media companies.
Regulation, education, and enforcement of social media safety needs to happen faster, and the social networks know they need to change or else they face a ban, he said.
"It's a shame that facing a ban is bringing about such quick action. But at least we're going in the right direction, because I don't think outright bans are practicable and therefore they won't work," he said.
"They will help, but they're not the solution because at some point a child comes online and we need to equip them with the skills for life online, not prevent them from accessing the reality of life".