Today marks two years since a deluge of online racist abuse was directed at Bukayo Saka, Marcus Rashford and Jadon Sancho after their missed penalties in the Euro 2020 Final.
The abuse was a watershed moment in the collective effort to fight racism in English football, which, in the 730 days since, has progressed in some ways and regressed in others.
“It was depressing enough to be losing a final, but we knew for a fact what was coming”, recalls Sanjay Bhandari, chair of the anti-discrimination charity Kick It Out. “On my way back to my car to drive home, I had a look on my phone and the abuse was already starting on social media.”
A UK Football Policing Unit investigation conducted within weeks of England’s defeat by Italy at Wembley amassed 600 reports. But 123 of the 207 deemed illegal came from accounts owned by people outside the UK, meaning only 11 UK arrests were made. Justin Lee Price, 19, was jailed for six weeks for a racist tweet he sent to Rashford.
Although Kick It Out face an uphill battle lobbying parliament and social media companies, there has been progress.
Three months after the abuse of the England trio, Bhandari, Rio Ferdinand and the FA’s former diversity advisor Edleen John gave evidence to a select committee on what they wanted to be included in the Online Safety Bill.
The bill, first proposed two months before the Euro 2020 Final, will empower the national communications regulator Ofcom to ensure all platforms follow guidelines that improve internet safety in the UK.
“Most of the things we’ve asked for are in the Online Safety Bill. Football has been pretty successful”, says Bhandari.
The so-called ‘Triple Shield’ part of the bill will force platforms to remove illegal content, remove content banned under their own terms and conditions, and provide filters that allow adults not to see harmful content.
People are seeing this behaviour online, that there are no consequences, and doing it to someone’s face.
While those filters would be on by default for children online, Kick It Out is lobbying for them to be automatically activated for adult users too.
“We deserve the right not to see harmful content”, Bhandari says.
“We’ve been absolutely all over this for years. Our last fight is to get the filters on by default.”
The Online Safety Bill is passing through the House of Lords, but it could be September before it becomes law. Ofcom must then implement the new law, which is expected to take another 12 months.
The FA have taken their own steps to tackle abuse. FA chief executive Mark Bullingham tells Standard Sport: “We’ve introduced point deductions for grassroots teams from the start of the 2023-24 season for repeat offenders of serious misconduct — a landmark step in our efforts to tackle discrimination.”
But in other areas, progress has been harder to come by.
“Social media companies often say they are a portal rather than a publisher and therefore not responsible for content. That isn’t good enough”, says Bullingham. “They just need to step up and take responsibility to prevent discrimination. It’s a shame it will take legislation for them to take the action required.”
Bhandari is similarly frustrated.
“70 per cent of the abuse comes from overseas”, he says. “For football banning orders to have an effect, it has to apply to people in this country who regularly attend football matches. Of the 30 percent from the UK, it’s really difficult to say what percentage attend football matches. We don’t have very good data on who the perpetrators are.
“I have been asking for three and a half years for that sort of data from the social media companies. They have the capability. It’s never happened.”
While Instagram has improved in assisting prosecutions and Twitter now sends users warning messages before they send potentially abusive content, the social media companies remain largely averse to restrictive measures.
“Over the last year when they’ve hit financial difficulties and disinvested in content moderation, our interaction with them has been less productive”, Bhandari says. “So many people we used to work with are no longer at Twitter or Instagram.
“Facebook is not as much of a vector of hate. The worst places are Twitter and ‘DMs’ on Instagram. Twitter is the worst one. Whether because of their teams shrinking or Elon Musk being more libertarian, there was a rise in hate after him taking over.
“And racism doesn’t just exist online, it exists in the real world. In the two years after the Euros, we’ve seen a large increase in discrimination in grassroots football with language we thought had gone away. People are seeing this behaviour online, that there are no consequences, and doing it to someone’s face.”
While the number of reports of abuse in the last few years has risen, it is unclear whether that indicates a growing willingness to report or that racist abuse itself is becoming more commonplace.
“Online abuse hasn’t really abated, particularly after key events,” says Bhandari. “We saw it with the Coventry player [Fankaty Dabo] who missed his penalty in the Championship Play-Off final. Suddenly there was a torrent of abuse. It’s going to be the Online Safety Bill that is key to us solving that challenge.
“But I’ve felt throughout everything that we’ve had the whole of football behind us. Club chairmen phone me up to seek advice.
“We do fan education courses rather than life bans. We’ve done 140 of those over the last few years, and only one person that we know of has reoffended.”
Despite the obstacles, Bhandari believes meaningful progress has been made since the Euro 2020 Final. But he is resolute in pushing for more.
“It was a turning point. It was a terrible, awful thing, but it probably reinvigorated Parliamentarians to actually do something about this. It was a moment that changed the national political debate about how we tackle racism.
“I take the view of Gramsci, who said you should be pessimistic of mind but optimistic of heart. Things won’t change unless we change them.”