Meta has been accused in a lawsuit of letting posts that inflamed the war in Tigray flourish on Facebook, after an Observer investigation in February revealed repeated inaction on posts that incited violence.
The lawsuit, filed in the high court of Kenya, where Meta’s sub-Saharan African operations are based, alleges that Facebook’s recommendations systems amplified hateful and violent posts in the context of the war in northern Ethiopia, which raged for two years until a ceasefire was agreed in early November. The lawsuit seeks the creation of a $1.6bn (£1.3bn) fund for victims of hate speech.
One of the petitioners said his father, an Ethiopian academic, was targeted with racist messages before his murder in November 2021, and that Facebook did not remove the posts despite complaints.
“If Facebook had just stopped the spread of hate and moderated posts properly, my father would still be alive,” said Abrham Meareg, who is ethnic Tigrayan and an academic like his father.
“I’m taking Facebook to court so no one ever suffers as my family has again. I’m seeking justice for millions of my fellow Africans hurt by Facebook’s profiteering – and an apology for my father’s murder.”
The case is asking for a compensation fund of 200bn Kenyan shillings (£1.3bn) to be established for victims of hate and violence on Facebook.
In February an analysis by the Bureau of Investigative Journalism (TBIJ) and the Observer found that Facebook was letting users post content inciting violence through hate and misinformation, despite being aware that it helped directly fuel tensions in Tigray, where thousands have died and millions been displaced since war broke out in late 2020.
The research found that one post from a local influencer, calling for people to “cleanse” the area of supporters of Tigrayan forces, stayed up for four months after it was reported to the company. The family of Gebremichael Teweldemedhin, a Tigrayan jeweller abducted last December, believe that post and others like it had resulted in many attacks on Tigrayans in Gondar, a city in the Amhara region.
Amnesty International is one of seven organisations supporting the lawsuit. “The spread of dangerous content on Facebook lies at the heart of Meta’s pursuit of profit, as its systems are designed to keep people engaged,” Amnesty’s deputy regional director, Flavia Mwangovya, said. “This legal action is a significant step in holding Meta to account for its harmful business model.”
One of Amnesty’s own staffers, Fisseha Tekle, is a petitioner in the case. “In Ethiopia, the people rely on social media for news and information,” he said. “Because of the hate and disinformation on Facebook, human rights defenders have also become targets of threats and vitriol. I saw first hand how the dynamics on Facebook harmed my own human rights work and hope this case will redress the imbalance.”
Facebook spokesperson Ben Walters told the Associated Press the company could not comment on the lawsuit because it hadn’t received it. He shared a general statement saying: “We have strict rules which outline what is and isn’t allowed on Facebook and Instagram. Hate speech and incitement to violence are against these rules and we invest heavily in teams and technology to help us find and remove this content.” Facebook continues to develop its capabilities to catch violating content in Ethiopia’s most widely spoken languages, the statement said.
Facebook has repeatedly come under fire for expanding into countries with low media literacy, rapidly growing to encompass a large fraction of internet traffic in the area, and failing to devote sufficient resources to moderation in the local languages. That describes its role in Myanmar, where the site faces compensation claims worth more than £150bn after legal action was launched in the UK and US last December.
Facebook admitted in 2018 that it had not done enough to prevent the incitement of violence and hate speech against the Rohingya, the Muslim minority in Myanmar. An independent report commissioned by the company found that “Facebook has become a means for those seeking to spread hate and cause harm, and posts have been linked to offline violence”. The company was also criticised by the UN for its “leading role” in the possible genocide.