A group of Rohingya refugees has brought a class-action lawsuit against Facebook parent company Meta for allegedly failing to remove hate speech that incited violence. The Muslim ethnic group has faced extreme violence in the last few years in Myanmar, to the extent that it has been labeled as genocide by multiple human rights advocacy groups.
The lawsuit, which was filed this week in California, claims that Facebook’s arrival in Myanmar allowed misinformation and hate speech to spread swiftly through the country. This, the lawsuit concludes, led directly to an uptick in violence that “amounted to a substantial cause, and eventual perpetuation of, the Rohingya genocide.”
A similar lawsuit is poised to drop in the U.K. in the near future. The combined value of the two lawsuits is more than $150 billion.
Facebook is, at this point, intimately familiar with the allegation that its platforms allow hate speech to thrive. That’s never seemed to phase the company much before. Can a $150 billion lawsuit about genocide change the company’s ways?
Facebook knows it, too —
Myanmar has been a particularly troublesome market for Facebook in recent years. Despite the company’s policies forbidding “violent or dehumanizing” speech — they even mention ethnic groups in particular — that exact content has been allowed to thrive on Facebook.
A 2018 Reuters investigation found more than 1,000 examples of posts, comments, and pornographic images attacking the Rohingya and other ethnic groups on Facebook. Mark Zuckerberg had announced earlier that year that the site would be hiring more experts to review hate speech in the area.
Those efforts largely failed to stem the well-documented problem. The U.N.’s human rights investigators found that Facebook had, indeed, played a significant role in spreading anti-Rohingya sentiment in Myanmar. And Facebook itself even admitted it had failed to prevent its platform from being used to incite violence against ethnic groups in the country.
This isn’t getting better —
Despite acknowledging this problem — which, just as a reminder, has literally fueled genocide — Facebook has largely failed to stop similar issues from fomenting on its platforms since. The effects of this failure have certainly been felt in the United States, but it’s overseas where Facebook’s moderation efforts have really been catastrophic.
The platform’s inability to moderate itself in countries where English is not the primary spoken language has not seen much improvement since 2018. This is one of the main issues brought up by now-infamous whistleblower Frances Haugen: that Facebook dedicates such a small percentage of its moderation resources to foreign countries that it leaves them floundering more often than not.
Meta’s response to these allegations has ranged from outright denial to tentative policy change. In this instance, Meta seems certain it will be protected by Section 230, as it is based in the U.S. — but the Rohingya refugees are hoping the case will be trialed under Burmese law instead.