
Russia recently banned Facebook and Instagram, while sparing Meta’s third offering, WhatsApp. Facebook and Instagram face a double ban: the most recent one is by a court that found these platforms guilty of extremist activity, which came on top of a pre-existing ban by executive decision. The ground being that Facebook and Instagram had allowed users in Ukraine to post messages urging violence against President Putin of Russia and against Russian troops invading Ukraine. Facebook has subsequently clarified it does not sanction calls for violence against a head of state and that it does not condone violence against Russians in general.
Earlier, Facebook gained notoriety on account of people using the platform to organize violent attacks on Rohingyas in Myanmar. It is plausible that Facebook had not deployed local language resources on a large enough scale to spot and remove messages inciting proximate violence. Such laxity stems from the tradition of exempting social media platforms from liability for the so-called user-generated content they carry. In the United State, Section 230 of the Communications Decency Act provided a safe harbour for online services, including intermediaries, relieving them of liability for third-party comments on their platforms.
Things have evolved from those early, innocent days. Today, social media platforms do not act as passive displayers of messages by their users; they actively moderate the content. Facebook has a global council in place to serve as an ombudsman for vetting its decisions on what qualifies to be carried on the platform and what needs to be removed. Further, most social media platforms also deploy algorithms to sort their user-generated content and feed types of content that past patterns suggest could be preferred by particular groups of users. Such editorial intervention is what contributes to the creation of closed echo-chambers that do not admit fact or opinion that challenges their defining prejudices.
The reality of editorial intervention in the content they display to particular users takes away a major ground for not holding social media platforms accountable for what they carry.
Another consideration is cultural sensibility. Most social media platforms are American in their development and ownership. Their content moderation sensibilities also tend to be American. What is dismissed as an instance of bad taste in the US might be a source of much offence and violent reaction in another jurisdiction. A call to burn the Quran by a maverick pastor in the US was featured in a video posted on YouTube in 2010. The pastor was not deterred by the possibility that this could inflame passions in the Middle East and endanger the lives of US troops deployed there. YouTube was guided by its cultural sensibilities and legal liabilities attendant on burning the Quran.
Both considerations of relieving social media companies of the responsibility to determine what is or what is not appropriate for users to see on their platforms and to give salience to local cultural and legal norms, it is far better to apply local regulation to social media on par with how such regulation applies to mainstream media.
Mainstream media would not or should not carry messages inciting violence. If any media outlet does so, it would fall foul of the editorial norms applicable to it. The editor and the publisher would be held accountable. That ought to be the case with social media as well. This would neither curtail freedom of speech — it is not the case that freedom of speech was invented by social media and did not exist prior to its arrival — nor make it difficult for social media to function.
On the contrary, having norms to conform to, laid down by the government of the jurisdiction in which they operate, would liberate social media companies of the obligation to identify locally relevant norms. The only casualty of such outsourcing of content moderation standards to the laws and rules of each jurisdiction would be global conscience-keepers assembled by social media platforms in their oversight bodies. It would be okay to shed a tear or two over such exalted unemployment, but the social media platforms and societies themselves would be the gainers.