Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Reason
Reason
Politics
Eugene Volokh

When May Law Require Social Media Platforms to Disclose Basis for Moderation Decisions?

From the majority in Moody v. Netchoice, LLC:

The laws, from Florida and Texas, restrict the ability of social-media platforms to control whether and how third-party posts are presented to other users … [including by] requir[ing] a platform to provide an individualized explanation to a user if it removes or alters her posts….

Analyzing whether these requirements are sound, the majority held, "means asking," as to each kind of content moderation decision, "whether the required disclosures unduly burden" the platforms' own expression:

[R]equirements of that kind violate the First Amendment if they unduly burden expressive activity. See Zauderer v. Office of Disciplinary Counsel of Supreme Court of Ohio (1985). So our explanation of why Facebook and YouTube are engaged in expression when they make content-moderation choices in their main feeds should inform the courts' further consideration of that issue.

For more on that "main feeds" question, and on the Court's not deciding the First Amendment questions raised by any of the platforms' other functions, see this post. As to the Zauderer "unduly burden expressive activity" standard, especially as applied outside the original Zauderer context of commercial advertising, see NIFLA v. Becerra (2018).

All this suggests that the individualized-explanation requirement are more likely to be invalid as to decisions about what to include in the "main feeds," and more likely to be valid as to decisions about whether to delete a post outright, or ban a user outright. But even that is not entirely clear. For a thoughtful, detailed treatment of the laws' practical effects (which is what the majority seems to be calling for), see Daphne Keller's Platform Transparency and the First Amendment article.

Justice Thomas, writing alone, argued in favor of greater protection against speech compulsions generally:

I think we should reconsider Zauderer and its progeny. "I am skeptical of the premise on which Zauderer rests—that, in the commercial speech context, the First Amendment interests implicated by disclosure requirements are substantially weaker than those at stake when speech is actually suppressed."

But he also joined Justice Alito's concurrence in the judgment (which also joined by Justice Gorsuch), which took a less platform-friendly approach. An excerpt:

NetChoice argues in passing that it cannot tell us how its members moderate content because doing so would embolden "malicious actors" and divulge "proprietary and closely held" information. But these harms are far from inevitable. Various platforms already make similar disclosures—both voluntarily and to comply with the European Union's Digital Services Act—yet the sky has not fallen. And on remand, NetChoice will have the opportunity to contest whether particular disclosures are necessary and whether any relevant materials should be filed under seal. Various NetChoice members already disclose in broad strokes how they use algorithms to curate content….

Just as NetChoice failed to make the showing necessary to demonstrate that the States' content-moderation provisions are facially unconstitutional, NetChoice's facial attacks on the individual-disclosure provisions also fell short. Those provisions require platforms to explain to affected users the basis of each content-censorship decision. Because these regulations provide for the disclosure of "purely factual and uncontroversial information," they must be reviewed under Zauderer's framework, which requires only that such laws be "reasonably related to the State's interest in preventing deception of consumers" and not "unduly burde[n]" speech.

For Zauderer purposes, a law is "unduly burdensome" if it threatens to "chil[l] protected commercial speech." Here, NetChoice claims that these disclosures have that effect and lead platforms to "conclude that the safe course is to … not exercis[e] editorial discretion at all" rather than explain why they remove "millions of posts per day." …

In the lower courts, NetChoice did not even try to show how these disclosure provisions chill each platform's speech. Instead, NetChoice merely identified one subset of one platform's content that would be affected by these laws: billions of nonconforming comments that YouTube removes each year. But if YouTube uses automated processes to flag and remove these comments, it is not clear why having to disclose the bases of those processes would chill YouTube's speech. And even if having to explain each removal decision would unduly burden YouTube's First Amendment rights, the same does not necessarily follow with regard to all of NetChoice's members.

NetChoice's failure to make this broader showing is especially problematic since NetChoice does not dispute the States' assertion that many platforms already provide a notice-and-appeal process for their removal decisions. In fact, some have even advocated for such disclosure requirements. Before its change in ownership, the previous Chief Executive Officer of the platform now known as X went as far as to say that "all companies" should be required to explain censorship decisions and "provide a straightforward process to appeal decisions made by humans or algorithms." Moreover, as mentioned, many platforms are already providing similar disclosures pursuant to the European Union's Digital Services Act. Yet complying with that law does not appear to have unduly burdened each platform's speech in those countries. On remand, the courts might consider whether compliance with EU law chilled the platforms' speech….

The post When May Law Require Social Media Platforms to Disclose Basis for Moderation Decisions? appeared first on Reason.com.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.