The hotly debated Section 230 is again under the scope of the Supreme Court as the court recently heard oral arguments in Gonzalez v. Google LLC. The plaintiff, Reynaldo Gonzalez, whose daughter was killed in a 2015 Paris terrorist attack, is seeking to hold Google’s YouTube liable for its algorithms allegedly recommending ISIS recruitment videos. This lawsuit could damage the internet, potentially opening up platforms to lawsuits for any content their algorithms recommend to users.
If the court rules against Google, future online innovation would be significantly hampered, as weakened protections under Section 230 would make entrepreneurs less likely to create services that host third-party content.
The question the Supreme Court is exploring is: “Does section 230(c)(l) immunize interactive computer services when they make targeted recommendations of information provided by another information content provider, or only limit the liability of interactive computer services when they engage in traditional editorial functions (such as deciding whether to display or withdraw) with regard to such information?” Put simply, do interactive computer services forfeit their liability protections when they use algorithms that recommend content?
If the court rules that companies do forfeit their protections when they employ such algorithms, many interactive computer services will become liable for the third-party material on their platforms. For example, as Google points out in its brief to the court, even companies like TripAdvisor might be sued for conspicuously displaying bad reviews. A company like Amazon might be sued for promoting eating disorders when recommending diet-focused books. Lawsuits relating to content could become common.
As it is written, Section 230 shields interactive computer services — social media companies, discussion forums, etc. — from liability for third-party content posted on their sites while also allowing providers to moderate content at their discretion. This regulation is mainly responsible for the growth of sites such as YouTube, Google, Facebook, Twitter and more. Without these provisions, entrepreneurs would have been less likely to take on legal risk by hosting varying third-party content.
The diverse internet landscape we enjoy depends on Section 230’s protections. Without it, individuals would have fewer opportunities to express themselves and engage with others online.
Corbin Barthold, TechFreedom’s internet policy counsel, points out that “if ISIS had not uploaded videos to YouTube, YouTube would have had no terrorist content to serve up. This is a tell that the plaintiffs’ suit is really about the user-generated content and is a loser under Section 230.” The issue remains the content itself: Since platforms are not liable for third-party content, neither should they be accountable for their algorithm’s ordering of that content.
If the court rules with Gonzalez, online platforms will be forced to engage in greater content moderation than they currently exhibit to avoid lawsuits, crushing online discourse. This will result in far less third-party content allowed and, thus, fewer opportunities for people to engage in important discussions, create meaningful content and participate in social communities online. And without the algorithms ordering content, interactive service providers’ platforms would be unimaginably unorganized and hard to navigate. If the model of modern platforms were damaged by the court, it could drastically change the internet landscape for the worse.
Innovators would be less likely to create platforms that host third-party content and current players would be inclined to restrict third-party activity. The internet could take a step back from the amazing innovation it has exhibited since its inception if the Supreme Court rules against Google.