February 24, 2023 - 4:53pm

This week, the Supreme Court of the United States heard arguments in Gonzalez v. Google LLC, a case that could have far wider repercussions for speech on the internet than the ongoing release of the “Twitter Files”

The facts of the case are tragic. In November 2015, Nohemi Gonzalez, an American exchange student, was eating at a bistro in Paris when she was shot and killed during a terrorist attack — allegedly carried out at the behest of ISIS — that also claimed the lives of 19 other people. Shortly afterwards, the Israeli legal nonprofit Shurat HaDin (“letter of the law” in Hebrew) approached Nohemi’s parents and asked if they wanted to sue Google for violating the US Anti-Terrorism Act because its YouTube site not only hosts ISIS content but, via its recommendation algorithm, automatically suggests ISIS and ISIS-related videos to users who have evinced an interest in that content.

Lately Shurat HaDin, which has won billions of dollars in judgments for victims of terrorist acts, has been pursuing even richer prizes, unsuccessfully attempting to sue tech companies for hosting terrorist content. Now, despite losing earlier at the lower-court level, the case has finally reached the Supreme Court, where it stands a chance of prevailing in a potentially paradigm-shifting decision. 

The question of law in the Gonzalez case is whether Section 230(c)(1) of the Communications Decency Act immunises interactive computer service providers when they make targeted recommendations of information provided by another content provider (in this case, ISIS videos hosted on YouTube). Google contends that it does, since that section was a specific grant of immunity for internet service providers in an act otherwise intended to regulate minors’ access to pornography. This has freed actual providers of internet access, as well as content-hosting and social media platforms like Facebook and YouTube, from the threat of frivolous litigation, allowing them to grow into major industries over the previous three decades. 

Eric Schnapper, the lawyer representing the Gonzalez family, argued that this limitation of liability should only apply when a service provider is engaging in traditional editorial functions, such as explicitly deciding whether to display or withdraw content, in a manner consistent with its capabilities at the time of the act’s passage. 

This case, like all others involving emergent technologies, poses a major challenge for the Supreme Court. Although the court no longer has any octogenarian members, only two of the nine sitting justices — recent appointees Amy Coney Barrett and Ketanji Brown Jackson — were born after 1970. When speaking during oral argument about the Supreme Court’s possible involvement in rolling back the immunity afforded by Section 230, Justice Elena Kagan, who is 62, remarked that “we really don’t know about these things”, joking that “these are not like the nine greatest experts on the internet” while echoing comments she made nine years ago about the justices’ inability to use e-mail.

Fortunately, the justices seemed sceptical of Schnapper’s attempt to articulate a repeatable, bright-line test for when companies like Google could be held liable for sorting and recommending third-party content. This is good, because a muddled or botched decision in favour of Gonzalez could greatly chill potentially useful content production. YouTube’s ability to introduce content via its algorithm can indeed seem insidious, but the alternative is far worse — an ever-growing litigation bill that may eventually force these providers to decide against hosting even slightly controversial material at all.

Congress has considered revising Section 230 in the past, but is yet to take any tangible steps towards doing so. If immunity for providers is withdrawn or modified at some point by legislators, one model for a potential contest-hosting platform that complies with this changed state of affairs would involve hosting all content without censorship whilst complying with all government requests to remove illegal content and track down the offender. All government communications that lead to content removal would be posted in full in a searchable archive. Such a platform would be extremely cheap to run — it could be essentially hands-off once there’s a single script to deploy — and also pass all legal liability on to content creators themselves. 

In any case, the hosting of content on these platforms constitutes a quasi-public service of considerable social value, and burdening providers with additional costs will benefit plaintiffs’ lawyers at the expense of everyone else.


Oliver Bateman is a historian and journalist based in Pittsburgh. He blogs, vlogs, and podcasts at his Substack, Oliver Bateman Does the Work

MoustacheClubUS