Algorithmic Recommendations as an Act of Creation: How the Supreme Court’s Ruling in Gonzalez v. Google, LLC Could Completely Change the Internet

*Anastasia Couch

I. The Twenty-Six Words that Built the Internet

    The internet as we know it rests on a single sentence written in 1996.[1] Section 230 of the Communications Decency Act states: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”[2] For nearly three decades, the internet has boomed under the protections of this blanket immunity without alteration.[3] However, § 230 will stand before the Supreme Court for the first time in the pending case Gonzalez v. Google, LLC.[4] At issue is the liability of any operator of a site that uses an algorithm to curate or recommend third-party content to users.[5] Gonzalez challenges the sweeping immunity that allows internet giants like Google and Facebook to flourish.[6] The internet that emerges after the Gonzalez decision could be barely recognizable.[7]

    Facially, the distinction made by § 230 is clear-cut. A site is a service provider if it “passively displays content” created by third parties, often in the form of user contributions.[8] Alternatively, if a site is a content provider, it either develops content itself or is responsible for content creation, either in whole or in part.[9] Under § 230, a site is only liable for the content it creates or develops itself, not for republishing or hosting third-party content.[10] Sites are often both service providers and content developers.[11] Typically, the distinction between the two types has been straightforward—the site only faces liability if it creates the material itself.[12] However, the plaintiff in Gonzalez asks a radically significant question: Does § 230 also immunize a service when it makes “targeted recommendations of information” made by a third party? In other words, is the use of an algorithm an act of creation?[13]

    II. Gonzalez v. Google, LLC.

      Plaintiff, Reynaldo Gonzalez, is the father of Nohemi Gonzalez, a student who lost her life in the November 13, 2015 Paris Attacks.[14] Three ISIS fighters opened fire into the crowd of a café where Nohemi was dining.[15] One of the fighters in the café shooting had appeared in a 2014 ISIS YouTube video delivering a recruitment monologue.[16] Following the attack, two of the fighters posted links to their ISIS recruitment videos, hosted on YouTube, on several media outlets.[17]

      Gonzalez alleges that Google (which owns YouTube) is liable for the ISIS content on its site specifically due to the website’s algorithmic recommendation and suggestion powers.[18] He argues that this is an act of content development because Google’s algorithm intentionally selects users to show ISIS videos to and targets “users whose characteristics indicated that they would be interested in ISIS videos[.]”[19] In doing so, Google has allegedly furthered the ISIS mission because it “‘recommended ISIS videos to users’ and enabled users to ‘locate other videos and accounts related to ISIS[.]’”[20] Gonzalez alleges that Google is aware of the illicit content in ISIS’ videos hosted on YouTube and recommended by YouTube and that YouTube has become “essential and integral” to ISIS’ global terror efforts.[21]

      Regardless, the circuit court applied the broad protection of § 230, holding that “a website’s use of content-neutral algorithms, without more, does not expose it to liability for content posted by a third party. Under our existing case law, § 230 requires this result.”[22] Gonzalez’s argument counters the assertion that § 230 would eliminate all liability for sites that nudge users toward harmful materials like inflammatory ISIS recruitment videos.[23]

      III. Algorithms: Neutral Tool or Act of Creation?

        In deciding Gonzalez, the Supreme Court will focus on the issue of how sites suggest third-party content.[24] The circuit court decision in Gonzalez held that an algorithm functions like a search engine,[25] which carries classic immunity under the basic principle that the site provides content by responding to user input.[26] The Gonzalez circuit court decision held that an algorithm works theoretically the same way—a user is inputting data about their preferences, and the service is responding in turn.[27] As noted by the lowercourt, § 230’s legislative history supports this general conclusion.[28] Courts have previously applied a “neutral tool” analysis to determine whether a site’s function is providing content or services.[29] A function has been deemed a neutral tool if it “does nothing more than provide options,” which users “may adopt or reject at their discretion.”[30] Previously, algorithmic recommendations have fallen within this category as “ordinary, neutral functions” of a site.[31]

        Unfortunately for Gonzalez, the traditional interpretation of this neutrality is broad enough to encompass instances where the website’s developers or owners are aware that third-party users are using the allegedly neutral tools to create illegal content.[32] A site providing neutral tools to enable illicit activities “does not amount to ‘development’” of the content.[33] Likewise, prior decisions have applied a narrow scope for the website company’s liability under these circumstances—only if the site elicits and makes “aggressive use of it in conducting its business” can it be found liable for that illicit third-party content.[34] Gonzales will have to upend several years of precedent to shift the long-standing application of § 230.[35]

        IV. Conclusion

          The current Court has already shown a willingness to disregard precedent,[36] if it chooses to do likewise in Gonzalez, the internet landscape will be fundamentally changed.[37] If the Supreme Court upholds case precedent, it will cement the broad, sweeping immunity from liability not just for tech giants but for any website that uses algorithmic recommendations. But there is hope for Gonzalez’s argument. The seemingly uniform conclusions of prior courts, in reality, are rife with close decisions and flaming dissents.[38] Dissenting justices note how algorithms nudge “susceptible souls . . . down dark paths”[39] and how “the benign aspects of Google/YouTube, Facebook, and Twitter have been transformed into a chillingly effective propaganda device.”[40] This issue is far from resolved and is easily swayed by political interests.[41] Moreover, the timing of the decision has reached a new fever pitch of urgency, as we slowly slip deeper into the age of the algorithm, where seemingly all internet content is curated “just for you.”[42] Now the Court has to decide—is the algorithm a mirror, merely reflecting back whatever the user is inputting, or is it something more powerful?[43]

          *Anastasia Couch is a second-year J.D candidate at the University of Baltimore School of Law, where she is a Staff Editor for Law Review and a member of the Royal Graham Shannonhouse III Honor Society. She is currently interning with Citizens for Responsibility and Ethics in Washington (CREW), and interned with the Maryland State Ethics Commission in Summer 2022. Anastasia pursues her passion for government oversight (specifically in warfare and technology) by volunteering with Women for Weapons Trade Transparency.


          [1] Christopher Cox, The Origins and Original Intent of Section 230 of the Communications Decency Act, Univ. Rich. J. L. Tech., ¶1 (Aug. 27, 2020), https://jolt.richmond.edu/2020/08/27/the-origins-and-original-intent-of-section-230-of-the-communications-decency-act/. As a former House Representative, Cox was one of the primary authors of § 230. Id.; Michael Hiltzik, Michael Hiltzik: The Supreme Court Holds the Internet’s Fate in Its Hands, and You Should Be Terrified, Waco Tribune-Herald (Oct. 14, 2022), https://wacotrib.com/opinion/columnists/michael-hiltzik-the-supreme-court-holds-the-internets-fate-in-its-hands-and-you-should/article_89050c17-94a9-5831-8ea2-2f82f2ab754a.html. When Section 230 was written, Google did not exist, neither did Facebook, Twitter, and YouTube. Id.

          [2] 47 U.S.C. § 230(c)(1). 

          [3] Ian Millhiser, A New Supreme Court Case Could Fundamentally Change the Internet, Vox (Oct. 6, 2022) https://www.vox.com/policy-and-politics/2022/10/6/23389028/supreme-court-section-230-google-gonzalez-youtube-twitter-facebook-harry-styles; CDA 230: The Most Important Law Protecting Internet Speech, Elec. Frontier Found., https://www.eff.org/issues/cda230 (last visited Oct. 19, 2022).

          [4] Rebecca Kern, SCOTUS to Hear Challenge to Section 230 Protections, Politico (Oct. 3, 2022), https://www.politico.com/news/2022/10/03/scotus-section-230-google-twitter-youtube-00060007.

          [5] Petition for Writ of Certiorari, Gonzalez v. Google, LLC., (No. 21-1333) at 3—4.

          [6] See Cox, supra note 1, at ¶ 60.

          [7] Kern, supra note 4.

          [8] Fair Housing Council of San Fernando Valley v. Roommates.com, LLC, 521 F.3d 1157, 1162 (2008) [hereinafter Roommates]. For example, a restaurant owner cannot sue Yelp if a customer posts a defamatory review on the website. See Millhiser, supra note 3.

          [9] Roommates,521 F.3d at 1162; 47 U.S.C. § 230 (f)(3).

          [10] Roommates, 521 F.3d at 1162;Elec. Frontier Found., supra note 3.

          [11] Roommates, 521 F.3d at 1162.

          [12] See discussion infra Section III.

          [13] Petition for Writ of Certiorari, Gonzalez v. Google, LLC., (No. 21-1333) at 8.   

          [14] Gonzalez v. Google LLC, 2 F.4th 871, 881 (9th Cir. 2021).

          [15] Id.

          [16] Id.

          [17] Id.

          [18] Id.

          [19] Petition for Writ of Certiorari, Gonzalez v. Google, LLC., (No. 21-1333) at 10.

          [20] Gonzalez, 2 F.4th at 881.

          [21] Id. at 882–83.

          [22] Gonzalez, 2 F.4th at 896 (emphasis added).

          [23]Petition for Writ of Certiorari, Gonzalez v. Google, LLC., (No. 21-1333) at 8; Gonzalez, 2 F.4th at 881.

          [24] See id. at 895; Roommates, 521 F.3d at 1175.

          [25] Gonzalez, 2 F.4that 895.

          [26]Roommates, 521 F.3d at 1175.

          [27] Gonzalez, 2 F.4that 895.

          [28] See Dyroff v. Ultimate Software Group. Inc., No. 17-CV-05359, 2017 WL 5665670 at *9 (N.D. Cal. Nov. 26, 2017) (holding that the site “merely provided content-neutral social-network functionalities—recommendations and notifications about posts”).

          [29] Dyroff, 2017 WL 5665670 at *8; Roommates, 521 F.3d at 1167 (“Reading the exception for co-developers as applying only to content that originates entirely with the website . . . ignores the words ‘development . . . in part’ in the statutory passage.”).

          [30] Goddard v. Google, Inc., 640 F.Supp.2d 1193, 1198 (2009).

          [31] Dyroff, 2017 WL 5665670 at *8.

          [32] Goddard, 640 F.Supp.2d at 1196.

          [33] Roommates, 521 F.3d at 1168–69; see also Force v. Facebook, Inc., 934 F.3d 53, 70 (2d Cir. 2019) (“Facebook’s algorithms may have made content more visible or available, but held this did not amount to developing the underlying information”).

          [34]Roommates, 521 F.3d at 1172. Development of content is determined if a site “contributes materially to the alleged illegality of the conduct.” Id. at 1168.

          [35] Dyroff, 2017 WL 5665670 at *7 (“In similar cases, courts have rejected plaintiffs’ attempts to plead around immunity by basing liability on a website’s tools”).

          [36] David Cole & Rotimi Adeoye, A Radical Supreme Court Term in Review, American Civil Liberties Union (July 7, 2022), https://www.aclu.org/news/civil-liberties/a-radical-supreme-court-term-in-review.

          [37] Evan Gerstmann, Supreme Court To Decide Whether YouTube Can Be Sued For Abetting Terrorism, Forbes (Oct. 3, 2022), https://www.forbes.com/sites/evangerstmann/2022/10/03/supreme-court-to-decide-whether-youtube-can-be-sued-for-abetting-terrorism/?sh=4049a8d10162.

          [38] See, e.g.,Roommates,521 F.3d at 1189 (McKeown, J., concurring in part, dissenting in part) (“Because the statute itself is cumbersome to interpret in light of today’s Internet architecture, and because the decision today will ripple through the billions of web pages already online, and the countless pages to come in the future, I would take a cautious, careful, and precise approach to the restriction of immunity, not the broad swath cut by the majority.”).

          [39] Gonzalez v. Google LLC, 2 F.4th 871, 950–51(9th Cir. 2021) (Katzmann, C.J., concurring in part, dissenting in part).

          [40] Gonzalez, 2 F.4th at 921 (Gould, J., concurring in part, dissenting in part).

          [41] See Gerstmann, supra note 37; Hiltzik, supra note 1.

          [42] See generally, Kyle Chayka, The Age of Algorithmic Anxiety, New Yorker (July 25, 2022), https://www.newyorker.com/culture/infinite-scroll/the-age-of-algorithmic-anxiety (describing the omnipresence of the algorithm).

          [43] See Eleanor Cummins, The Creepy TikTok Algorithm Doesn’t Know You¸ Wired (Jan. 3, 2022), https://www.wired.com/story/tiktok-algorithm-mental-health-psychology/.

          Leave a Reply

          Fill in your details below or click an icon to log in:

          WordPress.com Logo

          You are commenting using your WordPress.com account. Log Out /  Change )

          Facebook photo

          You are commenting using your Facebook account. Log Out /  Change )

          Connecting to %s

          %d bloggers like this: