I. The Twenty-Six Words that Built the Internet
The internet as we know it rests on a single sentence written in 1996. Section 230 of the Communications Decency Act states: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” For nearly three decades, the internet has boomed under the protections of this blanket immunity without alteration. However, § 230 will stand before the Supreme Court for the first time in the pending case Gonzalez v. Google, LLC. At issue is the liability of any operator of a site that uses an algorithm to curate or recommend third-party content to users. Gonzalez challenges the sweeping immunity that allows internet giants like Google and Facebook to flourish. The internet that emerges after the Gonzalez decision could be barely recognizable.
Facially, the distinction made by § 230 is clear-cut. A site is a service provider if it “passively displays content” created by third parties, often in the form of user contributions. Alternatively, if a site is a content provider, it either develops content itself or is responsible for content creation, either in whole or in part. Under § 230, a site is only liable for the content it creates or develops itself, not for republishing or hosting third-party content. Sites are often both service providers and content developers. Typically, the distinction between the two types has been straightforward—the site only faces liability if it creates the material itself. However, the plaintiff in Gonzalez asks a radically significant question: Does § 230 also immunize a service when it makes “targeted recommendations of information” made by a third party? In other words, is the use of an algorithm an act of creation?
II. Gonzalez v. Google, LLC.
Plaintiff, Reynaldo Gonzalez, is the father of Nohemi Gonzalez, a student who lost her life in the November 13, 2015 Paris Attacks. Three ISIS fighters opened fire into the crowd of a café where Nohemi was dining. One of the fighters in the café shooting had appeared in a 2014 ISIS YouTube video delivering a recruitment monologue. Following the attack, two of the fighters posted links to their ISIS recruitment videos, hosted on YouTube, on several media outlets.
Gonzalez alleges that Google (which owns YouTube) is liable for the ISIS content on its site specifically due to the website’s algorithmic recommendation and suggestion powers. He argues that this is an act of content development because Google’s algorithm intentionally selects users to show ISIS videos to and targets “users whose characteristics indicated that they would be interested in ISIS videos[.]” In doing so, Google has allegedly furthered the ISIS mission because it “‘recommended ISIS videos to users’ and enabled users to ‘locate other videos and accounts related to ISIS[.]’” Gonzalez alleges that Google is aware of the illicit content in ISIS’ videos hosted on YouTube and recommended by YouTube and that YouTube has become “essential and integral” to ISIS’ global terror efforts.
Regardless, the circuit court applied the broad protection of § 230, holding that “a website’s use of content-neutral algorithms, without more, does not expose it to liability for content posted by a third party. Under our existing case law, § 230 requires this result.” Gonzalez’s argument counters the assertion that § 230 would eliminate all liability for sites that nudge users toward harmful materials like inflammatory ISIS recruitment videos.
III. Algorithms: Neutral Tool or Act of Creation?
In deciding Gonzalez, the Supreme Court will focus on the issue of how sites suggest third-party content. The circuit court decision in Gonzalez held that an algorithm functions like a search engine, which carries classic immunity under the basic principle that the site provides content by responding to user input. The Gonzalez circuit court decision held that an algorithm works theoretically the same way—a user is inputting data about their preferences, and the service is responding in turn. As noted by the lowercourt, § 230’s legislative history supports this general conclusion. Courts have previously applied a “neutral tool” analysis to determine whether a site’s function is providing content or services. A function has been deemed a neutral tool if it “does nothing more than provide options,” which users “may adopt or reject at their discretion.” Previously, algorithmic recommendations have fallen within this category as “ordinary, neutral functions” of a site.
Unfortunately for Gonzalez, the traditional interpretation of this neutrality is broad enough to encompass instances where the website’s developers or owners are aware that third-party users are using the allegedly neutral tools to create illegal content. A site providing neutral tools to enable illicit activities “does not amount to ‘development’” of the content. Likewise, prior decisions have applied a narrow scope for the website company’s liability under these circumstances—only if the site elicits and makes “aggressive use of it in conducting its business” can it be found liable for that illicit third-party content. Gonzales will have to upend several years of precedent to shift the long-standing application of § 230.
The current Court has already shown a willingness to disregard precedent, if it chooses to do likewise in Gonzalez, the internet landscape will be fundamentally changed. If the Supreme Court upholds case precedent, it will cement the broad, sweeping immunity from liability not just for tech giants but for any website that uses algorithmic recommendations. But there is hope for Gonzalez’s argument. The seemingly uniform conclusions of prior courts, in reality, are rife with close decisions and flaming dissents. Dissenting justices note how algorithms nudge “susceptible souls . . . down dark paths” and how “the benign aspects of Google/YouTube, Facebook, and Twitter have been transformed into a chillingly effective propaganda device.” This issue is far from resolved and is easily swayed by political interests. Moreover, the timing of the decision has reached a new fever pitch of urgency, as we slowly slip deeper into the age of the algorithm, where seemingly all internet content is curated “just for you.” Now the Court has to decide—is the algorithm a mirror, merely reflecting back whatever the user is inputting, or is it something more powerful?
*Anastasia Couch is a second-year J.D candidate at the University of Baltimore School of Law, where she is a Staff Editor for Law Review and a member of the Royal Graham Shannonhouse III Honor Society. She is currently interning with Citizens for Responsibility and Ethics in Washington (CREW), and interned with the Maryland State Ethics Commission in Summer 2022. Anastasia pursues her passion for government oversight (specifically in warfare and technology) by volunteering with Women for Weapons Trade Transparency.
 Christopher Cox, The Origins and Original Intent of Section 230 of the Communications Decency Act, Univ. Rich. J. L. Tech., ¶1 (Aug. 27, 2020), https://jolt.richmond.edu/2020/08/27/the-origins-and-original-intent-of-section-230-of-the-communications-decency-act/. As a former House Representative, Cox was one of the primary authors of § 230. Id.; Michael Hiltzik, Michael Hiltzik: The Supreme Court Holds the Internet’s Fate in Its Hands, and You Should Be Terrified, Waco Tribune-Herald (Oct. 14, 2022), https://wacotrib.com/opinion/columnists/michael-hiltzik-the-supreme-court-holds-the-internets-fate-in-its-hands-and-you-should/article_89050c17-94a9-5831-8ea2-2f82f2ab754a.html. When Section 230 was written, Google did not exist, neither did Facebook, Twitter, and YouTube. Id.
 47 U.S.C. § 230(c)(1).
 Ian Millhiser, A New Supreme Court Case Could Fundamentally Change the Internet, Vox (Oct. 6, 2022) https://www.vox.com/policy-and-politics/2022/10/6/23389028/supreme-court-section-230-google-gonzalez-youtube-twitter-facebook-harry-styles; CDA 230: The Most Important Law Protecting Internet Speech, Elec. Frontier Found., https://www.eff.org/issues/cda230 (last visited Oct. 19, 2022).
 Rebecca Kern, SCOTUS to Hear Challenge to Section 230 Protections, Politico (Oct. 3, 2022), https://www.politico.com/news/2022/10/03/scotus-section-230-google-twitter-youtube-00060007.
 Petition for Writ of Certiorari, Gonzalez v. Google, LLC., (No. 21-1333) at 3—4.
 See Cox, supra note 1, at ¶ 60.
 Kern, supra note 4.
 Fair Housing Council of San Fernando Valley v. Roommates.com, LLC, 521 F.3d 1157, 1162 (2008) [hereinafter Roommates]. For example, a restaurant owner cannot sue Yelp if a customer posts a defamatory review on the website. See Millhiser, supra note 3.
 Roommates,521 F.3d at 1162; 47 U.S.C. § 230 (f)(3).
 Roommates, 521 F.3d at 1162;Elec. Frontier Found., supra note 3.
 Roommates, 521 F.3d at 1162.
 See discussion infra Section III.
 Petition for Writ of Certiorari, Gonzalez v. Google, LLC., (No. 21-1333) at 8.
 Gonzalez v. Google LLC, 2 F.4th 871, 881 (9th Cir. 2021).
 Petition for Writ of Certiorari, Gonzalez v. Google, LLC., (No. 21-1333) at 10.
 Gonzalez, 2 F.4th at 881.
 Id. at 882–83.
 Gonzalez, 2 F.4th at 896 (emphasis added).
Petition for Writ of Certiorari, Gonzalez v. Google, LLC., (No. 21-1333) at 8; Gonzalez, 2 F.4th at 881.
 See id. at 895; Roommates, 521 F.3d at 1175.
 Gonzalez, 2 F.4that 895.
Roommates, 521 F.3d at 1175.
 Gonzalez, 2 F.4that 895.
 See Dyroff v. Ultimate Software Group. Inc., No. 17-CV-05359, 2017 WL 5665670 at *9 (N.D. Cal. Nov. 26, 2017) (holding that the site “merely provided content-neutral social-network functionalities—recommendations and notifications about posts”).
 Dyroff, 2017 WL 5665670 at *8; Roommates, 521 F.3d at 1167 (“Reading the exception for co-developers as applying only to content that originates entirely with the website . . . ignores the words ‘development . . . in part’ in the statutory passage.”).
 Goddard v. Google, Inc., 640 F.Supp.2d 1193, 1198 (2009).
 Dyroff, 2017 WL 5665670 at *8.
 Goddard, 640 F.Supp.2d at 1196.
 Roommates, 521 F.3d at 1168–69; see also Force v. Facebook, Inc., 934 F.3d 53, 70 (2d Cir. 2019) (“Facebook’s algorithms may have made content more visible or available, but held this did not amount to developing the underlying information”).
Roommates, 521 F.3d at 1172. Development of content is determined if a site “contributes materially to the alleged illegality of the conduct.” Id. at 1168.
 Dyroff, 2017 WL 5665670 at *7 (“In similar cases, courts have rejected plaintiffs’ attempts to plead around immunity by basing liability on a website’s tools”).
 David Cole & Rotimi Adeoye, A Radical Supreme Court Term in Review, American Civil Liberties Union (July 7, 2022), https://www.aclu.org/news/civil-liberties/a-radical-supreme-court-term-in-review.
 Evan Gerstmann, Supreme Court To Decide Whether YouTube Can Be Sued For Abetting Terrorism, Forbes (Oct. 3, 2022), https://www.forbes.com/sites/evangerstmann/2022/10/03/supreme-court-to-decide-whether-youtube-can-be-sued-for-abetting-terrorism/?sh=4049a8d10162.
 See, e.g.,Roommates,521 F.3d at 1189 (McKeown, J., concurring in part, dissenting in part) (“Because the statute itself is cumbersome to interpret in light of today’s Internet architecture, and because the decision today will ripple through the billions of web pages already online, and the countless pages to come in the future, I would take a cautious, careful, and precise approach to the restriction of immunity, not the broad swath cut by the majority.”).
 Gonzalez v. Google LLC, 2 F.4th 871, 950–51(9th Cir. 2021) (Katzmann, C.J., concurring in part, dissenting in part).
 Gonzalez, 2 F.4th at 921 (Gould, J., concurring in part, dissenting in part).
 See Gerstmann, supra note 37; Hiltzik, supra note 1.
 See generally, Kyle Chayka, The Age of Algorithmic Anxiety, New Yorker (July 25, 2022), https://www.newyorker.com/culture/infinite-scroll/the-age-of-algorithmic-anxiety (describing the omnipresence of the algorithm).
 See Eleanor Cummins, The Creepy TikTok Algorithm Doesn’t Know You¸ Wired (Jan. 3, 2022), https://www.wired.com/story/tiktok-algorithm-mental-health-psychology/.