February 2023 | Volume 14, Issue 6


Find the full video and article on ABC News.

According to the article, seven years after Islamic State extremists murdered their daughter, the family of Nohemi Gonzalez, the only American killed in the 2015 Paris terror attacks, heads to the U.S. Supreme Court next month seeking to pin some responsibility for the tragedy on social media giant YouTube. “If some changes can be done to prevent these terrorist people [from] keeping killing human beings, then that is a big thing,” Beatrice Gonzalez, Nohemi Gonzalez’s mother, told the media in the family’s first interview about the case.

Beatrice Gonzalez alleges that Google’s YouTube algorithms – a series of proprietary software instructions which recommend video content to users – effectively amplified Islamic State-produced materials in support of the extremists that killed her daughter, a 23-year-old college student who had been studying in France. The family wants to bring a case against the company under the Anti-Terrorism Act but has been blocked from doing so because of a landmark federal law that has given sweeping legal immunity to social media companies for more than 25 years.

Section 230 of the Communications Decency Act of 1996 states that internet companies, including social media platforms, cannot be sued over third-party content uploaded by users – such as photos, videos, and commentary – or for decisions site operators make to moderate, or filter, what appears online. Oral arguments at the Supreme Court set for February 21 in Gonzalez v. Google, the parent company of YouTube, will focus on the scope of that immunity, whether it covers algorithms, and whether Gonzalez should be able to pursue her claims in court. “Hopefully this will change the laws and it’ll be for the good by being more careful about the social media, so [other parents] never have the pain that we’re feeling,” said Nohemi Gonzalez’s stepfather, Jose Hernandez.

The company has expressed sympathy to the Gonzales family but strongly denies any connection to the attack. YouTube says it bans terrorist content across its platform and that its algorithms help catch and remove violent extremist videos, noting 95 percent of those removed last year were automatically detected – most before receiving fewer than 10 views. “Undercutting Section 230 would make it harder for websites to do this work,” YouTube spokesperson Ivy Choi told the media. “Websites would either over-filter any conceivably controversial materials and creators, or shut their eyes to objectionable content like scams, fraud, harassment and obscenity to avoid liability – making services far less useful, less open and less safe.”

Lower courts have said Section 230 protects algorithms from liability claims, siding with Google. For years, members of Congress from both parties have debated changes to Section 230 to promote greater transparency and accountability of internet companies.

President Joe Biden in a recent Wall Street Journal op-ed called for “fundamental reform” to the law, but there is not a political consensus on the way forward. The Gonzalez case is the first time the nation’s highest court will consider limits to immunity for internet companies. “There are enormous amounts of money at stake if the platforms were to be held liable for every time a terrorist attack could in any way be tangentially traced to material that the platforms carried,” said Michael Karanicolas, executive director of the Institute for Technology Law & Policy at UCLA.

Section 230 was passed by bipartisan majorities in Congress and has long been considered a cornerstone of the modern internet, protecting online platforms as spaces for creativity, innovation, and open public debate. The crucial 26 words in the statute say: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Internet companies “get to decide what to carry. They get to decide what not to carry,” said Karanicolas. “And they get to decide how to design their algorithms – to amplify certain types of content or to de-emphasize other types of content.” Subjecting those decisions to legal scrutiny could have major implications for how the internet functions, experts say. “Large companies can maybe throw a battalion of lawyers at a problem and litigate their way forward, but new startups will simply not be able to get over that [financial burden],” said Matthew Schruers, president of the Computer and Communications Industry Association. “No digital service wants their products to be used by bad actors. But to try to use liability here is going to produce a contrary result,” Schruers added.

Advocates for overhauling Section 230 say the legal protection far exceeds what Congress intended, much earlier in the development of the modern internet, and insulates companies from accountability. “When this statute was enacted in 1996, it was for the express purpose of protecting kids from seeing obscene material online and protecting companies who take obscene material offline to protect kids. And it’s been turned on its head,” said Matthew Bergman, an attorney and founder of the Social Media Victims Law Center, who represents hundreds of plaintiffs alleging harm from social media use. Frances Haugen, the former Facebook insider who has warned Congress about the harms of internet companies’ algorithms, said setting new limits on legal immunity could incentivize companies to improve their products. “We have the tools, but all these things decrease usage. They make the companies a little less money,” Haugen said. “So, in a world where our business models are fueled by clicking on ads, there aren’t independent market incentives for making products that help people be healthy and happy.” Haugen believes Section 230 immunity does not have to be all or nothing but says regulators need to update the law to reflect current internet use and the proliferation of documented psychological harms.

“The Supreme Court isn’t really the right actor for dealing with this issue. You know, they can come in and make a very blunt judgment. They can’t, for example, set up a new regulatory framework that might be a more effective way to govern the internet,” Haugen said. The tech industry agrees that lawmakers, not the high court, should be the final arbiters of internet policy and that changes to immunity protection are not in Americans’ best interest.

But Bergman, the attorney for social media users claiming harm, and the Gonzalez family argue that the justices need to act under a plain reading of the law and permit the Gonzalez family to move forward with their suit against YouTube’s parent company. “It will certainly provide a more sensible opportunity for families to hold companies accountable,” Bergman said. “All it will do is allow them to seek discovery and prove their case. Everyone is entitled to a defense, as are the social media companies, but it will simply kind of open the courthouse door.” “It will certainly provide a more sensible opportunity for families to hold companies accountable,” Bergman said. “All it will do is allow them to seek discovery and prove their case. Everyone is entitled to a defense, as are the social media companies, but it will simply kind of open the courthouse door.”

Beatrice Gonzalez said she did not bring the case seeking financial compensation from Google and is instead seeking to enact a small change to the system in her daughter’s memory. “We want justice, but we’re not angry,” she said. “If we can do a little change in our community by knowing that it can be a bigger change in the world is what brings me peace in my heart.”

Discussion Questions

  1. Explain Section 230 of the Communications Decency Act of 1996.
    Section 230 is a section of Title 47 of the United States Code that was enacted as part of the United States Communications Decency Act and generally provides immunity for website platforms regarding third-party content. Section 230(c)(1) provides immunity from liability for providers and users of an "interactive computer service" who publish information provided by third-party users, stating that “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

    Section 230(c)(2) further provides "Good Samaritan" protection from civil liability for operators of interactive computer services in the good faith removal or moderation of third-party material they deem "obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected."

    Section 230 was developed in response to a pair of lawsuits against Internet service providers (ISPs) in the early 1990s that resulted in different interpretations of whether the service providers should be treated as publishers or, alternatively, as distributors of content created by its users. It was enacted as part of the Communications Decency Act (CDA) of 1996 (a common name for Title V of the Telecommunications Act of 1996), and formally codified as part of the Communications Act of 1934. After passage of the Telecommunications Act, the CDA was challenged in courts and was ruled by the Supreme Court in Reno v. American Civil Liberties Union (1997) to be unconstitutional, although Section 230 was determined to be severable from the rest of the legislation and remained in place. Since then, several legal challenges have validated the constitutionality of Section 230.

    Section 230 protections are not limitless and require providers to remove material illegal on a federal level, such as in copyright infringement cases. In 2018, Section 230 was amended by the Stop Enabling Sex Traffickers Act to require the removal of material violating federal and state sex trafficking laws. In the following years, protections from Section 230 have come under more scrutiny on issues related to hate speech and ideological biases in relation to the power that technology companies can hold on political discussions and became a major issue during the 2020 United States presidential election.

    Passed when Internet use was just starting to expand in both breadth of services and range of consumers in the United States, Section 230 has frequently been referred to as a key law which allowed the Internet to develop.
  2. Summarize the plaintiffs’ argument(s) in Gonzalez v. Google.
    As indicated in the article, the plaintiffs contend that Google’s YouTube algorithms – a series of proprietary software instructions which recommend video content to users – effectively amplified Islamic State-produced materials in support of the extremists that killed her daughter, a 23-year-old college student who had been studying in France. The plaintiffs claim shared responsibility for the death of their daughter among the extremists who killed her daughter, those who posted the content on YouTube, and most important for the lawsuit, YouTube itself, for providing a forum in which this information was shared.
  3. In your reasoned opinion, should social media companies be subject to liability for postings on their platforms? Why or why not?
    This is an opinion question, so student responses may vary. Your author is inclined to favor some degree of liability for social media companies for postings on their platforms, particularly given their global reach and the fact that increasingly, consumers of social media rely on such platforms as their “community.”