The Communications Decency Act, codified at 47 U.S.C. Section 230, has long been a foundational tenet of law governing the internet, providing crucial protections for online platforms. However, it is under intense scrutiny once again as the law faces significant re-examination and potential reform. The new legislation proposes to sunset Section 230 during the final week of 2025, effectively setting a timer on Congressional action and giving lawmakers only eighteen months to establish a new standard for liability. This legislative move seeks to address mounting concerns about the role of online platforms in hosting and moderating user-generated content. At the crux of the issue is whether online platforms can be held responsible for the content that users post.

Advocates of change argue that many online platforms have become breeding grounds for hate speech, incendiary rhetoric, and misinformation. They contend that Section 230's broad immunity provisions allow companies to dodge responsibility for the potentially harmful content on their sites. Conversely, opponents worry that altering or removing Section 230 could lead to excessive censorship, stifle free speech, and lead to a flood of litigation against online platforms. They emphasize that the current protections are necessary for maintaining a free and open internet.

Section 230 Immunity

Section 230 provides, at subsection (c)(1), that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Subsections (c)(2)(A)-(B) provide further immunity as to actions interactive computer services take to restrict access. An interactive computer service is defined by the statute as "any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions.”

In other words, Section 230 immunizes online platforms for the content others post on their websites, applications, or similar internet-based systems, and also protects those platforms when they decide to screen certain content.

Congressional Findings and Policy Underlying Section 230

The congressional findings outlined in the statute emphasize the transformative impact of the internet and interactive computer services, noting their role in advancing educational and informational resources. The congressional findings emphasize also that the internet offers “a forum for true diversity of political discourse, unique opportunities for cultural development, and myriad avenues for intellectual activity.” The statute further outlines policies to promote the use and development of the internet, and “to preserve the vibrant and competitive free market that presently exists,” as well as to encourage the development of technology, to permit parents and service providers to block and filter content, and to enable enforcement of Federal criminal laws.

Section 230, therefore, struck a balance between these various, and at times competing, goals. Specifically, by allowing service providers to block and filter content, while otherwise recognizing that they should not be liable for third-party content, promotes the development and use of the internet, allows the free market to dictate which sites parents and others decide to use, and ensures that liability for illegal content rests with the individual who actually posts it on the internet.

Significant Court Decisions Regarding Section 230

In May 2023, the Supreme Court of the United States declined to address Section 230, in the case of Gonzalez v. Google, LLC, 143 S.Ct. 1191 (2023), stating that lower courts should refer to their separate decision in Twitter, Inc. v. Taamneh, 143 S. CT. 1206 (2023) for guidance. In the Twitter decision, the Court found that Twitter was not legally responsible for aiding and abetting a terrorist organization where it was accused of distributing ISIS information as recommended content for users.

This reaffirmed the broad immunity provided by Section 230. The decision emphasized that holding platforms liable for third-party content would undermine the fundamental principles of Section 230, which aim to promote free speech and innovation on the internet by shielding platforms from being treated as the publisher or speaker of user-generated content. As the factual background of Gonzalez was analogous in that it involved a suit in which a family accused Google of allowing YouTube to suggest content involving an ISIS recruitment video, the Court in Gonzalez determined that the Twitter decision already covered the relevant question and remanded for reconsideration in accordance with that decision.

The rulings in both cases underscore the comprehensive protections currently afforded to online platforms under Section 230, even when algorithms recommend potentially harmful third-party content. It reinforces the legal framework that allows interactive computer service providers to operate without potential liability for most user-generated content.

The Debate Over Section 230

The debate over Section 230 focuses on weighing the need to protect free speech against the harms that can arise from unregulated online content. Advocates for reform argue that it is often impractical or impossible to hold third-parties liable for illegal content, so there needs to be a way to hold platforms accountable for enabling or facilitating the distribution of such content. They believe that a revised Section 230 could foster a safer online environment, and, by extension, society, by encouraging platforms to take more proactive measures against dangerous and misleading content.

On the other hand, defenders of Section 230 caution that changes could lead to censorship, limiting the free speech, stifling innovation, and a flood of litigation against online service providers. They argue that the broad protections currently in place are essential for small platforms and startups, in particular, since the same are more susceptible to being shuttered by claims of derivative liability for the content of often insolvent third-parties. 

Moving Forward

As the proposed legislation to sunset Section 230 progresses, the conversation will likely intensify. One of the idea behind the proposed sunsetting is to force “Big Tech” to come to the bargaining table with lawmakers on a new version of the statute. Lawmakers face the challenging task of writing new standards that address the concerns of both sides. The outcome will have profound implications for the future of the internet, free speech, and the responsibility of online platforms. Congress and industry leaders must navigate these complex issues to develop a balanced and effective approach to internet regulation. And, if the immunity is ultimately eliminated or significantly curtailed, online service providers will need to take steps to protect themselves from liability for the content posted by their users.

The attorneys at Wood Smith Henning & Berman are keeping abreast of this issue and will provide updates as new developments occur. In the interim, should you have any questions, please do not hesitate to reach out to a member of our team.

By using this site, you agree to our updated Privacy Policy.