Search
PHONE: 1-844-765-4293

Section 230 of the Communications Decency Act Explained

Shot of a diverse group of people holding up speech bubbles outside.

In 1996, the United States Congress passed the Communications Decency Act, a law that aimed to regulate the Internet and protect children from indecent online content. However, a key provision of the law, Section 230, has become one of the most controversial pieces of legislation in the tech industry.

Learn more about Section 230, the controversy surrounding it and how changes to this provision could affect the future of the Internet for both users and platforms.

What Is Section 230? 

Section 230, part of the 1996 Communications Decency Act (CDA), is a provision that shields Internet platforms from legal liability for user-generated content, stating that “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

This means that under Section 230, Internet platforms and big tech companies like Facebook, Twitter and YouTube cannot be held legally responsible for the content posted by users. It also provides legal immunity for platforms that make “good faith” efforts to moderate or remove content that is illegal or “that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.”

The intended purpose of Section 230 back in 1996 was to encourage the growth of the Internet and the development of online communities by providing legal protections for platforms that host user-generated content. And it did just that; many credit it with enabling the rise of social media and other online services that have become so integral to our daily lives.

Read the Policy vs Politics Section 230 Policy Brief to learn more.

The Controversy Surrounding Section 230

Despite its intended purpose, Section 230 has become the subject of intense debate and controversy, with opposing arguments coming from various sides. 

Some politicians and activists have advocated amending or repealing the law altogether, while others argue that it is essential for protecting free speech online.

In early 2020, former President Donald Trump and other Republicans pushed for changes to Section 230 in response to perceived bias against conservative voices on social media platforms. 

In June 2020, the Department of Justice released a report recommending to reform Section 230. The report recommended holding platforms responsible if they do not address illegal content, such as child exploitation and terrorism, while also maintaining protections for good-faith content moderation.

In July 2020, the Department of Commerce also released a petition on the future of Section 230. The petition maintained holding online platforms accountable for content that promotes or facilitates illegal activity. However, it also cautioned against any changes that could limit free expression online.

Since then, the debate around the future of Section 230 continues on, with lawmakers still seeking to update legislation so that it can more accurately reflect and regulate today’s modern Internet landscape, not 1996’s version of the Internet. 

What Could Happen if it is Changed or Repealed?

If Section 230 were changed or repealed, the implications for online platforms and Internet users could be significant. Proposals for reforming Section 230 generally narrow the broad immunities that platform owners currently receive.

Without the legal protections provided by Section 230, platforms would be more vulnerable to lawsuits and other legal challenges related to user-generated content. This could lead to free expression suppression as platforms may begin to heavily censor content in order to avoid any potential liability. Additionally, platforms may be more reluctant to host user-generated content, particularly controversial or sensitive content, which could limit the diversity of voices and opinions on the Internet. It could also put smaller platforms and startups at a disadvantage as they struggle to compete against larger, more established platforms with the resources to handle legal challenges related to user-generated content.

However, it is also possible that changes to Section 230 could lead to improvements in online safety and content moderation. Some propose bringing the US law closer to the EU standard, shifting the “knowledge” standard to a “should have known” standard. This verbiage shift would hold platforms accountable if they don’t review information for veracity even before someone complains, incentivizing the swift removal of harmful content, as they could face legal consequences for failing to do so. 

The Future of the Internet

In 1996, an estimated 40 million people used the Internet worldwide. As of January 2023, that number has reached 5.16 billion.

In the 27 years since its inception, Section 230 of the Communications Decency Act has become a controversial provision, with critics arguing that it allows platforms to evade responsibility for harmful content while defenders maintain that it is necessary to protect free expression online. 

Changes to the law could have significant implications for Internet users and online platforms, with potential impacts on free expression and online safety. While the debate over Section 230 is likely to continue, it is important to remember that any changes to the law must balance the interests of free expression with the need for a safe and welcoming online environment for all users.

Learn more about pending and written government policies, laws, regulations and actions that impact Americans and future generations at PolicyvsPolitics.org.

Join our Mailing list! - Receive Relevant Info Straight To Your Inbox