It’s time to kill Section 230

The 26 words that helped create the internet are not serving users anymore.

By LILY CITRON
Instead of letting an almost 30-year-old piece of legislation guide our understanding of the internet, we implement new laws that address current online issues.
(Leila Yi / Daily Trojan)

For 29 years, one passage, a small portion of the larger Communications Decency Act, has shaped the foundations of the internet as we know it. 

Often referred to as the 26 words that created the internet, Section C, Item 1 of the CDA for the “Treatment of publisher or speaker” reads: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

This means that an interactive service provider on the internet will not be held responsible for the content posted by a user onto their site. In other words, sites like Meta, YouTube and X have no legal liability for any of the hate speech, exploitative content or misinformation posted to their sites.  


Daily headlines, sent straight to your inbox.

Subscribe to our newsletter to keep up with the latest at and around USC.

Given the current state of the internet, this may seem absurd, but in 1996, this piece of legislation was essential in safeguarding the young information superhighway. At the time, the internet largely consisted of small blogs and forums that, if held liable for harmful information posted to them by third parties, would be completely destroyed by lawsuits. Section 230 determined that platforms on the internet work more like a corkboard in a community center than like a newspaper. 

This distinction is vital: because newspapers control the information that is released by them and regulate what is said through their platform, they are legally liable for the information that is printed. Under Section 230, a social media platform will not be treated as a publisher, with the assumption that they will not meaningfully alter the information on their sites, so they cannot not be held responsible for said content. 

In recent years, it has become painfully apparent that social media corporations have become dangerous to our society. Through the spread of harmful misinformation, including mistruths about vaccines, that have led to the deaths of thousands, through the refusal to combat hate speech within their sites and through the continual toleration of violent content on their sites, it is evident that social media platforms have been left uncheck for too long. Something needs to change.

These ideas of the internet as a marketplace of ideas or an impartial observer to the contributions of those who use it are antiquated and unhelpful. Of course social media platforms alter the way content is reached on their sites. 

This is done through content moderation, something explicitly protected in Section 230, under a Good Samaritan provision. However, time and time again these platforms have shown us that these social media sites are not Good Samaritans, whether they are not reacting quickly enough or not reacting at all to harmful content. 

Along with content moderation, these platforms fundamentally change the way that information exists on their platforms through algorithms, which ensures that not all information on social media will receive an equal amount of reach. The existence of algorithms inherently alters the way that media on social media platforms are pushed. When the CDA was passed, there was no imagining how social media platforms are now able to change the way information is distributed.

It is absolutely undeniable that the way that social media chooses to push content is harmful. Many social media platforms, including Meta and YouTube have been proven to push content can be controversial and hateful, as it is shown to engage viewers more than content that elicits positive feelings. The refusal to allow these sites to be held responsible for the content they push, something that this act establishes, is a precedent we need to overturn.

Social media platforms are due for a reckoning. They need to feel pressure, from users and legislators alike, to ensure that the content on their platforms is not hateful or harmful. As long as Section 230 exists, this will not happen. Social media platforms need to be held accountable for the ways that their content has been used to damage the mental health of young people, polarize nations and contribute to the perpetration of a genocide. 

Because social media sites have allowed the dissemination of harmful content for so long, it is clear that these platforms need to change. While I do not believe that abolishing Section 230 is the only step that needs to be taken in reforming today’s internet, I do not believe that genuine change will come to these platforms until we are able to hold them responsible for the information they promote. 

This policy was passed to help an emerging form of communication stand on its own without being buried in lawsuits. But the internet has outgrown these guardrails. 

© University of Southern California/Daily Trojan. All rights reserved.