What you need to know
The Internet is a vast medium for sharing information, from news and current events to entertainment, stories, and pictures of our families and neighbors. However, the Internet is not always a transparent and safe space, especially for younger or more vulnerable populations. Lies, threats, and misinformation often go viral and can cause real damage to individuals, organizations, and businesses. A legal provision known as Section 230 minimizes liability for the people who control websites and social media forums and limits their exposure and responsibility for the accuracy of information and data being presented, including reviewing and posting false or malicious content on their websites. Should Section 230 be reformed? Why does it exist in the first place?
What is Section 230?
Section 230, part of the 1996 Communications Decency Act, shields the owners of internet platforms (such as internet service providers, website owners, and social media websites) from legal liability for content created and posted by users.
Section 230 was enacted to regulate the new communication environment created by the Internet. Before the Internet, it was not easy to put ideas before the public. You had to go through a content publisher (work for a newspaper or TV station or write a letter to the editor) or be a publisher yourself. Today, Internet sites feature content created by users, whether on message boards like Reddit, social media sites like Facebook, YouTube, Twitter, and Instagram, or consumer review sites like Yelp. The rise of these forums raised two questions. First, under what circumstances could or should websites be responsible for material posted by users? Second, under what circumstances could those websites be required to keep user-posted content they found objectionable?
Section 230 was designed to address these issues. Under Section 230, platforms must set up procedures to remove “material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable” and act in “good faith” when moderating content. As defined by the courts, examples of bad faith include banning users based on their political beliefs alone (such as a website owned or influenced by a particular political party banning posts published by the opposing party or posters that support the opposing party) or removing posts from competing companies (such as Facebook banning all ads for Twitter). The content-removal procedures in Section 230 are used frequently. The plot below shows the number and types of posts that Facebook investigated and removed from its website in the Fall of 2022.
How is Section 230 different from the laws that govern print or broadcast media?
Standard law defines book publishers, newspapers, and TV and radio broadcasters as having full control over their content. As a result, it’s the law’s intent that these companies are held liable for all content that they publish. If they publish false, obscene, or defamatory content, they can be fined, taken to court, or, in the case of TV and radio stations, even lose their broadcasting license.
Section 230 creates a different set of rules for internet hosts. Hosts don’t have to check everything posted on their websites – they only have to respond if someone complains. Section 230’s impact goes beyond social media websites. For example, the home rental site Airbnb was protected by Section 230 when hosts improperly listed their short-term rentals in communities where rentals were prohibited by local laws. The court found that Airbnb was not liable for the illegal listings, and the responsibility fell on the hosts for not following local laws.
One exception to Section 230’s provisions is that internet platforms are legally liable for posts that involve sex trafficking. They can be held responsible for revenge porn (sexually explicit images posted by a former partner without the subject’s consent). It is also illegal for platforms to post, promote, or publish nonconsensual sexual images, targeted harassment, and terrorist communications.
What are the arguments against Section 230?
Some opponents of Section 230 believe that it gives platform owners too much power to decide which posts are removed, limiting the free speech of users. Thus, when Twitter, Facebook, and other social media sites suspended then-President Trump’s accounts in early 2020, some conservative commentators argued that he was being removed because platform owners disagreed with his political views and claims about the 2020 election, not because his comments were dangerous or defamatory.
Another consequence of Section 230 is that platforms can take down content without allowing the poster to appeal their decision. Thus, former President Trump had little or no way to appeal the decisions that removed his access to Facebook and Twitter. The same is true when platforms refuse to remove content. People who claim they are victims of online harassment, defamation, and other harmful conduct have few options to appeal a platform’s decision that these posts are allowable. Similarly, if Airbnb removes a rental because they believe the owner has violated the site’s rules, there is no higher authority or forum where the owner can ask for redress.
Another complaint about Section 230 is that it does not do enough to punish, impose monetary fines, or otherwise require specific follow-on communications by internet platforms for distributing information that is later found to be misleading or false. For example, a Facebook post might falsely claim that a new drug can cure cancer. Until Facebook is notified about the falsehood and reviews it, the website is not responsible if people read the post, forgo conventional treatment, and take the drug instead.
How do other nations regulate the Internet?
In some cases, other nations place a higher level of responsibility on the owners of internet platforms to monitor the content posted to their websites and to remove illegal content. The European Union’s E-commerce Directive allows internet intermediaries that are “mere conduits” broad immunity similar to Section 230. However, this provision does not apply to most social media websites, who must promptly take down illegal content once they receive notice. In addition, under the EU regulations, platforms have some responsibility for monitoring information posted to their websites and removing false information or hate speech if no one complains. India’s Information Technology Act of 2000 requires the removal of illegal content within 36 hours of notice and only provides immunity to those who do not initiate content, modify content, or determine who sees the content.
What are the proposals for reforming Section 230?
Proposals for reforming Section 230 generally narrow the broad immunities that platform owners currently enjoy. For example, one proposal would make platform owners legally liable if their decisions about keeping or removing content deviate from the criteria stated in their written policies. For example, in the case of removing former President Trump from Twitter, the new standard would allow the former President to take Twitter to court, where a judge or jury would decide whether Twitter had acted appropriately.
Additionally, some propose bringing US law closer to the EU standard, altering the “knowledge” standard to a “should have known” standard. This change would hold platforms accountable if they do not review information for veracity even before someone complains. Thus, in the earlier example of a Facebook post about a fake drug, the new standard would make Facebook liable if people were harmed by the useless drug. The Santa Clara Principles, proposed by a group of academics, civil rights advocates, and tech companies, provide a set of guidelines for content moderation, including greater transparency in the moderation process and an appeals process for people who believe that a platform has acted incorrectly.
There are ongoing court challenges to Section 230. In the case of Force v. Facebook, the plaintiff argues that Facebook should be held liable for the sale of firearms on its platform, despite Section 230 immunity. Similarly, in the case of Biden v. Knight First Amendment Institute, the plaintiffs argue that social media companies should be held responsible for violating users’ First Amendment rights when they remove a post. If courts ruled against Section 230 immunity in cases like these, it could significantly change current policy by holding platforms accountable for a wider range of potentially harmful content on their platforms. This shift could pressure companies to moderate their content more closely, potentially limiting the free speech protections that Section 230 was designed to uphold.
Further reading
- Brannon, V., & Holmes, E. 2021. Section 230: An overview. Congressional Research Service, accessed 3/25/23, available at https://tinyurl.com/2x9pbppy
- Rohac, D. 2021. What is Section 230, and Why Is It So Controversial? The New York Times, accessed 3/25/23, available at https://tinyurl.com/5cju48km
- Romoser, J. 2022. Elon Musk, internet freedom, and how the Supreme Court might force big tech into a catch-22. SCOTUS Blog, accessed 3/10/23, available at https://tinyurl.com/2yrad2j4
Sources
What is Section 230?
- Keller, D. & Ly, O. 2021. The Breakdown: Daphne Keller explains the Communications Decency Act. The Berkman Klein Center for Internet & Society at Harvard University, accessed 3/25/23, available at https://tinyurl.com/yc7jwn3d
- Brannon, V. 2019. Liability for Content Hosts: An Overview of the Communication Decency Act’s Section 230.Congressional Research Service, accessed 3/25/23, available at https://tinyurl.com/2xn6wp7b
- Ardia, D. S. 2010. Free Speech Savior or Shield for Scoundrels: An Empirical Study of Intermediary Immunity under Section 230 of the Communications Decency Act. Loyola of Los Angeles Law Review, 43(2), 373, available at https://tinyurl.com/s8edvxz8
- Jurecic, Q. 2022. The politics of Section 230 reform: Learning from FOSTA’s mistakes. Brookings Institution, accessed 3/25/23, available at https://tinyurl.com/53w6cnhp
- Castro, D., & Johnson, A. 2021. Overview of Section 230: What it Is, why it was created, and what it has achieved. ITIF, accessed 3/25/23, available at https://tinyurl.com/yjns34ay
- Goldman, E. 2006. The Future of Section 230. Communications of the ACM, 49(12), 105-110, available at https://tinyurl.com/35db5e57
- Brannon, V., & Holmes, E. 2021. Section 230: An overview. Congressional Research Service, accessed 3/25/23, available at https://tinyurl.com/2x9pbppy
- Hermes, J. 2017. Section 230 as Gatekeeper: When Is an Intermediary Liability Case Against a Digital Platform Ripe for Early Dismissal? Litigation, 43(3), 34–41, available at https://www.jstor.org/stable/26402058
- Ardia, D. S. 2010. Free Speech Savior or Shield for Scoundrels: An Empirical Study of Intermediary Immunity under Section 230 of the Communications Decency Act. Loyola of Los Angeles Law Review, 43(2), 373, available at https://tinyurl.com/s8edvxz8
- Chart data: Transparency Center. n.d. Community Standards Enforcement Report. Meta, accessed 3/4/23, available at https://transparency.fb.com/
How is Section 230 different from the laws that govern print or broadcast media?
- Digital Media Law Project. 2022. Immunity for Online Publishers Under the Communications Decency Act. Berkman Klein Center for Internet & Society, accessed 3/25/23, available at https://tinyurl.com/3cp26fy6
- Legal Information Institute. (n.d.). Regulation of the Media: Overview. Cornell Law School, accessed 3/25/23, available https://tinyurl.com/3wcbk2z7
- Brannon, V. 2019. Liability for Content Hosts: An Overview of the Communication Decency Act’s Section 230.Congressional Research Service, accessed 3/25/23, available at https://tinyurl.com/2xn6wp7b
- Goldman, E. 2018. The complicated story of FOSTA and section 230. First Amend. L. Rev., 17, 279, available at https://tinyurl.com/mrh5mptj
- Jurecic, Q. 2022. The politics of Section 230 reform: Learning from FOSTA’s mistakes. Brookings, accessed 3/25/23, available at https://tinyurl.com/53w6cnhp
What are the arguments against Section 230?
- Romoser, J. 2022. Elon Musk, internet freedom, and how the Supreme Court might force big tech into a catch-22. SCOTUS Blog, accessed 3/10/23, available at https://tinyurl.com/2yrad2j4
- US Department of Justice. 2020. Department of Justice’s review of section 230 of the Communications Decency Act of 1996. US Department of Justice Archives, accessed 2/10/23, available at https://tinyurl.com/4kr567cs
- Ardia, D. S. 2010. Free Speech Savior or Shield for Scoundrels: An Empirical Study of Intermediary Immunity under Section 230 of the Communications Decency Act. Loyola of Los Angeles Law Review, 43(2), 373, available at https://tinyurl.com/s8edvxz8
- Castro, D., & Johnson, A. 2021. Fact-checking the critiques of Section 230: What are the real problems? ITIF. https://tinyurl.com/3ub6ebsk
- Howe, A. 2023. Justices request federal government’s views on Texas and Florida social-media laws. SCOTUS Blog, accessed 3/10/23, available at https://tinyurl.com/yhwhmet2
How do other nations regulate the Internet?
- Johnson, A., & Castro, D. 2021. How Other Countries Have Dealt With Intermediary Liability. ITIF, accessed 3/10/23, available at https://tinyurl.com/by9y2c22
- Gesley, J. 2021. Germany: Network Enforcement Act Amended to Better Fight Online Hate Speech. Library of Congress, accessed 3/25/23, available at https://tinyurl.com/57x45nxa
- Morar, D., & Santos, B. M. (2022). Online content moderation lessons from outside the US. Brookings, available at https://tinyurl.com/2p8hhe24, accessed 3/26/23
What are proposals for reforming Section 230?
- US Department of Justice. 2020. Department of Justice’s review of section 230 of the Communications Decency Act of 1996. US Department of Justice Archives, accessed 2/10/23, available at https://tinyurl.com/4kr567cs
- Citron, D. K., & Wittes, B. 2018. The Internet Will Not Break: Denying Bad Samaritans Section 230 Immunity. Wake Forest Law Review, 53, 725-784, available at https://tinyurl.com/dwe5yupy
- Santa Clara Principles. (n.d.). The Santa Clara Principles: On Transparency and Accountability in Content Moderation. Accessed 3/25/23, available https://santaclaraprinciples.org/, accessed 3/26/23
- Wheeler, T. 2022. The Supreme Court takes up Section 230. Brookings, accessed 3/25/23, available https://tinyurl.com/pkwxxct2
- Romoser, J. 2022. Elon Musk, internet freedom, and how the Supreme Court might force big tech into a catch-22. SCOTUS Blog, accessed 3/10/23, available at https://tinyurl.com/2yrad2j4
Contributors
- This policy brief was researched and drafted in February-March 2023 by Policy vs. Politics Interns Nithin Krishnan, Zul Norin, and Shelby (Rosa Nice) Richardson, with revision and final edits by Research Director Dr. William Bianco