Twenty-six words that arose from a 1996 US law overhauling telecommunications have been termed the basis of the Internet’s current state. Section 230 of the Communications Decency Act essentially says that any ‘interactive computer service’ is not liable to be considered a publisher or speaker for any content created by a third party. Effectively, this protects host websites from litigation for illegal content published by the user, especially since section 230(c)(1) provides immunity from liability for providers and users of an “interactive computer service” who publish information provided by third-party users.
Secondary protection in the statute is the ‘Good Samaritan’ clause. Their liability for interactive computer services operators is protected when they take down speech or third-party content that is obscene or considered offensive in good faith. This can also be done to constitutionally protected speech. This law particularly protects social media networks, although it can be important when considering news websites that host public forums like comment sections.
Section 230 was established in reaction to a pair of claims against Internet Service Providers (ISPs) in the early 1990s, which had contradictory implications over whether service providers could be regarded as publishers or producers of material generated by their customers. Following the enactment of the Telecommunications Act, the CDA was contested by the courts and held by the Supreme Court in Reno v. American Civil Liberties Union (1997) that the Telecommunications Act went partially against the Constitution, keeping the protections of Section 230 in effect. Reno itself held that the 1996 Communications Decency Act (CDA) was an unconstitutional, content-based restriction of First Amendment free speech rights.
Since then, the constitutionality of Section 230 has been validated through many court challenges.
Section 230 rights are not unrestricted, forcing providers to restrict unauthorized content at the federal level, including copyright infringements. In 2018, Section 230, through the Stop Facilitating Sex Traffickers Act (FOSTA-SESTA), was revised to mandate the elimination of content in violation of federal and provincial sexual exploitation legislation. In the ensuing years, Section 230 guarantees have become much more intensely scrutinized on topics relating to intolerance and partisan prejudice about the hold that technology firms have over political discourse and have been a central topic in the 2020 presidential election.
At a stage where Internet usage itself had begun to grow across both the scope of services and the spectrum of users in the United States, Section 230 was always known as the primary legislation that had permitted the Internet to thrive.
In the study of the extant protection provided by Section 230, the courts typically implement a three-pronged examination. To gain protection, the criminal must meet each of the three prongs, wherein [A.] the defendant may be an “interactive computer service provider or user.” [B.] the cause of the lawsuit sees the defendant as the “publisher or speaker” of the adverse knowledge at the question in the claimant’s case. And [C.] the details must be “provided by another information content provider,” which means that the offender may not be the “information content provider” of the sensitive content in dispute.
The immunity of Section 230 is not absolute. Principally, the act prohibits federal criminal liability, privacy infringements, and copyright infringement charges. There is still no exemption from state legislation compliant with 230(e)(3), even as state criminal law has been held pre-emptive in situations wherein immunity has been granted in some inconsistent areas.
The judiciary has provided contradictory rulings on the extent of the exclusion of intellectual property. However, with the enactment of the Digital Millennium Copyright Act in 1998, service providers must comply with certain copyright infringement provisions to retain safe harbor immunity from liability as specified in Title II of the DMCA Online Copyright Infringement Liability Limitation Act.
[2.0] IMPACT OF CDA 230
Since the online intermediaries whose presence requires that they host or republish speech are thus protected against litigation, this then allows the flourishing of all internet bastions, including Internet Service Providers or any other online service with third-party content. Despite there being exceptions for third-party content like sex work, copyright infringement, or federal criminal content, essentially, CDA 230 provides a safety net that allows for free speech to flourish.
Particularly, YouTube has benefitted from CDA 230 since users can thus upload their own reviews. In a more hands-on example, Craigslist and Amazon can host classified ads and user reviews to individuals. Furthermore, Facebook and Twitter directly benefit from this because CDA 230 allows the platforms to continue hosting without being responsible for every individual infringement. With the internet’s use being prevalent, it is impossible to prevent objectionable content altogether due to the sheer magnitude of the content generated. CDA 230 allows intermediaries some respire without having to censor a wide ambit of said content actively.
CDA 230 is important to bloggers, which allows content on blogs to remain despite readers’ work or even guest bloggers. This protection still holds even if the blogger is aware of the content or makes editorial judgments in nature.
CDA 230’s impact is highly unique, the United States being one of the very few countries that have this statute in its repertoire. The United States thus has high and explicit protection for internet providers, differing from countries like the United Kingdom, France, or Germany. However, the European Union has established the safe harbor regime for intermediaries under the e-Commerce Directive, wherein it established that hosting providers are not necessarily liable for content hosted provided [A.] the acts in question are neutral intermediary acts of a mere technical, automatic and passive capacity, [B.] they are not informed of its illegal character, and [C.] they act promptly to remove or disable access to the material when informed of it. Furthermore, it prevents general obligations from being imposed for constant surveillance of hosted content in hopes of preventing illegal activity. The Directive, as amended, also strengthens the liability of providers in case of failure to adopt ‘effective and proportionate’ measures insofar as copyright violations and promptness in taking down the objectionable content.
[2.1] IMPACT OF ABROGATING OR RESTRICTING CDA 230
Without CDA 230, platforms would be required to be a lot more cautious with content hosted. However, the contrary might also be likely: platforms could avoid moderation altogether. Especially in light of provisions that narrow the scope of the ‘Good Samaritan’ clause, this could cause a significant impact on preventing the rise of hate speech and terrorism, which is already difficult for platforms like Twitter to control.
A major point of conflict with CDA 230 has been the different constitutional standard that has been established between online and offline speech, insofar as there has been no sound factual reasoning for the differentiation for media. However, this might be soon corked around since First Amendment interpretations have shown an inclination to create different rules for different media forms, especially since internet intermediary liability is significantly less effective than an offline liability.
[3.0] Political issues with CDA 230: CHANGES ENACTED
In August 2019, the President’s executive order meant that the Federal Communications Commission would be allowed to limit CDA 230, although it was tabled till May 2020. Essentially, the order provided a way around the protections afforded to the platforms, wherein complaints of bias could be filed with the Federal Trade Commission. This would allow the FCC to adjudicate whether the platform was empowered to retain the ‘good faith’ exception that comprises the Good Samaritan protection. The order also changed the onus of how the statute’s text was enacted and directed agencies to follow the legislative interpretation that the order itself provided rather than by precedent or Congress.
On May 28, 2020, President Trump signed “Executive Order on Preventing Online Censorship” (EO 13925), an executive order directing regulatory action at Section 230. The EO narrowed the safe-harbor protection granted in CDA 230(c)(1) to prevent any media companies that edited content that was not violent, obscene, or harassing, which essentially tapered down the good faith clause. This meant that the judicial precedents for good-faith evaluations would be considered secondary to the EO. This then leaves these platforms liable for the content.
However, the February 2020 Department of Justice workshop review following anti-trust probes in ‘big tech’ alliances caused an evaluation of ways CDA 230 could be amended, insofar as it was postulated that the effectiveness of CDA 230 was less important in the current state of events since the companies themselves were not underdogs and were able to hold their own.
This discussion largely centered around cases wherein host platforms allowed sexually exploitative content like revenge pornography, harassment, and child sexual abuse but did not address other issues that did not fall into the realm of antitrust or the above specifications. Intermediate uses of the internet were not included.
The final recommendations of the Department of Justice came in June 2020, addressed to the Congress, which was [A.] denying immunity to platforms and incentivizing them to deal with illicit content, including a carve-out of “Bad Samaritans” that importune illicit activity with exceptions in the areas of child abuse, terrorism, and cyber-stalking, as well as when platforms have been notified by courts of illicit material, [B.] stripping protections from civil lawsuits brought by the federal government; [C.] disallowing Section 230 protections in antitrust actions on the large Internet platforms; and [D.] promoting discourse and transparency by clarifying murky language in the statute like “otherwise objectionable” and “good faith” and necessitating platforms to publicly document when they take moderation actions against content unless that may interfere with law enforcement or risk harm to an individual.
While proposals to reform the statute can be through [A.] a carveout approach that essentially strips protection for a few categories of content like the FOSTA-SESTA or [B.] a system that forces the providers to higher standards to increase liability, the latter is seen in statutes like the Eliminating Abusive and Rampant Neglect of Interactive Technologies Act (EARN IT). Likely, this system will also buckle the encryption of private messaging, especially considering the United States’ history with key disclosure laws and data privacy issues.
Consequently, while the Republican effort to modify CDA 230 seems to be centered around the ‘bias’ that these platforms adopt, Democratic candidates have also called for the revocation of CDA 230 following the rise of hate speech, terrorism, and harassment, most notable seen through President-elect Joe Biden’s proposal for revocation of CDA 230. Sen. Bernie Sanders’ policy also echoes this. He believed that online platforms could not evade responsibility under the defense of ignorance when they are alerted and knowledgeable about their platform’s content that promotes violence.
[4.0] The relationship between CDA 230 and the first amendment
The First Amendment is important since it prevents the government from restricting a majority of the speech. Consequently, this should ideally also cover the same for tech companies that are required to moderate speech. However, private companies are allowed to create rules to restrict speech within their own spheres.
CDA 230 provides more immunity than First Amendment defenses do insofar as courts have routinely interpreted the same to immunize all claims against internet services bases on third-party content that is not expressly barred by the statute. Furthermore, since the First Amendment is also primarily against the government, CDA 230 is significant in its protection for private parties.
Commercial speech is also protected by CDA 230. Since First Amendment protections are only afforded after strict scrutiny tests have been passed, this means that judicial review is exacting for the same. However, commercial speech under the First Amendment often used intermediate scrutiny instead, causing inherent stratification within the protections offered. CDA 230 does not follow this model and treats most content in question equally.
The First Amendment also needs CDA 230 as a rule since internet intermediaries would limit a significant amount of constitutionally protected speech in its absence. The threshold that would have been established if not for the requirement of ‘actual malice’ would likely be higher, which would cause collateral censorship. This form of self-censorship arises when the intermediaries take it upon themselves to limit other individuals’ speech based on a fear of liability and not merely their own. This would cause the host to automatically censor most content that evolves, going against the First Amendment’s spirit.
For more information on serving legal papers, contact Undisputed Legal our Process Service department at (800) 774-6922. Representatives are available Monday-Friday 8 am – 8 pm EST. If you found this article helpful, please consider donating. Thank you for following our blog, A space dedicated to bringing you news on breaking legal developments, interesting articles for law professionals, and educational material for all. We hope that you enjoy your time on our blog and revisit us! We also invite you to check out our Frequently Asked Questions About Process Servers.
1. 47 USC 230: Protection for private blocking and screening of offensive material
2. Section 230 says that “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider” (47 U.S.C. § 230).
3. 47 U.S. Code § 230 – Protection for private blocking and screening of offensive material (c)Protection for “Good Samaritan” blocking and screening of offensive material
(1)Treatment of publisher or speaker
No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
No provider or user of an interactive computer service shall be held liable on account of—
(A)any action was voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers being obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or
(B)any action is taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1)
4. Reno v. American Civil Liberties Union, 521 U.S.844 (1997),
5. Stop Enabling Sex Traffickers Act (SESTA) and Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA) PUBLIC LAW 115–164—APR. 11, 2018 132 STAT. 1253
6. 47 U.S. Code § 230 – Protection for private blocking and screening of offensive material (e)Effect on other laws
(1)No effect on criminal law
Nothing in this section shall be construed to impair the enforcement of section 223 or 231 of this title, chapter 71 (relating to obscenity) or 110 (relating to sexual exploitation of children) of title 18, or any other Federal criminal statute.
(2)No effect on intellectual property law
Nothing in this section shall be construed to limit or expand any law about intellectual property.
Nothing in this section shall be construed to prevent any State from enforcing any State law consistent with this section. No cause of action may be brought, and no liability may be imposed under any State or local law that is inconsistent with this section.
(4)No effect on communications privacy law
Nothing in this section shall be construed to limit the application of the Electronic Communications Privacy Act of 1986 or any of the amendments made by such Act, or any similar State law.
(5)No effect on sex trafficking law
Nothing in this section (other than subsection (c)(2)(A)) shall be construed to impair or limit—
(A)any claim in a civil action brought under section 1595 of title 18, if the conduct underlying the claim constitutes a violation of section 1591 of that title;
(B)any charge in a criminal prosecution brought under State law if the conduct underlying the charge would constitute a violation of section 1591 of title 18; or
(C)any charge in a criminal prosecution brought under State law if the conduct underlying the charge would constitute a violation of section 2421A of title 18 and promotion or facilitation of prostitution is illegal in the jurisdiction where the defendant’s promotion or facilitation of prostitution was targeted.
7. Backpage.com, LLC v. McKenna
8. Pub. L. 105-304
9. Directive 2000/31/EC
10. Thomas W. Hazlett et al., The Overly Active Corpse of Red Lion, 9 NW. J. TECH. & INTELL. PROP. 51, 62 (2010
11. “Department Of Justice’s Review Of Section 230 Of The Communications Decency Act Of 1996”. United States Department of Justice. June 17, 2020.
12. Attorney General William Barr said that while Section 230 was needed to protect the Internet’s growth while most companies were not stable, “No longer are technology companies the underdog upstarts…They have become titans of U.S. industry” and questioned the need for Section 230’s broad protection
13. “The idea that it’s a tech company is that Section 230 should be revoked, immediately should be revoked, number one. For Zuckerberg and other platforms,” Biden said. “It should be revoked because it is not merely an internet company. It is propagating falsehoods they know to be false.”
14. New York Times Co. v. Sullivan, 376 U.S. 254, 279 (1964)
15. J.M. Balkin, Essay, Free Speech, and Hostile Environments, 99 COLUM. L. REV. 2295, 2296 (1999