Section 230 of the Communications Decency Act


Section 230 of the Communications Decency Act of 1996 is a piece of Internet legislation in the United States, codified at. At its core, §230 provides immunity from liability for providers and users of an "interactive computer service" who publish information provided by third-party users:
No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

The statute in §230 further provides "Good Samaritan" protection from civil liability for operators of interactive computer services in the removal or moderation of third-party material they deem obscene or offensive, even of constitutionally protected speech, as long as it is done in good faith.
Section 230 was developed in response to a pair of lawsuits against Internet service providers in the early 1990s that had different interpretations of whether the service providers should be treated as publishers or distributors of content created by its users. It was also pushed by the tech industry, and other experts, that language in the proposed CDA made providers responsible for indecent content posted by users that could extend to other types of questionable free speech. After passage of the Telecommunications Act, the CDA was challenged in courts and ruled by the Supreme Court in Reno v. American Civil Liberties Union to be partially unconstitutional, leaving the Section 230 provisions in place. Since then, several legal challenges have validated the constitutionality of Section 230.
Section 230 protections are not limitless, requiring providers to still remove criminal material such as copyright infringement; more recently, Section 230 was amended by the Stop Enabling Sex Traffickers Act in 2018 to require the removal of material violating federal and state sex trafficking laws. Protections from Section 230 have come under more recent scrutiny on issues related to hate speech and ideological biases in relation to the power technology companies can hold on political discussions.
Passed at a time where Internet use was just starting to expand in both breadth of services and range of consumers in the United States, Section 230 has frequently been referred as a key law that has allowed the Internet to flourish, often referred to as "The Twenty-Six Words That Created the Internet".

Application and limits

Section 230, as passed, has two primary parts both listed under §230 as the "Good Samaritan" portion of the law. Section §230, as identified above, defines that an information service provider shall not be treated as a "publisher or speaker" of information from another provider. Section §230 provides immunity from civil liabilities for information service providers that remove or restrict content from their services they deem "obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected", as long as they act "in good faith" in this action.
In analyzing the availability of the immunity offered by Section 230, courts generally apply a three-prong test. A defendant must satisfy each of the three prongs to gain the benefit of the immunity:
  1. The defendant must be a "provider or user" of an "interactive computer service."
  2. The cause of action asserted by the plaintiff must treat the defendant as the "publisher or speaker" of the harmful information at issue.
  3. The information must be "provided by another information content provider," i.e., the defendant must not be the "information content provider" of the harmful information at issue.
Section 230 immunity is not unlimited. The statute specifically excepts federal criminal liability ), electronic privacy violations ) and intellectual property claims ). There is also no immunity from state laws that are consistent with though state criminal laws have been held preempted in cases such as Backpage.com, LLC v. McKenna and Voicenet Commc'ns, Inc. v. Corbett.
As of mid-2016, courts have issued conflicting decisions regarding the scope of the intellectual property exclusion set forth in §230. For example, in Perfect 10, Inc. v. CCBill, LLC, the 9th Circuit Court of Appeals ruled that the exception for intellectual property law applies only to federal intellectual property claims such as copyright infringement, trademark infringement, and patents, reversing a district court ruling that the exception applies to state-law right of publicity claims. The 9th Circuit's decision in Perfect 10 conflicts with conclusions from other courts including Doe v. Friendfinder. The Friendfinder court specifically discussed and rejected the lower court's reading of "intellectual property law" in CCBill and held that the immunity does not reach state right of publicity claims.
Additionally, with the passage of the Digital Millennium Copyright Act in 1998, service providers must comply with additional requirements for copyright infringement to maintain safe harbor protections from liability, as defined in the DMCA's Title II, Online Copyright Infringement Liability Limitation Act.

Background and passage

Prior to the Internet, case law was clear that a liability line was drawn between publishers of content and distributors of content; publishers would be expected to have awareness of material it was publishing and thus should be held liable for any illegal content it published, while distributors would likely not be aware and thus would be immune. This was established in Smith v. California, where the Supreme Court ruled that putting liability on the provider would have "a collateral effect of inhibiting the freedom of expression, by making the individual the more reluctant to exercise it."
In the early 1990s, the Internet became more widely adopted and created means for users to engage in forums and other user-generated content. While this helped to expand the use of the Internet, it also resulted in a number of legal cases putting service providers at fault for the content generated by its users. This concern was raised by legal challenges against CompuServe and Prodigy, early service providers at this time. CompuServe stated they would not attempt to regulate what users posted on their services, while Prodigy had employed a team of moderators to validate content. Both faced legal challenges related to content posted by their users. In Cubby, Inc. v. CompuServe Inc., CompuServe was found not be at fault as, by its stance as allowing all content to go unmoderated, it was a distributor and thus not liable for libelous content posted by users. However, Stratton Oakmont, Inc. v. Prodigy Services Co. found that as Prodigy had taken an editorial role with regard to customer content, it was a publisher and legally responsible for libel committed by customers.
Service providers made their Congresspersons aware of these cases, believing that if upheld across the nation, it would stifle the growth of the Internet. United States Representative Christopher Cox had read an article about the two cases and felt the decisions were backwards. "It struck me that if that rule was going to take hold then the internet would become the Wild West and nobody would have any incentive to keep the internet civil," Cox stated.
At the time, Congress was preparing the Communications Decency Act, part of the omnibus Telecommunications Act of 1996, which was designed to make knowingly sending indecent or obscene material to minors a criminal offense. A version of the CDA had passed through the Senate pushed by Senator J. James Exon. A grassroots effort in the tech industry reacted to try to convince the House of Representatives to challenge Exon's bill. Based on the Stratton Oakmont decision, Congress recognized that by requiring service providers to block indecent content would make them be treated as publishers in context of the First Amendment and thus become liable for other illegal content such as libel, not set out in the existing CDA. Cox and fellow Representative Ron Wyden wrote the House bill's section 509, titled the Internet Freedom and Family Empowerment Act, designed to override the decision from Stratton Oakmont, so that service providers could moderate content as necessary and did not have to act as a wholly neutral conduit. The new Act was added the section while the CDA was in conference within the House.
The overall Telecommunications Act, with both Exon's CDA and Cox/Wyden's provision, passed both Houses by near-unanimous votes and signed into law by President Bill Clinton by February 1996. Cox/Wyden's section was codified as Section 230 in Title 47 of the US Code. The anti-indecency portion of the CDA was immediately challenged on passage, resulting in the Supreme Court 1997 case, Reno v. American Civil Liberties Union, that ruled all of the anti-indecency sections of the CDA were unconstitutional, but left Section 230 as law.

Impact

The passage and subsequent legal history supporting the constitutionality of Section 230 have been considered essential to the growth of Internet through the early part of the 21st century. Coupled with the Digital Millennium Copyright Act of 1998, Section 230 provides internet service providers safe harbors to operate as intermediaries of content without fear of being liable for that content as long as they take reasonable steps to delete or prevent access to that content. These protections allowed experimental and novel applications in the Internet area without fear of legal ramifications, creating the foundations of modern Internet services such as advanced search engines, social media, video streaming, and cloud computing. NERA Economic Consulting estimated in 2017 that Section 230 and the DMCA, combined, contributed about 425,000 jobs to the U.S. in 2017 and represented a total revenue of annually.

Subsequent history

Early challenges - ''Zeran v. AOL'' (1997–2008)

The first major challenge to Section 230 itself was Zeran v. AOL, a 1997 case decided at the Fourth Circuit. The case involved a person that sued America Online for failing to remove, in a timely manner, libelous ads posted by AOL users that inappropriately connected his home phone number to the Oklahoma City bombing. The court found for AOL and upheld the constitutionality of Section 230, stating that Section 230 "creates a federal immunity to any cause of action that would make service providers liable for information originating with a third-party user of the service." The court asserted in its ruling Congress's rationale for Section 230 was to give Internet service providers broad immunity "to remove disincentives for the development and utilization of blocking and filtering technologies that empower parents to restrict their children's access to objectionable or inappropriate online material." In addition, Zeran notes "the amount of information communicated via interactive computer services is... staggering. The specter of tort liability in an area of such prolific speech would have an obviously chilling effect. It would be impossible for service providers to screen each of their millions of postings for possible problems. Faced with potential liability for each message republished by their services, interactive computer service providers might choose to severely restrict the number and type of messages posted. Congress considered the weight of the speech interests implicated and chose to immunize service providers to avoid any such restrictive effect."
This rule, cementing Section 230's liability protections, has been considered one of the most important case laws affecting the growth of the Internet, allowing websites to be able to incorporate user-generated content without fear of prosecution. However, at the same time, this has led to Section 230 being used as a shield for some website owners as courts have ruled Section 230 provides complete immunity for ISPs with regard to the torts committed by their users over their systems. Through the next decade, most cases involving Section 230 challenges generally fell in favor of service providers, ruling in favor of their immunity from third-party content on their sites.

Erosion of Section 230 immunity - ''Roommates.com'' (2008–2016)

While Section 230 had seemed to have given near complete immunity to service providers in its first decade, new case law around 2008 started to find cases where providers can be liable for user content due to being a "publisher or speaker" related to that content under §230. One of the first such cases to make this challenge was Fair Housing Council of San Fernando Valley v. Roommates.com, LLC 521 F.3d 1157, The case centered on the services of Roommates.com that helped to match renters based on profiles they created on their website; this profile was generated by a mandatory questionnaire and which included information about their gender and race and preferred roommates' race. The Fair Housing Council of San Fernando Valley stated this created discrimination and violated the Fair Housing Act, and asserted that Roommates.com was liable for this. In 2008, the Ninth Circuit in an en banc decision ruled against Roommates.com, agreeing that its required profile system made it an information content provider and thus ineligible to receive the protections of §230.
The decision from Roommates.com was considered to be the most significant deviation from Zeran in how Section 230 was handled in case law. Eric Goldman of the Santa Clara University School of Law wrote that while the Ninth Circuit's decision in Roommates.com was tailored to apply to a limited number of websites, he was "fairly confident that lots of duck-biting plaintiffs will try to capitalize on this opinion and they will find some judges who ignore the philosophical statements and instead turn a decision on the opinion's myriad of ambiguities". Over the next several years, a number of cases cited the Ninth Circuit's decision in Roommates.com to limit some of the Section 230 immunity to websites. Law professor Jeff Kosseff of the United States Naval Academy reviewed 27 cases in the 2015–2016 year involving Section 230 immunity concerns, and found more than half of them had denied the service provider immunity, in contrast to a similar study he had performed in from 2001–2002 where a majority of cases granted the website immunity; Kosseff asserted that the Roommates.com decision was the key factor that led to this change.

Sex trafficking - ''Backpage.com'' and FOSTA-SESTA (2012–2017)

Around 2001, a University of Pennsylvania paper warned that "online sexual victimization of American children appears to have reached epidemic proportions" due to the allowances granted by Section 230. Over the next decade, advocates against such exploitation, such as the National Center for Missing and Exploited Children and Cook County Sheriff Tom Dart, pressured major websites to block or remove content related to sex trafficking, leading to sites like Facebook, MySpace, and Craigslist to pull such content. Because mainstream sites were blocking this content, those that engaged or profited from trafficking started to use more obscure sites, leading to the creation of sites like Backpage. In addition to removing these from the public eye, these new sites worked to obscure what trafficking was going on and who was behind it, limiting ability for law enforcement to take action. Backpage and similar sites quickly came under numerous lawsuits from victims of the sex traffickers and exploiters for enabling this crime, but the court continually found in favor of Backpage due to Section 230. Attempts to block Backpage from using credit card services as to deny them revenue was also defeated in the courts, as Section 230 allowed their actions to stand in January 2017.
Due to numerous complaints from constituents, Congress began an investigation into Backpage and similar sites in January 2017, finding Backpage complicit in aiding and profiting from illegal sex trafficking. Subsequently, Congress introduced the FOSTA-SESTA bills: the Allow States and Victims to Fight Online Sex Trafficking Act in the House of Representatives by Ann Wagner in April 2017, and the Stop Enabling Sex Traffickers Act U.S. Senate bill introduced by Rob Portman in August 2017. Combined, the FOSTA-SESTA bills modified Section 230 to exempt service providers from Section 230 immunity when dealing with civil or criminal crimes related to sex trafficking, which removes section 230 immunity for services that knowingly facilitate or support sex trafficking. The bill passed both Houses and was signed into law by President Donald Trump on April 11, 2018.
The bills were criticized by pro-free speech and pro-Internet groups as a "disguised internet censorship bill" that weakens the section 230 immunity, places unnecessary burdens on Internet companies and intermediaries that handle user-generated content or communications with service providers required to proactively take action against sex trafficking activities, and requires a "team of lawyers" to evaluate all possible scenarios under state and federal law. Critics also argued that FOSTA-SESTA did not distinguish between consensual, legal sex offerings from non-consensual ones, and argued it would cause websites otherwise engaged in legal offerings of sex work would be threatened with liability charges. Online sex workers argued that the bill would harm their safety, as the platforms they utilize for offering and discussing sexual services in a legal manner had begun to reduce their services or shut down entirely due to the threat of liability under the bill.

Debate on Section 230's protections for social media companies (2016–present)

Many social media sites, notably the Big Tech companies of Facebook, Google, and Apple, as well as Twitter, have come under scrutiny as a result of the alleged Russian interference in the 2016 United States elections, where it was alleged that Russian agents used the sites to spread propaganda and fake news to swing the election in favor of Donald Trump. These platforms also were criticized for not taking action against users that used the social media outlets for harassment and hate speech against others. Shortly after the passage of FOSTA-SESTA acts, some in Congress recognized that additional changes could be made to Section 230 to require service providers to deal with these bad actors, beyond what Section 230 already provided to them.

Platform neutrality

Some politicians, including Republican senators Ted Cruz and Josh Hawley, have accused major social networks of displaying a bias against conservative perspectives when moderating content. In a Fox News op-ed, Cruz argued that section 230 should only apply to providers that are politically "neutral", suggesting that a provider "should be considered to be a 'publisher or speaker' of user content if they pick and choose what gets published or spoke." Section 230 does not contain any requirements that moderation decisions be neutral. Hawley alleged that section 230 immunity was a "sweetheart deal between big tech and big government".
In December 2018, Republican house representative Louie Gohmert introduced the Biased Algorithm Deterrence Act, which would remove all section 230 protections for any provider that used filters or any other type of algorithms to display user content when otherwise not directed by a user.
In June 2019, Hawley introduced the Ending Support for Internet Censorship Act, that would remove section 230 protections from companies whose services have more than 30 million active monthly users in the U.S. and more than 300 million worldwide, or have over $500 million in annual global revenue, unless they receive a certification from the majority of the Federal Trade Commission that they do not moderate against any political viewpoint, and have not done so in the past 2 years.
There has been criticism—and support—of the proposed bill from various points on the political spectrum. A poll of more than 1,000 voters gave Senator Hawley's bill a net favorability rating of 29 points among Republicans and 26 points among Democrats. Some Republicans feared that by adding FTC oversight, the bill would continue to fuel fears of a big government with excessive oversight powers. Democrat Speaker Nancy Pelosi has indicated support for the same approach Hawley has taken. The chairman of the Senate Judiciary Committee, Senator Graham, has also indicated support for the same approach Hawley has taken, saying "he is considering legislation that would require companies to uphold 'best business practices' to maintain their liability shield, subject to periodic review by federal regulators."
Legal experts have criticized the Republicans' push to make Section 230 encompass platform neutrality. Wyden stated in response to potential law changes that "Section 230 is not about neutrality. Period. Full stop. 230 is all about letting private companies make their own decisions to leave up some content and take other content down." Kosseff has stated that the Republican intentions are based on a "fundamental misunderstanding" of Section 230's purpose, as platform neutrality was not one of the considerations made at the time of passage. Kosseff stated that political neutrality was not the intent of Section 230 according to the framers, but rather making sure providers had the ability to make content-removal judgement without fear of liability. There have been concerns that any attempt to weaken Section 230 could actually cause an increase in censorship when services lose their liability.
Attempts to bring damages to tech companies for apparent anti-conservative bias in courts, arguing against Section 230 protections, have generally failed. A lawsuit brought by the non-profit Freedom's Watch in 2018 against Google, Facebook, Twitter, and Apple on antitrust violations for using their positions to create anti-conservative censorship was dismissed by the D.C. Circuit Court of Appeals in May 2020, with the judges ruling that censorship can only apply to First Amendment rights blocked by the government and not by private entities.

Hate speech

In the wake of the 2019 shootings in Christchurch, New Zealand, El Paso, Texas, and Dayton, Ohio, the impact on Section 230 and liability towards online hate speech has been raised. In both the Christchurch and El Paso shootings, the perpetrator posted hate speech manifestos to 8chan, a moderated imageboard known to be favorable for the posting of extreme views. Concerned politicians and citizens raised calls at large tech companies for the need for hate speech to be removed from the Internet; however, hate speech is generally protected speech under the First Amendment, and Section 230 removes the liability for these tech companies to moderate such content as long as it is not illegal. This has given the appearance that tech companies do not need to be proactive against hateful content, thus allowing the hate content to proliferate online and lead to such incidents.
Notable articles on these concerns were published after the El Paso shooting by The New York Times, The Wall Street Journal, and Bloomberg Businessweek, among other outlets, but which were criticized by legal experts including Mike Godwin, Mark Lemley, and David Kaye, as the articles implied that hate speech was protected by Section 230, when it is in fact protected by the First Amendment. In the case of The New York Times, the paper issued a correction to affirm that the First Amendment protected hate speech, and not Section 230.
Members of Congress have indicated they may pass a law that changes how Section 230 would apply to hate speech as to make tech companies liable for this. Wyden, now a Senator, stated that he intended for Section 230 to be both "a sword and a shield" for Internet companies, the "sword" allowing them to remove content they deem inappropriate for their service, and the shield to help keep offensive content from their sites without liability. However, Wyden argued that because tech companies have not been willing to use the sword to remove content, it is necessary to take away that shield. Some have compared Section 230 to the Protection of Lawful Commerce in Arms Act, a law that grants gun manufacturers immunity from certain types of lawsuits when their weapons are used in criminal acts. According to law professor Mary Anne Franks, "They have not only let a lot of bad stuff happen on their platforms, but they’ve actually decided to profit off of people's bad behavior."
Representative Beto O’Rourke stated his intent for his 2020 presidential campaign to introduce sweeping changes to Section 230 to make Internet companies liable for not being proactive in taking down hate speech. O'Rourke later dropped out of the race. Fellow candidate and former vice president Joe Biden has similarly called for Section 230 protections to be weakened or otherwise "revoked" for "big tech" companies—particularly Facebook—having stated in a January 2020 interview with The New York Times that " is not merely an internet company. It is propagating falsehoods they know to be false", and that the U.S. needed to " standards" in the same way that the European Union's General Data Protection Regulation set standards for online privacy.

Terrorism-related content

In the aftermath of the Backpage trial and subsequent passage of FOSTA-SESTA, others have found that Section 230 appears to protect tech companies from content that is otherwise illegal under United States law. Professor Danielle Citron and journalist Benjamin Wittes found that as late as 2018, several groups deemed as terrorist organizations by the United States had been able to maintain social media accounts on services run by American companies, despite federal laws that make providing material support to terrorist groups subject to civil and criminal charges. However, case law from the Second Circuit has ruled that under Section 230, technology companies are generally not liable for civil claims based on terrorism-related content.

2020 Department of Justice review

In February 2020, the United States Department of Justice held a workshop related to Section 230 as part of an ongoing antitrust probe into "big tech" companies. Attorney General William Barr said that while Section 230 was needed to protect the Internet's growth while most companies were not stable, "No longer are technology companies the underdog upstarts...They have become titans of U.S. industry" and questioned the need for Section 230's broad protections. Barr said that the workshop was not meant to make policy decisions on Section 230, but part of a "holistic review" related to Big Tech since "not all of the concerns raised about online platforms squarely fall within antitrust" and that the Department of Justice would want to see reform and better incentives to improve online content by tech companies within the scope of Section 230 rather than change the law directly. Observers to the sessions stated the focus of the talks only covered Big Tech and small sites that engaged in areas of revenge porn, harassment, and child sexual abuse, but did not consider much of the intermediate uses of the Internet.
The DOJ issued their four major recommendations to Congress in June 2020 to modify Section 230. These include:
  1. Incentivizing platforms to deal with illicit content, including calling out "Bad Samaritans" that solicit illicit activity and remove their immunity, and carve out exemptions in the areas of child abuse, terrorism, and cyber-stalking, as well as when platforms have been notified by courts of illicit material;
  2. Removing protections from civil lawsuits brought by the federal government;
  3. Disallowing Section 230 protections in relationship to antitrust actions on the large Internet platforms; and
  4. Promoting discourse and transparency by defining existing terms in the statute like "otherwise objectionable" and "good faith" with specific language, and requiring platforms to publicly document when they take moderation actions against content unless that may interfere with law enforcement or risk harm to an individual.

    Legislation to alter Section 230

In 2020, several bills were introduced through Congress to limit the liability protections that Internet platforms had from Section 230 as a result of events in the preceeding years.
;EARN IT Act of 2020
;Limiting Section 230 Immunity to Good Samaritans Act
;Platform Accountability and Consumer Transparency Act
;Behavioral Advertising Decisions Are Downgrading Services Act

Executive Order on Preventing Online Censorship

United States President Donald Trump has been a major proponent of limiting the protections of technology and media companies under Section 230 due to claims of an anti-conservative bias. In July 2019, Trump held a "Social Media Summit" that he used to criticize how Twitter, Facebook, and Google handled conservative voices on their platforms. During the summit, Trump warned that he would seek "all regulatory and legislative solutions to protect free speech".
In late May 2020, President Trump made statements that mail-in voting would lead to massive fraud, in a pushback against the use of mail-in voting due to the COVID-19 pandemic for the upcoming 2020 primary elections, in both his public speeches and his social media accounts. In a Twitter message on May 26, 2020, he stated that, "There is NO WAY that Mail-In Ballots will be anything less than substantially fraudulent." Shortly after its posting, Twitter moderators marked the message with a "potentially misleading" warning linking readers to a special page on its site that provided analysis and fact-checks of Trump's statement from media sources like CNN and The Washington Post, the first time it had used the process on Trump's messages. Jack Dorsey, Twitter's CEO, defended the moderation, stating that they were not acting as a "arbitrator of truth" but instead "Our intention is to connect the dots of conflicting statements and show the information in dispute so people can judge for themselves." Trump was angered by this, and shortly afterwards threatened that he would take action to "strongly regulate" technology companies, asserting these companies were suppressing conservative voices.
On May 28, 2020, Trump signed "Executive Order on Preventing Online Censorship", an executive order directing regulatory action at Section 230. Trump stated in a press conference before signing his rationale for it: "A small handful of social media monopolies controls a vast portion of all public and private communications in the United States. They've had unchecked power to censor, restrict, edit, shape, hide, alter, virtually any form of communication between private citizens and large public audiences." The EO asserts that media companies that edit content apart from restricting posts that are violent, obscene or harassing, as outlined in the "Good Samaritan" clause §230, are then "engaged in editorial conduct" and may forfeit any safe-harbor protection granted in §230. From that, the EO specifically targets the "Good Samaritan" clause for media companies in their decisions to remove offensive material "in good faith". Courts have interpreted the "in good faith" portion of the statute based on its plain language, the EO purports to establish conditions where that good faith may be revoked, such as if the media companies have shown bias in how they remove material from the platform. The goal of the EO is to remove the Section 230 protections from such platforms, and thus leaving them liable for content. Whether a media platform has bias would be determined by a rulemaking process to be set by the Federal Communication Commission in consultation with the Commerce Department, the National Telecommunications and Information Administration, and the United States Attorney General, while the Justice Department and state attorney generals will handle disputes related to bias, gather these to report to the Federal Trade Commission, who would make determinations if a federal lawsuit should be filed. Additional provisions prevent government agencies from advertising on media company platforms that are demonstrated to have such bias.
The EO came under intense criticism and legal analysis after its announcement. Senator Wyden stated that the EO was a "mugging of the First Amendment", and that there does need to be a thoughtful debate about modern considerations for Section 230, though the political spat between Trump and Twitter is not a consideration. Professor Kate Klonick of St. John's University School of Law in New York considered the EO "political theater" without any weight of authority. The Electronic Frontier Foundation's Aaron Mackey stated that the EO starts with a flawed misconstruing of linking sections §230 and §230, which were not written to be linked and have been treated by case law as independent statements in the statute, and thus "has no legal merit".
By happenstance, the EO was signed on the same day that violent riots erupted in Minneapolis, Minnesota in the wake of the death of George Floyd, an African-American from an incident involving four officers of the Minneapolis police department. Trump had tweeted on his conversation with Minnesota's governor Tim Walz about bringing National Guard to stop the riots, but concluded with the statement, "Any difficulty and we will assume control but, when the looting starts, the shooting starts.", the latter phrase a phrase attached Miami Police Chief Walter E. Headley to deal with violent riots in 1967. Twitter, after internal review, marked the message with a "public interest notice" that deemed it "glorified violence", which they would normally remove for violating the site's terms, but stated to journalists that they "have kept the Tweet on Twitter because it is important that the public still be able to see the Tweet given its relevance to ongoing matters of public importance." Following Twitter's marking of his May 28 tweet, Trump said in another tweet that due to Twitter's actions, "Section 230 should be revoked by Congress. Until then, it will be regulated!"
By June 2, 2020, the Center for Democracy & Technology filed a lawsuit in the United States District Court for the District of Columbia seeking preliminary and permanent injunction from the EO from being enforced, asserting that the EO created a chilling effect on free speech since it puts all hosts of third-party content "on notice that content moderation decisions with which the government disagrees could produce penalties and retributive actions, including stripping them of Section 230s protections".
The Secretary of Commerce via the NTIA sent a petition with a proposed rule to the FCC on July 27, 2020 as the first stage of executing on the EO.

Case law

Numerous cases involving Section 230 have been heard in the judiciary system since its introduction, many which are rote applications of Section 230.
The following is a partial list of legal cases that have been established as case law that have influenced the interpretation of Section 230 in subsequent cases or have led to new legislation around Section 230.

Defamatory information

; Zeran v. AOL, 129 F.3d 327.
; Blumenthal v. Drudge, 992 F. Supp. 44, 49-53.
; Carafano v. Metrosplash.com, 339 F.3d 1119.
; Batzel v. Smith, 333 F.3d 1018.
; Green v. AOL, 318 F.3d 465.
; Barrett v. Rosenthal, 40 Cal. 4th 33.
; MCW, Inc. v. badbusinessbureau.com 2004 WL 833595, No. Civ.A.3:02-CV-2727-G.
; Hy Cite Corp. v. badbusinessbureau.com , 418 F. Supp. 2d 1142.
; Barnes v. Yahoo!, Inc. 570 F.3d

False information

; Gentry v. eBay, Inc., 99 Cal. App. 4th 816, 830.
; Ben Ezra, Weinstein & Co. v. America Online, 206 F.3d 980, 984-985, cert. denied, 531 U.S. 824.
; Goddard v. Google, Inc., C 08-2738 JF, 2008 WL 5245490, 2008 U.S. Dist. LEXIS 101890.
; Milgram v. Orbitz Worldwide, LLC, ESX-C-142-09.
; Herrick v. Grindr, 765 F. App'x 586.

Sexually explicit content and minors

; Doe v. America Online, 783 So. 2d 1010, 1013-1017, cert. denied, 122 S.Ct. 208.
; Kathleen R. v. City of Livermore, 87 Cal. App. 4th 684, 692.
; Doe v. MySpace, 528 F.3d 413.
; Dart v. Craigslist, Inc., 665 F. Supp. 2d 961.
; Backpage.com v. McKenna, et al., CASE NO. C12-954-RSM
; Backpage.com LLC v Cooper, Case #: 12-cv-00654
; Backpage.com LLC v Hoffman et al., Civil Action No. 13-cv-03952
; Backpage.com v. Dart., CASE NO. 15-3047

Discriminatory housing ads

; Chicago Lawyers' Committee For Civil Rights Under Law, Inc. v. Craigslist, Inc., 519 F.3d 666.
; Fair Housing Council of San Fernando Valley v. Roommates.com, LLC, 521 F.3d 1157 .

Threats

; Delfino v. Agilent Technologies, 145 Cal. App. 4th 790, cert denied, 128 S. Ct. 98.

Failure to warn

; Jane Doe No. 14 v. Internet Brands, Inc., No. 12-56638.

Terrorism

; Force v. Facebook, Inc., 934 F.3d 53.

Similar legislation in other countries

European Union

, the e-Commerce Directive, establishes a safe harbor regime for hosting providers:
The updated Directive on Copyright in the Digital Single Market Article 17 makes providers liable if they fail to take "effective and proportionate measures" to prevent users from uploading certain copyright violations and do not respond immediately to takedown requests.

Australia

In Dow Jones & Company Inc v Gutnick, the High Court of Australia treated defamatory material on a server outside Australia as having been published in Australia when it is downloaded or read by someone in Australia.
Gorton v Australian Broadcasting Commission & Anor 1 ACTR 6
Under the Defamation Act 2005, s 32, a defence to defamation is that the defendant neither knew, nor ought reasonably to have known of the defamation, and the lack of knowledge was not due to the defendant's negligence.

New Zealand

Failing to investigate the material or to make inquiries of the user concerned may amount to negligence in this context: Jensen v Clark 2 NZLR 268.

France

Directive 2000/31/CE was transposed into the :fr:Loi pour la confiance dans l'économie numérique|LCEN law. Article 6 of the law establishes safe haven for hosting provider as long as they follow certain rules.
In LICRA vs. Yahoo!, the High Court ordered Yahoo! to take affirmative steps to filter out Nazi memorabilia from its auction site. Yahoo!, Inc. and its then president Timothy Koogle were also criminally charged, but acquitted.

Germany

In 1997, Felix Somm, the former managing director for CompuServe Germany, was charged with violating German child pornography laws because of the material CompuServe's network was carrying into Germany. He was convicted and sentenced to two years probation on May 28, 1998. He was cleared on appeal on November 17, 1999.
The Oberlandesgericht Cologne, an appellate court, found that an online auctioneer does not have an active duty to check for counterfeit goods.
In one example, the first-instance district court of Hamburg issued a temporary restraining order requiring message board operator Universal Boards to review all comments before they can be posted to prevent the publication of messages inciting others to download harmful files. The court reasoned that "the publishing house must be held liable for spreading such material in the forum, regardless of whether it was aware of the content."

United Kingdom

The laws of libel and defamation will treat a disseminator of information as having "published" material posted by a user, and the onus will then be on a defendant to prove that it did not know the publication was defamatory and was not negligent in failing to know: Goldsmith v Sperrings Ltd 2 All ER 566; Vizetelly v Mudie's Select Library Ltd 2 QB 170; Emmens v Pottle & Ors 16 QBD 354.
In an action against a website operator, on a statement posted on the website, it is a defence to show that it was not the operator who posted the statement on the website. The defence is defeated if it was not possible for the claimant to identify the person who posted the statement, or the claimant gave the operator a notice of complaint and the operator failed to respond in accordance with regulations.