Second Circuit Affirms Video Sharing Site’s Immunity From Suit Under CDA Section 230 For Removal of User Content

By: Tyler G. Newby , Ethan M. Thomas

In a ruling that affirms the immunity of user-generated content platforms from suit for removing content from their sites, a panel of the U.S. Court of Appeals for the Second Circuit unanimously held that Section 230 of the Communications Decency Act (CDA) immunized Vimeo from a lawsuit challenging its removal of videos promoting sexual orientation change efforts (SOCE) from its platform.

The case, Domen v. Vimeo (2d Cir. Mar. 11, 2021) represents the first time the court applied Section 230 immunity in the context of removing (as opposed to permitting) content on an online platform. While several courts have held that Section 230(c)(1) provides immunity to online platforms for content removal—or refusal—decisions, the Second Circuit’s ruling is significant in that it is the first circuit court of appeal decision applying Section 230(c)(2)(A) as a basis for immunity at the pleading stage without the need for development of the factual record through discovery.

The CDA and Section 230

Section 230 is the principal legal protection afforded to online platforms from lawsuits over content posted by users of their platforms. Section 230 contains three provisions specifying when platforms will be immune from suit: first, in subsection (c)(1) as a “publisher”; second, in subsection (c)(2)(A) for the Good Samaritan removal or filtering of content’ and third, in subjection (c)(2)(B) as a provider of the technical means to restrict content. Subsection (c)(1) states:

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

Subsection (c)(2) states:

No provider or user of an interactive computer service shall be held liable on account of—

(A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or

(B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).

The breadth of Section 230 has been under attack in recent years. In 2018, the Allow States and Victims to Fight Online Sex Trafficking Act of 2017 (FOSTA), was signed into law, amending Section 230 to eliminate platforms’ immunity from prosecution for violating certain state sex trafficking law. It also eliminated platforms’ immunity from civil suits brought by victims of sex trafficking for knowingly promoting and facilitating sex trafficking.

The Senate held a hearing in 2020 to address Section 230 with executives from Twitter, Facebook and Google present, in which some senators accused the platforms of engaging in political censorship. During the past 12 months, numerous bills have been introduced that would further pare back the immunity Section 230 provides to platforms, both for removing and for failing to remove certain categories of third-party content.

Section 230 and Content Removal

Judicial decisions applying Section 230 to the content-removal context are fairly recent and limited to a handful of courts. As a magistrate judge in the Southern District of New York noted in Vimeo, although several district courts had held that subsection (c)(1) immunizes platforms from suit for content removal, no circuit court of appeal and no court in the Second Circuit had held that Section 230 specifically protected content removal decisions. See Domen v. Vimeo (S.D.N.Y. 2020) (citing Riggs v. MySpace (9th Cir. 2011); Ebeid v. Facebook (N.D. Cal. May 9, 2019); Lancaster v. Alphabet (N.D. Cal. July 8, 2016); and Mezey v. Twitter, No. 18-cv-21069, 2018 WL 5306769 (S.D. Fla. July 19, 2018)).

Moreover, some courts had previously found that subsection (c)(2)(A) was not an appropriate basis for dismissal at the pleading stage. For example, in Nat’l Numismatic Certification v. eBay (M.D. Fla. July 8, 2008), the district court denied eBay’s motion to dismiss on subsection 230(c)(2)(a) grounds a lawsuit over its removal of coin seller’s listings that failed to meet eBay’s certification criteria. The court there held that it could not determine whether eBay had acted in good faith at the pleading stage, where because the complaint had plausibly alleged bad faith.

Domen v. Vimeo

Church United is a nonprofit religious corporation founded by James Domen, who claimed that after he “was a homosexual” for three years, he began identifying as a “former homosexual” because of his desire to pursue his faith in Christianity. Church United alleged that it had used its paid Vimeo account to upload videos including those addressing “sexual orientation as it relates to religion,” in addition to more generally promoting SOCE. Vimeo notified Church United that the account was in violation of the platform’s policy against videos that promote SOCE, warning that the account could be removed if the offending content was not voluntarily taken down. Church United did not remove the videos, and Vimeo deleted the account for posting “videos that harass, incite hatred, or include discriminatory or defamatory speech.”

Domen and Church United sued Vimeo for removing Church United’s account. The lawsuit alleged that Vimeo engaged in “censorship” by removing these videos and violated California and New York state laws prohibiting discrimination on the basis of sexual orientation and religion. The district court dismissed the complaint under Rule 12(b)(6) for failure to state a claim, finding that subsections (c)(1) and (c)(2) both immunize Vimeo from the state-law claims. Though the district court did not perform separate analyses for subsections (c)(2)(A) and (c)(2)(B), it relied primarily on the language from subsection (c)(2)(A) to find Vimeo was empowered to “police content” in the manner alleged.

In affirming, the Second Circuit only addressed subsection (c)(2) of Section 230 as the basis for dismissal. The court noted that subsection (c)(2) is a “broad provision” that forecloses civil liability where providers restrict access to content that they “consider[] . . . objectionable.” (emphasis in opinion). Thus, Vimeo had discretion to “consider” SOCE material objectionable. Additionally, the panel held that Section 230 does not require any particular form of content restriction, and hence removing Church United’s account after warning was appropriate.

In other cases, courts have declined to dismiss content removal claims under subsection (c)(2)(A) where the plaintiffs have alleged the platform acted in bad faith, because “bad faith” is a fact-laden issue that requires discovery. This is not the case in this instance. The panel held that Church United’s allegations of bad faith were “far too conclusory” to allow the court to infer that Vimeo engaged in “anti-competitive conduct or self-serving behavior in the name of content regulation.”

Instead, Vimeo’s removal decision was a “straightforward consequence of Vimeo’s content policies,” which were adequately communicated to Church United. The court was unpersuaded by Church United’s arguments that Vimeo’s decision to allow other videos regarding sexual orientation and religion (though not SOCE) suggested bad faith, noting “one purpose of Section 230 is to provide interactive computer services with immunity for removing ‘some—but not all—offensive material from their websites.’” (quoting Bennett v. Google (D.C. Cir. 2018)). The court observed that content moderation is necessarily a discretionary exercise, and allegations of inconsistency are not enough to pierce the protection of immunity.

In reaching this conclusion, the court distinguished the Ninth Circuit’s 2019 opinion in Enigma Software Group USA v. Malwarebytes (9th Cir. 2019), which held that a security software company that blocked the installation and operation of  another company’s software on its users’ computers was not immune from suit under Section 230(c)(2)(B), because the plaintiff there plausibly alleged the companies were competitors.

Unlike the Vimeo case, the Ninth Circuit found that the parties in Enigma were both antimalware software providers. Enigma alleged that “Malwarebytes publicly mischaracterized Enigma’s programs . . . as potentially unwanted” programs and that it did so “to interfere with Enigma’s customer base and divert those customers to Malwarebytes,” thereby violating the Lanham Act. Malwarebytes argued that it blocked Enigma’s software for legitimate reasons, but in any event that the claims were barred by Section 230. The court reversed the district court’s finding of immunity and remanded the case to the district court, holding that the anticompetitive allegations were sufficient to survive dismissal at the pleading stage.

The Second Circuit held that, unlike in Enigma, Church United failed to allege that Vimeo’s removal constituted “anti-competitive conduct or self-serving behavior in the name of content regulation,” as opposed to “a straightforward consequence of Vimeo’s content policies.” Though the Second Circuit’s holding is consistent with Enigma, it made clear that plaintiffs must do more than plead bare assertions of bad faith.

Takeaways

The Vimeo decision confirms the immunity that Section 230 affords to online content providers in the context of removal decisions, and it reiterates that dismissal on this ground is appropriate at the pleading stage. However, the holding relied in many places on pleading failures that did not allow the court to infer bad faith or ulterior illicit motive. Content providers can continue to rely on Section 230 immunity but should remain cognizant of litigation risk by plaintiffs who may spell out more plausible allegations of anti-competitive or self-serving purposes behind moderation decisions.

Login

Don’t have an account yet?

Register