Morrison & Foerster LLP

11/13/2024 | News release | Archived content

The Future of Section 230: Protection Against Product Liability Claims

Courts around the country are grappling with Section 230 of the Communications Decency Act. Section 230 generally shields online platforms from liability for content posted by third-party users, but courts are now deciding if it can also protect online platforms from product liability claims. Plaintiffs have recently attempted to circumvent Section 230's robust immunity by bringing claims against online platforms based on theories of product liability, contending that Section 230 cannot protect against such claims.

Described by one scholar as the "twenty-six words that created the internet," Section 230 provides: "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." In other words, Section 230 protects online platforms from civil liability for third-party content posted by a platform's users.

The Scope of Section 230

Passed in 1996, the Act was created in response to two cases contemplating how to handle online forums where users could post comments. These cases highlighted that existing law was ill-equipped to handle liability issues that accompanied the rise of the internet, and prompted lawmakers to create legislation that would encourage and protect free speech online. Since it was enacted, Section 230 has allowed online platforms to operate without fear of liability for content posted by third parties.

Litigants have repeatedly argued over the statute's scope and the extent of its protections. Yet, in keeping with the deferential language in the statute, courts have consistently construed the protection offered by Section 230 broadly, holding that Section 230 offers online platforms near-total immunity from civil liability for providing a forum for information created by third parties.

This expansive understanding of Section 230 has historically required dismissal of practically all claims against internet companies for content posted to their platforms by third-party users.

But recently Section 230 has come under increasing fire from the plaintiffs' bar, who argue that the law as interpreted by courts provides online platforms with too much protection from liability involving user-generated content. In an effort to circumvent these protections, plaintiffs' attorneys have begun to rebrand claims against online platforms as product liability claims based on the allegation that the platforms are defectively designed "products."

A Traditional Section 230 Defense

A Section 230 defense contains three elements: First, the online platform must be a provider of an interactive computer service, such as a social media website. Second, the cause of action must treat the defendant as the publisher of the content in question. And, third, the content in question must have been provided by someone other than the defendant (generally, a third-party user of the online platform).

No one disputes that social media and communications platforms meet the first two elements. The focus is on the third element: whether the content in question was provided by the online platform as a publisher or by a third-party user. Traditionally, courts have dismissed claims against social media companies as barred by Section 230 because the companies cannot be considered the "publishers" of third-party content. As plaintiffs pursue product liability cases, however, they have tried to shift the focus of their claims away from this test.

The Product Liability Argument

Recently, some courts have been forced to evaluate product liability claims against tech companies and online platforms in cases alleging that the platforms are defective "products" rather than publishers of third-party content with Section 230 immunity. Some courts are finding that the inquiry is highly fact-based and cannot be determined on the face of the pleadings, allowing these cases to survive motions to dismiss.

But consistent with historic precedent, other courts have continued to reject claims brought against online platforms based on Section 230. These courts have asserted that tech companies are still immune under Section 230, despite plaintiffs' efforts to re-brand their claims.

For example, the Fifth Circuit upheld a lower court's decision that the defendant could not be liable for sexual abuse that allegedly occurred over its photo-messaging app. Although the plaintiff attempted to characterize their argument as a negligent design claim, the court found that the plaintiff ultimately sought liability for information provided by a third-party user-which Section 230 expressly prohibits.

Tech companies have responded to plaintiffs' attempt to recharacterize Section 230 by setting forth how Section 230 is critical to their business model. Limiting the scope of Section 230's protection, they caution, would lead to online censorship and curb free speech. Supporters of Section 230's protections have also noted that permitting claims against online platforms based on theories of product liability would render the Act nearly meaningless and encourage the kind of misdirected litigation that Section 230 was designed to prevent.

Looking Ahead

Courts have had many opportunities to weigh in on Section 230's ability to foreclose product liability claims, but often sidestep the issue. The Supreme Court has yet to weigh in, but this issue seems prime for future review.

Although the weight of authority counsels that Section 230 immunizes editorial content and algorithm recommendations, the Third Circuit recently bucked that trend, finding that the extent to which the software pushes content on users is relevant to the Section 230 inquiry. In August 2024, the Third Circuit found that Section 230 did not immunize TikTok against claims arising from the death of a 10-year-old girl who attempted a self-asphyxiation "blackout challenge" that she saw on the app. The Third Circuit, in reversing the district court's order granting TikTok's motion to dismiss, held that Section 230 did not shield TikTok from liability because the platform curates recommendations and "expressive content" to users through its "For You Page" algorithm. TikTok has moved for rehearing en banc, arguing that its algorithm serves a "publisher" function that has long been held to enjoy Section 230 immunity.

In 2022, Justice Clarence Thomas called for SCOTUS to take up the matter and decide the scope of Section 230, stating: "Assuming Congress does not step in to clarify Section 230's scope, [the Court] should do so in an appropriate case." Based on Thomas's statement and the inconsistency of lower court rulings in recent years, guidance from the Supreme Court seemed imminent. But in July 2024, the Court declined to grant certiorari in a case appealing a Fifth Circuit ruling that prevented the plaintiff from bringing suit against a social media company based on Section 230 immunity. Justice Thomas, joined by Justice Neil Gorsuch, dissented from the Court's decision not to hear the case, writing, "[M]ake no mistake about it-there is danger in delay. . . . Social-media platforms have increasingly used Section 230 as a get-out-of-jail free card."

Until the issue is settled by the Supreme Court, companies that run online platforms should continue to take precautions to protect themselves from possible product-liability exposure. For example, software developers should carefully consider what warnings might be appropriate on the platform and continue to refine best practices with respect to possible safety concerns as they design this software. We encourage developers to consult with counsel to discuss whether Section 230 can protect against product liability and other safety-related claims directed at their platforms and will continue to provide updates on these issues.