Perkins Coie LLP

26/07/2024 | Press release | Distributed by Public on 26/07/2024 16:39

Supreme Court Finds First Amendment Barriers to TX and FL Social Media Regulation

07.26.2024

|

Updates

On July 1, 2024, the Supreme Court of the United States ruled in Moody v. NetChoice, LLC that laws regulating large social media platforms passed by Texas and Florida likely offend the First Amendment in at least some applications, including to the extent that those laws require platforms to change the way that user content feeds are curated and moderated. Because the Supreme Court found that the lower courts considering these cases-the U.S. Courts of Appeals for the Fifth and Eleventh Circuits-had not properly analyzed all possible applications of the laws (as required for a "facial" First Amendment challenge), the Court vacated both appellate decisions and remanded for further proceedings.

A majority of the Supreme Court rejected the states' assertion that content moderation on social media raises no First Amendment issues and can be restricted by states based on a desire to achieve "viewpoint neutrality." The Court held, instead, that platforms have First Amendment rights-like newspapers, bookstores, and other traditional publishers and curators-to determine what content to offer and how to organize it. And the Court further made clear that its traditional First Amendment precedents apply to that activity. But notably, the Court declined to go further. Instead, it instructed the lower courts to perform a highly particularized analysis, considering the individual characteristics of different platforms and how each of the laws' myriad provisions would apply on the ground.

Brief Factual and Procedural Background

Texas and Florida's laws broadly limit the ability of social media platforms to engage in content moderation by filtering, prioritizing, or labeling the materials posted by platform users. The laws also require platforms to provide individualized explanations when removing or altering users' posts. The goal of both legislatures was to address perceived bias in the application of content standards on social media that purportedly stifle conservative viewpoints.

Industry groups (led by NetChoice) challenged both laws on their face, contending that the laws infringed on platforms' constitutionally protected editorial judgments. District courts in both states issued preliminary injunctions preventing the laws from taking effect.

On appeal, the Fifth Circuit reversed and would have upheld Texas's law, reasoning that online speech platforms are analogous to the common-carrier entities that can be required by the government to accept all users seeking services, irrespective of their political viewpoint. The Fifth Circuit saw Texas's law as aimed at state regulation of conduct (purported viewpoint discrimination against conservative opinions), rather than any speech of the platforms. The Eleventh Circuit, by contrast, agreed with the district court and NetChoice that Florida's law violates NetChoice members' First Amendment right to determine how to curate the speech on their platforms, without reaching the district court's alternate holding that SB 7072 was preempted in part by Section 230 of the Communications Decency Act. The Supreme Court granted certiorari to resolve the circuit split.

Supreme Court Decision

Justice Elena Kagan authored the opinion of the Court, joined by Chief Justice John Roberts and Justices Sonia Sotomayor, Brett Kavanaugh, Amy Coney Barrett, and (in part) Ketanji Brown Jackson. The majority held that the Texas and Florida Laws were "unlikely to withstand First Amendment Scrutiny" because they "prevent[] exactly the kind of editorial judgments [the Supreme Court] has previously held to receive First Amendment Protection." But because the lower courts in both cases had failed to conduct the analysis necessary to decide the facial constitutional challenges brought by NetChoice, the Court remanded for further proceedings. The Court noted that litigants and the lower courts had failed to consider less obvious applications of the laws-e.g., whether and how the laws apply to private messaging, email filters, and customer reviews. The Court stressed that, going forward, litigants and lower courts considering facial challenges must keep in mind that "[t]he online world is variegated and complex, encompassing an ever-growing number of apps, services, functionalities, and methods for communication and connection," and the "regulation of those diverse activities could well fall on different sides of the constitutional line."

The Court then offered guidance to the lower courts on remand to correct the flaws in the Fifth Circuit's attempted revision of First Amendment precedent to exclude online speech platforms: "The Fifth Circuit was wrong in concluding that Texas's restrictions on the platforms' selection, ordering, and labeling of third-party posts do not interfere with expression. And the [Fifth Circuit] was wrong to treat as valid Texas's interest in changing the content of the platforms' feeds."

A narrower majority (Justice Jackson objected to "deciding more than [was] necessary") explained why the Fifth Circuit's approach is foreclosed by longstanding First Amendment precedent:

  1. "The First Amendment offers protection when an entity engaging in expressive activity, including compiling and curating others' speech, is directed to accommodate messages it would prefer to exclude." That protection is implicated when Texas attempts to direct social media platforms to accommodate "both sides" of any particular topic. Texas's law would require, for example, a platform to accommodate pro-Al Qaeda messages alongside anti-Al Qaeda ones.
  2. It does not matter if an entity like the platforms exercises its editorial control sparingly by moderating only a small percentage of messages: "It 'is enough' for a compiler to exclude the handful of messages it most 'disfavor[s].'"
  3. Advancing the government's "own vision of ideological balance" is not a legitimate government purpose that could justify interfering with private speech even under intermediate First Amendment scrutiny, much less strict scrutiny.

Justice Barrett's Concurrence

Justice Barrett wrote separately to voice her concern over the "daunting, if not impossible" task posed by facial First Amendment challenges in this context. Justice Barrett-the decisive fifth vote in favor of the Court's holding that some content moderation is protected speech-raised three questions that will be of interest going forward in cases involving online speech platforms:

  1. Do viewpoint-neutral algorithms that present "content similar to posts with which the user previously engaged" (i.e., the kinds of structures that some plaintiffs and regulators have claimed are addictive) constitute expressive conduct?
  2. Similarly, is artificial intelligence (AI) content moderation expressive conduct? Justice Barrett questioned whether AI relying on "large language models to determine what is 'hateful'" and removing that content is entitled to the same First Amendment protection as "a human being with First Amendment rights" making an expressive choice. "[T]echnology may attenuate the connection" between content moderation and human beings' constitutional rights to make expressive decisions for themselves.
  3. Does the First Amendment apply differently to an online speech platform that is foreign-owned and where "corporate leadership abroad makes the policy decisions about the viewpoints and content the platform will disseminate"?

The Three Other Concurrences

Justices Jackson, Clarence Thomas, Samuel Alito, and Neil Gorsuch all opined that the Court should not have weighed in on the merits of Texas's law at all, and they criticized the majority's substantive analysis on that point as unnecessary dicta. The concurring opinions reveal that some of those Justices would be inclined to accept much more state regulation of online speech platforms in future cases.

Justice Alito, in particular, emphasized his view that the Court's holding was "narrow: Justice Alito understands the majority as holding merely that NetChoice had failed to prove that the Florida and Texas laws they challenged are facially unconstitutional." And he called out "the incompleteness of th[e] record," critiquing the majority opinion for "unreflectively assum[ing]" NetChoice's "unsupported" arguments that today's social-media platforms "are just as expressive as the newspaper editors who marked up typescripts in blue pencil 50 years ago." Justice Alito also described the Fifth Circuit's common carrier analysis as worthy of "serious treatment" on remand. And Justice Thomas, in an opinion just for himself, similarly argued that common carrier principles "may have weighty implications for the trade associations' claims."

Key Takeaways and Next Steps

The Supreme Court's ruling leaves enforcement of the Texas and Florida Laws paused while NetChoice's challenges continue to be litigated. No matter the outcome of these cases, courts will continue to wrestle on a case-by-case basis with the scope of social media regulation that is permitted by the First Amendment. Indeed, just last week, during the U.S. Court of Appeals for the Ninth Circuit's oral argument in NetChoice v. Bonta-another facial First Amendment challenge, this time to California's Age-Appropriate Design Code-members of the court raised questions about the severability of social media regulations and whether, and to what extent, parts of California's law should stand if others are found unconstitutional under Moody's precedent.

Moody offers several key takeaways for companies and individuals interested in state attempts to regulate online speech platforms:

  • A majority of Justices (five votes) agree that content moderation choices are protected speech under the First Amendment. But the nuances of a platforms' content moderation procedures will be carefully scrutinized in future cases. And the use of technology, such as artificial intelligence, to make or support content moderation decisions may impact the outcome of First Amendment arguments. Similarly, the foreign ownership of a business could affect whether and how the First Amendment applies.
  • Some of the most helpful and clear guidance from the Court-about how and why the Texas law fails to pass muster under the First Amendment when applied to content moderation features-is arguably dicta. (Justices Jackson, Alito, and Thomas all say so expressly.) The precedential impact of that section of the majority opinion is likely to be contested in future cases by states seeking to regulate online speech platforms.
  • NetChoice did not address the relationship between the challenged regulations and Section 230 of the Communications Decency Act, which multiple circuit courts have interpreted to preempt state laws that would hold covered online platforms liable for withdrawing (or failing to withdraw) user-created content.
  • Justices Thomas, Alito, and Gorsuch appear open to regulating social media platforms as common carriers, but the five-justice majority gave no indication that it would support such a paradigm shift.

Businesses that may be affected by content moderation laws, including HB 20 and SB 2707, should consult with knowledgeable counsel. Perkins Coie's experienced team routinely advises on these matters and represents platforms in content-moderation litigation.

© 2024 Perkins Coie LLP

Sign up for the latest legal news and insights>