California Chamber of Commerce

09/13/2024 | News release | Distributed by Public on 09/13/2024 10:45

Coalition Seeks Veto of Election Disinformation Bill Based on Incorrect Assumptions

Legislation aimed at preventing online election disinformation should be vetoed because it is based on false assumptions, could have a chilling effect on online speech and raises First Amendment questions, according to a coalition including the California Chamber of Commerce and technology industry associations.

AB 2655 (Berman; D-Palo Alto) falsely assumes that that online platforms definitively know whether any particular piece of content has been manipulated in such a way that is defined under the bill.

AB 2655 requires large platforms to:

  • block the posting or sending of materially deceptive and digitally modified or created content related to elections, during specified periods before and after an election;
  • label certain additional content inauthentic, fake, or false during specified periods before and after an election; and
  • develop procedures for Californians to report content that has not been blocked or labeled in compliance with the bill.

The bill authorizes candidates for elected office and the Attorney General, among others, to seek injunctive relief against a large online platform for noncompliance.

The coalition has been pointing out that AB 2655 is unlikely to have the desired outcomes as it incorrectly presumes platforms are the appropriate arbiter in deciding what constitutes election information. Most digital services are not equipped with the tools or expertise to make such judgments.

If enacted, the bill will result in significant suppression of political speech out of fear of liability, in violation of the First Amendment, which affords the broadest protection to political speech.

Because covered platforms are not privy to the intent and context behind each piece of content, they may inadvertently over-block or over-label material. This could lead to user frustration and the suppression of political speech. Political speech is fundamental to the First Amendment's purpose.

Adding to the incentive for the covered platforms to remove even accurate information is the limited time frame - 36 hours, without an opportunity to appeal - within which the platform must respond to a user complaint about content. Failure to respond within the limited time makes the platform subject to civil action.

AB 2655 focuses on enforcement against the covered platforms rather than holding accountable the purveyors of the deceptive content. As a result, online platforms face an endless cycle of addressing deceptive content.

Leading online businesses invest significant resources in moderating online content, including the specific types of content AB 2655 seeks to address. For example, responsible digital services providers take aggressive steps to moderate dangerous and illegal content, consistent with their terms of services.

The companies deliver on their commitment to their user communities with a mix of automated tools and human review. In 2021, a number of online businesses announced they have been participating voluntarily in the Digital Trust & Safety Partnership to develop and implement best practices to ensure a safer and more trustworthy internet.

Staff Contact: Ronak Daylami