Baker & Hostetler LLP

09/30/2024 | Press release | Distributed by Public on 09/30/2024 15:14

Setting Some Ground Rules: Gavin Newsom Signs Bills Regulating AI in Political Campaigns

09/30/2024|3 minute read
Share

What would you do if you received a call from the sitting president of the United States telling you not to vote in your state's upcoming party primary election?

In January 2024, many New Hampshire voters faced this exact question when they received a call from a former New Hampshire Democratic Party chair with a message from President Joe Biden telling them that they should not vote in their primary. The problem? The call came from inside the house. All kidding aside, it did come from an unexpected source - artificial intelligence (AI). The former chair never made the call, and Biden never made the statement.

This story set the news ablaze early in the 2024 election cycle. The public began to witness the potential of AI to manipulate voters in the coming election. Since then, further instances have continued to illustrate potential dangers:

  • On August 13, 2024, Elon Musk's AI chatbot, Grok, began allowing users to create AI-generated images from text prompts. Soon thereafter, X (formerly Twitter) users inundated the social media platform with fake photorealistic political images, including those of both presidential candidates, Vice President Kamala Harris and former President Donald Trump. The flood of potentially dangerous misinformation prompted X to implement restrictions on Grok's text-to-image ability only three days later.
  • In July 2024, a video of a voice seeming to be that of Harris went viral. The video placed the voice over a series of short videos, some containing Harris and some not. At some points, the video was broken up by actual videos of Harris speaking. Throughout the video, the voice made statements such as "I was selected because I am the ultimate diversity hire," and "I had four years under the tutelage of the ultimate deep-state puppet." Although the video was originally titled "Kamala Harris Campaign Ad PARODY," this title got lost as X users reposted the video on and off X.
  • In September 2024, when Taylor Swift formally endorsed Harris, she cited her concerns about Trump's use of several fake AI-created images of her and her fans to claim he had her support.

Sparked by these concerns, the California Legislature sprang into action, and California Governor Gavin Newsom subsequently signed a suite of three bills into law on September 17. The laws aim to protect voters from "deepfakes" - fake photorealistic images or videos of real people.

  • AB 2355 requires political campaigns to provide a disclosure in the advertisement of the use of AI in making the advertisement. This disclosure must be written or oral, depending on the advertising medium.
  • AB 2839 prohibits knowingly distributing materially deceptive content involving candidates for office or elected officials made with AI within 120 days of a California election and, in some cases, for 60 days after. The law allows a judge to issue an injunction requiring the content to be taken down and potentially for the judge to grant civil damages.
  • AB 2655 requires "large online platforms" to remove or label deepfakes within 72 hours of a user report. The law defines "large online platform" as "a public-facing internet website, web application, or digital application, including a social media platform ... , video sharing platform, advertising network, or search engine that had at least 1,000,000 California users during the preceding 12 months."

In signing the bills, Newsom stated, "Safeguarding the integrity of elections is essential to democracy, and it's critical that we ensure AI is not deployed to undermine the public's trust through disinformation - especially in today's fraught political climate."

Although the laws explicitly exempt parody and/or comedy, critics worry the new laws could encroach on individuals' freedom of speech. Those in support note that placing guardrails on potentially harmful disinformation is long overdue, especially as the 2024 election is only 36 days away.

Newsom vetoed the more comprehensive AI regulation, SB 1047 (the "Safe and Secure Innovation for Frontier Artificial Intelligence Models Act"), which passed the California Legislature last month. SB 1047 would have imposed significant requirements on AI developers. Lawmakers and AI and tech companies were similarly divided over whether such legislation would establish much-needed safeguards or put a halt to AI development.

Co-author: Noah J. Morris