California Chamber of Commerce

08/07/2024 | News release | Distributed by Public on 08/07/2024 09:49

‘Godmother of AI’ Warns SB 1047 AI Bill Restricts Innovation

In a commentary for Fortune yesterday, Dr. Fei-Fei Li warned that California's artificial intelligence (AI) bill SB 1047 would have significant unintended consequences that will stifle innovation.

Widely credited with being the "Godmother of AI," Li is a renowned computer scientist, professor and co-director of Stanford's Human-Centered AI Institute.

In her commentary, Li called SB 1047 "well-meaning," but warned that due to the penalties and restrictions the legislation sets on open-source development, SB 1047 will not just harm innovation in California, but in the entire country as well.

"If passed into law, SB-1047 will harm our budding AI ecosystem, especially the parts of it that are already at a disadvantage to today's tech giants: the public sector, academia, and 'little tech.'" she said. "SB-1047 will unnecessarily penalize developers, stifle our open-source community, and hamstring academic AI research, all while failing to address the very real issues it was authored to solve."

SB 1047 is scheduled to be considered by the Assembly Appropriations Committee today.

Safe and Secure Innovation for Frontier Artificial Intelligence Models Act

SB 1047 (Wiener; D-San Francisco), the Safe and Secure Innovation for Frontier Artificial Intelligence Models Act, requires frontier AI developers to make a positive safety determination before initiating training of a covered model, among other things, subject to harsh penalties that include criminal penalties. The bill creates significant uncertainty for businesses due to vague, overbroad and impractical, and at times infeasible, standards, requirements, and definitions. It focuses almost exclusively on developer liability, creating liability for failing to foresee and block any and all conceivable uses of a model that might do harm-even if a third party jailbreaks the model.

As a consequence of such issues, SB 1047 deters open-source development, undermines technological innovation and the economy. It further imposes unreasonable requirements on operators of computing clusters, including a requirement to predict if a prospective customer "intends to utilize the computing cluster to deploy a covered model" and implement a "kill switch" to enact a full shutdown in the event of an emergency. It also establishes a totally new regulatory body, the "Frontier Model Division" within the Department of Technology, with an ambiguous and ambitious preview.

SB 1047 Stifles Innovation

Li pointed out that it's impossible for each AI developer-particularly budding coders and entrepreneurs-to predict every possible use of their model. SB 1047's penalties unduly punish developers and will force them to pull back.

The bill also "shackles" open-source development, mandating a "kill switch" in certain cases, which is a mechanism by which the program can be shut down at any time.

"If developers are concerned that the programs they download and build on will be deleted, they will be much more hesitant to write code and collaborate," she said.

Open-source development is also vital to academia and the restrictions on open-source development would be a "death knell" to academic AI, Li warned.

"Take computer science students, who study open-weight AI models. How will we train the next generation of AI leaders if our institutions don't have access to the proper models and data? A kill switch would even further dampen the efforts of these students and researchers, already at such a data and computation disadvantage compared to Big Tech," she said.

Rather than pass an "overly and arbitrarily restrictive" mandate such as SB 1047, California should adopt a policy that will empower open-source development and put forward uniform and well-reasoned rules, she urged.