Close Menu
  • Home
  • AI
  • Education
  • Entertainment
  • Food Health
  • Health
  • Sports
  • Tech
  • Well Being

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

What's Hot

OpenAI raises $110B in one of the largest private funding rounds in history

February 27, 2026

OpenAI Just Closed a $110 Billion Funding Round

February 27, 2026

Black fathers embrace resources to support their pregnant partners through birth

February 27, 2026
Facebook X (Twitter) Instagram
  • Home
  • About Us
  • Advertise With Us
  • Contact us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
Facebook X (Twitter) Instagram
IQ Times Media – Smart News for a Smarter YouIQ Times Media – Smart News for a Smarter You
  • Home
  • AI
  • Education
  • Entertainment
  • Food Health
  • Health
  • Sports
  • Tech
  • Well Being
IQ Times Media – Smart News for a Smarter YouIQ Times Media – Smart News for a Smarter You
Home » California’s new AI safety law shows regulation and innovation don’t have to clash 
AI

California’s new AI safety law shows regulation and innovation don’t have to clash 

IQ TIMES MEDIABy IQ TIMES MEDIAOctober 1, 2025No Comments5 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email
Share
Facebook Twitter LinkedIn Pinterest Email


SB 53, the AI safety and transparency bill that California Gov. Gavin Newsom signed into law this week, is proof that state regulation doesn’t have to hinder AI progress.  

So says Adam Billen, vice president of public policy at youth-led advocacy group Encode AI, on today’s episode of Equity. 

“The reality is that policy makers themselves know that we have to do something, and they know from working on a million other issues that there is a way to pass legislation that genuinely does protect innovation — which I do care about — while making sure that these products are safe,” Billen told TechCrunch. 

At its core, SB 53 is a first-in-the-nation bill that requires large AI labs to be transparent about their safety and security protocols — specifically around how they prevent their models from catastrophic risks, like being used to commit cyberattacks on critical infrastructure or build bio-weapons. The law also mandates that companies stick to those protocols, which will be enforced by the Office of Emergency Services.  

“Companies are already doing the stuff that we ask them to do in this bill,” Billen told TechCrunch. “They do safety testing on their models. They release model cards. Are they starting to skimp in some areas at some companies? Yes. And that’s why bills like this are important.” 

Billen also noted that some AI firms have a policy around relaxing safety standards under competitive pressure. OpenAI, for example, has publicly stated that it may “adjust” its safety requirements if a rival AI lab releases a high-risk system without similar safeguards. Billen argues that policy can enforce companies’ existing safety promises, preventing them from cutting corners under competitive or financial pressure. 

While public opposition to SB 53 was muted in comparison to its predecessor SB 1047, which Newsom vetoed last year, the rhetoric in Silicon Valley and among most AI labs has been that almost any AI regulation is anathema to progress and will ultimately hinder the U.S. in its race to beat China.  

Techcrunch event

San Francisco
|
October 27-29, 2025

It’s why companies like Meta, VCs like Andreessen Horowitz, and powerful individuals like OpenAI president Greg Brockman are collectively pumping hundreds of millions into super PACs to back pro-AI politicians in state elections. And it’s why those same forces earlier this year pushed for an AI moratorium that would have banned states from regulating AI for 10 years.  

Encode AI ran a coalition of more than 200 organizations to work to strike down the proposal, but Billen says the fight isn’t over. Senator Ted Cruz, who championed the moratorium, is attempting a new strategy to achieve the same goal of federal preemption of state laws. In September, Cruz introduced the SANDBOX Act, which would allow AI companies to apply for waivers to temporarily bypass certain federal regulations for up to 10 years. Billen also anticipates a forthcoming bill establishing a federal AI standard that would be pitched as a middle-ground solution but would in reality override state laws. 

He warned that narrowly scoped federal AI legislation could “delete federalism for the most important technology of our time.” 

“If you told me SB 53 was the bill that would replace all the state bills on everything related to AI and all of the potential risks, I would tell you that’s probably not a very good idea and that this bill is designed for a particular subset of things,” Billen said.  

Adam Billen, vice president of public policy, Encode AIImage Credits:Encode AI

While he agrees that the AI race with China matters, and that policymakers need to enact regulation that will support American progress, he says killing state bills — which mainly focus on deepfakes, transparency, algorithmic discrimination, children’s safety, and governmental use of AI — isn’t the way to go about doing that. 

“Are bills like SB 53 the thing that will stop us from beating China? No,” he said. “I think it is just genuinely intellectually dishonest to say that that is the thing that will stop us in the race.” 

He added: “If the thing you care about is beating China in the race on AI — and I do care about that — then the things you would push for are stuff like export controls in Congress,” Billen said. “You would make sure that American companies have the chips. But that’s not what the industry is pushing for.” 

Legislative proposals like the Chip Security Act aim to prevent the diversion of advanced AI chips to China through export controls and tracking devices, and the existing CHIPS and Science Act seeks to boost domestic chip production. However, some major tech companies, including OpenAI and Nvidia, have expressed reluctance or opposition to certain aspects of these efforts, citing concerns about effectiveness, competitiveness, and security vulnerabilities.  

Nvidia has its reasons — it has a strong financial incentive to continue selling chips to China, which has historically represented a significant portion of its global revenue. Billen speculated that OpenAI could hold back on chip export advocacy to stay in the good graces of crucial suppliers like Nvidia. 

There’s also been inconsistent messaging from the Trump administration. Three months after expanding an export ban on advanced AI chips to China in April 2025, the administration reversed course, allowing Nvidia and AMD to sell some chips to China in exchange for 15% of the revenue. 

“You see people on the Hill moving towards bills like the Chip Security Act that would put export controls on China,” Billen said. “In the meantime, there’s going to continue to be this propping up of the narrative to kill state bills that are actually quite light tough.” 

Billen added that SB 53 is an example of democracy in action — of industry and policymakers working together to get to a version of a bill that everyone can agree on. It’s “very ugly and messy,” but “that process of democracy and federalism is the entire foundation of our country and our economic system, and I hope that we will keep doing that successfully.” 

“I think SB 53 is one of the best proof points that that can still work,” he said. 



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
IQ TIMES MEDIA
  • Website

Related Posts

OpenAI raises $110B in one of the largest private funding rounds in history

February 27, 2026

Jack Dorsey just halved the size of Block’s employee base — and he says your company is next

February 26, 2026

Anthropic CEO stands firm as Pentagon deadline looms

February 26, 2026
Add A Comment
Leave A Reply Cancel Reply

Editors Picks

Father of accused Georgia school shooter Colt Gray testifies in trial

February 27, 2026

Board OKs hundreds of corrections for Texas’ Bible-infused curriculum

February 26, 2026

Los Angeles school board to discuss superintendent after FBI search

February 26, 2026

Federal agents detained Columbia student after posing as investigators, school says

February 26, 2026
Education

Father of accused Georgia school shooter Colt Gray testifies in trial

By IQ TIMES MEDIAFebruary 27, 20260

ATLANTA (AP) — The father of accused school shooter Colt Gray has taken the stand…

Board OKs hundreds of corrections for Texas’ Bible-infused curriculum

February 26, 2026

Los Angeles school board to discuss superintendent after FBI search

February 26, 2026

Federal agents detained Columbia student after posing as investigators, school says

February 26, 2026
IQ Times Media – Smart News for a Smarter You
Facebook X (Twitter) Instagram Pinterest Vimeo YouTube
  • Home
  • About Us
  • Advertise With Us
  • Contact us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
© 2026 iqtimes. Designed by iqtimes.

Type above and press Enter to search. Press Esc to cancel.