Skip to main content
AI

Trump and congressional GOP revive push to block state AI laws

After a defeat over the summer, AI boosters haven’t given up on federal preemption.

3 min read

An effort to block states from regulating AI seems to be emerging from the dead.

As states pass more rules around the burgeoning technology, AI industry boosters are once again pushing the federal government to stymie them.

Congressional Republican leaders are reportedly mulling a provision in the National Defense Authorization Act (NDAA) that would override state AI laws. Politico reported that they’re considering merging the push with rules to protect children’s safety online.

And the Trump administration floated a draft executive order that pressures states to forgo AI regulation through lawsuits and withholding federal funds, though the White House has reportedly put it on ice for now.

Over the summer, the Senate shut down a similar moratorium, which would have blocked AI-related laws at the state level for a decade, in a resounding 99–1 vote.

The defeat hasn’t stopped factions of the AI industry from pushing for a revived effort as industry-backed lobbies step up spending in individual states. They argue that a patchwork of state requirements would hobble innovation.

State lawmakers are predictably opposed to the prospect of federal preemption; around 300 state legislators from both parties signed a letter opposing the ban’s inclusion in the NDAA. A site set up by the advocacy group Americans for Responsible Innovation has collected a running list of other statements in opposition to the preemption, including from some Republican governors and national figures on the party’s more populist flank.

What’s at stake: With not much AI regulation coming out of Washington these days, statehouses have taken up the mantle.

Seventeen new AI-related state measures are expected to go into effect on New Year’s Day alone. California recently passed the nation’s first law specifically regulating frontier models, and New York may be poised to do the same by the end of the year.

Keep up with the innovative tech transforming business

Tech Brew keeps business leaders up-to-date on the latest innovations, automation advances, policy shifts, and more, so they can make informed decisions about tech.

New York Assemblyman Alex Bores, author of an AI safety bill currently awaiting the governor’s signature and a recent target of a major pro-industry super PAC, said he doesn’t think a patchwork of state laws is a great outcome in the long run, either. But state lawmakers have worked together to make these laws somewhat uniform.

“What sticks with me is, this is such a weird framing of, should we preempt the states or not? The question is, is Congress solving the problem or not? And if they’re not solving the problem, obviously the states should step up,” Bores told Tech Brew last week. “What’s happening right now is not a push for a federal standard, it is a push to ban states from doing anything and leave us at the whims of Congress taking action.”

A coming showdown: Whatever the fate of these particular efforts, the stage is being set for a big-money fight over AI policy in next year’s midterm elections. Well-heeled AI companies are starting to mount big lobbying operations, including the $100 million super PAC Leading the Future, which is targeting Bores’s current congressional run and is backed by Andreessen Horowitz, Perplexity, and OpenAI President Greg Brockman.

Meanwhile, The New York Times reported last week that AI safety advocates are raising money for their own network of super PACs that would back candidates who prioritize AI regulation. The advocacy network aims to raise $50 million to counter Leading the Future’s influence, with likely support from execs at Anthropic, OpenAI’s more safety-minded rival, the NYT reported.

Keep up with the innovative tech transforming business

Tech Brew keeps business leaders up-to-date on the latest innovations, automation advances, policy shifts, and more, so they can make informed decisions about tech.