White House official advocates for a ‘give and take’ on state AI preemption

President Donald Trump speaks with Michael Kratsios, director of the White House Office of Science and Technology Policy, during a roundtable on Ratepayer Protection Pledge in the Indian Treaty Room at the Eisenhower Executive Office Building on the White House campus in Washington, DC on March 4, 2026.

President Donald Trump speaks with Michael Kratsios, director of the White House Office of Science and Technology Policy, during a roundtable on Ratepayer Protection Pledge in the Indian Treaty Room at the Eisenhower Executive Office Building on the White House campus in Washington, DC on March 4, 2026. ANDREW CABALLERO-REYNOLDS / AFP via Getty Images

Office of Science and Technology Policy Director Michael Kratsios argued the administration’s AI policy framework allows states to regulate some things about the technology — like child safety and state government procurement — while ensuring a national standard.

An administration official joined Republican lawmakers in advocating for the passage of an AI moratorium on Wednesday, calling it a key regulatory step.

Speaking at the Axios DC+AI Summit, White House Office of Science and Technology Policy Director Michael Kratsios reiterated the administration’s stance as outlined in the National Policy Framework for AI and said he hopes that Congress will take these principles into account as they prepare sweeping AI legislation.   

“We think we have something that is very common sense and can be very bipartisan,” Kratsios said regarding the administration’s framework.

Kratsios noted that preemption of state AI regulations — which has proven controversial since the concept was floated in Congress last year — is paired with popular bipartisan and state-level issues, such as child safety and utility ratepayer protections, offering states recourse in regulating these subjects. 

“We believe that we can't have preemption without other stuff attached to it, and it has to be a give and take with both,” Kratsios said. “We're not going to be preempting the way that an individual state government can procure AI, for example, and that's something that we want them to continue to be able to do anything they would like. So I do actually think it's a tailored approach to the issue of one national framework that allows states to take action, but also provides American innovators and American families a common picture for how the tech industry will be regulated.”

Lawmakers speaking at the event outlined what they will look for in an updated moratorium and future AI regulation overall.

Sen. Josh Hawley, R-Mo., underscored his desire to pass federal chatbot safety legislation for children online. And though he voted to remove the AI moratorium first proposed for inclusion in the 2025 reconciliation bill, Hawley echoed Kratsios’s approach to striking a balance between offering solid federal regulation and allowing states to enforce existing legislation on AI safeguards. 

“The idea with the framework that's new is actually calling on Congress not just to say no to a bunch of state laws, but to put down a federal standard. That's a very different thing,” Hawley said. “I think if we can protect child safety, if we can stop the data centers from sending up electricity rates, if we can make sure that states are able to continue to enforce their own zoning laws and so forth, which is carved out of the framework, then I think you've got the makings there of a national framework that could really work.”

Hawley added that the ability to offer tailored exceptions to certain issues is key to the framework’s success, particularly around child safety laws. 

“I'm encouraged that the framework carves out whole bunches of state laws, you know, particularly the children's safety laws. Those should be allowed to remain in place no matter what,” he said. 

Hawley noted that there is still a level of development or “build out” the framework needs, but that it is a good step in protecting citizen rights, namely protecting children, low prices and workers. 

Rep. Kat Cammack, R-Fla., cited concerns about state laws stifling innovation –– another major administration concern ––  as the biggest rationale for federal preemption. 

“The idea that we're not going to have any sort of framework or guardrails, that's just not realistic,” Cammack said. “But there's also the real concern that I certainly have, that we're going to push innovators out of the space, and so you can't get to the point where it's just become impossible to do anything, and that's going to require some preemption.” 

Cammack acknowledged that crafting a level of federal preemption for state AI regulations will be difficult to navigate, but protecting innovation is paramount. 

“The reality is that we're not going to be in a position where –– particularly startups –– but the companies are not going to be able to have a framework for 50 different states and really survive,” she said. 

Hawley and Cammack’s perspectives offer insight into the future of what national AI legislation in Congress may look like, as well as its chances of passing the Senate and House. 

Last week, a group of House Democrats introduced a new bill to counteract federal preemptive action against state AI laws, with Sen. Brian Schatz, D-Hawaii, filing companion legislation in the Senate. 

Sen. Marsha Blackburn, R-Tenn., who, alongside Hawley, voted against the AI moratorium in 2025, published a discussion draft of her own national AI framework legislation, a document that closely mirrors the Trump administration’s policy framework. 

Included in the draft text is a section that explicitly deems federal legislation as preemptive, making an exception for cases where a state is enacting a law or regulation that “provides greater protection to minors than the protection provided by the provisions of this title.”

While the future of an AI moratorium is still vague, administration leaders and lawmakers have been mulling over how best to package an updated moratorium in legislation, eyeing child safety bills as a potential vehicle.