On April 17, 2026, the US federal government moved on AI governance from two directions at once.
The Trump Administration finalized its National Policy Framework for AI, a document that recommends Congress establish federal AI standards preempting state laws, creates an AI Litigation Task Force to challenge conflicting state regulations, and covers seven policy pillars including Children, Communities, IP, Free Speech, Innovation, Workforce, and State Preemption. According to analysis from A&O Shearman and Holland & Knight, the framework explicitly recommends against creating a new federal AI rulemaking body, preferring instead to route oversight through existing sector-specific regulators.
Hours later, Representative Beyer and co-sponsors introduced the GUARDRAILS Act, a bill designed to prevent the federal government from imposing a moratorium on state AI regulation.
State laws remain in effect. The framework is non-binding. The GUARDRAILS Act is introduced, not enacted. But the positions are now formally on the record. And the stakeholder map that determines the outcome of this fight is now clear enough to read.
Part 1: The White House Position
The administration’s framework argues that a patchwork of state AI bills, described in multiple law firm analyses as numbering over 1,000, though that figure carries attribution to legal commentary rather than an independently verified count, threatens to fragment the US innovation landscape. The proposed solution is federal preemption: one national standard that supersedes inconsistent state requirements.
What the framework does not do is establish that standard. It recommends that Congress create one. The AI Litigation Task Force is the administration’s near-term tool for challenging state laws in court while Congress considers the longer-term legislative question.
The framework also reflects a structural preference: no new federal AI agency. Existing sector regulators, the FTC, SEC, FDA, and others, retain their AI-adjacent authorities. The White House is not asking for a US equivalent of the EU AI Act’s centralized enforcement model. It’s asking for a preemption ceiling on state innovation, not a federal floor of substantive requirements.
That distinction matters for compliance planning. Federal preemption without a robust federal standard doesn’t simplify compliance, it creates a ceiling with no floor.
Part 2: The Congressional Counter, GUARDRAILS Act
Representative Beyer’s GUARDRAILS Act is specifically designed to prevent the administration from using executive authority to impose a moratorium on state AI regulation. The bill targets a narrow but consequential scenario: an administration that tries to halt state-level AI enforcement while the federal legislative process moves slowly.
Per Holland & Knight’s analysis, the GUARDRAILS Act is a direct legislative response to the administration’s framework, introduced the same day, not coincidentally. It signals that a bipartisan consensus on federal preemption doesn’t exist. Democratic lawmakers, and potentially some Republican ones in states with active AI legislation, have incentives to preserve state authority.
The GUARDRAILS Act’s introduction is not the same as its passage. It’s a signal. What it signals is that the administration faces organized congressional resistance before it has even asked Congress to act on the framework’s recommendations.
Part 3: The State Landscape
State laws are the object of the preemption fight. Two are worth naming explicitly.
New York’s RAISE Act imposes transparency and accountability requirements on high-risk AI systems used in consequential decisions, employment, housing, credit, public services. It targets deployments within New York, not just companies headquartered there.
California has enacted multiple AI transparency and safety provisions over the past two years, including requirements for disclosure of AI-generated content and restrictions on certain high- risk automated decision systems.
The framework’s preemption recommendation, if enacted by Congress, would supersede these laws to the extent they conflict with the federal standard. The problem is that there is no federal standard yet. Until Congress acts, the state laws remain in effect.
State attorneys general from multiple states have publicly supported preserving state AI authority. The AI Litigation Task Force will face adversaries ready to challenge it in court the moment it moves against a state enforcement action.
The “1,000+ state bills” framing the administration uses is a characterization from legal analysis, but the underlying legislative reality it describes is accurate in direction, if not precisely in count. States are active. The patchwork is real. Whether federal preemption is the right solution is contested. That the problem the framework identifies exists is not.
Part 4: The EU Contrast
The US preemption debate looks different in relief against the EU AI Act.
The EU’s approach is the inverse of what the administration proposes. The EU AI Act establishes a centralized federal-equivalent framework with affirmative requirements on high-risk AI systems, conformity assessments, technical documentation, human oversight mandates, enforced through national competent authorities coordinated at the EU level. It creates substantive obligations, not just a preemption ceiling.
The US framework, by contrast, proposes preempting state innovation while recommending no new federal substantive requirements and no new enforcement body. The result, if enacted as proposed, would be a regulatory landscape with fewer obligations than either the EU model or the current state-level patchwork.
For companies operating in both markets, the EU AI Act’s requirements don’t change based on US preemption outcomes. Those are parallel tracks. What US preemption determines is whether a company’s US compliance burden is set by a federal standard or by the aggregate of active state laws. Until a federal standard exists, state law is the operative floor.
Part 5: What Compliance Teams Should Do Now
Four practical positions follow from this stakeholder map.
First, don’t pause state compliance programs. The framework is non-binding. State laws are operative. Scaling back compliance investment based on a preemption proposal that still requires Congressional action and likely litigation to take effect is a risk management error.
Second, document your current state compliance posture. When a federal standard does emerge – whether through this framework’s recommendations or a successor legislative process, companies that have already mapped their obligations to specific state laws will have a head start on gap analysis.
Third, monitor the AI Litigation Task Force. If it moves against a specific state law, New York’s RAISE Act is a plausible early target given its employer obligations, the resulting litigation will be the most concrete indicator of how aggressively the administration plans to exercise preemption authority before Congress acts.
Fourth, watch the GUARDRAILS Act’s committee trajectory. If it advances, it signals that congressional resistance to executive preemption is durable. If it stalls, the administration has more room to push through the Task Force channel.
TJS Synthesis
The US AI preemption fight is structurally similar to prior federal-versus-state regulatory contests, financial regulation, environmental standards, data privacy, where the outcome took years and multiple legislative cycles to resolve. The pattern is familiar: a federal executive frames state variation as a coordination problem; states and allied legislators frame federal preemption as an accountability threat; courts adjudicate the boundary. AI governance is following this pattern at speed.
The compliance implication is straightforward. You’re in a multi-year transition. The winning strategy isn’t to bet on one outcome, federal preemption or state primacy, and design your compliance program around it. The winning strategy is to build a program that can adapt. Know your state obligations now. Map them against the framework’s seven pillars. Identify where a future federal standard would simplify your burden and where it might create new ones. That mapping exercise is worth doing regardless of which side wins the preemption fight.