Over 10 years we help companies reach their financial and branding goals. Engitech is a values-driven technology agency dedicated.

Gallery

Contacts

411 University St, Seattle, USA

engitech@oceanthemes.net

+1 -800-456-478-23

Skip to content
Regulation Deep Dive

Federal AI Preemption Is Unresolved, What Law Firms Are Telling Compliance Teams to Do Now

Latham & Watkins Partial
Three federal AI governance instruments are now in public circulation, the White House's National Policy Framework for AI, Senator Blackburn's TRUMP AMERICA AI Act discussion draft, and the Democratic GUARDRAILS Act, and none of them is law. Major law firms including Latham & Watkins, Gibson Dunn, Crowell, Akin Gump, and Finnegan have now issued client guidance on what this unresolved landscape means in practice. The question compliance teams face isn't what these proposals say. It's what to do while Congress deliberates.

Three documents are reshaping how compliance teams think about U.S. AI law. None of them has been enacted.

On March 20, 2026, the White House released its National Policy Framework for AI, a set of non-binding legislative recommendations calling on Congress to adopt a “light-touch” regulatory approach. Two days earlier, on March 18, Senator Marsha Blackburn (R-Tenn.) circulated a discussion draft of the TRUMP AMERICA AI Act. Then Democratic House lawmakers introduced the GUARDRAILS Act to repeal the administration’s December 2025 executive order on AI and block any moratorium on state AI regulation.

Three proposals. Three competing visions. Zero enacted law.

What has changed in the past week is not the policy landscape itself, it’s the professional guidance surrounding it. As of March 26, at least five major law firms have published client analyses of the Framework and the Blackburn draft. The question those analyses are trying to answer is the same one every compliance officer with exposure to state AI laws is asking: what do we do right now?

The Preemption Stakes

The Framework’s most consequential recommendation for compliance teams is its call on Congress to preempt state AI laws that impose “undue burdens” on innovation, while preserving states’ traditional police powers for areas like child protection, fraud prevention, and consumer protection.

That carve-out matters. It means even under the Framework’s preferred outcome, not all state AI laws disappear. The compliance picture under a preemption regime is not a clean slate, it’s a filtered one. Organizations currently subject to state AI laws in California, Colorado, Texas, and elsewhere would still need to map their obligations against whatever police-powers carve-outs survive.

The Blackburn discussion draft takes a similar approach on preemption. It would establish a federal liability framework and, according to Latham & Watkins’ analysis of the draft, require chatbot developers to exercise a “duty of care” in system design. Latham & Watkins’ analysis also indicates the draft would address copyright training data use, though the specifics of that provision should be confirmed against the discussion draft text before acting on them.

One provision warrants particular caution. Latham & Watkins’ analysis references a requirement for periodic third-party audits related to political bias. This claim has not been independently confirmed by a second source. Organizations reviewing the discussion draft should treat that provision as unverified until they have read the draft text directly.

What Five Law Firms Are Advising

The volume of law firm guidance published in a single week tells its own story. Latham & Watkins, Gibson Dunn, Crowell, Akin Gump, and Finnegan each issued client alerts or analyses between March 24 and March 26. Their assessments are practitioner interpretation of a non-binding framework and an unintroduced draft bill, not legal authority. But the convergence in their guidance is useful signal.

Several themes appear across multiple firms’ analyses:

The Framework’s preemption call creates a compliance planning problem, not a compliance obligation. Organizations cannot restructure state-law compliance programs around a legislative recommendation that may never pass. The risk of premature restructuring – abandoning state-law controls that remain legally operative, is higher than the risk of maintaining them while federal legislation remains unresolved.

The “undue burdens” standard is undefined. The Framework calls for preempting state laws that impose undue burdens, but it provides no metric for what constitutes an undue burden. That ambiguity is likely intentional, it preserves congressional flexibility. For compliance teams, it means any scenario planning based on which state laws would survive preemption is necessarily speculative at this stage.

The Blackburn draft’s duty-of-care standard for chatbot developers is the most operationally concrete provision in either instrument. A reasonable-care standard is familiar legal ground. Companies operating chatbot-facing products should be reviewing their current design and disclosure practices against a reasonable-care baseline regardless of whether the draft advances, because that standard is also consistent with emerging state-level frameworks.

These are areas of convergence. Divergence exists too. Firms differ on how to weigh the Framework’s industry-led standards approach against the Blackburn draft’s liability provisions, and on how seriously to treat the preemption recommendation given the current congressional calendar.

The Opposition Position

The GUARDRAILS Act names the stakes clearly. Representative Don Beyer (D-Va.) and four Democratic co-sponsors introduced the bill to repeal the Trump Administration’s December 2025 AI preemption executive order. According to the official statement from Beyer’s office, the bill would also block any federal moratorium on state AI regulation.

Rep. Beyer’s office stated that the administration “aims to kill state AI laws without setting even minimally acceptable federal guardrails, exposing the American public to harm.” That framing captures the core political disagreement: the preemption debate is not only about regulatory efficiency, it’s about who bears the risk if federal standards prove inadequate.

The GUARDRAILS Act is introduced legislation, not enacted law. Its primary importance at this stage is as a signal of Democratic opposition to broad federal preemption. That opposition makes passage of the Framework’s preemption recommendation, which requires legislative action, more uncertain, not less.

Comparison: Three Instruments, Key Dimensions

Dimension White House Framework Blackburn Discussion Draft GUARDRAILS Act
Legal status Non-binding legislative recommendations Discussion draft, not formally introduced Introduced legislation, not enacted
Preemption approach Recommends federal preemption of state laws imposing “undue burdens” Proposes federal preemption with similar scope Opposes preemption; would block state law moratorium
New regulatory bodies No, directs existing agencies No, federal liability framework through courts/existing law N/A, repeal bill
Liability framework Not specified Federal liability framework; duty of care for chatbot developers N/A
IP/copyright treatment Addresses IP rights; according to legal analysts, leaves fair use to courts Would address copyright training data use (confirm against draft text) N/A
Child safety provisions Yes, explicit priority Not confirmed in available analyses N/A
Police powers carve-out Yes, child protection, fraud, consumer protection Similar carve-out reported N/A

Note: The Blackburn draft’s “annual third-party audits for political bias” provision is not included in this table. That provision is reported by a single analysis source and has not been independently confirmed. Compliance teams should read the discussion draft text directly before treating it as a planning input.

What Compliance Teams Should Do Now

The compliance-relevant takeaway from this week’s law firm guidance converges on a few practical points.

Don’t restructure. State AI laws remain operative. The Framework’s preemption recommendation creates no legal obligation and no safe harbor. Organizations that dismantle state-law compliance controls based on a non-binding recommendation are exposing themselves to enforcement risk in states where those laws apply.

Document your current state. If and when federal preemption legislation does advance, organizations with documented compliance programs will be better positioned to demonstrate good-faith efforts under any transition framework. The documentation you build now has dual value.

Monitor the Blackburn draft’s progression. A discussion draft is the earliest stage of the legislative process. It can change substantially or never advance. The duty-of-care and liability provisions are worth tracking because they represent a direction, even if the specific language shifts.

Watch for a companion Senate vehicle. The Framework was released by the White House; the Blackburn draft is a Senate Republican instrument. If a companion House bill emerges, or if the Blackburn draft is formally introduced, the legislative timeline becomes more concrete.

The GUARDRAILS Act’s introduction is itself a monitoring signal. Democratic opposition to preemption, expressed through legislation, makes bipartisan preemption legislation harder to pass. Organizations can reasonably treat broad federal preemption as a medium-to-long term possibility rather than a near-term certainty.

TJS Synthesis

The story from this week is not that three federal AI governance instruments exist. That story was reported when each was released. The story is that major law firms have now had time to read them carefully, and their guidance reveals a notable gap between the Framework’s ambition and its practical utility for compliance planning.

A non-binding recommendation to preempt state AI laws that impose “undue burdens” tells compliance teams almost nothing actionable. The standard is undefined. The legislation is unwritten. The politics are contested. What it does do is put every organization currently running state-law AI compliance programs on notice that the ground may shift, without telling them when, or how far.

The firms tracking this most closely are telling clients the same thing: maintain what you’ve built, document it well, and watch the Blackburn draft’s trajectory. That’s cautious advice for a cautious moment. It’s also the right advice. The organizations that will be best positioned when federal AI law finally crystallizes are the ones that didn’t tear down their compliance infrastructure on the basis of a proposal.

View Source
More Regulation intelligence
View all Regulation