Three state legislatures moved on AI in April 2026. Not one of them was coordinating with the others.
That’s the context the daily brief can state but can’t fully unpack. This piece unpacks it: who is protected, who is regulated, what the competing interests look like, and what the accumulation of state-by-state AI legislation means for organizations trying to build a compliance posture that actually holds.
The April 2026 State Wave: What Each Law Does
Per NCSL’s state legislative tracking, three laws became relevant in April 2026. What follows uses “enacted” as the accurate framing until operative dates are confirmed against individual state legislative records.
New York. New York enacted legislation prohibiting AI systems from displacing state employees or overriding collective bargaining rights in workforce decisions. The law’s scope is state employment, not private sector. Its compliance trigger is AI involvement in workforce decision-making: hiring, scheduling, performance evaluation, or any process that could result in state employee displacement. Per NCSL, the law creates a protected class of decision, union-covered state employment, that AI systems cannot autonomously override.
Montana. Montana enacted a Right to Compute law prohibiting government restrictions on private ownership of computational resources for lawful use. Per NCSL’s reporting, this is a restriction on government action, not a restriction on private actors. The law doesn’t regulate AI, it protects the infrastructure on which AI runs from state government interference. That’s a structurally different legislative theory than the other two laws.
Oregon. Oregon enacted a law prohibiting AI agents from using licensed medical professional titles, Registered Nurse is the example in the legislative record, when interacting with users. Per NCSL, the prohibition targets AI agent identity in healthcare-adjacent contexts. The compliance question is whether a given AI agent could be mistaken for a licensed professional. If yes, Oregon’s law creates an obligation to redesign or clearly disclaim the agent’s non-professional status.
Stakeholder Map: Who Is Affected by Each Law
The three laws create three distinct stakeholder landscapes. Understanding which landscape you’re operating in is the first compliance step.
New York, Three Stakeholder Groups
State employers and their AI vendors are the primary regulated parties. Any vendor whose AI tool is used in New York state government workforce decisions, regardless of where the vendor is headquartered, faces compliance obligations. The law doesn’t exempt out-of-state developers. Labor unions are the protected party; the law explicitly preserves collective bargaining rights from AI override. HR technology developers building tools for state government procurement need to redesign or limit the automated decision-making scope of their tools to preserve human review at collective bargaining-adjacent decision points.
Montana, Three Stakeholder Groups
Cloud computing providers and data center operators are the most directly protected. The law bars state government from restricting private ownership or use of computational infrastructure for lawful purposes, which means Montana cannot prohibit a company or individual from owning GPU clusters, running AI inference workloads, or maintaining private compute capacity. Privacy advocates have a more complex relationship with this law: compute access protections and privacy protections don’t always point in the same direction (more compute capacity can mean more surveillance capacity). Individual developers and small AI operators benefit from the same protection as large cloud providers, Montana’s framing is ownership-neutral.
Oregon, Three Stakeholder Groups
Healthcare AI developers are the primary regulated party. Any AI tool that interacts with users in a healthcare context, patient-facing chatbots, AI-assisted triage systems, clinical decision support tools with user-facing components, needs to audit whether its agent persona could be mistaken for a licensed medical professional. Licensed medical professionals are the protected party; the law prevents AI from misappropriating professional credentials and the trust those credentials carry. Agentic AI system operators broadly, not just healthcare-specific developers, should assess whether any of their agents use professional-sounding titles in any context, because Oregon’s framing could be read expansively.
The Federal Preemption Tension
The Trump Administration’s push for a federal AI framework that preempts state law creates a direct tension with all three of these statutes. That dynamic has been analyzed in depth in this brief on the federal-state AI showdown and in the hub’s coverage of what federal preemption actually means for state law compliance.
For these three laws specifically, preemption vulnerability varies. New York’s worker protection law, applied only to state employees, may have stronger state sovereignty arguments, states have traditionally had broad authority over their own employment relationships. Montana’s Right to Compute law, framed as a restriction on state government rather than a regulation of private actors, could be characterized as a state constitutional governance question rather than an AI regulation subject to federal preemption. Oregon’s professional title prohibition addresses a domain, professional licensing, that is traditionally state-controlled. None of these is preemption-proof, but none is obviously preemption-vulnerable either.
Companies should not assume these laws will be preempted. They should assess compliance obligations now and monitor federal preemption developments in parallel.
The Compliance Patchwork Problem
Here’s what a multi-state compliance review looks like after April 2026, applied to a single hypothetical company type: an AI vendor that sells workforce optimization and clinical support tools to state government and healthcare clients across all three states.
New York obligations: Audit whether the workforce optimization product makes or influences decisions that affect state employees covered by collective bargaining agreements. If yes, implement human-in-the-loop requirements at those decision points.
Montana: No direct compliance obligation as a private vendor, the law restricts government, not vendors. But clients in Montana who own private compute infrastructure are protected from state interference with that infrastructure. Worth knowing for contract and vendor relationship framing.
Oregon: Audit whether the clinical support tool’s agent persona uses or implies any licensed medical professional designation. If the agent introduces itself, displays credentials, or operates under a name that could suggest licensed status, redesign the identity layer.
Add EU AI Act obligations on top, which include risk classification, documentation of potential harms, and conformity assessment for high-risk systems including certain HR and clinical decision tools, and you have a multi-framework compliance matrix that no single policy document resolves. The hub has covered the EU dimension extensively in the 2026 AI compliance program planning brief.
What’s Coming
These three laws are consistent with a legislative pattern visible across multiple recent cycles. Worker protection framing is active in additional state legislative sessions beyond New York. The professional impersonation model, prohibiting AI from claiming credentials it doesn’t hold, is the type of targeted, scoped statute that tends to diffuse quickly because it’s narrow enough to defend against industry opposition. Montana’s compute rights framing is newer and less replicated; its durability as a legislative model is less certain.
TJS Synthesis
Three states, three legislative theories, three compliance populations. April 2026’s state AI wave isn’t a trend toward unified U.S. AI governance, it’s evidence that governance is fragmenting at exactly the moment companies most need coherent frameworks. The practical response isn’t to wait for federal preemption to resolve the patchwork. It’s to build a state-by-state tracking capability, identify which laws apply to which products in which jurisdictions, and update compliance assessments as the legislative calendar continues to produce new rows in a registry that isn’t slowing down.