Over 10 years we help companies reach their financial and branding goals. Engitech is a values-driven technology agency dedicated.

Gallery

Contacts

411 University St, Seattle, USA

engitech@oceanthemes.net

+1 -800-456-478-23

Skip to content
Regulation Daily Brief

Florida AG Opens Criminal Investigation Into OpenAI Citing AI's Alleged Role in FSU Shooting

3 min read Florida Attorney General / My Florida Legal Partial
Florida Attorney General James Uthmeier has opened a criminal investigation into OpenAI following a shooting at Florida State University, according to multiple reports. The investigation is examining a legal theory with no established precedent in US case law: whether an AI system can be criminally liable under Florida's aiding and abetting statutes.

Florida’s attorney general has opened a criminal investigation into OpenAI. The investigation was initiated on April 21, following a shooting at Florida State University, according to multiple reports corroborated across several publications. Attorney General James Uthmeier’s office is the named authority. The existence of the investigation is confirmed. What it specifically alleges, and whether it produces charges, is not yet established.

The legal theory being examined, based on legal analysis from My Florida Legal, the primary source for this story, is that Florida statutes permitting criminal liability for those who aid or counsel the commission of a crime could potentially be applied to an AI system. Legal analysts report that this specific application would be unprecedented in US case law. No AI system has faced criminal liability under aiding and abetting theory before. That doesn’t mean the theory fails, it means the courts haven’t tested it.

According to reports, OpenAI has been subpoenaed for all internal policies regarding user safety protocols and cooperation with law enforcement. That subpoena claim comes from a single source and couldn’t be independently verified against primary legal documents at the time of publication. It should be treated as reported, not confirmed, until primary documentation is available.

What this investigation introduces, regardless of its outcome, is a liability question that every company deploying consumer-facing AI must now reckon with. The argument isn’t that the AI system pulled a trigger. The argument, as reported, is that the AI system may have aided a person who did. Whether that framing survives legal scrutiny is a question for Florida courts. But the filing of the investigation means a state attorney general is willing to make that argument in a formal enforcement context, and that matters for how developer liability is assessed going forward.

This investigation lands in the same window as the FTC’s Section 5 enforcement actions against AI companies for marketing deception, covered in our earlier enforcement brief. Two different legal theories, two different jurisdictions, two different enforcement agencies, but both represent formal government action against AI companies within weeks of each other. That pattern is worth naming.

What to watch: The Florida AG’s next steps are the immediate signal to monitor. If the investigation produces formal charges, the aiding and abetting theory enters the judicial process and begins generating case law. If it closes without charges, it still establishes that state attorneys general are willing to apply criminal frameworks to AI conduct, which shapes how future investigations are framed. Watch also for any OpenAI response, which may include procedural challenges to the subpoena or preemption arguments. The federal vs. state jurisdictional question in AI enforcement, examined in our federal preemption analysis, is directly relevant here.

For legal teams at companies with consumer-facing AI products: the practical implication isn’t to restructure your product today. It’s to understand whether your systems have safety features, law enforcement cooperation protocols, and user monitoring capabilities that could withstand the kind of scrutiny this investigation is applying. Document those policies. Know where they are.

View Source
More Regulation intelligence
View all Regulation
Related Coverage

More from April 24, 2026

Stay ahead on Regulation

Get verified AI intelligence delivered daily. No hype, no speculation, just what matters.

Explore the AI News Hub