Over 10 years we help companies reach their financial and branding goals. Engitech is a values-driven technology agency dedicated.

Gallery

Contacts

411 University St, Seattle, USA

engitech@oceanthemes.net

+1 -800-456-478-23

Skip to content
Regulation Daily Brief

EU AI Act News: EC Assesses ChatGPT for DSA's Toughest Compliance Tier, 120M Users, 45M Threshold

3 min read Reuters / Tech in Asia / TechPolicy.Press Partial
The European Commission is assessing whether to formally designate ChatGPT as a Very Large Online Search Engine under the Digital Services Act, a designation triggered by scale, not search function. According to Reuters reporting, OpenAI disclosed 120.4 million average monthly EU users for the six-month period ending September 2025, nearly triple the 45 million DSA threshold.

Scale is what the Digital Services Act uses to define its most demanding compliance tier. The DSA designates platforms and search engines as Very Large Online Platforms or Very Large Online Search Engines, VLOP and VLOSE, when they exceed 45 million average monthly active EU users, per DSA Article 33. ChatGPT, according to Reuters reporting via Tech in Asia, disclosed 120.4 million average monthly EU users for the six-month period ending September 2025. That’s 2.7 times the threshold. The European Commission is now assessing whether to formally designate ChatGPT under that framework. No designation has been announced as of publication.

This would be a first. VLOSE designation was designed with search engines in mind. Applying it to an AI-native chatbot is a different proposition, ChatGPT doesn’t index the web the way Google Search does, but it serves a comparable volume of EU users who are using it as an information retrieval tool. The EC’s assessment is, in effect, a test of whether existing platform regulation can be retrofitted to AI-native products without new legislation.

What designation would require. Upon formal VLOSE designation, DSA obligations would activate on a defined timeline. The requirements are established law, confirmed in the DSA’s text: annual systemic risk assessments under DSA Articles 34 and 37, independent audits, transparency reporting, and researcher data access. The audit requirement, per DSA Article 66, initiates a four-month window from formal designation for the initial compliance review. These are not speculative outcomes, they are the confirmed statutory consequences of designation. What’s unconfirmed is when, or whether, the EC will make the designation formal.

The distinction matters for compliance planning. DSA obligations don’t attach on the day a company exceeds the user threshold. They attach when the EC makes a formal designation decision. As of publication, that decision hasn’t been made. TechPolicy.Press reporting confirms the assessment is active as of April 2026, with no formal designation announcement.

Why it matters. If ChatGPT is formally designated, OpenAI faces annual third-party audits, risk assessment publication requirements, and researcher data access obligations in its largest non-US market. These are operationally significant, independent audits of AI systems at this scale require documentation infrastructure that doesn’t exist off the shelf. More broadly, the designation would establish precedent for how the EU treats AI-native products under a regulatory framework written before generative AI at scale existed.

For other AI companies with EU user bases, the ChatGPT assessment is a threshold visibility event. The 45 million DSA figure is fixed in law. Any AI product approaching that scale in the EU should have compliance counsel tracking this designation process closely.

Context and precedent. The EU already applies the VLOP/VLOSE framework to major social platforms and search engines. ChatGPT’s designation would extend that tier to conversational AI for the first time. The “Three AI Governance Models” deep-dive published earlier this cycle covered how the EU AI Act approaches high-risk AI – the DSA designation is a parallel track, focused on systemic risk at platform scale rather than use-case risk classification under the AI Act. Both can apply simultaneously to the same product.

Some analysts, per Handelsblatt reporting, have suggested DSA compliance obligations could slow EU feature rollout for OpenAI, including SearchGPT and multimodal capabilities. This remains speculative ahead of any formal designation and should not be treated as a confirmed consequence.

What to watch. Three triggers worth monitoring: (1) the EC’s formal designation decision, when announced, it starts the DSA compliance clock; (2) whether OpenAI challenges the designation before the EU General Court, which is a procedural option VLOP/VLOSE designees have used before; (3) how the EC handles the interaction between DSA VLOSE obligations and GPAI model requirements under the EU AI Act, ChatGPT potentially faces both frameworks simultaneously. No official EC statement on that interaction has been issued as of this reporting cycle.

TJS synthesis. The EC’s ChatGPT assessment is less about whether ChatGPT is a search engine and more about whether the EU’s existing scale-based regulatory framework can absorb AI-native products without bespoke legislation. The 120.4 million user figure isn’t a technicality, it’s the data point that puts ChatGPT firmly in the DSA’s crosshairs. EU compliance teams at AI companies should treat the formal designation as a when, not an if, and begin mapping DSA Article 34 and 37 requirements to their documentation and audit infrastructure now.

View Source
More Regulation intelligence
View all Regulation
Related Coverage

Stay ahead on Regulation

Get verified AI intelligence delivered daily. No hype, no speculation, just what matters.

Explore the AI News Hub