Nine days. That’s how long the UK has before its government must publish a report on copyright and AI that could reshape the economics of AI training data in one of the world’s most significant creative content markets.
The House of Lords Communications and Digital Committee published its report on March 9, 2026. According to legal analysis from Mishcon de Reya, the committee sets out a licensing-first approach, a framework that would require AI developers to obtain licences to use copyrighted material for training purposes, rather than relying on a broad text and data mining (TDM) exception that the creative industries have resisted since the debate began.
What the committee recommends
Based on legal analysis from Mishcon de Reya, the committee’s position is that the government should rule out a commercial TDM exception with a rightsholder opt-out. The opt-out model, where content is freely usable for AI training unless a rightsholder actively removes it, has been the most contentious proposal in the UK’s multi-year copyright consultation. Publishers, musicians, visual artists, and writers have argued consistently that an opt-out model places an impossible enforcement burden on individual rightsholders while giving AI developers default access to years of human-created work.
The committee also reportedly recommends new protections around identity and style imitation, according to Mishcon’s analysis, a separate but related concern for professional creators whose distinctive styles can be replicated by AI systems trained on their work.
These are committee recommendations, not law. The government is not bound to adopt them.
What a licensing-first framework means in practice
If the UK government moves in the direction the committee recommends, AI developers training on UK-origin content would need to secure licences from rightsholders or their collective management organizations. For large-scale pre-training datasets, that means negotiating with publishers, record labels, stock image libraries, and news organizations, entities that have already begun demanding compensation in the US market and in EU member states.
The cost implications vary significantly by sector. Developers building general-purpose foundation models face the broadest exposure. Companies fine-tuning on narrow, controlled datasets may find their existing data governance practices largely adequate under a licensing regime. The line between those two positions is not always obvious and will likely require legal assessment on a case-by-case basis.
How this compares to the EU and US approaches
The UK’s pending decision sits between two already-established positions.
In the EU, the AI Act’s training data transparency requirements, combined with the existing Text and Data Mining provisions under the 2019 Copyright Directive, create a framework where TDM is permitted for research purposes, commercial TDM is allowed unless rightsholders opt out, and transparency about training data is increasingly required for high-risk and general-purpose AI systems. The EU approach leans toward a regulated exception with transparency obligations.
In the US, this week’s Supreme Court decision in Thaler v. Perlmutter settled the narrower question of whether AI-generated content qualifies for copyright protection, it doesn’t, where no human author is involved. The broader question of whether AI training on copyrighted material constitutes infringement remains actively litigated and legislatively unresolved.
The UK, if it follows the committee’s recommendation, would chart a third path: mandatory licensing rather than a broad exception. That would make the UK the most restrictive of the three major markets for AI training data access, a significant commercial consideration for any developer whose training pipeline draws on English-language content.
What happens on March 18
The UK government committed to publishing its report and impact assessment by March 18, 2026. This is a government commitment, not a policy announcement, the March 18 document is a response to consultation, not necessarily a statement of final policy. It may signal the government’s direction, propose further consultation, or decline to adopt the committee’s recommendations.
The creative industries will be watching for whether the government explicitly rules out the opt-out TDM exception. AI developers will be watching for whether any licensing framework comes with defined costs, collective licensing structures, or transition timelines. Both groups have been waiting for clarity since the consultation opened.
What to watch
Three signals will determine whether this story accelerates or stalls after March 18:
First, whether the government’s report explicitly addresses the opt-out TDM exception. Silence on that specific mechanism would suggest continued ambiguity rather than resolution.
Second, whether any proposed licensing framework includes a collective licensing mechanism, the presence or absence of that structure determines whether compliance is operationally feasible for most developers.
Third, whether the government’s position creates alignment or tension with the EU’s existing framework. UK companies operating across both markets need compatible rules. Divergence creates dual compliance obligations that neither rightsholders nor developers have asked for.
The March 18 publication date is a milestone, not a finish line. But it’s the clearest signal of UK direction the market will have seen in years.
[COMPARISON TABLE, insert as interactive widget]
AI Training Data: US vs. UK Proposed vs. EU Approach
| Jurisdiction | Framework | TDM Access | Transparency Required | Status | |—|—|—|—|—| | United States | Litigation-driven | No statutory TDM exception; fair use contested | No general requirement | Active litigation | | United Kingdom (proposed) | Licensing-first (committee recommendation) | No broad exception; licence required | Recommended | Government response due March 18 | | European Union | Regulated exception | Commercial TDM permitted unless opt-out; research TDM broad | Yes, AI Act training data transparency | Enacted; implementation ongoing |