
An overview of the high level takeaways from the government’s economic impact assessment and a policy report on the use of copyright works in the development of AI.
The government has published its economic impact assessment and policy report under the Data (Use and Access) Act 2025 (DUAA).
The immediate effect on the balance between AI and creative industries is limited.
There will be no new legislation, no new regulator, and the courts will apply existing law.
After much anticipation, on 18 March the government released its reports as prescribed under sections 135 and 136 of DUAA. As discussed in our previous article on the progress statement, the government faced an unenviable task of balancing the interests of the creative and AI industries despite their divergent interests.
In this article, we provide an overview of the contents of the report and impact assessment, and provide high level insights. This article is the first in a series. Future articles will provide more detail on topics within the report and the knock-on effect on business.
The scale of the competing interests helps to explain the challenge. The UK's Creative Industries generated £146 billion in Gross Value Added (GVA) in 2024, representing around 6% of the UK economy. The UK AI sector, the third largest globally, contributed approximately £12 billion GVA in 2024, but could add between £20 billion and £90 billion to UK GVA by 2030.
Productivity gains from AI adoption across the whole economy also weigh in the balance. The OECD estimates that wider AI adoption could add 0.4 to 1.3 percentage points to the UK's productivity growth, equivalent to adding £55 to £140 billion to UK GVA in 2030, though these estimates are described as “highly uncertain”.
Each of the reform options outlined below have significant downsides in this context, and appear to have strongly influenced the government’s position.
Why hasn't the government reformed copyright law?
As covered in a previous article, the government's consultation from last year attracted 11,520 responses. The majority came from rights holders and the creative industries. The four options on the table were:
Do nothing (Option 0); Strengthen copyright to require licensing in all cases (Option 1); A broad data mining exception (Option 2); and A data mining exception with a rights reservation opt-out (Option 3).
The government's originally preferred option was Option 3, which mirrors the Digital Sing Market Directive. However, this was rejected by most respondents. Many in the creative industries were concerned that a broad exception would allow AI to learn from their works without compensation, whilst some in the AI and research sectors argued it would be more restrictive than approaches taken in other countries.
The economic case for acting decisively in either direction was also found to be weak. Whilst providing insights into the respective contributions to the UK economy, the Impact Assessment does not present a substantive cost-benefit assessment of each of the options. On the grounds that the international context is evolving rapidly, the debate has been left in much the same position as when the consultation launched in December 2024.
The most significant policy development in the report is the formal rejection of Option 3. As we noted in our comparative analysis, the proposed opt-out model appeared to be heavily inspired by Article 4 of the EU's Digital Single Market Directive. The government now states that a broad copyright exception with opt-out is no longer its preferred way forward, and proposes to gather further evidence, consider alternative approaches, and monitor developments in technology, litigation, the international arena and the licensing market. This is perhaps unsurprising as less than 3% of respondents to the consultation were in favour of Option 3, but this does mean that there will eventually be clear winners and losers as the position crystallises.
This marks a significant divergence from the EU's approach. The other departure from the EU approach, is the refusal to impose statutory transparency obligations. We will explore this regulatory gap in more detail in a forthcoming article, but the approach on transparency is covered below.
Transparency: light touch approach
One of the most practically significant decisions in the report concerns a refusal to enforce transparency obligations over training data. The government proposes to work with industry and experts to develop best practice on input transparency, rather than introducing statutory requirements.
This matters because transparency is a precondition for enforcement. Without knowing whether their works were used in training, rights holders cannot bring infringement claims, as exemplified by the Getty Images v Stability AI case.
Getty abandoned its primary copyright infringement and database rights claims because there was insufficient evidence that training or development had taken place in the UK. This demonstrates clearly the practical enforcement problem: without transparency obligations, rights holders often cannot obtain the evidence needed to establish where training occurred, let alone bring a primary infringement claim that their works were used.
For rights holders and creative businesses, existing rights remain intact but enforcing them against overseas-trained models is difficult without transparency obligations in place. Taking practical steps to safeguard your data and licensing it out (either directly or through licensing arrangements), may be a more accessible route to remuneration than facing the substantial evidential hurdles of litigation.
For AI developers and those deploying AI tools, proactively documenting training datasets and agreeing licensing for those that are core to the business offering remains sensible risk mitigation regardless of the outcome.
So, in short, the copyright and AI debate has not been resolved. Policy has shifted from Parliament into the courts and the commercial marketplace, at least for now.
If you would like to discuss how this affects your business, whether as a creator, AI developer or purchaser of AI solutions, please get in touch with Jonathan Bywater.
Co-authored with the current Innovation Trainee, Angus Wilson
Disclaimer
This article is intended for general information purposes only and does not constitute legal advice. For advice specific to your situation, please contact our team at T & M Legis for a consultation with our Legal Experts.

