As readers of this blog will be aware, the rapid development of AI over the last few years has given rise to a number of significant debates on copyright. None more so than how best to balance the rights of AI developers and copyright owners when training generative AI.
Whilst this remains a fluid debate, with no clear policy direction yet emerging – at least in the UK – two recent reports have indicated that the winds may be shifting within the UK and the EU in favour of rights holders.
On 6 March, the Communications and Digital Committee of the UK House of Lords (CDC) published its report “AI, copyright and the creative industries”. This was shortly followed by the European Parliament adopting a resolution on copyright and generative artificial intelligence on 10 March (EP Resolution). It’s fair to say that both of these reports are very much pro-copyright, favouring licensing and remuneration over broad exceptions.
At the risk of over-simplification, we explore below five key areas of overlap between the two reports.
1. Voluntary licensing
Both reports propose that the default position should focus on strong copyright protection and voluntary licensing (individual or collective). They argue that AI developers should obtain permission to use copyright protected materials for training their AI models and, at least according to the EP Resolution, beyond (e.g. for inferencing and retrieval-augmented generation).
2. Remuneration
In return for granting those licences, the reports call for rights holders to receive fair remuneration for the use of their works. They also call for consideration to be given to whether, and if so how, rights holders should be paid for past uses of their works.
3. Technical standards for control
In order to support such a licensing framework, both reports note the importance of having effective, machine-readable rights-reservation mechanisms to enable rights holders to make clear whether (and, if so, on what terms) they are willing to license their content to generative AI developers. The EP Resolution goes a step further, proposing that any such opt-outs should be recorded in a European register maintained by the EU Intellectual Property Office.
4. Transparency
Increased transparency regarding content used to train generative AI models was regarded by both reports as essential to support a functioning licensing market. To be effective, however, both reports noted that any transparency obligations would need to be more granular than those required under the EU AI Act (see our earlier blog), with both reports finding those obligations to be inadequate. Acknowledging AI developers’ trade secret concerns around disclosing granular details of their training data, however, both reports suggested that AI developers could disclose those details confidentially to a trusted intermediary or regulator, which could then inform relevant rights holders about the use of their works on a need-to-know basis.
The EP Resolution also proposed that failure to comply with EU transparency obligations should give rise to a “rebuttable presumption” that the relevant copyright-protected content has been used for AI training, supported by cost consequences on the AI provider if the relevant rights holder were to ultimately prevail in court. Similar proposals were put to the CDC, but were not mentioned in the CDC’s final recommendations.
5. Territoriality
Building on the provisions already contained in Article 53(1)(c) of the EU AI Act (which we discussed in this briefing), the EP Resolution calls for EU copyright law to apply to the use of European content to train any generative AI models or systems that are made available on the EU market, regardless of where those models or systems were trained, with those models and systems that don’t comply being barred from the EU market. Extra-territorial effect is also discussed in the CDC report, but the CDC’s recommendations are more considered and limited to compliance with transparency requirements. Whilst noting that the evidence presented to it was “compelling”, the report acknowledged that “it is important to recognise the territorial nature of copyright and the limits of its ability to regulate training that occurs entirely overseas”, before simply proposing that “the Government should consider how public procurement and regulatory tools could support compliance with UK transparency requirements by AI developers operating in the UK”.
Comment
Whilst there are clear differences between the two reports, it’s interesting to see such similar themes arising at the same time in both the UK and the EU.
The CDC report is not entirely unexpected, given the pro-rightsholder stance the House of Lords has taken in the past, led by Baroness Kidron (although she isn’t a member of the Committee that was responsible for preparing this report). The clear aim is to try to influence the UK government before it (imminently) publishes the two reports due under the Data (Use and Access) Act 2025 (see our earlier blog). Thankfully, those reports are due to be published by 18 March, so we won’t have too long to wait to see them. We understand, however, that the UK government won’t be setting out its policy position in those reports, opting instead to “reset” its approach and carry on consulting on these points - effectively kicking the can further down the road. We might, however, see the government confirm that it has dropped its proposals for a broader TDM exception (if that is indeed its conclusion), as well as provide details of the steps and timetable it plans to follow to reach its final decision on the way forward, as requested by the CDC.
The EP resolution is perhaps slightly more surprising, given the EU is notably more advanced than the UK in plotting its way forward (with the AI Act and its existing copyright exceptions for text and data mining already in place). However, it is clear that, similar to the UK, significant pressure is being applied from rights holders who are not happy with the existing state of affairs.
This will be worrying for AI developers as many of the proposals would add additional (they may argue unworkable) and costly obligations and administrative burdens on them. However, this is all part of a healthy debate on where the balance should fairly lie and we look forward to seeing where it all lands (if it ever does!).

/Passle/5badda5844de890788b571ce/SearchServiceImages/2026-03-11-10-51-21-023-69b1492908768067119d22ed.jpg)
/Passle/5badda5844de890788b571ce/SearchServiceImages/2026-03-12-13-13-59-815-69b2bc17401a85d306657215.jpg)
/Passle/5badda5844de890788b571ce/SearchServiceImages/2026-03-05-22-19-40-744-69aa017c81537a51f5747663.jpg)
/Passle/5badda5844de890788b571ce/SearchServiceImages/2026-03-05-15-27-36-778-69a9a0e8a5089d4b5616d91b.jpg)