This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
THE LENS
Digital developments in focus
| 3 minute read

“I still haven’t found what I’m looking for” – back to square one after UK government abandons voluntary code of practice on copyright and AI

The UK government confirmed in its recent AI white paper consultation response (more on that here) that the technical working group convened by the UKIPO last summer (see here) has failed to reach agreement on what a voluntary code of practice on copyright and AI might look like. 

What’s at stake?

Generative AI models are trained on massive amounts of data which typically come from publicly available sources (e.g. open datasets and web crawling/scraping). Often, the data will include content protected by copyright, such as artwork, books, music or photographs, which, if copied without permission from the relevant rights holders or outside a statutory exemption, carries infringement risk. While there are a number of exceptions to copyright infringement under English law, their narrow formulations make them of limited use to most AI developers. In particular, the text and data mining (“TDM”) exception is limited to use for non-commercial research purposes and doesn’t protect against infringement of other potentially relevant IP rights like database rights. 

Generative AI providers say their current practices are (or at least ought to be) legal. Rights holders disagree, arguing that use of their content in training generative AI tools without a licence amounts to infringement. So this debate, and where the balance should fairly lie, goes right to the very heart of whether or not the current approach to training generative AI is workable.

What has happened so far?

As we reported previously, in the summer of 2022, following a second round of consultation, the UKIPO announced plans to introduce a new exception for both copyright and database rights which would allow TDM for any purpose, with no ability for rights holders to opt out or contract out. That was met with strong criticism from the creative industries which led the UK government to U-turn on its plans for legislative reform in this area. 

Instead, in the summer of 2023, following a recommendation from Sir Patrick Vallance, policymakers tasked the UKIPO with brokering consensus between AI developers and rights holders in the form of a voluntary code of practice on copyright and AI. Initially promised by “the end of the Summer”, we heard very little about progress. But, as recently as 11 January 2024, the UK government reiterated its commitment to the code in its response to the Culture, Media and Sport Committee of the UK House of Commons’ report on connected tech, AI and creative technology.

A further intervention in the debate came from the Communications and Digital Committee of the UK House of Lords, which published its report on large language models and generative AI just over a couple of weeks ago, on 2 February (the “CDC Report”).

Pointing to the benefits that “upholding a globally respected copyright regime” brings to the UK, the Communications and Digital Committee reaffirmed the core principles of copyright law – rewarding creators for their efforts, preventing others from using works without permission, and incentivising innovation - before concluding that the current legal framework falls short of achieving them in the context of generative AI and that the government has a duty to act. 

The CDC Report is peppered with hints of frustration over Westminster’s failure to articulate a clear stance on the issue. It recommends, among other things, measures to empower creators to exercise their rights under copyright law (e.g. on an opt-in or opt-out basis) and an ability for rights holders to check training data, and clearly disagrees with leaving matters to the courts. 

It is in that context that the government published its response to the AI white paper consultation and confirmed that the working group that was set up to agree the code of practice has failed to reach an effective agreement.

What’s next?

With a voluntary code of practice off the table, responsibility is back with DSIT and DCMS ministers who are to engage with AI developers and rights holders anew in their search for solutions. 

The UK government’s focus remains on fostering a partnership between AI developers and content creators/owners. It does, however, indicate that it is looking at “mechanisms for providing greater transparency so that rights holders can better understand whether content they produce is used as an input into AI models”, and also the attribution of outputs. 

Importantly, the government has also never ruled out legislative intervention. Whether or not new legislation is forthcoming remains to be seen, but it is now looking more likely. 

Comment

Getting two sides with diametrically opposed views to agree is always going to be a thorny assignment. However, while not unexpected, a return to square one will be a disappointing outcome for AI developers and rights holders alike. Uncertainty remains over a complex legal question that is being examined against a field evolving at unprecedented speed. 

The CDC Report makes it clear that stakeholders are growing impatient with the absence of guidance from the UK government and do not consider it fair to leave the AI technology and creative industries in limbo while the issue is settled in court - a process which may last several years (taking into account appeal processes).

And so, with policymakers promising to provide further proposals on the way forward “soon”, the waiting game continues.

Solving this ‘Goldilocks’ problem of getting the balance right between innovation and risk, with limited foresight of market developments, will be one of the defining challenges of the current generation of policymakers. - CDC Report

Sign up to receive the latest insights. Click here to subscribe to The Lens Blog.

Tags

ai, ip, emerging tech