As readers of this blog will know, the final version of the EU’s General Purpose AI (GPAI) Code of Practice was due to be published by 2 May. We, like many others, had been keeping a keen eye out for that final draft, but, alas, the deadline has come and gone without anything materialising.
The EU AI Office has now confirmed that the Code has indeed been delayed, with the final version due to be published “by August”, suggesting there may not be much time (if any) between the final draft being released and the EU AI Act’s provisions relating to GPAI model providers coming into force on 2 August.
What is the GPAI Code?
For those new to this, the GPAI Code is a voluntary code of practice, written by “independent experts” with stakeholder engagement, that is designed to help providers of GPAI models demonstrate compliance with their obligations under Articles 53 and 56 of the EU AI Act. That includes provisions relating to transparency, copyright and safety and security (see our earlier blogs here, here and here).
The majority of the commitments set out in the GPAI Code only apply to providers of GPAI models with systemic risk, but a couple apply to all providers of GPAI models that are placed on the EU market. That includes a commitment and related measures relating to copyright, which have proved controversial and which many people have been following closely.
Why the delay?
At the time of writing, no official statements have been publicly released by the EU AI Office to explain the delay. But press reports have suggested two main reasons have been given to stakeholders participating in the drafting process.
- To give participants more time to feed back on the third draft of the GPAI Code.
- To allow time for participants to respond to the EU Commission’s separate and ongoing consultation on its proposed draft GPAI guidelines (open until 22 May), which also seek to clarify certain aspects of GPAI model providers’ obligations under the EU AI Act. This includes giving guidance on key questions such as: What is a GPAI model? Which entities will be “providers”? And what does “placing on the market” mean? As well as, importantly, guidance on the effects of signing and adhering to the GPAI Code.
Others have suggested that the delay will also give the EU AI Office time to assess support for the GPAI Code from the major AI providers – after all, a key part of whether this Code will ultimately be successful will come down to whether GPAI model providers actually sign up to and adhere to it.
Comment
This delay is not entirely unexpected. Getting all stakeholders to align on the form of the GPAI Code was always going to be a tall ask, particularly given the contentious areas covered such as those relating to copyright (where we’ve also seen the UK government try and fail – see our blog).
With both sides having strong, diametrically opposed views on some of the areas covered, however, it does raise the question of whether, perhaps, this will require a political solution? Indeed, that’s something that the EU legislators clearly foresaw, with the AI Act making it clear that if the GPAI Code isn’t finalised by 2 August, or if the AI Office doesn’t deem the final draft to be adequate following assessment, the EU Commission may provide “common rules” through an implementing act.
It’s worth remembering, however, that the areas covered in the GPAI Code are not the only barriers facing big tech AI model developers wishing to launch AI products in the EU – they are also, for example, facing challenges from European data regulators as they seek to grapple with GDPR questions on training with personal data. Most recently (in April 2025), the Irish Data Protection Commission has announced an investigation into how the publicly accessible posts of EU users on the X platform are being used to train X’s Grok LLMs, focusing particularly on the processing of personal data in those posts and the lawfulness and transparency of that processing. A German consumer rights association has also recently warned Meta about its AI training plans using content from Facebook and Instagram, with that warning receiving public support from the privacy lobby group noyb.
With so much going on in this space, there is a lot to keep an eye on! But we will continue to watch this very closely and report any further material updates here on the Lens.