The ICO and the Competition and Markets Authority (CMA) have worked together for a number of years to enhance regulatory coherence and clarity where the data protection and competition regimes interact, including through their participation in the Digital Regulation Cooperation Forum. Unsurprisingly, one of their key areas of joint focus these last few years has been AI.
On 25 March, they published a joint article focusing on foundation models (FM) and outlining the steps that developers and deployers can take to support competitive and innovative FM markets while protecting consumers and respecting people’s information rights.
By way of reminder, FMs are base models for AI systems that are trained on large amounts of data and can generate outputs such as text, images and audio, and be adapted to a range of tasks. The joint article explains the ways in which FMs can be released, on a spectrum between closed and open access (and recognising this isn’t always a binary choice and may change over time):
- with open-access, some or all of an FMs key assets – such as their architecture, code, weights and biases and data – are available for anyone to view, modify and use when the model is released. This type of release is similar to open-source software.
- a closed-access release approach involves few or none of the FM’s key assets being open or publicly accessible. Instead, access is controlled and shared only to the extent that the FM developer chooses. Closed-access models may be deployed by businesses for their own internal use and operations, or they may make them available to external parties to use – such as on a paid-for basis via an Application Programming Interface (API) – while maintaining control over the FM.
The ICO and CMA are keen to emphasise that they do not endorse or favour one approach over the other. It is for businesses/developers to choose the one that is most appropriate to their specific situation. What is more important is that there are appropriate risk mitigations and safeguards in place to support effective data protection compliance and protection for consumers. For example, developers of FMs trained on personal data could:
- implement technical and organisational measures, in line with data protection by design and default, when releasing open-access FMs. This could include using licences or terms of use with effective data protection requirements and safeguards to ensure that deployers are using their models in a compliant way.
- rely on more on technical controls, e.g. APIs, to help monitor and control against data misuse downstream in the context of closed-access FMs.
- drive appropriate transparency, including by providing information to deployers about how an FM has been developed (for both open and closed-access models). This will help deployers make informed decisions about personal data processing and verify their accountability for their own data protection and consumer protection compliance, ultimately benefiting the user.
The ICO and CMA also draw attention to their 2021 Joint Statement and highlight the recommendations from it that are relevant to FMs, including promoting user choice and control; creating a level playing field for data access; and allocating accountability across the supply chain.
Finally, looking ahead, the ICO and the CMA mention the broad range of ongoing collaboration and projects they are, or are likely to be, involved in together, including online advertising issues and Strategic Market Status investigations under the CMA’s new digital markets regime.
Given the current multitude of intersecting areas of digital regulation, it is always encouraging to see regulators co-operate. Hopefully, through continued collaboration such as this and engagement with stakeholders (which the ICO and the CMA specifically call for in the article in relation to AI and FM models), some of the experience we are seeing in practice will soon be reflected in further guidance.