This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
THE LENS
Digital developments in focus
| 2 minutes read

ICO looks at individuals’ rights in GenAI

The fourth chapter of the ICO’s s series consulting on GenAI focusses on individuals’ rights and how to engineer these into GenAI models.

The chapter does not consider the right to rectification as this was covered in a previous chapter (see our blog) or the rights relating to automated decision-making (the ICO links to its existing guidance instead). 

The ICO focuses mainly on the development stage, i.e. the training and fine-tuning of GenAI models, but emphasises that organisations must have  processes to enable and record the exercise of individuals’ rights throughout the entire AI lifecycle, including the outputs and user queries stages. 

The right to be informed 

The ICO considers when personal data is provided directly by an individual as well as where it is collected from other sources. The ICO helpfully calls out the impossibility/disproportionate effort exemption in the latter case, which it says is likely to apply to web-scraped datasets. However, the ICO then notes that the processing of web-scraped data for the purposes of developing GenAI models is likely to be beyond people’s reasonable expectations. Arguably on this basis, it would also fall foul of the concepts of fairness and purpose limitation. 

If GenAI developers rely on this exception, the ICO still expects them to take measures to make privacy information publicly available, including by providing: 

  • specific and accessible information on data sources and type (ie. not just ‘publicly accessible information’) and on the purposes and lawful basis of the processing; and
  • prominent and accessible mechanisms for individuals to exercise their rights.

Publishing the above on a company website would likely suffice, as acknowledged by the ICO in its general guidance on the right to be informed.  

In what seems to be a slight aside, the ICO states that resource or expense requirements should be factored into business decisions from the very start, given the requirement to apply a privacy by design and default approach to ensure transparency, fairness and accountability. Further detail would be welcome on how to do this in practice and what test to apply (e.g. reasonableness).

Right of access

The ICO states that developers should have accessible, clear, easy-to-use, documented and evidenced methods to deal with these requests at all stages. If developers cannot identify an individual’s personal data, the burden is on them to explain this to the individual, who may then choose to provide more information to help with identification. 

Rights to erasure, to restriction of processing and to object to processing

The ICO has little to add at this stage to its previous guidance on these rights and is clearly in an information-gathering mode on exercise in practice. 

There is an interesting reference to the ‘memorisation’ issues that GenAI models present. The ICO explains that this occurs because during training, these models retain imprints of personal data so they can ‘learn’ and can therefore unintentionally output sections of the training data they have ‘memorised’ without being explicitly asked. The ICO says it is aware that input and output filters are used to mitigate this but, again, it is looking for practical evidence as to the effectiveness of these filters, and of other approaches such as ‘machine un-learning’. 

The ICO also asks for evidence on the mitigation measures to protect against unfairness and statistical inaccuracy of the model itself where individuals exercise their rights as a group. 

Next steps

The consultation closes on 10 June 2024, with the next chapter focusing on controllership. It continues to be encouraging to see the ICO engage with industry and seek practical examples and views. However, this fourth chapter is quite light on detail, perhaps reflecting the fact that certain individual rights are less exercised than others in practice generally. It will therefore be interesting to see how this evolves. 

 

 

 

 

Tags

data, ai, dp