In the first few weeks of 2023, the ICO has continued to espouse its commitment to exercising its full spectrum of enforcement powers. Monetary penalties increasingly form only a small part of the UK’s regulatory picture as the ICO shifts from blunt, “whack-a-mole” enforcement towards a more targeted, risk-based approach (a trend we discussed in our recent article The year in UK GDPR regulatory enforcement action).
In January 2023, the ICO published a letter to North Ayrshire Council, following its enquiry into the Council’s use of facial recognition technology (FRT) to manage ‘cashless catering’ in nine school canteens. The regulator ultimately concluded that although it may be possible to lawfully deploy FRT in schools, in this case it infringed provisions in the UK GDPR relating to lawful basis and consent, transparency, data protection impact assessments, data retention, minimisation and accuracy. In particular, the ICO found that:
- the Council had not gained valid explicit consent for the processing of special category data, in part because the consent was unlikely to have been freely given “because of the way the information was worded and how the introduction of FRT was being presented”. The ICO suggested that to remedy this, the Council should have provided an alternative option that was as easy for the children to use as FRT and made clear to parents and pupils that there was no requirement to consent to FRT to obtain a school lunch; and
- the content of the Council’s privacy notice did not fully explain to children in ‘child friendly terms’ the potential impacts and complexity of the processing of biometric data.
Crucially, the letter sets out the ICO’s intention to publicise key learnings from the enquiry, including through publication of the letter itself, a case study and via social media, to benefit other education authorities considering the use of similar technologies. Further guidance for private (and public) sector organisations on the use of biometric technologies is expected from the ICO later this spring, and follows the regulator’s warning about the risks posed by immature biometric technologies in October (discussed in our blog here).
The publication of these resources chimes with recent commitments by the Information Commissioner to expand the resources available to organisations – a more constructive regulatory approach to facilitate organisations’ taking control of their own risk-based compliance. It also reflects the spirit of the ICO’s recent decisions to publish both reprimands and data breach investigation details to increase transparency and accountability (consistent with its ICO25 commitments) and create certainty for controllers and data subjects.
In the same vein, early February saw the ICO announce a decision to ease the reporting burden for communications service providers (CSPs), whose strict 24-hour personal data breach reporting deadline under the UK’s Privacy and Electronic Communications Regulations (PECR) will no longer be enforced in trivial cases (which are unlikely to result in a risk to individuals’ rights and freedoms), provided CSPs notify the regulator within 72 hours.
The ICO revealed that it receives around 10,000 notifications under Regulation 5A PECR every year, but that the majority are small-scale and do not involve harm, “usually [resulting] from human error and only [affecting] a small number of individuals”. By helping CSPs to minimise this regulatory burden, the ICO is continuing to demonstrate its appetite to reduce organisations’ excuses for non-compliance and to direct resources towards more effective protection of individuals’ rights and freedoms in line with its ICO25 regulatory approach.
Further clarity on the impact of the ICO’s reframed enforcement priorities will be brought by the long-awaited results of the ICO’s consultation on its draft regulatory action policy and statutory guidance, which were expected by the end of last year.