The ICO has published new draft updated guidance (the draft guidance) for organisations using automated decision making. The draft guidance provides support for organisations in applying the new provisions in Articles 22A-D of the UK GDPR, which were introduced by the Data (Use and Access) Act 2025 (DUA Act), as discussed in this blog.
What does the draft guidance say?
Perhaps unsurprisingly, given the additional detail on automated decision making (ADM) added to the law to balance supporting innovation and protecting individuals, the draft guidance is relatively dense. In particular, it now needs to address multiple scenarios and decision points, for example, the more relaxed position where ADM is used but no special category data is processed, as well as the more onerous rules where special category data is processed.
While the ICO has included more content (and examples) endorsing the use of ADM by businesses to support AI and innovation, it’s clear the DUA Act changes aren’t a green flag for a more relaxed approach to ADM. Significant compliance analysis, documentation and rigorous processes to protect individuals will be still be required.
However, the draft guidance does contain some helpful new insights on previous areas of uncertainty, including in relation to:
- Decisions: More analysis is provided on what amounts to a decision for the purposes of the ADM rules. The term is viewed as having broad meaning, relating to “a conclusion or outcome, reached after consideration or analysis” where the conclusion may impact actions taken or engage a person’s rights. Systems that just apply a rule already set by a human won’t engage the ADM rules (e.g. to accept certain payment cards not others) whereas systems that evaluate information about a person and pass judgement (e.g. eligibility for a rental property) will.
Significant decisions: There is new recognition that ‘significance’ is contextual. What will have a ‘similarly significant effect’ on one individual may not for another, with the impact of decisions on vulnerable people and children being emphasised. Unless such cases can be identified with confidence, ADM safeguards should be applied to all decisions.
The ICO lists various types of impact which may indicate a decision has significance, such as impacts on finance, health, employment, behaviour and choices. New examples given to support this widely drawn list appear to pave the way for a broader set of decisions (and market sectors) to fall within the ADM rules. For example, the ICO cites the potential for ‘recommendations’ that ‘nudge’ teenagers towards content that promotes unhealthy eating, as an example of a decision with a behavioural impact and ‘dynamic pricing and discriminatory offers’ as examples of decisions that may impact ‘choices’.
- Meaningful human involvement: As before, the guidance emphasises that human involvement must be active and not tokenistic to take a decision outside the ADM rules, but greater emphasis is now put on the capacity of the person involved. The person must be “suitably trained and qualified to understand the system’s logic, outputs, limitations, and risks”. Ad hoc spot checking isn’t sufficient and timing is critical; human involvement must come before the decision is applied to a person.
- Individual protections (privacy notices aren’t enough): Although brought together in one new section of the UK GDPR, individuals’ rights in relation to ADM largely persist from the previous version of the law. However, the ADM information right (distinct from the right to be informed and information given in privacy notices) is given greater prominence in the draft guidance. Individuals must be given information to allow them to “meaningfully understand the decision and the specific aspects of their case that influenced it”. This must be “decision-specific information about the actual outcome, rather than simply repeating information you provide in your privacy notice”.
Broader context and next steps
The draft guidance is a good indicator of the ICO’s developing policy positions in relation to ADM (and AI) which will feed through to the regulator’s AI and ADM statutory code of practice that we are expecting this year. The positions outlined are currently subject to consultation (ending on 29 May 2026) and may yet evolve.
To accompany the draft guidance, the ICO has also published a suite of materials on ADM in the recruitment context, including a statement calling for action by businesses and a report on ICO findings from its regulatory focus on ADM to date and its expectations. These publications show that in line with its documented strategic focus, the regulator is prioritising work (and enforcement) on ADM and how it operates in practice. Organisations should take note of the direction of travel and start to review their processing and compliance practices accordingly.

/Passle/5badda5844de890788b571ce/SearchServiceImages/2026-03-31-15-05-34-392-69cbe2bef111dab9e0fd3a30.jpg)
/Passle/5badda5844de890788b571ce/SearchServiceImages/2026-03-26-19-07-04-415-69c583d8e41e6f1a6d43c2a6.jpg)
/Passle/5badda5844de890788b571ce/SearchServiceImages/2026-03-26-11-13-45-385-69c514e9e41e6f1a6d414454.jpg)
/Passle/5badda5844de890788b571ce/SearchServiceImages/2026-03-25-10-05-46-548-69c3b37a8d00a5a8ebe0592b.jpg)