This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
Digital developments in focus
| 2 minutes read

"This is not a drill" - DSIT publishes Responsible AI Toolkit

On 24 March 2024, the Department for Science, Innovation and Technology (DSIT) published its Responsible AI Toolkit (the “Toolkit”) - a repository for the latest guidance and research in responsible AI development and use. 

The Toolkit was developed by the recently rebranded Responsible Technology Adoption Unit (the “RTA”), formerly known as the Centre for Data Ethics and Innovation (the “CDEI”). The CDEI was originally established in 2018 to identify measures to “maximise the benefits of data and Artificial Intelligence for our society and economy”, whereas the RTA’s rebrand arguably hints at a shift in the government’s approach to AI regulation, from an unabashedly pro-innovation starting point (in its original AI white paper) to a more balanced stance that also focusses on safety.

What’s in the Toolkit? 

The Toolkit currently has three compartments:

  1. AI Assurance Toolkit: 
    Assurance was a common theme of the UK government’s “Pro-innovation approach to AI regulation” white paper (29 March 2023), which foresaw “an important leadership role for the UK in the development of the global AI assurance industry, including auditing and safety”.  The adoption of technical standards and assurance techniques is seen as critical to supporting the government’s proposed AI regulatory framework and fostering a level of public trust and confidence in the deployment of AI more generally. The focus on assurance and trustworthy AI is by no means a new mandate for the RTA, which was responsible for the world’s first roadmap for catalysing the development of an AI assurance ecosystem in 2021 (see more here).
    The AI Assurance Toolkit will collate guidance and reports on assurance techniques and technical standards to support anybody involved in designing, developing, deploying or procuring AI-enabled systems. For example, its most recent publication provides guidance to employers using AI systems in their recruitment processes to analyse and filter CVs. The choice of AI in recruitment as a topic is an interesting one, given it’s also been identified as a “high-risk” area under the EU AI Act.
  2. Algorithmic Transparency Recording Standard (ATRS)
    The Toolkit also includes guidance on the ATRS which is designed to help public sector organisations provide greater transparency about the algorithmic tools they use and their reasons for using them. The government announced as part of its AI white paper consultation response in February that use of the ATRS will become mandatory for all central government departments, and the Cabinet Office already followed the standard this January when disclosing its use of an automated tool to determine which digital records to retain for their historical value. The standard will be reviewed and updated every 6 months and may be rolled out to the broader public sector over time.
  3. Research into public attitudes towards data and AI
    The final compartment of the Toolkit contains the results of the RTA’s annual Public Attitudes Tracker Survey designed to measure how public attitudes to data-driven technology and AI have varied over time, with the most recent results published in February of this year (see more here). According to Viscount Camrose (Minister for Artificial Intelligence and Intellectual Property), the “findings from the Tracker Survey, which is the first in the world of its kind, will continue to underpin the government’s approach to AI and data”. 

A complete set? 

The RTA intends for the Toolkit to act as a living repository for guidance on the responsible use of AI systems, which “will be updated over time with new resources”. It seems quite light on content at present, but given the plethora of UK government publications on responsible AI use and development, it would certainly be helpful if the Toolkit did become the exhaustive one-stop-shop for government guidance on this topic.