As organisations look to procure AI, many are developing contractual clauses to manage AI specific risks. But what should those clauses include? And will they help compliance with laws such as the EU’s AI Act? The European Commission have recently updated their working draft model contractual clauses for the public procurement of AI (MCC-AI), which are a helpful reference point for public bodies and private companies alike. The MCC-AI, which replace the previous draft issued in September 2023, were published alongside a commentary document which provides guidance on how to use and apply the clauses in practice. The commentary, together with the latest version of the MCC-AI, can be found here.
How do the clauses work?
As with the previous draft of the standard contractual clauses, the MCC-AI come in two versions:
- the MCC-AI High-Risk for the procurement of AI systems that are classified as high risk under the EU AI Act – see our detailed briefing here for further details on what AI systems may fall within this definition; and
- the MCC-AI Light for the procurement of AI systems that are not classified as high-risk but the use of which could still pose risks to individual’s health and safety or fundamental rights.
The Commission does, however, note that even when an AI system does not pose any of the risks outlined above, contractual arrangements should still be put in place which set out relevant requirements in relation to the AI system.
The MCC-AI are intended to be appended to a main agreement and so do not contain all necessary provisions. For example, the commentary acknowledges that the MCC-AI do not deal with areas such as delivery deadlines, acceptance, intellectual property, data protection and liability.
Who should be using the MCC-AI?
The MCC-AI remain entirely voluntary, but are targeted at EU public bodies seeking to procure AI systems. That said, many of the provisions in the MCC-AI will be relevant, and a helpful reference point, for any private company looking to procure AI systems as well.
What do the MCC-AI include?
In terms of content, the MCC-AI generally align with the previous version of the clauses, except that the provisions have been updated to align with the now enacted EU AI Act. The MCC-AI High-Risk version:
- sets out the technical requirements for the AI system, including details relating to risk management, data governance, transparency as well as accuracy, robustness and cybersecurity (which generally align with the requirements for high-risk AI systems in Section 2 of Chapter III of the EU AI Act);
- defines the supplier’s key obligations, including ensuring compliance with the relevant technical requirements as well as obligations relating to quality management and conformity assessment (which generally align with the high-risk provider obligations in Section 3 of Chapter III of the EU AI Act);
- requires the supplier to cooperate with the public body in carrying out any fundamental rights impact assessment and explaining the role of the AI system in any decision-making process affecting individuals;
- sets out the scope of each party’s right to access and use the other party’s data sets (as well as any third party data sets);
- includes an indemnity - the supplier indemnifies the public body for any claims arising out of any infringement of intellectual property or data protection rights resulting from the public body’s use of the AI system or any supplier provided data sets (and the public body provides an equivalent indemnity in respect of any data set provided by the public body); and
- grants the public body audit rights allowing it to monitor the supplier’s compliance.
The MCC-AI Light version generally mirrors the MCC-AI High Risk version, except that the supplier’s obligations to implement quality management systems, carry out a conformity assessment, provide information for the AI register, as well as the public body’s audit right, have been deleted. This approach is interesting, given it contractually imposes many EU AI Act requirements and obligations on suppliers where the AI systems would not be subject to these requirements or obligations under the EU AI Act. Based on the previous version of the MCC-AI Light, the thinking behind this appears to be to improve the trustworthiness of the AI system procured by the public body. The latest commentary does however acknowledge that taking this approach may not be proportionate or appropriate in all circumstances and it will be interesting to see how suppliers respond to this in practice.
Commentary
Some of the criticism against the previous version of the MCC-AI was that the clauses were not detailed enough and largely just tracked the regulatory obligations imposed on the supplier. The latest versions of the MCC-AI do not substantially depart from this approach. Relying on the MCC-AI will therefore still require the parties to set out much of the required detail, including key technical requirements and measures, and to incorporate the MCC-AI into a main agreement. The updated commentary does not provide much guidance on how this will work in practice. It is also unclear how successful organisations will be in imposing the MCC-AI Light, given they go beyond the legal requirements of the AI Act. That all said, the MCC-AI remains a helpful resource for any public body, as well as any private company, looking to procure AI systems.