Lord Davies of Brixton
Main Page: Lord Davies of Brixton (Labour - Life peer)Department Debates - View all Lord Davies of Brixton's debates with the Cabinet Office
(2 years, 5 months ago)
Grand CommitteeMy Lords, I will speak to Amendment 46, which comes from a slightly different angle. In our report AI in the UK: Ready, Willing and Able?, our AI Lords Select Committee, which I chair, expressed its strong belief in the value of procurement by the public sector of AI applications. However, as a recent research post put it:
“Public sector bodies in several countries are using algorithms, AI, and similar methods in their administrative functions that have sometimes led to bad outcomes that could have been avoided.”
The solution is:
“In most parliamentary democracies, a variety of laws and standards for public administration combine to set enough rules to guide their proper use in the public sector.”
The challenge is to work out what is lawful, safe and effective to use.
The Government clearly understand this, yet one of the baffling and disappointing aspects of the Bill is the lack of connection to the many government guidelines applying to the procurement and use of tech, such as artificial intelligence and the use and sharing of data by those contracting with government. It is unbelievable, but it is almost as if the Government wanted to be able to issue guidance on the ethical aspects of AI and data without at the same time being accountable if those guidelines are breached and without any duty to ensure compliance.
There is no shortage of guidance available. In June 2020, the UK Government published guidelines for artificial intelligence procurement, which were developed by the UK Government’s Office for Artificial Intelligence in collaboration with the World Economic Forum, the Government Digital Service, the Government Commercial Function and the Crown Commercial Service. The UK was trumpeted as the first Government to pilot these procurement guidelines. Their purpose is to provide central government departments and other public sector bodies with a set of guiding principles for purchasing AI technology. They also cover guidance on tackling challenges that may occur during the procurement process. In connection with this project, the Office for AI also co-created the AI procurement toolkit, which provides a guide for the public sector globally to rethink the procurement of AI.
As the Government said on launch,
“Public procurement can be an enabler for the adoption of AI and could be used to improve public service delivery. Government’s purchasing power can drive this innovation and spur growth in AI technologies development in the UK.
As AI is an emerging technology, it can be more difficult to establish the best route to market for your requirements, to engage effectively with innovative suppliers or to develop the right AI-specific criteria and terms and conditions that allow effective and ethical deployment of AI technologies.”
The guidelines set out a number of AI-specific considerations within the procurement process:
“Include your procurement within a strategy for AI adoption … Conduct a data assessment before starting your procurement process … Develop a plan for governance and information assurance … Avoid Black Box algorithms and vendor lock in”,
to name just a few. The considerations in the guidelines and the toolkit are extremely useful and reassuring, although not as comprehensive or risk-based as some of us would like, but where does any duty to adhere to the principles reflecting them appear in the Bill?
There are many other sets of guidance applicable to the deployment of data and AI in the public sector, including the Technology Code of Practice, the Data Ethics Framework, the guide to using artificial intelligence in the public sector, the data open standards and the algorithmic transparency standard. There is the Ethics, Transparency and Accountability Framework, and this year we have the Digital, Data and Technology Playbook, which is the government guidance on sourcing and contracting for digital, data and technology projects and programmes. There are others in the health and defence sectors. It seems that all these are meant to be informed by the OECD’s and the G20’s ethical principles, but where is the duty to adhere to them?
It is instructive to read the recent government response to Technology Rules?, the excellent report from the Justice and Home Affairs Committee, chaired by my noble friend Lady Hamwee. That response, despite some fine-sounding phrases about responsible, ethical, legitimate, necessary, proportionate and safe Al, displays a marked reluctance to be subject to specific regulation in this area. Procurement and contract guidelines are practical instruments to ensure that public sector authorities deploy AI-enabled systems that comply with fundamental rights and democratic values, but without any legal duty backing up the various guidelines, how will they add up to a row of beans beyond fine aspirations? It is quite clear that the missing link in the chain is the lack of a legal duty to adhere to these guidelines.
My amendment is formulated in general terms to allow for guidance to change from time to time, but the intention is clear: to make sure that the Government turn aspiration into action and to prompt them to adopt a legal duty and a compliance mechanism, whether centrally via the CDDO, or otherwise.
My Lords, I am speaking to my Amendments 128 and 130, although the issues raised there have already been addressed by earlier speakers. I fully support the amendments spoken to by the Front Bench and Amendment 57 tabled by the Liberal Democrats.