Public Authority Algorithmic and Automated Decision-Making Systems Bill [HL] Debate
Full Debate: Read Full DebateBaroness Jones of Whitchurch
Main Page: Baroness Jones of Whitchurch (Labour - Life peer)Department Debates - View all Baroness Jones of Whitchurch's debates with the Department for Business and Trade
(5 days, 14 hours ago)
Lords ChamberMy Lords, I thank all noble Lords for contributing to a very insightful debate. I particularly welcome the noble Baroness, Lady Lane-Fox, to her new role chairing the board of the new digital centre of government. I am sure she will have a great contribution to make in debates of this kind. I also thank the noble Lord, Lord Clement-Jones, for bringing forward the Bill.
The Government understand the intent of the Bill, in particular on the safe, responsible and transparent use of algorithmic and automated decision-making systems in the public sector. However, for reasons I will now outline, the Government would like to express reservations about the noble Lord’s Bill.
The Government of course believe that such systems have a positive role to play in the public sector. As many noble Lords have said, they can improve services, unlock new insights, deliver efficiencies and give citizens back their time. However, they must be used in ways that maintain public trust. The noble Lord, Lord Clement-Jones, highlighted some shocking examples of where threats of bias and racism, for example, have undermined public trust, and these issues need to be addressed.
We know that transparency is a particularly important driver of rebuilding that trust and delivering fairness. That is what the Algorithmic Transparency Recording Standard, or ATRS, aims to address. The noble Lord asked about its status in government. The ATRS is now mandatory for all government departments. This mandate was agreed in cross-government policy. The ATRS is also recommended by the Data Standards Authority for use across the broader public sector, and the standards will become publicly available on GOV.UK.
The initial groundwork to comply with this mandate is complex, particularly for large organisations. They must identify and assess algorithmic tools from across multiple functions, engaging many individuals and multidisciplinary teams. However, I am pleased to reassure my noble friend Lord Knight and other noble Lords that a number of these records have now been completed under the mandatory rollout, and the Government will publish them in the coming weeks.
The ATRS complements the UK’s data protection framework, which provides protections for individuals when their personal data is processed. The technology-neutral approach of the data protection framework means that its principles, including accuracy, security, transparency and fairness, apply to the processing of personal data regardless of the technology used.
The framework provides additional protections for solely automated decision-making which has a legal or significant effect on individuals. It places a requirement on organisations to provide stringent safeguards for individuals where this type of processing takes place, so that they are available when they matter most. These rules apply to all organisations, including the public sector.
I agree, though, with the noble Baroness, Lady Hamwee, that there are specific responsibilities for clarifying and building our trust relationship with the state. I also agree with my noble friend Lord Knight that we have to be particularly sensitive about how we handle protections at work, given their significance to the individuals involved. To ensure that these rules are effective in the light of emerging technologies and changing societal expectations, the Government have introduced reforms to these rules in the Data (Use and Access) Bill, which is currently in Committee in the Lords. I have been engaging with noble Lords on this topic and look forward to further debates on these issues next week.
The Government are confident that these reforms strike the right balance between ensuring that organisations can make the best use of automated decision-making technology to support economic growth, productivity and service delivery, while maintaining high data protection standards and public trust. I am grateful to the noble Baroness, Lady Freeman, and the noble Lord, Lord Tarassenko, for their specific insights, which will help us finesse our policies on these issues as we go forward.
We recognise that our approach to technology can sometimes be too fragmented across the public sector. To help address this, the Government are establishing a revitalised digital centre of government, with further details to be announced shortly. This transformation is being overseen by a digital inter-ministerial group which will be a powerful advocate for digital change across government, setting a clear expectation on when standards such as the ATRS must be adopted. This combination of the ATRS policy mandate and the establishment of the digital centre are moving us towards a “business as usual” process for public sector bodies to share information about how and why they use algorithmic tools.
I turn to the key proposals in the noble Lord’s Bill. The Bill would require public authorities to complete a prescribed algorithmic impact assessment, and an algorithmic transparency record, prior to deployment of an algorithmic or automated decision-making system. Public authorities would be required to give notice on a public register when decisions are made wholly or partly by such systems, and to give affected individuals meaningful information about these decisions. Further provisions include monitoring and validating performance, outcomes and data; mandatory training; prohibition of the procurement of certain systems and redress. The technical scope of the Bill is broadly similar to that of the ATRS.
The ATRS was deliberately made mandatory via cross-government policy rather than legislation in the first instance. This was to enable better testing and iteration of the ATRS; that ethos still applies. Since the introduction of the policy mandate for the ATRS, we have seen significant progress towards adoption. We are confident that the foundations are in place for a smooth ongoing approach to government algorithmic transparency, delivered from the new digital centre.
Completing and publishing ATRS records also has benefits beyond transparency. A field on risks and mitigations enables references to other resources, such as data protection impact assessment. A field on alternative solutions asks how the tool owners know this tool was the right one to deploy, and indeed, whether an algorithmic tool was necessary. As such, the ATRS encourages a holistic view of how the impact of the tool has been considered, and potential negative outcomes avoided, overlapping considerably with the requirements of an algorithmic impact assessment, as the noble Lord has proposed. As such, we do not believe that legislation for either mandatory transparency records or AIAs for public authorities is necessary at this time.
As I set out earlier, under the data protection framework, individuals already have the right to specific safeguards where they have been subject to solely automated decisions with legal or significant effects on them. These safeguards include the right to be told about a decision, the right to obtain human intervention and the right to challenge the decision. Our reforms under the Data (Use and Access) Bill specifically provide that human involvement must be meaningful. This is to prevent cursory human involvement being used to rubber-stamp decisions as having had meaningful involvement.
Where an individual believes that there has been a failure in compliance with data protection legislation, they can bring a complaint to the independent data protection regulator, the Information Commissioner’s Office. The ICO has the authority to investigate and impose significant penalties for non-compliance, providing robust safeguards against misuse of personal data. Therefore, proposals by the noble Lord are also broadly covered under the data protection framework.
The data protection framework also requires organisations to carry out data protection impact assessments prior to any processing likely to result in a high risk to data protection and to the rights and freedoms of individuals to mitigate against such risks.
To summarise, the Government believe that transparency in public sector algorithmic and automated decision-making is crucial both to building public trust and to accelerating innovation. Meaningful transparency should not merely identify the existence of such systems but also discuss their purpose and effectiveness. The ATRS provides an established and effective mechanism to deliver this transparency.
The Government are also committed to maintaining the UK’s strong data protection framework while delivering on the DSIT Secretary of State’s priorities of accelerating innovation, technology for good, and modern digital government through the Data (Use and Access) Bill.
The noble Baroness, Lady Lane-Fox, is quite right to identify the need to upskill civil servants. That has certainly been identified within my department and it is part of the need to upskill everyone for the future. Everyone in the existing generation and the next will need those skills to fulfil the exciting technological opportunities that we will have in the future, so we all have a responsibility to upskill our skills.
We look forward to continuing to engage with noble Lords on these important issues as we develop our approach, and to the many other chances we will have, starting with our debates on Monday. I look forward to those debates. If I have missed anything out—I know the noble Viscount, Lord Camrose, asked some specific questions at the end—I will follow-up in writing.