Technology Rules: The Advent of New Technologies in the Justice System (Justice and Home Affairs Committee Report) Debate

Full Debate: Read Full Debate
Department: Home Office

Technology Rules: The Advent of New Technologies in the Justice System (Justice and Home Affairs Committee Report)

Lord Sharpe of Epsom Excerpts
Monday 28th November 2022

(1 year, 5 months ago)

Grand Committee
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Sharpe of Epsom Portrait The Parliamentary Under-Secretary of State, Home Office (Lord Sharpe of Epsom) (Con)
- Hansard - -

My Lords, I thank all noble Lords who have spoken in the debate today and particularly the noble Baroness, Lady Hamwee, for securing the debate. I also thank those who contributed to the Justice and Home Affairs Committee’s thoughtful and insightful report, which has paved the way for today’s discussion.

As the noble Baroness has made clear, the Government responded to that report in June, but it is nevertheless welcome that we have found time to discuss these important matters more fully. I hope this is not the last time we cover the topic; I suspect it will not be. I will remark briefly on the broad thrust of the committee’s report and the Government’s position, as well as on points made during this debate, while also—I am afraid—having to join the noble and learned Lord, Lord Hope, by admitting that I am not much good with my thumbs either.

I am not sure that this line is going to qualify as “riding to the rescue”, but there is significant agreement between the Government and the committee on the challenges posed by advanced technology and how it is rolled out into the justice system. I am sorry if noble Lords feel that the government response was in some way a brush-off, but I am sure all your Lordships would agree that the technology is very complicated. The policing and justice sector and the ethics around balancing competing human rights are also very complicated. The public expect us to have a world-class justice system, and I think all noble Lords acknowledged this. Utilising technology is a cornerstone of this. The police must use technologies to free up officer time to fight crime, by making administration more efficient, and as a tool to hold those responsible for crime to account.

The Government are committed to empowering the police to use the latest technologies because the public support their use. However, there are no easy answers and the risk of acting without fully understanding the implications of these technologies and getting it wrong is very real. We are not presently persuaded by the overall recommendations put forward in the report, but the Government are committed to the spirit of improving consistency, maintaining public trust, ensuring sufficient oversight and empowering the police which sit behind those recommendations.

The subject of transparency was raised by my noble friend Lord Hunt and others. In their evidence, the Government were clear that transparency is not optional. The police themselves see and understand that being transparent is in their interests. We do not agree that we should mandate specific rules on transparency across such a wide range of current and potential future technologies and uses, but that does not mean we take it any less seriously.

Transparency is an important part of data protection laws. Our policing model works only if there is public consent. For the public to consent, as the noble Lord, Lord Ponsonby, has just pointed out, they must be engaged. It is in the police’s interest to hold conversations and be open about what they are doing and why. Several police forces are working with the Centre for Data Ethics and Innovation to explore how the algorithm transparency standard may work for them. We welcome it as one tool that could promote the sharing of best practice, but transparency can come in many forms. Our position is that mandating a set of rules could restrict what information is ultimately provided to the public and risks turning transparency into a tick-box exercise.

Instead, we will continue to help the police to collaborate with experts and identify how they can be transparent in a way that allows scrutiny, both at a technical level by those with expert knowledge and at an ethical level by the wider public. There is no point being transparent if what is said cannot be understood. We are in agreement that the question of ethics is of fundamental importance, and the ethics of acting or using technology is not something to be considered lightly.

We have heard how important the roles of accountability and oversight are at each stage of the system. I would caution that a statutory ethics panel, as proposed in the report, may decrease democratic oversight because such powers could override local decision-making, local accountability and locally elected officials, but I note the particular reference to the West Midlands Police example. We are not persuaded that the creation of a national statutory ethics committee is the best way to bring expert insight into police practice, but we will continue to work with colleagues in policing to develop and support non-statutory models.

Our democratic system, and ultimately Parliament, is here to provide scrutiny and oversight. The committee’s report is proof of that, as is today’s debate. It is right that our institutions are held to account, especially in relation to the complex and important issues we have discussed today. The committee’s report noted that, below this, there are a range of oversight bodies tasked with providing oversight on various aspects of how the police use technology. We recognise the risk of overlap and confusion, which is why we have proposed in the Data Protection and Digital Information Bill to simplify the arrangements for biometric and surveillance cameras, because, ultimately, it is individuals, not technology, who take the key decisions within the justice system. Technology may be used to generate insights, but the decision to arrest will always remain with the officer, while the courts will decide what material can be given in evidence in determining guilt and any sentence. The Government will continue to support work to equip and educate the individuals working within the justice system so that they understand the technologies they use and how to use them correctly.

My noble friend Lord Hunt and others raised governance and accountability. On accountability, I think the question was who is responsible when things go wrong—who has the day-to-day responsibility for governance? There are existing regulations covering the responsibilities of parties when undertaking a procurement and when working together to provide a service. Depending on the issue, it may be addressed in different ways: illegal activity may be a criminal offence; other unlawful activities, such as a data protection breach, would be an issue for regulators; and poor performance should be mitigated against at the contractual level.

The public expect the police to innovate. They have to be allowed to do so within the law, so decisions on what technologies to use are highly operational ones for the police, independent of government. However, the police need to act within the legal framework set out by Parliament, and bans are in place where they are proportionate to the risk, such as in cases where the technology poses a risk of lethal or less than lethal force. This is not the same level of risk as that associated with the types of technologies raised in the report.

Chief constables ultimately decide when and how to use new technologies. However, they and their PCC are advised, regulated and overseen by a range of technical and regulatory bodies. The police chief scientific adviser, who I will come back to, advises chief constables on important matters such as good education. The ICO can and will take action where there is a lack of compliance with data protection laws. His Majesty’s Inspectorate of Constabulary and Fire & Rescue Services has a duty to consider how forces are meeting the Peelian principles, of which the use of technology is of course a part. HMICFRS undertakes thematic reviews based on its local inspections, and the use of technology is an area which could merit specific analysis.

The noble Baroness, Lady Primarolo, asked about individual complaints challenging the use of technology. Challenging the use of technology in the courts is certainly a resource-intensive process, and it is best reserved as a solution when the circumstances are exceptional. However, individuals can report concerns through other avenues, and we encourage them to do so. Where there are concerns over necessity, proportionality or a policing justification, they could be raised with HMICFRS, which has a mandate to consider how professional standards are applied in its reports and investigations. If the matter relates to how individuals within policing are using technology and their behaviour, this may be something to take forward with the independent police complaints authority. Concerns related to fairness, equality or rights can be raised with the Equality and Human Rights Commission, while the Information Commissioner’s Office is well placed to investigate questions of data protection and privacy.

Noble Lords have acknowledged that the police are operationally independent, which is an essential principle of our system. Nevertheless, we are also alive to the need to ensure that law enforcement is given appropriate support in adapting to technological change and advancements. The role of the police chief scientific adviser, to which I have referred, was created to give policing a scientific capability, establishing a dedicated place for advice on how to innovate, test technologies and ensure that tools do what they claim. Since being appointed, the chief scientific adviser has led reform of how the sector works with the scientific community and is developing a strategy for science and technology. The NPCC’s science and technology strategy will strengthen how the police approach using validated and cutting-edge science in their mission to protect the public. The Government support this strategy and encourage its successful adoption. Those using the technology and impacted by it must be confident that it works as it should.

The Home Office is investing in policing to strengthen the technical evidence available on the most promising future technologies, as well as helping in the commission of research by the Defence Science and Technology Laboratory, which tests functional performance. Confidence in the scientific basis and validity of the technology being used is only part of the picture: there must also be confidence in the operational practice.

The wider question of technology in the justice system is clearly an area in which it is important constantly to develop best practice and future guidance. We agree that clear and consistent advice is essential to allow innovation. To this end, the sector is developing its repository of guidance and information. For example, the College of Policing published national guidance on live facial recognition earlier this year. The Government will support the sector to stay on the front foot in addressing specific technologies, as needed.

An approach centred on the “Move fast and break things” mantra may work for innovation in the Silicon Valley, but it would not be appropriate in the context of UK law enforcement. So we have no wish to break the system establishing the rule of law, which of course dates back a very long time. That is not to say that the Government intend to sit back and be solely reactive, but proactively regulating brings its own risks. Mandating standards without consensus in the sector on what it needs may turn certification into something that is easily gamed by bad actors, opening up public authorities to harm.

So, although I happily acknowledge that there will be an opportunity for someone to set global standards, at the moment the Government are of the opinion that certification, or kitemarking, can create false confidence in the validity of a technology. We want to ensure that responsibility for using lawful technologies is not delegated to a certification process that may be gamed. Within our existing regulatory model, the police have a responsibility to use products that are safe and meet the high ethical tests set out in the data protection, human rights and equalities legal framework.

Assessing proportionality and necessity, even if the technology works, depends on the unique factors of each use case. Organisations should not hide behind regulations or certification when it comes to deploying new technologies responsibly. The police must make justifiable decisions during procurement, development and deployment, reviewing them regularly. The current legal framework places responsibility for how to do that firmly on the organisation. However, in addition to the Centre for Data Ethics and Innovation, the Government have established an AI standards hub to help to promote good practice. But the responsibility and accountability that organisations face are theirs alone.

Although we did not generally share the committee’s overall approach of more and more legislation, we will act when the need is clear. We are confident that the regulatory model is proportionate and mature. We have established a statutory code for digital forensics and placed the forensic services regulator on a statutory footing. As practice consolidates around specific standards, we will continue to learn from the relevant experiences and engage with wider learning from sectors such as healthcare.

Someone, but I am afraid I have forgotten who, asked: does it actually work? The answer is yes. I have a large number of examples but in the time available I will provide one: all forces use facial recognition retrospectively. South Wales Police produces around 100 identifications a month, which, as a noble Lord—I forget who—noted, reduces certification time from 14 days to a matter of hours. South Wales Police and the Met have also used live facial recognition technology and successfully disrupted things like mobile phone theft gangs, with no reported thefts at rock concerts, for example, and there were 70 arrests overall during various trials, including for offences as severe as rape, robbery and other forms of violence.

The noble Lord, Lord Clement-Jones, raised the Bridges case. That was a compliance failure by South Wales Police. The court confirmed that there was a legal basis in common law and a legal framework including human rights, data protection and equalities law, in which live facial recognition and, by extension, other technologies could be usefully carried out. Since the judgment, the College of Policing has published an authorised professional practice clarifying the “who” and “where” questions.

On the question of potential bias, noble Lords will be interested to know that the US National Institute of Standards and Technology, which is generally recognised as the world’s premier outfit of this type, found that the algorithm that South Wales Police and the Met use shows almost indetectable bias.

The Committee may have noticed that I am slightly between focus ranges with or without glasses, which is making life rather complicated. I wish I were relying on technology at this point.

I was asked about live facial recognition as an example. I have just mentioned that the College of Policing authorised professional practice guidance on live facial recognition. That requires chief officers to ensure training within the force on the following: how to respond to an alert; the technical capabilities of live facial recognition; the potential effects on those subject to the processing; the core principles of human rights; and the potential impact and level of intrusion on each subject.

The adoption of live facial recognition standards serves as an example of where practice has moved quickly over the last few years following legal scrutiny and greater public discourse. The sector learned from the early pilots to test, improve and evolve policies following feedback. The pilots of this tool were just that—early tests. Now that more evidence is available and the maturity of the capability is advanced, we can analyse how the legal framework is working. This process points to the strength of our legal framework as it has driven the improvement of standards without suffocating innovation.

My noble friend Lady Sanderson and the noble Baroness, Lady Ludford, asked about DCMS and cross-departmental working. The answer is that we work very closely. The Home Office is also part of a pilot looking at how the algorithm transparency standard works for the department’s own activities. As for the White Paper, it will come some time next year but I am afraid I do not have a specified date.

I thank all noble Lords who have contributed to this fascinating debate. I extend my thanks again to the committee for all the work and insight that went into producing a thorough and engaging report on these very complex issues. We do not fully agree on the way forward in terms of specific steps, but I am confident in suggesting that there is a broad consensus about the need for a long-term approach. Whether that stops noble Lords being disheartened, I do not know.

For the Government’s part, we will continue to look at the entirety of the system and seek to encourage improvements at each stage, with a focus on developing policy to ensure that the benefits of new technology are realised throughout the justice system. As the report laid out so clearly, there is no option to pause or stand still. The issues discussed today are of fundamental importance to the safety and security of our citizens and our values, and I look forward to continuing our engagement on these matters.