Technology Rules: The Advent of New Technologies in the Justice System (Justice and Home Affairs Committee Report) Debate

Full Debate: Read Full Debate
Department: Home Office

Technology Rules: The Advent of New Technologies in the Justice System (Justice and Home Affairs Committee Report)

Baroness Hamwee Excerpts
Monday 28th November 2022

(2 years ago)

Grand Committee
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Moved by
Baroness Hamwee Portrait Baroness Hamwee
- Hansard - -

That the Grand Committee takes note of the Report from the Justice and Home Affairs Committee Technology rules? The advent of new technologies in the justice system (1st Report, Session 2021–22, HL Paper 180).

Baroness Hamwee Portrait Baroness Hamwee (LD)
- Hansard - -

My Lords, I am delighted to move this Motion and I hope the Grand Committee will support it.

This is the first formal report of our committee, which was formed in April last year. At the start, our members knew little about new technologies—I hope I am not being unkind to any of them. After some tuition, we confessed ourselves terrified, but we should not have been terrified about not understanding technologies; in a way, that is the point. The report is about new technologies and how they affect the citizen in the justice system. We looked largely at policing because that was where the evidence led us, but our recommendations have wider application.

Quite early on I asked, rhetorically, “How would I feel if I was arrested, charged, convicted and imprisoned on the basis of evidence I did not understand and could not access?” Towards the end of our work, another member said, “Look at Horizon and the Post Office; look at what happens when you assume the computer is always right”.

We heard about the software and tools used to record, store, organise, search and analyse data, and those used to predict future risk based on the analysis of past data. Predictive policing includes identifying, say, an estate where there has been a lot of crime, putting police in and detecting more crime than in an area that is not overpoliced. The data reflects this increased detection rate as an increased crime rate, and that is embedded in the next predictions. It is a vicious circle which, as a witness said, is

“really pernicious. We are looking at high-volume data that is mostly about poor people, and we are turning it into prediction tools about poor people.”

The noble Lord, Lord Blunkett, who had hoped to speak this afternoon but, given the change of time, has a clash and apologises for not being here, asked me to say the following:

“It is critical that the substantial issues addressed in the report are confronted before major problems arise, rather than because of them. The wide-ranging implications for the operation and therefore the credibility of the criminal justice system, and the unanimity supporting the committee’s findings, require something better than kicking the can down the road or believing that the present architecture can handle the growth and significance in the use of artificial intelligence.”

I heard a murmur of support when I was reading that, but I will continue even though it pretty much says what I will say over the next few minutes.

The “something better” includes welcoming innovation and regulating it appropriately. The issues are difficult, but the point was not to put them in the “too difficult” tray. I believe that the report answers the not unexpected concerns that we must not stifle innovation, that each police force should be free to take its own decision and that police and crime commissioners must ensure compliance with human rights.

Proposing regulation often raises hackles, but it is another way of requiring standards to be met. Standards are a good thing—in themselves and because something known to meet agreed standards is more likely to be trusted. For example, standards can ensure, to the greatest possible extent, that conscious and unconscious bias—such as racial bias in stop and search tools—is not baked in. That is to the benefit of the producer as well as others. In other words, standards support innovation.

Procurements deserve a lot of attention. A police officer procuring a product can be vulnerable to an overenthusiastic sales pitch—we heard some horror stories—or a one-sided contract. I would have loved to see a form of contract, for instance, about the ownership of data, both input and output. Does the commercial producer of the programme own it? It is a big question, which makes one wonder about data inadequacy, but I will not go there this afternoon. We were not able to get hold of a form of contract: commercial confidentiality gets in the way.

National standards would include requirements in respect of reliability, accuracy and performance in the context of their use, evaluation, validity, suitability and relevance. It is very worrying if standards are regarded as a threat.

We heard a lot about the independence of police and crime commissioners, and that PCCs and chiefs ensure compliance with human rights. I heard that as overdefensive. Of course each force should pick products to suit its local needs; there are 43 forces applying the same law. By analogy, the BSI kitemark is in common use for many products in other sectors—in other words, certification. The police could have a choice among certified products. That would not preclude them picking products to suit their own local priorities. Operationally, this would not mean that the police do not have to assess both the necessity and proportionality of each deployment.

This is all part of governance. The point was made more than once, including by government: “You can always go to court to sort things out”, but the courts’ role is to apply the law, and nothing goes to court unless someone takes it there. That needs determination, emotional energy and money. By definition, the judgment will not be a comprehensive assessment nor a systematic evaluation.

In a similar vein, the Minister said to us that Parliament is the national ethics body—to be fair, I think that was a throwaway line—but I doubt that we are qualified for that. However, Parliament has a role in establishing a national body: independent, on a statutory basis and with a budget. We think there should be a single national body. Our report lists 30 relevant bodies and programmes. That makes for very complicated governance.

There can never be a completely one-stop shop, but that does not mean that simplification is not needed. It is not surprising that there is confusion as to where to find guidance. The committee recommends a body where all relevant legislation, regulation and guidance are collated, drawing together high-level principles and practice. Primary legislation should be for general principles, with detailed regulation setting minimum standards—not so prescriptive as to stifle innovation, but recognising the need for the safe and ethical use of technologies. We recommend the use of statutory instruments, despite the procedural drawbacks with which your Lordships are familiar, as a vehicle for regulations and a basis for guidance, with scope for non-statutory guidelines.

To assess necessity and proportionality, we need transparency. A duty of candour is associated more with the health service, but we urge the Government to consider what level of candour would be appropriate to require of police forces regarding their use of new technologies.

We also recommend mandatory participation in the Government’s algorithmic transparency standard—currently, it is voluntary—and that its scope be extended to all advanced algorithms used in the application of law which has implications for individuals. This would in effect produce a register, under the aegis of the central body. I understand that the Information Commissioner’s Office and Thames Valley Police, and no doubt more, are involved with the standard, and there is clear wish to link compliance with it to processes to improve technology and to enable police to exchange information about what works and what does not. There is a wish too to link it to independent oversight.

Ensuring the ethical use of any tool is fundamental. That has to be integral to the use of the tool, as we have seen with live facial recognition and the London gangs matrix, whose review apparently led to the removal of the names of some 1,000 young black men. The West Midlands Police are leaders with their ethics committee, both in having it and in how it is used—I have been very impressed by what I have heard and seen of its operation. There are similar bodies in a few, but only a few, other forces. If we get the standards right, the tools will be better trusted, by the citizen and the police themselves. That will free up police resources.

Current legislation provides that a person shall not be subject to

“a decision based solely on automated processing, including profiling, which … significantly affects him.”

The then Home Secretary assured us that decisions about humans would always be taken by humans—a human in the loop—but clicking a button on a screen is not enough when one starts from the mindset that “the computer is always right”. We agreed with the witness who said that the better way is that the machine is in the loop of human decision-making.

Does the human understand what it and he are doing? “Explainability” is essential; I had not come across that term before, but it seems to be used a lot in the sector. It is essential for the user, the citizen affected and everyone else. If the police officer does not understand the technology, how can he know if he—or it—has made a mistake? A critical approach in the best sense is needed.

The Sunday Times recently reported on new AI which will detect sex pests and thugs on trains who intend to assault rail passengers. It said:

“When a woman is sitting on her own in a carriage with empty seats, it could also assess whether she feels threatened when a man comes to sit down next to her or whether she welcomes his presence.”


There is no hint there might be some fallibility in all this. With all of this, noble Lords will not be surprised that we identified a lot of training needs.

We received the Home Office response to our report in the summer. I wrote on behalf of the committee to the then Home Secretary that we were “disheartened”—the best term I felt I could use courteously—by the

“reaction to what we hoped would be understood as constructive conclusions and recommendations. These are very much in line with the recommendations of other recently published work”.

Indeed, a workshop discussing the report last week at the Alan Turing Institute bore this out. The response read to us as more satisfied with the current position than was consonant with the evidence we had used. I will not quote from the Government’s response as I am optimistic that the Minister today will be able to indicate an understanding of our conclusions and an enthusiasm to progress our recommendations. I beg to move.

--- Later in debate ---
Baroness Hamwee Portrait Baroness Hamwee (LD)
- Hansard - -

My Lords, there are more recommendations and conclusions in our report which any of us could have spoken to today, but noble Lords have covered a great deal of ground and I thank them all.

Our thanks go to the staff who supported this inquiry: Sam Kenny, our then clerk, and Achille Versaevel, our policy analyst, who, in truth were the authors; Amanda McGrath, who kept everything in order including the members; Aneela Mahmood, who got us coverage in an astonishing number of media outlets; David Shiels, our present clerk; and Marion Oswald, our enormously knowledgeable specialist adviser, who seems to know everyone. Of course, thanks also go to the people who gave us such powerful evidence. I thank the Alan Turing Institute, which hosted last week’s workshop, attracting contributors with such expertise, who I wish were sitting behind me, passing me notes of critique of what we have just heard. That workshop felt like an important validation of our work. My thanks go to all members of the committee, with whom I thoroughly enjoy working. None of their contributions is small.

We were drawn to the topic because of the lack of a legal framework, the rule of law and the potential for injustice—principles which must continue to apply. The speeches today have confirmed these and that the committee appreciates the use of AI. We have not been dismissive of it.

I thought that the noble Lord, Lord Hunt, might refer to the thalidomide case. It was mentioned at the workshop, where the point was made that it is essential to get the tests of a product right, otherwise compliance with the test is used as the defence to a claim.

I have been subjected to a type of AI at the border, where I could get through only when I took off my earrings, because I had not been wearing the same earrings when the passport photo was taken. That is such a minor example, but I felt quite rejected.

I have to say that I thought my noble friend Lord Paddick was going to say that the technology let him range freely through his twin brother’s bank because he thought he was his twin brother.

I do not think that the noble and learned Lord, Lord Hope, should begin to be apologetic about having no technical expertise. In a way, that is the point of our report. The judiciary was very much among those we regarded as affected by the use of AI.

The pace of development was referred to; it is enormous. The issues will not go away, which makes it all the more important that we should not be thinking about shutting the stable door after the horse has bolted or letting the horse bolt.

I thank the Minister for his response. It is not easy to come to this when many of us have lived with it for a long time. To sum up his response, I think the Government agree with our diagnosis, but not what we propose as the cure. We have to make transparency happen. He says it is not optional, but how do we do that, for instance?

There was a good deal of reference in his response to the public’s consenting, policing needing consent and the Peelian principles, but he then listed a number of institutions, which, frankly, confirmed our point about institutional confusion. On ethics and his point that a statutory body could override a democracy, that is not how any of the ethics organisations approach it. It is about closing the stable too late if one addresses specific technology as it is needed.

A commitment to the spirit of the report gets us only so far; it does not leave the Wild West way behind in our rear-view mirror. We will indeed come back to this, maybe when we get the new data protection Bill. This is not an academic issue to be left in a pigeonhole unconnected with issues current in Parliament—I need only say: the Public Order Bill.

Motion agreed.