Dawn Butler
Main Page: Dawn Butler (Labour - Brent East)Department Debates - View all Dawn Butler's debates with the Home Office
(1 month ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
As I understand it, the number of false positives recorded depends to some extent on the threshold at which the technology is set.
The report by the national physical laboratory said that it had to be set at 0.6 for it to have fewer misidentifications, but there is no such thing as no misidentifications or people not being wrongly identified. It is also easy for a police service to lower that number. Because we have no judicial oversight, it is very problematic.
The hon. Lady is completely right. I think the police are generally being responsible in its use and setting the threshold as recommended, but that is another example where there is no requirement on them to do so, and they could lower it. Regarding deployment in Essex, the chief constable told me there was just one false positive.
I attended a meeting with Baroness Chakrabarti, along with my right hon. Friend the Member for Goole and Pocklington, where Shaun Thompson, an anti-knife community worker, spoke to us. He had been held by the police for 30 minutes and forced to provide all sorts of identity documents, as a result of a false positive. On the extent to which it is occurring and whether racial bias is involved, there is some evidence that that is the case. That makes it all the more important that we provide assurances.
We have heard from several campaign organisations that are concerned about the use. They vary in the extent to which they believe it is a legitimate technology. Big Brother Watch has described live facial recognition technology as
“constant generalised surveillance”
and has said that it is
“indiscriminately subjecting members of the public to mass identity checks”
which undermines the presumption of innocence.
Liberty has gone further, saying:
“Creating law to govern police and private company use…will not solve the human rights concerns or the tech’s inbuilt discrimination…The only solution is to ban it.”
I do not agree with that, because I think there is clear evidence that it has a real benefit in helping the police apprehend people who are wanted for serious offences, but one of my major concerns is the lack of any clarity in law about how it should be used.
I am grateful to the Library, which has provided advice on that point. It says:
“There is no dedicated legislation in the UK on the use of facial recognition technologies.”
Instead, its use is governed by common law and by an interpretation of the Police and Criminal Evidence Act 1984, although that Act does not mention live facial recognition technology, and some case law, such as the Bridges case. Even in the Bridges case, the Court of Appeal found that
“The current policies do not sufficiently set out the terms on which discretionary powers can be exercised by the police and for that reason do not have the necessary quality of law.”
It is a pleasure to serve under your chairmanship, Dame Siobhain. I thank the right hon. Member for Maldon (Sir John Whittingdale) —or I could say my right hon. Friend, if he does not mind—for securing this debate. I have spoken to the Secretary of State and Ministers in the Department for Science, Innovation and Technology, and there is an awareness that we need a lot of careful and considerate thinking on this issue. Obviously, a new Government have just come in and this is not a new issue, as the right hon. Member for Maldon said—LFR was first used in 2017, so there is a lot of clearing up that has to be done.
Live facial recognition changes one of the cornerstones of our democracy: an individual is innocent until proven guilty. With this technology, if the machine says an individual is guilty because they have been identified using live facial recognition, they then have to prove their innocence. That is a huge change in our democracy that nobody has consented to. We have not consented to it in this place, and as we police by consent as a society, that should really worry us all.
I thank the hon. Lady for giving way; I am looking forward to this debate and to concluding it for the Opposition later.
On the question of changing the burden of proof or undermining the concept that someone is innocent until proven guilty, the technology absolutely does not change that. What it does is give the police a reason to stop somebody and check their identity to see whether they are the person wanted for a criminal offence. It certainly does not provide evidence on which a conviction might be secured. In fact, it is no different from the police stopping someone because they are suspicious of them, and it is a lot more accurate than stop and search, about which I am sure the hon. Lady has views. It is simply a tool to enable the police to stop somebody and check their identity to see whether they are the person who is wanted. It certainly does not undermine the very important principle that a person is innocent until proven guilty.
The shadow Minister has hit on an important point regarding reasonable suspicion. What is reasonable suspicion? How have the police got to that point? If he is then going to make reference to watchlists, who is put on a watchlist? We know, for instance, that the Met police has hundreds of thousands of people on its system who should not be there. We know that the watchlist can consist of people it considers to be vulnerable, such as those with mental health issues. Anybody in this room could be put on a watchlist, so I am afraid the shadow Minister has not quite nailed the point he was trying to make.
I am very much on the hon. Lady’s side of that argument, partly because we are a country where it is not normal to stop people and ask for their identity cards, which is why we have had a few battles over that in the past. Also, the technology is prone to slippage. Way back when—probably when the hon. Lady was still at school—we introduced automatic number plate recognition to monitor IRA terrorists coming from Liverpool to London. That was its exact purpose, but thereafter it got used for a dozen other things, without any legislative change or any approval by Parliament.
Order. Could I ask Members to keep interventions as interventions?
Thank you, Dame Siobhain. Yes, it is really important that we talk about this openly. That is what we are supposed to do in this place, right? Anybody can be put on the watchlist. Seven police forces are currently using LFR. One that I know of—I am not sure about the others—the Metropolitan Police Service, is in special measures. I do not think it should be given any additional powers while it is in special measures.
The thing is that we know very little about the software or what is in the black box that is developed by these systems. What we can look at is the outcome, and we know that the outcome does not identify very well black women’s faces, especially, and black and Asian people. There is a lower identification threshold for those people, so that is a concern.
It is also really interesting that even when LFR is set at 0.6, a police super-spotter is more accurate. We have specialist police officers who spot people very quickly, and they are more accurate than this system, so it becomes the case that a police service will try to prove that the system it has bought is value for money. We can imagine a police officer not getting many hits with LFR at 0.6 and lowering that to 0.5 so that they can get more hits, which in turn means that more people are misidentified, so there should be regulation around this issue.
Taking away somebody’s liberty is one of the most serious things we can do in society, so we need to think very carefully if we are going to introduce something that accelerates that. It is good that for the first time we are having the debate on this issue. As the right hon. Member for Maldon said, the EU permits LFR only where there is prior judicial authorisation and in cases in which the police need to locate a missing person, for instance. That is something we need to consider.
I want to say this: I like technology. I am very much into our civil liberties. We need to protect our digital rights as human beings and individuals. I love technology— I used to be a coder—but we should not rush to do things because people get excited. There are really four people in the debate on this issue. It reminds me of four of my mates when we go out clubbing. Bear with me. We have the person who will stay at home because they are not bothered—they do not care—and we have the people who do not care about this issue: “It is going to happen; let it happen.” We have the person who will come, but they are a bit moany. They do not really like the music, but they will come anyway because they do not want to miss out.
We then have the person who is completely drunk on it all: “Give it to me. I’ll take everything.” There are people who just love anything to do with technology and will say, “Look, let’s just throw it all in the mix and it’ll all be fine.” And there is me. I am the person who likes the music and the food, but I need to keep sober to make sure everyone gets home safely. In this debate about AI, we need to be sober to make sure that everybody gets home safely and that when we roll out AI, we do so in a way that is fair and compassionate and in line with our values as British citizens.
I thank every Member here for coming to this debate and I thank the right hon. Member for Maldon (Sir John Whittingdale) for securing it in the first place.
I have worked on this issue for many years. In my previous job, I attended and observed the first deployments of live facial recognition by the Metropolitan police, which is many years ago now. Since then, the gap between its increasing use and the lack of a legislative basis has grown wider and wider. In that time, many thousands of people have had their personal data captured and used by the police when there was absolutely no reason for that. Many people have been misidentified, but the accuracy issue is not my main concern.
The unlegislated use of the technology is incredibly worrying. In my previous job on the London Assembly, I asked the Met and the Mayor of London many questions about that. I asked for watchlist transparency, but I did not get it. I heard the initial promises—“Oh, it will be very transparently used, we will communicate it, and no one will have to walk past it without knowing.” All those reassurances just faded away, because there is no real scrutiny or legislation. We need to debate the subject from first principles. As other Members have pointed out, we have had proper debates about identity cards and fingerprint and DNA data, but not about this extremely intrusive technology. It is more concerning than other technologies because it can be used on us without our knowledge. It really does engage our human rights in profound ways.
For all those reasons, the use of facial recognition by the police has been challenged by the Information Commissioner, the Surveillance Camera Commissioner, the Biometrics Commissioner, London Assembly members, of whom I was one, Senedd Members and Members of Parliament here. The only detailed scrutiny of the technology has resulted in calls for a halt to its use; I am thinking of the Science, Innovation and Technology Committee. The Justice and Home Affairs Committee has also called for primary legislation. That is the absolutely key question. The EU has had the debate and looked at the issue in detail, with the result that over there what is used so much by the UK police is restricted to only the most serious cases of genuine public safety. That absolutely needs to happen here.
The legislation needs to look not just at police use of the technology, but private use. I have seen its use by private companies in the privately owned public space in King’s Cross. Data from there has been shared with the police; the police initially denied knowing anything about it and then later apologised for that denial. If private companies are collecting data and sharing it with the police, that needs to be scrutinised. If private companies are using the technology, that needs to be legislated for as well.
The hon. Lady is making an incredibly powerful speech. Is she aware of the Big Brother Watch campaign to try to stop large shops from capturing people’s faces and saying that they are shoplifters? They then get stopped in other places, but they are not aware of that process.
Yes, I am aware of Big Brother Watch’s excellent campaigning on this issue. It has identified a serious breach of human rights. There is the potential for a serious injustice if people are denied access to their local shops based on a suspicion that has put them on a watchlist that may or may not be accurate. There is no oversight. We need to debate these things and legislate for them.
I tabled a written question to the Minister about putting regulation and legislation behind the police use of live facial recognition. The answer stated that the technology is governed by data protection and equality and human rights legislation, and supplemented by specific police guidance. I do not believe that police guidance is sufficient, given the enormous risks to human rights. We need a debate on primary legislation. I hope that the Minister will announce that that process will start soon and that this unlawful grey area will not be invading our privacy for much longer. This issue is urgent.
I appreciate that we are having this debate, because it is surprising that we have got to where we are without legislation and firm frameworks in place. I really like the phrase “first principles”, and one of the first principles of the police is “without fear or favour”. That is an exceptional phrase that, if perfectly implemented, we would all benefit from, although of course we recognise that in the real world there is no such thing as perfect.
I am grateful that concerns have been raised about how the technology we are discussing impacts the assumption of innocence—we should all be very careful about that—although I also appreciate the point that it does not impact innocence but provides the opportunity for a human to check. If done properly, that is no bad thing, but we are right to discuss the issue in serious terms in our legislature because there is a danger of an unofficial assumption of guilt. Let us take the example of local shopping centres, which we heard about earlier. If an issue has not been escalated to the police or courts, but some local security officers have seen the same images on cameras and that information has gone round by radio, a gentleman or a lady out with their children doing the weekly shop may suddenly not be able to get in and do what they need to do. That is the kind of pervasive and damaging thing that could easily slip under the radar; we should all be mindful of that.
I want to touch briefly on transparency. This is clearly a developing technology and we would be wrong not to look at its benefits, but we must be mindful of the harm it could do along the way. If people find that they are getting an unfair crack of the whip—that is probably an inappropriate term—and are suffering as a result of this technology, we need to nip that in the bud, and be very direct and open about the failures so that we can make adjustments.
Is the hon. Gentleman aware that black men are eight times more likely to be stopped and search by the police than their white counterparts, and 35 times more likely under section 60? This technology accelerates the discrimination that is already in the system.
Absolutely. Let me put it like this: if any of us were to turn up at a social event and unexpectedly find a large swarm of police, that would give us a moment’s pause for thought. We need to be careful to ensure that this technology is not a more pervasive version of that example. It must not be constantly in existence, attached to every CCTV camera, without us even being aware of it.
To go back to transparency, we have to be open and frank about any issues with how the technology is being implemented, so that we can fix them. I agree that there absolutely could be issues, and we definitely want to be on the right path.
But when we look at the numbers of people, something like 0.5% of scans—I cannot remember the statistic—still result in somebody being misidentified.
On the misidentification rate, I think the Bridges court case set a standard of a false positive rate of one in 1,000: out of every 1,000 people stopped, 999 are the people the police think they are, while one is misidentified. The Minister may have more up-to-date figures, but from my recollection the system in practice is running at about one in 6,000. That is an extraordinarily high accuracy rate—much more accurate than a regular stop and search.
About 25% to 30% of regular physical stops and searches, where a police officer stops someone and searches them for drugs or a knife or something, are successful. About 70% are unsuccessful, while the equivalent figure for live facial recognition is 0.02%. That means that this technology is 4,500 times less likely to result in someone being inappropriately stopped than a regular stop and search. It therefore hugely—by three orders of magnitude—reduces the likelihood of someone being improperly stopped and searched.
I turn to the use of the technology on the ground. I asked for it to be trialled in the centre of Croydon, which is the borough I represent in Parliament. Over the past nine months or so, it has been deployed on a relatively regular basis: about once a week. I believe that the Minister was supposed to go down this morning to have a look; I certainly encourage her to go again as soon as she can. By the way, the hon. Member for Birmingham Perry Barr (Ayoub Khan) asked whether people know when the technology is being used. The answer is yes: one of the guidelines is that public signage must be displayed telling the public that the technology is in use.
Over that period in Croydon, there have been approximately 200 arrests of people who would not otherwise have been arrested, including for all kinds of offences such as class A drugs supply, grievous bodily harm, fraud and domestic burglary. It has also included a man who had been wanted for two rapes dating back to 2017. That wanted rapist would be free to this day if not for this technology. Just a couple of weeks ago, a man was stopped and subsequently arrested in relation to a rape allegation from June this year. There are people who are alleged to have committed rape who would not have been stopped—who would still be walking free—if not for this technology. It is only the fact that they walked past a camera outside East Croydon station or somewhere that has meant they were stopped by the police. They will now have a normal trial with the normal standards of evidence, but they would not have been caught in the first place if not for this technology.
I have done quite a lot of public meetings on this. I explain, “These are the people who get caught, and the price the public pay is that you might get scanned when you walk down Croydon High Street, but if you are innocent your picture is immediately deleted.” By and large, the overwhelming majority of the people in Croydon think that a reasonable trade-off.
There should be protections, of course. Several hon. Members, including my right hon. Friend the Member for Maldon, have rightly said that there should be guidelines, rules and procedures. However, it is not true that there is a complete vacuum as far as rules and regulations are concerned. The Bridges case at the Court of Appeal in 2020 looked at how South Wales police were using the technology between 2017 and 2020. It found that some of the ways they were using the technology were not appropriate because they broke rules on things like data protection privacy. It set out in case law the guidelines that have to be adhered to for the technology to be lawful—things like public signage, the rate of accuracy and having no racial bias.
Secondly—I do hope I am not taking the Minister’s entire speech—there are guidelines for police. The College of Policing has national authorised professional practice guidelines that the police are supposed to stick to. There is a debate to be had about whether, for the sake of clarity and democratic accountability, we in Parliament should set something out more formal; my right hon. Friend the Member for Maldon made that point. I think there would be some merit in clarifying at a national level where the guidelines sit, but I would not go as far as Europe. If we had done so, those rapists would not have been arrested. I would also be careful to ensure that any legislation is flexible enough to accommodate changing technology. Primary legislation may not be the right vehicle: a regulation-making power might be a more sensible approach, so that things can be kept up to date from time to time.
While we consider that, I strongly urge the Minister not to halt the use of the technology. As we speak, it is arresting criminals in Croydon and elsewhere who would not otherwise be caught. I urge her to continue supporting the police to roll it out. I think some money was allocated in the Budget for the current financial year, to continue developing the technology. I would welcome an update from the Minister on whether that money is still being spent in the current financial year. I do hope it has not somehow been snaffled by the Treasury in a misguided cost-saving effort—