Automated Facial Recognition Surveillance Debate
Full Debate: Read Full DebateKit Malthouse
Main Page: Kit Malthouse (Conservative - North West Hampshire)Department Debates - View all Kit Malthouse's debates with the Home Office
(4 years, 10 months ago)
Commons ChamberUrgent Questions are proposed each morning by backbench MPs, and up to two may be selected each day by the Speaker. Chosen Urgent Questions are announced 30 minutes before Parliament sits each day.
Each Urgent Question requires a Government Minister to give a response on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
(Urgent Question): To ask the Secretary of State for the Home Department if she will make a statement on police use of automated facial recognition surveillance.
The Government are supporting the police and empowering them with the tools they need to deliver on the people’s priorities by cutting the crime that is blighting our communities. We have already pledged 20,000 more officers, new powers and the biggest funding increase in a decade, but embracing new technology is also vital and we support the use of live facial recognition, which can help to identify, locate and arrest violent and dangerous criminals who may otherwise evade justice.
Live facial recognition compares the images of people passing a camera with a specific and predetermined list of those sought by the police. It is then up to officers to decide whether to stop and speak to those flagged as a possible match. This replicates traditional policing methods such as using spotters at a football match. The technology can make the search for suspects quicker and more effective, but it must be used strictly within the law.
The High Court has found that there is an appropriate legal framework for the police use of live facial recognition, and that includes police common-law powers, data protection and human rights legislation, and the surveillance camera code. Those restrictions mean that sensitive personal data must be used appropriately for policing purposes, and only where necessary and proportionate. There are strict controls on the data gathered. If a person’s face does not match any on the watchlist, the record is deleted immediately. All alerts against the watchlist are deleted within 31 days, including the raw footage, and police do not share the data with third parties.
The Metropolitan Police Service informed me of its plans in advance, and it will deploy this technology where intelligence indicates it is most likely to locate serious offenders. Each deployment will have a bespoke watchlist made up of images of wanted people, predominantly those wanted for serious and violent offences. It will also help the police to tackle child sexual exploitation and to protect the vulnerable. Live facial recognition is an important addition to the tools available to the police to protect us all and to keep murderers, drug barons and terrorists off our streets.
We must not allow the UK to become a society in which innocent people feel as though their every movement is being watched by the police. We must not throw away UK citizens’ right to privacy or their freedom to go about their lawful business without impediment.
An independent review of the Met’s facial recognition trial was published last July, and its conclusions are damning. Does the Minister agree with the report that the legal basis for this roll-out is questionable at best and is likely to be in conflict with human rights law? According to an analysis of the Met’s test data, 93% of supposed matches in the four years of trials have been wrong. As well as being inaccurate, facial recognition technology has been shown to be much less accurate in identifying women and ethnic minorities than in identifying white men. This means that women and black, Asian and minority ethnic people are much more likely to be stopped without reason than white men. Given that a black person is already 10 times more likely to be stopped and searched than a white person, does the Minister share the Liberal Democrats’ concern that this technology will increase discrimination and further undermine trust in the police among BAME communities?
The biometrics commissioner, the Information Commissioner and the surveillance camera commissioner have all raised concerns about facial recognition surveillance, and all three have argued that its impact on human rights must be resolved before a wider roll-out. What steps has the Minister taken since those warnings to examine and address the human rights issues they raise?
The hon. Lady rightly raises a number of issues that need to be addressed in the operation of this technology. I assume she is referring to last year’s statement by the Information Commissioner’s Office. The commissioner reviewed the Met’s operation and raised some concerns about how it was operating the pilot of live facial recognition. Happily, the ICO put out a statement on Friday saying that it is broadly encouraged by the fact that the Met has adopted some of its recommendations in this deployment, although she is right that the ICO remains concerned about the legal basis.
Since the ICO report was published, we have had the judgment in a case brought against South Wales police’s deployment of this technology, in which the High Court found there is an appropriate legal basis for the operation of facial recognition. However, I understand that there may be an appeal, and there is a suspended judicial review into the Met’s operation, which may be restarted, so if Members do not mind, I will limit what I say about that.
As for disproportionality, there is no evidence of it at the moment; the Met has not found disproportionality in its data in the trials it has run, and certainly a Cardiff University review of the South Wales police deployment could not find any evidence of it at all. The hon. Lady is, however, right to say that in a country that prides itself in being an open and liberal society, we need to take care with people’s impressions of how technology may impinge upon that. As she will know, live facial recognition has an awful lot of democratic institutions looking at it, not only this House: the London Assembly has a policing ethics panel; we have the Surveillance Camera Commissioner and the Information Commissioner; and there is a facial recognition and biometrics board at the National Police Chiefs’ Council, which brings people together to look at these issues. There is lots of examination to make sure that it is used appropriately, and I am pleased to say that the Met will be operating it on a very transparent basis. As I understand it, the Met will be publishing information about which data was gathered and the success rate, and other information that will allow the public to have confidence that where the technology is deployed to identify wanted criminals it is having the effect intended.
If I am wanted for questioning, what difference does it make to my rights if I am fingered by a police officer or a bit of software?
In his usual pithy manner, my right hon. Friend puts his finger on the button. As Members will know, the police have used facial recognition since their establishment. There is an analogue version—a wanted poster. We will have seen those and they crowdsource the identification of wanted criminals. The only question here is whether a human being does it, such as a spotter at a football match, or a machine does it. We acknowledge that if a machine is doing it, more circumspection and democratic control are required, and that is what we will be providing.
Facial recognition technology is potentially an important crime-fighting tool, but not without the correct safeguards, and the Minister has failed to persuade the House thus far that all the correct safeguards are in place. Does he accept that the random use of facial recognition technology requires not just a High Court judgment, but a specific legal framework and specific arrangements for scrutiny? After all, when blood, saliva or hair samples are provided, they are done voluntarily or under compulsory detention and charge. Facial recognition evidence is given involuntarily. He will have heard different reports about the unreliability of the evidence. Does that put people at risk of being wrongly accused of a crime? He will have heard the reports that the facial recognition technology finds it difficult to recognise black people and women, and that the technology deployed is often inaccurate. To bring in technology that might be inaccurate and mean that the guilty go unapprehended and the innocent are wrongly identified would be a spectacular own goal, leading to a breakdown of the bond of trust between the police and public.
The right hon. Lady is right to say that the police must deploy technology so as to increase the trust of those they seek to protect, rather than to diminish it. We certainly believe that the use of this technology could, as she said, have enormous potential for crimefighting, if deployed in the correct way. She asked whether the random use of facial technology could undermine that confidence. It might, but of course we are not intending to use it in a random way and the police are not doing so. In effect, they will be operating it in a very specific intelligence-led way, with lots of notification in the area in which it is to be deployed against a known list of wanted suspects or criminals; a specific area will be identified where the police have intelligence that that person might be passing through. Those very specific and focused arrangements will be authorised by a very senior officer above commander rank.
As for unreliability, as technology is rolled out it obviously becomes more and more effective and reliable—[Interruption.] Well, I am the lucky owner of a telephone that allows me to make banking payments on the basis of recognising my face. That technology was not available in the last iteration of the phone—it is an iPhone—which used my thumb instead. So there are developments in technology. South Wales police found in trials that there was a 1:4,500 chance of triggering a false alert and more than an 80% chance of a correct alert. It is worth bearing in mind that even when the system does alert the police to a possible identification, the final decision as to whether to intervene with an individual is still taken by a human being.
Will my hon. Friend explain how the proportionate use of facial recognition technology could help to tackle the offences, such as county lines drug offending, that are the scourge of many communities, including those in my constituency?
My hon. Friend raises an extremely important point. The British people want to see the technology used, as he rightly says, in a proportionate way. It is certainly the intention that live facial recognition is used against the most violent and serious criminals, who are often wanted urgently when the police are having problems locating them. One key area of LFR governance will be the surveillance camera code, one of the key tenets of which is that LFR is used proportionately to the offence committed and, specifically, that it is absolutely necessary—that is, the police have no other way of locating that person or have had trouble locating them in the past. We all have a duty to monitor this development carefully, see how it is rolled out and judge it by its results, which we hope will be spectacular.
As we have heard, there are huge concerns about the impact of automated facial recognition technology on privacy and freedoms such as the freedom of assembly, and about the danger of bias and discrimination because, as the hon. Member for Richmond Park (Sarah Olney) said, there is evidence that AFR technology can disproportionately misidentify women and BAME people, which means that they are more likely to be wrongly stopped and questioned. Those concerns are widely held, including by the independent Biometrics and Forensics Ethics Group, which advises the Home Office on facial recognition.
The Scottish Government are employing an approach that involves a comprehensive, up-to-date legislative framework and a regularly updated code of conduct with strong oversight through a commissioner. In that way, my colleagues in Edinburgh hope to ensure that the use of the technology is proportionate, necessary and targeted, and that it respects human rights, privacy and data protection rules. Will the Minister follow suit?
Finally, so far as I am aware, there is no evidence that the use of this technology in the manner contemplated is effective in fighting crime. If I am wrong about that, will the Minister direct me to the evidence that says that it is effective? If not, why not employ less risky measures, such as following the Scottish Government’s example and employing more police officers in a meaningful way?
The identification of individuals at large, by any method, is a standard policing technique—whether it is done by a human, a machine or, indeed, a member of the public—so increasing its effectiveness is absolutely key. I am pleased that the Scottish Government are mirroring many of the arrangements that are being put in place in the rest of the United Kingdom to deal with this technology because, as the hon. and learned Lady said, it has enormous potential for us. We have seen the successful use of the technology in pilots elsewhere. I was even told of an occasion on which a police force—I forget which it was; it might have been South Wales police—advertised the use of live facial recognition at a rock concert where in the past there had been significant problems with what they call “dipping”, which is in effect the pickpocketing of wallets and phones. The mere advertising of the technology resulted in there being no offences committed.
If it is subject to the appropriate ethical controls and privacy requirements, I see this technology more as a benefit than a threat. It is another tool in the police toolbox for fighting crime. Does the Minister envisage its application in order to deal with the more than 300,000 people in this country who went missing last year, who were predominantly children? Speed is of the essence in locating them for their own safety.
My hon. Friend highlights an extremely important opportunity for us. As he quite rightly points out, many, many people go missing every year. Some people want to disappear for various reasons, but, often, young people do not want to do so. Where it is proportionate, necessary and in line with the code, the identification of missing vulnerable people, particularly young people, would certainly be an incredibly good use of the technology.
I welcome you to your place, Madam Deputy Speaker. I have not yet had the chance to congratulate you on your new role.
In the previous Parliament, the Science and Technology Committee looked at this issue as part of the biometrics and forensics strategy review. All of the key stakeholders recognised that a biometrics strategy that was not fit for purpose and not of the quality required to provide a regulatory framework for facial recognition technology was at the root of the issue. Can the Minister confirm whether that strategy has been updated since last April?
The hon. Gentleman is quite right to raise concerns about the framework, and I will have to get back to him on whether the strategy has been updated. I do not think that it has, but I will check and make sure. He will be pleased to know that, at the recent general election, the Conservative party manifesto did contain a commitment that, while we wanted the police to use the ever-increasing capabilities that technology was presenting to them, we wanted them to do so within a strict legal framework. We will be giving consideration over the months to come about what form that will take.
Does my hon. Friend not agree that liberty also means freedom from crime and antisocial behaviour? That is why I strongly welcome these measures. Will he expand on how the technology will deal with antisocial behaviour and drug running, on which he has touched before, as we face those problems in my constituency of Harlow?
At the moment, this technology is being deployed only by the South Wales police and the Metropolitan police. However, as I explained earlier, where the police do have a wanted, serious and violent criminal who they believe may be moving around in a particular location, they will deploy this camera and a wanted list and, hopefully, identify that individual. For areas that surround London, which often suffer from the movement of violent criminals mainly to deal in drugs, their identification as they move through particular areas and therefore their apprehension will no doubt pay benefits to many towns such as his and, indeed, such as the one in my constituency that exist around the capital.
Like all artificial intelligence, and unlike the spotter in a football crowd that the Minister cites, facial recognition technology automates the prejudices of those who design it and the limitations of the data on which it has trained. If it is not diverse by design, it will be unequal by outcome, so what minimum standards is he placing on this technology before it is rolled out?
The hon. Lady is quite right to raise what has been a concern in the media, but none of the evidence from the trials thus far—[Interruption.] Okay, the concern has been elsewhere as well. However, none of the evidence in trials thus far is pointing to that disproportionality. One of the key things that the Met will be doing, however, is that, after every deployment—[Interruption.] Madam Deputy Speaker, I am trying to answer the hon. Lady’s question, but she is still barracking me from a seated position. I would like, if possible, to explain it. I understand that it is a very sensitive issue, but we are, nevertheless, dealing with very serious crime and this may help the police in apprehending those people. Frankly, if the police were seeking to apprehend the killer of my child, I would want them to consider using this technology. We owe it to people to make the police as effective as possible. However, the Metropolitan police will be publishing the results of every deployment on their website. The democratic scrutiny will be exposed through the London Assembly and, indeed, I am sure, through this place. As the technology is rolled out and we consider what changes may be needed to the legal framework so that it operates in a position of confidence with the public, no doubt Members here will have their say.
Policing sporting events such as the Cheltenham festival, which will soon be upon us, presents unique challenges for the police. How does the Minister see this technology, once appropriately considered and reviewed, acting to assist the police to ensure that those who might wish to do harm to large numbers of people can be properly apprehended?
My hon. Friend, in his usual way, raises an extremely important point. It is worth reiterating that there is no intention of our having random surveillance using live facial recognition. The deployment of a camera will be against a known wanted list and against intelligence that an individual is likely to be in a particular location and is either wanted or is intent on harm and causing a crime or, indeed, perpetrating some sort of awful event in a large crowd. This is a tool we would be foolish to neglect, given its potential, but we in this House have a duty to set a framework that strikes a balance between protecting our invaluable civil liberties and keeping the public safe.
I thank the Minister for his answers so far. Does he agree that although personal privacy is a right, anything that is used in the correct manner to prevent crime and apprehend those who have committed a crime must be considered and utilised where appropriate?
I do. It is worth repeating what I said at the beginning about how the system works. If an individual passes in front of a camera and there is no match, the information that that individual is there is instantly deleted; if there is a match, the information will be retained for 31 days and then deleted; and even if there is a match, it is for the police officer on the scene at the time to decide, on viewing the evidence, whether to stop the individual. We will see how this goes over the next few months and years, but we hope and believe it will be of enormous benefit in fighting crime.
Does my hon. Friend not accept the view of the surveillance camera commissioner, who has said that the guidelines are insufficient at present and there is no transparency? Do the Government plan to update the guidelines to take account of developments in technology?
I am grateful to my right hon. Friend for his question, which points to the heart of the matter. As he knows, there is a facial recognition and biometrics board, which is soon to have a new chair. As part of that renewal of leadership, we will review the board’s terms of reference and its mission, especially in the light of technological developments. What emanates from that, and whether it is a change in the terms of the code, we will have to wait and see, but as I said at the start, I am very aware of the duty we have in this House to strike the right balance between security and liberty.
The approach of trying it out and seeing how it goes is exactly the wrong way to maintain public trust. Many of my constituents use King’s Cross railway station, and last year they discovered that they were, in effect, being spied on. The legal framework is not in place. When even the head of Google is saying we should move more slowly, because we need to keep the public with us, is it not right that we follow the example of the European Union and put it on pause while we work out the right way to proceed?
No, it is not right. The hon. Gentleman is incorrect to say that there is no legal framework, and in saying that he disagrees with the High Court, which only last year certified in a case that there was and therefore the police could roll it out. The Information Commissioner looked at this and issued a report, and the Met has adopted many of recommendations of that report. Like every development in crime fighting, the technology is not static; we have to be agile and sensitive to its use. For example, the past 100 years have seen enormous developments in fingerprint technology—in detection and retrieval and in the identification of individuals using fingerprints. We keep fingerprints in a way that we do not keep facial recognition information, and there are good reasons for that, but these things should be kept under review at all times, and that is what we intend to do with LFR.
Whether it is county lines gangs or cyber-fraudsters, we know that criminals are using technology to spread crime. People expect us to ensure that our police can use the best technology to tackle crime. Will the Government work with expert organisations such as the Ada Lovelace Institute on ensuring that we develop world-class ethics governing how best to use technology to tackle crime?
Of course we want to maintain public confidence in the use of the technology, and that means that we have to be as transparent as possible about both its deployment and the results obtained from it, but we must get this in proportion. Those who believe that the technology should not be used at all must ask themselves why we publicise the faces of wanted criminals on programmes such as “Crimewatch”, and use the wisdom of crowds to identify criminals as quickly as possible. There are circumstances where the police have a duty to try to find people quickly, effectively and efficiently, and this will help them to do that.
We are aware that facial recognition is used in Xinjiang in China for mass oppression through mass surveillance. People who oppose war or the climate crisis are concerned that their assembly will be systematically recorded and used, or misused, against them—that liberty will be oppressed in the name of security. What assurances can the Minister give to people who want legally to participate in such assemblies that we will not go down the road of mass surveillance and oppression under a new, more authoritarian regime?
As I understand it, the use of this technology in such circumstances would be illegal, and we are the guardians of what is legal in this country.
In the age of smartphones, automated number recognition and especially CCTV, is it not already virtually impossible to preserve one’s privacy when one is out in public? As it is only a matter of time before CCTV becomes pin-sharp, is it not inevitable that this technology cannot be stopped, because we are already going to be recorded on systems that will provide exactly the same technique for identifying people for whom the authorities are rightly searching?
It is definitely the case that in a world where identification technology of all types is accelerating, one of the challenges we face is the preservation of our privacy, and there have been many debates in this House and in the public realm about how we do that. We believe that we have a good, strong and transparent framework in which data can be gathered legally but then kept private, and through which individuals can seek their own privacy by way of the deletion or amendment of data. As I said earlier, we are the guardians of the system. This House is the crucible in which the decisions are made, so we must look sharp about it and not assume that these technological developments are outwith our control.
Congratulations on your election to the Chair, Madam Deputy Speaker.
Has the Minister seen the concerns raised by the think-tank Future Advocacy that the deployment of this technology may infringe upon the rights of Muslim women who wear the niqab, and wider concerns about technology being less accurate, particularly with women and ethnic minorities?
I understand that that specific issue has been raised with the Metropolitan police, and they have made it clear that nobody will be required to remove their niqab or other facial coverings. It is worth remembering what the police are seeking to do with this deployment. They are looking for wanted criminals, suspects in crimes, and possibly missing persons. When the system makes a match, it is then for a human being to decide whether intervention is proportionate or not. It is not a kind of conveyor belt. Human judgment is still required, as it will always be in sensitive and proportionate policing.
There are clearly data privacy and human rights issues bound up with facial recognition technology, which I admit will be very useful for solving crimes. However, technology moves on quickly, and it is my understanding that bodily recognition is already being developed, in which faces will not actually count as the cameras will look at people’s movement. Are we not just behind the curve on all this? As a Parliament, should we not be looking to put in place a framework that will envelop all the new technologies as they move on, rather than being one step behind? I think we should be doing a little bit more, proactively.
My hon. Friend raises an extremely important and useful point. He is quite right that the acceleration of technology needs to be embraced by the House in a way that perhaps it has not been in the past. Both he and I stood on a manifesto that contained a commitment to the enabling of technology in a strict and controlled legal framework, and we will be thinking about that over the next few months. Some years ago, I came across a company that was working on online financial security. It had a system that identified someone not only from their password when they entered it, but from the way in which that person typed their password, because apparently the way we type is very characteristic. Those are the sorts of technologies we can deploy to great effect, but with democratic control.
This technology is potentially a very powerful tool to fight crime, including serious crimes like knife crime where deprived and minority ethnic communities are, sadly, disproportionately likely to be the victims. It could also help to clear up cases like the awful recent murder and aggravated burglary in my constituency. However, will the Minister reassure the House that we will use this powerful new technology only in a proportionate way?
I can absolutely give that assurance. The police, who are of course operationally independent and have devised the system themselves, have reassured me that there is, first, no mass retention of movement data. As I say, if there is no match on the system someone’s presence in the area is instantly deleted, and any other data is deleted after 31 days unless evidential requirements are taken forward. There is no intention that we should use this other than for the apprehension of the most serious and violent criminals which, as my hon. Friend says, will pay benefits across the country.
The usual prize—thank you, Madam Deputy Speaker.
The hon. Member for Newcastle upon Tyne Central (Chi Onwurah) made an important point. The embedding of bias in technology is a major issue that will worsen with the early widespread adoption of artificial intelligence. The Government will inherit these biases as a user of these technologies, so will my hon. Friend, noting that the American studies show that the disproportionality of false recognition for ethnic minority women was between 10 and 100 times that for Caucasians, look seriously at how those technologies are improving as he progresses the adoption of this technology?
Of course I will. I recognise the possible controversy that my hon. Friend points to. As I say, in the trials and deployments thus far there is no evidence of bias either way that we can see, but in a world where technology is to come under democratic control, we all have a duty to watch for these unintended consequences and correct them when they occur—and he has my undertaking that we will do exactly that.