Security and Policing: Facial Recognition Technology Debate

Full Debate: Read Full Debate
Department: Department for International Development

Security and Policing: Facial Recognition Technology

Baroness Jones of Moulsecoomb Excerpts
Thursday 1st March 2018

(6 years, 8 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Asked by
Baroness Jones of Moulsecoomb Portrait Baroness Jones of Moulsecoomb
- Hansard - -

To ask Her Majesty’s Government what proposals they have for the use of facial recognition technology in security and policing.

Baroness Jones of Moulsecoomb Portrait Baroness Jones of Moulsecoomb (GP)
- Hansard - -

My Lords, I congratulate noble Peers on their fortitude and stamina in being here today. My partner and I have to get to Dorset for our 20th anniversary party—it would not be the same without us; our guests would probably miss us—and it is hard enough for me to be here. I also express my utmost gratitude to Silkie Carlo and the NGO Big Brother Watch, who have supported me and others in preparing for this important debate.

I have asked questions about this issue before but the answers were not satisfactory. I have therefore brought this debate before the House simply because I believe that the use of automated facial recognition technology represents a turning point in our civil liberties and human rights in the UK. It has barely been acknowledged anywhere that this could be a problem, and that is the reason for today’s debate.

If used appropriately, I do not doubt that it will provide many opportunities and ways to solve crimes, just as DNA research has done over the past few decades. However, we are currently faced with an unregulated and frankly terrifying mess, which uses data illegally and disproportionately interferes with our fundamental human rights. The current system—or, more correctly, the lack of a current system—means that there is no law, no oversight and no policy regulating police use of automated facial recognition. The limited trials that we know about have shown that it can be completely ineffective and potentially discriminatory.

The truth is that we are being watched all the time. People have had concerns about CCTV for some time but now it is beginning to recognise and identify us. The purpose of today’s debate is to understand how much we are being watched and automatically identified. I want to know how that is being governed—if it is being governed at all—and what legislative frameworks need to be put in place to properly regulate facial recognition.

In response to today’s debate, I call on the Government to do two things. First, they should place an immediate ban on police forces using automated facial recognition with surveillance cameras. The reasons for this will become clear during my speech. Secondly, I call on the Government to automatically remove the thousands of images of unconvicted individuals from the police’s custody image database. I will come back to this, but it is nearly six years since a court ruling said that the current system is illegal, so I am not sure how we are still using it.

Automated facial recognition uses technology to identify people in real time against pictures stored in a database. South Wales Police has been leading on its deployment and testing in the UK, funded by a £2 million grant from the Home Office. It has used it at a whole range of sports events, concerts and shopping centres. The Met police has also used facial recognition technology at a number of events, including Remembrance Sunday and the Notting Hill Carnival. Very little information has been released to the public on the accuracy and reliability of these tests, but the anecdotal evidence from police using it at Notting Hill Carnival was that they had 35 false positives, with only one positive match. Some five people were asked to prove their identity to the police, having been flagged up on the computer, all of whom turned out to be innocent. Big Brother Watch itself saw two people identified by the computer, the problem being that the computer had matched them with the police records of two men.

It is not looking like a great start. The Government cannot stand idly by while allowing this intrusion into individuals’ rights and identities. Big concerns about equality issues arise from this technology, particularly the risk of misidentifying people of colour, who are already disproportionately affected by policing tactics. If these concerns turn out to be correct, that will be another legal challenge in the pipeline.

If the public are ever to trust the use of this technology, it must be subject to the highest standards. We need the results of these tests to be made public and subject to rigorous scrutiny. As far as I can tell, there is absolutely no legal or regulatory framework governing how the police use automatic facial recognition. I hope the Minister can give me a straight answer on that. At the moment, it seems the Government are letting police forces get on with it as an operational matter. This is clearly not just an operational matter. We have rules about road signs, speed cameras and gathering evidence, so why would we not have rules about how they use something as potentially intrusive as facial recognition? There is a regulatory gap here that must be filled.

I am very concerned that this technology is being used with a database full of illegal images of innocent people—I include myself in that number. It seems that the facial recognition technology is using the police national database, which contains tens of thousands of people who were never charged or convicted of an offence. It is six years since the High Court ruled that the policy of retaining the mugshots of innocent people was unlawful, but the police still do it and they still upload them to the police national database. The Government’s solution in 2017 was to allow individuals to write to the police, asking to be deleted. That is just not good enough. My pictures will be on the database, along with those of hundreds of other people who have been arrested at peaceful, perfectly lawful protests and never charged with an offence. So will people whose charges were dropped, were wrongly accused or were found not guilty by a jury of their peers. No one chooses to have their photograph taken by the police; it is extracted under coercion.

The burden should be on the police to delete those images of everyone who has not been found guilty of an offence. I ask the Minister whether the Home Office will take immediate steps to automatically delete those images of every single innocent person from the police national computer and prevent the database being used for facial recognition until it no longer contains innocent people. Will she also inform the House whether other sources of personal images, such as driving licence and passport databases, are available for use by the police and the security services?

I turn my attention to the security services and, in doing so, I extend my respect and gratitude to the NSA whistleblower Edward Snowden—a true hero of our times. Among his revelations was a GCHQ programme called Optic Nerve. It is alleged that millions of innocent people were spied on through their webcams to experiment with facial recognition. Parliament has since passed laws that make bulk surveillance and interception lawful, so it seems that we are moving towards more, rather than less, of this kind of mass surveillance. I would appreciate the Minister informing the House about the security services’ use of facial recognition technology, and ask her not to hide behind the cloaking words of “national security”. I am not asking for details; I am asking for process.

There are very real concerns about the use of mass surveillance and facial recognition technology; we are moving into the kind of territory that even George Orwell could not have imagined. Whistleblowers such as Edward Snowden are being persecuted, when we should really be offering them political asylum for their heroism in exposing these nefarious, illegal schemes. We must look hard at this issue now; millions of pounds of taxpayers’ money are already being spent on deploying such systems in south Wales, London and beyond. I do not want us to come back to this in a few years’ time only to be told, “The police have invested far too much money already for us to start making changes”.

It is easy to write this issue off by saying that it is not about privacy because everyone has their face out in public anyway, but that is to look at it from the wrong end. Our faces are now being used like fingerprints and DNA, but the difference is that our faces are so obvious and public that it makes the intrusion into our private lives all the greater. If the police were taking our fingerprints and DNA at sports events, carnivals and remembrance parades, it would cause great discomfort and concern. We should be no less discomfited and concerned about their automatically scanning and identifying our faces.

It occurred to me that we could perhaps use this technology ourselves, here in this House. We could have a facial recognition camera over the doors so that we would not have to be given a little tick by doorkeepers. Perhaps the Minister would like to consider that and see whether Members of the House like it.

I reiterate my call on the Government immediately to ban the use of automatic facial recognition and to clean the police national computer of all images of innocent people.

--- Later in debate ---
Baroness Williams of Trafford Portrait The Minister of State, Home Office (Baroness Williams of Trafford) (Con)
- Hansard - - - Excerpts

I thank the noble Lord, Lord Kennedy, for that and thank the noble Baroness, Lady Jones of Moulsecoomb, for bringing forward this debate on a very important issue, now and in the future. I start by stressing the importance the Government place on giving law enforcement the tools it needs to prevent terrorism and cut crime. However, it is also important to build public trust in our use of biometrics, including the use of facial images and facial recognition technology.

Biometric data is of critical importance in law enforcement, and various forms and uses of biometric data have an increasingly significant role in everyday life in the UK. However, the technology is of course changing rapidly. The noble Lord, Lord Kennedy, talked about gait analysis technology, voice technology and other types of technology that are rapidly emerging. We are committed to producing a framework that ensures that organisations can innovate in their use and deployment of biometric technologies, such as facial recognition, and do so, crucially, in a transparent and ethical way. Noble Lords have talked about ethics in this as well. Maintaining public trust and confidence is absolutely key; achieving this involves a more open approach to the development and deployment of new technologies. We remain committed to ensuring that our use of biometrics, including those provided to law enforcement partners, is legal, ethical, transparent and robust.

In answer to the point made by the noble Lord, Lord Evans of Weardale, we will publish the Home Office biometrics strategy in June this year, as I outlined to the Science and Technology Committee. The strategy will address the use of facial recognition technology. There is ongoing work to implement last year’s custody images review, which provides a right to request deletion, and we are planning improvements to the governance of police use of custody images and facial recognition technology.

Automatic facial recognition, or AFR, is a rapidly evolving technology with huge potential, as the noble Lord, Lord Evans, and others powerfully illustrated. There have been some suggestions that there is no guidance on police use of AFR. The Home Office has published the Surveillance Camera Code of Practice, which sets out the guiding principles for striking a balance between protecting the public and upholding civil liberties. The noble Lords, Lord Kennedy and Lord Evans, and the right reverend Prelate the Bishop of St Albans all pointed this out, as did others. Police forces are obliged under the Protection of Freedoms Act—POFA—to have regard to this code. Similarly, the Information Commissioner’s Office has issued a code of practice, which explains how data protection legislation applies to the use of surveillance cameras and promotes best practice. However, to address the point of the noble Lord, Lord Scriven, we believe that more can be done to improve governance around AFR and we are discussing options for doing this with the commissioners and the police. I am very pleased to see the really good practice already being followed in this area, such as the work being done by South Wales Police, which I will go into in a bit more detail in a few minutes. We are working to ensure that this is consistently applied across all areas by tightening up our oversight arrangements of AFR.

The noble Baroness, Lady Jones of Moulsecoomb, and others talked about the retention of custody images and whether that was illegal, following the 2012 High Court ruling. The noble Lord, Lord Paddick, also alluded to this. The Police and Criminal Evidence Act 1984 gives police the power to take facial photographs of anyone detained following arrest. The regime governing the retention of custody images is set out in the Code of Practice on the Management of Police Information and statutory guidance contained in the College of Policing’s authorised professional practice. The Police Act 1996 requires chief officers to have regard to such codes of practice. In addition, the Information Commissioner and Surveillance Camera Commissioner promote their respective codes of practice.

Following the custody images review, people who are not subsequently convicted of an offence may request that their custody image be deleted from all police databases, with a presumption that it will be unless there is an exceptional policing reason for it to be retained, such as if an individual has known links to organised crime or terrorism. Assuming that the noble Baroness, Lady Jones, has links to neither—

Baroness Williams of Trafford Portrait Baroness Williams of Trafford
- Hansard - - - Excerpts

Not yet—you heard it first at the Dispatch Box. I suggested some months ago that the noble Baroness should request that her image be removed. I am assuming that she has now done so and that, therefore, it is in the process of being removed. But the police should automatically review all the custody images of convicted people that they hold, in line with scheduled review periods set out in the College of Policing’s Authorised Professional Practice to ensure that they retain only those that they need to keep.

On the point about illegality suggested by a couple of noble Lords, the court did not rule that there was an issue with applying facial recognition software to legitimately retained images. Following the CIR, we are clear that unconvicted people have the right to apply for the deletion of their image, with a presumption in favour of deletion. However, the police, as I said, have the right to retain an image in the cases that I outlined.

The noble Baroness, Lady Jones, and the noble Lord, Lord Scriven, talked about oversight. This is a very good question which was brought out by the Science and Technology Committee. Noble Lords also talked about the Biometrics and Forensics Ethics Group. In line with the recommendations of the triennial review of the Home Office science bodies, the Biometrics and Forensics Ethics Group’s remit has been extended to cover the ethical issues associated with all forensic identification techniques, including, but not limited to, facial recognition technology and fingerprinting. The Government are exploring the expansion of oversight of facial recognition systems. They are also seeking to establish an oversight board to enable greater co-ordination and transparency on the use of facial recognition by law enforcement. Noble Lords will not be surprised to hear that we are consulting with stakeholders such as the NPCC, the Surveillance Camera Commissioner, the Information Commissioner and the Biometrics Commissioner.

Noble Lords mentioned two specific instances: Notting Hill and the South Wales Police. I think that I have time to talk about both events. In 2016-17, when facial recognition technology was piloted at the Notting Hill Carnival, the Metropolitan Police published this on its website. This is in line with the fact that it is a pilot and that it is important that police let people know about it. The public were informed that the technology involved the use of overt—not covert—cameras, which scan the faces of those passing by and flag up potential matches against a specific database of custody images, and that the database had been populated with about 500 images of individuals who were forbidden to attend the carnival, as well as individuals wanted by police who it was believed might attend the carnival to commit offences. I must stress that this system does not involve a search against all images held on the police national database or the Met systems. The public were also advised that if a match was made by the system, officers would be alerted and would seek to speak to the individuals to verify their identity, making arrests if necessary. I think that it was the noble Lord, Lord Paddick, who talked about mismatches with BME people, even between men and women. That goes back to the point that this is evolving technology and in no way would it be used at this point in time other than in a pilot situation.

South Wales Police took a very proactive approach to communications in its pilot. In addition to the more formal press briefing notices, it used social media in the form of YouTube and Facebook to explain the technology to the public and publicise its deployment—and, most importantly, it published the results. In its publicity, South Wales Police has been very aware of concerns about privacy and has stressed that it has built checks and balances into its methodology to make sure that the approach is justified and balanced. It consulted the Biometrics Commissioner, the Information Commissioner and the Surveillance Camera Commissioner, all of whom are represented on the South Wales Police automatic facial recognition strategic partnership board, and gave them the opportunity to comment on the privacy impact assessment that was carried out in relation to the pilot. This resulted in a very positive press response to the pilot. The force also published a public round-up of six months of the pilot on its Facebook page.

I will go on now to the PIA, which links to that point. The noble Lord, Lord Scriven, asked about the Government doing a privacy impact assessment. I can confirm that the Home Office biometrics programme carried out privacy impact assessments on all of its strategic projects to ensure that they maximised the benefits to the public while protecting the privacy of individuals and also addressed any potential impact of data aggregation.

The noble Lords, Lord Harris and Lord Kennedy, asked about arrangements for the storage of images. The Police National Database is based in the UK. Images are taken from custody systems run by each police force and then loaded on to the PND.

The noble Baroness, Lady Jones, asked whether passport and driving licence photos were available to police. They are not used by the police when deploying facial recognition technology. They may be used under specific conditions for other policing purposes.

I thank noble Lords once again for their participation in this debate and thank the noble Baroness, Lady Jones.