Debates between Lord Alton of Liverpool and Lord Clement-Jones during the 2019 Parliament

Surveillance Camera Code of Practice

Debate between Lord Alton of Liverpool and Lord Clement-Jones
Wednesday 2nd February 2022

(2 years, 2 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I have raised the subject of live facial recognition many times in this House and elsewhere, most recently last November, in connection with its deployment in schools. Following an incredibly brief consultation exercise, timed to coincide with the height of the summer holidays last year, the Government laid an updated Surveillance Camera Code of Practice, pursuant to the Protection of Freedoms Act 2012, before both Houses on 16 November last year, which came into effect on 12 January 2022.

The subject matter of this code is of great importance. The last Surveillance Camera Commissioner did a survey shortly before stepping down, and found that there are over 6,000 systems and 80,000 cameras in operation across 183 local authorities. The UK is now the most camera-surveilled country in the western world. According to recently published statistics, London remains the third most surveilled city in the world, with 73 surveillance cameras for every 1,000 people. We are also faced with a rising tide of the use of live facial recognition for surveillance purposes.

Let me briefly give a snapshot of the key arguments why this code is insufficient as a legitimate legal or ethical framework for the police’s use of facial recognition technology and is incompatible with human rights requirements surrounding such technology. The Home Office has explained that changes were made mainly to reflect developments since the code was first published, including changes introduced by legislation such as the Data Protection Act 2018 and those necessitated by the successful appeal of Councillor Ed Bridges in the Court of Appeal judgment on police use of live facial recognition issued in August 2020, which ruled that that South Wales Police’s use of AFR—automated facial recognition—had not in fact been in accordance with the law on several grounds, including in relation to certain convention rights, data protection legislation and the public sector equality duty.

During the fifth day in Committee on the Police, Crime, Sentencing and Courts Bill last November, the noble Baroness, Lady Williams of Trafford, the Minister, described those who know about the Bridges case as “geeks”. I am afraid that does not minimise its importance to those who want to see proper regulation of live facial recognition. In particular, the Court of Appeal held in Bridges that South Wales Police’s use of facial recognition constituted an unlawful breach of Article 8—the right to privacy—as it was not in accordance with law. Crucially, the Court of Appeal demanded that certain bare minimum safeguards were required for the question of lawfulness to even be considered.

The previous surveillance code of practice failed to provide such a basis. This, the updated version, still fails to meet the necessary standards, as the code allows wide discretion to individual police forces to develop their own policies in respect of facial recognition deployments, including the categories of people included on a watch-list and the criteria used to determine when to deploy. There are but four passing references to facial recognition in the code itself. This scant guidance cannot be considered a suitable regulatory framework for the use of facial recognition.

There is, in fact, no reference to facial recognition in the Protection of Freedoms Act 2012 itself or indeed in any other UK statute. There has been no proper democratic scrutiny over the code and there remains no explicit basis for the use of live facial recognition by police forces in the UK. The forthcoming College of Policing guidance will not satisfy that test either.

There are numerous other threats to human rights that the use of facial recognition technology poses. To the extent that it involves indiscriminately scanning, mapping and checking the identity of every person within the camera’s range—using their deeply sensitive biometric data—LFR is an enormous interference with the right to privacy under Article 8 of the ECHR. A “false match” occurs where someone is stopped following a facial recognition match but is not, in fact, the person included on the watch-list. In the event of a false match, a person attempting to go about their everyday life is subject to an invasive stop and may be required to show identification, account for themselves and even be searched under other police powers. These privacy concerns cannot be addressed by simply requiring the police to delete images captured of passers-by or by improving the accuracy of the technology.

The ECHR requires that any interference with the Article 10 right to freedom of expression or the Article 11 right to free association is in accordance with law and both necessary and proportionate. The use of facial recognition technology can be highly intimidating. If we know our faces are being scanned by police and that we are being monitored when using public spaces, we are more likely to change our behaviour and be influenced on where we go and who we choose to associate with.

Article 14 of the ECHR ensures that no one is denied their rights because of their gender, age, race, religion or beliefs, sexual orientation, disability or any other characteristic. Police use of facial recognition gives rise to two distinct discrimination issues: bias inherent in the technology itself and the use of the technology in a discriminatory way.

Liberty has raised concerns regarding the racial and socioeconomic dimensions of police trial deployments thus far—for example, at Notting Hill Carnival for two years running as well as twice in the London Borough of Newham. The disproportionate use of this technology in communities against which it “underperforms” —according to its proponent’s standards—is deeply concerning.

As regards inherent bias, a range of studies have shown facial recognition technology disproportionately misidentifies women and BAME people, meaning that people from these groups are more likely to be wrongly stopped and questioned by police and to have their images retained as the result of a false match.

The Court of Appeal determined that South Wales Police had failed to meet its public sector equality duty, which requires public bodies and others carrying out public functions to have due regard to the need to eliminate discrimination. The revised code not only fails to provide any practical guidance on the public sector equality duty but, given the inherent bias within facial recognition technology, it also fails to emphasise the rigorous analysis and testing required by the public sector equality duty.

The code itself does not cover anybody other than police and local authorities, in particular Transport for London, central government and private users where there have also been concerning developments in terms of their use of police data. For example, it was revealed that the Trafford Centre in Manchester scanned the faces of every visitor for a six-month period in 2018, using watch-lists provided by Greater Manchester Police—approximately 15 million people. LFR was also used at the privately owned but publicly accessible site around King’s Cross station. Both the Met and British Transport Police had provided images for their use, despite originally denying doing so.

It is clear from the current and potential future human rights impact of facial recognition that this technology has no place on our streets. In a recent opinion, the former Information Commissioner took the view that South Wales Police had not ensured that a fair balance had been struck between the strict necessity of the processing of sensitive data and the rights of individuals.

The breadth of public concern around this issue is growing clearer by the day. Several major cities in the US have banned the use of facial recognition and the European Parliament has called for a ban on police use of facial recognition technology in public places and predictive policing. In response to the Black Lives Matter uprisings in 2020, Microsoft, IBM and Amazon announced that they would cease selling facial recognition technology to US law enforcement bodies. Facebook, aka Meta, also recently announced that it will be shutting down its facial recognition system and deleting the “face prints” of more than a billion people after concerns were raised about the technology.

In summary, it is clear that the Surveillance Camera Code of Practice is an entirely unsuitable framework to address the serious rights risk posed by the use of live facial recognition in public spaces in the UK. As I said in November in the debate on facial recognition technology in schools, the expansion of such tools is a

“short cut to a widespread surveillance state.”—[Official Report, 4/11/21; col. 1404.]

Public trust is crucial. As the Biometrics and Surveillance Camera Commissioner said in a recent blog:

“What we talk about in the end, is how people will need to be able to have trust and confidence in the whole ecosystem of biometrics and surveillance”.


I have on previous occasions, not least through a Private Member’s Bill, called for a moratorium on the use of LFR. In July 2019, the House of Commons Science and Technology Committee published a report entitled The Work of the Biometrics Commissioner and the Forensic Science Regulator. It repeated a call made in an earlier 2018 report that

“automatic facial recognition should not be deployed until concerns over the technology’s effectiveness and potential bias have been fully resolved.”

The much-respected Ada Lovelace Institute has also called for a

“a voluntary moratorium by all those selling and using facial recognition technology”,

which would

“enable a more informed conversation with the public about limitations and appropriate safeguards.”

Rather than update toothless codes of practice to legitimise the use of new technologies like live facial recognition, the UK should have a root and branch surveillance camera review which seeks to increase accountability and protect fundamental rights. The review should investigate the novel rights impacts of these technologies, the scale of surveillance we live under and the regulations and interventions needed to uphold our rights.

We were reminded by the leader of the Opposition on Monday about what Margaret Thatcher said, and I also said this to the Minister earlier this week:

“The first duty of Government is to uphold the law. If it tries to bob and weave and duck around that duty when it’s inconvenient, if Government does that, then so will the governed and then nothing is safe—not home, not liberty, not life itself.”


It is as apposite for this debate as it was for that debate on the immigration data exemption. Is not the Home Office bobbing and weaving and ducking precisely as described by the late Lady Thatcher?

Lord Alton of Liverpool Portrait Lord Alton of Liverpool (CB)
- Hansard - -

My Lords, the noble Lord, Lord Clement-Jones, has given an eloquent exposition of the reasons for supporting his Motion of Regret. The Motion refers to the ethical and human rights considerations that attach to the use of surveillance camera technology, and it is to those two considerations that I shall address my remarks. I especially draw the Minister’s attention to the Amnesty International report of 3 June 2021 about the use of surveillance technology in New York, to which the noble Lord referred, and also to the serious civil liberty questions that that report raised. Concerns were raised in Japan on 28 December, in Yomiuri Shimbun, and in the Financial Times on 10 June, about Chinese technology in Belgrade, and on the Asia News Monitor in November 2021 in a report from Thailand about mass surveillance against Uighurs in Xinjiang, as well as a report in the Telegraph of 1 December, in which the head of MI6, Richard Moore, said that

“technologies of control … are increasingly being exported to other governments by China—expanding the web of authoritarian control around the planet”.

It is not just control—it is also a keystone in the export of truly shocking crimes against humanity and even genocide. Just a week ago, we marked Holocaust Memorial Day, on which many colleagues from across the House signed the Holocaust Memorial Day book or issued statements recommitting to never allowing such a genocide to happen ever again. Yet, sadly, in 2022, as the Foreign Secretary has said, a genocide against the Uighur Muslims is taking place in Xinjiang. As I argued in our debate on Monday, we are doing far too little to sanction those companies that are actively involved, or to regulate and restrict the facial recognition software that has allowed the Chinese state to incarcerate and enslave more than a million Uighurs.

In the 1940s, we did not allow the widespread use of IBM’s machines, or other tools of genocide used in Nazi Germany and manufactured by slave labour in factories and concentration camps, to be sold in the United Kingdom. Today we find ourselves in the perverse situation of having Chinese surveillance cameras with facial recognition software being used in government departments, hospitals, schools and local councils as well as in shops, such as Tesco and Starbucks. It is an issue that I doggedly raised during our debates on the telecommunications Bills that have recently been before your Lordships’ House. As I said in those debates, a series of freedom of information requests in February 2021 found that more than 70% of local councils use surveillance cameras and software from either Dahua Technology or Hikvision, which are companies rightly subject to United States sanctions for their involvement in the development and installation of technology and software that targets Uighur Muslims. Nevertheless, these companies are free to operate in the United Kingdom.

So much for co-ordinating our response with our Five Eyes allies, which was the subject of one amendment that I laid before your Lordships’ House. Far from being a reputable or independent private company, more than 42% of Hikvision is owned by Chinese state-controlled enterprises. According to Hikvision’s accounts, for the first half of 2021, the company received RMB 223 million in state subsidies, while the company works hand in glove with the authorities in Xinjiang, having signed five public-private partnerships with them since 2017. What is perhaps just as disturbing are the recent reports in the Mail on Sunday that Hikvision received up to £10,000 per month of furlough money from United Kingdom taxpayers from December 2020 until February 2021. How can it be right that, at a time when the US Government are sanctioning Hikvision for its links to Uighur concentration camps, the UK Government are giving them taxpayer money and Covid furlough funds?

It is clear that the introduction and use of this type of facial recognition software technology by the police needs substantial regulation and oversight, especially because of the dominance of sanctioned Chinese companies in the UK surveillance market. Hikvision alone has nearly 20% of the global surveillance camera market. Hikvision is working hard to penetrate and dominate the UK surveillance technology sector. In May 2021, it launched a consultant support programme and demonstration vehicles so it could bring its technology

“to all parts of the United Kingdom”.

In October, it became corporate partner in the Security Institute, the UK’s largest membership body for security professionals, and it has launched a dedicated UK technology partner programme. All of this deserves further investigation by our domestic intelligence services.