Surveillance Camera Code of Practice Debate

Full Debate: Read Full Debate
Department: Home Office

Surveillance Camera Code of Practice

Baroness Falkner of Margravine Excerpts
Wednesday 2nd February 2022

(2 years, 10 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Anderson of Ipswich Portrait Lord Anderson of Ipswich (CB)
- Hansard - - - Excerpts

My Lords, as expectations of privacy are lower in public places than at home, overt surveillance, such as by street cameras, is generally seen as a lesser intrusion into our liberties than either covert surveillance by intelligence agencies—the subject of my 2015 report, A Question of Trust—or so-called surveillance capitalism, the monitoring and monetising of our personal data by big tech. However, that assessment has been cast into doubt by automatic facial recognition and similar technologies, which potentially enable their users to put a name to every person picked up by a camera, to track their movements and to store images of them on vast databases that can be efficiently searched using AI-driven analytics.

Those databases are not all owned by the police: the company Clearview AI has taken more than 10 billion facial images from public-only web sources and boasts on its website that its database is available to US law enforcement on a commercial basis. This technology, part of the information revolution in whose early stages we now find ourselves, can now more be stopped than, two centuries ago, could the steam engine, but, as has been said, the abuses of overt surveillance are already obvious in the streets of China and Hong Kong. To show the world that we are better, we must construct for those who wish to use these powers, as our forebears did in the Industrial Revolution, a democratic licence to operate.

We start in this country with a number of advantages. We have a strong tradition of citizen engagement and, as the noble Lord, Lord Alton, said, a culture of policing by consent. We inherited strong data protection laws from the EU and we still have legislation that gives real protection to human rights. We even had—almost uniquely in the world—a Surveillance Camera Commissioner, Tony Porter. I pay tribute to the extraordinary work that he did, on a part-time basis and without any powers of inspection, audit or sanction, including the issue of a 70-page document with detailed recommendations for police users of this technology.

I regret that the Surveillance Camera Code of Practice is, by comparison, a slim and highly general document. It is not comparable to the detailed codes of practice issued under the Investigatory Powers Act 2016 and overseen by the world-leading Investigatory Powers Commissioner’s Office. The designated bodies which must have regard to it are confined to local authorities and policing bodies; they do not include, as the noble Lord, Lord Clement-Jones, said, health, education or transport providers, private operators or, indeed, the Government themselves. Consultation on the latest version made no attempt to involve the public but was limited to statutory consultees.

The recent annual report of Tony Porter’s impressively qualified but thinly spread successor, the Biometrics and Surveillance Camera Commissioner, Fraser Sampson, commented that his formal suggestions for the code were largely dismissed as being “out of scope”. He added:

“That my best endeavours to get even a sentence reminding relevant authorities of the ethical considerations were rejected on the grounds that it would be too burdensome is perhaps an indication of just how restrictive this scope—wherever it is to be found—must have been.”


I do not know whether the highly general provisions of the code will be effective to improve local policies on the ground and ensure the consistency between them that my noble and learned friend Lord Etherton and his colleagues gently pointed out was desirable in their judgment in the Bridges case. In the absence of an IPCO-style inspection regime, perhaps we never will know. I suspect that the need not to stifle innovation, advanced in the code as a justification for its brevity, is a less than adequate excuse for the failure to do more to develop the code itself against a changing legal and technological background.

The words of the Motion are harsher than I would have chosen but, as the Snowden episode a few years ago showed, public trust in these increasingly intrusive technologies can be suddenly lost and requires huge effort to regain. I hope that the next revision of this code will be more energetic and ambitious than the last.

Baroness Falkner of Margravine Portrait Baroness Falkner of Margravine (CB)
- Hansard - -

My Lords, it is a pleasure to follow three incredibly distinguished speakers in this debate. With reference to the remarks of the noble Lord, Lord Clement-Jones, attributed to the Minister, I must say that if this is a subject for geeks, I am delighted to join the band of geeks.

I fear I shall demonstrate a level of ignorance tonight, because I am a newcomer to the debate. In fact, I emailed the noble Lord, Lord Clement-Jones, earlier today because I had only just realised that it was taking place tonight. I am also speaking in a hybrid capacity—I now understand the true meaning of “hybrid”—so my opening remarks will be personal, but for those that follow, I will need to declare an interest, so I shall do so in advance of making those remarks.

In my opening remarks I have to say just a few things that demonstrate what a parlous state we are in as a country in terms of respect for human rights. The level of permissiveness in the capture—state capture, policy capture—of institutions that operate in authoritarian regimes, a list of which the noble Lord, Lord Alton, has given us, is truly staggering. We bang on about how fantastic our sanctions regime is, and so on, yet these companies, many of them Chinese, as the noble Lord described, operate here with complete impunity and we seem entirely content to allow them to do so, while we also recognise, in our foreign policy statements, that some of these countries have very ignoble intentions towards any freedom-loving democracy. I know the noble Baroness represents the Home Office, but I hope it is something the Government at large will take account of, because commercial surveillance, commercial espionage, commercial authority and commercial capture of the economy are all things we need to be incredibly vigilant about. One needs only to look at Russia’s capture of the German political debate, through Nord Stream 2, and what we are facing now with the Ukraine issue, to understand what is being discussed here by the noble Lord, Lord Alton.

Those are my general remarks. My remarks on it as chair of the Equality and Human Rights Commission now follow. There, I have to say to the noble Lord, Lord Clement-Jones, that I am so relieved he managed to secure this regret Motion. Articles 8, 9, 10, 11 and 14—the general article against discrimination—of the European Convention on Human Rights are engaged in this, so the fact that we get a document as thin as this is truly remarkable. I understand why only statutory bodies were consulted—it was a means for the Government to get it through in six weeks without being very concerned about broader concerns—but it is regrettable. The Bridges case directly engaged the public sector equality duty. The Equality and Human Rights Commission is the regulator of the public sector equality duty, yet the idea that it was not consulted, post the judgment, on how we might strengthen the code in light of that judgment is a matter of great deep regret to me.

I have a couple of points on the code. In paragraph 10.4 we are told that effective review and audit mechanisms should be published regularly. The summary of such a review has to be made available publicly, so my question to the noble Baroness is: why only a summary? In the interests of transparency and accountability, it is essential that these bodies regularly give a full explanation of what they are doing. The public sector equality duty requires legitimate aims to be addressed objectively, verifiably and proportionately. We, the public, will not be capable of assessing whether those tests have been met if there is only an executive summary to go by.

My other point concerns section 12.3, “When using a surveillance camera” and so on. The third bullet point requires “having due regard” and states that

“chief police officers should … have regard to the Public Sector Equality Duty, in particular taking account of any potential adverse impact that the LFR algorithm may have on members of protected groups.”

Again, no practical examples are provided in this rather thin document. We know from publishing statutory codes that the public, and even the bodies that use this technology, want practical examples. A code is effective, of value and of use, to the providers as well as the public, only when it gives those practical examples, because you cannot test the legal interpretation of those examples until you have that evidence before you.

We, the EHRC, have been unable at short notice to assess whether the code is in compliance with the Bridges judgment—I wonder, myself, whether it is—but we do not take a clear position on the legality of the revised code, and I should say that in clarification. However, we have recommended previously that the Government scrutinise the impact of any policing technologies, in particular for the impact on ethnic minorities, because we have a mountain of evidence piling up to say that they discriminate against people of darker skin colour.

We wanted mandatory independent equality and human rights impact assessments. These should ensure that decisions regarding the use of such technologies are informed by those impact assessments and the publication of the relevant data—this takes me back to my point about executive summaries—and then evaluated on an ongoing basis, and that appropriate mitigating action is taken through robust oversight, including the development of a human rights compliant legal, regulatory and policy framework. That is in conformity with our role as a regulator. We have recommended that, in light of evidence regarding their inaccuracy, and potentially discriminating impacts, the Government review the use of automated facial recognition and predictive programs in policing, pending completion of the above independent impact assessments and consultation processes, and the adoption of appropriate mitigation action. We await action from the Government on the basis of this recommendation.