Technology Rules: The Advent of New Technologies in the Justice System (Justice and Home Affairs Committee Report) Debate

Full Debate: Read Full Debate
Department: Home Office

Technology Rules: The Advent of New Technologies in the Justice System (Justice and Home Affairs Committee Report)

Lord Clement-Jones Excerpts
Monday 28th November 2022

(1 year, 12 months ago)

Grand Committee
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, it is a pleasure to follow three such excellent opening speeches. I draw attention to my interests in the register, particularly my interest in artificial intelligence technologies as a former chair of the AI Select Committee of this House. As a non-member of her committee, I congratulate my noble friend Lady Hamwee and the committee on such a comprehensive and well-argued report.

I entirely understand and welcome the width of the report but today I shall focus on live facial recognition technology, a subject that I have raised many times in this House and elsewhere in Questions and debates, and even in a Private Member’s Bill, over the last five years. The previous debate involving a Home Office Minister—the predecessor of the noble Lord, Lord Sharpe, the noble Baroness, Lady Williams—was in April, on the new College of Policing guidance on live facial recognition.

On each occasion, I drew attention to why guidance or codes are regarded as insufficient by myself and many other organisations such as Liberty, Big Brother Watch, the Ada Lovelace Institute, the former Information Commissioner, current and former Biometrics and Surveillance Camera Commissioners and the Home Office’s own Biometrics and Forensics Ethics Group, not to mention the Commons Science and Technology Committee. On each occasion, I have raised the lack of a legal basis for the use of this technology—and on each occasion, government Ministers have denied that new explicit legislation or regulation is needed, as they have in the wholly inadequate response to this report.

In the successful appeal of Liberal Democrat Councillor Ed Bridges, the Court of Appeal case on the police use of live facial recognition issued in August 2020, the court ruled that South Wales Police’s use of such technology had not been in accordance with the law on several grounds, including in relation to certain human rights convention rights, data protection legislation and the public sector equality duty. So it was with considerable pleasure that I read the Justice and Home Affairs Committee report, which noted the complicated institutional landscape around the adoption of this kind of technology, emphasised the need for public trust and recommended a stronger legal framework with primary legislation embodying general principles supported by detailed regulation, a single national regulatory body, minimum scientific standards, and local or regional ethics committees put on a statutory basis.

Despite what paragraph 4 of the response says, neither House of Parliament has ever adequately considered or rigorously scrutinised automated facial recognition technology. We remain in the precarious position of police forces dictating the debate, taking it firmly out of the hands of elected parliamentarians and instead—as with the recent College of Policing guidance—marking their own homework. A range of studies have shown that facial recognition technology disproportionately misidentifies women and BAME people, meaning that people from those groups are more likely to be wrongly stopped and questioned by police, and to have their images retained as the result of a false match.

The response urges us to be more positive about the use of new technology, but the UK is now the most camera-surveilled country in the Western world. London remains the third most surveilled city in the world, with 73 surveillance cameras for every 1,000 people. The last Surveillance Camera Commissioner did a survey, shortly before stepping down, and found that there are over 6,000 systems and 80,000 cameras in operation in England and Wales across 183 local authorities. The ubiquity of surveillance cameras, which can be retrofitted with facial recognition software and fed into police databases, means that there is already an apparatus in place for large-scale intrusive surveillance, which could easily be augmented by the widespread adoption of facial recognition technology. Indeed, many surveillance cameras in the UK already have advanced capabilities such as biometric identification, behavioural analysis, anomaly detection, item/clothing recognition, vehicle recognition and profiling.

The breadth of public concern around this issue is growing clearer by the day. Many cities in the US have banned the use of facial recognition, while the European Parliament has called for a ban on the police use of facial recognition technology in public places and predictive policing. In 2020 Microsoft, IBM and Amazon announced that they would cease selling facial recognition technology to US law enforcement bodies.

Public trust is crucial. Sadly, the new Data Protection and Digital Information Bill does not help. As the Surveillance Camera Commissioner said last year, in a blog about the consultation leading up to it:

“This consultation ought to have provided a rare opportunity to pause and consider the real issues that we talk about when we talk about accountable police use of biometrics and surveillance, a chance to design a legal framework that is a planned response to identified requirements rather than a retrospective reaction to highlighted shortcomings, but it is an opportunity missed.”


Now we see that the role of Surveillance Camera Commissioner is to be abolished in the new data protection Bill—talk about shooting the messenger. The much-respected Ada Lovelace Institute has called, in its report Countermeasures and the associated Ryder review in June this year, for new primary legislation to govern the use of biometric technologies by both public and private actors, for a new oversight body and for a moratorium until comprehensive legislation is passed.

The Justice and Home Affairs Committee stopped short of recommending a moratorium on the use of LFR, but I agree with the institute that a moratorium is a vital first step. We need to put a stop to this unregulated invasion of our privacy and have a careful review, so that its use can be paused while a proper regulatory framework is put in place. Rather than update and use toothless codes of practice, as we are urged to do by the Government, to legitimise the use of new technologies such as live facial recognition, the UK should have a root-and-branch surveillance camera and biometrics review, which seeks to increase accountability and protect fundamental rights. The committee’s report is extremely authoritative in this respect. I hope today that the Government will listen but, so far, I am not filled with optimism about their approach to AI governance.