Facial Recognition Surveillance Debate
Full Debate: Read Full DebateLord Clement-Jones
Main Page: Lord Clement-Jones (Liberal Democrat - Life peer)Department Debates - View all Lord Clement-Jones's debates with the Department for International Development
(4 years, 10 months ago)
Lords ChamberMy Lords, this was recently tested in court and the High Court found that the police were operating within the law, so we do not feel that there is any need for further legislation at this point. However, I understand that the decision is being appealed, so that is probably about as far as I can go today.
My Lords, I confess to being rather baffled by the Government’s agreement to this. Only in September, the Metropolitan Police Commissioner said in the context of live facial recognition technology that the UK risks becoming a
“ghastly, Orwellian, omniscient police state”
with
“potential for bias in the data or the algorithm.”
The Information Commissioner expressed deep concern in her last report and in her reaction to the Met’s deployment. She said:
“We reiterate our call for Government to introduce a statutory and binding code of practice for LFR as a matter of priority.”
The Home Office’s own Biometrics and Forensics Ethics Group has questioned the accuracy of live facial recognition technology and noted its potential for biased outputs and biased decision-making on the part of system operators. The Science and Technology Committee recommended a moratorium in its report of just over a year ago. When the Minister responded to me in an Oral Question about the watchlist, that was not reassuring either: the watchlist is extensive. Is the answer not a moratorium as a first step, to put a stop to this unregulated invasion of our privacy? I commend to the Minister in that context my Private Member’s Bill, due to have a First Reading next week.
My Lords, I wish the noble Lord’s Private Member’s Bill all the very best when it comes to your Lordships’ House—without pre-empting, obviously, its outcome.
As for inaccuracy, LFR has been shown to be 80% accurate. It has thrown up one false result in 4,500 and there was no evidence of racial bias against BME people. I should point out that a human operative always makes the final decision; this is not decision by machine.