Question to the Home Office:
To ask His Majesty's Government, with regard to the statement by the Secretary of State for the Home Office on 26 January (HC Deb col 610), what assessment they have made of any bias and inconsistency of application in the use of facial recognition assessments and algorithms for Black and Asian men and women.
The algorithm used for retrospective facial recognition searches on the Police National Database (PND) has been independently tested by the National Physical Laboratory (NPL), which found that in a limited set of circumstances it was more likely to incorrectly include some demographic groups in its search results. At the settings used by police, the NPL also found that if a correct match was in the database, the algorithm found it in 99% of searches.
We take these findings very seriously. A new algorithm has been procured and independently tested, which can be used at settings with no statistically significant bias. It is due to be operationally tested in the coming months and will be subject to evaluation.
Manual safeguards embedded in police training, operational practice and guidance have always required trained users and investigating officers to visually assess all potential matches. Training and guidance have been re-issued and promoted to remind them of these long-standing manual safeguards. The National Police Chiefs’ Council has also updated and published data protection and equality impact assessments.
Given the importance of this issue, the Home Secretary has asked HMICFRS, supported by the Forensic Science Regulator, to inspect police and relevant law enforcement agencies’ use of retrospective facial recognition, with work expected to begin before the end of March.
It is important to note that no decisions are made by the algorithm or solely on the basis of a possible match– matches are intelligence, which must be corroborated with other information, as with any other police investigation.
For live facial recognition, NPL testing found, a 1 in 6,000 false alert rate on a watchlist containing 10,000 images. In practice, the police have reported that the false alert rate has been far better than this. The NPL also found no statistically significant performance differences by gender, age, or ethnicity at the settings used by the police.
On 4 December last year, we launched a public consultation on when and how biometrics, facial recognition and similar technologies should be used, and what safeguards and oversight are needed. Following analysis of the responses, we will publish a formal government response in due course.