Biometrics: Ethnic Groups

(asked on 9th June 2020) - View Source

Question to the Department for Digital, Culture, Media & Sport:

To ask Her Majesty's Government what steps they are taking to ensure that commercial facial recognition technology is (1) registered, (2) accurate, and (3) not discriminatory towards people from BAME communities.


Answered by
Baroness Barran Portrait
Baroness Barran
Parliamentary Under-Secretary (Department for Education)
This question was answered on 23rd June 2020

Uses of facial recognition technology in the UK, both private and public, are regulated by the GDPR and the Data Protection Act 2018 that set standards for protecting personal data. Organisations have an obligation to ensure that any personal data they hold is accurate and processed in a manner that is lawful, fair and transparent.

Facial images, which constitute 'special category' data for the purposes of the legislation are subject to heightened safeguards and can only be processed if specific conditions in the legislation are met. Processing must be necessary, proportionate and justified. The legislation is enforced by the Information Commissioner's Office, which has shown a willingness to take action against commercial organisations that are acting unlawfully.

To ensure a safe use of facial recognition technology (FRT) in all sectors, the government tasked the Centre for Data Ethics and Innovation (CDEI) to produce a Snapshot briefing paper looking at the uses and potential implications of facial recognition technology’s deployment in the UK. The paper was published on 28 May and we are considering its findings. The CDEI are currently working on a review into bias in algorithmic decision-making and will continue to examine the impacts of FRT and algorithms on society and provide recommendations on how to minimise bias.

Reticulating Splines