Data Protection Bill [HL] Debate
Full Debate: Read Full DebateBaroness Hamwee
Main Page: Baroness Hamwee (Liberal Democrat - Life peer)Department Debates - View all Baroness Hamwee's debates with the Department for Digital, Culture, Media & Sport
(6 years, 10 months ago)
Lords ChamberI may have to add later to what I have said, which I think the Minister will find totally unpalatable. I will try to move on.
The Minister also said:
“You are concerned that if consent is not a genuine option in these situations and there are no specific processing conditions in the Bill to cover this on grounds of substantial public interest. Processing in these circumstances would be unlawful. To make their consent GDPR compliant, an employer or school must provide a reasonable alternative that achieves the same ends, for example, offering ‘manual’ entry by way of a reception desk”.
Consent is rarely valid in an employment context. If an employer believes that certain premises require higher levels of security, and that biometric access controls are a necessary and proportionate solution, it cannot be optional with alternative mechanisms that are less secure, as that undermines the security reasons for needing the higher levels of security in the first place: for example, where an employer secures a specific office or where the staff are working on highly sensitive or confidential matters, or where the employer secures a specific room in an office, such as a server room, where only a small number of people can have access and the access needs to be more secure.
Biometrics are unique to each person. A pass card can easily be lost or passed to someone else. It is not feasible or practical to insist that organisations employ extra staff for each secure office or secure room to act as security guards to manually let people in.
The Minister further stated:
“You also queried whether researchers involved in improving the reliability or ID verification mechanisms would be permitted to carry on their work under the GDPR and the Bill. Article 89(1) of the GDPR provides that processing of special categories of data is permitted for scientific research purposes, providing that appropriate technical and organisational safeguards are put in place to keep the data safe. Article 89(1) is supplemented by the safeguards of clause 18 of the Bill. For the purposes of GDPR, ‘scientific research’ has a broad meaning. When taken together with the obvious possibility of consent-based research, we are confident that the Bill allows for the general type of testing you have described”.
It is good to hear that the Government interpret the research provisions as being broad enough to accommodate the research and development described. However, for organisations to use these provisions with confidence, they need to know whether the ICO and courts will take the same broad view.
There are other amendments which would broaden the understanding of the research definition, which no doubt the Minister will speak to and which the Government could support to leave no room for doubt for organisations. However, it is inaccurate to assume that all R&D will be consent based; in fact, very little of it will be. Given the need for consent to be a genuine choice to be valid, organisations can rarely rely on this as they need a minimum amount of reliable data for R&D that presents a representative sample for whatever they are doing. That is undermined by allowing individuals to opt in and out whenever they choose. In particular, for machine learning and AI, there is a danger of discrimination and bias if R&D has incomplete datasets and data that does not accurately represent the population. There have already been cases of poor facial recognition programmes in other parts of the world that do not recognise certain races because the input data did not contain sufficient samples of that particular ethnicity with which to train the model.
This is even more the case where the biometric data for research and development is for the purpose of improving systems to improve security. Those employing security and fraud prevention measures have constantly to evaluate and improve their systems to stay one step ahead of those with malicious intent. The data required for this needs to be guaranteed and not left to chance by allowing individuals to choose. The research and development to improve the system is an integral aspect of providing the system in the first place.
I hope that the Minister recognises some of those statements that he made in his letter and will be able, at least to some degree, to respond to the points that I have made. There has been some toing and froing, so I think that he is pretty well aware of the points being raised. Even if he cannot accept these amendments, I hope that he can at least indicate that biometrics is the subject of live attention within his department and that work will be ongoing to find a solution to some of the issues that I have raised. I beg to move.
My Lords, I wonder whether I might use this opportunity to ask a very short question regarding the definition of biometric data and, in doing so, support my noble friend. The definition in Clause 188 is the same as in the GDPR and includes reference to “behavioural characteristics”. It states that,
“‘biometric data’ means personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of an individual, which allows or confirms the unique identification of that individual, such as facial images or dactyloscopic data”.
Well:
“There’s no art
To find the mind’s construction in the face”.
How do behavioural characteristics work in this context? The Minister may not want to reply to that now, but I would be grateful for an answer at some point.
My Lords, I thank the noble Lord, Lord Clement-Jones, for engaging constructively on this subject since we discussed it in Committee. I know that he is keen for data controllers to have clarity on the circumstances in which the processing of biometric data would be lawful. I recognise that the points he makes are of the moment: my department is aware of these issues and will keep an eye on them, even though we do not want to accept his amendments today.
To reiterate some of the points I made in my letter so generously quoted by the noble Lord, the GDPR regards biometric data as a “special category” of data due to its sensitivity. In order to process such data, a data controller must satisfy a processing condition in Article 9 of the GDPR. The most straightforward route to ensure that processing of such data is lawful is to seek the explicit consent of the data subject. However, the GDPR acknowledges that there might be occasions where consent is not possible. Schedule 1 to the Bill makes provision for a range of issues of substantial public interest: for example, paragraph 8, which permits processing such as the prevention or detection of an unlawful act. My letter to noble Lords following day two in Committee went into more detail on this point.
The noble Lord covered much of what I am going to say about businesses such as banks making use of biometric identification verification mechanisms. Generally speaking, such mechanisms are offered as an alternative to more conventional forms of access, such as use of passwords, and service providers should have no difficulty in seeking the data subject’s free and informed consent, but I take the point that obtaining proper, GDPR-compliant consent is more difficult when, for example, the controller is the data subject’s employer. I have considered this issue carefully following our discussion in Committee, but I remain of the view that there is not yet a compelling case to add new exemptions for controllers who wish to process sensitive biometric data without the consent of data subjects. The Bill and the GDPR make consent pre-eminent wherever possible. If that means employers who wish to install biometric systems have to ensure that they also offer a reasonable alternative to those who do not want their biometric data to be held on file, then so be it.
There is legislative precedent for this principle. Section 26 of the Protection of Freedoms Act 2012 requires state schools to seek parental consent before processing biometric data and to provide a reasonable alternative mechanism if consent is not given or is withdrawn. I might refer the noble Lord to any number of speeches given by members of his own party—the noble Baroness, Lady Hamwee, for example—on the importance of those provisions. After all, imposing a legislative requirement for consent was a 2010 Liberal Democrat manifesto commitment. The GDPR merely extends that principle to bodies other than schools. The noble Lord might respond that his amendment’s proposed subsection (1) is intended to permit processing only in a tight set of circumstances where processing of biometric data is undertaken out of necessity. To which I would ask: when is it genuinely necessary to secure premises or authenticate individuals using biometrics, rather than just cheaper or more convenient?
We also have very significant concerns with the noble Lord’s subsections (4) and (5), which seek to drive a coach and horses through fundamental provisions of the GDPR—purpose limitation and storage limitation, in particular. The GDPR does not in fact allow member states to derogate from article 5(1)(e), so subsection (5) would represent a clear breach of European law.
For completeness, I should also mention concerns raised about whether researchers involved in improving the reliability of ID verification mechanisms would be permitted to carry on their work under the GDPR and the Bill. I reassure noble Lords, as I did in Committee, that article 89(1) of the GDPR provides that processing of special categories of data is permitted for scientific research purposes, providing appropriate technical and organisational safeguards are put in place to keep the data safe. Article 89(1) is supplemented by the safeguards in Clause 18 of the Bill. Whatever your opinion of recitals and their ultimate resting place, recital 159 is clear that the term “scientific research” should be interpreted,
“in a broad manner including for example technological development and demonstration”.
This is a fast-moving area where the use of such technology is likely to increase over the next few years, so I take the point of the noble Lord, Lord Clement-Jones, that this is an area that needs to be watched. That is partly why Clause 9(6) provides a delegated power to add further processing conditions in the substantial public interest if new technologies, or applications of existing technologies, emerge. That would allow us to make any changes that are needed in the future, following further consultation with the parties that are likely to be affected by the proposals, both data controllers and, importantly, data subjects whose sensitive personal data is at stake. For those reasons, I hope the noble Lord is persuaded that there are good reasons for not proceeding with his amendment at the moment.
The noble Baroness, Lady Hamwee, asked about behavioural issues. I had hoped that I might get some inspiration, but I fear I have not, so I will get back to her and explain all about behavioural characteristics.