Data Protection Bill [HL] Debate
Full Debate: Read Full DebateLord Clement-Jones
Main Page: Lord Clement-Jones (Liberal Democrat - Life peer)Department Debates - View all Lord Clement-Jones's debates with the Department for Digital, Culture, Media & Sport
(6 years, 11 months ago)
Lords ChamberMy Lords, I will speak to Amendment 117 in my name, but before I do I warmly congratulate my noble friend Lady Kidron on obtaining this important code of practice for children. I apologise for not having spoken in the debate on this Bill previously, but Amendment 117 is significant and is also a children’s rights issue.
If there is to be—correctly—a sensitivity concerning age-appropriate understanding by children in relation to information services, the same should be no less true in the school setting, where personal data given out ranges from a new maths app to data collected by the DfE for the national pupil database. A code of practice needs to be introduced that centres on the rights of the child—children are currently disempowered in relation to their own personal data in schools. Although not explicitly referred to in this amendment, such a code ought to reflect the child’s right to be heard as set out in Article 12 of the UN Convention on the Rights of the Child. Among other things, it would allow children, parents, school staff and systems administrators to build trust together in safe, fair and transparent practice.
The situation is complicated in part by the fact that it is parents who make decisions on behalf of children up to the age of 18; although that in itself makes it even more necessary that children are made aware of the data about themselves that is collected and every use to which that data may be put, including the handing on to third-party users, as well as the justification for so doing. The current reality is that children may well go through life without knowing that data on a named basis is held permanently by the DfE, let alone passed on to others. There may, of course, be very good research reasons why data is collected, but such reasons should not override children’s rights, even as an exemption.
It is because there is no clear code of practice for a culture of increased data gathering in the school setting that we now have the current situation of growing controversy, enforcement and misuse. It is important, for instance, that both parents and children, in their capacity to understand, are made aware—as schools should be—of what data can be provided optionally. However, when nationality and place of birth were introduced by the DfE last year, many schools demanded that passports be brought into the classroom. In effect, the DfE operated an opt-out system. The introduction of nationality and place of birth data also raises the question of the relevance of data to improving education and its ultimate use. Many parents do not believe that such data has anything to do with the improvement of education. Last week, Against Borders for Children, supported by Liberty, launched an action against the Government on this basis.
There is now also considerable concern about the further expansion of the census data in January next year to include alternative provision data on mental health, pregnancy and other sensitive information without consent from parents or children, with no commitment to children’s confidentiality and without ceasing the use of identifying data for third-party use.
It was only after FOI requests and questions from Caroline Lucas that we discovered that the DfE had passed on individual records to the Home Office for particular immigration purposes. As defenddigitalme said, such action,
“impinges on fundamental rights to privacy and the basic data protection principles of purposes limitation and fairness”.
I appreciate that as the Bill stands such purposes are an exemption, but teachers are not border guards.
In 2013, a large number of records were passed to the Daily Telegraph by the DfE. In an Answer given on 31 October this year by Nick Gibb to a Question by Darren Jones, he incorrectly said that individuals could not be identified. There is no suggestion that there was any sinister intent, but many parents and schoolchildren would be appalled that a newspaper had possession of this data or that such a transfer of information was possible. Moreover, in the same Answer he said that he did not know how many datasets had been passed on. This is unacceptable. There needs to be a proper auditing process, as data needs to be safe. It is wrong too that a company may have more access to a pupil’s data than the pupil themselves, or indeed have such data corrected if wrong.
It is clear that from the Government’s point of view, one reason for having a good code of practice is to restore confidence in the Government, but this should not be the main reason. In September, Schools Week reported that the Information Commissioner’s Office was critical of the current DfE guidance, which is aimed at schools rather than parents or children and is, in the main, procedural. It said that rights were not given enough prominence. Both children and parents need to be properly informed of these rights and the use to which data is put at every stage throughout a child’s school life and, where applicable, beyond.
My Lords, I add my very strong welcome for this amendment to the very strong welcome from these Benches. I endorse everything that my noble friend Lord McNally said about the noble Baroness, Lady Kidron, and her energy and efforts. In fact, I believe that she was far too modest in her introduction of the amendment. I agree with the noble Lord, Lord Best, that, quite honestly, this is essentially a game-changer in the online world for children. As he said, the process of setting standards could be much wider than simply the UK. As the noble Lord, Lord Puttnam, said, these major tech companies need to wake up and understand that they have to behave in an ethical fashion. Having been exposed to some of the issues in recent weeks, it is obvious to me that as technology becomes ever more autonomous, the way tech companies adopt ethical forms of behaviour becomes ever more important. This is the start of something important in this field. Otherwise, the public will turn away and will not understand why all this is happening. That will inevitably be the consequence.
My Lords, in moving Amendment 8 I will speak to Amendment 21. I will be a little longer than perhaps those waiting on their dinner would like. I apologise for that, but this is an important set of amendments for those wishing to make use of new technologies using biometrics.
In Committee the Minister focused on the use of biometrics in a clear context, such as using a fingerprint to unlock a mobile device. In that context he may be correct to say that the enabling of this security feature by the user constitutes consent—although without a record of the consent it would still fall short of GDPR requirements. However, the contexts I was aiming to cover are those where the biometric data processing is an integral part of a service or feature, and the service or feature simply will not function without it.
Other contexts I was looking to cover include where banks decide to use biometric technology as extra security when you phone up or access your account online. Some banks offer this as an option, but it is not hard to envisage this becoming a requirement as banks are expected to do more to protect account access. If it is a mandatory requirement, consent is not appropriate—nor would it be valid. HMRC has begun to use voice recognition so that people will not have to go through all the usual security questions. If HMRC does this after 25 May 2018 it could be unlawful.
This is certainly the case with biometric access systems at employment premises. It is also the case where biometrics are used in schools and nurseries, such as for access controls and identifying who is picking up a child. In schools, biometrics are sometimes used to determine entitlements, such as free meals, in a way that does not identify or risk stigmatising those who receive them, and avoids children having to remember swipe cards or carry money.
In these contexts, providing an alternative system that does not use biometrics would probably undermine the security and other reasons for having biometrics in the first place. Without any specific lawful basis for biometric data, organisations will rely entirely on the Government, the ICO and the courts, accepting that their uses fall within the fraud prevention/substantial public interest lawful bases and within the definition of “scientific research”.
The amendments are designed to meet all these objections. In particular, the research elements of the amendments replicate the research exemption in Section 33 the Data Protection Act 1998. The effect of this exemption is that organisations processing personal data for research purposes are exempt from certain provisions of the Act, provided that they meet certain conditions. The key conditions are that the data is not used to support measures or decisions about specific individuals and that there is no substantial damage or distress caused by the processing.
In this context—I am afraid this is the reason for taking rather longer than I had hoped—it is important to place on the record a response to a number of points made in the Minister’s letter of 5 December to me about biometric data. First, he said:
“As you are aware, the General Data Protection Regulation … regards biometric data as a ‘special category’ of data due to its sensitivity”.
This is precisely why the amendment is needed. The change in status risks current lawful processing becoming unlawful. This type of data is being processed now using conditions for processing that will no longer be available once it becomes sensitive data.
I may have to add later to what I have said, which I think the Minister will find totally unpalatable. I will try to move on.
The Minister also said:
“You are concerned that if consent is not a genuine option in these situations and there are no specific processing conditions in the Bill to cover this on grounds of substantial public interest. Processing in these circumstances would be unlawful. To make their consent GDPR compliant, an employer or school must provide a reasonable alternative that achieves the same ends, for example, offering ‘manual’ entry by way of a reception desk”.
Consent is rarely valid in an employment context. If an employer believes that certain premises require higher levels of security, and that biometric access controls are a necessary and proportionate solution, it cannot be optional with alternative mechanisms that are less secure, as that undermines the security reasons for needing the higher levels of security in the first place: for example, where an employer secures a specific office or where the staff are working on highly sensitive or confidential matters, or where the employer secures a specific room in an office, such as a server room, where only a small number of people can have access and the access needs to be more secure.
Biometrics are unique to each person. A pass card can easily be lost or passed to someone else. It is not feasible or practical to insist that organisations employ extra staff for each secure office or secure room to act as security guards to manually let people in.
The Minister further stated:
“You also queried whether researchers involved in improving the reliability or ID verification mechanisms would be permitted to carry on their work under the GDPR and the Bill. Article 89(1) of the GDPR provides that processing of special categories of data is permitted for scientific research purposes, providing that appropriate technical and organisational safeguards are put in place to keep the data safe. Article 89(1) is supplemented by the safeguards of clause 18 of the Bill. For the purposes of GDPR, ‘scientific research’ has a broad meaning. When taken together with the obvious possibility of consent-based research, we are confident that the Bill allows for the general type of testing you have described”.
It is good to hear that the Government interpret the research provisions as being broad enough to accommodate the research and development described. However, for organisations to use these provisions with confidence, they need to know whether the ICO and courts will take the same broad view.
There are other amendments which would broaden the understanding of the research definition, which no doubt the Minister will speak to and which the Government could support to leave no room for doubt for organisations. However, it is inaccurate to assume that all R&D will be consent based; in fact, very little of it will be. Given the need for consent to be a genuine choice to be valid, organisations can rarely rely on this as they need a minimum amount of reliable data for R&D that presents a representative sample for whatever they are doing. That is undermined by allowing individuals to opt in and out whenever they choose. In particular, for machine learning and AI, there is a danger of discrimination and bias if R&D has incomplete datasets and data that does not accurately represent the population. There have already been cases of poor facial recognition programmes in other parts of the world that do not recognise certain races because the input data did not contain sufficient samples of that particular ethnicity with which to train the model.
This is even more the case where the biometric data for research and development is for the purpose of improving systems to improve security. Those employing security and fraud prevention measures have constantly to evaluate and improve their systems to stay one step ahead of those with malicious intent. The data required for this needs to be guaranteed and not left to chance by allowing individuals to choose. The research and development to improve the system is an integral aspect of providing the system in the first place.
I hope that the Minister recognises some of those statements that he made in his letter and will be able, at least to some degree, to respond to the points that I have made. There has been some toing and froing, so I think that he is pretty well aware of the points being raised. Even if he cannot accept these amendments, I hope that he can at least indicate that biometrics is the subject of live attention within his department and that work will be ongoing to find a solution to some of the issues that I have raised. I beg to move.
My Lords, I wonder whether I might use this opportunity to ask a very short question regarding the definition of biometric data and, in doing so, support my noble friend. The definition in Clause 188 is the same as in the GDPR and includes reference to “behavioural characteristics”. It states that,
“‘biometric data’ means personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of an individual, which allows or confirms the unique identification of that individual, such as facial images or dactyloscopic data”.
Well:
“There’s no art
To find the mind’s construction in the face”.
How do behavioural characteristics work in this context? The Minister may not want to reply to that now, but I would be grateful for an answer at some point.
My Lords, I thank the noble Lord, Lord Clement-Jones, for engaging constructively on this subject since we discussed it in Committee. I know that he is keen for data controllers to have clarity on the circumstances in which the processing of biometric data would be lawful. I recognise that the points he makes are of the moment: my department is aware of these issues and will keep an eye on them, even though we do not want to accept his amendments today.
To reiterate some of the points I made in my letter so generously quoted by the noble Lord, the GDPR regards biometric data as a “special category” of data due to its sensitivity. In order to process such data, a data controller must satisfy a processing condition in Article 9 of the GDPR. The most straightforward route to ensure that processing of such data is lawful is to seek the explicit consent of the data subject. However, the GDPR acknowledges that there might be occasions where consent is not possible. Schedule 1 to the Bill makes provision for a range of issues of substantial public interest: for example, paragraph 8, which permits processing such as the prevention or detection of an unlawful act. My letter to noble Lords following day two in Committee went into more detail on this point.
The noble Lord covered much of what I am going to say about businesses such as banks making use of biometric identification verification mechanisms. Generally speaking, such mechanisms are offered as an alternative to more conventional forms of access, such as use of passwords, and service providers should have no difficulty in seeking the data subject’s free and informed consent, but I take the point that obtaining proper, GDPR-compliant consent is more difficult when, for example, the controller is the data subject’s employer. I have considered this issue carefully following our discussion in Committee, but I remain of the view that there is not yet a compelling case to add new exemptions for controllers who wish to process sensitive biometric data without the consent of data subjects. The Bill and the GDPR make consent pre-eminent wherever possible. If that means employers who wish to install biometric systems have to ensure that they also offer a reasonable alternative to those who do not want their biometric data to be held on file, then so be it.
There is legislative precedent for this principle. Section 26 of the Protection of Freedoms Act 2012 requires state schools to seek parental consent before processing biometric data and to provide a reasonable alternative mechanism if consent is not given or is withdrawn. I might refer the noble Lord to any number of speeches given by members of his own party—the noble Baroness, Lady Hamwee, for example—on the importance of those provisions. After all, imposing a legislative requirement for consent was a 2010 Liberal Democrat manifesto commitment. The GDPR merely extends that principle to bodies other than schools. The noble Lord might respond that his amendment’s proposed subsection (1) is intended to permit processing only in a tight set of circumstances where processing of biometric data is undertaken out of necessity. To which I would ask: when is it genuinely necessary to secure premises or authenticate individuals using biometrics, rather than just cheaper or more convenient?
We also have very significant concerns with the noble Lord’s subsections (4) and (5), which seek to drive a coach and horses through fundamental provisions of the GDPR—purpose limitation and storage limitation, in particular. The GDPR does not in fact allow member states to derogate from article 5(1)(e), so subsection (5) would represent a clear breach of European law.
For completeness, I should also mention concerns raised about whether researchers involved in improving the reliability of ID verification mechanisms would be permitted to carry on their work under the GDPR and the Bill. I reassure noble Lords, as I did in Committee, that article 89(1) of the GDPR provides that processing of special categories of data is permitted for scientific research purposes, providing appropriate technical and organisational safeguards are put in place to keep the data safe. Article 89(1) is supplemented by the safeguards in Clause 18 of the Bill. Whatever your opinion of recitals and their ultimate resting place, recital 159 is clear that the term “scientific research” should be interpreted,
“in a broad manner including for example technological development and demonstration”.
This is a fast-moving area where the use of such technology is likely to increase over the next few years, so I take the point of the noble Lord, Lord Clement-Jones, that this is an area that needs to be watched. That is partly why Clause 9(6) provides a delegated power to add further processing conditions in the substantial public interest if new technologies, or applications of existing technologies, emerge. That would allow us to make any changes that are needed in the future, following further consultation with the parties that are likely to be affected by the proposals, both data controllers and, importantly, data subjects whose sensitive personal data is at stake. For those reasons, I hope the noble Lord is persuaded that there are good reasons for not proceeding with his amendment at the moment.
The noble Baroness, Lady Hamwee, asked about behavioural issues. I had hoped that I might get some inspiration, but I fear I have not, so I will get back to her and explain all about behavioural characteristics.
My Lords, I realise that, ahead of the dinner break business, the House is agog at details of the Data Protection Bill, so I will not prolong the matter. The Minister said that things are fast-moving, but I do not think the Government are moving at the pace of the slowest in the convoy on this issue. We are already here. The Minister says it is right that we should have alternatives, but for a lab that wants facial recognition techniques, having alternatives is just not practical. The Government are going to have to rethink this, particularly in the employment area. As more and more banks require it as part of their identification techniques, it will become of great importance.
We are just around the corner from these things, so I urge the Minister, during the passage of the Bill, to look again at whether there are at least some obvious issues that could be dealt with. I accept that some areas may be equivocal at this point, only we are not really talking about the future but the present. I understand what the Minister says and I will read his remarks very carefully, as no doubt will the industry that increasingly uses and wants to use biometrics. In the meantime, I beg leave to withdraw the amendment.