Data Protection Bill [HL]

(Limited Text - Ministerial Extracts only)

Read Full debate
Monday 11th December 2017

(7 years ago)

Lords Chamber
Read Hansard Text
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

My Lords, we have had a good discussion this evening about topics raised in Committee, where the strength of feeling and expertise displayed was highly instrumental in persuading Ministers to think again about the approach they were taking towards the regulatory process for children’s data being transferred into the internet. It shows that well-argued cases can get through even the most impervious armour put on by Ministers when they start battling on their Bills. I am delighted to see it.

The noble Lord, Lord Clement-Jones, commented on Amendment 117, tabled by the noble Earl, Lord Clancarty. I wondered why that amendment had been included in the group because it seemed to point in a different direction. It deals with data collected and used by the Government, having cleared what would presumably be the highest standards of propriety in relation to it. However, the story that emerged, endorsed by the noble Lord, Lord Clement-Jones, is shocking and I hope that the Minister will be able to help us chart a path through this issue. Several things seem to be going wrong. The issues were raised by my noble friend Lord Knight in Committee, but this amendment and the paperwork supplied with it give me a chill. The logic behind the amendment’s being in this group is that this is the end-product of the collection of children’s data—admittedly by others who are providing it for them in this case—and it shows the kinds of dangers that are about. I hope that point will be answered well by the Minister when he comes to respond.

I turn to the substantive amendment; it is an honour to have been invited to sign up to it. I have watched with admiration—as have many others—the skilful way in which the noble Baronesses, Lady Kidron and Lady Harding, and others have put together a case, then an argument and then evidence that has persuaded all of us that something can be done, should be done and now will be done to make sure that our children and grandchildren will have a safe environment in which they can explore and learn from the internet.

When historic moments such as this come along you do not often notice them. However, tonight we are laying down a complete change in the way in which individuals relate to the services that have now been provided on such a huge scale, as has been described. I welcome that—it is an important point—and we want to use it, savour it and build on it as we go forward.

I first sensed that we were on the right path here when I addressed an industry group of data-processing professionals recently. Although I wowed them with my knowledge of the automatic processing of data and biometric arguments—I even strayed into de-anonymisation, and got the word right as I spoke in my cups—they did not want anything to do with that: they only wanted to talk about what we were going to do to support the noble Baroness, Lady Kidron, and her amendments. When the operators in industry are picking up these debates and realising that this is something that they had always really wanted but did not know how to do—and now it is happening and they are supporting it all they can—we are in the right place.

The noble Baroness, Lady Harding, said something interesting about it being quite clear now that self-regulation does not work—she obviously has not read Adam Smith recently; I could have told her that she might have picked that up from earlier studies. She also said, to redeem herself, that good regulation has a chance to change behaviour and to inculcate a self-regulatory approach, where those who are regulated recognise the strength of the regulations coming forward and then use it to develop a proper approach to the issue and more. In that sense she is incredibly up to date. Your Lordships’ House discussed this only last week in a debate promoted by the noble Baroness, Lady Neville-Rolfe, on what good regulation meant and how it could be applied. We on these Benches are on all fours with her on this. It is exactly the way to go. Regulation for regulation’s sake does not work. Stripping away regulation because you think it is red tape does not work. Good regulation or even better regulation works, and that is where we want to go.

There are only three points I want to pick out of the contribution made by the noble Baroness, Lady Kidron, when she introduced the amendment. First, it is good that the problem we saw at the start of the process about how we were going to get this code applied to all children has been dealt with by the Government in taking on the amendment and bringing it back in a different way. As the noble Baroness admits, their knowledge and insight was instrumental in getting this in the Bill. I think that answers some of the questions that the noble Baroness, Lady Howe, was correctly asking. How do the recommendations and the derogation in the Bill reducing the age from 16 to 13 work in relation to the child? They do so because the amendment is framed in such a way that all children, however they access the internet, will be caught by it, and that is terrific.

The second point I want to make picks up on a concern also raised by the noble Baroness, Lady Harding. While we are probably not going to get a timescale today, the Bill sets a good end-stop for when the code is going to be implemented. However, one hopes that when the Minister comes to respond, he will be able to give us a little more hope than having to wait for 18 months. The amendment does say,

“as soon as reasonably practicable”,

but that is usually code for “not quite soon”. I hope that we will not have to wait too long for the code because it is really important. The noble Baroness, Lady Harding, pointed out that if the message goes out clearly and the descriptions of what we intend to do are right, the industry will want to move before then anyway.

Thirdly, I turn to the important question of how the code will be put into force in such a way that it makes sure that those who do not follow it will be at risk. Yes, there will be fines, and I hope that the Minister is able to confirm what the noble Baroness asked him when introducing her amendment. I would also like to pick up the point about the need to ensure that we encourage the Government to think again about the derogation of article 82. I notice in a document recently distributed by the Information Commissioner that she is concerned about this, particularly in relation to vulnerable people and children, who might not be expected to know whether and how they can exercise their rights under data protection law. It is clear that very young people will not be able to do that. If they cannot or do not understand the situation they are in, how is enforcement going to take place? Surely the right thing to do is to make sure that the bodies which have been working with the noble Baroness, Lady Kidron, which know and understand the issues at stake here, are able to raise what are known as super complaint-type procedures on behalf of the many children to whom damage might be being done but who do not have a way of exercising their rights.

If we can have a response to that when we come to it later in the Bill, and in the interim get answers to some of the questions I have set out, we will be at the historic moment of being able to bless on its way a fantastic approach to how those who are the most vulnerable but who often get so much out of the internet can be protected. I am delighted to be able to support the amendment.

Lord Ashton of Hyde Portrait The Parliamentary Under-Secretary of State, Department for Digital, Culture, Media and Sport (Lord Ashton of Hyde) (Con)
- Hansard - - - Excerpts

My Lords, first, like other noble Lords, I pay tribute to the noble Baroness, Lady Kidron, for her months—indeed, years—of work to ensure that the rights and safety of children are protected online. I commend her efforts to ensure that the Bill properly secures those rights. She has convinced us that it is absolutely right that children deserve their own protections in the Bill. The Government agree that these amendments do just that for the processing of a child’s personal data.

Amendment 109 would require the Information Commissioner to produce a code of practice on age-appropriate design of online services. The code will carry the force of statutory guidance and set out the standards expected of data controllers to comply with the principles and obligations on data processors as set out by the GDPR and the Bill. I am happy to undertake that the Secretary of State will work in close consultation with the Information Commissioner and the noble Baroness, Lady Kidron, to ensure that this code is robust, practical and, most importantly, meets the development needs of children in relation to the gathering, sharing, storing and commoditising of their data. I have also taken on board the recommendations of the noble Lord, Lord Clement-Jones, on the internet safety strategy. We have work to do on that and I will take his views back to the department.

The Government will support the code by providing the Information Commissioner with a list of minimum standards to be taken into account when designing it. These are similar to the standards proposed by the noble Baroness in Committee. They include default privacy settings, data minimisation standards, the presentation and language of terms and conditions and privacy notices, uses of geolocation technology, automated and semi-automated profiling, transparency of paid-for activity such as product placement and marketing, the sharing and resale of data, the strategies used to encourage extended user engagement, user reporting and resolution processes and systems, the ability to understand and activate a child’s right to erasure, rectification and restriction, the ability to access advice from independent, specialist advocates on all data rights, and any other aspect of design that the commissioner considers relevant.

--- Later in debate ---
I am sorry. I am just finding the right place in my notes.
Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - - - Excerpts

I may have to add later to what I have said, which I think the Minister will find totally unpalatable. I will try to move on.

The Minister also said:

“You are concerned that if consent is not a genuine option in these situations and there are no specific processing conditions in the Bill to cover this on grounds of substantial public interest. Processing in these circumstances would be unlawful. To make their consent GDPR compliant, an employer or school must provide a reasonable alternative that achieves the same ends, for example, offering ‘manual’ entry by way of a reception desk”.


Consent is rarely valid in an employment context. If an employer believes that certain premises require higher levels of security, and that biometric access controls are a necessary and proportionate solution, it cannot be optional with alternative mechanisms that are less secure, as that undermines the security reasons for needing the higher levels of security in the first place: for example, where an employer secures a specific office or where the staff are working on highly sensitive or confidential matters, or where the employer secures a specific room in an office, such as a server room, where only a small number of people can have access and the access needs to be more secure.

Biometrics are unique to each person. A pass card can easily be lost or passed to someone else. It is not feasible or practical to insist that organisations employ extra staff for each secure office or secure room to act as security guards to manually let people in.

The Minister further stated:

“You also queried whether researchers involved in improving the reliability or ID verification mechanisms would be permitted to carry on their work under the GDPR and the Bill. Article 89(1) of the GDPR provides that processing of special categories of data is permitted for scientific research purposes, providing that appropriate technical and organisational safeguards are put in place to keep the data safe. Article 89(1) is supplemented by the safeguards of clause 18 of the Bill. For the purposes of GDPR, ‘scientific research’ has a broad meaning. When taken together with the obvious possibility of consent-based research, we are confident that the Bill allows for the general type of testing you have described”.


It is good to hear that the Government interpret the research provisions as being broad enough to accommodate the research and development described. However, for organisations to use these provisions with confidence, they need to know whether the ICO and courts will take the same broad view.

There are other amendments which would broaden the understanding of the research definition, which no doubt the Minister will speak to and which the Government could support to leave no room for doubt for organisations. However, it is inaccurate to assume that all R&D will be consent based; in fact, very little of it will be. Given the need for consent to be a genuine choice to be valid, organisations can rarely rely on this as they need a minimum amount of reliable data for R&D that presents a representative sample for whatever they are doing. That is undermined by allowing individuals to opt in and out whenever they choose. In particular, for machine learning and AI, there is a danger of discrimination and bias if R&D has incomplete datasets and data that does not accurately represent the population. There have already been cases of poor facial recognition programmes in other parts of the world that do not recognise certain races because the input data did not contain sufficient samples of that particular ethnicity with which to train the model.

This is even more the case where the biometric data for research and development is for the purpose of improving systems to improve security. Those employing security and fraud prevention measures have constantly to evaluate and improve their systems to stay one step ahead of those with malicious intent. The data required for this needs to be guaranteed and not left to chance by allowing individuals to choose. The research and development to improve the system is an integral aspect of providing the system in the first place.

I hope that the Minister recognises some of those statements that he made in his letter and will be able, at least to some degree, to respond to the points that I have made. There has been some toing and froing, so I think that he is pretty well aware of the points being raised. Even if he cannot accept these amendments, I hope that he can at least indicate that biometrics is the subject of live attention within his department and that work will be ongoing to find a solution to some of the issues that I have raised. I beg to move.

Baroness Hamwee Portrait Baroness Hamwee (LD)
- Hansard - - - Excerpts

My Lords, I wonder whether I might use this opportunity to ask a very short question regarding the definition of biometric data and, in doing so, support my noble friend. The definition in Clause 188 is the same as in the GDPR and includes reference to “behavioural characteristics”. It states that,

“‘biometric data’ means personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of an individual, which allows or confirms the unique identification of that individual, such as facial images or dactyloscopic data”.

Well:

“There’s no art


To find the mind’s construction in the face”.

How do behavioural characteristics work in this context? The Minister may not want to reply to that now, but I would be grateful for an answer at some point.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

My Lords, I thank the noble Lord, Lord Clement-Jones, for engaging constructively on this subject since we discussed it in Committee. I know that he is keen for data controllers to have clarity on the circumstances in which the processing of biometric data would be lawful. I recognise that the points he makes are of the moment: my department is aware of these issues and will keep an eye on them, even though we do not want to accept his amendments today.

To reiterate some of the points I made in my letter so generously quoted by the noble Lord, the GDPR regards biometric data as a “special category” of data due to its sensitivity. In order to process such data, a data controller must satisfy a processing condition in Article 9 of the GDPR. The most straightforward route to ensure that processing of such data is lawful is to seek the explicit consent of the data subject. However, the GDPR acknowledges that there might be occasions where consent is not possible. Schedule 1 to the Bill makes provision for a range of issues of substantial public interest: for example, paragraph 8, which permits processing such as the prevention or detection of an unlawful act. My letter to noble Lords following day two in Committee went into more detail on this point.

The noble Lord covered much of what I am going to say about businesses such as banks making use of biometric identification verification mechanisms. Generally speaking, such mechanisms are offered as an alternative to more conventional forms of access, such as use of passwords, and service providers should have no difficulty in seeking the data subject’s free and informed consent, but I take the point that obtaining proper, GDPR-compliant consent is more difficult when, for example, the controller is the data subject’s employer. I have considered this issue carefully following our discussion in Committee, but I remain of the view that there is not yet a compelling case to add new exemptions for controllers who wish to process sensitive biometric data without the consent of data subjects. The Bill and the GDPR make consent pre-eminent wherever possible. If that means employers who wish to install biometric systems have to ensure that they also offer a reasonable alternative to those who do not want their biometric data to be held on file, then so be it.

There is legislative precedent for this principle. Section 26 of the Protection of Freedoms Act 2012 requires state schools to seek parental consent before processing biometric data and to provide a reasonable alternative mechanism if consent is not given or is withdrawn. I might refer the noble Lord to any number of speeches given by members of his own party—the noble Baroness, Lady Hamwee, for example—on the importance of those provisions. After all, imposing a legislative requirement for consent was a 2010 Liberal Democrat manifesto commitment. The GDPR merely extends that principle to bodies other than schools. The noble Lord might respond that his amendment’s proposed subsection (1) is intended to permit processing only in a tight set of circumstances where processing of biometric data is undertaken out of necessity. To which I would ask: when is it genuinely necessary to secure premises or authenticate individuals using biometrics, rather than just cheaper or more convenient?

We also have very significant concerns with the noble Lord’s subsections (4) and (5), which seek to drive a coach and horses through fundamental provisions of the GDPR—purpose limitation and storage limitation, in particular. The GDPR does not in fact allow member states to derogate from article 5(1)(e), so subsection (5) would represent a clear breach of European law.

For completeness, I should also mention concerns raised about whether researchers involved in improving the reliability of ID verification mechanisms would be permitted to carry on their work under the GDPR and the Bill. I reassure noble Lords, as I did in Committee, that article 89(1) of the GDPR provides that processing of special categories of data is permitted for scientific research purposes, providing appropriate technical and organisational safeguards are put in place to keep the data safe. Article 89(1) is supplemented by the safeguards in Clause 18 of the Bill. Whatever your opinion of recitals and their ultimate resting place, recital 159 is clear that the term “scientific research” should be interpreted,

“in a broad manner including for example technological development and demonstration”.

This is a fast-moving area where the use of such technology is likely to increase over the next few years, so I take the point of the noble Lord, Lord Clement-Jones, that this is an area that needs to be watched. That is partly why Clause 9(6) provides a delegated power to add further processing conditions in the substantial public interest if new technologies, or applications of existing technologies, emerge. That would allow us to make any changes that are needed in the future, following further consultation with the parties that are likely to be affected by the proposals, both data controllers and, importantly, data subjects whose sensitive personal data is at stake. For those reasons, I hope the noble Lord is persuaded that there are good reasons for not proceeding with his amendment at the moment.

The noble Baroness, Lady Hamwee, asked about behavioural issues. I had hoped that I might get some inspiration, but I fear I have not, so I will get back to her and explain all about behavioural characteristics.

Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - - - Excerpts

My Lords, I realise that, ahead of the dinner break business, the House is agog at details of the Data Protection Bill, so I will not prolong the matter. The Minister said that things are fast-moving, but I do not think the Government are moving at the pace of the slowest in the convoy on this issue. We are already here. The Minister says it is right that we should have alternatives, but for a lab that wants facial recognition techniques, having alternatives is just not practical. The Government are going to have to rethink this, particularly in the employment area. As more and more banks require it as part of their identification techniques, it will become of great importance.

We are just around the corner from these things, so I urge the Minister, during the passage of the Bill, to look again at whether there are at least some obvious issues that could be dealt with. I accept that some areas may be equivocal at this point, only we are not really talking about the future but the present. I understand what the Minister says and I will read his remarks very carefully, as no doubt will the industry that increasingly uses and wants to use biometrics. In the meantime, I beg leave to withdraw the amendment.