Data Protection Bill [HL]

Baroness Neville-Rolfe Excerpts
Committee: 6th sitting (Hansard): House of Lords
Wednesday 22nd November 2017

(6 years, 5 months ago)

Lords Chamber
Read Full debate Data Protection Act 2018 View all Data Protection Act 2018 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: HL Bill 66-VI Sixth marshalled list for Committee (PDF, 286KB) - (20 Nov 2017)
Lord Griffiths of Burry Port Portrait Lord Griffiths of Burry Port (Lab)
- Hansard - - - Excerpts

My Lords, I thank the noble Lord for his eloquent disquisition, which made me much more aware of the issues than I was before. I have no problem in aligning myself with the two points of view that have just been expressed. I had come to the conclusion partly myself, but to be told that the wording is not in the equivalent article in the European GDPR just adds to my simple conclusion that the words “other adverse effects” add precisely nothing but open a potential cave of dark possibilities. The rain of the noble Lord’s eloquence has found a crack in my roof, and I am very happy to align myself with his remarks.

Baroness Neville-Rolfe Portrait Baroness Neville-Rolfe (Con)
- Hansard - -

I also share the concerns expressed by my noble friend Lord Hunt, based on my experience, both in government and in a number of different businesses. We have the experience not only of the motor sector, which has been talked about, but obviously of PPI, where there was compensation that needed to be paid, but the whole business took years and generated not only claims management companies but also nuisance calls and lots of other harms. This is an area that one has to be very careful about, and I support looking at the drafting carefully to see what can be done, and at my noble friend’s idea of trying to estimate the economic impact—the costs—in terms of those affected. That would help one to come to a sensible conclusion on what is appropriate in this important Bill.

Baroness Chisholm of Owlpen Portrait Baroness Chisholm of Owlpen (Con)
- Hansard - - - Excerpts

My Lords, I thank my noble friend Lord Hunt for explaining Amendment 170A and other noble Lords who have spoken. The amendment seeks to clarify the definition of “damage” provided by Clause 159 and its relationship to the language used in article 82 of the GDPR. This is important because article 82 of the GDPR provides a right to compensation when a person has suffered damage as the result of an infringement of the rights during the processing of their personal data.

Currently, the type of damage that can be claimed is broader under article 82 than Section 13 of the 1998 Act, as article 82 expressly extends to “non-material” damage. As a result, in drafting the Bill, the Government considered that some definition of “damage” was necessary, including specifying that it extends to distress, to provide clarity and certainty for data subjects and others as to their rights under article 82.

I stress that Clause 159 does not seek to provide a wider definition of “damage” than is currently provided in the GDPR, and nor indeed could it. The intention is simply to clarify the GDPR’s meaning. My noble friend Lord Hunt asked what estimates have been made of the financial consequences of the increase in litigation, but as Clause 159 does not provide a wider definition of damage there will be no financial consequence.

The concept of “damage” included in the GDPR reflects developments in case law over a period of some years. As such, I cannot agree with my noble friend’s suggestion that the Bill or the GDPR will suddenly unleash a free-for-all of claims. However, I am happy to reflect on my noble friend’s point that the Bill’s use of the term “other adverse effects” may unintentionally provide uncertainty rather than clarity. With the reassurance that I will go away and look at that, I hope my noble friend feels able to withdraw his amendment.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

We are in the thickets here at the interface between technology, techno-speak and legality. Picking our way through Clause 162 is going to be rather important.

There are two schools of thought. The first is that we can amend this clause in fairly radical ways—and I support many of the amendments proposed by the noble Lord, Lord Stevenson. Of course, I am speaking to Amendment 170E as well, which tries to simplify the language and make it much more straightforward in terms of retroactive approval for actions taken in this respect, and I very much hope that parliamentary draftsmen will approve of our efforts to simplify the language. However, another more drastic school of thought is represented by many researchers—and the noble Lord, Lord Stevenson, has put the case very well that they have put to us, that the cause of security research will be considerably hampered. But it is not just the research community that is concerned, although it is extremely concerned by the lack of definition, the sanctions and the restrictions that the provisions appear to place on their activities. Business is also concerned, as numerous industry practices might be considered illegal and a criminal offence, including browser fingerprinting, data linkage in medicine, what they call device reconciliation or offline purchases tracking. So there is a lot of uncertainty for business as well as for the academic research community.

This is where we get into the techno-language. We are advised that modern, privacy-enhancing technologies such as differential privacy, homomorphic encryption—I am sure that the Minister is highly familiar with that—and question and answer systems are being used and further developed. There is nothing worse than putting a chill on the kind of research that we want to see by not acknowledging that there is the technology to make sure that we can do what we need to do and can keep our consumers safe in the circumstances. The fact is that quite often anonymisation, as we are advised, can never be complete. It is only by using this new technology that we can do that. I very much hope that the Minister is taking the very best legal and technology advice in the drafting and purposes of this clause. I am sure that he is fully aware that there is a great deal of concern about it.

Baroness Neville-Rolfe Portrait Baroness Neville-Rolfe
- Hansard - -

I rise to support the noble Lords, Lord Stevenson and Lord Clement-Jones, and some of the amendments in this group on this, the final day in Committee. I congratulate my noble friends Lord Ashton and Lady Chisholm of Owlpen as well as the indefatigable Bill team for taking this gargantuan Bill through so rapidly.

The problem caused by criminalising re-identification was brought to my attention by one of our most distinguished universities and research bodies, Imperial College London. I thought that this was a research issue, which troubled me but which I thought might be easy to deal with. However, talking to the professor in the computational privacy group today, I found, as the noble Lord, Lord Clement-Jones, said, that it goes wider and could cause problems for companies as well. That leads me to think that I should probably draw attention to my relevant interests in the House of Lords register of interests.

The computational privacy group explained that the curious addition of Clause 162—which is different in character and language from other parts of the Bill, as the noble Lord, Lord Stevenson, said—draws on Australian experience, but risks halving the work of the privacy group, which is an academic body, and possibly creating costs and problems for other organisations and companies. I am not yet convinced that we should proceed with this clause at all, for two reasons. First, it will not address the real risk of unethical practice by people outside the UK. As the provision is not in the GDPR or equivalent frameworks in most other countries, only UK and Australian bodies or companies will be affected, which could lead to the migration of research teams and data entrepreneurs to Harvard, Paris and other sunny and sultry climes. Secondly, because it will become criminal in the UK to re-identify de-identified data—it is like saying “seashells on the seashore”—the clause could perversely increase the risk of data being re-identified and misused. It will limit the ability of researchers to show up the vulnerability of published datasets, which will make life easier for hackers and fraudsters—another perversity. For that reason, it may be wise to recognise the scope and value of modern privacy-enhancing technologies in ensuring the anonymous use of data somewhere in the Bill, which could perhaps be looked at.

I acknowledge that there are defences in Clause 162 —so, if a person faces prosecution, they have a defence. However, in my experience, responsible organisations do not much like to rely on defences when they are criminal prohibitions, as they can be open to dispute. I am also grateful to the noble Lord, Lord Stevenson— I am so sorry about his voice, although it seems to be getting a bit better—for proposing an exemption in cases where re-identification relates to demonstrating how personal data can be re-identified or is vulnerable to attack. However, I am not sure that the clause and its wider ramifications have been thought through. I am a strong supporter of regulation to deal with proven harm, especially in the data and digital area, where we are still learning about the externalities. But it needs to be reasonable, balanced, costed, careful and thought through—and time needs to be taken for that purpose.

I very much hope that my noble friend the Minister can find a way through these problems but, if that is not possible, I believe that the Government should consider withdrawing the clause.

Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

I very much support what my noble friend has just said. The noble Lord, Lord Stevenson, has tried to give an exemption for researchers, but a lot of these things will happen in the course of other research. You are not spending your time solely trying to break some system; you are trying to understand what you can get from it, and suddenly you see someone you know, or you can see a single person there. It is something that you can discover as a result of using the data; you can get to the point where you understand that this is a single person, and you could find out more about them if you wanted to. If it is a criminal offence, of course, you will then tell nobody, which rather defeats the point. You ought to be going back to the data controller and saying that it is not quite right.

There are enormous uses in learning how to make a city work better by following people around with mobile phone data, for instance, but how do you anonymise it? Given greater computational power and more datasets becoming available, what can you show and use which does not have the danger of identifying people? This is ongoing technology—there will be new ways of breaking it and of maintaining privacy, and we have to have that as an active area of research and conversation. To my mind, this clause as it presently is just gets in the way.

--- Later in debate ---
Debate on whether Clause 162 should stand part of the Bill.
Baroness Neville-Rolfe Portrait Baroness Neville-Rolfe
- Hansard - -

My Lords, I simply wish to associate myself with the comments of the noble Lord, Lord Stevenson, and say that a meeting on this would be helpful. As I said, I hope that we can find a solution. If we cannot, I have reservations about this measure being part of the Bill.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

I make it plain to my noble friend—my predecessor in this position—that I will arrange a meeting.