Data Protection and Digital Information Bill Debate
Full Debate: Read Full DebateLord Davies of Brixton
Main Page: Lord Davies of Brixton (Labour - Life peer)Department Debates - View all Lord Davies of Brixton's debates with the Department for Science, Innovation & Technology
(8 months, 1 week ago)
Grand CommitteeMy Lords, I rise to speak in favour of Amendments 1 and 5 in this group and with sympathy towards Amendment 4. The noble Lord, Lord Clement-Jones, will remember when I was briefly Minister for Health. We had lots of conversations about health data. One of the things we looked at was a digitised NHS. It was essential if we were to solve many problems of the future and have a world-class NHS, but the problem was that we had to make sure that patients were comfortable with the use of their data and the contexts in which it could be used.
When we were looking to train AI, it was important that we made sure that the data was as anonymous as possible. For example, we looked at things such as synthetic and pseudonymised data. There is another point: having done the analysis and looked at the dataset, if you see an identifiable group of people who may well be at risk, how can you reverse-engineer that data perhaps to notify those patients that they should be contacted for further medical interventions?
I know that that makes it far too complicated; I just wanted to rise briefly to support the noble Lord, Lord Clement-Jones, on this issue, before the new rules come in next week. It is essential that the users, the patients—in other spheres as well—have absolute confidence that their data is theirs and are given the opportunity to give permission or opt out as much as possible.
One of the things that I said when I was briefed as a Health Minister was that we can have the best digital health system in the world, but it is no good if people choose to opt out or do not have confidence. We need to make sure that the Bill gives those patients that confidence where their data is used in other areas. We need to toughen this bit up. That is why I support Amendments 1 and 5 in the name of the noble Lord, Lord Clement-Jones.
My Lords, anonymisation of data is crucially important in this debate. I want to see, through the Bill, a requirement for personal data, particularly medical data, to be held within trusted research environments. This is a well-developed technique and Britain is the leader. It should be a legal requirement. I am not quite sure that we have got that far in the Bill; maybe we will need to return to the issue on Report.
The extent to which pseudonymisation—I cannot say it—is possible is vastly overrated. There is a sport among data scientists of being able to spot people within generally available datasets. For example, the data available to TfL through people’s use of Oyster cards and so on tells you an immense amount of information about individuals. Medical data is particularly susceptible to this, although it is not restricted to medical data. I will cite a simple example from publicly available data.
My Lords, I will be brief because I associate myself with everything that the noble Baroness, Lady Kidron, just said. This is where the rubber hits the road from our previous group. If we all believe that it is important to maintain children’s protection, I hope that my noble friend the Minister will be able to accept if not the exact wording of the children-specific amendments in this group then the direction of travel—and I hope that he will commit to coming back and working with us to make sure that we can get wording into the Bill.
I am hugely in favour of research in the private sector as well as in universities and the public sector; we should not close our minds to that at all. We need to be realistic that all the meaningful research in AI is currently happening in the private sector, so I do not want to close that door at all, but I am extremely uncomfortable with a Secretary of State having the ability to amend access to personal data for children in this context. It is entirely sensible to have a defined code of conduct for the use of children’s data in research. We have real evidence that a code of conduct setting out how to protect children’s rights and data in this space works, so I do not understand why it would not be a good idea to do research if we want the research to happen but we want children’s rights to be protected at a much higher level.
It seems to me that this group is self-evidently sensible, in particular Amendments 8, 22, 23 and 145. I put my name to all of them except Amendment 22 but, the more I look at the Bill, the more uncomfortable I get with it; I wish I had put my name to Amendment 22. We have discussed Secretary of State powers in each of the digital Bills that we have looked at and we know about the power that big tech has to lobby. It is not fair on Secretaries of State in future to have this ability to amend—it is extremely dangerous. I express my support for Amendment 22.
I just want to say that I agree with what the previous speakers have said. I particularly support Amendment 133; in effect, I have already made my speech on it. At that stage, I spoke about pseudonymised data but I focused my remarks on scientific research. Clearly, I suspect that the Minister’s assurances will not go far enough, although I do not want to pre-empt what he says and I will listen carefully to it. I am sure that we will have to return to this on Report.
I make a small additional point: I am not as content as the noble Baroness, Lady Harding of Winscombe, about commercial research. Different criteria apply; if we look in more detail at ensuring that research data is protected, there may be special factors relating to commercial research that need to be covered in a potential code of practice or more detailed regulations.