Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateLord Davies of Brixton
Main Page: Lord Davies of Brixton (Labour - Life peer)Department Debates - View all Lord Davies of Brixton's debates with the Department for Business and Trade
(1 month ago)
Lords ChamberI welcome the Bill and thank my noble friend Lady Jones of Whitchurch for her clear introduction. It represents a significant improvement on the Data Protection and Digital Information Bill that we had such fun discussing last year under the previous Government. I thank the noble Viscount, Lord Camrose, for his handling of the Bill at that stage and look forward to continuing these discussions.
However, there are some concerns on which it would be good to have some reassurance from the Government, so I welcome the opportunity to discuss potential improvements during the Bill’s passage through the House. It is also worth bearing in mind the remarks of the noble Lord, Lord Holmes of Richmond, that this is a fast-moving field and there is a continual danger of fighting the last war. I think that is certainly the case in relation to AI. So, time in Committee will have to be spent considering whether there is more that needs to be done because of the way the world has developed.
I am pleased that the Bill no longer covers information for social security purposes. I am not so pleased that it is going to reappear through the separate fraud, error and debt Bill. That is, of course, a discussion for another day; we have not seen it yet. My Government announced it two months ago and we have not yet seen it, so fingers crossed they are having second thoughts.
My prime concern with the Bill, and where I want to ensure that there are adequate safeguards, is individuals’ health data and what the provisions in the Bill mean for patients and for the public. It is notable that one of the key stated purposes of the Bill is to
“build an NHS fit for the future”,
which is of course one of the Government’s five missions.
My noble friend Lord Stevenson of Balmacara, who is not in his place, set out the issues very clearly. Nevertheless, I will repeat them, because I think that the point is so important. We have the problem that data regulation can slow down the pace of data sharing, increase risk aversion and make research and innovation more difficult. That is all true—it is true of all data, but particularly of health data. However, patients and the public rightly expect high standards for data protection, particularly when it comes to their health data, and I am worried that the effects of the Bill are not as strong as might be wished. This will need close examination during its passage through Committee. To get this wrong would damage public trust, negatively impact patient care, complicate the running of the health service and have a harmful effect on academic research and our life sciences industry. We must do our best to ensure that any concerns are misplaced—I hope that I am wrong.
Under current data protection laws there are transparency obligations, which means that information needs to be provided to the data subject that explains the use of their data. Reusing data for a different purpose is currently possible, but under limited circumstances—for example, the UK Health Security Agency. The main point of concern with the Bill, however, is with Clause 77, which, in the words of the BMA,
“will water down the transparency of information to patients”.
I suggest that we have to take the concerns of the BMA most seriously on this, which I am highlighting, but also on the other points it has made. What we have is a situation where data collected for one purpose can be reused for scientific research. In those circumstances, there is not necessarily a requirement to tell the data subjects about it. The definition of “scientific research” is very wide. It can be commercial or non-commercial. It can be funded publicly or privately. It also covers technological development, which is broadening the idea of scientific research.
Clearly, this is thought to be a good thing. It will remove barriers for valuable health research—timely availability of data is something important when you are undertaking research—and it is always possible that, during the course of the research, you can identify things which were not in the original proposal. All that is right, but there is a risk of data being reused for activities that data subjects might not have supported, have no control over and have no knowledge that it is happening. This feels like it contradicts the “no surprises” Caldicott principle. It is unclear to me at this stage who exactly is going to have oversight of all the data reuses to check that they are ethical and to check that the right standards are being applied.
The consequence is a real risk of the loss of patient and public trust in data use and sharing within the health sector and more widely. To reiterate, patients and the public rightly expect high standards of data processing to protect their confidential health data. I have serious concerns that the Bill, in its current state, runs the risk of diluting those standards and protections.
The underlying policy priority for the Bill, as I understand it, is to stimulate innovation through broadening the definition of “scientific research”. However, there is concern—for example, that expressed by the Ada Lovelace Institute—that, as currently written, the provisions in the Bill are susceptible to misuse. We must ensure that the Bill explicitly forbids the mass reuse of personal data scraped from the internet or acquired through social media for AI product development under the auspices of “scientific research”, with the potential for considerable public backlash. Voluntary commitments from the tech industry to protect people from the potential harms of AI models are welcome, of course, but are not good enough. Only hard rules enshrined in law can incentivise the developers and deployers of AI to comply, and empower the regulators to act.
Another unknown at this stage—I hope my noble friend can guide us here—is how far the Bill diverges from EU standards and potentially puts at risk the free flow of personal data between the EU and the UK. This free flow is critical to medical research and innovation and must be maintained.
I am also concerned about the issue of making data anonymous. It is incredibly difficult to make medical data anonymous. It is valueless in most cases if you do not know how old the subject is or their pre-existing conditions, and as soon as you have that sort of data it is open to manipulation. I believe that to counter those problems we need to expand the use of so-called trusted research environments. This is a well-developed technique in which Britain is the leader. I believe it should be a legal requirement in this field. The Bill does not go that far. It is certainly something we should discuss in Committee.
This is a system where the information—the subject’s data—is kept within a locked box. It stays within the box. The medical researchers, who are crucial, come up with their program, using a sandbox, which is then applied to the locked-away data. The researchers would not get the data, they would just get the results of their inquiry. They do not go anywhere near the data. This level of protection is required to achieve public support. The outcome of the research in these circumstances is identical but the subjects’ medical information—crucially, but not only, their genetic information—is kept away and kept secure.
Finally, another point of concern that has been mentioned by a number of speakers is automated decision-making. The Bill removes the general prohibition on automated decision-making, placing responsibility on individuals to enforce their rights rather than on companies to demonstrate why automation is permissible. Even with the new safeguards being introduced, people will struggle to get meaningful explanations about decisions that will deeply affect their lives and will have difficulty exercising their right to appeal against automated decisions when the basis on which the decisions have been made is kept from them.
With those concerns, which I am sure we will discuss in Committee, I support the Bill.