Thursday 24th June 2021

(3 years, 5 months ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
David Davis Portrait Mr David Davis (Haltemprice and Howden) (Con)
- View Speech - Hansard - -

In winding up the last debate, the Minister for the Armed Forces referred to volunteering a mucker for the guardroom. I hope that my entire speech does not sound like that to the Secretary of State; it is not intended to.

Every couple of years, Whitehall, like an overexcited teenager expecting a new mobile phone, becomes fixated with data. Most recently, it has been about the power of big data mining, and I am sure that that is not just because of the influence of Mr Dominic Cummings. The Department of Health and Social Care wants to open our GP medical records—55 million datasets or thereabouts—to pharmaceutical companies, universities and researchers.

Managed properly, that data could transform, innovate and help to overcome the great challenges of our time, such as cancer, dementia and diabetes. Those are proper and worthwhile ambitions in the national interest, and I have little doubt that that was the Government’s aim, but that data is incredibly personal, full of facts that might harm or embarrass the patient if they were leaked or misused. Psychiatric conditions, history of drug or alcohol abuse, sexually transmitted infections, pregnancy terminations—the list is extensive. Revealing that data may not be embarrassing for everyone, but it could be life-destroying for someone.

Unfortunately, in keeping with the Department’s long history of IT failures, the roll-out of the programme has been something of a shambles. The Government have failed to explain exactly how they will use the data, have failed to say who will use it and—most importantly—have failed to say how they will safeguard this treasure trove of information. They describe the data as “pseudonymised” because it is impossible to fully anonymise medical records, a fact that is well understood by experts in the field.

Even pseudonymised, anyone can be identified if someone tries hard enough. Take Tony Blair, who was widely known to have developed a heart condition, supraventricular tachycardia, in October 2003. He was first admitted to Stoke Mandeville and then rushed to Hammersmith. One year later, in September 2004, he visited Hammersmith again for a corrective operation. Even the name of the cardiologist is in the public record. A competent researcher would make very short work of finding such individual records in a mass database. That cannot be for the public good. Moreover, the Government seem to intend to keep hold of the keys to unlock the entire system and identify an individual if the state feels the need to do so.

Jim Shannon Portrait Jim Shannon (Strangford) (DUP)
- Hansard - - - Excerpts

I congratulate the right hon. Gentleman on securing the debate; I have been inundated with the same concerns from many of my constituents. Does he agree that a system that allows a diversion from the court-appointed warrant to collect information is a dangerous precedent in terms of judicial due process? We must ensure that anyone who opts out is completely opted out, as is promised.

--- Later in debate ---
David Davis Portrait Mr Davis
- View Speech - Hansard - -

I take the hon. Gentleman’s point and will elaborate on it as I make progress. As presented, the plan is to collect the data first and think about the problems second, but the information is too important and the Department’s record of failed IT is too great for it to be trusted with carte blanche over our privacy.



There is also the so-called honeypot problem. Data gathered centrally inevitably attracts actors with more nefarious intentions. The bigger the database, the greater the incentive to hack it. If the Pentagon, US Department of Defence and even Microsoft have been hacked by successful cyber-attacks, what chance does our NHS have?

Nigel Evans Portrait Mr Deputy Speaker (Mr Nigel Evans)
- Hansard - - - Excerpts

Order. As we are coming towards 5 o’clock, I will just go through the following technical process.

--- Later in debate ---
Motion made, and Question proposed, That this House do now adjourn.—(James Morris.)
David Davis Portrait Mr Davis
- Hansard - -

Thank you, Mr Deputy Speaker. I take it you do not want me to start from the beginning again. That might test people’s patience a little.

As I was saying, if the giants of data security can be hacked, what chance the NHS? Big databases and big systems are intrinsically vulnerable. In 2017, a ransomware attack brought parts of the NHS to its knees. Trusts were forced to turn away patients, ambulances were diverted and 20,000 operations were cancelled. That highlights significant problems the Government have not yet had time to address. Despite those problems, the Government have been determined to press ahead with their data plans regardless. They undertook no widespread consultation, provided no easy opt-out, and showed no particular willingness to listen as would be proper with such an important move. The public were given little over a month to opt out of a data grab that few knew existed. The plan was described by the British Medical Association as “a complete failure” and “completely inadequate”.

The Government’s riding roughshod over our privacy was halted only when a coalition of organisations, including digital rights campaign group Foxglove, the Doctors’ Association UK, the National Pensioners Convention and myself, challenged the legality of the state’s actions. Our letter before legal action and threat of injunction forced a delay of two months. That is a welcome pause, but it has not resolved the issue.

Earlier this week, the Secretary of State published a data strategy that raised the possibility of using health data to improve care, something I know is close to his heart, but plans for securing and handling our data were consigned to a single paragraph—almost an afterthought. If the Government do not take corrective action to address our concerns, there will inevitably be a full judicial review. I have no doubt that, without clear action to both protect privacy and give patients control of their own data, the Government will find themselves on the losing side of any legal case.

Today, I hope and believe the Government will have the courtesy to listen. Indeed, if I may, I will thank the Secretary of State for being here personally today. It is very unusual for a Secretary of State to take the time to be here—he must be the busiest man in the Government—and address the issue today. That he has done so is, I think, a compliment to him.

A comprehensive health database undoubtedly has the potential to revolutionise patient treatment and save hundreds of thousands of lives. However, this data grab is not the correct approach. There are much better, safer and more effective ways to do this in the national interest. No system is ever going to be 100% safe, but it must be as safe as possible. We must find the proper balance between privacy and progress, research and restrictions, individual rights and academic insights. That also means controlling the companies we allow into our health system. Patient trust is vital to our NHS, so foreign tech companies such as Palantir, with their history of supporting mass surveillance, assisting in drone strikes, immigration raids and predictive policing, must not be placed at the heart of our NHS. We should not be giving away our most sensitive medical information lightly under the guise of research to huge companies whose focus is profits over people.

Of course, this was not Whitehall’s first attempt at a medical data grab. The failed care.data programme was the most notorious attempt to invade our privacy. Launched in 2013, NHS Digital’s project aimed to extract data from GP surgeries into a central database and sell the information to third parties for profit. NHS Digital claimed the data was going to be anonymised, not realising that that was actually impossible. The Cabinet Office described the disaster as having

“major issues with project definition, schedule, budget, quality and/or benefits delivery, which at this stage do not appear to be manageable or resolvable.”

The project was ended in July 2016, wasting £8 million before it was scrapped.

However, care.data was just one example. I am afraid the Department has a long and problematic history with IT. Before care.data the NHS national programme for IT was launched by Labour in 2003. It sought to link more than 30,000 GPs to nearly 300 hospitals with a centralised medical records system for 50 million patients. The initial budget of £2.3 billion—note billion, not million—ballooned to £20 billion, which had to be written off when the programme collapsed in 2011. My old Committee, the Public Accounts Committee described the failed programme as one of the

“worst and most expensive contracting fiascos”

ever.

The possibilities to make research more productive, quicker and more secure are goals worth pursuing. There is no doubt that we all agree on the aims, but the path to progress must be agreed on, and there is clear concern among the public, GPs and professional bodies about this new data system.

Rachael Maskell Portrait Rachael Maskell (York Central) (Lab/Co-op)
- Hansard - - - Excerpts

I am very grateful to the right hon. Gentleman not only for giving way, but for leading today’s very important debate. It has been a really difficult year both for clinicians and for the public. The public understand the importance of research and planning, but they need confidence that their data—often about very intimate health needs—is secure. Given the need to maintain the special relationship between the clinician and patient, does he agree that the insufficiency of the current processes will damage that relationship, and therefore that we need a complete rethink about how data is collected and then used appropriately?

--- Later in debate ---
David Davis Portrait Mr Davis
- Hansard - -

I do absolutely agree. I think there is a common interest, frankly, between everybody in this House, including those on the Front Bench. The worst thing that can happen to this is a failure of trust. The failure of public trust in the care.data system saw some 2 million people opt out, and that is not what we want to see here, but we could easily exceed that figure with this programme now.

A lack of trust will undermine the usefulness of the dataset the Government hope to collect. The Guardian reported this month:

“All 36 doctors’ surgeries in Tower Hamlets…have already agreed to withhold the data”

had the collection gone ahead on 1 July as was planned. Other parts of the country are seeing more than 10% of patients withdraw their data via their GP surgery, and that is with little to no public awareness campaign. Much of this would have been avoided had the Government trusted Parliament and the public with a detailed and carefully thought-through plan. As the BMA noted:

“Rushing through such fundamental changes to confidential healthcare data, losing the confidence of the public and the profession, will severely undermine the programme and threaten any potential benefits it can bring”.

It is entirely correct.

Despite the errors so far, this proposal need not necessarily be consigned to the ash heap of NHS history. There are ways of safely achieving the vast majority of what the Government want. The programme OpenSAFELY is a new analytics platform, principally authored by Dr Ben Goldacre, Liam Smeeth and Seb Bacon, that was created during the pandemic to provide urgent data insights, so I know the Health Secretary will be very familiar with it. Working with 58 million NHS records distributed across a range of databases—not centralised, but on a range of databases—their software maintains health data within the secure systems it was already stored on. It is not transported outside the existing servers and it does not create a central honeypot target.

The programme sees the data, but the researcher does not. Furthermore, all activity involving the data is logged for independent review. The way it works is that the researcher sets up the experiment, and the programme returns the results, such as a hypothesis test, a regression analysis or an associational graph. At no point does the researcher need to see the raw patient data; they simply see the outcome of their own experiment. This is very important because the biggest risk with any new data system is losing control of data dissemination. Once it is out, like Pandora’s box, you cannot close the lid.

OpenSAFELY gets us 80% to 90% of the way to the Government’s objectives. Operated under rigorous access controls, it could give the vast majority of the research benefit with very little risk to the security of the data. Therefore, this is a viable approach providing there is a properly thought-through opt-out system for patients. This approach, so far, has been severely lacking: where are the texts, the emails and the letters to the patients that should have been there at the beginning? On the “Today” programme earlier this week, the Health Secretary indicated that he was now willing to contact every patient. That is very welcome. I hope he is now writing to every single patient involved in this proposed database and informing them properly. That information should be in easy-to-understand English or other community language, not technical jargon. Everything in the letter must be easily verifiable: clear facts for clear choices. The letter should have the approval of the relevant civil organisations that campaign on privacy and medical data issues to give the letter credibility. Unlike the disastrous scenes of only a few weeks ago, this will mean that patients should be able to opt out through their choice of a physical form with a pre-paid return, an easily accessible form online, or a simple notification of their GP. As well as the physical letter, a reminder should be sent to them shortly before their data is accessed, which, again, should give the patient a clear way to change their mind and opt out. The overall aim must be to give patients more control, more security and more trust in the process, and that requires very high levels of transparency.

However, my understanding is that the Government want to go further than the 80% or 90% that we could do absolutely safely. They want to allow, I think, partial downloads of datasets by researchers, albeit under trusted research environment conditions. They may even go further and wish to train AIs in this area, or allow outside third-party companies to do so. In my view, that is a bridge too far. One of the country’s leading professors of software security told me only this week that it is difficult to ensure that some designs of AI will not retain details of individual data. The simple fact is that at the moment AI is, effectively, a digital technology with analogue oversight. Other researchers argue for other reasons that they need to have more direct access to the data. Again, as I understand it, the Government’s response is downloading partial samples of these databases under the control of technology that will track the researcher’s every click, keystroke and action, and take screenshots of what their computer shows at any point in time. I am afraid that I am unpersuaded of the security of that approach. Downloading any of these databases, even partially, strikes me as being a serious risk.

The stark fact is that whether it be data downloads, AI or other concerns that we are not yet aware of, there are significant ethical and risk implications. If the Government want to go beyond what is demonstrably safe and secure, an opt-out system is not sufficient. In this scenario, a database would only be viable as an opt-in system, with volunteers, if you like: people who have decided they are happy that their data is used in a system that is perhaps not perfectly secure. The risk is too great to work on the presumption of consent that an opt-out system has. The Government must make these risks of exposure and privacy absolutely clear to those willing to donate their data. It is obvious that an opt-in system will be significantly constrained by a much smaller data sample, but that is the only way we should countenance such risks. My strong recommendation to the Secretary of State is that the Government pursue the first stage properly with a closed technology like OpenSAFELY that can provide proper security, proper access for researchers, and proper reassurance to the public.

There is no doubt that this is a complex issue. However, it would be a dereliction of our duty if this House did not hold the Government to account on what could have been, and could still be, a colossal failure. Whether it intended it or not, the Department of Health has given us the impression that it did not take the privacy and security of our personal health records sufficiently seriously. This is extremely damaging to the Government’s cause, which I have no doubt is well-meaning. The Department needs to explain to the House how it will address the legitimate concerns and safeguard this most sensitive of personal data. Only by properly respecting the privacy of the citizen, and by obtaining freely given informed consent, can the Department deliver on its prime purpose, which must be enhancing the health of the nation—something that I know is absolutely close to the Secretary of State’s heart.