Artificial Intelligence in Healthcare Debate

Full Debate: Read Full Debate
Department: Department of Health and Social Care

Artificial Intelligence in Healthcare

Daniel Zeichner Excerpts
Thursday 5th September 2019

(5 years, 2 months ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Daniel Zeichner Portrait Daniel Zeichner (Cambridge) (Lab)
- Hansard - -

It is a pleasure to serve under your chairmanship, Mr Paisley. I congratulate the hon. Member for Crawley (Henry Smith) not only on securing the debate but on his thoughtful and comprehensive introduction to an extraordinarily complicated subject that I suspect will require much more debate in this place in future. I chair the all-party parliamentary group on data analytics and represent a constituency that is, of course, well known for its health services, innovation and tech cluster, not just in the city but around it.

The issue is therefore close to my heart. When I was elected as the Member of Parliament for Cambridge, I never imagined that I would spend quite so much time on such issues, but there are many jobs involved and huge opportunities available, exactly as the hon. Gentleman said. However, I suspect that I will be slightly less optimistic than him, because as I have begun to look at the issue more closely, it has struck me, as he said, that the only way that we will make it work is by maintaining the trust of patients, which is difficult—particularly given the behaviour of some of the major tech companies. It is not a lost cause, in my view, but we are going to need a qualitative change in regulation and protection if we are to secure some of the benefits that have already been referenced. Every day in Cambridge, I hear about new innovations and developments that convince me and, I think, many others that we really are on the cusp of a technological revolution across a range of sectors. Everywhere one goes in Cambridge, one sees people working on the most extraordinary things, and the gains are potentially huge, not just for our citizens but across the world.

It is hard to explain a lot of this to the public. I feel that I am in a privileged position going around Cambridge; I sometimes feel that I am the only person who is seeing all the various things that are going on, and one of my challenges is to try to spread the word about all the stuff that is happening. My worry is that often it is poorly communicated and poorly understood, and that misunderstanding can easily lead to a public backlash. I read with great interest the report from the all-party parliamentary group on heart and circulatory diseases; a very distinguished panel of people was behind it, and I will highlight some of the crucial points.

Ensuring that artificial intelligence really does enhance patient healthcare—and that it does not, as some of us fear could easily happen, get diverted on to a profit-seeking route—requires the following key elements: stakeholder engagement; an exact explanation of the risks and benefits; keeping researchers and academics involved; digital inclusion in general; proper development of policy, focusing on AI for public values; and the development of standards.

There are others, of course, working in a similar field. I am delighted to see present a fellow member of the APPG on data analytics, the hon. Member for North East Derbyshire (Lee Rowley). A few months ago he and my hon. Friend the Member for Bristol North West (Darren Jones) led a very good inquiry and produced, with a similarly illustrious panel of experts, an excellent report entitled “Trust, Transparency and Technology”. It is amazing how many people are working in this field at the moment. Part of that report—I suspect the hon. Gentleman will refer to it when he speaks—was focused on healthcare. He did the work, so I do not want to steal his thunder, but I will pick out a particular couple of things.

We drew on a 2018 survey by the Open Data Institute, whose statistics reflected those cited by the hon. Member for Crawley. Some 64% of consumers trusted the NHS and healthcare organisations with their personal data, which is more than the 57% who trusted their family and friends. Consumers also trusted the NHS more than they trusted their bank, the figure for which was also 57%; local government, for which it was 41%; and online retailers, for which it was 22%. I do not think they asked about the level of trust in politics; that is probably not recorded. Nearly half of respondents—47% of them—were prepared to share medical data about themselves. I have seen different figures, and I would also reflect on the fact that 53% were not prepared to share data. However, those people were prepared to share their data provided that it helped develop new medicines and treatments. In terms of the trade-offs for data sharing, they were most keen to participate when it was for medical research.

As we know in politics, however, trust is hard won and easily lost, and we have to be careful. A few months ago, I was asked to write a foreword to a report by the think-tank Polygeia, entitled “Technology in Healthcare: Advancing AI in the NHS”. The report is consistent with other work in this field and comes to broadly similar conclusions to those we have already heard. There is also a sense that NHS staff need to be closely involved in these developments, to ensure that they are not just kept informed but given a sense of understanding and confidence about how this can work. The black box algorithm to which the hon. Member for Crawley referred is still a little baffling and scary to a lot of people. If we are going to make this work, it is crucial that we consult, educate and take people with us. We must rely on the advice of medical and healthcare professionals, who are best placed to understand the concerns of both their patients and their colleagues.

We are constantly seeing new developments in the news. One of the joys of modern life is that when we go on holiday, we still watch our iPad. This summer I noticed the debate about DeepMind and its new ability to predict acute kidney failure; it wins an extra 48 hours by looking at huge volumes of data and doing the number crunching. That is bound to be a good thing but, typically, there were people who questioned the methodology and who raised concerns about unforeseen consequences. I also think there are some unforeseen consequences to these kinds of changes, and I will touch on one or two.

So far, I have been profoundly non-partisan and non-political, but I have to say that the new Secretary of State for Health and Social Care did rather wade in early on with his support for Babylon Health and GP at Hand. Those kinds of technologies provide tremendous opportunities but, as the hon. Member for Crawley said, such developments can be disruptive to the organisation of the national health service. There has been disruption to funding flows, particularly across London, and I hope the Minister will be able to reassure us about that. Simon Stevens made a commitment, but these changes are happening quickly and one of the things that we know about the NHS is that it is quite a tanker to turn around. Quick, unintended consequences are not always benign ones for the people on the receiving end.

The wider point, of course, is that some of us are worried that the NHS, which is free at the point of use, is being undermined by the creeping in of a potentially competitive system. That can be resolved in some ways: we can change the administrative structures, for example, and sort out the financial flows. My bigger fear relates to confidentiality and what is happening with patient data. It is frequently argued that the data will be anonymised, and this is where we get into the realm of the techies. Plenty of people have explained to me that it is possible to reverse that anonymisation process, because as clever as these machines are in terms of machine learning, they are also pretty clever at doing the reverse. I am now pretty much persuaded that there is no such thing as anonymity. We must face the fact that there are consequences to these tremendous gains, and think through how we should deal with some of them.

This does not necessarily matter. I remember years ago when, under the previous Labour Government, Alistair Darling unfortunately had to come to Parliament and explain that his Department had lost millions of people’s data. That week, everyone thought the world was going to come to an end, but it did not. An awful lot of data is out there already, which is not great because we do not know who knows what about us. That is not necessarily a disaster, but if data is being used for the wrong purposes, it could be very difficult. This is my key point, I suppose: I am afraid that the evidence from the big tech companies, as we see almost daily, is that they have been doing things with our data that we did not know about. That is a problem that we previously experienced with the Care.data failure in the NHS, which damaged public trust. It is absolutely essential that we do better in future if we are going to keep the public on board.

The report from the APPG on data analytics states:

“Key lessons from this failure are around data security and consent, and reinforce the need for proper public engagement in the development of data collection programmes, and gaining the right level of consent, if such consent is not subsequently to be withdrawn with major clinical and value for money implications. In the case of DeepMind, Dame Fiona Caldicott, the National Data Guardian at the Department of Health, concluded that she ‘did not believe that when the patient data was shared with Google DeepMind, implied consent for direct care was an appropriate legal basis’.”

There is a significant number of concerns and the issues are profound and difficult. We have a whole range of structures in place to try to deal with some of them, and I have huge respect for the Information Commissioner’s Office. The Information Commissioner frequently tells those of us who ask that that office does have the appropriate resources. Given the scale and difficulty of the task, I must say that I find that hard to believe, because it is a very big task indeed. The hon. Member for Crawley mentioned the Centre for Data Ethics and Innovation, which is at an early stage. Frankly, it, too, will struggle to find the resources to meet the scale of the task.

I sat on the Bill Committee for the Data Protection Act 2018, which introduced the general data protection regulations. Some parts could have been strengthened. I tabled amendments that would have tightened up the assurance that research institutions must process healthcare data ethically for patient gain, but sadly, the Government chose not to adopt them. I hope that they might look at the issue again. A feature of the lengthy discussions in Committee, particularly in the Opposition’s observations, was that although the legislation is worthy, it felt like it was for the previous period, rather than the future, given the pace of change that we are likely to encounter. We were not convinced that it would keep up.

We need a much more radical set of safeguards. To stray slightly into the technical areas, when my local paper asked what my summer reading was going to be, it was surprised to hear that it was Shoshana Zuboff’s magnum opus, “The Age of Surveillance Capitalism”. It is a thought-provoking work and astonishing in the way she untangles the range of uses to which our data is being put every time we pick up our smartphone—or, in some cases, when we do not even turn it on. Many people are surprised to find that, far from being a phone, it is a tracking device. As she says, the question is not just who knows about us, but who decides what data is used, and who decides who decides what that data is used for. She talks about a shadow text, effectively; there is the data that we put on there and then there are all the connections that are made.

Staggeringly, huge amounts of information are being held about all of us that we do not have any access to—that we do not know about. At the moment, those companies consider that it belongs to them. We have to change that, because I think if it is about us, it belongs to us. That is a huge challenge, because if it were to happen, it would fundamentally challenge the business model of those hugely fabulously wealthy tech giants, which are hardly likely to give it up easily. The only way to tackle it, however, is through Governments and regulation. I hate to mention the issue of the hour, but that is one reason, of course, why those companies dislike the European Union—because we need large organisations to counter the giant power that we face.

We have a fantastic opportunity, particularly with our national health service, which, as is often observed, has access to huge amounts of data that no other health system in the world has. In this country, we have the fantastic raw material and a fantastic data science industry. We have the expertise and the knowledge. We also, just about, have the good will of our citizens. We have a great opportunity, but we will need much tougher regulatory frameworks to unlock that potential in the right way. I fear that, so far, compared with what we have to do, we have merely been tinkering.

There are huge opportunities. I have raised a range of issues that go beyond the immediate ones. I hope that Parliament will find an opportunity to have those discussions in the period ahead. If I were asked whether we are in a position to meet the challenge, I would say, “Not yet.” I do not think it is impossible, but it will be difficult, so it is vital to start the discussion. I thank the hon. Member for Crawley for giving us the opportunity to do that today.