Terminally Ill Adults (End of Life) Bill Debate
Full Debate: Read Full DebateBaroness Berridge
Main Page: Baroness Berridge (Conservative - Life peer)Department Debates - View all Baroness Berridge's debates with the Department of Health and Social Care
(1 day, 8 hours ago)
Lords Chamber
Baroness Gerada (CB)
My Lords, under this amendment as it stands, we would have patients who could not have computerised records, because we have AI sitting behind every computer. The AI starts at the beginning. It starts with our telephone system, so, in fact, the patient would not even be able to use the telephone to access us; they or a relative would have to come in. They certainly would not be allowed to have computerised records, because of the digital and AI systems that we have in order to pick out diseases and to make sure that we are safely practising.
They also would not be able to have electronic prescribing, in many ways, because the pharmacy end too uses AI to make sure that patients are not being overmedicated and for drug interactions, et cetera, and, if they are using a computer system, AI is also used to digitally scribe consultations. So I understand the essence of this amendment, which I think, as many have said, is to not allow AI to decision-make somebody at the end of their life, but, as it stands, I have to warn noble Lords that it is unworkable in clinical practice.
My Lords, I am grateful to my noble friend for laying such a broad amendment, and obviously I agree with much of what the right reverend Prelate said. It is interesting that this is coming straight after the debate on face-to-face conversations. We are all used to ticking the “I am not a robot” box, but AI now has the ability to create persons, and it is often very difficult if you are not face to face to judge whether the person on screen is actually a person. I cannot believe we have got there quite so quickly.
However, it is also important to consider about public confidence and understanding at the moment. This is, as we keep saying, such an important life-or-death decision. There is a lack of understanding and people are potentially worried about these implications, often with regard to employment but also other purposes. For instance, as I was preparing this, it made me reflect, as the noble Baroness, Lady Gerada, said, on how your GP uses AI. When Patchs told me recently that the NHS guidance was that I should not take an over-the-counter drug for more than two weeks, I queried it.
However, only yesterday, I thought: was that answer actually from my GP or was it from an AI tool sitting behind the system? We really need to be careful with the level of public understanding and awareness of its use. This use of AI is also one step on and connected to Clause 42, which relates to advertising. I am grateful that the noble and learned Lord is going to bring forward some amendments on that clause. I hope that the connection with AI, as well as the Online Safety Act 2023, have been considered. If I have understood the noble and learned Lord correctly, I am disappointed that we have had no assurance that those amendments will be with us by the end of Committee, when the noble and learned Lord gave evidence on 22 October last year and accepted that there was additional work to be done on Clause 42.
I said at Second Reading that the Bill is currently drafted for an analogue age. I am not wanting to take us back to some kind of quill and no-use-of-AI situation. Obviously, as other noble Lords have said, the Bill do not deal with the pressure or coercion not being from a human being. It also does not consider that coercion can now be more hidden with the use of AI. The Bill does not deal with people being able to learn to answer certain tools by watching YouTube. Therefore, we could be in a situation where someone who would not qualify if there was a face-to-face non-AI system could learn those answers and qualify.
There are also good studies to say that its use in GP practices has had some inaccuracies. In many circumstances, there is a lack of transparency and accountability in tracing where the decision has come from. We do not even understand the algorithms that are sending us advertisements for different shops, let alone how they could be connected to a decision such as this.
Finally, my biggest concern is that there will be a limited number of practitioners who will want to participate in this process. That has been accepted on numerous occasions in your Lordships House. I will quote from a public letter written on 12 June last year. All of Plymouth’s senior palliative medicine doctors were signatories to a letter warning us of the risks of the Bill and saying that the
“changes would significantly worsen the delivery of our current health services in Plymouth through the complexity of the conversations required when patients ask us about the option of assistance to die”.
That is relevant for two reasons. First, if we have a shortage of practitioners in parts of the country, such as the south-west if those doctors’ opposition to the Bill translates into not being involved, there may therefore be an increased temptation to resort to more use of AI. I hope that the noble and learned Lord or the Minister can help on this point.
Many of these systems—I am speaking as a layperson here—rely on data groups and information within the system: the learning is created from that. If you have a very small pool of practitioners and some form of AI being used, does that not affect the creation of the AI tool itself? I hope that I have explained that correctly. With such a small group doing it, will that not affect the technology itself?
I come to this amendment with a good deal of suspicion. I am always worried when the House of Lords decides that it is getting worried about some new thing that is coming along, so we had better do something about it. The noble Baroness, Lady Coffey, explained that this was a broad demand in order that we should concentrate on the important bit. I recommend to those in the House who were not here for last night’s debate on super-clever AI to read it, because it explains why we should be concerned about this. If it will not embarrass him, I shall say that I hope the House will read with care the speech by the right reverend Prelate the Bishop of Hereford, which brought his scientific knowledge and moral concern together in a most interesting and perceptive way. If his quoting Saint Thomas Aquinas interests people, there is a remarkable book called Why Aquinas Matters Now, which is well worth reading in the context of this particular Bill.
I am grateful to the noble Baroness, Lady Coffey, for raising artificial intelligence. There was, broadly, a consensus around the Committee, which the noble Baroness supported, that the amendment is much too blunt, but as she said, fairly, it gives us an opportunity to talk about AI. I will also pick up the right reverend Prelate the Bishop of Hereford’s contribution; he rightly said, as has been echoed around the Committee, that there have been huge benefits for patients from AI.
I think four concerns were raised during the debate. The first was: will AI affect decision-making? I think the underlying point there is that we do not want machines to make the decisions that are referred to in the Bill; we want human beings to make them. In particular, the decisions I have in mind are the decision of the first doctor, the decision of the second doctor, the decision of the panel, and the decision of the doctor, at the point that the assistance is being given, that the conditions are still satisfied. Everybody around the Chamber wants that to be decided by a doctor or a panel, depending on which it is, and I completely and unreservedly endorse and accept that.
Does that need to be made even clearer in the Bill? I will consider it, but I do not think that it does. The acid test for me is that if you fail to comply with your obligations as a doctor or as a panel, you can go to prison for up to five years. It is very difficult to imagine how you could put a machine in prison, so it is pretty clear that these decisions must be made by a human being. For my part and for everybody who supports the Bill, that must remain the position.
The second concern is advertising, which the noble Baroness, Lady Berridge, referred to. She is absolutely right. I have made it clear that I will bring forward amendments. Those amendments, which are almost finally drafted, make provision specifically in relation to digital advertising—they do not specifically refer to AI, but we need to address that in the advertising provision. I will lay those amendments so that the House can consider them.
The third concern is slightly generalised, which is that AI is very persuasive, particularly in persuading people to do things that they do not necessarily want to do. The first thing on that is that there is a wider societal requirement to address the pervasive impacts of AI in a whole range of things. We should all try to contribute to that. More focused on this is the question of the safeguards in the Bill, because they then become incredibly important. In particular, the safeguards require that there is doctor-to-patient discussion in relation to the decision for that patient, and they are specifically required in the preliminary conversation, the first conversation and the second conversation. It is those safeguards that one must see as the antidote to the persuasive aspect of AI, but I completely accept what people said on that.
The fourth issue, which was touched on very briefly, was the operation of devices. That, I think, referred to the fact that quite a number of medical devices can be operated by, for example, the blink of an eye or something quite minor. Again, that needs to be properly safeguarded. Those may not necessarily be AI problems but problems with other sorts of developments in technology.
I thank the noble Baroness, Lady Coffey, for raising this. We need to consider all the points she made. At the moment, apart from the advertising amendment, which I will bring forward, I am not sure that it requires amendment to the Bill.
Is there a guarantee that we will see those amendments in Committee rather than on Report? That is important, because there is a very different procedure in Committee, in which we can go back and forth and query amendments.