Terminally Ill Adults (End of Life) Bill Debate

Full Debate: Read Full Debate
Department: Department of Health and Social Care

Terminally Ill Adults (End of Life) Bill

Baroness O'Loan Excerpts
Friday 30th January 2026

(1 day, 10 hours ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Whitaker Portrait Baroness Whitaker (Lab)
- Hansard - - - Excerpts

My Lords, that provision is in the Bill, if the noble Baroness would just look. I am afraid that I cannot put my finger on the actual clause, but the assessing doctor is required to provide interpreters where necessary.

Baroness O'Loan Portrait Baroness O’Loan (CB)
- Hansard - -

My Lords, I have added my name to Amendment 65 in this group, but will first respond briefly to what the noble Baroness, Lady Gerada, said about remote consultations. This is the core of what we are discussing today. It is not just the doctor who needs to be able to see and understand. The patient needs to be able to see, understand and interact with the doctor.

During Covid, my brother tried for six months to see his doctor. There were regular telephone calls. On each occasion, he was told that his symptoms were resulting from cardiac problems and other problems that he had had, and that all he needed to do was take painkillers. When he finally presented to A&E six months later, he had stage 4 lung cancer and bone cancer. Remote consultations do not always protect. Because this is a matter of life and death, because this is a situation in which someone is seeking death, we need to be very sure of what we are doing.

--- Later in debate ---
Lord Falconer of Thoroton Portrait Lord Falconer of Thoroton (Lab)
- Hansard - - - Excerpts

I would not look a gift horse in the mouth if I were the noble Lord. First, I do not regard accepting amendments in Committee of the House of Lords as indicating that a Bill is fundamentally flawed; I regard it as listening and making appropriate changes. Secondly, in relation to the question of the Parliament Act, the last thing I want is for this to happen through the Parliament Act. I want this House to do the job that it is supposed to do, which is do scrutiny and then send it back to the Commons.

In the debate on 8 January 2026, I gave in detail the reasons why I thought we were not going about it properly, and I thought that the House agreed with me when it said that we needed to move quicker. The problem is not that everybody does not have good ideas; the problem is that it is taking not just far too long but disproportionately long. My experience of the Lords is that we can do this, and we can do it much quicker than we are doing it and there still be quality. That was the point I made on 8 January and that I understood the House to have accepted.

Baroness O'Loan Portrait Baroness O'Loan (CB)
- Hansard - -

My Lords, if I remember it correctly, the Motion that the noble and learned Lord put before the House on 8 January was a Motion that more time was required, not that the House needed to race through the Bill and proceed more quickly. The Motion did not say that we needed to move more quickly; it said that we needed more time. That is what the House agreed to.

Lord Falconer of Thoroton Portrait Lord Falconer of Thoroton (Lab)
- Hansard - - - Excerpts

The Motion said that more time should be given so that we could complete the stages of the Bill so that it would have sufficient time to get back to the Commons. I was particularly impressed during that debate by what my friend, the noble and learned Baroness, Lady Butler-Sloss, said—that we have to “get through” this. We are not getting through it.

Baroness O'Loan Portrait Baroness O'Loan (CB)
- Hansard - -

My Lords, the noble and learned Lord has made the point that I was making. We were saying—we agreed and did not vote against him—that more time was required. But the discussion earlier this morning was about the fact that the House must not rush this, because that we are talking about life and death, and in particular the life and death of very vulnerable and marginalised people who are living in poverty and all those things. These things require the kind of consideration which has occurred this morning, as reflected in the noble and learned Lord’s response.

--- Later in debate ---
Baroness Coffey Portrait Baroness Coffey (Con)
- Hansard - - - Excerpts

My Lords, I am conscious that I might be accused of preferring quill and pen than the latest technology in Amendment 66. In recognising how artificial intelligence is emerging, I thought I would put down a blunt amendment to allow us at least to have a debate. Inevitably, in a variety of legal and health situations, we will start to see artificial intelligence being used routinely. There was a recent legal ruling in which it turns out a judge had completely relied on AI and gave a completely inaccurate ruling based on it. This is not simply about what would be considered by medical practitioners.

I worry about judgment. We have already heard, reasonably, that trying to predict when somebody will pass away due to a terminal illness involves a bit of science but is largely an art. Perhaps I am being ungenerous in that regard. Certainly, in the DWP, we moved accelerated access to benefits from a six-month consideration to 12 months simply because, routinely, the NHS does not require its practitioners to assess six months; it is much more accurate at assessing 12 months. It is interesting that this Bill is focused on six months when, routinely, the NHS does not use that period. However, I am diverting slightly from the point of artificial intelligence.

I was somewhat interested in the previous debate, because there seemed to be a majority—I will not say a consensus—who felt that face to face was an important part of this happening in practice. But there are still a significant number of people who seem happy that we use a variety of technology for some of the interactions.

Forgive me for fast forwarding, but I see this whole issue becoming pretty routine. What I want to avoid is outsourcing. It strikes me how much people rely on Wikipedia and think that they are actually dealing with the Encyclopaedia Britannica, even though a lot of what is on Wikipedia is a complete load of garbage. What is even more worrying is that many of the AI mechanisms use sources such as Wikipedia, or simply put two and two together and come up with 22. I saw this, not that long ago, when I was trying to find something from when I had been on the Treasury Committee and interrogated the FCA about something. The first thing that came out of ChatGPT was that, somehow, I had become a non-executive director of the FCA—if only. That certainly was not the case. I am concerned that an overreliance on AI might start to happen in this regard.

I want to avoid a world of chatbots that removes the human element. That is why I keep coming back to the themes of face to face, being in this country and this having a personal element. I am conscious that the NHS and other practitioners, including legal practitioners, will continue to evolve—I am not stuck in some dinosaur age—but I feel that the issues that those of us concerned about the Bill have will continue. We completely understand why people might want to do this, but we want to make sure that the safeguards, particularly around coercion, are as safe as possible. That is why I have raised for debate the consideration of whether, as a matter of principle, artificial intelligence should not be used in the deployment of the future Act.

As I said, there may be evolution in medicine; we see that that is already happening. I do not know to what extent the Government have confidence in the use of AI in the diagnosis of lifespans. A new evolution in government is that AI is now starting to handle consultations. That might get tested in court at some point, to see whether it is a meaningful way to handle consultations—it is certainly a cost-efficient way to do so. My point is that, according to the Wednesbury rule, there is supposed to be proper consultation, not just a tick-box exercise.

I will not dwell on this, but I would be very interested to hear, from not only the sponsor but the Government, their consideration of artificial intelligence in relation to the practicality and operability of the Bill if it were to become law. I beg to move.

Baroness O'Loan Portrait Baroness O’Loan (CB)
- Hansard - -

My Lords, I have put my name to Amendment 66, in the name of the noble Baroness, Lady Coffey. At present, the Bill makes no allowance for any restriction on the possibility of the use of non-human assessment and automated administration devices during the application and decision-making process for assisted death. Obviously, AI will be used for recording meetings and stuff like that—I am not a quill and paper person to that extent—but AI has already been proposed for use in killing patients in the Netherlands, where doctors are unwilling to participate.

The Data (Use and Access) Act 2025 established a new regulatory architecture for automated decision-making and data interoperability in the NHS. It provides that meaningful human involvement is maintained for significant decisions—decisions which may affect legal status, rights or health outcomes. Of course, assisted death would come within that definition.

That reflects the purpose of the NHS. We have talked about its constitution. I looked at the constitution and the guidance. It says that the purpose of the NHS is

“to improve our health and wellbeing, supporting us to keep mentally and physically well, to get better when we are ill and, when we cannot fully recover, to stay as well as we can to the end of our lives”.

I know that the noble and learned Lord, Lord Falconer, is going to put down an amendment suggesting that the constitution and guidance will have to be amended, but the current situation is that that is the purpose of the NHS. The assisted suicide of patients is certainly not provided for in the NHS, nor should AI be used in the crucial assessment and decision-making process for assisted dying, given the extreme difficulties in identifying coercion and assessing nuanced capacity, and the irreversible nature of death. What plans does the noble and learned Lord have to address these issues?

In the Commons, amendments were passed allowing the Secretary of State to regulate devices for self-administration. The amendment was not put to a vote; in fact, only seven votes were permitted by the Speaker on the more than 80 non-Leadbeater amendments. The Commons have accepted that devices will be used for self-administration. Of course, the assisted suicide Bill requires self-administration. Nothing in the Bill prohibits a device that uses AI to verify identity or capacity at the final moment. If a machine makes the final go/no-go decision based on an eye blink or a voice command, have we not outsourced the most lethal decision-making in a person’s life to technology? I have to ask: is this safe?

Public education campaigns on assisted suicide are explicitly allowed for in Clause 43. The Government have said that there will be an initial education campaign to ensure that health and social care staff are aware of the changes, and that there would likely be a need to provide information to a much wider pool of people, including all professionals who are providing or have recently provided health or social care to the person, as well as family members, friends, unpaid carers, other support organisations and charities. That controls only government activity. The other observation I would make is that I presume the public education campaign will inform families that they have no role in a person’s decision to choose assisted death, and that the first they may know of an assisted death is when they receive the phone call telling them that the person is dead. It is profoundly important that people know this.

There is nothing to prevent an AI chatbot or search algorithm helpfully informing a patient about assisted dying services and prioritising assisted dying over palliative care search results. By legalising this service, the Bill will feed the training data that makes these AIs suggest death as a solution. I would ask the noble and learned Lord, Lord Falconer, how he intends to police that situation.

There is also a risk of algorithmic bias. If prognostic AI is trained on biased datasets—we know the unreliability of the prognosis of life expectancy—it could disproportionately label certain demographics as terminal, subtly influencing the care options, including assisted dying, presented to them. The National Commission into the Regulation of AI in Healthcare established by the MHRA in 2025 is currently reviewing these risks to ensure that patient safety is at the heart of regulatory innovation. I ask the Minister: will that work cover assisted dying?

The AI Security Institute’s Frontier AI Trends Report in December 2025 highlights that:

“The persuasiveness of Al models is increasing with scale”,


and:

“Targeted post-training can increase persuasive capabilities further”.


In a healthcare context, this raises the risk of automated coercion, where the person interacting with a chatbot or an AI voice agent might be subtly persuaded towards certain end-of-life choices. The AISI has said that safeguards will not prevent all AI misuse. We have to remember that there will be financial incentives to provide assisted suicide; after all, the CEO of Marie Stopes received between £490,000 and £499,000 in 2024. There is big money, even though this will be charitable or NHS work. Clause 5 allows doctors to direct the person to where they can obtain information and have the preliminary discussion. That sort of information could be an AI or a chatbot at the present time.

Dr Sarah Hughes, giving evidence to the Lords Select Committee, said there was a real risk of “online coercion”. With newly developed AI functions and chatbots, we already know there are cases all around the world of individuals being coerced into all sorts of different behaviours, practices and decision-making. There is also an issue of misinformation around diagnosis and prognosis. Hannah van Kolfschooten questioned who has ultimate responsibility if the technology fails. She said:

“In traditional euthanasia settings, a doctor is accountable, but in AI-driven scenarios, accountability could become ambiguous, potentially resting between manufacturers, healthcare providers, and even the patient”.


AIs also have a record of encouraging suicide. We know that, and we have seen terrible cases among young people; they have no regard for human life.

Evidence shows that doctors suspect only 5% of elder abuse cases. Detecting subtle coercion requires, as was said in the previous group, professional judgment to interpret things such as non-verbal cues, body language and discomfort. AI systems are ill-equipped to handle these nuanced, non-quantifiable elements. It is imperative for trust in the system that the individual circumstances of each request for assisted death are recorded and are available for interrogation, or even potentially a criminal investigation, by the panel or another regulatory authority. The only insight as to what happened in the consulting room will come from these records. The patient will be dead. The current provision in the Bill does not provide any protection against the use of AI, which has algorithmic bias, to protect an individual in these circumstances. Can the noble and learned Lord, Lord Falconer, explain how he proposes to deal with these concerns?

Lord Carlile of Berriew Portrait Lord Carlile of Berriew (CB)
- Hansard - - - Excerpts

My Lords, I will add only a very short sentence to my noble friend’s excellent speech, and it is what AI says about AI. It says: “AI is technically capable of providing advice or information relating to suicide, but it is critically dangerous to rely on it for this purpose”. Enough said.

--- Later in debate ---
Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- Hansard - - - Excerpts

I am most grateful to the noble Lord, Lord Hendy, with whom I have had conversations going back to last September. I have looked after many patients dying of mesothelioma, and it seemed to be a loophole if the chain of causation was not completely intact.

We had advice in several calls from extremely wise sources—I will not list them all—and I learned a great deal about the legal side and the Fatal Accidents Act. I have some questions for the noble and learned Lord, whom I met with yesterday. He assured me that he would be bringing forward amendments, but unfortunately, I did not see them at the time; it was only much later that they appeared in my inbox. I have not been able to go through them in detail to examine the precise wording.

The concern is that unless this is watertight, these companies will wriggle out of any type of compensation. Therefore, what is the position of coronial oversight in these cases, where perhaps even the diagnosis might be questioned by a company, and it could be difficult for a family to provide the evidence it might be demanding? One does not know. Also, what is the position regarding the life insurance policy of the individual when they have an industrial disease and there is a chain of causation? They might be eligible, one hopes, for compensation. That needs to be followed through. However, somebody could claim that in some way, the chain of causation had been broken.

Baroness O'Loan Portrait Baroness O'Loan (CB)
- Hansard - -

My Lords, I express my appreciation to the noble Lords, Lord Hendy and Lord Harper, for bringing these matters to our attention. I had prepared a speech describing the awful situation of mesothelioma, et cetera. I will not talk about that but will just say a couple of other things.

This has clearly identified a huge gap in provision in this legislation: a Bill is being passed that may have consequences it does not provide for in any way. I am thinking in particular of the Fatal Accidents Act: people are dying of these industrial diseases, including military victims. I had no idea that military victims would lose compensation in that situation. I am very grateful to them for identifying such a significant gap. This is very important for members of the Armed Forces, because many of them suffer from mental illnesses as a consequence of their service, in addition to any other condition from which they may suffer. That always makes life harder for them in trying to negotiate their way through and make decisions of such a profound kind. The noble Baroness, Lady Finlay, mentioned the problem with insurance policies and suicides. Most insurance companies will pay out after a suicide, provided that the minimum time has elapsed since taking out the policy. If someone has an assisted suicide, we do not quite know how that will affect their insurance policy; but it now appears that if the underlying cause of death—the terminal illness which led to the granting of assisted suicide—is something such as cancer, that may send the insurer straight back to find out what underlying habits were disclosed, such as the person being a smoker. It all becomes enormously complicated for the person suffering from a terminal illness who is trying to decide whether to seek an assisted death. There is no provision in this legislation for consequences for their families in situations such as this.

Baroness Grey-Thompson Portrait Baroness Grey-Thompson (CB)
- Hansard - - - Excerpts

My Lords, this is a really interesting group of amendments, and it has probably raised more questions for me than it has answered. When we talk about injury, I immediately think about people who have had a spinal cord injury and who have become a quadriplegic or a paraplegic.

By the very nature of my former career, I know a lot of wheelchair users who have been through various compensation cases. Luckily, these days the survival rate for someone with paraplegia or tetraplegia is very high. We also have to take that into account. I had not thought before about the impact on anyone who has been in the military. I know quite a few people who are injured through the military. Generally, the public are very supportive of the military and what they have gone through, and we would not want any unintended consequences for them.

When I was looking at conditions such as asbestosis, and others that have been debated on this group, it became very clear that in many cases these conditions present quite late and treatment is then very difficult, and many patients die before the compensation claim has gone through. We have talked before about coercion, and I know that Ms Leadbeater has said in various debates and comments that she is concerned about people being coerced not to end their life.

This is a situation where I could see this happening. If you go online and google asbestosis compensation or spinal cord injury compensation, a plethora of websites come up straightaway with calculators, so that you can have an indication of how much you could possibly gain. I had a look; it goes from a couple of thousand pounds for a back injury—which obviously would not account for this—up to £493,000 for someone with quadriplegia. The figures given as a range for asbestosis were £50,000 to £1 million. That is a life-changing amount of money for many families in this country, and it will colour the decisions they make.

It is slightly strange, because we talk about someone being a burden, but people will make a different decision because they are thinking of their children and grandchildren and protecting them for the rest of their lives. So a lot of clarity is needed to make sure that coercion does not go either way. I would be very interested in understanding what the noble and learned Lord intends to do to offer greater clarification for this group of amendments.

--- Later in debate ---
Lord Falconer of Thoroton Portrait Lord Falconer of Thoroton (Lab)
- Hansard - - - Excerpts

The reason, from discussing and thinking about this issue, is that the Government see the most convenient way of doing it is to have a review that can make sure every single aspect is covered. That is the argument for the review.

Baroness O'Loan Portrait Baroness O’Loan (CB)
- Hansard - -

It seems from what the noble and learned Lord just said that the Government have been discussing this issue. If they have, is it his intention to ensure that, in providing such information as the Government provide under the terms of the Bill, they warn people that if they opt for assisted suicide in certain circumstances, it is possible that they will lose compensation to which they would otherwise be entitled and that this is a matter on which advice needs to be sought? Are the Government aware of any other situations in which this may happen to people who may opt for assisted suicide?

Lord Falconer of Thoroton Portrait Lord Falconer of Thoroton (Lab)
- Hansard - - - Excerpts

I am afraid that I am not the Government. On the issue of risk, my proposal—although I recognise that some people want to go further—sets out a sensible course to reach the aim that everyone wants to reach, which is that the problem does not arise. One will have to look at the extent to which one has to warn against that problem when one sees where the review goes, because the question of what warnings have to be given will have to be addressed only at the point when the review has already reported and any action has been taken on it.