Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)
Question to the Department of Health and Social Care:
To ask His Majesty's Government what assessment they have made of the role of digital technologies in supporting treatment and recovery services for people experiencing drug and alcohol addiction.
Answered by Baroness Merron - Parliamentary Under-Secretary (Department of Health and Social Care)
The Government is continuing to invest in improvements to local alcohol and drug treatment services to ensure those in need can access high quality help and support. From 2026, all drug and alcohol treatment and recovery funding will be channelled through the Public Health Grant, with over £13.45 billion allocated across three years, including £3.4 billion ringfenced for drug and alcohol treatment and recovery.
Local authorities are responsible for assessing local needs for alcohol and drug prevention and treatment in their area, and commissioning services to meet these needs. The Government works with local treatment systems to provide a number of digital products including guidance, subject-matter expertise and data tools to help them deliver their service.
Digital products are derived from The National Drug Treatment Monitoring System and other related health datasets and made available via a dedicated website to enable local treatment systems to monitor treatment access and better manage outcomes.
Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)
Question to the Department of Health and Social Care:
To ask His Majesty's Government what assessment they have made of the impact of proposals to indefinitely recognise CE-marked medical devices on the availability of medical technologies in the UK.
Answered by Baroness Merron - Parliamentary Under-Secretary (Department of Health and Social Care)
Approximately 90% of medical devices currently on the British market are CE marked and their continued supply to the National Health Service and wider health system is vital for patient access to essential products. The Medicines and Healthcare products Regulatory Agency recognises CE marked products until 2028 or 2030, depending on risk classification and the European Union legislation they comply with. The proposals are intended to allow continued access to medical devices that have been assessed as safe and effective in the EU while aligning with international best practice. As Northern Ireland follows EU medical devices regulations, continued recognition of CE marked medical devices in Great Britain would further support the functioning of the UK Internal Market, as manufacturers could continue to place the same product on the entire United Kingdom market.
The proposals are anticipated to drive growth in the medical technology sector by reducing administrative costs and safeguarding the continued supply of medical technologies. The purpose of the proposed policy is to enable indefinite market access for CE marked medical devices on the British market. The impact on safety, availability, and favourability may vary depending on whether all devices are recognised, or just devices that are the same risk class in Great Britain, or lower. An assessment of each proposal against the availability of medical devices can be found in Annex C of the published consultation document, which is available on the GOV.UK website.
Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)
Question to the Department of Health and Social Care:
To ask His Majesty's Government what assessment they have made of the role of AI tools in supporting radiologists and improving diagnostic capacity in the NHS breast screening programme.
Answered by Baroness Merron - Parliamentary Under-Secretary (Department of Health and Social Care)
The Department is actively testing artificial intelligence (AI) in areas with significant impact on health and the economy. AI tools have demonstrated clear potential in aiding radiologists and enhancing diagnostic capacity within the National Health Service, especially in breast screening.
While no formal assessment has yet been completed, emerging evidence is already highlighting the benefits that AI can provide to the NHS. Previously, two radiologists were required to review each scan, but now an AI assistant can perform a preliminary check, which is then verified by a qualified radiologist. This approach reduces the number of radiologists needed to review each scan, but it does not result in fewer radiologists employed by the NHS. Instead, it enables clinicians to work more efficiently and to review a greater volume of scans, thereby improving diagnostic capacity and ensuring more patients are seen promptly.
Furthermore, on 4 February 2025, the Department announced that nearly 700,000 women nationwide will participate in the world-leading Early Detection using Information Technology in Health (EDITH) trial. This initiative aims to test advanced AI tools to detect breast cancer cases earlier and is supported by £11 million of Government funding through the National Institute for Health and Care Research.
The Department is pursuing significant initiatives to evaluate and expand the use of AI in NHS breast screening. Early evidence points to improved efficiency and diagnostic capacity, and the EDITH trial will further examine the potential of AI in delivering earlier detection of breast cancer for patients across the country.
Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)
Question to the Department of Health and Social Care:
To ask His Majesty's Government what assessment they have made of the use of AI and assistive technologies in adult social care and elderly support services, in particular its impact on improving independence and quality of life for older people.
Answered by Baroness Merron - Parliamentary Under-Secretary (Department of Health and Social Care)
Artificial intelligence (AI) and assistive technologies can support people to live high-quality, independent lives for longer. Such technologies are already being used across adult social care by care providers and local authorities to enable more preventative and personalised care, save staff time, and improve care coordination.
To help assess the use of technologies in adult social care, the Government has funded testing and evaluation of technologies in social care, including AI-enabled technologies, through the Adult Social Care Technology Fund. Emerging evidence indicates positive outcomes for people in receipt of care, care professionals, and the wider health and social care system. People using technology experienced greater independence, safety, wellbeing, and quality of life. We will publish the findings from these projects.
The Government is committed to supporting safe and appropriate adoption of technologies in social care. We are setting new national standards for care technologies and producing trusted guidance, so that people can confidently buy and use technology which support them or the people they care for. To support appropriate use of AI in adult social care, we have published guidance for care providers on AI use cases and tips for safe and responsible use. We will be setting out the Government’s strategic approach to AI in adult social care, alongside its approach to AI in health, through the National AI Roadmap. We have also launched the Adult Social Care Assessments Improvement Toolkit to help local authorities find digital and AI-enabled tools to improve services and the quality-of-care delivery.
Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)
Question to the Department of Health and Social Care:
To ask His Majesty's Government what assessment they have made of the data protection and confidentiality risks of the deployment of generative AI workplace tools in public sector bodies; and what guidance they have issued regarding the use of those tools in environments handling sensitive or personal data, including NHS organisations.
Answered by Baroness Merron - Parliamentary Under-Secretary (Department of Health and Social Care)
The Government recognises that the deployment of generative artificial intelligence (AI) workplace tools across the public sector presents data protection, confidentiality, and security risks, particularly where these tools may process sensitive or personal data.
The Government has assessed these risks which are addressed within the Artificial Intelligence Playbook for the UK Government and the Generative AI Framework for UK Government, both published in February 2025.
The AI playbook makes clear that public sector organisations must comply with UK data protection law when using generative AI, including the UK General Data Protection Regulation and the Data Protection Act 2018. It emphasises the need for data protection impact assessments, clear accountability, human oversight, and restrictions on the use of generative AI tools in environments handling sensitive or personal data unless appropriate safeguards are in place. The generative AI framework provides detailed guidance on privacy, security, and information governance, including data minimisation, purpose limitation, and preventing the disclosure of personal or confidential information through prompts or outputs.
Specific guidance has also been issued for health and care settings. NHS England has published information governance guidance on the use of AI, which has been reviewed by the Health and Care Information Governance Working Group, including the Information Commissioner's Office and National Data Guardian. This guidance addresses confidentiality, lawful processing, consent, and human oversight, and applies to NHS organisations considering or deploying AI technologies, including generative AI tools. NHS bodies are expected to operate within established information governance frameworks and, where appropriate, adopt local AI governance and acceptable use policies consistent with national guidance.
Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)
Question to the Department of Health and Social Care:
To ask His Majesty's Government what assessment they have made of the use of artificial intelligence technologies by hospice and palliative care providers; and what safeguards are in place to ensure that those technologies maintain patient safety, data protection and equitable access to high-quality end of life care.
Answered by Baroness Merron - Parliamentary Under-Secretary (Department of Health and Social Care)
No formal assessment has been made of the use of artificial intelligence (AI) technologies by hospices and other palliative care providers. The majority of hospices are independent charitable organisations and so are free to make their own decisions regarding the adoption and deployment of AI tools.
NHS England is dedicated to enabling the safe deployment and adoption of AI technologies, providing clear guidance on approval, implementation, information governance, security, privacy, and controls. NHS England provides guidance on how technologies should be selected, deployed, and scaled to ensure they are safe, effective, and eligible for National Health Service adoption, including accuracy. NHS trusts are expected to ensure that access to the AI tools they employ is safe, ethical, effective, and equitable for all within their remit.
Strict safeguards are in place across the NHS to guarantee patient safety, and data protection. All NHS organisations, including NHS palliative care and end-of-life care services, are expected to comply with Medical Devices Regulations (SI 2002 No 618, as amended) (UK MDR 2002) and digital clinical safety standards.
Providers handling patient data must comply with UK General Data Protection Regulation and the Data Protection Act 2018. Each health organisation is required to appoint a Caldicott Guardian, whose role is to advise on the protection and proper use of health and care data, including where AI is involved.
Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)
Question to the Department of Health and Social Care:
To ask His Majesty's Government what assessment they have made of the use of AI tools by NHS hospitals to support clinical documentation, including real-time note-taking systems; and what safeguards are in place to ensure that those tools maintain accuracy, patient safety and data protection.
Answered by Baroness Merron - Parliamentary Under-Secretary (Department of Health and Social Care)
National Health Service hospitals are increasingly using artificial intelligence (AI) tools, such as real-time note-taking systems, to support clinical documentation. Among these, Ambient Voice Technologies (AVT) hold transformative potential to improve both patient care and operational efficiency. These tools have been shown to cut down on the amount of time clinicians spend on paperwork by half, giving them more time to spend on other important tasks, like interacting with their patients.
NHS England is dedicated to enabling the safe deployment and adoption of such technologies, providing clear guidance on approval, implementation, information governance, security, privacy, and controls. National standards and additional guidance will explain how AVT solutions should be selected, deployed, and scaled to ensure they are safe, effective, and eligible for NHS adoption including accuracy.
Strict safeguards are in place across the NHS to guarantee patient safety, and data protection. All NHS organisations must comply with Medical Devices Regulations (SI 2002 No 618, as amended) (UK MDR 2002) and digital clinical safety standards. Providers handling patient data must comply with UK General Data Protection Regulation and the Data Protection Act 2018. Each health organisation is required to appoint a Caldicott Guardian, whose role is to advise on the protection and proper use of health and care data, including where AI is involved.
Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)
Question to the Department of Health and Social Care:
To ask His Majesty's Government what assessment they have made of the quality of surgical outcome data collected by NHS trusts; and what steps they are taking to support NHS trusts to use that data to improve patient safety.
Answered by Baroness Merron - Parliamentary Under-Secretary (Department of Health and Social Care)
The National Clinical Audit and Patient Outcomes Programme (NCAPOP), is commissioned, managed, and developed by the Health Quality Improvement Partnership on behalf of NHS England, the Welsh Government, and other devolved administrations.
The programme currently consists of over 30 national clinical audits, registries, and databases as well as five clinical outcome review programmes.
The audit and registry topics include, for example, the national vascular registry, the national emergency laparotomy audit, and multiple cancer topics, all of which monitor a variety of clinical metrics including surgical outcomes.
The role of the NCAPOP is to detect unwarranted clinical variation and to feed this back to National Health Service trusts in an agile manner. Timely feedback to trusts enables them to make quick improvements to clinical practice. The NCAPOP work programme achieves this by making trust data available in near real time dynamic dashboards. The NCAPOP audits also operate a statistically rigorous outlier process with the aim of detecting negative trust outcomes. Outlier information is provided to the trust concerned, NHS England, and the Care Quality Commission.
The dashboard and outlier data can be used by trusts to influence quality governance, improve patient safety and reduce patient harm, and enable tailored clinical quality improvement programmes.
Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)
Question to the Department of Health and Social Care:
To ask His Majesty's Government what assessment they have made of the use of AI forecasting tools by NHS trusts to manage demand for and waiting times in accident and emergency; and how the use of that AI is informing wider NHS digital transformation policy.
Answered by Baroness Merron - Parliamentary Under-Secretary (Department of Health and Social Care)
The 10-Year Health Plan was published on 3 July 2025 and sets out how the Government will ensure the National Health Service is fit for the future, and that artificial intelligence (AI) will play a fundamental role in this transformation. As part of the 10-Year Health Plan, the Government is supporting the use of AI-enabled appointment and scheduling tools to reduce the administrative burden on clinicians, with early trials showing an increase in productivity and clinician time saved.
An accident and emergency demand forecasting tool is now available to all NHS trusts and is already in use by 50 NHS organisations, helping them plan how many people are likely to need emergency care and treatment on any given day. While this tool does not schedule appointments specifically, it uses AI to predict emergency care demand, enabling trusts to plan staffing and resources more effectively and reduce pressure on services.
The tool forms part of a wider set of Government‑supported innovations in operational AI, which include technologies to streamline scheduling, automate administrative tasks, and enhance clinical workflows. These collectively aim to free up staff time, improve care quality, and reduce waiting times across the system.
Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)
Question to the Department of Health and Social Care:
To ask His Majesty's Government what steps they are taking, if any, to support the use of AI-enabled appointment and scheduling tools in the NHS.
Answered by Baroness Merron - Parliamentary Under-Secretary (Department of Health and Social Care)
The 10-Year Health Plan was published on 3 July 2025, which sets out how the Government will ensure the National Health Service is fit for the future, where artificial intelligence (AI) will play a fundamental role in this transformation. As part of the 10-Year Health Plan, the Government is supporting the use of AI-enabled appointment and scheduling tools to reduce the administrative burden on clinicians, with early trials showing an increase in productivity and clinician time saved.
An accident and emergency demand forecasting tool is now available to all NHS trusts and is already in use by 50 NHS organisations, helping them plan how many people are likely to need emergency care and treatment on any given day. While this tool does not schedule appointments specifically, it uses AI to predict emergency care demand, enabling trusts to plan staffing and resources more effectively and reduce pressure on services.
The NHS continues to fund both pilots and scaling of different software products that enable the use of AI in scheduling and managing secondary care appointments. Typically, these include the ability to predict Did Not Attends, to reschedule appointments at short notice, and improve utilisation of clinician time.
Work has begun to deliver the NHS’s Medium Term Planning Framework commitment that, from April 2026, the NHS will begin to move to a unified access model, using AI-assisted triage. This model should effectively guide patients to self-care or to the appropriate care setting, through a single user interface delivered via the NHS App but with an integrated telephony and in-person offering.
Further to this, features set to be developed through the NHS App will include the ability to book and manage remote or face-to-face appointments, receive personalised health advice, see when vaccines are up-to-date, and book appointments to get them organised, and find travel vaccine info.
Additionally, DrDoctor, an AI tool, had a three-year contract from 2021 to 2024 with the NHS AI Lab Award. It supports hospitals by providing AI guidance on overbooking as a more efficient and economical solution to increase NHS appointment capacity. This has been shown to free up clinician and administrative time, improve patient care and experience, and predict which patients are at the highest risk of missing an appointment with “Did Not Attend” DNA Prediction.