To match an exact phrase, use quotation marks around the search term. eg. "Parliamentary Estate". Use "OR" or "AND" as link words to form more complex queries.


View sample alert

Keep yourself up-to-date with the latest developments by exploring our subscription options to receive notifications direct to your inbox

Written Question
Artificial Intelligence: Safety
Monday 24th November 2025

Asked by: Baroness Ritchie of Downpatrick (Labour - Life peer)

Question to the Department for Science, Innovation & Technology:

To ask His Majesty's Government what discussions they are having with the technology industry to ensure artificial intelligence models are tested robustly before deployment, and to embed safeguards such as suicide prevention into model development.

Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)

The Government has ongoing partnerships with artificial intelligence developers to ensure the safety of the models they develop. It is essential that AI models are appropriately tested to ensure safeguards are robust, possible harms are considered and risks mitigated, to ensure the British public are protected.

The role of the AI Security Institute (AISI) is to build an evidence base of these risks, to inform government decision making and help make AI more secure and reliable. AISI works in close collaboration with AI companies to assess model safeguards and suggest mitigations. To date, AISI has tested over 30 models from leading AI companies, including OpenAI, Google DeepMind and Anthropic. AISI’s findings lead to tangible changes to AI models before deployment, reducing the risk from day one.

Once deployed, many AI services are captured by the Online Safety Act 2023, which places robust duties on all in-scope user-to-user and search services, including those deploying generative artificial intelligence chatbots, to prevent users from encountering illegal suicide and self-harm content. These duties apply regardless of whether content is created by AI or by humans.


Written Question
Press: Regulation
Monday 24th November 2025

Asked by: Lord Bradshaw (Liberal Democrat - Life peer)

Question to the Department for Digital, Culture, Media & Sport:

To ask His Majesty's Government what assessment they have made of the YouGov report Press regulation: public attitudes and expectations, published by the Press Recognition Panel on 5 November; and whether they plan to take further action in this area.

Answered by Baroness Twycross - Baroness in Waiting (HM Household) (Whip)

The Department for Culture, Media and Sport is interested in a range of evidence concerning public attitudes to news media and we have noted the publication of the report.

The UK has a self-regulatory system for the press, which is independent from Government. This is vital to ensure the public has access to accurate and trustworthy information from a range of different sources. Our aim as a Government is to ensure we strike the balance between freedom of the press and protecting the public from harm. We are carefully considering next steps to determine the best route forward to safeguard public trust in our news media.


Written Question
Prisons: Crimes of Violence
Friday 21st November 2025

Asked by: Jim Shannon (Democratic Unionist Party - Strangford)

Question to the Ministry of Justice:

To ask the Secretary of State for Justice, what assessment he has made of trends in the level of violence in women's prisons.

Answered by Jake Richards - Assistant Whip

Violence in prisons may be caused, or triggered, by a range of factors, including personal characteristics such as existing patterns of behaviour, substance misuse or traumatic life experiences. Factors particularly relevant to the women’s estate include trauma, relational complexities and separation from children.

Information on the rate of assaults in female establishments in the 12 months to June 2025 can be found at the following link: Safety in Custody Statistics, England and Wales: Deaths in Prison Custody to September 2025 Assaults and Self-harm to June 2025 - GOV.UK.

The Managing Women in Crisis Working Group in His Majesty’s Prison & Probation Service (HMPPS) was established to increase understanding of complex behaviour in this group of prisoners, and to consider how best to support them. This includes developing guidance and training for staff. In addition, HMPPS’s Women’s Estate Case Advice and Support Panel supports establishments in the management of women with complex needs. It aims to help reduce risk and to enable women to progress in their sentences.


Written Question
Prisoners: Death
Thursday 20th November 2025

Asked by: Kim Johnson (Labour - Liverpool Riverside)

Question to the Ministry of Justice:

To ask the Secretary of State for Justice, what estimate he has made of the current total amount of (a) compensation and (b) civil claim payments made to families of prisoners who have died in custody while serving a sentence of Imprisonment for Public Protection.

Answered by Jake Richards - Assistant Whip

The information requested is not held centrally. Information relating to payments relating to civil claims following the death in custody of prisoners is not broken down by sentence-type.

It remains a priority for the Government that all those on IPP sentences receive the support they need to progress towards safe release from custody or, where they are being supervised on licence in the community, towards having their licence terminated altogether. Guidance has been provided to all prison staff and partner agencies to raise the importance of recognising the heightened level of risk of self-harm and suicide amongst IPP prisoners and an IPP Safety Toolkit has been developed, with a range of resources to promote learning and to help front-line staff support and engage those serving the IPP sentence effectively.


Written Question
Internet: Self-harm and Suicide
Wednesday 19th November 2025

Asked by: Lisa Smart (Liberal Democrat - Hazel Grove)

Question to the Department for Science, Innovation & Technology:

To ask the Secretary of State for Science, Innovation and Technology, what assessment she has made of the effectiveness of the Online Safety Act 2023 to protect internet users from (a) suicide and (b) self-harm content on artificial intelligence platforms.

Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)

Every death by suicide is a tragedy, and the government is deeply concerned about the role that online content can play in facilitating suicide and self-harm. This government is committed to keeping people safe online. For the first time, platforms now have a legal duty to ensure that they are protecting users from illegal content and, in particular, safeguarding children from harmful content. But we have gone further still. We have made self-harm and cyber-flashing, and now strangulation, priority offences. We will go further still by backing Ofcom to make sure that enforcement is robust too.

Some chatbots, including live search and user-to-user engagement, are in scope of the Online Safety Act 2023, and we want to ensure that enforcement against them, where relevant, is robust. The Secretary of State has commissioned work to make sure that, if there are any gaps in the legislation, they will be looked at fully and robust action will be taken too.


Written Question
Mental Health Services
Friday 14th November 2025

Asked by: Baroness Ritchie of Downpatrick (Labour - Life peer)

Question to the Department of Health and Social Care:

To ask His Majesty's Government what steps they are taking to ensure users of artificial intelligence platforms can safely access mental health support and are protected from harmful content such as suicide and self-harm content.

Answered by Baroness Merron - Parliamentary Under-Secretary (Department of Health and Social Care)

We recognise the growing use of artificial intelligence (AI) platforms and the potential risks they pose, particularly when people are seeking mental health support.

The National Health Service operates within a comprehensive regulatory framework for AI, underpinned by rigorous standards for safety and effectiveness. Publicly available AI applications that are not deployed by the NHS are not regulated as medical technologies and may offer incorrect or harmful information. Users are strongly advised to be careful when using these technologies.

Regardless of whether content is created by AI or humans, the Online Safety Act places robust duties on all in-scope services to prevent users encountering illegal content including content on suicide and self-harm.


Written Question
Internet: Children and Young People
Wednesday 12th November 2025

Asked by: Gerald Jones (Labour - Merthyr Tydfil and Aberdare)

Question to the Department for Science, Innovation & Technology:

To ask the Secretary of State for Science, Innovation and Technology, what assessment her Department has made of the potential impact of the Online Safety Act 2023 on protecting children and young people from online harms.

Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)

I met and made clear to Ofcom’s chief executive that keeping children safe online is my top priority.

Since taking post I’ve already strengthened the Online Safety Act: to make encouraging self-harm and cyber-flashing priority offences, so services must proactively remove this abhorrent content.

And today I can announce we will amend the Crime and Policing Bill to ensure AI models cannot produce child sexual abuse material, and address vulnerabilities where they can.

I will not hesitate to go further where evidence shows it’s needed.

Parents should be able to have confidence that children and young people are safe as they benefit from the opportunities that being online offers.


Written Question
Animal Experiments: Primates
Tuesday 11th November 2025

Asked by: Manuela Perteghella (Liberal Democrat - Stratford-on-Avon)

Question to the Home Office:

To ask the Secretary of State for the Home Department, what assessment she has made of the potential impact of importing cynomolgus monkeys born in (a) Africa and (b) Asia for use in scientific procedures on the welfare of those animals.

Answered by Dan Jarvis - Minister of State (Cabinet Office)

The Home Office is committed to maintaining the highest standards of animal welfare regarding the use of non-human primates in scientific procedures. The use of cynomolgus monkeys in the United Kingdom is strictly regulated under the Animals (Scientific Procedures) Act 1986 (ASPA).

The Home Office commissioned a comprehensive assessment from the expert Animals in Science Committee on the welfare implications associated with the use of non-human primates bred and imported for use in scientific procedures. You can find the report here: https://www.gov.uk/government/publications/nonhuman-primates-bred-for-use-in-scientific-purposes.

Following recommendations from the Committee, the Home Office has introduced a time-limited transitional period relating to the sourcing of non-human primates. During this period, the use of first-generation cynomolgus macaques will only be permitted where there is a scientific need, where there is an inability to reasonably source self-sustaining animals, where there is a robust plan to transition to a sustainable supply, and where their use will prevent culling, thereby reducing harm. You can read the Government’s response to the report here: https://www.gov.uk/government/publications/non-human-primates-bred-for-use-in-scientific-purposes-response-from-lord-hanson.


Written Question
Social Media: Harassment and Stalking
Monday 10th November 2025

Asked by: Elsie Blundell (Labour - Heywood and Middleton North)

Question to the Department for Science, Innovation & Technology:

To ask the Secretary of State for Science, Innovation and Technology, what steps her Department is taking to address the role of social media companies in enabling (a) harassment and (b) stalking through their platforms.

Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)

The Online Safety Act since its implementation in 2023 places legal duties on social media companies to tackle online harms, including harassment and stalking. Platforms must assess risks, swiftly remove illegal content, and implement measures to prevent abuse. They are also required to provide clear reporting tools. Ofcom, the independent regulator, is responsible for ensuring services are complying with their safety duties. The Act also introduced new communications offences, including cyber-flashing and threatening communications, strengthening protections against online harassment and stalking. The Secretary of State is taking steps to make cyberflashing, and assisting and encouraging self-harm priority offences, in addition to stalking and harassment already being priority offences, to strengthen the act further.


Written Question
Mental Health Services: Prisons and Young Offender Institutions
Thursday 6th November 2025

Asked by: Rebecca Paul (Conservative - Reigate)

Question to the Department of Health and Social Care:

To ask the Secretary of State for Health and Social Care, with reference to the Independent Monitoring Board's report entitled Annual report of the Independent Monitoring Board at HMP/YOI Downview, published on 3 September 2025, what steps he is taking to ensure acutely mentally unwell prisoners are swiftly (a) identified and (b) given care in an appropriate facility at (a) HMP/YOI Downview, (b) other prisons and (c) other young offenders institutions.

Answered by Zubir Ahmed - Parliamentary Under-Secretary (Department of Health and Social Care)

NHS England commissions prison health care services for HMP/YOI Downview and every other prison and young offenders institution in England. Every prison has onsite health care services including primary care, mental health, dentistry, and substance misuse teams.

The National Service Specification for integrated mental health sets out how patients within secure settings, who require support for their mental wellbeing, should receive the same level of healthcare as people in the community, both in terms of the range of interventions available to them, in order to meet their needs, and the quality and standards of those interventions.

This includes access to crisis intervention and crisis prevention for those at high risk of self-harm and suicide, where such behaviours relate to poor emotional wellbeing and/or minor psychiatric morbidity.

Access to mental health provision is available to every person in prison at any stage of their sentence, beginning at the point of entry. NHS England commissions first night reception screening to have a registered nurse/practitioner review patients’ medical history to address any immediate health needs and risks and to ensure medication is made available as soon as possible and that onward referrals to onsite healthcare teams, including mental health services, for both urgent face to face appointments, within 24 hours, and routine face to face appointments, within five working days, are made.

Outside of reception screening, people in prison can be referred or can self-refer to mental health services, within those timeframes.

When someone is acutely unwell, they can be transferred from prisons and other places of detention to hospital for treatment, under the Mental Health Act, within the target transfer period of 28 days. The Mental Health Bill, currently going through Parliament, introduces a statutory 28-day time limit within which agencies must seek to ensure individuals who meet the criteria for detention under the act are transferred to hospital for treatment. NHS England’s South East Health and Justice team is funding a transfer and remissions co-ordinator from January 2025, to improve, where possible, safe, effective, and efficient transfers to hospital level treatment and interventions.

NHS England is reviewing the National Integrated Prison Service Specification to ensure it continues to meet the needs of the prison population.