Asked by: Baroness Uddin (Non-affiliated - Life peer)
Question to the Home Office:
To ask His Majesty's Government what assessment they have made of the number of young girls who are currently in danger of online grooming; and what procedures are in place to support their wellbeing.
Answered by Lord Hanson of Flint - Minister of State (Home Office)
In the year ending September 2025 there were 7,527 recorded offences of sexual grooming (which includes sexual communication with a child). In the same period there were 1,000 defendants prosecuted and 1,085 convicted for sexual grooming offences. Girls are more likely to be affected by sexual offending than boys. However, the majority of CSA remains hidden and under-identified. The Centre of Expertise on Child Sexual Abuse (CSA Centre) estimate that 15% of girls experience some form of sexual abuse before age 16 compared to 5% of boys (year ending March 2020). The Home Office funds the CSA Centre to drive system-wide improvements in professionals’ ability to identify and respond to child sexual abuse
The Home Office also equips UK Law Enforcement with the capabilities required to identify and tackle more child sex offenders, including online grooming. The Home Office funds a network of Undercover Online (UCOL) officers based in Regional Organised Crime Units. This network uses specially trained teams and infrastructure to target those who seek to groom children for sexual purposes.
The Home Office also provides funding to voluntary sector organisations to support victims and survivors of CSA through the Support for Victims and Survivors of Child Sexual Abuse fund. In 2025 as part of our response to recommendation 16 of the Independent Inquiry into Child Sexual Abuse, the Government set out ambitious proposals to strengthen therapeutic support for victims, announcing it will provide up to £50 million in new funding to expand the Child House (Barnahus) model to every NHS region in England. This internationally recognised model—rightly viewed as the gold standard for supporting children who have experienced sexual abuse—will ensure that wherever a child lives, they can access the specialist, trauma-informed care they need to begin recovering and rebuilding their lives.
The Online Safety Act is also deigned to drive down online grooming. This landmark piece of legislation protects citizens, especially children, from abuse and harm online, such as grooming. There are over 40 specific measures in Ofcom’s Codes of Practice, which will protect children from the risk of online grooming. The Government is committed to supporting Ofcom’s effective implementation of the Act.
Asked by: Baroness Uddin (Non-affiliated - Life peer)
Question to the Home Office:
To ask His Majesty's Government what measures are in place to ensure that young girls that are subject to online grooming in the UK are supported indiscriminately, regardless of their faith and race.
Answered by Lord Hanson of Flint - Minister of State (Home Office)
The Ministry of Justice is investing £550 million in victim support services over the next three years, including funding to Police and Crime Commissioner areas to commission victims support services locally, as well asfunding to over 60 specialist support organisations through the Rape and Sexual Abuse Support Fund. These organisations provide support for victims and survivors of sexual abuse, including recent and non-recent victims of child sexual abuse, to cope and move forward with their lives. The Home Office’s Support for Victims and Survivors of Child Sexual Abuse Fund also supports seven voluntary and community sector organisations to have national reach and supports victims and survivors of child sexual abuse with a range of one-to-one and peer and survivor led support groups. These services support all victims and survivors irrespective of personal characteristics such as faith and race.
In 2025 as part of our response to recommendation 16 of the Independent Inquiry into Child Sexual Abuse, the Government set out ambitious proposals to strengthen therapeutic support for victims, announcing it will provide up to £50 million in new funding to expand the Child House (Barnahus) model to every NHS region in England. This internationally recognised model—rightly viewed as the gold standard for supporting children who have experienced sexual abuse—will ensure that wherever a child lives, they can access the specialist, trauma-informed care they need to begin recovering and rebuilding their lives.
Children’s Independent Sexual Violence Advisor’s also provide practical and emotional support to children and young people aged 4 to 17 years, who have experienced rape, sexual abuse or sexual exploitation at any time during their life. They provide emotional and practical support and liaise between the police, courts and other agencies, acting as an advocate for the survivor.
Asked by: Baroness Uddin (Non-affiliated - Life peer)
Question to the Home Office:
To ask His Majesty's Government what steps they are taking to prevent cases of online grooming by terrorists by educating people about the consequential danger to their wellbeing and the potential deprivation of their citizenship.
Answered by Lord Hanson of Flint - Minister of State (Home Office)
The Government takes the threat from online grooming by terrorist individuals and organisations seriously. Terrorist activity online and illegal radicalising content should have no place on the internet. However, the borderless nature of the internet means that the threat remains persistent.
The Home Office works to influence industry partners to increase action to tackle online content used to radicalise, recruit and incite terrorism by providing threat assessment, insight and support.
We also work with international to collaborate on tackling online radicalisation, and influence and align approaches where possible and respond to emerging threats.
Under the Online Safety Act, tech companies are accountable to Ofcom, the independent online safety regulator, to keep their users safe, and they need to have in place systems and processes to remove and limit the spread of illegal content, including terrorist material.
Through our Prevent programme, partners also deliver a range of activity from face-to-face workshops, online sessions, sessions at conferences, school assemblies etc around building resilience to extremist/terrorist narratives, online safety and the impact of terrorism.
Asked by: Baroness Uddin (Non-affiliated - Life peer)
Question to the Home Office:
To ask His Majesty's Government what assessment they have made of the role of online grooming in the case of Shamima Begum.
Answered by Lord Hanson of Flint - Minister of State (Home Office)
Shamima Begum had her British citizenship removed, as upheld by UK courts, which we support.
We do not comment on individual cases, operational intelligence or security matters and it would be inappropriate to comment on the specifics of Ms Begum’s case whilst legal proceedings are ongoing.
Depriving an individual of British citizenship keeps the very worst, high harm offenders out of the UK. Each case is assessed individually on the basis of all available evidence and always comes with a right of appeal..
The Government’s top priority remains maintaining our national security and keeping the public safe.
Asked by: Baroness Uddin (Non-affiliated - Life peer)
Question to the Home Office:
To ask His Majesty's Government, with regard to the statement by the Secretary of State for the Home Office on 26 January (HC Deb col 610), what assessment they have made of any bias and inconsistency of application in the use of facial recognition assessments and algorithms for Black and Asian men and women.
Answered by Lord Hanson of Flint - Minister of State (Home Office)
The algorithm used for retrospective facial recognition searches on the Police National Database (PND) has been independently tested by the National Physical Laboratory (NPL), which found that in a limited set of circumstances it was more likely to incorrectly include some demographic groups in its search results. At the settings used by police, the NPL also found that if a correct match was in the database, the algorithm found it in 99% of searches.
We take these findings very seriously. A new algorithm has been procured and independently tested, which can be used at settings with no statistically significant bias. It is due to be operationally tested in the coming months and will be subject to evaluation.
Manual safeguards embedded in police training, operational practice and guidance have always required trained users and investigating officers to visually assess all potential matches. Training and guidance have been re-issued and promoted to remind them of these long-standing manual safeguards. The National Police Chiefs’ Council has also updated and published data protection and equality impact assessments.
Given the importance of this issue, the Home Secretary has asked HMICFRS, supported by the Forensic Science Regulator, to inspect police and relevant law enforcement agencies’ use of retrospective facial recognition, with work expected to begin before the end of March.
It is important to note that no decisions are made by the algorithm or solely on the basis of a possible match– matches are intelligence, which must be corroborated with other information, as with any other police investigation.
For live facial recognition, NPL testing found, a 1 in 6,000 false alert rate on a watchlist containing 10,000 images. In practice, the police have reported that the false alert rate has been far better than this. The NPL also found no statistically significant performance differences by gender, age, or ethnicity at the settings used by the police.
On 4 December last year, we launched a public consultation on when and how biometrics, facial recognition and similar technologies should be used, and what safeguards and oversight are needed. Following analysis of the responses, we will publish a formal government response in due course.
Asked by: Baroness Uddin (Non-affiliated - Life peer)
Question to the Home Office:
To ask His Majesty's Government, with regard to the statement by the Secretary of State for the Home Office on 26 January (HC Deb col 610), what steps they are taking to correct and define new large language models for facial recognition to ensure errors and potential racial bias are removed.
Answered by Lord Hanson of Flint - Minister of State (Home Office)
Facial recognition algorithms provided by or procured with Home Office funding for police use are required to be independently tested for accuracy and bias. Independent testing is important because it helps determine the setting in which an algorithm can safely and fairly be used.
Where potential bias or performance issues are identified, the Home Office works with policing partners to ensure their guidance, practices, and oversight processes minimise any risks arising from use of the technology.
On 4 December last year, we launched a public consultation on when and how biometrics, facial recognition and similar technologies should be used, and what safeguards and oversight are needed. Following analysis of the responses, we will publish a formal government response in due course.
Asked by: Baroness Uddin (Non-affiliated - Life peer)
Question to the Home Office:
To ask His Majesty's Government, with regard to the statement by the Secretary of State for the Home Office on 26 January (HC Deb col 610), what steps they will take to ensure that data and information collected as a result of the increased use of facial recognition (1) remains in British jurisdiction, (2) is managed by the government, and (3) is not transferred to any third party entities or nations.
Answered by Lord Hanson of Flint - Minister of State (Home Office)
Custody images used for retrospective facial recognition searches are stored on the Police National Database. The data is held at a secure location in the UK.
Police use of facial recognition is governed by data protection legislation, which require that any processing of biometric data is lawful, fair, proportionate and subject to appropriate safeguards.
Police forces act as the data controllers for facial recognition use and must manage data, including any international transfers, in line with data protection law and established policing standards.
On 4 December last year, we launched a public consultation on when and how biometrics, facial recognition and similar technologies should be used, and what safeguards and oversight are needed. Following analysis of the responses, we will publish a formal government response in due course.
Asked by: Baroness Uddin (Non-affiliated - Life peer)
Question to the Home Office:
To ask His Majesty's Government what assessment they have made of the potential provision of content regarding tackling violence against women and girls through (1) immersive, and (2) other, electronic media.
Answered by Lord Sharpe of Epsom - Shadow Minister (Business and Trade)
The Home Office has worked to identify the most impactful and cost-effective channels to provide content under its tackling violence against women and girls campaign, Enough. This has included variety of digital channels including social media advertising, video-on-demand, digital audio and search engine optimisation.
Immersive forms of electronic media were considered as part of the campaign’s PR activity, but not pursued, following advice from sector experts who felt this type of activity could carry an increased risk of triggering trauma among victims of abuse.
We will continue to ensure any future campaign activity explores and utilises innovative ways of reaching our audiences and delivering its vital message.