Crime and Policing Bill Debate

Full Debate: Read Full Debate
Department: Home Office

Crime and Policing Bill

Lord Hacking Excerpts
Thursday 27th November 2025

(1 day, 2 hours ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Lord Hanson of Flint Portrait Lord Hanson of Flint (Lab)
- Hansard - - - Excerpts

My Lords, I will also speak to further amendments later. I just want to say thank you to the noble Lord, Lord Blencathra, for his kind words before he goes. My reputation is ruined, but there we go. I thank him anyway.

The government amendments in this group and the clauses to which they relate are vital in safeguarding the public from some of the gravest harms emerging from the digital age. All the amendments in this group of government amendments, starting with Amendments 295A and 295B, pertain to the introduction of a defence for authorised persons to test and investigate technologies for child sexual abuse material, extreme pornography and non-consensual intimate imagery capabilities. These are abhorrent crimes and we must ensure that our laws keep pace with them.

Noble Lords will know that the rapid advancement and prevalence of AI technologies without adequate guardrails has increased the volume of AI-generated abuse imagery circulating online. These harms fall disproportionately on women and children. We must get ahead of these risks. At present, AI developers and public safety organisations seeking to test for these risks face significant legal jeopardy from testing. These legal blocks mean that testers could be liable to prosecution if they create illegal images during testing. We want to support government and public safety organisations in their commitment to research internet safety. If we are serious about AI safety, it is essential that we support continuous and rigorous testing so that testers can be confident that models are safe to use and support our ambition to drive down CSAM online.

This defence could give a technology company the ability to understand the capabilities of its models, identify weaknesses and design out harmful outputs. Amendment 295A introduces a power by regulations to create new testing defences. The Secretary of State will authorise persons to carry out technology testing subject to rigorous conditions. I confirm that any regulations that are brought forward will be subject to the affirmative parliamentary procedure and testing will be subject to rigorous oversight and strict mandatory operational safeguards. The regulation-making power will also extend to making provision for the enforcement of any breaches of conditions and may include creating criminal offences.

Amendment 295B lists the offences to which this defence applies. The Secretary of State will have the power to amend this list of offences as the law evolves. This will ensure that the defence remains fit for purpose. I hope the Committee welcomes that the Scottish Government and Northern Ireland Department of Justice want this defence to be extended to Scotland and Northern Ireland. The offences listed may be amended, as appropriate, for England and Wales as well as for Scotland and Northern Ireland. The Secretary of State will be required to consult Scottish Ministers and the Department of Justice in Northern Ireland before making any regulations that would affect the Scottish Parliament or the Northern Ireland Assembly.

Clause 63 criminalises artificial intelligence image generators, which are used by offenders to create child sexual abuse imagery. Our law is clear that AI-generated child sexual abuse material is illegal. However, these fine-tuned models that facilitate the creation of child sexual abuse material currently are not. Therefore, the Government are making it illegal to possess, make, adapt, supply or offer to supply a child sexual abuse image generator, punishable by up to five years’ imprisonment.

Government Amendments 267 and 268 ensure that we take a unified approach across the United Kingdom. This is why we are creating equivalent offences in Scotland and Northern Ireland. Clause 64 amends Section 69 of the Serious Crime Act 2015 to criminalise the possession of advice or guidance on using artificial intelligence to create child abuse imagery. Sadly, there are so-called paedophile manuals that contain guidance for offenders on how to abuse children sexually and how to create indecent photographs or pseudo-photographs—which are illegal under the existing offence in the Serious Crime Act 2015. However, this offence does not include guidance for offenders about how to use AI to create illegal images of children and is applicable only to England, Wales and Northern Ireland. Amendment 269 extends the offence, as amended by Clause 64, to Scotland, ensuring that these vile manuals can be tackled across the whole of the United Kingdom. The other amendments in this group are consequential on the main amendments that I have described.

Together, these government amendments will enhance the protection of women and children, prevent criminal use of AI technologies and improve long-term safety by design and the resilience of future AI development. I commend the amendments to the Committee. I beg to move.

Lord Hacking Portrait Lord Hacking (Lab)
- View Speech - Hansard - -

My Lords, if I could intervene for a moment, the Bill is going at a fine pace through the House, but I am a little concerned about Amendment 263. The problems of modern slavery that I have raised in the House are very severe.

None Portrait A noble Baroness
- Hansard -

That has been debated.

Lord Hacking Portrait Lord Hacking (Lab)
- Hansard - -

I know. I am just asking for some assistance with this—does the proposed new clause in Amendment 263 still stand?

Lord Hanson of Flint Portrait Lord Hanson of Flint (Lab)
- View Speech - Hansard - - - Excerpts

The Committee has considered that amendment. If the noble Lord wishes to write to me on any details, I will certainly write back to him, but, in the interests of progress, it would be better if that was dealt with outside the Chamber, given that we have debated those matters already.

--- Later in debate ---
Lord Hampton Portrait Lord Hampton (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I briefly add my support to all these amendments, particularly the amendment of the noble Lord, Lord Nash, which is fascinating. If we can get the software to do this, then why would we not? I offer a challenge to Ofcom, the Government and tech firms. If they can produce such sophisticated software that it can persuade children to kill themselves, why are BT and eBay’s chatbots so rubbish? We have to make AI a force for good, not for evil.

Lord Hacking Portrait Lord Hacking (Lab)
- View Speech - Hansard - -

My Lords, having arrived in this House a very long time ago—53 years ago—I know this House works best if it treats legislation as an evolutionary process. The Online Safety Act seemed to be a very good Act when we passed it two years ago, but now we have further, drastic evidence, which we have heard in this debate. I am confident my noble friend the Minister will treat the speeches made in this debate as part of the evolutionary process which, I emphasise again, this House does best.

Baroness Doocey Portrait Baroness Doocey (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the noble Baroness, Lady Kidron, for bringing forward these amendments and for explaining them so clearly. The understanding of the Independent Reviewer of Terrorism Legislation, Jonathan Hall, is that AI chatbots do not trigger the illegal content duties since these tools are not considered to show mental intent. As a result, chatbots can generate prompts that are not classified as illegal, even though the exact same content would be illegal and subject to regulation if produced by a human. I find that quite extraordinary.

By accepting these amendments, the Government would be acting decisively to address the fast-evolving threat which this year saw abusive material of sexual content for children rise by 380%. In April 2024, the Internet Watch Foundation reported that a manual circulating on the dark web, which the Minister referred to earlier, instructed paedophiles to use AI to create nude images of children, then use these to extort or coerce money or extreme material from the young victims. The charity warned that AI was generating astoundingly realistic abusive content.

Text-to-image generative AI tools and AI companion apps have proliferated, enabling abusers to create AI chatbot companions specifically to enable realistic and abusive roleplay with child avatars. Not only do they normalise child sexual abuse, but evidence shows that those who abuse virtual children are much more likely to go on to abuse real ones. Real children are also increasingly subjected to virtual rape and sexual abuse online. It is wrong to dismiss this as less traumatic simply because it happens in a digital space.

The measures in the Bill are welcome but, given the speed at which technology is moving, how easy or otherwise will it be to future-proof it in order to keep pace with technology once the Bill is enacted?