Crime and Policing Bill Debate

Full Debate: Read Full Debate
Department: Ministry of Justice
Baroness Bertin Portrait Baroness Bertin (Con)
- View Speech - Hansard - - - Excerpts

I will make a very small intervention because people have spoken so eloquently before me. I support the amendment 100% and I am surprised that the Front Benches are not taking a different view. For crying out loud, I am not easily shocked but the briefing that we have all spoken about that we went to this afternoon shocked me. We are so behind the curve on this and we have to get ahead of it, so I support the amendment.

Earl of Erroll Portrait The Earl of Erroll (CB)
- View Speech - Hansard - -

My Lords, I can see what the noble Lord, Lord Stevenson, is saying about Third Reading, but it would be wiser to vote for this amendment now—if noble Lords have any conscience at all, they have to vote for it—and if it is slightly defective it can be amended at Third Reading. If we do not do it now, there is a huge risk of it not coming back.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, from these Benches, I strongly support Amendment 209, which was so convincingly spoken to by the noble Baroness, Lady Kidron. I was very pleased to have signed it, alongside the noble Lord, Lord Russell of Liverpool, and the noble Baroness, Lady Morgan of Cotes.

This amendment is a vital safeguard against the “innovation first, safety later” culture of big tech. Although the Bill will rightly prohibit the creation of models specifically designed to generate CSA images, it remains silent on general-purpose models that can be easily manipulated or jailbroken to produce the same horrific results. As the unacceptable use of tools such as Grok—referred to by my noble friend Lady Benjamin in her powerful speech—has recently illustrated, we cannot leave the safety of our children to chance. We face a technological and moral emergency. The Internet Watch Foundation, represented at the meeting today which the noble Lord, Lord Russell, and my noble friend mentioned, has warned of a staggering 380% increase in confirmed cases of AI-generated child exploitation imagery. The noble Lord, Lord Russell, is right that the extent of this abuse is sickening beyond imagination.

The amendment would mandate a safety-by-design intervention, requiring providers to proactively risk-assess their services and report identified risks to Ofcom within 48 hours. In Committee, the Minister, the noble Lord, Lord Hanson, pushed back against this proposal, arguing that it

“would place unmanageable and unnecessary operational burdens on … the National Crime Agency and Ofcom”.—[Official Report, 27/11/25; col. 1533.]

He further claimed that these measures risk creating “legal uncertainty” by “duplicating” the Online Safety Act. Both assertions need rebutting. First, protecting children from an industrial-scale explosion of AI-generated abuse is not an unnecessary burden; it is the primary duty of our law enforcement and regulatory bodies. Secondly, we cannot rely on the theoretical protections of an Online Safety Act designed for a world before generative AI. Ofcom itself has maintained what might be called a tactical ambiguity about how the Act applies to stand-alone AI chatbots and large language models.

Alongside the noble Baroness, Lady Kidron, who we will support if she puts the amendment to a vote, we ask for an ex ante duty: providers must check whether their models can be used to generate CSAM before they are released to the public. Voluntary commitments and retrospective enforcement are simply not enough. The Government have already committed to this principle; it is time to put that commitment into statute. I urge the Minister to accept Amendment 209 and ensure that we move away from ex post measures that address harm only after a child has been victimised.

The current definitions of “search” and “user-to-user” services do not neatly or comprehensively capture these new generative technologies. We cannot allow a situation where tech developers release highly capable models to the public without first explicitly checking whether they can be used to generate CSAM. Voluntary commitments and retrospective civil enforcement are simply not enough. We need this explicit statutory duty in the Bill today and I urge the Minister to accept Amendment 209.