All 4 Lord Nash contributions to the Crime and Policing Bill 2024-26

Read Bill Ministerial Extracts

Thu 27th Nov 2025
Crime and Policing Bill
Lords Chamber

Committee stage part two
Tue 9th Dec 2025
Crime and Policing Bill
Lords Chamber

Committee stage part one
Mon 2nd Mar 2026
Wed 18th Mar 2026
Crime and Policing Bill
Lords Chamber

Report stage part one

Crime and Policing Bill Debate

Full Debate: Read Full Debate
Department: Home Office
This House has long campaigned for the Government to include addictiveness as a stand-alone harm. We believed we had secured it on the last day of Report on what is now the OSA, but Ofcom has repeatedly said that it does not have the power. I recognise that this last point is outwith this Bill and my amendments, but can the Minister go back to the Government and ask: if the regulator does not have the power to regulate addictiveness, would the Secretary of State use her powers under the Act to bring forward a code of conduct on it? When we advocate for a safer online environment by making an analogy with smoking, very often a Minister, an interviewer, a tech lobbyist or a civil servant interjects to say that it is a false analogy because tech does not kill. We are well past that; my inbox is a litany of bereaved parents. It does kill. I beg to move.
Lord Nash Portrait Lord Nash (Con)
- View Speech - Hansard - -

My Lords, Amendment 271A is in my name and I support the other amendments in this group. As this is the first time I have spoken on the Bill, I draw attention to my interests on the register, particularly the fact that I am an investor in a wide range of companies, including many software companies.

My Amendment 271A, if passed, would have the effect of software being used to screen out all child sexual abuse material, including live-streaming, on smartphones and tablets, and in due course on all devices. It would also apply to private communications, which is where the majority of live-streamed child sexual abuse takes place and which is not covered by the Online Safety Act.

--- Later in debate ---
Lord Hanson of Flint Portrait Lord Hanson of Flint (Lab)
- Hansard - - - Excerpts

I am grateful to my noble friend for that and for her contribution to the debate and the experiences she has brought. The monitoring and evaluation of the online safety regime is a responsibility of DSIT and Ofcom, and they have developed a framework to monitor the implementation of the Act and evaluate core outcomes. This monitoring and evaluation is currently tracking the effect of the online safety regime and feeding into a post-implementation review of the 2023 Act. Where there is evidence of a need to go further to keep children safe online, including from AI-enabled harms, the Government will not hesitate to act.

If the noble Baroness, Lady Kidron, will allow DSIT and Ofcom to look at those matters, I will make sure that DSIT Ministers are apprised of the discussion that we have had today. It is in this Bill, which is a Home Office Bill, but it is important that DSIT Ministers reflect on what has been said. I will ensure that we try to arrange that meeting for the noble Baroness in due course.

I want also to talk about Amendments 271A and 497ZA from the noble Lord, Lord Nash, which propose that smartphone and tablet manufacturers, importers and distributors are required to ensure that any device they have is preinstalled with technology that prevents the recording and viewing of child sexual abuse material or similar material accordingly. I acknowledge the noble Lord’s very valid intention concerning child safety and protection, and to prevent the spread of child sexual abuse material online. To that end, there is a shared agreement with the Government on the need to strengthen our already world-leading online safety regime wherever necessary.

I put to the noble Lord, and to the noble Lord, Lord Bethell, on his comments in support, that if nudity detection technology could be effectively deployed at scale, there could be a significant limiting impact on the production and sharing of child sexual abuse material. I accept that, but we must get this right. Application of detection technology that detects and blocks all nudity, adult and child, but which is primarily targeted at children, would be an effective intervention. I and colleagues across government want to gather evidence about the application of such technology and its effectiveness and impact. However, our assessment is that further work is needed to understand the accuracy of such tools and how they may be implemented.

We must also consider the risks that could arise from accepting this amendment, including legitimate questions about user privacy and data security. If it helps the noble Lord, Lord Nash, we will continue to assess the effect of detection tools on the performance of mobile device so that we can see how easy it is to circumvent them, how effective they are and a range of other matters accordingly. The Government’s focus is on protective measures within the Online Safety Act, but we are actively considering the potential benefits of the technology that the noble Lord has mentioned and others like it in parallel. There will be further future government interventions but they must be proportionate and driven by evidence. At the moment, we do not have sufficient evidence to ensure that we could accept the amendment from the noble Lord, but the direction of travel is one that we would support.

Lord Nash Portrait Lord Nash (Con)
- Hansard - -

Will the Minister meet me and representatives from software companies to explain why they say this technology works?

Lord Hanson of Flint Portrait Lord Hanson of Flint (Lab)
- Hansard - - - Excerpts

I am very happy to arrange a meeting with an appropriate Minister. I would be very happy to sit in on it. Other Ministers may wish to take the lead on this, because there are technology issues as well. I have Home Office responsibilities across the board, but I have never refused a meeting with a Member of this House in my 16 months here and I am not going to start now, so the answer to that question is yes. The basic presumption at the moment is that we are not convinced that the technology is yet at the stage that the noble Lord believes it to be, but that is a matter for future operation. I again give him the assurance that, in the event that the technology proves to be successful, the Government will wish to examine it in some detail.

I have absolutely no doubt that we will revisit these matters but, for the moment, I hope that the noble Baroness can withdraw her amendment.

Crime and Policing Bill Debate

Full Debate: Read Full Debate
Department: Ministry of Justice
Lord Nash Portrait Lord Nash (Con)
- View Speech - Hansard - -

My Lords, I support the amendments in this group. It is shameful that we have not yet legislated for parity between the regulation of online and offline pornography and that we are so very late in playing catch-up. What people can view online at a couple of clicks—including children often diverted to this sort of stuff without asking for it—is horrifying. As the report of the noble Baroness, Lady Bertin, stated, over half of 11 to 13 year-olds have seen pornography, often accidentally, and many have seen appalling images of choking, strangulation or sex where one partner is asleep, which is of course a non-consensual act—rape.

Therapists and front-line practitioners often describe a growing number of clients stating that porn consumption led them to child sexual abuse material. In the late 1980s, the Home Office commissioned a study that showed that fewer than 10,000 child sexual abuse images were available online. Today, it is conservatively estimated that, worldwide, the number of child sexual abuse images is 70 million to 80 million.

The internet has become a place where you can search for and find absolutely anything. If you cannot find it, you can create it yourself using AI and LLMs that are on the market, with no guard-rails. For example, generative AI can be and has been used to create pictures of someone’s older self abusing their younger self, including, in one series of images, that self as an eight year-old abusing themself as a two year-old. This is not a problem of the dark web; this is available easily, at a few clicks, on popular social media sites. One social media site alone hosts and facilitates by far the greatest number of cases of sextortion and, in a number of cases, this has led to young people taking their own lives.

Bad actors are also exploiting generative AI to sexually extort. Com groups are driving abuse and exploitation behaviours that are unimaginable, including cutting competitions where the winner is the person who cuts the deepest. Other com groups are used by adults—bad actors—to groom the most vulnerable children and control them to engage in the most horrifying acts, including suicide. One survivor described watching multiple suicides in one group.

Children are using social media to create their own payment models for live sex shows, like the one the recent TV series “Wild Cherry” showed, but much worse. More than half of the 107,000 child sexual abuse and exploitation cases recorded in 2022—a figure that has quadrupled in the last 10 years—were committed by children. Pornography has to play a large part in this. The amendments of the noble Baroness, Lady Bertin, have the support of the NSPCC, the Children’s Commissioner and many other organisations. We must listen to them. It would be completely morally irresponsible for us, as guardians of children, not to enact now.

In the last Committee session, the Minister promised me a meeting with the appropriate person and officials to talk about my amendment to allow new technology that is now available to block out child sexual abuse material. He indicated that officials were unsure whether this technology works. Since then, I have met with the providers of this technology again and they have assured me that it does work, certainly for young children, and that they are in active dialogue at a senior level with the head of the technical solutions team at the Home Office, DSIT, the Internet Watch Foundation, the NCA and GCHQ. I very much look forward to that meeting.

I should say that, although I do not think this will happen—I am fully aware of the rules—I have committed to a radio interview, so it is just possible that I may not be here to the end. I think I will be, but I apologise if I am not.

Baroness Owen of Alderley Edge Portrait Baroness Owen of Alderley Edge (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I pay tribute to my noble friend Lady Bertin for her hard work and her review. I fully support all her amendments, but will focus my remarks on a couple of them. I declare my interest as a guest of Google at its Future Forum, an AI policy conference, and my interest as receiving pro bono legal advice from Mishcon de Reya on my work on intimate image abuse.

On Amendment 292, it is vital that we always remember that consent is a live process, and our law should protect those who have featured in pornographic content and wish to withdraw their consent, no matter how long after publication. One content creator said, “A lot of the videos, I have no rights under; otherwise, I would probably have deleted them all by now”, and went on to describe it as a stigma that will follow her for the rest of her life. Given the huge scale of the porn industry, it is vital that our law protects those who feature and offers them recourse to remove their content should they wish to.

Crime and Policing Bill Debate

Full Debate: Read Full Debate
Department: Ministry of Justice
Moved by
239A: After Clause 76, insert the following new Clause—
“Action to forestall the sexual exploitation of children by combating CSAM(1) Within 12 months of the passing of this Act the Secretary of State must, for the purpose of forestalling the sexual exploitation of children, make and bring into force regulations which require manufacturers, importers and distributors of relevant devices to satisfy the CSAM requirement specified in subsection (2).(2) The ‘CSAM requirement’ is that any relevant device supplied for use in the UK must have installed tamper-proof system software which is highly effective at preventing the recording, transmitting (by any means, including livestreaming) and viewing of CSAM using that device.(3) The duties of manufacturers, importers and distributors to comply with the CSAM requirement specified by regulations under subsection (1) must be subject to enforcement as if the CSAM requirement was a security requirement for the purposes of Part 1 of the Product Security and Telecommunications Infrastructure Act 2022.(4) Regulations under subsection (1) must—(a) enable the Secretary of State, by further regulations, to expand the definition of ‘relevant devices’ to include other categories of device which may be used to record, transmit or view CSAM, and (b) protect the privacy of the users of relevant devices through making provision to ensure that software of the kind required by subsection (2) does not, and cannot be used to, collect, retain, copy or transmit any data outside of the relevant device on which it is operating, or determine by any means the identity of the user of the relevant device on which it is operating.(5) A statutory instrument containing regulations under subsection (1) may not be made unless a draft of the instrument has been laid before and approved by a resolution of each House of Parliament.(6) For the purposes of this section—“CSAM” means images, video recordings or live videos involving child sexual abuse, including—(a) any indecent photograph or pseudo-photograph of a child within the meaning of the Protection of Children Act 1978, and(b) any prohibited image of a child, within the meaning of section 62 of the Coroners and Justice Act 2009, that is not an excluded image within the meaning of section 63 of that Act;“relevant devices” are smartphones or tablet computers which are either internet-connectable products or network-connectable products for the purposes of section 5 of the Product Security and Telecommunications Infrastructure Act 2022;“manufacturer”, “importer”, “distributor” and “supply” is each as defined in the Product Security and Telecommunications Infrastructure Act 2022.”Member's explanatory statement
This new clause would require the Secretary of State to take action to forestall the sexual exploitation of children by mandating the installation of software which prevents the creation, viewing and sharing of child sexual abuse material on smartphones, tablets, and subsequently other devices, which are supplied for use in the UK.
Lord Nash Portrait Lord Nash (Con)
- View Speech - Hansard - -

My Lords, as this is the first time I have spoken on Report, I draw attention to my interests in the register, particularly the fact that I am—and have been for many years—an investor in many technology companies, mainly software companies.

I do not think I need to spend too much time telling noble Lords of the appalling worldwide industry of child sexual abuse, as I know many noble Lords are only too aware of it. There have been many powerful speeches about it already today and I went through it in quite a lot of detail in Committee, but I will mention a few facts. It is estimated that in the Philippines alone, one in every 100 children is coerced into this industry, often with their parents’ consent, for the gratification of paedophile customers across the world. It is estimated that around 70 million child sexual abuse images are floating around the internet, many of which are of very young children and some—quite a few, sadly—even of babies, as the noble Lord, Lord Russell of Liverpool, mentioned earlier. Many depict incest.

Some of the victims in these images have been viewed tens of millions of times. Imagine what it is like as a young girl or an adult walking down the street and seeing a man—it would be a man—look at you and peer at you for a few seconds, and to wonder whether that man has seen you raped online. With the advent of AI, it is, as we now know, possible using just text to speech to generate increasingly appalling images.

Depending on whose statistics you look at, this country is the second or third-largest consumer of this dreadful stuff in the world. The National Crime Agency issued a report last month saying that it arrests 1,000 paedophiles a month in this country. There were tens of thousands of outstanding investigations, and it is estimated that there are well over half a million offenders in the UK alone. For some offenders, this online abuse is a gateway to real-life contact abuse, as the noble Lord, Lord Stevenson of Balmacara, has mentioned already. There is no doubt that some of this is fuelled by addiction to pornography and the desire for even more extreme content.

Under existing legislation, material can be taken down only once it has been seen—often by children. With livestreaming of this abuse, which is a very large industry, the images are watched in the moment and often immediately taken down. The tech companies already have methods of taking down much of this non-livestreamed material, but most of them are not using these methods effectively. Technology is now available to block on device the viewing of child sexual abuse images, or the making or livestreaming of them.

My amendment would mandate that this technology be installed on smartphones and tablets supplied in the UK. Of course, it would be open to manufacturers to develop their own technology to do that if they did not want to purchase a third-party product. Everyone I have spoken to, from regulators to technology experts and the companies themselves, is completely confident that that can be done. The problem is not the technology; it is achieving very high accuracy levels, at 99%, and very low false positives, at under 1%.

Of course, the Government will also need to be satisfied that the technology works effectively. Several discussions about this have already taken place between the Home Office, DSIT, the Internet Watch Foundation and the technology company I introduced to them. The Government may also initially, at least because of the difficulty sometimes of telling a 16 or 17 year-old from an 18 year-old, want to bring it in effective for a lower age. Since at least half of children being abused are under 13, that would be a very good start. My amendment would require the regulations to be brought into force within 12 months, but the regulations could mandate a further period for implementation.

Noble Lords will have noted that in place of my original Amendment 239, I now have down Amendment 239A. The difference is the addition of proposed new Clause 4(b) to ensure user privacy, which is perfectly possible under the technology because it is on the device; the data is not stored and does not go into the cloud.

We have the opportunity under the Bill to effectively hamper this appalling activity—indeed, industry—thereby saving and protecting many children from harm. I believe we have a moral obligation to pass this into law.

Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I have put my name to Amendment 239A in the name of the noble Lord, Lord Nash, as I believe we need to protect our children, however and wherever we can, from child sexual abuse material being created and shared. Shockingly, over 70 million images—yes, 70 million—are being circulated around the world, far beyond these shores, via the scourge of the online world. There is sexual imagery involving children as young as seven to 11, being exploited and watched by an ever-growing audience. This is not only immoral but cruel, despicable and illegal. It makes me weep to think that children’s childhood is being snatched away from them as we speak.

Organisations such as the Internet Watch Foundation have helped to secure arrests of those for CSAM offences but, despite those arrests, the number of offenders continues to grow. Demand is not being diminished; it is being fed by sick-minded, perverted individuals. Heartbreakingly, where demand for new imagery grows, so does the abuse of real children to produce it.

Social media is central to how offenders operate. Some 40% of CSAM offenders attempted to contact a child after viewing material, with 70% doing so online, mostly through social media, gaming and messaging platforms, while 77% of offenders found CSAM on the open web, with 29% citing social media.

I have met young people who have remained victims of this vile practice years after they became adults. They describe the ongoing harm they suffer because the images of their abuse remain in circulation. They have had their abuse material viewed millions and millions of times. Research has confirmed that survivors with an online element to their abuse found significantly higher levels of long-lasting harm, including depression and anxiety, post-traumatic stress disorder, self-harm, substance abuse, social isolation and sexual dysfunction, compared with survivors whose abuse was never recorded or shared online.

The cruelty that these survivors must endure extends even further. Some are actively hunted in adult life by offenders seeking to see how they look today. Can your Lordships believe this? With AI, offenders are now generating new abuse imagery featuring adult survivors—in some cases producing material in which the survivor appears to be abusing their younger self. Does that not make you want to cry?

Imagine if it was your child or grandchild, and what it means to live that reality. Imagine a survivor, as the noble Lord, Lord Nash, described, walking down the street, catching the eye of a stranger and immediately, involuntarily, thinking, “Have you seen the image of me being abused?” Does that not make your heart bleed? This is the daily experience of people whose abuse is permanently accessible online.

--- Later in debate ---
I hope that the noble Lord accepts that. I suggest to him that there is no difference between us. We are looking to do more work, and we want to make sure, for the reasons mentioned by the noble Lords, Lord Clement-Jones and Lord Davies of Gower, that the technology has a feasible and impactful way of achieving the same objective in due course.
Lord Nash Portrait Lord Nash (Con)
- View Speech - Hansard - -

I am grateful to the Minister for his answer and to the other Members who have spoken today. I am satisfied that the Government are seized of this issue. I do not think it will be difficult to satisfy them and Members of this House and the other place that the technology works, the privacy issues can be sorted and we can deal with all their concerns. On the basis of the commitment the Minister has made today, I will not be testing the opinion of the House. I beg leave to withdraw the amendment.

Amendment 239A withdrawn.

Crime and Policing Bill Debate

Full Debate: Read Full Debate
Department: Home Office
Baroness Cass Portrait Baroness Cass (CB)
- Hansard - - - Excerpts

My Lords, I will be very brief. When it comes to assessing risk to children, a plastic bath duck has better risk assessment than AI chatbots. I fully support my noble friend’s amendments.

Lord Nash Portrait Lord Nash (Con)
- View Speech - Hansard - -

My Lords, I support the amendments in the names of the noble Baroness, Lady Kidron, and others; I commend them on bringing them forward. Social media companies have captured our children’s attention, and now AI chatbots are coming for their affection—and worse. In legislating against harms caused by technology, we are always going to be playing catch-up, but we need to learn quickly to play catch-up much faster. These amendments offer us the opportunity to do that, and we should seize it.

Lord Alton of Liverpool Portrait Lord Alton of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

My Lords, brevity is the order of the day but, like some of my noble friends, I would like to add my support to the amendments that have been laid before your Lordships’ House by my noble friend Lady Kidron.

The Joint Committee on Human Rights, which I have the privilege of chairing, is currently conducting an inquiry into AI and human rights. We have concluded our evidence taking, and I commend to your Lordships the evidence given by, in particular, Google, Meta and Microsoft. I also highlight some of the concerns that have been raised around child safety.

My noble friend Lady Kidron gave me, the noble Baroness, Lady Boycott, and others the opportunity to meet the parents of Sewell Setzer. It was an extraordinary moment. He was a 14 year-old boy who took his own life because he had been befriended by a chatbot. I was struck by a report from Internet Matters that said that two-thirds of UK children aged between nine and 17 have used AI chatbots, with many engaging often. More than a third—35%—of them say that it is like talking to a friend; that figure rises to 50% among vulnerable children.

It is the obligation of your Lordships’ House to take this issue seriously. We should all be greatly indebted to my noble friend Lady Kidron for laying these amendments before us.