Crime and Policing Bill (Seventh sitting)

Debate between Joe Robertson and Jess Phillips
Joe Robertson Portrait Joe Robertson
- Hansard - -

I also rise to support the clauses. As we have heard, artificial intelligence poses one of the biggest threats to online child safety in a generation. It is too easy for criminals to use AI to generate and distribute sexually explicit content of children.

As the UK’s frontline against child sexual abuse imagery, the IWF was among the first to sound the alarm about AI being used in this way. In October 2023, the IWF revealed the presence of more than 20,000 AI-generated images, 3,000 of which depicted criminal child sexual abuse activities. The creation and distribution of AI-generated child sexual abuse is already an offence under UK law, but AI’s capabilities have far outpaced our laws. My concern is that they will continue to do so. We must continue to keep the law in this area under review.

Offenders can now legally download the tools that they need to generate these images and produce as many as they want offline, with the high level of anonymity that can be achieved through open-source technology. Herein lies a problem: software created for innocent purposes can be appropriated and used for the most grim and hideous purposes. It is all very well making the activity illegal—I support the Government in tackling it—but the Government must also take steps, as indeed they are, to limit, curtail and disrupt criminals’ access to the tools used to carry out their crimes. The Government would do so with regard to any other crime, and it so happens that this is a particularly evil crime that uses cutting-edge and developing technology.

I am concerned about detection in this area. The Minister has been asked to confirm—I am sure she will—that social media companies carrying out lawful activity will not be captured by this law. I do not think it is controversial to say that, in other areas, social media companies have not lived up to their responsibilities to detect crime, support law enforcement agencies in detecting crime and detect criminals who are using their platforms to enhance and enable their own criminal activities.

I hope and am sure that the Government are bringing pressure to bear on social media companies to help with detection of these crimes. It is all very well for social media companies, which are probably exclusively very large, international or multinational companies, to say that they are not the perpetrators of crime, but they do provide platforms and they have huge capabilities to enable detection. I would expect them to step up and put all the resources that they have into detecting or helping law enforcement to detect these vile and horrible crimes.

Jess Phillips Portrait Jess Phillips
- Hansard - - - Excerpts

I completely agree with the hon. Member for Isle of Wight East that there is a real responsibility on our tech giants. The hon. Member for Windsor talked about the Internet Watch Foundation; the basis of its model is a partnership with social media firms whereby they provide it with huge amounts of the data, so they are not without efforts in the space of child abuse detection—they have been partners in it for many years. However, I think that it is uncontroversial to say that more needs to be done. We as policymakers and lawmakers have to keep a constant eye on how things change.

The shadow Minister, the hon. Member for Gordon and Buchan, asked a series of questions. She asked, “What if someone uses electronic services without the knowledge of the service provider?” An individual must have the intention of facilitating child sexual exploitation and abuse to be convicted under this offence. Where an internet service is used without the knowledge or intention of a service provider to carry out child sexual exploitation and abuse, the service provider will not be criminally responsible.

The shadow Minister also asked about the interplay with the Online Safety Act. These criminal offences are designed to ensure that we can better counter the threat of AI-generated CSAM offences. Offences that criminalise the individual user are not in scope of the Online Safety Act. However, the interplay would be in relation to the content created where these measures are in scope. Companies and platforms would then fall under the OSA. I hope that that answers the hon. Lady’s questions.

Question put and agreed to.

Clause 38 accordingly ordered to stand part of the Bill.

Schedule 6 agreed to.

Clauses 39 and 40 ordered to stand part of the Bill.

Clause 41

Notification requirements for offence under section 38

Amendment made: 13, in clause 41, page 46, line 7, at end insert—

“(6) In Schedule 4 to the Modern Slavery Act 2015 (offences to which defence in section 45 does not apply), in paragraph 36D (inserted by section 17), after the entry for section 17 insert—

“section 38 (online facilitation of child sexual exploitation and abuse)”.”—(Jess Phillips.)

This amendment excepts the offence of online facilitation of child sexual exploitation and abuse from the defence in section 45 of the Modern Slavery Act 2015.

Clause 41, as amended, ordered to stand part of the Bill.

Ordered, That further consideration be now adjourned. —(Keir Mather.)