(4 days, 14 hours ago)
Lords ChamberMy Lords, in moving Amendment 266, I will speak also to Amendments 479 and 480, all of which are in my name. I thank the noble Baroness, Lady Morgan, the noble Lords, Lord Clement-Jones and Lord Russell, and the noble Viscount, Lord Colville, for their support.
All three amendments concern illegal or harmful online activity. Amendment 266 places a legal duty on online services, including generative AI services, to conduct risk assessments evaluating the likelihood that their systems could be used to create or facilitate child sexual abuse material. Subsection (1) of the proposed new clause establishes that duty. Subsection (2) requires providers to report the results to Ofcom or the National Crime Agency, depending on whether or not they are regulated under the Online Safety Act. Subsections (3) to (7) set out the enforcement mechanisms, drawing on Ofcom’s existing enforcement powers under the OSA or equivalent powers for the NCA.
Amendment 266 complements Clause 63, which creates the new offence relating to the supply of CSA image generators to which the Minister has just spoken, but it is in addition to those powers. In June 2023, the BBC reported that the open-source AI model Stable Diffusion was being used to generate child sexual abuse material. Researchers at Stanford University subsequently found that Stable Diffusion had been trained on datasets containing child sexual abuse material. This issue is not confined to a single model. The Internet Watch Foundation and the chair of the AI Security Institute have warned of the potential for open-source AI models to be used for the creation of CSAM.
I am very happy to arrange a meeting with an appropriate Minister. I would be very happy to sit in on it. Other Ministers may wish to take the lead on this, because there are technology issues as well. I have Home Office responsibilities across the board, but I have never refused a meeting with a Member of this House in my 16 months here and I am not going to start now, so the answer to that question is yes. The basic presumption at the moment is that we are not convinced that the technology is yet at the stage that the noble Lord believes it to be, but that is a matter for future operation. I again give him the assurance that, in the event that the technology proves to be successful, the Government will wish to examine it in some detail.
I have absolutely no doubt that we will revisit these matters but, for the moment, I hope that the noble Baroness can withdraw her amendment.
I pay tribute to the noble Lord, Lord Nash, for his amendment and his fierce following of this issue, and for bringing it to our attention. I recognise that this is a Home Office Bill and that some of these things cross to DSIT, but we are also witnessing crime. The Home Office must understand that not everything can be pushed to DSIT.
Your Lordships have just met the tech Lords. These are incredibly informed people from all over the Chamber who share a view that we want a technological world that puts kids front and centre. We are united in that and, as the Minister has suggested, we will be back.
I have three very quick points. First, legal challenges, operational difficulties and the capacity of the NCA and Ofcom were the exact same reasons why Clause 63 was not in the Online Safety Bill or the Data (Use and Access) Bill. It is unacceptable for officials to always answer with those general things. Many noble Lords said, “It’s so difficult”, and, “This is new”, with the Online Safety Bill. It is not new: we raised these issues before. If we had acted three or four years ago, we would not be in this situation. I urge this Government to get on the front foot, because we know what is coming.
(1 month, 2 weeks ago)
Lords ChamberMy Lords, I thank the Minister for her kind words about the new offence with respect to child sexual abuse image generators and I take the opportunity to recognise the work of the specialist police unit that has worked alongside me on these and other issues. Working at the front line of child sexual abuse detection and enforcement is to come up against some of the most sordid and horrendous scenarios that can make you lose faith in humanity, so I want to put on record our huge debt to those in the unit for their courage and commitment.
I was also pleased to hear the Minister’s commitment to criminalise pornography that depicts acts of strangulation and suffocation. This is one of a number of concerns that the noble Baroness, Lady Bertin, will speak to shortly, and I shall be supporting her on all her amendments. During the Recess, I chaired a meeting of extremely senior health professionals and the prevalence of young people presenting in clinical settings suffering from violence and abuse during sex was simply horrific, with outcomes ranging from fear and trauma to death itself. There is an epidemic of sexual violence, normalised and driven by pornography, and I very much hope that the noble Baroness will have the support of the whole House on this matter.
I have four further areas of concern and I am going to touch on each very briefly. First, this House successfully introduced amendments to the Data (Use and Access) Act to empower coroners to require technology companies to preserve data when a child has died. At the time, we proposed that preservation notices should be automatic and that statutory guidance should be developed, but this was refused. We now have the law, but bereaved families are still unable to benefit from its provisions, because preservation notices are not being used quickly enough, and nor are the powers fully understood. It is simply heartbreaking to see a parent who has just lost a child become a victim of a system that does not understand or use its own powers. I will be tabling amendments to make the new law work as was promised and as Parliament intended.
Secondly, we have all seen media reports of chatbots suggesting illegal content or activity to children. I remain unclear about the Government’s appetite to strengthen Ofcom’s codes or to resolve the differences of opinion between Parliament and the regulator about the scope of the Online Safety Act. Nevertheless, I will be seeking to ensure that AI chatbots that suggest or facilitate illegal activity are addressed in the Crime and Policing Bill.
Thirdly, as I have indicated, I welcome the CSAM generator offence in the Bill, but a gap remains and I will be tabling amendments to place clear, legally binding duties on developers of generative AI systems to conduct risk assessments, identifying whether and how their systems could be misused for this narrow but devastating purpose.
Finally, I am curious about youth diversion orders. I am by no means against them, but I would like to understand whether they are to be backed up by other support, such as autism screening and therapeutic support. Many at the front line of this issue say that there is a serious lack of resource, and I would be interested to hear from Ministers how young people are to be supported once diverted, and whether the Government have plans to look further at the responsibility of tech companies that deliberately design for constant engagement, even if extreme content is being used simply as bait. It is no longer possible to consider the online world as separate from any other environment and, if we do not impose the legal order we require elsewhere, we will continue to create a place of lawlessness and abuse.