Lord Bethell
Main Page: Lord Bethell (Conservative - Excepted Hereditary)Department Debates - View all Lord Bethell's debates with the Home Office
(1 day, 2 hours ago)
Lords ChamberMy Lords, I shall speak very briefly. Earlier—I suppose it was this morning—we talked about child criminal exploitation at some length, thanks particularly to the work of the noble Baroness, Lady Casey, and Professor Jay. Essentially, what we are talking about in this group of amendments is child commercial exploitation. All these engines, all these technologies, are there for a commercial purpose. They have investors who are expecting a return and, to maximise the return, these technologies are designed to drive traffic, to drive addiction, and they do it very successfully. We are way behind the curve—we really are.
I echo what the noble Baroness, Lady Morgan, said about the body of knowledge within Parliament, in both Houses, that was very involved in the passage of the Online Safety Act. There is a very high level of concern, in both Houses, that we were perhaps too ambitious in assuming that a regulator that had not previously had any responsibilities in this area would be able to live up to the expectations held, and indeed some of the promises made, by the Government during the passage of that Act. I think we need to face up to that: we need to accept that we have not got it off to as good a start as we wanted and hoped, and that what is happening now is that the technologies we have been hearing about are racing ahead so quickly that we are finding it hard to catch up. Indeed, looking at the body language and the physiognomies of your Lordships in the Chamber, looking at the expressions on our faces as some of what we were talking about is being described, if it is having that effect on us, imagine what effect it is having on the children who in many cases are the subjects of these technologies.
I plead with the Minister to work very closely with his new ministerial colleague, the noble Baroness, Lady Lloyd, and DSIT. We really need to get our act together and focus; otherwise, we will have repeats of these sorts of discussions where we raise issues that are happening at an increasing pace, not just here but all around the world. I fear that we are going to be holding our hands up, saying “We’re doing our best and we’re trying to catch up”, but that is not good enough. It is not good enough for my granddaughter and not good enough for the extended families of everybody here in this Chamber. We really have to get our act together and work together to try to catch up.
My Lords, I too support the amendments in this group, particularly those tabled by my noble friend Lord Nash on security software and by the noble Baroness, Lady Kidron, on AI-generated child sexual abuse material. I declare my interest as a trustee of the Royal Society for Public Health.
As others have noted, the Online Safety Act was a landmark achievement and, in many ways, something to be celebrated, but technology has not stood still—we said it at the time—and nor can our laws. It is important that we revisit it in examining this legislation, because generative AI presents such an egregious risk to our children which was barely imaginable even two years ago when we were discussing that Act. These amendments would ensure that our regulatory architecture keeps pace.
Amendment 266 on AI CSAM risk assessment is crucial. It addresses a simple but profound question: should the provider of a generative AI service be required to assess whether that service could be used to create or facilitate child sexual abuse material? Surely the answer is yes. This is not a theoretical risk, as we have heard in testimony from many noble Lords. We know that AI can generate vivid images, optimised on a dataset scraped from children themselves on the open internet, and that can be prompted to create CSAM-like content. On this, there is no ambiguity at all. We know that chatbots trained on the vast corpora of text from children can be manipulated to generate grooming scripts and sexualised narratives to engage children and make them semi-addicted to those conversations. We know that these tools are increasingly accessible, easy to use and almost impossible to monitor by parents and, it seems, regulators.
There are a number of issues before the Committee today and the Government will reflect on all the points that have been mentioned. However, the view at the moment is that these amendments would risk creating significant legal uncertainty by duplicating and potentially undermining aspects of the Online Safety Act.
My Lords, I am enormously grateful to the Minister for reassuring us that all chatbots are captured by the Online Safety Act; that is very good news indeed. Can he reassure us that Ofcom will confirm that in writing to the House? I appreciate that he is a Home Office Minister, but he speaks on behalf of all of government. I think it is fair, given the nature of the Bill, that he seeks an answer from Ofcom in this matter.
My assessment is that the vast majority of chatbots are captured—
My Lords, I rise to speak to my Amendment 273, which is a very simple amendment that aims to put into action what IICSA recommended: that mandatory reporting of child sexual abuse should happen with no exceptions. The inquiry argued that, even if abuse is disclosed in the context of confession, the person—in this case, the priest—should be legally required to report it. It proposes that failing to report such abuse should itself be a criminal offence.
I am very glad that the right reverend Prelate the Bishop of Manchester is in his place, because I know he has spent a long time on working parties looking at this issue. In earlier discussions in the House, in response to the right reverend Prelate, the noble Lord, Lord Hanson of Flint, said that he had received representations from churches on this issue and expressed the hope that this would be further debated as the Crime and Policing Bill went through Parliament. My amendment is simply here to enable that debate to happen.
My Lords, I rise to speak in support of my noble friend Lord Polak and his Amendment 286A. As he lucidly put it, this amendment proposes to close several glaring loopholes in the offences outlined in Clause 79; otherwise, I fear it will fail to meet the aims and expectations placed on it by this Committee.
Our criminal justice system should be equipped with new laws to hold accountable all those who cover up child sexual abuse. The noble Baroness, Lady Featherstone, put that case incredibly well and touchingly. It needs to be known that if someone acts purposefully to stop child sexual abuse being properly investigated and so denies the victims and survivors the protection and justice they are entitled to, they will face strong criminal penalties. That is why I support the Bill’s inclusion of Clause 79, which seeks to introduce new criminal offences for preventing or deterring someone, under the new mandatory reporting duty, from making a report. However, its drafting means that it would be limited in its ability to contribute meaningfully to the important mission of tackling child sexual abuse that we across the Committee strongly support.
Clause 79 is dependent upon the new mandatory duty to report. The clause not only requires the action taken to directly involve a reporter under the duty, it requires the person attempting to conceal the abuse to know that the person that they are deterring is a mandated reporter. This brings with it a whole host of legal complexities. What does it mean to know that someone is under the duty? Does it require them to also know that the child sex offence has taken place to trigger the said duty? How could it be convincingly proved by the courts that someone accused of putting the needs of their institution above protecting a child also understood what the duty is, who it applies to and how that factored into their actions? These are important questions that need to be reconciled.