Crime and Policing Bill

Debate between Viscount Colville of Culross and Baroness Morgan of Cotes
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - -

My Lords, I put my name to Amendments 479 and 480, and I support the other amendments in this group. I have once again to thank my noble friend Lady Kidron for raising an issue which I had missed and which, I fear, the regulator might have missed as well. After extensive research, I too am very worried about the Online Safety Act, which many of your Lordships spent many hours refining. It does not cover some of the new developments in the digital world, especially personalised AI chatbots. They are hugely popular with children under 18; 31% use Snapchat’s My AI and 32% use Google’s Gemini.

The Online Safety Act Network set up an account on ChatGPT-5 using a 13 year-old persona. Within two minutes, the chatbot was engaged with the user about mental health, eating disorders and advice about how to safely cut yourself. Within 40 minutes, it had generated a list of pills for overdosing. The OSA was intended to stop such online behaviour. Your Lordships worked so hard to ensure that the OSA covered search and user-to-user functions in the digital space, but AI chatbots have varied functionalities that, as my noble friend pointed out, are not clearly covered by the legislation.

My noble friend Lady Kidron pointed out that, although Dame Melanie Dawes confirmed to the Communications and Digital Committee that chatbots are covered by the OSA, Ofcom in its paper Era of Answer Engines admits:

“Under the OSA, a search service means a service that is, or which includes, a search engine, and this applies to some (though not all) GenAI search tools”.


There is doubt about whether the AI interpretive process, which can change the original search findings, excludes it from being in the scope of search under the OSA. More significantly, AI chatbots are not covered where the provider creates content that is personalised for one user and cannot be forwarded to another user. I am advised that this is not a user-to-user service as defined under the Act.

One chatbot that seems to fall under this category is Replika. I had never heard of it until I started my research for this amendment. However, 2% of all children aged nine to 17 say that they have used the chatbot, and 18% have heard of it. Its aim is to stimulate human interaction by creating a replica chatbot personal to each user. It is very sophisticated in its output, using avatars to create images of a human interlocutor on screen and a speaking voice to reply conversationally to requests. The concern is that, unlike traditional search engines, it is programmed for sycophancy, or, in other words, to affirm and engage the user’s response—the more positive the response, the more engaged the child user. This has led to conversations with the AI companion talking the child user into self-harm and even suicide ideation.

Research by Internet Matters found that a third of children users think that interacting with chatbots is like talking to a friend. Most concerning is the level of trust they generate in children, with two in five saying that they have no concerns about the advice they are getting. However, because the replies are supposed to be positive, what might have started as trustworthy advice develops into unsafe advice as the conversation continues. My concern is that chatbots are not only affirming the echo chambers that we have seen developing for over a decade as a result of social media polarisation but are reducing yet further children’s critical faculties. We cannot leave the development of critical faculties to the already inadequate media literacy campaigns that Ofcom is developing. The Government need to discourage sycophancy and a lack of critical thinking at its digital source.

A driving force behind the Online Safety Act was the realisation that tech developers were prioritising user engagement over user safety. Once again, we find new AI products that are based on the same harmful principles. In looking at the Government’s headlong rush to surrender to tech companies in the name of AI growth, I ask your Lordships to read the strategic vision for AI laid out in the AI Opportunities Action Plan. It focuses on accelerating innovation but fails to mention once any concern about children’s safety. Your Lordships have fought hard to make children’s safety a priority online in legislation. Once again, I ask for these amendments to be scrutinised by Ofcom and the Government to ensure that children’s safety is at the very centre of their thinking as AI develops.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I support the amendments of the noble Baroness, Lady Kidron. I was pleased to add my name to Amendments 266, 479 and 480. I also support the amendment proposed by the noble Lord, Lord Nash.

I do not want to repeat the points that were made—the noble Baroness ably set out the reasons why her amendments are very much needed—so I will make a couple of general points. As she demonstrated, what happens online has what I would call real-world consequences—although I was reminded this week by somebody much younger than me that of course, for the younger generation, there is no distinction between online and offline; it is all one world. For those of us who are older, it is worth remembering that, as the noble Baroness set out, what happens online has real-world, and sadly often fatal, consequences. We should not lose sight of that.

We have already heard many references to the Online Safety Act, which is inevitable. We all knew, even as we were debating the Bill before it was enacted, that there would have to be an Online Safety Act II, and no doubt other versions as well. As we have heard, technology is changing at an enormously fast rate, turbocharged by artificial intelligence. The Government recognise that in Clause 63. But surely the lesson from the past decade or more is that, although technology can be used for good, it can also be used to create and disseminate deeply harmful content. That is why the arguments around safety by design are absolutely critical, yet they have been lacking in some of the regulation and enforcement that we have seen. I very much hope that the Minister will be able to give the clarification that the noble Baroness asked for on the status of LLMs and chatbots under the Online Safety Act, although he may not be able to do so today.

I will make some general points. First, I do not think the Minister was involved in the debate on and scrutiny of—particularly in this Chamber—what became the Online Safety Act. As I have said before, it was a master class in what cross-party, cross-House working can achieve, in an area where, basically, we all want to get to the same point: the safety of children and vulnerable people. I hope that the Ministers and officials listening to and involved in this will work with this House, and with Members such as the noble Baroness who have huge experience, to improve the Bill, and no doubt lay down changes in the next piece of legislation and the one after that. We will always be chasing after developments in technology unless we are able to get that safety-by-design and preventive approach.

During the passage of the then Online Safety Bill, a number of Members of both Houses, working with experienced and knowledgeable outside bodies, spotted the harms and loopholes of the future. No one has all the answers, which is why it is worth working together to try to deal with the problems caused by new and developing technology. I urge the Government not to play belated catch-up as we did with internet regulation, platform regulation, search-engine regulation and more generally with the Online Safety Act. If we can work together to spot the dangers, whether from chatbots, LLMs, CSAM-generated content or deepfakes, we will do an enormous service to young people, both in this country and globally.

Online Safety Bill

Debate between Viscount Colville of Culross and Baroness Morgan of Cotes
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - -

My Lords, I, too, thank the Minister for the great improvements that the Government have made to the Secretary of State’s powers in the Bill during its passage through this House. I rise to speak briefly today to praise the Government’s new Amendments 1 and 2 to Clause 44. As a journalist, I was worried by the lack of transparency around these powers in the clause; I am glad that the lessons of Section 94 of the Telecommunications Act 1984, which had to be rescinded, have been learned. In a world of conspiracy theories that can be damaging to public trust and governmental and regulatory process, it has never been more important that Parliament and the public are informed about the actions of government when giving directions to Ofcom about the draft codes of practice. So I am glad that these new amendments resolve those concerns.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I welcome Amendments 5 and 6, as well as the amendments that reflect the work done and comments made in earlier stages of this debate by the noble Baroness, Lady Kennedy. Of course, we are not quite there yet with this Bill, but we are well on the way as this is the Bill’s last formal stage in this Chamber before it goes back to the House of Commons.

Amendments 5 and 6 relate to the categorisation of platforms. I do not want to steal my noble friend’s thunder, but I echo the comments made about the engagement both from my noble friend the Minister and from the Secretary of State. I am delighted that the indications I have received are that they will accept the amendment to Schedule 11, which this House voted on just before the Recess; that is a significant and extremely welcome change.

When commentators outside talk about the work of a revising Chamber, I hope that this Bill will be used as a model for cross-party, non-partisan engagement in how we make a Bill as good as it possibly can be—particularly when it is as ground-breaking and novel as this one is. My noble friend the Minister said in a letter to all of us that this Bill had been strengthened in this Chamber, and I think that is absolutely right.

I also want to echo thanks to the Bill team, some of whom I was working with four years ago when we were talking about this Bill. They have stuck with the Bill through thick and thin. Also, I thank noble Lords across the House for their support for the amendments but also all of those outside this House who have committed such time, effort, support and expertise to making sure this Bill is as good as possible. I wish it well with its final stages. I think we all look forward to both Royal Assent and also the next big challenge, which is implementation.