Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, on behalf of my noble friend Lord Clement-Jones, I will speak in support of Amendments 195, 239, 263 and 286, to which he added his name. He wants me to thank the Carnegie Trust and the Institution of Engineering and Technology, which have been very helpful in flagging relevant issues for the debate.

Some of the issues in this group of amendments will range much more widely than simply the content we have before us in the Online Safety Bill. The right reverend Prelate the Bishop of Chelmsford is right to flag the question of a risk assessment. People are flagging to us known risks. Once we have a known risk, it is incumbent on us to challenge the Minister to see whether the Government are thinking about those risks, regardless of whether the answer is something in the Online Safety Bill or that there needs to be amendments to wider criminal law and other pieces of legislation to deal with it.

Some of these issues have been dealt with for a long time. If you go back and look at the Guardian for 9 May 2007, you will see the headline,

“Second Life in virtual child sex scandal”.


That case was reported in Germany about child role-playing in Second Life, which is very similar to the kind of scenarios described by various noble Lords in this debate. If Second Life was the dog that barked but did not bite, we are in quite a different scenario today, not least because of the dramatic expansion in broadband technology, for which we can thank the noble Baroness, Lady Harding, in her previous role. Pretty much everybody in this country now has incredible access, at huge scale, to high-speed broadband, which allows those kinds of real life, metaverse-type environments to be available to far more people than was possible with Second Life, which tended to be confined to a smaller group.

The amendments raise three significant groups of questions: first, on scope, and whether the scope of the Online Safety Bill will stretch to what we need; secondly, on behaviour, including the kinds of new behaviours, which we have heard described, that could arise as these technologies develop; and, finally, on agency, which speaks to some of the questions raised by the noble Baroness, Lady Fox, on AIs, including the novel questions about who is responsible when something happens through the medium of artificial intelligence.

On scope, the key question is whether the definition of “user-to-user”, which is at the heart of the Bill, covers everything that we would like to see covered by the Bill. Like the noble Baroness, Lady Harding, I look forward to the Minister’s response; I am sure that he has very strongly prepared arguments on that. We should take a moment to give credit to the Bill’s drafters for coming up with these definitions for user-to-user behaviours, rather than using phrases such as, “We are regulating social media or specific technology”. It is worth giving credit, because a lot of thought has gone into this, over many years, with organisations such as the Carnegie Trust. Our starting point is a better starting point than many other legislative frameworks which list a set of types of services; we at least have something about user-to-user behaviours that we can work with. Having said that, it is important that we stress-test that definition. That is what we are doing today: we are stress-testing, with the Minister, whether the definition of “user-to-user” will still apply in some of the novel environments.

It certainly seems likely—and I am sure that the Minister will say this—that a lot of metaverse activity would be in scope. But we need detailed responses from the Minister to explain why the kinds of scenario that have been described—if he believes that this is the case; I expect him to say so—would mean that Ofcom would be able to demand things of a metaverse provider under the framework of the user-to-user requirements. Those are things we all want to see, including the risk assessments, the requirement to keep people away from illegal content, and any other measures that Ofcom deems necessary to mitigate the risks on those platforms.

It will certainly be useful for the Minister to clarify one particular area. Again, we are fortunate in the UK that pseudo-images of child sexual abuse are illegal and have been illegal for a long time. That is not the case in every country around the world, and the noble Lord, Lord Russell, is quite right to say that this an area where we need international co-operation. Having dealt with it on the platforms, some countries have actively chosen not to criminalise pseudo-images; others just have not considered it.

In the UK, we were ahead of the game in saying, “If it looks like a photo of child abuse, we don’t care whether you created it on Photoshop, or whatever—it is illegal”. I hope that the Minister can confirm that avatars in metaverse-type environments would fall under that definition. My understanding is that the legislation refers to photographs and videos. I would interpret an avatar or activity in a metaverse as a photo or video, and I hope that is what the Government’s legal officers are doing.

Again, it is important in the context of this debate and the exchange that we have just had between the noble Baronesses, Lady Harding and Lady Fox, that people out there understand that they do not get away with it. If you are in the UK and you create a child sexual abuse image, you can be taken to court and go to prison. People should not think that, if they do it in the metaverse, it is okay—it is not okay, and it is really important that that message gets out there.

This brings us to the second area of behaviours. Again, some of the behaviours that we see online will be extensions of existing harms, but some will be novel, based on technical capabilities. Some of them we should just call by their common or garden term, which is sexual harassment. I was struck by the comments of the noble Baroness, Lady Berridge, on this. If people go online and start approaching other people in sexual terms, that is sexual harassment. It does not matter whether it is happening in a physical office, on public transport, on traditional social media or in the metaverse—sexual harassment is wrong and, particularly when directed at minors, a really serious offence. Again, I hope that all the platforms recognise that and take steps to prevent sexual harassment on their platforms.

That is quite a lot of the activity that people are concerned about, but others are much more complex and may require updates to legislation. Those are particularly activities such as role-playing online, where people play roles and carry out activities that would be illegal if done in the real world. That is particularly difficult when it is done between consenting adults, when they choose to carry out a role-playing activity that replicates an illegal activity were it to take place in the real world. That is hard—and those with long memories may remember a group of cases around Operation Spanner in the 1990s, whereby a group of men was prosecuted for consensual sadomasochistic behaviour. The case went backwards and forwards, but it talked to something that the noble Baroness, Lady Fox, may be sympathetic to—the point at which the state should intervene on sexual activities that many people find abhorrent but which take place between consenting adults.

In the context of the metaverse, I see those questions coming front and centre again. There are all sorts of things that people could role-play in the metaverse, and we will need to take a decision on whether the current legislation is adequate or needs to be extended to cater for the fact that it now becomes a common activity. Also important is the nature of it. The fact that it is so realistic changes the nature of an activity; you get a gut feeling about it. The role-playing could happen today outside the metaverse, but once you move it in there, something changes. Particularly when children are involved, it becomes something that should be a priority for legislators—and it needs to be informed by what actually happens. A lot of what the amendments seek to do is to make sure that Ofcom collects the information that we need to understand how serious these problems are becoming and whether they are, again, something that is marginal or something that is becoming mainstream and leading to more harm.

The third and final question that I wanted to cover is the hardest one—the one around agency. That brings us to thinking about artificial intelligence. When we try to assign responsibility for inappropriate or illegal behaviour, we are normally looking for a controlling mind. In many cases, that will hold true online as well. I know that the noble Lord, Lord Knight of Weymouth, is looking at bots—and with a classic bot, you have a controlling mind. When the bots were distributing information in the US election on behalf of Russia, that was happening on behalf of individuals in Russia who had created those bots and sent them out there. We still had a controlling mind, in that instance, and a controlling mind can be prosecuted. We have that in many instances, and we can expect platforms to control them and expect to go after the individuals who created the bots in the same way that we would go after things that they do as a first party. There is a lot of experience in the fields of spam and misinformation, where “bashing the bots” is the daily bread and butter of a lot of online platforms. They have to do it just to keep their platforms safe.

We can also foresee a scenario with artificial intelligence whereby it is less obvious that there is a controlling mind or who the controlling mind should be. I can imagine a situation whereby an artificial intelligence has created illegal content, whether that is child sexual abuse material or something else that is in the schedule of illegal content in the Bill, without the user having expected it to happen or the developer having believed or contemplated that it could happen. Let us say that the artificial intelligence goes off and creates something illegal, and that both the user and the developer can show the question that they asked of the artificial intelligence and show how they coded it, showing that neither of them intended for that thing to happen. In the definition of artificial intelligence, it has its own agency in that scenario. The artificial intelligence cannot be fined or sent to prison. There are some things that we can do: we can try to retrain it, or we can kill it. There is always a kill switch; we should never forget that with artificial intelligence. Sam Altman at OpenAI can turn off ChatGPT if it is behaving in an illegal way.

There are some really important questions around that issue. There is the liability for the specific instance of the illegality happening. Who do we hold liable? Even if everyone says that it was not their intention, is there someone that we can hold liable? What should the threshold be at which we can execute that death sentence on the AI? If an AI is being used by millions of people and on a small number of occasions it does something illegal, is that sufficient? At what point do we say that the AI is rogue and that, effectively, it needs to be taken out of operation? Those are much wider questions than we are dealing with immediately with in the Bill, but I hope that the Minister can at least point to what the Government are thinking about these kind of legal questions, as we move from a world of user-to-user engagement to user-to-user-to-machine engagement, when that machine is no longer a creature of the user.

Baroness Berridge Portrait Baroness Berridge (Con)
- Hansard - - - Excerpts

I have had time just to double-check the offences. The problem that exists—and it would be helpful if my noble friend the Minister could confirm this—is that the criminal law is defined in terms of person. It is not automatic that sexual harassment, particularly if you do not have a haptic suit on, would actually fall within the criminal law, as far as I understand it, which is why I am asking the Minister to clarify. That was the point that I was making. Harassment per se also needs a course of conduct, so if it was not a touch of your avatar in a sexual nature, it clearly falls outside criminal law. That is the point of clarification that we might need on how the criminal law is framed at the moment.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

I am grateful to the noble Baroness. That is very helpful.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

That is exactly the same issue with child sexual abuse images—it is about the way in which criminal law is written. Not surprisingly, it is not up to date with evolution of technology.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

I am grateful for that intervention as well. That summarises the core questions that we have for the Minister. Of the three areas that we have for him, the first is the question of scope and the extent to which he can assure us that the Bill as drafted will be robust in covering the metaverse and bots, which are the issues that have been raised today. The second is on behaviours and to the two interventions that we have just had. We have been asking whether, with the behaviours that are criminal today, that criminality will stretch to new, similar forms of behaviour taking place in new environments—let us put it that way. The behaviour, the intent and the harm are the same, but the environment is different. We want to understand the extent to which the Government are thinking about that, where that thinking is happening and how confident they are that they can deal with that.

Finally, on the question of agency, how do the Government expect to deal with the fact that we will have machines operating in a user-to-user environment when the connection between the machine and another individual user is qualitatively different from anything that we have seen before? Those are just some small questions for the Minister on this Thursday afternoon.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, the debate on this group has been a little longer, deeper and more important than I had anticipated. It requires all of us to reflect before Report on some of the implications of the things we have been talking about. It was introduced masterfully by the noble Baroness, Lady Harding, and her comments—and those from the noble Baronesses, Lady Finlay and Lady Berridge—were difficult to listen to at times. I also congratulate the Government Whip on the way he handled the situation so that innocent ears were not subject to some of that difficult listening. But the questions around the implications of virtual reality, augmented reality and haptic technology are really important, and I hope the Minister will agree to meet with the noble Baroness, Lady Berridge, and the people she referenced to reflect on some of that.

--- Later in debate ---
Clause 159 requires the Secretary of State to undertake a review into the operation of the regulatory framework between two and five years after the provisions come into effect. This review will consider any new emerging trends or technologies, such as AI, which could have the potential to compromise the efficacy of the Bill in achieving its objectives. I am happy to assure the noble Viscount, Lord Colville of Culross, and the right reverend Prelate the Bishop of Chelmsford that the review will cover all content and activity being regulated by the Bill, including legal content that is harmful to children and content covered by user-empowerment tools. The Secretary of State must consult Ofcom when she carries out this review.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

Will the review also cover an understanding of what has been happening in criminal cases where, in some of the examples that have been described, people have tried to take online activity to court? We will at that point understand whether the judges believe that existing offences cover some of these novel forms of activity. I hope the review will also extend not just to what Ofcom does as a regulator but to understand what the courts are doing in terms of the definitions of criminal activity and whether they are being effective in the new online spaces.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I believe it will. Certainly, both government and Parliament will take into account judgments in the court on this Bill and in related areas of law, and will, I am sure, want to respond.