All 2 Baroness Berridge contributions to the Online Safety Act 2023

Read Bill Ministerial Extracts

Tue 25th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 25th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1

Online Safety Bill

Baroness Berridge Excerpts
Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak in support of the amendments in this group in the names of the intrepid noble Baroness, Lady Kidron, the noble Baroness, Lady Harding, and my noble friend Lord Storey—we are kindred spirits.

As my noble friend said, the expectations of parents are clear: they expect the Bill to protect their children from all harm online, wherever it is encountered. The vast majority of parents do not distinguish between the different content types. To restrict regulation to user-to-user services, as in Part 3, would leave a great many websites and content providers, which are accessed by children, standing outside the scope of the Bill. This is a flagship piece of legislation; there cannot be any loopholes leaving any part of the internet unregulated. If there is a website, app, online game, educational platform or blog—indeed, any content that contains harmful material—it must be in the scope of the Bill.

The noble Baroness, Lady Kidron, seeks to amend the Bill to ensure that it aligns with the Information Commissioner’s age-appropriate design code—it is a welcome amendment. As the Bill is currently drafted, the threshold for risk assessment is too high. It is important that the greatest number of children and young people are protected from harmful content online. The amendments achieve that to a greater degree than the protection already in the Bill.

While the proposal to align with the age-appropriate design code is welcome, I have one reservation. Up until recently, it appears that the ICO was reluctant to take action against pornography platforms that process children’s data. It has perhaps been deemed that pornographic websites are unlikely to be accessed by children. Over the years, I have shared with this House the statistics of how children are accessing pornography and the harm it causes. The Children’s Commissioner also recently highlighted the issue and concerns. Pornography is being accessed by our children, and we must ensure that the provisions of the Bill are the most robust they can be to ensure that children are protected online.

I am concerned with ensuring two things: first, that any platform that contains harmful material falls under the scope of the Bill and is regulated to ensure that children are kept safe; and, secondly, that, as far as possible, what is harmful offline is regulated in the same way online. The amendments in the name of my noble friend Lord Storey raise the important question of online-offline equality. Amendments 33A and 217A seek to regulate online video games to ensure they meet the same BBFC ratings as would be expected offline, and I agree with that approach. Later in Committee, I will raise this issue in relation to pornographic content and how online content should be subject to the same BBFC guidance as content offline. I agree with what my noble friend proposes: namely, that this should extend to video game content as well. Video games can be violent and sexualised in nature, and controls should be in place to ensure that children are protected. The BBFC guidelines used offline appear to be the best way to regulate online as well.

Children must be kept safe wherever they are online. This Bill must have the widest scope possible to keep children safe, but ensuring online/offline alignment is crucial. The best way to keep children safe is to legislate for regulation that is as far reaching as possible but consistently applied across the online/offline world. These are the reasons why I support the amendments in this group.

Baroness Berridge Portrait Baroness Berridge (Con)
- View Speech - Hansard - -

My Lords, I will lend my support to Amendments 19 and 22. It is a pleasure to speak after the noble Baroness, Lady Benjamin. I may be one of those people in your Lordships’ House who relies significantly on the British Board of Film Classification for movie watching, as I am one of the faint-hearted.

In relation to app stores, it is not just children under 18 for whom parents need the age verification. If you are a parent of a child who has significant learning delay, the internet is a wonderful place where they can get access to material and have development that they might not ordinarily have had. But, of course, turning 17 or 18 is not the threshold for them. I have friends who have children with significant learning delay. Having that assurance, so they know which apps are which in the app store, goes well beyond 18 for them. Obviously it will not be a numerical equivalent for their child—now a young adult—but it is important to them to know that the content they get on a free app or an app purchased from the app store is suitable.

I just wanted to raise that with noble Lords, as children and some vulnerable adults—not all—would benefit from the kind of age verification that we have talked about. I appreciate the points that the noble Lord, Lord Allan, raised about where the Bill has ended up conceptually and the framework that Ofcom will rely on. Like him, I am a purist sometimes but, pragmatically, I think that the third concept raised by the noble Baroness, Lady Kidron, about protection and putting this in the app store and bringing it parallel with things such as classification for films and other video games is really important.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this has been a really fascinating debate and I need to put a stake in the ground pretty early on by saying that, although my noble friend Lord Allan has raised some important points and stimulated an important debate, I absolutely agree with the vast majority of noble Lords who have spoken in favour of the amendment so cogently put forward by the noble Baronesses, Lady Kidron and Lady Harding.

Particularly as a result of the Bill’s being the subject of a Joint Committee, it has changed considerably over time in response to comment, pressure, discussion and debate and I believe very much that during Committee stage we will be able to make changes, and I hope the Minister will be flexible enough. I do not believe that the framework of the Bill is set in concrete. There are many things we can do as we go through, particularly in the field of making children safer, if we take some of the amendments that have been put forward on board. In particular, the noble Baroness, Lady Kidron, set out why the current scope of the Bill will fail to protect children if it is kept to user-to-user and search services. She talked about blogs with limited functionalities, gaming without user functionalities and mentioned the whole immersive environment, which the noble Lord, Lord Russell, described as eye-watering. As she said, it is not fair to leave parents or children to work out whether they are on a regulated service. Children must be safe wherever they are online.

As someone who worked with the noble Baroness, Lady Kidron, in putting the appropriate design code in place in that original Data Protection Act, I am a fervent believer that it is perfectly appropriate to extend in the way that is proposed today. I also support her second amendment, which would bring the Bill’s child user condition in line with the threshold of the age-appropriate design code. It is the expectation—I do not think it an unfair expectation—of parents, teachers and children themselves that the Bill will apply to children wherever they are online. Regulating only certain services will mean that emerging technologies that do not fit the rather narrow categories will not be subject to safety duties.

Online Safety Bill

Baroness Berridge Excerpts
Baroness Berridge Portrait Baroness Berridge (Con)
- View Speech - Hansard - -

My Lords, I rise to support Amendment 241, in the name of the noble Baroness, Lady Finlay, as she mentioned. I also spoke in the Private Member’s Bill that the noble Baroness previously brought before your Lordships’ House, in a similar vein, regarding future-proofing.

The particular issue in Amendment 241 that I wish to address is

“the extent to which new communications and internet technologies allow for behaviours which would be in breach of the law if the equivalent behaviours were committed in the physical world”.

The use of “behaviours” brings into sharp focus the applicability of the Online Safety Bill in the metaverse. Since that Private Member’s Bill, I have learned much about future-proofing from the expert work of the Dawes Centre for Future Crime at UCL. I reached out to the centre as it seemed to me that some conduct and crimes in the physical world would not be criminal if committed in the metaverse.

I will share the example, which seems quite banal, that led me to contact them. The office meeting now takes place in the metaverse. All my colleagues are represented by avatars. My firm has equipped me with the most sophisticated haptic suit. During the meeting, the avatar of one of my colleagues slaps the bum of my avatar. The haptic suit means that I have a physical response to that, to add to the fright and shock. Even without such a suit, I would be shocked and frightened. Physically, I am, of course, working in my own home.

Lord Harlech Portrait Lord Harlech (Con)
- Hansard - - - Excerpts

My Lords, I apologise to my noble friend. I ask that we pause the debate to ask this school group to exit the Chamber. We do not think that the subject matter and content will be suitable for that audience. I am very sorry. The House is pausing.

Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- Hansard - - - Excerpts

In this moment while we pause, I congratulate the noble Lord, the Government Whip, for being so vigilant: some of us in the Chamber cannot see the whole Gallery. It is appreciated.

Baroness Berridge Portrait Baroness Berridge (Con)
- Hansard - -

I, too, thank my noble friend the Government Whip. I apologise too if I have spoken out of discourtesy in the Committee: I was not sure whose name was on which amendment, so I will continue.

Physically, I am, of course, working in my home. If that behaviour had happened in the office, it would be an offence, an assault: “intentional or reckless application of unlawful force to another person”. It will not be an offence in the metaverse and it is probably not harassment because it is not a course of conduct.

Although the basic definition of user-to-user content covers the metaverse, as does encountering, as has been mentioned in relation to content under Clause 207, which is broad enough to cover the haptic suits, the restriction to illegal content could be problematic, as the metaverse is a complex of live interactions that mimics real life and such behaviours, including criminal ones. Also, the avatar of an adult could sexually assault the avatar of a child in the metaverse, and with haptic technologies this would not be just a virtual experience. Potentially even more fundamentally than Amendment 125, the Bill is premised on the internet being a solely virtual environment when it comes to content that can harm. But what I am seeking to outline is that conduct can also harm.

I recognise that we cannot catch everything in this Bill at this moment. This research is literally hot off the press; it is only a few weeks old. At the very least, it highlights the need for future-proofing. I am aware that some of the issues I have highlighted about the fundamental difference between conduct and content refer to clauses noble Lords may already have debated. However, I believe that these points are significant. It is just happenstance that the research came out and is hot off the press. I would be grateful if the Minister would meet the Dawes Centre urgently to consider whether there are further changes the Government need to make to the Bill to ensure that it covers the harms I have outlined.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I have put my name to Amendments 195, 239 and 263. I also strongly support Amendment 125 in the name of my noble friend Lady Kidron.

During this Committee there have been many claims that a group of amendments is the most significant, but I believe that this group is the most significant. This debate comes after the Prime Minister and the Secretary of State for Science and Technology met the heads of leading AI research companies in Downing Street. The joint statement said:

“They discussed safety measures … to manage risks”


and called for

“international collaboration on AI safety and regulation”.

Surely this Bill is the obvious place to start responding to those concerns. If we do not future-proof this Bill against the changes in digital technology, which are ever increasing at an ever-faster rate, it will be obsolete even before it is implemented.

My greatest concern is the arrival of AI. The noble Baroness, Lady Harding, has reminded us of the warnings from the godfather of AI, Geoffrey Hinton. If he is not listened to, who on earth should we be listening to? I wholeheartedly support Amendment 125. Machine-generated content is present in so much of what we see on the internet, and its presence is increasing daily. It is the future, and it must be within scope of this Bill. I am appalled by the examples that the noble Baroness, Lady Harding, has brought before us.

In the Communications and Digital Committee inquiry on regulating the internet, we decided that horizon scanning was so important that we called for a digital authority to be created which would look for harms developing in the digital world, assess how serious a threat they posed to users and develop a regulated response. The Government did not take up these suggestions. Instead, Ofcom has been given the onerous task of enforcing the triple shield which under this Bill will protect users to different degrees into the future.

Amendment 195 in the name of the right reverend Prelate the Bishop of Oxford will ensure that Ofcom has knowledge of how well the triple shield is working, which must be essential. Surveys of thousands of users undertaken by companies such as Kantar give an invaluable snapshot of what is concerning users now. These must be fed into research by Ofcom to ensure that future developments across the digital space are monitored, updated and brought to the attention of the Secretary of State and Parliament on a regular basis.

Amendment 195 will reveal trends in harms which might not be picked up by Ofcom under the present regime. It will look at the risk arising for individuals from the operation of Part 3 services. Clause 12 on user empowerment duties has a list of content and characteristics from which users can protect themselves. However, the characteristics for which or content with which users can be abused will change over time and these changes need to be researched, anticipated and implemented.

This Bill has proved in its long years of gestation that it takes time to change legislation, while changes on the internet take just minutes or are already here. The regime set up by these future-proofing amendments will at least go some way to protecting users from these fast-evolving harms. I stress to your Lordships’ Committee that this is very much precautionary work. It should be used to inform the Secretary of State of harms which are coming down the line. I do not think it will give power automatically to expand the scope of harms covered by the regime.

Amendment 239 inserts a new clause for an Ofcom future management of risks review. This will help feed into the Secretary of State review regime set out in Clause 159. Clause 159(3)(a) currently looks at ensuring that regulated services are operating using systems and process which, so far as relevant, are minimising the risk of harms to individuals. The wording appears to mean that the Secretary of State will be viewing all harms to individuals. I would be grateful if the Minister could explain to the Committee the scope of the harms set out in Clause 159(3)(a)(i). Are they meant to cover only the harms of illegality and harms to children, or are they part of a wider examination of the harms regime to see whether it needs to be contracted or expanded? I would welcome an explanation of the scope of the Secretary of State’s review.

The real aim of Amendment 263 is to ensure that the Secretary of State looks at research work carried out by Ofcom. I am not sure how politicians will come to any conclusions in the Clause 159 review unless they are required to look at all the research published by Ofcom on future risk. I would like the Minister to explain what research the Secretary of State would rely on for this review unless this amendment is accepted. I hope Amendment 263 will also encourage the Secretary of State to look at possible harms not only from content, but also from the means of delivering this content.

This aim was the whole point of Amendment 261, which has already been debated. However, it needs to be borne in mind when considering that harms come not just from content, but also from the machine technology which delivers it. Every day we read about new developments and threats posed by a fast-evolving internet. Today it is concerns about ChatGPT and the race for the most sophisticated artificial intelligence. The amendments in this group will provide much-needed reinforcement to ensure that the Online Safety Bill remains a beacon for continuing safety online.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, on behalf of my noble friend Lord Clement-Jones, I will speak in support of Amendments 195, 239, 263 and 286, to which he added his name. He wants me to thank the Carnegie Trust and the Institution of Engineering and Technology, which have been very helpful in flagging relevant issues for the debate.

Some of the issues in this group of amendments will range much more widely than simply the content we have before us in the Online Safety Bill. The right reverend Prelate the Bishop of Chelmsford is right to flag the question of a risk assessment. People are flagging to us known risks. Once we have a known risk, it is incumbent on us to challenge the Minister to see whether the Government are thinking about those risks, regardless of whether the answer is something in the Online Safety Bill or that there needs to be amendments to wider criminal law and other pieces of legislation to deal with it.

Some of these issues have been dealt with for a long time. If you go back and look at the Guardian for 9 May 2007, you will see the headline,

“Second Life in virtual child sex scandal”.


That case was reported in Germany about child role-playing in Second Life, which is very similar to the kind of scenarios described by various noble Lords in this debate. If Second Life was the dog that barked but did not bite, we are in quite a different scenario today, not least because of the dramatic expansion in broadband technology, for which we can thank the noble Baroness, Lady Harding, in her previous role. Pretty much everybody in this country now has incredible access, at huge scale, to high-speed broadband, which allows those kinds of real life, metaverse-type environments to be available to far more people than was possible with Second Life, which tended to be confined to a smaller group.

The amendments raise three significant groups of questions: first, on scope, and whether the scope of the Online Safety Bill will stretch to what we need; secondly, on behaviour, including the kinds of new behaviours, which we have heard described, that could arise as these technologies develop; and, finally, on agency, which speaks to some of the questions raised by the noble Baroness, Lady Fox, on AIs, including the novel questions about who is responsible when something happens through the medium of artificial intelligence.

On scope, the key question is whether the definition of “user-to-user”, which is at the heart of the Bill, covers everything that we would like to see covered by the Bill. Like the noble Baroness, Lady Harding, I look forward to the Minister’s response; I am sure that he has very strongly prepared arguments on that. We should take a moment to give credit to the Bill’s drafters for coming up with these definitions for user-to-user behaviours, rather than using phrases such as, “We are regulating social media or specific technology”. It is worth giving credit, because a lot of thought has gone into this, over many years, with organisations such as the Carnegie Trust. Our starting point is a better starting point than many other legislative frameworks which list a set of types of services; we at least have something about user-to-user behaviours that we can work with. Having said that, it is important that we stress-test that definition. That is what we are doing today: we are stress-testing, with the Minister, whether the definition of “user-to-user” will still apply in some of the novel environments.

It certainly seems likely—and I am sure that the Minister will say this—that a lot of metaverse activity would be in scope. But we need detailed responses from the Minister to explain why the kinds of scenario that have been described—if he believes that this is the case; I expect him to say so—would mean that Ofcom would be able to demand things of a metaverse provider under the framework of the user-to-user requirements. Those are things we all want to see, including the risk assessments, the requirement to keep people away from illegal content, and any other measures that Ofcom deems necessary to mitigate the risks on those platforms.

It will certainly be useful for the Minister to clarify one particular area. Again, we are fortunate in the UK that pseudo-images of child sexual abuse are illegal and have been illegal for a long time. That is not the case in every country around the world, and the noble Lord, Lord Russell, is quite right to say that this an area where we need international co-operation. Having dealt with it on the platforms, some countries have actively chosen not to criminalise pseudo-images; others just have not considered it.

In the UK, we were ahead of the game in saying, “If it looks like a photo of child abuse, we don’t care whether you created it on Photoshop, or whatever—it is illegal”. I hope that the Minister can confirm that avatars in metaverse-type environments would fall under that definition. My understanding is that the legislation refers to photographs and videos. I would interpret an avatar or activity in a metaverse as a photo or video, and I hope that is what the Government’s legal officers are doing.

Again, it is important in the context of this debate and the exchange that we have just had between the noble Baronesses, Lady Harding and Lady Fox, that people out there understand that they do not get away with it. If you are in the UK and you create a child sexual abuse image, you can be taken to court and go to prison. People should not think that, if they do it in the metaverse, it is okay—it is not okay, and it is really important that that message gets out there.

This brings us to the second area of behaviours. Again, some of the behaviours that we see online will be extensions of existing harms, but some will be novel, based on technical capabilities. Some of them we should just call by their common or garden term, which is sexual harassment. I was struck by the comments of the noble Baroness, Lady Berridge, on this. If people go online and start approaching other people in sexual terms, that is sexual harassment. It does not matter whether it is happening in a physical office, on public transport, on traditional social media or in the metaverse—sexual harassment is wrong and, particularly when directed at minors, a really serious offence. Again, I hope that all the platforms recognise that and take steps to prevent sexual harassment on their platforms.

That is quite a lot of the activity that people are concerned about, but others are much more complex and may require updates to legislation. Those are particularly activities such as role-playing online, where people play roles and carry out activities that would be illegal if done in the real world. That is particularly difficult when it is done between consenting adults, when they choose to carry out a role-playing activity that replicates an illegal activity were it to take place in the real world. That is hard—and those with long memories may remember a group of cases around Operation Spanner in the 1990s, whereby a group of men was prosecuted for consensual sadomasochistic behaviour. The case went backwards and forwards, but it talked to something that the noble Baroness, Lady Fox, may be sympathetic to—the point at which the state should intervene on sexual activities that many people find abhorrent but which take place between consenting adults.

In the context of the metaverse, I see those questions coming front and centre again. There are all sorts of things that people could role-play in the metaverse, and we will need to take a decision on whether the current legislation is adequate or needs to be extended to cater for the fact that it now becomes a common activity. Also important is the nature of it. The fact that it is so realistic changes the nature of an activity; you get a gut feeling about it. The role-playing could happen today outside the metaverse, but once you move it in there, something changes. Particularly when children are involved, it becomes something that should be a priority for legislators—and it needs to be informed by what actually happens. A lot of what the amendments seek to do is to make sure that Ofcom collects the information that we need to understand how serious these problems are becoming and whether they are, again, something that is marginal or something that is becoming mainstream and leading to more harm.

The third and final question that I wanted to cover is the hardest one—the one around agency. That brings us to thinking about artificial intelligence. When we try to assign responsibility for inappropriate or illegal behaviour, we are normally looking for a controlling mind. In many cases, that will hold true online as well. I know that the noble Lord, Lord Knight of Weymouth, is looking at bots—and with a classic bot, you have a controlling mind. When the bots were distributing information in the US election on behalf of Russia, that was happening on behalf of individuals in Russia who had created those bots and sent them out there. We still had a controlling mind, in that instance, and a controlling mind can be prosecuted. We have that in many instances, and we can expect platforms to control them and expect to go after the individuals who created the bots in the same way that we would go after things that they do as a first party. There is a lot of experience in the fields of spam and misinformation, where “bashing the bots” is the daily bread and butter of a lot of online platforms. They have to do it just to keep their platforms safe.

We can also foresee a scenario with artificial intelligence whereby it is less obvious that there is a controlling mind or who the controlling mind should be. I can imagine a situation whereby an artificial intelligence has created illegal content, whether that is child sexual abuse material or something else that is in the schedule of illegal content in the Bill, without the user having expected it to happen or the developer having believed or contemplated that it could happen. Let us say that the artificial intelligence goes off and creates something illegal, and that both the user and the developer can show the question that they asked of the artificial intelligence and show how they coded it, showing that neither of them intended for that thing to happen. In the definition of artificial intelligence, it has its own agency in that scenario. The artificial intelligence cannot be fined or sent to prison. There are some things that we can do: we can try to retrain it, or we can kill it. There is always a kill switch; we should never forget that with artificial intelligence. Sam Altman at OpenAI can turn off ChatGPT if it is behaving in an illegal way.

There are some really important questions around that issue. There is the liability for the specific instance of the illegality happening. Who do we hold liable? Even if everyone says that it was not their intention, is there someone that we can hold liable? What should the threshold be at which we can execute that death sentence on the AI? If an AI is being used by millions of people and on a small number of occasions it does something illegal, is that sufficient? At what point do we say that the AI is rogue and that, effectively, it needs to be taken out of operation? Those are much wider questions than we are dealing with immediately with in the Bill, but I hope that the Minister can at least point to what the Government are thinking about these kind of legal questions, as we move from a world of user-to-user engagement to user-to-user-to-machine engagement, when that machine is no longer a creature of the user.

Baroness Berridge Portrait Baroness Berridge (Con)
- Hansard - -

I have had time just to double-check the offences. The problem that exists—and it would be helpful if my noble friend the Minister could confirm this—is that the criminal law is defined in terms of person. It is not automatic that sexual harassment, particularly if you do not have a haptic suit on, would actually fall within the criminal law, as far as I understand it, which is why I am asking the Minister to clarify. That was the point that I was making. Harassment per se also needs a course of conduct, so if it was not a touch of your avatar in a sexual nature, it clearly falls outside criminal law. That is the point of clarification that we might need on how the criminal law is framed at the moment.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I am grateful to the noble Baroness. That is very helpful.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I believe it will. Certainly, both government and Parliament will take into account judgments in the court on this Bill and in related areas of law, and will, I am sure, want to respond.

Baroness Berridge Portrait Baroness Berridge (Con)
- Hansard - -

It is not just the judgments of the courts; it is about how the criminal law as a very basic point has been framed. I invite my noble friend the Minister to please meet with the Dawes Centre, because it is about future crime. We could end up with a situation in which more and more violence, particularly against women and girls, is being committed in this space, and although it may be that the Bill has made it regulated, it may not fall within the province of the criminal law. That would be a very difficult situation for our law to end up in. Can my noble friend the Minister please meet with the Dawes Centre to talk about that point?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am happy to reassure my noble friend that the director of the Dawes Centre for Future Crime sits on the Home Office’s Science Advisory Council, whose work is very usefully fed into the work being done at the Home Office. Colleagues at the Ministry of Justice keep criminal law under constant review, in light of research by such bodies and what we see in the courts and society. I hope that reassures my noble friend that the points she raised, which are covered by organisations such as the Dawes Centre, are very much in the mind of government.

The noble Lord, Lord Allan of Hallam, explained very effectively the nuances of how behaviour translates to the virtual world. He is right that we will need to keep both offences and the framework under review. My noble friend Lady Berridge asked a good and clear question, to which I am afraid I do not have a similarly concise answer. I can reassure her that generated child sexual abuse and exploitation material is certainly illegal, but she asked about sexual harassment via a haptic suit; that would depend on the specific circumstances. I hope she will allow me to respond in writing, at greater length and more helpfully, to the very good question she asked.

Under Clause 56, Ofcom will also be required to undertake periodic reviews into the incidence and severity of content that is harmful to children on the in-scope services, and to recommend to the Secretary of State any appropriate changes to regulations based on its findings. Clause 141 also requires Ofcom to carry out research into users’ experiences of regulated services, which will likely include experiences of services such as the metaverse and other online spaces that allow user interaction. Under Clause 147, Ofcom may also publish reports on other online safety matters.

The questions posed by the noble Lord, Lord Russell of Liverpool, about international engagement are best addressed in a group covering regulatory co-operation, which I hope we will reach later today. I can tell him that we have introduced a new information-sharing gateway for the purpose of sharing information with overseas regulators, to ensure that Ofcom can collaborate effectively with its international counterparts. That builds on existing arrangements for sharing information that underpin Ofcom’s existing regulatory regimes.

The amendments tabled by the noble Lord, Lord Knight of Weymouth, relate to providers’ judgments about when content produced by bots is illegal content, or a fraudulent advertisement, under the Bill. Clause 170 sets out that providers will need to take into account all reasonably available relevant information about content when making a judgment about its illegality. As we discussed in the group about illegal content, providers will need to treat content as illegal when this information gives reasonable grounds for inferring that an offence was committed. Content produced by bots is in scope of providers’ duties under the Bill. This includes the illegal content duties, and the same principles for assessing illegal content will apply to bot-produced content. Rather than drawing inferences about the conduct and intent of the user who generated the content, the Bill specifies that providers should consider the conduct and the intent of the person who can be assumed to have controlled the bot at the point it created the content in question.

The noble Lord’s amendment would set out that providers could make judgments about whether bot-produced content is illegal, either by reference to the conduct or mental state of the person who owns the bot or, alternatively, by reference to the person who controls it. As he set out in his explanatory statement and outlined in his speech, I understand he has brought this forward because he is concerned that providers will sometimes not be able to identify the controller of a bot, and that this will impede providers’ duties to take action against illegal content produced by them. Even when the provider does not know the identity of the person controlling the bot, however, in many cases there will still be evidence from which providers can draw inferences about the conduct and intent of that person, so we are satisfied that the current drafting of the Bill ensures that providers will be able to make a judgment on illegality.