7 Baroness Berridge debates involving the Department for Digital, Culture, Media & Sport

Baroness Berridge Portrait Baroness Berridge (Con)
- View Speech - Hansard - -

My Lords, I rise to support Amendment 241, in the name of the noble Baroness, Lady Finlay, as she mentioned. I also spoke in the Private Member’s Bill that the noble Baroness previously brought before your Lordships’ House, in a similar vein, regarding future-proofing.

The particular issue in Amendment 241 that I wish to address is

“the extent to which new communications and internet technologies allow for behaviours which would be in breach of the law if the equivalent behaviours were committed in the physical world”.

The use of “behaviours” brings into sharp focus the applicability of the Online Safety Bill in the metaverse. Since that Private Member’s Bill, I have learned much about future-proofing from the expert work of the Dawes Centre for Future Crime at UCL. I reached out to the centre as it seemed to me that some conduct and crimes in the physical world would not be criminal if committed in the metaverse.

I will share the example, which seems quite banal, that led me to contact them. The office meeting now takes place in the metaverse. All my colleagues are represented by avatars. My firm has equipped me with the most sophisticated haptic suit. During the meeting, the avatar of one of my colleagues slaps the bum of my avatar. The haptic suit means that I have a physical response to that, to add to the fright and shock. Even without such a suit, I would be shocked and frightened. Physically, I am, of course, working in my own home.

Lord Harlech Portrait Lord Harlech (Con)
- Hansard - - - Excerpts

My Lords, I apologise to my noble friend. I ask that we pause the debate to ask this school group to exit the Chamber. We do not think that the subject matter and content will be suitable for that audience. I am very sorry. The House is pausing.

Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- Hansard - - - Excerpts

In this moment while we pause, I congratulate the noble Lord, the Government Whip, for being so vigilant: some of us in the Chamber cannot see the whole Gallery. It is appreciated.

Baroness Berridge Portrait Baroness Berridge (Con)
- Hansard - -

I, too, thank my noble friend the Government Whip. I apologise too if I have spoken out of discourtesy in the Committee: I was not sure whose name was on which amendment, so I will continue.

Physically, I am, of course, working in my home. If that behaviour had happened in the office, it would be an offence, an assault: “intentional or reckless application of unlawful force to another person”. It will not be an offence in the metaverse and it is probably not harassment because it is not a course of conduct.

Although the basic definition of user-to-user content covers the metaverse, as does encountering, as has been mentioned in relation to content under Clause 207, which is broad enough to cover the haptic suits, the restriction to illegal content could be problematic, as the metaverse is a complex of live interactions that mimics real life and such behaviours, including criminal ones. Also, the avatar of an adult could sexually assault the avatar of a child in the metaverse, and with haptic technologies this would not be just a virtual experience. Potentially even more fundamentally than Amendment 125, the Bill is premised on the internet being a solely virtual environment when it comes to content that can harm. But what I am seeking to outline is that conduct can also harm.

I recognise that we cannot catch everything in this Bill at this moment. This research is literally hot off the press; it is only a few weeks old. At the very least, it highlights the need for future-proofing. I am aware that some of the issues I have highlighted about the fundamental difference between conduct and content refer to clauses noble Lords may already have debated. However, I believe that these points are significant. It is just happenstance that the research came out and is hot off the press. I would be grateful if the Minister would meet the Dawes Centre urgently to consider whether there are further changes the Government need to make to the Bill to ensure that it covers the harms I have outlined.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I have put my name to Amendments 195, 239 and 263. I also strongly support Amendment 125 in the name of my noble friend Lady Kidron.

During this Committee there have been many claims that a group of amendments is the most significant, but I believe that this group is the most significant. This debate comes after the Prime Minister and the Secretary of State for Science and Technology met the heads of leading AI research companies in Downing Street. The joint statement said:

“They discussed safety measures … to manage risks”


and called for

“international collaboration on AI safety and regulation”.

Surely this Bill is the obvious place to start responding to those concerns. If we do not future-proof this Bill against the changes in digital technology, which are ever increasing at an ever-faster rate, it will be obsolete even before it is implemented.

My greatest concern is the arrival of AI. The noble Baroness, Lady Harding, has reminded us of the warnings from the godfather of AI, Geoffrey Hinton. If he is not listened to, who on earth should we be listening to? I wholeheartedly support Amendment 125. Machine-generated content is present in so much of what we see on the internet, and its presence is increasing daily. It is the future, and it must be within scope of this Bill. I am appalled by the examples that the noble Baroness, Lady Harding, has brought before us.

In the Communications and Digital Committee inquiry on regulating the internet, we decided that horizon scanning was so important that we called for a digital authority to be created which would look for harms developing in the digital world, assess how serious a threat they posed to users and develop a regulated response. The Government did not take up these suggestions. Instead, Ofcom has been given the onerous task of enforcing the triple shield which under this Bill will protect users to different degrees into the future.

Amendment 195 in the name of the right reverend Prelate the Bishop of Oxford will ensure that Ofcom has knowledge of how well the triple shield is working, which must be essential. Surveys of thousands of users undertaken by companies such as Kantar give an invaluable snapshot of what is concerning users now. These must be fed into research by Ofcom to ensure that future developments across the digital space are monitored, updated and brought to the attention of the Secretary of State and Parliament on a regular basis.

Amendment 195 will reveal trends in harms which might not be picked up by Ofcom under the present regime. It will look at the risk arising for individuals from the operation of Part 3 services. Clause 12 on user empowerment duties has a list of content and characteristics from which users can protect themselves. However, the characteristics for which or content with which users can be abused will change over time and these changes need to be researched, anticipated and implemented.

This Bill has proved in its long years of gestation that it takes time to change legislation, while changes on the internet take just minutes or are already here. The regime set up by these future-proofing amendments will at least go some way to protecting users from these fast-evolving harms. I stress to your Lordships’ Committee that this is very much precautionary work. It should be used to inform the Secretary of State of harms which are coming down the line. I do not think it will give power automatically to expand the scope of harms covered by the regime.

Amendment 239 inserts a new clause for an Ofcom future management of risks review. This will help feed into the Secretary of State review regime set out in Clause 159. Clause 159(3)(a) currently looks at ensuring that regulated services are operating using systems and process which, so far as relevant, are minimising the risk of harms to individuals. The wording appears to mean that the Secretary of State will be viewing all harms to individuals. I would be grateful if the Minister could explain to the Committee the scope of the harms set out in Clause 159(3)(a)(i). Are they meant to cover only the harms of illegality and harms to children, or are they part of a wider examination of the harms regime to see whether it needs to be contracted or expanded? I would welcome an explanation of the scope of the Secretary of State’s review.

The real aim of Amendment 263 is to ensure that the Secretary of State looks at research work carried out by Ofcom. I am not sure how politicians will come to any conclusions in the Clause 159 review unless they are required to look at all the research published by Ofcom on future risk. I would like the Minister to explain what research the Secretary of State would rely on for this review unless this amendment is accepted. I hope Amendment 263 will also encourage the Secretary of State to look at possible harms not only from content, but also from the means of delivering this content.

This aim was the whole point of Amendment 261, which has already been debated. However, it needs to be borne in mind when considering that harms come not just from content, but also from the machine technology which delivers it. Every day we read about new developments and threats posed by a fast-evolving internet. Today it is concerns about ChatGPT and the race for the most sophisticated artificial intelligence. The amendments in this group will provide much-needed reinforcement to ensure that the Online Safety Bill remains a beacon for continuing safety online.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, on behalf of my noble friend Lord Clement-Jones, I will speak in support of Amendments 195, 239, 263 and 286, to which he added his name. He wants me to thank the Carnegie Trust and the Institution of Engineering and Technology, which have been very helpful in flagging relevant issues for the debate.

Some of the issues in this group of amendments will range much more widely than simply the content we have before us in the Online Safety Bill. The right reverend Prelate the Bishop of Chelmsford is right to flag the question of a risk assessment. People are flagging to us known risks. Once we have a known risk, it is incumbent on us to challenge the Minister to see whether the Government are thinking about those risks, regardless of whether the answer is something in the Online Safety Bill or that there needs to be amendments to wider criminal law and other pieces of legislation to deal with it.

Some of these issues have been dealt with for a long time. If you go back and look at the Guardian for 9 May 2007, you will see the headline,

“Second Life in virtual child sex scandal”.


That case was reported in Germany about child role-playing in Second Life, which is very similar to the kind of scenarios described by various noble Lords in this debate. If Second Life was the dog that barked but did not bite, we are in quite a different scenario today, not least because of the dramatic expansion in broadband technology, for which we can thank the noble Baroness, Lady Harding, in her previous role. Pretty much everybody in this country now has incredible access, at huge scale, to high-speed broadband, which allows those kinds of real life, metaverse-type environments to be available to far more people than was possible with Second Life, which tended to be confined to a smaller group.

The amendments raise three significant groups of questions: first, on scope, and whether the scope of the Online Safety Bill will stretch to what we need; secondly, on behaviour, including the kinds of new behaviours, which we have heard described, that could arise as these technologies develop; and, finally, on agency, which speaks to some of the questions raised by the noble Baroness, Lady Fox, on AIs, including the novel questions about who is responsible when something happens through the medium of artificial intelligence.

On scope, the key question is whether the definition of “user-to-user”, which is at the heart of the Bill, covers everything that we would like to see covered by the Bill. Like the noble Baroness, Lady Harding, I look forward to the Minister’s response; I am sure that he has very strongly prepared arguments on that. We should take a moment to give credit to the Bill’s drafters for coming up with these definitions for user-to-user behaviours, rather than using phrases such as, “We are regulating social media or specific technology”. It is worth giving credit, because a lot of thought has gone into this, over many years, with organisations such as the Carnegie Trust. Our starting point is a better starting point than many other legislative frameworks which list a set of types of services; we at least have something about user-to-user behaviours that we can work with. Having said that, it is important that we stress-test that definition. That is what we are doing today: we are stress-testing, with the Minister, whether the definition of “user-to-user” will still apply in some of the novel environments.

It certainly seems likely—and I am sure that the Minister will say this—that a lot of metaverse activity would be in scope. But we need detailed responses from the Minister to explain why the kinds of scenario that have been described—if he believes that this is the case; I expect him to say so—would mean that Ofcom would be able to demand things of a metaverse provider under the framework of the user-to-user requirements. Those are things we all want to see, including the risk assessments, the requirement to keep people away from illegal content, and any other measures that Ofcom deems necessary to mitigate the risks on those platforms.

It will certainly be useful for the Minister to clarify one particular area. Again, we are fortunate in the UK that pseudo-images of child sexual abuse are illegal and have been illegal for a long time. That is not the case in every country around the world, and the noble Lord, Lord Russell, is quite right to say that this an area where we need international co-operation. Having dealt with it on the platforms, some countries have actively chosen not to criminalise pseudo-images; others just have not considered it.

In the UK, we were ahead of the game in saying, “If it looks like a photo of child abuse, we don’t care whether you created it on Photoshop, or whatever—it is illegal”. I hope that the Minister can confirm that avatars in metaverse-type environments would fall under that definition. My understanding is that the legislation refers to photographs and videos. I would interpret an avatar or activity in a metaverse as a photo or video, and I hope that is what the Government’s legal officers are doing.

Again, it is important in the context of this debate and the exchange that we have just had between the noble Baronesses, Lady Harding and Lady Fox, that people out there understand that they do not get away with it. If you are in the UK and you create a child sexual abuse image, you can be taken to court and go to prison. People should not think that, if they do it in the metaverse, it is okay—it is not okay, and it is really important that that message gets out there.

This brings us to the second area of behaviours. Again, some of the behaviours that we see online will be extensions of existing harms, but some will be novel, based on technical capabilities. Some of them we should just call by their common or garden term, which is sexual harassment. I was struck by the comments of the noble Baroness, Lady Berridge, on this. If people go online and start approaching other people in sexual terms, that is sexual harassment. It does not matter whether it is happening in a physical office, on public transport, on traditional social media or in the metaverse—sexual harassment is wrong and, particularly when directed at minors, a really serious offence. Again, I hope that all the platforms recognise that and take steps to prevent sexual harassment on their platforms.

That is quite a lot of the activity that people are concerned about, but others are much more complex and may require updates to legislation. Those are particularly activities such as role-playing online, where people play roles and carry out activities that would be illegal if done in the real world. That is particularly difficult when it is done between consenting adults, when they choose to carry out a role-playing activity that replicates an illegal activity were it to take place in the real world. That is hard—and those with long memories may remember a group of cases around Operation Spanner in the 1990s, whereby a group of men was prosecuted for consensual sadomasochistic behaviour. The case went backwards and forwards, but it talked to something that the noble Baroness, Lady Fox, may be sympathetic to—the point at which the state should intervene on sexual activities that many people find abhorrent but which take place between consenting adults.

In the context of the metaverse, I see those questions coming front and centre again. There are all sorts of things that people could role-play in the metaverse, and we will need to take a decision on whether the current legislation is adequate or needs to be extended to cater for the fact that it now becomes a common activity. Also important is the nature of it. The fact that it is so realistic changes the nature of an activity; you get a gut feeling about it. The role-playing could happen today outside the metaverse, but once you move it in there, something changes. Particularly when children are involved, it becomes something that should be a priority for legislators—and it needs to be informed by what actually happens. A lot of what the amendments seek to do is to make sure that Ofcom collects the information that we need to understand how serious these problems are becoming and whether they are, again, something that is marginal or something that is becoming mainstream and leading to more harm.

The third and final question that I wanted to cover is the hardest one—the one around agency. That brings us to thinking about artificial intelligence. When we try to assign responsibility for inappropriate or illegal behaviour, we are normally looking for a controlling mind. In many cases, that will hold true online as well. I know that the noble Lord, Lord Knight of Weymouth, is looking at bots—and with a classic bot, you have a controlling mind. When the bots were distributing information in the US election on behalf of Russia, that was happening on behalf of individuals in Russia who had created those bots and sent them out there. We still had a controlling mind, in that instance, and a controlling mind can be prosecuted. We have that in many instances, and we can expect platforms to control them and expect to go after the individuals who created the bots in the same way that we would go after things that they do as a first party. There is a lot of experience in the fields of spam and misinformation, where “bashing the bots” is the daily bread and butter of a lot of online platforms. They have to do it just to keep their platforms safe.

We can also foresee a scenario with artificial intelligence whereby it is less obvious that there is a controlling mind or who the controlling mind should be. I can imagine a situation whereby an artificial intelligence has created illegal content, whether that is child sexual abuse material or something else that is in the schedule of illegal content in the Bill, without the user having expected it to happen or the developer having believed or contemplated that it could happen. Let us say that the artificial intelligence goes off and creates something illegal, and that both the user and the developer can show the question that they asked of the artificial intelligence and show how they coded it, showing that neither of them intended for that thing to happen. In the definition of artificial intelligence, it has its own agency in that scenario. The artificial intelligence cannot be fined or sent to prison. There are some things that we can do: we can try to retrain it, or we can kill it. There is always a kill switch; we should never forget that with artificial intelligence. Sam Altman at OpenAI can turn off ChatGPT if it is behaving in an illegal way.

There are some really important questions around that issue. There is the liability for the specific instance of the illegality happening. Who do we hold liable? Even if everyone says that it was not their intention, is there someone that we can hold liable? What should the threshold be at which we can execute that death sentence on the AI? If an AI is being used by millions of people and on a small number of occasions it does something illegal, is that sufficient? At what point do we say that the AI is rogue and that, effectively, it needs to be taken out of operation? Those are much wider questions than we are dealing with immediately with in the Bill, but I hope that the Minister can at least point to what the Government are thinking about these kind of legal questions, as we move from a world of user-to-user engagement to user-to-user-to-machine engagement, when that machine is no longer a creature of the user.

Baroness Berridge Portrait Baroness Berridge (Con)
- Hansard - -

I have had time just to double-check the offences. The problem that exists—and it would be helpful if my noble friend the Minister could confirm this—is that the criminal law is defined in terms of person. It is not automatic that sexual harassment, particularly if you do not have a haptic suit on, would actually fall within the criminal law, as far as I understand it, which is why I am asking the Minister to clarify. That was the point that I was making. Harassment per se also needs a course of conduct, so if it was not a touch of your avatar in a sexual nature, it clearly falls outside criminal law. That is the point of clarification that we might need on how the criminal law is framed at the moment.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I am grateful to the noble Baroness. That is very helpful.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I believe it will. Certainly, both government and Parliament will take into account judgments in the court on this Bill and in related areas of law, and will, I am sure, want to respond.

Baroness Berridge Portrait Baroness Berridge (Con)
- Hansard - -

It is not just the judgments of the courts; it is about how the criminal law as a very basic point has been framed. I invite my noble friend the Minister to please meet with the Dawes Centre, because it is about future crime. We could end up with a situation in which more and more violence, particularly against women and girls, is being committed in this space, and although it may be that the Bill has made it regulated, it may not fall within the province of the criminal law. That would be a very difficult situation for our law to end up in. Can my noble friend the Minister please meet with the Dawes Centre to talk about that point?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am happy to reassure my noble friend that the director of the Dawes Centre for Future Crime sits on the Home Office’s Science Advisory Council, whose work is very usefully fed into the work being done at the Home Office. Colleagues at the Ministry of Justice keep criminal law under constant review, in light of research by such bodies and what we see in the courts and society. I hope that reassures my noble friend that the points she raised, which are covered by organisations such as the Dawes Centre, are very much in the mind of government.

The noble Lord, Lord Allan of Hallam, explained very effectively the nuances of how behaviour translates to the virtual world. He is right that we will need to keep both offences and the framework under review. My noble friend Lady Berridge asked a good and clear question, to which I am afraid I do not have a similarly concise answer. I can reassure her that generated child sexual abuse and exploitation material is certainly illegal, but she asked about sexual harassment via a haptic suit; that would depend on the specific circumstances. I hope she will allow me to respond in writing, at greater length and more helpfully, to the very good question she asked.

Under Clause 56, Ofcom will also be required to undertake periodic reviews into the incidence and severity of content that is harmful to children on the in-scope services, and to recommend to the Secretary of State any appropriate changes to regulations based on its findings. Clause 141 also requires Ofcom to carry out research into users’ experiences of regulated services, which will likely include experiences of services such as the metaverse and other online spaces that allow user interaction. Under Clause 147, Ofcom may also publish reports on other online safety matters.

The questions posed by the noble Lord, Lord Russell of Liverpool, about international engagement are best addressed in a group covering regulatory co-operation, which I hope we will reach later today. I can tell him that we have introduced a new information-sharing gateway for the purpose of sharing information with overseas regulators, to ensure that Ofcom can collaborate effectively with its international counterparts. That builds on existing arrangements for sharing information that underpin Ofcom’s existing regulatory regimes.

The amendments tabled by the noble Lord, Lord Knight of Weymouth, relate to providers’ judgments about when content produced by bots is illegal content, or a fraudulent advertisement, under the Bill. Clause 170 sets out that providers will need to take into account all reasonably available relevant information about content when making a judgment about its illegality. As we discussed in the group about illegal content, providers will need to treat content as illegal when this information gives reasonable grounds for inferring that an offence was committed. Content produced by bots is in scope of providers’ duties under the Bill. This includes the illegal content duties, and the same principles for assessing illegal content will apply to bot-produced content. Rather than drawing inferences about the conduct and intent of the user who generated the content, the Bill specifies that providers should consider the conduct and the intent of the person who can be assumed to have controlled the bot at the point it created the content in question.

The noble Lord’s amendment would set out that providers could make judgments about whether bot-produced content is illegal, either by reference to the conduct or mental state of the person who owns the bot or, alternatively, by reference to the person who controls it. As he set out in his explanatory statement and outlined in his speech, I understand he has brought this forward because he is concerned that providers will sometimes not be able to identify the controller of a bot, and that this will impede providers’ duties to take action against illegal content produced by them. Even when the provider does not know the identity of the person controlling the bot, however, in many cases there will still be evidence from which providers can draw inferences about the conduct and intent of that person, so we are satisfied that the current drafting of the Bill ensures that providers will be able to make a judgment on illegality.

Online Safety Bill

Baroness Berridge Excerpts
Thursday 25th May 2023

(11 months, 2 weeks ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Research by Bumble has found alarming statistics, including that: 35% of women have received an unsolicited nude image at work; 27% of women have received an unsolicited nude image when on public transport; one in five women have received unsolicited nude images when walking down the street; and almost half of 18 to 24 year-olds have received a sexual image that they did not consent to. Let us make the law clearer and stronger by basing the offence on consent. If anyone has not consented to receiving sexual images then it should be an offence. I therefore urge that Clause 167 is left out and replaced with the new clause as proposed by the noble Baroness, Lady Featherstone.
Baroness Berridge Portrait Baroness Berridge (Con)
- View Speech - Hansard - -

My Lords, I am grateful to noble Lords who have added their name to my Amendment 271, which arose out of concerns that there are now seemingly several offences that laudably aim to protect women but are not being enforced effectively. The most notable in this category is the low rate of rape cases that are prosecuted and lead to convictions. The amendment is not affected in theory by the definition of cyberflashing, whether it is in the form recommended by the Law Commission, that of specific intent, rather than being based on consent. However, in practice, if it remains in that specific intent form, then the victim will not be required to go to court. Therefore, in practice the amendment would be more effective if the offence remained on that basis. However, even if the victim on that basis does not need to go to court, someone who has been cyberflashed is, as other noble Lords have mentioned, unlikely to go to the police station to report what has happened.

This amendment is designed to put an obligation on the providers of technology to provide a reporting mechanism on phones and to collate that information before passing it to the prosecuting authorities. The Minister said that there are various issues with how the amendment is currently drafted, such as “the Crown Prosecution Service” rather than “the police”, and perhaps the definition of “providers of internet services” as it may be a different part of the tech industry that is required to collate this information.

Drawing on our discussions on the previous group of amendments regarding the criminal law here, I hope that my noble friend can clarify the issues of intent, which is mens rea and different from motive in relation to this matter. The purpose of the amendment is to ensure that there will be resources and expertise from the technology sector to provide these reporting mechanisms for the offences. One can imagine how many people will report cyberflashing if they only have to click on an app, or if their phone is enabled to retain such an image, since some of them disappear after a short while. You should be able to sit on the bus and report it. The tech company would then store and collate that, potentially in a manner that it would become clear. For instance—because this happens so much as we have just heard—if six people on the 27 bus multiple times a week report that they have received the same image, that would prompt the police to get the CCTV from the bus company to identify who this individual is if the tech company data did not provide that specificity. Or, is someone hanging out every Friday night at the A&E department and cyberflashing as they sit there? This is not part of the amendment, but such an app or mechanism could also include a reminder to change the security settings on your phone so that you cannot be AirDropped.

I hope that His Majesty’s Government will look at the purpose of this amendment. It is laudable that we are making cyberflashing an offence, but this amendment is about the enforcement of that offence and will support that. Only with such an easy mechanism to report it can what will be a crime be effectively policed.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I, too, wish the noble Baroness, Lady Featherstone, a very speedy recovery. Her presence here today is missed, though the amendments were very ably moved by the noble Baroness, Lady Burt. Having worked in government with the noble Baroness, Lady Featherstone, I can imagine how frustrated she is at not being able to speak today on amendments bearing her name.

As my noble friend said, this follows our debate on the wider issues around violence against women and girls in the online world. I do not want to repeat anything that was said there, but I am grateful to him for the discussions that we have had since. I support the Government in their introduction of Amendment 135A and the addition of controlling or coercive behaviour to the priority offences list. I will also speak to the cyberflashing amendments and Amendment 271, introduced by my noble friend Lady Berridge.

I suspect that many of us speaking in this debate today have had briefings from the wonderful organisation Refuge, which has seen a growing number of cases of technology-facilitated domestic abuse in recent years. As a result of this, Refuge pioneered a specialist technology-facilitated domestic abuse team, which uses expertise to support survivors and to identify emerging trends of online domestic abuse.

I draw noble Lords’ attention to a publication released since we debated this last week: the National Police Chiefs’ Council’s violence against women and girls strategic threat risk assessment for 2023, in which a whole page is devoted to tech and online-enabled violence against women and girls. In its conclusions, it says that one of the key threats is tech-enabled VAWG. The fact that we are having to debate these specific offences, but also the whole issue of gendered abuse online, shows how huge an issue this is for women and girls.

I will start with Amendment 271. I entirely agree with my noble friend about the need for specific user reporting and making that as easy as possible. That would support the debate we had last week about the code of practice, which would generally require platforms and search engines to think from the start how they will enable those who have been abused to report that abuse as easily as possible, so that the online platforms and search engines can then gather that data to build up a picture and share it with the regulator and law enforcement as appropriate. So, while I suspect from what the Minister has said that he will not accept this amendment, the points that my noble friend made are absolutely necessary in this debate.

I move on to the cyberflashing amendment. It has been very ably covered already, so I do not want to say too much. It is clear that women and girls experience harms regardless of the motives of the perpetrator. I also point out that, as we have heard, motivations are very difficult to prove, meaning that prosecutions are often extremely unlikely.

I was very proud to introduce the amendments to what became the Domestic Abuse Act 2021. It was one of my first contributions in this House. I remember that, in the face of a lockdown, most of us were working virtually. But we agreed, and the Government introduced, amendments on intimate image abuse and revenge porn. Even as I proposed those amendments and they were accepted, it was clear that they were not quite right and did not go far enough. As we have heard, for the intimate image abuse proposals, the Law Commission is proposing a consent-based image abuse offence. Can my noble friend be even clearer—I am sorry that I was not able to attend the briefing—about the distinction between consent-based intimate image abuse offences and motive-based cyberflashing offences, and why the Government decided to make it?

I also gently point out to him that I know that this is complicated, but we are still waiting for drafting of the intimate image abuse offences. We are potentially running out of time. Perhaps we will see them at the next stage of the Bill—unless he reveals them like a rabbit out of a hat this afternoon, which I suspect is not the case. These are important offences and it will be important for us to see the detail so that we can scrutinise them properly.

Finally, in welcoming the Government’s amendment on coercive control, I say that it is generally poorly understood by technology companies. Overall, the use of the online world to perpetrate abuse on women and girls, particularly in the domestic abuse context, is certainly being understood more quickly, but we are all playing catch-up in how this happens while the perpetrators are running ahead of us. More can be done to recognise the ways that the online world can be used to abuse and intimidate victims, as the Government have recognised with this amendment and as the noble Baroness, Lady Gohir, said. It is very necessary in debating the Bill. I look forward to hearing the Minister’s remarks at the end of this debate.

Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak in support of the amendments in this group in the names of the intrepid noble Baroness, Lady Kidron, the noble Baroness, Lady Harding, and my noble friend Lord Storey—we are kindred spirits.

As my noble friend said, the expectations of parents are clear: they expect the Bill to protect their children from all harm online, wherever it is encountered. The vast majority of parents do not distinguish between the different content types. To restrict regulation to user-to-user services, as in Part 3, would leave a great many websites and content providers, which are accessed by children, standing outside the scope of the Bill. This is a flagship piece of legislation; there cannot be any loopholes leaving any part of the internet unregulated. If there is a website, app, online game, educational platform or blog—indeed, any content that contains harmful material—it must be in the scope of the Bill.

The noble Baroness, Lady Kidron, seeks to amend the Bill to ensure that it aligns with the Information Commissioner’s age-appropriate design code—it is a welcome amendment. As the Bill is currently drafted, the threshold for risk assessment is too high. It is important that the greatest number of children and young people are protected from harmful content online. The amendments achieve that to a greater degree than the protection already in the Bill.

While the proposal to align with the age-appropriate design code is welcome, I have one reservation. Up until recently, it appears that the ICO was reluctant to take action against pornography platforms that process children’s data. It has perhaps been deemed that pornographic websites are unlikely to be accessed by children. Over the years, I have shared with this House the statistics of how children are accessing pornography and the harm it causes. The Children’s Commissioner also recently highlighted the issue and concerns. Pornography is being accessed by our children, and we must ensure that the provisions of the Bill are the most robust they can be to ensure that children are protected online.

I am concerned with ensuring two things: first, that any platform that contains harmful material falls under the scope of the Bill and is regulated to ensure that children are kept safe; and, secondly, that, as far as possible, what is harmful offline is regulated in the same way online. The amendments in the name of my noble friend Lord Storey raise the important question of online-offline equality. Amendments 33A and 217A seek to regulate online video games to ensure they meet the same BBFC ratings as would be expected offline, and I agree with that approach. Later in Committee, I will raise this issue in relation to pornographic content and how online content should be subject to the same BBFC guidance as content offline. I agree with what my noble friend proposes: namely, that this should extend to video game content as well. Video games can be violent and sexualised in nature, and controls should be in place to ensure that children are protected. The BBFC guidelines used offline appear to be the best way to regulate online as well.

Children must be kept safe wherever they are online. This Bill must have the widest scope possible to keep children safe, but ensuring online/offline alignment is crucial. The best way to keep children safe is to legislate for regulation that is as far reaching as possible but consistently applied across the online/offline world. These are the reasons why I support the amendments in this group.

Baroness Berridge Portrait Baroness Berridge (Con)
- View Speech - Hansard - -

My Lords, I will lend my support to Amendments 19 and 22. It is a pleasure to speak after the noble Baroness, Lady Benjamin. I may be one of those people in your Lordships’ House who relies significantly on the British Board of Film Classification for movie watching, as I am one of the faint-hearted.

In relation to app stores, it is not just children under 18 for whom parents need the age verification. If you are a parent of a child who has significant learning delay, the internet is a wonderful place where they can get access to material and have development that they might not ordinarily have had. But, of course, turning 17 or 18 is not the threshold for them. I have friends who have children with significant learning delay. Having that assurance, so they know which apps are which in the app store, goes well beyond 18 for them. Obviously it will not be a numerical equivalent for their child—now a young adult—but it is important to them to know that the content they get on a free app or an app purchased from the app store is suitable.

I just wanted to raise that with noble Lords, as children and some vulnerable adults—not all—would benefit from the kind of age verification that we have talked about. I appreciate the points that the noble Lord, Lord Allan, raised about where the Bill has ended up conceptually and the framework that Ofcom will rely on. Like him, I am a purist sometimes but, pragmatically, I think that the third concept raised by the noble Baroness, Lady Kidron, about protection and putting this in the app store and bringing it parallel with things such as classification for films and other video games is really important.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this has been a really fascinating debate and I need to put a stake in the ground pretty early on by saying that, although my noble friend Lord Allan has raised some important points and stimulated an important debate, I absolutely agree with the vast majority of noble Lords who have spoken in favour of the amendment so cogently put forward by the noble Baronesses, Lady Kidron and Lady Harding.

Particularly as a result of the Bill’s being the subject of a Joint Committee, it has changed considerably over time in response to comment, pressure, discussion and debate and I believe very much that during Committee stage we will be able to make changes, and I hope the Minister will be flexible enough. I do not believe that the framework of the Bill is set in concrete. There are many things we can do as we go through, particularly in the field of making children safer, if we take some of the amendments that have been put forward on board. In particular, the noble Baroness, Lady Kidron, set out why the current scope of the Bill will fail to protect children if it is kept to user-to-user and search services. She talked about blogs with limited functionalities, gaming without user functionalities and mentioned the whole immersive environment, which the noble Lord, Lord Russell, described as eye-watering. As she said, it is not fair to leave parents or children to work out whether they are on a regulated service. Children must be safe wherever they are online.

As someone who worked with the noble Baroness, Lady Kidron, in putting the appropriate design code in place in that original Data Protection Act, I am a fervent believer that it is perfectly appropriate to extend in the way that is proposed today. I also support her second amendment, which would bring the Bill’s child user condition in line with the threshold of the age-appropriate design code. It is the expectation—I do not think it an unfair expectation—of parents, teachers and children themselves that the Bill will apply to children wherever they are online. Regulating only certain services will mean that emerging technologies that do not fit the rather narrow categories will not be subject to safety duties.

OFCOM (Duty regarding Prevention of Serious Self-harm and Suicide) Bill [HL]

Baroness Berridge Excerpts
Baroness Berridge Portrait Baroness Berridge (Con)
- View Speech - Hansard - -

My Lords, I too am very grateful to the noble Baroness, Lady Finlay, for introducing this Private Member’s Bill, with supplements the lengthy Online Safety Bill that your Lordships’ House discussed earlier this week. That Bill would set up Ofcom as an online safety regulator.

At first, I thought that this Bill was “getting on the front foot” legislation, but it is more aptly “keeping us on the front foot” legislation, when arguably we have been on the back foot for so long. It is not about censoring content before it is online but about ensuring that Ofcom is keeping the Government, Parliament and the public up to date with what is happening online in terms of self-harm and suicide content.

The Bill would ensure that the Government get both advice on the effectiveness of regulations and recommendations from Ofcom. Importantly, it would ensure that we do not get into a stop-start pattern of reviews when we have cases of self-harm and suicide. Reviews are often triggered only by a terrible tragedy and the comments of the coroner. That puts real pressure on a family and puts them through additional pain. If the Government knew that Ofcom had this role of recommendation and monitoring content, then it would be the body that they would go to and there would be a regular pattern of reporting to government. We know that the internet and technology are always developing, so we need a vehicle to keep us abreast of this.

When we legislate, I always look for precedent and analogy. This role for Ofcom would be akin to the role that the Advisory Council on the Misuse of Drugs has in relation to the Home Office. That council keeps under review the situation of drugs which appear to be being misused. We saw it respond nimbly to the swift development of legal highs by establishing the novel psychoactive substances committee. In that context, the Government cannot wait for legislation or statutory instruments to deal with these fast-changing chemical developments. The body proposed in the Bill would enable us, to some extent, to keep pace with developments on the internet.

I understand that His Majesty’s Government have committed to introducing an additional offence of encouraging and assisting self-harm. When it comes to the notices and penalties under the Online Safety Bill, obviously some firms will have our best lawyers looking at cases. I am not in that category, but might there be arguments about whether self-harm, with “self” meaning “the human person”, would cover content that uses humanoids? It could be argued that they are not too much like human beings at the moment, so putting that kind of content online could not possibly encourage someone to self-harm. However, as they and the evidence on our human response to seeing humanoids through our phones develop, they might be found to encourage self-harm. It is on that kind of development and the evidence behind it that we need recommendations as to whether we should change what the Online Safety Bill covers.

It would also be useful to monitor this content because it will ensure that Ofcom reports to us on what content it feels is within the Online Safety Bill and what content it has decided is outside it. Ofcom may come to us with more recommendations for the Government to consider whether that content should be brought from beyond the Online Safety Bill and into its coverage. However, only if we see this monitoring by Ofcom, as suggested in this Bill, can the Government and Parliament be properly equipped to achieve His Majesty’s Government’s intention of making Britain the safest place to be online in the world.

Football: Illegal Entry to Matches

Baroness Berridge Excerpts
Wednesday 1st February 2023

(1 year, 3 months ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

Absolutely—and we are not. As I have explained to the noble Lord before, we have taken action to implement a series of changes to the football banning order legislation with which he was associated when he was in government to help ensure safety at football matches. That included adding football related online hate crime to the list of offences, amending the threshold for the imposition of a banning order, extending the legislation to the women’s domestic game and adding football-related class A drug crimes to the list of offences. We continue to work with the police and football bodies to review disorder and consider whether any further action is necessary.

Baroness Berridge Portrait Baroness Berridge (Con)
- View Speech - Hansard - -

My Lords, in relation to tailgating, could my noble friend the Minister outline whether the Government are considering making this an offence and making it slightly broader? This happens a lot on the Tube. Particularly as a woman, being tailgated through a barrier by somebody trying to come in behind you means you virtually are assaulted. TfL’s policy is not to do anything, probably because it is not an offence. Could the Minister review this to see whether it should be made an offence not just in football but on the Tube?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My noble friend is right to point to the impact on people being followed through ticket barriers. Fare evasion is a criminal offence and Transport for London publishes its revenue enforcement and prosecutions policy. If convicted, people face a criminal record and a fine of up to £1,000, as well as compensation for the fares they have avoided, a victim surcharge and prosecution costs—so this is something that should not be done.

Suicide: Online Products

Baroness Berridge Excerpts
Monday 27th June 2022

(1 year, 10 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Berridge Portrait Baroness Berridge (Con)
- Hansard - -

My Lords, my noble friend has mentioned various statutory agencies, but is this not a particular category of legal but harmful content? Assisting suicide is a criminal offence, as is potentially conspiring to assist suicide. Will he ensure that all those statutory bodies involved really relate to the boundaries of the criminal law that exists today? These companies should be ensuring that they are not assisting or conspiring to assist suicide.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My noble friend is right: there are existing criminal sanctions here and content which encourages or assists suicide, and therefore breaks the existing law, will be covered as well by the safety duties providing for illegal content under the Online Safety Bill. We want to ensure that the Bill adds to the armoury that we have to prevent as many suicides as we can.

Birmingham Commonwealth Games Bill [HL]

Baroness Berridge Excerpts
Tuesday 25th June 2019

(4 years, 10 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Berridge Portrait Baroness Berridge (Con)
- Hansard - -

My Lords, I declare my interest as an executive member of CPA UK and thank my noble friend the Minister for his meeting with interested Peers, which was really useful.

Although the hosting of the Commonwealth Games by Birmingham in 2022 is undoubtedly great news for the West Midlands region, or what is now called the midlands engine—I am from the East Midlands, so I prefer that phrase—I am sure that noble Lords will share my regret that that is because Durban in South Africa was unable to do so. In its modern format since 1978, the Commonwealth Games has been held outside the ABC and Z countries of the Commonwealth only twice: in 1998 in Kuala Lumpur and in 2010 in Delhi, India. This is regrettable, as this is the Games of 53 nations, 94% of whose people live in Asia or Africa. I hope that the Games federation will look at how the Games, which are the part of the Commonwealth most known to many people, can go to every corner of the Commonwealth.

I hope that the Bill’s framework for the protection of commercial rights, creating civil offences for a limited period to protect intellectual property rights, will be a template for other nations to use. This could then limit the legislative work for the creation of essential safeguards needed by the next host of the Games. I hope that Her Majesty’s Government are already looking for the legacy of these Games to be an operational structure that is streamlined and will assist the Games moving to countries and continents of the Commonwealth which have yet to host them.

I lived in Manchester during the time of the Commonwealth Games there and I worked for three years at the University of Birmingham on a Commonwealth project. Part of the reason we chose the university for such a project was the fact that, like the Commonwealth, where 60% of the population of 2.4 billion is under the age of 30, Birmingham is the youngest city in Europe, with 40% of the population aged under 25.

Also, much of the migration to the West Midlands is from the Commonwealth. The 2011 census stated that 13.5% of migrants in Birmingham were of Pakistani origin, 6% were Indian and 4.4% were Caribbean. Birmingham is more ethnically diverse than London. It is important to see such an international event outside London. Although I accept that it is not under the direct control of Her Majesty’s Government, much of the budget will come from central government. I therefore hope that the employment opportunities that come with these Games will reflect the diverse nature of the West Midlands population; that is one area where the London Olympics and Paralympics struggled. Ensuring that local people are employed to deliver their Games is really important; it would be disappointing if people were predominantly relocating temporarily from London or elsewhere to take up jobs.

He cannot be here today but it was good to learn from the right reverend Prelate the Bishop of Birmingham at the Minister’s meeting of the support of faith communities for the Games. In the Manchester Commonwealth Games, one of the key services offered by faith communities was the opportunity to host athletes’ families in the homes of local residents, rather than them having to spend money on hotels. Many athletes struggle financially; if their family can afford the flight to the UK to watch them, the accommodation costs can often be one step too far. I hope that my noble friend the Minister will do what he can to nudge the right reverend Prelate to see whether this scheme could be of use in the Birmingham Games.

Often with these large events, what happens in the margins is also valuable. CPA UK is looking at whether to hold an event on the importance of regional governance across the Commonwealth. The Birmingham Games will, I hope, be a great model of both that and of different authorities controlled by different political parties working together. Often in Commonwealth nations, regional governance structures can also deliver changes, such as the state governors in Nigeria. I am grateful to my noble friend the Minister to hear about the transport plan in the Bill. As someone who lives in the West Midlands, I repeat my request: remember that people do not always travel north-south on trains—the east-west routes can often be problematic—and ensure that this is taken into account in the delivery of the Games.

I hope that looking outside London for such events will remain a focus for Her Majesty’s Government. It was really encouraging that when we last hosted the NATO summit in 2014, it was in Newport, Wales. The year 2022 will be a time of great national celebration, with the 70th, platinum anniversary of Her Majesty’s reign. What more fitting tribute to the Head of the Commonwealth than having the Commonwealth Games in Birmingham?