Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Kidron
Main Page: Baroness Kidron (Crossbench - Life peer)Department Debates - View all Baroness Kidron's debates with the Department for Digital, Culture, Media & Sport
(1 year, 5 months ago)
Lords ChamberMy Lords, a lot of positive and interesting things have been said that I am sympathetic to, but this group of amendments raises concerns about a democratic deficit: if too much of the Bill is either delegated to the Secretary of State or open to interference in relation to the Secretary of State and Ofcom, who decides what those priorities are? I will ask for a couple of points of clarification.
I am glad to see that the term “public policy” has been replaced, because what did that mean? Everything. But I am not convinced that saying that the Secretary of State can decide not just on national security but on public safety and public health is reassuring in the present circumstances. The noble Lord, Lord Allan, has just pointed out what it feels like to be leaned on. We had a very recent example internationally of Governments leaning on big tech companies in relation to Covid policies, lockdowns and so on, and removing material that was seen to contradict official public health advice—often public health advice that turned out not to be accurate at all. There should at least have been a lot more debate about what were political responses to a terrible virus. Noble Lords will know that censorship became a matter of course during that time, and Governments interfering in or leaning on big tech directly was problematic. I am not reassured that the Government hold to themselves the ability to lean on Ofcom around those issues.
It is also worth remembering that the Secretary of State already has a huge amount of power to designate, as we have discussed previously. They can designate what constitute priority illegal offences and priority content harmful to children, and that can all change beyond what we have discussed here. We have already seen that there is a constant expansion of what those harms can be, and having those decisions removed using only secondary legislation, unaccountable to Parliament or to public scrutiny, really worries me. It is likely to give a green light to every identity group and special interest NGO to demand that the list of priority harms and so on should be dealt with. That is likely to make the job of the Secretary of State to respond to “something must be done” moral panics all the more difficult. If that is going to happen, we should have parliamentary scrutiny of it; it cannot just be allowed to happen elsewhere.
It is ironic that the Secretary of State is more democratic, because they are elected, than an unelected regulator. I just feel that there is a danger in so much smoke and mirrors. When the Minister very kindly agreed to see the noble Lord, Lord Moylan, and me, I asked in a rather exasperated way why Ofcom could not make freedom of expression a priority, with codes of practice so that it would have to check on freedom of speech. The Minister said, “It’s not up to me to tell Ofcom what to do”, and I thought, “The whole Bill is telling Ofcom what to do”. That did not seem to make any sense.
I had another exchange with the present Secretary of State—again, noble Lords will not be surprised to hear that it was not a sophisticated intervention on my part—in which I said, “Why can’t the Government force the big tech companies to put freedom of expression in their terms and conditions or terms of service?” The Minister said, “They are private companies; we’re not interfering in what they do”. So you just end up thinking, “The whole Bill is telling companies that they’re going to be compelled to act in relation to harm and safety, but not on freedom of expression”. What that means is that you feel all the time as though the Government are saying that they are outsourcing this to third parties, which means that you cannot hold anyone to account.
Civil liberties campaigner Guy Herbert compared this to what is happening with the banks at the moment; they are being blamed by the Government and held to account for things such as politically exposed people and Ts and Cs that overconcentrate on values such as EDI and ESG that may be leading to citizens of this country having their bank accounts closed down. The Government say that they will tell the regulator that it has to act and say that the banks cannot behave in this way, but this all came from legislation—it is not as though the regulator was doing it off its own bat. Maybe it overinterpreted the legislation and the banks then overinterpreted it again and overremoved.
The obvious analogy for me is that there is a danger here that we will not be able to hold anyone to account for overremoval of legitimate democratic discussion from the online world, because everyone is pointing the finger at everyone else. At the very least, the amendments are trying to say that any changes beyond what we have discussed so far on this Bill must come before Parliament. That is very important for any kind of democratic credibility to be attached to this legislation.
My Lords, I too express my admiration to the noble Baroness, Lady Stowell, for her work on this group with the Minister and support the amendments in her name. To pick up on what the noble Baroness, Lady Harding, said about infinite ping-pong, it can be used not only to avoid making a decision but as a form of power and of default decision-making—if you cannot get the information back, you are where you are. That is a particularly important point and I add my voice to those who have supported it.
I have a slight concern that I want to raise in public, so that I have said it once, and get some reassurance from the Minister. New subsection (B1)(d) in Amendment 134 concerns the Secretary of State directing Ofcom to change codes that may affect
“relations with the government of a country outside the United Kingdom”.
Many of the companies that will be regulated sit in America, which has been very forceful about protecting its sector. Without expanding on this too much, when it was suggested that senior managers would face some sort of liability in international fora, various parts of the American Government and state apparatus certainly made their feelings clearly known.
I am sure that the channels between our Government and the US are much more straightforward than any that I have witnessed, but it is absolutely definite that more than one Member of your Lordships’ House was approached about the senior management and said, “This is a worry to us”. I believe that where we have landed is very good, but I would like the Minister to say what the limits of that power are and acknowledge that it could get in a bit of a muddle with the economic outcomes that we were talking about, celebrating that they had been taken off the list, and government relations. That was the thing that slightly worried me in the government amendments, which, in all other ways, I welcome.
My Lords, this has been a consistent theme ever since the Joint Committee’s report. It was reported on by the Delegated Powers and Regulatory Reform Committee, and the Digital and Communications Committee, chaired by the noble Baroness, Lady Stowell, has rightly taken up the issue. Seeing some movement from the Minister, particularly on Clause 29 and specifically in terms of Amendments 134 to 137, is very welcome and consistent with some of the concerns that have been raised by noble Lords.
There are still questions to answer about Amendment 138, which my noble friend has raised. I have also signed the amendments to Clause 38 because I think the timetabling is extremely welcome. However, like other noble Lords, I believe we need to have Amendments 139, 140, 144 and 145 in place, as proposed by the noble Baroness, Lady Stowell of Beeston. The phrase “infinite ping-pong” makes us all sink in gloom, in current circumstances—it is a very powerful phrase. I think the Minister really does have to come back with something better; I hope he will give us that assurance, and that his discussions with the noble Baroness Stowell will bear further fruit.
I may not agree with the noble Lord, Lord Moylan, about the Clause 39 issues, but I am glad he raised issues relating to Clause 159. It is notable that of all the recommendations by the Delegated Powers and Regulatory Reform Committee, the Government accepted four out of five but did not accept the one related to what is now Clause 159. I have deliberately de-grouped the questions of whether Clauses 158 and 159 should stand part of the Bill, so I am going to pose a few questions which I hope, when we get to the second group which contains my clause stand part proposition, the Minister will be able to tell me effortlessly what he is going to do. This will prevent me from putting down further amendments on those clauses, because it seems to me that the Government are being extraordinarily inconsistent in terms of how they are dealing with Clauses 158 and 159 compared with how they have amended Clause 39.
For instance, Clause 158 allows the Secretary of State to issue a direction to Ofcom, where the Secretary of State has reasonable grounds for believing that there is a threat to public health and safety or national security, and they can direct Ofcom to set objectives in how they use their media-literacy powers in Section 11 of the Communications Act for a specific period to address the threat, and make Ofcom issue a public-statement notice. That is rather extraordinary. I will not go into great detail at this stage, and I hope the Minister can avoid me having to make a long speech further down the track, but the Government should not be in a position to be able to direct a media regulator on a matter of content. For instance, the Secretary of State has no powers over Ofcom on the content of broadcast regulation—indeed, they have limited powers to direct over radio spectrum and wires—and there is no provision for parliamentary involvement, although I accept that the Secretary of State must publish reasons for the direction. There is also the general question of whether the threshold is high enough to justify this kind of interference. So Clause 158 is not good news at all. It raises a number of questions which I hope the Minister will start to answer today, and maybe we can avoid a great debate further down the track.
My Lords, I rise briefly to support the noble Baroness, Lady Morgan, to welcome the government amendment and to say that this is a moment of delight for many girls—of all varieties. I echo the noble Baroness, Lady Fox, on the issue of having a broad consultation, which is a good idea. While our focus during the passage of this Bill was necessarily on preventing harm, I hope this guidance will be part of the rather more aspirational and exciting part of the digital world that allows young people to participate in social and civic life in ways that do not tolerate abuse and harm on the basis of their gender. In Committee, I said that we have a duty not to allow digital tech to be regressive for girls. I hope that this is a first step.
My Lords, on behalf of my party, all the groups mentioned by the noble Baroness, Lady Morgan, and potentially millions of women and girls in this country, I briefly express my appreciation for this government amendment. In Committee, many of us argued that a gender-neutral Bill would not achieve strong enough protection for women and girls as it would fail to recognise the gendered nature of online abuse. The Minister listened, as he has on many occasions during the passage of the Bill. We still have differences on some issues—cyberflashing, for instance—but in this instance I am delighted that he is amending the Bill, and I welcome it.
Why will Ofcom be required to produce guidance and not a code, as in the amendment originally tabled by the noble Baroness, Lady Morgan? Is there a difference, or is it a case of a rose by any other name? Is there a timescale by which Ofcom should produce this guidance? Are there any plans to review Ofcom’s guidance once produced, just to see how well it is working?
We all want the same thing: for women and girls to be free to express themselves online and not to be harassed, abused and threatened as they are today.
My Lords, I am most grateful to the noble Lord, Lord Clement-Jones, for tabling the amendment. If I had been quicker, I would have added my name to it, because he may— I use the word “may” advisedly, because I am not sure—have identified quite a serious gap in terms of future-proofing. As far as I understand it, in a somewhat naive way, the amendment probes whether there is a gap between provider-generated content and user-generated content and whether provider-generated content could lead to a whole lot of ghastly stuff on the metaverse without any way of tackling it because it is deemed to have fallen outside the scope of the Bill.
I am grateful to Carnegie UK for having tried to talk me through this—it is pretty complicated. As a specific example, I understand that a “Decentraland” avatar pops up on gaming sites, and it is useful because it warns you about the dangers of gambling and what it can lead to. But then there is the problem about the backdrop to this avatar: at the moment, it seems to be against gambling, but you can see how those who have an interest in gambling would be quite happy to have the avatar look pretty hideous but have a backdrop of a really enticing casino with lots of lights and people streaming in, or whatever. I am not sure where that would fit, because it seems that this type of content would be provider-generated. When it comes to the metaverse and these new ways of interacting with 3D immersion, I am not clear that we have adequately caught within the Bill some of these potentially dangerous applications. So I hope that the Minister will be able to clarify it for us today and, if not, possibly to write between now and the next time that we debate this, because I have an amendment on future-proofing, but it is in a subsequent group.
My Lords, I am interested to hear what the Minister says, but could he also explain to the House the difference in status of this sort of material in Part 5 versus Part 3? I believe that the Government brought in a lot of amendments that sorted it out and that many of us hoped were for the entire Bill, although we discovered, somewhat to our surprise, that they were only in Part 5. I would be interested if the Minister could expand on that.
My Lords, I am grateful to the noble Lord, Lord Clement-Jones, for raising this; it is important. Clause 49(3)(a)(i) mentions content
“generated directly on the service by a user”,
which, to me, implies that it would include the actions of another user in the metaverse. Sub-paragraph (ii) mentions content
“uploaded to or shared on the service by a user”,
which covers bots or other quasi-autonomous virtual characters in the metaverse. As we heard, a question remains about whether any characters or objects provided by the service itself are covered.
A scenario—in my imagination anyway—would be walking into an empty virtual bar at the start of a metaverse service. This would be unlikely to be engaging: the attractions of indulging in a lonely, morose drink at that virtual bar are limited. The provider may therefore reasonably configure the algorithm to generate characters and objects that are engaging until enough users then populate the service to make it interesting.
Of course, there is the much more straightforward question of gaming platforms. On Monday, I mentioned “Grand Theft Auto”, a game with an advisory age of 17—they are still children at that age—but that is routinely accessed by younger children. Shockingly, an article that I read claimed that it can evolve into a pornographic experience, where the player becomes the character from a first-person angle and received services from virtual sex workers, as part of the game design. So my question to the Minister is: does the Bill protect the user from these virtual characters interacting with users in virtual worlds?
We talked about bots controlled by service providers before the noble Lord, Lord Knight, asked questions on this. The Bill is designed to make online service providers responsible for the safety of their users in light of harmful activities that their platforms might facilitate. Providers of a user-to-user service will need to adhere to their duties of care, which apply to all user-generated content present on their service. The Bill does not, however, regulate content published by user-to-user providers themselves. That is because the providers are liable for the content they publish on the service themselves. The one exception to this—as the noble Baroness, Lady Kidron, alluded to in her contribution—is pornography, which poses a particular risk to children and is regulated by Part 5 of the Bill.
I am pleased to reassure the noble Lord, Lord Clement- Jones, that the Bill—
I thank the noble Lord for giving way. The Minister just said that private providers will be responsible for their content. I would love to understand what mechanism makes a provider responsible for their content?
I will write to noble Lords with further information and will make sure that I have picked up correctly the questions that they have asked.
On Amendment 152A, which the noble Lord, Lord Clement-Jones, has tabled, I am pleased to assure him that the Bill already achieves the intention of the amendment, which seeks to add characters and objects that might interact with users in the virtual world to the Bill’s definition of user-generated content. Let me be clear again: the Bill already captures any service that facilitates online user-to-user interaction, including in the metaverse or other augmented reality or immersive online worlds.
The Bill broadly defines “content” as
“anything communicated by means of an internet service”,
so it already captures the various ways in which users may encounter content. Clause 211 makes clear that “encounter” in relation to content for the purposes of the Bill means to,
“read, view, hear or otherwise experience”
content. That definition extends to the virtual worlds which noble worlds have envisaged in their contributions. It is broad enough to encompass any way of encountering content, whether that be audio-visually or through online avatars or objects.
In addition, under the Bill’s definition of “functionality”,
“any feature that enables interactions of any description between users of the service”
will be captured. That could include interaction between avatars or interaction by means of an object in a virtual world. All in-scope services must therefore consider a range of functionalities as part of their risk assessment and must put in place any necessary measures to mitigate and manage any risks that they identify.
I hope that that provides some assurance to the noble Lord that the concerns that he has raised are covered, but I shall happily write on his further questions before we reach the amendment that the noble Baroness, Lady Finlay, rightly flagged in her contribution.
My Lords, I strongly support Amendment 180, tabled by the noble Baroness, Lady Merron. I will also explain why I put forward Amendment 180A. I pay tribute to the noble Baroness, Lady Hayman, who pursued this issue with considerable force through her Question in the House.
There is clearly an omission in the Bill. One of its primary aims is to protect children from harmful online content, and animal cruelty content causes harm to the animals involved and, critically, to the people who view it, especially children. In Committee, in the Question and today, we have referred to the polling commissioned by the RSPCA, which found that 23% of 10 to 18 year-olds had seen animal cruelty on social media sites. I am sure that the numbers have increased since that survey in 2018. A study published in 2017 found—if evidence were needed—that:
“There is emerging evidence that childhood exposure to maltreatment of companion animals is associated with psychopathology in childhood and adulthood.”
The noble Baroness made an extremely good case, and I do not think that I need to add to it. When the Bill went through the Commons, assurances were given by the former Minister, Damian Collins, who acknowledged that the inclusion of animal cruelty content in the Bill deserves further consideration as the Bill progresses through its parliamentary stages. We need to keep up that pressure, and we will be very much supporting the noble Baroness if she asks for the opinion of the House.
Turning to my Amendment 180A, like the noble Baroness, I pay tribute to the Social Media Animal Cruelty Coalition, which is a very large coalition of organisations. We face a global extinction crisis which the UK Government themselves have pledged to reverse. Algorithmic amplification tools and social media recommendation engines have driven an explosive growth in online wildlife trafficking. A National Geographic article from 2020 quoted US wildlife officials describing the dizzying scale of the wildlife trade on social media. The UK’s national wildlife crime units say that cyber-enabled wildlife crime has become their priority focus, since virtually all wildlife cases they now investigate have a cyber component to them, usually involving social media or e-commerce platforms. In a few clicks it is easy to find pages, groups and postings selling wildlife products made from endangered species, such as elephant ivory, rhino horn, pangolin scales and marine turtle shells, as well as big cats, reptiles, birds, primates and insects for the exotic pet trade. This vast, unregulated trade in live animals and their parts is not only illegal but exacerbates the risk of another animal/human spillover event such as the ones that caused Ebola, HIV and the Covid-19 pandemic.
In addition to accepting the animal welfare amendment tabled by the noble Baroness, which I hope they do, the Government should also add offences under the Control of Trade in Endangered Species Regulations 2018 to Schedule 7 to the Bill. This would definitely help limit the role of social media platforms in enabling wildlife trafficking, helping to uphold the UK’s commitments to tackling global wildlife crime.
My Lords, I rise very briefly to support the noble Baroness, Lady Merron, and to make only one point. As someone who has the misfortune of seeing a great deal of upsetting material of all kinds, I have to admit that it sears an image on your mind. I have had the misfortune to see the interaction of animal and human cruelty in the same sequences, again and again. In making the point that there is a harm to humans in witnessing and normalising this kind of material, I offer my support to the noble Baroness.
My Lords, Amendments 180 and 180A seek to require the Secretary of State to conduct a review of existing legislation and how it relates to certain animal welfare offences and, contingent on this review, to make them priority offences under the regulatory framework.
I am grateful for this debate on the important issue of protecting against animal cruelty online, and all of us in this House share the view of the importance of so doing. As the House has discussed previously, this Government are committed to strong animal welfare standards and protections. In this spirit, this Government recognise the psychological harm that animal cruelty content can cause to children online. That is why we tabled an amendment that lists content that depicts real or realistic serious violence or injury against an animal, including by fictional creatures, as priority content that is harmful to children. This was debated on the first day of Report.
In addition, all services will need proactively to tackle illegal animal cruelty content where this amounts to an existing offence such as extreme pornography. User-to-user services will be required swiftly to remove other illegal content that targets an individual victim once made aware of its presence.
The noble Baroness asked about timing. We feel it is important to understand how harm to animals as already captured in the Bill will function before committing to the specific remedy proposed in the amendments.
As discussed in Committee, the Bill’s focus is rightly on ensuring that humans, in particular children, are protected online, which is why we have not listed animal offences in Schedule 7. As many have observed, this Bill cannot fix every problem associated with the internet. While we recognise the psychological harm that can be caused to adults by seeing this type of content, listing animal offences in Schedule 7 is likely to dilute providers’ resources away from protecting humans online, which is the Bill’s main purpose.
However, I understand the importance of taking action on animal mistreatment when committed online, and I am sympathetic to the intention of these amendments. As discussed with the noble Baroness, Defra is confident that the Animal Welfare Act 2006 and its devolved equivalents can successfully bring prosecutions for the commission and action of animal torture when done online in the UK. These Acts do not cover acts of cruelty that take place outside the UK. I know from the discussion we have had in this House that there are real concerns that the Animal Welfare Act 2006 cannot tackle cross-border content, so I wish to make a further commitment today.
The Government have already committed to consider further how the criminal law can best protect individuals from harmful communications, alongside other communications offences, as part of changes made in the other place. To that end, we commit to include the harm caused by animal mistreatment communications as part of this assessment. This will then provide a basis for the Secretary of State to consider whether this offence should be added to Schedule 7 to the OSB via the powers in Clause 198. This work will commence shortly, and I am confident that this, in combination with animal cruelty content listed as priority harms to children, will safeguard users from this type of content online.
For the reasons set out, I hope the noble Baroness and the noble Lord will consider not pressing their amendments.
My Lords, these seem very sensible amendments. I am curious about why they have arrived only at this stage, given this was a known problem and that the Bill has been drafted over a long period. I am genuinely curious as to why this issue has been raised only now.
On the substance of the amendments, it seems entirely sensible that, given that we are now going to have 20,000 to 25,000 regulated entities in scope, some of which will never have encountered child sexual exploitation or abuse material or understood that they have a legal duty in relation to it, it will be helpful for them to have a clear set of regulations that tell them how to treat their material.
Child sexual exploitation or abuse material is toxic in both a moral and a legal sense. It needs to be treated almost literally as toxic material inside a company, and sometimes that is not well understood. People feel that they can forward material to someone else, not understanding that in doing so they will break the law. I have had experiences where well-meaning people acting in a vigilante capacity sent material to me, and at that point you have to report them to police. There are no ifs or buts. They have committed an offence in doing so. As somebody who works inside a company, your computer has to be quarantined and taken off and cleaned, just as it would be for any other toxic material, because we framed the law, quite correctly, to say that we do not want to offer people the defence of saying “I was forwarding this material because I’m a good guy”. Forwarding the material is a strict liability offence, so to have regulations that explain, particularly to organisations that have never dealt with this material, exactly how they have to deal with it in order to be legally compliant will be extremely helpful.
One thing I want to flag is that there are going to be some really fundamental cross-border issues that have to be addressed. In many instances of child sexual exploitation or abuse material, the material has been shared between people in different jurisdictions. The provider may not be in a UK jurisdiction, and we have got to avoid any conflicts of laws. I am sure the Government are thinking about this, but in drafting those regulations, what we cannot do, for example, is order a provider to retain data in a way that would be illegal in the jurisdiction from which it originates or in which it has its headquarters. The same would apply vice versa. We would not expect a foreign Government to order a UK company to act in a way that was against UK law in dealing with child sexual exploitation or abuse material. This all has to be worked out. I hope the Government are conscious of that.
I think the public interest is best served if the United Kingdom, the United States and the European Union, in particular, adopt common standards around this. I do not think there is anything between us in terms of how we would want to approach child sexual exploitation or abuse material, so the extent to which we end up having common legal standards will be extraordinarily helpful.
As a general matter, to have regulations that help companies with their compliance is going to be very helpful. I am curious as to how we have got there with the amendment only at this very late stage.
My Lords, I rise to make a slightly lesser point, but I also welcome these amendments. I want to ask the Minister where the consultation piece of this will lie and to check that all the people who have been in this space for many years will be consulted.
My Lords, as ever, my noble friend Lord Allan and the noble Baroness, Lady Kidron, have made helpful, practical and operational points that I hope the Minister will be able to answer. In fact, the first half of my noble friend’s speech was really a speech that the Minister himself could have given in welcoming the amendment, which we do on these Benches.