Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

My right hon. Friend raises a good question. In fact, I was about to come on to the safeguards that exist to address some of the concerns that have been raised this morning. Let me jump to the fourth of the safeguards, which in many ways is the most powerful and directly addresses my right hon. Friend’s question.

In fact, a change has been made. The hon. Member for Ochil and South Perthshire asked what changes had been made, and one important change—perhaps the change that my hon. Friend the Member for Watford found convincing—was the insertion of a requirement for the codes, following a direction, to go before Parliament and be voted on using the affirmative procedure. That is a change. The Bill previously did not have that in it. We inserted the use of the affirmative procedure to vote on a modified code in order to introduce extra protections that did not exist in the draft of the Bill that the Joint Committee commented on.

I hope my right hon. Friend the Member for Basingstoke will agree that if Ofcom had a concern and made it publicly known, Parliament would be aware of that concern before voting on the revised code using the affirmative procedure. The change to the affirmative procedures gives Parliament extra control. It gives parliamentarians the opportunity to respond if they have concerns, if third parties raise concerns, or if Ofcom itself raises concerns.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

Before the Minister moves off the point about exceptional circumstances, it was the case previously that an amendment of the law resolution was always considered with Finance Bills. In recent years, that has stopped on the basis of it being exceptional circumstances because a general election was coming up. Then the Government changed that, and now they never table an amendment of the law resolution because they have decided that that is a minor change. Something has gone from being exceptional to being minor, in the view of this Government.

The Minister said that he envisions that this measure will be used only in exceptional circumstances. Can he commit himself to it being used only in exceptional circumstances? Can he give the commitment that he expects that it will be used only in exceptional circumstances, rather than simply envisioning that it will be used in such circumstances?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I have made clear how we expect the clause to be used. I am slightly hesitant to be more categorical simply because I do not want to make comments that might unduly bind a future Secretary of State—or, indeed, a future Parliament, because the measure is subject to the affirmative procedure—even were that Secretary of State, heaven forbid, to come from a party other than mine. Circumstances might arise, such as the pandemic, in which a power such as this needs to be exercised for good public policy reasons—in that example, public health. I would not want to be too categorical, which the hon. Lady is inviting me to be, lest I inadvertently circumscribe the ability of a future Parliament or a future Secretary of State to act.

The power is also limited in the sense that, in relation to matters that are not to do with national security or terrorism or CSEA, the power to direct can be exercised only at the point at which the code is submitted to be laid before Parliament. That cannot be done at any point. The power cannot be exercised at a time of the Secretary of State’s choosing. There is one moment, and one moment only, when that power can be exercised.

I also want to make it clear that the power will not allow the Secretary of State to direct Ofcom to require a particular regulated service to take a particular measure. The power relates to the codes of practice; it does not give the power to intrude any further, beyond the code of practice, in the arena of regulated activity.

I understand the points that have been made. We have listened to the Joint Committee, and we have made an important change, which is that to the affirmative procedure. I hope my explanation leaves the Committee feeling that, following that change, this is a reasonable place for clauses 40 and 41 to rest. I respectfully resist amendment 84 and new clause 12, and urge the Committee to allow clauses 40 and 41 to stand part of the Bill.

Question put, That the amendment be made.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I can see that that is the most popular thing I have said during the entire session—when you say, “And finally,” in a speech and the crowd cheers, you know you are in trouble.

Regulated user-to-user and search services will have duties to keep records of their risk assessments and the measures they take to comply with their safety duties, whether or not those are the ones recommended in the codes of practice. They must also undertake a children’s access assessment to determine whether children are likely to access their service.

Clause 48 places a duty on Ofcom to produce guidance to assist service providers in complying with those duties. It will help to ensure a consistent approach from service providers, which is essential in maintaining a level playing field. Ofcom will have a duty to consult the Information Commissioner prior to preparing this guidance, as set out in clause 48(2), in order to draw on the expertise of the Information Commissioner’s Office and ensure that the guidance is aligned with wider data protection and privacy regulation.

Question put and agreed to.

Clause 48 accordingly ordered to stand part of the Bill.

Clause 49

“Regulated user-generated content”, “user-generated content”, “news

publisher content”

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I beg to move amendment 89, in clause 49, page 45, line 16, leave out subsection (e).

This amendment would remove the exemption for comments below news articles posted online.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss amendment 43, in clause 49, page 45, line 19, at end insert—

“(2A) Subsection (2)(e) does not apply in respect of a user-to-user service which is operated by an organisation which—

(a) is a relevant publisher (as defined in section 41 of the Crime and Courts Act 2013); and

(b) has an annual UK turnover in excess of £100 million.”

This amendment removes comments sections operated by news websites where the publisher has a UK turnover of more than £100 million from the exemption for regulated user-generated content.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Thank you, Ms Rees, for your hard work in chairing the Committee this morning; we really appreciate it. Amendment 89 relates to below-the-line comments on newspaper articles. For the avoidance of doubt, if we do not get amendment 89, I am more than happy to support the Labour party’s amendment 43, which has a similar effect but covers slightly fewer—or many fewer—organisations and places.

Below-the-line comments in newspaper articles are infamous. They are places that everybody fears to go. They are worse than Twitter. In a significant number of ways, below-the-line comments are an absolute sewer. I cannot see any reasonable excuse for them to be excluded from the Bill. We are including Twitter in the Bill; why are we not including below-the-line comments for newspapers? It does not make any sense to me; I do not see any logic.

We heard a lot of evidence relating to freedom of speech and a free press, and I absolutely, wholeheartedly agree with that. However, the amendment would not stop anyone writing a letter to the editor. It would not stop anyone engaging with newspapers in the way that they would have in the print medium. It would still allow that to happen; it would just ensure that below-the-line comments were subject to the same constraints as posts on Twitter. That is the entire point of amendment 89.

I do not think that I need to say much more, other than to add one more thing about the direction by comments to other, more radical and extreme pieces, or bits of information. It is sometimes the case that the comments on a newspaper article will direct people to even more extreme views. The newspaper article itself may be just slightly derogatory, while some of the comments may have links or references to other pieces, and other places on the internet where people can find a more radical point of view. That is exactly what happens on Twitter, and is exactly some of the stuff that we are trying to avoid—sending people down an extremist rabbit hole. I do not understand how the Minister thinks that the clause, which excludes below the line newspaper comments, is justifiable or acceptable.

Having been contacted by a number of newspapers, I understand and accept that some newspapers have moderation policies for their comments sections, but that is not strong enough. Twitter has a moderation policy, but that does not mean that there is actually any moderation, so I do not think that subjecting below-the-line comments to the provisions of the Bill is asking too much. It is completely reasonable for us to ask for this to happen, and I am honestly baffled as to why the Minister and the Government have chosen to make this exemption.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Before I address the amendments, I will speak to clause 49 more broadly.

Labour has concerns about a number of subsections of the clause, including subsections (2), and (8) to (10)— commonly known as the news publisher content exemption, which I have spoken about previously. We understand that the intention of the exemption is to shield broadcasters and traditional newspaper publishers from the Bill’s regulatory effects, clause 50(2) defines a “recognised news publisher” as a regulated broadcaster or any other publisher that publishes news, has an office, and has a standards code and complaints process. There is no detail about the latter two requirements, thus enabling almost any news publishing enterprise to design its own code and complaints process, however irrational, and so benefit from the exemption. “News” is also defined broadly, and may include gossip. There remains a glaring omission, which amendment 43 addresses and which I will come to.

During an earlier sitting of the Committee, in response to comments made by my hon. Friend the Member for Liverpool, Walton as we discussed clause 2, the Minister claimed that

“The metaverse is a good example, because even though it did not exist when the structure of the Bill was conceived, anything happening in the metaverse is none the less covered by the Bill. Anything that happens in the metaverse that is illegal or harmful to children, falls into the category of legal but harmful to adults, or indeed constitutes pornography will be covered because the Bill is tech agnostic.”––[Official Report, Online Safety Public Bill Committee, 7 June 2022; c. 204.]

Clause 49 exempts one-to-one live aural communications from the scope of regulation. Given that much interaction in virtual reality is live aural communication, including between two users, it is hard to understand how that would be covered by the Bill.

There is also an issue about what counts as content. Most standard understandings would define “content” as text, video, images and audio, but one of the worries about interactions in VR is that behaviour such as physical violence will be able to be replicated virtually, with psychologically harmful effects. It is very unclear how that would be within the scope of the current Bill, as it does not clearly involve content, so could the Minister please address that point? As he knows, Labour advocates for a systems-based approach, and for risk assessments and systems to take place in a more upstream and tech-agnostic way than under the current approach. At present, the Bill would struggle to be expanded effectively enough to cover those risks.

Amendment 43 removes comments sections operated by news websites where the publisher has a UK turnover of more than £100 million from the exemption for regulated user-generated comment. If the Bill is to be effective in protecting the public from harm, the least it must accomplish is a system of accountability that covers all the largest platforms used by British citizens. Yet as drafted, the Bill would exempt some of the most popular social media platforms online: those hosted on news publisher websites, which are otherwise known as comments sections. The amendment would close that loophole and ensure that the comments sections of the largest newspaper websites are subject to the regime of regulation set out in the Bill.

Newspaper comments sections are no different from the likes of Facebook and Twitter, in that they are social media platforms that allow users to interact with one another. This is done through comments under stories, comments in response to other comments, and other interactions—for example, likes and dislikes on posts. In some ways, their capacity to cause harm to the public is even greater: for example, their reach is in many cases larger than even the biggest of social media platforms. Whereas there are estimated to be around 18 million users of Twitter in the UK, more than twice that number of British citizens access newspaper websites every month, and the harm perpetuated on those platforms is severe.

In July 2020, the rapper Wiley posted a series of antisemitic tweets, which Twitter eventually removed after an unacceptable delay of 48 hours, but under coverage of the incident in The Sun newspaper, several explicitly antisemitic comments were posted. Those comments contained holocaust denial and alleged a global Jewish conspiracy to control the world. They remained up and accessible to The Sun’s 7 million daily readers for the best part of a week. If we exempt comments sections from the Bill’s proposed regime and the duties that the Bill sets for platforms, we will send the message that that kind of vicious, damaging and harmful racism is acceptable.

Similarly, after an antisemitic attack in the German city of Halle, racists comments followed in the comments section under the coverage in The Sun. There are more examples: Chinese people being described as locusts and attacked with other racial slurs; 5G and Bill Gates conspiracy theories under articles on the Telegraph website; and of course, the most popular targets for online abuse, women in public life. Comments that described the Vice-President of the United States as a “rat” and “ho” appeared on the MailOnline. A female union leader has faced dozens of aggressive and abusive comments about her appearance, and many of such comments remain accessible on newspaper comments sections to this day. Some of them have been up for months, others for years.

Last week, the Committee was sent a letter from a woman who was the victim of comments section abuse, Dr Corinne Fowler. Dr Fowler said of the comments that she received:

“These comments contained scores of suggestions about how to kill or injure me. Some were general ideas, such as hanging, but many were gender specific, saying that I should be burnt at the stake like a witch. Comments focused on physical violence, one man advising that I should slapped hard enough to make my teeth chatter”.

She added:

“I am a mother: without me knowing, my son (then 12 years old) read these reader comments. He became afraid for my safety.”

Without the amendment, the Bill cannot do anything to protect women such as Dr Fowler and their families from this vile online abuse, because comments sections will be entirely out of scope of the Bill’s new regime and the duties designed to protect users.

As I understand it, two arguments have been made to support the exemption. First, it is argued that the complaints handlers for the press already deal with such content, but the handler for most national newspapers, the Independent Press Standards Organisation, will not act until a complaint is made. It then takes an average of six months for a complaint to be processed, and it cannot do anything if the comments have not been moderated. The Opposition do not feel that that is a satisfactory response to the seriousness of harms that we know to occur, and which I have described. IPSO does not even have a code to deal with cases of antisemitic abuse that appeared on the comments section of The Sun. IPSO’s record speaks for itself from the examples that I have given, and the many more, and it has proven to be no solution to the severity of harms that appear in newspaper comments sections.

The second argument for an exemption is that publishers are legally responsible for what appears on comments sections, but that is only relevant for illegal harms. For everything else, from disinformation to racial prejudice and abuse, regulation is needed. That is why it is so important that the Bill does the job that we were promised. To keep the public safe from harm online, comments sections must be covered under the Bill.

The amendment is a proportionate solution to the problem of comments section abuse. It would protect user’s freedom of expression and, given that it is subject to a turnover threshold, ensure that duties and other requirements do not place a disproportionate burden on smaller publishers such as locals, independents and blogs.

I have reams and reams and reams of examples from comments sections that all fall under incredibly harmful abuse and should be covered by the Bill. I could be here for hours reading them all out, and while I do not think that anybody in Committee would like me to, I urge Committee members to take a look for themselves at the types of comments under newspaper articles and ask themselves whether those comments should be covered by the terms of the Bill. I think they know the answer.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

On a point of order, Ms Rees. Are we considering clause 49 now? I know that it is supposed to considered under the next set of amendments, but I just wondered, because I have separate comments to make on that clause that I did not make earlier because I spoke purely to the amendment.

None Portrait The Chair
- Hansard -

I did not want to stop Alex Davies-Jones in full flow. When we come to consideration of clause 49, I was going to ask for additional comments, but it is for the Committee to decide whether it is content with that, or would like the opportunity to elaborate on that clause now.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I am happy to speak on clause 49 now—I can see the Minister is nodding. I really appreciate it, Ms Rees, because I did not want to lose the opportunity to raise concerns about this matter. I have not tabled an amendment but I would appreciate it if the Minister gave consideration to my following comments.

My concern relates to subsection (5) of clause 49, which exempts one-to-one live aural communications in relation to user-to-user services. My concern relates to child sexual abuse and grooming. I am worried that exempting those one-to-one live aural communications allows bad actors, people who are out to attack children, a loophole to do that. We know that on games such as Fortnite, one-to-one aural communication happens.

I am not entirely sure how communication happens on Roblox and whether there is an opportunity for that there. However, we also know that a number of people who play online games have communication on Discord at the same time. Discord is incredibly popular, and we know that there is an opportunity for, and a prevalence of, grooming on there. I am concerned that exempting this creates a loophole for people to attack children in a way that the Minister is trying to prevent with the Bill. I understand why the clause is there but am concerned that the loophole is created.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Lady raises an important philosophical question that underpins much of the Bill’s architecture. All the measures are intended to strike a balance. Where there are things that are at risk of leading to illegal activity, and things that are harmful to children, we are clamping down hard, but in other areas we are being more proportionate. For example, the legal but harmful to adult duties only apply to category 1 companies, and we are looking at whether that can be extended to other high-risk companies, as we debated earlier. In the earlier provisions that we debated, about “have regard to free speech”, there is a balancing exercise between the safety duties and free speech. A lot of the provisions in the Bill have a sense of balance and proportionality. In some areas, such as child sexual exploitation and abuse, there is no balance. We just want to stop that—end of story. In other areas, such as matters that are legal but harmful and touch on free speech, there is more of a balancing exercise.

In this area of news publisher content, we are again striking a balance. We are saying that the inherent harmfulness of those sites, owing to their functionality—they do not go viral in the same way—is much lower. There is also an interaction with freedom of the press, as I said earlier. Thus, we draw the balance in a slightly different way. To take the example of suicide promotion or self-harm content, there is a big difference between stumbling across something in comment No. 74 below a BBC article, versus the tragic case of Molly Russell—the 14-year-old girl whose Instagram account was actively flooded, many times a day, with awful content promoting suicide. That led her to take her own life.

I think the hon. Member for Batley and Spen would probably accept that there is a functional difference between a comment that someone has to scroll down a long way to find and probably sees only once, and being actively flooded with awful content. In having regard to those different arguments—the risk and the freedom of the press—we try to strike a balance. I accept that they are not easy balances to strike, and that there is a legitimate debate to be had on them. However, that is the reason that we have adopted this approach.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I have a question on anonymity. On social media there will be a requirement to verify users’ identities, so if somebody posts on Twitter that they want to lynch me, it is possible to find out who that is, provided they do not have an anonymous account. There is no such provision for newspaper comment sections, so I assume it would be much more difficult for the police to find them, or for me not to see anonymous comments that threaten my safety below the line of newspaper articles—comments that are just as harmful, which threaten my safety on social media. Can the Minister can convince me otherwise?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Lady is correct in her analysis, I can confirm. Rather similar to the previous point, because of the interaction with freedom of the press—the argument that the newspapers and broadcasters have advanced—and because this is an inherently less viral environment, we have drawn the balance where we have. She is right to highlight a reasonable risk, but we have struck the balance in the way we have for that reason.

The shadow Minister, the hon. Member for Pontypridd, asked whether very harmful or illegal interactions in the metaverse would be covered or whether they have a metaphorical “get out of jail free” card owing to the exemption in clause 49(2)(d) for “one-to-one live aural communications”. In essence, she is asking whether, in the metaverse, if two users went off somewhere and interacted only with each other, that exemption would apply and they would therefore be outwith the scope of the Bill. I am pleased to tell her they would not, because the definition of live one-to-one aural communications goes from clause 49(2)(d) to clause 49(5), which defines “live aural communications”. Clause 49(5)(c) states that the exemption applies only if it

“is not accompanied by user-generated content of any other description”.

The actions of a physical avatar in the metaverse do constitute user-generated content of any other description. Owing to that fact, the exemption in clause 49(2)(d) would not apply to the metaverse.

I am happy to provide clarification on that. It is a good question and I hope I have provided an example of how, even though the metaverse was not conceived when the Bill was conceived, it does have an effect.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

On that point, when it comes to definition of content, we have tabled an amendment about “any other content”. I am not convinced that the definition of content adequately covers what the Minister stated, because it is limited, does not include every possible scenario where it is user-generated and is not future-proofed enough. When we get to that point, I would appreciate it if the Minister would look at the amendment and ensure that what he intends is what happens.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am grateful to the hon. Lady for thinking about that so carefully. I look forward to her amendment. For my information, which clause does her amendment seek to amend?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I will let the Minister know in a moment.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am grateful. It is an important point.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank my hon. Friend for his service on the Joint Committee. I heard the representations of my right hon. Friend the Member for Basingstoke about a Joint Committee, and I have conveyed them to the higher authorities.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The amendment that the Minister is asking about is to clause 189, which states:

“‘content’ means anything communicated by means of an internet service, whether publicly or privately, including written material or messages, oral communications, photographs, videos, visual images, music and data of any description”.

It is amendment 76 that, after “including”, would insert “but not limited to”, in order that the Bill is as future-proofed as it can be.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank my right hon. Friend for that intervention. First, clearly if something illegal is said online about someone, they would have the normal redress to go to the police and the police could seek to exercise their powers to investigate the offence, including requesting the company that hosts the comments—in this case, it would be a newspaper’s or broadcaster’s website—to provide any relevant information that might help to identify the person involved; they might have an account, and if they do not they might have a log-on or IP address. So, the normal criminal investigatory procedures would obviously apply.

Secondly, if the content was defamatory, then—I realise that only people like Arron Banks can sue for libel, but there is obviously civil recourse for libel. And I think there are powers in the civil procedure rules that allow for court orders to be made that require organisations, such as news media websites, to disclose information that would help to identify somebody who is a respondent in a civil case.

Thirdly, there are obviously the voluntary steps that the news publisher might take to remove content. News publishers say that they do that; obviously, their implementation, as we know, is patchy. Nevertheless, there is that voluntary route.

Regarding any legal obligation that may fall on the shoulders of the news publisher itself, I am not sure that I have sufficient legal expertise to comment on that. However, I hope that those first three areas of redress that I have set out give my right hon. Friend some assurance on this point.

Finally, I turn to a question asked by the hon. Member for Aberdeen North. She asked whether the exemption for “one-to-one live aural communications”, as set out in clause 49(2)(d), could inadvertently allow grooming or child sexual exploitation to occur via voice messages that accompany games, for example. The exemption is designed to cover what are essentially phone calls such as Skype conversations—one-to-one conversations that are essentially low-risk.

We believe that the Bill contains other duties to ensure that services are designed to reduce the risk of grooming and to address risks to children, if those risks exist, such as on gaming sites. I would be happy to come back to the hon. Lady with a better analysis and explanation of where those duties sit in the Bill, but there are very strong duties elsewhere in the Bill that impose those obligations to conduct risk assessments and to keep children safe in general. Indeed, the very strongest provisions in the Bill are around stopping child sexual exploitation and abuse, as set out in schedule 6.

Finally, there is a power in clause 174(1) that allows us, as parliamentarians and the Government, to repeal this exemption using secondary legislation. So, if we found in the future that this exemption caused a problem, we could remove it by passing secondary legislation.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

That is helpful for understanding the rationale, but in the light of how people communicate online these days, although exempting telephone conversations makes sense, exempting what I am talking about does not. I would appreciate it if the Minister came back to me on that, and he does not have to give me an answer now. It would also help if he explained the difference between “aural” and “oral”, which are mentioned at different points in the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I will certainly come back with a more complete analysis of the point about protecting children—as parents, that clearly concerns us both. The literal definitions are that “aural” means “heard” and “oral” means “spoken”. They occur in different places in the Bill.

This is a difficult issue and legitimate questions have been raised, but as I said in response to the hon. Member for Batley and Spen, in this area as in others, there are balances to strike and different considerations at play—freedom of the press on the one hand, and the level of risk on the other. I think that the clause strikes that balance in an appropriate way.

Question put, That the amendment be made.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Briefly, as with earlier clauses, the Labour party recognises the challenge in finding the balance between freedom of expression and keeping people safe online. Our debate on the amendment has illustrated powerfully that the exemptions as they stand in the Bill are hugely flawed.

First, the exemption is open to abuse. Almost any organisation could develop a standards code and complaints process to define itself as a news publisher and benefit from the exemption. Under those rules, as outlined eloquently by my hon. Friend the Member for Batley and Spen, Russia Today already qualifies, and various extremist publishers could easily join it. Organisations will be able to spread seriously harmful content with impunity—I referred to many in my earlier contributions, and I have paid for that online.

Secondly, the exemption is unjustified, as we heard loud and clear during the oral evidence sessions. I recall that Kyle from FairVote made that point particularly clearly. There are already rigorous safeguards in the Bill to protect freedom of expression. The fact that content is posted by a news provider should not itself be sufficient reason to treat such content differently from that which is posted by private citizens.

Furthermore, quality publications with high standards stand to miss out on the exemption. The Minister must also see the lack of parity in the broadcast media space. In order for broadcast media to benefit from the exemption, they must be regulated by Ofcom, and yet there is no parallel stipulation for non-broadcast media to be regulated in order to benefit. How is that fair? For broadcast media, the requirement to be regulated by Ofcom is simple, but for non-broadcast media, the series of requirements are not rational, exclude many independent publishers and leave room for ambiguity.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I have a couple of questions that were probably too long for interventions. The Minister said that if comments on a site are the only user-generated content, they are not in scope. It would be really helpful if he explained what exactly he meant by that. We were talking about services that do not fall within the definition of “recognised news publishers”, because we were trying to add them to that definition. I am not suggesting that the Minister is wrong in any way, but I do not understand where the Bill states that those comments are excluded, and how this all fits together.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

With your permission, Ms Rees, I will speak to clause 52 before coming to amendment 61. Illegal content is defined in clause 52(2) as

“content that amounts to a relevant offence.”

However, as the Minister will know from representations from Carnegie UK to his Department—we share its concerns—the illegal and priority illegal regimes may not be able to operate as intended. The Bill requires companies to decide whether content “amounts to” an offence, with limited room for movement. We share concerns that that points towards decisions on an item-by-item basis; it means detecting intent for each piece of content. However, such an approach does not work at the scale on which platforms operate; it is bad regulation and poor risk management.

There seem to be two different problems relating to the definition of “illegal content” in clause 52. The first is that it is unclear whether we are talking about individual items of content or categories of content—the word “content” is ambiguous because it can be singular or plural—which is a problem for an obligation to design and run a system. Secondly, determining when an offence has taken place will be complex, especially bearing in mind mens rea and defences, so the providers are not in a position to get it right.

The use of the phrase “amounts to” in clause 52(2) seems to suggest that platforms will be required to identify accurately, in individual cases, where an offence has been committed, without any wriggle room drafted in, unlike in the draft Bill. As the definition now contains no space for error either side of the line, it could be argued that there are more incentives to avoid false negatives than false positives—providers can set higher standards than the criminal law—and that leads to a greater risk of content removal. That becomes problematic, because it seems that the obligation under clause 9(3) is then to have a system that is accurate in all cases, whereas it would be more natural to deal with categories of content. This approach seems not to be intended; support for that perspective can be drawn from clause 9(6), which recognises that there is a distinction between categories of content and individual items, and that the application of terms of service might specifically have to deal with individual instances of content. Critically, the “amounts to” approach cannot work in conjunction with a systems-based approach to harm reduction. That leaves victims highly vulnerable.

This problem is easily fixed by a combination of reverting to the draft Bill’s language, which required reasonableness, and using concepts found elsewhere in the Bill that enable a harm mitigation system to operate for illegal content. We also remind the Minister that Ofcom raised this issue in the evidence sessions. I would be grateful if the Minister confirmed whether we can expect a Government amendment to rectify this issue shortly.

More broadly, as we know, priority illegal content, which falls within illegal content, includes,

“(a) terrorism content,

(b) CSEA content, and

(c) content that amounts to an offence specified in Schedule 7”,

as set out in clause 52(7). Such content attracts a greater level of scrutiny and regulation. Situations in which user-generated content will amount to “a relevant offence” are set out in clause 52(3). Labour supports the inclusion of a definition of illegal content as outlined in the grouping; it is vital that service providers and platforms have a clear indication of the types of content that they will have a statutory duty to consider when building, or making changes to the back end of, their business models.

We have also spoken about the importance of parity between the online and offline spaces—what is illegal offline must be illegal online—so the Minister knows we have more work to do here. He also knows that we have broad concerns around the omissions in the Bill. While we welcome the inclusion of terrorism and child sexual exploitation content as priority illegal content, there remain gaps in addressing violence against women and girls content, which we all know is hugely detrimental to many online.

The UK Government stated that their intention for the Online Safety Bill was to make the UK the safest place to be online in the world, yet the Bill does not mention online gender-based violence once. More than 60,000 people have signed the Glitch and End Violence Against Women Coalition’s petition calling for women and girls to be included in the Bill, so the time to act is now. We all have a right to not just survive but thrive, engage and play online, and not have our freedom of expression curtailed or our voices silenced by perpetrators of abuse. The online space is just as real as the offline space. The Online Safety Bill is our opportunity to create safe digital spaces.

The Bill must name the problem. Violence against women and girls, particularly those who have one or multiple protected characteristics, is creating harm and inequality online. We must actively and meaningfully name this issue and take an intersectional approach to ending online abuse to ensure that the Bill brings meaningful change for all women. We also must ensure that the Bill truly covers all illegal content, whether it originated in the UK or not.

Amendment 61 brings offences committed overseas within the scope of relevant offences for the purposes of defining illegal content. The aim of the amendment is to clarify whether the Bill covers content created overseas that would be illegal if what was shown in the content took place in the UK. For example, animal abuse and cruelty content is often filmed abroad. The same can be said for dreadful human trafficking content and child sexual exploitation. The optimal protection would be if the Bill’s definition of illegal content covered matter that would be illegal in either the UK or the country it took place in, regardless of whether it originated in the UK.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I do not intend to make a speech, but I want to let the hon. Lady know that we wholeheartedly support everything that she has said on the clause and amendment 61.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful for the hon. Member’s contribution, and for her support for the amendment and our comments on the clause.

The Bill should be made clearer, and I would appreciate an update on the Minister’s assessment of the provisions in the Bill. Platforms and service providers need clarity if they are to take effective action against illegal content. Gaps in the Bill give rise to serious questions about the overwhelming practical challenges of the Bill. None of us wants a two-tier internet, in which user experience and platforms’ responsibilities in the UK differ significantly from those in the rest of the world. Clarifying the definition of illegal content and acknowledging the complexity of the situation when content originates abroad are vital if this legislation is to tackle wide-ranging, damaging content online. That is a concern I raised on Second Reading, and a number of witnesses reiterated it during the oral evidence sessions. I remind the Committee of the comments of Kevin Bakhurst from Ofcom, who said:

“We feel it is really important—hopefully this is something the Committee can contribute to—that the definition of ‘illegal content’ is really clear for platforms, and particularly the area of intent of illegality, which at the moment might be quite tricky for the platforms to pick up on.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 8, Q7.]

That has been reiterated by myriad other stakeholders, so I would be grateful for the Minister’s comments.