Online Safety Bill (Tenth sitting) Debate

Full Debate: Read Full Debate

Dean Russell

Main Page: Dean Russell (Conservative - Watford)
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

There are one or two points to pick up on. A question was raised about algorithms, and it is worth saying that the risk assessments that platforms must undertake will include consideration of the operation of algorithms. It is important to make it absolutely clear that that is the case.

The shadow Minister asked about the definition of harm, and whether all the harms that might concern Parliament, and many of us as parents, will be covered. It may be helpful to refer to definition of harm provided in clause 187, at the top of page 153. Committee members will note that the definition is very wide and that subsection (2) defines it as “physical or psychological harm”, so I hope that partly answers the shadow Minister’s question.

Dean Russell Portrait Dean Russell (Watford) (Con)
- Hansard - -

I am jumping ahead a bit, but I know that we will discuss clause 150, Zach’s law and epilepsy in particular at some point. Given the definition that my hon. Friend has just cited, am I correct to assume that the physical harm posed to those with epilepsy who might be targeted online will be covered, and that it is not just about psychological harm?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I admire my hon. Friend’s attention to the debate. The definition of harm for the harmful communications offence in clause 150 is set out in clause 150(4). In that context, harm is defined slightly differently, as

“psychological harm amounting to at least serious distress”.

The definition of harm in clause 187 that I read out is the definition of harm used elsewhere in the Bill. However, as I said before in the House and in the evidence session, the Government’s belief and intention is that epilepsy trolling would fall in the scope of clause 150, because giving someone an epileptic fit clearly does have a physical implication, as my hon. Friend said, but also causes psychological harm. Being given an epileptic fit is physically damaging, but it causes psychological harm as well.

Despite the fact that the definition of harm in clause 187 does not apply in clause 150, which has its own definition of harm, I am absolutely categoric that epilepsy trolling is caught by clause 150 because of the psychological harm it causes. I commend my hon. Friend the Member for Watford for being so attentive on the question of epilepsy, and also in this debate.

Returning to the definition of harm in clause 187, besides the wide definition covering physical and psychological harm, clause 187(4) makes it clear that harm may also arise not just directly but if the content prompts individuals to

“act in a way that results in harm to themselves or that increases the likelihood of harm to themselves”.

Clause 187(4)(b) covers content where the

“individuals do or say something to another individual that results in”

that individual suffering harm. I hope the shadow Minister is reassured that the definition of harm that applies here is extremely wide in scope.

There was a question about media literacy, which I think the hon. Member for Batley and Spen raised in an intervention. Media literacy duties on Ofcom already exist in the Communications Act 2003. The Government published a comprehensive and effective media literacy strategy about a year ago. In December—after the first version of the Bill was produced, but before the second and updated version—Ofcom updated its policy in a way that went beyond the duties contained in the previous version of the Bill. From memory, that related to the old clause 103, in the version of the Bill published in May last year, which is of course not the same clause in this version of the Bill, as it has been updated.

The hon. Member for Aberdeen North raised, as ever, some important points of detail. She asked about future proofing. The concept of harm expressed in the clause is a general concept of harm. The definition of harm is whatever is harmful to children, which includes things that we do not know about at the moment and that may arise in the future. Secondly, primary priority content and priority content that is harmful can be updated from time to time by a statutory instrument. If some new thing happens that we think deserves to be primary priority content or priority content that is harmful to children, we can update that using a statutory instrument.

The hon. Lady also asked about exclusions in clause 53(5). The first exclusion in subsection (5)(a) is illegal content, because that is covered elsewhere in the Bill—it is covered in clause 52. That is why it is excluded, because it is covered elsewhere. The second limb, subsection 5(b), covers some financial offences. Those are excluded because they are separately regulated. Financial services are separately regulated. The hon. Lady used the example of gambling. Gambling is separately regulated by the Gambling Act 2005, a review of which is imminent. There are already very strong provisions in that Act, which are enforced by the regulator, the Gambling Commission, which has a hard-edged prohibition on gambling if people are under 18.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I very much agree. We cannot emphasis that enough, and it is useful that my hon. Friend has set that out, adding to what I was saying.

Amendment 55 sets out the details of the information that Ofcom must request to be provided in a transparency report in new paragraph 31A. First, transparency disclosures required by the Bill should include how large companies allocate resources to tackling harm in different languages —an issue that was rightly raised by the hon. Member for Ochil and South Perthshire. As we heard from Frances Haugen, many safety systems at Meta have only a subset of detection systems for languages other than English. Languages such as Welsh have almost no safety systems live on Facebook. It is neither fair nor safe.

When we consider that more than 250 languages are spoken in London alone, the inconsistency of safety systems becomes very concerning. Charities have warned that people accessing Facebook in different languages are being exposed to very different levels of risk, with some versions of Facebook having few or none of the safety systems that protect other versions of the site in different languages.

When giving evidence to the Committee last month, Richard Earley disclosed that Meta regulated only 70 languages. Given that around 3 billion people use Facebook on a monthly basis across the world, that is clearly inadequate.

Dean Russell Portrait Dean Russell
- Hansard - -

One of the things we found on the Joint Committee last year was the consistent message that we should not need to put this Bill in place. I want to put on the record my continued frustration that Meta and the other social media platforms are requiring us to put this Bill in place because they are not doing the monitoring, engaging in that way or putting users first. I hope that the process of going through the Bill has helped them to see the need for more monitoring. It is disappointing that we have had to get to this point. The UK Government are having to lead the world by putting this Bill in place—it should not be necessary. I hope that the companies do not simply follow what we are putting forward, but go much further and see that it is imperative to change the way they work and support their users around the world.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I thank the hon. Gentleman and I agree. It is a constant frustration that we need this Bill. We do need it, though. In fact, amendment 55 would really assist with that, by requiring those services to go further in transparency reporting and to disclose

“the languages in which the service has safety systems or classifiers”.

We need to see what they are doing on this issue. It is an easily reported piece of information that will have an outsized impact on safety, even for English speakers. It will help linguistic groups in the multilingual UK and around the world.

Reporting on language would not be a big burden on companies. In her oral evidence, Frances Haugen told the Committee that large platforms can trivially produce this additional data merely by changing a single line of code when they do their transparency reports. We must not become wrapped up in the comfort of the language we all speak and ignore the gaping loophole left for other languages, which allows harms to slip through.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

It is a great shame that the hon. Member for Ochil and South Perthshire is occupied in the main Chamber, because I could have pointed to this change as one of the examples of the Government listening to the Joint Committee, on which he and many others served. However, I hope that the hon. Member for Aberdeen North will communicate my observation to him, which I am sure he will appreciate.

In seriousness, this is an example of the Government moving the Bill on in response to widespread parliamentary and public commentary. It is right that we extend the duties to cover commercial pornographic content as well as the user-to-user pornography covered previously. I thank the Opposition parties for their support for the inclusion of those measures.

Dean Russell Portrait Dean Russell
- Hansard - -

As a member of the Joint Committee, on which I worked with the hon. Member for Ochil and South Perthshire, I thank the Minister for including this clause on a point that was debated at length by the Joint Committee. Its inclusion is crucial to organisations in my constituency such as Dignify—a charity that works to raise awareness and campaign on this important point, to protect children but also wider society. As this is one of the 66 recommendations that the Minister took forward in the Bill, I would like to thank him; it is very welcome, and I think that it will make a huge difference to children and to society.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank my hon. Friend for his intervention and for his work on the Joint Committee, which has had a huge impact, as we have seen. I hope that colleagues will join me in thanking the members of the Joint Committee for their work.

My final point on this important clause is in response to a question that the shadow Minister raised about clause 66(3), which makes reference to

“a person acting on behalf of the provider”.

That is just to ensure that the clause is comprehensively drafted without any loopholes. If the provider used an agent or engaged some third party to disseminate content on their behalf, rather than doing so directly, that would be covered too. We just wanted to ensure that there was absolutely no loophole—no chink of light—in the way that the clause was drafted. That is why that reference is there.

I am delighted that these clauses seem to command such widespread support. It therefore gives me great pleasure to commend them to the Committee.

Question put and agreed to.

Clause 66 accordingly ordered to stand part of the Bill.

Clause 67 ordered to stand part of the Bill.

Schedule 9 agreed to.

Clause 68

Duties about regulated provider pornographic content