All 1 Shaun Bailey contributions to the Online Safety Act 2023

Read Bill Ministerial Extracts

Thu 9th Jun 2022

Online Safety Bill (Seventh sitting)

Shaun Bailey Excerpts
Committee stage
Thursday 9th June 2022

(2 years, 6 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 9 June 2022 - (9 Jun 2022)
Shaun Bailey Portrait Shaun Bailey (West Bromwich West) (Con)
- Hansard - -

It is a pleasure to see you in the Chair, Ms Rees, and to make my first contribution in Committee—it will be a brief one. It is great to follow the hon. Member for Aberdeen North, and I listened intently to my right hon. Friend the Member for Basingstoke, from whom I have learned so much having sat with her in numerous Committees over the past two years.

I will speak to clause 18 stand part, in particular on the requirements of the technical specifications that the companies will need to use to ensure that they fulfil the duties under the clause. The point, which has been articulated well by numerous Members, is that we can place such a duty on service providers, but we must also ensure that the technical specifications in their systems allow them to follow through and deliver on it.

I sat in horror during the previous sitting as I listened to the hon. Member for Pontypridd talking about the horrendous abuse that she has to experience on Twitter. What that goes to show is that, if the intention of this clause and the Bill are to be fulfilled, we must ensure that the companies enable themselves to have the specifications in their systems on the ground to deliver the requirements of the Bill. That might mean that the secondary legislation is slightly more prescriptive about what those systems look like.

It is all well and good us passing primary legislation in this place to try to control matters, but my fear is that if those companies do not have systems such that they can follow through, there is a real risk that what we want will not materialise. As we proceed through the Bill, there will be mechanisms to ensure that that risk is mitigated, but the point that I am trying to make to my hon. Friend the Minister is that we should ensure that we are on top of this, and that companies have the technical specifications in their complaints procedures to meet the requirements under clause 18.

We must ensure that we do not allow the excuse, “Oh, well, we’re a bit behind the times on this.” I know that later clauses seek to deal with that, but it is important that we do not simply fall back on excuses. We must embed a culture that allows the provisions of the clause to be realised. I appeal to the Minister to ensure that we deal with that and embed a culture that looks at striding forward to deal with complaints procedures, and that these companies have the technical capabilities on the ground so that they can deal with these things swiftly and in the right way. Ultimately, as my right hon. Friend the Member for Basingstoke said, it is all well and good us making these laws, but it is vital that we ensure that they can be applied.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me address some of the issues raised in the debate. First, everyone in the House recognises the enormous problem at the moment with large social media firms receiving reports about harmful and even illegal content that they just flagrantly ignore. The purpose of the clause, and indeed of the whole Bill and its enforcement architecture, is to ensure that those large social media firms no longer ignore illegal and harmful content when they are notified about it. We agree unanimously on the importance of doing that.

The requirement for those firms to take the proper steps is set out in clause 18(2)(b), at the very top of page 18 —it is rather depressing that we are on only the 18th of a couple of hundred pages. That paragraph creates a statutory duty for a social media platform to take “appropriate action”—those are the key words. If the platform is notified of a piece of illegal content, or content that is harmful to children, or of content that it should take down under its own terms and conditions if harmful to adults, then it must do so. If it fails to do so, Ofcom will have the enforcement powers available to it to compel—ultimately, escalating to a fine of up to 10% of global revenue or even service disconnection.