Online Safety Bill

Eleanor Laing Excerpts
2nd reading
Tuesday 19th April 2022

(2 years ago)

Commons Chamber
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts
None Portrait Several hon. Members rose—
- Hansard -

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - -

Order. Before I call the shadow Secretary of State, it will be obvious to the House that we have approximately one hour for Back-Bench contributions and that a great many people want to speak. I warn colleagues that not everybody will have the opportunity and that there will certainly be a time limit, which will probably begin at five minutes.

--- Later in debate ---
Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - -

Order. The hon. Lady is not giving way. Let us get on with the debate.

Lucy Powell Portrait Lucy Powell
- Hansard - - - Excerpts

The business managers have failed everybody on both sides given the time available.

A systems-based approach also has the benefit of tackling the things that platforms can control, such as how content spreads, rather than what they cannot control, such as what people post. We would avoid the cul-de-sac of arguing over the definitions of what content is or is not harmful, and instead go straight to the impact. I urge the Government to adopt the recommendations that have been made consistently to focus the Bill on systems and models, not simply on content.

Turning to other aspects of the Bill, key issues with its effectiveness remain. The first relates to protecting children. As any parent will know, children face significant risks online, from poor body image, bullying and sexist trolling to the most extreme grooming and child abuse, which is, tragically, on the rise. This Bill is an important opportunity to make the internet a safe place for children. It sets out duties on platforms to prevent children from encountering illegal, harmful or pornographic content. That is all very welcome.

However, despite some of the Government’s ambitious claims, the Bill still falls short of fully protecting children. As the National Society for the Prevention of Cruelty to Children argues, the Government have failed to grasp the dynamics of online child abuse and grooming—[Interruption.] Again, I am being heckled from the Front Bench, but if Ministers engage with the children’s charities they will find a different response. For example—[Interruption.] Yes, but they are not coming out in support of the Bill, are they? For example, it is well evidenced that abusers will often first interact with children on open sites and then move to more encrypted platforms. The Government should require platforms to collaborate to reduce harm to children, prevent abuse from being displaced and close loopholes that let abusers advertise to each other in plain sight.

The second issue is illegal activity. We can all agree that what is illegal offline should be illegal online, and all platforms will be required to remove illegal content such as terrorism, child sex abuse and a range of other serious offences. It is welcome that the Government have set out an expanded list, but they can and must go further. Fraud was the single biggest crime in the UK last year, yet the Business Secretary dismissed it as not affecting people’s everyday lives.

The approach to fraud in this Bill has been a bit like the hokey-cokey: the White Paper said it was out, then it was in, then it was out again in the draft Bill and finally it is in again, but not for the smaller sites or the search services. The Government should be using every opportunity to make it harder for scammers to exploit people online, backed up by tough laws and enforcement. What is more, the scope of this Bill still leaves out too many of the Law Commission’s recommendations of online crimes.

The third issue is disinformation. The war in Ukraine has unleashed Putin’s propaganda machine once again. That comes after the co-ordinated campaign by Russia to discredit the truth about the Sergei Skripal poisonings. Many other groups have watched and learned: from covid anti-vaxxers to climate change deniers, the internet is rife with dangerous disinformation. The Government have set up a number of units to tackle disinformation and claim to be working with social media companies to take it down. However, that is opaque and far from optimal. The only mention of disinformation in the Bill is that a committee should publish a report. That is far from enough.

Returning to my earlier point, it is the business models and systems of social media companies that create a powerful tool for disinformation and false propaganda to flourish. Being a covid vaccine sceptic is one thing, but being able to quickly share false evidence dressed up as science to millions of people within hours is a completely different thing. It is the power of the platform that facilitates that, and it is the business models that encourage it. This Bill hardly begins to tackle those societal and democratic harms.

The fourth issue is online abuse. From racism to incels, social media has become a hotbed for hate. I agree with the Secretary of State that that has poisoned public life. I welcome steps to tackle anonymous abuse. However, we still do not know what the Government will designate as legal but harmful, which makes it very difficult to assess whether the Bill goes far enough, or indeed too far. I worry that those definitions are left entirely to the Secretary of State to determine. A particularly prevalent and pernicious form of online hate is misogyny, but violence against women and girls is not mentioned at all in the Bill—a serious oversight.

The decision on which platforms will be regulated by the Bill is also arbitrary and flawed. Only the largest platforms will be required to tackle harmful content, yet smaller platforms, which can still have a significant, highly motivated, well-organised and particularly harmful user base, will not. Ofcom should regulate based on risk, not just on size.

The fifth issue is that the regulator and the public need the teeth to take on the big tech companies, with all the lawyers they can afford. It is a David and Goliath situation. The Bill gives Ofcom powers to investigate companies and fine them up to 10% of their turnover, and there are some measures to help individual users. However, if bosses in Silicon Valley are to sit up and take notice of this Bill, it must go further. It should include stronger criminal liability, protections for whistleblowers, a meaningful ombudsman for individuals, and a route to sue companies through the courts.

The final issue is future-proofing, which we have heard something about already. This Bill is a step forward in dealing with the likes of Twitter, Facebook and Instagram—although it must be said that many companies have already begun to get their house in order ahead of any legislation—but it will have taken nearly six years for the Bill to appear on the statute book.

Since the Bill was first announced, TikTok has emerged on the scene, and Facebook has renamed itself Meta. The metaverse is already posing dangers to children, with virtual reality chat rooms allowing them to mix freely with predatory adults. Social media platforms are also adapting their business models to avoid regulation; Twitter, for example, says that it will decentralise and outsource moderation. There is a real danger that when the Bill finally comes into effect, it will already be out of date. A duty of care approach, focused on outcomes rather than content, would create a much more dynamic system of regulation, able to adapt to new technologies and platforms.

In conclusion, social media companies are now so powerful and pervasive that regulating them is long overdue. Everyone agrees that the Bill should reduce harm to children and prevent illegal activity online, yet there are serious loopholes, as I have laid out. Most of all, the focus on individual content rather than business models, outcomes and algorithms will leave too many grey areas and black spots, and will not satisfy either side in the free speech debate.

Despite full prelegislative scrutiny, the Government have been disappointingly reluctant to accept those bigger recommendations. In fact, they are going further in the wrong direction. As the Bill progresses through the House, we will work closely with Ministers to improve and strengthen it, to ensure that it truly becomes a piece of world-leading legislation.

None Portrait Several hon. Members rose—
- Hansard -

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - -

We will begin with a time limit of five minutes, but that is likely to reduce.

Julian Knight Portrait Julian Knight (Solihull) (Con)
- Hansard - - - Excerpts

Some colleagues have been in touch with me to ask my view on one overriding matter relating to this Bill: does it impinge on our civil liberties and our freedom of speech? I say to colleagues that it does neither, and I will explain how I have come to that conclusion.

In the mid-1990s, when social media and the internet were in their infancy, the forerunners of the likes of Google scored a major win in the United States. Effectively, they got the US Congress to agree to the greatest “get out of jail free” card in history: namely, to agree that social media platforms are not publishers and are not responsible for the content they carry. That has led to a huge flowering of debate, knowledge sharing and connections between people, the likes of which humanity has never seen before. We should never lose sight of that in our drive to fairly regulate this space. However, those platforms have also been used to cause great harm in our society, and because of their “get out of jail free” card, the platforms have not been accountable to society for the wrongs that are committed through them.

That is quite simplistic. I emphasise that as time has gone by, social media platforms have to some degree recognised that they have responsibilities, and that the content they carry is not without impact on society—the very society that they make their profits from, and that nurtured them into existence. Content moderation has sprung up, but it has been a slow process. It is only a few years ago that Google, a company whose turnover is higher than the entire economy of the Netherlands, was spending more on free staff lunches than on content moderation.

Content moderation is decided by algorithms, based on terms and conditions drawn up by the social media companies without any real public input. That is an inadequate state of affairs. Furthermore, where platforms have decided to act, there has been little accountability, and there can be unnecessary takedowns, as well as harmful content being carried. Is that democratic? Is it transparent? Is it right?

These masters of the online universe have a huge amount of power—more than any industrialist in our history—without facing any form of public scrutiny, legal framework or, in the case of unwarranted takedowns, appeal. I am pleased that the Government have listened in part to the recommendations published by the Digital, Culture, Media and Sport Committee, in particular on Parliament’s being given control through secondary legislation over legal but harmful content and its definition—an important safeguard for this legislation. However, the Committee and I still have queries about some of the Bill’s content. Specifically, we are concerned about the risks of cross-platform grooming and bread- crumbing—perpetrators using seemingly innocuous content to trap a child into a sequence of abuse. We also think that it is a mistake to focus on category 1 platforms, rather than extending the provisions to other platforms such as Telegram, which is a major carrier of disinformation. We need to recalibrate to a more risk-based approach, rather than just going by the numbers. These concerns are shared by charities such as the National Society for the Prevention of Cruelty to Children, as the hon. Member for Manchester Central (Lucy Powell) said.

On a systemic level, consideration should be given to allowing organisations such as the Internet Watch Foundation to identify where companies are failing to meet their duty of care, in order to prevent Ofcom from being influenced and captured by the heavy lobbying of the tech industry. There has been reference to the lawyers that the tech industry will deploy. If we look at any newspaper or LinkedIn, we see that right now, companies are recruiting, at speed, individuals who can potentially outgun regulation. It would therefore be sensible to bring in outside elements to provide scrutiny, and to review matters as we go forward.

On the culture of Ofcom, there needs to be greater flexibility. Simply reacting to a large number of complaints will not suffice. There needs to be direction and purpose, particularly with regard to the protection of children. We should allow for some forms of user advocacy at a systemic level, and potentially at an individual level, where there is extreme online harm.

On holding the tech companies to account, I welcome the sanctions regime and having named individuals at companies who are responsible. However, this Bill gives us an opportunity to bring about real culture change, as has happened in financial services over the past two decades. During Committee, the Government should actively consider the suggestion put forward by my Committee—namely, the introduction of compliance officers to drive safety by design in these companies.

Finally, I have concerns about the definition of “news publishers”. We do not want Ofcom to be effectively a regulator or a licensing body for the free press. However, I do not want in any way to do down this important and improved Bill. I will support it. It is essential. We must have this regulation in place.

John Nicolson Portrait John Nicolson (Ochil and South Perthshire) (SNP)
- Hansard - - - Excerpts

Thank you, Madam Deputy Speaker, but I was under the impression that I was to wind up for my party, rather than speaking at this juncture.

--- Later in debate ---
Eleanor Laing Portrait Madam Deputy Speaker
- Hansard - -

If the hon. Gentleman would prefer to save his slot until later—

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

I would, Madam Deputy Speaker, if that is all right with you.

Eleanor Laing Portrait Madam Deputy Speaker
- Hansard - -

Then we shall come to that arrangement. I call Dame Margaret Hodge.

--- Later in debate ---
None Portrait Several hon. Members rose—
- Hansard -

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - -

Order. After the next speaker, the time limit will be reduced to four minutes.

--- Later in debate ---
None Portrait Several hon. Members rose—
- Hansard -

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - -

Order. I am reluctant to reduce the time limit, but I am receiving appeals for me to try to get more people in, so I will reduce it to three minutes. However, not everyone will have a chance to speak this evening.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I have so many points to reply to that I have to make some progress.

The Bill also enshrines, for the first time, free speech—something that we all feel very strongly about—but it goes beyond that. As well as enshrining free speech in clause 19, it gives special protection, in clauses 15 and 16, for content of journalistic and democratic importance. As my right hon. Friend the Secretary of State indicated in opening the debate, we intend to table a Government amendment—a point that my right hon. Friends the Members for Maldon and for Ashford (Damian Green) asked me to confirm—to make sure that journalistic content cannot be removed until a proper right of appeal has taken place. I am pleased to confirm that now.

We have made many changes to the Bill. Online fraudulent advertisers are now banned. Senior manager liability will commence immediately. Online porn of all kinds, including commercial porn, is now in scope. The Law Commission communication offences are in the Bill. The offence of cyber-flashing is in the Bill. The priority offences are on the face of the Bill, in schedule 7. Control over anonymity and user choice, which was proposed by my hon. Friend the Member for Stroud (Siobhan Baillie) in her ten-minute rule Bill, is in the Bill. All those changes have been made because this Government have listened.

Let me turn to some of the points made from the Opposition Front Bench. I am grateful for the in-principle support that the Opposition have given. I have enjoyed working with the shadow Minister and the shadow Secretary of State, and I look forward to continuing to do so during the many weeks in Committee ahead of us, but there were one or two points made in the opening speech that were not quite right. This Bill does deal with systems and processes, not simply with content. There are risk assessment duties. There are safety duties. There are duties to prevent harm. All those speak to systems and processes, not simply content. I am grateful to the Chairman of the Joint Committee, my hon. Friend the Member for Folkestone and Hythe (Damian Collins), for confirming that in his excellent speech.

If anyone in this House wants confirmation of where we are on protecting children, the Children’s Commissioner wrote a joint article with the Secretary of State in the Telegraph—I think it was this morning—confirming her support for the measures in the Bill.

When it comes to disinformation, I would make three quick points. First, we have a counter-disinformation unit, which is battling Russian disinformation night and day. Secondly, any disinformation that is illegal, that poses harm to children or that comes under the definition of “legal but harmful” in the Bill will be covered. And if that is not enough, the Minister for Security and Borders, who is sitting here next to me, intends to bring forward legislation at the earliest opportunity to cover counter-hostile state threats more generally. This matter will be addressed in the Bill that he will prepare and bring forward.

I have only four minutes left and there are so many points to reply to. If I do not cover them all, I am very happy to speak to Members individually, because so many important points were made. The right hon. Member for Barking asked who was going to pay for all the Ofcom enforcement. The taxpayer will pay for the first two years while we get ready—£88 million over two years—but after that Ofcom will levy fees on these social media firms, so they will pay for regulating their activities. I have already replied to the point she rightly raised about smaller but very harmful platforms.

My hon. Friend the Member for Meriden (Saqib Bhatti) has been campaigning tirelessly on the question of combating racism. This Bill will deliver what he is asking for.

The hon. Member for Batley and Spen (Kim Leadbeater) and my hon. Friend the Member for Watford (Dean Russell) asked about Zach’s law. Let me take this opportunity to confirm explicitly that clause 150—the harmful communication clause, for where a communication is intended to cause psychological distress—will cover epilepsy trolling. What happened to Zach will be prevented by this Bill. In addition, the Ministry of Justice and the Law Commission are looking at whether we can also have a standalone provision, but let me assure them that clause 150 will protect Zach.

My right hon. Friend the Member for Maldon asked a number of questions about definitions. Companies can move between category 1 and category 2, and different parts of a large conglomerate can be regulated differently depending on their activities. Let me make one point very clear—the hon. Member for Bristol North West (Darren Jones) also raised this point. When it comes to the provisions on “legal but harmful”, neither the Government nor Parliament are saying that those things have to be taken down. We are not censoring in that sense. We are not compelling social media firms to remove content. All we are saying is that they must do a risk assessment, have transparent terms and conditions, and apply those terms and conditions consistently. We are not compelling, we are not censoring; we are just asking for transparency and accountability, which is sorely missing at the moment. No longer will those in Silicon Valley be able to behave in an arbitrary, censorious way, as they do at the moment—something that Members of this House have suffered from, but from which they will no longer suffer once this Bill passes.

The hon. Member for Bristol North West, who I see is not here, asked a number of questions, one of which was about—[Interruption.] He is here; I do apologise. He has moved—I see he has popped up at the back of the Chamber. He asked about codes of practice not being mandatory. That is because the safety duties are mandatory. The codes of practice simply illustrate ways in which those duties can be met. Social media firms can meet them in other ways, but if they fail to meet those duties, Ofcom will enforce. There is no loophole here.

When it comes to the ombudsman, we are creating an internal right of appeal for the first time, so that people can appeal to the social media firms themselves. There will have to be a proper right of appeal, and if there is not, they will be enforced against. We do not think it appropriate for Ofcom to consider every individual complaint, because it will simply be overwhelmed, by probably tens of thousands of complaints, but Ofcom will be able to enforce where there are systemic failures. We feel that is the right approach.

I say to the hon. Member for Plymouth, Sutton and Devonport (Luke Pollard) that my right hon. Friend the Minister for Security and Borders will meet him about the terrible Keyham shooting.

The hon. Member for Washington and Sunderland West (Mrs Hodgson) raised a question about online fraud in the context of search. That is addressed by clause 35, but we do intend to make drafting improvements to the Bill, and I am happy to work with her on those drafting improvements.

I have been speaking as quickly as I can, which is quite fast, but I think time has got away from me. This Bill is groundbreaking. It will protect our citizens, it will protect our children—[Hon. Members: “Sit down!”]—and I commend it to the House.

Question put and agreed to.

Bill accordingly read a Second time.

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - -

The Minister just made it. I have rarely seen a Minister come so close to talking out his own Bill.

Online Safety Bill (Programme)

Motion made, and Question put forthwith (Standing Order No. 83A(7)),

That the following provisions shall apply to the Online Safety Bill:

Committal

(1) The Bill shall be committed to a Public Bill Committee.

Proceedings in Public Bill Committee

(2) Proceedings in the Public Bill Committee shall (so far as not previously concluded) be brought to a conclusion on Thursday 30 June 2022.

(3) The Public Bill Committee shall have leave to sit twice on the first day on which it meets.

Consideration and Third Reading

(4) Proceedings on Consideration shall (so far as not previously concluded) be brought to a conclusion one hour before the moment of interruption on the day on which those proceedings are commenced.

(5) Proceedings on Third Reading shall (so far as not previously concluded) be brought to a conclusion at the moment of interruption on that day.

(6) Standing Order No. 83B (Programming committees) shall not apply to proceedings on Consideration and Third Reading.

Other proceedings

(7) Any other proceedings on the Bill may be programmed.—(Michael Tomlinson.)

Question agreed to.

Online Safety Bill (Money)

Queen’s recommendation signified.

Motion made, and Question put forthwith (Standing Order No. 52(1)(a)),

That, for the purposes of any Act resulting from the Online Safety Bill, it is expedient to authorise the payment out of money provided by Parliament of:

(1) any expenditure incurred under or by virtue of the Act by the Secretary of State, and

(2) any increase attributable to the Act in the sums payable under any other Act out of money so provided.—(Michael Tomlinson.)

Question agreed to.

Online Safety Bill (Ways and Means)

Motion made, and Question put forthwith (Standing Order No. 52(1)(a)),

That, for the purposes of any Act resulting from the Online Safety Bill, it is expedient to authorise:

(1) the charging of fees under the Act, and

(2) the payment of sums into the Consolidated Fund.—(Michael Tomlinson.)

Question agreed to.

Deferred Divisions

Motion made, and Question put forthwith (Standing Order No. 41A(3)),

That at this day’s sitting, Standing Order 41A (Deferred divisions) shall not apply to the Motion in the name of Secretary Nadine Dorries relating to Online Safety Bill: Carry-over.—(Michael Tomlinson.)

Question agreed to.

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - -

Order. Really, people just ought to have more courtesy than to get up and, when there is still business going on in this House, to behave as if it is not sitting because it is after 10 o’clock. We really have to observe courtesy at all times in here.

Online Safety Bill (Carry-Over)

Motion made, and Question put forthwith (Standing Order No. 80A(1)(a)),

That if, at the conclusion of this Session of Parliament, proceedings on the Online Safety Bill have not been completed, they shall be resumed in the next Session.—(Michael Tomlinson.)

Question agreed to.