2nd reading
Tuesday 19th April 2022

(2 years, 7 months ago)

Commons Chamber
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts
Lucy Powell Portrait Lucy Powell (Manchester Central) (Lab/Co-op)
- Hansard - -

Thank you, Madam Deputy Speaker. It has been a busy day, and I will try to keep my remarks short. It is a real shame that the discussion of an important landmark Bill, with so many Members wanting to contribute, has been squeezed into such a tiny amount of time.

Labour supports the principles of the Online Safety Bill. There has been a wild west online for too long. Huge platforms such as Facebook and Google began as start-ups but now have huge influence over almost every aspect of our lives: how we socialise and shop, where we get our news and views, and even the outcomes of elections and propaganda wars. There have been undoubted benefits, but the lack of regulation has let harms and abuses proliferate. From record reports of child abuse to soaring fraud and scams, from racist tweets to Russia’s disinformation campaigns, there are too many harms that, as a society, we have been unable or unwilling to address.

There is currently no regulator. However, neither the Government nor silicon valley should have control over what we can say and do online. We need strong, independent regulation.

Dan Carden Portrait Dan Carden (Liverpool, Walton) (Lab)
- Hansard - - - Excerpts

Will my hon. Friend give way?

Lucy Powell Portrait Lucy Powell
- Hansard - -

I will give way once on this point.

Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

I am grateful. The Secretary of State talked about getting the tech giants to follow their own rules, but we know from Frances Haugen, the Facebook whistleblower, that companies were driving children and adults to harmful content, because it increased engagement. Does that not show that we must go even further than asking them to follow their own rules?

Lucy Powell Portrait Lucy Powell
- Hansard - -

I very much agree with my hon. Friend, and I will come on to talk about that shortly.

The Online Safety Bill is an important step towards strong, independent regulation. We welcome the Bill’s overall aim: the duty of care framework based on the work of the Carnegie Trust. I agree with the Secretary of State that the safety of children should be at the heart of this regulation. The Government have rightly now included fraud, online pornography and cyber-flashing in the new draft of the Bill, although they should have been in scope all along.

Wera Hobhouse Portrait Wera Hobhouse (Bath) (LD)
- Hansard - - - Excerpts

Will the hon. Lady give way?

Lucy Powell Portrait Lucy Powell
- Hansard - -

I am not going to give way, sorry.

Before I get onto the specifics, I will address the main area of contention: the balance between free speech and regulation, most notably expressed via the “legal but harmful” clauses.

Christian Wakeford Portrait Christian Wakeford (Bury South) (Lab)
- Hansard - - - Excerpts

Will my hon. Friend give way?

Lucy Powell Portrait Lucy Powell
- Hansard - -

I will give way one last time.

Christian Wakeford Portrait Christian Wakeford
- Hansard - - - Excerpts

I thank my hon. Friend. The Government have set out the priority offences in schedule 7 to the Bill, but legal harms have clearly not been specified. Given the torrent of racist, antisemitic and misogynistic abuse that grows every single day, does my hon. Friend know why the Bill has not been made more cohesive with a list of core legal harms, allowing for emerging threats to be dealt with in secondary legislation?

--- Later in debate ---
Lucy Powell Portrait Lucy Powell
- Hansard - -

I will come on to some of those issues. My hon. Friend makes a valid point.

I fear the Government’s current solution to the balance between free speech and regulation will please no one and takes us down an unhelpful rabbit hole. Some believe the Bill will stifle free speech, with platforms over-zealously taking down legitimate political and other views. In response, the Government have put in what they consider to be protections for freedom of speech and have committed to setting out an exhaustive list of “legal but harmful” content, thus relying almost entirely on a “take down content” approach, which many will still see as Government overreach.

On the other hand, those who want harmful outcomes addressed through stronger regulation are left arguing over a yet-to-be-published list of Government-determined harmful content. This content-driven approach moves us in the wrong direction away from the “duty of care” principles the Bill is supposed to enshrine. The real solution is a systems approach based on outcomes, which would not only solve the free speech question, but make the Bill overall much stronger.

What does that mean in practice? Essentially, rather than going after individual content, go after the business models, systems and policies that drive the impact of such harms—[Interruption.] The Minister for Security and Borders, the right hon. Member for East Hampshire (Damian Hinds), says from a sedentary position that that is what the Bill does, but none of the leading experts in the field think the same. He should talk to some of them before shouting at me.

The business models of most social media companies are currently based on engagement, as my hon. Friend the Member for Liverpool, Walton (Dan Carden) outlined. The more engagement, the more money they make, which rewards controversy, sensationalism and fake news. A post containing a racist slur or anti-vax comment that nobody notices, shares or reads is significantly less harmful than a post that is quickly able to go viral. A collective pile-on can have a profoundly harmful effect on the young person on the receiving end, even though most of the individual posts would not meet the threshold of harmful.

Matt Rodda Portrait Matt Rodda (Reading East) (Lab)
- Hansard - - - Excerpts

Will my hon. Friend give way on that point?

Lucy Powell Portrait Lucy Powell
- Hansard - -

I will not, sorry. Facebook whistleblower Frances Haugen, who I had the privilege of meeting, cited many examples to the Joint Committee on the draft Online Safety Bill of Facebook’s models and algorithms making things much worse. Had the Government chosen to follow the Joint Committee recommendations for a systems-based approach rather than a content-driven one, the Bill would be stronger and concerns about free speech would be reduced.

Lucy Powell Portrait Lucy Powell
- Hansard - -

I am sorry, but too many people want to speak. Members should talk to their business managers, who have cut—[Interruption.] I know the hon. Gentleman was Chair of the Committee—[Interruption.]

Baroness Laing of Elderslie Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - - - Excerpts

Order. The hon. Lady is not giving way. Let us get on with the debate.

Lucy Powell Portrait Lucy Powell
- Hansard - -

The business managers have failed everybody on both sides given the time available.

A systems-based approach also has the benefit of tackling the things that platforms can control, such as how content spreads, rather than what they cannot control, such as what people post. We would avoid the cul-de-sac of arguing over the definitions of what content is or is not harmful, and instead go straight to the impact. I urge the Government to adopt the recommendations that have been made consistently to focus the Bill on systems and models, not simply on content.

Turning to other aspects of the Bill, key issues with its effectiveness remain. The first relates to protecting children. As any parent will know, children face significant risks online, from poor body image, bullying and sexist trolling to the most extreme grooming and child abuse, which is, tragically, on the rise. This Bill is an important opportunity to make the internet a safe place for children. It sets out duties on platforms to prevent children from encountering illegal, harmful or pornographic content. That is all very welcome.

However, despite some of the Government’s ambitious claims, the Bill still falls short of fully protecting children. As the National Society for the Prevention of Cruelty to Children argues, the Government have failed to grasp the dynamics of online child abuse and grooming—[Interruption.] Again, I am being heckled from the Front Bench, but if Ministers engage with the children’s charities they will find a different response. For example—[Interruption.] Yes, but they are not coming out in support of the Bill, are they? For example, it is well evidenced that abusers will often first interact with children on open sites and then move to more encrypted platforms. The Government should require platforms to collaborate to reduce harm to children, prevent abuse from being displaced and close loopholes that let abusers advertise to each other in plain sight.

The second issue is illegal activity. We can all agree that what is illegal offline should be illegal online, and all platforms will be required to remove illegal content such as terrorism, child sex abuse and a range of other serious offences. It is welcome that the Government have set out an expanded list, but they can and must go further. Fraud was the single biggest crime in the UK last year, yet the Business Secretary dismissed it as not affecting people’s everyday lives.

The approach to fraud in this Bill has been a bit like the hokey-cokey: the White Paper said it was out, then it was in, then it was out again in the draft Bill and finally it is in again, but not for the smaller sites or the search services. The Government should be using every opportunity to make it harder for scammers to exploit people online, backed up by tough laws and enforcement. What is more, the scope of this Bill still leaves out too many of the Law Commission’s recommendations of online crimes.

The third issue is disinformation. The war in Ukraine has unleashed Putin’s propaganda machine once again. That comes after the co-ordinated campaign by Russia to discredit the truth about the Sergei Skripal poisonings. Many other groups have watched and learned: from covid anti-vaxxers to climate change deniers, the internet is rife with dangerous disinformation. The Government have set up a number of units to tackle disinformation and claim to be working with social media companies to take it down. However, that is opaque and far from optimal. The only mention of disinformation in the Bill is that a committee should publish a report. That is far from enough.

Returning to my earlier point, it is the business models and systems of social media companies that create a powerful tool for disinformation and false propaganda to flourish. Being a covid vaccine sceptic is one thing, but being able to quickly share false evidence dressed up as science to millions of people within hours is a completely different thing. It is the power of the platform that facilitates that, and it is the business models that encourage it. This Bill hardly begins to tackle those societal and democratic harms.

The fourth issue is online abuse. From racism to incels, social media has become a hotbed for hate. I agree with the Secretary of State that that has poisoned public life. I welcome steps to tackle anonymous abuse. However, we still do not know what the Government will designate as legal but harmful, which makes it very difficult to assess whether the Bill goes far enough, or indeed too far. I worry that those definitions are left entirely to the Secretary of State to determine. A particularly prevalent and pernicious form of online hate is misogyny, but violence against women and girls is not mentioned at all in the Bill—a serious oversight.

The decision on which platforms will be regulated by the Bill is also arbitrary and flawed. Only the largest platforms will be required to tackle harmful content, yet smaller platforms, which can still have a significant, highly motivated, well-organised and particularly harmful user base, will not. Ofcom should regulate based on risk, not just on size.

The fifth issue is that the regulator and the public need the teeth to take on the big tech companies, with all the lawyers they can afford. It is a David and Goliath situation. The Bill gives Ofcom powers to investigate companies and fine them up to 10% of their turnover, and there are some measures to help individual users. However, if bosses in Silicon Valley are to sit up and take notice of this Bill, it must go further. It should include stronger criminal liability, protections for whistleblowers, a meaningful ombudsman for individuals, and a route to sue companies through the courts.

The final issue is future-proofing, which we have heard something about already. This Bill is a step forward in dealing with the likes of Twitter, Facebook and Instagram—although it must be said that many companies have already begun to get their house in order ahead of any legislation—but it will have taken nearly six years for the Bill to appear on the statute book.

Since the Bill was first announced, TikTok has emerged on the scene, and Facebook has renamed itself Meta. The metaverse is already posing dangers to children, with virtual reality chat rooms allowing them to mix freely with predatory adults. Social media platforms are also adapting their business models to avoid regulation; Twitter, for example, says that it will decentralise and outsource moderation. There is a real danger that when the Bill finally comes into effect, it will already be out of date. A duty of care approach, focused on outcomes rather than content, would create a much more dynamic system of regulation, able to adapt to new technologies and platforms.

In conclusion, social media companies are now so powerful and pervasive that regulating them is long overdue. Everyone agrees that the Bill should reduce harm to children and prevent illegal activity online, yet there are serious loopholes, as I have laid out. Most of all, the focus on individual content rather than business models, outcomes and algorithms will leave too many grey areas and black spots, and will not satisfy either side in the free speech debate.

Despite full prelegislative scrutiny, the Government have been disappointingly reluctant to accept those bigger recommendations. In fact, they are going further in the wrong direction. As the Bill progresses through the House, we will work closely with Ministers to improve and strengthen it, to ensure that it truly becomes a piece of world-leading legislation.

None Portrait Several hon. Members rose—
- Hansard -