Thursday 13th January 2022

(2 years, 3 months ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- View Speech - Hansard - -

I join the House in congratulating the hon. Member for Folkestone and Hythe (Damian Collins) on convening the debate and on all his hard and excellent work in leading the Joint Committee. This has been one of the most interesting debates in this place that I have ever participated in and, given the urgent need to improve online safety, it could not have come at a more crucial time. I am grateful to all members of the Joint Committee, who had a tough job in cleaning up this confusing and long-delayed Bill. It has been a very long time coming.

Current legislation on the online space is from the analogue age and lags far behind the digital age in which most of us now live. The Bill has the potential to be the world-leading legislation that we need it to be, sending a message to social media giants who, for too long, have got away with allowing—and in some cases even promoting—harmful content online. That cannot be allowed to continue. Most of us recognise the huge impact of the Government’s failure to regulate the online space, notably on young people, yet still, as the Joint Committee report suggests, the draft legislation is not ambitious or broad enough in scope to tackle the issues at their root.

Let me be clear that some of the trends emerging online can be extremely detrimental to both physical and mental wellbeing. We have all heard desperately tragic stories involving young people. We heard such stories today from the hon. Member for Stourbridge (Suzanne Webb) and from my hon. Friends the Members for Leeds East (Richard Burgon) and for Reading East (Matt Rodda) about young people harming themselves, taking their own lives and, in some cases, even being murdered at the hands of social media. I pay tribute to Molly Russell, Joe Nihill and Olly Stephens and to their families and friends for campaigning to make social media a much safer place so that no other young people have to go through what they did.

We all know about the other harms faced online, from the spread of fake news—including dangerous anti-vax content—to financial scams offering supposedly lucrative incentives that can be hard to decipher even for the most internet-literate of people. However, despite years of warnings from the Opposition alongside campaigning groups and charities, the Government have so far failed to take robust action. In my constituency, whenever I meet young people through a school visit or a community group, the conversations almost always centre around a common interest: social media. I know that those sentiments are not unique to my area. That is why it is so utterly wrong that tech giants have been left unaccountable for so long. Labour therefore welcomes the Joint Committee’s recommendations calling for the Government to hold online tech giants to account for the design and operation of their systems. We firmly believe that regulation should be governed through legislation and by an independent regulator instead of by a distant body in Silicon Valley.

In recent weeks, we have been reminded once again of the real power and influence of material shared online in generating and spreading fake news. In the pandemic, tackling dangerous anti-vax content is critical to vaccinating the unvaccinated. With the majority of people requiring serious care in hospital for coronavirus being unvaccinated, Government inaction and complacency in tackling dangerous anti-vax sentiment is costing lives and putting pressure on the NHS,

Labour have repeatedly called on the Government to work cross-party to introduce emergency legislation that includes financial and criminal penalties for companies who fail to act to stamp out dangerous anti-vax content, yet once again they have failed to act. They must stand up to big tech companies. As my hon. Friend the Member for Newcastle upon Tyne Central (Chi Onwurah) said, we must ignore those companies’ excuses and introduce financial and criminal penalties for failures that lead to serious harm. That is echoed in the Joint Committee’s report, which recommends that responsibilities of individuals at the very top of online tech organisations go further, with full accountability for the messages that those companies are hosting and, at times, even promoting.

In our dialogue about the responsibilities of tech firms, we must remember that we need to consider the role of so-called niche organisations, too. In line with that, Labour commends the Committee’s recognition of concerns raised by Hope not Hate and the Antisemitism Policy Trust, among others, about the harms caused by these alternative platforms. Our party leader raised concerns about one such example—Telegram—during Prime Minister’s questions, and there are numerous other platforms on which misogyny, racism and homophobia run rampant, including BitChute, Gab, BrandNewTube and 4chan, to name just a few. It is absolutely right that the Government look again at categorisation so that harm caused on and by such platforms is assessed by risk and not the current determinants of size and functionality.

The Committee has also rightly noted that, while search does not operate in the same way as user-to-user platforms do, harm can still be caused through algorithmic programming and auto-prompts. We therefore urge the Government to include search engines and search services within the regulatory scope of the Bill, recognising that they, too, have a role to play in addressing not just illegal but legal and harmful content, too.

This brings me to another excellent recommendation raised by the Joint Committee. Notably excluded from the draft legislation is the ability to regulate and hold social media giants accountable for paid-for advertising hosted on their websites. We have heard from a host of Members from across the House about how important it is that that should be included in the legislation. The Committee concluded that

“The exclusion of paid-for advertising from the scope of the Online Safety Bill would obstruct the Government’s stated aim of tackling online fraud and activity that creates a risk of harm more generally.”

The Government have repeatedly claimed that regulating paid-for advertisements is beyond the scope of this legislation and that instead it will be the role of the online advertising programme to manage how adverts are monitored. But we are now almost three years down the line since the OAP was first mentioned, and we still have little more than a press release and an outdated call for evidence to confirm exactly how the programme will function.

As right hon. and hon. Members, including my right hon. Friend the Member for East Ham (Stephen Timms), have said, the Government must adopt the Joint Committee’s recommendation and expand the Bill to include paid-for adverts, which are central to so many instances of fraud and harm online more generally. Of course, crucial to this debate is therefore the need to define exactly what constitutes harm. As the Joint Committee recommends, the Government must publish their definition as soon as possible. The concept of harm underpins the entire evolution of how this legislation will be drafted and eventually enacted. It is vital that the Government’s definition is published before the Bill is introduced to ensure that Ofcom, as the regulator, is fully prepared and resourced for its role. I hope that the Minister will be able to give the House an update on that point in his comments.

I move on to address some of the detail in the Committee’s recommendations. I am pleased that the Committee has addressed many of the concerns raised about the complexity of the Bill in its current form, and Labour supports the move to bring our focus firmly back to the regulation of social media giants’ systems and progress. The Joint Committee’s report rightly reflects the strong concerns about the scale of the Secretary of State’s powers in the Bill, and we have heard from other Members about concerns regarding the scope of the regulator’s independence in many areas. We also know very little about the disinformation and misinformation unit, and that is required—Madam Deputy Speaker, I could go on. We know that this legislation is vital.

To conclude, I believe that, without the big changes recommended by the Joint Committee alongside a faster-paced and increased understanding of the wider issues, more people will find themselves at risk of harms online. The danger is that, even with the excellent recommendations of the Joint Committee, the Online Safety Bill will be inadequate and simply out of date when it eventually becomes law. The Government have a once-in-a-generation opportunity to change that, and I urge the Minister to take seriously the concerns raised by Members in the House.