Online Harms Debate
Full Debate: Read Full DebateAlex Davies-Jones
Main Page: Alex Davies-Jones (Labour - Pontypridd)Department Debates - View all Alex Davies-Jones's debates with the Department for Digital, Culture, Media & Sport
(2 years, 1 month ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
It is always a pleasure to serve under your chairship, Mr Dowd. I am grateful to be here representing the Opposition in this important debate. This is the first time I have overwhelmingly agreed with every single excellent contribution in this Chamber. That goes to show that, as my friend the hon. Member for Aberdeen North (Kirsty Blackman) said, this does cross party lines and is not a political issue—at least, it should not be. There is huge cross-party consensus in this place, and the other place, about getting the Bill on the statute book and in action to protect everybody on the internet.
I pay particular tribute to the right hon. Member for East Hampshire (Damian Hinds) who, as a former Education Secretary, comes at this debate with a huge breadth of knowledge and experience. He is a former colleague of mine; we sat together on the Digital, Culture, Media and Sport Committee, where we scrutinised this legislation and these issues in depth. I know it is an issue he cares very deeply about. I echo his and other Members’ sentiments on the reappointment of the Minister, who comes at this with a breadth of experience and cares deeply. I am very pleased to see him in his post.
Regulation to tackle online abuse was first promised many years ago. In the initial White Paper, the Conservatives promised world-leading legislation. However, when the draft Online Safety Bill was published in May 2021, those proposals were totally watered down and incomplete. The Bill is no longer world leading. Since it was first announced that this Government intended to regulate the online space, seven jurisdictions have introduced online safety laws. Although those pieces of legislation are not perfect, they are in place. In that time, online crime has exploded, child sex abuse online has become rife and scams have continued to proliferate. The Minister knows that, and he may share my frustration and genuine concern at the cost that the delay is causing.
I recognise that we are living in turbulent political times, but when it comes to online harms, particularly in the context of children, we cannot afford to wait. Last week, the coroner’s report from the tragic death of Molly Russell brought into sharp relief the serious impact that harmful social media content is having on young people across the UK every day. Let me be clear; Molly Russell’s death is a horrific tragedy. I pay tribute to her father Ian and her family, who have, in the most harrowing of circumstances, managed to channel their energy into tireless campaigning that has quite rightly made us all sit up and listen.
Molly’s untimely death, to which, as the coroner announced last week, harmful social media content was a contributing factor, has stunned us all. It should force action from the Government. While I was pleased to note in the business statement last week that the Online Safety Bill will return to the House on Tuesday, I plead with the Minister to work with Labour, the SNP and all parties to get it through, with some important amendments. Without measures on legal but harmful content—or harmful but legal, as we are now referring to it—it is not likely that suicide and self-harm content such as that faced online by Molly or by Joe Nihill, the constituent of my hon. Friend the Member for Leeds East (Richard Burgon), will be dealt with.
Enough is enough. Children and adults—all of us—need to be kept safe online. Labour has long campaigned for stronger protections online for children and the public, to keep people safe, secure our democracy and ensure that everyone is treated with decency and respect. There is broad consensus that social media companies have failed to regulate themselves. That is why I urge the Minister to support our move to ensure that those at the top of multi-million-pound social media companies are held personally accountable for failures beyond those currently in the Bill relating to information notices.
The Online Safety Bill is our opportunity to do better. I am keen to understand why the Government have failed to introduce or support personal criminal liability measures for senior leaders who have fallen short on their statutory duty to protect us online. There are such measures in other areas, such as financial services. The same goes for the Government’s approach to the duties of care for adults under the Bill—what we call harmful but legal. The Minister knows that the Opposition has concerns over the direction of the Bill, as do other Members here today.
Freedom of speech is vital to our democracy, but it absolutely must not come at a harmful cost. The Bill Committee, which I was a member of, heard multiple examples of racist, antisemitic, extremist and other harmful publishers, from holocaust deniers to white supremacists, which would stand to benefit from the recognised news publisher exemption as it currently stands, either overnight or by making minor administrative changes.
In Committee, in response to an amendment from my hon. Friend the Member for Batley and Spen (Kim Leadbeater), the Minister promised the concession that Russia Today would be excluded from the recognised news publisher exemption. I am pleased that the Government have indeed promised to exclude sanctioned news titles such as Russia Today through an amendment that they have said they will introduce at a later stage, but that does not go far enough. Disinformation outlets rarely have the profile of Russia Today. Often they operate more discreetly and are less likely to attract sanctions. For those reasons, the Government must go further. As a priority, we must ensure that the current exemption cannot be exploited by bad actors. The Government must not give a free pass to those propagating racist or misogynistic harm and abuse.
Aside from freedom of speech, Members have raised myriad harms that appear online, many of which we tried to tackle with amendments in Committee. A robust corporate and senior management liability scheme for routine failures was rejected. Basic duties that would have meant that social media companies had to publish their own risk assessments were rejected. Amendments to bring into scope small but high-harm platforms that we have heard about today were also rejected. The Government would not even support moves to name violence against women and girls as a harm in the Bill, despite the huge amount of evidence suggesting that women and people of colour are more at risk.
Recent research from the Centre for Countering Digital Hate has found that Instagram fails to act on nine out of 10 reports of misogyny over its direct messenger. One in 15 DMs sent to women by strangers were abusive or contained violent and sexual images. Of 330 examples reported on Twitter and Instagram, only nine accounts were removed. More than half of those that were reported continued to offend. The Government are letting down survivors and putting countless women and girls at risk of gendered harms, such as image-based sexual abuse—so-called revenge porn—rape threats, doxxing and tech abuse perpetrated by an abusive partner. What more will it take for meaningful change to be made?
I hope the Minister will address those specific omissions. Although I recognise that he was not in his role as the Bill progressed in Committee, he is in the unfortunate position of having to pick up the pieces. I hope he will today give us some reassurances, which I know many of us are seeking.
I must also raise with the Minister once again the issue of online discriminatory abuse, particularly in the context of sport. In oral questions I recently raised the very serious problem of rising discrimination faced not just by players but their families, referees, coaches, pundits, fans and others. I know the hon. Member for Barrow and Furness (Simon Fell) tried to make this point in his contribution. Abuse and harm faced online is not virtual; it is real and has a lasting impact. Labour Members believe it is essential that tech firms are held to account when harmful abuse and criminal behaviour appear on, are amplified by and therefore flourish on their platforms.
There are genuine issues with the Government’s approach to the so-called legal but harmful provisions in the Bill that will, in essence, fail to capture some of the most harmful content out there. We have long called for a more systems-based approach to the Bill, and we need only to look at the research that we have had from Kick It Out to recognise the extent of the issue. Research from that organisation used artificial intelligence to identify violent abuse that falls below the current criminal thresholds outlined in the current draft of the Bill. There is no need for me to repeat the vile language in this place today. We have only to cast our minds back to 2020 and the Euros to recall the disgraceful abuse—and more—targeted at members of the England team to know the realities of the situation online. But it does not have to be this way.
Labour colleagues have repeatedly raised concerns that the current AI moderation practices utilised by the big social media giants are seemingly incapable of adapting to the rapid rate at which new internet-based languages, emojis and euphemisms develop. It is wrong of the Government to pursue an online harms agenda that is so clearly focused on content moderation, rather than considering the business models that underpin those harmful practices. Worse still, we now know that that approach often underpins a wide range of the harmful content that we see online.
The Times recently reported that TikTok users were able to easily evade safety filters to share suicide and self-harm posts by using slang terms and simple misspellings. Some of the content in question had been online for more than a year, despite including direct advice on how to self-harm. TikTok’s community guidelines forbid content that depicts or encourages suicide or self-harm, and yet such content still remains online for everyone to see.
We have concerns that the Government’s current approach will have little impact unless the big firms are held more accountable. What we really need is a consistent approach from the Government, and a commitment to tackling myriad online harms that is fit for the modern age and for emerging tech, too. There is a widespread political consensus on the importance of getting this right, and the Minister can be assured of success if only his Department is prepared to listen.
It is a pleasure to serve under your chairmanship, Mr Dowd. This is my first appearance as a Minister in Westminster Hall, and your first appearance in the Chair, so we are both making our debuts. I hope we have long and successful reigns in our respective roles.
It is a great pleasure to respond to the debate secured by my right hon. Friend the Member for East Hampshire (Damian Hinds) and to his excellent opening speech. He feels strongly about these issues—as he did both in Government and previously as a member of the Digital, Culture, Media and Sport Committee—and he has spoken up about them. I enjoyed working with him when he was a Minister at the Home Office and I chaired the prelegislative scrutiny Committee, which discussed many important features of the Online Safety Bill. One feature of the Bill, of course, is the inclusion of measures on fraud and scam advertising, which was a recommendation of the Joint Committee. It made my life easier that, by the time I became a Minister in the Department, the Government had already accepted that recommendation and introduced the exemption, and I will come on to talk about that in more detail.
My right hon. Friend, the hon. Member for Pontypridd (Alex Davies-Jones) and other Members raised the case of Molly Russell, and it is important to reflect on that case. I share the sentiments expressed about the tragedy of Molly’s death, its avoidable nature and the tireless work of the Russell family, and particularly her father, Ian Russell, whom I have met several times to discuss this. The Russell family pursued a very difficult and complicated case, which required a huge release of evidence from the social media companies, particularly Instagram and Pinterest, to demonstrate the sort of content to which Molly Russell was exposed.
One of the things Ian Russell talks about is the work done by the investigating officers in the coroner’s inquest. Tellingly, the inquest restricted the amount of time that people could be exposed to the content that Molly was exposed to, and ensured that police officers who were investigating were not doing so on their own. Yet that was content that a vulnerable teenage girl saw repeatedly, on her own, in isolation from those who could have helped her.
When online safety issues are raised with social media companies, they say things like, “We make this stuff very hard to find.” The lived experience of most teenagers is not searching for such material; it is such material being selected by the platforms and targeted at the user. When someone opens TikTok, their first exposure is not to content that they have searched for; it is to content recommended to them by TikTok, which data-profiles the user and chooses things that will engage them. Those engagement-based business models are at the heart of the way the Online Safety Bill works and has to work. If platforms choose to recommend content to users to increase their engagement with the platform, they make a business decision. They are selecting content that they think will make a user want to return more frequently and stay on the platform for longer. That is how free apps make money from advertising: by driving engagement.
It is a fair criticism that, at times, the platforms are not effective enough at recognising the kinds of engagement tools they are using, the content that is used to engage people and the harm that that can do. For a vulnerable person, the sad truth is that their vulnerability will probably be detected by the AI that drives the recommendation tools. That person is far more likely to be exposed to content that will make their vulnerabilities worse. That is how a vulnerable teenage girl can be held by the hand—by an app’s AI recommendation tools—and walked from depression to self-harm and worse. That is why regulating online safety is so important and why the protection of children is so fundamental to the Bill. As hon. Members have rightly said, we must also ensure that we protect adults from some of the illegal and harmful activity on the platforms and hold those platforms to account for the business model they have created.
I take exception to the suggestion from the hon. Member for Pontypridd that this is a content-moderation Bill. It is not; it is a systems Bill. The content that we use, and often refer to, is an exemplar of the problem; it is an exemplar of things going wrong. On all the different areas of harm that are listed in the Bill, particularly the priority legal offences in schedule 7, our challenge to the companies is: “You have to demonstrate to the regulator that you have appropriate systems in place to identify this content, to ensure that you are not amplifying or recommending it and to mitigate it.” Mitigation could be suppressing the content—not letting it be amplified by their tools—removing it altogether or taking action against the accounts that post it. It is the regulator’s job to work with the companies, assess the risk, create codes of practice and then hold the companies to account for how they work.
There is criminal liability for the companies if they refuse to co-operate with the regulator. If they refuse to share information or evidence asked for by the regulator, a named company director will be criminally liable. That was in the original Bill. The recommendation in the Joint Committee report was that that should be commenced within months of the Bill being live; originally it was going to be two years. That is in the Bill today, and it is important that it is there so that companies know they have to comply with requests.
The hon. Member for Pontypridd is right to say that the Bill is world-leading, in the sense that it goes further than other people’s Bills, but other Bills have been enacted elsewhere in the world. That is why it is important that we get on with this.
The Minister is right to say that we need to get on with this. I appreciate that he is not responsible for the business of this House, but his party and his Government are, so will he explain why the Bill has been pulled from the timetable next week, if it is such an important piece of legislation?
As the hon. Lady knows, I can speak to the Bill; I cannot speak to the business of the House—that is a matter for the business managers in the usual way. Department officials—some here and some back at the Department—have been working tirelessly on the Bill to ensure we can get it in a timely fashion. I want to see it complete its Commons stages and go to the House of Lords as quickly as possible. Our target is to ensure that it receives safe passage in this Session of Parliament. Obviously, I cannot talk to the business of the House, which may alter as a consequence of the changes to Government.