Online Safety Bill Debate
Full Debate: Read Full DebateAlex Davies-Jones
Main Page: Alex Davies-Jones (Labour - Pontypridd)Department Debates - View all Alex Davies-Jones's debates with the Department for Digital, Culture, Media & Sport
(2 years ago)
Commons ChamberThe hon. Gentleman talks about the wider use of screens and screen time, and that is why Ofcom’s media literacy programme, and DCMS’s media literacy strategy—
That is because we have a detailed strategy that tackles many of these issues. Again, none of this is perfect, and as I have said, the Government are working in tandem with the platforms, and with parents and education bodies, to make sure we get that bit right. The hon. Gentleman is right to highlight that as a big issue.
I talked about harmful communications, recognising that we could leave a potential gap in the criminal law. The Government have also decided not to repeal existing communications offences in the Malicious Communications Act 1988, or those under section 127(1) of the Communications Act 2003. That will ensure that victims of domestic abuse or other extremely harmful communications will still be robustly protected by the criminal law. Along with planned changes to the harmful communications offence, we are making a number of additional changes to the Bill—that will come later, Mr Speaker, and I will not tread too much into that, as it includes the removal of the adult safety duties, often referred to as the legal but harmful provision. The amended Bill offers adults a triple shield of protection that requires platforms to remove illegal content and material that violates their terms and conditions, and gives adults user controls to help them avoid seeing certain types of content.
The Bill’s key objective, above everything else, is the safety of children online, and we will be making a number of changes to strengthen the Bill’s existing protections for children. We will make sure that we expect platforms to use age assurance technology when identifying the age of their users, and we will also require platforms with minimum age restrictions to explain in their terms of service what measures they have in place to prevent access to those below their minimum age, and enforce those measures consistently. We are planning to name the Children’s Commissioner as a statutory consultee for Ofcom in its development of the codes of practice, ensuring that children’s views and needs are represented.
That is the Children’s Commissioner for England, specifically because they have particular reserved duties for the whole of the UK. None the less, Ofcom must also have regard to a wider range of voices, which can easily include the other Children’s Commissioners.
It is an absolute pleasure to be back in the Chamber to respond on behalf of the Opposition to this incredibly important piece of legislation on its long overdue second day on Report. It certainly has not been an easy ride so far: I am sure that Bill Committee colleagues across the House agree that unpicking and making sense of this unnecessarily complicated Bill has been anything but straightforward.
We should all be incredibly grateful and are all indebted to the many individuals, charities, organisations and families who have worked so hard to bring online safety to the forefront for us all. Today is a particularly important day, as we are joined in the Public Gallery by a number of families who have lost children in connection with online harms. They include Lorin LaFave, Ian Russell, Andy and Judy Thomas, Amanda and Stuart Stephens and Ruth Moss. I sincerely hope that this debate will do justice to their incredible hard work and commitment in the most exceptionally difficult of circumstances.
It may be a drop in the ocean to the likes of Elon Musk or Mark Zuckerberg—these multibillionaires who are taking over social media and using it as their personal plaything. They are not going to listen to fines; the only way they are going to listen, sit up and take notice is if criminal liability puts their neck on the line and makes them answer for some of the huge failures of which they are aware.
The right hon. and learned Member mentions that he shares the sentiment of the amendment but feels it could be wrong. We have an opportunity here to put things right and put responsibility where it belongs: with the tech companies, the platforms and the managers responsible. In a similar way to what happens in the financial sector or in health and safety regulation, it is vital that people be held responsible for issues on their platforms. We feel that criminal liability will make that happen.
May I intervene on a point of fact? The hon. Lady says that fines are a drop in the ocean. The turnover of Google is $69 billion; 10% of that is just shy of $7 billion. That is not a drop in the ocean, even to Elon Musk.
We are looking at putting people on the line. It needs to be something that people actually care about. Money does not matter to these people, as we have seen with the likes of Google, Elon Musk and Mark Zuckerberg; what matters to them is actually being held to account. Money may matter to Government Members, but it will be criminal liability that causes people to sit up, listen and take responsibility.
While I am not generally in the habit of predicting the Minister’s response or indeed his motives—although my job would be a hell of a lot easier if I did—I am confident that he will try to peddle the line that it was the Government who introduced director liability for compliance failures in an earlier draft of the Bill. Let me be crystal clear in making this point, because it is important. The Bill, in its current form, makes individuals at the top of companies personally liable only when a platform fails to supply information to Ofcom, which misses the point entirely. Directors must be held personally liable when safety duties are breached. That really is quite simple, and I am confident that it would be effective in tackling harm online much more widely.
We also support new clause 28, which seeks to establish an advocacy body to represent the interests of children online. It is intended to deal with a glaring omission from the Bill, which means that children who experience online sexual abuse will receive fewer statutory user advocacy protections than users of a post office or even passengers on a bus. The Minister must know that that is wrong and, given his Government’s so-called commitment to protecting children, I hope he will carefully consider a new clause which is supported by Members on both sides of the House as well as the brilliant National Society for the Prevention of Cruelty to Children. In rejecting new clause 28, the Government would be denying vulnerable children a strong, authoritative voice to represent them directly, so I am keen to hear the Minister’s justification for doing so, if that is indeed his plan.
Members will have noted the bundle of amendments tabled by my hon. Friend the Member for Worsley and Eccles South (Barbara Keeley) relating to Labour’s concerns about the unnecessary powers to overrule Ofcom that the Bill, as currently drafted, gives the Secretary of State of the day. During Committee evidence sessions, we heard from Will Perrin of the Carnegie UK Trust, who, as Members will know, is an incredibly knowledgeable voice when it comes to internet regulation. He expressed concern about the fact that, in comparison with other regulatory frameworks such as those in place for advertising, the Bill
“goes a little too far in introducing a range of powers for the Secretary of State to interfere with Ofcom’s day-to-day doing of its business.”––[Official Report, Online Safety Public Bill Committee, 26 May 2022; c. 117.]
Labour shares that concern. Ofcom must be truly independent if it is to be an effective regulator. Surely we have to trust it to undertake logical processes, rooted in evidence, to arrive at decisions once this regime is finally up and running. It is therefore hard to understand how the Government can justify direct interference, and I hope that the Minister will seriously consider amendments 23 to 30, 32, and 35 to 41.
Before I address Labour’s main concerns about the Government’s proposed changes to the Bill, I want to record our support for new clauses 29 and 30, which seek to bring media literacy duties back into the scope of the Bill. As we all know, media literacy is the first line of defence when it comes to protecting ourselves against false information online. Prevention is always better than cure. Whether it is a question of viral conspiracy theories or Russian disinformation, Labour fears that the Government’s approach to internet regulation will create a two-tier internet, leaving some more vulnerable than others.
However, I am sorry to say that the gaps in this Bill do not stop there. I was pleased to see that my hon. Friend the Member for Rotherham (Sarah Champion) had tabled new clause 54, which asks the Government to formally consider the impact that the use of virtual private networks will have on Ofcom’s ability to enforce its powers. This touches on the issue of future-proofing, which Labour has raised repeatedly in debates on the Bill. As we have heard from a number of Members, the tech industry is evolving rapidly, with concepts such as the metaverse changing the way in which we will all interact with the internet in the future. When the Bill was first introduced, TikTok was not even a platform. I hope the Minister can reassure us that the Bill will be flexible enough to deal with those challenges head-on; after all, we have waited far too long.
That brings me to what Labour considers to be an incredible overturn by the Government relating to amendment 239, which seeks to remove the new offence of harmful communications from the Bill entirely. As Members will know, the communications offence was designed by the Law Commission with the intention of introducing a criminal threshold for the most dangerous online harms. Indeed, in Committee it was welcome to hear the then Minister—the present Minister for Crime, Policing and Fire, the right hon. Member for Croydon South (Chris Philp)—being so positive about the Government’s consultation with the commission. In relation to clause 151, which concerns the communications offences, he even said:
“The Law Commission is the expert in this kind of thing…and it is right that, by and large, we follow its expert advice in framing these offences, unless there is a very good reason not to. That is what we have done—we have followed the Law Commission’s advice, as we would be expected to do.” ––[Official Report, Online Safety Public Bill Committee, 21 June 2022; c. 558.]
Less than six months down the line, we are seeing yet another U-turn from this Government, who are doing precisely the opposite of what was promised.
Removing these communications offences from the Bill will have real-life consequences. It will mean that harmful online trends such as hoax bomb threats, abusive social media pile-ons and fake news such as encouraging people to drink bleach to cure covid will be allowed to spread online without any consequence.
No Jewish person should have to log online and see Hitler worship, but what we have seen in recent weeks from Kanye West has been nothing short of disgusting, from him saying “I love Hitler” to inciting online pile-ons against Jewish people, and this is magnified by the sheer number of his followers, with Jews actually being attacked on the streets in the US. Does my hon. Friend agree that the Government’s decision to drop the “legal but harmful” measures from the Bill will allow this deeply offensive and troubling behaviour to continue?
I thank my hon. Friend for that important and powerful intervention. Let us be clear: everything that Kanye West said online is completely abhorrent and has no place in our society. It is not for any of us to glorify Hitler and his comments or praise him for the work he did; that is absolutely abhorrent and it should never be online. Sadly, however, that is exactly the type of legal but harmful content that will now be allowed to proliferate online because of the Government’s swathes of changes to the Bill, meaning that that would be allowed to be seen by everybody. Kanye West has 30 million followers online. His followers will be able to look at, share, research and glorify that content without any consequence to that content being freely available online.
Further to that point, it is not just that some of the content will be deeply offensive to the Jewish community; it could also harm wider society. Some further examples of postings that would be considered legal but harmful are likening vaccination efforts to Nazi death camps and alleging that NHS nurses should stand trial for genocide. Does my hon. Friend not agree that the changes the Government are now proposing will lead to enormous and very damaging impacts right through society?
My right hon. Friend is absolutely right. I am keen to bring this back into scope before Mr Speaker chastises us any further, but she is right to say that this will have a direct real-world impact. This is what happens when we focus on content rather than directly on the platforms and the algorithms on the platforms proliferating this content. That is where the focus needs to be. It is the algorithms that share and amplify this content to these many followers time and again that need to be tackled, rather than the content itself. That is what we have been pleading with the Government to concentrate on, but here we are in this mess.
We are pleased that the Government have taken on board Labour’s policy to criminalise certain behaviours—including the encouragement of self-harm, sharing people’s intimate images without their consent, and controlling or coercive behaviours—but we believe that the communications offences more widely should remain in order to tackle dangerous online harms at their root. We have worked consistently to get this Bill over the line and we have reached out to do so. It has been subject to far too many delays and it is on the Government’s hands that we are again facing substantial delays, when internet regulation has never been more sorely needed. I know that the Minister knows that, and I sincerely hope he will take our concerns seriously. I reach out to him again across the Dispatch Box, and look forward to working with him and challenging him further where required as the Bill progresses. I look forward to getting the Bill on to the statute book.
I call the Chair of the Select Committee.