Online Safety Bill Debate
Full Debate: Read Full DebateRoger Gale
Main Page: Roger Gale (Conservative - Herne Bay and Sandwich)Department Debates - View all Roger Gale's debates with the Department for Digital, Culture, Media & Sport
(1 year, 10 months ago)
Commons ChamberFirst and foremost, as we approach the remaining stages of this Bill, we must remember its importance. As MPs, we hear stories of the dangers of online harms that some would not believe. I think it is fair to say that those of my generation were very fortunate to grow up in a world where social media did not exist; as I just said to my hon. Friend the Member for South Antrim (Paul Girvan) a few minutes ago, I am really glad I did not have to go through that. Social media is so accessible nowadays and children are being socialised in that environment, so it is imperative that we do all we can to ensure that they are protected and looked after.
I will take a moment to discuss the importance of new clause 2. There are many ongoing discussions about where the responsibility lies when it comes to the regulation of online harms, but new clause 2 ultimately would make it an offence for service providers not to comply with their safety duties in protecting children.
The hon. Member for Penistone and Stocksbridge (Miriam Cates) has described the world of social media as
“a modern Wild West, a lawless and predatory environment”—
how true those words are. I put on record my thanks to her and to the hon. Member for Stone (Sir William Cash) for all their endeavours to deliver change—they have both been successful, and I say well done to them.
Some 3,500 online child sexual offences are recorded by the police every month. Every month, 1.4 million UK children access online porn, the majority of which is degrading, abusive and violent. As drafted, the Bill would not hold tech bosses individually liable for their own failure in child and public safety. New clause 2 must be supported, and I am very pleased that the Government are minded to accept it.
Fines are simply not enough. If we fail to address that in the Bill, this House will be liable, because senior tech bosses seem not to be. I am minded, as is my party, to support the official Opposition’s new clause 4, “Safety duties protecting adults and society: minimum standards for terms of service”.
New clause 8 is also important. Over the last couple of years, my office has received numerous stories from parents who have witnessed their children deal with the consequences of what an eating disorder can do. I have a very close friend whose 16-year-old daughter is experiencing that at the moment. It is very hard on the family. Social media pages are just brutal. I have heard of TikTok pages glorifying bulimia and anorexia, and Instagram pages providing tips for self-harm—that is horrendous. It is important that we do not pick and choose what forms of harm are written into the Bill. It is not fair that some forms of harm are addressed under the Bill or referred to Ofcom while others are just ignored.
Communication and engagement with third-party stakeholders is the way to tackle and deal with this matter. Let us take, for example, a social media page that was started to comment on eating disorders and is generally unsafe and unhelpful to young people who are struggling. Such a page should be flagged to healthcare professionals, including GPs and nurses, who know best. If we can do that through the Bill, it would be a step in the right direction. On balance, we argue that harmful content should be reserved for regulations, which should be informed by proper stakeholder engagement.
I will touch briefly on new clause 3, which would require providers to include features that child users may use or apply if they wish to increase their control over harmful content. Such features are currently restricted to adults. Although we understand the need to empower young people to be responsible and knowledgeable for the decisions they make, we recognise the value of targeting such a duty at adults, many of whom hold their parental responsibilities very close to their hearts. More often than not, that is just as important as regulation.
To conclude, we have seen too many suicides and too much danger emerge from online and social media. Social media has the potential to be an educational and accessible space for all, including young people. However, there must be safety precautions for the sake of young people, who can very easily fall into traps, as we are all aware. In my constituency, we have had a spate of suicides among young people—it seems to be in a clique of friends, and that really worries me. This is all about regulation, and ensuring that harmful content is dealt with and removed, and that correct and informed individuals are making the decisions about what is and is not safe. I have faith that the Minister, the Government and the Bill will address the outstanding issues. The Bill will not stop every online evil, but it will, as the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) said, make being online safer. If the Bill does that, we can support it, because that would be truly good news.
Thank you, Mr Deputy Speaker—if I may say so, it is a pleasure to see my east Kent neighbour in the Chair.
I will speak to amendment 82, which was tabled in my name, and in support of new clause 2 and amendment 83. At the last Report stage I spoke at some length on an associated amendment, and I am conscious that many Members wish to speak, so I will keep my comments brief.
I am grateful to the many right hon. and hon. Friends who supported my amendment, whether or not their names appear next to it on the amendment paper. I thank in particular my right hon. Friend the Member for South Holland and The Deepings (Sir John Hayes) for his considerable assistance in securing changes.
Amendment 82 sets out a requirement to remove content that may result in serious harm or death to a child while crossing the English channel in small boats. The risk of harm or death from channel crossings is very real. Four children have drowned in the past 15 months, with many more harmed through exposure to petrol and saltwater burns and put in danger here and abroad by organised crime and people traffickers. Social media is playing a direct role in this criminal enterprise. It must be brought to book, and the videos and other content that encourage such activity must be taken down.
I thank my right hon. Friend for those comments. I will wrap up shortly, Mr Deputy Speaker. On that point, I have said before that the use of algorithms on platforms is in my mind very similar to addictive drugs: they get people addicted and get them to change their behaviours. They get them to cut off from their friends and family, and then they direct them in ways that we would not allow if we could wrap our arms around them and stop it. But they are doing that in their own bedrooms, classrooms and playgrounds.
I applaud the work on the Bill. Yes, there are ways it could be improved and a committee that looks at ways to improve it as the dynamics of social media change will be essential. However, letting the Bill go to the other place will be a major shift forwards in protecting our young people both now and in the future.