Online Safety Bill Debate
Full Debate: Read Full DebateDamian Collins
Main Page: Damian Collins (Conservative - Folkestone and Hythe)Department Debates - View all Damian Collins's debates with the Department for Digital, Culture, Media & Sport
(1 year, 11 months ago)
Commons ChamberI am aware of that case, which is truly appalling and shocking. That is exactly why we need such protections in the Bill: to stop those cases proliferating online, to stop the platforms from choosing their own terms of service, and to give Ofcom real teeth, as a regulator, to take on those challenges.
Does the hon. Lady accept that the Bill does give Ofcom the power to set minimum safety standards based on the priority legal offences written into the Bill? That would cover almost all the worst kinds of offences, including child sexual exploitation, inciting violence and racial hatred, and so on. Those are the minimum safety standards that are set, and the Bill guarantees them.
What is not in those minimum safety standards is all the horrendous and harmful content that I have described: covid disinformation, harmful content from state actors, self-harm promotion, antisemitism, misogyny and the incel culture, all of which is proliferating online and being amplified by the algorithms. This set of minimum safety standards can be changed overnight.
As the hon. Lady knows, foreign-state disinformation is covered because it is part of the priority offences listed in the National Security Bill, so those accounts can be disabled. Everything that meets the criminal threshold is in this Bill because it is in the National Security Bill, as she knows. The criminal threshold for all the offences she lists are set in schedule 7 of this Bill.
That is just the problem, though, isn’t it? A lot of those issues would not be covered by the minimum standards—that is why we have tabled new clause 4—because they do not currently meet the legal threshold. That is the problem. There is a grey area of incredibly harmful but legal content, which is proliferating online, being amplified by algorithms and by influencers—for want of a better word—and being fed to everybody online. That content is then shared incredibly widely, and that is what is causing harm and disinformation.
No, I will not. I need to make progress; we have a lot to cover and a lot of amendments, as I have outlined.
Under the terms of the Bill, platforms can issue whatever minimum standards they wish and then simply change them at will overnight. In tabling new clause 4, our intention is to ensure that the platforms are not able to avoid safety duties by changing their terms and conditions. As I have said, this group of amendments will give Ofcom the relevant teeth to act and keep everybody safe online.
We all recognise that there will be a learning curve for everyone involved once the legislation is enacted. We want to get that right, and the new clauses will ensure that platforms have specific duties to keep us safe. That is an important point, and I will continue to make it clear at every opportunity, because the platforms and providers have, for far too long, got away with zero regulation—nothing whatsoever—and enough is enough.
During the last Report stage, I made it clear that Labour considers individual liability essential to ensuring that online safety is taken seriously by online platforms. We have been calling for stronger criminal sanctions for months, and although we welcome some movement from the Government on that issue today, enforcement is now ultimately a narrower set of measures because the Government gutted much of the Bill before Christmas. That last minute U-turn is another one to add to a long list, but to be frank, very little surprises me when it comes to this Government’s approach to law-making.
I rise to speak in favour of new clause 4, on minimum standards. In particular, I shall restrict my remarks to minimum standards in respect of incel culture.
Colleagues will know of the tragedy that took place in Plymouth in 2021. Indeed, the former Home Secretary, the right hon. Member for Witham (Priti Patel), visited Plymouth to meet and have discussions with the people involved. I really want to rid the internet of the disgusting, festering incel culture that is capturing so many of our young people, especially young men. In particular, I want minimum standards to apply and to make sure that, on big and small platforms where there is a risk, those minimum standards include the recognition of incel content. At the moment, incel content is festering in the darkest corners of the internet, where young men are taught to channel their frustrations into an insidious hatred of women and to think of themselves as brothers in arms in a war against women. It is that serious.
In Parliament this morning I convened a group of expert stakeholders, including those from the Centre for Countering Digital Hate, Tech Against Terrorism, Moonshot, Girlguiding, the Antisemitism Policy Trust and the Internet Watch Foundation, to discuss the dangers of incel culture. I believe that incel culture is a growing threat online, with real-world consequences. Incels are targeting young men, young people and children to swell their numbers. Andrew Tate may not necessarily be an incel, but his type of hate and division is growing and is very popular online. He is not the only one, and the model of social media distribution that my right hon. Friend the Member for Barking (Dame Margaret Hodge) spoke about incentivises hate to be viewed, shared and indulged in.
This Bill does not remove incel content online and therefore does not prevent future tragedies. As chair of the all-party parliamentary group on social media, I want to see minimum standards to raise the internet out of the sewer. Where is the compulsion for online giants such as Facebook and YouTube to remove incel content? Five of the most popular incel channels on YouTube have racked up 140,000 subscribers and 24 million views between them, and YouTube is still platforming four of those five. Why? How can these channels apparently pass YouTube’s current terms and conditions? The content is truly harrowing. In these YouTube videos, men who have murdered women are described as saints and lauded in incel culture.
We know that incels use mainstream platforms such as YouTube to reel in unsuspecting young men—so-called normies—before linking them to their own small, specialist websites that show incel content. This is called breadcrumbing: driving traffic and audiences from mainstream platforms to smaller platforms—which will be outside the scope of category 1 provisions and therefore any minimum standards—where individuals start their journey to incel radicalisation.
I think we need to talk less about freedom of speech and more about freedom of reach. We need to talk about enabling fewer and fewer people to see that content, and about down-ranking sites with appalling content like this to increase the friction to reduce audience reach. Incel content not only includes sexist and misogynist material; it also frequently includes anti-Semitic, racist, homophobic and transphobic items layered on top of one another. However, without a “legal but harmful” provision, the Bill does nothing to force search engines to downrate harmful content. If it is to be online, it needs to be harder and harder to find.
I do not believe that a toggle will be enough to deal with this. I agree with amendment 43—if we are to have a toggle, the default should be the norm—but I do not think a toggle will work because it will be possible to evade it with a simple Google Chrome extension that will auto-toggle and therefore make it almost redundant immediately. It will be a minor inconvenience, not a game changer. Some young men spent 10 hours a day looking at violent incel content online. Do we really think that a simple button, a General Data Protection Regulation annoyance button, will stop them from doing so? It will not, and it will not prevent future tragedies.
However, this is not just about the effect on other people; it is also about the increase in the number of suicides. One of the four largest incel forums is dedicated to suicide and self-harm. Suicide is normalised in the forum, and is often referred to as “catching the bus.” People get together to share practical advice on how they can take their own lives. That is not content to which we should be exposing our young people, but it is currently legal. It is harmful, but it will remain legal under the Bill because the terms and conditions of those sites are written by incels to promote incel content. Even if the sites were moved from category 2 to category 1, they would still pass the tests in the Bill, because the incels have written the terms and conditions to allow that content.
Why are smaller platforms not included in the Bill? Ofcom should have the power to bring category 2 sites into scope on the basis of risk. Analysis conducted by the Center for Countering Digital Hate shows that on the largest incel website, rape is mentioned in posts every 29 minutes, with 89% of those posts referring to it in a positive sense. Moreover, 50% of users’ posts about child abuse on the same site are supportive of paedophilia. Indeed, the largest incel forum has recently changed its terms and conditions to allow mention of the sexualisation of pubescent minors—unlike pre-pubescent minors; it makes that distinction. This is disgusting and wrong, so why is it not covered in the Bill? I think there is a real opportunity to look at incel content, and I would be grateful if the Minister met the cross-party group again to discuss how we can ensure that it is harder and harder to find online and is ultimately removed, so that we can protect all our young people from going down this path.
My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) made an excellent speech about new clause 2, a clause with which I had some sympathy. Indeed, the Joint Committee that I chaired proposed that there should be criminal liability for failure to meet the safety duties set out in the Bill, and that that should apply not just to child safety measures, but to any such failure.
However, I agree with my right hon. and learned Friend that, as drafted, the new clause is too wide. If it is saying that the liability exists when the failure to meet the duties has occurred, who will be the determinant of that factor? Will it be determined when Ofcom has issued a notice, or when it has issued a fine? Will it be determined when guidance has been given and has not been followed? What we do not want to see is a parallel judicial system in which decisions are made that are different from those of the regulator in respect of when the safety duties had not been met.
I think it is when there are persistent breaches of the safety duties, when companies have probably already been fined and issued with guidance, and when it has been demonstrated that they are clearly in breach of the codes of practice and are refusing to abide by them, that the criminal liability should come in. Similar provisions already exist in the GDPR legislation for companies that are in persistent breach of their duties and obligations. The Joint Committee recommended that this should be included in the Bill, and throughout the journey of this legislation the inclusion of criminal liability has been consistently strengthened. When the draft Bill was published there was no immediate commencement of any criminal liability, even for not complying with the information notices given by Ofcom, but that was included when the Bill was presented for Second Reading. I am pleased that the Government are now going to consider how we can correctly define what a failure to meet the safety duties would be and therefore what the committal sanction that goes with it would be. That would be an important measure for companies that are in serial breach of their duties and obligations and have no desire to comply.
My hon. Friend has referenced the proposals from my hon. Friend the Member for Dover (Mrs Elphicke). I am grateful to the Minister and the Secretary of State for the discussions they have had with me on making modern slavery a specific priority offence, as well as illegal immigration. I think this is very important.
I agree with my right hon. Friend; that is exactly right, and it is also right that we look at including additional offences on the face of the Bill in schedule 7 as offences that will be considered as part of the legislation.
Where this touches on advertising, the Government have already accepted, following the recommendation of the Joint Committee, that the promotion of fraud should be regulated in the Bill, even if it is in advertising. There are other aspects of this, too, including modern slavery and immigration, where we need to move at pace to close the loophole where consideration was to be given to advertising outside of the Bill through the online advertising review. The principle has already been accepted that illegal activity promoted through an advert on an online platform should be regulated as well as if it was an organic posting. That general provision does not yet exist, however. Given that the Government have considered these additional amendments, which was the right thing to do, they also need to look at the general presumption that any illegal activity that is a breach of the safety duties should be included and regulated, and that if somebody includes it in an advert it does not become exempt, when it would be regulated if it was in an organic posting.
I would like to focus on new clause 1, dealing with redress, new clause 43, dealing with the toggle default, and new clause 4 on minimum standards. This Bill is a very important piece of legislation, but I am afraid that it has been seriously watered down by the Government. In particular, it has been seriously weakened by the removal of measures to tackle legal but harmful content. I acknowledge that some progress has been made recently, now that the Government have accepted the need for criminal sanctions for senior managers of tech companies. However, there are still many gaps in the Bill and I want to deal with some of them in the time available to me tonight.
First, I pay tribute to the families who have lost children due to issues related to social media. Some of those families are in the Public Gallery tonight. In particular, I want to mention the Stephens family from my Reading East constituency. Thirteen-year-old Olly Stephens was murdered in an horrific attack following a plot hatched on social media. The two boys who attacked Olly had both shared dozens of images of knives online, and they used 11 different social media platforms to do so. Sadly, none of the platforms took down the content, which is why these matters are so important to all of us and our communities.
Following this awful case, I support a number of new clauses that I believe would lead to a significant change in the law to prevent a similar tragedy. I stress the importance of new clause 1, which would help parents to make complaints. As Olly’s dad, Stuart, often says, “You simply cannot contact the tech companies. You send an email and get no reply.” It is important to tackle this matter, and I believe that new clause 1 would go some way towards doing that.
As others have said, surely it makes sense for parents to know their children have some protection from harmful content. New clause 43 would provide reassurance by introducing a default position of protecting children. I urge Members on both sides of the House to support this new clause. Both children and vulnerable adults should be better protected from legal but harmful content, and further action should be taken. New clause 43 would take clear steps in that direction.
I am aware of time, and I support many other important new clauses. I reiterate my support and backing for my Front-Bench colleague, my hon. Friend the Member for Pontypridd (Alex Davies-Jones). Thank you, Madam Deputy Speaker, for the opportunity to contribute to this debate.
For the purpose of future-proofing, we have tried to make the Bill as flexible and as technologically neutral as possible so that it can adapt to changes. I think we will need to review it, and indeed I am sure that, as technology changes, we will come back with new legislation in the future to ensure that we continue to be world-beating—but let us see where we end up with that.
May I follow up my hon. Friend’s response to our right hon. Friend the Member for Bromsgrove (Sajid Javid)? If it is the case that coroners cannot access data and information that they need in order to go about their duties—which was the frustrating element in the Molly Russell case—will the Government be prepared to close that loophole in the House of Lords?
We will certainly work with others to address that, and if there is a loophole, we will seek to act, because we want to ensure—
I met my right hon. Friend today to discuss that very point, which is particularly important and powerful. I look forward to continuing to work with her and the Ministry of Justice as we progress this Bill through the other place.
The changes are balanced with new protections for free speech and journalism—two of the core pillars of our democratic society. There are amendments to the definition of recognised news publishers to ensure that sanctioned outlets such as RT must not benefit.
Since becoming Secretary of State I have made a number of my own changes to the Bill. First and foremost, we have gone even further to boost protections for children. Social media companies will face a new duty on age limits so they can no longer turn a blind eye to the estimated 1.6 million underage children who currently use their sites. The largest platforms will also have to publish summaries of their risk assessments for illegal content and material that is harmful for children—finally putting transparency for parents into law.
I believe it is blindingly obvious and morally right that we should have a higher bar of protection when it comes to children. Things such as cyber-bullying, pornography and posts that depict violence do enormous damage. They scar our children and rob them of their right to a childhood. These measures are all reinforced by children and parents, who are given a real voice in the legislation by the inclusion of the Children’s Commissioner as a statutory consultee. The Bill already included provisions to make senior managers liable for failure to comply with information notices, but we have now gone further. Senior managers who deliberately fail children will face criminal liability. Today, we are drawing our line in the sand and declaring that the UK will be the world’s first country to comprehensively protect children online.
Those changes are completely separate to the changes I have made for adults. Many Members and stakeholders had concerns over the “legal but harmful” section of the Bill. They were concerned that it would be a serious threat to legal free speech and would set up a quasi-legal grey area where tech companies would be encouraged to take down content that is perfectly legal to say on our streets. I shared those concerns, so we have removed “legal but harmful” for adults. We have replaced it with a much simpler and fairer and, crucially, much more effective mechanism that gives adults a triple shield of protection. If it is illegal, it has to go. If it is banned under the company’s terms and conditions, it has to go.
Lastly, social media companies will now offer adults a range of tools to give them more control over what they see and interact with on their own feeds.
My right hon. Friend makes an important point about things that are illegal offline but legal online. The Bill has still not defined a lot of content that could be illegal and yet promoted through advertising. As part of their ongoing work on the Bill and the online advertising review, will the Government establish the general principle that content that is illegal will be regulated whether it is an ad or a post?
I completely agree with my hon. Friend on the importance of this topic. That is exactly why we have the online advertising review, a piece of work we will be progressing to tackle the nub of the problem he identifies. We are protecting free speech while putting adults in the driving seat of their own online experience. The result is today’s Bill.
I thank hon. Members for their hard work on this Bill, including my predecessors, especially my right hon. Friend the Member for Mid Bedfordshire (Ms Dorries). I thank all those I have worked with constructively on amendments, including my hon. Friends the Members for Penistone and Stocksbridge (Miriam Cates), for Stone (Sir William Cash), for Dover (Mrs Elphicke), for Rutland and Melton (Alicia Kearns), and my right hon. Friends the Members for South Holland and The Deepings (Sir John Hayes), for Chelmsford (Vicky Ford), for Basingstoke (Dame Maria Miller) and for Romsey and Southampton North (Caroline Nokes).
I would like to put on record my gratitude for the hard work of my incredibly dedicated officials—in particular, Sarah Connolly, Orla MacRae and Emma Hindley, along with a number of others; I cannot name them all today, but I note their tremendous and relentless work on the Bill. Crucially, I thank the charities and devoted campaigners, such as Ian Russell, who have guided us and pushed the Bill forward in the face of their own tragic loss. Thanks to all those people, we now have a Bill that works.
Legislating online was never going to be easy, but it is necessary. It is necessary if we want to protect our values —the values that we protect in the real world every single day. In fact, the NSPCC called this Bill “a national priority”. The Children’s Commissioner called it
“a once-in-a-lifetime opportunity to protect all children”.
But it is not just children’s organisations that are watching. Every parent across the country will know at first hand just how difficult it is to shield their children from inappropriate material when social media giants consistently put profit above children’s safety. This legislation finally puts it right.