Online Safety Bill Debate
Full Debate: Read Full DebatePriti Patel
Main Page: Priti Patel (Conservative - Witham)Department Debates - View all Priti Patel's debates with the Department for Digital, Culture, Media & Sport
(1 year, 10 months ago)
Commons ChamberI rise to speak to new clause 2 on the offence of failing to comply with a relevant duty. I pay tribute to my right hon. and hon. Friends who have championed new clause 2 to strengthen protections for children by introducing criminal liability for senior managers.
The issues of evolving technology and holding people to account are hugely important. May I make the general point that digital education could underpin all those safeguards? The teaching of digital literacy should be conducted in parallel with all the other good efforts made across our schools.
The hon. Member is absolutely right, and I do not think anyone in the House would disagree with that. We have to carry on learning in life, and that links to technology and other issues. That applies to all of us across the board, and we need people in positions of authority to ensure that the right kind of information is shared, to protect our young people.
I look forward to hearing from the Under-Secretary of State for Digital, Culture, Media and Sport, my hon. Friend the Member for Sutton and Cheam (Paul Scully), who has been so good in engaging on this issue, and I thank him for the proactive way in which he has spent time with all of us. Will we see the Government’s amendment prior to the Bill going to the other place for its Second Reading there? It is vital for all colleagues who support new clause 2 to have clear assurances that the provisions we support, which could have passed through this House, will not be diluted in the other place by Ministers. Furthermore—we should discuss this today—what steps are the Government and Ofcom taking to secure the agreement of tech companies to work to ensure that senior managers are committed and proactive in meeting their duties under clause 11?
I recognise that a lot of things will flow through secondary legislation, but on top of that, engagement with tech companies is vital, so that they can prepare, be ready and know what duties will be upon them. We also need to know what further guidance and regulation will come forward to secure the delivery of clause 11 duties and hold tech companies to account.
In the interests of time, I will shorten my remarks. I trust and hope that Ministers will give those details. It is important to give those assurances before the Bill moves to the House of Lords. We need to know that those protections will not be diluted. This is such a sensitive issue. We have come a long way, and that is thanks to colleagues on both sides of the House. It is important that we get the right outcomes, because all of us want to make sure that children are protected from the dreadful harms that we have seen online.
This is a really important piece of legislation. As my hon. Friend the Member for Pontypridd (Alex Davies-Jones) said, it has taken far too long to get to this point. The Bill has been considered in a painstaking way by Members across the House. While today’s announcement that we will introduce senior manager and director liability is most welcome, the recent decisions to strip out vast chunks of the Bill—clauses that would have contributed to making online a safe place for us all—represent a tragic opportunity missed by the Government, and it will fall to a Labour Government to put things right. I know from the assurances given by those on our Front Bench that they will do just that.
I do not want to spend too much time on it, but in discussing the removal of provisions on “legal but harmful” content, I have to talk a little bit about the Jewish community. The hope that the Online Safety Bill would give us some respite from the torrent of antisemitic abuse that some of us have been subjected to has been thwarted. The Centre for Countering Digital Hate has conducted research in this area, and it found that nine out of 10 antisemitic posts on Facebook and Twitter stay there, despite requests to have them removed. Its analysis of 714 posts containing anti-Jewish hate found that they were viewed by more than 7.3 million people across the platforms, and that 80% of posts containing holocaust denial and 70% identified as neo-Nazi were not acted on, although they were in breach of the rules set by the platforms. People like me are left with a sense of bitterness that our suffering has to be tolerated because of some ideological, misplaced, flawed and ill-thought-out interpretation of freedom of speech.
I turn to new clause 2, tabled by the hon. Member for Stone (Sir William Cash) and the hon. Member for Penistone and Stocksbridge (Miriam Cates). I congratulate them on the work they have done in bringing this forward. I think they will probably agree with me that this issue should never have divided us as it did before Christmas, when I tabled a similar amendment. It is not a party political issue; it is a common-sense measure that best serves the national interest and will make online a safer place for children. I am pleased that the hon. Members for Stone and for Penistone and Stocksbridge have persuaded their colleagues of the justification and that the Government have listened to them—I am only sorry that I was not as successful.
This is an important measure. The business model that platforms operate encourages, not just passively but actively, the flourishing of abusive content online. They do not just fail to remove that content, but actively promote its inclusion through the algorithms that they employ. Sadly, people get a kick out of reading hateful, harmful and abusive content online, as the platform companies and their senior managers know. It is in their interest to encourage maximum traffic on their platforms, and if that means letting people post and see vile abuse, they will. The greater the traffic on such sites, the more attractive they become to advertisers and the more advertisers are willing to pay for the ads that they post on the sites. The platforms make money out of online abuse.
Originally, the Government wanted to deal with the problem by fining the companies, but companies would simply treat such fines as a cost to their business. It would not change their model or the platforms’ behaviour, although it might add to the charges for those who want to advertise on the platforms. Furthermore, we know that senior directors, owners and managers personally take decisions about the content that they allow to appear on their platforms and that their approach affects what people post.
Elon Musk’s controversial and aggressive takeover of Twitter, where he labelled the sensible moderation of content as a violation of freedom of speech, led to a 500% increase in the use of the N-word within 12 hours of his acquisition. Telegram, whose CEO is Pavel Durov, has become the app of choice of terror networks such as ISIS, according to research conducted by the Middle East Media Research Institute. When challenged about that, however, Durov refused to act on the intelligence to moderate content and said:
“You cannot make messaging technology secure for everybody except for terrorists.”
If senior managers have responsibility for the content on their platforms, they must be held to account, because we know that doing so will mean that online businesses become a safer place for our children.
We have to decide whose side we are on. Are we really putting our children’s wellbeing first, or are we putting the platforms’ interest first? Of course, everybody will claim that we are putting children’s interests first, but if we are, we have to put our money where our mouth is, which involves making the managers truly accountable for what appears on their platforms. We know that legislating for director liability works, because it has worked for health and safety on construction sites, in the Bribery Act 2010 and on tax evasion. I hope to move similar amendments when we consider the Economic Crime and Corporate Transparency Bill on Report next week.
This is not simply a punitive measure—in fact, the last thing we want to do is lock up a lot of platform owners—but a tool to transform behaviour. We will not be locking up the tech giants, but we will be ensuring that they moderate their content. Achieving this change shows the House truly working at its best, cross-party, and focusing on the merits of the argument rather than playing party politics with such a serious issue. I commend new clause 2 to the House.
We will certainly work with others to address that, and if there is a loophole, we will seek to act, because we want to ensure—
I am grateful to the Minister for giving way. He was commenting on my earlier remarks about new clause 2 and the specifics around a timetable. I completely recognise that much of this work is under development. In my remarks, I asked for a timetable on engagement with the tech firms as well as transparency to this House on the progress being made on developing the regulations around criminal liability. It is important that this House sees that, and that we follow every single stage of that process.
I thank my right hon. Friend for that intervention. We want to have as many conversations as possible in this area with Members on all sides, and I hope we can be as transparent as possible in that operation. We have already started the conversation. The Secretary of State and I met some of the big tech companies just yesterday to talk about exactly this area.
My hon. Friend the Member for Dover, my right hon. Friends the Members for South Holland and The Deepings and for Maidenhead (Mrs May) and others are absolutely right to highlight concerns about illegal small boat crossings and the harm that can be caused to people crossing in dangerous situations. The use of highly dangerous methods to enter this country, including unseaworthy, small or overcrowded boats and refrigerated lorries, presents a huge challenge to us all. Like other forms of serious and organised crime, organised immigration crime endangers lives, has a corrosive effect on society, puts pressure on border security resources and diverts money from our economy.
As the Prime Minister has said, stopping these crossings is one of the Government’s top priorities for the next year. The situation needs to be resolved and we will not hesitate to take action wherever that can have the most effect, including through this Bill. Organised crime groups continue to facilitate most migrant journeys to the UK and have no respect for human life, exploiting vulnerable migrants, treating them as commodities and knowingly putting people in life-threatening situations. Organised crime gangs are increasingly using social media to facilitate migrant crossings and we need to do more to prevent and disrupt the crimes facilitated through these platforms. We need to share best practice, improve our detection methods and take steps to close illegal crossing routes as the behaviour and methods of organised crime groups evolve.
However, amendment 82 risks having unforeseen consequences for the Bill. It could bring into question the meaning of the term “content” elsewhere in the Bill, with unpredictable implications for how the courts and companies would interpret it. Following constructive discussions with my hon. Friend the Member for Dover and my right hon. Friend the Member for Maidenhead, I can now confirm that in order to better tackle illegal immigration encouraged by organised gangs, the Government will add section 2 of the Modern Slavery Act 2015 to the list of priority offences. Section 2 makes it an offence to arrange or facilitate the travel of another person, including through recruitment, with a view to their exploitation.
We will also add section 24 of the Immigration Act to the priority offences list in schedule 7. Although the offences in section 2 cannot be carried out online, paragraph 33 of the schedule states the priority illegal content includes the inchoate offences relating to the offences listed. Therefore aiding, abetting, counselling and conspiring in those offences by posting videos of people crossing the channel that show the activity in a positive light could be an offence that is committed online and therefore fall within what is priority illegal content. The result of this amendment would therefore be that platforms would have to proactively remove that content. I am grateful to my hon. Friend the Member for Dover and my right hon. Friends the Members for South Holland and The Deepings and for Maidenhead for raising this important issue and I would be happy to offer them a meeting with my officials to discuss the drafting of this amendment ahead of it being tabled in the other place.
We recognise the strength of feeling on the issue of harmful conversion practices and remain committed to protecting people from these practices and making sure that they can live their lives free from the threat of harm or abuse. We have had constructive engagement with my hon. Friend the Member for Rutland and Melton (Alicia Kearns) on her amendment 84, which seeks to prevent children from seeing harmful online content on conversion practices. It is right that this issue is tackled through a dedicated and tailored legislative approach, which is why we are announcing today that the Government will publish a draft Bill to set out a proposed approach to banning conversion practices. This will apply to England and Wales. The Bill will protect everybody, including those targeted on the basis of their sexuality or being transgender. The Government will publish the Bill shortly and will ask for pre-legislative scrutiny by a Joint Committee in this parliamentary Session.
This is a complex area and pre-legislative scrutiny exists to help ensure that any Bill introduced to Parliament does not cause unintended consequences. It will also ensure that the Bill benefits from stakeholder expertise and input from parliamentarians. The legislation must not, through a lack of clarity, harm the growing number of children and young adults experiencing gender-related distress through inadvertently criminalising or chilling legitimate conversations that parents or clinicians may have with children. This is an important issue, and it needs the targeted and robust approach that a dedicated Bill would provide.