Read Bill Ministerial Extracts
Online Safety Bill Debate
Full Debate: Read Full DebatePriti Patel
Main Page: Priti Patel (Conservative - Witham)Department Debates - View all Priti Patel's debates with the Department for Digital, Culture, Media & Sport
(1 year, 11 months ago)
Commons ChamberI was about to speak to the programme motion, Mr Speaker, but you have outlined exactly what I was going to say, so thank you for that—I am glad to get the process right.
I am delighted to bring the Online Safety Bill back to the House for the continuation of Report stage. I start by expressing my gratitude to colleagues across the House for their contributions to the Bill through pre-legislative scrutiny and before the summer recess, and for their engagement with me since I took office as the Minister for Tech and the Digital Economy.
The concept at the heart of this legislation is simple: tech companies, like those in every other sector, must take responsibility for the consequences of their business decisions. As they continue to offer users the latest innovations, they must consider the safety of their users as well as profit. They must treat their users fairly and ensure that the internet remains a place for free expression and robust debate. As Members will be aware, the majority of the Bill was discussed on Report before the summer recess. Our focus today is on the provisions that relate to the regulator’s power and the criminal law reforms. I will take this opportunity also to briefly set out the further changes that the Government recently committed to making later in the Bill’s passage.
Let me take the Government amendments in turn. The Government’s top priority for this legislation has always been the protection of children. We recognise that the particularly abhorrent and pernicious nature of online child sexual exploitation and abuse—CSEA—demands the most robust response possible. Throughout the passage of the Bill, we have heard evidence of the appalling harm that CSEA causes. Repeatedly, we heard calls for strong incentives for companies to do everything they can to innovate and make safety technologies their priority, to ensure that there is no place for offenders to hide online. The Bill already includes a specific power to tackle CSEA, which allows Ofcom, subject to safeguards, to require tech companies to use accredited technology to identify and remove illegal CSEA content in public and private communications. However, we have seen in recent years how the online world has evolved to allow offenders to reach their victims and one another in new ways.
I am listening to my hon. Friend with great interest on this aspect of child sexual abuse and exploitation, which is a heinous crime. Will he go on to speak about how the Ofcom role will interact with law enforcement, in particular the National Crime Agency, when dealing with these awful crimes?
It is important that we tackle this in a number of ways. My right hon. Friend the Member for Haltemprice and Howden (Mr Davis) and I spoke earlier, and I will come to some of what he will outline. It is important that Ofcom recognises the technologies that are available and—with the Children’s Commissioner as one of the statutory consultees—liaises with the social media platforms, and the agencies, to ensure that there are codes of practice that work, and that we get this absolutely right. It is about enforcing the terms and conditions of the companies and being able to produce the evidence and track the exchanges, as I will outline later, for the agency to use for enforcement.
With the rapid developments in technology, on occasions there will be no existing accredited technology available that will satisfactorily mitigate the risks. Similarly, tech companies might be able to better design solutions that integrate more easily with their services than those that are already accredited. The new regulatory framework must incentivise tech companies to ensure that their safety measures keep pace with the evolving threat, and that they design their services to be safe from the outset. It is for these reasons that the Government have tabled the amendments that we are discussing.
New clauses 11 and 12 establish options for Ofcom when deploying its powers under notices to deal with terrorism content and CSEA content. These notices will empower Ofcom to require companies to use accredited technology to identify and remove illegal terrorism and CSEA content or to prevent users from encountering that content or, crucially, to use their best endeavours to develop or to source technology to tackle CSEA. That strikes the right balance of supporting the adoption of new technology, while ensuring that it does not come at the expense of children’s physical safety.
Because if those companies do not, they will get a fine of up to £18 million or 10% of their global turnover, whichever is higher. As we are finding with Twitter, there is also a commercial impetus, because advertisers are fleeing that platform as they see the uncertainty being caused by those changes. A lot of things are moving here to ensure that safety is paramount; it is not just for the Government to act in this area. All we are doing is making sure that those companies enforce their own terms and conditions.
This point is important: we are speaking about terrorism and counter-terrorism and the state’s role in preventing terrorist activity. For clarity, will the Minister update the House later on the work that takes place between his Department and the platforms and, importantly, between the Home Office and the security services. In particular, some specialist work takes place with the Global Internet Forum to Counter Terrorism, which looks at online terrorist and extremist content. That work can ensure that crimes are prevented and that the right kinds of interventions take place.
My right hon. Friend talks with experience from her time at the Home Office. She is absolutely right that the Bill sets a framework to adhere to the terms and conditions of the platforms. It also sets out the ability for the services to look at things such as terrorism and CSEA, which I have been talking about—for example, through the evidence of photos being exchanged. The Bill is not re-examining and re-prosecuting the interaction between all the agencies, however, because that is apparent for all to see.
New clauses 11 and 12 bring those powers in line with the wider safety duties by making it clear that the tools may seek to proactively prevent CSEA content from appearing on a service, rather than focusing only on identification and removal after the fact. That will ensure the best possible protection for children, including on services that offer livestreaming.
The safeguards around those powers remain as strong as before to protect user privacy. Any tools that are developed will be accredited using a rigorous assessment process to ensure that they are highly accurate before the company is asked to use them. That will avoid any unnecessary intrusions into user privacy by minimising the risk that the tools identify false positives.
Crucially, the powers do not represent a ban on or seek to undermine any specific type of technology or design, such as end-to-end encryption. They align with the UK Government’s view that online privacy and cyber-security must be protected, but that technological changes should not be implemented in a way that diminishes public safety.
If my hon. Friend will bear with me—I need to make some progress—I think that will be teased out today and in Committee, should the Bill be recommitted, as we amend the clauses relating directly to what she is talking about, and then as the Bill goes through the other place.
I am grateful to the Minister, who has taken a number of interventions. I fully agree with my hon. Friend the Member for Gosport (Dame Caroline Dinenage). This is a grey area and has consistently been so—many Members have given their views on that in previous stages of the Bill. Will the Minister come back in the later stages on tackling violence against women and girls, and show how the Bill will incorporate key aspects of the Domestic Abuse Act 2021, and tie up with the criminal justice system and the work of the forthcoming victims Bill? We cannot look at these issues in isolation—I see that the Minister of State, Ministry of Justice, my right hon. Friend the Member for Charnwood (Edward Argar) is also on the Front Bench. Rather, they all have to be put together in a golden thread of protecting victims, making sure that people do not become victims, and ensuring that we go after the perpetrators—we must not forget that at all. The Minister will not be able to answer that now, but I would ask him to please do so in the latter stages.
Before I speak to specific clauses I pay tribute to all the campaigners, particularly the families who have campaigned so hard to give their loved ones a voice through this Bill and to change our laws. Having had some prior involvement in the early stages of this Bill three years ago as Home Secretary, I also pay tribute to many of the officials and Members of this House on both sides who have worked assiduously on the construction, development and advancement of this Bill. In particular, I pay tribute to my hon. Friend the Member for Folkestone and Hythe (Damian Collins) and the work of the Joint Committee; when I was Home Secretary we had many discussions about this important work. I also thank the Minister for the assiduous way in which he has handled interventions and actually furthered the debate with this Bill. There are many Government Departments that have a raft of involvement and engagement.
The victims must be at the heart of everything that we do now to provide safeguards and protections. Children and individuals have lost their lives because of the online space. We know there is a great deal of good in the online space, but also a great deal of harm, and that must unite us all in delivering this legislation. We have waited a long time for this Bill, but we must come together, knowing that this is foundational legislation, which will have to be improved and developed alongside the technology, and that there is much more work to do.
I start by focusing on a couple of the new clauses, beginning with Government new clause 11 on end-to-end encryption. The House will not be surprised by my background in dealing with end-to-end encryption, particularly the harmful content, the types of individuals and the perpetrators who hide behind end-to-end encryption. We must acknowledge the individuals who harm children or who peddle terrorist content through end-to-end encryption while recognising that encryption services are important to protect privacy.
There is great justification for encryption—business transactions, working for the Government and all sorts of areas of importance—but we must acknowledge in this House that there is more work to do, because these services are being used by those who would do harm to our country, threaten our national interest or threaten the safety of young people and children in particular. We know for a fact that there are sick-minded individuals who seek to abuse and exploit children and vulnerable adults. The Minister will know that, and I am afraid that many of us do. I speak now as a constituency Member of Parliament, and one of my first surgery cases back in 2010 was the sad and tragic case of a mother who came to see me because her son had accessed all sorts of content. Thanks to the Bill, that content will now be ruled as harmful. There were other services associated with access that the family could not see and could not get access to, and encryption platforms are part of that.
There are shocking figures, and I suspect that many of my colleagues in the House will be aware of them. Almost 100,000 reports relating to online child abuse were received by UK enforcement agencies in 2021 alone. That is shocking. The House will recognise my experience of working with the National Crime Agency, to which we must pay tribute for its work in this space, as we should to law enforcement more widely. Police officers and all sorts of individuals in law enforcement are, day in, day out, investigating these cases and looking at some of the most appalling images and content, all in the name of protecting vulnerable children, and we must pay tribute to them as well.
It is also really shocking that that figure of 100,000 reports in 2021 alone is a 29% increase on the previous year. The amount of disturbing content is going up and up, and we are, I am afraid, looking only at the tip of the iceberg. So, I think it is absolutely right—and I will always urge the Government and whichever Secretary of State, be they in the Home Office, DMCS or the MOJ—to put the right measures and powers in place so that we act to prevent child sexual abuse and exploitation, prevent terrorist content from being shielded behind the platforms of encryption and, importantly, bring those involved to face justice. End-to-end encryption is one thing, but we need end-to-end justice for victims and the prevention of the most heinous crimes.
This is where we, as a House, must come together. I commend the hon. Member for Rotherham (Sarah Champion) in particular for her work relating to girls, everything to do with the grooming gangs, and the most appalling crimes against individuals, quite frankly. I will always urge colleagues to support the Bill, on which we will need to build going forward.
I think I can speak with experience about the difficulties in drafting legislation—both more broadly and specifically in this area, which is complex and challenging. It is hard to foresee the multiplicity of circumstances. My hon. Friend the Member for Folkestone and Hythe was absolutely right to say in his comments to the SNP spokesman, the hon. Member for Ochil and South Perthshire (John Nicolson), that we have to focus on illegal content. It is difficult to get the balance right between the lawful and harmful. The illegal side is what we must focus on.
I also know that many campaigners and individuals—they are not just campaigners, but families—have given heartbreaking and devastating accounts of their experiences of online harms. As legislators, we owe them this Bill, because although their suffering is not something that we will experience, it must bring about the type of changes that we all want to see for everyone—children, adults and vulnerable individuals.
May I ask the Minister for reassurances on the definition of “best endeavours”? As my right hon. Friend the Member for Basingstoke (Dame Maria Miller) touched on, when it comes to implementation, that will be the area where the rubber hits the road. That is where we will need to know that our collective work will be meaningful and will deliver protections—not just change, but protections. We must be honest about the many serious issues that will arise even after we pass the Bill—be it, God forbid, a major terrorist incident, or cases of child sexual exploitation—and there is a risk that, without clarity in this area, when a serious issue does arise, we may not know whether a provider undertook best endeavours. I think we owe it to everyone to ensure that we run a slide rule over this on every single granular detail.
Cases and issues relating to best endeavours are debated and discussed extensively in court cases, coroner inquests and for social services relating to child safeguarding issues, for example—all right hon. and hon. Members here will have experience of dealing with social services on behalf of their constituents in child protection cases—or, even worse, in serious case reviews or public inquiries that could come in future. I worry that in any response a provider could say that it did its best and had undertaken its best endeavours, as a defence. That would be unacceptable. That would lead those affected to feel as if they suffered an even greater injustice than the violations that they experienced. It is not clear whether best endeavours will be enough to change the culture, behaviour and attitudes of online platforms.
I raise best endeavours in the context of changing attitudes and cultures because in many institutions, that very issue is under live debate right now. That may be in policing, attitudes around women and girls or how we protect other vulnerable groups, even in other services such as the fire service, which we have heard about recently. It is important that we ask those questions and have the scrutiny. We need to hear more about what constitutes best endeavours. Who will hold the providers to account? Ofcom clearly has a role. I know the Minister will do a very earnest and diligent job to provide answers, but the best endeavours principle goes wider than just the Minister on the Front Bench—it goes across the whole of Government. He knows that we will give him every backing to use his sharp elbows—perhaps I can help with my sharp elbows—to ensure that others are held to account.
It will also be for Ofcom to give further details and guidance. As ever, the guidance will be so important. The guidance has to have teeth and statutory powers. It has to be able to put the mirror up and hold people to account. For example, would Ofcom be able, in its notices to providers, to instruct them to use specific technologies and programmes to tackle and end the exposure to exploitation, in relation to end-to-end encryption services, to protect victims? That is an open question, but one that could be put to Ofcom and could be an implementation test. There is no reason why we should not put a series of questions to Ofcom around how it would practically implement.
I would like to ask the Minister why vulnerable adults and victims of domestic abuse and violence against women and girls are not included. We must do everything in this House. This is not about being party political. When it comes to all our work on women and violence against women and girls, there should be no party politics whatsoever. We should ensure that what is right for one group is consistent and that the laws are strengthened. That will require the MOJ, as well as the Home Office, to ensure that the work is joined up in the right kind of way.
It is right that powers are available for dealing with terrorist threats and tackling child sexual abuse thoroughly. There is some good work around terrorist content. There is excellent work in GIFCT, the Global Internet Forum to Counter Terrorism. The technology companies are doing great work. There is international co-operation in this space. The House should take some comfort in the fact that the United Kingdom leads the world in this space. We owe our gratitude to our intelligence and security agencies. I give my thanks to MI5 in particular for its work and to counter-terrorism policing, because they have led the world robustly in this work.
My right hon. Friend makes an important point about this being a cross-Government effort. The Online Safety Bill creates a regulatory framework for the internet, but we need to make sure that we have the right offences in law clearly defined. Then, it is easy to read them and cross them with legislation. If we do not have that, it is a job for the whole of Government.
Exactly that. My hon. Friend is absolutely right. I come back to the point about drafting this legislation, which is not straightforward and easy because of the definitions. It is not just about what is in scope of the Bill but about the implications of the definitions and how they could be applied in law.
The Minister touched on the criminal side of things; interpretation in the criminal courts and how that would be applied in case law are the points that need to be fleshed out. This is where our work on CT is so important, because across the world with Five Eyes we have been consistent. Again, there are good models out there that can be built upon. We will not fix all this through one Bill—we know that. This Bill is foundational, which is why we must move forward.
On new clause 11, I seek clarity—in this respect, I need reassurance not from the Minister but from other parts of government—on how victims and survivors, whether of terrorist activity, domestic abuse or violence against women and girls, will be supported and protected by the new safeguards in the Bill, and by the work of the Victims’ Commissioner.
I thank my right hon. Friend for sharing her remarks with the House. She is making an excellent speech based on her considerable experience. On the specific issue of child sexual abuse and exploitation, many organisations, such as the Internet Watch Foundation, are instrumental in removing reports and web pages containing that vile and disgusting material. In the April 2020 White Paper, the Government committed to look at how the Internet Watch Foundation could use its technical expertise in that field. Does she agree that it would be good to hear from the Minister about how the Internet Watch Foundation could work with Ofcom to assist victims?
My hon. Friend is absolutely right. I thank her for not just her intervention but her steadfast work when she was a Home Office Minister with responsibility for safeguarding. I also thank the Internet Watch Foundation; many of the statistics and figures that we have been using about child sexual abuse and exploitation content, and the take-downs, are thanks to its work. There is some important work to do there. The Minister will be familiar with its work—[Interruption.] Exactly that.
We need the expertise of the Internet Watch Foundation, so it is about integrating that skillset. There is a great deal of expertise out there, including at the Internet Watch Foundation, at GIFCT on the CT side and, obviously, in our services and agencies. As my right hon. Friend the Member for Basingstoke said, it is crucial that we pool organisations’ expertise to implement the Bill, as we will not be able to create it all over again overnight in government.
I thank my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) for tabling new clause 16, which would create new offences to address the challenges caused by those who promote, encourage and assist self-harm. That has been the subject of much of the debate already, which is absolutely right when we think about the victims and their families. In particular, I thank the Samaritans and others for their work to highlight this important issue. I do not need to dwell on the Samaritans’ report, because I think all hon. Members have read it.
All hon. Members who spoke in the early stages of the Bill, which I did not because I was in government, highlighted this essential area. It is important to ensure that we do everything we can to address it in the right way. Like all right hon. and hon. Members, I pay tribute to the family of Molly Russell. There are no words for the suffering that they have endured, but their campaign of bravery, courage and fortitude aims to close every loophole to stop other young people being put at risk.
Right hon. and hon. Members meet young people in schools every week, and we are also parents and, in some cases, grandparents. To know that this grey area leaves so many youngsters at risk is devastating, so we have almost a collective corporate duty to stand up and do the right thing. The long and short of it is that we need to be satisfied, when passing the Bill, that we are taking action to protect vulnerable people and youngsters who are susceptible to dangerous communications.
As I have emphasised, we should also seek to punish those who cause and perpetrate this harm and do everything we can to protect those who are vulnerable, those with learning disabilities, those with mental health conditions, and those who are exposed to self-harm content. We need to protect them and we have a duty to do that, so I look forward to the Minister’s reply.
I welcome new clauses 45 to 50, tabled by my right hon. Friend the Member for Basingstoke. I pay tribute to her for her work; she has been a strong campaigner for protecting the privacy of individuals, especially women and children, and for closing loopholes that have enabled people to be humiliated or harmed in the ways she has spoken about so consistently in the House. I am pleased that the Deputy Prime Minister, my right hon. Friend the Member for Esher and Walton (Dominic Raab), announced last month that the Government would table amendments in the other place to criminalise the sharing of intimate images, photographs and videos without consent; that is long overdue. When I was Home Secretary I heard the most appalling cases, with which my right hon. Friend the Member for Basingstoke will be familiar. I have met so many victims and survivors, and we owe it to them to do the right thing.
It would be reassuring to hear not just from the Minister in this debate, but from other Ministers in the Departments involved in the Bill, to ensure they are consistent in giving voice to the issues and in working through their Ministries on the implementation—not just of this Bill, but of the golden thread that runs throughout the legislation. Over the last three years, we have rightly produced a lot of legislation to go after perpetrators, and support women and girls, including the Domestic Abuse Act 2021. We should use those platforms to stand up for the individuals affected by these issues.
I want to highlight the importance of the provisions to protect women and girls, particularly the victims and survivors of domestic abuse and violence. Some abusive partners and ex-partners use intimate images in their possession; as the Minister said, that is coercive control which means that the victim ends up living their life in fear. That is completely wrong. We have heard and experienced too many harrowing and shocking stories of women who have suffered as a result of the use of such images and videos. It must now be a priority for the criminal justice system, and the online platforms in particular, to remove such content. This is no longer a negotiation. Too many of us—including myself, when I was Home Secretary—have phoned platforms at weekends and insisted that they take down content. Quite frankly, I have then been told, “Twitter doesn’t work on a Saturday, Home Secretary” or “This is going to take time.” That is not acceptable. It is an absolute insult to the victims, and is morally reprehensible and wrong. The platforms must be held to account.
Hon. Members will be well aware of the Home Office’s work on the tackling violence against women and girls strategy. I pay tribute to all colleagues, but particularly my hon. Friend the Member for Redditch (Rachel Maclean), who was the Minister at the time. The strategy came about after much pain, sorrow and loss of life, and it garnered an unprecedented 180,000 responses. The range of concerns raised were predominantly related to the issues we are discussing today. We can no longer stay mute and turn a blind eye. We must ensure that the safety of women in the public space offline—on the streets—and online is respected. We know how women feel about the threats. The strategy highlighted so much; I do not want to go over it again, as it is well documented and I have spoken about it in the House many times.
It remains a cause of concern that the Bill does not include a specific VAWG code of practice. We want and need the Bill. We are not going to fix everything through it, but, having spent valued time with victims and survivors, I genuinely believe that we could move towards a code of practice. Colleagues, this is an area on which we should unite, and we should bring such a provision forward; it is vital.
Let me say a few words in support of new clause 23, which was tabled by my right hon. Friend the Member for Basingstoke. I have always been a vocal and strong supporter of services for victims of crime, and of victims full stop. I think it was 10 years ago that I stood in this House and proposed a victims code of practice—a victims Bill is coming, and we look forward to that as well. This Government have a strong record of putting more resources into support for victims, including the £440 million over three years, but it is imperative that offenders—those responsible for the harm caused to victims—are made to pay, and it is absolutely right that they should pay more in compensation.
Companies profiteering from online platforms where these harms are being perpetrated should be held to account. When companies fail in their duties and have been found wanting, they must make a contribution for the harm caused. There are ways in which we can do that. There has been a debate already, and I heard the hon. Member for Pontypridd (Alex Davies-Jones) speak for the Opposition about one way, but I think we should be much more specific now, particularly in individual cases. I want to see those companies pay the price for their crimes, and I expect the financial penalties issued to reflect the severity of the harm caused—we should support that—and that such money should go to supporting the victims.
I pay tribute to the charities, advocacy groups and other groups that, day in and day out, have supported the victims of crime and of online harms. I have had an insight into that work from my former role in Government, but we should never underestimate how traumatic and harrowing it is. I say that about the support groups, but we have to magnify that multiple times for the victims. This is one area where we must ensure that more is done to provide extra resources for them. I look forward to hearing more from the Minister, but also from Ministers from other Departments in this space.
I will conclude on new clause 28, which has already been raised, on the advocacy body for children. There is a long way to go with this—there really is. Children are harmed in just too many ways, and the harm is unspeakable. We have touched on this in earlier debates and discussions on the Bill, in relation to child users on online platforms, and there will be further harm. I gently urge the Government —if not today or through this Bill, then later—to think about how we can pull together the skills and expertise in organisations outside this House and outside Government that give voice to children who have nowhere else to go.
This is not just about the online space; in the cases in the constituency of the hon. Member for Rotherham (Sarah Champion) and other constituencies, we have seen children being harmed under cover. Statutory services failed them and the state failed them. It was state institutional failure that let children down in the cases in Rotherham and other child grooming cases. We could see that all over again in the online space, and I really urge the Government to make sure that that does not happen—and actually never happens again, because those cases are far too harrowing.
There really is a lot here, and we must come together to ensure that the Bill comes to pass, but there are so many other areas where we can collectively put aside party politics and give voice to those who really need representation.
I pay tribute to all the relatives and families of the victims of online abuse who have chosen to be with us today. I am sure that, for a lot of you, our debate is very dry and detached, yet we would not be here but for you. Our hearts are with you all.
I welcome the Minister to his new role. I hope that he will guide his Bill with the same spirit set by his predecessors, the right hon. Member for Croydon South (Chris Philp) and the hon. Member for Folkestone and Hythe (Damian Collins), who is present today and has done much work on this issue. Both Ministers listened and accepted ideas suggested by Back Benchers across the House. As a result, we had a better Bill.
I am grateful to my hon. Friend for what he has said, and for his significant work on the Committee.
There is a great deal that we could say about this Bill, but let me end by touching on an important topic that I think my hon. Friend the Member for Dover (Mrs Elphicke) will speak about later: the way in which social media platforms are used by people trafficking gangs to recruit those who can help them with bringing people into the country in small boats. It was right that the Government included immigration offences in the list of priority legal harms in schedule 7. It was also right that, following a recommendation from the Joint Committee, they included fraud and scam ads in the scope of the Bill.
We have already accepted, in principle, that advertising can be within the Bill’s scope in certain circumstances, and that priority legal harms can be written into the Bill and identified as such. As I understand it, my hon. Friend’s amendment seeks to bring advertising services—not just organic posts on social media platforms—into the Bill’s scope as well. I know that the Government want to consider illegal activity in advertising as part of the online advertising review, but I hope that this could be an expedited process running in parallel with the Bill as it completes its stages. Illegal activity in advertising would not be allowed in the offline world. Newspaper editors are legally liable for what appears in their papers, and broadcasters can lose their licence if they allow illegal content to feature in advertising. We do not yet have the same enforcement mechanism through the advertising industry with the big online platforms, such as Google and Facebook, where the bulk of display advertising now goes. Their advertising market is bigger than the television advertising market. We are seeing serious examples of illegal activity, and it cannot be right that while such examples cannot be posted on a Facebook page, if money is put behind them and they are run as advertisements they can.
My hon. Friend is making a very thoughtful speech. This is an important point, because it relates to criminality fuelled by online activity. We have discussed that before in the context of advertising. Tools already exist throughout Government to pick up such criminality, but we need the Bill to integrate them and drive the right outcomes—to stop this criminality, to secure the necessary prosecutions, and to bring about the deterrent effect that my hon. Friend the Member for Dover (Mrs Elphicke) is pursuing.
Online Safety Bill Debate
Full Debate: Read Full DebatePriti Patel
Main Page: Priti Patel (Conservative - Witham)Department Debates - View all Priti Patel's debates with the Department for Digital, Culture, Media & Sport
(1 year, 10 months ago)
Commons ChamberI rise to speak to new clause 2 on the offence of failing to comply with a relevant duty. I pay tribute to my right hon. and hon. Friends who have championed new clause 2 to strengthen protections for children by introducing criminal liability for senior managers.
The issues of evolving technology and holding people to account are hugely important. May I make the general point that digital education could underpin all those safeguards? The teaching of digital literacy should be conducted in parallel with all the other good efforts made across our schools.
The hon. Member is absolutely right, and I do not think anyone in the House would disagree with that. We have to carry on learning in life, and that links to technology and other issues. That applies to all of us across the board, and we need people in positions of authority to ensure that the right kind of information is shared, to protect our young people.
I look forward to hearing from the Under-Secretary of State for Digital, Culture, Media and Sport, my hon. Friend the Member for Sutton and Cheam (Paul Scully), who has been so good in engaging on this issue, and I thank him for the proactive way in which he has spent time with all of us. Will we see the Government’s amendment prior to the Bill going to the other place for its Second Reading there? It is vital for all colleagues who support new clause 2 to have clear assurances that the provisions we support, which could have passed through this House, will not be diluted in the other place by Ministers. Furthermore—we should discuss this today—what steps are the Government and Ofcom taking to secure the agreement of tech companies to work to ensure that senior managers are committed and proactive in meeting their duties under clause 11?
I recognise that a lot of things will flow through secondary legislation, but on top of that, engagement with tech companies is vital, so that they can prepare, be ready and know what duties will be upon them. We also need to know what further guidance and regulation will come forward to secure the delivery of clause 11 duties and hold tech companies to account.
In the interests of time, I will shorten my remarks. I trust and hope that Ministers will give those details. It is important to give those assurances before the Bill moves to the House of Lords. We need to know that those protections will not be diluted. This is such a sensitive issue. We have come a long way, and that is thanks to colleagues on both sides of the House. It is important that we get the right outcomes, because all of us want to make sure that children are protected from the dreadful harms that we have seen online.
This is a really important piece of legislation. As my hon. Friend the Member for Pontypridd (Alex Davies-Jones) said, it has taken far too long to get to this point. The Bill has been considered in a painstaking way by Members across the House. While today’s announcement that we will introduce senior manager and director liability is most welcome, the recent decisions to strip out vast chunks of the Bill—clauses that would have contributed to making online a safe place for us all—represent a tragic opportunity missed by the Government, and it will fall to a Labour Government to put things right. I know from the assurances given by those on our Front Bench that they will do just that.
I do not want to spend too much time on it, but in discussing the removal of provisions on “legal but harmful” content, I have to talk a little bit about the Jewish community. The hope that the Online Safety Bill would give us some respite from the torrent of antisemitic abuse that some of us have been subjected to has been thwarted. The Centre for Countering Digital Hate has conducted research in this area, and it found that nine out of 10 antisemitic posts on Facebook and Twitter stay there, despite requests to have them removed. Its analysis of 714 posts containing anti-Jewish hate found that they were viewed by more than 7.3 million people across the platforms, and that 80% of posts containing holocaust denial and 70% identified as neo-Nazi were not acted on, although they were in breach of the rules set by the platforms. People like me are left with a sense of bitterness that our suffering has to be tolerated because of some ideological, misplaced, flawed and ill-thought-out interpretation of freedom of speech.
I turn to new clause 2, tabled by the hon. Member for Stone (Sir William Cash) and the hon. Member for Penistone and Stocksbridge (Miriam Cates). I congratulate them on the work they have done in bringing this forward. I think they will probably agree with me that this issue should never have divided us as it did before Christmas, when I tabled a similar amendment. It is not a party political issue; it is a common-sense measure that best serves the national interest and will make online a safer place for children. I am pleased that the hon. Members for Stone and for Penistone and Stocksbridge have persuaded their colleagues of the justification and that the Government have listened to them—I am only sorry that I was not as successful.
This is an important measure. The business model that platforms operate encourages, not just passively but actively, the flourishing of abusive content online. They do not just fail to remove that content, but actively promote its inclusion through the algorithms that they employ. Sadly, people get a kick out of reading hateful, harmful and abusive content online, as the platform companies and their senior managers know. It is in their interest to encourage maximum traffic on their platforms, and if that means letting people post and see vile abuse, they will. The greater the traffic on such sites, the more attractive they become to advertisers and the more advertisers are willing to pay for the ads that they post on the sites. The platforms make money out of online abuse.
Originally, the Government wanted to deal with the problem by fining the companies, but companies would simply treat such fines as a cost to their business. It would not change their model or the platforms’ behaviour, although it might add to the charges for those who want to advertise on the platforms. Furthermore, we know that senior directors, owners and managers personally take decisions about the content that they allow to appear on their platforms and that their approach affects what people post.
Elon Musk’s controversial and aggressive takeover of Twitter, where he labelled the sensible moderation of content as a violation of freedom of speech, led to a 500% increase in the use of the N-word within 12 hours of his acquisition. Telegram, whose CEO is Pavel Durov, has become the app of choice of terror networks such as ISIS, according to research conducted by the Middle East Media Research Institute. When challenged about that, however, Durov refused to act on the intelligence to moderate content and said:
“You cannot make messaging technology secure for everybody except for terrorists.”
If senior managers have responsibility for the content on their platforms, they must be held to account, because we know that doing so will mean that online businesses become a safer place for our children.
We have to decide whose side we are on. Are we really putting our children’s wellbeing first, or are we putting the platforms’ interest first? Of course, everybody will claim that we are putting children’s interests first, but if we are, we have to put our money where our mouth is, which involves making the managers truly accountable for what appears on their platforms. We know that legislating for director liability works, because it has worked for health and safety on construction sites, in the Bribery Act 2010 and on tax evasion. I hope to move similar amendments when we consider the Economic Crime and Corporate Transparency Bill on Report next week.
This is not simply a punitive measure—in fact, the last thing we want to do is lock up a lot of platform owners—but a tool to transform behaviour. We will not be locking up the tech giants, but we will be ensuring that they moderate their content. Achieving this change shows the House truly working at its best, cross-party, and focusing on the merits of the argument rather than playing party politics with such a serious issue. I commend new clause 2 to the House.
We will certainly work with others to address that, and if there is a loophole, we will seek to act, because we want to ensure—
I am grateful to the Minister for giving way. He was commenting on my earlier remarks about new clause 2 and the specifics around a timetable. I completely recognise that much of this work is under development. In my remarks, I asked for a timetable on engagement with the tech firms as well as transparency to this House on the progress being made on developing the regulations around criminal liability. It is important that this House sees that, and that we follow every single stage of that process.
I thank my right hon. Friend for that intervention. We want to have as many conversations as possible in this area with Members on all sides, and I hope we can be as transparent as possible in that operation. We have already started the conversation. The Secretary of State and I met some of the big tech companies just yesterday to talk about exactly this area.
My hon. Friend the Member for Dover, my right hon. Friends the Members for South Holland and The Deepings and for Maidenhead (Mrs May) and others are absolutely right to highlight concerns about illegal small boat crossings and the harm that can be caused to people crossing in dangerous situations. The use of highly dangerous methods to enter this country, including unseaworthy, small or overcrowded boats and refrigerated lorries, presents a huge challenge to us all. Like other forms of serious and organised crime, organised immigration crime endangers lives, has a corrosive effect on society, puts pressure on border security resources and diverts money from our economy.
As the Prime Minister has said, stopping these crossings is one of the Government’s top priorities for the next year. The situation needs to be resolved and we will not hesitate to take action wherever that can have the most effect, including through this Bill. Organised crime groups continue to facilitate most migrant journeys to the UK and have no respect for human life, exploiting vulnerable migrants, treating them as commodities and knowingly putting people in life-threatening situations. Organised crime gangs are increasingly using social media to facilitate migrant crossings and we need to do more to prevent and disrupt the crimes facilitated through these platforms. We need to share best practice, improve our detection methods and take steps to close illegal crossing routes as the behaviour and methods of organised crime groups evolve.
However, amendment 82 risks having unforeseen consequences for the Bill. It could bring into question the meaning of the term “content” elsewhere in the Bill, with unpredictable implications for how the courts and companies would interpret it. Following constructive discussions with my hon. Friend the Member for Dover and my right hon. Friend the Member for Maidenhead, I can now confirm that in order to better tackle illegal immigration encouraged by organised gangs, the Government will add section 2 of the Modern Slavery Act 2015 to the list of priority offences. Section 2 makes it an offence to arrange or facilitate the travel of another person, including through recruitment, with a view to their exploitation.
We will also add section 24 of the Immigration Act to the priority offences list in schedule 7. Although the offences in section 2 cannot be carried out online, paragraph 33 of the schedule states the priority illegal content includes the inchoate offences relating to the offences listed. Therefore aiding, abetting, counselling and conspiring in those offences by posting videos of people crossing the channel that show the activity in a positive light could be an offence that is committed online and therefore fall within what is priority illegal content. The result of this amendment would therefore be that platforms would have to proactively remove that content. I am grateful to my hon. Friend the Member for Dover and my right hon. Friends the Members for South Holland and The Deepings and for Maidenhead for raising this important issue and I would be happy to offer them a meeting with my officials to discuss the drafting of this amendment ahead of it being tabled in the other place.
We recognise the strength of feeling on the issue of harmful conversion practices and remain committed to protecting people from these practices and making sure that they can live their lives free from the threat of harm or abuse. We have had constructive engagement with my hon. Friend the Member for Rutland and Melton (Alicia Kearns) on her amendment 84, which seeks to prevent children from seeing harmful online content on conversion practices. It is right that this issue is tackled through a dedicated and tailored legislative approach, which is why we are announcing today that the Government will publish a draft Bill to set out a proposed approach to banning conversion practices. This will apply to England and Wales. The Bill will protect everybody, including those targeted on the basis of their sexuality or being transgender. The Government will publish the Bill shortly and will ask for pre-legislative scrutiny by a Joint Committee in this parliamentary Session.
This is a complex area and pre-legislative scrutiny exists to help ensure that any Bill introduced to Parliament does not cause unintended consequences. It will also ensure that the Bill benefits from stakeholder expertise and input from parliamentarians. The legislation must not, through a lack of clarity, harm the growing number of children and young adults experiencing gender-related distress through inadvertently criminalising or chilling legitimate conversations that parents or clinicians may have with children. This is an important issue, and it needs the targeted and robust approach that a dedicated Bill would provide.