Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The Bill is very specific with regard to encryption; this provision will cover solely CSEA and terrorism. It is important that we do not encroach on privacy.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - -

I welcome my hon. Friend to his position. Under the Bill, is it not the case that if a company refuses to use existing technologies, that will be a failure of the regulatory duties placed on that company? Companies will be required to demonstrate which technology they will use and will have to use one that is available. On encrypted messaging, is it not the case that companies already gather large amounts of information about websites that people visit before and after they send a message that could be hugely valuable to law enforcement?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

My hon. Friend is absolutely right. Not only is it incumbent on companies to use that technology should it exist; if they hamper Ofcom’s inquiries by not sharing information about what they are doing, what they find and which technologies they are not using, that will be a criminal liability under the Bill.

--- Later in debate ---
John Nicolson Portrait John Nicolson
- View Speech - Hansard - - - Excerpts

I rise to speak to the amendments in my name and those of my right hon. and hon. Friends, which of course I support.

It is welcome to see the Online Safety Bill back in the House. As we have debated this Bill and nursed it, as in my case, through both the Bill Committee and the Joint Committee, we have shone a light into some dark corners and heard some deeply harrowing stories. Who can forget the testimony given to us by Molly Russell’s dad, Ian? As we have heard, in the Public Gallery we have bereaved families who have experienced the most profound losses due to the extreme online harms to which their loved ones have been exposed; representatives of those families are watching the proceedings today. The hon. Member for Pontypridd (Alex Davies-Jones) mentioned that Ian is here, but let me mention the names of the children. Amanda and Stuart Stephens are here, and they are the parents of Olly; Andy and Judy Thomas are here, and they are the parents of Frankie; and Lorin LaFave, the mother of Breck is here, as is Ruth Moss, the mother of Sophie. All have lost children in connection with online harms, and I extend to each our most sincere condolences, as I am sure does every Member of the House. We have thought of them time and time again during the passage of this legislation; we have thought about their pain. All of us hope that this Bill will make very real changes, and we keep in our hearts the memories of those children and other young people who have suffered.

In our debates and Committee hearings, we have done our best to harry the social media companies and some of their secretive bosses. They have often been hiding away on the west coast of the US, to emerge blinking into the gloomy Committee light when they have to answer some questions about their nefarious activities and their obvious lack of concern for the way in which children and others are impacted.

We have debated issues of concern and sometimes disagreement in a way that shows the occasional benefits of cross-House co-operation. I have been pleased to work with friends and colleagues in other parties at every stage of the Bill, not least on Zach’s law, which we have mentioned. The result is a basis of good, much-needed legislation, and we must now get it on to the statute book.

It is unfortunate that the Bill has been so long delayed, which has caused great stress to some people who have been deeply affected by the issues raised, so that they have sometimes doubted our good faith. These delays are not immaterial. Children and young teenagers have grown older in an online world full of self-harm—soon to be illegal harms, we hope. It is a world full of easy-to-access pornography with no meaningful age verification and algorithms that provide harmful content to vulnerable people.

I have been pleased to note that calls from Members on the SNP Benches and from across the House to ensure that specific protection is granted to women and girls online have been heeded. New communications offences on cyber-flashing and intimate image abuse, and similar offences, are to be incorporated. The requirements for Ofcom to consult with the Victims’ Commissioner and the Domestic Abuse Commissioner are very welcome. Reporting tools should also be more responsive.

New clause 28 is an important new clause that SNP Members have been proud to sponsor. It calls for an advocacy body to represent the interests of children. That is vital, because the online world that children experience is ever evolving. It is not the online world that we in this Chamber tend to experience, nor is it the one experienced by most members of the media covering the debate today. We need, and young people deserve, a dedicated and appropriately funded body to look out for them online—a strong, informed voice able to stand up to the representations of big tech in the name of young people. This will, we hope, ensure that regulators get it right when acting on behalf of children online.

I am aware that there is broad support for such a body, including from those on the Labour Benches. We on the SNP Benches oppose the removal of the aspect of the Bill related to legal but harmful material. I understand the free speech arguments, and I have heard Ministers argue that the Government have proposed alternative approaches, which, they say, will give users control over the content that they see online. But adults are often vulnerable, too. Removing measures from the Bill that can protect adults, especially those in a mental health spiral or with additional learning needs, is a dereliction of our duty. An on/off toggle for harmful content is a poor substitute for what was originally proposed.

The legal but harmful discussion was and is a thorny one. It was important to get the language of the Bill right, so that people could be protected from harm online without impinging on freedom of expression, which we all hold dear. However, by sending aspects of the Bill back to Committee, with the intention of removing the legal but harmful provisions, I fear that the Government are simply running from a difficult debate, or worse, succumbing to those who have never really supported this Bill—some who rather approve of the wild west, free-for-all internet. It is much better to rise to the challenge of resolving the conflicts, such as they are, between free speech and legal but harmful. I accept that the Government’s proposals around greater clarity and enforcement of terms and conditions and of transparency in reporting to Ofcom offer some mitigation, but not, in my view, enough.

Damian Collins Portrait Damian Collins
- Hansard - -

The hon. Gentleman will remember that, when we served on the Joint Committee that scrutinised the draft Bill, we were concerned that the term “legal but harmful” was problematic and that there was a lack of clarity. We thought it would be better to have more clarity and enforcement based on priority illegal offences and on the terms of service. Does he still believe that, or has he changed his mind?

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

It is a fine debate. Like so much in legislation, there is not an absolute right and an absolute wrong. We heard contradictory evidence. It is important to measure the advantages and the disadvantages. I will listen to the rest of the debate very carefully, as I have done throughout.

As a journalist in a previous life, I have long been a proponent of transparency and open democracy—something that occasionally gets me into trouble. We on the SNP Benches have argued from the outset that the powers proposed for the Secretary of State are far too expensive and wide-reaching. That is no disrespect to the Minister or the new Secretary of State, but they will know that there have been quite a few Culture Secretaries in recent years, some more temperate than others.

In wishing to see a diminution of the powers proposed we find ourselves in good company, not least with Ofcom. I note that there have been some positive shifts in the proposals around the powers of the Secretary of State, allowing greater parliamentary oversight. I hope that these indicate a welcome acknowledgement that our arguments have fallen on fertile Government soil—although, of course, it could be that the Conservative Secretary of State realises that she may soon be the shadow Secretary of State and that it will be a Labour Secretary of State exercising the proposed powers. I hope she will forgive me for that moment’s cynicism.

--- Later in debate ---
Priti Patel Portrait Priti Patel (Witham) (Con)
- View Speech - Hansard - - - Excerpts

Before I speak to specific clauses I pay tribute to all the campaigners, particularly the families who have campaigned so hard to give their loved ones a voice through this Bill and to change our laws. Having had some prior involvement in the early stages of this Bill three years ago as Home Secretary, I also pay tribute to many of the officials and Members of this House on both sides who have worked assiduously on the construction, development and advancement of this Bill. In particular, I pay tribute to my hon. Friend the Member for Folkestone and Hythe (Damian Collins) and the work of the Joint Committee; when I was Home Secretary we had many discussions about this important work. I also thank the Minister for the assiduous way in which he has handled interventions and actually furthered the debate with this Bill. There are many Government Departments that have a raft of involvement and engagement.

The victims must be at the heart of everything that we do now to provide safeguards and protections. Children and individuals have lost their lives because of the online space. We know there is a great deal of good in the online space, but also a great deal of harm, and that must unite us all in delivering this legislation. We have waited a long time for this Bill, but we must come together, knowing that this is foundational legislation, which will have to be improved and developed alongside the technology, and that there is much more work to do.

I start by focusing on a couple of the new clauses, beginning with Government new clause 11 on end-to-end encryption. The House will not be surprised by my background in dealing with end-to-end encryption, particularly the harmful content, the types of individuals and the perpetrators who hide behind end-to-end encryption. We must acknowledge the individuals who harm children or who peddle terrorist content through end-to-end encryption while recognising that encryption services are important to protect privacy.

There is great justification for encryption—business transactions, working for the Government and all sorts of areas of importance—but we must acknowledge in this House that there is more work to do, because these services are being used by those who would do harm to our country, threaten our national interest or threaten the safety of young people and children in particular. We know for a fact that there are sick-minded individuals who seek to abuse and exploit children and vulnerable adults. The Minister will know that, and I am afraid that many of us do. I speak now as a constituency Member of Parliament, and one of my first surgery cases back in 2010 was the sad and tragic case of a mother who came to see me because her son had accessed all sorts of content. Thanks to the Bill, that content will now be ruled as harmful. There were other services associated with access that the family could not see and could not get access to, and encryption platforms are part of that.

There are shocking figures, and I suspect that many of my colleagues in the House will be aware of them. Almost 100,000 reports relating to online child abuse were received by UK enforcement agencies in 2021 alone. That is shocking. The House will recognise my experience of working with the National Crime Agency, to which we must pay tribute for its work in this space, as we should to law enforcement more widely. Police officers and all sorts of individuals in law enforcement are, day in, day out, investigating these cases and looking at some of the most appalling images and content, all in the name of protecting vulnerable children, and we must pay tribute to them as well.

It is also really shocking that that figure of 100,000 reports in 2021 alone is a 29% increase on the previous year. The amount of disturbing content is going up and up, and we are, I am afraid, looking only at the tip of the iceberg. So, I think it is absolutely right—and I will always urge the Government and whichever Secretary of State, be they in the Home Office, DMCS or the MOJ—to put the right measures and powers in place so that we act to prevent child sexual abuse and exploitation, prevent terrorist content from being shielded behind the platforms of encryption and, importantly, bring those involved to face justice. End-to-end encryption is one thing, but we need end-to-end justice for victims and the prevention of the most heinous crimes.

This is where we, as a House, must come together. I commend the hon. Member for Rotherham (Sarah Champion) in particular for her work relating to girls, everything to do with the grooming gangs, and the most appalling crimes against individuals, quite frankly. I will always urge colleagues to support the Bill, on which we will need to build going forward.

I think I can speak with experience about the difficulties in drafting legislation—both more broadly and specifically in this area, which is complex and challenging. It is hard to foresee the multiplicity of circumstances. My hon. Friend the Member for Folkestone and Hythe was absolutely right to say in his comments to the SNP spokesman, the hon. Member for Ochil and South Perthshire (John Nicolson), that we have to focus on illegal content. It is difficult to get the balance right between the lawful and harmful. The illegal side is what we must focus on.

I also know that many campaigners and individuals—they are not just campaigners, but families—have given heartbreaking and devastating accounts of their experiences of online harms. As legislators, we owe them this Bill, because although their suffering is not something that we will experience, it must bring about the type of changes that we all want to see for everyone—children, adults and vulnerable individuals.

May I ask the Minister for reassurances on the definition of “best endeavours”? As my right hon. Friend the Member for Basingstoke (Dame Maria Miller) touched on, when it comes to implementation, that will be the area where the rubber hits the road. That is where we will need to know that our collective work will be meaningful and will deliver protections—not just change, but protections. We must be honest about the many serious issues that will arise even after we pass the Bill—be it, God forbid, a major terrorist incident, or cases of child sexual exploitation—and there is a risk that, without clarity in this area, when a serious issue does arise, we may not know whether a provider undertook best endeavours. I think we owe it to everyone to ensure that we run a slide rule over this on every single granular detail.

Cases and issues relating to best endeavours are debated and discussed extensively in court cases, coroner inquests and for social services relating to child safeguarding issues, for example—all right hon. and hon. Members here will have experience of dealing with social services on behalf of their constituents in child protection cases—or, even worse, in serious case reviews or public inquiries that could come in future. I worry that in any response a provider could say that it did its best and had undertaken its best endeavours, as a defence. That would be unacceptable. That would lead those affected to feel as if they suffered an even greater injustice than the violations that they experienced. It is not clear whether best endeavours will be enough to change the culture, behaviour and attitudes of online platforms.

I raise best endeavours in the context of changing attitudes and cultures because in many institutions, that very issue is under live debate right now. That may be in policing, attitudes around women and girls or how we protect other vulnerable groups, even in other services such as the fire service, which we have heard about recently. It is important that we ask those questions and have the scrutiny. We need to hear more about what constitutes best endeavours. Who will hold the providers to account? Ofcom clearly has a role. I know the Minister will do a very earnest and diligent job to provide answers, but the best endeavours principle goes wider than just the Minister on the Front Bench—it goes across the whole of Government. He knows that we will give him every backing to use his sharp elbows—perhaps I can help with my sharp elbows—to ensure that others are held to account.

It will also be for Ofcom to give further details and guidance. As ever, the guidance will be so important. The guidance has to have teeth and statutory powers. It has to be able to put the mirror up and hold people to account. For example, would Ofcom be able, in its notices to providers, to instruct them to use specific technologies and programmes to tackle and end the exposure to exploitation, in relation to end-to-end encryption services, to protect victims? That is an open question, but one that could be put to Ofcom and could be an implementation test. There is no reason why we should not put a series of questions to Ofcom around how it would practically implement.

I would like to ask the Minister why vulnerable adults and victims of domestic abuse and violence against women and girls are not included. We must do everything in this House. This is not about being party political. When it comes to all our work on women and violence against women and girls, there should be no party politics whatsoever. We should ensure that what is right for one group is consistent and that the laws are strengthened. That will require the MOJ, as well as the Home Office, to ensure that the work is joined up in the right kind of way.

It is right that powers are available for dealing with terrorist threats and tackling child sexual abuse thoroughly. There is some good work around terrorist content. There is excellent work in GIFCT, the Global Internet Forum to Counter Terrorism. The technology companies are doing great work. There is international co-operation in this space. The House should take some comfort in the fact that the United Kingdom leads the world in this space. We owe our gratitude to our intelligence and security agencies. I give my thanks to MI5 in particular for its work and to counter-terrorism policing, because they have led the world robustly in this work.

Damian Collins Portrait Damian Collins
- Hansard - -

My right hon. Friend makes an important point about this being a cross-Government effort. The Online Safety Bill creates a regulatory framework for the internet, but we need to make sure that we have the right offences in law clearly defined. Then, it is easy to read them and cross them with legislation. If we do not have that, it is a job for the whole of Government.

--- Later in debate ---
Baroness Hodge of Barking Portrait Dame Margaret Hodge
- Hansard - - - Excerpts

Indeed. The way the hon. Gentleman describes his new clause, which I will look at, is absolutely right, but can I just make a more general point because it speaks to the point about legal but harmful? What I really fear with the legal but harmful rule is that we create more and more laws to make content illegal and that, ironically, locks up more and more people, rather than creates structures and systems that will prevent the harm occurring in the first place. So I am not always in favour of new laws simply criminalising individuals. I would love us to have kept to the legal but harmful route.

We can look to Elon Musk’s recent controversial takeover of Twitter. Decisions taken by Twitter’s newest owner—by Elon Musk himself—saw use of the N-word increase by nearly 500% within 12 hours of acquisition. And allowing Donald Trump back on Twitter gives a chilling permission to Trump and others to use the site yet again to incite violence.

The tech giants know that their business models are dangerous. Platforms can train their systems to recognise so-called borderline content and reduce engagement. However, it is for business reasons, and business reasons alone, that they actively choose not to do that. In fact, they do the opposite and promote content known to trigger extreme emotions. These platforms are like a “danger for profit” machine, and the decision to allow that exploitation is coming from the top. Do not take my word for it; just listen to the words of Ian Russell. He has said:

“The only person that I’ve ever come across in this whole world…that thought that content”—

the content that Molly viewed—

“was safe was…Meta.”

There is a huge disconnect between what silicon valley executives think is safe and what we expect, both for ourselves and for our children. By introducing liability for directors, the behaviour of these companies might finally change. Experience elsewhere has shown us that that would prove to be the most effective way of keeping online users safe. New clause 17 would hold directors of a regulated service personally liable on the grounds that they have failed, or are failing, to comply with any duties set in relation to their service, for instance failure that leads to the death of a child. The new clause further states that the decision on who was liable would be made by Ofcom, not the provider, meaning that responsibility could not be shirked.

I say to all Members that if we really want to reduce the amount of harmful abuse online, then making senior directors personally liable is a very good way of achieving it. Some 82% of UK adults agree with us, Labour Front Benchers agree and Back Benchers across the House agree. So I urge the Government to rethink their position on director liability and support new clause 17 as a cross-party amendment. I really think it will make a difference.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - -

As Members know, there is a tradition in the United States that when the President signs a new Bill into law, people gather around him in the Oval Office, and multiple pens are used and presented to people who had a part in that Bill being drafted. If we required the King to do something similar with this Bill and gave a pen to every Minister, every Member who had served on a scrutiny Committee and every hon. Member who introduced an amendment that was accepted, we would need a lot of pens and it would take a long time. In some ways, however, that shows the House at its best; the Bill’s introduction has been a highly collaborative process.

The right hon. Member for Barking (Dame Margaret Hodge) was kind in her words about me and my right hon. Friend the Member for Croydon South (Chris Philp). I know that my successor will continue in the same tradition and, more importantly, that he is supported by a team of officials who have dedicated, in some cases, years of their career to the Bill, who care deeply about it and who want to see it introduced with success. I had better be nice to them because some of them are sitting in the Box.

--- Later in debate ---
Damian Collins Portrait Damian Collins
- Hansard - -

My right hon. Friend raises a very good question. As well as having a named individual with criminal liability for the supplying of information, should there be somebody who is accountable within a company, whether that comes with criminal sanctions or not—somebody whose job it is to know? As all hon. Members know if they have served on the Digital, Culture, Media and Sport Committee, which I chaired, on the Public Accounts Committee or on other Select Committees that have questioned people from the big tech companies, the frustrating thing is that no matter who they put up, it never seems to be the person who actually knows.

There needs to be someone who is legally liable, whether or not they have criminal liability, and is the accountable officer. In the same way as in a financial institution, it is really important to have someone whose job it is to know what is going on and who has certain liabilities. The Bill gives Ofcom the power to seek information and to appoint experts within a company to dig information out and work with the company to get it, but the companies need to feel the same sense of liability that a bank would if its systems had been used to launder money and it had not raised a flag.

Damian Collins Portrait Damian Collins
- Hansard - -

I will dare to give way to yet another former Committee Chair—the former chair of the Public Accounts Committee.

Baroness Hodge of Barking Portrait Dame Margaret Hodge
- Hansard - - - Excerpts

I draw all hon. Members’ attention to issues relating to Barclays Bank in the wake of the economic crisis. An authority—I think it was the Serious Fraud Office—attempted to hold both the bank and its directors to account, but it failed because there was not a corporate criminal liability clause that worked. It was too difficult. Putting such a provision in the Bill would be a means of holding individual directors as well as companies to account, whatever standard of proof was used.

Damian Collins Portrait Damian Collins
- Hansard - -

I thank the right hon. Lady for that information.

Let me move on to the debate about encryption, which my right hon. Friend the Member for Haltemprice and Howden has mentioned. I think it is important that Ofcom and law enforcement agencies be able to access information from companies that could be useful in prosecuting cases related to terrorism and child sexual exploitation. No one is suggesting that encrypted messaging services such as WhatsApp should be de-encrypted, and there is no requirement in the Bill for encryption to end, but we might ask how Meta makes money out of WhatsApp when it appears to be free. One way in which it makes money is by gathering huge amounts of data and information about the people who use it, about the names of WhatsApp groups and about the websites people visit before and after sending messages. It gathers a lot of background metadata about people’s activity around using the app and service.

If someone has visited a website on which severe illegal activity is taking place and has then used a messaging service, and the person to whom they sent the message has done the same, it should be grounds for investigation. It should be easy for law enforcement to get hold of the relevant information without the companies resisting. It should be possible for Ofcom to ask questions about how readily the companies make that information available. That is what the Government seek to do through their amendments on encryption. They are not about creating a back door for encryption, which could create other dangers, and not just on freedom of expression grounds: once a back door to a system is created, even if it is only for the company itself or for law enforcement, other people tend to find their way in.

Ian Paisley Portrait Ian Paisley (North Antrim) (DUP)
- Hansard - - - Excerpts

I thank the hon. Member for jointly sponsoring my private Member’s Bill, the Digital Devices (Access for Next of Kin) Bill. Does he agree that the best way to make progress is to ensure open access for the next of kin to devices that a deceased person leaves behind?

Damian Collins Portrait Damian Collins
- Hansard - -

The hon. Member makes an important point. Baroness Kidron’s amendment has been referred to; I anticipate that future amendments in the House of Lords will also seek to address the issue, which our Joint Committee looked at carefully in our pre-legislative scrutiny.

It should be much easier than it has been for the Russell family and the coroner to gain access to such important information. However, depending on the nature of the case, there may well be times when it would be wrong for families to have access. I think there has to be an expedited and official process through which the information can be sought, rather than a general provision, because some cases are complicated. There should not be a general right in law, but it needs to be a lot easier than it is. Companies should make the information available much more readily than they have done. The Molly Russell inquest had to be delayed for four months because of the late release of thousands of pages of information from Meta to the coroner. That is clearly not acceptable either.

My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) has tabled an amendment relating to small and risky platforms. The categorisation of platforms on the basis of size was linked to duties under the “legal but harmful” provisions, which we expect now to change. The priority illegal harms apply to platforms of all sizes. Surely when illegal activity is taking place on any platform of any size—I hope that the Minister will clarify this later—Ofcom must have the right to intervene and start asking questions. I think that, in practice, that is how we should expect the system to work.

Like other Members who served on the Joint Committee —I am thinking particularly of my hon. Friends the Members for Watford (Dean Russell) and for Stourbridge (Suzanne Webb), both of whom spoke so passionately about this subject, and the hon. Member for Ochil and South Perthshire (John Nicolson) raised it as well—I was delighted to see that the Government had tabled amendments to cover Zach’s law. The fact that someone can deliberately seek out a person with epilepsy and target that person with flashing images with the intention of causing a seizure is a terrible example of the way in which systems can be abused. It is wrong for the platforms to be neutral and have no obligation to identify and stop that action, but the action is wrong in practice as well, and it demonstrates the need for us to ensure that the law keeps pace with the nature of new offences. I was very proud to meet Zach and his mother in October. I said to them then that their work had changed the law, and I am glad that the Government have tabled those amendments.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

May I pay tribute to my hon. Friend for his chairmanship of the Joint Committee last year? We covered a wide range of challenging ethical, moral and technical decisions, with work across both Houses, and I think that the decisions contained in our report informed many of the Government amendments, but it was my hon. Friend’s chairmanship that helped to guide us through that period.

Damian Collins Portrait Damian Collins
- Hansard - -

I am grateful to my hon. Friend for what he has said, and for his significant work on the Committee.

There is a great deal that we could say about this Bill, but let me end by touching on an important topic that I think my hon. Friend the Member for Dover (Mrs Elphicke) will speak about later: the way in which social media platforms are used by people trafficking gangs to recruit those who can help them with bringing people into the country in small boats. It was right that the Government included immigration offences in the list of priority legal harms in schedule 7. It was also right that, following a recommendation from the Joint Committee, they included fraud and scam ads in the scope of the Bill.

We have already accepted, in principle, that advertising can be within the Bill’s scope in certain circumstances, and that priority legal harms can be written into the Bill and identified as such. As I understand it, my hon. Friend’s amendment seeks to bring advertising services—not just organic posts on social media platforms—into the Bill’s scope as well. I know that the Government want to consider illegal activity in advertising as part of the online advertising review, but I hope that this could be an expedited process running in parallel with the Bill as it completes its stages. Illegal activity in advertising would not be allowed in the offline world. Newspaper editors are legally liable for what appears in their papers, and broadcasters can lose their licence if they allow illegal content to feature in advertising. We do not yet have the same enforcement mechanism through the advertising industry with the big online platforms, such as Google and Facebook, where the bulk of display advertising now goes. Their advertising market is bigger than the television advertising market. We are seeing serious examples of illegal activity, and it cannot be right that while such examples cannot be posted on a Facebook page, if money is put behind them and they are run as advertisements they can.

Priti Patel Portrait Priti Patel
- Hansard - - - Excerpts

My hon. Friend is making a very thoughtful speech. This is an important point, because it relates to criminality fuelled by online activity. We have discussed that before in the context of advertising. Tools already exist throughout Government to pick up such criminality, but we need the Bill to integrate them and drive the right outcomes—to stop this criminality, to secure the necessary prosecutions, and to bring about the deterrent effect that my hon. Friend the Member for Dover (Mrs Elphicke) is pursuing.

Natalie Elphicke Portrait Mrs Natalie Elphicke (Dover) (Con)
- Hansard - - - Excerpts

Will my right hon. Friend give way?

Damian Collins Portrait Damian Collins
- Hansard - -

Of course.

Natalie Elphicke Portrait Mrs Elphicke
- Hansard - - - Excerpts

I am grateful to my right hon. Friend raising this and for his support in this important area that affects our constituencies so much. I will be speaking later to the details of this, which go beyond the advertising payment to the usage, showing and sharing of this. As he has mentioned schedule 7, does he agree that there is—as I have set out in my amendment—a strong case for making sure that it covers all those illegal immigration and modern slavery offences, given the incredible harm that is being caused and that we see on a day-to-day basis?

Damian Collins Portrait Damian Collins
- Hansard - -

I agree with my hon. Friend, which is why I think it is important that immigration offences were included in schedule 7 of the Bill. I think this is something my right hon. Friend the Member for Croydon South felt strongly about, having been Immigration Minister before he was a tech Minister. It is right that this has been included in the scope of the Bill and I hope that when the code of practice is developed around that, the scope of those offences will be made clear.

On whether advertising should be included as well as other postings, it may well be that at this time the Online Safety Bill is not necessarily the vehicle through which that needs to be incorporated. It could be done separately through the review of the online advertising code. Either way, these are loopholes that need to be closed, and the debate around the Online Safety Bill has brought about a recognition of what offences can be brought within the regulatory scope of the Bill and where Ofcom can have a role in enforcing those measures. Indeed, the measures on disinformation in the National Security Bill are good example of that. In some ways it required the National Security Bill to create the offence, and then the offence could be read across into the Online Safety Bill and Ofcom could play a role in regulating the platforms to ensure that they complied with requests to take down networks of Russian state-backed disinformation. Something similar could work with immigration offences as well, but whether it is done that way or through the online advertising review or through new legislation, this is a loophole that needs to be closed.

Sarah Champion Portrait Sarah Champion (Rotherham) (Lab)
- View Speech - Hansard - - - Excerpts

I am learning so much sitting here. I am going to speak just on child protection, but all of us are vulnerable to online harms, so I am really grateful to hon. Members across the House who are bringing their specialisms to this debate with the sole aim of strengthening this piece of legislation to protect all of us. I really hope the Government listen to what is being said, because there seems to be a huge amount of consensus on this.

The reason I am focusing on child protection is that every police officer in this field that I talk to says that, in almost every case, abusers are now finding children first through online platforms. We cannot keep up with the speed or the scale of this, so I look to this Bill to try to do so much more. My frustration is that when the Bill first started, we were very much seen as a world leader in this field, but now the abuse has become so prolific, other countries have stepped in and we are sadly lagging behind, so I really hope the Minister does everything he can to get this into law as soon as possible.

Although there are aspects of the Bill that go a long way towards tackling child abuse online, it is far from perfect. I want to speak on a number of specific ways in which the Minister can hopefully improve it. The NSPCC has warned that over 100 online grooming and child abuse image crimes are likely to be recorded every day while we wait for this crucial legislation to pass. Of course, that is only the cases that are recorded. The number is going to be far greater than that. There are vital protections in the Bill, but there is a real threat that the use of virtual private networks—VPNs—could undermine the effectiveness of these measures. VPNs allow internet users to hide their private information, such as their location and data. They are commonly used, and often advertised, as a way for people to protect their data or watch online content. For example, on TV services such as Netflix, people might be able to access something only in the US, so they could use a VPN to circumnavigate that to enable them to watch it in this country.

During the Bill’s evidence sessions, Professor Clare McGlynn said that 75% of children aged 16 and 17 used, or knew how to use, a VPN, which means that they can avoid age verification controls. So if companies use age assurance tools, as listed in the safety duties of this Bill, there is no guarantee that they will provide the protections that are needed. I am also concerned that the use of VPNs could act as a barrier to removing indecent or illegal material from the internet. The Internet Watch Foundation uses a blocking list to remove this content from internet service providers, but users with a VPN are usually not protected through the provisions they use. It also concerns me that a VPN could be used in court to circumnavigate this legislation, which is very much based in the UK. Have the Government tested what will happen if someone uses a VPN to give the appearance of being overseas?

My new clause 54 would require the Secretary of State to publish, within six months of the Bill’s passage, a report on the effect of VPN use on Ofcom’s ability to enforce the requirements under clause 112. If VPNs cause significant issues, the Government must identify those issues and find solutions, rather than avoiding difficult problems.

New clause 28 would establish a user advocacy body to represent the interests of children in regulatory decisions. Children are not a homogenous group, and an advocacy body could reflect their diverse opinions and experiences. This new clause is widely supported in the House, as we have heard, and the NSPCC has argued that it would be an important way to counterbalance the attempts of big tech companies to reduce their obligations, which are placing their interests over children’s needs.

I would like to see more third sector organisations consulted on the code of practice. The Internet Watch Foundation, which many Members have discussed, already has the necessary expertise to drastically reduce the amount of child sexual abuse material on the internet. The Government must work with the IWF and build on its knowledge of web page blocking and image hashing.

Girls in particular face increased risk on social media, with the NSPCC reporting that nearly a quarter of girls who have taken a nude photo have had their image sent to someone else online without their permission. New clauses 45 to 50 would provide important protections to women and girls from intimate image abuse, by making the non-consensual sharing of such photos illegal. I am pleased that the Government have announced that they will look into introducing these measures in the other place, but we are yet to see any measures to compare with these new clauses.

In the face of the huge increase in online abuse, victims’ services must have the necessary means to provide specialist support. Refuge’s tech abuse team, for example, is highly effective at improving outcomes for thousands of survivors, but the demand for its services is rapidly increasing. It is only right that new clause 23 is instated so that a good proportion of the revenue made from the Bill’s provisions goes towards funding these vital services.

The landmark report by the independent inquiry into child sexual abuse recently highlighted that, between 2017-18 and 2020-21, there was an approximately 53% rise in recorded grooming offences. With this crime increasingly taking place online, the report emphasised that internet companies will need more moderators to aid technology in identifying this complex type of abuse. I urge the Minister to also require internet companies to provide sufficient and meaningful support to those moderators, who have to view and deal with disturbing images and videos on a daily basis. They, as well as the victims of these horrendous crimes, deserve our support.

I have consistently advocated for increased prevention of abuse, particularly through education in schools, but we must also ensure that adults, particularly parents, are educated about the threats online. Internet Matters found that parents underestimate the extent to which their children are having negative experiences online, and that the majority of parents believe their 14 to 16-year-olds know more about technology than they do.

The example that most sticks in my mind was provided by the then police chief in charge of child protection, who said, “What is happening on a Sunday night is that the family are sitting in the living room, all watching telly together. The teenager is online, and is being abused online.” In his words, “You wouldn’t let a young child go and open the door without knowing who is there, but that is what we do every day by giving them their iPad.”

If parents, guardians, teachers and other professionals are not aware of the risks and safeguards, how are they able to protect children online? I strongly encourage the Government to accept new clauses 29 and 30, which would place an additional duty on Ofcom to promote media literacy. Minister, you have the potential—

--- Later in debate ---
David Davis Portrait Mr David Davis
- View Speech - Hansard - - - Excerpts

I do not agree with every detail of what the hon. Member for Rotherham (Sarah Champion) said, but I share her aims. She has exactly the right surname for what she does in standing up for children.

To avoid the risk of giving my Whip a seizure, I congratulate the Government and the Minister on all they have done so far, both in delaying the Bill and in modifying their stance.

My hon. Friend the Member for Solihull (Julian Knight), who is no longer in the Chamber, said that this is five Bills in one and should have had massively more time. At the risk of sounding like a very old man, there was a time when this Bill would have had five days on Report. That is what should have happened with such a big Bill.

Opposition Members will not agree, but I am grateful that the Government decided to remove the legal but harmful clause. The simple fact is that the hon. Member for Pontypridd (Alex Davies-Jones) and I differ not in our aim—my new clause 16 is specifically designed to protect children—but on the method of achieving it. Once upon a time, there was a tradition that this Chamber would consider a Companies Bill every year, because things change over time. We ought to have a digital Bill every year, specifically to address not legal but harmful but, “Is it harmful enough to be made illegal?” Obviously, self-harm material is harmful enough to be made illegal.

The hon. Lady and I have similar aims, but we have different perspectives on how to attack this. My perspective is as someone who has seen many pieces of legislation go badly wrong despite the best of intentions.

The Under-Secretary of State for Digital, Culture, Media and Sport, my hon. Friend the Member for Sutton and Cheam (Paul Scully), knows he is a favourite of mine. He did a fantastic job in his previous role. I think this Bill is a huge improvement, but he has a lot more to do, as he recognises with the Bill returning to Committee.

One area on which I disagree with many of my hon. and right hon. Friends is the question of encryption. The Bill allows Ofcom to issue notices directing companies to use “accredited technology,” but it might as well say “magic,” because we do not know what is meant by “accredited technology.” Clause 104 will create a pressure to undermine the end-to-end encryption that is not only desirable but crucial to our telecommunications. The clause sounds innocuous and legalistic, especially given that the notices will be issued to remove terrorist or child sexual exploitation content, which we all agree has no place online.

Damian Collins Portrait Damian Collins
- Hansard - -

Rather than it being magic, does my right hon. Friend agree that a company could not ignore it if we demystified the process? By saying there is an existing technology that is available and proven to work, the company would have to explain why it is not using that technology or something better.

David Davis Portrait Mr Davis
- Hansard - - - Excerpts

I will come back to that in some detail.

The first time I used encryption it was one-time pads and Morse, so it was a long time ago. The last time was much more recent. The issue here is that clause 104 causes pressure by requiring real-time decryption. The only way to do that is by either having it unencrypted on the server, having it weakly encrypted or creating a back door. I am talking not about metadata, which I will come back to in a second, but about content. In that context, if the content needs to be rapidly accessible, it is bound to lead to weakened encryption.

This is perhaps a debate for a specialist forum, but it is very dangerous in a whole series of areas. What do we use encryption for? We use it for banking, for legal and privileged conversations, and for conversations with our constituents and families. I could go on and on about the areas in which encryption matters.

--- Later in debate ---
David Davis Portrait Mr Davis
- Hansard - - - Excerpts

I very much agree with my hon. Friend on that. He and I have been allies in the past—and sometimes opponents—and he has often been far ahead of other people. I am afraid that I do not remember the example from the 1970s, as that was before even my time here, but I remember the intervention he made in the 1990s and the fuss it caused. From that point of view, I absolutely agree with him. My new clause is clearly worded and I hope the House will give it proper consideration. It is important that we put something in the Bill on this issue, even if the Government, quite properly, amend it later.

I wish to raise one last point, which has come up as we have talked through these issues. I refer to the question of individual responsibility. One or two hon. Ladies on the Opposition Benches have cited algorithmic outcomes. As I said to the right hon. Member for Barking, I am worried about how we place the responsibility, and how it would lead the courts to behave, and so on. We will debate that in the next few days and when the Bill comes back again.

There is one other issue that nothing in this Bill covers, and I am not entirely sure why. Much of the behaviour pattern is algorithmic and it is algorithmic with an explicit design. As a number of people have said, it is designed as clickbait; it is designed to bring people back. We may get to a point, particularly if we come back to this year after year, of saying, “There are going to be rules about your algorithms, so you have to write it into the algorithm. You will not use certain sorts of content, pornographic content and so on, as clickbait.” We need to think about that in a sophisticated and subtle way. I am looking at my hon. Friend the Member for Folkestone and Hythe (Damian Collins), the ex-Chairman of the Select Committee, on this issue. If we are going to be the innovators—and we are the digital world innovators— we have to get this right.

Damian Collins Portrait Damian Collins
- Hansard - -

My right hon. Friend is right to raise this important point. The big area here is not only clickbait, but AI-generated recommendation tools, such as a news feed on Facebook or “next up” on YouTube. Mitigating the illegal content on the platforms is not just about content moderation and removal; it is about not promoting.

David Davis Portrait Mr Davis
- Hansard - - - Excerpts

My hon. Friend is exactly right about that. I used the example of clickbait as shorthand. The simple truth is that “AI-generated” is also a misnomer, because these things are not normally AI; they are normally algorithms written specifically to recommend and to maximise returns and revenue. We are not surprised at that. Why should we be? After all, these are commercial companies we are talking about and that is what they are going to do. Every commercial company in the world operates within a regulatory framework that prevents them from making profits out of antisocial behaviour.