Debates between Sarah Champion and Damian Collins during the 2019-2024 Parliament

Mon 5th Dec 2022

Online Safety Bill

Debate between Sarah Champion and Damian Collins
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I agree with my hon. Friend, which is why I think it is important that immigration offences were included in schedule 7 of the Bill. I think this is something my right hon. Friend the Member for Croydon South felt strongly about, having been Immigration Minister before he was a tech Minister. It is right that this has been included in the scope of the Bill and I hope that when the code of practice is developed around that, the scope of those offences will be made clear.

On whether advertising should be included as well as other postings, it may well be that at this time the Online Safety Bill is not necessarily the vehicle through which that needs to be incorporated. It could be done separately through the review of the online advertising code. Either way, these are loopholes that need to be closed, and the debate around the Online Safety Bill has brought about a recognition of what offences can be brought within the regulatory scope of the Bill and where Ofcom can have a role in enforcing those measures. Indeed, the measures on disinformation in the National Security Bill are good example of that. In some ways it required the National Security Bill to create the offence, and then the offence could be read across into the Online Safety Bill and Ofcom could play a role in regulating the platforms to ensure that they complied with requests to take down networks of Russian state-backed disinformation. Something similar could work with immigration offences as well, but whether it is done that way or through the online advertising review or through new legislation, this is a loophole that needs to be closed.

Sarah Champion Portrait Sarah Champion (Rotherham) (Lab)
- View Speech - Hansard - -

I am learning so much sitting here. I am going to speak just on child protection, but all of us are vulnerable to online harms, so I am really grateful to hon. Members across the House who are bringing their specialisms to this debate with the sole aim of strengthening this piece of legislation to protect all of us. I really hope the Government listen to what is being said, because there seems to be a huge amount of consensus on this.

The reason I am focusing on child protection is that every police officer in this field that I talk to says that, in almost every case, abusers are now finding children first through online platforms. We cannot keep up with the speed or the scale of this, so I look to this Bill to try to do so much more. My frustration is that when the Bill first started, we were very much seen as a world leader in this field, but now the abuse has become so prolific, other countries have stepped in and we are sadly lagging behind, so I really hope the Minister does everything he can to get this into law as soon as possible.

Although there are aspects of the Bill that go a long way towards tackling child abuse online, it is far from perfect. I want to speak on a number of specific ways in which the Minister can hopefully improve it. The NSPCC has warned that over 100 online grooming and child abuse image crimes are likely to be recorded every day while we wait for this crucial legislation to pass. Of course, that is only the cases that are recorded. The number is going to be far greater than that. There are vital protections in the Bill, but there is a real threat that the use of virtual private networks—VPNs—could undermine the effectiveness of these measures. VPNs allow internet users to hide their private information, such as their location and data. They are commonly used, and often advertised, as a way for people to protect their data or watch online content. For example, on TV services such as Netflix, people might be able to access something only in the US, so they could use a VPN to circumnavigate that to enable them to watch it in this country.

During the Bill’s evidence sessions, Professor Clare McGlynn said that 75% of children aged 16 and 17 used, or knew how to use, a VPN, which means that they can avoid age verification controls. So if companies use age assurance tools, as listed in the safety duties of this Bill, there is no guarantee that they will provide the protections that are needed. I am also concerned that the use of VPNs could act as a barrier to removing indecent or illegal material from the internet. The Internet Watch Foundation uses a blocking list to remove this content from internet service providers, but users with a VPN are usually not protected through the provisions they use. It also concerns me that a VPN could be used in court to circumnavigate this legislation, which is very much based in the UK. Have the Government tested what will happen if someone uses a VPN to give the appearance of being overseas?

My new clause 54 would require the Secretary of State to publish, within six months of the Bill’s passage, a report on the effect of VPN use on Ofcom’s ability to enforce the requirements under clause 112. If VPNs cause significant issues, the Government must identify those issues and find solutions, rather than avoiding difficult problems.

New clause 28 would establish a user advocacy body to represent the interests of children in regulatory decisions. Children are not a homogenous group, and an advocacy body could reflect their diverse opinions and experiences. This new clause is widely supported in the House, as we have heard, and the NSPCC has argued that it would be an important way to counterbalance the attempts of big tech companies to reduce their obligations, which are placing their interests over children’s needs.

I would like to see more third sector organisations consulted on the code of practice. The Internet Watch Foundation, which many Members have discussed, already has the necessary expertise to drastically reduce the amount of child sexual abuse material on the internet. The Government must work with the IWF and build on its knowledge of web page blocking and image hashing.

Girls in particular face increased risk on social media, with the NSPCC reporting that nearly a quarter of girls who have taken a nude photo have had their image sent to someone else online without their permission. New clauses 45 to 50 would provide important protections to women and girls from intimate image abuse, by making the non-consensual sharing of such photos illegal. I am pleased that the Government have announced that they will look into introducing these measures in the other place, but we are yet to see any measures to compare with these new clauses.

In the face of the huge increase in online abuse, victims’ services must have the necessary means to provide specialist support. Refuge’s tech abuse team, for example, is highly effective at improving outcomes for thousands of survivors, but the demand for its services is rapidly increasing. It is only right that new clause 23 is instated so that a good proportion of the revenue made from the Bill’s provisions goes towards funding these vital services.

The landmark report by the independent inquiry into child sexual abuse recently highlighted that, between 2017-18 and 2020-21, there was an approximately 53% rise in recorded grooming offences. With this crime increasingly taking place online, the report emphasised that internet companies will need more moderators to aid technology in identifying this complex type of abuse. I urge the Minister to also require internet companies to provide sufficient and meaningful support to those moderators, who have to view and deal with disturbing images and videos on a daily basis. They, as well as the victims of these horrendous crimes, deserve our support.

I have consistently advocated for increased prevention of abuse, particularly through education in schools, but we must also ensure that adults, particularly parents, are educated about the threats online. Internet Matters found that parents underestimate the extent to which their children are having negative experiences online, and that the majority of parents believe their 14 to 16-year-olds know more about technology than they do.

The example that most sticks in my mind was provided by the then police chief in charge of child protection, who said, “What is happening on a Sunday night is that the family are sitting in the living room, all watching telly together. The teenager is online, and is being abused online.” In his words, “You wouldn’t let a young child go and open the door without knowing who is there, but that is what we do every day by giving them their iPad.”

If parents, guardians, teachers and other professionals are not aware of the risks and safeguards, how are they able to protect children online? I strongly encourage the Government to accept new clauses 29 and 30, which would place an additional duty on Ofcom to promote media literacy. Minister, you have the potential—