Debates between Miriam Cates and John Nicolson during the 2019 Parliament

Mon 5th Dec 2022

Online Safety Bill

Debate between Miriam Cates and John Nicolson
Miriam Cates Portrait Miriam Cates
- Hansard - -

I thank the hon. Gentleman for his intervention. He is absolutely right: inciting a child to harm their body, whatever that harm is, should be criminalised, and I support the sentiment of new clause 16, which seeks to do that. Sadly, lots of children, particularly girls, go online and type in “I don’t like my body”. Maybe they are drawn to eating disorder sites, as my right hon. Friend the Member for Chelmsford (Vicky Ford) has mentioned, but often they are drawn into sites that glorify transition, often with adult men that they do not even know in other countries posting pictures of double mastectomies on teenage girls.

John Nicolson Portrait John Nicolson (Ochil and South Perthshire) (SNP)
- Hansard - - - Excerpts

The hon. Lady must realise that this is fantasy land. It is incredibly difficult to get gender reassignment surgery. The “they’re just confused” stuff is exactly what was said to me as a young gay man. She must realise that this really simplifies a complicated issue and patronises people going through difficult choices.

Miriam Cates Portrait Miriam Cates
- Hansard - -

I really wish it was fantasy land, but I am in contact with parents each and every day who tell me stories of their children being drawn into this. Yes, in this country it is thankfully very difficult to get a double mastectomy when you are under 18, but it is incredibly easy to buy testosterone illegally online and to inject it, egged on by adults in other countries. Once a girl has injected testosterone during puberty, she will have a deep voice and facial hair for life and male-pattern baldness, and she will be infertile. That is a permanent change, it is self-harm and it should be criminalised under this Bill, whether through this clause or through the Government’s new plans. The hon. Member for Kirkcaldy and Cowdenbeath (Neale Hanvey) is absolutely right: this is happening every day and it should be classed as self-harm.

Going back to my comments about the effect on children of viewing pornography, I absolutely support the idea of putting children’s experience at the heart of the Bill but it needs to be about children’s welfare and not about what children want. One impact of the internet has been to blur the boundary between adults and children. As adults, we need to be able to say, “This is the evidence of what is harmful to children, and this is what children should not be seeing.” Of course children will say that they want free access to all content, just like they want unlimited sweets and unlimited chocolate, but as adults we need to be able to say what is harmful for children and to protect them from seeing it.

This bring me to Government new clause 11, which deals with making sure that child sexual abuse material is taken offline. There is a clear link between the epidemic of pornography and the epidemic of child sexual abuse material. The way the algorithms on porn sites work is to draw users deeper and deeper into more and more extreme content—other Members have mentioned this in relation to other areas of the internet—so someone might go on to what they think is a mainstream pornography site and be drawn into more and more explicit, extreme and violent criminal pornography. At the end of this, normal people are drawn into watching children being abused, often in real time and often in other countries. There is a clear link between the epidemic of porn and the child sexual abuse material that is so prevalent online.

Last week in the Home Affairs Committee we heard from Professor Alexis Jay, who led the independent inquiry into child sexual abuse. Her report is harrowing, and it has been written over seven years. Sadly, its conclusion is that seven years later, there are now even more opportunities for people to abuse children because of the internet, so making sure that providers have a duty to remove any child sexual abuse material that they find is crucial. Many Members have referred to the Internet Watch Foundation. One incredibly terrifying statistic is that in 2021, the IWF removed 252,194 web pages containing child sexual abuse material and an unknown number of images. New clause 11 is really important, because it would put the onus on the tech platforms to remove those images when they are found.

It is right to put the onus on the tech companies. All the way through the writing of this Bill, at all the consultation meetings we have been to, we have heard the tech companies say, “It’s too hard; it’s not possible because of privacy, data, security and cost.” I am sure that is what the mine owners said in the 19th century when they were told by the Government to stop sending children down the mines. It is not good enough. These are the richest, most powerful companies in the world. They are more powerful than an awful lot of countries, yet they have no democratic accountability. If they can employ real-time facial recognition at airports, they can find a way to remove child abuse images from the internet.

This leads me on to new clause 17, tabled by the right hon. Member for Barking (Dame Margaret Hodge), which would introduce individual director liability for non-compliance. I completely support that sentiment and I agree that this is likely to be the only way we will inject some urgency into the process of compliance. Why should directors who are profiting from the platforms not be responsible if children suffer harm as a result of using their products? That is certainly the case in many other industries. The right hon. Lady used the example of the building trade. Of course there will always be accidents, but if individual directors face the prospect of personal liability, they will act to address the systemic issues, the problems with the processes and the malevolent algorithms that deliberately draw users towards harm.