Online Safety Bill Debate
Full Debate: Read Full DebateWilliam Cash
Main Page: William Cash (Conservative - Stone)Department Debates - View all William Cash's debates with the Department for Digital, Culture, Media & Sport
(2 years ago)
Commons ChamberMany years ago, in the 1970s, I was much involved in the Protection of Children Bill, which was one of the first steps in condemning and making illegal explicit imagery of children and their involvement in the making of such films. We then had the broadcasting Acts and the video Acts, and I was very much involved at that time in saying that we ought to prohibit such things in videos and so on. I got an enormous amount of flack for that. We have now moved right the way forward and it is tremendous to see not only the Government but the Opposition co-operating together on this theme. I very much sympathise with not only what my right hon. Friend has just said—I am very inclined to support his new clause for that reason— but with what the right hon. Member for Barking (Dame Margaret Hodge) said. I was deeply impressed by the way in which she presented the argument about the personal liability of directors. We cannot distinguish between a company and the people who run it, and I am interested to hear what the Government have to say in reply to that.
I very much agree with my hon. Friend on that. He and I have been allies in the past—and sometimes opponents—and he has often been far ahead of other people. I am afraid that I do not remember the example from the 1970s, as that was before even my time here, but I remember the intervention he made in the 1990s and the fuss it caused. From that point of view, I absolutely agree with him. My new clause is clearly worded and I hope the House will give it proper consideration. It is important that we put something in the Bill on this issue, even if the Government, quite properly, amend it later.
I wish to raise one last point, which has come up as we have talked through these issues. I refer to the question of individual responsibility. One or two hon. Ladies on the Opposition Benches have cited algorithmic outcomes. As I said to the right hon. Member for Barking, I am worried about how we place the responsibility, and how it would lead the courts to behave, and so on. We will debate that in the next few days and when the Bill comes back again.
There is one other issue that nothing in this Bill covers, and I am not entirely sure why. Much of the behaviour pattern is algorithmic and it is algorithmic with an explicit design. As a number of people have said, it is designed as clickbait; it is designed to bring people back. We may get to a point, particularly if we come back to this year after year, of saying, “There are going to be rules about your algorithms, so you have to write it into the algorithm. You will not use certain sorts of content, pornographic content and so on, as clickbait.” We need to think about that in a sophisticated and subtle way. I am looking at my hon. Friend the Member for Folkestone and Hythe (Damian Collins), the ex-Chairman of the Select Committee, on this issue. If we are going to be the innovators—and we are the digital world innovators— we have to get this right.
I really wish it was fantasy land, but I am in contact with parents each and every day who tell me stories of their children being drawn into this. Yes, in this country it is thankfully very difficult to get a double mastectomy when you are under 18, but it is incredibly easy to buy testosterone illegally online and to inject it, egged on by adults in other countries. Once a girl has injected testosterone during puberty, she will have a deep voice and facial hair for life and male-pattern baldness, and she will be infertile. That is a permanent change, it is self-harm and it should be criminalised under this Bill, whether through this clause or through the Government’s new plans. The hon. Member for Kirkcaldy and Cowdenbeath (Neale Hanvey) is absolutely right: this is happening every day and it should be classed as self-harm.
Going back to my comments about the effect on children of viewing pornography, I absolutely support the idea of putting children’s experience at the heart of the Bill but it needs to be about children’s welfare and not about what children want. One impact of the internet has been to blur the boundary between adults and children. As adults, we need to be able to say, “This is the evidence of what is harmful to children, and this is what children should not be seeing.” Of course children will say that they want free access to all content, just like they want unlimited sweets and unlimited chocolate, but as adults we need to be able to say what is harmful for children and to protect them from seeing it.
This bring me to Government new clause 11, which deals with making sure that child sexual abuse material is taken offline. There is a clear link between the epidemic of pornography and the epidemic of child sexual abuse material. The way the algorithms on porn sites work is to draw users deeper and deeper into more and more extreme content—other Members have mentioned this in relation to other areas of the internet—so someone might go on to what they think is a mainstream pornography site and be drawn into more and more explicit, extreme and violent criminal pornography. At the end of this, normal people are drawn into watching children being abused, often in real time and often in other countries. There is a clear link between the epidemic of porn and the child sexual abuse material that is so prevalent online.
Last week in the Home Affairs Committee we heard from Professor Alexis Jay, who led the independent inquiry into child sexual abuse. Her report is harrowing, and it has been written over seven years. Sadly, its conclusion is that seven years later, there are now even more opportunities for people to abuse children because of the internet, so making sure that providers have a duty to remove any child sexual abuse material that they find is crucial. Many Members have referred to the Internet Watch Foundation. One incredibly terrifying statistic is that in 2021, the IWF removed 252,194 web pages containing child sexual abuse material and an unknown number of images. New clause 11 is really important, because it would put the onus on the tech platforms to remove those images when they are found.
It is right to put the onus on the tech companies. All the way through the writing of this Bill, at all the consultation meetings we have been to, we have heard the tech companies say, “It’s too hard; it’s not possible because of privacy, data, security and cost.” I am sure that is what the mine owners said in the 19th century when they were told by the Government to stop sending children down the mines. It is not good enough. These are the richest, most powerful companies in the world. They are more powerful than an awful lot of countries, yet they have no democratic accountability. If they can employ real-time facial recognition at airports, they can find a way to remove child abuse images from the internet.
This leads me on to new clause 17, tabled by the right hon. Member for Barking (Dame Margaret Hodge), which would introduce individual director liability for non-compliance. I completely support that sentiment and I agree that this is likely to be the only way we will inject some urgency into the process of compliance. Why should directors who are profiting from the platforms not be responsible if children suffer harm as a result of using their products? That is certainly the case in many other industries. The right hon. Lady used the example of the building trade. Of course there will always be accidents, but if individual directors face the prospect of personal liability, they will act to address the systemic issues, the problems with the processes and the malevolent algorithms that deliberately draw users towards harm.
My hon. Friend knows that I too take a great interest in this, and I am glad that the Government have agreed to continue discussions on this question. Is she aware that the personal criminal liability for directors flows from the corporate criminal liability in the company of which they are a director, and that their link to the criminal act itself, even if the company has not been or is not being prosecuted, means that the matter has to be made clear in the legislation, so that we do not have any uncertainty about the relationship of the company director and the company of which he is a director?
I was not aware of that, but I am now. I thank my hon. Friend for that information. This is a crucial point. We need the accountability of the named director associated with the company, the platform and the product in order to introduce the necessary accountability. I do not know whether the Minister will accept this new clause today, but I very much hope that we will look further at how we can make this possible, perhaps in another place.
I very much support the Bill. We need to get it on the statute book, although it will probably need further work, and I support the Government amendments. However, given the link between children viewing pornography and child sexual abuse, I hope that when the Bill goes through the other place, their lordships will consider how regulations around pornographic content can be strengthened, in order to drastically reduce the number of children viewing porn and eventually being drawn into criminal activities themselves. In particular, I would like their lordships to look at tightening and accelerating the age verification and giving equal treatment to all pornography, whether it is on a porn site or a user-to-user service and whether it is online or offline. Porn is harmful to children in whatever form it comes, so the liability on directors and the criminality must be exactly the same. I support the Bill and the amendments in the Government’s name, but it needs to go further when it goes to the other place.
I have raised this on a number of occasions in the past few hours, as have my hon. Friend the Member for Penistone and Stocksbridge (Miriam Cates) and the right hon. Member for Barking (Dame Margaret Hodge). Will the Minister be good enough to ensure that this matter is thoroughly looked at and, furthermore, that the needed clarification is thought through?
I was going to come to my hon. Friend in two seconds.
In the absence of clearly defined offences, the changes we are making to the Bill mean that it is likely to be almost impossible to take enforcement action against individuals. We are confident that Ofcom will have all the tools necessary to drive the necessary culture change in the sector, from the boardroom down.
This is not the last stage of the Bill. It will be considered in Committee—assuming it is recommitted today—and will come back on Report and Third Reading before going to the House of Lords, so there is plenty of time further to discuss this and to give my hon. Friend the clarification he needs.