I say this to the Whip on the Front Bench, and I hope that I have his attention: the Bill needs many more days on Report. I hope that he will reflect that back to the Chief Whip at the end of this business, because only with more days can we get it right. This is probably one of the most important Bills to go through this House in this decade, and we have not quite got it right yet.
John Nicolson Portrait John Nicolson (Ochil and South Perthshire) (SNP)
- View Speech - Hansard - -

I rise to speak to the amendments in my name and those of other right hon. and hon. Members. I welcome the Minister to his place after his much-deserved promotion; as other hon. Members have said, it is great to have somebody who is both passionate and informed as a Minister. I also pay tribute to the hon. Member for Croydon South (Chris Philp), who is sitting on the Back Benches: he worked incredibly hard on the Bill, displayed a mastery of detail throughout the process and was extremely courteous in his dealings with us. I hope that he will be speedily reshuffled back to the Front Bench, which would be much deserved—but obviously not that he should replace the Minister, who I hope will remain in his current position or indeed be elevated from it.

But enough of all this souking, as we say north of the border. As one can see from the number of amendments tabled, the Bill is not only an enormous piece of legislation but a very complex one. Its aims are admirable—there is no reason why this country should not be the safest place in the world to be online—but a glance through the amendments shows how many holes hon. Members think it still has.

The Government have taken some suggestions on board. I welcome the fact that they have finally legislated outright to stop the wicked people who attempt to trigger epileptic seizures by sending flashing gifs; I did not believe that such cruelty was possible until I was briefed about it in preparation for debates on the Bill. I pay particular tribute to wee Zach, whose name is often attached to what has been called Zach’s law.

The amendments to the Bill show that there has been a great deal of cross-party consensus on some issues, on which it has been a pleasure to work with friends in the Labour party. The first issue is addressed, in various ways, by amendments 44 to 46, 13, 14, 21 and 22, which all try to reduce the Secretary of State’s powers under the Bill. In all the correspondence that I have had about the Bill, and I have had a lot, that is the area that has most aggrieved the experts. A coalition of groups with a broad range of interests, including child safety, human rights, women and girls, sport and democracy, all agree that the Secretary of State is granted too many powers under the Bill, which threatens the independence of the regulator. Businesses are also wary of the powers, in part because they cause uncertainty.

The reduction of ministerial powers under the Bill was advised by the Joint Committee on the Draft Online Safety Bill and by the Select Committee on Digital, Culture, Media and Sport, on both of which I served. In Committee, I asked the then Minister whether any stakeholder had come forward in favour of these powers. None had.

Even DCMS Ministers do not agree with the powers. The new Minister was Chair of the Joint Committee, and his Committee’s report said:

“The powers for the Secretary of State to a) modify Codes of Practice to reflect Government policy and b) give guidance to Ofcom give too much power to interfere in Ofcom’s independence and should be removed.”

The Government have made certain concessions with respect to the powers, but they do not go far enough. As the Minister said, the powers should be removed.

We should be clear about exactly what the powers do. Under clause 40, the Secretary of State can

“modify a draft of a code of practice”.

That allows the Government a huge amount of power over the so-called independent communications regulator. I am glad that the Government have listened to the suggestions that my colleagues and I made on Second Reading and in Committee, and have committed to using the power only in “exceptional circumstances” and by further defining “public policy” motives. But “exceptional circumstances” is still too opaque and nebulous a phrase. What exactly does it mean? We do not know. It is not defined—probably intentionally.

The regulator must not be politicised in this way. Several similar pieces of legislation are going through their respective Parliaments or are already in force. In Germany, Australia, Canada, Ireland and the EU, with the Digital Services Act, different Governments have grappled with the issue of making digital regulation future-proof and flexible. None of them has added political powers. The Bill is sadly unique in making such provision.

When a Government have too much influence over what people can say online, the implications for freedom of speech are particularly troubling, especially when the content that they are regulating is not illegal. There are ways to future-proof and enhance the transparency of Ofcom in the Bill that do not require the overreach that these powers give. When we allow the Executive powers over the communications regulator, the protections must be absolute and iron-clad, but as the Bill stands, it gives leeway for abuse of those powers. No matter how slim the Minister feels the chance of that may be, as parliamentarians we must not allow it.

Amendment 187 on human trafficking is an example of a relatively minor change to the Bill that could make a huge difference to people online. Our amendment seeks to deal explicitly with what Meta and other companies refer to as domestic servitude, which is very newsworthy, today of all days, and which we know better as human trafficking. Sadly, this abhorrent practice has been part of our society for hundreds if not thousands of years. Today, human traffickers are aided by various apps and platforms. The same platforms that connect us with old friends and family across the globe have been hijacked by the very worst people in our world, who are using them to create networks of criminal enterprise, none more cruel than human trafficking.

Investigations by the BBC and The Wall Street Journal have uncovered how traffickers use Instagram, Facebook and WhatsApp to advertise, sell and co-ordinate the trafficking of young women. One would have thought that the issue would be of the utmost importance to Meta—Facebook, as it was at the time—yet, as the BBC reported, The Wall Street Journal found that

“the social media giant only took ‘limited action’ until ‘Apple Inc. threatened to remove Facebook’s products from the App Store, unless it cracked down on the practice’.”

I and my friends across the aisle who sat on the DCMS Committee and the Joint Committee on the draft Bill know exactly what it is like to have Facebook’s high heid yins before us. They will do absolutely nothing to respond to legitimate pressure. They understand only one thing: the force of law and of financial penalty. Only when its profits were in danger did Meta take the issue seriously.

The omission of human trafficking from schedule 7 is especially worrying, because if human trafficking is not directly addressed as priority illegal content, we can be certain that it will not be prioritised by the platforms. We know from their previous behaviour that the platforms never do anything that will cost them money unless they are forced to do so. We understand that it is difficult to regulate in respect of human trafficking on platforms: it requires work across borders and platforms, with moderators speaking different languages. It is not cheap or easy, but it is utterly essential. The social media companies make enormous amounts of money, so let us shed no tears for them and for the costs that will be entailed. If human trafficking is not designated as a priority harm, I fear that it will fall by the wayside.

In Committee, the then Minister said that the relevant legislation was covered by other parts of the Bill and that it was not necessary to incorporate offences under the Modern Slavery Act 2015 into priority illegal content. He referred to the complexity of offences such as modern slavery, and said how illegal immigration and prostitution priority offences might cover that already. That is simply not good enough. Human traffickers use platforms as part of their arsenal at every stage of the process, from luring in victims to co-ordinating their movements and threatening their families. The largest platforms have ample capacity to tackle these problems and must be forced to be proactive. The consequences of inaction will be grave.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

It is a pleasure to follow the hon. Member for Ochil and South Perthshire (John Nicolson).

Let me begin by repeating my earlier congratulations to my hon. Friend the Member for Folkestone and Hythe (Damian Collins) on assuming his place on the Front Bench. Let me also take this opportunity to extend my thanks to those who served on the Bill Committee with me for some 50 sitting hours—it was, generally speaking, a great pleasure—and, having stepped down from the Front Bench, to thank the civil servants who have worked so hard on the Bill, in some cases over many years.

--- Later in debate ---
Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - - - Excerpts

I follow that point. I will channel, with some effort, the hon. Member for Birmingham, Yardley (Jess Phillips), who I suspect would say that these things are already up for debate and discussed in other contexts—the ability to distinguish between art and pornography is something that we have wrestled with in other media. Actually, in relation to the Bill, I think that one of our guiding principles ought to be that we do not reinvent the wheel where we do not have to, and that we seek to apply to the online world the principles and approaches that we would expect in all other environments. That is probably the answer to my hon. Friend’s point.

I think it is very important that we recognise the need for platforms to do all they can to ensure that the wrong type of material does not reach vulnerable users, even if that material is a brief part of a fairly long piece. Those, of course, are exactly the principles that we apply to the classification of films and television. It may well be that a small portion of a programme constitutes material that is unsuitable for a child, but we would still seek to put it the wrong side of the 9 o’clock watershed or use whatever methods we think the regulator ought to adopt to ensure that children do not see it.

Good points are being made. The practicalities are important; it may be that because of a lack of available time and effort in this place, we have to resolve those elsewhere.

John Nicolson Portrait John Nicolson
- View Speech - Hansard - -

I wish to speak to new clause 33, my proposed new schedule 1 and amendments 201 to 203. I notice that the Secretary of State is off again. I place on record my thanks to Naomi Miles of CEASE—the Centre to End All Sexual Exploitation—and Ceri Finnegan of Barnardos for their support.

The UK Government have taken some steps to strengthen protections on pornography and I welcome the fact that young teenagers will no longer be able to access pornography online. However, huge quantities of extreme and harmful pornography remain online, and we need to address the damage that it does. New clause 33 would seek to create parity between online and offline content—consistent legal standards for pornography. It includes a comprehensive definition of pornography and puts a duty on websites not to host content that would fail to attain the British Board of Film Classification standard for R18 classification.

The point of the Bill, as the Minister has repeatedly said, is to make the online world a safer place, by doing what we all agree must be done—making what is illegal offline, illegal online. That is why so many Members think that the lack of regulation around pornography is a major omission in the Bill.

The new clause stipulates age and consent checks for anyone featured in pornographic content. It addresses the proliferation of pornographic content that is both illegal and harmful, protecting women, children and minorities on both sides of the camera.

The Bill presents an opportunity to end the proliferation of illegal and harmful content on the internet. Representations of sexual violence, animal abuse, incest, rape, coercion, abuse and exploitation—particularly directed towards women and children—are rife. Such content can normalise dangerous and abusive acts and attitudes, leading to real-world harm. As my hon. Friend the Member for Pontypridd (Alex Davies-Jones) said in her eloquent speech earlier, we are seeing an epidemic of violence against women and girls online. When bile and hatred is so prolific online, it bleeds into the offline space. There are real-world harms that flow from that.

The Minister has said how much of a priority tackling violence against women and girls is for him. Knowing that, and knowing him, he will understand that pornography is always harmful to children, and certain kinds of pornographic content are also potentially harmful to adults. Under the Video Recordings Act 1984, the BBFC has responsibility for classifying pornographic content to ensure that it is not illegal, and that it does not promote an interest in abusive relationships, such as incest. Nor can it promote acts likely to cause serious physical harm, such as breath restriction or strangulation. In the United Kingdom, it is against the law to supply pornographic material that does not meet this established BBFC classification standard, but there is no equivalent standard in the online world because the internet evolved without equivalent regulatory oversight.

I know too that the Minister is determined to tackle some of the abusive and dangerous pornographic content online. The Bill does include a definition of pornography, in clause 66(2), but that definition is inadequate; it is too brief and narrow in scope. In my amendment, I propose a tighter and more comprehensive definition, based on that in part 3 of the Digital Economy Act 2017, which was debated in this place and passed into law. The amendment will remove ambiguity and prevent confusion, ensuring that all websites know where they stand with regard to the law.

The new duty on pornographic websites aligns with the UK Government’s 2020 legislation regulating UK-established video-sharing platforms and video-on-demand services, both of which appeal to the BBFC’s R18 classification standards. The same “high standard of rules in place to protect audiences”, as the 2020 legislation put it, and “certain content standards” should apply equally to online pornography and offline pornography, UK-established video-sharing platforms and video-on-demand services.

Let me give some examples sent to me by Barnardo’s, the children’s charity, which, with CEASE, has done incredibly important work in this area. The names have been changed in these examples, for obvious reasons.

“There are also children who view pornography to try to understand their own sexual abuse. Unfortunately, what these children find is content that normalises the most abhorrent and illegal behaviours, such as 15-year-old Elizabeth, who has been sexually abused by a much older relative for a number of years. The content she found on pornography sites depicted older relatives having sex with young girls and the girls enjoying it. It wasn’t until she disclosed her abuse that she realised that it was not normal.

Carrie is a 16-year-old who was being sexually abused by her stepfather. She thought this was not unusual due to the significant amount of content she had seen on pornography sites showing sexual relationships within stepfamilies.”

That is deeply disturbing evidence from Barnardo’s.

Although in theory the Bill will prevent under-18s from accessing such content, the Minister knows that under-18s will be able to bypass regulation through technology like VPNs, as the DCMS Committee and the Bill Committee—I served on both—were told by experts in various evidence sessions. The amendment does not create a new law; it merely moves existing laws into the online space. There is good cause to regulate and sometimes prohibit certain damaging offline content; I believe it is now our duty to provide consistency with legislation in the online world.

Kirsty Blackman Portrait Kirsty Blackman
- View Speech - Hansard - - - Excerpts

I want to talk about several things, but particularly new clause 7. I am really pleased that the new clause has come back on Report, as we discussed it in the Bill Committee but unfortunately did not get enough support for it there—as was the case with everything we proposed—so I thank the right hon. Member for Kingston upon Hull North (Dame Diana Johnson) for tabling it. I also thank my hon. Friend the Member for Inverclyde (Ronnie Cowan) for his lobbying and for providing us with lots of background information. I agree that it is incredibly important that new clause 7 is agreed, particularly the provisions on consent and making sure that participants are of an appropriate age to be taking part. We have heard so many stories of so many people whose videos are online—whose bodies are online—and there is nothing they can do about it because of the lack of regulation. My hon. Friend the Member for Ochil and South Perthshire (John Nicolson) has covered new clause 33 in an awful lot of detail—very good detail—so I will not comment on that.

The right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) mentioned how we need to get the balance right, and specifically talked about the role of the regulator. In many ways, this Bill has failed to get the balance right in its attempts to protect children online. Many people who have been involved in writing this Bill, talking about this Bill, scrutinising this Bill and taking part in every piece of work that we have done around it do not understand how children use the internet. Some people do, absolutely, but far too many of the people who have had any involvement in this Bill do not. They do not understand the massive benefits to children of using the internet, the immense amount of fun they can have playing Fortnite, Fall Guys, Minecraft, or whatever it is they happen to be playing online and how important that is to them in today’s crazy world with all of the social media pressures. Children need to decompress. This is a great place for children to have fun—to have a wonderful time—but they need to be protected, just as we would protect them going out to play in the park, just the same as we would protect them in all other areas of life. We have a legal age for smoking, for example. We need to make sure that the protections are in place, and the protections that are in place need to be stronger than the ones that are currently in the Bill.

I did not have a chance earlier—or I do not think I did—to support the clause about violence against women and girls. As I said in Committee, I absolutely support that being in the Bill. The Government may say, “Oh we don’t need to have this in the Bill because it runs through everything,” but having that written in the Bill would make it clear to internet service providers—to all those people providing services online and having user-generated content on their sites—how important this is and how much of a scourge it is. Young women who spend their time on social media are more likely to have lower outcomes in life as a result of problematic social media use, as a result of the pain and suffering that is caused. We should be putting such a measure in the Bill, and I will continue to argue for that.

We have talked a lot about pornographic content in this section. There is not enough futureproofing in the Bill. My hon. Friend the Member for Ochil and South Perthshire and I tabled amendment 158 because we are concerned about that lack of futureproofing. The amendment edits the definition of “content”. The current definition of “content” says basically anything online, and it includes a list of stuff. We have suggested that it should say “including but not limited to”, on the basis that we do not know what the internet will look like in two years’ time, let alone what it will look like in 20 years’ time. If this Bill is to stand the test of time, it needs to be clear that that list is not exhaustive. It needs to be clear that, when we are getting into virtual reality metaverses where people are meeting each other, that counts as well. It needs to be clear that the sex dungeon that exists in the child’s game Roblox is an issue—that that content is an issue no matter whether it fits the definition of “content” or whether it fits the fact that it is written communication, images or whatever. It does not need to fit any of that. If it is anything harmful that children can find on the internet, it should be included in that definition of “content”, no matter whether it fits any of those specific categories. We just do not know what the internet is going to look like.

I have one other specific thing in relation to the issues of content and pornography. One of the biggest concerns that we heard is the massive increase in the amount of self-generated child sexual abuse images. A significant number of new images of child sexual abuse are self-generated. Everybody has a camera phone these days. Kids have camera phones these days. They have much more potential to get themselves into really uncomfortable and difficult situations than when most of us were younger. There is so much potential for that to be manipulated unless we get this right.