Damian Collins Portrait Damian Collins
- View Speech - Hansard - - - Excerpts

Thank you, Mr Speaker. I am honoured to have been appointed the Minister responsible for the Online Safety Bill. Having worked on these issues for a number of years, I am well aware of the urgency and importance of this legislation, in particular to protect children and tackle criminal activity online—that is why we are discussing this legislation.

Relative to the point of order from my right hon. Friend the Member for Haltemprice and Howden (Mr Davis), I have the greatest respect for him and his standing in this House, but it feels like we have been discussing this Bill for at least five years. We have had a Green Paper and a White Paper. We had a pre-legislative scrutiny process, which I was honoured to be asked to chair. We have had reports from the Digital, Culture, Media and Sport Committee and from other Select Committees and all-party parliamentary groups of this House. This legislation does not want for scrutiny.

We have also had a highly collaborative and iterative process in the discussion of the Bill. We have had 66 Government acceptances of recommendations made by the Joint Committee on the draft Online Safety Bill. We have had Government amendments in Committee. We are discusssing Government amendments today and we have Government commitments to table amendments in the House of Lords. The Bill has received a huge amount of consultation. It is highly important legislation, and the victims of online crime, online fraud, bullying and harassment want to see us get the Bill into the Lords and on the statute book as quickly as possible.

Jeremy Wright Portrait Sir Jeremy Wright (Kenilworth and Southam) (Con)
- Hansard - -

I warmly welcome my hon. Friend to his position. He will understand that those of us who have followed the Bill in some detail since its inception had some nervousness as to who might be standing at that Dispatch Box today, but we could not be more relieved that it is him. May I pick up on his point about the point of order from our right hon. Friend the Member for Haltemprice and Howden (Mr Davis)? Does he agree that an additional point to add to his list is that, unusually, this legislation has a remarkable amount of cross-party consensus behind its principles? That distinguishes it from some of the other legislation that perhaps we should not consider in these two weeks. I accept there is plenty of detail to be examined but, in principle, this Bill has a lot of support in this place.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - - - Excerpts

I completely agree with my right hon. and learned Friend. That is why the Bill passed Second Reading without a Division and the Joint Committee produced a unanimous report. I am happy for Members to cast me in the role of poacher turned gamekeeper on the Bill, but looking around the House, there are plenty of gamekeepers turned poachers here today who will ensure we have a lively debate.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The Government are often in possession of information—for example, security information relating to the UK intelligence community—that Ofcom, as the proposer of a code or a revised code, may not be in possession of. So the ability of the Secretary of State to propose amendments in those narrow fields, based on information that only the Government have access to, is not wholly unreasonable. My hon. Friend will obviously comment further on this in his speech, and no doubt the other place will give anxious scrutiny to the question as well.

I welcome the architecture in new clause 14 in so far as it relates to the definition of illegal content; that is a helpful clarification. I would also like to draw the House’s attention to amendment 16 to clause 9, which makes it clear that acts that are concerned with the commission of a criminal offence or the facilitation of a criminal offence will also trigger the definitions. That is a very welcome widening.

I do not want to try the House’s patience by making too long a speech, given how much the House has heard from me already on this topic, but there are two areas where, as far as I can see, there are no amendments down but which others who scrutinise this later, particularly in the other place, might want to consider. These are areas that I was minded to look at a bit more over the summer. No doubt it will be a relief to some people that I will not be around to do so. The first of the two areas that might bear more thought is clause 137, which talks about giving academic researchers access to social media platforms. I was struck by Frances Haugen’s evidence on this. The current approach in the Bill is for Ofcom to do a report that will takes two years, and I wonder if there could be a way of speeding that up slightly.

The second area concerns the operation of algorithms promoting harmful content. There is of course a duty to consider how that operates, but when it comes algorithms promoting harmful content, I wonder whether we could be a bit firmer in the way we treat that. I do not think that would restrain free speech, because the right of free speech is the right to say something; it is not the right to have an algorithm automatically promoting it. Again, Frances Haugen had some interesting comments on that.

Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - -

I agree that there is scope for more to be done to enable those in academia and in broader civil society to understand more clearly what the harm landscape looks like. Does my hon. Friend agree that if they had access to the sort of information he is describing, we would be able to use their help to understand more fully and more clearly what we can do about those harms?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

My right hon. and learned Friend is right, as always. We can only expect Ofcom to do so much, and I think inviting expert academic researchers to look at this material would be welcome. There is already a mechanism in clause 137 to produce a report, but on reflection it might be possible to speed that up. Others who scrutinise the Bill may also reach that conclusion. It is important to think particularly about the operation of algorithmic promotion of harmful content, perhaps in a more prescriptive way than we do already. As I have said, Frances Haugen’s evidence to our Committee in this area was particularly compelling.

--- Later in debate ---
Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- View Speech - Hansard - - - Excerpts

Order. The House will see that a great many people still wish to speak. May I explain that there are two groups of amendments? We will finish debating this group at 4.30 pm, after which there will be some votes, and debate on the next group of amendments will last until 7 o’clock. By my calculations, there might be more time for speeches during the debate on the next group, so if anyone wishes to speak on that group rather than the current group, I would be grateful if they came and indicated that to me. Meanwhile, if everyone takes about eight minutes and no longer, everyone will have the opportunity to speak. I call Sir Jeremy Wright.

Jeremy Wright Portrait Sir Jeremy Wright
- View Speech - Hansard - -

I shall speak to the amendments in my name and the names of other right hon. and hon. Members, to whom I am grateful for their support. I am also grateful to the organisations that helped me to work through some of the problems I am about to identify, including the Carnegie Trust, Reset and the Antisemitism Policy Trust.

On the first amendments I shall talk about, amendments 42 and 43, I have been able to speak to Lego, so I can honestly say that these amendments were put together with Lego. Let me explain. The focus of the Bill, quite rightly, is on safety, and there is no safety more important than the safety of children. In that respect, the Bill is clear: platforms must give the safety of children the utmost priority and pay close attention to ways to enhance it. In other parts of the Bill, however, there are countervailing duties—for example, in relation to freedom of speech and privacy—where, predominantly in relation to adults, we expect platforms to conduct a balancing exercise. It seems right to me to think about that in the context of children, too.

As I said, the emphasis is rightly on children’s safety, but the safest approach would be to prohibit children from any online activity at all. We would not regard such an approach as sensible, because there are benefits to children in being able to engage—safely, of course—in online activity and to use online products and services. It seems to me that we ought to recognise that in the language of the Bill. Amendment 42 would do that when consideration is given to the safety duties designed to protect children set out in clause 11, which requires that “proportionate measures” must be taken to protect children’s safety and goes on to explain what factors might be taken into account when deciding what is proportionate, by adding

“the benefits to children’s well-being”

of the product or service in that list of factors. Amendment 43 would do the same when consideration is given to the online safety objectives set out in schedule 4. Both amendments are designed to ensure that the appropriate balance is struck when judgments are taken by platforms.

Others have spoken about journalistic content, and I am grateful for what the Minister said about that, but my amendment 10 is aimed at the defect that I perceive in clause 16. The Bill gives additional protections and considerations to journalists, which is entirely justifiable, given the important role that journalism plays in our society, but those extra protections mean that it will be harder for platforms to remove potentially harmful content that is also journalistic content. We should be sure, therefore, that the right people get the benefit of that protection.

It is worth having look at what clause 16 says and does. It sets out that a platform—a user-to-user service—in category 1 will have

“A duty to operate a service using proportionate systems and processes designed to ensure that the importance of the free expression of journalistic content is taken into account when making decisions about…how to treat such content (especially decisions about whether to take it down or restrict users’ access to it), and…whether to take action against a user generating, uploading or sharing such content.”

So it is important, because of the significance of those protections, that we get right the definitions of those who should benefit from them. Amendment 10 would amend clause 16(8), which states that:

“For the purposes of this section content is “journalistic content”, in relation to a user-to-user service, if…the content is”

either

“news publisher content in relation to that service”—

the definition of which I will return to—

“or…regulated user-generated content in relation to that service”.

That is the crucial point. The content also has to be

“generated for the purposes of journalism”

and be linked to the UK.

The first problem here is that journalism is not defined in the Bill. There are definitions of journalism, but none appears in the text of this Bill. “UK-linked” does not narrow it down much, and “regulated user-generated content” is a very broad category indeed. Clause 16 as drafted offers the protection given to journalistic content not just to news publishers, but to almost everybody else who chooses to define themselves as a journalist, whether or not that is appropriate. I do not think that that is what the Bill is intended to do, or an approach that this House should endorse. Amendment 10 would close the loophole by removing the second limb, regulated user-generated content that is not news publisher content. Let me be clear: I do not think that that is the perfect answer to the question I have raised, but it is better than the Bill as it stands, and if the Government can come up with a way of reintroducing protections of this kind for types of journalistic content beyond news publisher content that clearly deserve them, I will be delighted and very much open to it. Currently, however, the Bill is defective and needs to be remedied.

That brings us to the definition of news publisher content, because it is important that if we are to give protection to that category of material, we are clear about what we mean by it. Amendments 11 and 12 relate to the definition of news publisher content that arises from the definition of a recognised news publisher in clauses 49 and 50. That matters for the same reason as I just set out: we should give these protections only to those who genuinely deserve them. That requires rigorous definition. Clause 50 states that if an entity is not named in the Bill, as some are, it must fulfil a set of conditions set out in subsection (2), which includes having a standards code and policies and procedures for handling and resolving complaints. The difficulty here is that in neither case does the Bill refer to any quality threshold for those two things, so having any old standards code or any old policy for complaints will apparently qualify. That cannot be right.

I entirely accept that inserting a provision that the standards code and the complaints policies and procedures should be both “suitable and sufficient” opens the question whose job it becomes to decide what is suitable and sufficient. I am familiar with all the problems that may ensue, so again, I do not say that the amendment is the final word on the subject, but I do say that the Government need to look more carefully at what the value of those two items on the list really is if the current definition stands. If we are saying that we want these entities to have a standards code and a complaints process that provide some reassurance that they are worthy of the protections the Bill gives, it seems to me that meaningful criteria must apply, which currently they do not.

The powers of the Secretary of State have also been discussed by others, but I perhaps differ from their view in believing that there should be circumstances in which the Secretary of State should hold powers to act in genuine emergency situations. However, being able to direct Ofcom, as the Bill allows the Secretary of State to do, to modify a code of practice

“for reasons of public policy”

is far too broad. Amendment 13 would simply remove that capacity, with amendment 14 consequential upon it.

I accept that on 7 July the Secretary of State issued a written statement that helps to some extent on that point—it was referred to by my hon. Friend the Member for Croydon South South (Chris Philp). First, it states that the Secretary of State would act only in “exceptional circumstances”, although it does not say who defines what exceptional circumstances are, leaving it likely that the Secretary of State would do so, which does not help us much. Secondly, it states the intention to replace the phrase

“for reasons of public policy”

with a list of circumstances in which the Secretary of State might act. I agree with my hon. Friend the Member for Solihull (Julian Knight) that that is still too broad. The proposed list comprises

“national security, public safety, public health, the UK’s international relations and obligations, economic policy and burden to business.”—[Official Report, 7 July 2022; Vol. 717, c. 69WS.]

The platforms we are talking about are businesses. Are we really saying that a burden on them would give the Secretary of State reason to say to Ofcom, the independent regulator, that it must change a code of practice? That clearly cannot be right. This is still too broad a provision. The progress that has been made is welcome, but I am afraid that there needs to be more to further constrain this discretion. That is because, as others have said, the independence of the regulator is crucial not just to this specific part of the Bill but to the credibility of the whole regulatory and legislative structure here, and therefore we should not undermine it unless we have to.

--- Later in debate ---
Julian Knight Portrait Julian Knight
- View Speech - Hansard - - - Excerpts

Although this is not contained within these measures, it is pertaining to them. Does my right hon. and learned Friend agree that, down the line, Ofcom will want to look at a regime of compliance officers in order to give the guidance that he seeks?

Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - -

Yes, that is a possible way forward. Ofcom will need to produce a code of practice in this area. I am sure my hon. Friend on the Front Bench will say that that is a suitable way to deal with the problem that I have identified. It may well be, but at this stage, it is right for the House to recognise that the drafting of the Bill at the moment seeks to offer support to platforms, for which I am sure they will be grateful, but it will need to offer some more in order to allow these judgments to be made.

I restate the point that I have made in previous debates on this subject: there is little point in this House passing legislation aimed to make the internet a safer place if the legislation does not work as it is intended to. If our regime does not work, we will keep not a single person any safer. It is important, therefore, that we think about this Bill not in its overarching statements and principles but, particularly at this stage of consideration, in terms of how it will actually work.

You will not find a bigger supporter of the Bill in this House than me, Madam Deputy Speaker, but I want to see it work well and be effective. That means that some of the problems that I am highlighting must be addressed. Because humility is a good way to approach debates on something as ground-breaking and complex as this, I do not pretend that I have all the right answers. These amendments have been tabled because the Bill as it stands does not quite yet do the job that we want it to do. It is a good Bill—it needs to pass—but it can be better, and I very much hope that this process will improve it.

Joanna Cherry Portrait Joanna Cherry
- View Speech - Hansard - - - Excerpts

I rise to speak to new clause 24 and amendments 193 and 191 tabled in my name. I also want to specifically give my support to new clause 6 and amendments 33 and 34 in the name of the right hon. Member for Kingston upon Hull North (Dame Diana Johnson).

The purpose of my amendments, as I have indicated in a number of interventions, is to ensure that, when moderating content, category 1 service providers such as Twitter abide by the anti-discrimination law of our domestic legal systems—that is to say the duties set out in the Equality Act 2010 not to discriminate against, harass or victimise their users on the grounds of a protected characteristic.

I quickly want to say a preliminary word about the Bill. Like all responsible MPs, I recognise the growing concern about online harms, and the need to protect service users, especially children, from harmful and illegal content online. That said, the House of Lords’ Communication and Digital Committee was correct to note that the internet is not currently the unregulated Wild West that some people say it is, and that civil and criminal law already applies to activities online as well as offline.

The duty of care, which the Bill seeks to impose on online services, will be a significant departure from existing legislation regulating online content. It will allow for a more preventative approach to regulating illegal online content and will form part of a unified regulatory framework applying to a wide range of online services. I welcome the benefits that this would represent, especially with respect to preventing the proliferation of child sexual and emotional abuse online.

Before I became an MP, I worked for a number of years as a specialist sex crimes prosecutor, so I am all too aware of how children are targeted online. Sadly, there are far too many people in our society, often hiding in plain sight, who seek to exploit children. I must emphasise that child safeguarding should be a No. 1 priority for any Government. In so far as this Bill does that, I applaud it. However, I do have some concerns that there is a significant risk that the Bill will lead to censorship of legal speech by online platforms. For the reasons that were set out by the right hon. Member for Haltemprice and Howden (Mr Davis), I am also a bit worried that it will give the Government unacceptable controls over what we can and cannot say online, so I am keen to support any amendments that would ameliorate those aspects of the Bill. I say this to those Members around the Chamber who might be looking puzzled: make no mistake, when the Bill gives greater power to online service providers to regulate content, there is a very real risk that they will be lobbied by certain groups to regulate what is actually legal free speech by other groups. That is partly what my amendment is designed to avoid.

Jeremy Wright Portrait Sir Jeremy Wright
- View Speech - Hansard - -

What the hon. and learned Lady says is sensible, but does she accept—this is a point the Minister made earlier—that, at the moment, the platforms have almost unfettered control over what they take down and what they leave up? What this Bill does is present a framework for the balancing exercise that they ought to apply in making those decisions.

Joanna Cherry Portrait Joanna Cherry
- View Speech - Hansard - - - Excerpts

That is why I am giving the Bill a cautious welcome, but I still stand by my very legitimate concerns about the chilling effect of aspects of this Bill. I will give some examples in a moment about the problems that have arisen when organisations such as Twitter are left to their own devices on their moderation of content policy.

As all hon. Members will be aware, under the Equality Act there are a number of protected characteristics. These include: age; gender reassignment; being married or in a civil partnership; being pregnant or on maternity leave; disability; race, including colour, nationality, ethnic or national origin; religion or belief; sex and sexual orientation. It is against the law to discriminate, victimise or harass anyone because of any of those protected characteristics, but Twitter does discriminate against some of the protected characteristics. It often discriminates against women in the way that I described in an intervention earlier. It takes down expressions of feminist belief, but refuses to take down expressions of the utmost violent intent against women. It also discriminates against women who hold gender-critical beliefs. I remind hon. Members that, in terms of the Employment Appeal Tribunal’s decision in the case of Maya Forstater, the belief that sex matters is worthy of respect in a democratic society and, under the Equality Act, people cannot lawfully discriminate against women, or indeed men, who hold those views.

Twitter also sometimes discriminates against lesbians, gay men and bisexual people who assert that their sexual orientation is on the basis of sex, not gender, despite the fact that same-sex orientation, such as I hold, is a protected characteristic under the Equality Act.

At present, Twitter claims not to be covered by the Equality Act. I have seen correspondence from its lawyers that sets out the purported basis for that claim, partly under reference to schedule 25 to the Equality Act, and partly because it says:

“Twitter UK is included in an Irish Company and is incorporated in the Republic of Ireland. It does pursue economic activity through a fixed establishment in the UK but that relates to income through sales and marketing with the main activity being routed through Ireland.”

I very much doubt whether that would stand up in court, since Twitter is clearly providing a service in the United Kingdom, but it would be good if we took the opportunity of this Bill to clarify that the Equality Act applies to Twitter, so that when it applies moderation of content under the Bill, it will not discriminate against any of the protected characteristics.

The Joint Committee on Human Rights, of which I am currently the acting Chair, looked at this three years ago. We had a Twitter executive before our Committee and I questioned her at length about some of the content that Twitter was content to support in relation to violent threats against women and girls and, on the other hand, some of the content that Twitter took down because it did not like the expression of certain beliefs by feminists or lesbians.

We discovered on the Joint Committee on Human Rights that Twitter’s hateful conduct policy does not include sex as a protected characteristic. It does not reflect the domestic law of the United Kingdom in relation to anti-discrimination law. Back in October 2019, in the Committee’s report on democracy, freedom of expression and freedom of association, we recommended that Twitter should include sex as a protected characteristic in its hateful conduct policy, but Twitter has not done that. It seems Twitter thinks it is above the domestic law of the United Kingdom when it comes to anti-discrimination.

At that Committee, the Twitter executive assured me that certain violent memes that often appear on Twitter directed against women such as me and against many feminists in the United Kingdom, threatening us with death by shooting, should be removed. However, just in the past 48 hours I have seen an example of Twitter’s refusing to remove that meme. Colleagues should be assured that there is a problem here, and I would like us to direct our minds to it, as the Bill gives us an opportunity to do.

Whether or not Twitter is correctly praying in aid the loophole it says there is in the Equality Act—I think that is questionable—the Bill gives us the perfect opportunity to clarify matters. Clause 3 of clearly brings Twitter and other online service providers within the regulatory scheme of the Bill as a service with

“a significant number of United Kingdom users”.

The Bill squarely recognises that Twitter provides a service in the United Kingdom to UK users, so it is only a very small step to amend the Bill to make it absolutely clear that when it does so it should be subject to the Equality Act. That is what my new clause 24 seeks to do.

I have also tabled new clauses 193 and 191 to ensure that Twitter and other online platforms obey non-discrimination law regarding Ofcom’s production of codes of practice and guidance. The purpose of those amendments is to ensure that Ofcom consults with persons who have expertise in the Equality Act before producing those codes of conduct.

I will not push the new clauses to a vote. I had a very productive meeting with the Minister’s predecessor, the hon. Member for Croydon South (Chris Philp), who expressed a great deal of sympathy when I explained the position to him. I have been encouraged by the cross-party support for the new clauses, both in discussions before today with Members from all parties and in some of the comments made by various hon. Members today.

I am really hoping that the Government will take my new clauses away and give them very serious consideration, that they will look at the Joint Committee’s report from October 2019 and that either they will adopt these amendments or perhaps somebody else will take them forward in the other place.

--- Later in debate ---
At its core, the Online Safety Bill should be about reducing harm, and we are all aligned on that aim. I am disappointed that the Government have reversed some of the effectiveness of the scrutiny in Committee by now amending the Bill to such a degree. I hope the Minister considers our amendments in the collaborative spirit in which they are intended, and recognises their potential to make this Bill stronger and more effective for all.
Jeremy Wright Portrait Sir Jeremy Wright
- View Speech - Hansard - -

I think it is extraordinarily important that this Bill does what the hon. Member for Worsley and Eccles South (Barbara Keeley) has just described. As the Bill moves from this place to the other place, we must debate what the right balance is between what the Secretary of State must do—in the previous group of amendments, we heard that many of us believe that is too extensive as the Bill stands—what the regulator, Ofcom, must do and what Parliament must do. There is an important judgment call for this House to make on whether we have that balance right in the Bill as it stands.

These amendments are very interesting. I am not convinced that the amendments addressed by the hon. Lady get the balance exactly right either, but there is cause for further discussion about where we in this House believe the correct boundary is between what an independent regulator should be given authority to do under this legislative and regulatory structure and what we wish to retain to ourselves as a legislature.

Adam Afriyie Portrait Adam Afriyie
- Hansard - - - Excerpts

My right hon. and learned Friend is highlighting, and I completely agree, that there is a very sensitive balance between different power bases and between different approaches to achieving the same outcome. Does he agree that as even more modifications are made—the nipping and tucking I described earlier—this debate and future debates, and these amendments, will contribute to those improvements over the weeks and months ahead?

Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - -

Yes, I agree with my hon. Friend about that. I hope it is some comfort to the hon. Member for Worsley and Eccles South when I say that if the House does not support her amendment, it should not be taken that she has not made a good point that needs further discussion—probably in the other place, I fear. We are going to have think carefully about that balance. It is also important that we do not retain to ourselves as a legislature those things that the regulator ought to have in its own armoury. If we want Ofcom to be an effective and independent regulator in this space, we must give it sufficient authority to fulfil that role. She makes interesting points, although I am not sure I can go as far as supporting her amendments. I know that is disappointing, but I do think that what she has done is prompted a further debate on exactly this balance between Secretary of State, Executive, legislature and regulator, which is exactly where we need to be.

I have two other things to mention. The first relates to new clause 7 and amendment 33, which the right hon. Member for Kingston upon Hull North (Dame Diana Johnson) tabled. She speaks powerfully to a clear need to ensure that this area is properly covered. My question, however, is about practicalities. I am happy to take an intervention if she can answer it immediately. If not, I am happy to discuss it with her another time. She has heard me speak many times about making sure that this Bill is workable. The challenge in what she has described in her amendments may be that a platform needs to know how it is to determine and “verify”—that is the word she has used—that a participant in a pornographic video is an adult and a willing participant. It is clearly desirable that the platform should know both of those things, but the question that will have to be answered is: by what mechanism will it establish that? Will it ask the maker of the pornographic video and be prepared to accept the assurances it is given? If not, by what other mechanism should it do this? For example, there may be a discussion to be had on what technology is available to establish whether someone is an adult or is not—that bleeds into the discussion we have had about age assurance. It may be hard for a platform to establish whether someone is a willing participant.

Jess Phillips Portrait Jess Phillips (Birmingham, Yardley) (Lab)
- View Speech - Hansard - - - Excerpts

This has been quite topical this week. When we have things on any platform that is on our television, people absolutely have to have signed forms to say that they are a willing participant. It is completely regular within all other broadcast media that people sign consent forms and that people’s images are not allowed to be used without their consent.

Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - -

Yes, I am grateful to the hon. Lady for that useful addition to this debate, but it tends to clarify the point I was seeking to clarify, which is whether or not what the right hon. Member for Kingston upon Hull North has in mind is to ensure that a platform would be expected to make use of those mechanisms that already exist in order to satisfy itself of the things that she rightly asks it to be satisfied of or whether something beyond that would be required to meet her threshold. If it is the former, that is manageable for platforms and perfectly reasonable for us to expect of them. If it is the latter, we need to understand a little more clearly how she expects a platform to achieve that greater assurance. If it is that, she makes an interesting point.

Finally, let me come to amendment 56, tabled by my hon. Friend the Member for Windsor (Adam Afriyie). Again, I have a practical concern. He seeks to ensure that the pornographic content is “taken as a whole”, but I think it is worth remembering why we have included pornographic content in the context of this Bill. We have done it to ensure that children are not exposed to this content online and that where platforms are capable of preventing that from happening, that is exactly what they do. There is a risk that if we take this content as a whole, it is perfectly conceivable that there may be content online that is four hours long, only 10 minutes of which is pornographic in nature. It does not seem to me that that in any way diminishes our requirement of a platform to ensure that children do not see those 10 minutes of pornographic content.

Adam Afriyie Portrait Adam Afriyie
- Hansard - - - Excerpts

I am very sympathetic to that view. I am merely flagging up for the Minister that if we get the opportunity, we need to have a look at it again in the Lords, to be absolutely certain that we are not ruling out certain types of art, and certain types of community sites that we would all think were perfectly acceptable, that are probably not accessible to children, just to ensure that we are not creating further problems down the road that we would have to correct.

Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - -

I follow that point. I will channel, with some effort, the hon. Member for Birmingham, Yardley (Jess Phillips), who I suspect would say that these things are already up for debate and discussed in other contexts—the ability to distinguish between art and pornography is something that we have wrestled with in other media. Actually, in relation to the Bill, I think that one of our guiding principles ought to be that we do not reinvent the wheel where we do not have to, and that we seek to apply to the online world the principles and approaches that we would expect in all other environments. That is probably the answer to my hon. Friend’s point.

I think it is very important that we recognise the need for platforms to do all they can to ensure that the wrong type of material does not reach vulnerable users, even if that material is a brief part of a fairly long piece. Those, of course, are exactly the principles that we apply to the classification of films and television. It may well be that a small portion of a programme constitutes material that is unsuitable for a child, but we would still seek to put it the wrong side of the 9 o’clock watershed or use whatever methods we think the regulator ought to adopt to ensure that children do not see it.

Good points are being made. The practicalities are important; it may be that because of a lack of available time and effort in this place, we have to resolve those elsewhere.

John Nicolson Portrait John Nicolson
- View Speech - Hansard - - - Excerpts

I wish to speak to new clause 33, my proposed new schedule 1 and amendments 201 to 203. I notice that the Secretary of State is off again. I place on record my thanks to Naomi Miles of CEASE—the Centre to End All Sexual Exploitation—and Ceri Finnegan of Barnardos for their support.

The UK Government have taken some steps to strengthen protections on pornography and I welcome the fact that young teenagers will no longer be able to access pornography online. However, huge quantities of extreme and harmful pornography remain online, and we need to address the damage that it does. New clause 33 would seek to create parity between online and offline content—consistent legal standards for pornography. It includes a comprehensive definition of pornography and puts a duty on websites not to host content that would fail to attain the British Board of Film Classification standard for R18 classification.

The point of the Bill, as the Minister has repeatedly said, is to make the online world a safer place, by doing what we all agree must be done—making what is illegal offline, illegal online. That is why so many Members think that the lack of regulation around pornography is a major omission in the Bill.

The new clause stipulates age and consent checks for anyone featured in pornographic content. It addresses the proliferation of pornographic content that is both illegal and harmful, protecting women, children and minorities on both sides of the camera.

The Bill presents an opportunity to end the proliferation of illegal and harmful content on the internet. Representations of sexual violence, animal abuse, incest, rape, coercion, abuse and exploitation—particularly directed towards women and children—are rife. Such content can normalise dangerous and abusive acts and attitudes, leading to real-world harm. As my hon. Friend the Member for Pontypridd (Alex Davies-Jones) said in her eloquent speech earlier, we are seeing an epidemic of violence against women and girls online. When bile and hatred is so prolific online, it bleeds into the offline space. There are real-world harms that flow from that.

The Minister has said how much of a priority tackling violence against women and girls is for him. Knowing that, and knowing him, he will understand that pornography is always harmful to children, and certain kinds of pornographic content are also potentially harmful to adults. Under the Video Recordings Act 1984, the BBFC has responsibility for classifying pornographic content to ensure that it is not illegal, and that it does not promote an interest in abusive relationships, such as incest. Nor can it promote acts likely to cause serious physical harm, such as breath restriction or strangulation. In the United Kingdom, it is against the law to supply pornographic material that does not meet this established BBFC classification standard, but there is no equivalent standard in the online world because the internet evolved without equivalent regulatory oversight.

I know too that the Minister is determined to tackle some of the abusive and dangerous pornographic content online. The Bill does include a definition of pornography, in clause 66(2), but that definition is inadequate; it is too brief and narrow in scope. In my amendment, I propose a tighter and more comprehensive definition, based on that in part 3 of the Digital Economy Act 2017, which was debated in this place and passed into law. The amendment will remove ambiguity and prevent confusion, ensuring that all websites know where they stand with regard to the law.

The new duty on pornographic websites aligns with the UK Government’s 2020 legislation regulating UK-established video-sharing platforms and video-on-demand services, both of which appeal to the BBFC’s R18 classification standards. The same “high standard of rules in place to protect audiences”, as the 2020 legislation put it, and “certain content standards” should apply equally to online pornography and offline pornography, UK-established video-sharing platforms and video-on-demand services.

Let me give some examples sent to me by Barnardo’s, the children’s charity, which, with CEASE, has done incredibly important work in this area. The names have been changed in these examples, for obvious reasons.

“There are also children who view pornography to try to understand their own sexual abuse. Unfortunately, what these children find is content that normalises the most abhorrent and illegal behaviours, such as 15-year-old Elizabeth, who has been sexually abused by a much older relative for a number of years. The content she found on pornography sites depicted older relatives having sex with young girls and the girls enjoying it. It wasn’t until she disclosed her abuse that she realised that it was not normal.

Carrie is a 16-year-old who was being sexually abused by her stepfather. She thought this was not unusual due to the significant amount of content she had seen on pornography sites showing sexual relationships within stepfamilies.”

That is deeply disturbing evidence from Barnardo’s.

Although in theory the Bill will prevent under-18s from accessing such content, the Minister knows that under-18s will be able to bypass regulation through technology like VPNs, as the DCMS Committee and the Bill Committee—I served on both—were told by experts in various evidence sessions. The amendment does not create a new law; it merely moves existing laws into the online space. There is good cause to regulate and sometimes prohibit certain damaging offline content; I believe it is now our duty to provide consistency with legislation in the online world.