All 2 contributions to the Digital Economy Act 2017 (Commencement of Part 3) Bill [HL]

Digital Economy Act 2017 (Commencement of Part 3) Bill [HL]

1st reading
Wednesday 9th June 2021

(3 years, 5 months ago)

Lords Chamber
Read Full debate Digital Economy Act 2017 (Commencement of Part 3) Bill [HL] Read Hansard Text
First Reading
12:51
A Bill to bring into force the remaining sections of Part 3 of the Digital Economy Act 2017.
The Bill was introduced by Lord Morrow, read a first time and ordered to be printed.
12:52
Sitting suspended.

Digital Economy Act 2017 (Commencement of Part 3) Bill [HL]

2nd reading
Friday 28th January 2022

(2 years, 9 months ago)

Lords Chamber
Read Full debate Digital Economy Act 2017 (Commencement of Part 3) Bill [HL] Read Hansard Text
Second Reading
10:14
Moved by
Lord Morrow Portrait Lord Morrow
- Hansard - - - Excerpts

That the Bill be now read a second time.

Lord Morrow Portrait Lord Morrow (DUP)
- Hansard - - - Excerpts

My Lords, first, I take the opportunity to acknowledge the noble Lords attending today to support this Bill. I particularly want to thank those who have made a special effort to attend today and speak on this important issue. I look forward to hearing everyone’s contribution to this debate; I know that many others would have liked to be here today but are unable to be present. It is clear that there is strong cross-party support for this issue and for action to be taken.

I also want to put on the record my thanks for the pioneering work of Baroness Howe, who tabled this Bill in the previous Parliament, before her retirement in June 2020. She set a high bar for those of us concerned with online safety.

Before I set out the reasons for this Bill, I want to set out what it seeks to achieve. It is a very simple, one-clause Bill that seeks to ensure that regulations are brought into force to commence Part 3 of the Digital Economy Act by 20 June this year. In brief, Part 3 requires commercial pornographic websites to introduce age verification so that children are unable to access pornographic material. It requires the appointment of a regulator to oversee the age verification and to instruct internet service providers to block sites without age verification or which contain illegal and extreme pornography.

It seems remarkable that this Bill is even needed. The Digital Economy Act received Royal Assent on 27 April 2017, and it has been almost five years since Parliament passed that legislation into law. I am not sure what the correct adjective is to use: “shocking” does not seem to be strong enough, but it is shocking that, almost five years after legislation was passed to protect children from accessing pornography, the Government have continued to fail in their obligation to introduce this part of the 2017 Act and provide the children of the UK with some level of protection from accessing pornography online. Indeed, the July 2021 report of the Communications and Digital Committee of your Lordships’ House said:

“The Government’s inaction has severely impacted children”.


It is not just children who are failed by the Government’s inaction; it is also women who are let down by this legislation not being brought into force. As I said, Part 3 of the DEA provided for a regulator with robust power to deal with extreme and violent pornography. We do not have to cast our minds back too far into the past to see the harm caused by extreme pornography. Just last spring, a woman who should have felt safe walking home in the early evening through the streets of London was attacked and died at the hands of a man who was addicted to the type of pornography that Part 3 of the DEA seeks to address. We will of course never know what might have happened in the case of Sarah Everard if this legislation had been in force, but what we can say with certainty is that action would have been taken to address the type of pornography to which her attacker had formed an addiction.

This legislation should have been in force in 2018, but there were several delays in getting the framework in place. Just as everyone thought that Part 3 was about to come into effect, it came to light that there was a departmental error: the department failed to inform the EU that the legislation was going into force, and implementation was delayed again. Two weeks after the EU notification period was concluded, in October 2019, the Government announced that they were not going to implement the legislation after all, despite spending £2.2 million to ensure that the BBFC was ready to be the regulator. The implementation of the legislation was shelved. The Government gave no indication that they were going to take this course of action; they did not consult with interested parties or speak to children’s charities or the many organisations helping women across this country. They simply buried the legislation.

The question that has never been answered is why. When the Government made their decision in 2019, we were told that new proposals would be brought forward in early 2020 for pre-legislative scrutiny. While Covid could be advanced as a reason for delay, it is surprising that pre-legislative scrutiny of what is now the online safety Bill did not start until May 2021; more than 18 months after the Government promised new proposals, pre-legislative scrutiny began. As of today, the online safety Bill has completed that pre-legislative scrutiny and is still a few months away from starting its parliamentary journey.

Five years after Parliament legislated that age verification be placed on pornographic websites, children do not have the protection of that technology to keep them safe online. While the Government will probably respond that the online safety Bill will fix the problem—and it may well do—the jury is definitely out on that matter, since the Bill as currently drafted does not have robust age-verification measures contained within its provisions and pornography is not mentioned on the face of the Bill. Put simply, the online safety Bill, as currently drafted, is less robust in protecting children compared with Part 3 of the DEA. The online safety Bill as drafted covers only user-to-user content. It does not cover all commercial pornographic websites. It is unacceptable that protection for children online should be diminished in any new law.

Whatever form the online safety Bill takes when it receives Royal Assent, one abiding problem remains: what are we going to do now? Five years is already too long to wait for protections to be put in place, given that the online safety Bill is unlikely to receive Royal Assent before 2023, and the Government tell us that it may take 18 months for Ofcom to be ready to assume the role of regulator. What do the Government propose that we do now and for the next few years? We face the prospect of almost 10 years having passed between age verification having first been raised in the Conservative manifesto as a way of protecting children online and that vital safety measure being put in place on commercial pornography websites.

A child who was eight years old when this proposal was first put forward in 2016 will be an adult when the protections will finally be in place; they will have gone through their formative years and been exposed to untold harm online. Potentially, they will be in the grips of addiction by the time this protection is made a legal requirement, if it is at all—yet it could have been avoided. If the Government had only done what they were supposed to do by law, that child, who will be an adult by the time the online safety Bill is implemented, could have been protected.

The charity Naked Truth helps adults facing this reality. One person they have helped is Jack from Manchester. Jack says that he was first exposed to pornography when he was 11. His story is a common one: a group of boys looking online and coming across pornographic material. By the age of 16, Jack was addicted to online pornography. He says, “Some of the things that I saw made me excited, yet others shocked and disgusted me and made me feel almost sick. Yet as I explored this world more and more, I found that the things that first made me gasp in shock and disbelief slowly started to become attractive.” He became desensitised to what he was seeing online.

As Jack entered adult life, his addiction had taken hold and he could not stop. It infected his entire life. Jack continued, “It took me over 10 years to rid myself of pornography addiction that started at a young age. Its effect on me, my mental health and attitude to women has ruined my life as a teenager and young adult and still deeply affects these aspects of my life to this day. I wish only that there were a way to stop and protect children—like I once was—from pornography, so they would not make the same mistakes I made and have the inappropriate exposure that I was first exposed to.”

There are hundreds of thousands of children like Jack across the UK. According to research by DCMS, 80% of children aged six to 12 have viewed something harmful online, while over 50% of teenagers believe that they have accessed illegal content online. Jack and the millions of children like him are the ones that Part 3 of the DEA was enacted to help, yet they have been failed. We cannot allow children to continue to be let down, especially when there is legislation on the statute book right now that would protect them. If passed, this Bill before the House today would ensure that protection was in place this year.

It is important to understand that it is not just a matter of waiting for the online safety Bill to come into force. The position is not that the online safety Bill will do all that Part 3 would have done—albeit a number of years late—but that the substance of the Bill as currently drafted is considerably less than what Part 3 of the DEA would deliver in a number of respects.

First, commercial pornography sites are not captured by the current draft of the online safety Bill. The Joint Committee scrutinising the Bill has recommended in its report that the Bill be amended to include pornography—a position supported by the Digital, Culture, Media and Sport Committee in the other place in its report published on Monday. We await the Government’s response, but the Bill at present would allow many pornographic sites to continue operating in a non-regulated manner.

Secondly, how age verification will operate and to which parts of the online world it will apply are unknown. The Bill documents state:

“The proportion of businesses required to employ age assurance controls and the type of controls required are unknown at this stage, this will be set out in future codes of practice.”


The Joint Committee proposes that the age-assurance design code be utilised to cover age verification, but we do not know how that will work. The design code relates to the processing of data and the Information Commissioner is clear that they do not believe that it covers content. So we are as yet uncertain about how age verification will continue—and even whether it will operate at all. That clarity and certainty can be delivered by Part 3 of the DEA.

Thirdly, and in a similar vein, there is no requirement to block extreme pornographic websites. The Government’s 2021 Tackling Violence Against Women and Girls strategy states:

“Through the new Online Safety Bill, companies will need to take swift and effective action against illegal content targeted at women … The Government will work with stakeholders and Parliamentarians to identify priority illegal harms which will be specified in secondary legislation and may include those of particular relevance to women, such as ‘revenge porn’, extreme pornography”.


Again, we have a lack of certainty about how women will be protected from online pornography, despite the strategy saying:

“The Call for Evidence showed a widespread consensus about the harmful role violent pornography can play in violence against women and girls, with most respondents to the open public surveys and many respondents to the nationally representative survey agreeing that an increase in violent pornography has led to more people being asked to agree to violent sex acts”.


Fourthly, it is not clear whether the wide list of actions that are considered enforceable under the draft online safety Bill will be effective in preventing harm to children or violence to women. Part 3 of the DEA relies on the regulator asking ancillary services to block services or requiring ISPs to block websites to enforce the provisions. Under the draft online safety Bill, only in rare situations will a court—rather than the regulator, Ofcom—direct an ancillary service to take action against an ISP or other service to block access to a provider, and it is not clear how proactive Ofcom will be in ensuring that websites are implementing the duty of care as set out in that Bill. There is so much uncertainty surrounding the online safety Bill, yet we have sure and certain legislation on the statute book right now. My Bill would ensure it was brought into force this year.

It is disappointing that the Government have continually ignored pleas from across this House and the other place to implement this legislation. If the Government continue to be unwilling to implement Part 3 of the DEA, what then is the alternative? Are there any other measures that they plan to bring forward in the interim to ensure that children are protected? Speaking in the other place on 10 June last year, responding to the Ofsted review on sexual abuse in schools and colleges, the then Parliamentary Under-Secretary of State for Education stated:

“The Online Safety Bill will deliver a groundbreaking system of accountability and oversight of tech companies and make them accountable to an independent regulator. The strongest protections in the new regulatory framework will be for children, and companies will need to take steps to ensure that children cannot access services that pose the highest risk of harm, such as online pornography. In addition, the Secretary of State for Education and the Secretary of State for Digital, Culture, Media and Sport have asked the Children’s Commissioner to start looking immediately at how we can reduce children and young people’s access to pornography and other harmful content. That work will identify whether there are actions that can be taken more quickly to protect children before the Online Safety Bill comes into effect.”—[Official Report, Commons, 10/6/21; col. 1162.]


That was seven months ago and, apart from some press reports stating the views of the Children’s Commissioner for England and Wales, we do not have any clear indication of what interim measures the Government are going to take to reduce access by children to pornographic and other harmful content online.

When Part 3 was delayed, the Government said that preventing children’s access to pornography is a critically urgent issue. The slow pace of responding to the Ofsted report suggests that the Government think otherwise. The Government say they are going to identify action that can be taken to protect children, yet the one action they can take, legislative action—namely, the implementation of Part 3 of the DEA—is the one thing they continue to refuse to do. The Joint Committee, when reporting on the online safety Bill, stated that age assurance needed to be in place within six months of the Bill receiving Royal Assent. I understand that the Government believe it could take two years for a regulator to be in place to properly administer age verification. If it will indeed take two years for Ofcom to get ready to be the regulator and consult on its role and legal powers, surely the Government should start the process now.

I appreciate that there is doubt about the steps Ofcom can take now to speed up implementation of whatever new regulations will be required to give effect to the online safety Bill’s provision in respect of pornographic sites. Mindful of that, an opportunity exists for the Secretary of State to utilise the Digital Economy Act 2017 to allow Ofcom to start work now. Ofcom could be designated under Section 17 of the DEA as the regulator. This would, at the very least, give it legal cover to undertake research into the size, shape and nature of the online pornography market in the UK, the readiness of the industry to respond to any new laws on age verification, the relevant technologies and related matters. Once that preparatory work is complete, it would have a clear idea of how it would need to respond when its responsibilities under the online safety Bill become clear and would be able to prepare a consultation process to ensure that regulation begins as soon as possible after Royal Assent.

Let me be clear: this is far from the preferred option but does, at the very least, represent a way forward and allows the Government to use Part 3 of the DEA to ensure that protections are delivered without delay once the online safety Bill is law. I ask the House to support this Bill and send a message to children and women across this nation that we value them. Real lives are being affected by the current lack of protection. We must take action to prevent more people like Jack falling into the addiction that has had a devastating effect on his life. We cannot allow another tragedy like that which happened to Sarah Everard.

While Part 3 of the Digital Economy Act will not solve all the issues in relation to online pornography, one thing that is certain is that the landscape will be much safer with those protections than without them. It is not just Members of this House who believe that; the general public want action on this issue now. According to BBFC research, 83% of parents across the UK want the Government to act and bring in age-verification measures now. It is within the Government’s gift to provide protection. That is why I ask that the Government take this Bill seriously and that, if they do not support it, they set out urgently their alternative proposals to ensure that children and women are protected to the same level as envisaged by Part 3 of the DEA while the online safety Bill makes its way through the House. I beg to move.

10:34
Baroness Benjamin Portrait Baroness Benjamin (LD)
- Hansard - - - Excerpts

My Lords, I congratulate the noble Lord, Lord Morrow, on tabling this Bill to debate this important issue and I declare an interest as a vice-president of Barnardo’s. It is frustrating that once again we find ourselves in this House debating the need to protect children and women from the dangers of online pornography, because this matter was settled in 2017 with Part 3 of the Digital Economy Act, which should have been implemented by 2019. It goes without saying that children must not have easy access to harmful material online, and women and girls should no longer be placed at risk of abuse, violence and harm resulting from the ever-increasing deluge of online pornography. It is shocking that legislation exists that could alleviate some of those risks, yet it has not been brought into force.

Barnardo’s believes that many children in the UK are developing a view of sex and relationships that is void of context, especially in relation to consent and violence, which will stay with them well into adulthood. Research shows that pornography has wide impacts on the development of children and young people, including poor mental health, low self-esteem, sexual aggression, violence, child-on-child sexual abuse and the shaping of future sexual behaviour. Another generation of young people will face these same issues unless we act now.

It is not just children and young people who will be protected by Part 3 of the DEA; it also seeks to protect women and girls from the harmful influence of violent and extreme pornography on men and boys. All the research highlights that male sexual violence against women and the distortion of sexual relationships have their roots in extreme and violent pornography. This is why, on 7 May last year, I and a group of other parliamentarians, women’s organisations, head teachers and children’s charities wrote to the Prime Minister expressing our concerns regarding this issue. This letter was written because of the tragic death of Sarah Everard, who was murdered at the hands of a man addicted to extreme pornography. This has sparked women from all over the UK to share their stories. Listening to these stories, it is clear that pornography plays a significant role in increasing violence by men against women.

I raised these issues almost a year ago, in debates in this House on the Domestic Abuse Bill. We know that Part 3 of the DEA will not end sexual violence or protect all children from harm, but it is the very minimum we can expect and trust our Government to do. At the beginning of last year, I wrote to the Government to ask why they have failed to implement Part 3 of the DEA and to encourage them to implement it as an interim measure while the online safety Bill is considered by Parliament. I was informed that implementing Part 3 as an interim measure would not be possible because of the time it would take to designate a regulator under Section 17 of the DEA; that it would take two years for a regulator to be designated and for the relevant consultation to conclude. I was given no information as to why this would take two years—that was two years ago. The BBFC, which was designated and then de-designated under the Act, could have been reappointed as the regulator under Section 17.

In March 2021, had the Government redesignated the BBFC as regulator, interim protections for children and women could have been in place within 40 days. Protection would be in place pending the online safety Bill. Had the Government acted at that time, commercial pornography sites could have been regulated and age verification put in place by the summer of last year. It is still open to the Government to follow this course and I so wish that they would.

Here is the reality: the Bill we are debating today is not actually needed. All that is required is that the Government do what they are obligated to do by law and implement Part 3 of the Digital Economy Act. If the Government refuse to do this, it will take time for Ofcom to be designated as the regulator once the online safety Bill is eventually passed. But the Government do not need to wait until the online safety Bill is in place: they could designate Ofcom now.

Last March, the Government wrote to me stating that it would take around two years for Ofcom to be designated as a regulator. If they had acted last year and designated Ofcom under Section 17 of the DEA, it could have laid the regulations and guidance before Parliament this autumn. By the end of the year, Ofcom would have begun its work as regulator, pending passage of the online safety Bill through Parliament. If, as the Government claim, it will take two years for Ofcom to be designated as a regulator under the online safety Bill, it could be 2025 or later before age verification and curbs on extreme pornography are in place. This is totally unacceptable.

While it is preferable for the Government to implement Part 3 of the DEA immediately, the suggestion of the noble Lord, Lord Morrow, that Ofcom be designated now under Section 17 of the DEA and that it commence work to prepare to be the regulator is reasonable. It would be shameful if the Government further delayed action on age verification and protecting women and girls from the harm of violent pornography by failing to act now. Children and women have waited far too long for these protections. The Government should act now to alleviate any more harm and suffering.

A mother wrote to me telling me that her four year-old daughter was sexually abused by a 10 year-old boy, who told her, “I am going to rape you and you are going to like it”. Now when the daughter hears the word “rape” on the news, she asks her mother, “Did she like it mummy?” It makes me weep to tell this story, because childhood lasts a lifetime. This is why I support the Bill of the noble Lord, Lord Morrow. It is a moral issue.

10:42
Earl of Erroll Portrait The Earl of Erroll (CB)
- Hansard - - - Excerpts

My Lords, I almost wonder what I can add to what the two previous speakers have said. I entirely agree with the noble Lord, Lord Morrow, in bringing this Bill; it is long overdue. I just do not know what went wrong inside the Civil Service and the Executive for Part 3 of the Digital Economy Act not to have been implemented. I entirely agree with the analysis of the noble Baroness, Lady Benjamin, that this could have been done very much more quickly. I do not understand what went wrong.

I will make two principal points. First, the online safety Bill does not even begin to cover adequately the point covered by Part 3 of the Digital Economy Act. There are some suggestions from the Select Committee as to how it might be done through the age-appropriate design code, but I do not think that has the teeth of Part 3—its financial sanctions are the only thing that will really bite and make people comply properly. I will come back to that in a moment.

The second point is the constitutional issue, which I find quite intriguing. Parliament is supposedly sovereign; it legislates and sets the rules, and the executive departments carry them out. How, constitutionally, is it correct for the Executive to overrule Parliament? I am quite worried about the balance of power between the two arms of government. The delaying tactics used by DCMS were exasperating. It failed to do something that it also made a mess of about a year earlier in informing the EU about regulations coming into force. It knew it had to do that, so was that deliberate? I do not really need an answer—we can guess.

Then we have the BBFC being appointed the regulator, as the noble Baroness, Lady Benjamin, just said; it went off at a tangent and produced regulations about data protection and GDPR, which was not its job but the job of the Information Commissioner’s Office. Its job was age verification and making sure that people were not accessing the sites. We spent a huge amount of time on Part 3 because we recognised its purpose and importance. On the whole, we did a very good job—we realised that effective sanctions had to be in there or it would not work.

What really worries me is not just extreme pornography, which has quite rightly been mentioned, but the stuff you can access for free—what you might call the “teaser” stuff to get you into the sites. It normalises a couple of sexual behaviours which are not how to go about wooing a woman. Most of the stuff you see up front is about men almost attacking women. It normalises—to be absolutely precise about this, because I think people pussyfoot around it—anal sex and blowjobs. I am afraid I do not think that is how you go about starting a relationship. Starting children off, at the age of 10 or 11—goodness knows when they start watching this stuff—thinking that this is how you should treat a girl when you first start going out with her, probably in your early teens, is not a good idea. We could have stopped it. For some reason, the Executive decided not to. I would love to know who kept blocking it, because there are funny people in there—it really worries me.

We had support for it from the major porn providers. I wondered why, and discovered that quite a few of the directors had teenage children. They did not like teenagers looking at this stuff either. They would have gone along with it as long as it was done universally, to stop the smaller ones grabbing their market. It could have been done, and they were going to help police it. We even produced a BSI PAS 1296 on how to do anonymous age verification to make sure that people who wanted to browse a website —for instance, a parliamentarian—could do so anonymously, but they would be age-verified in doing it. I chaired the working group on it, which is why I know quite a lot about that bit. It worked, and quite a few age-verification and attribute providers worked on how to do it. There are various examples out there; we even had demonstrations for whoever wanted to go and see it, including members of the Executive, the Home Office, DCMS, the BBFC, parliamentarians and everyone. They worked.

The real problem is that the online safety Bill, which we were promised would deal with this, does not. It deals with the intermediaries; the bit that Part 3 of the DEA could not cover because, I suspect, it was lobbied not to by Facebook, Google and all that lot. Quite rightly, it attacks that part of it. However, by repealing Part 3, the Government are dropping the ultimate sanctions you need to sort out the sites you go to in order to see this stuff. That is where you need the protections. It is all very well people being checked half-way down a Google search or whatever, but by the time they have got to where the search goes, that age check will not be there. You must have it at the front end of the porn sites themselves. That is what we had in the DEA; I have no idea why the Government do not want it. Again, I wonder what the motives are of those inside government doing this.

What else can I say? I do not understand why they will not just implement it. If they need to pass this Bill to do it, please do so. If they just want to get on with implementing Part 3 of the Digital Economy Act, that is probably quicker, as we would not have to go through all the stages of this Bill in this and the other House. My plea to the Government is this: get on with it.

10:48
Lord McColl of Dulwich Portrait Lord McColl of Dulwich (Con)
- Hansard - - - Excerpts

My Lords, I also thank the noble Lord, Lord Morrow, for picking up the baton from Baroness Howe and continuing to bring this important subject before your Lordships’ House.

I want to quote what the Government said in response to the Ofsted Review of Sexual Abuse in Schools and Colleges which was published last June, as has been mentioned. A Statement was made in the other place on 10 June and in this House on 17 June which said:

“There is another thing that is not okay: the ease of access to and increasing violence of online pornography. This increasingly accessible online content, which often portrays extremely violent sex, can give young people warped views of sex and deeply disturbing views on consent.”—[Official Report, 17/6/21; col. 2071.]


That is why we are debating this Bill today. The Government appear to have been taken by surprise by the findings of the report last year. But it is not a surprise. The Conservative manifesto of 2015 acknowledged the impact of pornography on young people and promised action to stop children accessing this material.

In 2016, the Government introduced proposals for age verification by stating their concerns in terms very similar to the words used last year. The consultation document said:

“Pornography has never been more easily accessible online, and material that would previously have been considered extreme has become a part of mainstream online pornography. When young people access this material, it risks normalising behaviour that might be harmful to their future emotional and psychological development.”


The consultation document was issued in February 2016. Here I stand, nearly six years later, dismayed that the promised action has not materialised.

I do not doubt the Government’s good intentions with respect to the online safety Bill. I met the Minister with the noble Baroness, Lady Benjamin, last year to discuss the new proposals. But I said at the time—on Report on the domestic violence Bill—that I remained just as baffled after that meeting as to why the Government choose not to implement Part 3. I am not suggesting that Part 3 is a complete answer to all the issues around online pornography, but it is what we have available to us now. Doing nothing in the interim—before the online safety Bill comes into effect—is leaving our children and youth without protection from material that the Government acknowledge can lead to real harm.

It is not only children who are impacted by pornography. In our debates on the domestic violence Bill last year, it was clear that—as I said in Committee—sexual violence is an important part of domestic violence. During the debates, we heard about the links between pornography and acts of rough sex. The subsequent call for evidence to inform the tackling violence against women and girls strategy

“showed a widespread consensus about the harmful role that violent pornography can play in violence against women and girls”.

On 17 November 2021, when asked about the regulation of pornography, the Prime Minister said that

“people are coarsened and degraded by this stuff”.

I could not agree more.

I am left wondering why the Government do not use the legislation that is on the statute book to act now, as has been mentioned before. I look forward to hearing what the Minister has to say, but I hope his remarks will be more than the stock statements about how the Government are working on the online safety Bill. This was the position in March 2021, and we have not yet started debating the Bill in Parliament.

I started my speech by referring to the Government’s response to the Ofsted report. The response included that announcement that

“the Secretary of State for Education and the Secretary of State for Digital, Culture, Media and Sport have asked the Children’s Commissioner to start looking immediately at how we can reduce children and young people’s access to pornography and other harmful content. That work will identify whether there are actions that can be taken more quickly to protect children before the online safety Bill comes into effect.”—[Official Report, 17/6/21; col. 2071.]

I hope the Minister will set out clearly today the Government’s short-term plans to achieve these objectives, and how they will also ensure that the plans include actions to protect women from the consequences of violent pornography. I urge the Minister—as I have done before—to include the implementation of Part 3 as part of the interim measures.

10:54
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I too thank the noble Lord, Lord Morrow, for securing the debate and add my frustration and exasperation to the voices of noble Lords who have already spoken. I declare my interests, particularly that of chair of the 5Rights Foundation and as a member of the Joint Committee on the Draft Online Safety Bill.

Here we are again, discussing our collective failure to protect children from violent misogynistic pornography and the negative impact it has on body image, self-esteem, sex and relationships. That failure can be measured, in part, by the need of schoolchildren to set up their own website, Everyone’s Invited, so that they could make witness to their tales of epidemic sexual abuse in our schools—about which we have done nothing. It can also be measured by the statistic released this week by the IWF that last year 27,000 seven to 10 year-olds in the UK posted self-generated sexual abuse images—a threefold increase on the year before. In a chilling conversation with colleagues a couple of days ago, the IWF explained how it hears in the voices of these young children language taken directly from the pornography they are mirroring. It is unacceptable to hear the despair of schoolchildren and fail to act. It is tragic to imagine even one child—of any age—mimicking porn for the pleasure of a paedophile, let alone 27,000 seven to 10 year-olds.

We will undoubtedly hear from the Minister that Part 3 of the DEA will be usurped and addressed through the forthcoming online safety Bill. However, that does not account for the failure of Government to implement legislation already fought for in this House, and already in law for five years. Nor does the fact that this measure is absent from the draft online safety Bill, or the recent refusal to accept my Private Member’s Bill on age assurance, give me any confidence that this is a priority for the Government.

Frictionless access to online pornography is not an equivalent to hazy memories of men who once read a soft-porn mag behind the cricket shed. It is a multi-billion industry delivering eye-watering violence towards women and girls, delivered by a tech sector proven to be driven by profit and with a wilful disregard for children’s safety and well-being. It is worth noting that 60% of pornography access by children aged 11 to 13 is not actually searched for—it is unintentional, often delivered algorithmically as content they might like. Waiting for the online safety Bill is no longer an option and access to pornography is not the only issue.

Ofcom’s own research shows that 42% of five to 12 year-olds in the UK use social media services— most of which have a minimum age use of 13. Ofcom’s chief executive, Dame Melanie Dawes, in her evidence to the Joint Committee said that any code of practice under the Bill would take a minimum of 18 months to produce. This does not take into account the year of transition by which it becomes law. This is simply not good enough for an issue of this urgency. A child of 11 getting their first smartphone today will be 14 or15 before they benefit from the online safety Bill, and a child who was 11 when Part 3 was agreed will be an adult. Nor is it acceptable, as I have been told by Ministers and officials, that what is currently proposed by DCMS is a voluntary standard of age assurance rather than a statutory code of practice. Voluntary standards require volunteers and we have seen repeatedly that the sector will not act unless mandated to do so. Age assurance, which is any system of estimating or establishing age, must be subject to rules of the road so that we know that, whatever the technical approach —and believe me, there are many—both third-party providers that offer age checks and services that operate their own age-assurance systems are doing so to a set of agreed principles appropriate to the risk.

On 19 November last year, we had the Second Reading of the Age Assurance (Minimum Standards) Bill, which would have given Ofcom the power to create a mandatory code of age assurance. On that occasion, I set out the arguments for a proportionate, flexible, secure, accurate, privacy-preserving regime for age assurance—one that would finally deal with the issue of underage access to pornography but also support the age-appropriate design code, with its landmark safety and privacy advances—as well as the further safeguards that we anticipate will be brought forward in the online safety Bill. I will not repeat in full what I said on that occasion, but rather refer the noble Lord to that debate and urge him to understand that this is not a question of technology but one of governance.

If government sets the principles of privacy, security, purpose limitation and fairness, we know that the technology is there. We have a vibrant safety tech sector in the UK, and it too has asked that the Government create mandatory standards so that it can be seen to meet them. This is not a zero-sum game. The sector is already checking age, but very badly. It is already taking excessive data from both children and adults, with little oversight over how it is used and with whom it is shared. We must now set a higher bar.

Royal Assent has been granted for age assurance. The Government have promised age assurance. Parents are desperate for age assurance and children will never be safe without it. Waiting for the online safety Bill is to condemn yet another generation of young people to a digital world that fails to protect them. That means it is government that now bears the responsibility for the failure to act.

In the name of the thousands of seven to 10 year-olds who will copycat porn for predators in the meantime and the young people who repeatedly ask that their digital lives be safer, kinder and more equitable, I ask the Minister himself to act and, when he stands to speak, to make a commitment that he personally will put this case, in full, to the Secretary of State and ask that she give government support to the Private Member’s Bill that is sitting at the ready and, in doing so, swiftly fulfil the ambition of Part 3 of the DEA.

11:02
Lord Browne of Belmont Portrait Lord Browne of Belmont (DUP)
- Hansard - - - Excerpts

My Lords, I add my thanks to my noble friend Lord Morrow for bringing this Bill before the House. It is disappointing that due to government inaction, my noble friend has been forced to move this Bill today. This is an issue that everyone across the House believed to be settled in 2017, when Part 3 of the Digital Economy Act was passed.

The Government tell us that this matter will finally be dealt with through the online safety Bill, but that Bill will not become law for at least 18 months, and provisions relating to age verification and pornography will not be in force until potentially a further two years after that. Indeed, it could be another four years before provisions agreed in 2017 come into force.

Even if the online safety Bill is enacted, there is no guarantee that the provisions or even the spirit of Part 3 of the Digital Economy Act will be implemented. The draft online safety Bill does not extend to all commercial pornography websites and, unlike in Part 3 of the Digital Economy Act, pornography is not listed as an online harm. There is no guarantee that the online safety Bill will come anywhere close to providing the protection afforded under Part 3 of the DEA. That is why the Bill before the House today in the name of my noble friend Lord Morrow is critical.

It is clear that this protection is needed now. In 2016, prior to the introduction of the then Digital Economy Bill, the Government said:

“Pornography has never been more easily accessible online, and material that would previously have been considered extreme has become part of mainstream online pornography. When young people access this material it risks normalising behaviour that might be harmful to their future emotional and psychological development.”


That is why the DEA was enacted, and those risks have not receded. Young people are still at risk from online harm.

There is a substantial body of evidence which suggests that exposure to pornography is harmful to children and young people. Many have spoken in this debate already about the harm carried into adult life, which has a damaging impact on young people’s view of sex and relationships. For many young men, addiction to pornography which starts in the teenage years can often lead to the belief that women are dehumanised and to be treated as objects.

Evidence published by the Government in January 2021 which reported the experiences of front-line professionals working with clients who had either exhibited harmful sexual behaviours towards women or were at risk of doing so said that for young people, pornography is seen as

“providing a template for what sex and sexual relationships should look like”.

One worker is quoted as saying:

“‘Porn comes up in probably eighty or ninety percent of my cases … what they’ve done is influenced by what they’ve seen … For them, the internet is fact.’”


Pornography is becoming a young person’s main reference point for sex, and there is no conversation about issues such as consent. That is why Part 3 of the DEA was enacted.

Part 3 of the DEA was not just about protecting children. It also reflected the concerns of Parliament that women and girls should be protected from violence. Too often, pornography is a contributing factor in violence against women. In 2018, the Women and Equalities Committee reported on pornography’s impact on women and girls in public places and concluded that:

“There is significant research suggesting that there is a relationship between the consumption of pornography and sexist attitudes and sexually aggressive behaviours, including violence.”


The Government’s 2020 literature review into the use of legal pornography and its influence on harmful behaviours and attitudes towards women and girls reports that

“there is substantial evidence of an association between the use of pornography and harmful sexual attitudes and behaviours towards women.”

While the report recognises that pornography is one among several potential factors associated with these attitudes and behaviours,

“it is clear that a relationship does exist and this is especially true for the use of violent pornography.”

The Government’s 2021 Tackling Violence Against Women and Girls Strategy reported that most respondents to the representative survey agreed that an increase in violent pornography has led to more people being asked to agree to violent sex and to a rise in sexual assaults.

It is clear that Part 3 of the DEA is needed today as much as it was in 2017. Children, young people, women and girls should not have to wait until the online safety Bill becomes law before they are protected from online harm. Indeed, there is no guarantee that the online safety Bill will be as far-reaching as Part 3 of the DEA. It is time for the Government to meet their obligations and to bring these provisions into force. I support the Bill.

11:08
Lord Alton of Liverpool Portrait Lord Alton of Liverpool (CB)
- Hansard - - - Excerpts

My Lords, the whole House should be grateful to the noble Lord, Lord Morrow, for tabling a Bill that should not be needed but patently is. As my noble friend Lord Erroll said earlier, Parliament has been clear: children should be protected from viewing harmful content online through robust age verification, and women and girls should be protected from the effects of violent pornography being viewed on the internet and the associated risks to their safety that it brings.

Part 3 of the Digital Economy Act 2017 was passed to ensure that what was illegal offline was illegal online, and that violent and extreme pornography could be blocked. But more than that, the legislation sought to ensure that the protections for children offline were robustly put in place for the online sphere as well.

In an analogue world, before the internet, the shopkeeper stood between children and the top shelf or the purchase of adult video content. Part 3 of the Digital Economy Act sought to put in place a mechanism of age verification to ensure that protection was extended to the digital world. In 2019, it was extremely disappointing for those of us across this House who had argued for Part 3 of the Digital Economy Act that the Government decided to abandon implementing that legislation, stating that new measures would be brought forward.

I remind noble Lords of some of the statements made by the noble Baroness, Lady Barran, who was the Minister when the government U-turn was announced. She said that

“children are exposed to harmful pornography every day”,

and:

“Shocking things are going on”—[Official Report, 17/10/19; col.170.]


Yet here are we without any new protections for children or women, despite her assurance that the Government would demonstrate “urgency” in tackling the issue.

This emphatically is not about consenting acts between adults; it is about protecting children. Protecting children from the harm of films is something that has occupied some of my political life. While in the other place, serving a constituency in Liverpool— I am particularly pleased to see the right reverend Prelate the Bishop of Liverpool in his place today—along with the rest of the nation I was greatly affected by the death of James Bulger. It is hard to believe that it will be 29 years next month since that tragedy occurred. In response, I tabled an amendment to the then Criminal Justice Bill. It set out to make it an offence to show gratuitously violent videos to children. That amendment was not supported by the Government, but had cross-party support, with some 80 Conservative Members supporting it. With that support, including the then shadow Home Secretary, Tony Blair, the amendment made it into law as Section 4A of the Video Recordings Act 1984.

This amendment was based on evidence from leading child psychologists at the time. Their research focused on the impact of harmful videos on the development of young people. In the Bulger case, it was not pornographic content but a horror movie that impacted negatively on the children involved in his killing, but the issue remains the same. Children’s brains are not developed to be able to cope with the harm that such content poses to them. That is why they need to be protected by legislation.

Section 4A of the Video Recordings Act has become known as the “harms test”. The test allows for a video not to classified by the BBFC on the basis of the harm it may cause. If the video is likely to be viewed by children or would be of interest to children, the BBFC can take that into account when making its determination about what, if any, classification a video should receive. The harms test is not confined to children but applies to adults as well.

In 1994, when I proposed my amendment, the internet was relatively new, and no one could have foreseen the explosion in harmful content that is unregulated and available at the click of a button today. As my noble friend Lady Kidron told the House earlier in an excellent speech, this is not about people behind the bicycle sheds reading soft porn magazines. During lockdown, Ofcom reported that more people accessed Pornhub than watched BBC or Sky News, and it is freely available, readily accessible and open to children and young people to access freely. The British Board of Film Classification reported in 2020 that it was the most popular website for children viewing pornography.

It is a sad reflection that the world may have moved on, but much has stayed the same. In 1994, I tabled my amendment because psychologists stated that the development of children and young people is harmed by content that they access online. It matters because, as Ofsted reported in June 2021, it deeply affects children’s attitude. As the noble Baroness, Lady Benjamin, told us in her excellent speech, it affects our relationships.

As with the James Bulger case, it is often a tragedy that focuses minds on the problem. Last year, as the noble Lord, Lord Morrow, reminded us when he opened our debate on his Bill, there was the tragic case of Sarah Everard. He highlighted the link between violent pornography and violence against women such as Sarah. Just before her tragic murder in January 2021, the Government published research on the use of legal pornography and its influence on harmful behaviours and attitudes towards women and girls. It said that

“there is substantial evidence of an association between the use of pornography and harmful sexual attitudes and behaviours towards women … it is clear that a relationship does exist and this is especially true for the use of violent pornography.”

Sarah Everard’s killer, as we know, was addicted to such material, and that begs the question: how many more tragedies have to take place before we take action?

When the Minister replies, no doubt he will point to the online safety Bill and suggest that it will take care of all the concerns that have been raised in the House today. Noble Lords cannot wait: how many more children have to be harmed; how many more acts of violence against women and girls will there be? The Bill of the noble Lord, Lord Morrow, means that we do not have to wait two or three years before we act. I hope the Minister will tell us exactly what the Government’s expectation is for the timeframe for that Act of Parliament. Legislation stands on the statute book, and that legislation, Part 3 of the Digital Economy Act, should be brought into force as this House, as Parliament, intended. At the very least, this should be undertaken as an interim measure until the online safety Bill is passed. It is for all those reasons that I am very happy to support the Bill and thank the noble Lord, Lord Morrow, for introducing it and laying it before us.

11:16
Baroness Brinton Portrait Baroness Brinton (LD) [V]
- Hansard - - - Excerpts

My Lords, I, too, thank the noble Lord, Lord Morrow, for laying this Private Member’s Bill, so eloquently introduced by him and supported by all noble Lords who have spoken so far.

As others have said, we should not need to have a Bill to start Part 3 of the Digital Economy Act 2017, which restricts use of porn sites to the over-18s, who must use age verification. The noble Lord, Lord Alton, made the critical point that Parliament approved that legislation five years ago. As we have heard, the law was originally due to come into force in April 2018 but, after repeated delays, the Government announced in 2019 that they would not go ahead with it. The Government are hiding behind the proposed online safety Bill, saying that they will make sure that protection of children from online pornography will be covered. But as both the noble Lords, Lord Morrow and Lord Browne of Belmont, said, it looks as if the online safety Bill is very much less clear and effective than Part 3 of the DEA. Further, it now looks as if it will be at least five and possibly up to 10 years before that proposed legislation comes into practical force, and there is no guarantee that a Minister might not block parts of the online safety Bill, just as happened with the DEA.

My noble friend Lady Benjamin and others made important and critical points about the safety of women. Many of us speaking today have expressed those concerns during the passage of the Domestic Abuse Bill and the Police, Crime, Sentencing and Courts Bill. As the noble Baroness, Lady Kidron, said, the core of this Private Member’s Bill is the protection of children now, and the key to making Part 3 work is a reliable age-verification process. It was moving to hear her say that the Internet Watch Foundation can attest to the need for it in the conversations it is having with children who have been groomed and abused.

I am glad that the noble Lord, Lord Alton, also referred to film classification and the prevention of harm to children. In its role as the age-verification regulator, the British Board of Film Classification reports that a significant and growing evidence base supports the case for preventing children’s access to online pornography. Recently, the BBFC carried out research into children’s exposure to online pornography. The findings include that more than half of 11 to 13 year-olds have seen porn, some being as young as 7 or 8. The majority of young people’s first time watching pornography is accidental: 62% of 11 to 13 year-olds who have seen porn stumbled across it unintentionally. Children see violent content that they find upsetting and disturbing, including content that they feel normalises rape. This comes back to the comments made by noble Lords about the murder of Sarah Everard.

The BBFC would refuse to classify any of this type of content on DVDs. Children do see porn on social media, but most of their viewing is done on dedicated sites. We know that there is now significant public support for the introduction of age verification, with 83% of parents and 56% of 11 to 13 year-olds in favour—so the young people want protection too. The BBFC would have been responsible for making sure that porn sites comply and would have been able to fine or block sites that did not, but it would be up to each site to decide its own system for age verification. It is good to hear that the BBFC has agreed a memorandum of understanding with the Information Commissioner’s Office and the National Cyber Security Centre at GCHQ to ensure that strict privacy rules are maintained.

However, this is all voluntary at the moment—for providers, that is. The BBFC’s voluntary age-certification scheme expects all providers to undergo an audit. One was carried out recently by the NCC Group, which ensured that there was no handover of any personal data. This is absolutely key, because it is vital that age verification is run separately and independently from online providers. Helpfully, over the passage of the past five years, despite the delay in the implementation of Part 3, age verification has improved. The noble Earl, Lord Erroll, talked about anonymous age verification but, for me, it is the independence of the age-verification process from online pornography providers that protects the process from unscrupulous providers and thereby also protects the individual, particularly children.

This would counter previous concerns that too many children could circumvent age verification via a VPM or proxy. It would also reduce the risk of child sexual exploitation and abuse by predators grooming children by enabling predators to groom children by acting as gatekeepers to pornography. So can the Minister say whether he believes that, if made into law, the proposals from the BBFC, ICO and NCC Group would also protect the privacy of individuals? I ask that because one of the other concerns is about the private act of watching legal and consensual pornography that can at the moment be passed on and sold by one of the parties—known as revenge porn. Would this be covered under these age-verification rules?

We also need to ensure that there is no undue targeting of sexual minorities, especially people whose sexual preferences are secret; this includes many younger members of the LGBT community, including young adults. The potential for the hacking of personal data has been very harmful to many people in the past, so we need proposals to make sure that they are protected.

There is one thing that I have not heard anybody else mention in the debate so far. This matter is not just about legislation; it is also about education. Schools need to teach children about the dangers, and how to use the internet and social media safely and responsibly. Parents, whose own education in online data usage is often way behind that of their children, must be empowered to protect their children online, including through digital literacy education and advice and support for parents on best practice.

There is far too much illegal activity online, including child porn, extreme porn and revenge porn. Our existing laws must be properly enforced, which requires more officers, resources and training for police and prosecutors. Keeping children safe online is more difficult than ever, but it is also more important than ever. The Government, social media companies and online providers must do more to protect children from harmful content.

I agree with the noble Lord, Lord McColl. We on these Benches hope and expect that the Minister will give us a frank response, not the stock response, on why Part 3 has not yet been commenced. However, the one message from all the speakers so far is that time is of the essence and regulators need teeth. Voluntary arrangements do not always work, especially because two things have happened since the passage of the DEA: evidence shows that high numbers of much younger children are encountering online pornography; and age-assurance and verification processes have improved significantly and can now be managed independently of online pornography providers. We cannot wait.

11:24
Baroness Merron Portrait Baroness Merron (Lab)
- Hansard - - - Excerpts

My Lords, I thank the noble Lord, Lord Morrow, for bringing forward this Bill and for laying out the arguments and rationale for it so clearly, as well as the background.

I agree that it is quite remarkable that this Bill is even needed. However, it is needed because we find a continuing situation where children have unfettered access to damaging pornography, which affects how they see, experience and play out in the world. Noble Lords have made important contributions and put challenges to the Minister, who, I fear, will be put in a position in his response of defending the indefensible: the fact that we still await protections. I hope that he will be able to make some positive comments in response to the debate that has taken place not just today but on many previous occasions.

As noble Lords have said, this situation is entirely baffling because it seems that, although legislation is in place and much of the preparatory work has been done, the department for some reason will not make a move swiftly to introduce age verification for online pornography. As the noble Earl, Lord Erroll, and the noble Lord, Lord Alton, said, there has been a catalogue of errors for some reason and a failure to implement legislation that has been agreed. I hope that the Minister will give us some explanation as to why that is the case.

As we discuss this today, I feel that there is an even greater imperative because of the circumstances in which we are having this debate. Since Part 3 of the DEA was cancelled, in October 2019, we have lived through several lockdowns during which internet usage has soared. Yes, the internet is a great source of education and positive entertainment, but, as we have heard so clearly today, it can also have a very negative impact if the proper online protections are not there. In that case, it means that many children and young people find it easy to come across exposure to pornography that they have not even sought out. This is disturbing and harmful, with potential long-term and unhealthy consequences for their future adult relationships and their current relationships as children and young people.

It is interesting that, when Ofsted did a rapid review of sexual abuse and harassment in schools in June 2021, it reported that

“easy access to pornography had … set unhealthy expectations of sexual relationships and shaped children and young people’s perceptions of women and girls.”

In response to Ofsted’s findings, the Government asked the Children’s Commissioner to investigate actions that could be taken in the short term. I am interested in this point because this Bill gives the perfect opportunity for some action to be taken in the short term. As the letter sent by the noble Baroness, Lady Benjamin, and 60 others—including MPs, Peers, head teachers and non-governmental organisations—said, interim action can and should include the implementation of Part 3 of the DEA.

However, there is a further reason why there is a greater imperative at present. As we have heard several times in this debate, there is great concern about the obvious impact of pornography on violent acts committed against women, which have hit the headlines since the death of Sarah Everard. That will be on our minds for ever, and I hope that the Government will see that the need for action is ever more pressing.

This is of course an ongoing cross-party initiative, and today we have a Bill seeking to implement a settlement reached by representatives of all parties and none. I thank the noble Baroness, Lady Kidron—it is right to do so regularly—and others, including the noble Baroness, Lady Benjamin, who deserve credit for their relentless focus on this subject. As the noble Baroness, Lady Kidron, reminded us, we are dealing with a multi-billion-pound industry with a wilful disregard for the impact that it has on children. As she said, voluntary standards require volunteers, and, regretfully, we do not find voluntary co-operation as it should be. I pay tribute to my noble friend Lord Stevenson of Balmacara. While no longer on the Front Bench, he has played a significant role during the passage of the Digital Economy Act, and has since leant his expertise to the Joint Committee on the Draft Online Safety Bill.

Given that we agree wholeheartedly with what the noble Lord, Lord Morrow, seeks to achieve with this Bill, and knowing the Minister’s likely argument as we do, I will make two points to draw out the main issues. First, while the Government may say that this matter can be better dealt with in the online safety Bill, the fact is that the worst offenders and the most harmful material will not, as things stand, fall within the scope of that legislation. Secondly, even if we were satisfied that the online safety Bill will address these concerns, enactment will take several years, so there is no reason why Part 3 of the DEA cannot be enacted as an interim measure. Children’s lives will not be put on hold while we delay putting legislative provision in place. The Government have repeatedly promised to keep young people safe; that is a solemn promise, as I know the Minister understands. However, if this Bill is opposed today by the Government, I regret to say that that solemn promise will have been broken.

11:32
Lord Sharpe of Epsom Portrait Lord Sharpe of Epsom (Con)
- Hansard - - - Excerpts

My Lords, I apologise for my slightly tardy arrival this morning.

I join others in thanking the noble Lord, Lord Morrow, for introducing this Bill, and thank all noble Lords who have taken part in this very powerful debate. I acknowledge the valuable work done by the Joint Committee scrutinising the draft online safety Bill, and in particular the noble Baroness, Lady Kidron. It is clear how much time and careful thought the Joint Committee has put into its scrutiny, and I hope I can give some positive comments in answer to the noble Baroness.

The Government share the concerns raised in both Houses, by parents and by those advocating on behalf of children’s safety online that a large amount of pornography is available on the internet with little or no protection to ensure that those accessing it are old enough to do so. While preventing children accessing online pornography is a key priority for the Government, I am afraid that the Government do not support this Private Member’s Bill, on the following grounds.

First, this is an unusual use of a Private Member’s Bill from a procedural perspective. The Bill introduces a new, stand-alone duty to commence regulations through pre-existing primary legislation. On its ordinary reading, this new duty would supersede the existing discretionary power that the Secretary of State has in that primary legislation to introduce commencement regulations. The Bill does not, however, make any amendment to that discretionary power, nor does it make any attempt to update the previous legislation to take account of the new statutory obligation that would have a significant effect on it.

Secondly, the Government have already taken the decision, announced in October 2019, that they would not be commencing Part 3 of the Digital Economy Act 2017. We will instead repeal these provisions and deliver the objective of protecting children from online pornography through the forthcoming online safety Bill.

The proposed measures in the online safety Bill will mean that platforms will have clear legal responsibilities for keeping their users safe online. Services which are likely to be accessed by children will be required to protect them from harmful content on their sites, including pornography. Priority categories of harmful material to children will be set out in secondary legislation, so that all companies and users are clear on what companies need to protect children from.

The online safety Bill will deliver more comprehensive protections for children online than the Digital Economy Act. The draft Bill goes further than the Digital Economy Act, protecting children from a broader range of harmful content on a wider range of services. The Digital Economy Act was criticised for not covering social media companies, where a considerable quantity of pornographic material is accessible, and which research suggests children use to access pornography. The online safety framework will cover many of the most visited pornography sites, social media, video-sharing platforms, forums and search engines, thereby capturing sites through which a large proportion of children access pornography. We expect Ofcom to take a robust approach to sites that pose the highest risk of harm to children, including sites hosting online pornography.

A number of noble Lords, including the noble Lord, Lord Morrow, have addressed the issue of violence against women and girls. Of course violence against women is abhorrent. The Government Equalities Office commissioned research into the relationship between pornography use and harmful sexual behaviours, to better understand whether there are connections, as referenced by the noble Lord, Lord Alton. The noble Earl, Lord Erroll, also made powerful points about this. The online safety Bill will impose legal duties on companies to address damaging content online. This will include removing illegal and extreme pornography, as well as applying the Bill’s duties to legal pornography on major platforms. It will also mean that platforms in scope will need to protect children from accessing the most harmful material, such as pornography.

A number of noble Lords also referenced extreme pornography. The duty set out in the Bill for illegal content will apply to instances of extreme pornography. For all content that amounts to a relevant offence, platforms will be required to ensure that they have the systems and processes in place to quickly take down such content once it has been reported. Under the Bill, a limited number of criminal offences that pose the greatest risk of harm online will be listed in legislation as priority offences. For priority offences, platforms will be required to implement systems and processes to minimise the uploading and sharing of such content. This new approach will be more robust than the Digital Economy Act, as it will capture extreme pornography, as well as other illegal pornography, including non-photographic child sexual abuse content that is not included in the definition of extreme pornography referred to in the Digital Economy Act.

The noble Baroness, Lady Brinton, asked what these new laws will mean for revenge pornography. This is already a crime under Section 33 of the Criminal Justice and Courts Act 2015. Platforms will need to take action to prevent explicit illegal content circulating or face enforcement action. In addition, the Government recently confirmed that the revenge porn offence would be widened to include threats to disclose intimate images with the intention of causing distress. Section 69 of the Domestic Abuse Act 2021 recently extended the offence to include threats to disclose such material. These provisions came into effect on 29 June 2021 and are not retrospective. The extension of the offence applies to England and Wales.

Our ambition is to ensure that we are fully equipped to respond to the changing nature of violence against women and girls and, most importantly, to continue to put victims and survivors at the heart of this approach. As the noble Lord, Lord Browne, has highlighted, the Government published a new Tackling Violence Against Women and Girls Strategy last July.

The Government recognise the concerns that have been raised, including from the Joint Committee scrutinising the draft online safety Bill, about protecting children from online pornography on services which do not currently fall within the scope of the online safety Bill, as referenced by the noble Baroness, Lady Merron. The Secretary of State said during her evidence session to the Joint Committee scrutinising the draft Bill that the Government are exploring ways to provide through the Bill wider protections for children to prevent them accessing online pornography, including on sites that are not currently within the draft Bill’s scope.

It is worth quoting at length what the Secretary of State said on 4 November last year:

“I do not believe that the Bill goes far enough in preventing children from accessing commercial pornography. That is tied into age verification and there are elements of that that I have asked officials, subject to parliamentary counsel and write-around, to look at further, to see whether we can do more. I realise that there is a gap. I am not going to call it a loophole. There is a gap, and I think we need to close that gap somehow if we can.”


The noble Baroness, Lady Kidron, talked about algorithms in reference to this subject, and how we will be dealing with them. Companies will fulfil their duties under the proposed new law by assessing the risks of harm to users from their services and putting in place systems and processes to mitigate them. As the noble Baroness said, algorithms play a very important part in how many companies operate their services, and they need to consider how they could cause harm and take steps to mitigate it. The regulator will set out steps that companies can take to fulfil their duties in codes of practice.

The use of age-verification technology is key to this. We will expect companies to use age-verification technologies to prevent children accessing online pornography or to demonstrate to Ofcom that the approach they are taking delivers the same level of protection for children.

It is important that the Bill be future-proofed—I hope this goes some way to answering the questions raised by the noble Earl, Lord Erroll, and the noble Baroness, Lady Brinton—so it will not mandate that companies use specific technologies to comply with their new duties. This is similar to the requirement in the Digital Economy Act, which did not mandate the use of a particular technology. Where age-verification technologies are used, it is important that they are robust, effective and privacy-preserving. This is needed to ensure that children are appropriately protected and that the public have trust in these solutions.

The Government take data security and privacy extremely seriously, which is why both Ofcom and in-scope companies will have duties under the Bill relating to user privacy which will apply to the use of age-assurance technologies. Standards also have an important role to play here by creating consistency and providing transparency for regulators. The Bill has been designed such that Ofcom will be able to set out expectations for the use of age-verification technologies in its codes of practice and accompanying guidance. This includes referencing relevant standards or principles. Companies will need to adopt these standards or demonstrate clearly to Ofcom that they have achieved an equivalent outcome.

So, we have given Ofcom, as the regulator, a broader range of enforcement powers than Part 3 of the Digital Economy Act to take action against companies that fail to act. Ofcom can issue fines and require companies to take specific steps to come into compliance or remedy their breach, and it can set deadlines for action to be taken.

Ofcom will have a suite of enforcement powers available to use against companies, which include imposing substantial fines up to the greater of either £18 million or 10% of qualifying annual revenue. Under the Digital Economy Act, it was £250,000 or 5%. Ofcom can also require companies to make improvements and, in the most serious cases, pursue business disruption measures, including blocking. There will also be criminal sanctions for senior managers in tech companies if regulated providers do not take their responsibilities seriously. The new regime will apply to companies that provide services to UK users, wherever they are located. We consider this approach necessary given the global nature of the online world, and the Government expect Ofcom to prioritise enforcement action where children’s safety has been compromised.

Ofcom has been mentioned a lot today, particularly with regard to time and preparation, so I am going to digress briefly to talk a little about what we have been doing with Ofcom in preparation. We have achieved a positive outcome through the challenging spending review, securing continued funding across online safety and allowing for the delivery of the Government’s commitment to make the UK the safest place to be online.

As we know, the online safety Bill represents the largest and highest-profile expansion in Ofcom’s remit since its inception. Ofcom is currently recruiting a significant number of staff to ensure it has the necessary expertise to implement the framework as intended and act as the online harms regulator. To effectively support the DCMS in taking the Bill through Parliament, Ofcom will work to create a strong evidence base to help inform its regulatory strategy and framework. It will also be overseeing the implementation of key operations and processes in its organisation as it steadily expands its operations for the regime going live.

Commencing Part 3 of the Digital Economy Act would create a confusing and fragmented regulatory landscape that tackles individual concerns in a piecemeal fashion. It would also subject businesses to two different enforcement regimes, with potentially different regulators.

In answer to the question from the noble Baroness, Lady Benjamin, our analysis indicates that it would take a minimum of just under two years to implement the provisions of Part 3 of the Digital Economy Act, so a commencement date of 20 June 2022, as set out in this Bill, would be impracticable even if desirable. The Government would need to designate a new regulator, that regulator would need to produce and consult on statutory guidance and the Government would then need to lay regulations before Parliament ahead of any new regime coming into force.

I am going to digress once more to talk about the interim measures we have taken. I hope this reassures noble Lords that there is not some sort of void in children’s safety at the moment. We have a comprehensive programme of work planned to ensure we maintain momentum on child online safety, until the legislation is ready. Ahead of the online safety Bill, the video-sharing platform and video-on-demand regimes are already in force, with Ofcom as the regulator. They include requirements for some UK services to protect children from harmful content online, such as pornography. In addition, the Government have published an interim code of practice for providers to tackle online child sexual exploitation and abuse. This code sets out steps that companies can take voluntarily to tackle this type of abuse.

In July 2021, the Government published our Online Media Literacy Strategy. The strategy supports the empowerment of users, including young people, with the skills and knowledge they need to make safe and informed decisions online. In addition, the new relationships, sex and health education curriculum is clear that, by the end of secondary school, pupils should have been taught about the impact that viewing harmful content such as pornography can have. This covers both the way that people see themselves in relation to others and how pornography can negatively affect how they behave towards sexual partners.

As part of the interim measures, in response to the Ofsted review following the Everyone’s Invited website ad campaign, we funded the NSPCC to launch a dedicated helpline and we moved to strengthen the delivery of the new relationships, sex and health education curriculum with additional support and briefings for teachers. That subject came up in a Question earlier this week.

On timing, we are committed to introducing the online safety Bill as soon as possible in this parliamentary Session. It is therefore reasonable to assume that the online safety Bill will receive Royal Assent within the time it takes to implement the Digital Economy Act, making any benefits of an interim regime minimal at best. The Joint Committee that scrutinised the draft online safety Bill published its report in December, and we are carefully considering its recommendations.

It is worth reiterating that our intention is to have the regime operational as soon as possible after Royal Assent. In the meantime, as I have just outlined, we are working closely with Ofcom to ensure that the implementation of the framework takes as short a time as possible following passage of the legislation.

I am aware that I have not answered the question from the noble Baroness, Lady Merron, about delays to the Digital Economy Act. I will have to write to her on that; I am afraid I do not know the answer. I have tried to answer all other noble Lords’ questions. I will study Hansard carefully and, if I have failed to answer or missed any, I will write.

I reiterate that, today, we heard some powerful arguments for and accounts of the urgent need to increase protections for children online. We will be able to deliver the strongest possible protections through the online safety Bill, rather than Part 3 of the Digital Economy Act. In answer to the noble Baroness, Lady Brinton, we are not hiding behind this Bill.

Finally, and to answer the noble Baroness, Lady Kidron, on a personal note and as a parent—the father of a daughter and a son—I will reflect the tone and tenor of this debate to the Secretary of State. The noble Baroness made it clear that she is also very committed to this legislation and to enacting it at speed. Also speaking personally, I rather agree with the noble Lord, Lord Alton: a return to an analogue world is quite appealing.

11:49
Lord Morrow Portrait Lord Morrow (DUP)
- Hansard - - - Excerpts

My Lords, I will be brief in my few closing remarks. I have listened intently to what the Minister said. I thank him for his comments, but I must be frank and honest: I am disappointed, but perhaps not surprised. I will leave that comment there.

I am grateful to all noble Lords who have taken the time to speak on this issue and to support the Bill. It is heartening that there is still a great degree of unanimity across the House on the way forward on this issue. I also thank the Minister but, although I have listened very carefully, I will reserve judgment until a later date.

The noble Baroness, Lady Benjamin, helpfully reminded us of the timescale and the opportunities that the Government have missed over the last five years. The implementation of age verification is a catalogue of government delays. The process simply repeats itself: there are promises by the Government that something will be done, and great hope is expressed in new legislation and that change is imminent, but all it amounts to is another delay.

The Government first delayed the implementation of Part 3 of the DEA in 2019, promising new legislation. However, given how long it took for the Government to bring forward the online safety Bill, they could have acted at any time since 2019 to bring in Part 3. That has not happened and, having listened to the Minister, I am still concerned. If they had done so, age verification and protections against extreme pornography could have been in force by now.

At the start of last year, the Government wrote to the noble Baroness, Lady Benjamin, indicating that it would take 22 months to designate Ofcom under the DEA and have it regulate the legislation’s provisions. If they had only commenced Part 3 of the DEA on that date, it would have meant that, even by the Government’s timetable, age verification would have been operational in this country before the end of this year. I fear that the online safety Bill will mean yet more delays. It could be four more years before the protections that Parliament enacted in Part 3 of the DEA come into force.

If I were to do them justice, I would name everyone who has spoken today. What an array of speeches we have heard, made with passion and commitment. I thank all speakers most sincerely. We were reminded by many of them that a generation of children could grow up without benefiting from protection online. That is all the more shocking when legislation already exists that could be brought into force to protect them. The noble Lord, Lord Alton, who has worked tirelessly over many years to help protect children and young people from harm, both offline and online, reminded the House of what can be achieved with cross-party support. It is clear that cross-party support exists for Part 3 of the DEA.

I echo what the noble Lord, Lord McColl, said: Part 3 is not the complete answer to all the issues concerning online pornography—we have accepted that from day one—but it is what we have available now. Clearly, this is a provision that could be brought in now, and built on and improved through the online safety Bill.

As my noble friend Lord Browne reminded us, there is little or no defence for government inaction in this matter. The Government’s own research highlights the harm from online pornography and the issues that arise when children are exposed to it. The Government have the data, and they know the harm that is carried into adult life and the devastation it brings to families and society. That is why it is time for them to act.

Three years on from the Government announcing that they would not progress Part 3 of the DEA, the problems have not gone away. The internet is still an unregulated place for our children. It is surely time for action.

Bill read a second time and committed to a Committee of the Whole House.