(1 year, 7 months ago)
Lords ChamberMy Lords, I support the noble Baroness, Lady Benjamin, in bringing the need for consistent regulation of pornographic content to your Lordships’ attention and have added my name in support of Amendment 185. I also support Amendments 123A, 142, 161, 183, 184 and 306 in this group.
There should not be separate regimes for how pornographic content is regulated in this country. I remember discussions about this on Report of the Digital Economy Bill around six years ago. The argument for not making rules for the online world consistent with those for the offline world was that the CPS was no longer enforcing laws on offline use anyway. Then as now, this seems simply to be geared towards letting adults continue to have unrestricted access to an internet awash with pornographic material that depicts and/or promotes child sexual abuse, incest, trafficking, torture, and violent or otherwise harmful sexual acts: adult freedoms trumping all else, including the integrity of the legal process. In the offline world, this material is illegal or prohibited for very good reason.
The reason I am back here, arguing again for parity, is that, since 2017, an even deeper seam of academic research has developed which fatally undermines the case for untrammelled cyber-libertarianism. It has laid bare the far-reaching negative impacts that online pornography has had on individuals and relationships. One obvious area is the sharp rise in mental ill-health, especially among teenagers. Research from CEASE, the Centre to End All Sexual Exploitation, found that over 80% of the public would support new laws to limit free and easy access.
Before they get ensnared—and some patients of the Laurel Centre, a private pornography addiction clinic, watch up to 14 hours of pornography a day—few would have been aware that sexual arousal chained to pornography can make intimate physical sex impossible to achieve. Many experience pornography-induced erectile dysfunction and Psychology Today reports that
“anywhere from 17% to 58% of men who self-identify as heavy/compulsive/addicted users of porn struggle with some form of sexual dysfunction”.
As vice-chair of the APPG on Issues Affecting Men and Boys, I am profoundly concerned that very many men and boys are brutalised by depictions of rape, incest, violence and coercion, which are not niche footage on the dark web but mainstream content freely available on every pornography platform that can be accessed online with just a few clicks.
The harms to their growing sons, which include an inability to relate respectfully to girls, should concern all parents enough to dial down drastically their own appetite for porn. There is enormous peer pressure on teenage boys and young men to consume it, and its addictive nature means that children and young people, with their developing brains, are particularly susceptible. One survey of 14 to 18 year-olds found almost a third of boys who used porn said it had become a habit or addiction and a third had enacted it. Another found that the more boys watched porn and were sexually coercive, the less respect they had for girls.
Today’s headlines exposed the neurotoxins in some vaping products used by underage young people. There are neurotoxins in all the porn that would be caught by subsection 368E(2) of the Communications Act 2003, if it was offline—hence the need for parity and, just like the vapes, children as well as adults will continue to be exposed. Trustworthy age verification will stop children stumbling across it or finding it in searches, but adults who are negligent, or determined to despoil children’s innocence, will facilitate their viewing it if it remains available online. This Bill will not make the UK the safest place in the world for children online if we continue to allow content that should be prohibited, for good reason, to flood into our homes.
Helen Rumbelow, writing in the Times earlier this month, said the public debate—the backdrop to our own discussions in this Bill—is “spectacularly ill-informed” because we only talk about porn’s side-effects and not what is enacted. So here goes. Looking at the most popular pages of the day on Pornhub, she found that 12 out of 32 showed men physically abusing women. One-third of these showed what is known as “facial abuse”, where a woman’s airway is blocked by a penis: a porn version of waterboarding torture. She described how
“in one a woman is immobilised and bound by four straps and a collar tightened around her neck. She ends up looking like a dead body found in the boot of a car. In another a young girl, dressed to look even younger in a pair of bunny ears and pastel socks, is held down by an enormous man pushing his hand on her neck while she is penetrated. The sounds that came from my computer were those you might expect from a battle hospital: cries of pain, suction and “no, no, no”. I won’t tell you the worst video I saw as you may want to stop reading now. I started to have to take breaks to go outside and look at the sky and remember kindness”.
Turning briefly to the other amendments, I thank my noble friend Lord Bethell for his persistence in raising the need for the highest standard of age verification for pornography. I also commend the noble Baroness, Lady Kidron, for her continued commitment to protecting children from harmful online content and for representing so well the parents who have lost children, in the most awful of circumstances, because of online harms. I therefore fully support the package of amendments in this group tabled by the noble Baroness, Lady Kidron, and my noble friend Lord Bethell.
This Bill should be an inflection point in history and future generations will judge us on the decisions we make now. It is highly like they will say “Shame on them”. To argue that we cannot put the genie back in the bottle is defeatist and condemns many of our children and grandchildren to the certainty of a dystopic relational future. I say “certain” because it is the current reality of so many addicted adults who wish they could turn back the clock. Therefore, it is humane and responsible, not quaint or retrogressive, to insist that this Government act decisively to make online and offline laws consistent and reset the dial.
My Lords, I will speak to my Amendment 232, as well as addressing issues raised more broadly by this group of amendments. I want to indicate support from these Benches for the broader package of amendments spoken to so ably by the noble Baroness, Lady Kidron. I see my noble friend Lord Clement-Jones has returned to check that I am following instructions during my temporary occupation of the Front Bench.
The comments I will make are going to focus on an aspect which I think we have not talked about so much in the debate, which is age assurance in the context of general purpose, user-to-user and search services, so-called Part 3, because we like to use confusing language in this Bill, rather than the dedicated pornography sites about which other noble Lords have spoken so powerfully. We have heard a number of contributions on that, and we have real expertise in this House, not least from my noble friend Lady Benjamin.
In the context of age assurance more generally, I start with a pair of propositions that I hope will be agreed to by all participants in the debate and build on what I thought was a very balanced and highly informative introduction from the noble Baroness, Lady Kidron. The first proposition is that knowledge about the age of users can help all online platforms develop safer services than they could absent that information—a point made by the right reverend Prelate the Bishop of Oxford earlier. The second is that there are always some costs to establishing age, including to the privacy of users and through some of the friction they encounter when they wish to use a service. The task before us is to create mechanisms for establishing age that maximise the safety benefits to users while minimising the privacy and other costs. That is what I see laid out in the amendment that the noble Baroness, Lady Kidron, has put before us.
My proposed new clause seeks to inform the way that we construct that balance by tasking Ofcom with carrying out regular studies into a broad range of approaches to age assurance. This is exactly the type of thinking that is complementary to that in Amendment 142; it is not an alternative but complementary to it. We may end up with varying views on exactly where that balance should be struck. Again, I am talking about general purpose services, many of which seek to prohibit pornography—whether they do so 100%, it is a different set of arguments from those that apply to services which are explicitly dedicated to pornography. We may come to different views about where we eventually strike the balance but I think we probably have a good, shared understanding of the factors that should be in play. I certainly appreciate the conversations I have had with the noble Baroness, Lady Kidron, and others about that, and think we have a common understanding of what we should be considering.
If we can get this formulation right, age assurance may be one of the most significant measures in the Bill in advancing online safety, but if we get it wrong, I fear we may create a cookie banner scenario, such as the one I warned about at Second Reading. This is my shorthand for a regulatory measure that brings significant costs without delivering its intended benefits. However keen we are to press ahead, we must always keep in mind that we do not want to create legislation that is well-intended but does not have the beneficial effect that we all in this Committee want.
Earlier, the noble Baroness, Lady Harding, talked about the different roles that we play. I think mine is to try to think about what will actually work, and whether the Bill will work as intended, and to try to tease out any grit in it that may get in the way. I want in these remarks to flag what I think are four key considerations that may help us to deliver something that is actually useful and avoid that cookie banner outcome, in the context of these general purpose, Part 3 services.
First, we need to recognise that age assurance is useful for enabling as well as disabling access to content—a point that the noble Baroness, Lady Kidron, rightly made. We rightly focus on blocking access to bad content, but other things are also really important. For example, knowing that a user is very young might mean that the protocol for the reporting system gets to that user report within one hour, rather than 24 hours for a regular report. Knowing that a user is young and is being contacted by an older user may trigger what is known as a grooming protocol. Certainly at Facebook we had that: if we understood that an older user was regularly contacting younger users, that enabled us to trigger a review of those accounts to understand whether something problematic was happening—something that the then child exploitation and online protection unit in the UK encouraged us to implement. A range of different things can then be enabled. The provision of information in terms that a 13 year-old would understand can be triggered if you know the age of that user.
Equally, perfectly legitimate businesses, such as alcohol and online gambling businesses, can use age assurance to make sure that they exclude people who should not be part of that. We in this House are considering measures such as junk food advertising restrictions, which again depend on age being known to ensure that junk food which can be legitimately marketed to older people is not marketed to young people. In a sense, that enables those businesses to be online because, absent the age-gating, they would struggle to meet their regulatory obligations.
Secondly, we need to focus on outcomes, using the risk assessment and transparency measures that the Bill creates for the first time. We should not lose sight of those. User-to-user and search services will have to do risk assessments and share them with Ofcom, and Ofcom now has incredible powers to demand information from them. Rather than asking, “Have you put in an age assurance system?”, we can ask, “Can you tell us how many 11 year-olds or 15 year-olds you estimate access the wrong kind of content?”, and, “How much pornography do you think there is on your service despite the fact that you have banned it?” If the executives of those companies mislead Ofcom or refuse to answer, there are criminal sanctions in the Bill.
The package for user-to-user and search services enables us to really focus on those outcomes and drill down. In many cases, that will be more effective. I do not care whether they have age-assurance type A or type B; I care whether they are stopping 99.9% of 11 year-olds accessing the wrong kind of content. Now, using the framework in the Bill, Ofcom will be able to ask those questions and demand the answers, for the first time ever. I think that a focus on outcomes rather than inputs—the tools that they put in place—is going to be incredibly powerful.