Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Benjamin
Main Page: Baroness Benjamin (Liberal Democrat - Life peer)Department Debates - View all Baroness Benjamin's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberMy Lords, one of our clergy in the diocese of Guildford has been campaigning for more than a decade, as have others in this Committee, on children’s access to online pornography. With her, I support the amendments in the names of the noble Baronesses, Lady Kidron and Lady Harding.
Her concerns eventually made their way to the floor of the General Synod of the Church of England in a powerful debate in July last year. The synod voted overwhelmingly in favour of a motion, which said that we
“acknowledge that our children and young people are suffering grave harm from free access to online pornography”
and urged us to
“have in place age verification systems to prevent children from having access to those sites”.
It asked Her Majesty’s Government to use their best endeavours to secure the passage and coming into force of legislation requiring age-verification systems preventing access by people under the age of 18. It also recommended more social and educational programmes to increase awareness of the harms of pornography, including self-generated sexually explicit images.
Introducing the motion, my chaplain, Reverend Jo Winn-Smith, said that age verification
“ought to be a no-brainer … Exposure to sexualised material is more likely to lead to young people engaging in more sexualised behaviour and to feel social pressure to have sex”,
as well as normalising sexual violence against girls and women. A speech from the chaplain-general of the Prison Service towards the end of the debate highlighted just where such behaviours and pressures could lead in extreme circumstances.
One major theme that emerged during the debate is highlighted by the amendments this afternoon: that access to online pornography goes far beyond materials that fall into what the Bill defines as Part 5 services. Another is highlighted in a further group of amendments: age assurance needs to be both mandatory and effective beyond reasonable doubt.
It was also commented on how this whole area has taken such an age to get on to the statute book, given David Cameron’s proposals way back in 2013 and further legislation proposed in 2018 that was never enacted. Talk of secondary legislation to define harmful content in that regard is alarming, as a further amendment indicates, given the dragging of feet that has now been perpetuated for more than a decade. That is a whole generation of children and young people.
In an imaginative speech in the synod debate, the most reverend Primate the Archbishop of York, Archbishop Stephen, reminded us that the internet is not a platform; it is a public space, where all the rights and norms you would expect in public should apply. In the 1970s, he continued, we famously put fluoride in the water supply, because we knew it would be great for dental health; now is the opportunity to put some fluoride into the internet. I add only this: let us not water down the fluoride to a point where it becomes feeble and ineffective.
My Lords, I will speak in support of the amendments in this group in the names of the intrepid noble Baroness, Lady Kidron, the noble Baroness, Lady Harding, and my noble friend Lord Storey—we are kindred spirits.
As my noble friend said, the expectations of parents are clear: they expect the Bill to protect their children from all harm online, wherever it is encountered. The vast majority of parents do not distinguish between the different content types. To restrict regulation to user-to-user services, as in Part 3, would leave a great many websites and content providers, which are accessed by children, standing outside the scope of the Bill. This is a flagship piece of legislation; there cannot be any loopholes leaving any part of the internet unregulated. If there is a website, app, online game, educational platform or blog—indeed, any content that contains harmful material—it must be in the scope of the Bill.
The noble Baroness, Lady Kidron, seeks to amend the Bill to ensure that it aligns with the Information Commissioner’s age-appropriate design code—it is a welcome amendment. As the Bill is currently drafted, the threshold for risk assessment is too high. It is important that the greatest number of children and young people are protected from harmful content online. The amendments achieve that to a greater degree than the protection already in the Bill.
While the proposal to align with the age-appropriate design code is welcome, I have one reservation. Up until recently, it appears that the ICO was reluctant to take action against pornography platforms that process children’s data. It has perhaps been deemed that pornographic websites are unlikely to be accessed by children. Over the years, I have shared with this House the statistics of how children are accessing pornography and the harm it causes. The Children’s Commissioner also recently highlighted the issue and concerns. Pornography is being accessed by our children, and we must ensure that the provisions of the Bill are the most robust they can be to ensure that children are protected online.
I am concerned with ensuring two things: first, that any platform that contains harmful material falls under the scope of the Bill and is regulated to ensure that children are kept safe; and, secondly, that, as far as possible, what is harmful offline is regulated in the same way online. The amendments in the name of my noble friend Lord Storey raise the important question of online-offline equality. Amendments 33A and 217A seek to regulate online video games to ensure they meet the same BBFC ratings as would be expected offline, and I agree with that approach. Later in Committee, I will raise this issue in relation to pornographic content and how online content should be subject to the same BBFC guidance as content offline. I agree with what my noble friend proposes: namely, that this should extend to video game content as well. Video games can be violent and sexualised in nature, and controls should be in place to ensure that children are protected. The BBFC guidelines used offline appear to be the best way to regulate online as well.
Children must be kept safe wherever they are online. This Bill must have the widest scope possible to keep children safe, but ensuring online/offline alignment is crucial. The best way to keep children safe is to legislate for regulation that is as far reaching as possible but consistently applied across the online/offline world. These are the reasons why I support the amendments in this group.
My Lords, I will lend my support to Amendments 19 and 22. It is a pleasure to speak after the noble Baroness, Lady Benjamin. I may be one of those people in your Lordships’ House who relies significantly on the British Board of Film Classification for movie watching, as I am one of the faint-hearted.
In relation to app stores, it is not just children under 18 for whom parents need the age verification. If you are a parent of a child who has significant learning delay, the internet is a wonderful place where they can get access to material and have development that they might not ordinarily have had. But, of course, turning 17 or 18 is not the threshold for them. I have friends who have children with significant learning delay. Having that assurance, so they know which apps are which in the app store, goes well beyond 18 for them. Obviously it will not be a numerical equivalent for their child—now a young adult—but it is important to them to know that the content they get on a free app or an app purchased from the app store is suitable.
I just wanted to raise that with noble Lords, as children and some vulnerable adults—not all—would benefit from the kind of age verification that we have talked about. I appreciate the points that the noble Lord, Lord Allan, raised about where the Bill has ended up conceptually and the framework that Ofcom will rely on. Like him, I am a purist sometimes but, pragmatically, I think that the third concept raised by the noble Baroness, Lady Kidron, about protection and putting this in the app store and bringing it parallel with things such as classification for films and other video games is really important.
My Lords, I violently agree with my noble friend Lord Moylan that the grouping of this amendment is unfortunate. For that reason I am not going to plunge into the issue in huge detail. but there are a couple of things I would like to reassure my noble friend on, and I have a question for the Minister.
The noble Baroness, Lady Kidron, said there is a package of amendments around age verification and that we will have a lot of time to dive into this, and I think that is probably the right format for doing it. However, I reassure my noble friend Lord Moylan that he is absolutely right. The idea is not in any way to shut off the town square from everyone simply because there might be something scary there.
Clause 11(3) refers to priority content, which the noble Lord will know is to do with child abuse and fraudulent and severely violent content. This is not just any old stuff; this is hardcore porn and the rest. As in the real world, that content should be behind an age-verification barrier. At the moment we have a situation on the internet where, because it has not been well-managed for a generation, this content has found itself everywhere: on Twitter and Reddit, and all sorts of places where really it should not be because there are children there. We envisage a degree of tidying up of social media and the internet to make sure that the dangerous content is put behind age verification. What we are not seeking to do, and what would not be a benign or positive action, is to put the entire internet behind some kind of age-verification boundary. From that point of view, I completely agree with my noble friend.
My Lords, as might be expected, I will speak against Amendment 26 and will explain why.
The children’s charity Barnardo’s—here I declare an interest as vice-president—has said, as has been said several times before, that children are coming across pornographic content from as young as seven. Often they stumble across the content accidentally, unwittingly searching for terms such as “sex” or “porn”, without knowing what they mean. The impact that this is having on children is huge. It is harming their mental health and distorting their perception of healthy sexual relationships and consent. That will go with them into adulthood.
Age verification for pornography and age assurance to protect children from other harms are crucial to protect children from this content. In the offline world, children are rightly not allowed to buy pornographic DVDs in sex shops but online they can access this content at the click of a button. This is why I will be supporting the amendments from the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, and am fully supportive of their age assurance and age verification schedule.
My Lords, to go back not just to the age question, the noble Lord, Lord Allan of Hallam, reminded us that community-led moderation is not just Wikipedia. What I tried to hint at earlier is that that is one of the most interesting, democratic aspects of the online world, which we should protect.
We often boast that we are a self-regulating House and that that makes us somehow somewhat superior to up the road—we are all so mature because we self-regulate; people do behave badly but we decide. It is a lesson in democracy that you have a self-regulating House, and there are parts of the online world that self-regulate. Unless we think that the citizens of the UK are less civilised than Members of the House of Lords, which I would refute, we should say that it is positive that there are self-moderating, self-regulating online sites. If you can say something and people can object and have a discussion about it, and things can be taken down, to me that is the way we should deal with speech that is inappropriate or wrong. The bulk of these amendments—I cannot remember how many there are now—are right.
I was glad that the noble Lord, Lord Moylan, said he could not understand why this grouping had happened, which is what I said earlier. I had gone through a number of groupings thinking: “What is that doing there? Am I missing something? Why is that in that place?” I think we will come back to the age verification debate and discussion.
One thing to note is that one of the reasons organisations such as Wikipedia would be concerned about age verification—and they are—is anonymity. It is something we have to consider. What is going to happen to anonymity? It is so important for journalists, civil liberty activists and whistleblowers. Many Wikipedia editors are anonymised, maybe because they are politically editing sites on controversial issues. Imagine being a Wikipedia editor from Russia at the moment—you would not want to have to say who you are. We will come back to it but it is important to understand that Amendment 26, and those who are saying that we should look at the question of age verification, are not doing so because they do not care about children and are not interested in protecting them. However, the dilemmas of any age-gating or age verification for adult civil liberties have to be considered. We have to worry that, because of an emphasis on checking age, some websites will decide to sanitise what they allow to be published to make it suitable for children, just in case they come across it. Again, that will have a detrimental impact on adult access to all knowledge.
These will be controversial issues, and we will come back to them, but it is good to have started the discussion.