Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Ritchie of Downpatrick
Main Page: Baroness Ritchie of Downpatrick (Labour - Life peer)Department Debates - View all Baroness Ritchie of Downpatrick's debates with the Department for Digital, Culture, Media & Sport
(1 year, 6 months ago)
Lords ChamberMy Lords, it is such a privilege to follow the noble Baroness, Lady Benjamin. I pay tribute to her years of campaigning on this issue and the passion with which she spoke today. It is also a privilege to follow the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, in supporting all the amendments in this group. They are vital to this Bill, as all sides of this Committee agree. They all have my full support.
When I was a child, my grandparents’ home, like most homes, was heated by a coal fire. One of the most vital pieces of furniture in any house where there were children in those days was the fireguard. It was there to prevent children getting too near to the flame and the smoke, either by accident or by design. It needed to be robust, well secured and always in position, to prevent serious physical harm. You might have had to cut corners on various pieces of equipment for your house, but no sensible family would live without the best possible fireguard they could find.
We lack any kind of fireguard at present and the Bill currently proposes an inadequate fireguard for children. A really important point to grasp on this group of amendments is that children cannot be afforded the protections that the Bill gives them unless they are identified as children. Without that identification, the other protections fail. That is why age assurance is so foundational to the safety duties and mechanisms in the Bill. Surely, I hope, the Minister will acknowledge both that we have a problem and that the present proposals offer limited protection. We have a faulty fireguard.
These are some of the consequences. Three out of five 11 to 13 year-olds have unintentionally viewed pornography online. That is most of them. Four out of five 12 to 15 year-olds say they have had a potentially harmful experience online. That is almost universal. Children as young as seven are accessing pornographic content and three out of five eight to 11 year-olds—you might want to picture a nine year-old you know—have a social media profile, when they should not access those sites before the age of 13. That profile enables them to view adult content. The nation’s children are too close to the fire and are being harmed.
There is much confusion about what age assurance is. As the noble Baroness, Lady Kidron, has said, put simply it is the ability to estimate or verify an individual’s age. There are many different types of age assurance, from facial recognition to age verification, which all require different levels of information and can give varying levels of assurance. At its core, age assurance is a tool which allows services to offer age-appropriate experiences to their users. The principle is important, as what might be appropriate for a 16 year-old might be inappropriate for a 13 year-old. That age assurance is absolutely necessary to give children the protections they deserve.
Ofcom’s research shows that more than seven out of 10 parents of children aged 13 to 17 were concerned about their children seeing age-inappropriate content or their child seeing adult or sexual content online. Every group I have spoken to about the Bill in recent months has shared this concern. Age assurance would enable services to create age-appropriate experiences for children online and can help prevent children’s exposure to this content. The best possible fireguard would be in place.
Different levels of age assurance are appropriate in different circumstances. Amendments 161 and 142 establish that services which use age assurance must do so in line with the basic rules of the road. They set out that age assurance must be proportionate to the level of risk of a service. For high-risk services, such as pornography, sites much establish the age of their users beyond reasonable doubt. Equally, a service which poses no risk may not need to use age assurance or may use a less robust form of age assurance to engage with children in an age-appropriate manner—for example, serving them the terms and conditions in a video format.
As has been said, age assurance must be privacy-preserving. It must not be used as an excuse for services to use the most intrusive technology for data-extractive purposes. These are such common-sense amendments, but vital. They will ensure that children are prevented from accessing the most high-risk sites, enable services to serve their users age-appropriate experiences, and ensure that age assurance is not used inappropriately in a way that contravenes a user’s right to privacy.
As has also been said, there is massive support for this more robust fireguard in the country at large, across this House and, I believe, in the other place. I have not yet been able to understand, or begin to understand, the Government’s reasons for not providing the best protection for our children, given the aim of the Bill. Better safeguards are technically possible and eminently achievable. I would be grateful if the Minister could attempt to explain what exactly he and the Government intend to do, given the arguments put forward today and the ongoing risks to children if these amendments are not adopted.
My Lords, it is a pleasure to follow the right reverend Prelate the Bishop of Oxford. He used an interesting analogy of the fireguard; what we want in this legislation is a strong fireguard to protect children.
Amendments 183ZA and 306 are in my name, but Amendment 306 also has the name of the noble Lord, Lord Morrow, on it. I want to speak in support of the general principles raised by the amendments in this group, which deal with five specific areas, namely: the definition of pornography; age verification; the consent of those participating in pornographic content; ensuring that content which is prohibited offline is also prohibited online; and the commencement of age verification. I will deal with each of these broad topics in turn, recognising that we have already dealt with many of the issues raised in this group during Committee.
As your Lordships are aware, the fight for age verification has been a long one. I will not relive that history but I remind the Committee that when the Government announced in 2019 that they would not implement age verification, the Minister said:
“I believe we can protect children better and more comprehensively through the online harms agenda”.—[Official Report, Commons, 17/10/19; col. 453.]
Four years later, the only definition for pornography in the Bill is found in Clause 70(2). It defines pornographic content as
“produced solely or principally for the purpose of sexual arousal”.
I remain to be convinced that this definition is more comprehensive than that in the Digital Economy Act 2017.
Amendment 183ZA is a shortened version of the 2017 definition. I know that the Digital Economy Act is out of vogue but it behoves us to have a debate about the definition, since what will be considered as pornography is paramount. If we get that wrong, age verification will be meaningless. Everything else about the protections we want to put in place relies on a common understanding of when scope of age verification will be required. Put simply, we need to know what it is we are subjecting to age verification and it needs to be clear. The Minister stated at Second Reading that he believed the current definition is adequate. He suggested that it ensured alignment across different pieces of legislation and other regulatory frameworks. In reviewing other legislation, the only clear thing is this: there is no standard definition of pornography across the legislative framework.
For example, Section 63 of the Criminal Justice and Immigration Act 2008 uses the definition in the Bill, but it requires a further test to be applied: meeting the definition of “extreme” material. Section 368E of the Communications Act 2003 regulates online video on demand services. That definition uses the objective tests of “prohibited material”, meaning material too extreme to be classified by the British Board of Film Classification, and “specially restricted material”, covering R18 material, while also using a subjective test that covers material that
“might impair the physical, mental or moral development”
of under-18s.