Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Benjamin
Main Page: Baroness Benjamin (Liberal Democrat - Life peer)Department Debates - View all Baroness Benjamin's debates with the Department for Digital, Culture, Media & Sport
(1 year, 6 months ago)
Lords ChamberMy Lords, I am very happy to move Amendment 29 and to speak to Amendments 83 and 103, which are also in my name. We have just had a debate about the protection of children online, and this clearly follows on from that.
The intention of the Bill is to set general parameters through which different content types can be regulated. The problem with that approach, as the sheer number of amendments highlights, is this: not all content and users are the same, and therefore cannot be treated in the same way. Put simply, not all content online should be legislated for in the same way. That is why the amendments in this group are needed.
Pornography is a type of content that cannot be regulated in general terms; it needs specific provisions. I realise that some of these issues were raised in the debate last Tuesday on amendments in my name, and again on Thursday when we discussed harms to children. I recognise too that, during his response to Thursday’s debate, the Minister made a welcome announcement on primary priority content which I hope will be set out in the Bill, as we have been asking for during this debate. While we wait to see the detail of what that announcement means, I think it safe to assume that pornography will be one of the harms named on the Bill, which makes discussion of these amendments that bit more straightforward.
Given that context, in Clause 11(3), user-to-user services that fall under the scope of Part 3 of the Bill have a duty to prevent children from accessing primary priority content. This duty is repeated in Clause 25(3) for search services. That duty is, however, qualified by the words,
“using proportionate systems and processes”.
It is the word “proportionate” and how that would apply to the regulation of pornography that is at the heart of the issue.
Generally speaking, acting in a proportionate way is a sensible approach to legislation and regulation. For the most part, regulation and safeguards should ensure that a duty is not onerous or that it does not place a disproportionate cost on the service provider that may make their business unviable. While that is the general principle, proportionality is not an appropriate consideration for all policy decisions.
In the offline world, legislation and regulation is not always proportionate. This is even more stark when regulating for children. The noble Lord, Lord Bethell, raised the issue of the corner shop last Tuesday, and that example is apt to highlight my point today. We do not take a proportional approach to the sale of alcohol or cigarettes. We do not treat a corner shop differently from a supermarket. It would be absurd if I were to suggest that a small shop should apply different age checks for children when selling alcohol, compared to the age checks we expect a large supermarket to apply. Therefore, in the same way, we already do not apply proportionality to some online activities. For example, gambling is an activity that is age-verified for children. Indeed, gambling companies are not allowed to make their product attractive to children and must advertise in a regulated way to avoid harm to children and young people. The harm caused to children by gambling is significant, so the usual policy considerations of proportionality do not apply. Clearly, both online and offline, there are some goods and services to which a proportionality test is not applied; there is no subjectivity. A child cannot buy alcohol or gamble and should not be able to access pornography.
In the UK, there is a proliferation of online gambling sites. It would be absurd to argue that the size of a gambling company or the revenue that company makes should be a consideration in whether it should utilise age verification to prevent children placing a bet. In the same way, it would be absurd to argue that the size or revenue of a pornographic website could be used as an argument to override a duty to ensure that age verification is employed to ensure that children do not access that website.
This is not a grey area. It is beyond doubt that exposing children to pornography is damaging to their health and development. The Children’s Commissioner’s report from this year has been much quoted already in Committee but it is worth reminding your Lordships what she found: that pornography was “widespread and normalised”, to the extent that children cannot opt out. The average age at which children first see pornography is 13. By age nine, 10% had seen it, 27% had seen it by age 11 and half had seen it by age 13. The report found that frequent users of pornography are more likely to engage—unfortunately and sadly—in physically aggressive sex acts.
There is nothing proportionate about the damage of pornographic content. The size, number of visitors, financial budget or technical know-how must not be considerations as to whether or not to deploy age checks. If a platform is incapable for any reason of protecting children from harmful exposure to pornography, it must remove that content. The Bill should be clear: if there is pornography on a website, it must use age verification. We know that pornographic websites will do all they can to evade age verification. In France and Germany, which are ahead of us in passing legislation to protect minors from pornography, regulators are tangled up in court action as the pornographic sites they first targeted for enforcement action argue against the law.
We must also anticipate the response of websites that are not dedicated exclusively to pornography, especially social media—a point we touched on during Tuesday’s debate. Reuters reported last year that an internal Twitter presentation stated that 13% of tweets were pornographic. Indeed, the Children’s Commissioner has found that Twitter is the platform where young people are most likely to encounter pornographic content. I know that some of your Lordships are concerned about age-gating social media. No one is suggesting that social media should exclude children, a point that has been made already. What I am suggesting is that pornography on that platform should be subject to age verification. The capabilities already exist to do this. New accounts on Twitter have to opt in to view pornographic content. Why cannot the opt-in function be age-gated? Twitter is moving to subscription content. Why can it not make pornographic content subscription based, with the subscription being age-verified. The solutions exist.
The Minister may seek to reassure the House that the Bill as drafted would not allow any website or search facility regulated under Part 3 that hosts pornographic content to evade its duties because of size, capacity or cost. But, as we have seen in France, these terms will be subject to court action. I therefore trust that the Government will bring forward an amendment to ensure that any platform that hosts pornographic content will employ age verification, regardless of any other factors. Perhaps the Minister in his wind-up can provide us with some detail or a hint of a future amendment at Report. I look forward to hearing and considering the Minister’s response. I beg to move.
My Lords, I wish to speak in support of Amendments 29, 83 and 103 in the name of the noble Baroness, Lady Ritchie. I am extremely pleased that the Minister said last Tuesday that pornography will be within primary priority content; he then committed on Thursday to naming primary priority content in the Bill. This is good news. We also know that pornography will come within the child safety duties in Clause 11. This makes me very happy.
In the document produced for the Government in January 2021, the BBFC said that there were millions of pornographic websites—I repeat, millions—and many of these will come within Part 3 of the Bill because they allow users to upload videos, make comments on content and chat with other users. Of course, some of these millions of websites will be very large, which means by definition that we expect them to come within the scope of the Bill. Under Clause 11(3) user-to-user services have a duty to prevent children accessing primary priority content. The duty is qualified by the phrase
“using proportionate systems and processes”.
The facts of deciding what is proportionate are set out in Clause 11(11): the potential harm of the content based on the children’s risk assessment, and the size and capacity of the provider of the service. Amendments 29, 83 and 103 tackle the issue of size and capacity.