(1 year, 7 months ago)
Lords ChamberMy Lords, first, I will address Amendments 12BA, 183A and 183B, tabled by the noble Baroness, Lady Ritchie of Downpatrick, who I was grateful to discuss them with earlier today, and the noble Lord, Lord Morrow, whose noble friend, the noble Lord, Lord Browne of Belmont, I am grateful to for speaking to them on his behalf.
These amendments seek to apply the duties in Part 5 of the Bill, which are focused on published pornographic content and user-generated pornography. Amendments 183A and 183B are focused particularly on making sure that children are protected from user-to-user pornography in the same way as from published pornography, including through the use of age verification. I reassure the noble Baroness and the noble Lord that the Government share their concerns; there is clear evidence about the impact of pornography on young people and the need to protect children from it.
This is where I come to the questions posed earlier by the noble Lord, Lord McCrea of Magherafelt and Cookstown. The research we commissioned from the British Board of Film Classification assessed the functionality of and traffic to the UK’s top 200 most visited pornographic websites. The findings indicated that 128 of the top 200 most visited pornographic websites—that is just under two-thirds, or 64%—would have been captured by the proposed scope of the Bill at the time of the Government’s initial response to the online harms White Paper, and that represents 85% of the traffic to those 200 websites.
Since then, the Bill’s scope has been broadened to include search services and pornography publishers, meaning that children will be protected from pornography wherever it appears online. The Government expect companies to use age-verification technologies to prevent children accessing services which pose the highest risk to children, such as online pornography. Age-assurance technologies and other measures will be used to provide children with an age-appropriate experience on their service.
As noble Lords know, the Bill does not mandate that companies use specific approaches or technologies when keeping children safe online as it is important that the Bill is future-proofed: what is effective today might not be so effective in the future. Moreover, age verification may not always be the most appropriate or effective approach for user-to-user companies to comply with their duties under the Bill. For instance, if a user-to-user service, such as a social medium, does not allow pornography under its terms of service, measures such as strengthening content moderation and user reporting would be more appropriate and effective for protecting children than age verification. That would allow content to be better detected and removed, instead of restricting children from a service that is designed to be appropriate for their use—as my noble friend Lady Harding of Winscombe puts it, avoiding the situation where children are removed from these services altogether.
While I am sympathetic to the aims of these amendments, I assure noble Lords that the Bill already has robust, comprehensive protections in place to keep children safe from all pornographic content, wherever or however it appears online. This amendment is therefore unnecessary because it duplicates the existing provisions for user-to-user pornography in the child safety duties in Part 3.
It is important to be clear that, wherever they are regulated in the Bill, companies will need to ensure that children cannot access pornographic content online. This is made clear, for user-to-user content, in Clause 11(3); for search services, in Clause 25(3); and for published pornographic content in Clause 72(2). Moving the regulation of pornography from Part 3 to Part 5 would not be a workable or desirable option because the framework is effective only if it is designed to reflect the characteristics of the services in scope.
Part 3 has been designed to address the particular issues arising from the rapid growth in platforms that allow the sharing of user-generated content but are not the ones choosing to upload that content. The scale and speed of dissemination of user-generated content online demands a risk-based and proportionate approach, as Part 3 sets out.
It is also important that these companies understand the risks to children in the round, rather than focusing on one particular type of content. Risks to children will often be a consequence of the design of these services—for instance, through algorithms, which need to be tackled holistically.
I know that the noble Baroness is concerned about whether pornography will indeed be designated as primary priority content for the purposes of the child safety duties in Clauses 11(3) and 25(3). The Government fully intend this to be the case, which means that user-to-user services will need to have appropriate systems to prevent children accessing pornography, as defined in Clause 70(2).
The approach taken in Part 3 is very different from services captured under Part 5, which are publishing content directly, know exactly where it is located on their site and already face legal liability for the content. In this situation the service has full control over its content, so a risk-based approach is not appropriate. It is reasonable to expect that service to prevent children accessing pornography. We do not therefore consider it necessary or effective to apply the Part 5 duties to user-to-user pornographic content.
I also assure the noble Baroness and the noble Lord that, in a case where a provider of user-to-user services is directly publishing pornographic content on its own service, it will already be subject to the Part 5 duties in relation to that particular content. Those duties in relation to that published pornographic content will be separate from and in addition to their Part 3 duties in relation to user-generated pornographic content.
This means that, no matter where published pornographic content appears, the obligation to ensure that children are not normally able to encounter it will apply to all in-scope internet service providers that publish pornographic content. This is made clear in Clause 71(2) and is regardless of whether they also offer user-to-user or search services.
I am sorry, but can the Minister just clarify that? Is he saying that it is not possible to be covered by both Part 3 and Part 5, so that where a Part 5 service has user-generated content it is also covered by Part 3? Can he clarify that you cannot just escape Part 5 by adding user-generated content?
Yes, that is correct. I was trying to address the points raised by the noble Baroness, but the noble Lord is right. The point on whether people might try to be treated differently by allowing comments or reviews on their content is that they would be treated the same way. That is the motivation behind the noble Baroness’s amendment trying to narrow the definition. There is no risk that a publisher of pornographic content could evade their Part 5 duties by enabling comments or reviews on their content. That would be the case whether or not those reviews contained words, non-verbal indications that a user liked something, emojis or any other form of user-generated content.
That is because the Bill has been designed to confer duties on different types of content. Any service with provider pornographic content will need to comply with the Part 5 duties to ensure that children cannot normally encounter such content. If they add user-generated functionality—
I am sorry to come back to the same point, but let us take the Twitter example. As a publisher of pornography, does Twitter then inherit Part 5 responsibilities in as much as it is publishing pornography?
It is covered in the Bill as Twitter. I am not quite sure what my noble friend is asking me. The harms that he is worried about are covered in different ways. Twitter or another social medium that hosts such content would be hosting it, not publishing it, so would be covered by Part 3 in that instance.
Maybe my noble friend the Minister could write to me to clarify that point, because it is quite a significant one.
Perhaps I will speak to the noble Lord afterwards and make sure I have his question right before I do so.
I hope that answers the questions from the noble Baroness, Lady Ritchie, and that on that basis, she will be happy to withdraw her amendment.
My Lords, this has been a very wide-ranging debate, concentrating not only on the definition of pornography but on the views of noble Lords in relation to how it should be regulated, and whether it should be regulated, as the noble Baroness, Lady Kidron, the noble Lords, Lord Bethell and Lord Browne, and I myself believe, or whether it should be a graduated response, which seems to be the view of the noble Lords, Lord Allan and Lord Clement-Jones.
I believe that all pornography should be treated the same. There is no graduated response. It is something that is pernicious and leads to unintended consequences for many young people, so therefore it needs to be regulated in all its forms. I think that is the point that the noble Lord, Lord Bethell, was making. I believe that these amendments should have been debated along with those of the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, because then we could have an ever wider-ranging debate, and I look forward to that in the further groups in the days to come. The focus should be on the content, not on the platform, and the content is about pornography.
I agree with the noble Baroness, Lady Kidron, that porn is not the only harm, and I will be supporting her amendments. I believe that they should be in the Bill because if we are serious about dealing with these issues, they have to be in there.
I do not think my amendments are suggesting that children will be removed from social media. I agree that it is a choice to remove pornography or to age-gate. Twitter is moving to subscriber content anyway, so it can do it; the technology is already available to do that. I believe you just age-gate the porn content, not the whole site. I agree with the noble Lord, Lord Clement-Jones, as I said. These amendments should have been debated in conjunction with those of the noble Lord, Lord Bethell, and the noble Baroness, Lady Kidron, as I believe that the amendments in this group are complementary to those, and I think I already said that in my original submission.
I found the Minister’s response interesting. Obviously, I would like time to read Hansard. I think certain undertakings were given, but I want to see clearly spelled out where they are and to discuss with colleagues across the House where we take these issues and what we come back with on Report.
I believe that these issues will be debated further in Committee when the amendments from the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, are debated. I hope that in the intervening period the Minister will have time to reflect on the issues raised today about Parts 3 and 5 and the issue of pornography, and that he will be able to help us in further sessions in assuaging the concerns that we have raised about pornography. There is no doubt that these issues will come back. The only way that they can be dealt with, that pornography can be dealt with and that all our children throughout the UK can be dealt with is through proper regulation.
I think we all need further reflection. I will see, along with colleagues, whether it is possible to come back on Report. In the meantime, I beg leave to withdraw the amendment.