Online Pornography (Commercial Basis) Regulations 2018 Debate
Full Debate: Read Full DebateBaroness Benjamin
Main Page: Baroness Benjamin (Liberal Democrat - Life peer)Department Debates - View all Baroness Benjamin's debates with the Department for Digital, Culture, Media & Sport
(5 years, 10 months ago)
Lords ChamberMy Lords, I know that the Minister has carefully considered the definition of “commercial pornography”, and I am grateful that he has engaged with my comments on previous drafts of the regulations and that we have met in person to discuss these. Further to those conversations, I am happy to say that I support the regulations and the guidance, and certainly encourage other noble Lords to do the same, although I have a number of concerns I would like to highlight.
First, I note that it has taken more than 18 months since Third Reading to get to the point where this House and the other place are considering the regulations to determine what is deemed commercial pornography and the regulator’s guidance on age verification. I hope the Minister can assure us that the full implementation of age verification for pornographic websites is now very close. Indeed, it would be even better if he could tell the House when he expects it to be operational.
Secondly, I note that in its report on the Bill, Sub-Committee B of the Secondary Legislation Scrutiny Committee said that the measures available to the BBFC, as the age-verification regulator, should be applied “fairly and transparently”. I certainly hope that they will be. To this end, I ask the Minister to place a letter in the Library nine months after age verification goes live, with an update on the number of websites with AV in place and how many enforcement actions have taken place. I hope that that will be possible.
Thirdly, I cannot address the regulations and guidance that will help give effect to Part 3 of the Digital Economy Act without reflecting on the fact that, thanks to amendments introduced by your Lordships’ House, Part 3 will no longer address some very serious problems as effectively as it would have done. When Part 3, as amended, is implemented, there will be nothing in it to prevent non-photographic and animated child sex abuse images, which are illegal to possess under Section 62 of the Coroners and Justice Act 2009, being accessed behind age verification. This is a serious problem. In 2017, 3,471 reports of alleged non-photographic images of child sexual abuse were made to the Internet Watch Foundation, but since none of these images was hosted in the UK, it was unable to act.
Of course I appreciate that technically the amendments to the Digital Economy Bill, which removed from the regulator the power to take action against such material when it is behind age verification, did not have the effect of legalising possession of this material. The 2009 Act remains in place. However, as virtually all this material is beamed into the UK from other jurisdictions, the arrival of the Digital Economy Bill in your Lordships’ House meant that for the first time we had a credible means of enforcing that law online. There is no need for a regulator to be in the same jurisdiction as a website that it determines to block.
As I said at the time, greeting the first really credible means of enforcing that law online by removing the relevant enforcement mechanism from key parts of the Bill inevitably called into question our commitment to the law. I appreciate that there is arguably a mechanism for trying to enforce the law: the National Crime Agency can work with overseas agencies if websites with this material are identified. However, the mechanism is slow and expensive, and it remains unclear how it can have any effect if the domestic laws of the countries in question permit non-photographic child sex abuse images. To this extent, it was no surprise to me that in response to a Written Parliamentary Question in September 2018, the Government were unable to say whether the NCA had taken action against any websites, or whether any sites had been removed by overseas jurisdictions. ComRes polling published in the summer shows that 71% of MPs think that the regulator should be empowered to block these sites. Only 5% disagree.
The other loophole, of course, relates to all but the most extreme forms of violent pornography. Given that under the Video Recordings Act 1984 it is not legal to supply this material, it was entirely proper that the Digital Economy Bill, as it entered your Lordships’ House, did not accommodate such material. However, amendments were introduced in this House to allow it behind age verification. As I observed at the time, this sent out the message loud and clear that violence against women—unless it is “grotesque”, to quote the Minister on Report, at col. 1093—is, in some senses, acceptable.
My concerns about the impact of such material remain and have been mirrored by those of the Women and Equalities Select Committee in its report, which I referred to earlier. Of great importance, it states:
“There is significant research suggesting that there is a relationship between the consumption of pornography and sexist attitudes and sexually aggressive behaviour, including violence. The Government’s approach to pornography is not consistent: it restricts adults’ access to offline pornography to licensed premises and is introducing age-verification of commercial pornography online to prevent children’s exposure to it, but it has no plans to address adult men’s use of mainstream online pornography”.
I appreciate that we cannot deal with these problems today. The Government must, however, urgently prioritise how to address them. They could deal with the matter very quickly if they were to make time for my very short two-clause Digital Economy Act amendment Bill, which addresses the matter in full. With these caveats, I warmly welcome the regulations and the guidance.
My Lords, I welcome the Government’s decision finally to lay this guidance and the regulations for the House’s approval. It has not come a moment too soon. As the Minister knows, I have been concerned for some time that we should progress implementation of Part 3 of the Digital Economy Act and stop dragging our feet while harm is being done to our children. Almost every week, I hear of cases of children as young as four experiencing the traumatic horror of accidentally discovering pornographic material online. This can be devastating for young minds, causing them anxiety and depression.
This is ground-breaking child protection legislation and we should be proud, because it will be the first of its kind in the world. The UK is leading the way in online safety and setting an example for other countries that are looking to introduce similar controls. We can demonstrate that it is possible to regulate the internet to ensure that children can be protected from online pornographic material that we would never let them near in the offline world.
There is an abundance of evidence showing how harmful this material can be and, significantly, that children often do not seek it out but stumble across it. Research by the NSPCC found that children are as likely to stumble across pornography by accident as to search for it deliberately. Also significantly, the NSPCC reports that children themselves support age verification. Eighty per cent of young people felt that age verification is needed for sites that contain adult content.
The age-verification regulator, the British Board of Film Classification, has been working on implementing the legislation for a number of months and has kept me briefed on its progress. I am confident that it will successfully deliver age verification in the UK to prevent children stumbling across and accessing pornography. Its guidance sets out principle-based standards which will encourage even more innovation and allow for new means of age-verifying consumers in the future. This is important because if this regime is to work, age verification needs to be robust and privacy must be protected.
My concern, as always, is with child protection, but I recognise the need to ensure that this regime is seamless enough to prevent commercial incentives to avoid compliance. For this reason, I am pleased that the BBFC has said in the annex to the guidance that it intends to introduce a voluntary scheme to bring in a higher privacy standard than the GDPR—which is already of a high standard.
I would like the Minister to reassure us that this scheme will be in place shortly and that the Government will fully support it. It is most important that, as the age-verification regulator, the BBFC will have a range of enforcement powers, including requesting ancillary service providers and payment service providers to withdraw their services to non-compliant websites, and instructing internet service providers to block them. These powers should be highly effective in achieving the legislation’s objectives and should be used as swiftly as possible to encourage compliance. I ask the Minister: how will the Government encourage ancillary service providers, who can only be “requested” to take action, to co-operate fully with the BBFC? I have been told by the BBFC that PayPal, Visa and MasterCard have already indicated that they will withdraw services where there is non-compliance. I also welcome the support that I understand will be given by the ISPs and mobile network operators. Their role will be crucial.