Online Pornography (Commercial Basis) Regulations 2018

(Limited Text - Ministerial Extracts only)

Read Full debate
Tuesday 11th December 2018

(6 years ago)

Lords Chamber
Read Hansard Text
Moved by
Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

That the draft Regulations laid before the House on 10 October be approved.

Special attention drawn to the instrument by the Joint Committee on Statutory Instruments, 38th Report, 4th Report from the Secondary Legislation Scrutiny Committee (Sub-Committee B).

Lord Ashton of Hyde Portrait The Parliamentary Under-Secretary of State, Department for Digital, Culture, Media and Sport (Lord Ashton of Hyde) (Con)
- Hansard - - - Excerpts

My Lords, the Digital Economy Act 2017 introduced a requirement for commercial providers of online pornography to have robust age-verification controls in place to prevent children and young people under 18 accessing pornographic material. Section 14(2) of the Act states:

“The Secretary of State may make regulations specifying … circumstances in which material is or is not to be regarded as made available on a commercial basis”.


In a sense, this is a small part of the legislative jigsaw needed to implement age verification: indeed, it is the last piece. I therefore beg to move that the draft regulations and the guidance published by the British Board of Film Classification, which is the designated regulator in respect of these measures, on age-verification arrangements and ancillary service providers be approved.

I bring to the attention of the House the concerns of the Joint Committee on Statutory Instruments and the Secondary Legislation Scrutiny Committee and thank them for their work. I will address their concerns in a moment and the Motion to Regret later, but before considering the specific points related to this debate, I want to remind the House of why the Government introduced this requirement.

In the offline world, there are strict rules to prevent children accessing adult content. This is not true of the online world. A large amount of pornography is available on the internet in the UK, often for free, with little or no protection to ensure that those accessing it are old enough to do so. This is changing the way that young people understand healthy relationships, sex and consent. A 2016 report commissioned by the Children’s Commissioner and the NSPCC makes that clear. More than half of the children sampled had been exposed to online pornography by the age of 15 and nearly half of boys thought pornography was “realistic”. Just under half wished to emulate what they had seen. The introduction of a requirement for age-verification controls is a necessary step to tackle these issues and contributes towards our commitment to making the UK the safest place in the world to be online. I urge noble Lords, in the ensuing debate, to bear this primary objective in mind and help us ensure the commencement of age verification as soon as possible.

The draft Online Pornography (Commercial Basis) Regulations set out the basis on which pornographic material is to be regarded as made available on a commercial basis. The regulations cover material on websites and applications that charge for access and they also cover circumstances where a person makes pornographic material available on the internet for free but that person receives other payment or reward in connection with doing so, for example through advertising revenue. It was clear from debates during the passage of the Digital Economy Act that it was not Parliament’s intention that social media sites on which pornography is only part of the overall content should be required to have age verification. That is reflected in the draft regulations we are debating today. We have set a threshold to ensure proportionality where material is made available free of charge. Thus there is an exemption for people making pornographic material available where it is less than one-third of the content on the website or application on which it is made available. This will ensure that websites that do not derive a significant proportion of their overall commercial benefit from pornography are not regarded as commercial pornographic websites. However, should such a website or app be marketed as making pornographic material available, a person making pornographic material available on that website or app will be considered to be making it available on a commercial basis, even if it constitutes less than one-third of the total.

This is a proportionate way of introducing a new policy. I am confident that these measures represent the most effective way of commencing this important new policy, but my department will, of course, keep it under review. Indeed, the Secretary of State must report on the regulatory framework within 12 to 18 months of commencement. In addition, the upcoming online harms White Paper will give us an opportunity to review the wider context of this policy.

We have also laid two pieces of BBFC guidance: the Guidance on Age-verification Arrangements and the Guidance on Ancillary Service Providers. The guidance on AV arrangements sets out the criteria by which the BBFC will assess that a person has met the requirements of Section 14 of the Digital Economy Act to ensure that pornographic material is not normally accessible by those under 18. The criteria mandate: an effective control mechanism at the point of access to verify that a user is aged 18 or over; strict requirements on age-verification data; a requirement to ensure that “revisits” do not allow automatic re-entry; and prevention of non-human operators exercising the age-verification regime. The BBFC also provided examples of non-compliant features to help interested companies. The latter guidance provided a non-exhaustive list of ancillary service providers that the BBFC will consider. This list is not exhaustive, to ensure that this policy remains flexible to future developments. The BBFC published draft versions of both pieces of guidance and ran a public consultation for four weeks on the content. The draft guidance laid before this House takes account of comments received from affected companies and others.

I turn to the views of the JCSI, to which I referred earlier. We have been clear that although it will be a major step forward, age verification is not a complete answer to preventing children viewing online pornography, and we know that we are doing something difficult. Indeed, we are the first country anywhere in the world to introduce such a measure. We have considered the JCSI concerns carefully. We do not believe that the variation in the language of the legislation, between “met” and “applied”, will be difficult for a court to interpret. As for the committee’s concerns about the content threshold, the committee anticipates difficulty with the application and interpretation of the regulation. As I have already said, the regulation will not apply in a case where it is reasonable for the age-verification regulator to assume—those words are important—that pornographic material makes up less than one-third of the content. As is stated in the BBFC guidance, the BBFC will seek to engage and work with a person who may be in contravention of the requirement before commencing enforcement action.

I am aware that the committee has also drawn the special attention of both Houses to these two draft pieces of guidance, because in its view they fail to contain the guidance required by Section 25(1) of the 2017 Act and contain material that should not have been included. Section 3, paragraph 5 of the Guidance on Age-verification Arrangements sets out the criteria on age-verification arrangements which the regulator will treat as complying with age verification. The guidance then goes on, in paragraph 6, to give examples of features which, in isolation, do not comply with the age-verification requirement. This approach ensures fairness. It takes a product-neutral approach and, rather than recommending a particular solution, sets out principles to encourage innovation. The ancillary services providers’ guidance provides a non-exhaustive list of classes of providers which the age-verification regulator may consider as within scope in Section 3, paragraph 3. However, in order to ensure that this policy remains flexible for future developments, it is necessary that this is a non-exhaustive list. Where new classes of ancillary services appear in the future, the BBFC’s guidance explains the process by which these services will be informed.

The guidance includes additional material, as this is a new policy, and the regulator considered that it was important for stakeholders that its guidance set out the wider context in which the age-verification regulator will carry out regulation. This includes valuable guidance on matters such as the BBFC’s approach and powers, and material on data protection. We find it somewhat perverse that it should be prevented from including helpful guidance simply because it was not specifically mentioned in the Act.

We are also aware of the Secondary Legislation Scrutiny Committee’s special interest report. That committee raised some similar concerns to the JCSI; for example, on the content threshold and the requirements in the BBFC’s guidance. The responses to the concerns of the SLSC on these points are the same as the responses we have just given to the JSCI reports.

However, the SLSC also suggested that the House might want to ask what action the Government would take to tackle pornographic material available that does not fall within the criteria set out in the regulations. I appreciate that some pornography is available by means not covered by our regulations. This was the subject of extensive discussion during the passage of the Act. In particular, concern has been expressed about social media platforms. We expect those platforms to enforce their own terms and conditions and to protect children from harmful content. Indeed, the Government have been clear that online platforms must do more to protect users from such harmful content. We will set out our plans for new legislation to ensure that companies make their platforms safer, in the forthcoming online harms White Paper.

I recognise that age verification is not a complete answer but I am proud that this Government are leading the way internationally in our actions to protect children online. I beg to move.

Lord Bishop of Chester Portrait The Lord Bishop of Chester
- Hansard - - - Excerpts

My Lords, I am pleased to speak in general support of the regulations and guidance. They relate to matters which I and others raised during the passage of the Digital Economy Bill in 2017 and, more broadly, to issues debated by the House a couple of years ago in a balloted debate that I introduced. The subject of that debate was the impact of pornography on our society. While there was some disagreement over the impact of pornography on adults, there was virtual unanimity that children needed to be protected from pornography—as far as this could reasonably be achieved. I seem somehow, by default, to have become the episcopal expert on pornography. I am trying to live that down. It is just the way it has fallen—although I often find myself talking from these Benches about things I have not had much experience of.

The regulations deal with protecting children through the introduction of robust age-verification procedures for accessing at least some pornographic sites. I welcome them but I note that there remains good evidence for believing that adult access to pornography is also often harmful. The recent report on sexual harassment by the Women and Equalities Select Committee in the other place made this point in a new context, particularly in relation to violent pornography. My welcome of the regulations and guidance is also tempered by some questions which they pose, and which I would like to put to the Minister.

My main concern relates to access to pornography on websites that do not charge for access. Provided their pornographic content is limited to one-third of their total content, they are exempted from the regulations. They may not charge but they may make money from advertising and other sources. What is the rationale for choosing one-third and not, say, 10%? Parents really do not want their children to stumble across online pornography and arguably children are more likely to do that if it is a website that does not charge in the first place. Why is it one-third? I realise that enforcement against every site would be a challenge, but surely the obligation to use access by age verification should be on all sites which promote pornography. What we need is a culture change in relation to child protection and not a partial, piecemeal and limited approach, which I fear these regulations, in some respects, provide.

--- Later in debate ---
Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

My Lords, I thank noble Lords for their contributions and for the myriad questions which I will try to answer, in a slightly random order. It is important that we take a bit of time to discuss these; as many noble Lords have said, this is the start of something quite complicated. As I said at the beginning, we ought to bear in mind that we are trying to protect children. In the debates during the passage of the Digital Economy Bill, the Government always acknowledged that they would not have a complete solution, as many noble Lords said and as I mentioned during my opening remarks. We will take on board noble Lords’ comments. Indeed, we have shown—this is a partial answer to the question of why it has taken so long—that we have consulted quite widely; we have discussed the wording of the regulations themselves and the guidelines; and the Secretary of State’s guidelines to the BBFC, which the noble Lord, Lord Clement-Jones, mentioned, were available during the passage of the Digital Economy Bill.

We have tried to involve people, which is right given that we are at the beginning of something unique in the world. When we come to talk—I put a certain amount of emphasis on this—about social media and some of the areas that we do not cover in these regulations, we will look at those either in the review to come within 12 to 18 months or in the online harms White Paper. We are still discussing that White Paper and are still open to ideas about what it should include. I am pleased to say that the Secretary of State will make a meeting available to all Peers to discuss what they think should be in the White Paper. We will do that as soon as we can; I will let Peers know about it in due course.

--- Later in debate ---
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara
- Hansard - - - Excerpts

I am sure that the noble Baroness, Lady Howe, was about to leap to her feet but, to save her doing so, I mention to the Minister that he did not answer the question which she posed, and which was picked up by the noble Baroness, Lady Benjamin, about whether he would find time for the excellent two-paragraph Bill which she has in process and which would solve many of these problems.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

I had not forgotten that. It would obviously be difficult for me to commit to finding the necessary time but I will take that back to the department. I am not sure that it is currently within the plans of the Chief Whip to bring forward that legislation but I will ask. I understand the point that is being made but, as I said, the issue may well be covered within the review. I am afraid I cannot go any further than that tonight.

As for ancillary service providers, the BBFC and the DDCMS have been engaging with several companies. They have already agreed to act, as doing so is in line with their current terms of service. Therefore, we are optimistic that the voluntary approach will work, and of course that will be reviewed.

The right reverend Prelate, the noble Earl, Lord Erroll, and others talked about the rationale for choosing one-third of content as the appropriate threshold. During the passage of the Bill, it was established that the focus should be on commercial pornography sites and not on social media. There were good reasons for that but I do not want to revisit them—that is what was decided. The one-third threshold was regarded as proportionate in introducing this new policy where sites make pornography available free of charge. However, websites that market themselves as pornographic will also be required to have age verification, even if less than a third of the content is pornographic.

A third is an arbitrary amount. It was discussed and consulted on, and we think that it is a good place to start on a proportionate basis. We will keep this matter under review and, as I said, it will be one of the obvious things to be taken into account during the 12 to 18-month review. The noble Lord, Lord Morrow, asked how it will be measured. It will be measured by assessing the number of pieces of content rather than the length of individual videos. It will include all pornographic images, videos and individual bits of content, but the point to remember is that the threshold is there so that a decision can be made on whether it is reasonable for the regulator to assume that pornographic content makes up more than one-third of the entire content. This will be done by sampling the various sites.

The noble Earl, Lord Erroll, asked about ISP blocking and suggested that everyone would try to game the system to get out of meeting the requirements. That is not what we believe. The BBFC has already engaged with ISPs and we are confident that this will be an effective sanction. The wording in the guidance indicates that the regulator should take a “proportionate approach”. However, we are grateful for the noble Earl’s help. I am sure that he will also help during the review and later in the process when it comes to online harms. I see that he wants to help now.

Earl of Erroll Portrait The Earl of Erroll
- Hansard - - - Excerpts

It is not the ISPs that I am worried about; it is the websites that will game the system on notification, appeals and so on. That is the bit that will take a long time.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

We are confident it will work, but we will have to see when it comes to the review. It is an arbitrary figure that we came to by consensus. I will leave it at that.

The noble Lord, Lord Paddick, talked about education for children about sex and relationships. We are extending that by making relationships education compulsory in all primary schools. Relationships and sex education is compulsory in all secondary schools and health education compulsory in primary and secondary schools. We understand that it is important. Together with the protection of children we are introducing today, we will have to keep an eye on it. I notice that the DCMS committee in the other place is launching an inquiry into, among other things, the effects of social media on people’s attitudes, including those of children. In a sense, we are all learning as we go, because the technology is developing. It is something we are aware of and keeping an eye on, and we take the point.

As for the big issue of the evening, and why social media sites are not in the scope, that was a decision taken after a debate during the passage of the Digital Economy Bill. We did not want to prevent the benefits of social media sites. But I confirm to the noble Lord, Lord Stevenson, that we will consider that in the online harms White Paper. Noble Lords will be welcome to add their thoughts on that very soon—either just before or after Christmas.

As noble Lords have mentioned, there is a memorandum of understanding that clarifies the role of the ICO and what powers it will have instead of the BBFC. The BBFC will administer the voluntary certification scheme that will hold AV services to the highest standards of privacy protection and cybersecurity. We expect the vast majority of AV services to seek accreditation. Furthermore, the BBFC will inform the ICO of any non-certified age-verification solutions it finds, and the ICO will be able to take a look at them. Even if they do not want to apply for voluntary certification, the ICO will make sure they are subject to the full rigours of the GDPR.

I have covered most of the main points; I will look at Hansard and write to noble Lords if I have not covered any. I think it is evident from all the contributions from across the House that this is a complex and novel policy that requires sensitive handling. Having listened to all contributions and heard limited support for the regulations as they stand, albeit with some suggestions for improvement, I remain of the view that these regulations set out clearly what will fall within their scope. I think the guidance from the BBFC sets out clearly how it will assess the requirements of Section 14 and clarifies the BBFC’s approach to payment and ancillary service providers.

We are on the verge of doing something important that has the potential to make a real difference to the experience children have online and to make the internet a safer place for them, so I finish where I began. We are here to protect children, and for that reason I ask the noble Lord, Lord Stevenson, to withdraw his Motion, or indeed not to move it, and respectfully ask the House to approve the two guidances and the statutory instrument.

Motion agreed.