Online Pornography (Commercial Basis) Regulations 2018

Debate between Lord Ashton of Hyde and Lord Bishop of Chester
Tuesday 11th December 2018

(5 years, 11 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Ashton of Hyde Portrait The Parliamentary Under-Secretary of State, Department for Digital, Culture, Media and Sport (Lord Ashton of Hyde) (Con)
- Hansard - -

My Lords, the Digital Economy Act 2017 introduced a requirement for commercial providers of online pornography to have robust age-verification controls in place to prevent children and young people under 18 accessing pornographic material. Section 14(2) of the Act states:

“The Secretary of State may make regulations specifying … circumstances in which material is or is not to be regarded as made available on a commercial basis”.


In a sense, this is a small part of the legislative jigsaw needed to implement age verification: indeed, it is the last piece. I therefore beg to move that the draft regulations and the guidance published by the British Board of Film Classification, which is the designated regulator in respect of these measures, on age-verification arrangements and ancillary service providers be approved.

I bring to the attention of the House the concerns of the Joint Committee on Statutory Instruments and the Secondary Legislation Scrutiny Committee and thank them for their work. I will address their concerns in a moment and the Motion to Regret later, but before considering the specific points related to this debate, I want to remind the House of why the Government introduced this requirement.

In the offline world, there are strict rules to prevent children accessing adult content. This is not true of the online world. A large amount of pornography is available on the internet in the UK, often for free, with little or no protection to ensure that those accessing it are old enough to do so. This is changing the way that young people understand healthy relationships, sex and consent. A 2016 report commissioned by the Children’s Commissioner and the NSPCC makes that clear. More than half of the children sampled had been exposed to online pornography by the age of 15 and nearly half of boys thought pornography was “realistic”. Just under half wished to emulate what they had seen. The introduction of a requirement for age-verification controls is a necessary step to tackle these issues and contributes towards our commitment to making the UK the safest place in the world to be online. I urge noble Lords, in the ensuing debate, to bear this primary objective in mind and help us ensure the commencement of age verification as soon as possible.

The draft Online Pornography (Commercial Basis) Regulations set out the basis on which pornographic material is to be regarded as made available on a commercial basis. The regulations cover material on websites and applications that charge for access and they also cover circumstances where a person makes pornographic material available on the internet for free but that person receives other payment or reward in connection with doing so, for example through advertising revenue. It was clear from debates during the passage of the Digital Economy Act that it was not Parliament’s intention that social media sites on which pornography is only part of the overall content should be required to have age verification. That is reflected in the draft regulations we are debating today. We have set a threshold to ensure proportionality where material is made available free of charge. Thus there is an exemption for people making pornographic material available where it is less than one-third of the content on the website or application on which it is made available. This will ensure that websites that do not derive a significant proportion of their overall commercial benefit from pornography are not regarded as commercial pornographic websites. However, should such a website or app be marketed as making pornographic material available, a person making pornographic material available on that website or app will be considered to be making it available on a commercial basis, even if it constitutes less than one-third of the total.

This is a proportionate way of introducing a new policy. I am confident that these measures represent the most effective way of commencing this important new policy, but my department will, of course, keep it under review. Indeed, the Secretary of State must report on the regulatory framework within 12 to 18 months of commencement. In addition, the upcoming online harms White Paper will give us an opportunity to review the wider context of this policy.

We have also laid two pieces of BBFC guidance: the Guidance on Age-verification Arrangements and the Guidance on Ancillary Service Providers. The guidance on AV arrangements sets out the criteria by which the BBFC will assess that a person has met the requirements of Section 14 of the Digital Economy Act to ensure that pornographic material is not normally accessible by those under 18. The criteria mandate: an effective control mechanism at the point of access to verify that a user is aged 18 or over; strict requirements on age-verification data; a requirement to ensure that “revisits” do not allow automatic re-entry; and prevention of non-human operators exercising the age-verification regime. The BBFC also provided examples of non-compliant features to help interested companies. The latter guidance provided a non-exhaustive list of ancillary service providers that the BBFC will consider. This list is not exhaustive, to ensure that this policy remains flexible to future developments. The BBFC published draft versions of both pieces of guidance and ran a public consultation for four weeks on the content. The draft guidance laid before this House takes account of comments received from affected companies and others.

I turn to the views of the JCSI, to which I referred earlier. We have been clear that although it will be a major step forward, age verification is not a complete answer to preventing children viewing online pornography, and we know that we are doing something difficult. Indeed, we are the first country anywhere in the world to introduce such a measure. We have considered the JCSI concerns carefully. We do not believe that the variation in the language of the legislation, between “met” and “applied”, will be difficult for a court to interpret. As for the committee’s concerns about the content threshold, the committee anticipates difficulty with the application and interpretation of the regulation. As I have already said, the regulation will not apply in a case where it is reasonable for the age-verification regulator to assume—those words are important—that pornographic material makes up less than one-third of the content. As is stated in the BBFC guidance, the BBFC will seek to engage and work with a person who may be in contravention of the requirement before commencing enforcement action.

I am aware that the committee has also drawn the special attention of both Houses to these two draft pieces of guidance, because in its view they fail to contain the guidance required by Section 25(1) of the 2017 Act and contain material that should not have been included. Section 3, paragraph 5 of the Guidance on Age-verification Arrangements sets out the criteria on age-verification arrangements which the regulator will treat as complying with age verification. The guidance then goes on, in paragraph 6, to give examples of features which, in isolation, do not comply with the age-verification requirement. This approach ensures fairness. It takes a product-neutral approach and, rather than recommending a particular solution, sets out principles to encourage innovation. The ancillary services providers’ guidance provides a non-exhaustive list of classes of providers which the age-verification regulator may consider as within scope in Section 3, paragraph 3. However, in order to ensure that this policy remains flexible for future developments, it is necessary that this is a non-exhaustive list. Where new classes of ancillary services appear in the future, the BBFC’s guidance explains the process by which these services will be informed.

The guidance includes additional material, as this is a new policy, and the regulator considered that it was important for stakeholders that its guidance set out the wider context in which the age-verification regulator will carry out regulation. This includes valuable guidance on matters such as the BBFC’s approach and powers, and material on data protection. We find it somewhat perverse that it should be prevented from including helpful guidance simply because it was not specifically mentioned in the Act.

We are also aware of the Secondary Legislation Scrutiny Committee’s special interest report. That committee raised some similar concerns to the JCSI; for example, on the content threshold and the requirements in the BBFC’s guidance. The responses to the concerns of the SLSC on these points are the same as the responses we have just given to the JSCI reports.

However, the SLSC also suggested that the House might want to ask what action the Government would take to tackle pornographic material available that does not fall within the criteria set out in the regulations. I appreciate that some pornography is available by means not covered by our regulations. This was the subject of extensive discussion during the passage of the Act. In particular, concern has been expressed about social media platforms. We expect those platforms to enforce their own terms and conditions and to protect children from harmful content. Indeed, the Government have been clear that online platforms must do more to protect users from such harmful content. We will set out our plans for new legislation to ensure that companies make their platforms safer, in the forthcoming online harms White Paper.

I recognise that age verification is not a complete answer but I am proud that this Government are leading the way internationally in our actions to protect children online. I beg to move.

Lord Bishop of Chester Portrait The Lord Bishop of Chester
- Hansard - - - Excerpts

My Lords, I am pleased to speak in general support of the regulations and guidance. They relate to matters which I and others raised during the passage of the Digital Economy Bill in 2017 and, more broadly, to issues debated by the House a couple of years ago in a balloted debate that I introduced. The subject of that debate was the impact of pornography on our society. While there was some disagreement over the impact of pornography on adults, there was virtual unanimity that children needed to be protected from pornography—as far as this could reasonably be achieved. I seem somehow, by default, to have become the episcopal expert on pornography. I am trying to live that down. It is just the way it has fallen—although I often find myself talking from these Benches about things I have not had much experience of.

The regulations deal with protecting children through the introduction of robust age-verification procedures for accessing at least some pornographic sites. I welcome them but I note that there remains good evidence for believing that adult access to pornography is also often harmful. The recent report on sexual harassment by the Women and Equalities Select Committee in the other place made this point in a new context, particularly in relation to violent pornography. My welcome of the regulations and guidance is also tempered by some questions which they pose, and which I would like to put to the Minister.

My main concern relates to access to pornography on websites that do not charge for access. Provided their pornographic content is limited to one-third of their total content, they are exempted from the regulations. They may not charge but they may make money from advertising and other sources. What is the rationale for choosing one-third and not, say, 10%? Parents really do not want their children to stumble across online pornography and arguably children are more likely to do that if it is a website that does not charge in the first place. Why is it one-third? I realise that enforcement against every site would be a challenge, but surely the obligation to use access by age verification should be on all sites which promote pornography. What we need is a culture change in relation to child protection and not a partial, piecemeal and limited approach, which I fear these regulations, in some respects, provide.

Gambling: Fixed-odds Betting Terminals

Debate between Lord Ashton of Hyde and Lord Bishop of Chester
Tuesday 10th July 2018

(6 years, 4 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - -

No, my Lords, I did not say that.

Lord Bishop of Chester Portrait The Lord Bishop of Chester
- Hansard - - - Excerpts

My Lords, the Minister has referred several times to the need to engage with the industry in order to mitigate the impact on employment. I should like to know exactly what form of mitigation the Government have in mind.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - -

My Lords, an inter-ministerial group drawn from different departments will engage in discussions about what the effect on employment will be in different parts of the country, and we will produce a plan. There is a limited amount that we can do, but, as I say, over the summer we will produce a plan to deal with that. When we have a plan, I will be able to tell the House about it.