Claire Perry
Main Page: Claire Perry (Conservative - Devizes)(8 years, 2 months ago)
Public Bill CommitteesFirst, I thank my hon. Friend the Member for Sheffield, Heeley for making such a clear and cogent argument for why the Bill needs further amendment. As I think she said—I am sure that she will correct me if I am wrong—we want to ensure that the Government stick to their manifesto commitment to protect children from all forms of online pornography. That will take consistency and a depth of modesty about the extent of our various levels of knowledge about how the internet works.
The hon. Member for Devizes made a good speech, and I am grateful to her for making the argument about on-demand films, as my hon. Friend the Member for Sheffield, Heeley also did, but the hon. Lady said—please correct me if I am wrong—that there were not many providers of free online pornography. I must respectfully disagree. Given the existence of peer-to-peer sharing and other forms of availability—my hon. Friend mentioned Tumblr and other social media websites—I am afraid that it is incredibly easy, as my nephews and nieces have confirmed, sadly, for a young person to access free online pornographic content in ways that most of us here might not even understand.
I am happy to clarify. My focus was on the Government’s intention to capture free and commercial pornography. The hon. Lady is absolutely right that there is a plethora of free stuff out there, and she is right to focus on the harm that it causes.
I thank the hon. Lady for that clarification. I understand from an intervention made by my hon. Friend the Member for Cardiff West that the reason why we were not allowed to remove the words “on a commercial basis” was that they were deemed out of scope. As I understand it, the word “economy”, if we stick to the letter of it, includes transactions for which there is no financial payment. There are transactions involved, and the word “digital” is in the title of the Bill, so I think it unfortunate that the amendment was not agreed to. Taking out the words “on a commercial basis” would have done a great deal to make consistent across all platforms and all forms of pornographic content available online the restrictions that we are placing on commercial ones.
I support the amendments proposed by my hon. Friend to the wording of clause 15(5)(a) and (6), for reasons that have already been given, and I want to add to the arguments. Hon. Friends and Members may have read the evidence from Girlguiding. As a former Guide, I pay tribute to the movement for the excellent work that it has done. It has contributed a profound and well-evidenced understanding of what young women are saying about online pornography. I will pick out a couple of statistics, because they make arguments to which I will refer in interventions on later clauses. That will make my speeches less long.
In the 2016 girls’ attitudes survey, half of the girls said that sexism is worse online than offline. In the 2014 survey, 66%, or two thirds, of young women said that they often or sometimes see or experience sexism online. It is a place where young women routinely experience sexism, and part of that sexism is the ubiquity of pornography. In 2015, the survey found that 60% of girls aged 11 to 21 see boys their age—admittedly, some of those are over the age of 18, but they are still the girls’ peers—viewing pornography on mobile devices or tablets. In contrast, only 27% of girls say that they see girls their age viewing pornography. The majority of those young women say from their experience that children can access too much content online and that it should be for adults only. In the survey, we see a certain degree of concord among young women in the Girlguiding movement, Opposition Members and the Government manifesto, which pledged, as my hon. Friend said, to exclude children from all forms of online pornography.
The 2015 Girlguiding survey also found that those young women felt that pornography was encouraging sexist stereotyping and harmful views, and that the proliferation of pornography is having a negative effect on women in society more generally. Those young women are the next generation of adults.
I have worked with young men who have already abused their partners. In my former job working with domestic violence perpetrators, I worked with young men of all ages; for the men my age, their pornography had come from the top shelf of a newsagent, but the younger men knew about forms of pornography that those of us of a certain age had no understanding of whatever. They were using pornography in ways that directly contribute to the abuse of women and girls, including pornography that is filmed abuse. I shall come back to that point later, but we need to recognise that young men are getting their messages about what sex and intimacy are from online pornography. If we do not protect them from online pornography under the age of 18, we are basically saying that there are no holds barred.
The hon. Member for Devizes and my hon. Friend the Member for Sheffield, Heeley mentioned loopholes. When we leave loopholes, it creates a colander or sieve for regulation. Yes, the internet is evolving and, yes, we in this Committee Room probably do not know every single way in which it already provides pornography, and certainly not how it will in future, but that is a good reason to provide a strong regulatory framework when we have the chance. We have that chance now, and we should take it. If it remains the case that removing the words “on a commercial basis” is deemed outside our scope, which I find very sad—I think it is a missed opportunity, and I hope the House can return to it at some point and regulate the free content—we must definitely ensure that we are putting everything else that we possibly can on a level playing field. That means that the regulation of video on demand has to be consistent and that we have to close any other loophole we can spot over the next few days.
I hope Opposition amendments will make the Government think about the manifesto commitment they rightly made—I am happy to put on the record that I support it—and take the opportunity to stick to it. Young women want that; young men need it, because my experience of working with young men who have abused their partners and ex-partners is that they felt that they were getting those messages from pornography; and we as a society cannot afford to ignore this problem any longer. We have a chance to do something about it, so let us take that opportunity.
The principle is that there is a distinction between those who are making money by targeting and are indifferent to potential harm and those whose services facilitate the provision of porn to those who are under age. I think it is a reasonable distinction. We are trying to deal with the mass of the problem. By its nature, it is very difficult to get to 100%. I think that leaving the Bill in this way, with flexibility for the regulator to act, has a big advantage over being overly prescriptive in primary legislation and too specific about the way in which the regulator acts, not least because disrupting the business model is the goal of trying to provide enforcement.
I support the Minister’s point about over-prescription, but perhaps he could help me by talking about a particular case. Let us take Tumblr hosting a stream of content which is 18. Who would the regulator target if it issued an enforcement notice? Would it be the content provider, or would it be the social media platform that is hosting that content?
In that case, the platform—I do not want to get into individual platforms, but I am happy to take my hon. Friend’s example—would likely be an ancillary service provider and therefore captured. This is a very important distinction. There is a difference between somebody who is actively putting up adult material and choosing not to have age verification, and a platform where others put up adult material, where it is not necessarily impossible but much harder to have a control over the material. There is an important distinction here. If we try to pretend that everybody putting material onto a platform, for example, the one that my hon. Friend mentions, should be treated the same way as a porn-providing website, we will be led into very dangerous territory and it makes it harder to police this rather than easier. That is my argument.
On the specific amendments, I understand entirely where the argument on demand is coming from. I want to give an assurance which I hope will mean that these clauses will not be pushed to the vote. On-demand audio-visual media services under UK jurisdiction are excluded from part 3 of the Bill because they are regulated by Ofcom under part 4A of the Communications Act 2003. As my hon. Friend the Member for Devizes said, other on-demand services that are not currently regulated in the UK will be caught by the Bill regime.
Quite a lot of clarification is needed, and I hope it will come during the Bill’s passage. I do not think that the distinction between Ofcom and the BBFC is clear in this part of the Bill or in later clauses on enforcement. However, given that it states elsewhere in the Bill that the proposal is subject to further parliamentary scrutiny, and as the BBFC has not yet officially been given the regulator role—as far as I am aware—I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
I beg to move amendment 66, in clause 15, page 18, line 24, at end insert
“or an internet service provider.”.
This amendment and amendment 67 ensure that the requirement to implement age verification does not fall on ISPs but commercial sites or applications offering pornographic material; and defines internet service providers.
With this it will be convenient to discuss the following:
Amendment 90, in clause 22, page 23, line 29, leave out
“or ancillary service provider”
and insert
“, ancillary service provider, or internet service provider.”.
Amendment 77, in clause 22, page 24, line 23, at end insert “or
(c) an internet service provider.”.
This amendment and amendment 78 ensure that the definition of an ancillary service provider would include ISPs; and defines internet service providers.
Amendment 91, in clause 22, page 24, line 23, at end insert—
“(6A) In this section an “ancillary service provider” includes, but is not limited to, domain name registrars, social media platforms, internet service providers, and search engines.”.
Amendment 67, in clause 25, page 26, line 2, at end insert—
““internet service provider” has the same meaning as in section 124N of the Communications Act 2003 (interpretation);”.
See the explanatory statement for amendment 66.
New clause 8—Duty to provide a service that excludes adult-only content—
“(1) This section applies to internet service providers who supply an internet access service to subscribers.
(2) For the purposes of subsection (1), “subscribers” includes—
(a) domestic subscribers;
(b) schools; and
(c) organisations that allow a person to use an internet access service in a public place.
For the purposes of the conditions in subsections (3) and (4), if the subscriber is a school or organisation a responsible person within the school or organisation shall be regarded as the subscriber.
(3) A provider to whom subsection (1) applies must provide to subscribers an internet access service which excludes adult-only content unless all of the conditions listed in subsection (4) have been fulfilled.
(4) The conditions are—
(a) the subscriber “opts in” to subscribe to a service that includes online adult-only content;
(b) the subscriber is aged 18 or over; and
(c) the provider of the service has an age verification scheme which meets the standards set out by OFCOM in subsection (4) and which has been used to confirm that the subscriber is aged 18 or over before a user is able to access adult-only content.
(5) It shall be the duty of OFCOM, to set, and from time to time to review and revise, standards for the—
(a) filtering of adult content in line with the standards set out in Section 319 of the Communications Act 2003;
(b) age verification policies to be used under subsection (4) before an user is able to access adult content; and
(c) filtering of content by age or subject category by providers of internet access services.
(6) The standards set out by OFCOM under subsection (5) must be contained in one of more codes.
(7) Before setting standards under subsection (5), OFCOM must publish, in such a manner as they think fit, a draft of the proposed code containing those standards.
(8) After publishing the draft code and before setting the standards, OFCOM must consult relevant persons and organisations.
(9) It shall be the duty of OFCOM to establish procedures for the handling and resolution of complaints in a timely manner about the observance of standards set under subsection (5), including complaints about incorrect filtering of content.
(10) OFCOM may designate any body corporate to carry out its duties under this section in whole or in part.
(11) OFCOM may not designate a body under subsection (10) unless, as respects that designation, they are satisfied that the body—
(a) is a fit and proper body to be designated;
(b) has consented to being designated;
(c) has access to financial resources that are adequate to ensure the effective performance of its functions under this section; and
(d) is sufficiently independent of providers of internet access services.
(12) It shall be a defence to any claims, whether civil or criminal, for a provider to whom subsection (1) applies to prove that at the relevant time they were—
(a) following the standards and code set out in subsection (5),; and
(b) acting in good faith.
(13) Nothing in this section prevents any providers to whom subsection (1) applies from providing additional levels of filtering of content.
(14) In this section—
“adult-only content” means material that contains offensive and harmful material from which persons under the age of 18 are protected;
“age verification scheme” is a scheme to establish the age of the subscriber;
“internet access service” and “internet service provider” have the same meaning as in section 124N of the Communications Act 2003 (interpretation);
“material from which persons under the age of 18 are protected” means material specified in the OFCOM standards under section 2;
“OFCOM” has the same meaning as in Part 1 of the Communications Act 2003;
“offensive and harmful material” has the same meaning as in section 3 of the Communications Act 2003 (general duties of OFCOM); and
“subscriber” means a person who receives the service under an agreement between the person and the provider of the service.”.
This new clause places a statutory requirement on internet service providers to limit access to adult content by persons under 18. It would give Ofcom a role in determining the age verification scheme and how material should be filtered. It would ensure that ISPs were able to continue providing family friendly filtering once the net neutrality rules come into force in December 2016.
New clause 11—Power to make regulations about blocking injunctions preventing access to locations on the internet—
“(1) The Secretary of State may by regulations make provision about the granting by a court of a blocking injunction in respect of a location on the internet which the court is satisfied has been, is being or is likely to be used for or in connection with an activity that is contravening, or has contravened, section 15(1) of this Act.
(2) “Blocking injunction” means an injunction that requires an internet service provider to prevent its service being used to gain access to a location on the internet.
(3) Regulations introduced under subsection (1) above may, in particular—
(a) make provision about the type of locations against which a blocking injunction should be granted;
(b) make provision about the circumstances in which an application can be made for a blocking injunction;
(c) outline the type of circumstances in which the court will grant a blocking injunction;
(d) specify the type of evidence, and other factors, which the court must take into account in determining whether or not to grant a blocking injunction;
(e) make provision about the notice, and type of notice, including the form and means, by which a person must receive notice of an application for a blocking injunction made against them; and
(f) make provision about any other such matters as the Secretary of State considers are necessary in relation to the granting of a blocking injunction by the court.
(4) Regulations under this subsection must be made by statutory instrument.
(5) A statutory instrument containing regulations under this section may not be made unless a draft of the instrument has been laid before, and approved by a resolution of, each House of Parliament.
(6) In this Part— “Internet service provider” has the same meaning as in section 16 of the Digital Economy Act 2010. In the application of this Part to Scotland “injunction” means interdict.”.
This new Clause empowers the Secretary of State to introduce regulations in relation to the granting of a backstop blocking injunction by a court. The injunction would require an internet service provider to prevent access to a site or sites which do not comply with the age-verification requirements. This would only be used where the other enforcement powers (principally fines) had not been effective in ensuring that sites put in place effective age-verification.
I welcome the Minister’s previous comments, which gave me some real assurances on the parity of content and regulator. I also reassure him of how popular he will be when the Bill finally passes—the Centre for Gender Equal Media said that, in its most recent survey, 86% of people support a legal requirement on companies to prevent children’s access to pornography. We are moving in the right direction.
Amendment 66 seeks to pick through slightly more carefully who is responsible and is captured by the Bill’s language. There are four internet service providers in the UK through which the majority of broadband internet traffic travels, and they have come a long way. Five years ago, they accepted none of our proposals, be it single click protection for all devices in the home or the implementation of a filtering system that required selection—we could not select whether or not the filters were on. They have gone from that to the position now whereby, in some cases, we have ISPs that provide their services with the filters already on as default—something that we were told was absolutely unimaginable. With that regime, the level of complaints is very low and the level of satisfaction is very high.
Amendment 67 is consequential on amendment 66 and both seek to clarify the scope of who exactly would be covered under the wording of clause 15(1), which states:
“A person must not make pornographic material available on the internet on a commercial basis to persons in the United Kingdom except in a way that secures that, at any given time, the material is not normally accessible by persons under the age of 18.”
The Government have made it quite clear in the consultation, and the Minister clarified in his previous remarks, that the proposals apply to companies running websites aimed specifically at providing pornographic content for commercial gain, and that they want those who profit from such material being made available online to act in a legal, socially responsible way. It could be argued that ISPs both profit from the material being made available online and also make pornographic material available online, even though they are not the original source of the material. We also heard from the Minister that he is minded to consider social media platforms in that same category. In my view, the regulator must also publish guidance under clause 15(3) about
“circumstances in which the regulator will treat an internet site or other means of accessing the internet as operated or provided on a commercial basis”.
It is my concern that that could also be read as applying to ISPs. The amendments are intended to clarify that. In fact, I can quote from an article from July, which said:
“Internet access providers are likely to feel left in an uncertain position at the moment as, while the Bill does not reference them in this context, the definition of ‘makes pornographic material available’ could be argued as incorporating companies which provide connectivity to servers used for the making available of pornographic material”,
and piping that material into the home.
Paragraph 22 of the explanatory notes makes reference to “commercial providers of pornography”, and that obviously appears to place the onus of this suite of measures firmly on the content providers, but an optimal approach would be to improve the drafting to make the legislative attempt clear. I know we will have further discussions about the role of ISPs, but ISPs have done what we have asked them to do in introducing family friendly filters.
I am trying to understand why the hon. Lady believes that ISPs should not have this responsibility.
Because various other aspects of the Bill capture ISPs. My concern is that the Bill focuses on the commercial content providers where they are. The amendment is intended to probe the Government about how they are thinking about ISPs vis-à-vis commercial content providers in the drafting of the clause.
Our amendments are designed to enable the regulator to ask the internet service provider to block offending sites. This goes back to the point we made earlier on the differences between sites operated “on a commercial basis” and social media sites and ancillary sites. The proposals as they stand do not give the regulator sufficient powers to enforce the mechanisms proposed in the Bill.
Broadening the definition of “ancillary service provider” specifically to include internet service providers would require the regulator to notify them of non-compliant sites. That will put ISPs in the same bracket as payment service providers, which will be required to withdraw their services if other measures have been exhausted. In the case of ISPs, they would be required to block offending sites.
The amendments would create a simple backstop power where enforcement through the Government’s proposals had not achieved its intended objective and commercial providers had not withdrawn their services, either because the fine does not act as a deterrent or because, due to their international status, they do not need to comply. If pornography providers continued to provide content without age verification restrictions, the regulator would then have the power to require ISPs to take down the content.
We believe that, without amendment, the proposals will not achieve the Bill’s aim, as non-compliant pornographers would not be absolutely assured of payment services being blocked. First, the proposals do not send anywhere near a strong enough signal to the porn industry that the Government are serious about the proposals and their enforcement. Giving the regulator the power but not the stick suggests that we are not all that bothered about whether sites comply. Secondly, we can have no reassurance that sites will be shut down within any kind of timeframe if there is non-compliance. As drafted in the explanatory notes, “on an ongoing basis” could mean yearly, biannually or monthly, but it makes a mockery of the proposals if sites could be non-compliant for two years or more before payment services may or may not act. That does not provide much of an incentive to the industry to act.
Throughout the evidence sessions we heard that there are significant difficulties with the workability of this entire part of the Bill. For instance, many sites will hide their contact details, and a substantial number will simply not respond to financial penalties. Indeed, an ability already exists in law for ISPs to be compelled to block images that portray, for example, child sex abuse. There is also an ability to block in the case of copyright infringement. It therefore seems eminently reasonable that in the event of non-compliance, the regulator has a clear backstop power. We believe that even just legislating for such a power will help speed up enforcement. If providers know that they cannot simply circumvent the law by refusing to comply with notices, they will comply more efficiently. That will surely help the age verifier to pass the real-world test, which is integral to the Bill’s objectives.
On a point of order, Mr Stringer. Perhaps this shows my ignorance of doing Committees from the Back Benches, but I intended to go on in my speech to discuss new clause 8, which I have tabled and which defines more clearly what I expect internet service providers to do. Would it be in order for me to deliver those remarks, or have I lost my opportunity?
Let me be clear: we are considering amendment 66 to clause 15, amendments 90, 77, 91 and 67, and new clauses 8 and 11. Members can speak more than once in Committee if they wish to. The hon. Lady has the right to discuss her new clause.
May I please rise again, then? Apologies to the Committee—[Interruption.] I am so sorry; the hon. Member for Bristol West was speaking.
I defer to the hon. Lady. She mentioned something she is going to say in due course; I look forward to hearing it. Nevertheless, I stand by my comments. We need to be clear about whether we are going to fail to require ISPs to do something that we already require them to do for copyright infringement and other forms of pornography involving children. I fail to see what the problem is. Having a blocking injunction available to the regulator would give them another tool to achieve the aim that we have all agreed we subscribe to, which is being able to block pornography from being seen by children and young people.
Mr Stringer, I assume that, like me, you sometimes have the feeling that you have sat down before you have finished what you are saying. I apologise to the Committee. I am rarely short of words, but in this case I was.
I want to respond to the point made by the hon. Member for Bristol West and clarify exactly what we have asked and should be asking internet service providers to do. In doing so, I shall refer to the new EU net neutrality regulations, which, despite the Brexit vote, are due to come into force in December. They cause many of us concerns about the regime that our British internet service providers have put in place, which I believe leads the world—or, at least, the democratic free world; other countries are more draconian—in helping families to make these choices. We do not want all that good work to be unravelled.
Our current regime falls foul of the regime that the European Union is promoting, and unless the Government make a decision or at least give us some indication relatively quickly that they will not listen to that, we may have an issue in that all the progress that we have made may run out by December 2016. I would be grateful if the Minister told us what the Government are doing to get the new legislation on the statute book in line with the schedule set out by his colleague Baroness Shields last December.
We have an effective voluntarily filtering arrangement. I believe—I think that this point is in the scope of ancillary service providers—that we intend to capture internet service providers as part of the general suite of those responsible for implementing over-18 verification, but I want the Government to make crystal clear that they are aware of the responsibilities of internet service providers and intend for the regulator to include them in the basket of those that they will investigate and regulate.
The big missing link in all this has been getting content providers that provide material deemed to be pornographic to do anything with that material. The difference is that content providers of, say, gambling sites have always been required to have age-verification machinery sitting on their sites.
The hon. Member for Bristol West is quite right that we want ISPs to be captured under this regulatory regime, but I am keen to hear from the Minister that all the work that we have done with ISPs that have voluntarily done the socially and morally responsible thing and brought forward family-friendly filters will not be undone by December 2016, when the EU net neutrality regulations are intended to come into place.
Quite a lot of points have been raised, and I seek to address them all. Clause 22 is an important provision containing the powers at the heart of the new regime to enable the age-verification regulator to notify payment service providers and ancillary service providers that a person using their services is providing pornographic material in contravention of clause 15 or making prohibited material available on the internet to persons in the UK.
Amendments 66, 67, 77, 78, 90 and 91 would provide that the requirement to implement age verification does not fall on ISPs and further clarify that ISPs are to be considered ancillary service providers. Amendment 91 would clarify that as well as ISPs, domain name registrars, social media platforms and search engines are all to be considered ancillary service providers for the purposes of clause 22, which makes provision for the meaning of “ancillary service provider”.
This is a fast-moving area, and the BBFC, in its role as regulator, will be able to publish guidelines for the circumstances in which it will treat services provided in the course of business as either enabling or facilitating, as we discussed earlier. Although it will be for the regulator to consider on a case-by-case basis who is an ancillary service provider, it would be surprising if ISPs were not designated as ancillary service providers.
New clause 8 would impose a duty on internet service providers to provide a service that excludes adult-only content unless certain conditions are met. As I understand it, that measure is intended to protect the position of parental filters under net neutrality. However, it is our clear position that parental filters, where they can be turned off by the end user—that is, where they are a matter of user choice—are allowed under the EU regulation. We believe that the current arrangements are working well. They are based on a self-regulatory partnership and they are allowed under the forthcoming EU open internet access regulations.
I think I understand the Minister to be saying that in cases where companies have introduced filters that are on by default, the fact that the users can choose to turn those filters off in the home means that they would not be captured by the net neutrality rules. Is that correct?
David Austin of the BBFC said:
“We see this Bill as a significant step forward in terms of child protection.”––[Official Report, Digital Economy Public Bill Committee, 11 October 2016; c. 42, Q94.]
We think, on balance, that the regulator will have enough powers—for example, through the provisions on ancillary service providers—to take effective action against non-compliant sites. For that reason, I think this is the appropriate balance and I ask my hon. Friend the Member for Devizes to withdraw her amendment.
I think that we are running through two definitions of ISPs: one relating to ancillary service providers and the other to enforcement and blocking. If we include ISPs in the definition of ancillary service providers, we want to make sure that they are captured, either explicitly or as a service provider. Is the Minister saying that he is comfortable with the enforcement regime without blocking? Would it require further legislation for blocking to be carried out if the regulator felt it was an appropriate measure? Are we ruling that out in this legislation?
Order. The hon. Lady is making a speech. If the Minister wants to intervene, he may.
I apologise. I would like to conclude my speech by inviting the Minister to respond.
I thank my hon. Friend for giving way. I would like to provide a point of clarity on the speech she has made. Treatment of an ASP will not lead to blocking. I think that is the answer to her question.
I thank the Minister for that intervention. We will return to this subject in a series of amendments around clause 20. I want to thank the Minister for clarifying some of the murkiness around definitions in the Bill. I want to ask him and his team, though, to consider what his colleague had said, which goes back to the net neutrality point.
I accept what the Minister says about the spirit being absolutely clear, that our current filtering regime will not be captured, but Baroness Shields did say that we needed to legislate to make our filters regime legal. I did not hear from the Minister that that legislation is something that the Department is preparing or planning to introduce.
We very much share the hon. Lady’s concerns that the legislation has explicitly excluded the ability of internet service providers to block. We simply cannot understand why the Government have ruled out that final backstop power. We appreciate it is not perfect but it would give the regulator that final power. We will return to new clause 11 at the end of the Bill and be pushing it to a vote when we come to it.
I thank the hon. Lady for making her intentions clear. I am prepared to withdraw or not push my new clause to a vote on the basis of what the Minister said, but I would love to get his assurances—perhaps he will write to me—to be crystal clear on the fact that he believes the Government do not have to legislate in order to push back on the net neutrality regime.
Before the hon. Lady sits down, she did mention the view of Baroness Shields that there should be new legislation. Notwithstanding our remarks about the number of Government amendments, does the hon. Lady believe this Bill could be a useful vehicle to achieve that?
Given the Brexit vote, I would be inclined to accept a letter from the Minister suggesting that we will absolutely resist any attempt to make EU net neutrality apply to what is a very fine, though not perfect, voluntary regime. On that basis, I accept the Minister’s assurances that that is what he intends to do. I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
Clause 15 ordered to stand part of the Bill.
Clause 16 ordered to stand part of the Bill.
Clause 17
The age-verification regulator: designation and funding
Question proposed, That the clause stand part of the Bill.
In this and related clauses, we seek to strengthen the proposals that the Government have put forward. We have said that the regulation needs to be beefed up to require internet service providers to be notified about non-compliance. We would like to see an injunction power to take down any content which a court is satisfied is in breach of the age-verification legislation, as soon as possible, at the start of the four-tier regulation process the Government have identified in their amendments and letters published to the Committee last week.
That would require a regulator with sufficient enforcement expertise and the ability to apply that injunction and push enforcement at an early stage. As we are aware, however, the BBFC heads of agreement with the Government do not cover enforcement. Indeed, they made perfectly clear that they would not be prepared to enforce the legislation in clauses 20 and 21 as they stand, which is part 4 of that enforcement process, giving the power to issue fines. The BBFC is going to conduct phases 1, 2 and 3 of the notification requirements, presumably before handing over to a regulator with sufficient enforcement expertise, but that has not been made clear so far.
While we welcome the role of the BBFC and the expertise it clearly brings on classification, we question whether it is unnecessarily convoluted to require a separate regulator to take any enforcement action, which will effectively have been begun by the BBFC and which so far has not been mentioned in the legislation. This goes back to the point my hon. Friend the Member for Cardiff West made earlier about the two separate regimes for on-demand programme services.
As I understand it, although it is not clear, the BBFC will be taking on stage 3 of the regulation, meaning it will be involved in the first stage of enforcement—in notification. That is fine, but it will then have to hand over the second stage of enforcement to another regulator—presumably Ofcom. The enforcement process is already incredibly weak and this two-tiered approach involving two separate regulators risks further delays in enforcement against non-compliant providers who are to protect or take down material that is in breach of the law. In evidence to the Committee, the BBFC said:
“Our role is focused much more on notification. We think we can use the notification process and get some quite significant results.”—[Official Report, Digital Economy Public Bill Committee, 11 October 2016; c. 41, Q83.]
We do not doubt it, but confusion will arise when the BFFC identifies a clearly non-compliant site that is brazenly flouting the law, and it does not have power to enforce quickly but will have to hand it over.
We would also like to hear when the Government are planning to announce the regulator for the second stage and how they intend to work with the BBFC. As far as I can see, this will require further amendments to the Bill. If it is Ofcom, it would have been helpful to have heard its views on what further enforcement powers it would like to see in the Bill, rather than being asked to fill in after the Bill has passed through Parliament. There is a clear danger that the enforcement regulator could be asked to take over enforcement of age verification, which it thinks requires more teeth to be effective.
We therefore have very serious concerns about the process by which clause 17 will be have effect. Although we will not vote against the clause, we want to make it very clear that we would have preferred to have seen an official announcement about who will carry out the enforcement provisions in the Bill before being asked to vote on it.
I would expect that to happen immediately. The question of the designation of the backstop enforcement regulator does not stop or preclude the BBFC from getting going on this. As we have heard, it is already working to put in place its own internal systems. As I have just said to the Committee, we have a new commitment that we expect to commence the provisions in terms of getting the system up and running within 12 months of Royal Assent; after that, if the BBFC has designated that there is a problem, I would expect action to be immediate, because I expect the BBFC to ensure through good relations that systems are in place.
I see enforcement very much as a back-up to good behaviour. As we have seen with the taking down of child pornography and material related to terrorism, many providers and platforms respond rapidly when such material is identified. It will be far better if the system works without having to resort to enforcement. We will set out in due course who is best placed to be the regulator for enforcement, but the system is new, and the approach provides the level of flexibility that we need to get it right. I have every confidence in the BBFC’s ability and enthusiasm to deliver on these aims, so I commend the clause to the Committee.
Question put and agreed to.
Clause 17 accordingly ordered to stand part of the Bill.
Clauses 18 and 19 ordered to stand part of the Bill.
Clause 20
Enforcement of sections 15 and 19
I beg to move amendment 68, in clause 20, page 21, line 5, at beginning insert
“If the person in contravention of section 15(1) is resident in the United Kingdom,”.
This amendment and amendments 69, 70, 71, 72, 73 and 74 place a requirement on the age-verification regulator to impose fines where a UK person has contravened clause 15(1) unless the contravention has ceased; or to issue an enforcement notice to person outside of the UK who has contravened clause 15(1).
With this it will be convenient to discuss the following:
Amendment 69, in clause 20, page 21, line 5, leave out “may” and insert “must”.
See the explanatory statement for amendment 68.
Amendment 70, in clause 20, page 21, line 7, after “15(1)”, insert “, unless subsection (5) applies”.
See the explanatory statement for amendment 68.
Amendment 71, in clause 20, page 21, line 10, at beginning insert
“If the person in contravention of section 15(1) is not resident in the United Kingdom,”.
See the explanatory statement for amendment 68.
Amendment 72, in clause 20, page 21, line 10, leave out “may” and insert “must”.
See the explanatory statement for amendment 68.
Amendment 73, in clause 20, page 21, line 16, leave out subsection (4).
See the explanatory statement for amendment 68.
Amendment 74, in clause 20, page 21, line 42, leave out “may” and insert “must”.
See the explanatory statement for amendment 68.
This is a series of consequential and investigatory amendments intended to probe the Minister’s thinking about what the regulator can actually do. At the moment, enforcement operates through a series of financial penalties, which we can discuss further when we debate clause 21, or of enforcement notices. We heard clearly last week from David Austin that the challenge is that almost none of the content-producing sites that we are discussing are based in the UK; in fact, I think he said that all the top 50 sites that the regulator will rightly target are based overseas.
The challenge is how the Government intend to carry out enforcement. I know that the BBFC’s current enforcement role is not carried out through its own designated powers; it is carried out through various other agencies, and the Bill makes further provision for financial penalties. I tabled the amendments to press the Minister on the point that it would be clearer to specify that where a site, or the company that owns a site, is based in the UK, a financial penalty can and will be applied.
For overseas sites, enforcing a financial penalty, if one can even get to grips with what the financial accounts look like, may be difficult, hence the enforcement notice and then a series of other potential backstop actions; I know that the Minister is aware that I do not feel that we have exhausted the debate on blocking. I am trying to probe the Government on whether there is a way to use the Bill to reflect the reality that content providers are unlikely to be based primarily in the UK, and that perhaps a different approach is needed for those based offshore.
We completely support the hon. Lady’s amendments, which propose a sensible toughening up of the requirements of the age verification regulator. We particularly welcome the measures to require the regulator to issue enforcement notices to people outside the UK if they do not comply. That is an attempt to close a large hole in the current proposals. How will the BBFC tackle providers outside the UK?
At the evidence session last week, David Austin said that
“you are quite right that there will still be gaps in the regime, I imagine, after we have been through the notification process, no matter how much we can achieve that way, so the power to fine is essentially the only real power the regulator will have, whoever the regulator is for stage 4”;
we are not yet certain.
He continued:
“For UK-based websites and apps, that is fine, but it would be extremely challenging for”
the BBFC, Ofcom or whoever the regulator is for stage 4
“to pursue foreign-based websites or apps through a foreign jurisdiction to uphold a UK law. So we suggested, in our submission of evidence to the consultation back in the spring, that ISP blocking ought to be part of the regulator’s arsenal.”––[Official Report, Digital Economy Public Bill Committee, 11 October 2016; c. 41, Q91.]
That is precisely why we will return to the amendment on ISP blocking, because if we are to pursue foreign-based providers, the ability to block will be integral to that strategy.
I am thankful for the opportunity to respond. I will actually respond to the points made about these amendments, which were tabled by my hon. Friend the Member for Devizes, rather than the reiteration of the blocking debate, which we have had and will no doubt have again on further clauses.
First, clause 17 clearly makes provision for the Secretary of State to designate more than one person as a regulator. Secondly—a crucial point—the complexity in regulation is deciding who is satisfying the rules and who is not, and that is for the BBFC to determine, whereas issuing fines is essentially a matter of execution and could be fulfilled by a variety of bodies. We will come forward with more detail on that in due course.
I think the whack-a-mole analogy inadvertently made the point, which is that when we are trying to deal with a problem on the internet, where people can move about, we can deal with the mainstream of the problem, which comes from reliable providers of adult material, who are already engaged and want to ensure they comply with the law. In future, once this measure becomes law, refusing to put age verification on adult material will be illegal, so we will be dealing with illegal activity. That will mean that the vast majority of people will comply with the law, and we heard that very clearly in the evidence session. The question then is how to deal with non-compliance and on the internet we know that that is very difficult. The proposals are to deal with non-compliance by disrupting business models and by imposing financial penalties.
I understand what my hon. Friend is trying to do. She is trying to strengthen the imposition of financial controls. Inadvertently, however, her amendments would reduce the regulator’s discretion by obliging the it to apply sanctions when they are available, and they would remove the power to apply financial penalties to non-UK residents.
We want to be able to fine non-UK residents—difficult as that is—and there are international mechanisms for doing so. They do not necessarily reach every country in the world, but they reach a large number of countries. For instance, Visa and other payment providers are already engaged in making sure that we will be able to follow this illegal activity across borders.
Therefore, while I entirely understand where my hon. Friend is coming from, the amendments would inadvertently have the effect of removing the ability to apply an enforcement notice to a UK resident, although I am certain that that is not what she intended. So I resist the amendment but I give her the commitment that we have drafted the clause in such a way as to make it as easy as possible for the enforcement regulator to be able to take the financial route to enforcement.
On the point made by the hon. Member for Berwickshire, Roxburgh and Selkirk, the provisions do extend to Scotland, with necessary modifications to Scottish law. I am sure that he, like me, will have seen clause 17(5) and clause 20(11)(b), which refer to modifications needed to be consistent with Scottish law. On the basis of that information, I hope that my hon. Friend will withdraw the amendment.
I thank the Minister for that clarification and for the mention of support. The intention was to help to provide a practical solution rather than cut off aims. He has persuaded me that I do not need to press the amendment to a vote. Although I take the point about shared regulation, I would ask him to consider in setting up the BBFC as the primary regulator that it is working reasonably well in the video-on-demand world, but this may be having them stray into a new sphere of expertise in terms of finding, identifying and sending out enforcement notices or penalties, particularly for foreign-based companies. I think the whack-a-mole analogy is entirely consistent—they will shut their doors and reopen in another jurisdiction almost overnight. Given the anonymity principles, it is sometimes almost impossible to know where they actually are. If the Minister is assuring us that everyone is aware of the problem, he believes the powers allow the regulator to be flexible, and it is something that his Department will consider, I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
I beg to move amendment 86, in clause 20, page 21, line 40, leave out paragraph (b) and insert—
“(b) “during the initial determination period fix the date for ending the contravention of section 15(1) as the initial enforcement date.”.
I will not test the Committee’s patience further by going over arguments that we have already had, but there is one further area of clause 20 that we wish to touch on—the lack of an appeals process in the legislation. The Minister may expect the regulator to build that appeals process in: it would be helpful to have some clarity from him on that.
As I understand it, the BBFC will use analytics to identify sites that should have age verification. Analytics are not foolproof, so obviously an appeals mechanism will be needed for websites incorrectly prevented from operating. Previous such systems have wrongly filtered out websites such as breast cancer charities or forums for gay and transgender people. That is incredibly important: let us put ourselves in the shoes of a young gay man or woman, growing up in a religious household perhaps, who does not know where to turn to ask the questions that would plague any teenager coming to terms with their sexuality and who seeks refuge and solace in internet forums with other people going through the same issues. As risky as the internet can be, it can also be an incredibly empowering, transformative space that can literally save lives in such situations. Such lifelines must absolutely not be filtered out by ASPs or made subject to age verification; the Bill should include a mechanism that allows for correction when they have been mistakenly identified.
We also need clarification on who will develop the analytics, the data they will be based on and whether it will be done in consultation with the tech industry. We can only assume that this is an oversight that will be corrected when working out how the regulator is to proceed.
The hon. Lady raises an important point about access to information about sex education, sexuality, abortion and all sorts of things that are incredibly valuable. She is right to draw attention to safe forums. I reassure her that many of the same issues came up with respect to the question of voluntary filtering and, despite what some of those giving evidence said, the incidence of false blocking of such valuable sites is incredibly low. The BBFC as regulator is really good: it is not in the business of defining based on imagery, and it has fairly detailed algorithms. I share her concern, but I want to offer some comfort.
I am grateful. I heard the BBFC or the Open Rights Group say that the incidence was very low, but it would do no harm to build an appeals process into the legislation to ensure that where sites that should not be blocked or require age verification have fallen through the cracks, that can be resolved at the behest of the regulator.
The hon. Lady is absolutely correct that there needs to be an appeals process. That process is provided for in clause 17(4):
“The Secretary of State must not make a designation under this section unless satisfied that arrangements will be maintained by the age-verification regulator for appeals”.
I agree with everything else she said. It is worth remarking on the recent announcement that gay and bisexual men will now be pardoned over abolished sexual offences—that is not in the Bill, so that remark was completely out of order, but I still think it was worth making. Appeals are important; I hope she is satisfied that they are provided for.
Question put and agreed to.
Clause 20 accordingly ordered to stand part of the Bill.
Clause 21 ordered to stand part of the Bill.
Clause 22
Age-verification regulator’s power to give notice of contravention to payment service providers and ancillary service providers
I beg to move amendment 75, in clause 22, page 23, line 28, at end insert; “and
(c) the person has been the subject of a enforcement notice under section 20(2) and the contravention has not ceased.”
With this it will be convenient to discuss the following:
Amendment 76, in clause 22, page 23, line 29, leave out “may” and insert “must”
This amendment places a requirement on the age-verification regulator to give notice to payment or ancillary service providers that a person has contravened clause 15(1) or is making prohibited material available on the internet to persons in the United Kingdom.
Amendment 79, in clause 22, page 24, line 24, leave out “may” and insert “must”
This amendment places a requirement on the age-verification regulator to issue guidance about the services that it determines are enabling or facilitating the making available of pornographic or prohibited content.
New clause 6—
“Requirement to cease services to non-complying persons
‘(1) Where the age-verification regulator has given notice to a payment-services provider or ancillary service provider under section 22(1), the payment-services provider or ancillary service provider must cease the service provided to the person making pornographic material available in the United Kingdom.
(2) A payment-services provider or ancillary service provider who fails to comply with a requirement imposed by subsection (1) commits an offence, subject to subsection (3).
(3) No offence is committed under subsection (2) if the payment-services provider or ancillary service provider took all reasonable steps and exercised all due diligence to ensure that the requirement would be complied with.
(4) A payment-services provider or ancillary service provider guilty of an offence under subsection (2) is liable, on summary conviction, to a fine.
(5) In this section “payment-services provider” and “ancillary service provider” have the same meaning as in section 22.”
This new clause requires payment and ancillary services to block payments or cease services made to pornography websites that do not offer age-verification if they have received a notice of non-compliance under section 22(1). This provision would only apply to websites outside of the UK. This would enhance the enforcement mechanisms that are available under the Bill.
New clause 18—Approval of Age-verification providers—
‘(1) Age-verification providers must be approved by the age-verification regulator.
(2) In this section an “age-verification provider” means a person who appears to the age-verification regulator to provide, in the course of a business, a service used by a person to ensure that pornographic material is not normally accessible by persons under the age of 18.
(3) The age-verification regulator must publish a code of practice to be approved by the Secretary of State and laid before Parliament.
(4) The code will include provisions to ensure that age-verification providers—
(a) perform a Data Protection Impact Assessment and make this publicly available,
(b) take full and appropriate measures to ensure the accuracy, security and confidentiality of the data of their users,
(c) minimise the processing of personal information to that which is necessary for the purposes of age-verification,
(d) do not disclose the identity of individuals verifying their age to persons making pornography available on the internet,
(e) take full and appropriate measures to ensure that their services do not enable persons making pornography available on the internet to identify users of their sites or services across differing sites or services,
(f) do not create security risks for third parties or adversely impact security systems or cyber security,
(g) comply with a set standard of accuracy in verifying the age of users.
(5) Age-verification Providers must comply with the code of practice.
(6) To the extent that a term of a contract purports to prevent or restrict the doing of any act required to comply with the Code, that term is unenforceable.”
We promised to return to the topic of enforcement and blocking, and we have reached it today. That is very good; it suggests that our progress on the Bill is excellent.
The purpose of these amendments and new clause 6 is to clarify and strengthen the enforcement process. We have already discussed fruitfully how clause 20 will be used, particularly for sites based overseas, and I was reassured by what the Minister said, but I want to turn to the “what ifs”. What happens if the regulator acts, has clarity about whether they are imposing a fine or an enforcement notice, and nothing actually happens—none of the sanctions in the current regime leads to a website imposing age verifications? I welcome what the Bill says about involving a direct relationship between not just the regulator and the platform or the website, but the payment providers. As the Minister said, cutting off the business model—the cash flow—is a very effective way of making enforcement happen.
I have a series of questions relating to the process. First, it is not clear when the regulator will inform providers that such a contravention is happening. Some questions were asked about how long it will be and what the time period will be, but when does the regulator actually issue a notice? Amendment 75 states that the regulator has a power to issue a notice under clause 22 when an enforcement notice has been issued and the contravention has not yet ceased. I think websites ought to be given the opportunity to respond to the regulator’s intervention before the payment providers and ancillary services are involved. That process should be very clear. It is the same if we have an issue with service provision at home: we know what our rights are, what period of time we have to complain and what happens when that period expires.
Secondly, as I read the Bill—I am in no way setting myself up as somebody who understands every aspect of the legal jargon—there appears to be no requirement for the regulator to inform the payment providers and ancillary services of a contravention. It may just be implicit, but amendment 66 would make it mandatory for the regulator to inform the payment providers and ancillary services if there were a contravention. I would be interested to hear the Minister’s views on that.
I am pleased that we have returned to enforcement and compliance, and I hope we are going to spend more time on blocking. The hon. Lady’s amendment uses the term “ancillary service provider”, to which she referred earlier. I would be very grateful if she spent some time spelling out in a bit more detail what an ancillary service provider is. Does it include ISPs? I think she alluded to that earlier, but I am not sure. Can she help clear up the confusion with some detail, please?
I apologise if I have caused any confusion. I will let the Minister specify exactly what he thinks. In tabling these amendments, I wanted to ensure that as wide a group of people and companies as possible is involved in doing something we all think is very valuable—implementing these age verification mechanisms. As I read the Bill as drafted, it does not contain a clear distinction between ISPs and ancillary service providers; they are included in the same bucket. I want to clarify that I think that both ISPs and ancillary service providers—in my mind, ancillary service providers are the platforms that we discussed by name earlier—have a duty and a legal responsibility to ensure that the age-verification mechanisms are in place.
The hon. Lady will have to forgive me. We are going to hear from the Minister shortly, but I would like to know if, in her amendment, ancillary service providers definitely include internet service providers. I know it is a difference of just one word, but I would be grateful for her clarification.
I share some of the hon. Lady’s uncertainty—I was going to say confusion, but it is not—about the terminology. Would the definition include, for example, telecoms providers over whose networks the services are provided?
I am perhaps going to let the Minister spell that out exactly. The hon. Gentleman raises a very important point: we all know now that access to internet services is often done entirely over a mobile network. I can again give some comfort on this issue. The BBFC, which is an excellent choice, has worked for many years with the mobile service providers—a witness gave evidence to this effect—so they already offer a blocking service based on the BBFC’s definition of 18-plus and 18-minus material. It is essentially an opt-in service. Someone has to say that they are under 18 and checks are carried out. The providers already offer the service, and it seems to work reasonably effectively.
I apologise for inadvertently misleading the Committee —perhaps it reflects some of the confusion in the wording—and I want to be very clear about who we are trying to capture with the amendments. We would all support the idea of spreading the net as widely as possible in ensuring the right behaviour, but it is important to make clear that ISPs are to be expected and legally mandated to carry out the same checks.
Another point I wanted to make with amendment 79 was to ask the regulator to issue guidance on the sort of businesses that will be considered to be ancillary services. The reason for putting that in the Bill is that, as we debated extensively in earlier sittings, the world changes. We had very good debates about why 10 megabits per second might not be appropriate in a couple of years’ time and why the USO as originally construed was laughably small. We all try to do the right thing, but of course the world changes. The reference by the hon. Member for City of Chester to Whac-A-Mole was interesting. What will the consequences be of implementing the Bill? We are a very substantial revenue stream for many websites, and new service models might arise. Someone might be scrutinising the letter of the law and thinking, “We are not captured by this, so we are not captured by these regulations.” Asking for the regulator to issue guidance on the types of businesses that will be considered to be ancillary services could future-proof some of the Bill.
I am grateful for the hon. Lady again allowing me to intervene. I apologise for interrupting her sentence; that was not my intention. I am pleased to see her amendments. This discussion is helping me and perhaps all of us to come to some form of understanding. I have a little metaphor in mind. If a cinema was allowing children to see pornography, we would hold the ticket seller responsible, as well as the organisation running the cinema, but not the bus driver who drove the bus the child took to get to the cinema. Does that metaphor help?
It depends whether the bus driver was paid for by the cinema. That is the point. Businesses pop up. There might be a bespoke Odeon cinema. My point is that we need to ensure that the regulator has as much flexibility as possible to respond to changing definitions. The current definition of an ancillary service provider is quite clear, although I would like the Minister to clarify it, but my amendment would try to future-proof the definition.
In raising the issue of whether the bus driver was paid for by the cinema, the hon. Lady has helped me to hit on something else. Are we not considering the role of search engines in this matter and whether they are driving things or complicit? I do not know the answer to that question. She has raised a helpful analogy in response to my analogy.
How long has the Committee got to hear about search engines? The hon. Lady raises a fascinating point. It was through a very strong cross-party effort and with the leadership of the former Prime Minister that we got the search engines to do some compelling things. Let me give her an example. It was clear that search engines in Europe were happy to allow terms to be typed in that could only lead to sexual images of child abuse being returned. I had the important but unenviable job, as the Prime Minister’s special adviser on the issue, of sitting down with the parents of April Jones, the little girl murdered in Wales, and trying to explain to them why, when their daughter’s killer typed in “naked little girls in glasses”, they received an image. It took many levels of conversation, including a personal conversation between me and the head of Google Europe, saying, “How do you as a parent feel about this? I don’t care about you saying ‘We serve up everything at all times’; I don’t care that the search terms themselves are not illegal. What I care about is your duty. You have a duty to do no evil, and in my view, you are breaching that.”
This is why I am so proud of what the Government have done. With all that effort and by recruiting Baroness Shields, who has been a worthy addition, we got the internet service providers not only to not return illegal imagery but, with the help of experts, not to return anything at all to a whole series of search terms that were found to be used by paedophiles in particular. I am sure that the hon. Lady will have seen that the Government then went further. It all comes down to what is legal. Your porn is my Saturday night viewing. [Laughter.] Theoretically.
I urge the hon. Lady to consider re-wording what she just said, for my sake and for hers.
I may have come up with a Daily Mirror headline. My point is that the whole debate about pornographic material has always ended in the cul-de-sac of freedom of speech. That is why we worked with internet service providers, saying, “Let parents choose. Let’s use the BBFC guidelines. They have years of experience defining this stuff based on algorithms.” It is not for the hon. Lady or me to decide what people should not be viewing; we quite properly have an independent agency that says, “This is appropriate; this is not.”
However, the hon. Lady has eloquently raised the point that for too long, we have treated the internet as a separate form of media. We accept in cinemas, whether or not the bus driver is working for them, that if a film is R18, we are pretty negligent if we take our kids to see it, but we are helped to see that. We do not let our kids wander into the cinema and watch the R18 stuff with nobody stopping them along the way, but for too long, that has been the situation with the internet. The hon. Lady has raised a good point about search engines. I can assure her that the world has changed significantly, certainly in the UK, although other jurisdictions may not have been so influenced.
I should probably declare that prior to becoming an MP, I worked at Google. Does my hon. Friend agree that this is where it becomes complex? A search engine, to use another analogy, is a bit like a library. The books are still on the shelves, but the search engine is like the library index: it can be removed and changed, but the content is still there. That is why we need to do much more than just removing things from the search engine: the content is still there, and people can find alternative ways to get to it. We must do much more.
I defer to my hon. Friend’s knowledge. Of course we all agree that certain instances of countries taking things down are utterly abhorrent; I am thinking of information about human rights in China, or about female driving movements in Saudi Arabia. We do not want to be in the business of over-specifying what search engines can deliver. We have not even touched on Tor, the dark web or the US State Department-sponsored attempts to circumvent the public internet and set up some rather difficult places to access, which have increasingly been used for trafficking illegal material.
We need to keep hold of the search engine issue for a moment, because search engines are part of the process. To restate the bus driver analogy, a search engine is also like a sign saying to adults, and children, “You can go here to see pornography”.
I think we will let the Minister talk about that. Again, think about the practical series of keystrokes. Let us take gambling for a moment. It is quite a good analogy, because we mandated in the Gambling Act 2005 that there should be age verification. The search engine host provides access to a site, and users must go through an age verification mechanism. Age verification is incumbent on the site, and the service provider is legally responsible. I shall let the Minister discuss search engines in his speech.
I rise to speak to new clause 18, which stands in my name and that of my hon. Friend the Member for Cardiff West. I also support the amendments tabled by the hon. Member for Devizes. The Government’s proposals really do rely on an awful amount of good will among all the stakeholders involved in the legislation. It makes sense to create a backstop power for the regulator to require payment services to act should they not do so in the first instance.
New clause 18 comes from a slightly different perspective. It would oblige the age-verification regulator to ensure that all age verification providers—the companies that put the tools on websites to ensure compliance—are approved by the regulator; to perform a data protection impact assessment that they make publicly available; and to perform an array of other duties as well.
The new clause is designed to address some of the concerns about the practicality of age-verification checks, ensuring that only minimal data are required, and kept secure; that individuals’ privacies and liberties are protected; and that there is absolutely no possibility of data being commercialised by pornographer. We raise the latter as a potential risk because the proposals were drafted with the input of the pornography industry. That is understandable, but the industry would have a significant amount to gain from obtaining personal data from customers that might not currently be collected.
As we said earlier, we have full confidence in the BBFC as regulator, but, as with the proposals in part 5 of the Bill, it is vital that some basic principles—although certainly not the minutiae—are put on the face of the Bill. We are certainly not asking anything that is unreasonable of the regulator or the age-verification providers. The principles of privacy, anonymity and proportionality should all underpin the age-verification tool, but as far as I am aware they have not featured in any draft guidance, codes of practice, or documents accompanying the Bill.
The Information Commissioner agrees. The Information Commissioner’s Office’s response to the Department for Culture, Media and Sport’s consultation on age verification for pornography raised the concern
“that any solution implemented must be compliant with the requirements of the DPA and PECR”—
the Data Protection Act 1998, and the Privacy and Electronic Communications (EC Directive) Regulations 2003 that sit alongside it. It continues:
“The concept of ‘privacy by design’ would seem particularly relevant in the context of age verification—that is, designing a system that appropriately respects individuals’ privacy whilst achieving the stated aim… In practical terms, this would mean only collecting and recording the minimum data required in the circumstances, having assessed what that minimum was. It would also mean ensuring that the purposes for which any data is used are carefully and restrictively defined, and that any activities keep to those restricted purposes…In the context of preventing children from accessing online commercial pornography, there is a clear attribute which needs to be proven in each case—that is, whether an individual’s age is above the required threshold. Any solution considered needs to be focussed on proving the existence or absence of that attribute, to the exclusion of other more detailed information (such as actual date of birth).”
The Commissioner made it clear that she would have
“significant concerns about any method of age verification that requires the collection and retention of documents such as a copy of passports, driving licences or other documents (of those above the age threshold) which are vulnerable to misuse and/or attractive to disreputable third parties. The collection and retention of such information multiplies the information risk for those individuals, whether the data is stored in one central database or in a number of smaller databases operated by different organisations in the sector.”
I understand that the Adult Provider Network exhibited some of the potential tools that could be used to fulfil that requirement. From the summary I read of that event, none of them seem particularly satisfactory. My favourite was put forward by a provider called Yoti, and the summary I read describes the process for using it as follows:
“install the Yoti App…use the app to take a selfie to determine that you are a human being…use the app to take a picture of Government ID documents”—
passport or driving licence, I imagine—
“the app sends both documents to Yoti…Yoti (the third party) now send both pictures to a fourth party; it was unclear whether personal data (e.g. passport details) is stripped before sending to the fourth party…Fourth party tells Yoti if the images (selfie, govt ID) match…Yoti caches various personal data about user”
to confirm that they are over 18. The user can then visit the porn site—whatever porn site they would like to visit at that time—and then the
“porn site posts a QR-like code on screen…user loads Yoti app…user has to take selfie (again) to prove that it is (still) them…not a kid using the phone…user scans the on-screen QR-code, is told: ‘this site wants to know if you are >18yo, do you approve?’…User accepts…Yoti app backchannel informs porn site…that user >18yo”
and then the user can see the pornography.
I do not know whether any Committee members watch online pornography; I gather that the figure is more than 50% of the general population, and I am not convinced that hon. Members are more abstinent than that. I ask Members to consider whether they would like to go through a process as absurd as the one suggested.
The hon. Lady has got ahead of the potential Daily Mail headline when the freedom of information request comes in for her Google search history.
I am not convinced that anybody would want to go through a process as the one I have just described, or even one significantly less convoluted. I suggest that instead they would seek entertainment on a site that did not impose such hurdles. The BBFC in its evidence made the telling point that the majority of the viewing population get their content from the top 50 sites, so it is very easy to target those—we see that entrenched in clause 23. The problem with that, as my hon. Friend the Member for City of Chester pointed out, is that targeting those sites may push viewers to the next 50 sites, and so on. We therefore need to ensure that the process is as straightforward and as minimal as possible.
That is absolutely right, and I will come to that point. We heard evidence from the BFFC that it intended potentially to use age-verified mobile telephony to ensure that sites are properly age verified, but I am afraid that that approach is also flawed. First, there is the obvious issue that there is nothing to stop an underage child using the information attached to that phone—be it the phone number or the owner’s name—to log on and falsely verify. Equally, there are enormous privacy issues with the use of mobile-verified software to log on.
The BBFC said clearly that it was interested not in identity but merely in the age of the individual attempting to access online pornography, but as we all know, our smartphones contain a wealth of information that can essentially be used to create a virtual clone. They are loaded with our internet conversations, financial data, health records, and in many cases the location of our children. There is a record of calls made and received, text messages, photos, contact lists, calendar entries and internet browsing history—the hon. Member for Devizes may want to take note of that—and they allow access to email accounts, banking institutions and websites such as Amazon, Facebook, Twitter and Netflix. Many people instruct their phones to remember passwords for those apps so they can quickly be opened, which means that they are available to anyone who gets into the phone.
All that information is incredibly valuable—it has been said that data are the new oil—and I imagine that most people would not want it to be obtained, stored, sold or commercialised by online pornography sites. The risks of creating databases that potentially contain people’s names, locations, credit card details—you name it—alongside their pornographic preferences should be quite clear to anyone in the room and at the forefront of people’s minds given the recent Ashley Madison hack. I am not condoning anyone using that website to look for extramarital affairs, nor am I privileging the preferences or privacy of people who wish to view online pornography over the clearly vastly more important issue of child protection. However, one consequence of that hack was the suicide of at least three individuals, and we should proceed with extreme caution before creating any process that would result in the storing of data that could be leaked, hacked or commercialised and would otherwise be completely private and legitimate.
That is the reasoning behind our reasonable and straightforward amendment, which would place a series of duties on the age-verification regulator to ensure that adequate privacy safeguards were provided, any data obtained or stored were not for commercial use, and security was given due consideration. The unintended consequences of the Government’s proposals will not end merely at the blocking of preferences, privacy or security issues, but will include pushing users on to illegal or at the very least non-compliant sites. We are walking a thin tightrope between making age verification so light-touch as to be too easily bypassed by increasingly tech-savvy under-18s and making it far too complicated and intrusive and therefore pushing viewers on to either sites that do not use age verification but still offer legitimate content or completely illegal sites that stray into much more damaging realms. These provisions clearly require a lot more consultation with the industry, and I am confident that the BBFC will do just that, but the Opposition would feel a lot more confident and assured if the regulator was required to adhere to these basic principles, which we should all hold dear: privacy, proportionality and safety.
The hon. Lady rightly gets to the great concern that somehow, in doing something good, an awful lot of concern can be created, and I am sympathetic to her points. I remind her that it is not as if these sites do not know who is visiting them anyway. One of the great conundrums on the internet is that every single keystroke we take is tracked and registered. Indeed, that is why shopping follows us around the internet after we have clicked on a particular site. Unless people are very clever with their private browsing history, the same is the case for commercial providers.
I absolutely support the Government’s intention here. We just want to ensure it is done in the right way and balances both sides of the argument. I think it is absolutely right that internet service providers are offering this filter, but does the hon. Lady share my concern that very few families take it up and very many families turn it off?
There are Ofcom data. One of the requirements we asked for was for Ofcom to monitor. Take-up improved, and, as I said, some internet service providers now have an automatic “on” system, whereby a person has to intervene to take the filters off. I am told that only about 30% of families choose to do so. Here is the savvy thing: we all know that people live in households with multiple ages and multiple requirements on the internet, so many ISPs now offer a service that enables people to disable the filters for a period and automatically reinstate them the following day. They do not have to do anything if they want the filters to be in place, but they might want to access over-18 content as an adult.
I want to discuss some of the other issues that have come up in this conversation, in the process of finally speaking about these amendments. Is it in order to do so, Mr Stringer?
It is if it is covered by the amendments and new clauses 6 and 18, but I cannot tell until you start speaking.
Then I will carry on, because it definitely is. I think I misspoke at the beginning when I talked about new clause 7. I was actually referring to new clause 6; it was just my note-taking.
I was trying ensure that we put in place series of protections, including enforcement notices that are acted upon, financial penalties that make a difference and the ability to stop income streams moving from the payment providers to the various content providers. I want to press the Minister on the question of blocking, because it comes back to the issue of why anyone would care. If somebody does not respond to an enforcement notice—if, for example, the fine is not sufficient to make them stop —how can it be that we are not considering blocking? Of course, we do that for other sites. I know it is not applicable to every form of illegal content, but I am very struck by copyright infringement, which generates take-down notices very swiftly, and upon which the entire provision of internet service providers and ancillary services act. I would be really interested to hear from the Minister why blocking has been rejected so far. Could it be put in place as a backstop power? I worry that, without it, all of this amazing progress will not have teeth.
It is sometimes said that Parliament skates over matters and does not get under the skin of things, but in the discussion we have just had Committee members displayed a great deal of analysis, experience and wisdom, and our debate on the Bill has been enriched by it. I am very grateful to hon. Members on both sides of the Committee who made very good contributions to help us get this right.
Exactly as the hon. Lady the Member for Sheffield, Heeley said, getting this right involves walking a tightrope between making sure that there is adequate enforcement and appropriate access for those for whom it is legally perfectly reasonable to access adult content. We must get that balance right. With that mind, we have drafted the clauses, particularly clause 22, to allow the regulator to operate with some freedom, because we also need to make sure that, over time, this remains a good system and is not overly prescriptive. It was ironic that in a speech about privacy, the hon. Lady started to speculate about which MPs enjoyed watching porn. I am definitely not going to do that.
The truth is that age verification technology is developing all the time. Online personal identity techniques are developing all the time, and indeed, the British Government are one of the leading lights in developing identity-verification software that also minimises the data needs for that verification and does not rely on especially large state databases to do that, and therefore does it in a relatively libertarian way, if I can put it that way. Providing for verification of identity or of age, because age without named identity is what is really being sought here, but is difficult to achieve, is an incredibly important issue. A huge amount of resource is going into that globally to get it right, and it ties closely to cyber security and the data protection requirements of any data.
The UK Data Protection Act has a broad consensus behind it and follows the simple principle that within an institution data can be shared, but data must not be shared between institutions. The institution that holds the data is responsible for their safekeeping and significant fines may be imposed for their inadvertent loss. The forthcoming General Data Protection Regulation increases those fines. Rather than reinventing data protection law for the purposes of age verification in this one case, it is better to rest on the long-established case law of data protection on which the Information Commissioner is the lead.
We had a very informed debate on the role of search engines. The regulator will be able to consider whether a search engine is an ancillary service provider. Although we do not specify it, I would expect ISPs to be regarded as ancillary service providers, but that will be for the regulator.
On the name of payment providers who are already engaged, rather than enforced engagement, we already have engagement from Visa, MasterCard, UK Cards Association and the Electronic Money Association, and clearly there a lot more organisations that can and should be engaged.
I thank the Minister for that response. I would have liked to hear him say a little bit more about how the payment service providers are involved in the game and whether we are relying on them to do the right thing because they are large corporate companies, or whether, as new clause 6 proposed, there was an opportunity to strengthen the wording of the Bill.
I apologise; there were so many interesting points made that I did not get to that one.
The provision of pornography without an age verification in the UK will become illegal under this Bill. There is a vast panoply of financial regulation requiring that financial organisations do not engage with organisations that commit illegal activities, and it is through that well-embedded, international set of regulations that we intend to ensure that payment service providers do not engage with those who do not follow what is set out in the Bill. Rather than inventing a whole new system, we are essentially piggybacking on a very well-established financial control system.
That is a very reassuring reply and I thank the Minister for it. We have had a very good debate. I know that his officials will be listening and thinking hard about what has been said, and I do not think it would serve the Committee any purpose to press my amendments or my new clause to a vote.
I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
Question proposed, That the clause stand part of the Bill.
It was interesting to hear the Minister refer to financial regulations. I was not present on Second Reading because I was not then in the position that I occupy now, but having read that debate I do not believe that there was any such reference. So we would like some clarity on who will be the regulator of the payment service providers and what work has already been done with the Financial Conduct Authority—I assume it will be with the FCA in this circumstance—to ensure that it will be regulating those providers, to make sure that they act with speed and due diligence on receiving notification from the age verification regulator under clause 15.
It is disappointing that the Government do not consider new clause 18 necessary to amend the Bill. I appreciate that the BBFC has been given powers to establish a code of practice, but given the very serious consequences that could result from that not being done correctly, some basic principles need to be embedded into the process, based on the issues that I raised earlier in our discussion.
I will just add that we will return to this issue on Report.
We have been engaging directly with payment service providers, although—no doubt as and when necessary—engagement with financial authorities will be made. Payment service providers can withdraw services from illegal activity under their existing terms and conditions, so the provision is already there for the measures to take effect.
Question put and agreed to.
Clause 22 accordingly ordered to stand part of the Bill.
Clause 23
Exercise of functions by the age-verification regulator
I beg to move amendment 80, in clause 23, page 25, line 1, at end insert—
‘(3) The age-verification regulator must consult with any persons it considers appropriate, about the option to restrict the use of its powers to large pornography websites only.’
This amendment requires the age-verification regulator to consult on whether, in the exercising of its function, it should restrict its powers to large pornography websites only.
With this it will be convenient to discuss new clause 12—Code of practice by age verification regulator—
‘(1) The age verification regulator must issue a code of practice giving practical guidance as to the requirements of any provision under this Part of the Act.
(2) The following persons must, in exercising their functions under this Part and in the design and delivery of their products and services, adhere to the code of practice, and ensure that the safety and wellbeing of children is paramount—
(a) relevant persons;
(b) internet service providers;
(c) ancillary service providers;
(d) payment-service providers; and
(e) any such other persons to whom the code of practice applies.
(3) Any code of practice issued by the age verification regulator under subsection (1) above must include standards in relation to the following—
(a) how content is managed on a service, including the control of access to online content that is inappropriate for children, and the support provided by the service for child safety protection tools and solutions;
(b) the assistance available for parents to limit their child’s exposure to potentially inappropriate content and contact;
(c) how the persons specified in subsection (2) above shall deal with abuse and misuse, including the provision of clear and simple processes for the reporting and moderation of content or conduct which may be illegal, harmful, offensive or inappropriate, and for the review of such reports;
(d) the action which must be taken in response to child sexual abuse content or illegal contact, including but not limited to, the co-operation with the appropriate law enforcement authorities;
(e) the action to be taken by the persons specified in subsection (2) above to comply with existing data protection and advertising rules and privacy rights that address the specific needs and requirements of children; and
(f) the provision of appropriate information, and the undertaking of relevant activities, to raise awareness of the safer use of connected devices and online services in order to safeguard children, and to promote their health and wellbeing.
(4) The age verification regulator may from time to time revise and re-issue the code of practice.
(5) Before issuing or reissuing the code of practice the age verification regulator must consult—
(a) the Relevant Minister;
(b) the Information Commissioner;
(c) the Scottish Ministers;
(d) the Welsh Ministers;
(e) the Northern Ireland Executive Committee;
(f) the persons specified in subsection (2) above;
(g) children;
(h) organisations and agencies working for and on behalf of children; and
(i) such other persons as the age verification regulator considers appropriate.
(6) As soon as is reasonably practicable after issuing or reissuing the code of practice the age verification regulator must lay a copy of it before—
(a) Parliament,
(b) the Scottish Parliament,
(c) the National Assembly for Wales, and
(d) the Northern Ireland Assembly.
(7) The age verification regulator must—
(a) publish any code of practice issued under subsection (1) above; and
(b) when it revises such a code, publish—
(i) a notice to that effect, and
(ii) a copy of the revised code; and
(c) when it withdraws such a code, publish a notice to that effect.
(8) The Secretary of State may by regulations make consequential provision in connection with the effective enforcement of the minimum standards in subsection (3).
(9) Regulations under subsection (8)—
(a) must be made by statutory instrument;
(b) may amend, repeal, revoke or otherwise modify the application of this Act;
(c) may make different provision for different purposes;
(d) may include incidental, supplementary, consequential, transitional, transitory or saving provision.
(10) A statutory instrument containing regulations under subsection (8) (whether alone or with other provisions) which amend, repeal or modify the application of primary legislation may not be made unless a draft of the instrument has been laid before and approved by a resolution of each House of Parliament.
(11) In this Part—
“ancillary service provider” has the meaning given by section 22(6);
“child” means an individual who is less than 18 years old.
“Information Commissioner” has the meaning given by section 18 of the Freedom of Information Act 2000
“Internet service provider” has the same meaning as in section 16 of the Digital Economy Act 2010.
“Northern Ireland Executive Committee” has the meaning given by section 20 of the Northern Ireland Act 1998
“payment-service providers” has the meaning given by section 22(5) “relevant Minister” has the meaning given by section 47(1)
“relevant persons” has the meaning given by section 19(3)
“Scottish Ministers” has the meaning given by section 44(2) of the Scotland Act 1998
“Welsh Ministers” has the meaning given by section 45 of the Government of Wales Act 2006.’
This new Clause gives the power to the age verification regulator to introduce a code of practice for internet content providers. The code of practice would be based on existing industry and regulatory minimum standards (such as the BBFC classification system) and require providers to ensure that the safety and wellbeing of children is paramount in the design and delivery of their products and services.
I promise this will be the last time I speak today. I am afraid I have had a slight change of heart. I tabled this amendment around many points that have been raised today on the difficulty of focusing the BBFC’s efforts on the fact that much of this traffic is not simply going to the larger websites. As we have heard, many other free sites are providing information. However, in reading my amendment, I have decided that it is almost a vote of no confidence in the BBFC’s ability to be flexible and I would therefore like to withdraw it.
New clause 12 would give the power to the age verification regulator to introduce another code of practice—the Opposition are very fond of them—for internet content providers. [Interruption.] And reviews, we are very fond of reviews.
We have made it clear throughout that we want enforcement to be as tough as possible and for all loopholes to be closed, but we also want to ensure that children are as safe in the online world as they are offline. There absolutely needs to be that parity of protection. That is one reason why we are disappointed, as I mentioned, that these measures came forward in a Digital Economy Bill, where it was incredibly difficult to look at the issues of child protection online in a thoroughly comprehensive way.
The new clause proposes that the regulator should work with industry to create a statutory code of practice, based on BBFC guidelines for rating films and the principles of the ICT Coalition for Children Online. The code would establish a set of minimum standards that would apply consistently to social networks, internet service providers, mobile telecommunication companies and other communication providers that provide the space and content where children interact online.
This is not intended to be an aggressive, regulatory process. We envisage that it will be the beginning of a much broader debate and conversation between regulators and content providers about just how we keep our children safe on the web. This debate will encompass not only ideas such as panic buttons, but education about the online world, which must run in parallel for any process to be effective.
A statutory code would work with providers to lay out how content is managed on a service and ensure that clear and transparent processes are in place to make it easy both for children and parents to report problematic content. It would also set out what providers should do to develop effective safeguarding policies—a process that the National Society for the Prevention of Cruelty to Children has supported.
As I said, this will clearly be a staged process. We envisage that in order to be effective, the development of a code of practice must involve industry, child protection organisations such as the NSPCC and, crucially, the children and families who use online services. But this code of practice would be based on existing industry and regulatory minimum standards and would require providers to ensure that the safety and wellbeing of children is paramount in the design and delivery of their products and services. The new clause would also empower the Secretary of State to make regulations to ensure effective enforcement of the minimum standards in the code of practice.
The online world can be an enormously positive force for good for our children and young people. It makes available a scale of information unimaginable before the internet existed and there is compelling evidence that that constant processing of information will lead to the most informed generation of children the world has known, but it needs to be made safe to realise that potential. The new clause would give assurance to Opposition Members that we will enable that to happen.
I do recognise that. My point is that making non-statutory guidance statutory will not help in that space, but there is clearly much more to do. I hope that, with that assurance, my hon. Friend the Member for Devizes will withdraw the amendment.
I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
Question proposed, That the clause stand part of the Bill.
This is a very curious clause, which renders much of the well-informed—as the Minister said—and useful discussion that we have had today about enforcement, targeting smaller providers and restricting access across the web, completely and utterly redundant. If the clause as I read it goes forward unamended, it will provide the regulator with the ability to target only the largest providers of online pornography, perhaps even limiting its ability to target only them.
As we have discussed at length, this is an incredibly difficult area to police, which I appreciate. It is obviously going to be far easier to tackle the 50 largest providers, not least because I assume many of them are already providing some level of age verification and are probably more at the responsible end of online pornography content providers. I would remind the Committee of the Conservative party’s manifesto, which said:
“we will stop children’s exposure to harmful sexualised content online, by requiring age verification for access to all sites containing pornographic material”.
That does not make any reference to commercial providers or whether the provider has a large or small turnover, is on WordPress, Tumblr, Twitter, Facebook or Snapchat. Today’s debate has very much suggested that the role of the regulator will be to focus on those sites that are operated on a commercial basis. Given the Minister’s reluctance to implement internet service provider blocking, I do not believe that the manifesto commitment will be achieved.
Part of my reason for withdrawing my amendment was that I was encouraged by the word “principally” on line 35 of this page. It is not a restriction; the regulator certainly has the power under the clause to go after it. My issue is that there is a worry, although not with this regulator, that success will be defined by the number of websites or the number of enforcement notices issued. It is not about the number of websites; it is about the number of eyeballs going to them, so it is absolutely right that the regulator focuses on larger sites first. The wording of the Bill allows the regulator discretion to go after any site.
On the basis that I agree with that explanation also, I commend the clause to the Committee.
Question put, That the clause stand part of the Bill.