Claire Perry
Main Page: Claire Perry (Conservative - Devizes)(8 years, 1 month ago)
Public Bill CommitteesAmendments have been tabled by Members on both sides of the Committee. The argument that we should not table amendments in Committee is an argument for having Bills come out of the parliamentary process in exactly the same form as they go in. Even the Government would not make that case. The central point here is that we offered plenty of time, which was agreed on a cross-party basis, and the Labour party has asked to reduce that time. In considering whether there has been enough time in Committee, those who read the transcript in the weeks and months to come ought to recognise that the Government have been as accommodating as possible, but that we had to give way to the Labour party’s request for less time and scrutiny in Committee.
We have a Minister who is engaging with the nuts and bolts of a Bill that was prepared long before he came to office. I, for one, am delighted that we have an active Minister who is determined to make this exceptionally important Bill as good as it can be. I do not accept this criticism. It is excellent that the Government are tabling these amendments and allowing time to consider them.
Yes—although I have had no discussions with them at a ministerial level about the amendments, I understand that discussions have taken place between officials. The effect of the amendments will be to make the law work better, so I hope they will have cross-party support.
Amendment 8 agreed to.
Amendments made: Government amendment 9, in clause 14, page 17, line 18, leave out “Subsections (3A) and (3B)” and insert
“Section 41(7) and subsection (3B) above”.
Subsection (3C), inserted in section 107 of the Wireless Telegraphy Act 2006 by the clause, lists enactments displaced by the time limits mentioned in subsections (3A) and (3B). Subsection (3A) merely refers to section 41(7), and the amendment substitutes a direct reference to that provision for the reference to subsection (3A).
Government amendment 10, in clause 14, page 17, line 26, at end insert—
“(3D) In relation to proceedings in Scotland, subsection (3) of section 136 of the Criminal Procedure (Scotland) Act 1995 (date when proceedings deemed to be commenced for the purposes of that section) applies also for the purposes of section 41(7) and subsection (3B) above.”.
The amendment adds provision about when proceedings in Scotland are deemed to be commenced for the purposes of the time limits in section 41(7) and new subsection (3B) of section 107 of the Wireless Telegraphy Act 2006.
Government amendment 11, in clause 14, page 17, line 31, at end insert—
“() for subsection (8) substitute—
“(8) For further provision about prosecutions see section 107.””.—(Matt Hancock.)
Existing section 41(8) of the Wireless Telegraphy Act 2006 applies to section 41(7) and is superseded by section 107(3C) inserted by the clause (see amendment 9). Amendment 10 also inserts provision applying to section 41(7) into section 107. Amendment 11 therefore substitutes a subsection referring the reader to section 107.
Clause 14, as amended, ordered to stand part of the Bill.
Clause 15
Internet pornography: requirement to prevent access by persons under the age of 18
I beg to move amendment 65, in clause 15, page 18, line 15, at end insert—
“(d) how persons can make a report to the age-verification regulator about pornographic material available on the internet on a commercial basis that is not complying with subsection (1).”.
This amendment places a requirement on the age-verification regulator to provide guidance as to how persons can report non-compliant pornography websites to the age-verification regulator.
I am extremely glad to have tabled a series of amendments to the vital provisions in part 3 of the Bill. As I said on Second Reading, we have come such a long way, and the enormous cross-party consensus to make the internet safer for young people has been crucial to that. We have seen some very effective sponsorship and responses from the previous Minister and his Department under the leadership of the last Prime Minister. Without his championship of this issue, we would not be where we are today.
My intention in tabling the amendments was to make provisions that are already good somewhat better, in the spirit of trying to encourage the Government to think hard about the line-by-line drafting. It has been made clear to me in meetings with organisations such as the British Board of Film Classification that there are ways to enhance the role of a regulator. I am delighted that the BBFC has been given the role, because it is truly a trusted brand; it is innovative and it does brilliant work to define age-rating boundaries. I have listened carefully to it.
What I am looking for is a clearer understanding of how the Government envisage the process of regulating websites and apps that provide access to material defined as pornographic in the UK. In his evidence session last week, David Austin referred to
“stages 1 to 3 of the regulation.”––[Official Report, Digital Economy Public Bill Committee, 11 October 2016; c. 39, Q84.]
I would be interested to hear the Minister’s explanation of how those different stages might work and to understand better how the enforcement element will work in practice—perhaps we will touch on that today but return to it in a later sitting.
I was struck by evidence given by those who do not support the changes; they feel that the issue is important but they argue that we should not be bringing in the new rules because we will not be able to make them stick. I must also mention my gratitude to the many organisations that have provided information and support on part 3 of the Bill. In particular, I note the contributions of Christian Action Research and Education, the Digital Policy Alliance, the National Society for the Prevention of Cruelty to Children and the Centre for Gender Equal Media.
My first amendment is to clause 15, which sets out the extremely welcome requirement that age verification should be introduced by websites and apps that are making commercial pornography available in the UK. Amendment 65 would add a new paragraph to clause 15(3) to strengthen enforcement by allowing the public and industry to provide intelligence to the regulator about the sites that do not have age verification.
I have always been struck by what we do not know about the internet. We all know that there is a massive proliferation of sites. I do accept what is said about much of the pornographic traffic concentrating around particular sites, but it grows like a Hydra every day. One of the BBFC’s most effective acts has been to allow effectively self-regulation and allow people to report and comment on a particular posting, which is, if you like, a sort of self-rating scheme. That would be extremely valuable. Clearly, the regulator cannot be expected to scrutinise the entire world of sites. Allowing members of the public and industry to notify the regulator that information is there that should be regulated would be helpful.
I note that the Digital Policy Alliance recommended in one of its parliamentary briefings back in April that this power should be available. It would be an excellent way to ensure that the public can feel involved in protecting their children. One of the messages I have heard over the past few years is how much families feel disempowered in the process of keeping their children safe. Of course, people accept the notion of parental responsibility and of course schools have become involved in this process, but we have made it uniquely difficult for families effectively to keep their children safe on a digital platform.
We have other rules and regulations around broadcast and written media that make it much easier for families wanting to be involved in that process. The amendment, allowing the BBFC to provide notice that these referrals can be made, would be very helpful. I note that David Austin of the BBFC said last week that he does intend to take referrals from the public.
Will the Minister please confirm that it is also the Government’s intention to promote the involvement of the whole community in championing online targeted child protection, and how this referral mechanism can be guaranteed? I hope he will consider this small change to the Bill.
Our intention is to establish a new regulatory framework and new regulatory powers tackling the viewing of adult content by minors. I pay tribute to the work of my hon. Friend over many years in getting us to this point. It has already ensured that there is voluntary activity, and that there are now legislative proposals is in many ways largely thanks to her campaigning. I am delighted that we have reached this point.
I am also delighted that, as we heard last week, the British Board of Film Classification will be designated as the age verification regulator. That is undoubtedly the best body in the land to do that job. It has the capability, as we heard at the evidence session. It will be responsible for identifying and notifying infringing sites. That will enable payment providers and other ancillary services to withdraw services from those providers that do not comply as soon as possible. Proceeding in that way will allow us to work quickly and effectively with all parts of the industry to ensure that they are fully engaged—indeed, that engagement has already started. We need to ensure the system is robust but fair and the providers of pornographic material are encouraged to be compliant by the processes in place.
I have every confidence, as I think we all should, in the BBFC’s ability to deliver on this. We heard from David Austin, the chief executive, in evidence that he is already working on this. He said that the BBFC would create something, and that it has done so with mobile operators. I think that its commitment to enable members of the public and organisations such as the NSPCC to report a particular website is the best way forward. That is a sensible approach for the regulator to take.
We should take a proportionate approach to the regulator’s role and allow the BBFC to do the job at which it is expert. We have required the regulator to issue guidance in circumstances where it allows the subjects of regulation to understand how the regime applies to them, but I think that going further and requiring this level of specification is not necessary, given the BBFC’s commitment and the uncontroversial nature of the need. That will give us flexibility as well as a clear commitment to make this happen. I hope that given that explanation, my hon. Friend will withdraw her amendment.
I am pleased to hear that the Minister shares the view that the BBFC should be given a permissive regime to do some of the things it does well, rather than the Government specifying too much. With that assurance, I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
I beg to move amendment 85, in clause 15, page 18, line 20, leave out subsection (5)(a).
The amendments all explicitly include on-demand programme services in the age verification measures proposed by the Government. Given the rise in the use of mobile devices and tablets in the past decade, the case for appropriate online pornography enforcement has increased. We commend the Government’s intention in the proposals. I also put on the record our thanks and congratulations to the hon. Member for Devizes, who has campaigned on this issue for many years along with many other hon. Members, not least my hon. Friend the Member for Bristol West.
The ultimate goal is to seek parity of protection for children between the online and offline worlds, but how that is done in practice is fraught with issues. I hope that we can improve the proposals before us. Teens have an emerging right to independent communication with friends and family, and we recognise and respect that. We must not fall back on outdated means of protection such as blanket parental permissions. We need to empower and protect young people in ways that make sense to them and that they can and will use.
As the Committee knows, the effects of online pornography on unhealthy attitudes to sex and relationships are only just starting to be explored, but the research indicates a troubling trend. The NSPCC study of more than 1,000 young people aged 11 to 18 found that over half the sample had been exposed to online pornography, and nearly all of that group—94%—had seen it by age 14. Just over half the boys believed that the pornography that they had seen was realistic, and a number of girls said that they worried about how it would make boys see girls and the possible impact on attitudes to sex and relationships. One respondent said:
“Because you don’t get taught how to go on the internet and keep yourself safe, there are loads of tricks to get you to give away or to go on a bad website.”
Crucially, in research by Barnardo’s, four fifths of teenagers agreed that it was too easy for young people to see pornography online by accident.
Adult products and spaces, including gambling shops, sex shops and nightclubs, are restricted in the offline sphere. Contents such as film and television, advertising and pornography are all also limited, with penalties ranging from fines to custodial sentences available to discharged proprietors who do not comply. It is a transparent, accountable process overseen by regulators and licence operators such as Ofcom, the BBFC and the Gambling Commission to ensure that children are protected from age-inappropriate content and experiences.
Labour is happy to support the Government’s efforts to introduce age verification, but we must ensure that enforcement is strong enough. Our amendment speaks to that broad aim of the Opposition, which I know is supported by Government Back Benchers, given the other amendments tabled today. However, the measure cannot be seen as a silver bullet, which is why tacking this manifesto commitment on to a Digital Economy Bill is inadequate. First, slotting it into a Bill on the digital economy gives the impression, however unintentional, that the measure is designed to deal only with commercial providers of pornography, those who exploit data or benefit from advertising or subscription services—those who are, in short, part of the digital economy, rather than all providers of pornography online.
Although we are aware that most pornography providers operate on a commercial basis, many do not. Peer-to-peer networks and Usenet groups, however difficult to police, would presumably not be in the scope of the Bill. That is on top of pornography available through apps that are commercial enterprises, such as Twitter and Tumblr, or free webpages, such as WordPress, where the provision of pornography is incidental or provides no income to the overall business, or is not used for commercial purposes at all. Under clause 15 as it stands, it is by no means clear that all pornography available on the internet will be subject to age verification requirements.
Allow me to remind the Minister what the Conservative party manifesto said on the matter in 2015. It stated that
“we will stop children’s exposure to harmful sexualised content online, by requiring age verification for access to all sites containing pornographic material”.
There is no prevarication or equivocation there, and I commend the wording in the manifesto. Unfortunately, between that time and the legislation being drawing up, a rogue adjective has been added to the commitment, which seemed perfectly clear in the manifesto. One could easily argue that if a site such as Tumblr does not make pornography available on a commercial basis, then it is exempt, which would leave that manifesto commitment in some difficulty. Can we therefore have a commitment from the Minister that the regulator will be able to go after all sites containing pornographic material and not just those operating on a commercial basis, however broadly we may want to define “commercial”? The word seems at best unnecessary, and at worst a breach of the manifesto commitment.
Slotting age verification into the Bill gives Members nothing like the scope needed to tackle the effect of under-age viewing of pornography, which is surely the intention behind its implementation, because the measure is not enough to protect children. For a start, the regulator should also be responsible for ensuring that services undertake self-audits and collect mandatory reports in relation to child abuse images, online grooming and malicious communication involving children. To ensure that services are working to consistent principles and to best support the collection and utilisation of data, the regulator should also be responsible for developing a definition of child abuse.
We need to improve reporting online. Children and young people are ill served by the currently inadequate and unreliable reporting systems when they experience online abuse. Reporting groups need to be standardised, visible, responsive and act rapidly to address issues. Every reporting group must be designed in ways children say they can and will use. The NSPCC found that 26% of children and young people who used the report button saw no action whatever taken in response to their complaint; and of those who did get a response, 16% were dissatisfied with it. The Government should include independent mediation and monitoring of responses to complaints.
Clearly, we need compulsory sex education in our schools. Compulsory age-appropriate lessons about healthy relationships and sex are vital to keeping children safe on and offline. We know that children are exposed to pornography, sometimes in an extreme or violent form. Alongside regulation to limit access to these materials, building resilience and instilling an early understanding of healthy relationships can help to mitigate the impact of that exposure.
On that point, we are incredibly keen to ensure that legislation is as clear as possible and that any potential loopholes are closed. One such loophole is clause 15(5)(a), which for reasons that are unclear excludes on-demand programme services. Explicitly excluding any on-demand programme service available on the internet in the Bill—although we are aware that they are regulated by Ofcom—risks on-demand programme services being subject to a much looser age verification requirement than the Bill would enforce on other pornography providers. We do not believe that the legislation intends to create two standards of age verification requirements for online content, regardless of whether it is separately regulated. The amendment is intended to close that loophole.
I will speak to amendments 85 and 87. I raised a question with David Austin last week about the regulation of video on demand. He confirmed that the intention of the Bill as it stands is to maintain the regulation of UK video on demand with Ofcom under the Communications Act 2003. That seems totally reasonable to me because Ofcom has done a good job. I think the issue is that the framework only requires age verification for R18 material.
I am not trying to give everyone a lesson—by the way, this is why we are so grateful to the BBFC; it gives very clear definitions of the material—but R18 is effectively hardcore porn. It contains restricted scenes that we would all consider to be pornography. Since 2010, the 18-certificate guidelines permit the depiction of explicit sex in exceptional justifying circumstances, so it is perfectly feasible for children to view 18-rated content that we would all consider to be pornographic. I fully agree with the sentiment behind amendments 85 and 87 to provide a level playing field for all online media, but we must ensure that all R18 and 18 content accessed through video-on-demand services is included in the provisions. However, removing clauses 15(5)(a) and 16(6) would cause a fair amount of confusion, as video-on-demand services would be regulated by Ofcom for the majority of the time but for age verification matters would be regulated by the BBFC and Ofcom, which raises the question of who has precedence and how enforcement would work.
I have therefore tabled new clause 7, which would meet the same objective in a slightly different way by amending the current regulatory framework for video on demand to ensure that children are protected from 18-rated as well as R18-rated on-demand material. The relevant section of the Communications Act 2003, section 368E, was amended by the Audiovisual Media Services Regulations 2014 to specify that R18 material should be subject to age verification to protect children. It is not a big step to require 18-rated pornographic material, which is the subject of much of this part of the Bill, to be included within the scope of that section. That would effectively create a legal level playing field. It would remove the issue of parity and precedence and would give us parity on the fundamental issue of the protection of children.
I agree with much of what the hon. Member for Sheffield, Heeley said. Ofcom’s latest figures on children and the media show that 51% of 12 to 15-year-olds watched on-demand services in 2015. The viewing of paid for on-demand content has gone up and accounts for 20% of viewing time for young people aged 16 to 24. They can view content rated 18 or R18 that would be prohibited for some of them if they were to purchase it in the offline world. With new clause 7, I recommend that the Government should try to ensure parity between the online and offline worlds. This Bill is a brilliant way to ensure that there is parity in the way that pornographic content is accessed.
On the point that my hon. Friend the Member for Sheffield, Heeley made about the wording of the clause and how it talks about material that is made available “on a commercial basis”, does the hon. Member for Devizes have any concerns that that might be a definitional problem that could create a loophole?
The hon. Gentleman raises a challenge. The explanatory notes make it clear that the Government intend to capture both commercial and freely provided material, which gets to the root of his concern. If someone is benefiting from the viewing of such material, the Government intend to capture that within the definition. I commend both the Minister and his Department for asking the BBFC to take on the role of regulator, because I have a high level of faith in its ability to do just that.
I take the hon. Lady’s point that the Government have said that they would like to capture such material, but my hon. Friend the Member for Sheffield, Heeley said that they might not capture everything. We tabled a probing amendment to take out the words “on a commercial basis” to test that, but it was ruled out of scope because the Bill is about the digital economy. So it has to be material that is made available on a commercial basis only, otherwise it is out of the scope of the Bill.
The hon. Gentleman is splitting hairs. The Government have issued clear guidance that the definition of “commercial” includes free content. There are very few altruistic providers of this material. Free content tends to be provided as a taster for commercial sites.
Well, I accept that is true of streaming and on-demand, which is why this provision is important. It would capture material that is rated 18, not just restricted-18, and put it on a level playing field with restricted-18 material. The on-demand video content that the hon. Member for Sheffield, Heeley mentioned would be covered by the changes. I am interested to hear the Minister’s response to my proposed new clause 7, which would support parity of both content and regulator.
Ordered, That the debate be now adjourned.—(Graham Stuart.)