(7 years, 7 months ago)
Commons ChamberDoes the hon. Lady agree, however, that in setting out these definitions on a spectrum ranging from prohibited material to extreme pornography—I will speak to this later—we have left ourselves in something of a quandary, as material that she and I would probably agree is completely unacceptable can in theory be viewed behind age filters? I heard that the Minister was prepared to consider this unfinished business. Will the hon. Lady, on behalf of her party, commit to trying to work out these definitions in the next Parliament to ensure that we arrive at a better place?
That was exactly why we pushed for an amendment in the Lords and it is why we are so pleased that the Government have accepted it. We need consultation, as well as a clear definition of extreme pornography and prohibited material. My understanding of the legislation is that nothing extreme, prohibited or otherwise will be able to be viewed behind age verification filters. If something is deemed to be pornography and analysed as such by the British Board of Film Classification, it will be required to be behind such filters.
The hon. Lady is right, but she will know that the original definition referred to five statutes. We now have a definition that is much tighter, specifically because items that were included under the broader definition are now deemed not to be obscene—I agree with that. The problem is that there is material that, according to 85% of people who have viewed it, should not be accessible on the internet for any age group. Such material could be accessible behind those filters for anyone to see. That is the problem that we need collectively to solve.
The hon. Lady is absolutely right. It is true that such material is currently available without any AV filters, so we have made substantial and welcome progress in this area, but the consultation in the next Parliament will be crucial. We look forward to participating in that debate and ensuring that we get the best possible regime for online pornography.
Several Government amendments on age verification were tabled in the Lords. We understand why technology cannot be dictated in legislation or even guidance, but the effectiveness of AV measures will obviously be determined by the technology that is used. If we are not careful, we could end up with age verification that is so light-touch as to be too easily bypassed by increasingly tech-savvy under-18s, or that is far too complicated and intrusive. That could push viewers on to sites that do not use age verification but still offer legitimate content, or completely illegal sites that stray into much more damaging realms. Equally, we must ensure that privacy and proportionality are at the heart of the proposals, so I push the Minister to say more about that.
The BBFC has intimated that its likely preference is age-verified mobile telephony, but there are significant privacy issues with that approach. We should proceed with extreme caution before creating any process that would result in the storing of data that could be leaked, hacked or commercialised when that would otherwise be completely private and legitimate. Concerns have been raised about whether the BBFC is appropriate to be the AV regulator, not least in relation to its conduct in lobbying Members of this House and the other. I am grateful that the Minister has listened to those concerns and that guidance will now be produced by the Secretary of State, meaning that there is proper accountability, and then issued to the regulator. I want to ensure that the report that the Secretary of State produces on the effectiveness of the regulation covers the regulator itself, so I would be grateful for clarification about that from the Minister.
On the social media code of conduct, we are delighted that the Government have taken a decisive step in the right direction. Amendment (a) in lieu of Lords amendment 40 requires the Secretary of State to issue a code of practice for online social media platforms in relation to bullying, directing insults, or other behaviour likely to intimidate or humiliate. It is difficult to understate the importance of tackling bullying and offensive behaviour online. Although social media has brought about transformative and significant changes for the good, it has also facilitated an exponential increase in bullying. It is estimated that seven in 10 young people have experienced cyber-bullying, with 37% of those people experiencing it frequently. Cyber-bullying can lead to anxiety, depression and even suicide.
This is the first time that social media providers will be subject to legislation on this issue. They will be required to have processes in place for reporting and responding to complaints about bullying. As the Minister said, some providers have taken steps to address these issues, but the pace of change has to keep up with the scale of the problem. It is absolutely right that the Government have taken decisive legislative action to make the internet a safer place for its users. I would be grateful if the Minister would confirm that there will be full public consultation when drafting the code of conduct.
On public service broadcasting prominence, we are happy to support Government amendment (a) in lieu of Lords amendment 242, which requires Ofcom regularly to review electronic programming guides in relation to public service broadcasting and the implications of changing technology for public service broadcasting. We are pleased that the Minister has confirmed that any necessary powers will be transferred to Ofcom, should it be required to intervene.
We are delighted that, after many years of campaigning, not least by my hon. Friend the Member for Washington and Sunderland West (Mrs Hodgson), significant progress has been made on efforts to tackle abuses in the secondary ticket market. Fans across the country will be thanking her, the Minister and all those involved in the campaign, but we recognise there is still more to do and that the Waterson review must be implemented in full in the next Parliament. We are pleased that the Minister has again seen sense by accepting Lords amendments on e-lending and on-demand accessibility.
The Bill has been improved significantly and it has been a privilege to enter negotiations with the Government. It has also been a privilege to negotiate with the Minister, as he said it had been to negotiate with me. However, I must say that this Bill is not legislation for the digital economy. The tech sector waited eagerly for well over a year for the Government’s strategy and vision for this crucial area of our economy. To say that it was disappointed with the lack of ambition and strategic direction in the Bill and the Government’s eventual strategy would be a gross understatement. Our burgeoning digital economy is the largest in the world, growing at a rate that we could hardly have expected even a decade ago, but after seven years of a Conservative Government, 12 million people still lack basic digital skills.
Some 3 million homes and businesses do not have access to superfast broadband. Britain does not even feature on the fibre broadband league table, and our 4G mobile coverage lags firmly behind that of our major competitors. Too often, workers find themselves overworked, underpaid and exploited by bosses they never meet who do not even fulfil their basic duties as an employer. People across the country suffer from digital exclusion because our infrastructure is second-rate and our digital skills programme is well behind the times. Now should have been the moment to lay the foundations for not just a world-leading digital sector, but a truly world-leading economy with digital inclusion at its heart. Those foundations must be built on the responsibilities of employers towards the burgeoning workforce, of the digital giants to their users, and of the Government to create the environment in which digital can transform the economy.
Although the Bill undoubtedly brings forward some welcome changes, it has revealed an alarming lack of ambition for the country and a worrying indication of the Government’s priorities in relation to tech as we Brexit. I can assure the House that come 9 June, when I will be preparing to take the Minister’s place, it will be the Labour party that will have the ambition and vision on infrastructure, skills and finance, and that will champion this sector, which is essential to the UK’s ability to thrive post-Brexit and for us to deliver the high-skilled, well-paid jobs that areas of the country such as mine so desperately cry out for. We welcome the improvements that have been made in the Bill, but I hope that, however the next Parliament looks, our digital economy will be given far greater prominence and priority.
(8 years ago)
Public Bill CommitteesI would like to put on the record again that this Bill was clearly not ready for Committee. We have just seen another example of an amendment that was completely uncalled for. In the last part, amendments had to be withdrawn that were incorrect. I hope that the proposals are properly examined in the Lords and that this is not a recurring theme throughout future legislation that this Government introduce. It is very disappointing to see the lack of preparation for this Bill.
The hon. Lady is doing a marvellous job for her Front-Bench team, but having sat through several Bill Committees, I assure her that this situation is not particularly unusual. What is important is getting the Bill absolutely right and making sure that we use this opportunity to scrutinise it. We should proceed in the spirit of us all wanting the best thing and stop taking pops at the drafting team.
I am assured by my hon. Friend the Member for Cardiff West that this was not common practice under the last Labour Government, and I am horrified to hear that it has been common practice over the past couple of years.
(8 years, 1 month ago)
Public Bill CommitteesI am pleased to hear that the Minister shares the view that the BBFC should be given a permissive regime to do some of the things it does well, rather than the Government specifying too much. With that assurance, I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
I beg to move amendment 85, in clause 15, page 18, line 20, leave out subsection (5)(a).
The amendments all explicitly include on-demand programme services in the age verification measures proposed by the Government. Given the rise in the use of mobile devices and tablets in the past decade, the case for appropriate online pornography enforcement has increased. We commend the Government’s intention in the proposals. I also put on the record our thanks and congratulations to the hon. Member for Devizes, who has campaigned on this issue for many years along with many other hon. Members, not least my hon. Friend the Member for Bristol West.
The ultimate goal is to seek parity of protection for children between the online and offline worlds, but how that is done in practice is fraught with issues. I hope that we can improve the proposals before us. Teens have an emerging right to independent communication with friends and family, and we recognise and respect that. We must not fall back on outdated means of protection such as blanket parental permissions. We need to empower and protect young people in ways that make sense to them and that they can and will use.
As the Committee knows, the effects of online pornography on unhealthy attitudes to sex and relationships are only just starting to be explored, but the research indicates a troubling trend. The NSPCC study of more than 1,000 young people aged 11 to 18 found that over half the sample had been exposed to online pornography, and nearly all of that group—94%—had seen it by age 14. Just over half the boys believed that the pornography that they had seen was realistic, and a number of girls said that they worried about how it would make boys see girls and the possible impact on attitudes to sex and relationships. One respondent said:
“Because you don’t get taught how to go on the internet and keep yourself safe, there are loads of tricks to get you to give away or to go on a bad website.”
Crucially, in research by Barnardo’s, four fifths of teenagers agreed that it was too easy for young people to see pornography online by accident.
Adult products and spaces, including gambling shops, sex shops and nightclubs, are restricted in the offline sphere. Contents such as film and television, advertising and pornography are all also limited, with penalties ranging from fines to custodial sentences available to discharged proprietors who do not comply. It is a transparent, accountable process overseen by regulators and licence operators such as Ofcom, the BBFC and the Gambling Commission to ensure that children are protected from age-inappropriate content and experiences.
Labour is happy to support the Government’s efforts to introduce age verification, but we must ensure that enforcement is strong enough. Our amendment speaks to that broad aim of the Opposition, which I know is supported by Government Back Benchers, given the other amendments tabled today. However, the measure cannot be seen as a silver bullet, which is why tacking this manifesto commitment on to a Digital Economy Bill is inadequate. First, slotting it into a Bill on the digital economy gives the impression, however unintentional, that the measure is designed to deal only with commercial providers of pornography, those who exploit data or benefit from advertising or subscription services—those who are, in short, part of the digital economy, rather than all providers of pornography online.
Although we are aware that most pornography providers operate on a commercial basis, many do not. Peer-to-peer networks and Usenet groups, however difficult to police, would presumably not be in the scope of the Bill. That is on top of pornography available through apps that are commercial enterprises, such as Twitter and Tumblr, or free webpages, such as WordPress, where the provision of pornography is incidental or provides no income to the overall business, or is not used for commercial purposes at all. Under clause 15 as it stands, it is by no means clear that all pornography available on the internet will be subject to age verification requirements.
Allow me to remind the Minister what the Conservative party manifesto said on the matter in 2015. It stated that
“we will stop children’s exposure to harmful sexualised content online, by requiring age verification for access to all sites containing pornographic material”.
There is no prevarication or equivocation there, and I commend the wording in the manifesto. Unfortunately, between that time and the legislation being drawing up, a rogue adjective has been added to the commitment, which seemed perfectly clear in the manifesto. One could easily argue that if a site such as Tumblr does not make pornography available on a commercial basis, then it is exempt, which would leave that manifesto commitment in some difficulty. Can we therefore have a commitment from the Minister that the regulator will be able to go after all sites containing pornographic material and not just those operating on a commercial basis, however broadly we may want to define “commercial”? The word seems at best unnecessary, and at worst a breach of the manifesto commitment.
Slotting age verification into the Bill gives Members nothing like the scope needed to tackle the effect of under-age viewing of pornography, which is surely the intention behind its implementation, because the measure is not enough to protect children. For a start, the regulator should also be responsible for ensuring that services undertake self-audits and collect mandatory reports in relation to child abuse images, online grooming and malicious communication involving children. To ensure that services are working to consistent principles and to best support the collection and utilisation of data, the regulator should also be responsible for developing a definition of child abuse.
We need to improve reporting online. Children and young people are ill served by the currently inadequate and unreliable reporting systems when they experience online abuse. Reporting groups need to be standardised, visible, responsive and act rapidly to address issues. Every reporting group must be designed in ways children say they can and will use. The NSPCC found that 26% of children and young people who used the report button saw no action whatever taken in response to their complaint; and of those who did get a response, 16% were dissatisfied with it. The Government should include independent mediation and monitoring of responses to complaints.
Clearly, we need compulsory sex education in our schools. Compulsory age-appropriate lessons about healthy relationships and sex are vital to keeping children safe on and offline. We know that children are exposed to pornography, sometimes in an extreme or violent form. Alongside regulation to limit access to these materials, building resilience and instilling an early understanding of healthy relationships can help to mitigate the impact of that exposure.
On that point, we are incredibly keen to ensure that legislation is as clear as possible and that any potential loopholes are closed. One such loophole is clause 15(5)(a), which for reasons that are unclear excludes on-demand programme services. Explicitly excluding any on-demand programme service available on the internet in the Bill—although we are aware that they are regulated by Ofcom—risks on-demand programme services being subject to a much looser age verification requirement than the Bill would enforce on other pornography providers. We do not believe that the legislation intends to create two standards of age verification requirements for online content, regardless of whether it is separately regulated. The amendment is intended to close that loophole.
I will speak to amendments 85 and 87. I raised a question with David Austin last week about the regulation of video on demand. He confirmed that the intention of the Bill as it stands is to maintain the regulation of UK video on demand with Ofcom under the Communications Act 2003. That seems totally reasonable to me because Ofcom has done a good job. I think the issue is that the framework only requires age verification for R18 material.
I am not trying to give everyone a lesson—by the way, this is why we are so grateful to the BBFC; it gives very clear definitions of the material—but R18 is effectively hardcore porn. It contains restricted scenes that we would all consider to be pornography. Since 2010, the 18-certificate guidelines permit the depiction of explicit sex in exceptional justifying circumstances, so it is perfectly feasible for children to view 18-rated content that we would all consider to be pornographic. I fully agree with the sentiment behind amendments 85 and 87 to provide a level playing field for all online media, but we must ensure that all R18 and 18 content accessed through video-on-demand services is included in the provisions. However, removing clauses 15(5)(a) and 16(6) would cause a fair amount of confusion, as video-on-demand services would be regulated by Ofcom for the majority of the time but for age verification matters would be regulated by the BBFC and Ofcom, which raises the question of who has precedence and how enforcement would work.
I have therefore tabled new clause 7, which would meet the same objective in a slightly different way by amending the current regulatory framework for video on demand to ensure that children are protected from 18-rated as well as R18-rated on-demand material. The relevant section of the Communications Act 2003, section 368E, was amended by the Audiovisual Media Services Regulations 2014 to specify that R18 material should be subject to age verification to protect children. It is not a big step to require 18-rated pornographic material, which is the subject of much of this part of the Bill, to be included within the scope of that section. That would effectively create a legal level playing field. It would remove the issue of parity and precedence and would give us parity on the fundamental issue of the protection of children.
I agree with much of what the hon. Member for Sheffield, Heeley said. Ofcom’s latest figures on children and the media show that 51% of 12 to 15-year-olds watched on-demand services in 2015. The viewing of paid for on-demand content has gone up and accounts for 20% of viewing time for young people aged 16 to 24. They can view content rated 18 or R18 that would be prohibited for some of them if they were to purchase it in the offline world. With new clause 7, I recommend that the Government should try to ensure parity between the online and offline worlds. This Bill is a brilliant way to ensure that there is parity in the way that pornographic content is accessed.
(8 years, 1 month ago)
Public Bill CommitteesQuite a lot of clarification is needed, and I hope it will come during the Bill’s passage. I do not think that the distinction between Ofcom and the BBFC is clear in this part of the Bill or in later clauses on enforcement. However, given that it states elsewhere in the Bill that the proposal is subject to further parliamentary scrutiny, and as the BBFC has not yet officially been given the regulator role—as far as I am aware—I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
I beg to move amendment 66, in clause 15, page 18, line 24, at end insert
“or an internet service provider.”.
This amendment and amendment 67 ensure that the requirement to implement age verification does not fall on ISPs but commercial sites or applications offering pornographic material; and defines internet service providers.
Because various other aspects of the Bill capture ISPs. My concern is that the Bill focuses on the commercial content providers where they are. The amendment is intended to probe the Government about how they are thinking about ISPs vis-à-vis commercial content providers in the drafting of the clause.
Our amendments are designed to enable the regulator to ask the internet service provider to block offending sites. This goes back to the point we made earlier on the differences between sites operated “on a commercial basis” and social media sites and ancillary sites. The proposals as they stand do not give the regulator sufficient powers to enforce the mechanisms proposed in the Bill.
Broadening the definition of “ancillary service provider” specifically to include internet service providers would require the regulator to notify them of non-compliant sites. That will put ISPs in the same bracket as payment service providers, which will be required to withdraw their services if other measures have been exhausted. In the case of ISPs, they would be required to block offending sites.
The amendments would create a simple backstop power where enforcement through the Government’s proposals had not achieved its intended objective and commercial providers had not withdrawn their services, either because the fine does not act as a deterrent or because, due to their international status, they do not need to comply. If pornography providers continued to provide content without age verification restrictions, the regulator would then have the power to require ISPs to take down the content.
We believe that, without amendment, the proposals will not achieve the Bill’s aim, as non-compliant pornographers would not be absolutely assured of payment services being blocked. First, the proposals do not send anywhere near a strong enough signal to the porn industry that the Government are serious about the proposals and their enforcement. Giving the regulator the power but not the stick suggests that we are not all that bothered about whether sites comply. Secondly, we can have no reassurance that sites will be shut down within any kind of timeframe if there is non-compliance. As drafted in the explanatory notes, “on an ongoing basis” could mean yearly, biannually or monthly, but it makes a mockery of the proposals if sites could be non-compliant for two years or more before payment services may or may not act. That does not provide much of an incentive to the industry to act.
Throughout the evidence sessions we heard that there are significant difficulties with the workability of this entire part of the Bill. For instance, many sites will hide their contact details, and a substantial number will simply not respond to financial penalties. Indeed, an ability already exists in law for ISPs to be compelled to block images that portray, for example, child sex abuse. There is also an ability to block in the case of copyright infringement. It therefore seems eminently reasonable that in the event of non-compliance, the regulator has a clear backstop power. We believe that even just legislating for such a power will help speed up enforcement. If providers know that they cannot simply circumvent the law by refusing to comply with notices, they will comply more efficiently. That will surely help the age verifier to pass the real-world test, which is integral to the Bill’s objectives.
I thank the Minister for that intervention. We will return to this subject in a series of amendments around clause 20. I want to thank the Minister for clarifying some of the murkiness around definitions in the Bill. I want to ask him and his team, though, to consider what his colleague had said, which goes back to the net neutrality point.
I accept what the Minister says about the spirit being absolutely clear, that our current filtering regime will not be captured, but Baroness Shields did say that we needed to legislate to make our filters regime legal. I did not hear from the Minister that that legislation is something that the Department is preparing or planning to introduce.
We very much share the hon. Lady’s concerns that the legislation has explicitly excluded the ability of internet service providers to block. We simply cannot understand why the Government have ruled out that final backstop power. We appreciate it is not perfect but it would give the regulator that final power. We will return to new clause 11 at the end of the Bill and be pushing it to a vote when we come to it.
I thank the hon. Lady for making her intentions clear. I am prepared to withdraw or not push my new clause to a vote on the basis of what the Minister said, but I would love to get his assurances—perhaps he will write to me—to be crystal clear on the fact that he believes the Government do not have to legislate in order to push back on the net neutrality regime.
Given the Brexit vote, I would be inclined to accept a letter from the Minister suggesting that we will absolutely resist any attempt to make EU net neutrality apply to what is a very fine, though not perfect, voluntary regime. On that basis, I accept the Minister’s assurances that that is what he intends to do. I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
Clause 15 ordered to stand part of the Bill.
Clause 16 ordered to stand part of the Bill.
Clause 17
The age-verification regulator: designation and funding
Question proposed, That the clause stand part of the Bill.
In this and related clauses, we seek to strengthen the proposals that the Government have put forward. We have said that the regulation needs to be beefed up to require internet service providers to be notified about non-compliance. We would like to see an injunction power to take down any content which a court is satisfied is in breach of the age-verification legislation, as soon as possible, at the start of the four-tier regulation process the Government have identified in their amendments and letters published to the Committee last week.
That would require a regulator with sufficient enforcement expertise and the ability to apply that injunction and push enforcement at an early stage. As we are aware, however, the BBFC heads of agreement with the Government do not cover enforcement. Indeed, they made perfectly clear that they would not be prepared to enforce the legislation in clauses 20 and 21 as they stand, which is part 4 of that enforcement process, giving the power to issue fines. The BBFC is going to conduct phases 1, 2 and 3 of the notification requirements, presumably before handing over to a regulator with sufficient enforcement expertise, but that has not been made clear so far.
While we welcome the role of the BBFC and the expertise it clearly brings on classification, we question whether it is unnecessarily convoluted to require a separate regulator to take any enforcement action, which will effectively have been begun by the BBFC and which so far has not been mentioned in the legislation. This goes back to the point my hon. Friend the Member for Cardiff West made earlier about the two separate regimes for on-demand programme services.
As I understand it, although it is not clear, the BBFC will be taking on stage 3 of the regulation, meaning it will be involved in the first stage of enforcement—in notification. That is fine, but it will then have to hand over the second stage of enforcement to another regulator—presumably Ofcom. The enforcement process is already incredibly weak and this two-tiered approach involving two separate regulators risks further delays in enforcement against non-compliant providers who are to protect or take down material that is in breach of the law. In evidence to the Committee, the BBFC said:
“Our role is focused much more on notification. We think we can use the notification process and get some quite significant results.”—[Official Report, Digital Economy Public Bill Committee, 11 October 2016; c. 41, Q83.]
We do not doubt it, but confusion will arise when the BFFC identifies a clearly non-compliant site that is brazenly flouting the law, and it does not have power to enforce quickly but will have to hand it over.
We would also like to hear when the Government are planning to announce the regulator for the second stage and how they intend to work with the BBFC. As far as I can see, this will require further amendments to the Bill. If it is Ofcom, it would have been helpful to have heard its views on what further enforcement powers it would like to see in the Bill, rather than being asked to fill in after the Bill has passed through Parliament. There is a clear danger that the enforcement regulator could be asked to take over enforcement of age verification, which it thinks requires more teeth to be effective.
We therefore have very serious concerns about the process by which clause 17 will be have effect. Although we will not vote against the clause, we want to make it very clear that we would have preferred to have seen an official announcement about who will carry out the enforcement provisions in the Bill before being asked to vote on it.
This is a series of consequential and investigatory amendments intended to probe the Minister’s thinking about what the regulator can actually do. At the moment, enforcement operates through a series of financial penalties, which we can discuss further when we debate clause 21, or of enforcement notices. We heard clearly last week from David Austin that the challenge is that almost none of the content-producing sites that we are discussing are based in the UK; in fact, I think he said that all the top 50 sites that the regulator will rightly target are based overseas.
The challenge is how the Government intend to carry out enforcement. I know that the BBFC’s current enforcement role is not carried out through its own designated powers; it is carried out through various other agencies, and the Bill makes further provision for financial penalties. I tabled the amendments to press the Minister on the point that it would be clearer to specify that where a site, or the company that owns a site, is based in the UK, a financial penalty can and will be applied.
For overseas sites, enforcing a financial penalty, if one can even get to grips with what the financial accounts look like, may be difficult, hence the enforcement notice and then a series of other potential backstop actions; I know that the Minister is aware that I do not feel that we have exhausted the debate on blocking. I am trying to probe the Government on whether there is a way to use the Bill to reflect the reality that content providers are unlikely to be based primarily in the UK, and that perhaps a different approach is needed for those based offshore.
We completely support the hon. Lady’s amendments, which propose a sensible toughening up of the requirements of the age verification regulator. We particularly welcome the measures to require the regulator to issue enforcement notices to people outside the UK if they do not comply. That is an attempt to close a large hole in the current proposals. How will the BBFC tackle providers outside the UK?
At the evidence session last week, David Austin said that
“you are quite right that there will still be gaps in the regime, I imagine, after we have been through the notification process, no matter how much we can achieve that way, so the power to fine is essentially the only real power the regulator will have, whoever the regulator is for stage 4”;
we are not yet certain.
He continued:
“For UK-based websites and apps, that is fine, but it would be extremely challenging for”
the BBFC, Ofcom or whoever the regulator is for stage 4
“to pursue foreign-based websites or apps through a foreign jurisdiction to uphold a UK law. So we suggested, in our submission of evidence to the consultation back in the spring, that ISP blocking ought to be part of the regulator’s arsenal.”––[Official Report, Digital Economy Public Bill Committee, 11 October 2016; c. 41, Q91.]
That is precisely why we will return to the amendment on ISP blocking, because if we are to pursue foreign-based providers, the ability to block will be integral to that strategy.
I thank the Minister for that clarification and for the mention of support. The intention was to help to provide a practical solution rather than cut off aims. He has persuaded me that I do not need to press the amendment to a vote. Although I take the point about shared regulation, I would ask him to consider in setting up the BBFC as the primary regulator that it is working reasonably well in the video-on-demand world, but this may be having them stray into a new sphere of expertise in terms of finding, identifying and sending out enforcement notices or penalties, particularly for foreign-based companies. I think the whack-a-mole analogy is entirely consistent—they will shut their doors and reopen in another jurisdiction almost overnight. Given the anonymity principles, it is sometimes almost impossible to know where they actually are. If the Minister is assuring us that everyone is aware of the problem, he believes the powers allow the regulator to be flexible, and it is something that his Department will consider, I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
I beg to move amendment 86, in clause 20, page 21, line 40, leave out paragraph (b) and insert—
“(b) “during the initial determination period fix the date for ending the contravention of section 15(1) as the initial enforcement date.”.
I will not test the Committee’s patience further by going over arguments that we have already had, but there is one further area of clause 20 that we wish to touch on—the lack of an appeals process in the legislation. The Minister may expect the regulator to build that appeals process in: it would be helpful to have some clarity from him on that.
As I understand it, the BBFC will use analytics to identify sites that should have age verification. Analytics are not foolproof, so obviously an appeals mechanism will be needed for websites incorrectly prevented from operating. Previous such systems have wrongly filtered out websites such as breast cancer charities or forums for gay and transgender people. That is incredibly important: let us put ourselves in the shoes of a young gay man or woman, growing up in a religious household perhaps, who does not know where to turn to ask the questions that would plague any teenager coming to terms with their sexuality and who seeks refuge and solace in internet forums with other people going through the same issues. As risky as the internet can be, it can also be an incredibly empowering, transformative space that can literally save lives in such situations. Such lifelines must absolutely not be filtered out by ASPs or made subject to age verification; the Bill should include a mechanism that allows for correction when they have been mistakenly identified.
We also need clarification on who will develop the analytics, the data they will be based on and whether it will be done in consultation with the tech industry. We can only assume that this is an oversight that will be corrected when working out how the regulator is to proceed.
The hon. Lady raises an important point about access to information about sex education, sexuality, abortion and all sorts of things that are incredibly valuable. She is right to draw attention to safe forums. I reassure her that many of the same issues came up with respect to the question of voluntary filtering and, despite what some of those giving evidence said, the incidence of false blocking of such valuable sites is incredibly low. The BBFC as regulator is really good: it is not in the business of defining based on imagery, and it has fairly detailed algorithms. I share her concern, but I want to offer some comfort.
I am grateful. I heard the BBFC or the Open Rights Group say that the incidence was very low, but it would do no harm to build an appeals process into the legislation to ensure that where sites that should not be blocked or require age verification have fallen through the cracks, that can be resolved at the behest of the regulator.
I rise to speak to new clause 18, which stands in my name and that of my hon. Friend the Member for Cardiff West. I also support the amendments tabled by the hon. Member for Devizes. The Government’s proposals really do rely on an awful amount of good will among all the stakeholders involved in the legislation. It makes sense to create a backstop power for the regulator to require payment services to act should they not do so in the first instance.
New clause 18 comes from a slightly different perspective. It would oblige the age-verification regulator to ensure that all age verification providers—the companies that put the tools on websites to ensure compliance—are approved by the regulator; to perform a data protection impact assessment that they make publicly available; and to perform an array of other duties as well.
The new clause is designed to address some of the concerns about the practicality of age-verification checks, ensuring that only minimal data are required, and kept secure; that individuals’ privacies and liberties are protected; and that there is absolutely no possibility of data being commercialised by pornographer. We raise the latter as a potential risk because the proposals were drafted with the input of the pornography industry. That is understandable, but the industry would have a significant amount to gain from obtaining personal data from customers that might not currently be collected.
As we said earlier, we have full confidence in the BBFC as regulator, but, as with the proposals in part 5 of the Bill, it is vital that some basic principles—although certainly not the minutiae—are put on the face of the Bill. We are certainly not asking anything that is unreasonable of the regulator or the age-verification providers. The principles of privacy, anonymity and proportionality should all underpin the age-verification tool, but as far as I am aware they have not featured in any draft guidance, codes of practice, or documents accompanying the Bill.
The Information Commissioner agrees. The Information Commissioner’s Office’s response to the Department for Culture, Media and Sport’s consultation on age verification for pornography raised the concern
“that any solution implemented must be compliant with the requirements of the DPA and PECR”—
the Data Protection Act 1998, and the Privacy and Electronic Communications (EC Directive) Regulations 2003 that sit alongside it. It continues:
“The concept of ‘privacy by design’ would seem particularly relevant in the context of age verification—that is, designing a system that appropriately respects individuals’ privacy whilst achieving the stated aim… In practical terms, this would mean only collecting and recording the minimum data required in the circumstances, having assessed what that minimum was. It would also mean ensuring that the purposes for which any data is used are carefully and restrictively defined, and that any activities keep to those restricted purposes…In the context of preventing children from accessing online commercial pornography, there is a clear attribute which needs to be proven in each case—that is, whether an individual’s age is above the required threshold. Any solution considered needs to be focussed on proving the existence or absence of that attribute, to the exclusion of other more detailed information (such as actual date of birth).”
The Commissioner made it clear that she would have
“significant concerns about any method of age verification that requires the collection and retention of documents such as a copy of passports, driving licences or other documents (of those above the age threshold) which are vulnerable to misuse and/or attractive to disreputable third parties. The collection and retention of such information multiplies the information risk for those individuals, whether the data is stored in one central database or in a number of smaller databases operated by different organisations in the sector.”
I understand that the Adult Provider Network exhibited some of the potential tools that could be used to fulfil that requirement. From the summary I read of that event, none of them seem particularly satisfactory. My favourite was put forward by a provider called Yoti, and the summary I read describes the process for using it as follows:
“install the Yoti App…use the app to take a selfie to determine that you are a human being…use the app to take a picture of Government ID documents”—
passport or driving licence, I imagine—
“the app sends both documents to Yoti…Yoti (the third party) now send both pictures to a fourth party; it was unclear whether personal data (e.g. passport details) is stripped before sending to the fourth party…Fourth party tells Yoti if the images (selfie, govt ID) match…Yoti caches various personal data about user”
to confirm that they are over 18. The user can then visit the porn site—whatever porn site they would like to visit at that time—and then the
“porn site posts a QR-like code on screen…user loads Yoti app…user has to take selfie (again) to prove that it is (still) them…not a kid using the phone…user scans the on-screen QR-code, is told: ‘this site wants to know if you are >18yo, do you approve?’…User accepts…Yoti app backchannel informs porn site…that user >18yo”
and then the user can see the pornography.
I do not know whether any Committee members watch online pornography; I gather that the figure is more than 50% of the general population, and I am not convinced that hon. Members are more abstinent than that. I ask Members to consider whether they would like to go through a process as absurd as the one suggested.
The hon. Lady has got ahead of the potential Daily Mail headline when the freedom of information request comes in for her Google search history.
I am not convinced that anybody would want to go through a process as the one I have just described, or even one significantly less convoluted. I suggest that instead they would seek entertainment on a site that did not impose such hurdles. The BBFC in its evidence made the telling point that the majority of the viewing population get their content from the top 50 sites, so it is very easy to target those—we see that entrenched in clause 23. The problem with that, as my hon. Friend the Member for City of Chester pointed out, is that targeting those sites may push viewers to the next 50 sites, and so on. We therefore need to ensure that the process is as straightforward and as minimal as possible.
That is absolutely right, and I will come to that point. We heard evidence from the BFFC that it intended potentially to use age-verified mobile telephony to ensure that sites are properly age verified, but I am afraid that that approach is also flawed. First, there is the obvious issue that there is nothing to stop an underage child using the information attached to that phone—be it the phone number or the owner’s name—to log on and falsely verify. Equally, there are enormous privacy issues with the use of mobile-verified software to log on.
The BBFC said clearly that it was interested not in identity but merely in the age of the individual attempting to access online pornography, but as we all know, our smartphones contain a wealth of information that can essentially be used to create a virtual clone. They are loaded with our internet conversations, financial data, health records, and in many cases the location of our children. There is a record of calls made and received, text messages, photos, contact lists, calendar entries and internet browsing history—the hon. Member for Devizes may want to take note of that—and they allow access to email accounts, banking institutions and websites such as Amazon, Facebook, Twitter and Netflix. Many people instruct their phones to remember passwords for those apps so they can quickly be opened, which means that they are available to anyone who gets into the phone.
All that information is incredibly valuable—it has been said that data are the new oil—and I imagine that most people would not want it to be obtained, stored, sold or commercialised by online pornography sites. The risks of creating databases that potentially contain people’s names, locations, credit card details—you name it—alongside their pornographic preferences should be quite clear to anyone in the room and at the forefront of people’s minds given the recent Ashley Madison hack. I am not condoning anyone using that website to look for extramarital affairs, nor am I privileging the preferences or privacy of people who wish to view online pornography over the clearly vastly more important issue of child protection. However, one consequence of that hack was the suicide of at least three individuals, and we should proceed with extreme caution before creating any process that would result in the storing of data that could be leaked, hacked or commercialised and would otherwise be completely private and legitimate.
That is the reasoning behind our reasonable and straightforward amendment, which would place a series of duties on the age-verification regulator to ensure that adequate privacy safeguards were provided, any data obtained or stored were not for commercial use, and security was given due consideration. The unintended consequences of the Government’s proposals will not end merely at the blocking of preferences, privacy or security issues, but will include pushing users on to illegal or at the very least non-compliant sites. We are walking a thin tightrope between making age verification so light-touch as to be too easily bypassed by increasingly tech-savvy under-18s and making it far too complicated and intrusive and therefore pushing viewers on to either sites that do not use age verification but still offer legitimate content or completely illegal sites that stray into much more damaging realms. These provisions clearly require a lot more consultation with the industry, and I am confident that the BBFC will do just that, but the Opposition would feel a lot more confident and assured if the regulator was required to adhere to these basic principles, which we should all hold dear: privacy, proportionality and safety.
The hon. Lady rightly gets to the great concern that somehow, in doing something good, an awful lot of concern can be created, and I am sympathetic to her points. I remind her that it is not as if these sites do not know who is visiting them anyway. One of the great conundrums on the internet is that every single keystroke we take is tracked and registered. Indeed, that is why shopping follows us around the internet after we have clicked on a particular site. Unless people are very clever with their private browsing history, the same is the case for commercial providers.
I absolutely support the Government’s intention here. We just want to ensure it is done in the right way and balances both sides of the argument. I think it is absolutely right that internet service providers are offering this filter, but does the hon. Lady share my concern that very few families take it up and very many families turn it off?
There are Ofcom data. One of the requirements we asked for was for Ofcom to monitor. Take-up improved, and, as I said, some internet service providers now have an automatic “on” system, whereby a person has to intervene to take the filters off. I am told that only about 30% of families choose to do so. Here is the savvy thing: we all know that people live in households with multiple ages and multiple requirements on the internet, so many ISPs now offer a service that enables people to disable the filters for a period and automatically reinstate them the following day. They do not have to do anything if they want the filters to be in place, but they might want to access over-18 content as an adult.
I want to discuss some of the other issues that have come up in this conversation, in the process of finally speaking about these amendments. Is it in order to do so, Mr Stringer?
That is a very reassuring reply and I thank the Minister for it. We have had a very good debate. I know that his officials will be listening and thinking hard about what has been said, and I do not think it would serve the Committee any purpose to press my amendments or my new clause to a vote.
I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
Question proposed, That the clause stand part of the Bill.
It was interesting to hear the Minister refer to financial regulations. I was not present on Second Reading because I was not then in the position that I occupy now, but having read that debate I do not believe that there was any such reference. So we would like some clarity on who will be the regulator of the payment service providers and what work has already been done with the Financial Conduct Authority—I assume it will be with the FCA in this circumstance—to ensure that it will be regulating those providers, to make sure that they act with speed and due diligence on receiving notification from the age verification regulator under clause 15.
It is disappointing that the Government do not consider new clause 18 necessary to amend the Bill. I appreciate that the BBFC has been given powers to establish a code of practice, but given the very serious consequences that could result from that not being done correctly, some basic principles need to be embedded into the process, based on the issues that I raised earlier in our discussion.
I will just add that we will return to this issue on Report.
I promise this will be the last time I speak today. I am afraid I have had a slight change of heart. I tabled this amendment around many points that have been raised today on the difficulty of focusing the BBFC’s efforts on the fact that much of this traffic is not simply going to the larger websites. As we have heard, many other free sites are providing information. However, in reading my amendment, I have decided that it is almost a vote of no confidence in the BBFC’s ability to be flexible and I would therefore like to withdraw it.
New clause 12 would give the power to the age verification regulator to introduce another code of practice—the Opposition are very fond of them—for internet content providers. [Interruption.] And reviews, we are very fond of reviews.
We have made it clear throughout that we want enforcement to be as tough as possible and for all loopholes to be closed, but we also want to ensure that children are as safe in the online world as they are offline. There absolutely needs to be that parity of protection. That is one reason why we are disappointed, as I mentioned, that these measures came forward in a Digital Economy Bill, where it was incredibly difficult to look at the issues of child protection online in a thoroughly comprehensive way.
The new clause proposes that the regulator should work with industry to create a statutory code of practice, based on BBFC guidelines for rating films and the principles of the ICT Coalition for Children Online. The code would establish a set of minimum standards that would apply consistently to social networks, internet service providers, mobile telecommunication companies and other communication providers that provide the space and content where children interact online.
This is not intended to be an aggressive, regulatory process. We envisage that it will be the beginning of a much broader debate and conversation between regulators and content providers about just how we keep our children safe on the web. This debate will encompass not only ideas such as panic buttons, but education about the online world, which must run in parallel for any process to be effective.
A statutory code would work with providers to lay out how content is managed on a service and ensure that clear and transparent processes are in place to make it easy both for children and parents to report problematic content. It would also set out what providers should do to develop effective safeguarding policies—a process that the National Society for the Prevention of Cruelty to Children has supported.
As I said, this will clearly be a staged process. We envisage that in order to be effective, the development of a code of practice must involve industry, child protection organisations such as the NSPCC and, crucially, the children and families who use online services. But this code of practice would be based on existing industry and regulatory minimum standards and would require providers to ensure that the safety and wellbeing of children is paramount in the design and delivery of their products and services. The new clause would also empower the Secretary of State to make regulations to ensure effective enforcement of the minimum standards in the code of practice.
The online world can be an enormously positive force for good for our children and young people. It makes available a scale of information unimaginable before the internet existed and there is compelling evidence that that constant processing of information will lead to the most informed generation of children the world has known, but it needs to be made safe to realise that potential. The new clause would give assurance to Opposition Members that we will enable that to happen.
I do recognise that. My point is that making non-statutory guidance statutory will not help in that space, but there is clearly much more to do. I hope that, with that assurance, my hon. Friend the Member for Devizes will withdraw the amendment.
I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
Question proposed, That the clause stand part of the Bill.
This is a very curious clause, which renders much of the well-informed—as the Minister said—and useful discussion that we have had today about enforcement, targeting smaller providers and restricting access across the web, completely and utterly redundant. If the clause as I read it goes forward unamended, it will provide the regulator with the ability to target only the largest providers of online pornography, perhaps even limiting its ability to target only them.
As we have discussed at length, this is an incredibly difficult area to police, which I appreciate. It is obviously going to be far easier to tackle the 50 largest providers, not least because I assume many of them are already providing some level of age verification and are probably more at the responsible end of online pornography content providers. I would remind the Committee of the Conservative party’s manifesto, which said:
“we will stop children’s exposure to harmful sexualised content online, by requiring age verification for access to all sites containing pornographic material”.
That does not make any reference to commercial providers or whether the provider has a large or small turnover, is on WordPress, Tumblr, Twitter, Facebook or Snapchat. Today’s debate has very much suggested that the role of the regulator will be to focus on those sites that are operated on a commercial basis. Given the Minister’s reluctance to implement internet service provider blocking, I do not believe that the manifesto commitment will be achieved.
(8 years, 1 month ago)
Public Bill CommitteesQ Baroness Harding, should the USO not have been an open tender process? If it had been, would it not have been right for it to have gone to more than one contractor, given the differences between the problems in inner city areas and those in rural areas?
Baroness Harding: Yes, maybe. I presume that you refer to the BDUK process that has taken place. I am actually very supportive of a universal service obligation. I do not agree with Sean Williams that 10 megabits will be sufficient as we look forward; it is very dangerous to try to set that number through primary legislation because technology is moving so fast. I fear that the rural communities who are furious that they do not have 10 meg today will be furious that they do not have 1 gigabit in three or four years’ time. I think you should be more ambitious, otherwise the political problem will never go away.
In terms of how then to get value for money for any form of Government subsidy, taxpayers’ money or levy going towards the final few per cent., I agree with the premise of your question. The more competition there is, the better, and it is a huge shame that there was none in the last process. To be fair to the Government of the time, I do not think that was because of how it was designed. The good news is that the market has changed quite a lot since then, and there are now a number of quite small providers building proper fibre-to-the-premises 1 gig services in rural areas, such as Gigaclear. I would be much more hopeful that, looking forward, it will be possible to design a process that is not reliant on one large incumbent.
Q As you know, I represent a very rural constituency. I support what has happened; it is clearly far better than it was five years ago. However, what happens if no USO provider is willing to come forward to deal with the last 500 houses in the Devizes constituency? What should happen then?
Q Why should we be limiting ourselves to something that is barely sufficient now? What changes could we see in the Bill that would give us anything like the connectivity that Mr Wheeldon just mentioned?
Paul Morris: You have to make sure that the USO does not get in the way of future ambition. We have to think about how we move from what we have today, which is largely a copper and fibre mix, with the exception of Virgin. We still have telephone lines running broadband, essentially; as David says, we have to move on and be more ambitious. The point is to make sure that the USO does not get in the way of that ambition to do better and to use fibre for homes and businesses. We should make sure that the smaller networks have an option to be involved in the USO, and, if they have the ambition, that they know that a USO provider is not going to over-build them.
There is lots to be done outside the legislation, and clearly we do not need to repeat the mistakes of BDUK. We need to know where the assets are, who can do the work and where the green cabinets are. It needs to make sense and we need to have some kind of register. We need a practical approach and money needs to follow results—not the other way round, which was the other issue with BDUK. We can learn from some issues from the past, and we need to make sure that this USO does not get in the way of what we need to do next, which is to have much more fibre in the ground across the whole country.
Q I represent a fairly rural constituency and I was interested to know what would happen if no USO provider came forward to do the right thing. What should happen in that case? How will the Government be able to mandate that provision?
Daniel Butler: We are not convinced that that situation will arise. What Mr Williams from BT just outlined was that BT was willing to enter into a legal obligation in which it would be the national provider for a universal service obligation. That is how it works today under the fixed telephony USO. Up to a relatively high cost threshold, BT is not allowed to pick and choose which areas and premises it connects and which it does not; it has a legal obligation to fulfil. The model does not need to radically change as we move to a broadband USO.
Paul Morris: Basically, you have to remember that most of these premises will have a telephone line—although not all, I grant you. That is a good start. It is about how we use what is already there well, and how we upgrade it.
(8 years, 1 month ago)
Public Bill CommitteesQ And is that the only form of age verification that you have so far looked into?
David Austin: The only form of age verification that we, as the BBFC, have experience of is age verification on mobile phones, but there are other methods and there are new methods coming on line. The Digital Policy Alliance, which I believe had a meeting here yesterday to demonstrate new types of age verification, is working on a number of initiatives.
Q May I say what great comfort it is to know that the BBFC will be involved in the regulatory role? It suggests that this will move in the right direction. We all feel very strongly that the Bill is a brilliant step in the right direction: things that were considered inconceivable four or five years ago can now be debated and legislated for.
The fundamental question for me comes down to enforcement. We know that it is difficult to enforce anything against offshore content providers; that is why in the original campaign we went for internet service providers that were British companies, for whom enforcement could work. What reassurance can you give us that enforcement, if you have the role of enforcement, could be carried out against foreign entities? Would it not be more appropriate to have a mandatory take-down regime if we found that a company was breaking British law by not asking for age verification, as defined in the Bill?
David Austin: The BBFC heads of agreement with the Government does not cover enforcement. We made clear that we would not be prepared to enforce the legislation in clauses 20 and 21 as they currently stand. Our role is focused much more on notification; we think we can use the notification process and get some quite significant results.
We would notify any commercially-operated pornographic website or app if we found them acting in contravention of the law and ask them to comply. We believe that some will and some, probably, will not, so as a second backstop we would then be able to contact and notify payment providers and ancillary service providers and request that they withdraw services from those pornographic websites. So it is a two-tier process.
We have indications from some major players in the adult industry that they want to comply—PornHub, for instance, is on record on the BBC News as having said that it is prepared to comply. But you are quite right that there will still be gaps in the regime, I imagine, after we have been through the notification process, no matter how much we can achieve that way, so the power to fine is essentially the only real power the regulator will have, whoever the regulator is for stage 4.
For UK-based websites and apps, that is fine, but it would be extremely challenging for any UK regulator to pursue foreign-based websites or apps through a foreign jurisdiction to uphold a UK law. So we suggested, in our submission of evidence to the consultation back in the spring, that ISP blocking ought to be part of the regulator’s arsenal. We think that that would be effective.
Q It discusses the transfer of data. It does not talk about your accessing data. It does not mention the technology through which you would do it. There are no codes of practice alongside how it would happen. It is very broad and explicitly talks about data sharing in certain areas.
Hetan Shah: I think I said this earlier, but in case I was not clear I shall repeat it. For statistical and research purposes, statisticians and researchers are interested only in aggregates; they are not interested in us as individuals. It is a key point that the relevant clauses are quite different from some of the other parts of the Bill. Others have indicated in their evidence that this area should be seen as slightly different.
It is also worth noting that there are safeguards that have been tried and tested over many years. There is the security surrounding the data—the ONS will not even let me into the vault where they hold the data. You need to be accredited and to sign something saying that you will not misuse the data. If you do, you will go to jail. The trick that has been missed has been not saying all that, because it is almost assumed that that is how the ONS works. My suggestion is that if you want to strengthen that part of the Bill, you should just lay out the safeguards that are already common practice in the ONS.
Q Thank you both for setting out some very factual and helpful arguments as to why the provisions are a good thing, particularly when it comes to aggregate statistics. I was struck by a quote in your report published in March, Professor Sir Charles. You mentioned the
“cumbersome nature of the present legal framework”,
which the Bill will clearly help to solve, and you also said that there was a
“cultural reluctance on the part of some departments and officials to data sharing”
and, in many ways, to working together, as we know from experience. How do we solve that problem and get Departments to realise how helpful some of these datasets might be?
Professor Sir Charles Bean: A key thing about the Bill is that it shifts the onus of presumption. There is a presumption of access unless there is a good reason not to comply or explain, if you like, as opposed to the current arrangement, which is that the data owner has the data and you say, “Can you please let us have a look at it?” There is civil service caution. I was a civil servant very early on in my career, so I am aware of how civil servants think. Inevitably, you are always worried about something going wrong or being misused or whatever. That plays into this, as well.
In the review I said there are really three elements and I think they are mutually reinforcing. There is the current legal framework, which is not as conducive as it could be; there is this innate caution on the part of some civil service Departments, or even perhaps on the part of their Ministers on occasion; and then the ONS has not been as pushy as it might have been. It is partly that if you know it is very difficult to get in—people are not very co-operative at the other end and the legal frameworks are very cumbersome—you are less inclined to put the effort in, and you think, “Oh, well, let’s just use the surveys, as we’ve always done.” So I think you need to act on the three things together, but they are potentially mutually reinforcing if you get the change right.
Hetan Shah: This is one area where I think the Bill could be strengthened. At the moment, the ONS has the right to request data; similarly, the researchers have the right to request data. The Department can still say, “No”, and in a sense the only comeback is that there is a sort of name-and-shame element of, “Parliament will note this”, as it were. My worry, given the cultural problems that have been seen in the past, is that that may not be enough. So why do we not do what Canada does? It just says, “The ONS requests”, and the Department gives.