All 1 Maria Miller contributions to the Digital Economy Act 2017

Read Bill Ministerial Extracts

Mon 28th Nov 2016
Digital Economy Bill
Commons Chamber

3rd reading: House of Commons & Legislative Grand Committee: House of Commons & Programme motion No. 3: House of Commons & Report stage: House of Commons

Digital Economy Bill Debate

Full Debate: Read Full Debate

Digital Economy Bill

Maria Miller Excerpts
3rd reading: House of Commons & Legislative Grand Committee: House of Commons & Programme motion No. 3: House of Commons & Report stage: House of Commons
Monday 28th November 2016

(7 years, 4 months ago)

Commons Chamber
Read Full debate Digital Economy Act 2017 Read Hansard Text Amendment Paper: Consideration of Bill Amendments as at 28 November 2016 - (28 Nov 2016)
Matt Hancock Portrait Matt Hancock
- Hansard - - - Excerpts

It is incredibly important to get the framework that operates in that sort of space right, as is the case for terrorist material and child protection online. The system that we have in place—it is essentially non-statutory, although it is underpinned by online and offline offences—is working well. Social media organisations’ collaboration with the police and others is incredibly important, and I urge them to collaborate with the police whenever they are asked to do so. We have taken the view that the effective and rigorous enforcement of rules relating to age verification is an important step to get that system up and running. The system is working well, with 220,000 take-downs since 2010, so we want to leave it in place. In all such instances, there might be difficult individual cases, but overall the system is, on the whole, working effectively. That is why we have taken different approaches for the two different areas.

New clause 10 would introduce some very specific requirements around online education. I maintain that the measure is not necessary, because e-safety is already covered at all stages in the new computing curriculum that was introduced in September 2014. From primary school, children are taught how to use technology safely, respectfully and responsibly, how to keep personal information private, how to recognise acceptable and unacceptable behaviour, and how to report a range of concerns. As hon. Members will see, we care deeply about protecting children online both through direct rules for the internet and through education. The new clause is not necessary, and I worry that putting in place a more static system would risk making the task at hand harder.

When it comes to broader protection, we expect social media and interactive services to have in place robust processes that can quickly address inappropriate content and abusive behaviour on their sites. It would be difficult to make the sort of statutory code of practice proposed in new clause 13 work, as there is not a one-size-fits-all solution. The way in which to deal properly with inappropriate content and abuse will vary by service and by incident. Technological considerations might differ by platform as innovation changes the way in which the internet operates. Legislating in this area is difficult because of the pace of change, and users will benefit most if companies develop a bespoke approach for reporting tools and in-house processes. Existing arrangements and the action taken by social media companies provide the best approach to tackling this problem.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - -

Will the Minister tell us which companies and sectors already have a code of practice in place? How he is monitoring whether such codes of practice are being brought up to date?

Matt Hancock Portrait Matt Hancock
- Hansard - - - Excerpts

We are working on codes of practice in a series of different areas. About 10 days ago, as my right hon. Friend will have seen, Twitter—one of the main players in this space—brought forward work towards a code of practice on online abuse. There is more to do in this area, but it is better that we have codes of practice that the organisations themselves can buy into and that can change with the times as the usage of social media changes. My goodness, we all know how social media changes over time—not always in a good way—so we need to make sure that we keep pace with that. I worry that putting something static into legislation would get into the way of such efforts. However, I agree with my right hon. Friend that it is incumbent on social media companies to play their part in establishing and rigorously enforcing norms and social responsibility in this area if we decide not to go down, or not yet to go down, the legislative route.

--- Later in debate ---
Matt Hancock Portrait Matt Hancock
- Hansard - - - Excerpts

The age verification requirements apply to the commercial provision of pornography. That is not only the paid-for but that which is provided for a commercial return. There is a difference between websites that provide commercial pornography and platforms on which others can upload images. Getting this right with regard to that second group is much harder than it is with regard to the first. We are therefore proposing to put forward the measures in the Bill to deal with the larger swathe or mainstay of the problem, get them working properly and then see how they are working.

I appreciate that there is a big challenge in stopping those who really want to access porn online, but all the evidence suggests that children’s first interaction is often by accident. We are legislating to prevent as much as possible of that inadvertent viewing by those who are not desperately actively seeking to do so. I appreciate that the Bill is not a utopia, but it is a very important step forward. I hope my right hon. Friend will accept that.

Maria Miller Portrait Mrs Miller
- Hansard - -

The Minister is being very generous with his time. Is it not fair to say that four years ago providers such as Twitter told us it was impossible to take down visual images of children being sexually abused, but now, as he says, there is quite rightly a code of practice in place? Surely where there is a will there is a way. He has already proved that he can make significant progress, so should he not put more pressure on organisations like Twitter?

Matt Hancock Portrait Matt Hancock
- Hansard - - - Excerpts

Yes is the short answer. The Bill does so, and we will best achieve that pressure by delivering on its proposals and then working with the platforms on the issue of platform-based pornography, because that is a much more difficult technical nut to crack.

--- Later in debate ---
With an 800% increase in the number of children contacting the NSPCC about online abuse, it is clear this is becoming a real problem for today’s schoolchildren. They clearly need more support and more advice, and someone to turn to. Statutory online education would work in tandem with a code of conduct for social media providers to prevent online abuse.
Maria Miller Portrait Mrs Miller
- Hansard - -

I am attracted to the shadow Minister’s proposal because I, too, feel more needs to be done to educate children in this area, but I am concerned that it is talking about internet pornography in isolation and potentially will not address the problems he is trying to address in his remarks, which go far broader than simply internet pornography.

Kevin Brennan Portrait Kevin Brennan
- Hansard - - - Excerpts

I would certainly welcome the right hon. Lady’s support for a wider amendment and for a wider change in Government policy in this area, because a problem does exist. Our proposals have had to be drawn up to be within the scope of the Digital Economy Bill. In Committee, we were unable to table an amendment that was in scope, so I am incredibly grateful that we have been able to get one in scope and within the confines of the Bill today.

--- Later in debate ---
Kevin Brennan Portrait Kevin Brennan
- Hansard - - - Excerpts

I welcome my hon. Friend’s intervention in support of our proposal for caps on mobile phone bills, and so that I do not exceed mine at this point, I will hang up, Madam Deputy Speaker.

Maria Miller Portrait Mrs Miller
- Hansard - -

It is a pleasure to follow the hon. Member for Cardiff West (Kevin Brennan), and I share his regret that it is not possible to address online abuse in this Bill. I hope that the Minister will show the Government’s determination on this issue, as Ministers have done regularly in response to questions on a number of other measures. I particularly noted his response to my intervention about codes of practices. He is right to say that the industry has been able to move swiftly and effectively to deal with issues relating to terrorism and child abuse, but I think issues relating to online abuse more broadly are just as worthy of their attention. I hope that he is clear about the Government’s priorities in this area, to make sure that the industry really does act.

It is an art form to draw the scope of a Bill, and the Minister should get a grade-A medal for drafting the scope of this Bill extremely tightly to make sure that a number of issues that many of us would have liked to have drawn to the attention of the House are not covered by this Bill. That does not, however, mean that they are any the less important.

I really welcome Government new clauses 28 and 29 on the powers to block access to material where age verification is not sufficiently robust. That shows the Government’s intention. They have done well to reflect the intentions of my hon. Friend the Member for Devizes (Claire Perry) in her new clause 1 and of my hon. Friend the Member for Congleton (Fiona Bruce). It shows action and energy from Government to try to clean up the internet so that it is safer for children to use. My amendments 27 to 34 raise the question of whether the Government could have gone further in that, although I acknowledge that they are very much adhering to the manifesto commitments we made at the general election.

We have heard from the Minister at length, and I listened carefully, particularly to his response to my amendments. With his usual elegance and wit, he attempted to explain how this Bill can be at odds with Government policy but people can be very happy with it—I may be being a little unkind. He often tells us at the Dispatch Box that what is illegal offline is illegal online too, but it is illegal for children under the age of 18 to view adult material—I refer not just to pornography; as he knows, “adult material” is drawn more broadly than pornography alone. It therefore seems a little arbitrary for us to introduce a new law that makes such a distinction. I do not understand why one needs to be made.

John Whittingdale Portrait Mr Whittingdale
- Hansard - - - Excerpts

My right hon. Friend says it is illegal for children to view adult material, but she will be aware that vast amounts of adult material are broadcast by our national broadcasters after the watershed at 10 o’clock, and it is not illegal for children to watch that, although it may be undesirable. How does she propose to deal with BBC iPlayer, ITV Play and 4oD, which broadcast 18 material?

Maria Miller Portrait Mrs Miller
- Hansard - -

My right hon. Friend, the former Secretary of State, makes an extremely important point. I suppose that the advantage broadcasters have over the online world is that they can use a notional watershed, although, as he rightly says, that is clearly not the case when it comes to iPlayer. I shall come on to technology that is on our side. Technology has moved on and given us opportunities, which my right hon. Friend would welcome, to make sure that children do not view things that we have said in Parliament are inappropriate.

I gently urge the Minister to consider how he might embrace my amendments in future. The law makes it clear that adult material does not just mean pornography. In response to my right hon. Friend the former Secretary of State, that is the point that I am making. Whether it is extreme violence, beheadings, sadomasochism or other such behaviour or material, it is deemed as adult-related. However, for reasons that are unclear, that is excluded from the Bill. Perhaps the Minister can give me a little more information about why he decided to do that, and assure me that in future that will be dealt with.

I took the time to talk to some primary schoolchildren in my constituency about the sort of things that they came across on the internet. A group of them talked about viewing age-appropriate material—I think it was pictures of small kittens—but at the end material popped up that frightened them to their core. They were young children, and they were not out and out looking for such material—it just popped up. Restrictions and parental controls could be put in place to catch that, but the Minister has an opportunity to make sure that organisations such as YouTube are more careful about advertisements linked to child-related material. That is an important point for him to consider further in relation to my amendments.

Ofcom has done a great deal of work in this area, and the Minister will be well and truly aware of that. It says that this is a significant problem, and that this year, one in 10 under-11s has seen something online that is “worrying, nasty or offensive”. Two thirds of young people think that sites should do more to protect them from that type of adult content. One of the guiding principles of the new regulator, the British Board of Film Classification, is to protect children from harmful media content. We protect them on television, albeit with the problems that my right hon. Friend the former Secretary of State has mentioned, and we protect them in the cinema. In one of the most uncontrolled environments —online—we allow them freely to view things that are far more difficult for us as parents to control. My amendments would help to draw those restrictions and website blocking more broadly if proper age verification procedures are not put in place, and it is worth the Government considering that further.

Ofcom was charged with looking at common media standards four or five years ago, so perhaps the Minister can update the House on the progress that has been made in that area. Can he explain how the new regulator will balance its narrow responsibilities to look solely at pornography with the organisation’s broader remit offline with regard to adult-related material? Organisations such as Childline have to deal daily with the aftermath when young people look at more broadly defined adult material online, as I have said before, in videos of extreme torture, violence, and—this is particularly upsetting—beheadings. My amendments, which have the full support of the National Society for the Prevention of Cruelty to Children seek to put safeguards that we take for granted offline into the online world. Content that would require an 18 certificate in a film or video game would be subject to an age-verification system.

The technology exists to do that. We have an incredible IT sector in this country, and it has invented ways to verify age in an anonymised way online, particularly with the use of passport data and biometrics. Companies such as Yoti have developed facial recognition apps linked to passports so that they can make sure, using anonymous data, that individuals are the age that they say they are. These things exist; Parliament does not need to invent them.

Accepting that adult over-18 material should not be viewed by children does not undermine freedom of speech, because we insist on it offline. It does add to costs for businesses, but we accept that cost for offline businesses, and I believe we should accept it for online businesses too. Fundamental rights and freedoms have always been subject to limits within the law, and the amendments simply call for the law relating to adult material in general to apply online, and for children to be protected. People who choose to flout the law should be subject to the same action by the regulator as people who distribute pornography.

I should like briefly to touch on a couple of other amendments in this group. New clause 3, which was tabled by the hon. Member for Dwyfor Meirionnydd (Liz Saville Roberts), talks about the creation of personal accounts and removing anonymity on the internet. I sympathise with the measures that it proposes, but it is as important for non-commercial sites as commercial sites to adopt such a measure, and I do not think that the Bill is the appropriate vehicle for such a change.

New clause 10 was discussed at length by the hon. Member for Cardiff West. As I said in an intervention, I sympathise with the point that he made, because the guidance on sex and relationships education is 16 years out of date. It does not quite pre-date the internet, but it is close to doing so, and it does not address issues such as pornography and the way in which it drives young people’s understanding of relationships—something that no one in the Chamber feels very comfortable with. I do not believe, however, that the Bill is the proper vehicle for him to achieve the objectives that he has set out, as he may well end up distorting the issue, because people might think that we have addressed it with his provision. However, we would not have done so, because the measure deals only with online pornography. He will agree, especially if he has read my Select Committee report on sexual harassment in schools, that any measure to address SRE and its improvement in schools should be drawn much more widely than the internet alone. I hope he will forgive me for not supporting that narrowly drawn provision, although I accept that he probably did not have any choice, given the scope of the Bill—he is absolutely right about that.

I urge the Minister to consider stronger undertakings than those he gave me in his opening statement, given the importance of prohibiting children from viewing adult material in the broader sense, rather than the narrow sense on which the Government have chosen to focus. He has a personal responsibility to children who use the internet day in, day out. We need to make sure that it is a safe place. He has done more than any other Minister today in making the internet a safer place for children such as mine and his, but he needs to do more, so will he give me that undertaking today?

None Portrait Several hon. Members rose—
- Hansard -

--- Later in debate ---
Claire Perry Portrait Claire Perry
- Hansard - - - Excerpts

As the House knows, I welcomed part 3 of the Bill on Second Reading, but I did raise, as did many other right hon. and hon. Members, the question of enforcement. We considered the possibility of internet service providers being asked to block sites that disregarded the Government’s requirement for age verification, and I tabled a series of amendments on that point in Committee. I disagree with the hon. Member for Cardiff West (Kevin Brennan) because I think that Ministers absolutely were in listening mode about a manifesto commitment that they were clearly keen to deliver. Against that backdrop, I am delighted to speak on Report by welcoming new clause 28 and Government amendments 35 to 42, which address this critical concern.

The Government had argued for rather a long time that it was disproportionate to make provision for statutory IP blocking because that had been dealt with on a voluntary basis for child pornography—we are all aware of the wonderful work done by the Internet Watch Foundation—and with reference to terrorist material. There was perhaps a hope that internet service providers would voluntarily get involved in blocking sites in the absence of age verification. Many right hon. and hon. Members campaigned for years for the voluntary introduction of family-friendly filters by internet service providers. We have led the world by working across industry and across the Government to produce a sensible set of provisions. We now have online filters that are introduced—in some cases automatically—by ISPs and others on a voluntary basis, and they seem to be working well.

There were, however, significant problems in assuming that ISPs would operate voluntarily. It was not just me and other colleagues in the House who were concerned. Bodies such as Christian Action Research and Education, the Children’s Charities Coalition for Internet Safety, the NSPCC, the British Board of Film Classification, which is now the regulator, and the Digital Policy Alliance were concerned that this sensible provision for age verification would not stick unless there was a more robust enforcement regime.

I am delighted that new clause 1, which I tabled, has been co-signed by 34 colleagues from seven political parties. That demonstrates that although we might like to stand up and shout at each other, our best work is done when we work together on such vital issues. It is a testament to the power of this place that we can work together so effectively to get this done. I know that this is a difficult argument; we have only to look at some of our Twitter feeds to see that. I am no longer on Twitter, but we know from other parts of the internet how difficult these conversations are because they go right to the heart of issues surrounding the regulation of the internet, which grew up, very properly, in a regulation-free environment, and in many respects that environment contributed to its growth and its glory.

Are we asking Governments and companies to restrict legal material for adults? I would argue strongly that the new clause is not about censorship or the restriction of legal access for adults; it is about proving that those who are consuming the material are indeed over 18. The new clause simply puts in place the sort of Government regulation and advice, and corporate socially responsible behaviour, that has been seen in many other industries. Example of that include the watershed in broadcasting, the fact that adult content often sits behind PINs on online media, and restrictions on what children can buy on the high street.

There is also a sense that the argument in relation to child sex abuse images and terrorist material is really not relevant. There is a strong global consensus that images or movie materials relating to neither of those things should be tolerated, so there is no need for statutory compulsion. However, the sites we are talking about, which offer material defined as pornographic, are quite different, because they provide a product that it is generally entirely legal for adults to access, and in many cases entirely reasonable, as there is no sense in which this is a kind of anti-pornography crusade. In that context, it is completely unsurprising that the ISPs made it clear they would not block pornographic sites without statutorily defined age-verification checks. Indeed, in evidence given on 25 October to the Communications Committee in the other place, the director of policy at Sky said of IP blocking under part 3 of the Bill:

“If there is a desire for ISPs to be blocking access to those sites, then legislation is required…If you want ISPs to block, I think they will struggle to do so, unless they are compelled to, and not because they do not want to but because they would probably be breaking the law.”

Indeed, Ofcom gave the Committee a similar message a week later, saying:

“If ISPs were to take any action blocking non-compliant sites, they would do so on a voluntary basis…I think you…have heard from ISPs about the legal difficulties they…would face if they were to undertake voluntary blocking…it would raise issues in relation to net neutrality.”

The second point, which has been widely raised among colleagues, is that there is overwhelming support among the majority of the British public for introducing these age-verification measures robustly. Eight out of 10 people absolutely support this very good manifesto commitment and want it to work. Indeed, the BBFC, which the Minister has chosen to be the regulator—I think all of us absolutely support it as a trusted brand in the space; it is not me or anyone else deciding what is over-18 material, because that will be based on the BBFC’s tried and tested guidelines—said itself that it felt that the regulator needed this power if it was effectively to carry out its work.

Maria Miller Portrait Mrs Miller
- Hansard - -

My hon. Friend says that this power is consistent with the guidelines that the regulator uses already, but my point was that it is not. Its powers are far more broadly drawn with regards to adult material over and above simply pornography.

Claire Perry Portrait Claire Perry
- Hansard - - - Excerpts

I do have great sympathy with the provisions my right hon. Friend has tabled; she is absolutely right to keep pushing on the issue. We defined the manifesto commitment and the Bill very tightly in terms of the online pornography space, and I wanted to achieve that first before we moved to broader definitions which, as she will be aware, quickly throw up many more questions about the scope of regulation. As she and I both know, there is a great desire in this space to make the perfect the enemy of the good, and with almost every advance we have made, we have been told, “Back off,” because something is not absolutely perfect. She, I and many other Members think that this is a process of iterative steps forward, and the Government are doing a great job in that respect.

The final argument for putting such blocking on a statutory basis is the precedent for IP blocking in the case of copyright infringement under the Copyright, Designs and Patents Act 1988. It would seem perverse for the House to argue that it was legal to instruct people to block sites that infringe copyright, but not those that infringe a legal requirement for age verification. It would be quite wrong for us to suggest that child protection is less important than protecting the interests of often very large commercial businesses.

I have two other quick points to make about why the case for change is so compelling. The first is that the BBFC has said that it will focus primarily on offshore sites, which are the main source of much of this material. Of course, as we know, it will be very difficult to enforce fines outside the UK jurisdiction. Secondly, we know that many sites are not reliant purely on financial transactions coming through the sorts of sites discussed in the Bill, given that there are systems such as Bitcoin and other forms of revenue generation.

I am absolutely delighted that the Government have tabled new proposals. I will not press my new clause and I will support their measures wholeheartedly. However, I want to probe the Minister—perhaps he will answer this question in a moment—about who will actually enforce the Bill. My understanding is that the BBFC does not currently have the enforcement powers required by new clause 28, which was why many of us assumed that Ofcom would be the enforcer of choice, as was set out very explicitly by my neighbour, my hon. Friend the Member for North West Hampshire (Kit Malthouse). We would therefore be keen to hear who will actually enforce the Bill, because we know that, without robust enforcement, there will be little incentive for websites to implement age verification, despite these new powers, and I think almost the whole House will support me in saying that we want this to be a great success.

--- Later in debate ---
Maria Miller Portrait Mrs Miller
- Hansard - -

We are seeing the internet come of age through this Bill. I very much welcome the change in the tone of Members on both Front Benches. The digital economy in this country is hugely important, but we need rules in this area just as we need them in other aspects of our lives. The acknowledgement that we need clear rules on content is welcomed across the board. I congratulate Ministers on the amendments that have been made to strengthen enforcement, particularly around harmful content, and I hope that when the other place considers the Bill, it will be able to look at some of the other points that right hon. and hon. Members have raised today. I wish the Bill well.