Debates between Lord Ashton of Hyde and Earl of Erroll during the 2015-2017 Parliament

Mon 20th Mar 2017
Digital Economy Bill
Lords Chamber

Report: 2nd sitting (Hansard): House of Lords
Thu 2nd Feb 2017
Digital Economy Bill
Lords Chamber

Committee: 2nd sitting (Hansard): House of Lords
Thu 2nd Feb 2017
Digital Economy Bill
Lords Chamber

Committee: 2nd sitting (Hansard - continued): House of Lords

Digital Economy Bill

Debate between Lord Ashton of Hyde and Earl of Erroll
Report: 2nd sitting (Hansard): House of Lords
Monday 20th March 2017

(7 years, 8 months ago)

Lords Chamber
Read Full debate Digital Economy Act 2017 View all Digital Economy Act 2017 Debates Read Hansard Text Amendment Paper: HL Bill 102-III Third marshalled list for Report (PDF, 182KB) - (20 Mar 2017)
Earl of Erroll Portrait The Earl of Erroll
- Hansard - - - Excerpts

My Lords, this is an important point. Without enforcement, nothing will work. If you do not enforce age verification, no one will bother with it. For exactly the same reasons as the noble Lord, Lord Paddick, gave, I think that the notice and take-down—the blocking—is the only thing that will work. Fines will not work; it is probably a waste of time even trying them. The only thing that might work is to ask the credit card companies not to take payments for those sites, because they like to observe the law. I am concerned that the BBFC will not have resources to do this properly, but even if it goes elsewhere the BBFC should still be able to notify ISPs to block sites. That bit must certainly be enforced.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - -

My Lords, I am grateful to everyone who has spoken in this brief debate. The introduction of a new law requiring appropriate age verification measures for online pornography will help protect young people and children from potential harms from online pornography. It will also rightly hold commercial providers of online pornography responsible for the material they provide and profit from.

The Government of course take the protection of children and young people very seriously. To provide effective protection it is important that we have a robust regulatory system in place. These amendments seek to limit the scope of the regulatory functions that may be fulfilled by the BBFC by seeking the requirement that the same regulator must not be responsible for both identifying a non-compliant site and taking enforcement action against it. I shall first explain why, in identifying the BBFC as the preferred regulator, we think we have made the right choice.

The Government’s intention is that, subject to parliamentary approval, the BBFC will be the regulator responsible for identifying websites that do not have adequate age verification or are hosting extreme pornography, and then to give notice to the appropriate persons, be they payment service providers, ancillary service providers or ISPs. It is not intended that the BBFC will be designated as the regulator responsible for issuing financial penalties. That will be a role for a separate body, yet to be determined, but which will be approved by Parliament.

We are pleased to be working with the British Board of Film Classification as the intended age verification regulator, again subject to parliamentary approval. To respond to the remarks of the noble Baroness, Lady Jones, on structure, the BBFC is an independent, not-for-profit company that has a proven track record of interpreting and implementing legislation as the statutory authority for age rating videos under the Video Recordings Act. It has unparalleled expertise in classifying content and it is committed to delivering the aims of age verification. It is the expert on editorial judgments over pornographic and other content.

The BBFC has been classifying cinema films since it was set up in 1912 and videos and DVDs since the Video Recordings Act was passed in 1984. It continuously has to make judgments on classification, openly and transparently. These decisions relate to a multimillion-pound industry and are subject to challenge. The BBFC’s work with mobile network operators on the self-regulatory regime for mobile content is a good example of where it successfully sets content standards, implements them and adjudicates transparently and accountably.

The BBFC will not operate without oversight. It must have regard to the statutory guidance from the Secretary of State to the regulator. This will provide a further opportunity to ensure that the regulator fulfils its duties in the way Parliament sees fit. As I said earlier, we are seeking views on this guidance before a final version is laid. Ultimately, the regulator’s decision-making process will be subject to oversight by the courts as there is the possibility of challenge by way of judicial review. This prevents it acting arbitrarily.

In our view, these amendments are unnecessary for the following reasons. First, Clause 17 already enables the Government to designate a person, or any two persons or more jointly, as age verification regulators. The importance of getting this measure right means that the Government remain open-minded and retain flexibility as to how best to respond to changing circumstances. If the BBFC is proven to be unable to deliver certain regulatory functions the legislation has the flexibility to overcome these problems.

Secondly, splitting the regulatory functions in the Bill so that the same regulator cannot identify non-compliant sites and enforce against them unnecessarily creates a middleman in the process. The BBFC will have to give notice to a second regulator, which will then pass that notice on to an ISP or other appropriate body. This is just red tape for no benefit. It makes sense that the body that makes the original determination should also be responsible for notifying relevant parties affected by that determination and for ensuring that that notification action is effective in achieving compliance.

Thirdly, our ambition is to have the age verification regime in place by spring 2018. We are determined to stick to that timetable. The NSPCC has set out the scale of the problem we face and we need to get on with protecting children as quickly as we can. If we need to invent an additional regulator that can only delay the result.

Digital Economy Bill

Debate between Lord Ashton of Hyde and Earl of Erroll
Committee: 2nd sitting (Hansard): House of Lords
Thursday 2nd February 2017

(7 years, 9 months ago)

Lords Chamber
Read Full debate Digital Economy Act 2017 View all Digital Economy Act 2017 Debates Read Hansard Text Amendment Paper: HL Bill 80-III Third marshalled list for Committee (PDF, 262KB) - (2 Feb 2017)
Earl of Erroll Portrait The Earl of Erroll (CB)
- Hansard - - - Excerpts

My Lords, I will make some brief points. First, on this set of amendments I am afraid I disagree with the noble Baroness: we must get on with this. It will not be perfect on day one but the sooner we get moving the better. We have talked about this for a very long time. That is why I am not really pro these amendments.

On Amendment 55, I agree entirely with my noble friend Lady Howe. She is absolutely right to spot this lacuna: the BBFC will look at this stuff and age verification, but who will enforce it? That is a problem and I was going to raise it later anyway. She was absolutely spot on there. My noble friend Lady Kidron was also absolutely spot on about these sites. Twitter could be classified as commercial because it takes money from pornography sites to promote them. I can get evidence of that. It would be difficult for it to say that it does not promote them.

Very quickly on what the Minister said, I was going to raise under the group starting with Amendment 57 the issue of including prohibited material with the age verification stuff. We should separate protecting children from protecting adults or it will confuse things. The big danger is that if we start using this to protect adults from stuff that they should not see—in other words, some of the adult prohibited material, of which there is quite a lot out there—we run the risk of challenges in court. Everything that the BBFC does not classify because it falls into certain categories is automatically prohibited material. It is not allowed to classify certain acts. I should probably not tell noble Lords about those now as they are pretty unpleasant but they are fairly prevalent in the hardcore pornography out there. If the pornography sites are blocked from supplying adults with what they want, they will just move offshore and get round this. If they do that, there will be no point in doing age verification and we will not protect our children. That will create the first major loophole in the entire thing.

I have this from the pornographers themselves. They know what they are doing. However, they are very happy—and would like—to protect children. If we leave them alone and argue through the Obscene Publications Act and other such things as to what they must stop adults seeing, they will help block children. They are very keen on that. Children just waste their time as they do not have money to spend. At the end of the day, the pornographers want to extract money from people.

I am advised that the real problem is that prohibited material includes content that would be refused a BBFC R18 certificate. The Crown Prosecution Service charging practice is apparently out of sync with recent obscenity case law in the courts. Most non-UK producers and distributors work on common global compliance standards based on Visa and Mastercard’s brand-protection guidelines. Maybe we should start to align with that. We should deal with that separately under the Obscene Publications Act. It will be very easy for the BBFC, the regulator or the enforcer to tell what does not have age verification on the front. That is yes/no—it is very simple. The trouble is that if we get into prohibited material, it will end up before the courts. We will have to go through court procedures and it will take much longer to block the sites. I would remove that from here. I shall leave my other comments to a later stage.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - -

My Lords, I am grateful for those contributions. They address some very important issues, some of which we will deal with now and some of which we will deal with later during the progress of the Bill. To start at the end, the noble Earl, Lord Erroll, made some interesting points regarding the statement that I made. We absolutely acknowledge some of them. I have listened to his suggestions. Our focus here is to protect children. That is what this Bill is for. That is what our manifesto commitment was. When he sees our suggestions, I hope that he will be able to contribute to the debate on Report—but I have noted everything he said.

The introduction of a new law requiring appropriate age verification measures for online pornography is a bold new step. It represents the first stage of ensuring that commercial providers of online pornographic material are rightly held responsible for what they provide and profit from.

Amendment 54B would require the regulator to publish guidance about the overarching duty of care on internet service providers and ancillary service providers, and their responsibility to ensure that all reasonable steps are taken to ensure the safety of a child or young person involved in activities or interaction for which the service provider is responsible. The purpose of our measures is to protect children from pornographic material. Seeking to stretch the framework further to regulate companies on a different basis risks the delivery of our aim. However, that is not to say that we want to ignore the issue. We take the issue of child safety online seriously and engage intensively with the industry through the UK Council for Child Internet Safety to ensure that robust protections are in place.

The Government expect industry to play a leading role in internet safety provisions, as it is best placed to offer safety and protection to children and young people. We know that it is already doing this and has default protections for under-18s, including the use of parental controls and tools to allow users to flag content, protect user privacy as well as educate users on staying safe with information and advice. We will have further opportunities to discuss the role of the industry, including social media and internet service provider filters, later in Committee.

Amendment 54D seeks to introduce a new clause with the requirement that the Secretary of State must consult on the role of the age verification regulator. The clause further seeks that the Secretary of State must lay before each House of Parliament a report on the results of the consultation and the Secretary of State’s conclusions, with any appointments to be subject to approval in each House. The introduction of the measures requiring appropriate age verification for online pornography follows public consultation. We asked about the powers that a regulator should have and there was strong support for a number of responsibilities that we have introduced. The passage of this Bill has provided an important opportunity for debate on this and we have seen the introduction of an important new blocking power for the regulator, which we shall discuss later.

We are grateful to the DPRRC and the Constitution Committee for their reports, which a number of noble Lords mentioned. They made a number of recommendations about the designation of the regulator and how the regulator should fulfil its role. We are carefully considering those and will publish our response before Report.

Amendment 55, in the name of the noble Baroness, Lady Howe, would specify that the Secretary of State is to designate the British Board of Film Classification as the age verification regulator. As the Committee will know, Clauses 17 and 18 provide for the designation of the regulator and we intend to designate the BBFC to carry out most—as the noble Baroness, Lady Howe, reminded us—of the functions of the regulator. Indeed, some noble Lords may have seen the BBFC’s recent presentation to the Children’s Media and the Arts APPG.

Digital Economy Bill

Debate between Lord Ashton of Hyde and Earl of Erroll
Committee: 2nd sitting (Hansard - continued): House of Lords
Thursday 2nd February 2017

(7 years, 9 months ago)

Lords Chamber
Read Full debate Digital Economy Act 2017 View all Digital Economy Act 2017 Debates Read Hansard Text Amendment Paper: HL Bill 80-III Third marshalled list for Committee (PDF, 262KB) - (2 Feb 2017)
Earl of Erroll Portrait The Earl of Erroll
- Hansard - - - Excerpts

My Lords, it has been suggested to me that this group of amendments could also be used in the code of practice and the safety responsibilities could also be drawn up to include non-age-verified pornography.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - -

My Lords, the Government take the harm caused by online abuse and harassment very seriously, and we will continue to invest in law enforcement capabilities to ensure that all online crime is dealt with properly.

Amendment 70 would require the Government to carry out a review of online abuse and lay a report before Parliament within six months of Royal Assent. We do not believe that it is necessary to include provision for a review in primary legislation. As part of the ending violence against women and girls strategy, we have established an official government working group to map out the current issues, prevalence, initiatives and barriers to addressing gendered online abuse and to produce an action plan.

We are absolutely clear that abusive and threatening behaviour is totally unacceptable in any form, either offline or online. As the Committee will be aware, any action that is illegal when committed offline is also illegal if committed online. Current legislation, some of which was passed before the digital age, has shown itself to be flexible and capable of catching and punishing offenders, whether their crimes were committed by digital means or otherwise. The Protection from Harassment Act 1997 was amended to introduce two new stalking offences to cover conduct that takes place online as well as offline. In addition, the Government will be introducing a new civil stalking protection order to protect victims further.

We will continue to take action where we find gaps in the legislation, just as we did with cyberstalking, harassment and the perpetrators of grossly offensive, obscene or menacing behaviour, and of course we introduced a new law making the fast-growing incidence of revenge porn a specific criminal offence.

The Law Commission recently consulted on including a review of the law covering online abuse as part of its 13th programme of law reform, which will launch later this year. It is expected to confirm with Ministers shortly which projects it proposes should be included.

We are also working to tackle online abuse in schools and have invested £1.6 million to fund a number of anti-bullying organisations.

In addition, we are working to improve the enforcement response to online abuse and harassment so that it can respond to changing technologies. The Home Office has also allocated £4.6 million for a digital transformation programme to equip forces with the tools to police the digital age effectively and to protect the victims of digital crime, including online abuse and harassment. Police and prosecutors evidence offences carried out digitally, non-digitally or both. The CPS Guidelines on Prosecuting Cases Involving Communications Sent via Social Media makes clear the range of criminal law which can be brought to bear on offences committed through social media. Moreover, from April 2015, police forces have been recording online instances of crimes, including stalking and harassment.

I shall talk about the next three amendments together, as they all cover the duties of social media sites. Amendment 71AA seeks to make it a requirement for all social media sites to carry out a safety impact assessment. Amendment 71AB seeks to require Ministers to issue a code of practice to ensure that commercial social media platform providers make a consistent and robust response to online abuse on their sites by identifying and assessing online abuse. Amendment 233A seeks to impose a duty on social media services to respond to reports posted on their sites of material which passes the criminal test—that is, that the content would, if published by other means or communicated in person, cause a criminal offence to be committed.

The Government expect social media and interactive services to have robust processes in place that can quickly address inappropriate content and abusive behaviour on their sites. On the point made by the noble Baroness, Lady O’Neill, it is incumbent on all social media companies to provide an effective means for users to report content and perform the actions that they say they will take to deal with this. We believe a statutory code of practice is unworkable because there is no one-size-fits-all solution. Dealing properly with inappropriate content and abuse will vary by service and incident. Technological considerations might differ by platform and as innovation develops. Users will benefit most if companies develop their own bespoke approach for reporting tools and in-house processes.

Social media companies take down content that is violent or incites violence if it breaches their terms and conditions. We expect them to inform the police where they identify significant threats or illegal activity happening on their sites. It is, however, extremely difficult to identify where the threat has come from and whether it is serious. We work closely with companies to flag terrorist-related content and have so far secured the voluntary removal of over 250,000 pieces of content since 2010.

I can assure the Committee that we share the sentiments expressed in these amendments. At the moment, though, they are not practical or necessary, so I hope on that basis noble Lords will not press their amendments.