Digital Economy Bill Debate

Full Debate: Read Full Debate

Baroness Kidron

Main Page: Baroness Kidron (Crossbench - Life peer)

Digital Economy Bill

Baroness Kidron Excerpts
Committee: 2nd sitting (Hansard): House of Lords
Thursday 2nd February 2017

(7 years, 9 months ago)

Lords Chamber
Read Full debate Digital Economy Act 2017 View all Digital Economy Act 2017 Debates Read Hansard Text Amendment Paper: HL Bill 80-III Third marshalled list for Committee (PDF, 262KB) - (2 Feb 2017)
Lord Paddick Portrait Lord Paddick (LD)
- Hansard - - - Excerpts

My Lords, I apologise to the Committee for not taking part in Second Reading. Having led on the Investigatory Powers Bill and the Policing and Crime Bill I was hoping for some time off for good behaviour, but apparently a policeman’s lot is not a happy one, even when he has retired.

My noble friend Lord Clement-Jones and I have Amendment 55B in this group. The first thing to say is that we on these Benches believe everything that can be demonstrated to be effective should be done to restrict children’s access to adult material online. We also believe that everything should be done to ensure that adults can access websites that contain material that it is legal for them to view. That is why Amendment 55B would require the age-verification regulator to produce an annual report on how effective the measures in the Bill have been in general in reducing the number of children accessing adult material online and how effective each enforcement mechanism has been. We also share the concerns expressed by the noble Baronesses, Lady Jones of Whitchurch and Lady Howe of Idlicote, on these provisions having been made somewhat at the last minute, and that they may not have been completely thought through.

The aims of the Bill and the other amendments in the group are laudable. The ideal that there should be equal protection for children online as there is offline is a good one, but it is almost impossible to achieve through enforcement alone. We have to be realistic about how relatively easy it is to prevent children accessing physical material sold in geographic locations and how relatively difficult, if not impossible, it is to prevent determined children accessing online material on the internet, much of which is free. An increasing proportion of adult material is not commercially produced.

That is not to say that we should not do all we can to prevent underage access to adult material, but we must not mislead by suggesting that doing all we can to prevent access is both necessary and sufficient to prevent children accessing adult material online, the detail of which I will come to in subsequent amendments. Of course internet service providers and ancillary service providers should do all they can to protect children, but there are also issues around freedom of expression that need to be taken into account.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, in light of these and some later amendments, I want to raise the matter of ancillary service providers. My understanding is that social media platforms continue to argue that they do not fall within the definition of ancillary service providers and are seeking confirmation from government that they have no role to play in preventing children accessing pornography online.

I am aware that the Minister stated at Second Reading:

“The Government believe that services, including Twitter, can be classified by regulators as ancillary service providers where they are enabling or facilitating the making available of pornographic or prohibited material”.—[Official Report, 13/12/16; col. 1228.]


I was pleased to hear him say that, but I would like confirmation that it remains the Government’s position. Unless such platforms are included, I simply do not understand what Part 3 of the Bill hopes to achieve.

I am unconvinced that it is possible to remove all adult content from the purview of children, but it is imperative to make it clear to young people that viewing adult sexual content is a transgressive act and not a cultural norm, so, at a minimum, it should be as difficult as reaching the top shelf in a newsagent or being underage in a pub. That is imperative for reasons I set out in great detail at Second Reading, so I will not repeat them here but simply say that children and young people are turning in large numbers to pornography to learn about sex, with unhappy consequences. Often violent, mainly misogynistic, unrealistic adult male fantasy is not a good starting point for a healthy, happy, consensual sex life.

I would have preferred for the age verification system to be fully thought out, prototyped and beta-tested before it came to the House in the form of legislation. None the less, I agree that Part 3 is a valiant attempt to stem the flow of adult material into the hands and lives of children. In the absence of a better, more thought out plan, I support it. But if this is the path we are taking, we must be clear in our message: this material is unsuitable for those under the age of 18.

The BBFC says that it intends to take a proportionate approach to its new role and will target the top 50 adult websites as accessed by viewers in the UK. Its research shows that 70% of all those who access such sites in the UK visit the top 50. Among children, concentration among those top sites is even higher. In that respect, I understand that age-verifying 70% of adult material websites sends a clear message.

However, a brief search on Twitter, which has a joining age of 13, shows that commercial pornography is readily available, with popular accounts attracting hundreds of thousands of followers. Many of those who access pornographic social media accounts do not publicly follow them, so it is more than likely that the follower figures are dwarfed by the number of actual viewers. In the case of younger viewers, such platforms if accessed via an app leave no browser footprint that might be discovered by parents—a very attractive proposition.

If social media companies provide alternative access to the same or similar pornographic material with no restriction, surely the regulator should be entitled to take the same proportionate approach and target pornographic social media accounts with similar viewer numbers to those for adult websites. For most young people, social media platforms are the gateway to the internet. Unless they are to be included within the definition of ASPs, neither the problem of young people accessing pornography nor the ambition of setting a social norm that puts adult sexual material beyond the easy reach of children and young people will be achieved. It will simply migrate.

I note that social media platforms are not homogenous and that some, including Facebook and Instagram, already take steps to prevent pornography being posted and act quickly to take it down when it does go up. It is disappointing that not all platforms take this approach. I do not want to focus on Twitter, but noble Lords might like go to the account, @gspot1177, with its 750,000 public followers, which has been publishing pornography with impunity since 2009. Surely it is necessary to bring this into scope of the regulator. Nobody is claiming that the measures set out in the Bill will prevent 100% of pornography being seen by children and I understand Ministers’ arguments that doing something is better than doing nothing, but I am concerned that in the lack of clarity about what does and does not fall within the definition of ASP there may lie a lack of political will about holding certain stakeholders to account.

I would love to hear from the Minister whether major social media platforms including Tumblr and Twitter have confirmed to the Government how they would respond to requests from the BBFC to withdraw services from a non-compliant site—and whether his statement at Second Reading that social media platforms may be considered ASPs by the regulator still stands.