Online Harms Consultation

Lord Faulkner of Worcester Excerpts
Wednesday 16th December 2020

(4 years ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Barran Portrait The Parliamentary Under-Secretary of State, Department for Digital, Culture, Media and Sport (Baroness Barran) (Con)
- Hansard - - - Excerpts

I thank both noble Lords for welcoming this full response to the consultation. I am happy to echo them both in their thanks, in particular to Carnegie UK and the important work it has done. We hope very much that the Bill will bring us into an age of accountability for big tech.

In response to the point made by the noble Lord, Lord Stevenson, what is illegal in the real world should indeed be illegal in the digital world. This Bill, when it comes, will help us move towards that. He raised the question about the focus on individuals. Obviously, the level of harm—in terms of the more individuals who are impacted—will be relevant to the sanctions that Ofcom can enforce. But he also raised a wider and very important point about trust in our institutions; clearly, social media and big tech platforms are institutions where the level of trust has been tremendously eroded in recent years. We want to restore that, so that what the big tech platforms say they will do is actually what happens in practice.

Both noble Lords asked about the category 1 companies, how those are defined and whether we will miss important actors as a result of that definition. Category 1 businesses will be based on size of audience but also on the functionality that they offer. For example, the ability to share content widely or to contact users anonymously, which are obviously higher-risk characteristics, could put a platform with a smaller audience into that category 1. Ofcom will publish the thresholds for these factors, assess companies against those thresholds and then publish a list of them. To be clear, all companies working in this area with user-generated content have to tackle all illegal content, and they have to protect children in relation to legal but harmful content. We are building safety by design into our approach from the get-go.

The noble Lord, Lord Stevenson, asked about criminal liability; we are not shying away from it. Indeed, the powers to introduce criminal liability for directors are, as he knows, being included in the Bill and can be introduced via secondary legislation. We would just rather give the technology companies a chance to get their house in order. The significant fines that can be levied—up to 10% of the turnover of the parent company or £18,000, whichever is higher—are obviously, for the larger tech companies, very substantial sums of money. We think that those fines will help to focus their minds.

The noble Lord, Lord Clement-Jones, talked about legal but harmful content. This is a very important and delicate area. We need to protect freedom of expression; we cannot dictate that legal content should automatically be taken down. That is why we agree with him that a duty of care is the right way forward. He questioned whether this would be sufficient to protect children. Our aim, and our number one priority, throughout this is clearly the protection of children.

The noble Lord, Lord Clement-Jones, asked a number of questions about Ofcom. I might not have time to answer them all now, but we believe that the Bill will give Ofcom the tools it needs to understand how to address the harms that need addressing through transparency reports, and to take action if needed. Ofcom will have extensive powers in order to achieve this. He also mentioned international co-ordination. We are clearly very open to working with other countries and regulators and are keen to do so.

Both noble Lords questioned whether the shift from age verification to age assurance is in some way a step backwards. We really do not believe that this is the case. We think that when the Bill comes, its scope will be very broad. We expect companies to use age-assurance or age-verification technologies to prevent children accessing services that pose the highest risk of harm to them, such as online pornography. The legislation will not mandate the use of specific technological approaches because we want it to be future-proofed. The emphasis will be on the duty of care and the undiluted responsibility of the tech companies to provide sufficient protection to children. We are therefore tech neutral in our approach, but we expect the regulator to be extremely robust towards those sites that pose the highest risk of harm to children.

The noble Lord, Lord Clement-Jones, also asked about our media literacy strategy, which we are working on at the moment.

Lord Faulkner of Worcester Portrait The Deputy Speaker (Lord Faulkner of Worcester) (Lab)
- Hansard - -

My Lords, we now come to the 20 minutes allocated to Back-Bench questions. I urge noble Lords who wish to participate to keep their questions short, so that we can get in as many of the 16 who have asked to participate as possible.

--- Later in debate ---
Lord Faulkner of Worcester Portrait The Deputy Speaker (Lord Faulkner of Worcester) (Lab)
- Hansard - -

I call the noble Lord, Lord McNally, again.

Lord McNally Portrait Lord McNally (LD) [V]
- Hansard - - - Excerpts

My Lords, not guilty, but happy to get in. Earlier this year, the noble Lord, Lord Puttnam, chaired a committee of this House which produced the report Digital Technology and the Resurrection of Trust, about the damage caused to our political and democratic system by online harm. The Government are choosing to ignore this. Does that not leave a massive stable door in the legislation? Will she assure me that the noble Lord, Lord Puttnam, will be able to give evidence to pre-legislative scrutiny to make the case for action in this area?