Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Ofcom will assess services that are close to meeting the threshold conditions of category 1 services and will publish a publicly available list of those emerging high-risk services. A service would have to meet two conditions to be added to the emerging services list: it would need at least 75% of the number of user figures in any category 1 threshold condition, and at least one functionality of a category 1 threshold condition, or one specified combination of a functionality and a characteristic or factor of a category 1 threshold condition.

Ofcom will monitor the emergence of new services. If it becomes apparent that a service has grown sufficiently to meet the threshold of becoming a category 1 service, Ofcom will be required to add that service to the register. The new clause and the consequential amendments take into account the possibility of quick growth.

Following the removal of “legal but harmful” duties, category 1 services will be subject to new transparency, accountability and free speech duties, as well as duties relating to protection for journalists and democratic content. Requiring all companies to comply with that full range of category 1 duties would pose a disproportionate regulatory burden on smaller companies that do not exert the same influence on public discourse, and that would possibly divert those companies’ resources away from tackling vital tasks.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - -

Will my hon. Friend confirm that the risk assessments for illegal content—the priority illegal offences; the worst kind of content—apply to all services, whether or not they are category 1?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

My hon. Friend is absolutely right. All companies will still have to tackle the risk assessment, and will have to remove illegal content. We are talking about the extra bits that could take a disproportionate amount of resource from core functions that we all want to see around child protection.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Absolutely. The Department has techniques for dealing with misinformation and disinformation as well, but we will absolutely push Ofcom to work as quickly as possible. As my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), the former Secretary of State, has said, once an election is done, it is done and it cannot be undone.

Damian Collins Portrait Damian Collins
- Hansard - -

Could the Minister also confirm that the provisions of the National Security Bill read across to the Online Safety Bill? Where disinformation is disseminated by networks operated by hostile foreign states, particularly Russia, as has often been the case, that is still in scope. That will still require a risk assessment for all platforms, whether or not they are category 1.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Indeed. We need to take a wide-ranging, holistic view of disinformation and misinformation, especially around election times. There is a suite of measures available to us, but it is still worth pushing Ofcom to make sure that it works as quickly as possible.

Amendment 48 agreed to.

Amendment made: 49, in clause 82, page 72, line 23, after “conditions” insert

“or the conditions in section (List of emerging Category 1 services)(2)”.—(Paul Scully.)

This is a technical amendment ensuring that references to assessments of user-to-user services in the new clause inserted by NC7 relate to the user-to-user part of the service.

Clause 82, as amended, ordered to stand part of the Bill.

Schedule 11

Categories of regulated user-to-user services and regulated search services: regulations

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I rise briefly to support everything the hon. Member for Aberdeen North just said. We have long called for the Bill to take a harm-led approach; indeed, the Government initially agreed with us, as when it was in its first iteration it was called the Online Harms Bill rather than the Online Safety Bill. Addressing harm must be a central focus of the Bill, as we know extremist content is perpetuated on smaller, high-harm platforms; this is something that the Antisemitism Policy Trust and Hope not Hate have long called for with regards to the Bill.

I want to put on the record our huge support for the amendment. Should the hon. Lady be willing to push it to a vote—I recognise that we are small in number—we will absolutely support her.

Damian Collins Portrait Damian Collins
- Hansard - -

I want to speak briefly to the amendment. I totally understand the reasons that the hon. Member for Aberdeen North has tabled it, but in reality, the kinds of activities she describes would be captured anyway, because most would fall within the remit of the priority illegal harms that all platforms and user-to-user services have to follow. If there were occasions when they did not, being included in category 1 would mean that they would be subject to the additional transparency of terms of service, but the smaller platforms that allow extremist behaviour are likely to have extremely limited terms of service. We would be relying on the priority illegal activity to set the minimum safety standards, which Ofcom would be able to do.

It would also be an area where we would want to move at pace. Even if we wanted to bring in extra risk assessments on terms of service that barely exist, the time it would take to do that would not give a speedy resolution. It is important that in the way Ofcom exercises its duties, it does not just focus on the biggest category 1 platforms but looks at how risk assessments for illegal activity are conducted across a wide range of services in scope, and that it has the resources needed to do that.

Even within category 1, it is important that is done. We often cite TikTok, Instagram and Facebook as the biggest platforms, but I recently spoke to a teacher in a larger secondary school who said that by far the worst platform they have to deal with in terms of abuse, bullying, intimidation, and even sharing of intimate images between children, is Snapchat. We need to ensure that those services get the full scrutiny they should have, because they are operating at the moment well below their stated terms of service, and in contravention of the priority illegal areas of harm.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I am glad that we are all in agreement on the need for a review. It is important that we have a comprehensive and timely review of the regulatory regime and how it is built into legislation. It is important that we understand that the legislation has the impact that we intend.

The legislation clearly sets out what the review must consider, how Ofcom is carrying out its role and if the legislation is effective in dealing with child protection, which as the hon. Lady rightly says is its core purpose. We have struck the balance of specifying two to five years after the regime comes into force, because it provides a degree of flexibility to future Ministers to judge when it should happen. None the less, I take the hon. Lady’s point that technology is developing. That is why this is a front-footed first move in this legislation, when other countries are looking at what we are doing; because of that less prescriptive approach to technologies, the legislation can be flexible and adapt to emerging new technologies. Inevitably, this will not be the last word. Some of the things in the Digital Economy Act 2017, for example, are already out of date, as is some of the other legislation that was put in place in the early 2000s. We will inevitably come back to this, but I think we have the right balance at the moment in terms of the timing.

I do not think we need to bed in whom we consult, but wider consultation will none the less be necessary to ascertain the effectiveness of the legislation.

Damian Collins Portrait Damian Collins
- Hansard - -

I am following carefully what the Minister says, but I would say briefly that a lot of the debate we have had at all stages of the Bill has rested on how we believe Ofcom will use the powers it has been given, and we need to make sure that it does that. We need to ensure that it is effective and that it has the resources it needs. The hon. Member for Aberdeen North (Kirsty Blackman) makes an important point that it may not be enough to rely on a Select Committee of the Lords or the Commons having the time to do that in the detail we would want. We might need to consider either a post-legislative scrutiny Committee or some other mechanism to ensure that there is the necessary level of oversight.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

My hon. Friend is absolutely right. The report as is obviously has to be laid before Parliament and will form part of the package of parliamentary scrutiny. But, yes, we will consider how we can utilise the expertise of both Houses in post-legislative scrutiny. We will come back on that.

Question put and agreed to.

Clause 155, as amended, accordingly ordered to stand part of the Bill.

Clause 169

Individuals providing regulated services: liability

Amendment made: 57, in clause 169, page 143, line 15, at end insert—

“(fa) Chapter 2A of Part 4 (terms of service: transparency, accountability and freedom of expression);”.—(Paul Scully.)

Clause 169 is about liability of providers who are individuals. This amendment inserts a reference to Chapter 2A, which is the new Chapter expected to be formed by NC3 to NC6, so that individuals may be jointly and severally liable for the duties imposed by that Chapter.

Clause 169, as amended, ordered to stand part of the Bill.

Clause 183 ordered to stand part of the Bill.

Schedule 17

Video-sharing platform services: transitional provision etc

Amendments made: 94, in schedule 17, page 235, line 43, leave out paragraph (c).

This amendment is consequential on Amendment 6 (removal of clause 12).

Amendment 95, in schedule 17, page 236, line 27, at end insert—

“(da) the duties set out in sections (Duty not to act against users except in accordance with terms of service) and (Further duties about terms of service) (terms of service);”.—(Paul Scully.)

This amendment ensures that services already regulated under Part 4B of the Communications Act 2003 (video-sharing platform services) are not required to comply with the new duties imposed by NC3 and NC4 during the transitional period.

Question proposed, That the schedule, as amended, be the Seventeenth schedule to the Bill.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The clause provides legal certainty about the meaning of those terms as used in the Bill: things such as “content”, “encounter”, “taking down” and “terms of service”. That is what the clause is intended to do. It is intentional and is for the reasons the hon. Lady said. Oral means speech and speech only. Aural is speech and other sounds, which is what can be heard on voice calls. That includes music as well. One is speech. The other is the whole gamut.

Damian Collins Portrait Damian Collins
- Hansard - -

I am intrigued, because the hon. Member for Aberdeen North makes an interesting point. It is not one I have heard made before. Does the Minister think there is a distinction between oral and aural, where oral is live speech and aural is pre-recorded material that might be played back? Are those two are considered distinct?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

My knowledge is being tested, so I will write to the hon. Member for Aberdeen North and make that available to the Committee. Coming back to the point she made about oral and aural on Tuesday about another clause on the exclusions, as I said, we have a narrow exemption to ensure that traditional phone calls are not subject to regulation. But that does mean that if a service such as Fortnite, which she spoke about previously, enables adults and children to have one-to-one oral calls, companies will still need to address the surrounding functionality around how that happens, because to enable that might cause harm—for example if an adult can contact an unknown child. That is still captured within the Bill.

--- Later in debate ---
Damian Collins Portrait Damian Collins
- Hansard - -

I want to briefly speak on this amendment, particularly as my hon. Friend the Member for Don Valley referenced the report by the Joint Committee, which I chaired. As he said, the Joint Committee considered the question of systematic abuse. A similar provision exists in the data protection legislation, whereby any company that is consistently in breach could be considered to have failed in its duties under the legislation and there could be criminal liability. The Joint Committee considered whether that should also apply with the Online Safety Bill.

As the Bill has gone through its processes, the Government have brought forward the commencement of criminal liability for information offences, whereby if a company refuses to respond to requests for information or data from the regulator, that would be a breach of their duties; it would invoke criminal liability for a named individual. However, I think the question of a failure to meet the safety duty set out in the Bill really needs to be framed along the lines of being a systematic and persistent breach, as the Joint Committee recommended. If, for example, a company was prepared to ignore requests from Ofcom, use lawyers to evade liability for as long as possible and consistently pay fines for serious breaches without ever taking responsibility for them, what would we do then? Would there be some liability at that point?

The amendment drafted by my hon. Friend the Member for Stone (Sir William Cash) is based on other existing legislation, and on there being knowledge—with “consent or connivance”. We can see how that would apply in cases such as the diesel emissions concerns raised at Volkswagen, where there was criminal liability, or maybe the LIBOR bank rate rigging and the serious failures there. In those cases, what was discovered was senior management’s knowledge and connivance; they were part of a process that they knew was illegal.

With the amendment as drafted, the question we would have is: could it apply for any failure? Where management could say, “We have created a system to resolve this system that hasn’t worked on this occasion”, would that trigger it? Or is it something broader and more systematic? These failures will be more about the failure to design a regime that takes into account the required stated duties, rather than a particular individual act, such as the rigging of the LIBOR rates or giving false public information on diesel emissions, which could only be made at a corporate level.

When I chaired the Joint Committee, we raised the question, “What about systematic failure, as we have that as an offence in data protection legislation?” I still think that would be an interesting question to consider when the Bill goes to another place. However, I have concerns that the current drafting would not fit quite as well in the online safety regime as it does in other industries. It would really need to reflect consistent, persistent failures on behalf of a company that go beyond the criminal liabilities that already exist in the Bill around information offences.

None Portrait The Chair
- Hansard -

Just to be clear, it is new clause 9 that we are reading a Second time, not an amendment.

Damian Collins Portrait Damian Collins
- Hansard - -

Forgive me, Dame Angela.

Caroline Ansell Portrait Caroline Ansell (Eastbourne) (Con)
- Hansard - - - Excerpts

I rise to recognise the spirit and principle behind new clause 9, while, of course, listening carefully to the comments made by my hon. Friend the Member for Folkestone and Hythe. He is right to raise those concerns, but my question is: is there an industry-specific way in which the same responsibility and liability could be delivered?

I recognise too that the Bill is hugely important. It is a good Bill that has child protection at its heart. It also contains far more significant financial penalties than we have previously seen—as I understand it, 10% of qualifying revenue up to £18 million. This will drive some change, but it comes against the backdrop of multi-billion-pound technology companies.

I would be interested to understand whether a double lock around the board-level responsibility might further protect children from some of the harrowing and harmful content we see online. What we need is nothing short of transformation and significant culture change. Even today, The Guardian published an article about TikTok and a study by the Centre for Countering Digital Hate, which found that teenagers who demonstrated an interest in self-harm and eating disorders were having algorithms pushing that content on to them within minutes. That is most troubling.

We need significant, serious and sustained culture change. There is precedent in other sectors, as has been mentioned, and there was a previous recommendation, so clearly there is merit in this. My understanding is that there is strong public support, because the public recognise that this new responsibility cannot be strengthened by anything other than liability. If there is board-level liability, that will drive priorities and resources, which will broker the kind of change we are looking for. I look forward to what the Minister might share today, as this has been a good opportunity to bring these issues into further consideration, and they might then be carried over into subsequent stages of this excellent Bill.