Committee stage
Tuesday 7th June 2022

(2 years, 6 months ago)

Public Bill Committees
Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 7 June 2022 - (7 Jun 2022)
None Portrait The Chair
- Hansard -

Good morning, ladies and gentleman. If anybody wishes to take their jacket off, they are at liberty to do so when I am in the Chair—my co-Chairman is joining us, and I am sure she will adopt the same procedure. I have a couple of preliminary announcements. Please make sure that all mobile phones are switched off. Tea and coffee are not allowed in the Committee, I am afraid. I think they used to be available outside in the corridor, but I do not know whether that is still the case.

We now start line-by-line consideration of the Bill. The selection and grouping list for the sitting is available on the table in the room for anybody who does not have it. It shows how the clauses and selected amendments have been grouped for debate. Grouped amendments are generally on the same subject or a similar issue.

Now for a slight tutorial to remind me and anybody else who is interested, including anybody who perhaps has not engaged in this arcane procedure before, of the proceedings. Each group has a lead amendment, and that amendment is moved first. The other grouped amendments may be moved later, but they are not necessarily voted on at that point, because some of them relate to matters that appear later in the Bill. Do not panic; that does not mean that we have forgotten them, but that we will vote on them—if anybody wants to press them to a Division—when they are reached in order in the Bill. However, if you are in any doubt and feel that we have missed something—occasionally I do; the Clerks never do—just let us know. I am relaxed about this, so if anybody wants to ask a question about anything that they do not understand, please interrupt and ask, and we will endeavour to confuse you further.

The Member who has put their name to the lead amendment, and only the lead amendment, is usually called to speak first. At the end of the debate, the Minister will wind up, and the mover of the lead amendment—that might be the Minister if it is a Government amendment, or it might be an Opposition Member—will indicate whether they want a vote on that amendment. We deal with that first, then we deal with everything else in the order in which it arises. I hope all that is clear, but as I said, if there are any questions, please interrupt and ask.

We start consideration of the Bill with clause 1, to which there are no amendments. Usually, the Minister would wind up at the end of each debate, but as there are no amendments to clause 1, the Minister has indicated that he would like to say a few words about the clause.

Clause 1

Overview of Act

Question proposed, That the clause stand part of the Bill.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

Thank you, Sir Roger; it is a pleasure to serve under your chairmanship once again. It may be appropriate to take this opportunity to congratulate my right hon. Friend the Member for Basingstoke on her damehood in the Queen’s birthday honours, which was very well deserved indeed.

This simple clause provides a high-level overview of the different parts of the Bill and how they come together to form the legislation.

None Portrait The Chair
- Hansard -

The Minister was completely out of order in congratulating the right hon. Lady, but I concur with him. I call the shadow Minister.

--- Later in debate ---
Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

This part of the Bill deals with the definitions of services and which services would be exempt. I consider myself a millennial; most people my age or older are Facebook and Twitter users, and people a couple of years younger might use TikTok and other services. The way in which the online space is used by different generations, particularly by young people, changes rapidly. Given the definitions in the Bill, how does the Minister intend to keep pace with the changing ways in which people communicate? Most online games now allow interaction between users in different places, which was not the case a few years ago. Understanding how the Government intend the Bill to keep up with such changes is important. Will the Minister tell us about that?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me briefly speak to the purpose of these clauses and then respond to some of the points made in the debate.

As the shadow Minister, the hon. Member for Pontypridd, touched on, clauses 2 and 3 define some of the key terms in the Bill, including “user-to-user services” and “search services”—key definitions that the rest of the Bill builds on. As she said, schedule 1 and clause 4 contain specific exemptions where we believe the services concerned present very low risk of harm. Schedule 2 sets out exemptions relating to the new duties that apply to commercial providers of pornography. I thank the shadow Minister and my right hon. Friend the Member for Basingstoke for noting the fact that the Government have substantially expanded the scope of the Bill to now include commercial pornography, in response to widespread feedback from Members of Parliament across the House and the various Committees that scrutinised the Bill.

The shadow Minister is quite right to say that the number of platforms to which the Bill applies is very wide. [Interruption.] Bless you—or bless my hon. Friend the Member for North West Durham, I should say, Sir Roger, although he is near sanctified already. As I was saying, we are necessarily trying to protect UK users, and with many of these platforms not located in the UK, we are seeking to apply these duties to those companies as well as ones that are domestically located. When we come to discuss the enforcement powers, I hope the Committee will see that those powers are very powerful.

The shadow Minister, the hon. Member for Liverpool, Walton and others asked about future technologies and whether the Bill will accommodate technologies that we cannot even imagine today. The metaverse is a good example: The metaverse did not exist when the Bill was first contemplated and the White Paper produced. Actually, I think Snapchat did not exist when the White Paper that preceded the Bill was first conceived. For that reason, the Bill is tech agnostic. We do not talk about specific technologies; we talk about the duties that apply to companies and the harms they are obligated to prevent.

The whole Bill is tech agnostic because we as parliamentarians today cannot anticipate future developments. When those future developments arise, as they inevitably will, the duties under the Bill will apply to them as well. The metaverse is a good example, because even though it did not exist when the structure of the Bill was conceived, anything happening in the metaverse is none the less covered by the Bill. Anything that happens in the metaverse that is illegal or harmful to children, falls into the category of legal but harmful to adults, or indeed constitutes pornography will be covered because the Bill is tech agnostic. That is an extremely important point to make.

The hon. Member for Aberdeen North asked about gaming. Parents are concerned because lots of children, including quite young children, use games. My own son has started playing Minecraft even though he is very young. To the extent that those games have user-to-user features—for example, user-to-user messaging, particularly where those messages can be sent widely and publicly—those user-to-user components are within the scope of the Bill.

The hon. Member for Aberdeen North also asked about the App Store. I will respond quickly to her question now rather than later, to avoid leaving the Committee in a state of tingling anticipation and suspense. The App Store, or app stores generally, are not in the scope of the Bill, because they are not providing, for example, user-to-user services, and the functionality they provide to basically buy apps does not count as a search service. However, any app that is purchased in an app store, to the extent that it has either search functionality, user-to-user functionality or purveys or conveys pornography, is in scope. If an app that is sold on one of these app stores turns out to provide a service that breaks the terms of the Bill, that app will be subject to regulatory enforcement directly by Ofcom.

The hon. Members for Aberdeen North and for Liverpool, Walton touched on media literacy, noting that there has been a change to the Bill since the previous version. We will probably debate this later, so I will be brief. The Government published a media literacy strategy, backed by funding, to address this point. It was launched about a year ago. Ofcom also has existing statutory duties—arising under the Communications Act 2003, I believe. The critical change made since the previous draft of the Bill—it was made in December last year, I believe—is that Ofcom published an updated set of policy intentions around media literacy that went even further than we had previously intended. That is the landscape around media literacy.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am sure we will discuss this topic a bit more as the Bill progresses.

I will make a few points on disinformation. The first is that, non-legislatively, the Government have a counter-disinformation unit, which sits within the Department for Digital, Culture, Media and Sport. It basically scans for disinformation incidents. For the past two years it has been primarily covid-focused, but in the last three or four months it has been primarily Russia/Ukraine-focused. When it identifies disinformation being spread on social media platforms, the unit works actively with the platforms to get it taken down. In the course of the Russia-Ukraine conflict, and as a result of the work of that unit, I have personally called in some of the platforms to complain about the stuff they have left up. I did not have a chance to make this point in the evidence session, but when the person from Twitter came to see us, I said that there was some content on Russian embassy Twitter accounts that, in my view, was blatant disinformation—denial of the atrocities that have been committed in Bucha. Twitter had allowed it to stay up, which I thought was wrong. Twitter often takes down such content, but in that example, wrongly and sadly, it did not. We are doing that work operationally.

Secondly, to the extent that disinformation can cause harm to an individual, which I suspect includes a lot of covid disinformation—drinking bleach is clearly not very good for people—that would fall under the terms of the legal but harmful provisions in the Bill.

Thirdly, when it comes to state-sponsored disinformation of the kind that we know Russia engages in on an industrial scale via the St Petersburg Internet Research Agency and elsewhere, the Home Office has introduced the National Security Bill—in fact, it had its Second Reading yesterday afternoon, when some of us were slightly distracted. One of the provisions in that Bill is a foreign interference offence. It is worth reading, because it is very widely drawn and it criminalises foreign interference, which includes disinformation. I suggest the Committee has a look at the foreign interference offence in the National Security Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful for the Minister’s intervention in bringing in the platforms to discuss disinformation put out by hostile nation states. Does he accept that if Russia Today had put out some of that disinformation, the platforms would be unable to take such content down as a result of the journalistic exemption in the Bill?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We will no doubt discuss in due course clauses 15 and 50, which are the two that I think the shadow Minister alludes to. If a platform is exempt from the duties of the Bill owing to its qualification as a recognised news publisher under clause 50, it removes the obligation to act under the Bill, but it does not prevent action. Social media platforms can still choose to act. Also, it is not a totally straightforward matter to qualify as a regulated news publisher under clause 50. We saw the effect of sanctions: when Russia Today was sanctioned, it was removed from many platforms as a result of the sanctioning process. There are measures outside the Bill, such as sanctions, that can help to address the shocking disinformation that Russia Today was pumping out.

The last point I want to pick up on was rightly raised by my right hon. Friend the Member for Basingstoke and the hon. Member for Aberdeen North. It concerns child sexual exploitation and abuse images, and particularly the ability of platforms to scan for those. Many images are detected as a result of scanning messages, and many paedophiles or potential paedophiles are arrested as a result of that scanning. We saw a terrible situation a little while ago, when—for a limited period, owing to a misconception of privacy laws—Meta, or Facebook, temporarily suspended scanning in the European Union; as a result, loads of images that would otherwise have been intercepted were not.

I agree with the hon. Member for Aberdeen North that privacy concerns, including end-to-end encryption, should not trump the ability of organisations to scan for child sexual exploitation and abuse images. Speaking as a parent—I know she is, too—there is, frankly, nothing more important than protecting children from sexual exploitation and abuse. Some provisions in clause 103 speak to this point, and I am sure we will debate those in more detail when we come to that clause. I mention clause 103 to put down a marker as the place to go for the issue being raised. I trust that I have responded to the points raised in the debate, and I commend the clause to the Committee.

Question put and agreed to.

Clause 2 accordingly ordered to stand part of the Bill.

Clause 3 ordered to stand part of the Bill.

Schedules 1 and 2 agreed to.

Clause 4 ordered to stand part of the Bill.

None Portrait The Chair
- Hansard -

Before we move on, we have raised the issue of the live feed. The audio will be online later today. There is a problem with the feed—it is reaching the broadcasters, but it is not being broadcast at the moment.

As we are not certain we can sort out the technicalities between now and this afternoon, the Committee will move to Committee Room 9 for this afternoon’s sitting to ensure that the live stream is available. Mr Double, if Mr Russell intends to be present—he may not; that is up to you—it would be helpful if you would let him know. Ms Blackman, if John Nicolson intends to be present this afternoon, would you please tell him that Committee Room 9 will be used?

It would normally be possible to leave papers and other bits and pieces in the room, because it is usually locked between the morning and afternoon sittings. Clearly, because we are moving rooms, you will all need to take your papers and laptops with you.

Clause 5

Overview of Part 3

Question proposed, That the clause stand part of the Bill.

--- Later in debate ---
Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

I want to add my voice to the calls for ways to monitor the success or failures of this legislation. We are starting from a position of self-regulation where companies write the rules and regulate themselves. It is right that we are improving on that, but with it comes further concerns around the powers of the Secretary of State and the effectiveness of Ofcom. As the issues are fundamental to freedom of speech and expression, and to the protection of vulnerable and young people, will the Minster consider how we better monitor whether the legislation does what it says on the tin?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clause 5 simply provides an overview of part 3 of the Bill. Several good points have been raised in the course of this discussion. I will defer replying to the substance of a number of them until we come to the relevant clause, but I will address two or three of them now.

The shadow Minister said that the Bill is a complex, and she is right; it is 193-odd clauses long and a world-leading piece of legislation. The duties that we are imposing on social media firms and internet companies do not already exist; we have no precedent to build on. Most matters on which Parliament legislates have been considered and dealt with before, so we build on an existing body of legislation that has been built up over decades or, in some cases in the criminal law, over centuries. In this case, we are constructing a new legislative edifice from the ground up. Nothing precedes this piece of legislation—we are creating anew—and the task is necessarily complicated by virtue of its novelty. However, I think we have tried to frame the Bill in a way that keeps it as straightforward and as future-proof as possible.

The shadow Minister is right to point to the codes of practice as the source of practical guidance to the public and to social media firms on how the obligations operate in practice. We are working with Ofcom to ensure that those codes of practice are published as quickly as possible and, where possible, prepared in parallel with the passage of the legislation. That is one reason why we have provided £88 million of up-front funding to Ofcom in the current and next financial years: to give it the financial resources to do precisely that.

My officials have just confirmed that my recollection of the Ofcom evidence session on the morning of Tuesday 24 May was correct: Ofcom confirmed to the Committee that it will publish, before the summer, what it described as a “road map” providing details on the timing of when and how those codes of practice will be created. I am sure that Ofcom is listening to our proceedings and will hear the views of the Committee and of the Government. We would like those codes of practice to be prepared and introduced as quickly as possible, and we certainly provided Ofcom with the resources to do precisely that.

There was question about the Scottish offences and, I suppose, about the Northern Irish offences as well—we do not want to forget any part of the United Kingdom.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We are in agreement on that. I can confirm that the Government have tabled amendments 116 to 126 —the Committee will consider them in due course—to place equivalent Scottish offences, which the hon. Member for Aberdeen North asked about, in the Bill. We have done that in close consultation with the Scottish Government to ensure that the relevant Scottish offences equivalent to the England and Wales offences are inserted into the Bill. If the Scottish Parliament creates any new Scottish offences that should be inserted into the legislation, that can be done under schedule 7 by way of statutory instrument. I hope that answers the question.

The other question to which I will briefly reply was about parliamentary scrutiny. The Bill already contains a standard mechanism that provides for the Bill to be reviewed after a two to five-year period. That provision appears at the end of the Bill, as we would expect. Of course, there are the usual parliamentary mechanisms—Backbench Business debates, Westminster Hall debates and so on—as well as the DCMS Committee.

I heard the points about a standing Joint Committee. Obviously, I am mindful of the excellent prelegislative scrutiny work done by the previous Joint Committee of the Commons and the Lords. Equally, I am mindful that standing Joint Committees, outside the regular Select Committee structure, unusual. The only two that spring immediately to mind are the Intelligence and Security Committee, which is established by statute, and the Joint Committee on Human Rights, chaired by the right hon. and learned Member for Camberwell and Peckham (Ms Harman), which is established by Standing Orders of the House. I am afraid I am not in a position to make a definitive statement about the Government’s position on this. It is of course always open to the House to regulate its own businesses. There is nothing I can say today from a Government point of view, but I know that hon. Members’ points have been heard by my colleagues in Government.

We have gone somewhat beyond the scope of clause 5. You have been extremely generous, Sir Roger, in allowing me to respond to such a wide range of points. I commend clause 5 to the Committee.

Question put and agreed to.

Clause 5 accordingly ordered to stand part of the Bill.

Clause 6

Providers of user-to-user services: duties of care

None Portrait The Chair
- Hansard -

Before we proceed, perhaps this is the moment to explain what should happen and what is probably going to happen. Ordinarily, a clause is taken with amendments. This Chairman takes a fairly relaxed view of stand part debates. Sometimes it is convenient to have a very broad-ranging debate on the first group of amendments because it covers matters relating to the whole clause. The Chairman would then normally say, “Well, you’ve already had your stand part debate, so I’m not going to allow a further stand part debate.” It is up to hon. Members to decide whether to confine themselves to the amendment under discussion and then have a further stand part debate, or whether to go free range, in which case the Chairman would almost certainly say, “You can’t have a stand part debate as well. You can’t have two bites of the cherry.”

This is slightly more complex. It is a very complex Bill, and I think I am right in saying that it is the first time in my experience that we are taking other clause stand parts as part of the groups of amendments, because there is an enormous amount of crossover between the clauses. That will make it, for all of us, slightly harder to regulate. It is for that reason—the Minister was kind enough to say that I was reasonably generous in allowing a broad-ranging debate—that I think we are going to have to do that with this group.

I, and I am sure Ms Rees, will not wish to be draconian in seeking to call Members to order if you stray slightly outside the boundaries of a particular amendment. However, we have to get on with this, so please try not to be repetitive if you can possibly avoid it, although I accept that there may well be some cases where it is necessary.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

That is a huge concern for us. As was brought up in our evidence sessions with Ofcom, it is recruiting, effectively, a fundraising officer for the regulator. That throws into question the potential longevity of the regulator’s funding and whether it is resourced effectively to properly scrutinise and regulate the online platforms. If that long-term resource is not available, how can the regulator effectively scrutinise and bring enforcement to bear against companies for enabling illegal activity?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Just to reassure the shadow Minister and her hon. Friend the Member for Liverpool, Walton, the Bill confers powers on Ofcom to levy fees and charges on the sector that it is regulating—so, on social media firms—to recoup its costs. We will debate that in due course—I think it is in clause 71, but that power is in the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful to the Minister for that clarification and I look forward to debating that further as the Bill progresses.

Returning to the senior managers and certificate regime in the financial services industry, it states that senior managers must be preapproved by the regulator, have their responsibilities set out in a statement of responsibilities and be subject to enhanced conduct standards. Those in banks are also subject to regulatory requirements on their remuneration. Again, it baffles me that we are not asking the same for child safety from online platforms and companies.

The money laundering regulations also use the threat of criminal offences to drive culture change. Individuals can be culpable for failure of processes, as well as for intent. I therefore hope that the Minister will carefully consider the need for the same to apply to our online space to make children safe.

Amendment 70 is a technical amendment that we will be discussing later on in the Bill. However, I am happy to move it in the name of the official Opposition.

Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

I congratulate my own Front Bench on this important amendment. I would like the Minister to respond to the issue of transparency and the reason why only the regulator would have sight of these risk assessments. It is fundamental that civil society groups and academics have access to them. Her Majesty’s Revenue and Customs is an example of where that works very well. HMRC publishes a lot of its data, which is then used by academics and researchers to produce reports and documents that feed back into the policy making processes and HMRC’s work. It would be a missed opportunity if the information and data gathered by Ofcom were not widely available for public scrutiny.

I would reinforce the earlier points about accountability. There are too many examples—whether in the financial crash or the collapse of companies such as Carillion—where accountability was never there. Without this amendment and the ability to hold individuals to account for the failures of companies that are faceless to many people, the legislation risks being absolutely impotent.

Finally, I know that we will get back to the issue of funding in a later clause but I hope that the Minister can reassure the Committee that funding for the enforcement of these regulations will be properly considered.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me start by speaking to clauses 6, 7, 21 and 22 stand part. I will then address the amendments moved by the shadow Minister.

None Portrait The Chair
- Hansard -

Order. I apologise for interrupting, Minister, but the stand part debates on clauses 7, 21 and 22 are part of the next grouping, not this one. I am fairly relaxed about it, but just be aware that you cannot have two debates on this.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The grouping sheet I have here suggests that clause 7 stand part and clauses 21 and 22 stand part are in this grouping, but if I have misunderstood—

None Portrait The Chair
- Hansard -

No, there are two groups. Let me clarify this for everyone, because it is not as straightforward as it normally is. At the moment we are dealing with amendments 69 and 70. The next grouping, underneath this one on your selection paper, is the clause stand part debates—which is peculiar, as effectively we are having the stand part debate on clause 6 now. For the convenience of the Committee, and if the shadow Minister is happy, I am relaxed about taking all this together.

None Portrait The Chair
- Hansard -

The hon. Lady can be called again. The Minister is not winding up at this point.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

In the interests of simplicity, I will stick to the selection list and adapt my notes accordingly to confine my comments to amendments 69 and 70, and then we will come to the stand part debates in due course. I am happy to comply, Sir Roger.

Speaking of compliance, that brings us to the topic of amendments 69 and 70. It is worth reminding ourselves of the current enforcement provisions in the Bill, which are pretty strong. I can reassure the hon. Member for Liverpool, Walton that the enforcement powers here are far from impotent. They are very potent. As the shadow Minister acknowledged in her remarks, we are for the first time ever introducing senior management liability, which relates to non-compliance with information notices and offences of falsifying, encrypting or destroying information. It will be punishable by a prison sentence of up to two years. That is critical, because without that information, Ofcom is unable to enforce.

We have had examples of large social media firms withholding information and simply paying a large fine. There was a Competition and Markets Authority case a year or two ago where a large social media firm did not provide information repeatedly requested over an extended period and ended up paying a £50 million fine rather than providing the information. Let me put on record now that that behaviour is completely unacceptable. We condemn it unreservedly. It is because we do not want to see that happen again that there will be senior manager criminal liability in relation to providing information, with up to two years in prison.

In addition, for the other duties in the Bill there are penalties that Ofcom can apply for non-compliance. First, there are fines of up to 10% of global revenue. For the very big American social media firms, the UK market is somewhere just below 10% of their global revenue, so 10% of their global revenue is getting on for 100% of their UK revenue. That is a very significant financial penalty, running in some cases into billions of pounds.

In extreme circumstances—if those measures are not enough to ensure compliance—there are what amount to denial of service powers in the Bill, where essentially Ofcom can require internet service providers and others, such as payment providers, to disconnect the companies in the UK so that they cannot operate here. Again, that is a very substantial measure. I hope the hon. Member for Liverpool, Walton would agree that those measures, which are in the Bill already, are all extremely potent.

The question prompted by the amendment is whether we should go further. I have considered that issue as we have been thinking about updating the Bill—as hon. Members can imagine, it is a question that I have been debating internally. The question is whether we should go further and say there is personal criminal liability for breaches of the duties that go beyond information provision. There are arguments in favour, which we have heard, but there are arguments against as well. One is that if we introduce criminal liability for those other duties, that introduces a risk that the social media firms, fearing criminal prosecution, will become over-zealous and just take everything down because they are concerned about being personally liable. That could end up having a chilling effect on content available online and goes beyond what we in Parliament would intend.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

In a moment.

For those reasons, I think we have drawn the line in the right place. There is personal criminal liability for information provision, with fines of 10% of local revenue and service disruption—unplugging powers—as well. Having thought about it quite carefully, I think we have struck the balance in the right place. We do not want to deter people from offering services in the UK. If they worried that they might go to prison too readily, it might deter people from locating here. I fully recognise that there is a balance to strike. I feel that the balance is being struck in the right place.

I will go on to comment on a couple of examples we heard about Carillion and the financial crisis, but before I do so, I will give way as promised.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I appreciate that the Minister says he has been swithering on this point—he has been trying to work out the correct place to draw the line. Given that we do not yet have a commitment for a standing committee—again, that is potentially being considered—we do not know how the legislation is going to work. Will the Minister, rather than accepting the amendment, give consideration to including the ability to make changes via secondary legislation so that there is individual criminal liability for different breaches? That would allow him the flexibility in the future, if the regime is not working appropriately, to add through secondary legislation individual criminal liability for breaches beyond those that are currently covered.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I have not heard that idea suggested. I will think about it. I do not want to respond off the cuff, but I will give consideration to the proposal. Henry VIII powers, which are essentially what the hon. Lady is describing—an ability through secondary legislation effectively to change primary legislation—are obviously viewed askance by some colleagues if too wide in scope. We do use them, of course, but normally in relatively limited circumstances. Creating a brand new criminal offence via what amounts to a Henry VIII power would be quite a wide application of the power, but it is an idea that I am perfectly happy to go away and reflect on. I thank her for mentioning the idea.

A couple of examples were given about companies that have failed in the past. Carillion was not a financial services company and there was no regulatory oversight of the company at all. In relation to financial services regulation, despite the much stricter regulation that existed in the run-up to the 2008 financial crisis, that crisis occurred none the less. [Interruption.] We were not in government at the time. We should be clear-eyed about the limits of what regulation alone can deliver, but that does not deter us from taking the steps we are taking here, which I think are extremely potent, for all the reasons that I mentioned and will not repeat.

Question put, That the amendment be made.

Division 1

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

On clause 7, as I have previously mentioned, we were all pleased to see the Government bring in more provisions to tackle pornographic content online, much of which is easily accessible and can cause harm to those viewing it and potentially to those involved in it.

As we have previously outlined, a statutory duty of care for social platforms online has been missing for far too long, but we made it clear on Second Reading that such a duty will only be effective if we consider the systems, business models and design choices behind how platforms operate. For too long, platforms have been abuse-enabling environments, but it does not have to be this way. The amendments that we will shortly consider are largely focused on transparency, as we all know that the duties of care will only be effective if platforms are compelled to proactively supply their assessments to Ofcom.

On clause 21, the duty of care approach is one that the Opposition support and it is fundamentally right that search services are subject to duties including illegal content risk assessments, illegal content assessments more widely, content reporting, complaints procedures, duties about freedom of expression and privacy, and duties around record keeping. Labour has long held the view that search services, while not direct hosts of potentially damaging content, should have responsibilities that see them put a duty of care towards users first, as we heard in our evidence sessions from HOPE not hate and the Antisemitism Policy Trust.

It is also welcome that the Government have committed to introducing specific measures for regulated search services that are likely to be accessed by children. However, those measures can and must go further, so we will be putting forward some important amendments as we proceed.

Labour does not oppose clause 22, either, but I would like to raise some important points with the Minister. We do not want to be in a position whereby those designing, operating and using a search engine in the United Kingdom are subject to a second-rate internet experience. We also do not want to be in a position where we are forcing search services to choose what is an appropriate design for people in the UK. It would be worrying indeed if our online experience vastly differed from that of, let us say, our friends in the European Union. How exactly will clause 22 ensure parity? I would be grateful if the Minister could confirm that before we proceed.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The shadow Minister has already touched on the effect of these clauses: clause 6 sets out duties applying to user-to-user services in a proportionate and risk-based way; clause 7 sets out the scope of the various duties of care; and clauses 21 and 22 do the same in relation to search services.

In response to the point about whether the duties on search will end up providing a second-rate service in the United Kingdom, I do not think that they will. The duties have been designed to be proportionate and reasonable. Throughout the Bill, Members will see that there are separate duties for search and for user-to-user services. That is reflected in the symmetry—which appears elsewhere, too—of clauses 6 and 7, and clauses 21 and 22. We have done that because we recognise that search is different. It indexes the internet; it does not provide a user-to-user service. We have tried to structure these duties in a way that is reasonable and proportionate, and that will not adversely impair the experience of people in the UK.

I believe that we are ahead of the European Union in bringing forward this legislation and debating it in detail, but the European Union is working on its Digital Services Act. I am confident that there will be no disadvantage to people conducting searches in United Kingdom territory.

Question put and agreed to.

Clause 6 accordingly ordered to stand part of the Bill.

Clause 7 ordered to stand part of the Bill.

Clause 8

Illegal content risk assessment duties

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move amendment 10, in clause 8, page 6, line 33, at end insert—

“(4A) A duty to publish the illegal content risk assessment and proactively supply this to OFCOM.”

This amendment creates a duty to publish an illegal content risk assessment and supply it to Ofcom.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clause 8 sets out the risk assessment duties for illegal content, as already discussed, that apply to user-to-user services. Ofcom will issue guidance on how companies can undertake those. To comply with those duties, companies will need to take proportionate measures to mitigate the risks identified in those assessments. The clause lists a number of potential risk factors the providers must assess, including how likely it is that users will encounter illegal content, as defined later in the Bill,

“by means of the service”.

That phrase is quite important, and I will come to it later, on discussing some of the amendments, because it does not necessarily mean just on the service itself but, in a cross-platform point, other sites where users might find themselves via the service. That phrase is important in the context of some of the reasonable queries about cross-platform risks.

Moving on, companies will also need to consider how the design and operation of their service may reduce or increase the risks identified. Under schedule 3, which we will vote on, or at least consider, later on, companies will have three months to carry out risk assessments, which must be kept up to date so that fresh risks that may arise from time to time can be accommodated. Therefore, if changes are made to the service, the risks can be considered on an ongoing basis.

Amendment 10 relates to the broader question that the hon. Member for Liverpool, Walton posed about transparency. The Bill already contains obligations to publish summary risk assessments on legal but harmful content. That refers to some of the potentially contentious or ambiguous types of content for which public risk assessments would be helpful. The companies are also required to make available those risk assessments to Ofcom on request. That raises a couple of questions, as both the hon. Member for Liverpool, Walton mentioned and some of the amendments highlighted. Should companies be required to proactively serve up their risk assessments to Ofcom, rather than wait to be asked? Also, should those risk assessments all be published—probably online?

In considering those two questions, there are a couple of things to think about. The first is Ofcom’s capacity. As we have discussed, 25,000 services are in scope. If all those services proactively delivered a copy of their risk assessment, even if they are very low risk and of no concern to Ofcom or, indeed, any of us, they would be in danger of overwhelming Ofcom. The approach contemplated in the Bill is that, where Ofcom has a concern or the platform is risk assessed as being significant—to be clear, that would apply to all the big platforms—it will proactively make a request, which the platform will be duty bound to meet. If the platform does not do that, the senior manager liability and the two years in prison that we discussed earlier will apply.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The Minister mentioned earlier that Ofcom would be adequately resourced and funded to cope with the regulatory duty set out in the Bill. If Ofcom is not able to receive risk assessments for all the platforms potentially within scope, even if those platforms are not deemed to be high risk, does that not call into question whether Ofcom has the resource needed to actively carry out its duties in relation to the Bill?

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Of course, Ofcom is able to request any of them if it wants to—if it feels that to be necessary—but receiving 25,000 risk assessments, including from tiny companies that basically pose pretty much no risk at all and hardly anyone uses, would, I think, be an unreasonable and disproportionate requirement to impose. I do not think it is a question of the resources being inadequate; it is a question of being proportionate and reasonable.

Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

The point I was trying to get the Minister to think about was the action of companies in going through the process of these assessments and then making that information publicly available to civil society groups; it is about transparency. It is what the sector needs; it is the way we will find and root out the problems, and it is a great missed opportunity in this Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

To reassure the hon. Member on the point about doing the risk assessment, all the companies have to do the risk assessment. That obligation is there. Ofcom can request any risk assessment. I would expect, and I think Parliament would expect, it to request risk assessments either where it is concerned about risk or where the platform is particularly large and has a very high reach—I am thinking of Facebook and companies like that. But hon. Members are talking here about requiring Ofcom to receive and, one therefore assumes, to consider, because what is the point of receiving an assessment unless it considers it? Receiving it and just putting it on a shelf without looking at it would be pointless, obviously. Requiring Ofcom to receive and look at potentially 25,000 risk assessments strikes me as a disproportionate burden. We should be concentrating Ofcom’s resources—and it should concentrate its activity, I submit—on those companies that pose a significant risk and those companies that have a very high reach and large numbers of users. I suggest that, if we imposed an obligation on it to receive and to consider risk assessments for tiny companies that pose no risk, that would not be the best use of its resources, and it would take away resources that could otherwise be used on those companies that do pose risk and that have larger numbers of users.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Just to be clear, we are saying that the only reason why we should not be encouraging the companies to do the risk assessment is that Ofcom might not be able to cope with dealing with all the risk assessments. But surely that is not a reason not to do it. The risk assessment is a fundamental part of this legislation. We have to be clear that there is no point in the companies having those risk assessments if they are not visible and transparent.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

All the companies have to do the risk assessment, for example for the “illegal” duties, where they are required to by the Bill. For the “illegal” duties, that is all of them; they have to do those risk assessments. The question is whether they have to send them to Ofcom—all of them—even if they are very low risk or have very low user numbers, and whether Ofcom, by implication, then has to consider them, because it would be pointless to require them to be sent if they were not then looked at. We want to ensure that Ofcom’s resources are pointed at the areas where the risks arise. Ofcom can request any of these. If Ofcom is concerned—even a bit concerned—it can request them.

Hon. Members are then making a slightly adjacent point about transparency—about whether the risk assessments should be made, essentially, publicly available. In relation to comprehensive public disclosure, there are legitimate questions about public disclosure and about getting to the heart of what is going on in these companies in the way in which Frances Haugen’s whistleblower disclosures did. But we also need to be mindful of what we might call malign actors—people who are trying to circumvent the provisions of the Bill—in relation to some of the “illegal” provisions, for example. We do not want to give them so much information that they know how they can circumvent the rules. Again, there is a balance to strike between ensuring that the rules are properly enforced and having such a high level of disclosure that people seeking to circumvent the rules are able to work out how to do so.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

If the rules are so bad that people can circumvent them, they are not good enough anyway and they need to be updated, but I have a specific question on this. The Minister says that Ofcom will be taking in the biggest risk assessments, looking at them and ensuring that they are adequate. Will he please give consideration to asking Ofcom to publish the risk assessments from the very biggest platforms? Then they will all be in one place. They will be easy for people to find and people will not have to rake about in the bottom sections of a website. And it will apply only in the case of the very biggest, most at risk platforms, which should be regularly updating their risk assessments and changing their processes on a very regular basis in order to ensure that people are kept safe.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Lady for her intervention and for the—

None Portrait The Chair
- Hansard -

Order. I am sorry to interrupt the Minister, but I now have to adjourn the sitting until this afternoon, when the Committee will meet again, in Room 9 and with Ms Rees in the Chair.