Kirsty Blackman debates involving the Department for Digital, Culture, Media & Sport during the 2019 Parliament

Tue 21st Jun 2022
Online Safety Bill (Thirteenth sitting)
Public Bill Committees

Committee stage: 13th sitting & Committee Debate - 13th sitting
Thu 16th Jun 2022
Thu 16th Jun 2022
Tue 14th Jun 2022
Thu 9th Jun 2022
Tue 7th Jun 2022
Tue 7th Jun 2022
Thu 26th May 2022
Online Safety Bill (Fourth sitting)
Public Bill Committees

Committee stage: 4th sitting & Committee Debate - 4th sitting
Tue 24th May 2022
Tue 24th May 2022

Online Safety Bill (Thirteenth sitting)

Kirsty Blackman Excerpts
Committee stage & Committee Debate - 13th sitting
Tuesday 21st June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 21 June 2022 - (21 Jun 2022)
Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship once again, Ms Rees, and I congratulate Committee members on evading this morning’s strike action.

I am delighted that the shadow Minister supports the intent behind these clauses, and I will not speak at great length given the unanimity on this topic. As she said, clause 118 allows Ofcom to impose a financial penalty for failure to take specified steps by a deadline set by Ofcom. The maximum penalty that can be imposed is the greater of £18 million or 10% of qualifying worldwide revenue. In the case of large companies, it is likely to be a much larger amount than £18 million.

Clause 119 enables Ofcom to impose financial penalties if the recipient of a section 103 notice does not comply by the deadline. It is very important to ensure that section 103 has proper teeth. Government amendments 154 to 157 make changes that allow Ofcom to recover not only the cost of running the service once the Bill comes into force and into the future but also the preparatory cost of setting up for the Bill to come into force.

As previously discussed, £88 million of funding is being provided to Ofcom in this financial year and next. We believe that something like £20 million of costs that predate these financial years have been funded as well. That adds up to around £108 million. However, the amount that Ofcom recovers will be the actual cost incurred. The figure I provided is simply an indicative estimate. The actual figure would be based on the real costs, which Ofcom would be able to recoup under these measures. That means that the taxpayer—our constituents —will not bear any of the costs, including the set-up and preparatory cost. This is an equitable and fair change to the Bill.

Clause 120 sets out that some regulated providers will be required to pay a regulatory fee to Ofcom, as set out in clause 71. Clause 120 allows Ofcom to impose a financial penalty if a regulated provider does not pay its fee by the deadline it sets. Finally, clause 121 sets out the information that needs to be included in these penalty notices issued by Ofcom.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

I have questions about the management of the fees and the recovery of the preparatory cost. Does the Minister expect that the initial fees will be higher as a result of having to recoup the preparatory cost and will then reduce? How quickly will the preparatory cost be recovered? Will Ofcom recover it quickly or over a longer period of time?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The Bill provides a power for Ofcom to recover those costs. It does not specify over what time period. I do not think they will be recouped over a period of years. Ofcom can simply recoup the costs in a single hit. I would imagine that Ofcom would seek to recover these costs pretty quickly after receiving these powers. The £108 million is an estimate. The actual figure may be different once the reconciliation and accounting is done. It sounds like a lot of money, but it is spread among a number of very large social media firms. It is not a large amount of money for them in the context of their income, so I would expect that recouping to be done on an expeditious basis—not spread over a number of years. That is my expectation.

Question put and agreed to.

Clause 118 accordingly ordered to stand part of the Bill.

Clause 119 ordered to stand part of the Bill.

Clause 120

Non-payment of fee

Amendments made: 154, in clause 120, page 102, line 20, after “71” insert:

“or Schedule (Recovery of OFCOM’s initial costs)”.

This amendment, and Amendments 155 to 157, ensure that Ofcom have the power to impose a monetary penalty on a provider of a service who fails to pay a fee that they are required to pay under NS2.

Amendment 155, in clause 120, page 102, line 21, leave out “that section” and insert “Part 6”.

Amendment 156, in clause 120, page 102, line 26, after “71” insert—

“or Schedule (Recovery of OFCOM’s initial costs)”

Amendment 157, in clause 120, page 103, line 12, at end insert—

“or Schedule (Recovery of OFCOM’s initial costs)”.—(Chris Philp.)

Clause 120, as amended, ordered to stand part of the Bill.

Clause 121 ordered to stand part of the Bill.

Clause 122

Amount of penalties etc

Question proposed, That the clause stand part of the Bill.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

With your permission, Ms Rees, I will speak to clause stand part and clauses 124 to 127 at the same time. Labour supports clause 123, which outlines the powers that Ofcom will have when applying to the court for business disruption measures. Business disruption measures are court orders that require third parties to withdraw services or block access to non-compliant regulated services. It is right that Ofcom has these tools at its disposal, particularly if it is going to be able to regulate effectively against the most serious instances of user harm. However, the Bill will be an ineffective regime if Ofcom is forced to apply for separate court orders when trying to protect people across the board from the same harms. We have already waited too long for change. Labour is committed to giving Ofcom the powers to take action, where necessary, as quickly as possible. That is why we have tabled amendments 50 and 51, which we feel will go some way in tackling these issues.

Amendment 50 would give Ofcom the ability to take action against a schedule of non-compliant sites, while still preserving the right of those sites to oppose the application for—and/or appeal through the courts against any—orders to block access or support services. The Bill currently requires Ofcom to seek a separate court order for each service against which it wishes to take enforcement action in the form of blocking access or services. That is the only effective mechanism for overseas websites. UK-based services will be subject to enforcement notices and financial penalties that can be enforced without having to go to court. That creates a disadvantage for UK sites, which can be more easily enforced against.

Given that there are 4 million to 5 million pornographic websites, for example, the requirement for separate court orders will prevent Ofcom from taking action at scale and creating a level playing field for all adult sites. Under the Bill, Ofcom must take action against each offending website or social media company individually. While we acknowledge that the Government have stated that enforcement action can be taken against multiple offending content providers, in our opinion that is not made clear in the Bill.

Moreover, we are concerned that some pornography websites would seek to avoid the Bill’s requirements by changing their domain name—domain hopping. That was threatened last year when Germany moved to issue a blocking order against major providers of internet pornography. That is why Ofcom must be granted clear enforcement powers to take swift action against multiple websites and content providers in one court action or order.

This group of amendments would also provide clarity and ease of enforcement for internet service providers, which will be expected to enforce court orders. Labour wants the Bill to be genuinely effective, and amendments 50 and 51 could ensure that Ofcom has the tools available to it to take action at pace. We urge the Minister to accept these small concessions, which could have a hugely positive impact.

Amendment 51 would give Ofcom the ability to take action against a schedule of non-compliant sites, while preserving the right of those sites to oppose an application for an order to block access or support services, or to appeal through the courts against any such order.

It will come as no surprise that Labour supports clause 124, which sets out the circumstances in which Ofcom may apply to the courts for an interim service restriction order. We particularly support the need for Ofcom to be able to take action when time is not on its side, or where, put plainly, the level of harm being caused means that it would be inappropriate to wait for a definite failure before taking action.

However, we hope that caution is exercised if Ofcom ever needs to consider such an interim order; we must, of course, get the balance right in our approach to internet regulation more widely. I would therefore be grateful if the Minister could outline his understanding of the specifics of when these orders may be applied. More broadly, Labour agrees that Ofcom should be given the power to act when time demands it, so we have not sought to amend clause 124 at this stage.

Labour also supports the need for Ofcom to have the power to apply to the courts for an access restriction order, as outlined in clause 125. It is vital that Ofcom is given the power to prevent, restrict or deter individuals in the UK from accessing a service from a non-compliant provider. We welcome the specific provisions on access via internet service providers and app stores. We all know from Frances Haugen’s testimony that harmful material can often be easily buried, so it is right and proper that those are considered as “access facilities” under the clause. Ultimately, we support the intentions of clause 125 and, again, have not sought to amend it at this stage.

We also support clause 126, which sets out the circumstances in which Ofcom may apply to the courts for an interim access restriction order. I will not repeat myself: for the reasons I have already outlined, it is key that Ofcom has sufficient powers to act, particularly on occasions when it is inappropriate to wait for a failure to be established.

We welcome clause 127, which clarifies how Ofcom’s enforcement powers can interact. We particularly welcome clarification that, where Ofcom exercises its power to apply to the courts for a business disruption order under clauses 123 to 126, it is not precluded from taking action under its other enforcement powers. As we have repeatedly reiterated, we welcome Ofcom’s having sufficient power to reasonably bring about positive change and increase safety measures online. That is why we have not sought to amend clause 127.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Thank you for chairing this morning’s sitting, Ms Rees.

I agree with the hon. Member for Pontypridd that these clauses are necessary and important, but I also agree that the amendments are important. It seems like this is a kind of tidying-up exercise, to give Ofcom the ability to act in a way that will make its operation smoother. We all want this legislation to work. This is not an attempt to break this legislation—to be fair, none of our amendments have been—but an attempt to make things work better.

Amendments 50 and 51 are fairly similar to the one that the National Society for the Prevention of Cruelty to Children proposed to clause 103. They would ensure that Ofcom could take action against a group of sites, particularly if they were facing the same kind of issues, they had the same kind of functionality, or the same kind of concerns were being raised about them.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I repeat the point I made to the hon. Member for Liverpool, Walton a moment ago. This is simply an obligation to consult. The clause gives the Secretary of State an opportunity to offer an opinion, but it is just that—an opinion. It is not binding on Ofcom, which may take that opinion into account or not at its discretion. This provision sits alongside the requirement to consult the Information Commissioner’s Office. I respectfully disagree with the suggestion that it represents unwarranted and inappropriate interference in the operation of a regulator. Consultation between organs of state is appropriate and sensible, but in this case it does not fetter Ofcom’s ability to act at its own discretion. I respectfully do not agree with the shadow Minister’s analysis.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Apologies, Ms Rees, for coming in a bit late on this, but I was not aware of the intention to vote against the clause. I want to make clear what the Scottish National party intends to do, and the logic behind it. The inclusion of Government amendment 7 is sensible, and I am glad that the Minister has tabled it. Clause 129 is incredibly important, and the requirement to publish guidance will ensure that there is a level of transparency, which we and the Labour Front Benchers have been asking for.

The Minister has been clear about the requirement for Ofcom to consult the Secretary of State, rather than to be directed by them. As a whole, this Bill gives the Secretary of State far too much power, and far too much ability to intervene in the workings of Ofcom. In this case, however, I do not have an issue with the Secretary of State being consulted, so I intend to support the inclusion of this clause, as amended by Government amendment 7.



Question put, That the amendment be made.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Clause 130 sets up a committee to advise Ofcom on misinformation and disinformation, which is the only direct reference to misinformation and disinformation in the entire Online Safety Bill. However, the Bill gives the committee no identifiable powers or active role in tackling harmful misinformation and disinformation, meaning that it has limited practical purpose. It is also unclear how the advisory committee will fit with Ofcom’s wider regulatory functions.

The remaining provisions in the Bill are limited and do not properly address harmful misinformation and disinformation. If tackling harmful misinformation and disinformation is left to this clause, the Bill will fail both to tackle harm properly, and to keep children and adults safe.

The clause risks giving a misleading impression that action is being taken. If the Government and Ofcom proceed with creating the committee, we need to see that its remit is strengthened and clarified, so that it more effectively tackles harmful disinformation and misinformation. That should include advising on Ofcom’s research, reporting on drivers of harmful misinformation and disinformation, and proportionate responses to them. There should also be a duty on Ofcom to consult the committee when drafting relevant codes of practice.

That is why we have tabled amendment 57. It would change the period by which the advisory committee must report from 18 months to six. This is a simple amendment that encourages scrutiny. Once again, the Minister surely has little reason not to accept it, especially as we have discussed at length the importance of the advisory committee having the tools that it needs to succeed.

Increasing the regularity of these reports from the advisory committee is vital, particularly given the ever-changing nature of the internet. Labour has already raised concerns about the lack of futureproofing in the Bill more widely, and we feel that the advisory committee has an important role and function to play in areas where the Bill itself is lacking. We are not alone in this view; the Minister has heard from his Back Benchers about just how important this committee is.

Amendment 58 would require Ofcom to produce a code of practice on system-level disinformation. Again, this amendment will come as no surprise to the Minister, given the concerns that Labour has repeatedly raised about the lack of provisions relating to disinformation in the Bill. It seems like an obvious omission that the Bill has failed to consider a specific code of practice around reducing disinformation, and the amendment would be a simple way to ensure that Ofcom actively encourages services to reduce disinformation across their platforms. The Minister knows that this would be a welcome step, and I urge him to consider supporting the amendment.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I want to briefly agree with the sentiments of the Opposition Front Bench, especially about the strength of the committee and the lack of teeth that it currently has. Given that the Government have been clear that they are very concerned about misinformation and disinformation, it seems odd that they are covered in the Bill in such a wishy-washy way.

The reduction of the time from 18 months to six months would also make sense. We would expect the initial report the committee publish in six months to not be as full as the ones it would publish after that. I do not see any issue with it being required to produce a report as soon as possible to assess how the Act is bedding in and beginning to work, rather than having to wait to assess—potentially once the Act is properly working. We want to be able to pick up any teething problems that the Act might have.

We want the committee to be able to say, “Actually, this is not working quite as we expected. We suggest that Ofcom operates in a slightly different way or that the interaction with providers happens in a slightly different way.” I would rather that problems with the Act were tackled as early as possible. We will not know about problems with the Act, because there is no proper review mechanism. There is no agreement on the committee, for example, to look at how the Act is operating. This is one of the few parts of the Bill where we have got an agreement to a review, and it would make sense that it happen as early as possible.

We agree that misinformation and disinformation are very important matters that really need to be tackled, but there is just not enough clout in the Bill to allow Ofcom to properly tackle these issues that are causing untold harm.

Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

When I spoke at the very beginning of the Committee’s proceedings, I said that the legislation was necessary, that it was a starting point and that it would no doubt change and develop over time. However, I have been surprised at how little, considering all of the rhetoric we have heard from the Secretary of State and other Ministers, the Bill actually deals with the general societal harm that comes from the internet. This is perhaps the only place in the Bill where it is covered.

I am thinking of the echo chambers that are created around disinformation and the algorithms that companies use. I really want to hear from the Minister where he sees this developing and why it is so weak and wishy-washy. While I welcome that much of the Bill seeks to deal with the criminality of individuals and the harm and abuse that can be carried out over the internet, overall it misses a great opportunity to deal with the harmful impact the internet can have on society.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clearly, resourcing of the upper tribunal is a matter decided jointly by the Lord Chancellor and the Secretary of State for Justice, in consultation with the Lord Chief Justice, and, in this case, the Senior President of Tribunals. Parliament would expect the resourcing of that part of the upper tribunal to be such that cases could be heard in an expedited matter. Particularly where cases concern the safety of the public—and particularly of children—we expect that to be done as quickly as it can.

Question put and agreed to.

Clause 138 accordingly ordered to stand part of the Bill.

Clause 139 ordered to stand part of the Bill.

Clause 140

Power to make super-complaints

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I beg to move amendment 143, in clause 140, page 121, line 1, after “services” insert “, consumers”.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 144, in clause 140, page 121, line 2, after “users” insert “, consumers”.

Amendment 145, in clause 140, page 121, line 4, after “services” insert “, consumers”.

Amendment 146, in clause 140, page 121, line 5, after “users” insert “, consumers”.

Amendment 147, in clause 140, page 121, line 6, at end insert “, consumers”.

Amendment 148, in clause 140, page 121, line 7, after “users” insert “, consumers”.

Amendment 149, in clause 140, page 121, line 14, after “service” insert “, consumers”.

Amendment 150, in clause 140, page 121, line 18, at end insert “, consumers”.

Amendment 151, in clause 140, page 121, line 19, after “users” insert “, consumers”.

Amendment 152, in clause 140, page 121, line 25, at end insert—

“‘consumers’” means individuals in the United Kingdom acting for purposes that are wholly or mainly outside the trade, business, craft or profession of the individuals concerned.”

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The Committee has been flexible about grouping clauses should it make sense to do so. I ask that the Committee allow me to speak to this set of amendments alone. It does not make sense for me to discuss these amendments and amendment 77 at the same time. If I could separately discuss amendment 77, as it says on the Order Paper, then I would appreciate that.

This group of amendments specifically relate to consumer protection. It is the case that online fraud facilitated through social media platforms and search engines is one of the most prevalent forms of crime today. Reported incidents increased significantly during the pandemic, and often resulted in victims losing life-changing amounts of money. In addition to the financial impact of being scammed, there is the emotional and physical impact. We know it has a significant effect on people’s mental health. I am glad that the Government listened to the Joint Committee and the Culture, Media and Sport Committee, and changed the legislation to include fraud.

Amendment 143 is about expanding who can make super-complaints, in order to reflect the expansion of the Bill to include fraud. The Bill does not leave a lot of the details around super-complaints to be made in secondary legislation. These amendments specifically allow groups that are acting on behalf of consumers, or those who are making requests on behalf of consumers, to make super-complaints. I am not sure that if somebody is acting on behalf of consumers that fits into the definitions of users of the service and people representing users of the service. Perhaps the Minister can convince me otherwise. If consumers are losing significant amounts of money, or where there is risk of significant numbers of people losing significant amounts of money—for example, where a search engine allows fraudulent advertising to be the top result—including “consumers” in the Bill will allow organisations acting on behalf of consumers to take action. It may be that the Minister can give me some comfort in this, and let us know that organisations acting on behalf of consumers would potentially—if they meet other criteria—be able to put forward a super-complaint.

I understand that there are other methods of complaining—it is possible for other complaints to be made. However, given the significant increase in the risk to consumers in the past few years, it would seem sensible that the Minister give some consideration to whether this is adequately covered in the Bill, and whether consumers are adequately protected in this section of the Bill, as well as in the additional flawed clauses that the Minister added between publication of the original draft Bill and the Bill that we have before us today.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clearly, we want the super-complaint function to be as effective as possible and for groups of relevant people, users or members of the public to be able to be represented by an eligible entity to raise super-complaints. I believe we are all on the same page in wanting to do that. If I am honest, I am a little confused as to what the addition of the term “consumers” will add. The term “users” is defined quite widely, via clause 140(6), which then refers to clause 181, where, as debated previously, a “user” is defined widely to include anyone using a service, whether registered or not. So if somebody stumbles across a website, they count as a user, but the definition being used in clause 140 about bringing super-complaints also includes “members of the public”—that is, regular citizens. Even if they are not a user of that particular service, they could still be represented in bringing a complaint.

Given that, by definition, “users” and “members of the public” already cover everybody in the United Kingdom, I am not quite sure what the addition of the term “consumers” adds. By definition, consumers are a subset of the group “users” or “members of the public”. It follows that in seeking to become an eligible entity, no eligible entity will purport to act for everybody in the United Kingdom; they will always be seeking to define some kind of subset of people. That might be children, people with a particular vulnerability or, indeed, consumers, who are one such subset of “members of the public” or “users”. I do not honestly understand what the addition of the word “consumers” adds here when everything is covered already.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Will the Minister explicitly say that he thinks that an eligible entity, acting on behalf of consumers, could, if it fulfils the other criteria, bring a super-complaint?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Yes, definitely. That is the idea of an eligible entity, which could seek to represent a particular demographic, such as children or people from a particular marginalised group, or it could represent people who have a particular interest, which would potentially include consumers. So I can confirm that that is the intention behind the drafting of the Bill. Having offered that clarification and made clear that the definition is already as wide as it conceivably can be—we cannot get wider than “members of the public”—I ask the hon. Member for Aberdeen North to consider withdrawing the amendments, particularly as there are so many. It will take a long time to vote on them.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I thank the Minister for the clarification. Given that he has explicitly said that he expects that groups acting on behalf of consumers could, if they fulfil the other criteria, be considered as eligible entities for making super-complaints, I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Amendment proposed: 66, in clause 140, page 121, line 8, at end insert—

“(d) causing harm to any human or animal.”

This amendment ensures groups are able to make complaints regarding animal abuse videos.(Alex Davies-Jones.)

Division 42

Ayes: 5


Labour: 4
Scottish National Party: 1

Noes: 9


Conservative: 9

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I beg to move amendment 77, in clause 140, page 121, line 9, leave out subsection (2).

This amendment removes the tests that complaints have to be of particular importance in order to be admissible.

When I first read clause 140, subsection (2) raised a significant number of red flags for me. The subsection might be reasonable if we did not have giant companies—social media platforms particularly—that significant numbers of people across the UK use regularly. Facebook might be counted as a “single regulated service”, but 85% of UK residents—57.1 million people—had a Facebook account earlier this year. Twitter is used by 28% of people living in the UK, which is 19 million users. TikTok is at 19%, which is significantly less, but still a very high number of people—13 million users. I can understand the decision that a super-complaint picking on one certain company might be a bit extreme, but it does not make sense when we are considering the Facebooks of this world.

If someone is making a complaint about a single regulated service and that service is Facebook, Twitter, TikTok or another large platform—or a new, yet-to-be-created platform—that significant numbers of people use, there is no justification for treating that complaint differently just because it is against a single entity. When a complaint is made against Facebook—I am picking on Facebook because 85% of the UK public are members of it; it is an absolute behemoth—I would like there to be no delay in its being taken to Ofcom. I would like Ofcom not to have to check and justify that the complaint is “of particular importance”.

Subsection (2)(a) states that one of the tests of the complaint should be that it “is of particular importance” or, as subsection (2)(b) notes, that it

“relates to the impacts on a particularly large number of users of the service or members of the public.”

I do not understand what

“large number of users of the service”

would mean. Does a large number of the users of Facebook mean 50% of its users? Does it mean 10%? What is a large number? Is that in percentage terms, or is it something that is likely to impact 1 million people? Is that a large number? The second part—

“large number…of members of the public”—

is again difficult to define. I do not think there is justification for this additional hoop just because the complaint relates to a single regulated service.

Where a complaint relates to a very small platform that is not causing significant illegal harm, I understand that Ofcom may want to consider whether it will accept, investigate and give primacy and precedence to that. If the reality is that the effect is non-illegal, fairly minor and impacts a fairly small number of people, in the order of hundreds instead of millions, I can understand why Ofcom might not want to give that super-complaint status and might not want to carry out the level of investigation and response necessary for a super-complaint. But I do not see any circumstances in which Ofcom could justify rejecting a complaint against Facebook simply because it is a complaint against a single entity. The reality is that if something affects one person on Facebook, it will affect significantly more than one person on Facebook because of Facebook’s absolutely massive user base. Therefore this additional hoop is unrealistic.

Paragraph (a), about the complaint being “of particular importance”, is too woolly. Does it relate only to complaints about things that are illegal? Does it relate only to things that are particularly urgent—something that is happening now and that is having an impact today? Or is there some other criterion that we do not yet know about?

I would very much appreciate it if the Minister could give some consideration to amendment 77, which would simply remove subsection (2). If he is unwilling to remove that subsection, I wonder whether we could meet halfway and whether, let us say, category 1 providers could all be excluded from the “single provider” exemption, because they have already been assessed by Ofcom to have particular risks on their platforms. That group is wider than the three names that I have mentioned, and I think that that would be a reasonable and realistic decision for the Government—and direction for Ofcom—to take. It would be sensible.

If the Government believe that there is more information—more direction—that they could add to the clause, it would be great if the Minister could lay some of that out here and let us know how he intends subsection (2) to operate in practice and how he expects Ofcom to use it. I get that people might want it there as an additional layer of protection, but I genuinely do not imagine that it can be justified in the case of the particularly large providers, where there is significant risk of harm happening.

I will illustrate that with one last point. The Government specifically referred earlier to when Facebook—Meta—stopped proactively scanning for child sexual abuse images because of an issue in Europe. The Minister mentioned the significant amount of harm and the issues that were caused in a very small period. And that was one provider—the largest provider that people use and access. That massive amount of harm can be caused in a very small period. I do not support allowing Meta or any other significantly large platform to have a “get out of jail” card. I do not want them to be able to go to Ofcom and say, “Hey, Ofcom, we’re challenging you on the basis that we don’t think this complaint is of particular importance” or “We don’t think the complaint relates to the impacts on a particularly large number of users of the service or members of the public.” I do not want them to have that ability to wriggle out of things because this subsection is in the Bill, so any consideration that the Minister could give to improving clause 140 and subsection (2) would be very much appreciated.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

We support the SNP’s amendment 77, moved by the hon. Member for Aberdeen North. The super-complaints mechanism introduced by clause 140 is a useful device for reporting numerous, widespread concerns about the harm caused by multiple or single services or providers. Subsection (1) includes the conditions on the subjects of super-complaints, which can relate to one or more services. However, as the hon. Member has pointed out, that is caveated by subsection (2), under which a super-complaint that refers to a single service or provider must prove, as she has just outlined, that it is “of particular importance” or

“relates to the impacts on a particularly large number of users of the service or members of the public.”

Given the various hoops through which a super-complaint already has to jump, it is not clear why the additional conditions are needed. Subsection (2) significantly muddies the waters and complicates the provisions for super-complaints. For instance, how does the Minister expect Ofcom to decide whether the complaint is of particular importance? What criteria does he expect the regulator to use? Why include it as a metric in the first place when the super-complaint has already met the standards set out in subsection (1)?

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I think the Committee, and the House, are pretty unanimous in agreeing that the power to make super-complaints is important. As we have discussed, there are all kinds of groups, such as children, under-represented groups and consumers, that would benefit from being represented where systemic issues are not being addressed and that Ofcom may have somehow overlooked or missed in the discharge of its enforcement powers.

I would observe in passing that one of the bases on which super-complaints can be made—this may be of interest to my hon. Friend the Member for Don Valley—is where there is a material risk under clause 140(1)(b) of

“significantly adversely affecting the right to freedom of expression within the law of users of the services or members of the public”.

That clause is another place in the Bill where freedom of expression is expressly picked out and supported. If freedom of expression is ever threatened in a way that we have not anticipated and that the Bill does not provide for, there is a particular power here for a particular free speech group, such as the Free Speech Union, to make a super-complaint. I hope that my hon. Friend finds the fact that freedom of expression is expressly laid out there reassuring.

Let me now speak to the substance of amendment 77, tabled by the hon. Member for Aberdeen North. It is important to first keep in mind the purpose of the super-complaints, which, as I said a moment ago, is to provide a basis for raising issues of widespread and systemic importance. That is the reason for some of the criteria in sections (1)(a), (b) and (c), and why we have subsection (2)—because we want to ensure that super-complaints are raised only if they are of a very large scale or have a profound impact on freedom of speech or some other matter of particular importance. That is why the tests, hurdles and thresholds set out in clause 140(2) have to be met.

If we were to remove subsection (2), as amendment 77 seeks to, that would significantly lower the threshold. We would end up having super-complaints that were almost individual in nature. We set out previously why we think an ombudsman-type system or having super-complaints used for near-individual matters would not be appropriate. That is why the clause is there, and I think it is reasonable that it is.

The hon. Lady asked a couple of questions about how this arrangement might operate in practice. She asked whether a company such Facebook would be caught if it alone were doing something inappropriate. The answer is categorically yes, because the condition in clause 140(2)(b)—

“impacts on a particularly large number of users”,

which would be a large percentage of Facebook’s users,

“or members of the public”—

would be met. Facebook and—I would argue—any category 1 company would, by definition, be affecting large numbers of people. The very definition of category 1 includes the concept of reach—the number of people being affected. That means that, axiomatically, clause 140(2)(b) would be met by any category 1 company.

The hon. Lady also raised the question of Facebook, for a period of time in Europe, unilaterally ceasing to scan for child sexual exploitation and abuse images, which, as mentioned, led to huge numbers of child sex abuse images and, consequently, huge numbers of paedophiles not being detected. She asks how these things would be handled under the clause if somebody wanted to raise a super-complaint about that. Hopefully, Ofcom would stop them happening in the first place, but if it did not the super-complaint redress mechanism would be the right one. These things would categorically be caught by clause 140(2)(a), because they are clearly of particular importance.

In any reasonable interpretation of the words, the test of “particular importance” is manifestly met when it comes to stopping child sexual exploitation and abuse and the detection of those images. That example would categorically qualify under the clause, and a super-complaint could, if necessary, be brought. I hope it would never be necessary, because that is the kind of thing I would expect Ofcom to catch.

Having talked through the examples from the hon. Lady, I hope I have illustrated how the clause will ensure that either large-scale issues affecting large numbers of people or issues that are particularly serious will still qualify for super-complaint status with subsection (2) left in the Bill. Given those assurances, I urge the hon. Member to consider withdrawing her amendment.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I welcome the Minister’s fairly explicit explanation that he believes that every category 1 company would be in scope, even if there was a complaint against one single provider. I would like to push the amendment to a vote on the basis of the comments I made earlier and the fact that each of these platforms is different. We have heard concerns about, for example, Facebook groups being interested in celebrating eight-year-olds’ birthdays. We have heard about the amount of porn on Twitter, which Facebook does not have in the same way. We have heard about the kind of algorithmic stuff that takes people down a certain path on TikTok. We have heard all these concerns, but they are all specific to that one provider. They are not a generic complaint that could be brought toward a group of providers.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Would the hon. Lady not agree that in all those examples—including TikTok and leading people down dark paths—the conditions in subsection (2) would be met? The examples she has just referred to are, I would say, certainly matters of particular importance. Because the platforms she mentions are big in scale, they would also meet the test of scale in paragraph (b). In fact, only one of the tests has to be met—it is one or the other. In all the examples she has just given, not just one test—paragraph (a) or (b)— would be met, but both. So all the issues she has just raised would make a super-complaint eligible to be made.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I am glad the Minister confirms that he expects that that would be the case. I am clearer now that he has explained it, but on my reading of the clause, the definitions of “particular importance” or

“a particularly large number of users…or members of the public”

are not clear. I wanted to ensure that this was put on the record. While I do welcome the Minister’s clarification, I would like to push amendment 77 to a vote.

Question put, That the amendment be made.

Online Safety Bill (Twelfth sitting)

Kirsty Blackman Excerpts
Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

Under this chapter, Ofcom will have the power to direct companies to use accredited technology to identify child sexual exploitation and abuse content, whether communicated publicly or privately by means of a service, and to remove that content quickly. Colleagues will be aware that the Internet Watch Foundation is one group that assists companies in doing that by providing them with “hashes” of previously identified child sexual abuse material in order to prevent the upload of such material to their platforms. That helps stop the images of victims being recirculated again and again. Tech companies can then notify law enforcement of the details of who has uploaded the content, and an investigation can be conducted and offenders sharing the content held to account.

Those technologies are extremely accurate and, thanks to the quality of our datasets, ensure that companies are detecting only imagery that is illegal. There are a number of types of technology that Ofcom could consider accrediting, including image hashing. A hash is a unique string of letters and numbers that can be applied to an image and matched every time a user attempts to upload a known illegal image to a platform.

PhotoDNA is another type, created in 2009 in a collaboration between Microsoft and Professor Hany Farid at the University of Berkeley. PhotoDNA is a vital tool in the detection of CSEA online. It enables law enforcement, charities, non-governmental organisations and the internet industry to find copies of an image even when it has been digitally altered. It is one of the most important technical developments in online child protection. It is extremely accurate, with a failure rate of one in 50 billion to 100 billion. That gives companies a high degree of certainty that what they are removing is illegal, and a firm basis for law enforcement to pursue offenders.

Lastly, there is webpage blocking. Most of the imagery that the Internet Watch Foundation removes from the internet is hosted outside the UK. While it is waiting for removal, it can disable public access to an image or webpage by adding it to our webpage blocking list. That can be utilised by search providers to de-index known webpages containing CSAM. I therefore ask the Minister, as we continue to explore this chapter, to confirm exactly how such technologies can be utilised once the Bill receives Royal Assent.

Labour welcomes clause 105, which confirms, in subsection (2), that where a service provider is already using technology on a voluntary basis but it is ineffective, Ofcom can still intervene and require a service provider to use a more effective technology, or the same technology in a more effective way. It is vital that Ofcom is given the power and opportunity to intervene in the strongest possible sense to ensure that safety online is kept at the forefront.

However, we do require some clarification, particularly on subsections (9) and (10), which explain that Ofcom will only be able to require the use of tools that meet the minimum standards for accuracy for detecting terrorism and/or CSEA content, as set out by the Secretary of State. Although minimum standards are of course a good thing, can the Minister clarify the exact role that the Secretary of State will have in imposing these minimum standards? How will this work in practice?

Once again, Labour does not oppose clause 106 and we have not sought to amend it at this stage. It is vital that Ofcom has the power to revoke a notice under clause 103(1) if there are reasonable grounds to believe that the provider is not complying with it. Only with these powers can we be assured that service providers will be implored to take their responsibilities and statutory duties, as outlined in the Bill, seriously.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

I have a few questions, concerns and suggestions relating to these clauses. I think it was the hon. Member for Don Valley who asked me last week about the reports to the National Crime Agency and how that would work—about how, if a human was not checking those things, there would be an assurance that proper reports were being made, and that scanning was not happening and reports were not being made when images were totally legal and there was no problem with them. [Interruption.] I thought it was the hon. Member for Don Valley, although it may not have been. Apologies—it was a Conservative Member. I am sorry for misnaming the hon. Member.

The hon. Member for Pontypridd made a point about the high level of accuracy of the technologies. That should give everybody a level of reassurance that the reports that are and should be made to the National Crime Agency on child sexual abuse images will be made on a highly accurate basis, rather than a potentially inaccurate one. Actually, some computer technology—particularly for scanning for images, rather than text—is more accurate than human beings. I am pleased to hear those particular statistics.

Queries have been raised on this matter by external organisations—I am particularly thinking about the NSPCC, which we spoke about earlier. The Minister has thankfully given a number of significant reassurances about the ability to proactively scan. External organisations such as the NSPCC are still concerned that there is not enough on the face of the Bill about proactive scanning and ensuring that the current level of proactive scanning is able—or required—to be replicated when the Bill comes into action.

During an exchange in an earlier Committee sitting, the Minister gave a commitment—I am afraid I do not have the quote—to being open to looking at amending clause 103. I am slightly disappointed that there are no Government amendments, but I understand that there has been only a fairly short period; I am far less disappointed than I was previously, when the Minister had much more time to consider the actions he might have been willing to take.

The suggestion I received from the NSPCC is about the gap in the Bill regarding the ability of Ofcom to take action. These clauses allow Ofcom to take action against individual providers about which it has concerns; those providers will have to undertake duties set out by Ofcom. The NSPCC suggests that there could be a risk register, or that a notice could be served on a number of companies at one time, rather than Ofcom simply having to pick one company, or to repeatedly pick single companies and serve notices on them. Clause 83 outlines a register of risk profiles that must be created by Ofcom. It could therefore serve notice on all the companies that fall within a certain risk profile or all the providers that have common functionalities.

If there were a new, emerging concern, that would make sense. Rather than Ofcom having to go through the individual process with all the individual providers when it knows that there is common functionality—because of the risk assessments that have been done and Ofcom’s oversight of the different providers—it could serve notice on all of them in one go. It could not then accidentally miss one out and allow people to move to a different platform that had not been mentioned. I appreciate the conversation we had around this issue earlier, and the opportunity to provide context in relation to the NSPCC’s suggestions, but it would be great if the Minister would be willing to consider them.

I have another question, to which I think the Minister will be able to reply in the affirmative, which is on the uses of the technology as it evolves. We spoke about that in an earlier meeting. The technology that we have may not be what we use in the future to scan for terrorist-related activity or child sexual abuse material. It is important that the Bill adequately covers future conditions. I think that it does, but will the Minister confirm that, as technology advances and changes, these clauses will adequately capture the scanning technologies that are required, and any updates in the way in which platforms work and we interact with each other on the internet?

I have fewer concerns about future-proofing with regard to these provisions, because I genuinely think they cover future conditions, but it would be incredibly helpful and provide me with a bit of reassurance if the Minister could confirm that. I very much look forward to hearing his comments on clause 103.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

Let me start by addressing some questions raised by hon. Members, beginning with the last point made by the hon. Member for Aberdeen North. She sought reconfirmation that the Bill will keep up with future developments in accredited technology that are not currently contemplated. The answer to her question can be found in clause 105(9), in which the definition of accredited technology is clearly set out, as technology that is

“accredited (by OFCOM or another person appointed by OFCOM) as meeting minimum standards of accuracy”.

That is not a one-off determination; it is a determination, or an accreditation, that can happen from time to time, periodically or at any point in the future. As and when new technologies emerge that meet the minimum standards of accuracy, they can be accredited, and the power in clause 103 can be used to compel platforms to use those technologies. I hope that provides the reassurance that the hon. Member was quite rightly asking for.

The shadow Minister, the hon. Member for Pontypridd, asked a related question about the process for publishing those minimum standards. The process is set out in clause 105(10), which says that Ofcom will give advice to the Secretary of State on the appropriate minimum standards, and the minimum standards will then be

“approved…by the Secretary of State, following advice from OFCOM.”

We are currently working with Ofcom to finalise the process for setting those standards, which of course will need to take a wide range of factors into account.

Let me turn to the substantive clauses. Clause 103 is extremely important, because as we heard in the evidence sessions and as Members of the Committee have said, scanning messages using technology such as hash matching, to which the shadow Minister referred, is an extremely powerful way of detecting CSEA content and providing information for law enforcement agencies to arrest suspected paedophiles. I think it was in the European Union that Meta—particularly Facebook and Facebook Messenger—stopped using this scanner for a short period time due to misplaced concerns about privacy laws, and the number of referrals of CSEA images and the number of potential paedophiles who were referred to law enforcement dropped dramatically.

A point that the hon. Member for Aberdeen North and I have discussed previously is that it would be completely unacceptable if a situation arose whereby these messages—I am thinking particularly about Facebook Messenger—did not get scanned for CSEA content in a way that they do get scanned today. When it comes to preventing child sexual exploitation and abuse, in my view there is no scope for compromise or ambiguity. That scanning is happening at the moment; it is protecting children on a very large scale and detecting paedophiles on quite a large scale. In my view, under no circumstances should that scanning be allowed to stop. That is the motivation behind clause 103, which provides Ofcom with the power to make directions to require the use of accredited technology.

As the hon. Member for Aberdeen North signalled in her remarks, given the importance of this issue the Government are of course open to thinking about ways in which the Bill can be strengthened if necessary, because we do not want to leave any loopholes. I urge any social media firms watching our proceedings never to take any steps that degrade or reduce the ability to scan for CSEA content. I thank the hon. Member for sending through the note from the NSPCC, which I have received and will look at internally.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

We welcome clause 104, but have tabled some important amendments that the Minister should closely consider. More broadly, the move away from requiring child sexual exploitation and abuse content to be prevalent and persistent before enforcement action can be taken is a positive one. It is welcome that Ofcom will have the opportunity to consider a range of factors.

Despite this, Labour—alongside the International Justice Mission—is still concerned about the inclusion of prevalence as a factor, owing to the difficulty in detecting newly produced CSEA content, especially livestreamed abuse. Amendments 35, 36, 39 and 40 seek to address that gap. Broadly, the amendments aim to capture the concern about the Bill’s current approach, which we feel limits its focus to the risk of harm faced by individuals in the UK. Rather, as we have discussed previously, the Bill should recognise the harm that UK nationals cause to people around the world, including children in the Philippines. The amendments specifically require Ofcom to consider the presence of relevant content, rather than its prevalence.

Amendment 37 would require Ofcom’s risk assessments to consider risks to adults and children through the production, publication and dissemination of illegal content—an issue that Labour has repeatedly raised. I believe we last mentioned it when we spoke to amendments to clause 8, so I will do my best to not repeat myself. That being said, we firmly believe it is important that video content, including livestreaming, is captured by the Bill. I remain unconvinced that the Bill as it stands goes far enough, so I urge the Minister to closely consider and support these amendments. The arguments that we and so many stakeholders have already made still stand.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I echo the sentiments that have been expressed by the shadow Minister, and thank her and her colleagues for tabling this amendment and giving voice to the numerous organisations that have been in touch with us about this matter. The Scottish National party is more than happy to support the amendment, which would make the Bill stronger and better, and would better enable Ofcom to take action when necessary.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I understand the spirit behind these amendments, focusing on the word “presence” rather than “prevalence” in various places. It is worth keeping in mind that throughout the Bill we are requiring companies to implement proportionate systems and processes to protect their users from harm. Even in the case of the most harmful illegal content, we are not placing the duty on companies to remove every single piece of illegal content that has ever appeared online, because that is requesting the impossible. We are asking them to take reasonable and proportionate steps to create systems and processes to do so. It is important to frame the legally binding duties in that way that makes them realistically achievable.

As the shadow Minister said, amendments 35, 36, 39 and 40 would replace the word “prevalence” with “presence”. That would change Ofcom’s duty to enforce not just against content that was present in significant numbers—prevalent—but against a single instance, which would be enough to engage the clause.

We mutually understand the intention behind these amendments, but we think the significant powers to compel companies to adopt certain technology contained in section 103 should be engaged only where there is a reasonable level of risk. For example, if a single piece of content was present on a platform, if may not be reasonable or proportionate to force the company to adopt certain new technologies, where indeed they do not do so at the moment. The use of “prevalence” ensures that the powers are used where necessary.

It is clear—there is no debate—that in the circumstances where scanning technology is currently used, which includes on Facebook Messenger, there is enormous prevalence of material. To elaborate on a point I made in a previous discussion, anything that stops that detection happening would be unacceptable and, in the Government’s view, it would not be reasonable to lose the ability to detect huge numbers of images in the service of implementing encryption, because there is nothing more important than scanning against child sexual exploitation images.

However, we think adopting the amendment and replacing the word “prevalence” with “presence” would create an extremely sensitive trigger that would be engaged on almost every site, even tiny ones or where there was no significant risk, because a single example would be enough to trigger the amendment, as drafted. Although I understand the spirit of the amendment, it moves away from the concepts of proportionality and reasonableness in the systems and processes that the Bill seeks to deliver.

Amendment 37 seeks to widen the criteria that Ofcom must consider when deciding to use section 103 powers. It is important to ensure that Ofcom considers a wide range of factors, taking into account the harm occurring, but clause 104(2)(f) already requires Ofcom to consider

“the level of risk of harm to individuals in the United Kingdom presented by relevant content, and the severity of that harm”.

Therefore, the Bill already contains provision requiring Ofcom to take those matters into account, as it should, but the shadow Minister is right to draw attention to the issue.

Finally, amendment 38 seeks to amend clause 116 to require Ofcom to consider the risk of harm posed by individuals in the United Kingdom, in relation to adults and children in the UK or elsewhere, through the production, publication and dissemination of illegal content. In deciding whether to make a confirmation decision requiring the use of technology, it is important that Ofcom considers a wide range of factors. However, clause 116(6)(e) already proposes to require Ofcom to consider, in particular, the risk and severity of harm to individuals in the UK. That is clearly already in the Bill.

I hope that this analysis provides a basis for the shadow Minister to accept that the Bill, in this area, functions as required. I gently request that she withdraw her amendment.

--- Later in debate ---
None Portrait The Chair
- Hansard -

The Question is—

None Portrait The Chair
- Hansard -

I beg your pardon; I am trying to do too many things at once. I call Kirsty Blackman.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Thank you very much, Sir Roger. I do not envy you in this role, which cannot be easy, particularly with a Bill that is 190-odd clauses long.

None Portrait The Chair
- Hansard -

It goes with the job.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I have a quick question for the Minister about the timelines in relation to the guidance and the commitment that Ofcom gave to producing a road map before this coming summer. When is that guidance likely to be produced? Does that road map relate to the guidance in this clause, as well as the guidance in other clauses? If the Minister does not know the answer, I have no problem with receiving an answer at a later time. Does the road map include this guidance as well as other guidance that Ofcom may or may not be publishing at some point in the future?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I welcome the cross-party support for the provisions set out in these important clauses. Clause 107 points out the requirement for Ofcom to publish guidance, which is extremely important. Clause 108 makes sure that it publishes an annual report. Clause 109 covers the interpretations.

The hon. Member for Aberdeen North asked the only question, about the contents of the Ofcom road map, which in evidence it committed to publishing before the summer. I cannot entirely speak for Ofcom, which is of course an independent body. In order to avoid me giving the Committee misleading information, the best thing is for officials at the Department for Digital, Culture, Media and Sport to liaise with Ofcom and ascertain what the exact contents of the road map will be, and we can report that back to the Committee by letter.

It will be fair to say that the Committee’s feeling—I invite hon. Members to intervene if I have got this wrong—is that the road map should be as comprehensive as possible. Ideally, it would lay out the intended plan to cover all the activities that Ofcom would have to undertake in order to make the Bill operational, and the more detail there is, and the more comprehensive the road map can be, the happier the Committee will be.

Officials will take that away, discuss it with Ofcom and we can revert with fuller information. Given that the timetable was to publish the road map prior to the summer, I hope that we are not going to have to wait very long before we see it. If Ofcom is not preparing it now, it will hopefully hear this discussion and, if necessary, expand the scope of the road map a little bit accordingly.

Question put and agreed to.

Clause 107 accordingly ordered to stand part of the Bill

Clauses 108 and 109 ordered to stand part of the Bill.

Clause 110

Provisional notice of contravention

Question proposed, That the clause stand part of the Bill.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Labour welcomes this important clause, which lists the enforceable requirements. Failure to comply with those requirements can trigger enforcement action. However, the provisions could go further, so we urge the Minister to consider our important amendments.

Amendments 52 and 53 make it abundantly clear that more access to, and availability of, data and information about systems and processes would improve understanding of the online environment. We cannot rely solely on Ofcom to act as problems arise, when new issues could be spotted early by experts elsewhere. The entire regime depends on how bright a light we can shine into the black box of the tech companies, but only minimal data can be accessed.

The amendments would require Ofcom simply to produce a code of practice on access to data. We have already heard that without independent researchers accessing data on relevant harm, the platforms have no real accountability for how they tackle online harms. Civil society and researchers work hard to identify online harms from limited data sources, which can be taken away by the platforms if they choose. Labour feels that the Bill must require platforms, in a timely manner, to share data with pre-vetted independent researchers and academics. The EU’s Digital Services Act does that, so will the Minister confirm why such a provision is missing from this supposed world-leading Bill?

Clause 136 gives Ofcom two years to assess whether access to data is required, and it “may”, but not “must”, publish guidance on how its approach to data access might work. The process is far too slow and, ultimately, puts the UK behind the EU, whose legislation makes data access requests possible immediately. Amendment 52 would change the “may” to “must”, and would ultimately require Ofcom to explore how access to data works, not if it should happen in the first place.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Frances Haugen’s evidence highlighted quite how shadowy a significant number of the platforms are. Does the hon. Member agree that that hammers home the need for independent researchers to access as much detail as possible so that we can ensure that the Bill is working?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I agree 100%. The testimony of Frances Haugen, the Facebook whistleblower, highlighted the fact that expert researchers and academics will need to examine the data and look at what is happening behind social media platforms if we are to ensure that the Bill is truly fit for purpose and world leading. That process should be carried out as quickly as possible, and Ofcom must also be encouraged to publish guidance on how access to data will work.

Ultimately, the amendments make a simple point: civil society and researchers should be able to access data, so why will the Minister not let them? The Bill should empower independently verified researchers and civil society to request tech companies’ data. Ofcom should be required to publish guidance as soon as possible —within months, not years—on how data may be accessed. That safety check would hold companies to account and make the internet a safer and less divisive space for everyone.

The process would not be hard or commercially ruinous, as the platforms claim. The EU has already implemented it through its Digital Services Act, which opens up the secrets of tech companies’ data to Governments, academia and civil society in order to protect internet users. If we do not have that data, researchers based in the EU will be ahead of those in the UK. Without more insight to enable policymaking, quality research and harm analysis, regulatory intervention in the UK will stagnate. What is more, without such data, we will not know Instagram’s true impact on teen mental health, nor the reality of violence against women and girls online or the risks to our national security.

We propose amending the Bill to accelerate data sharing provisions while mandating Ofcom to produce guidance on how civil society and researchers can access data, not just on whether they should. As I said, that should happen within months, not years. The provisions should be followed by a code of practice, as outlined in the amendment, to ensure that platforms do not duck and dive in their adherence to transparency requirements. A code of practice would help to standardise data sharing in a way that serves platforms and researchers.

The changes would mean that tech companies can no longer hide in the shadows. As Frances Haugen said of the platforms in her evidence a few weeks ago:

“The idea that they have worked in close co-operation with researchers is a farce. The only way that they are going to give us even the most basic data that we need to keep ourselves safe is if it is mandated in the Bill. We need to not wait two years after the Bill passes”.––[Official Report, Online Safety Public Bill Committee, 26 May 2022; c. 188, Q320.]

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

This data is a little different—the two domains do not directly correspond. In the health area, there has been litigation—an artificial intelligence company is currently engaged in litigation with an NHS hospital trust about a purported breach of patient data rules—so even in that long-established area, there is uncertainty and recent, or perhaps even current, litigation.

We are asking for the report to be done to ensure that those important issues are properly thought through. Once they are, Ofcom has the power under clause 136 to lay down guidance on providing access for independent researchers to do their work.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The Minister has committed to Ofcom being fully resourced to do what it needs to do under the Bill, but he has spoken about time constraints. If Ofcom were to receive 25,000 risk assessments, for example, there simply would not be enough people to go through them. Does he agree that, in cases in which Ofcom is struggling to manage the volume of data and to do the level of assessment required, it may be helpful to augment that work with the use of independent researchers? I am not asking him to commit to that, but to consider the benefits.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Yes, I would agree that bona fide academic independent researchers do have something to offer and to add in this area. The more we have highly intelligent, experienced and creative people looking at a particular problem or issue, the more likely we are to get a good and well-informed result. They may have perspectives that Ofcom does not. I agree that, in principle, independent researchers can add a great deal, but we need to ensure that we get that set up in a thoughtful and proper way. I understand the desire to get it done quickly, but it is important to take the time to do it not just quickly, but right. It is an area that does not exist already—at the moment, there is no concept of independent researchers getting access to the innards of social media companies’ data vaults—so we need to make sure that it is done in the right way, which is why it is structured as it is. I ask the Committee to stick with the drafting, whereby there will be a report and then Ofcom will have the power. I hope we end up in the same place—well, the same place, but a better place. The process may be slightly slower, but we may also end up in a better place for the consideration and thought that will have to be given.

--- Later in debate ---
Oversight is required to ensure that human resources processes clearly identify the role and provide content descriptions, as well as information on possible occupational hazards. Currently, the conditions of the work are unregulated and rely on the business relationship between two parties focused on the bottom line. Platforms do not release any due diligence on the employment conditions of those contractors, if they conduct it at all. If there is to be any meaningful oversight of the risks inherent in the content moderation supply chain, it is imperative to mandate transparency around the conditions for content moderators in contracted entities. As long as that relationship is self-regulated, the wellness of human moderators will be at risk. That is why we urge the Minister to support this important amendment and new clause: there is a human element to all this. We urge him to do the right thing.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I thank the hon. Member for Pontypridd for laying out her case in some detail, though nowhere near the level of detail that these people have to experience while providing moderation. She has given a very good explanation of why she is asking for the amendment and new clause to be included in the Bill. Concerns are consistently being raised, particularly by the Labour party, about the impact on the staff members who have to deal with this content. I do not think the significance of this issue for those individuals can be overstated. If we intend the Bill to have the maximum potential impact and reduce harm to the highest number of people possible, it makes eminent sense to accept this amendment and new clause.

There is a comparison with other areas in which we place similar requirements on other companies. The Government require companies that provide annual reports to undertake an assessment in those reports of whether their supply chain uses child labour or unpaid labour, or whether their factories are safe for people to work in—if they are making clothes, for example. It would not be an overly onerous request if we were to widen those requirements to take account of the fact that so many of these social media companies are subjecting individuals to trauma that results in them experiencing PTSD and having to go through a lengthy recovery process, if they ever recover. We have comparable legislation, and that is not too much for us to ask. Unpaid labour, or people being paid very little in other countries, is not that different from what social media companies are requiring of their moderators, particularly those working outside the UK and the US in countries where there are less stringent rules on working conditions. I cannot see a reason for the Minister to reject the provision of this additional safety for employees who are doing an incredibly important job that we need them to be doing, in circumstances where their employer is not taking any account of their wellbeing.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

As my hon. Friend the Member for Pontypridd has pointed out, there is little or no transparency about one of the most critical ways in which platforms tackle harms. Human moderators are on the frontline of protecting children and adults from harmful content. They must be well resourced, trained and supported in order to fulfil that function, or the success of the Bill’s aims will be severely undermined.

I find it shocking that platforms offer so little data on human moderation, either because they refuse to publish it or because they do not know it. For example, in evidence to the Home Affairs Committee, William McCants from YouTube could not give precise statistics for its moderator team after being given six days’ notice to find the figure, because many moderators were employed or operated under third-party auspices. For YouTube’s global counter-terrorism lead to be unaware of the detail of how the platform is protecting its users from illegal content is shocking, but it is not uncommon.

In evidence to this Committee, Meta’s Richard Earley was asked how many of Meta’s 40,000 human moderators were outsourced to remove illegal content and disinformation from the platform. My hon. Friend the Member for Pontypridd said:

“You do not have the figures, so you cannot tell me.”

Richard Earley replied:

“I haven’t, no, but I will be happy to let you know afterwards in our written submission.”

Today, Meta submitted its written evidence to the Committee. It included no reference to human content moderators, despite its promise.

The account that my hon. Friend gave just now shows why new clause 11 is so necessary. Meta’s representative told this Committee in evidence:

“Everyone who is involved in reviewing content at Meta goes through an extremely lengthy training process that lasts multiple weeks, covering not just our community standards in total but also the specific area they are focusing on, such as violence and incitement.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 45, Q76.]

But now we know from whistleblowers such as Daniel, whose case my hon. Friend described, that that is untrue. What is happening to Daniel and the other human moderators is deeply concerning. There are powerful examples of the devastating emotional impact that can occur because human moderators are not monitored, trained and supported.

There are risks of platforms shirking responsibility when they outsource moderation to third parties. Stakeholders have raised concerns that a regulated company could argue that an element of its service is not in the scope of the regulator because it is part of a supply chain. We will return to that issue when we debate new clause 13, which seeks to ensure enforcement of liability for supply chain failures that amount to a breach of one of the specified duties.

Platforms, in particular those supporting user-to-user generated content, employ those services from third parties. Yesterday, I met Danny Stone, the chief executive of the Antisemitism Policy Trust, who described the problem of antisemitic GIFs. Twitter would say, “We don’t supply GIFs. The responsibility is with GIPHY.” GIPHY, as part of the supply chain, would say, “We are not a user-to-user platform.” If someone searched Google for antisemitic GIFs, the results would contain multiple entries saying, “Antisemitic GIFs—get the best GIFs on GIPHY. Explore and share the best antisemitic GIFs.”

One can well imagine a scenario in which a company captured by the regulatory regime established by the Bill argues that an element of its service is not within the ambit of the regulator because it is part of a supply chain presented by, but not necessarily the responsibility of, the regulated service. The contracted element, which I have just described by reference to Twitter and GIPHY, supported by an entirely separate company, would argue that it was providing a business-to-business service that is not user-generated content but content designed and delivered at arm’s length and provided to the user-to-user service to deploy for its users.

I suggest that dealing with this issue would involve a timely, costly and unhelpful legal process during which systems were not being effectively regulated—the same may apply in relation to moderators and what my hon. Friend the Member for Pontypridd described; there are a number of lawsuits involved in Daniel’s case—and complex contract law was invoked.

We recognise in UK legislation that there are concerns and issues surrounding supply chains. Under the Bribery Act 2010, for example, a company is liable if anyone performing services for or on the company’s behalf is found culpable for specific actions. These issues on supply chain liability must be resolved if the Bill is to fulfil its aim of protecting adults and children from harm.

Online Safety Bill (Eleventh sitting)

Kirsty Blackman Excerpts
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 80, in schedule 10, page 192, line, at end insert—

“(c) the assessed risk of harm arising from that part of the service.”

This amendment, together with Amendments 81 and 82, widens Category 1 to include those services which pose a very high risk of harm, regardless of the number of users.

Amendment 81, in schedule 10, page 192, line 39, after “functionality” insert—

“and at least one specified condition about the assessed risk of harm”

This amendment is linked to Amendment 80.

Amendment 82, in schedule 10, page 192, line 41, at end insert—

‘(4A) At least one specified condition about the assessed risk of harm must provide for a service assessed as posing a very high risk of harm to its users to meet the Category 1 threshold.”

This amendment is linked to Amendment 80, it widens Category 1 to include those services which pose a very high risk of harm, regardless of the number of users.

That schedule 10 be the Tenth schedule to the Bill.

Clause 81 stand part.

Clause 82 stand part.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

Thank you for your efforts in chairing our meeting today, Sir Roger. My thoughts are with the hon. Member for Batley and Spen and her entire family on the anniversary of Jo Cox’s murder; the SNP would like to echo that sentiment.

I want to talk about my amendment, and I start with a quote from the Minister on Second Reading:

“A number of Members…have raised the issue of small platforms that are potentially harmful. I will give some thought to how the question of small but high-risk platforms can be covered.”—[Official Report, 19 April 2022; Vol. 712, c. 133.]

I appreciate that the Minister may still be thinking about that. He might accept all of our amendments; that is entirely possible, although I am not sure there is any precedent. The possibility is there that that might happen.

Given how strong I felt that the Minister was on the issue on Second Reading, I am deeply disappointed that there are no Government amendments to this section of the Bill. I am disappointed because of the massive risk of harm caused by some very small platforms—it is not a massive number—where extreme behaviour and radicalisation is allowed to thrive. It is not just about the harm to those individuals who spend time on those platforms and who are radicalised, presented with misinformation and encouraged to go down rabbit holes and become more and more extreme in their views. It is also about the risk of harm to other people as a result of the behaviour inspired in those individuals. We are talking about Jo Cox today; she is in our memories and thoughts. Those small platforms are the ones that are most likely to encourage individuals towards extremely violent acts.

If the Bill is to fulfil its stated aims and take the action we all want to see to prevent the creation of those most heinous, awful crimes, it needs to be much stronger on small, very high-risk platforms. I will make no apologies for that. I do not care if those platforms have small amounts of profits. They are encouraging and allowing the worst behaviours to thrive on their platforms. They should be held to a higher level of accountability. It is not too much to ask to class them as category 1 platforms. It is not too much to ask them to comply with a higher level of risk assessment requirements and a higher level of oversight from Ofcom. It is not too much to ask because of the massive risk of harm they pose and the massive actual harm that they create.

Those platforms should be punished for that. It is one thing to punish and criminalise the behaviour of users on those platforms—individual users create and propagate illegal content or radicalise other users—but the Bill does not go far enough in holding those platforms to account for allowing that to take place. They know that it is happening. Those platforms are set up as an alternative place—a place that people are allowed to be far more radical that they are on Twitter, YouTube, Twitch or Discord. None of those larger platforms have much moderation, but the smaller platforms encourage such behaviour. Links are put on other sites pointing to those platforms. For example, when people read vaccine misinformation, there are links posted to more radical, smaller platforms. I exclude Discord because, given its number of users, I think it would be included in one of the larger-platform categories anyway. It is not that there is not radical behaviour on Discord—there is—but I think the size of its membership excludes it, in my head certainly, from the category of the very smallest platforms that pose the highest risk.

We all know from our inboxes the number of people who contact us saying that 5G is the Government trying to take over their brains, or that the entire world is run by Jewish lizard people. We get those emails on a regular basis and those theories are propagated on the smallest platforms. Fair enough—some people may not take any action as a result of the radicalisation that they have experienced as a result of their very extreme views. But some people will take action and that action may be simply enough to harm their friends or family, it may be simply enough to exclude them and drag them away from the society or community that they were previously members of or it might, in really tragic cases, be far more extreme. It might lead people to cause physical or mental harm to others intentionally as a result of the beliefs that they have had created and fostered on those platforms.

That is why we have tabled the amendments. This is the one area that the Government have most significantly failed in writing this Bill, by not ensuring that the small, very high-risk platforms are held to the highest level of accountability and are punished for allowing these behaviours to thrive on their platforms. I give the Minister fair warning that unless he chooses to accept the amendments, I intend to push them to a vote. I would appreciate it if he gave assurances, but I do not believe that any reassurance that he could give would compare to having such a measure in the Bill. As I say, for me the lack of this provision is the biggest failing of the entire Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I echo the comments of the hon. Member for Aberdeen North. I completely agree with everything she has just said and I support the amendments that she has tabled.

The Minister knows my feelings on the Government’s approach to categorisation services; he has heard my concerns time and time again. However, it is not just me who believes that the Government have got their approach really wrong. It is also stakeholders far and wide. In our evidence sessions, we heard from HOPE not hate and the Antisemitism Policy Trust specifically on this issue. In its current form, the categorisation process is based on size versus harm, which is a fundamentally flawed approach.

The Government’s response to the Joint Committee that scrutinised the draft Bill makes it clear that they consider that reach is a key and proportional consideration when assigning categories and that they believe that the Secretary of State’s powers to amend those categories are sufficient to protect people. Unfortunately, that leaves many alternative platforms out of category 1, even if they host large volumes of harmful material.

The duty of care approach that essentially governs the Bill is predicated on risk assessment. If size allows platforms to dodge the entry criteria for managing high risk, there is a massive hole in the regime. Some platforms have already been mentioned, including BitChute, Gab and 4chan, which host extreme racist, misogynistic, homophobic and other extreme content that radicalises people and incites harm. And the Minister knows that.

I take this opportunity to pay tribute to my hon. Friend the Member for Plymouth, Sutton and Devonport (Luke Pollard), who has campaigned heavily on the issue since the horrendous and tragic shooting in Keyham in his constituency. One of my big concerns about the lack of focus on violence against women and girls in the Bill, which we have mentioned time and time again, is the potential for the rise of incel culture online, which is very heavily reported on these alternative platforms—these high-harm, high-risk platforms.

I will just give one example. A teacher contacted me about the Bill. She talked about the rise of misogyny and trying to educate her class on what was happening. At the end of the class, a 15-year-old boy—I appreciate that he is under 18 and is a child, so would come under a different category within the Bill, but I will still give the example. He came up to her and said: “Miss, I need to chat to you. This is something I’m really concerned about. All I did was google, ‘Why can’t I get a girlfriend?’” He had been led down a rabbit hole into a warren of alternative platforms that tried to radicalise him with the most extreme content of incel culture: women are evil; women are the ones who are wrong; it is women he should hate; it is his birth right to have a girlfriend, and he should have one; and he should hate women. That is the type of content that is on those platforms that young, impressionable minds are being pointed towards. They are being radicalised and it is sadly leading to incredibly tragic circumstances, so I really want to push the Minister on the subject.

We share the overarching view of many others that this crucial risk needs to be factored into the classification process that determines which companies are placed in category 1. Otherwise, the Bill risks failing to protect adults from substantial amounts of material that causes physical and psychological harm. Schedule 10 needs to be amended to reflect that.

--- Later in debate ---
There are therefore important public health reasons to minimise the discussion of dangerous and effective suicide methods and avoid discussion of them in the public domain. Addressing the most dangerous suicide-related content is an area where the Bill could really save lives. It is therefore inexplicable that a Bill intended to increase online safety does not seek to do that.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I appreciate the shadow Minister’s bringing that issue up. Would she agree that, given we have constraints on broadcast and newspaper reporting on suicide for these very reasons, there can be no argument against including such a measure in the Bill?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely agree. Those safeguards are in place for that very reason. It seems a major omission that they are not also included in the Online Safety Bill if we are truly to save lives.

The Bill’s own pre-legislative scrutiny Committee recommended that the legislation should

“adopt a more nuanced approach, based not just on size and high-level functionality, but factors such as risk, reach, user base, safety performance, and business model.”

The Government replied that they

“want the Bill to be targeted and proportionate for businesses and Ofcom and do not wish to impose disproportionate burdens on small companies.”

It is, though, entirely appropriate to place a major regulatory burden on small companies that facilitate the glorification of suicide and the sharing of dangerous methods through their forums. It is behaviour that is extraordinarily damaging to public health and makes no meaningful economic or social contribution.

Amendment 82 is vital to our overarching aim of having an assessed risk of harm at the heart of the Bill. The categorisation system is not fit for purpose and will fail to capture so many of the extremely harmful services that many of us have already spoken about.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

As we have heard, the clauses set out how different platforms will be categorised with the purpose of ensuring duties are applied in a reasonable and proportionate way that avoids over-burdening smaller businesses. However, it is worth being clear that the Online Safety Bill, as drafted, requires all in-scope services, regardless of their user size, to take action against content that is illegal and where it is necessary to protect children. It is important to re-emphasise the fact that there is no size qualification for the illegal content duties and the duties on the protection of children.

It is also important to stress that under schedule 10 as drafted there is flexibility, as the shadow Minister said, for the Secretary of State to change the various thresholds, including the size threshold, so there is an ability, if it is considered appropriate, to lower the size thresholds in such a way that more companies come into scope, if that is considered necessary.

It is worth saying in passing that we want these processes to happen quickly. Clearly, it is a matter for Ofcom to work through the operations of that, but our intention is that this will work quickly. In that spirit, in order to limit any delays to the process, Ofcom can rely on existing research, if that research is fit for purpose under schedule 10 requirements, rather than having to do new research. That will greatly assist moving quickly, because the existing research is available off the shelf immediately, whereas commissioning new research may take some time. For the benefit of Hansard and people who look at this debate for the application of the Bill, it is important to understand that that is Parliament’s intention.

I will turn to the points raised by the hon. Member for Aberdeen North and the shadow Minister about platforms that may be small and fall below the category 1 size threshold but that are none the less extremely toxic, owing to the way that they are set up, their rules and their user base. The shadow Minister mentioned several such platforms. I have had meetings with the stakeholders that she mentioned, and we heard their evidence. Other Members raised this point on Second Reading, including the right hon. Member for Barking (Dame Margaret Hodge) and my hon. Friend the Member for Brigg and Goole (Andrew Percy). As the hon. Member for Aberdeen North said, I signalled on Second Reading that the Government are listening carefully, and our further work in that area continues at pace.

I am not sure that amendment 80 as drafted would necessarily have the intended effect. Proposed new sub-paragraph (c) to schedule 10(1) would add a risk condition, but the conditions in paragraph (1) are applied with “and”, so they must all be met. My concern is that the size threshold would still apply, and that this specific drafting of the amendment would not have the intended effect.

We will not accept the amendments as drafted, but as I said on Second Reading, we have heard the representations—the shadow Minister and the hon. Member for Aberdeen North have made theirs powerfully and eloquently—and we are looking carefully at those matters. I hope that provides some indication of the Government’s thinking. I thank the stakeholders who engaged and provided extremely valuable insight on those issues. I commend the clause to the Committee.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I thank the Minister for his comments. I still think that such platforms are too dangerous not to be subject to more stringent legislation than similar-sized platforms. For the Chair’s information, I would like to press amendment 80 to a vote. If it falls, I will move straight to pressing amendment 82 to a vote, missing out amendment 81. Does that makes sense, Chair, and is it possible?

None Portrait The Chair
- Hansard -

No, I am afraid it is not. We will deal with the amendments in order.

Question put and agreed to.

Clause 80 accordingly ordered to stand part of the Bill.

Schedule 10

Categories of regulated user-to-user services and regulated search services: regulations

None Portrait The Chair
- Hansard -

Now we come to those amendments, which have not yet been moved. The problem is that amendment 82 is linked to amendment 80. I think I am right in saying that if amendment 80 falls, amendment 82 will fall. Does the hon. Lady want to move just amendment 82?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Thank you for your advice, Chair. I will move amendment 80. Should it be accepted, I would be keen to move to other two.

Amendment proposed: 80,in schedule 10, page 192, line 19, at end insert—

“(c) the assessed risk of harm arising from that part of the service.”—(Kirsty Blackman.)

This amendment, together with Amendments 81 and 82, widens Category 1 to include those services which pose a very high risk of harm, regardless of the number of users.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

On a point of order, Sir Roger. My understanding was that it was previously the case that amendments could not be re-moved again on Report, but that modern practice in the past few years in the House has been that amendments that have been pushed to a vote in Committee are then allowed to be resubmitted on Report, whether or not the Minister has indicated that this is the case.

None Portrait The Chair
- Hansard -

The hon. Lady is correct. I am advised that, actually, the ruling has changed, so it can be. We will see—well, I won’t, but the hon. Lady will see what the Minister does on report.

Schedule 10 agreed to.  

Clauses 81 and 82 ordered to stand part of the Bill.  

Clause 83

OFCOM’s register of risks, and risk profiles, of Part 3

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Labour supports clause 85, which gives Ofcom the power to require the provision of any information it requires in order to discharge its online safety functions. We strongly believe that, in the interests of transparency, Ofcom as the regulator must have sufficient power to require a service provider to share its risk assessment in order to understand how that service provider is identifying risks. As the Minister knows, we feel that that transparency should go further, and that the risk assessments should be made public. However, we have already had that argument during a previous debate, so I will not repeat those arguments—on this occasion, at least.

Labour also supports clause 86, and we particularly welcome the clarification that Ofcom may require the provision of information in any form. If we are to truly give Ofcom the power to regulate and, where necessary, investigate service providers, we must ensure that it has sufficient legislative tools to rely on.

The Bill gives some strong powers to Ofcom. We support the requirement in clause 87 to name a senior manager, but again, we feel those provisions should go further. Both users and Ofcom must have access to the full range of tools they need to hold the tech giants to account. As it stands, senior managers can be held criminally liable only for technical offences, such as failing to supply information to the regulator, and even then, those measures might not come in until two years after the Bill is in place. Surely the top bosses at social media companies should be held criminally liable for systemic and repeated failures to ensure online safety as soon as the Bill comes into force, so can the Minister explain the reasons for the delay?

The Minister will be happy to hear that Labour supports clause 88. It is important to have an outline on the face of the Bill of the circumstances in which Ofcom can require a report from a skilled person. It is also important that Ofcom has the power to appoint, or give notice to a provider requiring them to appoint, a skilled person, as Labour fears that without those provisions in subsections (3) and (4), the ambiguity around defining a so-called skilled person could be detrimental. We therefore support the clause, and have not sought to amend it at this stage.

Again, Labour supports all the intentions of clause 89 in the interests of online safety more widely. Of course, Ofcom must have the power to force a company to co-operate with an investigation.

Again, we support the need for clause 90, which gives Ofcom the power to require an individual to attend an interview. That is particularly important in the instances outlined in subsection (1), whereby Ofcom is carrying out an investigation into the failure or possible failure of a provider of a regulated service to comply with a relevant requirement. Labour has repeatedly called for such personal responsibility, so we are pleased that the Government are ensuring that the Bill includes sufficient powers for Ofcom to allow proper scrutiny.

Labour supports clause 91 and schedule 11, which outlines in detail Ofcom’s powers of entry, inspection and audit. I did not think we would support this much, but clearly we do. We want to work with the Government to get this right, and we see ensuring Ofcom has those important authorisation powers as central to it establishing itself as a viable regulator of the online space, both now and for generations to come. We will support and have not sought to amend the clauses or schedule 11 for the reasons set out.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I want to make a brief comment echoing the shadow Minister’s welcome for the inclusion of senior managers and named people in the Bill. I agree that that level of personal liability and responsibility is the only way that we will be able to hold some of these incredibly large, unwieldy organisations to account. If they could wriggle out of this by saying, “It’s somebody else’s responsibility,” and if everyone then disagreed about whose responsibility it was, we would be in a much worse place, so I also support the inclusion of these clauses and schedule 11.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am delighted by the strong support that these clauses have received from across the aisle. I hope that proves to be a habit-forming development.

On the shadow Minister’s point about publishing the risk assessments, to repeat the point I made a few days ago, under clause 64, which we have already debated, Ofcom has the power—indeed, the obligation—to compel publication of transparency reports that will make sure that the relevant information sees the light of day. I accept that publication is important, but we believe that objective is achieved via the transparency measures in clause 64.

On the point about senior management liability, which again we debated near the beginning of the Bill, we believe—I think we all agree—that this is particularly important for information disclosure. We had the example, as I mentioned at the time, of one of the very large companies refusing to disclose information to the Competition and Markets Authority in relation to a competition matter and simply paying a £50 million fine rather than complying with the duties. That is why criminal liability is so important here in relation to information disclosure.

To reassure the shadow Minister, on the point about when that kicks in, it was in the old version of the Bill, but potentially did not commence for two years. In this new version, updated following our extensive and very responsive listening exercise—I am going to get that in every time—the commencement of this particular liability is automatic and takes place very shortly after Royal Assent. The delay and review have been removed, for the reason the hon. Lady mentioned, so I am pleased to confirm that to the Committee.

The shadow Minister described many of the provisions. Clause 85 gives Ofcom powers to require information, clause 86 gives the power to issue notices and clause 87 the important power to require an entity to name that relevant senior manager, so they cannot wriggle out of their duty by not providing the name. Clause 88 gives the power to require companies to undergo a report from a so-called skilled person. Clause 89 requires full co-operation with Ofcom when it opens an investigation, where co-operation has been sadly lacking in many cases to date. Clause 90 requires people to attend an interview, and the introduction to schedule 11 allows Ofcom to enter premises to inspect or audit the provider. These are very powerful clauses and will mean that social media companies can no longer hide in the shadows from the scrutiny they so richly deserve.

Question put and agreed to.

Clause 85 accordingly ordered to stand part of the Bill.

Clauses 86 to 91 ordered to stand part of the Bill.

Schedule 11

OFCOM’s powers of entry, inspection and audit

Amendment made: 4, in schedule 11, page 202, line 17, leave out

“maximum summary term for either-way offences”

and insert

“general limit in a magistrates’ court”.—(Chris Philp.)

Schedule 11, as amended, agreed to.

Clause 92

Offences in connection with information notices

Question proposed, That the clause stand part of the Bill.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am delighted that support for the Government’s position on the clauses continues and that cross-party unanimity is taking an ever stronger hold. I am sure the Whips Office will find that particularly reassuring.

The shadow Minister asked a question about clause 100. Clause 100 amends section 24B of the Communications Act 2003, which allows Ofcom to provide information to the Secretary of State to assist with the formulation of policy. She asked me to clarify what that means, which I am happy to do. In most circumstances, Ofcom will be required to obtain the consent of providers in order to share information relating to their business. This clause sets out two exceptions to that principle. If the information required by the Secretary of State was obtained by Ofcom to determine the proposed fees threshold, or in response to potential threats to national security or to the health or safety of the public, the consent of the business is not required. In those instances, it would obviously not be appropriate to require the provider’s consent.

It is important that users of regulated services are kept informed of developments around online safety and the operation of the regulatory framework.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

This specifically relates to the Secretary of State, but would the Minister expect both Ofcom and his Department to be working with the Scottish Government and the Northern Ireland Executive? I am not necessarily talking about sharing all the information, but where there are concerns that it is very important for those jurisdictions to be aware of, will he try to ensure that he has a productive relationship with both devolved Administrations?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Member for her question. Where the matter being raised or disclosed touches on matters of devolved competence—devolved authority—then yes, I would expect that consultation to take place. Matters concerning the health and safety of the public are entirely devolved, I think, so I can confirm that in those circumstances it would be appropriate for the Secretary of State to share information with devolved Administration colleagues.

The shadow Minister has eloquently, as always, touched on the purpose of the various other clauses in this group. I do not wish to try the patience of the Committee, particularly as lunchtime approaches, by repeating what she has ably said already, so I will rest here and simply urge that these clauses stand part of the Bill.

Question put and agreed to.

Clause 97 accordingly ordered to stand part of the Bill.

Clauses 98 to 102 ordered to stand part of the Bill.

Ordered, That further consideration be now adjourned. —(Steve Double.)

Online Safety Bill (Ninth sitting)

Kirsty Blackman Excerpts
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

My right hon. Friend raises a good question. In fact, I was about to come on to the safeguards that exist to address some of the concerns that have been raised this morning. Let me jump to the fourth of the safeguards, which in many ways is the most powerful and directly addresses my right hon. Friend’s question.

In fact, a change has been made. The hon. Member for Ochil and South Perthshire asked what changes had been made, and one important change—perhaps the change that my hon. Friend the Member for Watford found convincing—was the insertion of a requirement for the codes, following a direction, to go before Parliament and be voted on using the affirmative procedure. That is a change. The Bill previously did not have that in it. We inserted the use of the affirmative procedure to vote on a modified code in order to introduce extra protections that did not exist in the draft of the Bill that the Joint Committee commented on.

I hope my right hon. Friend the Member for Basingstoke will agree that if Ofcom had a concern and made it publicly known, Parliament would be aware of that concern before voting on the revised code using the affirmative procedure. The change to the affirmative procedures gives Parliament extra control. It gives parliamentarians the opportunity to respond if they have concerns, if third parties raise concerns, or if Ofcom itself raises concerns.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

Before the Minister moves off the point about exceptional circumstances, it was the case previously that an amendment of the law resolution was always considered with Finance Bills. In recent years, that has stopped on the basis of it being exceptional circumstances because a general election was coming up. Then the Government changed that, and now they never table an amendment of the law resolution because they have decided that that is a minor change. Something has gone from being exceptional to being minor, in the view of this Government.

The Minister said that he envisions that this measure will be used only in exceptional circumstances. Can he commit himself to it being used only in exceptional circumstances? Can he give the commitment that he expects that it will be used only in exceptional circumstances, rather than simply envisioning that it will be used in such circumstances?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I have made clear how we expect the clause to be used. I am slightly hesitant to be more categorical simply because I do not want to make comments that might unduly bind a future Secretary of State—or, indeed, a future Parliament, because the measure is subject to the affirmative procedure—even were that Secretary of State, heaven forbid, to come from a party other than mine. Circumstances might arise, such as the pandemic, in which a power such as this needs to be exercised for good public policy reasons—in that example, public health. I would not want to be too categorical, which the hon. Lady is inviting me to be, lest I inadvertently circumscribe the ability of a future Parliament or a future Secretary of State to act.

The power is also limited in the sense that, in relation to matters that are not to do with national security or terrorism or CSEA, the power to direct can be exercised only at the point at which the code is submitted to be laid before Parliament. That cannot be done at any point. The power cannot be exercised at a time of the Secretary of State’s choosing. There is one moment, and one moment only, when that power can be exercised.

I also want to make it clear that the power will not allow the Secretary of State to direct Ofcom to require a particular regulated service to take a particular measure. The power relates to the codes of practice; it does not give the power to intrude any further, beyond the code of practice, in the arena of regulated activity.

I understand the points that have been made. We have listened to the Joint Committee, and we have made an important change, which is that to the affirmative procedure. I hope my explanation leaves the Committee feeling that, following that change, this is a reasonable place for clauses 40 and 41 to rest. I respectfully resist amendment 84 and new clause 12, and urge the Committee to allow clauses 40 and 41 to stand part of the Bill.

Question put, That the amendment be made.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I can see that that is the most popular thing I have said during the entire session—when you say, “And finally,” in a speech and the crowd cheers, you know you are in trouble.

Regulated user-to-user and search services will have duties to keep records of their risk assessments and the measures they take to comply with their safety duties, whether or not those are the ones recommended in the codes of practice. They must also undertake a children’s access assessment to determine whether children are likely to access their service.

Clause 48 places a duty on Ofcom to produce guidance to assist service providers in complying with those duties. It will help to ensure a consistent approach from service providers, which is essential in maintaining a level playing field. Ofcom will have a duty to consult the Information Commissioner prior to preparing this guidance, as set out in clause 48(2), in order to draw on the expertise of the Information Commissioner’s Office and ensure that the guidance is aligned with wider data protection and privacy regulation.

Question put and agreed to.

Clause 48 accordingly ordered to stand part of the Bill.

Clause 49

“Regulated user-generated content”, “user-generated content”, “news

publisher content”

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I beg to move amendment 89, in clause 49, page 45, line 16, leave out subsection (e).

This amendment would remove the exemption for comments below news articles posted online.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss amendment 43, in clause 49, page 45, line 19, at end insert—

“(2A) Subsection (2)(e) does not apply in respect of a user-to-user service which is operated by an organisation which—

(a) is a relevant publisher (as defined in section 41 of the Crime and Courts Act 2013); and

(b) has an annual UK turnover in excess of £100 million.”

This amendment removes comments sections operated by news websites where the publisher has a UK turnover of more than £100 million from the exemption for regulated user-generated content.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Thank you, Ms Rees, for your hard work in chairing the Committee this morning; we really appreciate it. Amendment 89 relates to below-the-line comments on newspaper articles. For the avoidance of doubt, if we do not get amendment 89, I am more than happy to support the Labour party’s amendment 43, which has a similar effect but covers slightly fewer—or many fewer—organisations and places.

Below-the-line comments in newspaper articles are infamous. They are places that everybody fears to go. They are worse than Twitter. In a significant number of ways, below-the-line comments are an absolute sewer. I cannot see any reasonable excuse for them to be excluded from the Bill. We are including Twitter in the Bill; why are we not including below-the-line comments for newspapers? It does not make any sense to me; I do not see any logic.

We heard a lot of evidence relating to freedom of speech and a free press, and I absolutely, wholeheartedly agree with that. However, the amendment would not stop anyone writing a letter to the editor. It would not stop anyone engaging with newspapers in the way that they would have in the print medium. It would still allow that to happen; it would just ensure that below-the-line comments were subject to the same constraints as posts on Twitter. That is the entire point of amendment 89.

I do not think that I need to say much more, other than to add one more thing about the direction by comments to other, more radical and extreme pieces, or bits of information. It is sometimes the case that the comments on a newspaper article will direct people to even more extreme views. The newspaper article itself may be just slightly derogatory, while some of the comments may have links or references to other pieces, and other places on the internet where people can find a more radical point of view. That is exactly what happens on Twitter, and is exactly some of the stuff that we are trying to avoid—sending people down an extremist rabbit hole. I do not understand how the Minister thinks that the clause, which excludes below the line newspaper comments, is justifiable or acceptable.

Having been contacted by a number of newspapers, I understand and accept that some newspapers have moderation policies for their comments sections, but that is not strong enough. Twitter has a moderation policy, but that does not mean that there is actually any moderation, so I do not think that subjecting below-the-line comments to the provisions of the Bill is asking too much. It is completely reasonable for us to ask for this to happen, and I am honestly baffled as to why the Minister and the Government have chosen to make this exemption.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Before I address the amendments, I will speak to clause 49 more broadly.

Labour has concerns about a number of subsections of the clause, including subsections (2), and (8) to (10)— commonly known as the news publisher content exemption, which I have spoken about previously. We understand that the intention of the exemption is to shield broadcasters and traditional newspaper publishers from the Bill’s regulatory effects, clause 50(2) defines a “recognised news publisher” as a regulated broadcaster or any other publisher that publishes news, has an office, and has a standards code and complaints process. There is no detail about the latter two requirements, thus enabling almost any news publishing enterprise to design its own code and complaints process, however irrational, and so benefit from the exemption. “News” is also defined broadly, and may include gossip. There remains a glaring omission, which amendment 43 addresses and which I will come to.

During an earlier sitting of the Committee, in response to comments made by my hon. Friend the Member for Liverpool, Walton as we discussed clause 2, the Minister claimed that

“The metaverse is a good example, because even though it did not exist when the structure of the Bill was conceived, anything happening in the metaverse is none the less covered by the Bill. Anything that happens in the metaverse that is illegal or harmful to children, falls into the category of legal but harmful to adults, or indeed constitutes pornography will be covered because the Bill is tech agnostic.”––[Official Report, Online Safety Public Bill Committee, 7 June 2022; c. 204.]

Clause 49 exempts one-to-one live aural communications from the scope of regulation. Given that much interaction in virtual reality is live aural communication, including between two users, it is hard to understand how that would be covered by the Bill.

There is also an issue about what counts as content. Most standard understandings would define “content” as text, video, images and audio, but one of the worries about interactions in VR is that behaviour such as physical violence will be able to be replicated virtually, with psychologically harmful effects. It is very unclear how that would be within the scope of the current Bill, as it does not clearly involve content, so could the Minister please address that point? As he knows, Labour advocates for a systems-based approach, and for risk assessments and systems to take place in a more upstream and tech-agnostic way than under the current approach. At present, the Bill would struggle to be expanded effectively enough to cover those risks.

Amendment 43 removes comments sections operated by news websites where the publisher has a UK turnover of more than £100 million from the exemption for regulated user-generated comment. If the Bill is to be effective in protecting the public from harm, the least it must accomplish is a system of accountability that covers all the largest platforms used by British citizens. Yet as drafted, the Bill would exempt some of the most popular social media platforms online: those hosted on news publisher websites, which are otherwise known as comments sections. The amendment would close that loophole and ensure that the comments sections of the largest newspaper websites are subject to the regime of regulation set out in the Bill.

Newspaper comments sections are no different from the likes of Facebook and Twitter, in that they are social media platforms that allow users to interact with one another. This is done through comments under stories, comments in response to other comments, and other interactions—for example, likes and dislikes on posts. In some ways, their capacity to cause harm to the public is even greater: for example, their reach is in many cases larger than even the biggest of social media platforms. Whereas there are estimated to be around 18 million users of Twitter in the UK, more than twice that number of British citizens access newspaper websites every month, and the harm perpetuated on those platforms is severe.

In July 2020, the rapper Wiley posted a series of antisemitic tweets, which Twitter eventually removed after an unacceptable delay of 48 hours, but under coverage of the incident in The Sun newspaper, several explicitly antisemitic comments were posted. Those comments contained holocaust denial and alleged a global Jewish conspiracy to control the world. They remained up and accessible to The Sun’s 7 million daily readers for the best part of a week. If we exempt comments sections from the Bill’s proposed regime and the duties that the Bill sets for platforms, we will send the message that that kind of vicious, damaging and harmful racism is acceptable.

Similarly, after an antisemitic attack in the German city of Halle, racists comments followed in the comments section under the coverage in The Sun. There are more examples: Chinese people being described as locusts and attacked with other racial slurs; 5G and Bill Gates conspiracy theories under articles on the Telegraph website; and of course, the most popular targets for online abuse, women in public life. Comments that described the Vice-President of the United States as a “rat” and “ho” appeared on the MailOnline. A female union leader has faced dozens of aggressive and abusive comments about her appearance, and many of such comments remain accessible on newspaper comments sections to this day. Some of them have been up for months, others for years.

Last week, the Committee was sent a letter from a woman who was the victim of comments section abuse, Dr Corinne Fowler. Dr Fowler said of the comments that she received:

“These comments contained scores of suggestions about how to kill or injure me. Some were general ideas, such as hanging, but many were gender specific, saying that I should be burnt at the stake like a witch. Comments focused on physical violence, one man advising that I should slapped hard enough to make my teeth chatter”.

She added:

“I am a mother: without me knowing, my son (then 12 years old) read these reader comments. He became afraid for my safety.”

Without the amendment, the Bill cannot do anything to protect women such as Dr Fowler and their families from this vile online abuse, because comments sections will be entirely out of scope of the Bill’s new regime and the duties designed to protect users.

As I understand it, two arguments have been made to support the exemption. First, it is argued that the complaints handlers for the press already deal with such content, but the handler for most national newspapers, the Independent Press Standards Organisation, will not act until a complaint is made. It then takes an average of six months for a complaint to be processed, and it cannot do anything if the comments have not been moderated. The Opposition do not feel that that is a satisfactory response to the seriousness of harms that we know to occur, and which I have described. IPSO does not even have a code to deal with cases of antisemitic abuse that appeared on the comments section of The Sun. IPSO’s record speaks for itself from the examples that I have given, and the many more, and it has proven to be no solution to the severity of harms that appear in newspaper comments sections.

The second argument for an exemption is that publishers are legally responsible for what appears on comments sections, but that is only relevant for illegal harms. For everything else, from disinformation to racial prejudice and abuse, regulation is needed. That is why it is so important that the Bill does the job that we were promised. To keep the public safe from harm online, comments sections must be covered under the Bill.

The amendment is a proportionate solution to the problem of comments section abuse. It would protect user’s freedom of expression and, given that it is subject to a turnover threshold, ensure that duties and other requirements do not place a disproportionate burden on smaller publishers such as locals, independents and blogs.

I have reams and reams and reams of examples from comments sections that all fall under incredibly harmful abuse and should be covered by the Bill. I could be here for hours reading them all out, and while I do not think that anybody in Committee would like me to, I urge Committee members to take a look for themselves at the types of comments under newspaper articles and ask themselves whether those comments should be covered by the terms of the Bill. I think they know the answer.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

On a point of order, Ms Rees. Are we considering clause 49 now? I know that it is supposed to considered under the next set of amendments, but I just wondered, because I have separate comments to make on that clause that I did not make earlier because I spoke purely to the amendment.

None Portrait The Chair
- Hansard -

I did not want to stop Alex Davies-Jones in full flow. When we come to consideration of clause 49, I was going to ask for additional comments, but it is for the Committee to decide whether it is content with that, or would like the opportunity to elaborate on that clause now.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I am happy to speak on clause 49 now—I can see the Minister is nodding. I really appreciate it, Ms Rees, because I did not want to lose the opportunity to raise concerns about this matter. I have not tabled an amendment but I would appreciate it if the Minister gave consideration to my following comments.

My concern relates to subsection (5) of clause 49, which exempts one-to-one live aural communications in relation to user-to-user services. My concern relates to child sexual abuse and grooming. I am worried that exempting those one-to-one live aural communications allows bad actors, people who are out to attack children, a loophole to do that. We know that on games such as Fortnite, one-to-one aural communication happens.

I am not entirely sure how communication happens on Roblox and whether there is an opportunity for that there. However, we also know that a number of people who play online games have communication on Discord at the same time. Discord is incredibly popular, and we know that there is an opportunity for, and a prevalence of, grooming on there. I am concerned that exempting this creates a loophole for people to attack children in a way that the Minister is trying to prevent with the Bill. I understand why the clause is there but am concerned that the loophole is created.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Lady raises an important philosophical question that underpins much of the Bill’s architecture. All the measures are intended to strike a balance. Where there are things that are at risk of leading to illegal activity, and things that are harmful to children, we are clamping down hard, but in other areas we are being more proportionate. For example, the legal but harmful to adult duties only apply to category 1 companies, and we are looking at whether that can be extended to other high-risk companies, as we debated earlier. In the earlier provisions that we debated, about “have regard to free speech”, there is a balancing exercise between the safety duties and free speech. A lot of the provisions in the Bill have a sense of balance and proportionality. In some areas, such as child sexual exploitation and abuse, there is no balance. We just want to stop that—end of story. In other areas, such as matters that are legal but harmful and touch on free speech, there is more of a balancing exercise.

In this area of news publisher content, we are again striking a balance. We are saying that the inherent harmfulness of those sites, owing to their functionality—they do not go viral in the same way—is much lower. There is also an interaction with freedom of the press, as I said earlier. Thus, we draw the balance in a slightly different way. To take the example of suicide promotion or self-harm content, there is a big difference between stumbling across something in comment No. 74 below a BBC article, versus the tragic case of Molly Russell—the 14-year-old girl whose Instagram account was actively flooded, many times a day, with awful content promoting suicide. That led her to take her own life.

I think the hon. Member for Batley and Spen would probably accept that there is a functional difference between a comment that someone has to scroll down a long way to find and probably sees only once, and being actively flooded with awful content. In having regard to those different arguments—the risk and the freedom of the press—we try to strike a balance. I accept that they are not easy balances to strike, and that there is a legitimate debate to be had on them. However, that is the reason that we have adopted this approach.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I have a question on anonymity. On social media there will be a requirement to verify users’ identities, so if somebody posts on Twitter that they want to lynch me, it is possible to find out who that is, provided they do not have an anonymous account. There is no such provision for newspaper comment sections, so I assume it would be much more difficult for the police to find them, or for me not to see anonymous comments that threaten my safety below the line of newspaper articles—comments that are just as harmful, which threaten my safety on social media. Can the Minister can convince me otherwise?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Lady is correct in her analysis, I can confirm. Rather similar to the previous point, because of the interaction with freedom of the press—the argument that the newspapers and broadcasters have advanced—and because this is an inherently less viral environment, we have drawn the balance where we have. She is right to highlight a reasonable risk, but we have struck the balance in the way we have for that reason.

The shadow Minister, the hon. Member for Pontypridd, asked whether very harmful or illegal interactions in the metaverse would be covered or whether they have a metaphorical “get out of jail free” card owing to the exemption in clause 49(2)(d) for “one-to-one live aural communications”. In essence, she is asking whether, in the metaverse, if two users went off somewhere and interacted only with each other, that exemption would apply and they would therefore be outwith the scope of the Bill. I am pleased to tell her they would not, because the definition of live one-to-one aural communications goes from clause 49(2)(d) to clause 49(5), which defines “live aural communications”. Clause 49(5)(c) states that the exemption applies only if it

“is not accompanied by user-generated content of any other description”.

The actions of a physical avatar in the metaverse do constitute user-generated content of any other description. Owing to that fact, the exemption in clause 49(2)(d) would not apply to the metaverse.

I am happy to provide clarification on that. It is a good question and I hope I have provided an example of how, even though the metaverse was not conceived when the Bill was conceived, it does have an effect.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

On that point, when it comes to definition of content, we have tabled an amendment about “any other content”. I am not convinced that the definition of content adequately covers what the Minister stated, because it is limited, does not include every possible scenario where it is user-generated and is not future-proofed enough. When we get to that point, I would appreciate it if the Minister would look at the amendment and ensure that what he intends is what happens.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am grateful to the hon. Lady for thinking about that so carefully. I look forward to her amendment. For my information, which clause does her amendment seek to amend?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I will let the Minister know in a moment.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am grateful. It is an important point.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank my hon. Friend for his service on the Joint Committee. I heard the representations of my right hon. Friend the Member for Basingstoke about a Joint Committee, and I have conveyed them to the higher authorities.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The amendment that the Minister is asking about is to clause 189, which states:

“‘content’ means anything communicated by means of an internet service, whether publicly or privately, including written material or messages, oral communications, photographs, videos, visual images, music and data of any description”.

It is amendment 76 that, after “including”, would insert “but not limited to”, in order that the Bill is as future-proofed as it can be.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank my right hon. Friend for that intervention. First, clearly if something illegal is said online about someone, they would have the normal redress to go to the police and the police could seek to exercise their powers to investigate the offence, including requesting the company that hosts the comments—in this case, it would be a newspaper’s or broadcaster’s website—to provide any relevant information that might help to identify the person involved; they might have an account, and if they do not they might have a log-on or IP address. So, the normal criminal investigatory procedures would obviously apply.

Secondly, if the content was defamatory, then—I realise that only people like Arron Banks can sue for libel, but there is obviously civil recourse for libel. And I think there are powers in the civil procedure rules that allow for court orders to be made that require organisations, such as news media websites, to disclose information that would help to identify somebody who is a respondent in a civil case.

Thirdly, there are obviously the voluntary steps that the news publisher might take to remove content. News publishers say that they do that; obviously, their implementation, as we know, is patchy. Nevertheless, there is that voluntary route.

Regarding any legal obligation that may fall on the shoulders of the news publisher itself, I am not sure that I have sufficient legal expertise to comment on that. However, I hope that those first three areas of redress that I have set out give my right hon. Friend some assurance on this point.

Finally, I turn to a question asked by the hon. Member for Aberdeen North. She asked whether the exemption for “one-to-one live aural communications”, as set out in clause 49(2)(d), could inadvertently allow grooming or child sexual exploitation to occur via voice messages that accompany games, for example. The exemption is designed to cover what are essentially phone calls such as Skype conversations—one-to-one conversations that are essentially low-risk.

We believe that the Bill contains other duties to ensure that services are designed to reduce the risk of grooming and to address risks to children, if those risks exist, such as on gaming sites. I would be happy to come back to the hon. Lady with a better analysis and explanation of where those duties sit in the Bill, but there are very strong duties elsewhere in the Bill that impose those obligations to conduct risk assessments and to keep children safe in general. Indeed, the very strongest provisions in the Bill are around stopping child sexual exploitation and abuse, as set out in schedule 6.

Finally, there is a power in clause 174(1) that allows us, as parliamentarians and the Government, to repeal this exemption using secondary legislation. So, if we found in the future that this exemption caused a problem, we could remove it by passing secondary legislation.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

That is helpful for understanding the rationale, but in the light of how people communicate online these days, although exempting telephone conversations makes sense, exempting what I am talking about does not. I would appreciate it if the Minister came back to me on that, and he does not have to give me an answer now. It would also help if he explained the difference between “aural” and “oral”, which are mentioned at different points in the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I will certainly come back with a more complete analysis of the point about protecting children—as parents, that clearly concerns us both. The literal definitions are that “aural” means “heard” and “oral” means “spoken”. They occur in different places in the Bill.

This is a difficult issue and legitimate questions have been raised, but as I said in response to the hon. Member for Batley and Spen, in this area as in others, there are balances to strike and different considerations at play—freedom of the press on the one hand, and the level of risk on the other. I think that the clause strikes that balance in an appropriate way.

Question put, That the amendment be made.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Briefly, as with earlier clauses, the Labour party recognises the challenge in finding the balance between freedom of expression and keeping people safe online. Our debate on the amendment has illustrated powerfully that the exemptions as they stand in the Bill are hugely flawed.

First, the exemption is open to abuse. Almost any organisation could develop a standards code and complaints process to define itself as a news publisher and benefit from the exemption. Under those rules, as outlined eloquently by my hon. Friend the Member for Batley and Spen, Russia Today already qualifies, and various extremist publishers could easily join it. Organisations will be able to spread seriously harmful content with impunity—I referred to many in my earlier contributions, and I have paid for that online.

Secondly, the exemption is unjustified, as we heard loud and clear during the oral evidence sessions. I recall that Kyle from FairVote made that point particularly clearly. There are already rigorous safeguards in the Bill to protect freedom of expression. The fact that content is posted by a news provider should not itself be sufficient reason to treat such content differently from that which is posted by private citizens.

Furthermore, quality publications with high standards stand to miss out on the exemption. The Minister must also see the lack of parity in the broadcast media space. In order for broadcast media to benefit from the exemption, they must be regulated by Ofcom, and yet there is no parallel stipulation for non-broadcast media to be regulated in order to benefit. How is that fair? For broadcast media, the requirement to be regulated by Ofcom is simple, but for non-broadcast media, the series of requirements are not rational, exclude many independent publishers and leave room for ambiguity.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I have a couple of questions that were probably too long for interventions. The Minister said that if comments on a site are the only user-generated content, they are not in scope. It would be really helpful if he explained what exactly he meant by that. We were talking about services that do not fall within the definition of “recognised news publishers”, because we were trying to add them to that definition. I am not suggesting that the Minister is wrong in any way, but I do not understand where the Bill states that those comments are excluded, and how this all fits together.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

With your permission, Ms Rees, I will speak to clause 52 before coming to amendment 61. Illegal content is defined in clause 52(2) as

“content that amounts to a relevant offence.”

However, as the Minister will know from representations from Carnegie UK to his Department—we share its concerns—the illegal and priority illegal regimes may not be able to operate as intended. The Bill requires companies to decide whether content “amounts to” an offence, with limited room for movement. We share concerns that that points towards decisions on an item-by-item basis; it means detecting intent for each piece of content. However, such an approach does not work at the scale on which platforms operate; it is bad regulation and poor risk management.

There seem to be two different problems relating to the definition of “illegal content” in clause 52. The first is that it is unclear whether we are talking about individual items of content or categories of content—the word “content” is ambiguous because it can be singular or plural—which is a problem for an obligation to design and run a system. Secondly, determining when an offence has taken place will be complex, especially bearing in mind mens rea and defences, so the providers are not in a position to get it right.

The use of the phrase “amounts to” in clause 52(2) seems to suggest that platforms will be required to identify accurately, in individual cases, where an offence has been committed, without any wriggle room drafted in, unlike in the draft Bill. As the definition now contains no space for error either side of the line, it could be argued that there are more incentives to avoid false negatives than false positives—providers can set higher standards than the criminal law—and that leads to a greater risk of content removal. That becomes problematic, because it seems that the obligation under clause 9(3) is then to have a system that is accurate in all cases, whereas it would be more natural to deal with categories of content. This approach seems not to be intended; support for that perspective can be drawn from clause 9(6), which recognises that there is a distinction between categories of content and individual items, and that the application of terms of service might specifically have to deal with individual instances of content. Critically, the “amounts to” approach cannot work in conjunction with a systems-based approach to harm reduction. That leaves victims highly vulnerable.

This problem is easily fixed by a combination of reverting to the draft Bill’s language, which required reasonableness, and using concepts found elsewhere in the Bill that enable a harm mitigation system to operate for illegal content. We also remind the Minister that Ofcom raised this issue in the evidence sessions. I would be grateful if the Minister confirmed whether we can expect a Government amendment to rectify this issue shortly.

More broadly, as we know, priority illegal content, which falls within illegal content, includes,

“(a) terrorism content,

(b) CSEA content, and

(c) content that amounts to an offence specified in Schedule 7”,

as set out in clause 52(7). Such content attracts a greater level of scrutiny and regulation. Situations in which user-generated content will amount to “a relevant offence” are set out in clause 52(3). Labour supports the inclusion of a definition of illegal content as outlined in the grouping; it is vital that service providers and platforms have a clear indication of the types of content that they will have a statutory duty to consider when building, or making changes to the back end of, their business models.

We have also spoken about the importance of parity between the online and offline spaces—what is illegal offline must be illegal online—so the Minister knows we have more work to do here. He also knows that we have broad concerns around the omissions in the Bill. While we welcome the inclusion of terrorism and child sexual exploitation content as priority illegal content, there remain gaps in addressing violence against women and girls content, which we all know is hugely detrimental to many online.

The UK Government stated that their intention for the Online Safety Bill was to make the UK the safest place to be online in the world, yet the Bill does not mention online gender-based violence once. More than 60,000 people have signed the Glitch and End Violence Against Women Coalition’s petition calling for women and girls to be included in the Bill, so the time to act is now. We all have a right to not just survive but thrive, engage and play online, and not have our freedom of expression curtailed or our voices silenced by perpetrators of abuse. The online space is just as real as the offline space. The Online Safety Bill is our opportunity to create safe digital spaces.

The Bill must name the problem. Violence against women and girls, particularly those who have one or multiple protected characteristics, is creating harm and inequality online. We must actively and meaningfully name this issue and take an intersectional approach to ending online abuse to ensure that the Bill brings meaningful change for all women. We also must ensure that the Bill truly covers all illegal content, whether it originated in the UK or not.

Amendment 61 brings offences committed overseas within the scope of relevant offences for the purposes of defining illegal content. The aim of the amendment is to clarify whether the Bill covers content created overseas that would be illegal if what was shown in the content took place in the UK. For example, animal abuse and cruelty content is often filmed abroad. The same can be said for dreadful human trafficking content and child sexual exploitation. The optimal protection would be if the Bill’s definition of illegal content covered matter that would be illegal in either the UK or the country it took place in, regardless of whether it originated in the UK.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I do not intend to make a speech, but I want to let the hon. Lady know that we wholeheartedly support everything that she has said on the clause and amendment 61.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful for the hon. Member’s contribution, and for her support for the amendment and our comments on the clause.

The Bill should be made clearer, and I would appreciate an update on the Minister’s assessment of the provisions in the Bill. Platforms and service providers need clarity if they are to take effective action against illegal content. Gaps in the Bill give rise to serious questions about the overwhelming practical challenges of the Bill. None of us wants a two-tier internet, in which user experience and platforms’ responsibilities in the UK differ significantly from those in the rest of the world. Clarifying the definition of illegal content and acknowledging the complexity of the situation when content originates abroad are vital if this legislation is to tackle wide-ranging, damaging content online. That is a concern I raised on Second Reading, and a number of witnesses reiterated it during the oral evidence sessions. I remind the Committee of the comments of Kevin Bakhurst from Ofcom, who said:

“We feel it is really important—hopefully this is something the Committee can contribute to—that the definition of ‘illegal content’ is really clear for platforms, and particularly the area of intent of illegality, which at the moment might be quite tricky for the platforms to pick up on.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 8, Q7.]

That has been reiterated by myriad other stakeholders, so I would be grateful for the Minister’s comments.

Online Safety Bill (Seventh sitting)

Kirsty Blackman Excerpts
Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

I will talk about this later, when we come to a subsequent clause to which I have tabled some amendments—I should have tabled some to this clause, but unfortunately missed the chance to do so.

I appreciate the Minister laying out why he has designated the people covered by this clause; my concern is that “affected” is not wide enough. My logic is that, on the strength of these provisions, I might not be able to report racist content that I come across on Twitter if I am not the subject of that content—if I am not a member of a group that is the subject of the content or if I am not caring for someone who is the subject of it.

I appreciate what the Minister is trying to do, and I get the logic behind it, but I think the clause unintentionally excludes some people who would have a reasonable right to expect to be able to make reports in this instance. That is why I tabled amendments 78 and 79 to clause 28, about search functions, but those proposals would have worked reasonably for this clause as well. I do not expect a positive answer from the Minister today, but perhaps he could give consideration to my concern. My later amendments would change “affected person” to “any other person”. That would allow anyone to make a report, because if something is illegal content, it is illegal content. It does not matter who makes the report, and it should not matter that I am not a member of the group of people targeted by the content.

I report things all the time, particularly on Twitter, and a significant amount of it is nothing to do with me. It is not stuff aimed at me; it is aimed at others. I expect that a number of the platforms will continue to allow reporting for people who are outwith the affected group, but I do not want to be less able to report than I am currently, and that would be the case for many people who see concerning content on the internet.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The hon. Lady is making a really important point. One stark example that comes to my mind is when English footballers suffered horrific racist abuse following the penalty shootout at the Euros last summer. Hundreds of thousands of people reported the abuse that they were suffering to the social media platforms on their behalf, in an outcry of solidarity and support, and it would be a shame if people were prevented from doing that.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I absolutely agree. I certainly do not think I am suggesting that the bigger platforms such as Twitter and Facebook will reduce their reporting mechanisms as a result of how the Bill is written. However, it is possible that newer or smaller platforms, or anything that starts after this legislation comes, could limit the ability to report on the basis of these clauses.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

Good morning, Ms Rees.

It is important that users of online services are empowered to report harmful content, so that it can be removed. It is also important for users to have access to complaints procedures when wrong moderation decisions have been made. Reporting and complaint mechanisms are integral to ensuring that users are safe and that free speech is upheld, and we support these provisions in the Bill.

Clauses 17 and 18, and clauses 27 and 28, are two parts of the same process: content reporting by individual users, and the handling of content reported as a complaint. However, it is vital that these clauses create a system that works. That is the key point that Labour Members are trying to make, because the wild west system that we have at the moment does not work.

It is welcome that the Government have proposed a system that goes beyond the users of the platform and introduces a duty on companies. However, companies have previously failed to invest enough money in their complaints systems for the scale at which they are operating in the UK. The duties in the Bill are an important reminder to companies that they are part of a wider society that goes beyond their narrow shareholder interest.

One example of why this change is so necessary, and why Labour Members are broadly supportive of the additional duties, is the awful practice of image abuse. With no access to sites on which their intimate photographs are being circulated, victims of image abuse have very few if any routes to having the images removed. Again, the practice of image abuse has increased during the pandemic, including through revenge porn, which the Minister referred to. The revenge porn helpline reported that its case load more than doubled between 2019 and 2020.

These clauses should mean that people can easily report content that they consider to be either illegal, or harmful to children, if it is hosted on a site likely to be accessed by children, or, if it is hosted on a category 1 platform, harmful to adults. However, the Minister needs to clarify how these service complaints systems will be judged and what the performance metrics will be. For instance, how will Ofcom enforce against a complaint?

In many sectors of the economy, even with long-standing systems of regulation, companies can have tens of millions of customers reporting content, but that does not mean that any meaningful action can take place. The hon. Member for Aberdeen North has just told us how often she reports on various platforms, but what action has taken place? Many advocacy groups of people affected by crimes such as revenge porn will want to hear, in clear terms, what will happen to material that has been complained about. I hope the Minister can offer that clarity today.

Transparency in reporting will be vital to analysing trends and emerging types of harm. It is welcome that in schedule 8, which we will come to later, transparency reporting duties apply to the complaints process. It is important that as much information as possible is made public about what is going on in companies’ complaints and reporting systems. As well as the raw number of complaints, reporting should include what is being reported or complained about, as the Joint Committee on the draft Bill recommended last year. Again, what happens to the reported material will be an important metric on which to judge companies.

Finally, I will mention the lack of arrangements for children. We have tabled new clause 3, which has been grouped for discussion with other new clauses at the end of proceedings, but it is relevant to mention it now briefly. The Children’s Commissioner highlighted in her oral evidence to the Committee how children had lost faith in complaints systems. That needs to be changed. The National Society for the Prevention of Cruelty to Children has also warned that complaints mechanisms are not always appropriate for children and that a very low proportion of children have ever reported content. A child specific user advocacy body could represent the interests of child users and support Ofcom’s regulatory decisions. That would represent an important strengthening of protections for users, and I hope the Government will support it when the time comes.

Jane Stevenson Portrait Jane Stevenson (Wolverhampton North East) (Con)
- Hansard - - - Excerpts

I rise briefly to talk about content reporting. I share the frustrations of the hon. Member for Aberdeen North. The way I read the Bill was that it would allow users and affected persons, rather than “or” affected persons, to report content. I hope the Minister can clarify that that means affected persons who might not be users of a platform. That is really important.

Will the Minister also clarify the use of human judgment in these decisions? Many algorithms are not taking down some content at the moment, so I would be grateful if he clarified that there is a need for platforms to provide a genuine human judgment on whether content is harmful.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I want to raise an additional point about content reporting and complaints procedures. I met with representatives of Mencap yesterday, who raised the issue of the accessibility of the procedures that are in place. I appreciate that the Bill talks about procedures being accessible, but will the Minister give us some comfort about Ofcom looking at the reporting procedures that are in place, to ensure that adults with learning disabilities in particular can access those content reporting and complaints procedures, understand them and easily find them on sites?

That is a specific concern that Mencap raised on behalf of its members. A number of its members will be users of sites such as Facebook, but may find it more difficult than others to access and understand the procedures that are in place. I appreciate that, through the Bill, the Minister is making an attempt to ensure that those procedures are accessible, but I want to make sure they are accessible not just for the general public but for children, who may need jargon-free access to content reporting and complaints procedures, and for people with learning disabilities, who may similarly need jargon-free, easy-to-understand and easy-to-find access to those procedures.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me try to address some of the questions that have been raised in this short debate, starting with the question that the hon. Member for Aberdeen North quite rightly asked at the beginning. She posed the question, “What if somebody who is not an affected person encountered some content and wanted to report it?” For example, she might encounter some racist content on Twitter or elsewhere and would want to be able to report it, even though she is not herself the target of it or necessarily a member of the group affected. I can also offer the reassurance that my hon. Friend the Member for Wolverhampton North East asked for.

The answer is to be found in clause 17(2), which refers to

“A duty to operate a service using systems and processes that allow users and”—

I stress “and”—“affected persons”. As such, the duty to offer content reporting is to users and affected persons, so if the hon. Member for Aberdeen North was a user of Twitter but was not herself an affected person, she would still be able to report content in her capacity as a user. I hope that provides clarification.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I appreciate that. That is key, and I am glad that this is wider than just users of the site. However, taking Reddit as an example, I am not signed up to that site, but I could easily stumble across content on it that was racist in nature. This clause would mean that I could not report that content unless I signed up to Reddit, because I would not be an affected person or a user of that site.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Lady for her clarificatory question. I can confirm that in order to be a user of a service, she would not necessarily have to sign up to it. The simple act of browsing that service, of looking at Reddit—not, I confess, an activity that I participate in regularly—regardless of whether or not the hon. Lady has an account with it, makes her a user of that service, and in that capacity she would be able to make a content report under clause 17(2) even if she were not an affected person. I hope that clears up the question in a definitive manner.

The hon. Lady asked in her second speech about the accessibility of the complaints procedure for children. That is strictly a matter for clause 18, which is the next clause, but I will quickly answer her question. Clause 18 contains provisions that explicitly require the complaints process to be accessible. Subsection (2)(c) states that the complaints procedure has to be

“easy to access, easy to use (including by children) and transparent”,

so the statutory obligation that she requested is there in clause 18.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Can the Minister explain the logic in having that phrasing for the complaints procedure but not for the content-reporting procedure? Surely it would also make sense for the content reporting procedure to use the phrasing

“easy to access, easy to use (including by children) and transparent.”

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

There is in clause 17(2)

“a duty to operate a service that allows users and affected persons to easily report content which they consider to be content of a…kind specified below”,

which, of course, includes services likely to be accessed by children, under subsection (4). The words “easily report” are present in clause 17(2).

I will move on to the question of children reporting more generally, which the shadow Minister raised as well. Clearly, a parent or anyone with responsibility for a child has the ability to make a report, but it is also worth mentioning the power in clauses 140 to 142 to make super-complaints, which the NSPCC strongly welcomed its evidence. An organisation that represents a particular group—an obvious example is the NSPCC representing children, but it would apply to loads of other groups—has the ability to make super-complaints to Ofcom on behalf of those users, if it feels they are not being well treated by a platform. A combination of the parent or carer being able to make individual complaints, and the super-complaint facility, means that the points raised by Members are catered for. I commend the clause to the Committee.

Question put and agreed to.

Clause 17 accordingly ordered to stand part of the Bill.

Clause 18

Duties about complaints procedures

Question proposed, That the clause stand part of the Bill.

--- Later in debate ---
Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

I rise to contribute to the stand part debate on clauses 18 and 28. It was interesting, though, to hear the debate on clause 17, because it is right to ask how the complaints services will be judged. Will they work in practice? When we start to look at how to ensure that the legislation works in all eventualities, we need to ensure that we have some backstops for when the system does not work as it should.

It is welcome that there will be clear duties on providers to have operational complaints procedures—complaints procedures that work in practice. As we all know, many of them do not at the moment. As a result, we have a loss of faith in the system, and that is not going to be changed overnight by a piece of legislation. For years, people have been reporting things—in some cases, very serious criminal activity—that have not been acted on. Consumers—people who use these platforms—are not going to change their mind overnight and suddenly start trusting these organisations to take their complaints seriously. With that in mind, I hope that the Minister listened to the points I made on Second Reading about how to give extra support to victims of crimes or people who have experienced things that should not have happened online, and will look at putting in place the right level of support.

The hon. Member for Worsley and Eccles South talked about the idea of an ombudsman; it may well be that one should be in place to deal with situations where complaints are not dealt with through the normal processes. I am also quite taken by some of the evidence we received about third-party complaints processes by other organisations. We heard a bit about the revenge porn helpline, which was set up a few years ago when we first recognised in law that revenge pornography was a crime. The Bill creates a lot more victims of crime and recognises them as victims, but we are not yet hearing clearly how the support systems will adequately help that massively increased number of victims to get the help they need.

I will probably talk in more detail about this issue when we reach clause 70, which provides an opportunity to look at the—unfortunately—probably vast fines that Ofcom will be imposing on organisations and how we might earmark some of that money specifically for victim support, whether by funding an ombudsman or helping amazing organisations such as the revenge porn helpline to expand their services.

We must address this issue now, in this Bill. If we do not, all those fines will go immediately into the coffers of the Treasury without passing “Go”, and we will not be able to take some of that money to help those victims directly. I am sure the Government absolutely intend to use some of the money to help victims, but that decision would be at the mercy of the Treasury. Perhaps we do not want that; perhaps we want to make it cleaner and easier and have the money put straight into a fund that can be used directly for people who have been victims of crime or injustice or things that fall foul of the Bill.

I hope that the Minister will listen to that and use this opportunity, as we do in other areas, to directly passport fines for specific victim support. He will know that there are other examples of that that he can look at.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

As the right hon. Member for Basingstoke has mentioned the revenge porn helpline, I will mention the NSPCC’s Report Remove tool for children. It does exactly the same thing, but for younger people—the revenge porn helpline is specifically only for adults. Both those tools together cover the whole gamut, which is massively helpful.

The right hon. Lady’s suggestion about the hypothecation of fines is a very good one. I was speaking to the NSPCC yesterday, and one of the issues that we were discussing was super-complaints. Although super-complaints are great and I am very glad that they are included in the Bill, the reality is that some of the third-sector organisations that are likely to be undertaking super-complaints are charitable organisations that are not particularly well funded. Given how few people work for some of those organisations and the amazing amount of work they do, if some of the money from fines could support not just victims but the initial procedure for those organisations to make super-complaints, it would be very helpful. That is, of course, if the Minister does not agree with the suggestion of creating a user advocacy panel, which would fulfil some of that role and make that support for the charitable organisations less necessary—although I am never going to argue against support for charities: if the Minister wants to hypothecate it in that way, that would be fantastic.

I tabled amendments 78 and 79, but the statement the Minister made about the definition of users gives me a significant level of comfort about the way that people will be able to access a complaints procedure. I am terribly disappointed that the Minister is not a regular Reddit user. I am not, either, but I am well aware of what Reddit entails. I have no desire to sign up to Reddit, but knowing that even browsing the site I would be considered a user and therefore able to report any illegal content I saw, is massively helpful. On that basis, I am comfortable not moving amendments 78 and 79.

On the suggestion of an ombudsman—I am looking at new clause 1—it feels like there is a significant gap here. There are ombudsman services in place for many other areas, where people can put in a complaint and then go to an ombudsman should they feel that it has not been appropriately addressed. As a parliamentarian, I find that a significant number of my constituents come to me seeking support to go to the ombudsman for whatever area it is in which they feel their complaint has not been appropriately dealt with. We see a significant number of issues caused by social media companies, in particular, not taking complaints seriously, not dealing with complaints and, in some cases, leaving illegal content up. Particularly in the initial stages of implementation—in the first few years, before companies catch up and are able to follow the rules put in place by the Bill and Ofcom—a second-tier complaints system that is removed from the social media companies would make things so much better than they are now. It would provide an additional layer of support to people who are looking to make complaints.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I am sure the hon. Lady will agree with me that it is not either/or—it is probably both. Ultimately, she is right that an ombudsman would be there to help deal with what I think will be a lag in implementation, but if someone is a victim of online intimate image abuse, in particular, they want the material taken down immediately, so we need to have organisations such as those that we have both mentioned there to help on the spot. It has to be both, has it not?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I completely agree. Both those helplines do very good work, and they are absolutely necessary. I would strongly support their continuation in addition to an ombudsman-type service. Although I am saying that the need for an ombudsman would likely be higher in the initial bedding-in years, it will not go away—we will still need one. With NHS complaints, the system has been in place for a long time, and it works pretty well in the majority of cases, but there are still cases it gets wrong. Even if the social media companies behave in a good way and have proper complaints procedures, there will still be instances of them getting it wrong. There will still be a need for a higher level. I therefore urge the Minister to consider including new clause 1 in the Bill.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me address some of the issues raised in the debate. First, everyone in the House recognises the enormous problem at the moment with large social media firms receiving reports about harmful and even illegal content that they just flagrantly ignore. The purpose of the clause, and indeed of the whole Bill and its enforcement architecture, is to ensure that those large social media firms no longer ignore illegal and harmful content when they are notified about it. We agree unanimously on the importance of doing that.

The requirement for those firms to take the proper steps is set out in clause 18(2)(b), at the very top of page 18 —it is rather depressing that we are on only the 18th of a couple of hundred pages. That paragraph creates a statutory duty for a social media platform to take “appropriate action”—those are the key words. If the platform is notified of a piece of illegal content, or content that is harmful to children, or of content that it should take down under its own terms and conditions if harmful to adults, then it must do so. If it fails to do so, Ofcom will have the enforcement powers available to it to compel—ultimately, escalating to a fine of up to 10% of global revenue or even service disconnection.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Will the Minister give way?

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I should give way to the hon. Member for Aberdeen North first, and then I will come to the shadow Minister.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I wanted to ask specifically about the resourcing of Ofcom, given the abilities that it will have under this clause. Will Ofcom have enough resource to be able to be that secondary line of defence?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

A later clause gives Ofcom the ability to levy the fees and charges it sees as necessary and appropriate to ensure that it can deliver the duties. Ofcom will have the power to set those fees at a level to enable it to do its job properly, as Parliament would wish it to do.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Lady for her thoughtful intervention. There are two separate questions here. One is about user advocacy groups helping individuals to make complaints to the companies. That is a fair point, and no doubt we will debate it later. The ombudsman question is different; it is about whether to have a right of appeal against decisions by social media companies. Our answer is that, rather than having a third-party body—an ombudsman—effectively acting as a court of appeal against individual decisions by the social media firms, because of the scale of the matter, the solution is to compel the firms, using the force of law, to get this right on a systemic and comprehensive basis.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I give way first to the hon. Member for Aberdeen North—I think she was first on her feet—and then I will come to the hon. Member for Pontypridd.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Does the Minister not think this is going to work? He is creating this systems and processes approach, which he suggests will reduce the thousands of complaints—complaints will be made and complaints procedures will be followed. Surely, if it is going to work, in 10 years’ time we are going to need an ombudsman to adjudicate on the individual complaints that go wrong. If this works in the way he suggests, we will not have tens of millions of complaints, as we do now, but an ombudsman would provide individual redress. I get what he is arguing, but I do not know why he is not arguing for both things, because having both would provide the very best level of support.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I will address the review clause now, since it is relevant. If, in due course, as I hope and expect, the Bill has the desired effect, perhaps that would be the moment to consider the case for an ombudsman. The critical step is to take a systemic approach, which the Bill is doing. That engages the question of new clause 1, which would create a mechanism, probably for the reason the hon. Lady just set out, to review how things are going and to see if, in due course, there is a case for an ombudsman, once we see how the Bill unfolds in practice.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

As the Minister says, clauses 19 and 29 are designed to provide a set of balancing provisions that will require companies to have regard to freedom of expression and privacy when they implement their safety duties. However, it is important that companies cannot use privacy and free expression as a basis to argue that they can comply with regulation in less substantive ways. That is a fear here.

Category 1 providers will need to undertake an impact assessment to determine the impact of their product and safety decisions on freedom of expression, but it is unclear whether that applies only in respect of content that is harmful to adults. Unlike with the risk assessments for the illegal content and child safety duties set out in part 3, chapter 2, these clauses do not set expectations about whether risk assessments are of a suitable and sufficient quality. It is also not clear what powers Ofcom has at its disposal to challenge any assessments that it considers insufficient or that reach an inappropriate or unreasonable assessment of how to balance fundamental rights. I would appreciate it if the Minister could touch on that when he responds.

The assumption underlying these clauses is that privacy and free expression may need to act as a constraint on safety measures, but I believe that that is seen quite broadly as simplistic and potentially problematic. To give one example, a company could argue that end-to-end encryption is important for free expression, and privacy could justify any adverse impact on users’ safety. The subjects of child abuse images, which could more easily be shared because of such a decision, would see their safety and privacy rights weakened. Such an argument fails to take account of the broader nuance of the issues at stake. Impacts on privacy and freedom of expression should therefore be considered across a range of groups rather than assuming an overarching right that applies equally to all users.

Similarly, it will be important that Ofcom understands and delivers its functions in relation to these clauses in a way that reflects the complexity and nuance of the interplay of fundamental rights. It is important to recognise that positive and negative implications for privacy and freedom of expression may be associated with any compliance decision. I think the Minister implied that freedom of speech was a constant positive, but it can also have negative connotations.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I am pleased that the clause is in the Bill, and I think it is a good one to include. Can the Minister reaffirm what he said on Tuesday about child sexual abuse, and the fact that the right to privacy does not trump the ability—particularly with artificial intelligence—to search for child sexual abuse images?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I confirm what the hon. Lady has just said. In response to the hon. Member for Worsley and Eccles South, it is important to say that the duty in clause 19 is “to have regard”, which simply means that a balancing exercise must be performed. It is not determinative; it is not as if the rights in the clause trump everything else. They simply have to be taken into account when making decisions.

To repeat what we discussed on Tuesday, I can explicitly and absolutely confirm to the hon. Member for Aberdeen North that in my view and the Government’s, concerns about freedom of expression or privacy should not trump platforms’ ability to scan for child sexual exploitation and abuse images or protect children. It is our view that there is nothing more important than protecting children from exploitation and sexual abuse.

We may discuss this further when we come to clause 103, which develops the theme a little. It is also worth saying that Ofcom will be able to look at the risk assessments and, if it feels that they are not of an adequate standard, take that up with the companies concerned. We should recognise that the duty to have regard to freedom of expression is not something that currently exists. It is a significant step forward, in my view, and I commend clauses 19 and 29 to the Committee.

--- Later in debate ---
None Portrait The Chair
- Hansard -

I definitely call Kirsty Blackman this time.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I would have been quite happy to move the amendment, but I do not think the Opposition would have been terribly pleased with me if I had stolen it. I have got my name on it, and I am keen to support it.

As I have said, I met the NSPCC yesterday, and we discussed how clause 31(3) might work, should the Minister decide to keep it in the Bill and not accept the amendment. There are a number of issues with the clause, which states that the child user condition is met if

“a significant number of children”

are users of the service, or if the service is

“likely to attract a significant number of users who are children”.

I do not understand how that could work. For example, a significant number of people who play Fortnite are adults, but a chunk of people who play it are kids. If some sort of invisible percentage threshold is applied in such circumstances, I do not know whether that threshold will be met. If only 20% of Fortnite users are kids, and that amounts only to half a million children, will that count as enough people to meet the child access assessment threshold?

Fortnite is huge, but an appropriate definition is even more necessary for very small platforms and services. With the very far-right sites that we have mentioned, it may be that only 0.5% of their users are children, and that may amount only to 2,000 children—a very small number. Surely, because of the risk of harm if children access these incredibly damaging and dangerous sites that groom people for terrorism, they should have a duty to meet the child access requirement threshold, if only so that we can tell them that they must have an age verification process—they must be able to say, “We know that none of our users are children because we have gone through an age verification process.” I am keen for children to be able to access the internet and meet their friends online, but I am keen for them to be excluded from these most damaging sites. I appreciate the action that the Government have taken in relation to pornographic content, but I do not think that this clause allows us to go far enough in stopping children accessing the most damaging content that is outwith pornographic content.

The other thing that I want to raise is about how the number of users will be calculated. The Minister made it very clear earlier on, and I thank him for doing so, that an individual does not have to be a registered user to be counted as a user of a site. People can be members of TikTok, for example, only if they are over 13. TikTok has some hoops in place—although they are not perfect—to ensure that its users are over 13, and to be fair, it does proactively remove users that it suspects are under 13, particularly if they are reported. That is a good move.

My child is sent links to TikTok videos through WhatsApp, however. He clicks on the links and is able to watch the videos, which will pop up in the WhatsApp mini-browser thing or in the Safari browser. He can watch the videos without signing up as a registered user of TikTok and without using the platform itself—the videos come through Safari, for example, rather than through the app. Does the Minister expect that platforms will count those people as users? I suggest that the majority of people who watch TikTok by those means are doing so because they do not have a TikTok account. Some will not have accounts because they are under 13 and are not allowed to by TikTok or by the parental controls on their phones.

My concern is that, if the Minister does not provide clarity on this point, platforms will count just the number of registered users, and will say, “It’s too difficult for us to look at the number of unregistered users, so in working out whether we meet the criteria, we are not even going to consider people who do not access our specific app or who are not registered users in some way, shape or form.” I have concerns about the operation of the provisions and about companies using that “get out of jail free” card. I genuinely believe that the majority of those who access TikTok other than through its platform are children and would meet the criteria. If the Minister is determined to keep subsection (3) and not accept the amendment, I feel that he should make it clear that those users must be included in the counting by any provider assessing whether it needs to fulfil the child safety duties.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I agree with thon. Lady’s important point, which feeds into the broader question of volume versus risk—no matter how many children see something that causes harm and damage, one is one too many—and the categorisation of service providers into category 1 to category 2A and category 2B. The depth of the risk is the problem, rather than the number of people who might be affected. The hon. Lady also alluded to age verification—I am sure we will come to that at some point—which is another can of worms. The important point, which she made well, is about volume versus risk. The point is not how many children see something; even if only a small number of children see something, the damage has been done.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I absolutely agree. In fact, I have tabled an amendment to widen category 1 to include sites with the highest risk of harm. The Minister has not said that he agrees with my amendment specifically, but he seems fairly amenable to increasing and widening some duties to include the sites of highest risk. I have also tabled another new clause on similar issues.

I am glad that these clauses are in the Bill—a specific duty in relation to children is important and should happen—but as the shadow Minister said, clause 31(3) is causing difficulty. It is causing difficulty for me and for organisations such as the NSPCC, which is unsure how the provisions will operate and whether they will do so in the way that the Government would like.

I hope the Minister will answer some of our questions when he responds. If he is not willing to accept the amendment, will he give consideration to how the subsection could be amended in the future—we have more stages, including Report and scrutiny in the other place—to ensure that there is clarity and that the intention of the purpose is followed through, rather than being an intention that is not actually translated into law?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Colleagues have spoken eloquently to the purpose and effect of the various clauses and schedule 3 —the stand part component of this group. On schedule 3, the shadow Minister, the hon. Member for Worsley and Eccles South, asked about timing. The Government share her desire to get this done as quickly as possible. In its evidence a couple of weeks ago, Ofcom said it would be publishing its road map before the summer, which would set out the timetable for moving all this forward. We agree that that is extremely important.

I turn to one or two questions that arose on amendment 22. As always, the hon. Member for Aberdeen North asked a number of very good questions. The first was whether the concept of a “significant number” applied to a number in absolute terms or a percentage of the people using a particular service, and which is looked at when assessing what is significant. The answer is that it can be either—either a large number in absolute terms, by reference to the population of the whole United Kingdom, or a percentage of those using the service. That is expressed in clause 31(4)(a). Members will note the “or” there. It can be a number in proportion to the total UK population or the proportion using a service. I hope that answers the hon. Member’s very good question.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

My concern is where services that meet neither of those criteria—they do not meet the “significant number” criterion in percentage terms because, say, only 0.05% of their users are children, and they do not meet it in population terms, because they are a pretty small platform and only have, say, 1,000 child users—but those children who use the platform are at very high risk because of the nature of the platform or the service provided. My concern is for those at highest risk where neither of the criteria are met and the service does not have to bother conducting any sort of age verification or access requirements.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am concerned to ensure that children are appropriately protected, as the hon. Lady sets out. Let me make a couple of points in that area before I address that point.

The hon. Lady asked another question earlier, about video content. She gave the example of TikTok videos being viewed or accessed not directly on TikTok but via some third-party means, such as a WhatsApp message. First, it is worth emphasising again that in order to count as a user, a person does not have to be registered and can simply be viewing the content. Secondly, if someone is viewing something through another service, such as WhatsApp—the hon. Lady used the example of browsing the internet on another site—the duty will bite at the level of WhatsApp, and it will have to consider the content that it is providing access to. As I said, someone does not have to be registered with a service in order to count as a user of that service.

On amendment 22, there is a drafting deficiency, if I may put it politely—this is a point of drafting rather than of principle. The amendment would simply delete subsection (3), but there would still be references to the “child user condition”—for example, the one that appears on the same page of the Bill at line 11. If the amendment were adopted as drafted, it would end up leaving references to “child user condition” in the Bill without defining what it meant, because we would have deleted the definition.

Online Safety Bill (Sixth sitting)

Kirsty Blackman Excerpts
None Portrait The Chair
- Hansard -

Welcome back. I have a few announcements. I have been reassured that we will have no transmission problems this afternoon, and apparently the audio of this morning’s sitting is available if Members want to listen to it. I have no objections to Members taking their jackets off, because it is rather warm this afternoon. We are expecting a Division in the main Chamber at about 4 o’clock, so we will suspend for 15 minutes if that happens.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

I am sorry, Ms Rees, but I am afraid that I cannot hear you very well.

None Portrait The Chair
- Hansard -

I will shout a bit in that case.

Clause 8

Illegal content risk assessment duties

Amendment proposed (this day): 10, in clause 8, page 6, line 33, at end insert—

“(4A) A duty to publish the illegal content risk assessment and proactively supply this to OFCOM.”—(Alex Davies-Jones.)

This amendment creates a duty to publish an illegal content risk assessment and supply it to Ofcom.

Question again proposed, That the amendment be made.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank my right hon. Friend for raising that. The risk assessments and, indeed, the duties arising under this Bill all apply to systems and processes—setting up systems and processes that are designed to protect people and to prevent harmful and illegal content from being encountered. We cannot specify in legislation every type of harmful content that might be encountered. This is about systems and processes. We heard the Chairman of the Joint Committee on the draft Online Safety Bill, our hon. Friend the Member for Folkestone and Hythe (Damian Collins), confirm to the House on Second Reading his belief—his accurate belief—that the Bill takes a systems-and-processes approach. We heard some witnesses saying that as well. The whole point of this Bill is that it is tech-agnostic—to future-proof it, as hon. Members mentioned this morning—and it is based on systems and processes. That is the core architecture of the legislation that we are debating.

Amendments 25 and 26 seek to ensure that user-to-user services assess and mitigate the risk of illegal content being produced via functions of the service. That is covered, as it should be—the Opposition are quite right to raise the point—by the illegal content risk assessment and safety duties in clauses 8 and 9. Specifically, clause 8(5)(d), on page 7 of the Bill—goodness, we are only on page 7 and we have been going for over half a day already—requires services to risk-assess functionalities of their service being used to facilitate the presence of illegal content. I stress the word “presence” in clause 8(5)(d). Where illegal content is produced by a functionality of the service—for example, by being livestreamed—that content will be present on the service and companies must mitigate that risk. The objective that the Opposition are seeking to achieve, and with which we completely agree with, is covered in clause 8(5)(d) by the word “presence”. If the content is present, it is covered by that section.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Specifically on that, I understand the point the hon. Gentleman is making and appreciate his clarification. However, on something such as Snapchat, if somebody takes a photo, it is sent to somebody else, then disappears immediately, because that is what Snapchat does—the photo is no longer present. It has been produced and created there, but it is not present on the platform. Can the Minister consider whether the Bill adequately covers all the instances he hopes are covered?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Lady raises an interesting point about time. However, the clause 8(5)(d) uses the wording,

“the level of risk of functionalities of the service facilitating the presence or dissemination of illegal content”

and so on. That presence can happen at any time, even fleetingly, as with Snapchat. Even when the image self-deletes after a certain period—so I am told, I have not actually used Snapchat—the presence has occurred. Therefore, that would be covered by clause 8(5)(d).

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The question of proof is a separate one, and that would apply however we drafted the clause. The point is that the clause provides that any presence of a prohibited image would fall foul of the clause. There are also duties on the platforms to take reasonable steps. In the case of matters such as child sexual exploitation and abuse images, there are extra-onerous duties that we have discussed before, for obvious and quite correct reasons.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Will the Minister stress again that in this clause specifically he is talking about facilitating any presence? That is the wording that he has just used. Can he clarify exactly what he means? If the Minister were to do so, it would be an important point for the Bill as it proceeds.

None Portrait The Chair
- Hansard -

Order. Minister, before you continue, before the Committee rose earlier today, there was a conversation about clause 9 being in, and then I was told it was out. This is like the hokey cokey; it is back in again, just to confuse matters further. I was confused enough, so that point needs to be clarified.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful for the opportunity to speak to amendments to clause 9 and to clauses 23 and 24, which I did not speak on earlier. I am also very grateful that we are being broadcast live to the world and welcome that transparency for all who might be listening.

On clause 9, it is right that the user-to-user services will be required to have specific duties and to take appropriate measures to mitigate and manage the risk of harm to individuals and their likelihood of encountering priority illegal content. Again, however, the Bill does not go far enough, which is why we are seeking to make these important amendments. On amendment 18, it is important to stress that the current scope of the Bill does not capture the range of ways in which child abusers use social networks to organise abuse, including to form offender networks. They post digital breadcrumbs that signpost to illegal content on third-party messaging apps and the dark web, and they share child abuse videos that are carefully edited to fall within content moderation guidelines. This range of techniques, known as child abuse breadcrumbing, is a significant enabler of online child abuse.

Our amendment would give the regulator powers to tackle breadcrumbing and ensure a proactive upstream response. The amendment would ensure that tens of millions of interactions with accounts that actively enable the discovery and sharing of child abuse material will be brought into regulatory scope. It will not leave that as ambiguous. The amendment will also ensure that companies must tackle child abuse at the earliest possible stage. As it stands, the Bill would reinforce companies’ current focus only on material that explicitly reaches the criminal threshold. Because companies do not focus their approach on other child abuse material, abusers can exploit this knowledge to post carefully edited child abuse images and content that enables them to connect and form networks with other abusers. Offenders understand and can anticipate that breadcrumbing material will not be proactively identified or removed by the host site, so they are able to organise and link to child abuse in plain sight.

We all know that child abuse breadcrumbing takes many forms, but techniques include tribute sites where users create social media profiles using misappropriated identities of known child abuse survivors. These are used by offenders to connect with likeminded perpetrators to exchange contact information, form offender networks and signpost child abuse material elsewhere online. In the first quarter of 2021, there were 6 million interactions with such accounts.

Abusers may also use Facebook groups to build offender groups and signpost to child abuse hosted on third-party sites. Those groups are thinly veiled in their intentions; for example, as we heard in evidence sessions, groups are formed for those with an interest in children celebrating their 8th, 9th or 10th birthdays. Several groups with over 50,000 members remained alive despite being reported to Meta, and algorithmic recommendations quickly suggested additional groups for those members to join.

Lastly, abusers can signpost to content on third-party sites. Abusers are increasingly using novel forms of technology to signpost to online child abuse, including QR codes, immersive technologies such as the metaverse, and links to child abuse hosted on the blockchain. Given the highly agile nature of the child abuse threat and the demonstrable ability of sophisticated offenders to exploit new forms of technology, this amendment will ensure that the legislation is effectively futureproofed. Technological change makes it increasingly important that the ability of child abusers to connect and form offender networks can be disrupted at the earliest possible stage.

Turning to amendment 21, we know that child abuse is rarely siloed on a single platform or app. Well-established grooming pathways see abusers exploit the design features of social networks to contact children before they move communication across to other platforms, including livestreaming sites, as we have already heard, and encrypted messaging services. Offenders manipulate features such as Facebook’s algorithmic friend suggestions to make initial contact with a large number of children. They can then use direct messages to groom them and coerce children into sending sexual images via WhatsApp. Similarly, as we heard earlier, abusers can groom children through playing videogames and then bringing them on to another ancillary platform, such as Discord.

The National Society for the Prevention of Cruelty to Children has shared details of an individual whose name has been changed, and whose case particularly highlights the problems that children are facing in the online space. Ben was 14 when he was tricked on Facebook into thinking he was speaking to a female friend of a friend, who turned out to be a man. Using threats and blackmail, he coerced Ben into sending abuse images and performing sex acts live on Skype. Those images and videos were shared with five other men, who then bombarded Ben with further demands. His mum, Rachel, said:

“The abuse Ben suffered had a devastating impact on our family. It lasted two long years, leaving him suicidal.

It should not be so easy for an adult to meet and groom a child on one site then trick them into livestreaming their own abuse on another app, before sharing the images with like-minded criminals at the click of a button.

Social media sites should have to work together to stop this abuse happening in the first place, so other children do not have to go through what Ben did.”

The current drafting of the Bill does not place sufficiently clear obligations on platforms to co-operate on the cross-platform nature of child abuse. Amendment 21 would require companies to take reasonable and proportionate steps to share threat assessments, develop proportionate mechanisms to share offender intelligence, and create a rapid response arrangement to ensure that platforms develop a coherent, systemic approach to new and emerging threats. Although the industry has developed a systemic response to the removal of known child abuse images, these are largely ad hoc arrangements that share information on highly agile risk profiles. The cross-platform nature of grooming and the interplay of harms across multiple services need to be taken into account. If it is not addressed explicitly in the Bill, we are concerned that companies may be able to cite competition concerns to avoid taking action.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

On the topic of child abuse images, the hon. Member spoke earlier about livestreaming and those images not being captured. I assume that she would make the same point in relation to this issue: these live images may not be captured by AI scraping for them, so it is really important that they are included in the Bill in some way as well.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely agree with the hon. Member, and appreciate her intervention. It is fundamental for this point to be captured in the Bill because, as we are seeing, this is happening more and more. More and more victims are coming forward who have been subject to livestreaming that is not picked up by the technology available, and is then recorded and posted elsewhere on smaller platforms.

Legal advice suggests that cross-platform co-operation is likely to be significantly impeded by the negative interplay with competition law unless there is a clear statutory basis for enabling or requiring collaboration. Companies may legitimately have different risk and compliance appetites, or may simply choose to hide behind competition law to avoid taking a more robust form of action.

New and emerging technologies are likely to produce an intensification of cross-platform risks in the years ahead, and we are particularly concerned about the child abuse impacts in immersive virtual reality and alternative-reality environments, including the metaverse. A number of high-risk immersive products are already designed to be platform-agnostic, meaning that in-product communication takes place between users across multiple products and environments. There is a growing expectation that these environments will be built along such lines, with an incentive for companies to design products in this way in the hope of blunting the ability of Governments to pursue user safety objectives.

Separately, regulatory measures that are being developed in the EU, but are highly likely to impact service users in the UK, could result in significant unintended safety consequences. Although the interoperability provisions in the Digital Markets Act are strongly beneficial when viewed through a competition lens—they will allow the competition and communication of multiple platforms—they could, without appropriate safety mitigations, provide new means for abusers to contact children across multiple platforms, significantly increase the overall profile of cross-platform risk, and actively frustrate a broad number of current online safety responses. Amendment 21 will provide corresponding safety requirements that can mitigate the otherwise significant potential for unintended consequences.

The Minister referred to clauses 23 and 24 in relation to amendments 30 and 31. We think a similar consideration should apply for search services as well as for user-to-user services. We implore that the amendments be made, in order to prevent those harms from occurring.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The Government support the spirit of amendments 17 and 28, which seek to achieve critical objectives, but the Bill as drafted delivers those objectives. In relation to amendment 17 and cross-platform risk, clause 8 already sets out harms and risks—including CSEA risks—that arise by means of the service. That means through the service to other services, as well as on the service itself, so that is covered.

Amendment 28 calls for the risk assessments expressly to cover illegal child sexual exploitation content, but clause 8 already requires that to happen. Clause 8(5) states that the risk assessment must cover the

“risk of individuals who are users of the service encountering…each kind of priority illegal content”.

If we follow through the definition of priority illegal content, we find all those CSEA offences listed in schedule 6. The objective of amendment 28 is categorically delivered by clause 8(5)(b), referencing onwards to schedule 6.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The amendment specifically mentions the level and rates of those images. I did not quite manage to follow through all the things that the Minister just spoke about, but does the clause specifically talk about the level of those things, rather than individual incidents, the possibility of incidents or some sort of threshold for incidents, as in some parts of the Bill?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The risk assessments that clause 8 requires have to be suitable and sufficient; they cannot be perfunctory and inadequate in nature. I would say that suitable and sufficient means they must go into the kind of detail that the hon. Lady requests. More details, most of which relate to timing, are set out in schedule 3. Ofcom will be making sure that these risk assessments are not perfunctory.

Importantly, in relation to CSEA reporting, clause 59, which we will come to, places a mandatory requirement on in-scope companies to report to the National Crime Agency all CSEA content that they detect on their platforms, if it has not already been reported. Not only is that covered by the risk assessments, but there is a criminal reporting requirement here. Although the objectives of amendments 17 and 28 are very important, I submit to the Committee that the Bill delivers the intention behind them already, so I ask the shadow Minister to withdraw them.

Question put, That the amendment be made.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

My right hon. Friend, as always, makes a very good point. The codes of practice will be important, particularly to enable Ofcom to levy fines where appropriate and then successfully defend them. This is an area that may get litigated. I hope that, should lawyers litigating these cases look at our transcripts in the future, they will see how strongly those on both sides of the House feel about this point. I know that Ofcom will ensure that the codes of practice are properly drafted. We touched this morning on the point about timing; we will follow up with Ofcom to make sure that the promise it made us during the evidence session about the road map is followed through and that those get published in good time.

On the point about the Joint Committee, I commend my right hon. Friend for her persistence—[Interruption.] Her tenacity—that is the right word. I commend her for her tenacity in raising that point. I mentioned it to the Secretary of State when I saw her at lunchtime, so the point that my right hon. Friend made this morning has been conveyed to the highest levels in the Department.

I must move on to the final two amendments, 11 and 13, which relate to transparency. Again, we had a debate about transparency earlier, when I made the point about the duties in clause 64, which I think cover the issue. Obviously, we are not debating clause 64 now but it is relevant because it requires Ofcom—it is not an option but an obligation; Ofcom must do so—to require providers to produce a transparency report every year. Ofcom can say what is supposed to be in the report, but the relevant schedule lists all the things that can be in it, and covers absolutely everything that the shadow Minister and the hon. Member for Worsley and Eccles South want to see in there.

That requirement to publish transparently and publicly is in the Bill, but it is to be found in clause 64. While I agree with the Opposition’s objectives on this point, I respectfully say that those objectives are delivered by the Bill as drafted, so I politely and gently request that the amendments be withdrawn.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I have a couple of comments, particularly about amendments 15 and 16, which the Minister has just spoken about at some length. I do not agree with the Government’s assessment that the governance subsection is adequate. It states that the risk assessment must take into account

“how the design and operation of the service (including the business model, governance, use of proactive technology…may reduce or increase the risks identified.”

It is actually an assessment of whether the governance structure has an impact on the risk assessment. It has no impact whatever on the level at which the risk assessment is approved or not approved; it is about the risks that the governance structure poses to children or adults, depending on which section of the Bill we are looking at.

The Minister should consider what is being asked in the amendment, which is about the decision-making level at which the risk assessments are approved. I know the Minister has spoken already, but some clarification would be welcome. Does he expect a junior tech support member of staff, or a junior member of the legal team, to write the risk assessment and then put it in a cupboard? Or perhaps they approve it themselves and then nothing happens with it until Ofcom asks for it. Does he think that Ofcom would look unfavourably on behaviour like that? If he was very clear with us about that, it might put our minds at rest. Does he think that someone in a managerial position or a board member, or the board itself, should take decisions, rather than a very junior member of staff? There is a big spread of people who could be taking decisions. If he could give us an indication of what Ofcom might look favourably on, it would be incredibly helpful for our deliberations.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am anxious about time, but I will respond to that point because it is an important one. The hon. Lady is right to say that clause 10(6)(h) looks to identify the risks associated with governance. That is correct —it is a risk assessment. However in clause 11(2)(a), there is a duty to mitigate those risks, having identified what the risks are. If, as she hypothesised, a very junior person was looking at these matters from a governance point of view, that would be identified as a risk. If it was not, Ofcom would find that that was not sufficient or suitable. That would breach clause 10(2), and the service would then be required to mitigate. If it did not mitigate the risks by having a more senior person taking the decision, Ofcom would take enforcement action for its failure under clause 11(2)(a).

For the record, should Ofcom or lawyers consult the transcript to ascertain Parliament’s intention in the course of future litigation, it is absolutely the Government’s view, as I think it is the hon. Lady’s, that a suitable level of decision making for a children’s risk assessment would be a very senior level. The official Opposition clearly think that, because they have put it in their amendment. I am happy to confirm that, as a Minister, I think that. Obviously the hon. Lady, who speaks for the SNP, does too. If the transcripts of the Committee’s proceedings are examined in the future to ascertain Parliament’s intention, Parliament’s intention will be very clear.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

May I say—this might be a point of order—how my constituency name is pronounced? I get a million different versions, but it is Worsley, as in “worse”. It is an unfortunate name for a great place.

I will speak to all the amendments in the group together, because they relate to how levels of risk are assessed in relation to certain characteristics. The amendments are important because small changes to the descriptions of risk assessment will help to close a significant gap in protection.

Clauses 10 and 12 introduce a duty on regulated companies to assess harms to adults and children who might have an innate vulnerability arising from being a member of a particular group or having a certain characteristic. However, Ofcom is not required to assess harms to people other than children who have that increased innate vulnerability. Amendment 71 would require Ofcom to assess risks of harm particularly affecting people with certain characteristics or membership of a group or groups as part of its risk register. That would reduce the regulatory burden if companies had Ofcom’s risk assessment to base their work on.

Getting this right is important. The risk management regime introduced by the Bill should not assume that all people are at the same risk of harm—they are clearly not. Differences in innate vulnerability increase the incidence and impact of harm, such as by increasing the likelihood of encountering content or of that content being harmful, or heightening the impact of the harm.

It is right that the Bill emphasises the vulnerability of children, but there are other, larger groups with innate vulnerability to online harm. As we know, that often reflects structural inequalities in society.

For example, women will be harmed in circumstances where men might not be, and they could suffer some harms that have a more serious impact than they might for men. A similar point can be made for people with other characteristics. Vulnerability is then compounded by intersectional issues—people might belong to more than one high-risk group—and I will come to that in a moment.

The initial Ofcom risk assessment introduced by clause 83 is not required to consider the heightened risks to different groups of people, but companies are required to assess that risk in their own risk assessments for children and adults. They need to be given direction by an assessment by Ofcom, which amendment 71 would require.

Amendments 72 to 75 address the lack of recognition in these clauses of intersectionality issues. They are small amendments in the spirit of the Bill’s risk management regime. As drafted, the Bill refers to a singular “group” or “characteristic” for companies to assess for risk. However, some people are subject to increased risks of harm arising from being members of more than one group. Companies’ risk assessments for children and adults should reflect intersectionality, and not just characteristics taken individually. Including the plural of “group” and “characteristic” in appropriate places would achieve that.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I will first speak to our amendment 85, which, like the Labour amendment, seeks to ensure that the Bill is crystal clear in addressing intersectionality. We need only consider the abuse faced by groups of MPs to understand why that is necessary. Female MPs are attacked online much more regularly than male MPs, and the situation is compounded if they have another minority characteristic. For instance, if they are gay or black, they are even more likely to be attacked. In fact, the MP who is most likely to be attacked is black and female. There are very few black female MPs, so it is not because of sheer numbers that they are at such increased risk of attack. Those with a minority characteristic are at higher risk of online harm, but the risk facing those with more than one minority characteristic is substantially higher, and that is what the amendment seeks to address.

I have spoken specifically about people being attacked on Twitter, Facebook and other social media platforms, but people in certain groups face an additional significant risk. If a young gay woman does not have a community around her, or if a young trans person does not know anybody else who is trans, they are much more likely to use the internet to reach out, to try to find people who are like them, to try to understand. If they are not accepted by their family, school or workplace, they are much more likely to go online to find a community and support—to find what is out there in terms of assistance—but using the internet as a vulnerable, at-risk person puts them at much more significant risk. This goes back to my earlier arguments about people requiring anonymity to protect themselves when using the internet to find their way through a difficult situation in which they have no role models.

It should not be difficult for the Government to accept this amendment. They should consider it carefully and understand that all of us on the Opposition Benches are making a really reasonable proposal. This is not about saying that someone with only one protected characteristic is not at risk; it is about recognising the intersectionality of risk and the fact that the risk faced by those who fit into more than one minority group is much higher than that faced by those who fit into just one. This is not about taking anything away from the Bill; it is about strengthening it and ensuring that organisations listen.

We have heard that a number of companies are not providing the protection that Members across the House would like them to provide against child sexual abuse. The governing structures, risk assessments, rules and moderation at those sites are better at ensuring that the providers make money than they are at providing protection. When regulated providers assess risk, it is not too much to ask them to consider not just people with one protected characteristic but those with multiple protected characteristics.

As MPs, we work on that basis every day. Across Scotland and the UK, we support our constituents as individuals and as groups. When protected characteristics intersect, we find ourselves standing in Parliament, shouting strongly on behalf of those affected and giving them our strongest backing, because we know that that intersection of harms is the point at which people are most vulnerable, in both the real and the online world. Will the Minister consider widening the provision so that it takes intersectionality into account and not only covers people with one protected characteristic but includes an over and above duty? I genuinely do not think it is too much for us to ask providers, particularly the biggest ones, to make this change.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Once again, the Government recognise the intent behind these amendments and support the concept that people with multiple intersecting characteristics, or those who are members of multiple groups, may experience—or probably do experience—elevated levels of harm and abuse online compared with others. We completely understand and accept that point, as clearly laid out by the hon. Member for Aberdeen North.

There is a technical legal reason why the use of the singular characteristic and group singular is adopted here. Section 6(c) of the Interpretation Act 1978 sets out how words in Bills and Acts are interpreted, namely that such words in the singular also cover the plural. That means that references in the singular, such as

“individuals with a certain characteristic”

in clause 10(6)(d), also cover characteristics in the plural. A reference to the singular implies a reference to the plural.

Will those compounded risks, where they exist, be taken into account? The answer is yes, because the assessments must assess the risk in front of them. Where there is evidence that multiple protected characteristics or the membership of multiple groups produce compounded risks, as the hon. Lady set out, the risk assessment has to reflect that. That includes the general sectoral risk assessment carried out by Ofcom, which is detailed in clause 83, and Ofcom will then produce guidance under clause 84.

The critical point is that, because there is evidence of high levels of compounded risk when people have more than one characteristic, that must be reflected in the risk assessment, otherwise it is inadequate. I accept the point behind the amendments, but I hope that that explains, with particular reference to the 1978 Act, why the Bill as drafted covers that valid point.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I would be delighted to speak to the amendment, which would change the existing user empowerment duty in clause 14 to require category 1 services to enable adult users to see whether other users are verified. In effect, however, that objective already follows as a natural consequence of the duty in clause 14(6). When a user decides to filter out non-verified users, by definition such users will be able to see content only from verified users, so they could see from that who was verified and who was not. The effect intended by the amendment, therefore, is already achieved through clause 14(6).

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I am sorry to disagree with the Minister so vigorously, but that is a rubbish argument. It does not make any sense. There is a difference between wanting to filter out everybody who is not verified and wanting to actually see if someone who is threatening someone else online is a verified or a non-verified user. Those are two very different things. I can understand why a politician, for example, might not want to filter out unverified users but would want to check whether a person was verified before going to the police to report a threat.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

When it comes to police investigations, if something is illegal and merits a report to the police, users should report it, regardless of whether someone is verified or not—whatever the circumstances. I would encourage any internet user to do that. That effectively applies on Twitter already; some people have blue ticks and some people do not, and people should report others to the police if they do something illegal, whether or not they happen to have a blue tick.

Amendment 47 seeks to create a definition of identity verification in clause 189. In addition, it would compel the person’s real name to be displayed. I understand the spirit of the amendment, but there are two reasons why I would not want to accept it and would ask hon. Members not to press it. First, the words “identity verification” are ordinary English words with a clear meaning and we do not normally define in legislation ordinary English words with a clear meaning. Secondly, the amendment would add the new requirement that, if somebody is verified, their real name has to be displayed, but I do not think that that is the effect of the drafting as it stands. Somebody may be verified, and the company knows who they are—if the police go to the company, they will have the verified information—but there is no obligation, as the amendment is drafted, for that information to be displayed publicly. The effect of that part of the amendment would be to force users to choose between disclosing their identity to everyone or having no control over who they interact with. That may not have been the intention, but I am not sure that this would necessarily make sense.

New clause 8 would place requirements on Ofcom about how to produce guidance on user identity verification and what that guidance must contain. We already have provisions on that in clause 58, which we will no doubt come to, although probably not later on today—maybe on Thursday. Clause 58 allows Ofcom to include in its regulatory guidance the principles and standards referenced in the new clause, which can then assist service providers in complying with their duties. Of course, if they choose to ignore the guidelines and do not comply with their duties, they will be subject to enforcement action, but we want to ensure that there is flexibility for Ofcom, in writing those guidelines, and for companies, in following those guidelines or taking alternative steps to meet their duty.

This morning, a couple of Members talked about the importance of remaining flexible and being open to future changes in technology and a wide range of user needs. We want to make sure that flexibility is retained. As drafted, new clause 8 potentially undermines that flexibility. We think that the powers set out in clause 58 give Ofcom the ability to set the relevant regulatory guidance.

Clause 14 implements the proposals made by my hon. Friend the Member for Stroud in her ten-minute rule Bill and the proposals made, as the shadow Minister has said, by a number of third-party stakeholders. We should all welcome the fact that these new user empowerment duties have now been included in the Bill in response to such widespread parliamentary lobbying.

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I rise to speak to amendments 105 and 106, in my name, on protecting democracy and democratic debate.

Within the Bill, there are significant clauses intended to prevent the spread of harm online, to protect women and girls against violence and to help prevent child sexual exploitation, while at the same time protecting the right of journalists to do their jobs. Although those clauses are not perfect, I welcome them.

The Bill is wide-ranging. The Minister talked on Second Reading about the power in clause 150 to protect another group—those with epilepsy—from being trolled with flashing images. That subject is close to my heart due to the campaign for Zach’s law—Zach is a young boy in my constituency. I know we will return to that important issue later in the Committee, and I thank the Minister for his work on it.

In protecting against online harm while preserving fundamental rights and values, we must also address the threats posed to those involved in the democratic process. Let me be clear: this is not self-serving. It is about not just MPs but all political candidates locally and nationally and those whose jobs facilitate the execution of our democratic process and political life: the people working on elections or for those elected to public office at all levels across the UK. These people must be defended from harm not only for their own protection, but to protect our democracy itself and, with it, the right of all our citizens to a political system capable of delivering on their priorities free from threats and intimidation.

Many other groups in society are also subjected to a disproportionate amount of targeted abuse, but those working in and around politics sadly receive more than almost any other people in this country, with an associated specific set of risks and harms. That does not mean messages gently, or even firmly, requesting us to vote one way or another—a staple of democratic debate—but messages of hate, abuse and threats intended to scare people in public office, grind them down, unfairly influence their voting intentions or do them physical and psychological harm. That simply cannot be an acceptable part of political life.

As I say, we are not looking for sympathy, but we have a duty to our democracy to try to stamp that out from our political discourse. Amendment 105 would not deny anybody the right to tell us firmly where we are going wrong—quite right, too—but it is an opportunity to draw the essential distinction between legitimately holding people in public life to account and illegitimate intimidation and harm.

The statistics regarding the scale of online abuse that MPs receive are shocking. In 2020, a University of Salford study found that MPs received over 7,000 abusive or hate-filled tweets a month. Seven thousand separate messages of harm a month on Twitter alone directed at MPs is far too many, but who in this room does not believe that the figure is almost certainly much higher today? Amnesty conducted a separate study in 2017 looking at the disproportionate amount of abuse that women and BAME MPs faced online, finding that my right hon. Friend the Member for Hackney North and Stoke Newington (Ms Abbott) was the recipient of almost a third of all the abusive tweets analysed, as alluded to already by the hon. Member for Edinburgh—

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Aberdeen North.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I knew that. [Laughter.]

Five years later, we continue to see significant volumes of racist, sexist and homophobic hate-filled abuse and threats online to politicians of all parties. That is unacceptable in itself, but we must ask whether this toxic environment helps to keep decent people in politics or, indeed, attracts good people into politics, so that our democracy can prosper into the future across the political spectrum. The reality we face is that our democracy is under attack online each and every day, and every day we delay acting is another day on which abuse becomes increasingly normalised or is just seen as part of the job for those who have put themselves forward for public service. This form of abuse harms society as a whole, so it deserves specific consideration in the Bill.

While elected Members and officials are not a special group of people deserving of more legal protections than anyone else, we must be honest that the abuse they face is distinct and specific to those roles and directly affects our democracy itself. It can lead to the most serious physical harm, with two Members of Parliament having been murdered in the last six years, and many others face death threats or threats of sexual or other violence on a daily basis. However, this is not just about harm to elected representatives; online threats are often seen first, and sometimes only, by their members of staff. They may not be the intended target, but they are often the people harmed most. I am sure we all agree that that is unacceptable and cannot continue.

All of us have probably reported messages and threats to social media platforms and the police, with varying degrees of success in terms of having them removed or the individuals prosecuted. Indeed, we sadly heard examples of that from my hon. Friend the shadow Minister. Often we are told that nothing can be done. Currently, the platforms look at their own rules to determine what constitutes freedom of speech or expression and what is hateful speech or harm. That fine line moves. There is no consistency across platforms, and we therefore urgently need more clarity and a legal duty in place to remove that content quickly.

Amendment 105 would explicitly include in the Bill protection and consideration for those involved in UK elections, whether candidates or staff. Amendment 106 would go further and place an obligation on Ofcom to produce a code of practice, to be issued to the platforms. It would define what steps platforms must take to protect those involved in elections and set out what content is acceptable or unacceptable to be directed at them.

--- Later in debate ---
Let us be honest: will this amendment solve the issue entirely? No. However, does more need to be done to protect our democracy? Yes. I am in constant conversation with people and organisations in this sector about what else could be brought forward to assist the police and the Crown Prosecution Service in prosecuting those who wish to harm those elected to public office—both online and offline. Directly addressing the duty of platforms to review content, remove harmful speech and report those who wish to do harm would, I believe, be a positive first step towards protecting our democratic debate and defending those who work to make it effective on behalf of the people of the United Kingdom.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I want to make a few comments on the amendment. As a younger female parliamentarian, I find that I am often asked to speak to young people about becoming an MP or getting involved in politics. I find it difficult to say to young women, “Yes, you should do this,” and most of the reason for that is what people are faced with online. It is because a female MP cannot have a Twitter account without facing abuse. I am sure male MPs do as well, but it tends to be worse for women.

We cannot engage democratically and with constituents on social media platforms without receiving abuse and sometimes threats as well. It is not just an abusive place to be—that does not necessarily meet the threshold for illegality—but it is pretty foul and toxic. There have been times when I have deleted Twitter from my phone because I just need to get away from the vile abuse that is being directed towards me. I want, in good conscience, to be able to make an argument to people that this is a brilliant job, and it is brilliant to represent constituents and to make a difference on their behalf at whatever level of elected politics, but right now I do not feel that I am able to do that.

When my footballing colleague, the hon. Member for Batley and Spen, mentions “UK elections” in the amendment, I assume she means that in the widest possible way—elections at all levels.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

indicated assent.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Sometimes we miss out the fact that although MPs face abuse, we have a level of protection as currently elected Members. Even if there were an election coming up, we have a level of security protection and access that is much higher than for anybody else challenging a candidate or standing in a council or a Scottish Parliament election. As sitting MPs, we already have an additional level of protection because of the security services we have in place. We need to remember, and I assume this is why the amendment is drawn in a pretty broad way, that everybody standing for any sort of elected office faces significant risk of harm—again, whether or not that meets the threshold for illegality.

There are specific things that have been mentioned. As has been said, epilepsy is specifically mentioned as a place where specific harm occurs. Given the importance of democracy, which is absolutely vital, we need to have a democratic system where people are able to stand in elections and make their case. Given the importance of democracy, which is absolutely vital, we need to have a democratic system where people are able to stand in elections and make their case. That is why we have election addresses and a system where the election address gets delivered through every single person’s door. There is an understanding and acceptance by people involved in designing democratic processes that the message of all candidates needs to get out there. If the message of all candidates cannot get out there because some people are facing significant levels of abuse online, then democracy is not acting in the way that it should be. These amendments are fair and make a huge amount of sense. They are protecting the most important tenets of democracy and democratic engagement.

I want to say something about my own specific experiences. We have reported people to the police and have had people in court over the messages they have sent, largely by email, which would not be included in the Bill, but there have also been some pretty creepy ones on social media that have not necessarily met the threshold. As has been said, it is my staff who have had to go to court and stand in the witness box to explain the shock and terror they have felt on seeing the email or the communication that has come in, so I think any provision should include that.

Finally, we have seen situations where people working in elections—this is not an airy-fairy notion, but something that genuinely happened—have been photographed and those pictures have been shared on social media, and they have then been abused as a result. They are just doing their job, handing out ballot papers or standing up and announcing the results on the stage, and they have to abide by the processes that are in place now. In order for us to have free and fair elections that are run properly and that people want to work at and support, we need to have that additional level of protection. The hon. Member for Batley and Spen made a very reasonable argument and I hope the Minister listened to it carefully.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I have listened very carefully to both the hon. Member for Batley and Spen and the hon. Member for Aberdeen North. I agree with both of them that abuse and illegal activity directed at anyone, including people running for elected office, is unacceptable. I endorse and echo the comments they made in their very powerful and moving speeches.

In relation to the technicality of these amendments, what they are asking for is in the Bill already but in different places. This clause is about protecting content of “democratic importance” and concerns stopping online social media firms deleting content through over-zealous takedown. What the hon. Members are talking about is different. They are talking about abuse and illegal activities, such as rape threats, that people get on social media, particularly female MPs, as they both pointed out. I can point to two other places in the Bill where what they are asking for is delivered.

First, there are the duties around illegal content that we debated this morning. If there is content online that is illegal—some of the stuff that the shadow Minister referred to earlier sounds as if it would meet that threshold—then in the Bill there is a duty on social media firms to remove that content and to proactively prevent it if it is on the priority list. The route to prosecution will exist in future, as it does now, and the user-verification measures, if a user is verified, make it more likely for the police to identify the person responsible. In the context of identifying people carrying out abuse, I know the Home Office is looking at the Investigatory Powers Act 2016 as a separate piece of work that speaks to that issue.

So illegal content is dealt with in the illegal content provisions in the Bill, but later we will come to clause 150, which updates the Malicious Communications Act 1988 and creates a new harmful communications offence. Some of the communications that have been described may not count as a criminal offence under other parts of criminal law, but if they meet the test of harmful communication in clause 150, they will be criminalised and will therefore have to be taken down, and prosecution will be possible. In meeting the very reasonable requests that the hon. Members for Batley and Spen and for Aberdeen North have made, I would point to those two parts of the Bill.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

But clause 150(5) says that if a message

“is, or is intended to be, a contribution to a matter of public interest”,

people are allowed to send it, which basically gives everybody a get-out clause in relation to anything to do with elections.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

No, it does not.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I know we are not discussing that part of the Bill, and if the Minister wants to come back to this when we get to clause 150, I have no problem with that.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I will answer the point now, as it has been raised. Clause 150 categorically does not give a get-out-of-jail-free card or provide an automatic excuse. Clearly, there is no way that abusing a candidate for elected office with rape threats and so on could possibly be considered a matter of public interest. In fact, even if the abuse somehow could be considered as possibly contributing to public debate, clause 150(5) says explicitly in line 32 on page 127:

“but that does not determine the point”.

Even where there is some potentially tenuous argument about a contribution to a matter of public interest, which most definitely would not be the case for the rape threats that have been described, that is not determinative. It is a balancing exercise that gets performed, and I hope that puts the hon. Lady’s mind at rest.

Online Safety Bill (Fifth sitting)

Kirsty Blackman Excerpts
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

We do not oppose clauses 2, 3 or 4, or the intentions of schedules 1 and 2, and have not sought to amend them at this stage, but this is an important opportunity to place on record some of the Opposition’s concerns as the Bill proceeds.

The first important thing to note is the broadness in the drafting of all the definitions. A service has links to the UK if it has a significant number of users in the UK, if the UK users are a target market, or if

“there are reasonable grounds to believe there is a material risk of significant harm to individuals”

in the UK using the service. Thus, territorially, a very wide range of online services could be caught. The Government have estimated in their impact assessment that 25,100 platforms will be in scope of the new regime, which is perhaps a conservative estimate. The impact assessment also notes that approximately 180,000 platforms could potentially be considered in scope of the Bill.

The provisions on extraterritorial jurisdiction are, again, extremely broad and could lead to some international platforms seeking to block UK users in a way similar to that seen following the introduction of GDPR. Furthermore, as has been the case under GDPR, those potentially in scope through the extraterritorial provisions may vigorously resist attempts to assert jurisdiction.

Notably absent from schedule 1 is an attempt to include or define how the Bill and its definitions of services that are exempt may adapt to emerging future technologies. The Minister may consider that a matter for secondary legislation, but as he knows, the Opposition feel that the Bill already leaves too many important matters to be determined at a later stage via statutory instruments. Although it good to see that the Bill has incorporated everyday internet behaviour such as a like or dislike button, as well as factoring in the use of emojis and symbols, it fails to consider how technologies such as artificial intelligence will sit within the framework as it stands.

It is quite right that there are exemptions for everyday user-to-user services such as email, SMS, and MMS services, and an all-important balance to strike between our fundamental right to privacy and keeping people safe online. That is where some difficult questions arise on platforms such as WhatsApp, which are embedded with end-to-end encryption as a standard feature. Concerns have been raised about Meta’s need to extend that feature to Instagram and Facebook Messenger.

The Opposition also have concerns about private messaging features more widely. Research from the Centre for Missing and Exploited Children highlighted the fact that a significant majority of online child abuse takes place in private messages. For example, 12 million of the 18.4 million child sexual abuse reports made by Facebook in 2019 related to content shared on private channels. Furthermore, recent data from the Office for National Statistics shows that private messaging plays a central role in contact between children and people they have not met offline before. Nearly three quarters—74%—of cases of children contacted by someone they do not know initially take place by private message. We will address this issue further in new clause 20, but I wanted to highlight those exemptions early on, as they are relevant to schedule 1.

On a similar point, we remain concerned about how emerging online systems such as the metaverse have had no consideration in Bill as it stands. Only last week, colleagues will have read about a researcher from a non- profit organisation that seeks to limit the power of large corporations, SumOfUs, who claimed that she experienced sexual assault by a stranger in Meta’s virtual reality space, Horizon Worlds. The organisation’s report said:

“About an hour into using the platform, a SumOfUs researcher was led into a private room at a party where she was raped by a user who kept telling her to turn around so he could do it from behind while users outside the window could see—all while another user in the room watched and passed around a vodka bottle.”

There is currently no clear distinction about how these very real technologies will sit in the Bill more widely. Even more worryingly, there has been no consideration of how artificial intelligence systems such as Horizon Worlds, with clear user-to-user functions, fit within the exemptions in schedule 1. If we are to see exemptions for internal business services or services provided by public bodies, along with many others, as outlined in the schedule, we need to make sure that the exemptions are fit for purpose and in line with the rapidly evolving technology that is widely available overseas. Before long, I am sure that reality spaces such as Horizon Worlds will become more and more commonplace in the UK too.

I hope that the Minister can reassure us all of his plans to ensure that the Bill is adequately future-proofed to cope with the rising expansion of the online space. Although we do not formally oppose the provisions outlined in schedule 1, I hope that the Minister will see that there is much work to be done to ensure that the Bill is adequately future-proofed to ensure that the current exemptions are applicable to future technologies too.

Turning to schedule 2, the draft Bill was hugely lacking in provisions to tackle pornographic content, so it is a welcome step that we now see some attempts to tackle the rate at which pornographic content is easily accessed by children across the country. As we all know, the draft Bill only covered pornography websites that allow user-generated content such as OnlyFans. I am pleased to see that commercial pornography sites have now been brought within scope. This positive step forward has been made possible thanks to the incredible efforts of campaigning groups, of which there are far too many to mention, and from some of which we took evidence. I pay tribute to them today. Over the years, it is thanks to their persistence that the Government have been forced to take notice and take action.

Once again—I hate to repeat myself—I urge the Minister to consider how far the current definitions outlined in schedule 2 relating to regulated provider pornographic content will go to protect virtual technologies such as those I referred to earlier. We are seeing an increase in all types of pornographic and semi-pornographic content that draws on AI or virtual technology. An obvious example is the now thankfully defunct app that was making the rounds online in 2016 called DeepNude. While available, the app used neural networks to remove clothing from images of women, making them look realistically nude. The ramifications and potential for technology like this to take over the pornographic content space is essentially limitless.

I urge the Minister carefully to keep in mind the future of the online space as we proceed. More specifically, the regulation of pornographic content in the context of keeping children safe is an area where we can all surely get on board. The Opposition have no formal objection at this stage to the provisions outlined in schedule 2.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

Thank you, Sir Roger, for chairing our sittings. It is a pleasure to be part of this Bill Committee. I have a couple of comments on clause 2 and more generally.

The Opposition spokesperson, the hon. Member for Pontypridd, made some points about making sure that we are future-proofing the Bill. There are some key issues where we need to make sure that we are not going backwards. That particularly includes private messaging. We need to make sure that the ability to use AI to find content that is illegal, involving child sexual abuse for example, in private messages is still included in the way that it is currently and that the Bill does not accidentally bar those very important safeguards from continuing. That is one way in which we need to be clear on the best means to go forward with the Bill.

Future-proofing is important—I absolutely agree that we need to ensure that the Bill either takes into account the metaverse and virtual reality or ensures that provisions can be amended in future to take into account the metaverse, virtual reality and any other emerging technologies that we do not know about and cannot even foresee today. I saw a meme online the other day that was somebody taking a selfie of themselves wearing a mask and it said, “Can you imagine if we had shown somebody this in 1995 and asked them what this was? They wouldn’t have had the faintest idea.” The internet changes so quickly that we need to ensure that the Bill is future-proofed, but we also need to make sure that it is today-proofed.

I still have concerns, which I raised on Second Reading, about whether the Bill adequately encompasses the online gaming world, where a huge number of children use the internet—and where they should use it—to interact with their friends in a safe way. A lot of online gaming is free from the bullying that can be seen in places such as WhatsApp, Snapchat and Instagram. We need to ensure that those safeguards are included for online gaming. Private messaging is a thing in a significant number of online games, but many people use oral communication—I am thinking of things such as Fortnite and Roblox, which is apparently a safe space, according to Roblox Corporation, but according to many researchers is a place where an awful lot of grooming takes place.

My other question for the Minister—I am not bothered if I do not get an answer today, as I would rather have a proper answer than the Minister try to come up with an answer right at this moment—is about what category the app store and the Google Play store fall into.

--- Later in debate ---
None Portrait The Chair
- Hansard -

I am reluctant to do that. It is a technical fault and it is clearly undesirable, but I do not think we can suspend the Committee for the sake of a technical problem. Every member of the public who wishes to express an interest in these proceedings is able to be present if they choose to do so. Although I understand the hon. Lady’s concern, we have to continue. We will get it fixed as soon as we can.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

Will the hon. Lady give way?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Absolutely.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

You are making some really important points about the world of the internet and online gaming for children and young people. That is where we need some serious consideration about obligations on providers about media literacy for both children and grown-ups. Many people with children know that this is a really dangerous space for young people, but we are not quite sure we have enough information to understand what the threats, risks and harms are. That point about media literacy, particularly in regard to the gaming world, is really important.

None Portrait The Chair
- Hansard -

Order. Before we proceed, the same rules apply in Committee as on the Floor of the House to this extent: the Chair is “you”, and you speak through the Chair, so it is “the hon. Lady”. [Interruption.] One moment.

While I am on my feet, I should perhaps have said earlier, and will now say for clarification, that interventions are permitted in exactly the same way as they are on the Floor of the House. In exactly the same way, it is up to the Member who has the Floor to decide whether to give way or not. The difference between these debates and those on the Floor of the House is of course that on the Floor of the House a Member can speak only once, whereas in Committee you have the opportunity to come back and speak again if you choose to do so. Once the Minister is winding up, that is the end of the debate. The Chair would not normally admit, except under exceptional circumstances, any further speech, as opposed to an intervention.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Thank you, Sir Roger.

I do not want to get sidetracked, but I agree that there is a major parental knowledge gap. Tomorrow’s parents will have grown up on the internet, so in 20 years’ time we will have not have that knowledge gap, but today media literacy is lacking particularly among parents as well as among children. In Scotland, media literacy is embedded in the curriculum; I am not entirely sure what the system is in the rest of the UK. My children are learning media literacy in school, but there is still a gap about media literacy for parents. My local authority is doing a media literacy training session for parents tomorrow night, which I am very much looking forward to attending so that I can find out even more about how to keep my children safe online.

I was asking the Minister about the App Store and the Google Play Store. I do not need an answer today, but one at some point would be really helpful. Do the App Store, the Google Play Store and other stores of that nature fall under the definition of search engines or of user-to-user content? The reality is that if somebody creates an app, presumably they are a user. Yes, it has to go through an approval process by Apple or Google, but once it is accepted by them, it is not owned by them; it is still owned by the person who generated it. Therefore, are those stores considered search engines, in that they are simply curating content, albeit moderated content, or are they considered user-to-user services?

That is really important, particularly when we are talking about age verification and children being able to access various apps. The stores are the key gateways where children get apps. Once they have an app, they can use all the online services that are available on it, in line with whatever parental controls parents choose to put in place. I would appreciate an answer from the Minister, but he does not need to provide it today. I am happy to receive it at a later time, if that is helpful.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I want to pick up on two issues, which I hope the Minister can clarify in his comments at the end of this section.

First, when we took evidence, the Internet Watch Foundation underlined the importance of end-to-end encryption being in scope of the Bill, so that it does not lose the ability to pick up child abuse images, as has already been referred to in the debate. The ability to scan end-to-end encryption is crucial. Will the Minister clarify if that is in scope and if the IWF will be able to continue its important work in safeguarding children?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

A number of people have raised concerns about freedom of speech in relation to end-to-end encryption. Does the right hon. Lady agree with me that, there should not be freedom of speech when it comes to child sexual abuse images, and that it is reasonable for those systems to check for child sexual abuse images?

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

The hon. Lady is right to pick up on the nuance and the balance that we have to strike in legislation between freedom of speech and the protection of vulnerable individuals and children. I do not think there can be many people, particularly among those here today, who would want anything to trump the safeguarding of children. Will the Minister clarify exactly how the Bill works in relation to such important work?

Secondly, it is important that the Government have made the changes to schedule 2. They have listened closely on the issue of pornography and extended the provisions of the Bill to cover commercial pornography. However, the hon. Member for Pontypridd mentioned nudification software, and I am unclear whether the Bill would outlaw such software, which is designed to sexually harass women. That software takes photographs only of women, because its database relates only to female figures, and makes them appear to be completely naked. Does that software fall in scope of the Bill? If not, will the Minister do something about that? The software is available and we have to regulate it to ensure that we safeguard women’s rights to live without harassment in their day-to-day life.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I want to just put it on the record that the irony is not lost on me that we are having tech issues relating to the discussion of the Online Safety Bill. The Opposition have huge concerns regarding clause 5. We share the frustrations of stakeholders who have been working on these important issues for many years and who feel the Bill has been drafted in overly complex way. In its evidence, the Carnegie UK Trust outlined its concerns over the complexity of the Bill, which will likely lead to ineffective regulation for both service users and companies. While the Minister is fortunate to have a team of civil servants behind him, he will know that the Opposition sadly do not share the same level of resources—although I would like to place on the record my sincere thanks to my researcher, Freddie Cook, who is an army of one all by herself. Without her support, I would genuinely not know where I was today.

Complexity is an issue that crops up time and again when speaking with charities, stakeholders and civil society. We all recognise that the Bill will have a huge impact however it passes, but the complexity of its drafting is a huge barrier to implementation. The same can be said for the regulation. A Bill as complex as this is likely to lead to ineffective regulation for both service users and companies, who, for the first time, will be subject to specific requirements placed on them by the regulator. That being said, we absolutely support steps to ensure that providers of regulated user-to-user services and regulated search services have to abide by a duty of care regime, which will also see the regulator able to issue codes of practice.

I would also like to place on record my gratitude—lots of gratitude today—to Professor Lorna Woods and Will Perrin, who we heard from in evidence sessions last week. Alongside many others, they have been and continue to be an incredible source of knowledge and guidance for my team and for me as we seek to unpick the detail of this overly complex Bill. Colleagues will also be aware that Professor Woods and Mr Perrin originally developed the idea of a duty of care a few years ago now; their model was based on the idea that social media providers should be,

“seen as responsible for public space they have created, much as property owners or operators are in a physical world.”

It will come as no surprise to the Minister that Members of the Opposition fully fall behind that definition and firmly believe that forcing platforms to identify and act on harms that present a reasonable chance of risk is a positive step forward.

More broadly, we welcome moves by the Government to include specific duties on providers of services likely to be accessed by children, although I have some concerns about just how far they will stretch. Similarly, although I am sure we will come to address those matters in the debates that follow, we welcome steps to require Ofcom to issue codes of practice, but have fundamental concerns about how effective they will be if Ofcom is not allowed to remain fully independent and free from Government influence.

Lastly, on subsection 7, I imagine our debate on chapter 7 will be a key focus for Members. I know attempts to define key terms such as “priority content” will be a challenge for the Minister and his officials but we remain concerned that there are important omissions, which we will come to later. It is vital that those key terms are broad enough to encapsulate all the harms that we face online. Ultimately, what is illegal offline must be approached in the same way online if the Bill is to have any meaningful positive impact, which is ultimately what we all want.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I want to make a couple of brief comments. Unfortunately, my hon. Friend the Member for Ochil and South Perthshire is not here as, ironically, he is at the DCMS committee taking evidence on the Online Safety Bill. That is a pretty unfortunate clash of timing, but that is why I am here solo for the morning.

I wanted to make a quick comment on subsection 7. The Minister will have heard the evidence given on schedule 7 and the fact that the other schedules, particularly schedule 6, has a Scottish-specific section detailing the Scottish legislation that applies. Schedule 7 has no Scotland-specific section and does not adequately cover the Scottish legislation. I appreciate that the Minister has tabled amendment 126, which talks about the Scottish and Northern Irish legislation that may be different from England and Wales legislation, but will he give me some comfort that he does intend Scottish-specific offences to be added to schedule 7 through secondary legislation? There is a difference between an amendment on how to add them and a commitment that they will be added if necessary and if he feels that that will add something to the Bill. If he could commit that that will happen, I would appreciate that—obviously, in discussion with Scottish Ministers if amendment 126 is agreed. It would give me a measure of comfort and would assist, given the oral evidence we heard, in overcoming some of the concerns raised about schedule 7 and the lack of inclusion of Scottish offences.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

In many ways, clause 6 is the central meat of the Bill. It brings into play a duty of care, which means that people operating online will be subject to the same rules as the rest of us when it comes to the provision of services. But when it comes to the detail, the guidance and codes that will be issued by Ofcom will play a central role. My question for the Minister is: in the light of the evidence that we received, I think in panel three, where the providers were unable to define what was harmful because they had not yet seen codes of practice from Ofcom, could he update us on when those codes and guidance might be available? I understand thoroughly why they may not be available at this point, and they certainly should not form part of the Bill because they need to be flexible enough to be changed in future, but it is important that we know how the guidance and codes work and that they work properly.

Will the Minister update the Committee on what further consideration he and other Ministers have given to the establishment of a standing committee to scrutinise the implementation of the Bill? Unless we have that in place, it will be difficult to know whether his legislation will work.

--- Later in debate ---
Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

The hon. Gentleman brings up an important point. We did hear about that in the evidence. I have no doubt the Secretary of State will not want to interfere in the workings of Ofcom. Having been in his position, I know there would be no desire for the Department to get involved in that, but I can understand why the Government might want the power to ensure things are working as they should. Perhaps the answer to the hon. Gentleman’s question is to have a standing committee scrutinising the effectiveness of the legislation and the way in which it is put into practice. That committee could be a further safeguard against what he implies: an unnecessary overreach of the Secretary of State’s powers.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Thank you, Sir Roger, for allowing me to intervene again. I was not expecting the standing committee issue to be brought up at this point, but I agree that there needs to be a post-implementation review of the Bill. I asked a series of written questions to Departments about post-legislative review and whether legislation that the Government have passed has had the intended effect. Most of the Departments that answered could not provide information on the number of post-legislative reviews. Of those that could provide me with the information, none of them had managed to do 100% of the post-implementation reviews that they were supposed to do.

It is important that we know how the Bill’s impact will be scrutinised. I do not think it is sufficient for the Government to say, “We will scrutinise it through the normal processes that we normally use,” because it is clear that those normal processes do not work. The Government cannot say that legislation they have passed has achieved the intended effect. Some of it will have and some of it will not have, but we do not know because we do not have enough information. We need a standing committee or another way to scrutinise the implementation.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I thank the hon. Lady for raising this point. Having also chaired a Select Committee, I can understand the sensitivities that this might fall under the current DCMS Committee, but the reality is that the Bill’s complexity and other pressures on the DCMS Committee means that this perhaps should be seen as an exceptional circumstance—in no way is that meant as a disrespect to that Select Committee, which is extremely effective in what it does.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I completely agree. Having sat on several Select Committees, I am aware of the tight timescales. There are not enough hours in the day for Select Committees to do everything that they would like to do. It would be unfortunate and undesirable were this matter to be one that fell between the cracks. Perhaps DCMS will bring forward more legislation in future that could fall between the cracks. If the Minister is willing to commit to a standing committee or anything in excess of the normal governmental procedures for review, that would be a step forward from the position that we are currently in. I look forward to hearing the Minister’s views on that.

--- Later in debate ---
Our amendment 69 would require regulated companies to designate a senior manager as a safety controller who is legally responsible for ensuring that the service meets its illegality risk assessment and content safety duties and is criminally liable for significant and egregious failures to protect users from harms. Typically, senior executives in technology companies have not taken their safeguarding responsibilities seriously, and Ofcom’s enforcement powers remain poorly targeted towards delivering child safety outcomes. The Bill is an opportunity to promote cultural change within companies and to embed compliance with online safety regulations at board level but, as it stands, it completely fails to do so.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I do not intend to speak to this specific point, but I wholeheartedly agree and will be happy to back amendment 69, should the hon. Lady press it to a vote.

--- Later in debate ---
Secondly, providing information is pretty cut and dried. We say, “Give us that information. Have you provided it—yes or no? Is that information accurate—yes or no?” It is pretty obvious what the individual executive must do to meet that duty. When it comes to some of the other duties, that clarity that comes with information provision is sometimes less obvious, which makes it harder to justify expanding criminal liability to those circumstances.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

In a moment.

For those reasons, I think we have drawn the line in the right place. There is personal criminal liability for information provision, with fines of 10% of local revenue and service disruption—unplugging powers—as well. Having thought about it quite carefully, I think we have struck the balance in the right place. We do not want to deter people from offering services in the UK. If they worried that they might go to prison too readily, it might deter people from locating here. I fully recognise that there is a balance to strike. I feel that the balance is being struck in the right place.

I will go on to comment on a couple of examples we heard about Carillion and the financial crisis, but before I do so, I will give way as promised.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I appreciate that the Minister says he has been swithering on this point—he has been trying to work out the correct place to draw the line. Given that we do not yet have a commitment for a standing committee—again, that is potentially being considered—we do not know how the legislation is going to work. Will the Minister, rather than accepting the amendment, give consideration to including the ability to make changes via secondary legislation so that there is individual criminal liability for different breaches? That would allow him the flexibility in the future, if the regime is not working appropriately, to add through secondary legislation individual criminal liability for breaches beyond those that are currently covered.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I have not heard that idea suggested. I will think about it. I do not want to respond off the cuff, but I will give consideration to the proposal. Henry VIII powers, which are essentially what the hon. Lady is describing—an ability through secondary legislation effectively to change primary legislation—are obviously viewed askance by some colleagues if too wide in scope. We do use them, of course, but normally in relatively limited circumstances. Creating a brand new criminal offence via what amounts to a Henry VIII power would be quite a wide application of the power, but it is an idea that I am perfectly happy to go away and reflect on. I thank her for mentioning the idea.

A couple of examples were given about companies that have failed in the past. Carillion was not a financial services company and there was no regulatory oversight of the company at all. In relation to financial services regulation, despite the much stricter regulation that existed in the run-up to the 2008 financial crisis, that crisis occurred none the less. [Interruption.] We were not in government at the time. We should be clear-eyed about the limits of what regulation alone can deliver, but that does not deter us from taking the steps we are taking here, which I think are extremely potent, for all the reasons that I mentioned and will not repeat.

Question put, That the amendment be made.

--- Later in debate ---
This matter is not addressed explicitly. We are concerned that companies might be able to cite competition worries to avoid considering that aspect of online abuse. That is unacceptable. We are also concerned that forthcoming changes to the online environment such as the metaverse will create new risks such as more seamless moving of abuse between different platforms .
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I want to talk about a few different things relating to the amendments. Speaking from the Opposition Front Bench, the hon. Member for Pontypridd covered in depth amendment 20, which relates to being directed to other content. Although this seems like a small amendment, it would apply in a significant number of different situations. Particular mention was made of Discord for gaming, but also of things such as moving from Facebook to Messenger—all those different directions that can happen. A huge number of those are important for those who would seek to abuse children online by trying to move from the higher-regulation services or ones with more foot traffic to areas with perhaps less moderation so as to attack children in more extreme ways.

I grew up on the internet and spent a huge amount of time speaking to people, so I am well aware that people can be anyone they want to be on the internet, and people do pretend to be lots of different people. If someone tells us their age on the internet, we cannot assume that that is in any way accurate. I am doing what I can to imprint that knowledge on my children in relation to any actions they are taking online. In terms of media literacy, which we will come on to discuss in more depth later, I hope that one of the key things that is being told to both children and adults is that it does not matter if people have pictures on their profile—they can be anybody that they want to online and could have taken those pictures from wherever.

In relation to amendment 21 on collaboration, the only reasonable concern that I have heard is about an action that was taken by Facebook in employing an outside company in the US. It employed an outside company that placed stories in local newspapers on concerns about vile things that were happening on TikTok. Those stories were invented—they were made up—specifically to harm TikTok’s reputation. I am not saying for a second that collaboration is bad, but I think the argument that some companies may make that it is bad because it causes them problems and their opponents may use it against them proves the need to have a regulator. The point of having a regulator is to ensure that any information or collaboration that is required is done in a way that, should a company decide to use it with malicious intent, the regulator can come down on them. The regulator ensures that the collaboration that we need to happen in order for emergent issues to be dealt with as quickly as possible is done in a way that does not harm people. If it does harm people, the regulator is there to take action.

I want to talk about amendments 25 and 30 on the production of images and child sexual abuse content. Amendment 30 should potentially have an “or” at the end rather than an “and”. However, I am very keen to support both of those amendments, and all the amendments relating to the production of child sexual abuse content. On the issues raised by the Opposition about livestreaming, for example, we heard two weeks ago about the percentage of self-generated child sexual abuse content. The fact is that 75% of that content is self-generated. That is absolutely huge.

If the Bill does not adequately cover production of the content, whether it is by children and young people who have been coerced into producing the content and using their cameras in that way, or whether it is in some other way, then the Bill fails to adequately protect our children. Purely on the basis of that 75% stat, which is so incredibly stark, it is completely reasonable that production is included. I would be happy to support the amendments in that regard; I think they are eminently sensible. Potentially, when the Bill was first written, production was not nearly so much of an issue. However, as it has moved on, it has become a huge issue and something that needs tackling. Like Opposition Members, I do not feel like the Bill covers production in as much detail as it should, in order to provide protection for children.

Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

Amendment 10 would create a duty to publish the illegal content risk assessment, and proactively supply that to Ofcom. This is new legislation that is really a trial that will set international precedent, and a lot of the more prescriptive elements—which are necessary—are perhaps the most challenging parts of the Bill. The Minister has been very thoughtful on some of the issues, so I want to ask him, when we look at the landscape of how we look to regulate companies, where does he stand on transparency and accountability? How far is he willing to go, and how far does the Bill go, on issues of transparency? It is my feeling that the more companies are forced to publish and open up, the better. As we saw with the case of the Facebook whistleblower Frances Haugen, there is a lot to uncover. I therefore take this opportunity to ask the Minister how far the Bill goes on transparency and what his thoughts are on that.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

All the companies have to do the risk assessment, for example for the “illegal” duties, where they are required to by the Bill. For the “illegal” duties, that is all of them; they have to do those risk assessments. The question is whether they have to send them to Ofcom—all of them—even if they are very low risk or have very low user numbers, and whether Ofcom, by implication, then has to consider them, because it would be pointless to require them to be sent if they were not then looked at. We want to ensure that Ofcom’s resources are pointed at the areas where the risks arise. Ofcom can request any of these. If Ofcom is concerned—even a bit concerned—it can request them.

Hon. Members are then making a slightly adjacent point about transparency—about whether the risk assessments should be made, essentially, publicly available. In relation to comprehensive public disclosure, there are legitimate questions about public disclosure and about getting to the heart of what is going on in these companies in the way in which Frances Haugen’s whistleblower disclosures did. But we also need to be mindful of what we might call malign actors—people who are trying to circumvent the provisions of the Bill—in relation to some of the “illegal” provisions, for example. We do not want to give them so much information that they know how they can circumvent the rules. Again, there is a balance to strike between ensuring that the rules are properly enforced and having such a high level of disclosure that people seeking to circumvent the rules are able to work out how to do so.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

If the rules are so bad that people can circumvent them, they are not good enough anyway and they need to be updated, but I have a specific question on this. The Minister says that Ofcom will be taking in the biggest risk assessments, looking at them and ensuring that they are adequate. Will he please give consideration to asking Ofcom to publish the risk assessments from the very biggest platforms? Then they will all be in one place. They will be easy for people to find and people will not have to rake about in the bottom sections of a website. And it will apply only in the case of the very biggest, most at risk platforms, which should be regularly updating their risk assessments and changing their processes on a very regular basis in order to ensure that people are kept safe.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Lady for her intervention and for the—

Online Safety Bill (Fourth sitting)

Kirsty Blackman Excerpts
Committee stage & Committee Debate - 4th sitting
Thursday 26th May 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 26 May 2022 - (26 May 2022)
None Portrait The Chair
- Hansard -

I saw you nodding, Ms Perry. Do you wish to add anything?

Lynn Perry: I agree. The important thing, particularly from the perspective of Barnardo’s as a children’s charity, is the right of children to remain safe and protected online and in no way compromised by privacy or anonymity considerations online. I was nodding along at certain points to endorse the need to ensure that the right balance is struck for protections for those who might be most vulnerable.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

Q Lynn, does the Bill ensure that children are kept as safe as possible online? If not, what improvements need to be made to it so that they are?

Lynn Perry: There are several things that we welcome as a children’s charity. One of them, age verification, has just been mentioned. We are particularly concerned and have written about children’s access to harmful and extreme pornography—they are sometimes only a couple of clicks away from harmful online commercial pornography—and we welcome the age-verification measures in the Bill. However, we are concerned about the length of time that it may take to implement those measures, during which children and young people will remain at risk and exposed to content that is potentially harmful to their development. We would welcome measures to strengthen that and to compel those companies to implement the measures earlier. If there were a commencement date for that, those provisions could be strengthened.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q How much of an impact will the Bill have on the likelihood of children being subjected to online grooming and predatory behaviour?

Lynn Perry: There are some contextual considerations that we have been reflecting on as a charity, influenced by what we have heard from children, young people, parents and carers. We know that more children have had access to digital devices and have spent more time online over the last couple of years in particular. In that sense, we are concerned that the Bill needs to be strengthened because of the volume of access, the age at which children and young people now access digital content, and the amount of time that they spend online.

There are some other contextual things in respect of grooming. We welcome the fact that offences are named on the face of the Bill, for example, but one of the things that is not currently included is the criminal exploitation of children. We think that there is another opportunity to name criminal exploitation, where young people are often targeted by organised criminal gangs. We have seen more grooming of that type during the pandemic period as offenders have changed the ways in which they seek to engage young people. That is another area that we would welcome some consideration of.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q In terms of online gaming, and predators moving children from more mainstream to less regulated platforms, do you think there are improvements in the Bill that relate to that, or do you think more can be done?

Lynn Perry: Grooming does happen within gaming, and we know that online video games offer some user-to-user interaction. Users sometimes have the ability to create content within platforms, which is in scope for the Bill. The important thing will be enforcement and compliance in relation to those provisions. We work with lots of children and young people who have been sexually exploited and abused, and who have had contact through gaming sites. It is crucial that this area is in focus from the perspective of building in, by design, safety measures that stop perpetrators being able to communicate directly with children.

Private messaging is another area for focus. We also consider it important for Ofcom to have regulatory powers to compel firms to use technology that could identify child abuse and grooming.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q If I could address one question to each witness, that would be fantastic. I do a lot of work with women in sport, including football. Obviously, we have the Women’s Euros coming up, and I have my Panini sticker album at the ready. Do you think the Bill could do more to address the pervasive issue of online threats of violence and abuse against women and girls, including those directed at women in sport, be they players, officials or journalists?

Sanjay Bhandari: I can see that there is something specific in the communications offences and that first limb around threatening communications, which will cover a lot of the things we see directed at female football pundits, like rape threats. It looks as though it would come under that. With our colleagues in other civil society organisations, particularly Carnegie UK Trust, we are looking at whether more should be done specifically about tackling misogyny and violence against women and girls. It is something that we are looking at, and we will also work with our colleagues in other organisations.

--- Later in debate ---
Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Poppy, do you have anything to add?

Poppy Wood: Yes. I think we could go much further on enforcement. One of the things that I really worry about is that if the platforms make an inadequate risk assessment, there is not much that Ofcom can do about it. I would really like to see powers for Ofcom to say, “Okay, your risk assessment hasn’t met the expectations that we put on you, so we want you to redo it. And while you’re redoing it, we may want to put you into a different category, because we may want to have higher expectations of you.” That way, you cannot start a process where you intentionally make an inadequate risk assessment in order to extend the process of you being properly regulated. I think that is one thing.

Then, going back to the point about categorisation, I think that Ofcom should be given the power to recategorise companies quickly. If you think that a category 2B company should be a category 1 company, what powers are there for Ofcom to do that? I do not believe that there are any for Ofcom to do that, certainly not to do it quickly, and when we are talking about small but high-risk companies, that is absolutely the sort of thing that Ofcom should be able to do—to say, “Okay, you are now acting like a category 1 company.” TikTok, Snapchat—they all started really small and they accelerated their growth in ways that we just could not have predicted. When we are talking about the emergence of new platforms, we need to have a regulator that can account for the scale and the pace at which these platforms grow. I think that is a place where I would really like to see Ofcom focusing.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q I have a question for the Centre for Countering Digital Hate. I raised some of your stats on reporting with Meta—Facebook—when they were here, such as the number of reports that are responded to. They basically said, “This is not true any more; we’re now great”—I am paraphrasing, obviously. Could you please let us know whether the reporting mechanism on major platforms—particularly Facebook—is now completely fixed, or whether there are still lots of issues with it?

Eva Hartshorn-Sanders: There are still lots of issues with it. We recently put a report out on anti-Muslim hatred and found that 90% of the content that was reported was not acted on. That was collectively, across the platforms, so it was not just Facebook. Facebook was in the mid-90s, I think, in terms of its failure to act on that type of harmful content. There are absolutely still issues with it, and this regulation—this law—is absolutely necessary to drive change and the investment that needs to go into it.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q I have a quick question for Poppy, although I am afraid it might not have a quick answer. How much of an impact does the algorithmic categorisation of things—the way we are fed things on social media—have on our lives? Do you think it is steering people towards more and more extreme content? Or is it a totally capitalist thing that is not harmful, and just something that sells us things every so often?

Poppy Wood: I think it goes without saying that the algorithmic promotion of harmful content is one of the biggest issues with the model we have in big tech today. It is not the individual pieces of content in themselves that are harmful. It is the scale over which they spread out—the amplification of them; the targeting; the bombardment.

If I see one piece of flat-earth content, that does not necessarily harm me; I probably have other counter-narratives that I can explore. What we see online, though, is that if you engage with that one piece of flat-earth content, you are quickly recommended something else—“You like this, so you’ll probably like that”—and then, before you know it, you are in a QAnon conspiracy theory group. I would absolutely say that the algorithmic promotion of harmful content is a real problem. Does that mean we ban algorithms? No. That would be like turning off the internet. You have to go back and ask, how it is that that kind of harm is promoted, and how is it that we are exploiting human behaviour? It is human nature to be drawn to things that we cannot resist. That is something that the Bill really needs to look at.

In the risk assessments, particularly for illegal content and content that is harmful to children, it explicitly references algorithmic promotion and the business model. Those are two really big things that you touched on in the question. The business model is to make money from our time spent online, and the algorithms serve us up the content that keeps us online. That is accounted for very well in the risk assessments. Some of the things around the safety duties do not necessarily account for that, just because you are risk assessing for it. Say you identify that our business model does promote harmful content; under the Bill, you do not have to mitigate that all the time. So I think there are questions around whether the Bill could go further on algorithmic promotion.

If you do not mind, I will quickly come back to the question you asked Eva about reporting. We just do not know whether reporting is really working because we cannot see—we cannot shine a light into these platforms. We just have to rely on them to tell us, “Hey, reporting is working. This many pieces of content were reported and this many pieces of content were taken down.” We just do not know if that is true. A big part of this regime has to be about transparency. It already is, but I think it could go much further in enabling Ofcom, Government, civil society and researchers to say, “Hey, you said that many pieces of content were reported and that many pieces of content were taken down, but actually, it turns out that none of that is true. We are still seeing that stuff online.” Transparency is a big part of the solution around understanding whether reporting is really working and whether the platforms are true to their word.

Caroline Ansell Portrait Caroline Ansell (Eastbourne) (Con)
- Hansard - - - Excerpts

Q May I ask a follow-up question on that? Poppy, you referenced risk assessments. Would you value and welcome more specifics around quality standards and minimum requirements on risk assessments? My main question is about privacy and anonymity, but I would appreciate a word on risk assessments.

Poppy Wood: Absolutely. I know that children’s groups are asking for minimum standards for children’s risk assessments, but I agree that they should be across the board. We should be looking for the best standards that we can get. I really do not trust the platforms to do these things properly, so I think we have to be really tough with them about what we expect from them. We should absolutely see minimum standards.

Online Safety Bill (Second sitting)

Kirsty Blackman Excerpts
None Portrait The Chair
- Hansard -

One moment, please. I am conscious of the fact that we are going to run out of time. I am not prepared to allow witnesses to leave without feeling they have had a chance to say anything. Ms Foreman, Ms O’Donovan, is there anything you want to comment on from what you have heard so far? If you are happy, that is fine, I just want to make sure you are not being short-changed.

Becky Foreman: No.

Katie O'Donovan: No, I look forward to the next question.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

Q Given the size of Facebook, a lot of our questions will be focused towards it—not that you guys do not have very large platforms, but the risks with social media are larger. You mentioned, Richard, that three in every 10,000 views are hate speech. If three in every 10,000 things I said were hate speech, I would be arrested. Do you not think that, given the incredibly high number of views there are on Facebook, there is much more you need to do to reduce the amount of hate speech?

Richard Earley: So, reducing that number—the prevalence figure, as we call it—is the goal that we set our engineers and policy teams, and it is what we are devoted to doing. On whether it is a high number, I think we are quite rare among companies of our size in providing that level of transparency about how effective our systems are, and so to compare whether the amount is high or low, you would require additional transparency from other companies. That is why we really welcome the part of the Bill that allows Ofcom to set standards for what kinds of transparency actually are meaningful for people.

We have alighted on the figure of prevalence, because we think it is the best way for you and the public to hold us to account for how we are doing. As I said, that figure of three in every 10,000 has declined from six in every 10,000 about 12 months ago. I hope the figure continues to go down, but it is not just a matter of what we do on our platform. It is about how all of us in society function and the regulations you will all be creating to help support the work we do.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q I would like to follow up with a question about responding to complaints. The complaints process is incredibly important. Reports need to be made and Facebook needs to respond to those reports. The Centre for Countering Digital Hate said that it put in 100 complaints and that 51 did not get any response from Facebook. It seems as though there is a systemic issue with a lack of response to complaints.

Richard Earley: I do not know the details of that methodological study. What I can tell you is that every time anyone reports something on Facebook or Instagram, they get a response into their support inbox. We do not put the response directly into your Messenger inbox or IG Direct inbox, because very often when people report something, they do not want to be reminded of what they have seen among messages from their friends and family. Unfortunately, sometimes people do not know about the support inbox and so they miss the response. That could be what happened there, but every time somebody reports something on one of our platforms, they get a response.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q Does the response just say, “Thanks for your report”?

Richard Earley: No. I want to be very constructive here. I should say that some of the concerns that are raised around this date from several years ago. I will accept that five or 10 years ago, the experience on our platforms was not this comprehensive, but in the last few years, we have really increased the transparency we give to people. When you submit something and report it for a particular violation, we give you a response that explains the action we took. If we removed it, we would explain what piece of our community standards it broke. It also gives you a link to see that section of our policy so you can understand it.

That is one way we have tried to increase the transparency we give to users. I think there is a lot more we could be doing. I could talk about some of the additional transparency steps we are taking around the way that our algorithms recommend content to people. Those are, again, all welcome parts of the Bill that we look forward to discussing further with Ofcom.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q One of the things that has been recommended by a number of charities is increasing cross-platform and cross-company work to identify and take action on emerging threats. Do you think there would be the level of trust necessary for cross-platform co-operation with your competitors in the light of reports that, for example, Facebook employed somebody to put out negative things about TikTok in the US? Do you think that cross-platform working will work in that environment?

Richard Earley: Yes; I think it is already working, in fact. Others on the panel mentioned a few areas in which we have been collaborating in terms of open-sourcing some of the technology we have produced. A few years ago, we produced a grooming classifier—a technology that allows people to spot potentially inappropriate interactions between adults and children—and we open-sourced that and enabled it to be used and improved on by anyone else who is building a social media network.

A number of other areas, such as PhotoDNA, have already been mentioned. An obvious one is the Global Internet Forum to Counter Terrorism, which is a forum for sharing examples of known terrorist content so that those examples can be removed from across the internet. All those areas have been priorities for us in the past. A valuable piece of the Bill is that Ofcom—from what I can see from the reports that it has been asked to make—will do a lot of work to understand where there are further opportunities for collaboration among companies. We will be very keen to continue being involved in that.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q I have a question for Katie on the algorithms that produce suggestions when you begin to type. It has been raised with me and in the evidence that we have received that when you begin to type, you might get a negative suggestion. If somebody types in, “Jews are”, the algorithm might come up with some negative suggestions. What has Google done about that?

Katie O'Donovan: We are very clear that we want the auto-suggestion, as we call it, to be a helpful tool that helps you find the information that you are looking for quicker—that is the core rationale behind the search—but we really do not want it to perpetuate hate speech or harm for protected individuals or wider groups in society. We have changed the way that we use that auto-complete, and it will not auto-complete to harmful suggestions. That is a live process that we review and keep updated. Sometimes terminology, vernacular or slang change, or there is a topical focus on a particular group of people, so we keep it under review. But by our policy and implementation, those auto-suggestions should very much not be happening on Google search.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q Would it be technically possible for all of the protected characteristics, for example, to have no auto-complete prompts come up?

Katie O'Donovan: That is an excellent question on where you do not want protections and safety to minimise user or individual impact. If you wanted a protected characteristic for Jewish people, for example, we see that as really important, and we should remove the ability for hate speech. If you wanted to do that for a Jewish cookbook, Jewish culture or Jewish history, and we removed everything, you would really minimise the amount of content that people could access.

The Bill is totally vital and will be incredibly significant on UK internet access, but that is where it is really important to get the balance and nuance right. Asking an algorithm to do something quite bluntly might look at first glance like it will really improve safety, but when you dig into it, you end up with the available information being much less sophisticated, less impactful and less full, which I think nobody really wants—either for the user or for those protected groups.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q Would it not be easier to define all the protected characteristics and have a list of associated words than to define every possible instance of hate speech in relation to each?

Katie O'Donovan: The way we do it at the moment is through algorithmic learning. That is the most efficient way to do it because we have millions of different search terms, a large number of which we see for the very first time every single day on Google search. We rarely define things with static terms. We use our search rater guidelines—a guide of about 200 pages—to determine how those algorithms work and make sure that we have a dynamic ability to restrict them.

That means that you do not achieve perfection, and there will be changes and new topical uses that we perhaps did not anticipate—we make sure that we have enough data incoming to adjust to that. That is the most efficient way of doing it, and making sure that it has the nuance to stop the bad autocomplete but give access to the great content that we want people to get to.

None Portrait The Chair
- Hansard -

Thank you very much. Ms Foreman, do you want to add anything to that? You do not have to.

Becky Foreman: I do not have anything to add.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Ms Walker?

Janaya Walker: Some of these discussions almost reiterate what I was saying earlier about the problematic nature of this, in that so much of what companies are going to be directed to do will be tied only to the specific schedule 7 offences. There have been lots of discussions about how you respond to some harms that reach a threshold of criminality and others that do not, but that really contrasts with the best practice approach to addressing violence against women and girls, which is really trying to understand the context and all of the ways that it manifests. There is a real worry among violence against women and girls organisations about the minimal response to content that is harmful to adults and children, but will not require taking such a rigorous approach.

Having the definition of violence against women and girls on the face of the Bill allows us to retain those expectations on providers as technology changes and new forms of abuse emerge, because the definition is there. It is VAWG as a whole that we are expecting the companies to address, rather than a changing list of offences that may or may not be captured in criminal law.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q Why is it important that we have this? Is this a big thing? What are you guys actually seeing here?

Jessica Eagelton: I can respond to that in terms of what we are seeing as a provider. Technology-facilitated domestic abuse is an increasing form of domestic abuse: technology is providing perpetrators with increasing ways to abuse and harass survivors. What we are seeing on social media is constant abuse, harassment, intimate image abuse, monitoring and hacking of accounts, but when it comes to the responses we are getting from platforms at the moment, while I acknowledge that there is some good practice, the majority experience of survivors is that platforms are not responding sufficiently to the tech abuse they are experiencing.

Our concern is that the Bill could be a really good opportunity for survivors of domestic abuse to have greater protections online that would mean that they are not forced to come offline. At the moment, some of the options being given to survivors are to block the perpetrator—which in some cases has a minimal impact when they can easily set up new fake accounts—or to come offline completely. First, that is not a solution to that person being able to maintain contact, stay online and take part in public debate. But secondly, it can actually escalate risk in some cases, because a perpetrator could resort to in-person forms of abuse. If we do not make some of these changes—I am thinking in particular about mandating a VAWG code of practice, and looking at schedule 7 and including controlling and coercive behaviour—the Bill is going to be a missed opportunity. Women and survivors have been waiting long enough, and we need to take this opportunity.

Janaya Walker: If I could add to that, as Jessica has highlighted, there is the direct harm to survivors in terms of the really distressing experience of being exposed to these forms of harm, or the harm they experience offline being exacerbated online, but this is also about indirect harm. We need to think about the ways in which the choices that companies are making are having an impact on the extent to which violence against women and girls is allowed to flourish.

As Jessica said, it impacts our ability to participate in online discourse, because we often see a mirroring online of what happens offline, in the sense that the onus is often on women to take responsibility for keeping themselves safe. That is the status quo we see offline, in terms of the decisions we make about what we are told to wear or where we should go as a response to violence against women and girls. Similarly, online, the onus is often on us to come offline or put our profiles on private, to take all those actions, or to follow up with complaints to various different companies that are not taking action. There is also something about the wider impact on society as a whole by not addressing this within the Bill.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q How does the proposed code of practice—or, I suppose, how could the Bill—tackle intersectionality of harms?

Janaya Walker: This is a really important question. We often highlight the fact that, as I have said, violence against women and girls often intersects with other forms of discrimination. For example, we know from research that EVAW conducted with Glitch during the pandemic that black and minoritised women and non-binary people experience a higher proportion of abuse. Similarly, research done by Amnesty International shows that black women experience harassment at a rate 84% higher than that experienced by their white counterparts. It is a real focal point. When we think about the abuse experienced, we see the ways that people’s identities are impacted and how structural discrimination emerges online.

What we have done with the code of practice is try to introduce requirements for the companies to think about things through that lens, so having an overarching human rights and equalities framework and having the Equality Act protected characteristics named as a minimum. We see in the Bill quite vague language when it comes to intersectionality; it talks about people being members of a certain group. We do not have confidence that these companies, which are not famed for their diversity, will interpret that in a way that we regard as robust—thinking very clearly about protected characteristics, human rights and equalities legislation. The vagueness in the Bill is quite concerning. The code of practice is an attempt to be more directive on what we want to see and how to think through issues in a way that considers all survivors, all women and girls.

Professor Clare McGlynn: I wholly agree. The code of practice is one way by which we can explain in detail those sorts of intersecting harms and what companies and platforms should do, but I think it is vital that we also write it into the Bill. For example, on the definitions around certain characteristics and certain groups, in previous iterations reference was made to protected characteristics. I know certain groups can go wider than that, but naming those protected characteristics is really important, so that they are front and centre and the platforms know that that is exactly what they have to cover. That will cover all the bases and ensure that that happens.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I have a quite specific question on something that is a bit tangential.

None Portrait The Chair
- Hansard -

Last one, please.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q If someone has consented to take part in pornography and they later change their mind and would like it to be taken down, do you think they should have the right to ask a porn website, for example, to take it down?

Professor Clare McGlynn: That is quite challenging not only for pornography platforms but for sex workers, in that if you could participate in pornography but at any time thereafter withdraw your consent, it is difficult to understand how a pornography company and the sex worker would be able to make a significant amount of money. The company would be reluctant to invest because it might have to withdraw the material at any time. In my view, that is a quite a challenge. I would not go down that route, because what it highlights is that the industry can be exploitative and that is where the concern comes from. I think there are other ways to deal with an exploitative porn industry and other ways to ensure that the material online has the full consent of participants. You could put some of those provisions into the Bill—for example, making the porn companies verify the age and consent of those who are participating in the videos for them to be uploaded. I think that is a better way to deal with that, and it would ensure that sex workers themselves can still contract to perform in porn and sustain their way of life.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Thank you very much—this is extremely interesting and helpful. You have covered a lot of ground already, but I wonder whether there is anything specific you think the Bill should be doing more about, to protect girls—under-18s or under-16s—in particular?

Janaya Walker: A lot of what we have discussed in terms of naming violence against women and girls on the face of the Bill includes children. We know that four in five offences of sexual communications with a child involved girls, and a lot of child abuse material is targeted at girls specifically. The Bill as a whole takes a very gender-neutral approach, which we do not think is helpful; in fact, we think it is quite harmful to trying to reduce the harm that girls face online.

This goes against the approach taken in the Home Office violence against women and girls strategy and its domestic abuse plan, as well as the gold-standard treaties the UK has signed up to, such as the Istanbul convention, which we signed and have recently committed to ratifying. The convention states explicitly that domestic laws, including on violence against women and girls online, need to take a very gendered approach. Currently, it is almost implied, with references to specific characteristics. We think that in addressing the abuse that girls, specifically, experience, we need to name girls. To clarify, the words “women”, “girls”, “gender” and “sex” do not appear in the Bill, and that is a problem.

Jessica Eagelton: May I add a point that is slightly broader than your question? Another thing that the Bill does not do at the moment is provide for specialist victim support for girls who are experiencing online abuse. There has been some discussion about taking a “polluter pays” approach; where platforms are not compliant with the duties, for example, a percentage of the funds that go to the regulator could go towards victim support services, such as the revenge porn helpline and Refuge’s tech abuse team, that provide support to victims of abuse later on.

Professor Clare McGlynn: I can speak to pornography. Do you want to cover that separately, or shall I do that now?

--- Later in debate ---
Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

Thank you.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q My first question is for Lulu. Do small tech companies have enough staff with technical expertise to be able to fulfil their obligations under the Bill?

Lulu Freemont: It is a great question. One of the biggest challenges is capacity. We hear quite a lot from the smaller tech businesses within our membership that they will have to divert their staff away from existing work to comply with the regime. They do not have compliance teams, and they probably do not have legal counsel. Even at this stage, to try to understand the Bill as it is currently drafted—there are lots of gaps—they are coming to us and saying, “What does this mean in practice?” They do not have the answers, or the capability to identify that. Attendant regulatory costs—thinking about the staff that you have and the cost, and making sure the regulation is proportionate to the need to divert away from business development or whatever work you might be doing in your business—are really fundamental.

Another real risk, and something in the Bill that smaller businesses are quite concerned about, is the potential proposal to extend the senior management liability provisions. We can understand them being in there to enable the regulators to do their job—information requests—but if there is any extension into individual pieces of content, coupled with a real lack of definitions, those businesses might find themselves in the position of restricting access to their services, removing too much content or feeling like they cannot comply with the regime in a proportionate way. That is obviously a very extreme case study. It will be Ofcom’s role to make sure that those businesses are being proportionate and understand the provisions, but the senior management liability does have a real, chilling impact on the smaller businesses within our membership.

Adam Hildreth: One of the challenges that we have seen over the last few years is that you can have a business that is small in revenue but has a huge global user base, with millions of users, so it is not really a small business; it just has not got to the point where it is getting advertisers and getting users to pay for it. I have a challenge on the definition of a small to medium-sized business. Absolutely, for start-ups with four people in a room—or perhaps even still just two—that do not have legal counsel or anything else, we need to make it simple for those types of businesses to ingest and understand what the principles are and what is expected of them. Hopefully they will be able to do quite a lot early on.

The real challenge comes when someone labels themselves as a small business but they have millions of users across the globe—and sometimes actually quite a lot of people working for them. Some of the biggest tech businesses in the world that we all use had tens of people working for them at one point in time, when they had millions of users. That is the challenge, because there is an expectation for the big-tier providers to be spending an awful lot of money, when the small companies are actually directly competing with them. There is a challenge to understanding the definition a small business and whether that is revenue-focused, employee-focused or about how many users it has—there may be other metrics.

Ian Stevenson: One of the key questions is how much staffing this will actually take. Every business in the UK that processes data is subject to GDPR from day one. Few of them have a dedicated data protection officer from day one; it is a role or responsibility that gets taken on by somebody within the organisation, or maybe somebody on the board who has some knowledge. That is facilitated by the fact that there are a really clear set of requirements there, and there are a lot of services you can buy and consume that help you deliver compliance. If we can get to a point where we have codes of practice that make very clear recommendations, then even small organisations that perhaps do not have that many staff to divert should be able to achieve some of the basic requirements of online safety by buying in the services and expertise that they need. We have seen with GDPR that many of those services are affordable to small business.

If we can get the clarity of what is required right, then the staff burden does not have to be that great, but we should all remember that the purpose of the Bill is to stop some of the egregiously bad things that happen to people as a result of harmful content, harmful behaviours and harmful contact online. Those things have a cost in the same way that implementing data privacy has a cost. To come back to Lulu’s point, it has to be proportionate to the business.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Adam, you said a few moments ago that companies are starting to put safety at the core of what they do, which will be welcome to us all—maybe it should have happened a lot earlier. I know you have worked a lot in that area. Regulators and company owners will have to depend on an ethical culture in their organisations if they are going to abide by the new regulations, because they cannot micromanage and regulators cannot micromanage. Will the Bill do enough to drive that ethical culture? If not, what more could it do or could the industry do? I would be really interested in everybody’s answer to this one, but I will start with Adam.

Adam Hildreth: What we are seeing from the people that are getting really good at this and that really understand it is that they are treating this as a proper risk assessment, at a very serious level, across the globe. When we are talking about tier 1s, they are global businesses. When they do it really well, they understand risk and how they are going to roll out systems, technology, processes and people in order to address that. That can take time. Yes, they understand the risk, who it is impacting and what they are going to do about it, but they still need to train people and develop processes and maybe buy or build technology to do it.

We are starting to see that work being done really well. It is done almost in the same way that you would risk assess anything else: corporate travel, health and safety in the workplace—anything. It should really become one of those pillars. All those areas I have just gone through are regulated. Once you have regulation there, it justifies why someone is doing a risk assessment, and you will get businesses and corporates going through that risk assessment process. We are seeing others that do not do the same level of risk assessment and they do not have that same buy-in.

--- Later in debate ---
None Portrait The Chair
- Hansard -

I have three Members and the Minister to get in before 5 o’clock, so I urge brief questions and answers please.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q Is it technically possible—I do not need to know how—to verify the age of children who are under 16, for example?

Dr Rachel O'Connell: Yes.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q So technology exists out there for that to happen.

Dr Rachel O'Connell: Yes.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q Once we have the verification of those ages, do you think it would be possible or desirable to limit children’s interactions to only with other children? Is that the direction you were going in?

Dr Rachel O'Connell: I will give an example. If you go to an amusement park, kids who are below four feet, for example, cannot get on the adult rides, so the equivalent would be that they should not be on an 18-plus dating site. The service can create it at a granular level so the kids can interact with kids in the same age group or a little bit older, but they can also interact with family. You can create circles of trust among verified people.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

For a game like Roblox, which is aimed at kids—it is a kids platform—if you had the age verification and if that worked, you could have a situation where a 13-year-old on Roblox could only interact with children who are between 12 and 14. Does the technology exist to make that work?

Dr Rachel O'Connell: You could do. Then if you were using it in esports or there was a competition, you could broaden it out. The service can set the parameters, and you can involve the parents in making decisions around what age bands their child can play with. Also, kids are really into esports and that is their future, so there are different circumstances and contexts that the technology could enable.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q Finally, do you think it would be desirable for Ofcom to consider a system with more consistency in parental controls, so that parents can always ensure that their children cannot talk to anybody outside their circle? Would that be helpful?

Dr Rachel O'Connell: There is a history of parental controls, and only 36% of parents use them. Ofcom research consistently says that it is 70%, but in reality, it is lower. When using age verification, the parents are removing the ability to watch everything. It is a platform; they are providing the digital playground. In the same way, when you go on swings and slides, there is bouncy tarmac because you know the kids are going to use them. It is like creating that health and safety environment in a digital playground.

When parents receive a notification that their child wants to access something, there could be a colour-coded nutrition-style thing for social media, livestreaming and so on, and the parents could make an informed choice. It is then up to the platform to maintain that digital playground and run those kinds of detection systems to see if there are any bad actors in there. That is better than parental controls because the parent is consenting and it is the responsibility of the platform to create the safer environment. It is not the responsibility of the parent to look over the child’s shoulder 24/7 when they are online.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q The age verification stuff is really interesting, so thank you to our witnesses. On violence against women and girls, clauses 150 to 155 set out three new communications offences. Do you think those offences will protect women from receiving offensive comments, trolling and threats online? What will the Bill mean for changing the way you manage those risks on your platforms?

Jared Sine: I do not know the specific provisions but I am familiar with the general concept of them. Any time you put something in law, it can either be criminalised or have enforcement behind it, and I think that helps. Ultimately, it will be up to the platforms to come up with innovative technologies or systems such as “Are You Sure?” and “Does This Bother You?” which say that although the law says x, we are going to go beyond that to find tools and systems that make it happen on our platform. Although I think it is clearly a benefit to have those types of provisions in law, it will really come down to the platforms taking those extra steps in the future. We work with our own advisory council, which includes the founder of the #MeToo movement, REIGN and others, who advise us on how to make platforms safer for those things. That is where the real bread gets buttered, so to speak.

--- Later in debate ---
Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q So what needs to change in the Bill to make sure that happens? I am not clear.

Susie Hargreaves: We just want to make sure that the ability to scan in an end-to-end encrypted environment is included in the Bill in some way.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q The ability to scan is there right now—we have got that—so you are just trying to make sure we are standing still, basically. Am I correct in my understanding?

Susie Hargreaves: I think with technology you can never stand still. We do not know what is coming down the line. We have to deal with the here and now, but we also need to be prepared to deal with whatever comes down the line. The answer, “Okay, we will just get people to report,” is not a good enough replacement for the ability to scan for images.

When the privacy directive was introduced in Europe and Facebook stopped scanning for a short period, we lost millions of images. What we know is that we must continue to have those safety mechanisms in place. We need to work collectively to do that, because it is not acceptable to lose millions of images of child sexual abuse and create a forum where people can safely share them without any repercussions, as Rhiannon says. One survivor we talked to in this space said that one of her images had been recirculated 70,000 times. The ability to have a hash of a unique image, go out and find those duplicates and make sure they are removed means that people are not re-victimised on a daily basis. That is essential.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q Focusing on thinking about how to prevent grooming behaviour, does the Bill have enough in place to protect children from conversations that they may have adults, or from facing grooming behaviour online?

Rhiannon-Faye McDonald: There is one specific point that I would like to raise about this. I am concerned about private communications. We know that many offenders identify and target children on more open platforms, and then very quickly move them to more private platforms to continue the grooming and abuse. We were very pleased to see that private communications were brought in scope. However, there is a difficulty in the code of practice. When that is drafted, Ofcom is not going to be able to require proactive tools to be used to identify. That includes things like PhotoDNA and image and text-based classifiers.

So although we have tools that we can use currently, which can identify conversations where grooming is happening, we are not going to be using those immediately on private platforms, on private communications where the majority of grooming is going to happen. That means there will be a delay while Ofcom establishes that there is a significant problem with grooming on the platform, and then issues are noticed to require those tools to be used.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q You mentioned the reporting mechanisms that are in place, Susie. Yes, they are not the only tool, and should not be the only tool—many more things should be happening—but are the reporting mechanisms that will be in place, once the Bill has come in and is being embedded, sufficient, or do they need to be improved as well; as requirements for platforms to have reporting mechanisms?

Susie Hargreaves: An awful lot of work has already gone into this over the past few years. We have been working closely with Departments on the draft code of practice. We think that, as it stands, it is in pretty good shape. We need to work more closely with Ofcom as those codes are developed—us and other experts in the field. Again, it needs to be very much not too directing, in the sense that we do not want to limit people, and to be available for when technology changes in the future. It is looking in the right shape, but of course we will all be part of the consultation and of the development of those practices as they go. It requires people to scan their networks, to check for child sexual abuse and—I guess for the first time, the main thing—to report on it. It is going to be a regulated thing. In itself, that is a huge development, which we very much welcome.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q I have one last question. Rhiannon, a suggestion was made earlier by Dr Rachel O’Connell about age verification and only allowing children to interact with other children whose age is verified within a certain area. Do you think that would help to prevent online grooming?

Rhiannon-Faye McDonald: It is very difficult. While I am strongly about protecting children from encountering perpetrators, I also recognise that children need to have freedoms and the ability to use the internet in the ways that they like. I think if that was implemented and it was 100% certain that no adult could pose as a 13-year-old and therefore interact with actual 13-year-olds, that would help, but I think it is tricky.

Susie Hargreaves: One of the things we need to be clear about, particularly where we see children groomed —we are seeing younger and younger children—is that we will not ever sort this just with technology; the education piece is huge. We are now seeing children as young as three in self-generated content, and we are seeing children in bedrooms and domestic settings being tricked, coerced and encouraged into engaging in very serious sexual activities, often using pornographic language. Actually, a whole education piece needs to happen. We can put filters and different technology in place, but remember that the IWF acts after the event—by the time we see this, the crime has been committed, the image has been shared and the child has already been abused. We need to bump up the education side, because parents, carers, teachers and children themselves have to be able to understand the dangers of being online and be supported to build their resilience online. They are definitely not to be blamed for things that happen online. From Rhiannon’s own story, how quickly it can happen, and how vulnerable children are at the moment—I don’t know.

Rhiannon-Faye McDonald: For those of you who don’t know, it happened very quickly to me, within the space of 24 hours, from the start of the conversation to the perpetrator coming to my bedroom and sexually assaulting me. I have heard other instances where it has happened much more quickly than that. It can escalate extremely quickly.

Just to add to Susie’s point about education, I strongly believe that education plays a huge part in this. However, we must be very careful in how we educate children, so that the focus is not on how to keep themselves safe, because puts the responsibility on them, which in turn increases the feelings of responsibility when things do go wrong. That increased feeling of responsibility makes it less likely that they will disclose that something has happened to them, because they feel that they will be blamed. It will decrease the chance that children will tell us that something has happened.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q Just to follow up on a couple of things, mainly with Susie Hargreaves. You mentioned reporting mechanisms and said that reporting will be a step forward. However, the Joint Committee on the draft Bill recommended that the highest-risk services should have to report quarterly data to Ofcom on the results of their child sexual exploitation and abuse removal systems. What difference would access to that kind of data make to your work?

Susie Hargreaves: We already work with the internet industry. They currently take our services and we work closely with them on things such as engineering support. They also pay for our hotline, which is how we find child sexual abuse. However, the difference it would make is that we hope then to be able to undertake work where we are directly working with them to understand the level of their reports and data within their organisations.

At the moment, we do not receive that information from them. It is very much that we work on behalf of the public and they take our services. However, if we were suddenly able to work directly with them—have information about the scale of the issue within their own organisations and work more directly on that— then that would help to feed into our work. It is a very iterative process; we are constantly developing the technology to deal with the current threats.

It would also help us by giving us more intelligence and by allowing us to share that information, on an aggregated basis, more widely. It would certainly also help us to understand that they are definitely tackling the problem. We do believe that they are tackling the problem, because it is not in their business interests not to, but it just gives a level of accountability and transparency that does not exist at the moment.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q So as a result of these exemptions, the Bill as it stands could make the internet less safe than it currently is.

Kyle Taylor: The Bill as it stands could absolutely make the internet less safe than it currently is.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q You have done a really good job of explaining the concerns about journalistic content. Thinking about the rest of the Bill for a moment, do you think the balance between requiring the removal of content and the prioritisation of content is right? Do you think it will be different from how things are now? Do you think there is a better way it could be done in the Bill?

Ellen Judson: The focus at the moment is too heavily on content. There is a sort of tacit equation of content removal—sometimes content deprioritisation, but primarily content removal—as the way to protect users from harm, and as the threat to freedom of expression. That is where the tension comes in with how to manage both those things at once. What we would want from a Bill that was taking more of a systems approach is thinking: where are platforms making decisions about how they are designing their services, and how they are operating their services at all levels? Content moderation policy is certainly included, but it goes back to questions of how a recommendation algorithm is designed and trained, who is involved in that process, and how human moderators are trained and supported. It is also about what functionality users are given and what behaviour is incentivised and encouraged. There is a lot of mitigation that platforms can put in place that does not talk about directly affecting user content.

I think we should have risk assessments that focus on the risks of harms to users, as opposed to the risk of users encountering harmful content. Obviously there is a relationship, but one piece of content may have very different effects when it is encountered by different users. It may cause a lot of harm to one user, whereas it may not cause a lot of harm to another. We know that when certain kinds of content are scaled and amplified, and certain kinds of behaviour are encouraged or incentivised, we see harms at a scale that the Bill is trying to tackle. That is a concern for us. We want more of a focus on some things that are mentioned in the Bill—business models, platform algorithms, platform designs and systems and processes. They often take a backseat to the issues of content identification and removal.

Kyle Taylor: I will use the algorithm as an example, because this word flies around a lot when we talk about social media. An algorithm is a calculation that is learning from people’s behaviour. If society is racist, an algorithm will be racist. If society is white, an algorithm will be white. You can train an algorithm to do different things, but you have to remember that these companies are for-profit businesses that sell ad space. The only thing they are optimising for in an algorithm is engagement.

What we can do, as Ellen said, through a system is force optimisation around certain things, or drive algorithms away from certain types of content, but again, an algorithm is user-neutral. An algorithm does not care what user is saying what; it is just “What are people clicking on?”, regardless of what it is or who said it. An approach to safety has to follow the same methodology and say, “We are user-neutral. We are focused entirely on propensity to cause harm.”

The second piece is all the mitigation measures you can take once a post is up. There has been a real binary of “Leave it up” and “Take it down”, but there is a whole range of stuff—the most common word used is “friction”—to talk about what you can do with content once it is in the system. You have to say to yourself, “Okay, we absolutely must have free speech protections that exceed the platform’s current policies, because they are not implemented equally.” At the same time, you can preserve someone’s free expression by demonetising content to reduce the incentive of the company to push that content or user through its system. That is a way of achieving both a reduction in harm and the preservation of free expression.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

May I just ask one more question, Chair?

None Portrait The Chair
- Hansard -

Briefly, because there are two other Members and the Minister wishing to ask questions.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q Thanks. On the propensity to cause harm, we heard earlier that a company might create a great new feature and put it out, but then there is a period—a lag, if you like—before they realise the harm that is being caused. Do you trust that companies would have the ability to understand in advance of doing something what harm it may cause, and adequately to assess that?

Ellen Judson: I think there are a lot of things that companies could be doing. Some of these things are in research that they probably are conducting. As we have seen from the Facebook files, companies are conducting that sort of research, but we aren’t privy to the results. I think there a couple of things we want to see. First, we want companies to have to be more transparent about what kind of testing they have done, or, if not testing, about who they have consulted when designing these products. Are they consulting human rights experts? Are they consulting people who are affected by identity-based harm, or are they just consulting their shareholders? Even that would be a step in the right direction, and that is why it is really important.

We feel that there need to be stronger provisions in the Bill for independent researcher and civil society access to data. Companies will be able to do certain amounts of things, and regulators will have certain powers to investigate and do their own research, but it requires the added efforts of civil society properly to hold companies to account for the effects of certain changes they have made—and also to help them in identifying what the effects of those changes to design have been. I think that is really crucial.

None Portrait The Chair
- Hansard -

We are playing “Beat the clock”. I am going to ask for brief answers and brief questions, please. I will take one question from Kim Leadbeater and one from Barbara Keeley.

Online Safety Bill (First sitting)

Kirsty Blackman Excerpts
None Portrait The Chair
- Hansard -

Thank you. I intend to bring in the Minister at about 10 o’clock. Kirsty Blackman, Kim Leadbeater and Dean Russell have indicated that they wish to ask questions, so let us try to keep to time.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

Q I have a few questions, but I will ask them in a short way, and hopefully the witnesses can answer them in a fairly short way too. The chief executive of Ofcom told the Joint Committee on the draft Bill that the Secretary of State’s powers were extremely open ended. You have already touched on this, but do you feel that this will impede Ofcom’s independence as a regulator?

Kevin Bakhurst: There is a particular area on reasons of public policy for the Secretary of State to direct us on codes that we have some concern about. It is more on practicality than independence, but clearly for the platforms, and we have had a lot of discussions with them, the independence of a regulator—that is, a regulatory regime that is essentially about content—is absolutely critical, and it is a priority for us to show that we are independent.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q Do you feel that the Bill will allow you to adequately regulate online gaming, which is how an awful lot of young people use the internet, in a way that will keep them safer than they currently are?

Richard Wronka: Yes, we fully anticipate that gaming services, and particularly the messaging functionality that is often integrated into those services, will be captured within the scope of the regime. We do think that the Bill, on the whole, gives us the right tools to regulate those services.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q My last question is about future-proofing the Bill. Obviously, an awful lot of things will happen in the online world that do not currently happen there, and some of those we cannot foresee. Do you think the Bill is wide enough and flexible enough to allow changes to be made so that new and emerging platforms can be regulated?

Kevin Bakhurst: Overall, we feel that it is. By and large, the balance between certainty and flexibility in the Bill is probably about right and will allow some flexibility in future, but it is very hard to predict what other harms may emerge. We will remain as flexible as possible.

Richard Wronka: There are some really important updating tools in the Bill. The ability for the Secretary of State to introduce new priority harms or offences—with the approval of Parliament, of course—is really important.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

Q Ofcom is required to produce certain codes, for example on terrorism, but others that were floated in the Green Paper are no longer in the Bill. Are you working on such codes, for example on hate crime and wider harm, and if not, what happens in the meantime? I guess that links to my concerns about the democratic importance and journalistic content provisions in the Bill, to which you have alluded. They are very vague protections and I am concerned that they could be exploited by extremists who suddenly want to identify as a journalist or a political candidate. Could you say a little about the codes and about those two particular clauses and what more you think we could do to help you with those?

Richard Wronka: I will cover the codes first. You are absolutely right that the Bill requires Ofcom to publish codes of practice, particularly on CSEA and on terror, as well as on fraudulent advertising and other areas. We are doing the work right now so that we are ready to progress with that process as soon as we get powers and duties, because it is really important that we are ready to move as quickly as possible. We will set out further detail on exactly how we plan to do that in a roadmap document that we are looking to publish before the summer break, so that will provide some of the detail.

A really important point here is that the Bill quite rightly covers a wide set of harms. We are mindful of the fact that the temptation of having a code that covers every single harm could be counterproductive and confusing for platforms, even for those that want to comply and do the right thing. One of the balancing acts for us as we produce that code framework will be to get the right coverage for all the issues that everyone is rightly concerned about, but doing that in a way that is streamlined and efficient, so that services can apply the provisions of those codes.

Richard Wronka: Shall I pick up on the second bit very quickly? I think you are right; this is one of our central concerns about the definitions. As far as possible, this should be a matter for Parliament. It is really important that to know Parliament has a view on this. Ultimately, the regulator will take a view based on what Parliament says. We have some experience in this area, but as Richard said, we recognise the challenge—it is extremely complex. We can see the policy intent of doing it, quite rightly, and the importance of enshrining freedom of expression as far as possible, but Parliament can help to add clarity and, as you rightly say, be aware of some of the potential loopholes. At the moment, someone could describe themselves as a citizen journalist; where does that leave us? I am not quite sure. Parliament could help to clarify that, and we would be grateful.

--- Later in debate ---
Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Is there capacity in the sector to deliver what you are talking about?

Dame Rachel de Souza: I think we need to make capacity. There is some—the NSPCC has its Childline and, as Children’s Commissioner, I have my own advocacy service for children in care. I think this should function in that way, with direct access. So I think that we can create it.

Andy Burrows: May I come in briefly? Our proposals for user advocacy reflect the clear “polluter pays” principle that we think should apply here, to help build and scale up that capacity, but the levy that is covering the direct cost of regulation should also provide really effective user advocacy. That is really important not only to help to give victims what they need in frontline services, but in ensuring that there is a strong counterbalance to some of the largest companies in the world for our sector, which has clear ambition but self-evident constraints.

Dame Rachel de Souza: One of the concerns that has come to me from children—I am talking about hundreds of thousands of children—over the past year is that there is not strong enough advocacy for them and that their complaints are not being met. Girls in particular, following the Everyone’s Invited concerns, have tried so hard to get images down. There is this almost medieval bait-out practice of girls’ images being shared right across platforms. It is horrendous, and the tech firms are not acting quickly enough to get those down. We need proper advocacy and support for children, and I think that they would expect that of us in this groundbreaking Bill.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q There has not been a huge amount of discussion of online gaming in the context of the Bill, despite the fact that for many young people that is the way in which they interact with other people online. Do you think the Bill covers online gaming adequately? A lot of interaction in online gaming is through oral communication—voice chat messages. Do you think that it is possible to properly regulate oral communications in gaming?

Dame Rachel de Souza: Good question. I applaud the Bill for what it does cover. We are looking at a Bill that, for the first time, is going to start protecting children’s rights online, so I am really pleased to see that. We have looked a bit at gaming in the past. In terms of harms, obviously the Bill does not cover gaming in full, but it does cover the safety aspects of children’s experience.

It is always good for us to be looking further. Gaming, we know, has some extremely harmful and individualistic issues with it, particularly around money and the profile of potential grooming and safety. In terms of communications, one of the reasons that I am so concerned about encryption and communications online is that it happens through gaming. We need to make sure that those elements are really firm.

Andy Burrows: It is vitally important that the gaming sector is in scope. We know that there are high-risk gaming sites—for example, Twitch—and gaming-adjacent services such as Discord. To go back to my earlier point about the need for cross-platform provisions to apply here, in gaming we can see grooming pathways that can take on a different character from those on social networks, for example, where we might see abuse pathways where that grooming is taking place at the same time, rather than sequentially from a gaming streaming service, say, to a gaming-adjacent platform such as Discord. I think it is very important that a regulator is equipped to understand the dynamics of the harms and how they will perhaps apply differently on gaming services. That is a very strong and important argument for use advocacy.

I would say a couple of things on oral communications. One-to-one oral communication are excluded from the Bill’s scope—legitimately—but we should recognise that there is a grooming risk there, particularly when that communication is embedded in a platform of wider functionality. There is an argument for a platform to consider all aspects of its functionality within the risk assessment process. Proactive scanning is a different issue.

There is a broader challenge for the Bill, and this takes us back to the fundamental objectives and the very welcome design based around systemic risk identification and mitigation. We know that right now, in respect of oral communications and livestream communications, the industry response is not as developed in terms of detecting and disrupting harm as it is for, say, text-based chat. In keeping with the risk assessment process, it should be clear that if platforms want to offer that functionality, they should have to demonstrate through the risk assessment process that they have high-quality, effective arrangements in place to detect and disrupt harm, and that should be the price of admission. If companies cannot demonstrate that, they should not be offering their services, because there is a high risk to children.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q Do you think it would be reasonable for gaming companies in particular to have a setting whereby children or young people can choose to interact only with people in their friends list? Would that be helpful?

Andy Burrows: I think that aspect is certainly worthy of consideration, because the key objective is that platforms should be incentivised to deliver safety by design initiatives. One area in the Bill that we would like to be amended is the user empowerment mechanism. That gives adults the ability to screen out anonymous accounts, for example, but those provisions do not apply to children. Some of those design features that introduce friction to the user experience are really important to help children, and indeed parents, have greater ownership of their experience.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q Finally, could you explain breadcrumbing a little further? What does it mean and does the Bill tackle it adequately?

Andy Burrows: Child abuse breadcrumbing is a major area of concern for us. The term captures a range of techniques whereby abusers are able to use social networks to facilitate the discovery and the dissemination of child sexual abuse. The activity does not meet the criminal threshold in and of itself, but it effectively enables abusers to use online services as a shop window to advertise their sexual interest in children.

I will give a couple of fairly chilling examples of what I mean by that. There is a phenomenon called “tribute sites”. Abusers open social media accounts in the guise of well-known survivors of child sexual abuse. To all of us in this room, that would look perfectly innocuous, but if you are an abuser, the purpose of those accounts is very clear. In the first quarter of last year, those types of accounts received 6 million interactions.

Another example is Facebook groups. We have seen evidence of Facebook refusing to take down groups that have a common interest in, for example, children celebrating their 8th, 9th and 10th birthdays. That is barely disguised at all; we can all see what the purpose is. Indeed, Facebook’s algorithms can see the purpose there, because research has shown that, within a couple of hours of use of the service, the algorithms identify the common characteristic of interest, which is child sexual abuse, and then start recommending accounts in multiple other languages.

We are talking about a significant way in which abusers are able to organise abuse and migrate it to encrypted chat platforms, to the dark web, and to offender fora, where it is, by definition, much harder to catch that activity, which happens after harm has occurred—after child abuse images have been circulated. We really want breadcrumbing to be brought unambiguously into the scope of the Bill. That would close off tens of millions of interactions with accounts that go on to enable abusers to discover and disseminate material and to form offender networks.

We have had some good, constructive relationships with the Home Office in recent weeks. I know that the Home Office is keen to explore how this area can be addressed, and it is vital that it is addressed. If we are going to see the Bill deliver the objective of securing a really effective upstream response, which I think is the clear legislative ambition, this is an area where we really need to see the Bill be amended.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q You mostly talked about Facebook. Is it mostly the largest social media platforms, or are we talking about some of the smaller ones, such as Discord, which you mentioned? Would you like to see those in scope as well, or is it just the very biggest ones?

Andy Burrows: Those provisions should apply broadly, but it is a problem that we see particularly on those large sites because of the scale and the potential for algorithmic amplification.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Thank you.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q I want to ask about the many tragic cases of teenagers who have died by suicide after viewing self-harm material online. Do you think coroners have sufficient powers to access digital data after the death of a child, and should parents have the right to access their children’s digital data following their death?

Dame Rachel de Souza: Baroness Kidron has done some fantastic work on this, and I really support her work. I want to tell you why. I am a former headteacher—I worked for 30 years in schools as a teacher and headteacher. Only in the last five or six years did I start seeing suicides of children and teenagers; I did not see them before. In the year just before I came to be Children’s Commissioner, there was a case of a year 11 girl from a vulnerable family who had a relationship with a boy, and it went all over the social media sites. She looked up self-harm material, went out to the woods and killed herself. She left a note that basically said, “So there. Look what you’ve done.”

It was just horrendous, having to pick up the family and the community of children around her, and seeing the long-term effects of it on her siblings. We did not see things like that before. I am fully supportive of Baroness Kidron and 5Rights campaigning on this issue. It is shocking to read about the enormous waiting and wrangling that parents must go through just to get their children’s information. It is absolutely shocking. I think that is enough from me.

Andy Burrows: I absolutely agree. One of the things we see at the NSPCC is the impact on parents and families in these situations. I think of Ian Russell, whose daughter Molly took her own life, and the extraordinarily protracted process it has taken to get companies to hand over her information. I think of the anguish and heartbreak that comes with this process. The Bill is a fantastic mechanism to be able to redress the balance in terms of children and families, and we would strongly support the amendments around giving parents access to that data, to ensure that this is not the protracted process that it currently all too often is.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Very briefly, Dame Rachel, I will build on what you were just saying, based on your experience as a headteacher. When I make my school visits, the teachers overwhelmingly tell me how, on a daily basis, they have to deal with the fallout from an issue that has happened online or on social media. On that matter, the digital media literacy strategy is being removed from the Bill. What is your thinking on that? How important do you see a digital media literacy strategy being at the heart of whatever policy the Government try to make regarding online safety for children?

Dame Rachel de Souza: There is no silver bullet. This is now a huge societal issue and I think that some of the things that I would want to say would be about ensuring that we have in our educational arsenal, if you like, a curriculum that has a really strong digital media literacy element. To that end, the Secretary of State for Education has just asked me to review how online harms and digital literacy are taught in schools—reviewing not the curriculum, but how good the teaching is and what children think about how the subject has been taught, and obviously what parents think, too.

I would absolutely like to see the tech companies putting some significant funding into supporting education of this kind; it is exactly the kind of thing that they should be working together to provide. So we need to look at this issue from many aspects, not least education.

Obviously, in a dream world I would like really good and strong digital media literacy in the Bill, but actually it is all our responsibility. I know from my conversations with Nadhim Zahawi that he is very keen that this subject is taught through the national curriculum, and very strongly.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q I have a quick question on parental digital literacy. You mentioned the panel that you put together of 16 to 21-year-olds. Do you think that today’s parents have the experience, understanding, skills and tools to keep their children properly safe online? Even if they are pretty hands-on and want to do that, do you think that they have all the tools they need to be able to do that?

Dame Rachel de Souza: It is a massive concern to parents. Parents talk to me all the time about their worries: “Do we know enough?” They have that anxiety, especially as their children turn nine or 10; they are thinking, “I don’t even know what this world out there is.” I think that our conversations with 16 to 21-year-olds were really reassuring, and we have produced a pamphlet for parents. It has had a massive number of downloads, because parents absolutely want to be educated in this subject.

What did young people tell us? They told us, “Use the age controls; talk to us about how much time we are spending online; keep communication open; and talk to us.” Talk to children when they’re young, particularly boys, who are likely to be shown pornography for the first time, even if there are parental controls, around the age of nine or 10. So have age-appropriate conversations. There was some very good advice about online experiences, such as, “Don’t worry; you’re not an expert but you can talk to us.” I mean, I did not grow up with the internet, but I managed parenting relatively well—my son is 27 now. I think this is a constant concern for parents.

I do think that the tech companies could be doing so much more to assist parents in digital media literacy, and in supporting them in how to keep their child safe. We are doing it as the Office of the Children’s Commissioner. I know that we are all trying to do it, but we want to see everyone step up on this, particularly the tech companies, to support parents on this issue.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Can I start by thanking the NSPCC and you, Dame Rachel, and your office for the huge contribution that you have made to the Bill as it has developed? A number of changes have been made as a result of your interventions, so I would just like to start by putting on the record my thanks to both of you and both your organisations for the work that you have done so far.

Could you outline for the Committee the areas where you think the Bill, as currently drafted, contains the most important provisions to protect children?

Dame Rachel de Souza: I was really glad to see, in the rewrite of the Online Safety Bill, a specific reference to the role of age assurance to prevent children from accessing harmful content. That has come across strongly from children and young people, so I was very pleased to see that. It is not a silver bullet, but for too long children have been using entirely inappropriate services. The No. 1 recommendation from the 16 to 21-year-olds, when asked what they wish their parents had known and what we should do, was age assurance, if you are trying to protect a younger sibling or are looking at children, so I was pleased to see that. Companies cannot hope to protect children if they do not know who the children are on their platforms, so I was extremely pleased to see that.

--- Later in debate ---
Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

Q In terms of the timing, once the Bill comes into law, there may be a period where it is enforced to set everything up. Are both your platforms already gearing up to make sure you fulfil the requirements of the Bill from day one?

Katy Minshall: I am glad you asked that question. The problem with the Bill is it depends on so many things that do not exist yet. We are looking at the Bill and thinking how we can prepare and start thinking about what is necessary, but in practice, content that is harmful to adults and harmful to children has not been set out yet. So much of the Bill depends on secondary legislation and codes of practice, and as I described earlier in the question from Alex Davies-Jones, there are such real workability questions around exemptions and ID verification that I worry there would be the risk of substantial delays at the other end, which I do not think anyone wants to see.

Ben Bradley: It is the same from our perspective. We have our community guidelines and we are committed to enforcing those at the moment. A lot of the detail of the Bill will be produced in Ofcom’s codes of practice but I think it is important we think about operationalising the process, what it looks like in practice and whether it is workable.

Something like Katy mentioned in terms of the user empowerment duties, how prescriptive those would be and how those would work, not just from the platforms of today but for the future, is really important. For TikTok, to use a similar example on the user empowerment duties, the intent is to discover content from all over the world. When you open the app, you are recommended content from all sorts of users and there is no expectation that those would be verified. If you have opted into this proposed user empowerment duty, there is a concern that it could exacerbate the risk of filter bubbles, because you would only be receiving content from users within the UK who have verified themselves, and we work very hard to make sure there is a diverse range of recommendations in that. I think it is a fairly easy fix. Much like elsewhere in the Bill, where Ofcom has flexibility about whether to require specific recommendations, they could have that flexibility in this case as well, considering whether this type of power works for these types of platforms.

To use the example of the metaverse, how would it work once the metaverse is up and running? The whole purpose of the metaverse is a shared environment in which users interact, and because the Bill is so prescriptive at the minute about how this user empowerment duty needs to be achieved, it is not clear, if you were verified and I were unverified and you had opted not to see my content but I moved something in the shared environment, like this glass, whether that would move for everyone. It is a small point, but it just goes to the prescriptiveness of how it is currently drafted and the importance of giving Ofcom the flexibility that it has elsewhere in the Bill, but in this section as well.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q I have a few questions, starting with Twitter, in relation to young people using the platform. How do you currently make sure that under-13s do not use the platform? What actions do you take to ensure that happens? Going forward, will that change?

Katy Minshall: At present, we follow the industry standard of age self-declaration. How you manage and verify identity—whether using a real-name system or emerging technologies like blockchain or documentation—is at the heart of a range of industries, not just ours.

Technology will change and new products that we cannot even envisage today will come on to the market. In terms of what we would do in relation to the Bill, as I said, until we see the full extent of the definitions and requirements, we cannot really say what exact approach we would take.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q To follow up on that, you said that there is agreement internally and externally that your service is mostly used by over-18s. Does that mean that you do not think you will have a responsibility to undertake the child safety duties?

Katy Minshall: My understanding of the Bill is that if there is a chance a young person could access your service, you would be expected to undertake the child safety duties, so my understanding is that that would be the case.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q Okay. Ben, for TikTok, how do you currently ensure that under-13s are not using your service, and how is that likely to change with the Bill coming in?

Ben Bradley: We are a strictly 13-plus platform. There are basically two approaches to preventing under-age access to our platform. The first is preventing them from signing up. We are 12+ rated in the app stores, so if you have parental controls on those app stores, you cannot download the app. We also have a neutral age gate, which I think is similar to Twitter’s. We do not ask people to confirm whether they are over 13—we do not ask them to tick a box; instead we ask them to enter their date of birth. If they enter a date of birth that is under 13, they are blocked from re-entering date of birth, so they cannot just keep trying. We do not say that it is because they are under age; we just say, “TikTok isn’t right for you right now.” That is the first step.

Secondly, we proactively surface and remove under-age users. Whenever a piece of content is reported on TikTok, for whatever reason, the moderator will look at two things: the reason why it was reported and also whether the user is under 13. They can look at a range of signals to do that. Are they wearing a school uniform? Is there a birthday cake in their biography? Do they say that they are in a certain year of school? They can use those signals.

We actually publish every quarter how many suspected under-13s we remove from our platform. I think we are currently the only company to publish that on a quarterly basis, but we think it is important to be transparent about how we are approaching this, to give a sense of the efficacy of our interventions.

On what specifically might change, that is not clear; obviously, we have to wait for further guidance from Ofcom. However, we did carry out research last year with parents and young people in five countries across Europe, including the UK, where we tested different ideas of age assurance and verification, trying to understand what they would like to see. There was not really a single answer that everyone could get behind, but there were concerns raised around data protection and privacy if you were handing over this type of information to the 50 or 60 apps that might be on your phone.

One idea, which people generally thought was a good one, was that when you first get a device and first sign into the app store, you would verify your age there, and then that app store on that device could then pass an additional token to all the apps on your phone suggesting that you are of a certain age, so that we could apply an age-appropriate experience. Obviously that would not stop us doing everything that we currently do, but I think that would be a strong signal. If that were to move forward, we would be happy to explore that.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q Both of your sites work very heavily on algorithms for the content that is put towards people. If you are in the top tweets feed on Twitter, you get algorithmically derived or chosen content, and TikTok is even more heavily involved in algorithms. How will this Bill affect the algorithms that you use, particularly regarding some of the content that may get more and more extreme, for example, if people are going down that route? In terms of the legal but harmful stuff that is likely to come through, how will the Bill affect the algorithms that you use, and is it possible to do that? Does it work?

Ben Bradley: TikTok does not take a filter bubble approach. When you first open the app, you express areas of content that you are interested in and then we recommend content. Because it is short-form, the key to TikTok’s success is sending you diverse content, which allows you to discover things that you might never have previously expressed interest in. I use the example of Nathan Evans, a postman who went on to have a No. 1 song with “Wellerman”, or even Eurovision, for example. These are things that you would not necessarily express interest in, but when they are recommended to you, you are engaged. Because it is short-form content, we cannot show you the same type of material over and over again—you would not be interested in seeing 10 30-second videos on football, for example. We intentionally try to diversify the feed to express those different types of interests.

Katy Minshall: Our algorithms down-rank harmful content. If you want to see an example live on Twitter, if you send a tweet and get loads of replies, there is a chunk that are automatically hidden at the bottom in a “view more replies” section. Our algorithm works in other ways as well to down-rank content that could be violating our rules. We endeavour to amplify credible content as well. In the explore tab, which is the magnifying glass, we will typically be directing you to credible sources of information—news websites and so on.

In terms of how the Bill would affect that, my main hope is that codes of practice go beyond a leave up or take down binary and beyond content moderation and think about the role of algorithms. At present on Twitter, you can turn the algorithm off in the top right-hand corner of the app, on the sparkle icon. In the long term, I think what we will be aiming for is a choice in the range of algorithms that you could use on services like Twitter. I would hope that the code of practice enables that and does not preclude is as a solution to some of the legal but harmful content we may have in mind.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q Just one more question. We know that women and minorities face more abuse online than men do. Is that something that you have found in your experience, particularly Twitter? What are you doing to ensure that the intersectionality of harms is considered in the work that you are doing to either remove or downgrade content?

Katy Minshall: That is absolutely the case and it has been documented by numerous organisations and research. Social media mirrors society and society has the problems you have just described. In terms of how we ensure intersectionality in our policies and approaches, we are guided by our trust and safety council, which is a network of dozens of organisations around the world, 10 of which are here in the UK, and which represents different communities and different online harms issues. Alongside our research and engagement, the council ensures that when it comes to specific policies, we are constantly considering a range of viewpoints as we develop our safety solutions.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Thank you, Chair, and thank you to the witnesses. I share your concerns about the lack of clarity regarding the journalistic content and democratic content exemptions. Do you think those exemptions should be removed entirely, or can you suggest what we might do to make them clearer in the Bill?

Katy Minshall: At the very least, there must be tighter definitions. I am especially concerned when it comes to the news publisher exemption. The Secretary of State has indicated an amendment that would mean that services like Twitter would have to leave such content up while an appeals process is ongoing. There is no timeline given. The definition in the Bill of a news publisher is, again, fairly vague. If Ben and I were to set up a news website, nominally have some standards and an email address where people could send complaints, that would enable it to be considered a news publisher under the Bill. If we think about some of the accounts that have been suspended from social media over the years, you can absolutely see them creating a news website and saying, “I have a case to come back on,” to Twitter or TikTok or wherever it maybe.

Ben Bradley: We share those concerns. There are already duties to protect freedom of expression in clause 19. Those are welcome. It is the breadth of the definition of journalistic and democratic content that is a concern for us, particularly when it comes to things like the expediated and dedicated appeals mechanism, which those people would be able to claim if their content was removed. We have already seen people like Tommy Robinson on the far right present themselves as journalists or citizen journalists. Giving them access to a dedicated and expediated appeals mechanism is an area of concern.

There are different ways you could address that, such as greater clarity in those definitions and removing subjective elements. At the minute, it is whether or not a user considers their content to be journalistic; that it is not an objective criterion but about their belief about their content.

Also, if you look at something like the dedicated and expediated appeals mechanism, could you hold that in reserve so that if a platform were found to be failing in its duties to journalistic content or in its freedom of expression duties, Ofcom could say, like it can in other areas of the Bill, “Okay, we believe that you need to create this dedicated mechanism, because you have failed to protect those duties.”? That would, I think, minimise the risk for exploitation of that mechanism.