The Committee consisted of the following Members:
Chairs: † Sir Roger Gale, Christina Rees
† Ansell, Caroline (Eastbourne) (Con)
† Bailey, Shaun (West Bromwich West) (Con)
† Blackman, Kirsty (Aberdeen North) (SNP)
† Carden, Dan (Liverpool, Walton) (Lab)
† Davies-Jones, Alex (Pontypridd) (Lab)
† Double, Steve (St Austell and Newquay) (Con)
† Fletcher, Nick (Don Valley) (Con)
† Holden, Mr Richard (North West Durham) (Con)
† Keeley, Barbara (Worsley and Eccles South) (Lab)
† Leadbeater, Kim (Batley and Spen) (Lab)
† Miller, Dame Maria (Basingstoke) (Con)
† Mishra, Navendu (Stockport) (Lab)
Moore, Damien (Southport) (Con)
Nicolson, John (Ochil and South Perthshire) (SNP)
† Philp, Chris (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)
† Russell, Dean (Watford) (Con)
† Stevenson, Jane (Wolverhampton North East) (Con)
Katya Cassidy, Kevin Maddison, Seb Newman, Committee Clerks
† attended the Committee
Public Bill Committee
Tuesday 28 June 2022
(Morning)
[Sir Roger Gale in the Chair]
Online Safety Bill
09:25
None Portrait The Chair
- Hansard -

Good morning, ladies and gentlemen. Please be kind enough to make sure that your mobile phones are switched off.

New Clause 4

Duty to disclose information to OFCOM

“(1) This section sets out the duties to disclose information to OFCOM which apply in relation to all regulated user-to-user services.

(2) A regulated user-to-user service must disclose to OFCOM anything relating to that service of which that regulator would reasonably expect notice.

(3) This includes —

(a) any significant changes to its products or services which may impact upon its performance of its safety duties;

(b) any significant changes to its moderation arrangements which may impact upon its performance of its safety duties;

(c) any significant breaches in respect of its safety duties.”—(Barbara Keeley.)

This new clause creates a duty to disclose information to Ofcom.

Brought up, and read the First time.

Baroness Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

Good morning, Sir Roger. The new clause would require regulated companies to disclose proactively to the regulator material changes in its operations that may impact on safety, and any significant breaches as a result of its safety duties. Category 1 services should be under regulatory duties to disclose proactively to the regulator matters about which it could reasonably expect to be informed. For example, companies should notify Ofcom about significant changes to their products and services, or to their moderation arrangements, that may impact on the child abuse threat and the company’s response to it. A similar proactive duty already applies in the financial services sector. The Financial Conduct Authority handbook states:

“A firm must deal with its regulators in an open and cooperative way, and must disclose to the FCA appropriately anything relating to the firm of which that regulator would reasonably expect notice.”

The scope of the duty we are suggesting could be drawn with sufficient clarity so that social media firms properly understand their requirements and companies do not face unmanageable reporting burdens. Such companies should also be subject to red flag disclosure requirements, whereby they would be required to notify the regulator of any significant lapses in, or changes to, systems and processes that compromise children’s safety or could put them at risk. For example, if regulation had been in place over the last 12 months, Facebook might reasonably have been expected to report on the technology and staffing issues to which it attributes its reduced detection of child abuse content.

Experience from the financial services sector demonstrates the importance of disclosure duties as a means of regulatory intelligence gathering. Perhaps more importantly, they provide a useful means of hard-wiring regulatory compliance into company decisions on the design and operation of their sites.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

Thank you for chairing this meeting, Sir Roger. I have a quick question for the Minister that relates to the new clause, which is a reasonable request for a duty on providers to disclose information to Ofcom. We would hope that the regulator had access to that information, and if companies are making significant changes, it is completely reasonable that they should have to tell Ofcom.

I do not have any queries or problems with the new clause; it is good. My question for the Minister is—I am not trying to catch anyone out; I genuinely do not know the answer—if a company makes significant changes to something that might impact on its safety duties, does it have to do a new risk assessment at that point, or does it not have to do so until the next round of risk assessments? I do not know the answer, but it would be good if the direction of travel was that any company making drastic changes that massively affected security—for example, Snapchat turning on the geolocation feature when it did an update—would have to do a new risk assessment at that point, given that significant changes would potentially negatively impact on users’ safety and increase the risk of harm on the platform.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

It is a pleasure, as always, to serve under your chairmanship, Sir Roger. As the hon. Member for Worsley and Eccles South said, the new clause is designed to introduce a duty on providers to notify Ofcom of anything that Ofcom could reasonably be expected to be notified of.

The Bill already has extremely strong information disclosure provisions. I particularly draw the Committee’s attention to clause 85, which sets out Ofcom’s power to require information by provision of an information notice. If Ofcom provides an information notice—the particulars of which are set out in clause 86—the company has to abide by that request. As the Committee will recall, the strongest sanctions are reserved for the information duties, extending not only to fines of up to 10% or service discontinuation—unplugging the website, as it were; there is also personal criminal liability for named executives, with prison sentences of up to two years. We take those information duties extremely seriously, which is why the sanctions are as strong as they are.

The hon. Member for Aberdeen North asked what updates would occur if there were a significant design change. I draw the Committee’s attention to clause 10, which deals with children’s risk assessment duties, but there are similar duties in relation to illegal content and the safety of adults. The duty set out in clause 10(2), which cross-refers to schedule 3, makes it clear. The relevant words are “suitable and sufficient”. Clearly if there were a massive design change that would, in this case, adversely affect children, the risk assessment would not be suitable and sufficient if it were not updated to reflect that design change. I hope that answers the hon. Lady’s question.

Turning to the particulars of the new clause, if we incentivise companies to disclose information they have not been asked for by Ofcom, there is a danger that they might, through an excessive desire to comply, over-disclose and provide a torrent of information that would not be very helpful. There might also be a risk that some companies that are not well intentioned would deliberately dump enormous quantities of data in order to hide things within it. The shadow Minister, the hon. Member for Worsley and Eccles South, mentioned an example from the world of financial services, but the number of companies potentially within the scope of the Bill is so much larger than even the financial services sector. Some 25,000 companies may be in scope, a number that is much larger—probably by one order of magnitude, and possibly by two—than the financial services sector regulated by the FCA. That disparity in scale makes a significant difference.

Given that there are already strong information provision requirements in the Bill, particularly clause 85, and because of the reasons of scale that I have mentioned, I will respectfully resist the new clause.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

We believe that the platforms need to get into disclosure proactively, and that this is a reasonable clause, so we will push it to a vote.

Question put, That the clause be read a Second time.

Division 53

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

New Clause 5
Duty to distinguish paid-for advertisements
“(1) A provider of a Category 2A service must operate the service using systems and processes designed to clearly distinguish to users of that service paid-for advertisements from all other content appearing in or via search results of the service.
(2) The systems and processes described under subsection (1)—
(a) must include clearly displaying the words “paid-for advertisement” next to any paid-for advertisement appearing in or via search results of the service, and
(b) may include measures such as but not limited to the application of colour schemes to paid-for advertisements appearing in or via search results of the service.
(3) The reference to paid-for advertisements appearing “in or via search results of a search service” does not include a reference to any advertisements appearing as a result of any subsequent interaction by a user with an internet service other than the search service.
(4) If a person is the provider of more than one Category 2A service, the duties set out in this section apply in relation to each such service.
(5) The duties set out in this section extend to the design, operation and use of a Category 2A service that hosts paid-for advertisements targeted at users of that service in the United Kingdom.
(6) For the meaning of “Category 2A service”, see section 81 (register of a categories of service).
(7) For the meaning of “paid-for advertisement”, see section 189 (interpretation: general).”—(Alex Davies-Jones.)
Brought up, and read the First time.
Question put, That the clause be read a Second time.

Division 54

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

New Clause 6
Duty to verify advertisements
“(1) A provider of a Category 2A service must operate an advertisement verification process for any relevant advertisement appearing in or via search results of the service.
(2) In this section, “relevant advertisement” means any advertisement for a service or product to be designated in regulations made by the Secretary of State.
(3) The verification process under subsection (1) must include a requirement for advertisers to demonstrate that they are authorised by a UK regulatory body.
(4) In this section, “UK regulatory body” means a UK regulator responsible for the regulation of a particular service or product to be designated in regulations made by the Secretary of State.
(5) If a person is the provider of more than one Category 2A service, the duties set out in this section apply in relation to each such service.
(6) For the meaning of “Category 2A service”, see section 81 (register of a categories of service).
(7) Regulations under this section shall be made by statutory instrument.
(8) A statutory instrument containing regulations under this section may not be made unless a draft of the instrument has been laid before and approved by resolution of each House of Parliament.”—(Alex Davies-Jones.)
Brought up, and read the First time.
Question put, That the clause be read a Second time.

Division 55

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

New Clause 7
Report on duties to protect content of democratic importance and journalistic content
“(1) The Secretary of State must publish a report which—
(a) reviews the extent to which Category 1 services have fulfilled their duties under—
(i) Clause 15; and
(ii) Clause 16;
(b) analyses the effectiveness of Clauses 15 and 16 in protecting against—
(i) foreign state actors;
(ii) extremist groups and individuals; and
(iii) sources of misinformation and disinformation.
(2) The report must be laid before Parliament within one year of this Act being passed.”—(Alex Davies-Jones.)
This new clause would require the Secretary of State to publish a report reviewing the effectiveness of Clauses 15 and 16.
Brought up, and read the First time.
Question put, That the clause be read a Second time.

Division 56

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

New Clause 8
OFCOM’s guidance about user identity verification
“(1) OFCOM must produce guidance for providers of Category 1 services on how to comply with the duty set out in section 57(1).
(2) In producing the guidance (including revised or replacement guidance), OFCOM must have regard to—
(a) ensuring providers offer forms of identity verification which are likely to be accessible to vulnerable adult users and users with protected Characteristics under the Equality Act 2010,
(b) promoting competition, user choice, and interoperability in the provision of identity verification,
(c) protection of rights, including rights to privacy, freedom of expression, safety, access to information, and the rights of children,
(d) alignment with other relevant guidance and regulation, including with regards to Age Assurance and Age Verification.
(3) In producing the guidance (including revised or replacement guidance), OFCOM must set minimum standards for the forms of identity verification which Category services must offer, addressing—
(a) effectiveness,
(b) privacy and security,
(c) accessibility,
(d) time-frames for disclosure to Law Enforcement in case of criminal investigations,
(e) transparency for the purposes of research and independent auditing,
(f) user appeal and redress mechanisms.
(4) Before producing the guidance (including revised or replacement guidance), OFCOM must consult—
(a) the Information Commissioner,
(b) the Digital Markets Unit,
(c) persons whom OFCOM consider to have technological expertise relevant to the duty set out in section 57(1),
(d) persons who appear to OFCOM to represent the interests of users including vulnerable adult users of Category 1 services, and
(e) such other persons as OFCOM considers appropriate.
(5) OFCOM must publish the guidance (and any revised or replacement guidance).”—(Alex Davies-Jones.)
This new clause would require Ofcom to set a framework of principles and minimum standards for the User Verification Duty.
Brought up, and read the First time.
Question put, That the clause be read a Second time.

Division 57

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

New Clause 9
Risk assessments: submission to OFCOM and publication
“Whenever a Category 1 service carries out any risk assessment pursuant to Part 3 of this Act, the service must—
(a) submit the risk assessment to OFCOM; and
(b) publish the risk assessment on the service’s website.”—(Barbara Keeley.)
This new clause requires any risk assessment carried out by a Category 1 service under Part 3 to be submitted to Ofcom and published.
Brought up, and read the First time.
Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

Throughout these debates it has been clear that we agree on both sides that the Online Safety Bill must be a regime that promotes the highest levels of transparency. This will ensure that platforms can be held accountable for their systems and processes. Like other regulated industries, they must be open and honest with the regulator and the public about how their products work and how they keep users safe.

As we know, platforms duck and dive to avoid sharing information that could make life more difficult for them or cast them in a dim light. The Bill must give them no opportunity to shirk their responsibilities. The Bill enables the largest platforms to carry out a risk assessment safe in the knowledge that it may never see the light of day. Ofcom can access such information if it wants, but only following a lengthy process and as part of an investigation. This creates no incentive for platforms to carry out thorough and proper risk assessments. Instead, platforms should have to submit these risk assessments to Ofcom not only on request but as a matter of course. Limiting this requirement to only the largest platforms will not overload Ofcom, but will give it the tools and information it needs to oversee an effective regime.

In addition, the public have a right to know the risk profile of the services they use. This happens in all other regulated industries, with consumers having easy access to the information they need to make informed decisions about the products they use. At present, the Bill does not give users the information they deserve about what to expect online. Parents in particular will be empowered by information about the risk level of platforms their children use. Therefore, it is imperative that risk assessments are made publicly available, as well as submitted to the regulator as a matter of course.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a couple of comments on the point about parental empowerment. I have been asked by my children for numerous apps. I have a look at them and think, “I don’t know anything about this app. I have never seen or heard of it before, and I have no idea the level of user-to-user functionality in this app.” Nowhere is there a requirement for this information to be set out. There is nowhere that parents can easily find this information.

With iPhones, if a kid wants an app, they have to request it from their parent and their parents needs to approve whether or not they get it. I find myself baffled by some of them because they are not ones that I have ever heard of or come across. To find out whether they have that level of functionality, I have to download and use the app myself in the way that, hopefully, my children would use it in order to find out whether it is safe for them.

A requirement for category 1 providers to be up front and explain the risks and how they manage them, and even how people interact with their services, would increase the ability of parents to be media literate. We can be as media literate as we like, but if the information is not there and we cannot find it anywhere, we end up having to make incredibly restrictive decisions in relation to our children’s ability to use the internet, which we do not necessarily want to make. We want them to be able to have fun, and the information being there would be very helpful, so I completely agree on that point.

My other point is about proportionality. The Opposition moved new clause 4, relating to risk assessments, and I did not feel able to support it on the basis of the arguments that the Minister made about proportionality. He made the case that Ofcom would receive 25,000 risk assessments and would be swamped by the number that it might receive. This new clause balances that, and has the transparency that is needed.

It is completely reasonable for us to put the higher burden of transparency on category 1 providers and not on other providers because they attract the largest market share. A huge percentage of the risk that might happen online happens with category 1 providers, so I am completely happy to support this new clause, which strikes the right balance. It answers the Minister’s concerns about Ofcom being swamped, because only category 1 providers are affected. Asking those providers to put the risk assessment on their site is the right thing to do. It will mean that there is far more transparency and that people are better able to make informed decisions.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I understand the intention behind the new clause, but I want to draw the Committee’s attention to existing measures in the Bill that address this matter. I will start with the point raised by the hon. Member for Aberdeen North, who said that as a parent she would like to be able to see a helpful summary of what the risks are prior to her children using a new app. I am happy to say to her that that is already facilitated via clause 13(2), which appears at the top of page 13. There is a duty there

“to summarise in the terms of service the findings of the most recent adults’ risk assessment of a service”,

including the levels of risk, and the nature and severity of those risks. That relates specifically to adults, but there is an equivalent provision relating to children as well.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I just gently say that if there is a requirement for people to sign up or begin to go through the sign-up process in order to see the terms of service, that is not as open and transparent. That is much more obstructive than it could be. A requirement for providers to make their terms of service accessible to any user, whether or not they were registered, would assist in the transparency.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I think the terms of service are generally available to be viewed by anyone. I do not think people have to be registered users to view the terms of service.

In addition to the duty to summarise the findings of the most recent risk assessment in relation to adults in clause 13(2), clause 11 contains obligations to specify in the terms of service, in relation to children, where children might be exposed to risks using that service. I suggest that a summary in the terms of service, which is an easy place to look, is the best way for parents or anybody else to understand what the risks are, rather than having to wade through a full risk assessment. Obviously, the documents have not been written yet, because the Bill has not been passed, but I imagine they would be quite long and possibly difficult to digest for a layperson, whereas a summary is more readily digestible. Therefore, I think the hon. Lady’s request as a parent is met by the duties set out in clause 11, and the duties for adults are set out in clause 13.

09:44
On transparency and disclosure more generally, beyond the summaries that will be published, I would point to the transparency duties in clause 64, which we have discussed previously. Ofcom must specify what it requires to be published publicly and the platforms will then have to comply with that. That is a good mechanism for Ofcom to force publication of what it thinks needs to be brought into the light of day to meet the wider public interest, and the interests of users and parents. I hope that I have set out how, in clauses 11, 13 and 64, the transparency and disclosure obligations are met. In addition, clause 136 will require Ofcom to produce a report about providing researchers with access to information, which is important.
So what are the issues with the new clause? First, for the reasons that I have set out, the Bill already addresses the point. However, exposing the entire risk assessment publicly also carries some risks itself. For example, if the risk assessment identifies weaknesses or vulnerabilities in the service—ways that malfeasant people could exploit it to get at children or do something else that we would consider harmful—then exposing to everybody, including bad actors, the ways of beating the system and doing bad things on the service would not necessarily be in the public interest. A complete disclosure could help those looking to abuse and exploit the systems. That is why the transparency duties in clause 64 and the duties to publish accessible summaries in clauses 11 and 13 meet the objectives—the quite proper objectives—of the shadow Minister, the hon. Member for Worsley and Eccles South, and the hon. Member for Aberdeen North, without running the risks that are inherent in new clause 9, which I would therefore respectfully and genuinely resist.
Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The Minister seems to be resisting so many measures that have been put forward that would improve transparency, particularly by making information publicly available. As I made clear, the public have a right to know the risk profile of the services they use. We have debated this issue reasonably exhaustively now. Therefore, I will press the new clause to a Division.

Question put, That the clause be read a Second time.

Division 58

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

New Clause 10
Special circumstances
“(1) This section applies where OFCOM has reasonable grounds for believing that circumstances exist that present a threat—
(a) to the health or safety of the public, or
(b) to national security.
(2) OFCOM may, in exercising their media literacy functions, give priority for a specified period to specified objectives designed to address the threat presented by the circumstances mentioned in subsection (1).
(3) OFCOM may give a public statement notice to—
(a) a specified provider of a regulated service, or
(b) providers of regulated services generally.
(4) A “public statement notice” is a notice requiring a provider of a regulated service to make a publicly available statement, by a date specified in the notice, about steps the provider is taking in response to the threat presented in the circumstances mentioned in subsection (1).
(5) OFCOM may, by a public statement notice or a subsequent notice, require a provider of a regulated service to provide OFCOM with such information as they may require for the purpose of responding to that threat.
(6) If OFCOM takes any of the steps set out in this Chapter, they must publish their reasons for doing so.
(7) In subsection (2) “media literacy functions” means OFCOM’s functions under section 11 of the Communications Act (duty to promote media literacy), so far as functions under that section relate to regulated services.”—(Alex Davies-Jones.)
This new clause gives Ofcom the power to take particular steps where it considers that there is a threat to the health and safety of the public or to national security, without the need for a direction from the Secretary of State.
Brought up, and read the First time.
Question put, That the clause be read a Second time.

Division 59

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

New Clause 12
Secretary of State’s powers to suggest modifications to a code of practice
“(1) The Secretary of State may on receipt of a code write within one month of that day to OFCOM with reasoned, evidence-based suggestions for modifying the code.
(2) OFCOM shall have due regard to the Secretary of State’s letter and must reply to the Secretary of State within one month of receipt.
(3) The Secretary of State may only write to OFCOM twice under this section for each code.
(4) The Secretary of State and OFCOM shall publish their letters as soon as reasonably possible after transmission, having made any reasonable redactions for public safety and national security.
(5) If the draft of a code of practice contains modifications made following changes arising from correspondence under this section, the affirmative procedure applies.”—(Alex Davies-Jones.)
This new clause gives the Secretary of State powers to suggest modifications to a code of practice, as opposed to the powers of direction proposed in clause 40.
Brought up, and read the First time.
Question put, That the clause be read a Second time.

Division 60

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

New Clause 13
Liability for companies associated with regulated services
“(1) A relevant regulated entity (“C”) is liable for penalties set out in the Bill where a person or company (“A”) associated with C and considered by a user to be a component of C does not comply with the duties established in the Bill.
(2) Subsection (1) applies whether or not C has made A aware of the duties established in the Bill.
(3) But it is a defence for C to prove that C had in place adequate procedures designed to prevent persons associated with C from undertaking such conduct.
(4) In this section a “relevant regulated entity” means a regulated service as defined in section 3(4) of this Act.
(5) For the purposes of this section, A is associated with C if A is a person who performs services for or on behalf of C notwithstanding—
(a) the capacity in which A performs services for or on behalf of C;
(b) whether or not A is an employee, agent or subsidiary of C.
(6) Whether or not A is a person who performs services for or on behalf of C is to be determined by reference to all the relevant circumstances and not merely by reference to the nature of the relationship between A and C.
(7) If A is an employee of C, it is to be presumed unless the contrary is shown that A is a person who performs services for or on behalf of C.”—(Alex Davies-Jones.)
Brought up, and read the First time.
Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

Good morning, Sir Roger. As my hon. Friend the Member for Worsley and Eccles South mentioned when speaking to new clause 11, Labour has genuine concerns about supply chain risk assessment duties. That is why we have tabled new clause 13, which seeks to ensure enforcement of liability for supply chain failures that amount to a breach of one of the specified duties drawing on existing legislation.

As we know, platforms, particularly those supporting user-to-user generated content, often employ services from third parties. At our evidence sessions we heard from Danny Stone of the Antisemitism Policy Trust that this has included Twitter explaining that racist GIFs were not its own but were provided by another service. The hands-off approach that platforms have managed to get away with for far too long is exactly what the Bill is trying to fix, yet without this important new clause we fear there will be very little change.

We have already raised issues with the reliance on third party providers more widely, particularly content moderators, but the same problems also apply to some types of content. Labour fears a scenario in which a company captured by the regulatory regime established by the Bill will argue that an element of its service is not within the ambit of the regulator simply because it is part of a supply chain, represented by, but not necessarily the responsibility of, the regulated services.

The contracted element, supported by an entirely separate company, would argue that it is providing business-to-business services. That is not user-to-user generated content per se but content designed and delivered at arm’s length, provided to the user-to-user service to deploy to its users. The result would likely be a timely, costly and unhelpful legal process during which systems could not be effectively regulated. The same may apply in relation to moderators, where complex contract law would need to be invoked.

We recognise that in UK legislation there are concerns and issues around supply chains. The Bribery Act 2010, for example, says that a company is liable if anyone performing services for or on the company’s behalf is found culpable of specific actions. We therefore strongly urge the Minister to consider this new clause. We hope he will see the extremely compelling reasons why liability should be introduced for platforms failing to ensure that associated parties, considered to be a part of a regulated service, help to fulfil and abide by relevant duties.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The new clause seeks to impose liability on a provider where a company providing regulated services on its behalf does not comply with the duties in the Bill. The provider would be liable regardless of whether it has any control over the service in question. We take the view this would impose an unreasonable burden on businesses and cause confusion over which companies are required to comply with the duties in the Bill.

As drafted, the Bill ensures legal certainty and clarity over which companies are subject to duties. Clause 180 makes it clear that the Bill’s duties fall on companies with control over the regulated service. The point about who is in control is very important, because the liability should follow the control. These companies are responsible for ensuring that any third parties, such as contractors or individuals involved in running the service, are complying with the Bill’s safety duties, so that they cannot evade their duties in that way.

Companies with control over the regulated service are best placed to keep users safe online, assess risk, and put in place systems and processes to minimise harm, and therefore bear the liability if there is a transgression under the Bill as drafted. Further, the Bill already contains robust provisions in clause 161 and schedule 14 that allow Ofcom to hold parent and subsidiary companies jointly liable for the actions of other companies in a group structure. These existing mechanisms promote strong compliance within groups of companies and ensure that the entities responsible for breaches are the ones held responsible. That is why we feel the Bill as drafted achieves the relevant objectives.

Question put, That the clause be read a Second time.

Division 61

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

New Clause 14
Duty to promote media literacy: regulated user-to-user services and search services
“(1) In addition to the duty on OFCOM to promote media literacy under section 11 of the Communications Act 2003, OFCOM must take such steps as they consider appropriate to improve the media literacy of the public in relation to regulated user-to-user services and search services.
(2) This section applies only in relation to OFCOM’s duty to regulate—
(a) user-to-user services, and
(b) search services.
(3) OFCOM’s performance of its duty in subsection (1) must include pursuit of the following objectives—
(a) to reach audiences who are less engaged with, and harder to reach through, traditional media literacy initiatives;
(b) to address gaps in the availability and accessibility of media literacy provisions targeted at vulnerable users;
(c) to build the resilience of the public to disinformation and misinformation by using media literacy as a tool to reduce the harm from that misinformation and disinformation;
(d) to promote greater availability and effectiveness of media literacy initiatives and other measures, including by—
(i) carrying out, commissioning or encouraging educational initiatives designed to improve the media literacy of the public;
(ii) seeking to ensure, through the exercise of OFCOM’s online safety functions, that providers of regulated services take appropriate measures to improve users’ media literacy;
(iii) seeking to improve the evaluation of the effectiveness of the initiatives and measures mentioned in sub paras (2)(d)(i) and (ii) (including by increasing the availability and adequacy of data to make those evaluations);
(e) to promote better coordination within the media literacy sector.
(4) OFCOM may prepare such guidance about the matters referred to in subsection (2) as it considers appropriate.
(5) Where OFCOM prepares guidance under subsection (4) it must—
(a) publish the guidance (and any revised or replacement guidance); and
(b) keep the guidance under review.
(6) OFCOM must co-operate with the Secretary of State in the exercise and performance of their duty under this section.”—(Alex Davies-Jones.)
This new clause places an additional duty on Ofcom to promote media literacy of the public in relation to regulated user-to-user services and search services.
Brought up, and read the First time.
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

New clause 15—Media literacy strategy

“(1) OFCOM must prepare a strategy which sets out how they intend to undertake their duty to promote media literacy in relation to regulated user-to-user services and regulated search services under section (Duty to promote media literacy: regulated user-to-user services and search services).

(2) The strategy must—

(a) set out the steps OFCOM propose to take to achieve the pursuit of the objectives set out in section (Duty to promote media literacy: regulated user-to-user services and search services),

(b) set out the organisations, or types of organisations, that OFCOM propose to work with in undertaking the duty;

(c) explain why OFCOM considers that the steps it proposes to take will be effective;

(d) explain how OFCOM will assess the extent of the progress that is being made under the strategy.

(3) In preparing the strategy OFCOM must have regard to the need to allocate adequate resources for implementing the strategy.

(4) OFCOM must publish the strategy within the period of 6 months beginning with the day on which this section comes into force.

(5) Before publishing the strategy (or publishing a revised strategy), OFCOM must consult—

(a) persons with experience in or knowledge of the formulation, implementation and evaluation of policies and programmes intended to improve media literacy;

(b) the advisory committee on disinformation and misinformation, and

(c) any other person that OFCOM consider appropriate.

(6) If OFCOM have not revised the strategy within the period of 3 years beginning with the day on which the strategy was last published, they must either—

(a) revise the strategy, or

(b) publish an explanation of why they have decided not to revise it.

(7) If OFCOM decides to revise the strategy they must—

(a) consult in accordance with subsection (3), and

(b) publish the revised strategy.”

This new clause requires Ofcom to publish a strategy related to their duty to promote media literacy of the public in relation to regulated user-to-user services and search services.

New clause 16—Media literacy strategy: progress report

“(1) OFCOM must report annually on the delivery of the strategy required under section (Duty to promote media literacy: regulated user-to-user services and search services).

(2) The report must include—

(a) a description of the steps taken in accordance with the strategy during the year to which the report relates; and

(b) an assessment of the extent to which those steps have had an effect on the media literacy of the public in that year.

(3) The assessment referred to in subsection (2)(b) must be made in accordance with the approach set out by OFCOM in the strategy (see section (Duty to promote media literacy: regulated user-to-user services and search services) (2)(d).

(4) OFCOM must—

(a) publish the progress report in such manner as they consider appropriate; and

(b) send a copy of the report to the Secretary of State who must lay the copy before Parliament.”

This new clause is contingent on NC15.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The UK has a vast media literacy skills and knowledge gap, which leaves the population at risk of harm. Indeed, research from Ofcom found that a third of internet users are unaware of the potential for inaccurate or biased information. Similarly, about 61% of social media users who say they are confident in judging whether online content is true or false actually lack the skills to do so.

Good media literacy is our first line of defence against bad information online. It can make the difference between decisions based on sound evidence and decisions based on poorly informed opinions that can harm health and wellbeing, social cohesion and democracy. Clause 103 of the draft Bill proposed a new media duty for Ofcom to replace the one in section 11 of the Communications Act 2003, but sadly the Government scrapped it from the final Bill.

Media literacy initiatives in the Online Safety Bill are now mentioned only in the context of risk assessments, but there is no active requirement for internet companies to promote media literacy. The draft Bill’s media literacy provision needed to be strengthened, not cut. New clauses 14, 15 and 16 would introduce a new, stronger media literacy duty on Ofcom, with specific objectives. They would require the regulator to produce a statutory strategy for delivering on it and then to report on progress made towards increasing media literacy under the strategy. There is no logical reason for the Minister not to accept these important new clauses or work with Labour on them.

Over the past few weeks, we have debated a huge range of issues that are being perpetuated online as we speak, from vile, misogynistic content about women and girls to state-sponsored disinformation. It is clear that the lessons have not been learned from the past few years, when misinformation was able to significantly undermine public health, most notably throughout the pandemic. Harmful and, more importantly, false statistics were circulated online, which caused significant issues in encouraging the uptake of the vaccine. We have concerns that, without a robust media literacy strategy, the consequences of misinformation and disinformation could go further.

The issues that Labour has raised about the responsibility of those at the top—the Government—have been well documented. Only a few weeks ago, we spoke about the Secretary of State actually contributing to the misinformation discourse by sharing a picture of the Labour leader that was completely out of context. How can we be in a position where those at the top are contributing to this harmful discourse? The Minister must be living in a parallel universe if he cannot see the importance of curbing these harmful behaviours online as soon as possible. He must know that media literacy is at the very heart of the Bill’s success more widely. We genuinely feel that a strengthened media literacy policy would be a huge step forward, and I sincerely hope that the Minister will therefore accept the justification behind these important new clauses.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I agree entirely on these new clauses. Although the Bill will make things safer, it will do that properly only if supported by proper media literacy and the upskilling of everybody who spends any portion of their lives online. They all need better media literacy, and I am not excluding myself from that. Everybody, no matter how much time they have spent online, can learn more about better ways to fact-check and assess risk, and about how services use our data.

I pay tribute to all those involved in media literacy—all the educators at all levels, including school teachers delivering it as part of the curriculum, school teachers delivering it not as part of the curriculum, and organisations such as CyberSafe Scotland in my constituency, which is working incredibly hard to upskill parents and children about the internet. They also include organisations such as the Silver City Surfers in Aberdeen, where a group of young people teaches groups of elderly people how to use the internet. All those things are incredibly helpful and useful, but we need to ensure that Ofcom is at the top of that, producing materials and taking its duties seriously. It must produce the best possible information and assistance for people so that up-to-date media literacy training can be provided.

As we have discussed before, Ofcom’s key role is to ensure that when threats emerge, it is clear and tells people, “This is a new threat that you need to be aware of,” because the internet will grow and change all the time, and Ofcom is absolutely the best placed organisation to be recognising the new threats. Obviously, it would do that much better with a user advocacy panel on it, but given its oversight and the way it will be regulating all the providers, Ofcom really needs to take this issue as seriously as it can. It is impossible to overstate the importance of media literacy, so I give my wholehearted backing to the three new clauses.

10:00
Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

I rise to speak in favour of new clauses 14 to 16, on media literacy. As we have discussed in Committee, media literacy is absolutely vital to ensure that internet users are aware of the tools available to protect themselves. Knowledge and understanding of the risks online, and how to protect against them, are the first line of defence for us all.

We all know that the Bill will not eliminate all risk online, and it will not entirely clean up the internet. Therefore, ensuring that platforms have robust tools in place, and that users are aware of them, is one of the strongest tools in the Bill to protect internet users. As my hon. Friend the Member for Pontypridd said, including the new clauses in the Bill would help to ensure that we all make decisions based on sound evidence, rather than on poorly informed opinions that can harm not just individuals but democracy itself. The new clauses, which would place a duty on Ofcom to promote media literacy and publish a strategy, are therefore crucial.

I am sure we all agree about the benefits of public health information that informs us of the role of a healthy diet and exercise, and of ways that we can adopt a healthier lifestyle. I do not want to bring up the sensitive subject of the age of members of the Committee, as it got me into trouble with some of my younger colleagues last week, but I am sure many of us will remember the Green Cross Code campaign, the stop smoking campaigns, the anti-drink driving ads, and the powerful campaign to promote the wearing of seatbelts—“Clunk click every trip”. These were publicly funded and produced information campaigns that have stuck in our minds and, I am sure, protected thousands of lives across the country. They laid out the risks and clearly stated the actions we all need to take to protect ourselves.

When it comes to online safety, we need a similar mindset to inform the public of the risks and how we can mitigate them. Earlier in Committee, the right hon. Member for Basingstoke, a former Secretary of State for Digital, Culture, Media and Sport, shared her experience of cyber-flashing and the importance of knowing how to turn off AirDrop to prevent such incidents from occurring in the first place. I had no idea about this simple change that people can make to protect themselves from such an unpleasant experience. That is the type of situation that could be avoided with an effective media literacy campaign, which new clauses 14 to 16 would legislate for.

I completely agree that platforms have a significant duty to design and implement tools for users to protect themselves while using platforms’ services. However, I strongly believe that only a publicly funded organisation such as Ofcom can effectively promote their use, explain the dangers of not using them and target such information at the most vulnerable internet users. That is why I wholeheartedly support these vital new clauses.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The Government obviously recognise and support the intent behind the new clause, which is to make sure that work is undertaken by Ofcom specifically, and the Government more widely, on media literacy. That is important for the reasons laid out by the hon. Members for Aberdeen North and for Batley and Spen.

Ofcom already has a statutory duty to promote media literacy in relation to electronic media, which includes everything in scope of the Bill and more beyond. That is set out in the Communications Act 2003, so the statutory duty exists already. The duty proposed in new clause 14 is actually narrower in scope than the existing statutory duty on Ofcom, and I do not think it would be a very good idea to give Ofcom an online literacy duty with a narrower scope than the one it has already. For that reason, I will resist the amendment, because it narrows the duties rather than widens them.

I would also point out that a number of pieces of work are being done non-legislatively. The campaigns that the hon. Member for Batley and Spen mentioned—dating often, I think, back to the 1980s—were of course done on a non-legislative basis and were just as effective for it. In that spirit, Ofcom published “Ofcom’s approach to online media literacy” at the end of last year, which sets out how Ofcom plans to expand, and is expanding, its media literacy programmes, which cover many of the objectives specified in the new clause. Therefore, Ofcom itself has acted already—just recently—via that document.

Finally, I have two points about what the Government are doing. First, about a year ago the Government published their own online media literacy strategy, which has been backed with funding and is being rolled out as we speak. When it comes to disinformation more widely, which we have debated previously, we also have the counter-disinformation unit working actively on that area.

Therefore, through the Communications Act 2003, the statutory basis exists already, and on a wider basis than in these new clauses; and, through the online media literacy strategy and Ofcom’s own approach, as recently set out, this important area is well covered already.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

We feel that we cannot have an online safety Bill without a core digital media literacy strategy. We are disappointed that clause 103 was removed from the draft Bill. We do not feel that the current regime, under the Communications Act 2003, is robust enough. Clearly, the Government do not think it is robust enough, which is why they tried to replace it in the first place. We are sad to see that now replaced altogether. We fully support these new clauses.

Question put, That the clause be read a Second time.

Division 62

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

New Clause 17
Algorithmic prompts: prohibition of protected characteristics
“(1) A search service which uses an algorithm to suggest search terms to users, an “algorithmic prompt”, must not apply any algorithm where any of the words in the search term relate to any protected characteristic as defined in the Equality Act 2010.
(2) If the word relating to a protected characteristic is not the first word input, the algorithmic prompt must cease as soon as the word relating to a protected characteristic is input by the user.”—(Kirsty Blackman.)
This new clause removes the ability of search services to allow their algorithms to create prompts in relation to protected characteristics. This removes entirely the possibility that a prompt would contain discriminatory language toward an individual or group with protected characteristics.
Brought up, and read the First time.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

I tabled new clause 17 in relation to protected characteristics because of some of the points made by Danny Stone. I missed the relevant evidence session because unfortunately, at the time, I was in the Chamber, responding to the Chancellor of the Exchequer. I am referring to some of the points made by Danny Stone in the course of the evidence session in relation to the algorithmic prompts that there are in search functions.

We have an issue with search functions; we have an issue with the algorithmic prompts that there are in search functions. There is an issue if someone puts in something potentially derogatory, if they put in something relating to someone with a protected characteristic. For example, if someone were to type “Jews are”, the results that they get with those algorithmic prompts can be overwhelmingly racist, overwhelmingly antisemitic, overwhelmingly discriminatory. The algorithm should not be pushing those things.

To give organisations like Google some credit, if something like that is highlighted to them, they will address it. Some of them take a long time to sort it, but they will have a look at it, consider sorting it and, potentially, sort it. But that is not good enough. By that point, the damage is done. By that point, the harm has been put into people’s minds. By that point, someone who is from a particular group and has protected characteristics has already seen that Google—or any other search provider—is pushing derogatory terms at people with protected characteristics.

I know that the prompts work like that because of artificial intelligence; firms are not intentionally writing these terms in order to push them towards people, but the AI allows that to happen. If such companies are going to be using artificial intelligence—some kind of software algorithm—they have a responsibility to make sure that none of the content they are generating on the basis of user searches is harmful. I asked Google about this issue during one of our evidence sessions, and the response they gave was, “Oh, algorithmic prompts are really good, so we should keep them”—obviously I am paraphrasing. I do not think that is a good enough argument. I do not think the value that is added by algorithmic prompts is enough to counter the harm that is caused by some of those prompts.

As such, the new clause specifically excludes protected characteristics from any algorithm that is used in a search engine. The idea is that if a person starts to type in something about any protected characteristic, no algorithmic prompt will appear, and they will just be typing in whatever they were going to type in anyway. They will not be served with any negative, harmful, discriminatory content, because no algorithmic prompt will come up. The new clause would achieve that across the board for every protected characteristic term. Search engines would have to come up with a list of such terms and exclude all of them from the work of the algorithm in order to provide that layer of protection for people.

I do not believe that that negative content could be in any way balanced by the potential good that could arise from somebody being able to type “Jews are” and getting a prompt that says “funny”. That would be a lovely, positive thing for people to see, but the good that could be caused by those prompts is outweighed by the negativity, harm and pain that is caused by the prompts we see today, which platforms are not quick enough to act on.

As I say, the harm is done by the time the report is made; by the time the concern is raised, the harm has already happened. New clause 17 would prevent that harm from ever happening. It would prevent anybody from ever being injured in any way by an algorithmic prompt from a search engine. That is why I have tabled that new clause, in order to provide a level of protection for any protected characteristic as defined under the Equality Act 2010 when it comes to search engine prompts.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The problem underlying the need for this new clause is that under the Bill, search services will not have to address or risk assess legal harm to adults on their sites, while the biggest user-to-user services will. As Danny Stone of the Antisemitism Policy Trust told us in evidence, that includes sites such as Google and Microsoft Bing, and voice search assistants including Amazon’s Alexa and Apple’s Siri. Search services rightly highlight that the content returned by a search is not created or published by then, but as the hon. Member for Aberdeen North has said, algorithmic indexing, promotion and search prompts provided in the search bar are their responsibility. As she has pointed out, and as we have heard in evidence sessions, those algorithms can cause significant harm.

Danny Stone told us on 26 May:

“Search returns are not necessarily covered because, as I say, they are not the responsibility of the internet companies, but the systems that they design as to how those things are indexed and the systems to prevent them going to harmful sites by default are their responsibility, and at present the Bill does not address that.”––[Official Report, Online Safety Public Bill Committee, 26 May 2022; c. 130, Q207.]

The hon. Member for Aberdeen North mentioned the examples from Microsoft Bing that Danny gave in his evidence—“Jews are” and “gays are”. He gave other examples of answers that were returned by search services, such as using Amazon Alexa to search, “Is George Soros evil?” The response was, “Yes, he is.” “Are the White Helmets fake?” “Yes, they are set up by an ex-intelligence officer.” The issue is that the search prompts that the hon. Member has talked about are problematic, because just one person giving an answer to Amazon could prompt that response. The second one, about the White Helmets, was a comment on a website that was picked up. Clearly, that is an issue.

Danny Stone’s view is that it would be wise to have something that forces search companies to have appropriate risk assessments in place for the priority harms that Parliament sets, and to enforce those terms and conditions consistently. It is not reasonable to exempt major international and ubiquitous search services from risk assessing and having a policy to address the harms caused by their algorithms. We know that leaving it up to platforms to sort this out themselves does not work, which is why Labour is supporting the new clause proposed by our SNP colleague.

10:15
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

It is important to make clear how the Bill operates, and I draw the Committee’s attention in particular to clauses 23 to 26, which deal with the risk assessment and safety duties for search services. I point in particular to clause 23(5)(a), which deals with the risk assessment duties for illegal content. The provision makes it clear that those risk assessments have to be carried out

“taking into account (in particular) risks presented by algorithms used by the service”.

Clause 25 relates to children’s risk assessment duties, and subsection (5)(a) states that children’s risk assessment duties have to be carried out

“taking into account (in particular) risks presented by algorithms”.

The risks presented by algorithms are expressly accounted for in clauses 23 and 25 in relation to illegal acts and to children. Those risk assessment duties flow into safety duties as we know.

By coincidence, yesterday I met with Google’s head of search, who talked about the work Google is doing to ensure that its search work is safe. Google has the SafeSearch work programme, which is designed to make the prompts better constructed.

In my view, the purpose of the new clause is covered by existing provisions. If we were to implement the proposal—I completely understand and respect the intention behind it, by the way—there could be an unintended consequence in the sense that it would ban any reference in the prompts to protected characteristics, although people looking for help, support or something like that might find such prompts helpful.

Through a combination of the existing duties and the list of harms, which we will publish in due course, as well as legislating via statutory instrument, we can ensure that people with protected characteristics, and indeed other people, are protected from harmful prompts while not, as it were, throwing the baby out with the bathwater and banning the use of certain terms in search. That might cause an unintended negative consequence for some people, particularly those from marginalised groups who were looking for help. I understand the spirit of the new clause, but we shall gently resist it.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The Minister has highlighted clauses 23 and 25. Clause 25 is much stronger than clause 23, because clause 23 includes only illegal content and priority illegal content, whereas clause 25 goes into non-designated content that is harmful to children. Some of the things that we are talking about, which might not be on the verge of illegal, but which are wrong and discriminatory, might not fall into the categories of illegal or priority illegal content unless the search service, which presumably an organisation such as Google is, has a children’s risk assessment duty. Such organisations are getting a much easier ride in that regard.

I want to make the Minister aware of this. If he turns on Google SafeSearch, which excludes explicit content, and googles the word “oral” and looks at the images that come up, he will see that those images are much more extreme than he might imagine. My point is that, no matter the work that the search services are trying to do, they need to have the barriers in place before that issue happens—before people are exposed to that harmful or illegal content. The existing situation does not require search services to have enough in place to prevent such things happening. The Minister was talking about moderation and things that happen after the fact in some ways, which is great, but does not protect people from the harm that might occur. I very much wish to press the new clause to the vote.

Question put, That the clause be read a Second time.

Division 63

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

New Clause 18
Identification of information incidents by Ofcom
“(1) OFCOM must maintain arrangements for identifying and understanding patterns in the presence and dissemination of harmful misinformation and disinformation on regulated services.
(2) Arrangements for the purposes of subsection (1) must in particular include arrangements for—
(a) identifying, and assessing the severity of, actual or potential information incidents; and
(b) consulting with persons with expertise in the identification, prevention and handling of disinformation and misinformation online (for the purposes of subsection (2)(a)).
(3) Where an actual or potential information incident is identified, OFCOM must as soon as reasonably practicable—
(a) set out any steps that OFCOM plans to take under its online safety functions in relation to that situation; and
(b) publish such recommendations or other information that OFCOM considers appropriate.
(4) Information under subsection (3) may be published in such a manner as appears to OFCOM to be appropriate for bringing it to the attention of the persons who, in OFCOM’s opinion, should be made aware of it.
(5) OFCOM must prepare and issue guidance about how it will exercise its functions under this section and, in particular—
(a) the matters it will take into account in determining whether an information incident has arisen;
(b) the matters it will take into account in determining the severity of an incident; and
(c) the types of responses that OFCOM thinks are likely to be appropriate when responding to an information incident.
(6) For the purposes of this section—
‘harmful misinformation or disinformation’ means misinformation or disinformation which, taking into account the manner and extent of its dissemination, may have a material adverse effect on users of regulated services or other members of the public;
‘information incident’ means a situation where it appears to OFCOM that there is a serious or systemic dissemination of harmful misinformation or disinformation relating to a particular event or situation.”—(Kirsty Blackman.)
This new clause would insert a new clause into the Bill to give Ofcom a proactive role in identifying and responding to the sorts of information incidents that can occur in moments of crisis.
Brought up, and read the First time.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss new clause 45—Sharing of information relating to counter-disinformation

“(1) The Secretary of State must produce a report setting out any steps the Secretary of State has taken to tackle the presence of disinformation on Part 3 services.

(2) The purpose of the report is to assist OFCOM in carrying out its regulatory duties under this Act.

(3) The first report must be submitted to OFCOM and laid before Parliament within six months of this Act being passed.

(4) Thereafter, the Secretary of State must submit an updated report to OFCOM and lay it before Parliament at least once every three months.”

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

My hon. Friend the Member for Ochil and South Perthshire is not present and he had intended to move this new clause. If the Committee does not mind, I will do more reading and look at my notes more than I would normally when giving a speech.

Misinformation and disinformation arise during periods of uncertainty, either acutely, such as during a terror attack, or over a long period, as with the pandemic. That often includes information gaps and a proliferation of inaccurate claims that spread quickly. Where there is a vacuum of information, we can have bad actors or the ill-informed filling it with false information.

Information incidents are not dealt with effectively enough in the Bill, which is focused on regulating the day-to-day online environment. I accept that clause 146 gives the Secretary of State powers of direction in certain special circumstances, but their effectiveness in real time would be questionable. The Secretary of State would have to ask Ofcom to prioritise its media literacy function or to make internet companies report on what they are doing in response to a crisis. That is just too slow, given the speed at which such incidents can spread.

The new clause might involve Ofcom introducing a system whereby emerging incidents could be reported publicly and different actors could request the regulator to convene a response group. The provision would allow Ofcom to be more proactive in its approach and, in I hope rare moments, to provide clear guidance. That is why the new clause is a necessary addition to the Bill.

Many times, we have seen horrendous incidents unfold on the internet, in a very different way from how they ever unfolded in newspapers, on news websites or among people talking. We have seen the untold and extreme harm that such information incidents can cause, as significant, horrific events can be spread very quickly. We could end up in a situation where an incident happens and, for example, a report spreads that a Muslim group was responsible when there is absolutely no basis of truth to that. A vacuum can be created and bad actors step into it in order to spread discrimination and lies, often about minority groups who are already struggling. That is why we move the new clause.

For the avoidance of doubt, new clause 45, which was tabled by Labour, is also to be debated in this group. I am more than happy to support it.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

As we know, the new clause would give Ofcom a proactive role in identifying and responding to misinformation incidents that can occur in a moment of crisis. As we have discussed, there are huge gaps in the Bill’s ability to sufficiently arm Ofcom with the tools it will likely need to tackle information incidents in real time. It is all very well that the Bill will ensure that things such as risk assessments are completed, but, ultimately, if Ofcom is not able to proactively identify and respond to incidents in a crisis, I have genuine concerns about how effective this regulatory regime will be in the wider sense. Labour is therefore pleased support the new clause, which is fundamental to ensuring that Ofcom can be the proactive regulator that the online space clearly needs.

The Government’s methods of tackling disinformation are opaque, unaccountable and may not even work. New clause 45, which would require reporting to Parliament, may begin to address this issue. When Ministers are asked how they tackle misinformation or disinformation harms, they refer to some unaccountable civil service team involved in state-based interference in online media.

I thank those at Carnegie UK Trust for their support when researching the following list, and for supporting my team and me to make sense of the Bill. First, we have the counter-disinformation unit, which is based in the Department for Digital, Culture, Media and Sport and intends to address mainly covid issues that breach companies’ terms of service and, recently, the Russia-Ukraine conflict. In addition, the Government information cell, which is based in the Foreign, Commonwealth and Development Office, focuses on war and national security issues, including mainly Russia and Ukraine. Thirdly, there is the so-called rapid response unit, which is based in the Cabinet Office, and mainly tackles proactive counter-messaging.

Those teams appear to nudge service providers in different ways where there are threats to national security or the democratic process, or risks to public health, yet we have zero record of their effectiveness. The groups do not publish logs of action to any external authority for oversight of what they raise with companies using the privilege authority of Her Majesty’s Government, nor do they publish the effectiveness of their actions. As far as we know, they are not rooted in expert independent external advisers. That direct state interference in the media is very worrying.

In our recent debate on amendment 83, which calls on the Government to include health misinformation and disinformation in the Bill, the Minister clearly set out why he thinks the situation is problematic. He said,

“We have established a counter-disinformation unit within DCMS whose remit is to identify misinformation and work with social media firms to get it taken down. The principal focus of that unit during the pandemic was, of course, covid. In the past three months, it has focused more on the Russia-Ukraine conflict, for obvious reasons.

In some cases, Ministers have engaged directly with social media firms to encourage them to remove content that is clearly inappropriate. For example, in the Russia-Ukraine context, I have had conversations with social media companies that have left up clearly flagrant Russian disinformation. This is, therefore, an area that the Government are concerned about and have been acting on operationally already.”––[Official Report, Online Safety Public Bill Committee, 14 June 2022; c. 408.]

Until we know more about those units, the boundary between their actions and that of a press office remains unclear. In the new regulatory regime, Ofcom needs to be kept up to date on the issues they are raising. The Government should reform the system and bring those units out into the open. We support Carnegie’s longer term strategic goal to set up a new external oversight body and move the current Government functions under Ofcom’s independent supervision. The forthcoming National Security Bill may tackle that, but I will leave that for the Minister to consider.

There must be a reporting system that requires the Government to set out their operational involvement with social media companies to address misinformation and disinformation, which is why we have tabled new clause 45. I hope the Minister will see that the current efforts in these units are hugely lacking in transparency, which we all want and have learned is fundamental to keep us all safe online.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We agree that it is important that the Bill contains measures to tackle disinformation and misinformation that may emerge during serious information incidents, but the Bill already contains measures to address those, including the powers vested in the Secretary of State under clause 146, which, when debated, provoked some controversy. Under that clause, the Secretary of State will have the power to direct Ofcom when exercising its media literacy functions in the context of an issue of public health or safety or national security.

Moreover, Ofcom will be able to require platforms to issue a public statement about the steps they are taking to respond to a threat to public health or safety or to national security. As we discussed, it is appropriate that the Secretary of State will make those directions, given that the Government have the access to intelligence around national security and the relevant health information. Ofcom, as a telecoms regulator, obviously does not have access to that information, hence the need for the Secretary of State’s involvement.

10:30
It is also worth saying that under the existing framework, companies will have to address harmful disinformation that could spread during information incidents, such as the recent pandemic. The Government have already committed to designating some forms of harmful health mis and disinformation as priority harmful content in secondary legislation, which further supports the point.
Ofcom already has reporting duties under the Bill’s framework to carry out reviews of the prevalence and severity of content harmful to children and adults on regulated services. Under clause 135, Ofcom must also produce its own transparency report, in addition to which there will be an advisory committee on dis and misinformation, set out in clause 130, to provide advice to Ofcom about how these issues can be addressed.
The shadow Minister, the hon. Member for Pontypridd, has already made reference to DCMS’s counter-disinformation unit. She has quoted me extensively—I thank her for that—setting out the work it has been doing. She asked about further reporting in terms of oversight of that counter-disinformation unit. Obviously, setting out the full details of what it does could provide inappropriately detailed information to hostile states, such as Russia, that are trying to pump out that disinformation. However, the activities of the CDU are of course open to parliamentary scrutiny in the usual way, whether that is through oral questions, Backbench Business and Opposition day debates, or scrutiny by Select Committees, just as every other area of Government activity is open to parliamentary scrutiny using any of the means available.
On the regular reports sought through new clause 45, we think the work of the CDU is already covered in the way I have just set out. It would not be appropriate to lift up the hood to the point that the Russians and others can see exactly what is going on. Ofcom is already required to consult with the Secretary of State and relevant experts when developing its codes of practice, which gives the Secretary of State an appropriate mechanism.
I have been brief in the interest of time, but I hope I have set out how the Bill as drafted already provides a response to mis and disinformation. I have also pointed out the existing parliamentary scrutiny to which the Government in general and the CDU in particular is subject. I therefore ask the hon. Member for Aberdeen North to withdraw the new clause.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I do not think the urgency and speed that are needed for these incidents is adequately covered by the Bill, so I would like to push new clause 18 to a vote.

Question put, That the clause be read a Second time.

Division 64

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

New Clause 19
Research conducted by regulated services
“(1) OFCOM may, at any time it considers appropriate, produce a report into how regulated services commission, collate, publish and make use of research.
(2) For the purposes of the report, OFCOM may require services to submit to OFCOM—
(a) a specific piece of research held by the service, or
(b) all research the service holds on a topic specified by OFCOM.”—(Kirsty Blackman.)
Brought up, and read the First time.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

I think you are probably getting fed up with me, Sir Roger, so I will try my best not to speak for too long. The new clause is one of the most sensible ones we have put forward. It simply allows Ofcom to ask regulated services to submit to Ofcom

“a specific piece of research held by the service”

or

“all research the service holds”

on a specific topic. It also allows Ofcom to product a report into

“how regulated services commission, collate, publish and make use of research.”

The issues that we heard raised by Frances Haugen about the secretive nature of these very large companies gave us a huge amount concern. Providers will have to undertake risk assessments on the basis of the number of users they have, the risk of harm to those users and what percentage of their users are children. However, Ofcom is just going to have to believe the companies when they say, “We have 1 million users,” unless it has the ability to ask for information that proves the risk assessments undertaken are adequate and that nothing is being hidden by those organisations. In order to find out information about a huge number of the platforms, particularly ones such as Facebook, we have had to have undercover researchers posing as other people, submitting reports and seeing how they come out.

We cannot rely on these companies, which are money-making entities. They exist to make a profit, not to make our lives better. In some cases they very much do make our lives better—in some cases they very much do not—but that is not their aim. Their aim is to try to make a profit. It is absolutely in their interests to underplay the number of users they have and the risk faced by people on their platforms. It is very much in their interest to underplay how the algorithms are firing content at people, taking them into a negative or extreme spiral. It is also in their interests to try to hide that from Ofcom, so that they do not have to put in the duties and mitigations that keep people safe.

We are not asking those companies to make the information public, but if we require them to provide to Ofcom their internal research, whether on the gender or age of their users, or on how many of their users are viewing content relating to self-harm, it will raise their standards. It will raise the bar and mean that those companies have to act in the best interests—or as close as they can get to them—of their users. They will have to comply with what is set out in the Bill and the directions of Ofcom.

I see no issue with that. Ofcom is not going to share the information with other companies, so that they could subvert competition law. Ofcom is a regulator; it literally does not do that. Our proposal would mean that Ofcom has the best, and the most, information in order to take sensible decisions to properly regulate the platforms. It is not a difficult provision for the Minister to accept.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The transparency requirements set out in the Bill are welcome but limited. Numerous amendments have been tabled by the Opposition and by our colleagues in the SNP to increase transparency, so that we can all be better informed about the harms around us, and so that the regulator can determine what protections are needed for existing and emerging harms. This new clause is another important provision in that chain and I speak in support of it.

We know that there is research being undertaken all the time by companies that is never published—neither publicly nor to the regulator. As the hon. Member for Aberdeen North said, publishing research undertaken by companies is an issue championed by Frances Haugen, whose testimony last month the Committee will remember. A few years ago, Frances Haugen brought to the public’s attention the extent to which research is held by companies such as Facebook—as it was called then—and never reaches the public realm.

Billions of members of the public are unaware that they are being tracked and monitored by social media companies as subjects in their research studies. The results of those studies are only published when revealed by brave whistleblowers. However, their findings could help charities, regulators and legislators to recognise harms and help to make the internet a safer place. For example, Frances Haugen leaked one Facebook study that found that a third of teenage girls said Instagram made them feel worse about their bodies. Facebook’s head of safety, Antigone Davis, fielded questions on this issue from United States Senators last September. She claimed that the research on the impact of Instagram and Facebook to children’s health was “not a bombshell”. Senator Richard Blumenthal responded:

“I beg to differ with you, Ms Davis, this research is a bombshell. It is powerful, gripping, riveting evidence that Facebook knows of the harmful effects of its site on children and that it has concealed those facts and findings.”

It is this kind of cover-up that new clause 19 seeks to prevent.

I remind the Committee of one more example that Frances Haugen illustrated to us in her evidence last month. Meta conducts frequent analyses of the estimated age of its users, which is often different from the ages they submit when registering, both among adults and children. Frances told us that Meta does this so that adverts can be targeted more effectively. However, if Ofcom could request this data, as the new clause would require, it would give an important insight into how many under-13s were in fact creating accounts on Facebook. Ofcom should be able to access such information, so I hope hon. Members and the Minister will support the new clause as a measure to increase transparency and support greater protections for children.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me start by saying that I completely agree with the premise of the new clause. First, I agree that these large social media companies are acting principally for motives of their own profit and not the public good. Secondly, I agree with the proposition that they are extremely secretive, and do not transparently and openly disclose information to the public, the Government or researchers, and that is a problem we need to solve. I therefore wholeheartedly agree with the premise of the hon. Member for Aberdeen North’s new clause and her position.

However, I am honestly a bit perplexed by the two speeches we have just heard, because the Bill sets out everything the hon. Members for Aberdeen North and for Worsley and Eccles South asked for in unambiguous, black and white terms on the face of the Bill—or black and green terms, because the Bill is published on green paper.

Clause 85 on page 74 outlines the power Ofcom has to request information from the companies. Clause 85(1) says very clearly that Ofcom may require a person

“to provide them with any information”—

I stress the word “any”—

“that they require for the purpose of exercising, or deciding whether to exercise, any of their online safety functions.”

Ofcom can already request anything of these companies.

For the avoidance of doubt, clause 85(5) lists the various things Ofcom can request information for the purpose of and clause 85(5)(l)—on page 75, line 25— includes for

“the purpose of carrying out research, or preparing a report, in relation to online safety matters”.

Ofcom can request anything, expressly including requesting information to carry out research, which is exactly what the hon. Member for Aberdeen North quite rightly asks for.

The hon. Lady then said, “What if they withhold information or, basically, lie?” Clause 92 on page 80 sets out the situation when people commit an offence. The Committee will see that clause 92(3)(a) states that a person “commits an offence” if

“the person provides information that is false in a material respect”.

Again, clause 92(5)(a) states that a person “commits an offence” if

“the person suppresses, destroys or alters, or causes or permits the suppression, destruction or alteration of, any information required to be provided.”

In short, if the person or company who receives the information request lies, or falsifies or destroys information, they are committing an offence that will trigger not only civil sanctions—under which the company can pay a fine of up to 10% of global revenue or be disconnected—but a personal offence that is punishable by up to two years in prison.

I hope I have demonstrated that clauses 85 and 92 already clearly contain the powers for Ofcom to request any information, and that if people lie, destroy information or supress information as they do as the moment, as the hon. Member for Aberdeen North rightly says they do, that will be a criminal offence with full sanctions available. I hope that demonstrates to the Committee’s satisfaction that the Bill does this already, and that it is important that it does so for the reasons that the hon. Lady set out.

10:45
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a question for the Minister that hopefully, given the Committee’s work, he might be able to answer. New clause 19(2)(b) would give Ofcom the power to require services to submit to it

“all research the service holds on a topic specified by OFCOM.”

Ofcom could say, “We would like all the research you have on the actual age of users.”

My concern is that clause 85(1) allows Ofcom to require companies to provide it

“with any information that they require for the purpose of exercising, or deciding whether to exercise, any of their online safety functions.”

Ofcom might not know what information the company holds. I am concerned that Ofcom is able to say, as it is empowered to do by clause 85(1), “Could you please provide us with the research piece you did on under-age users or on the age of users?”, instead of having a more general power to say, “Could you provide us with all the research you have done?” I am worried that the power in clause 85(1) is more specific.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

If the Minister holds on for two seconds, he will get to make an actual speech. I am worried that the power is not general enough. I would very much like to hear the Minister confirm what he thinks.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am not going to make a full speech. I have conferred with colleagues. The power conferred by clause 85(1) is one to require any information in a particular domain. Ofcom does not have to point to a particular research report and say, “Please give me report X.” It can ask for any information that is relevant to a particular topic. Even if it does not know what specific reports there may be—it probably would not know what reports there are buried in these companies—it can request any information that is at all relevant to a topic and the company will be obliged to provide any information relevant to that request. If the company fails to do so, it will be committing an offence as defined by clause 92, because it would be “suppressing”, to use the language of that clause, the information that exists.

I can categorically say to the hon. Lady that the general ability of Ofcom is to ask for any relevant information—the word “any” does appear—and even if the information notice does not specify precisely what report it is, Ofcom does have that power and I expect it to exercise it and the company to comply. If the company does not, I would expect it to be prosecuted.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Given that clarification, I will not press the new clause. The Minister has made the case strongly enough and has clarified clause 85(1) to my satisfaction. I beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.

New Clause 23

Priority illegal content: violence against women and girls

“(1) For the purposes of this Act, any provision applied to priority illegal content should also be applied to any content which—

(a) constitutes,

(b) encourages, or

(c) promotes

violence against women or girls.

(2) ‘Violence against women and girls’ is defined by Article 3 of the Council of Europe Convention on Preventing Violence Against Women and Domestic Violence (‘the Istanbul Convention’).” —(Alex Davies-Jones.)

This new clause applies provisions to priority illegal content to content which constitutes, encourages or promotes violence against women and girls.

Brought up, and read the First time.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

This new clause would apply provisions applied to priority illegal content also to content that constitutes, encourages or promotes violence against women and girls. As it stands, the Bill is failing women and girls. In an attempt to tackle that alarming gap, the new clause uses the Istanbul convention definition of VAWG, given that the Home Secretary has so recently agreed to ratify the convention—just a decade after was signed.

The Minister might also be aware that GREVIO—the Group of Experts on Action against Violence against Women and Domestic Violence—which monitors the implementation of the Istanbul convention, published a report in October 2021 on the digital dimension of violence against women and girls. It stated that domestic laws are failing to place the abuse of women and girls online

“in the context of a continuum of violence against women that women and girls are exposed to in all spheres of life, including in the digital sphere.”

The purpose of naming VAWG in the Bill is to require tech companies to be responsible for preventing and addressing VAWG as a whole, rather than limiting their obligations only to specific criminal offences listed in schedule 7 and other illegal content. It is also important to note that the schedule 7 priority list was decided on without any consultation with the VAWG sector. Naming violence against women and girls will also ensure that tech companies are held to account for addressing emerging forms of online hate, which legislation is often unable to keep up with.

We only need to consider accounts from survivors of online violence against women and girls, as outlined in “VAWG Principles for the Online Safety Bill”, published in September last year, to really see the profound impact that the issue is having on people’s lives. Ellesha, a survivor of image-based sexual abuse, was a victim of voyeurism at the hands of her ex-partner. She was filmed without her consent and was later notified by someone else that he had uploaded videos of her to Pornhub. She recently spoke at an event that I contributed to—I believe the right hon. Member for Basingstoke and others also did—on the launch of the “Violence Against Women and Girls Code of Practice”. I am sure we will come to that code of practice more specifically on Report. Her account was genuinely difficult to listen to.

This is an issue that Ellesha, with the support of EVAW, Glitch, and a huge range of other organisations, has campaigned on for some time. She says:

“Going through all of this has had a profound impact on my life. I will never have the ability to trust people in the same way and will always second guess their intentions towards me. My self confidence is at an all time low and although I have put a brave face on throughout this, it has had a detrimental effect on my mental health.”

Ellesha was informed by the police that they could not access the websites where her ex-partner had uploaded the videos, so she was forced to spend an immense amount of time trawling through all of the videos uploaded to simply identify herself. I can only imagine how distressing that must have been for her.

Pornhub’s response to the police inquiries was very vague in the first instance, and it later ignored every piece of following correspondence. Eventually the videos were taken down, likely by the ex-partner himself when he was released from the police station. Ellesha was told that Pornhub had only six moderators at the time—just six for the entire website—and it and her ex-partner ultimately got away with allowing the damaging content to remain, even though the account was under his name and easily traced back to his IP address. That just is not good enough, and the Minister must surely recognise that the Bill fails women in its current form.

If the Minister needs any further impetus to genuinely consider the amendment, I point him to a BBC report from last week that highlighted how much obscene material of women and girls is shared online without their consent. The BBC’s Angus Crawford investigated Facebook accounts and groups that were seen to be posting pictures and videos of upskirting. Naturally, Meta—Facebook’s owner—said that it had a grip on the problem and that those accounts and groups had all been removed, yet the BBC was able to find thousands of users sharing material. Indeed, one man who posted videos of himself stalking schoolgirls in New York is now being investigated by the police. This is the reality of the internet; it can be a powerful, creative tool for good, but far too often it seeks to do the complete opposite.

I hate to make this a gendered argument, but there is a genuine difference between the experiences of men and women online. Last week the Minister came close to admitting that when I queried whether he had ever received an unsolicited indecent picture. I am struggling to understand why he has failed to consider these issues in a Bill proposed by his Department.

The steps that the Government are taking to tackle violence against women and girls offline are broadly to be commended, and I welcome a lot of the initiatives. The Minister must see sense and do the right thing by also addressing the harms faced online. We have a genuine opportunity in the Bill to prevent violence against women and girls online, or at least to diminish some of the harms they face. Will he please do the right thing?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The shadow Minister is right to raise the issue of women and girls being disproportionately—one might say overwhelmingly—the victims of certain kinds of abuse online. We heard my right hon. Friend the Member for Basingstoke, the shadow Minister and others set that out in a previous debate. The shadow Minister is right to raise the issue.

Tackling violence against women and girls has been a long-standing priority of the Government. Indeed, a number of important new offences have already been and are being created, with protecting women principally in mind—the offence of controlling or coercive behaviour, set out in the Serious Crime Act 2015 and amended in the Domestic Abuse Act 2021; the creation of a new stalking offence in 2012; a revenge porn offence in 2015; and an upskirting offence in 2019. All of those offences are clearly designed principally to protect women and girls who are overwhelmingly the victims of those offences. Indeed, the cyber-flashing offence created by clause 156 —the first time we have ever had such an offence in this jurisdiction—will, again, overwhelmingly benefit women and girls who are the victims of that offence.

All of the criminal offences I have mentioned—even if they are not mentioned in schedule 7, which I will come to in a moment—will automatically flow into the Bill via the provisions of clause 52(4)(d). Criminal offences where the victim is an individual, which these clearly all are, automatically flow into the provisions of the Bill, including the offences I just listed, which have been created particularly with women in mind.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

I hope that my hon. Friend will discuss the Law Commission’s recommendations on intimate image abuse. When I raised this issue in an earlier sitting, he was slightly unsighted by the fact that the recommendations were about to come out—I can confirm again that they will come out on 7 July, after some three years of deliberation. It is unfortunate that will be a week after the end of the Committee’s deliberations, and I hope that the timing will not preclude the Minister from mopping it up in his legislation.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank my right hon. Friend for her question and for her tireless work in this area. As she says, the intimate image abuse offence being worked on is an extremely important piece in the jigsaw puzzle to protect women, particularly as it has as its threshold—at least in the previous draft—consent, without any test of intent, which addresses some points made by the Committee previously. As we have discussed before, it is a Ministry of Justice lead, and I am sure that my right hon. Friend will make representations to MOJ colleagues to elicit a rapid confirmation of its position on the recommendations, so that we can move to implement them as quickly as possible.

I remind the Committee of the Domestic Abuse Act 2021, which was also designed to protect women. Increased penalties for stalking and harassment have been introduced, and we have ended the automatic early release of violent and sex offenders from prison—something I took through Parliament as a Justice Minister a year or two ago. Previously, violent and sex offenders serving standard determinate sentences were often released automatically at the halfway point of their sentence, but we have now ended that practice. Rightly, a lot has been done outside the Bill to protect women and girls.

Let me turn to what the Bill does to further protect women and girls. Schedule 7 sets out the priority offences—page 183 of the Bill. In addition to all the offences I have mentioned previously, which automatically flow into the illegal safety duties, we have set out priority offences whereby companies must not just react after the event, but proactively prevent the offence from occurring in the first place. I can tell the Committee that many of them have been selected because we know that women and girls are overwhelmingly the victims of such offences. Line 21 lists the offence of causing

“intentional harassment, alarm or distress”.

Line 36 mentions the offence of harassment, and line 37 the offence of stalking. Those are obviously offences where women and girls are overwhelmingly the victims, which is why we have picked them out and put them in schedule 7—to make sure they have the priority they deserve.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The Minister is making a good speech about the important things that the Bill will do to protect women and girls. We do not dispute that it will do so, but I do not understand why he is so resistant to putting this on the face of the Bill. It would cost him nothing to do so, and it would raise the profile. It would mean that everybody would concentrate on ensuring that there are enhanced levels of protection for women and girls, which we clearly need. I ask him to reconsider putting this explicitly on the face of the Bill, as he has been asked to do by us and so many external organisations.

11:04
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I completely understand and accept the point that there are groups of people in society who suffer disproportionate harms, as we have debated previously, and that obviously includes women and girls. There are of course other groups as well, such as ethnic minorities or people whose sexual orientation makes them the target of completely unacceptable abuse in a way that other groups do not suffer.

I accept the point about having this “on the face of the Bill”. We have debated this. That is why clauses 10 and 12 use the word “characteristic”—we debated this word previously The risk assessment duties, which are the starting point for the Bill’s provisions, must specifically and expressly—it is on the face of the Bill—take into account characteristics, first and foremost gender, but also racial identity, sexual orientation and so on. Those characteristics must be expressly addressed by the risk assessments for adults and for children, in order to make sure that the special protections or vulnerabilities or the extra levels of abuse people with those characteristics suffer are recognised and addressed. That is why those provisions are in the Bill, in clauses 10 and 12.

A point was raised about platforms not responding to complaints raised about abusive content that has been put online—the victim complains to the platform and nothing happens. The hon. Members for Pontypridd and for Aberdeen North are completely right that this is a huge problem that needs to be addressed. Clause 18(2) places a duty—they have to do it; it is not optional—on these platforms to operate a complaints procedure that is, in paragraph (c),

“easy to access, easy to use (including by children)”

and that, in paragraph (b),

“provides for appropriate action to be taken”.

They must respond. They must take appropriate action. That is a duty under clause 18. If they do not comply with that duty on a systemic basis, they will be enforced against. The shadow Minister and the hon. Member for Aberdeen North are quite right. The days of the big platforms simply ignoring valid complaints from victims have to end, and the Bill will end them.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I am extremely impressed by the Minister’s knowledge of the Bill, as I have been throughout the Committee’s sittings. It is admirable to see him flicking from page to page, finding where the information about violence against women and girls is included, but I have to concur with the hon. Member for Aberdeen North and my Front-Bench colleagues. There is surely nothing to be lost by specifically including violence against women and girls on the face of the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I hope I have made very clear in everything I have said, which I do not propose to repeat, that the way the Bill operates, in several different areas, and the way the criminal law has been constructed over the past 10 years, building on the work of previous Governments, is that it is designed to make sure that the crimes committed overwhelmingly against women and girls are prioritised. I think the Bill does achieve the objective of providing that protection, which every member of this Committee wishes to see delivered. I have gone through it in some detail. It is woven throughout the fabric of the Bill, in multiple places. The objective of new clause 23 is more than delivered.

In conclusion, we will be publishing a list of harms, including priority harms for children and adults, which will then be legislated for in secondary legislation. The list will be constructed with the vulnerability of women and girls particularly in mind. When Committee members see that list, they will find it reassuring on this topic. I respectfully resist the new clause, because the Bill is already incredibly strong in this important area as it has been constructed.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The Bill is strong, but it could be stronger. It could be, and should be, a world-leading piece of legislation. We want it to be world-leading and we feel that new clause 23 would go some way to achieving that aim. We have cross-party support for tackling violence against women and girls online. Placing it on the face of the Bill would put it at the core of the Bill—at its heart—which is what we all want to achieve. With that in mind, I wish to press the new clause to a vote.

Question put, That the clause be read a Second time.

Division 65

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

New Clause 24
Civil claims for breach of duty
“A user may bring civil proceedings against the provider of a regulated service in respect of a breach by a provider of any of its duties under Part 3 of this Act.”—(Barbara Keeley.)
This new clause would enable users to bring civil proceedings against providers when providers fail to meet their duties under Part 3.
Brought up, and read the First time.
Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

New clause 24 would enable users to bring civil proceedings against providers when they fail to meet their duties under part 3 of the Bill. As has been said many times, power is currently skewed significantly against individuals and in favour of big corporations, leading people to feel that they have no real ability to report content or complain to companies because, whenever they do, there is no response and no action. We have discussed how the reporting, complaints and super-complaints mechanisms in the Bill could be strengthened, as well as the potential merits of an ombudsman, which we argued should be considered when we debated new clause 1.

In tabling this new clause, we are trying to give users the right to appeal through another route—in this case, the courts. As the Minister will be aware, that was a recommendation of the Joint Committee, whose report stated:

“While we recognise the resource challenges both for individuals in accessing the courts and the courts themselves, we think the importance of issues in this Bill requires that users have a right of redress in the courts. We recommend the Government develop a bespoke route of appeal in the courts to allow users to sue providers for failure to meet their obligations under the Act.”

The Government’s response to that recommendation was that the Bill would not change the current situation, which allows individuals to

“seek redress through the courts in the event that a company has been negligent or is in breach of its contract with the individual.”

It went on to note:

“Over time, as regulatory precedent grows, it will become easier for individuals to take user-to-user services to court when necessary.”

That seems as close as we are likely to get to an admission that the current situation for individuals is far from easy. We should not have to wait for the conclusion of the first few long and drawn-out cases before it becomes easier for people to fight companies in the courts.

Some organisations have rightly pointed out that a system of redress based on civil proceedings in the courts risks benefiting those with the resources to sue—as we know, that is often the case. However, including that additional redress system on the face of the Bill should increase pressure on companies to fulfil their duties under part 3, which will hopefully decrease people’s need to turn to the redress mechanism.

If we want the overall system of redress to be as strong as possible, individuals must have the opportunity to appeal failures of a company’s duty of care as set out in the Bill. The Joint Committee argued that the importance of the issues dealt with by the Bill requires that users have a right of redress in the courts. The Government did not respond to that criticism in their formal response, but it is a critical argument. A balancing act between proportionate restrictions and duties versus protections against harms is at the heart of this legislation, and has been at the heart of all our debates. Our position is in line with that of the Joint Committee: these issues are too important to deny individuals the right to appeal failures of duty by big companies through the courts.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I agree with the shadow Minister’s point that it is important to make sure social media firms are held to account, which is the entire purpose of the Bill. I will make two points in response to the proposed new clause, beginning with the observation that the first part of its effect is essentially to restate an existing right. Obviously, individuals are already at liberty to seek redress through the courts where a company has caused that individual to suffer loss through negligence or some other behaviour giving rise to grounds for civil liability. That would, I believe, include a breach of that company’s terms of service, so simply restating in legislation a right that already exists as a matter of law and common law is not necessary. We do not do declaratory legislation that just repeats an existing right.

Secondly, the new clause creates a new right of action that does not currently exist, which is a right of individual action if the company is in breach of one of the duties set out in part 3 of the Bill. Individuals being able to sue for a breach of a statutory duty that we are creating is not the way in which we are trying to construct enforcement under the Bill. We will get social media firms to comply through Ofcom acting as the regulator, rather than via individuals litigating these duties on a case-by-case basis. A far more effective way of dealing with the problems, as we discussed previously when we debated the ombudsman, is to get Ofcom to deal with this on behalf of the whole public on a systemic basis, funded not by individual litigants’ money, which is what would happen, at least in the first instance, if they had to proceed individually. Ofcom should act on behalf of us all collectively—this should appeal to socialists—using charges levied from the industry itself.

That is why we want to enforce against these companies using Ofcom, funded by the industry and acting on behalf of all of us. We want to fix these issues not just on an individual basis but systemically. Although I understand the Opposition’s intent, the first part simply declares what is already the law, and the second bit takes a different route from the one that the Bill takes. The Bill’s route is more comprehensive and will ultimately be more effective. Perhaps most importantly of all, the approach that the Bill takes is funded by the fees charged on the polluters—the social media firms—rather than requiring individual citizens, at least in the first instance, to put their hand in their own pocket, so I think the Bill as drafted is the best route to delivering these objectives.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I will say a couple of things in response to the Minister. It is individuals who are damaged by providers breaching their duties under part 3 of the Bill. I understand the point about—

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Yes, but it is not systems that are damaged; it is people. As I said in my speech, the Government’s response that, as regulatory precedent grows, it will become easier over time for individuals to take user-to-user services to court where necessary clearly shows that the Government think it will happen. What we are saying is: why should it wait? The Minister says it is declaratory, but I think it is important, so we will put the new clause to a vote.

Question put, That the clause be read a Second time.

Division 66

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

New Clause 25
Annual reporting by OFCOM to Parliament
“(1) OFCOM must publish and lay before Parliament an annual report on the operation of its regulatory functions under this Act.
(2) The report must include—
(a) an overall assessment of the continued effectiveness of this Act in reducing harm online;
(b) figures of the volume of content removed by category 1 services in compliance with their duties under this Act;
(c) details of the exercise of any powers by OFCOM under Chapter 4, Part 7 of this Act, including—
(i) the number of times each power has been exercised, and
(ii) the service providers subject to the power;
(a) the number of reports received by OFCOM from regulated services in compliance with their duties under this Act, including details of the type of content that the reports concern.”—(Kim Leadbeater.)
This new clause would require Ofcom to publish and lay before Parliament an annual report on the operation of its regulatory functions under the Act.
Brought up, and read the First time.
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

New clause 25 would place an obligation on Ofcom to report annually to Parliament with an update on the effectiveness of the Online Safety Bill, which would also indicate Ofcom’s ability to implement the measures in the Bill to tackle online harms.

As we have discussed, chapter 7 of the Bill compels Ofcom to compile and issue reports on various aspects of the Bill as drafted. Some of those reports are to be made public by Ofcom, and others are to be issued to the Secretary of State, who must subsequently lay them before Parliament. However, new clause 25 would place a direct obligation on Ofcom to be transparent to Parliament about the scale of harms being tackled, the type of harms encountered and the effectiveness of the Bill in achieving its overall objectives.

The current proposal in clause 135 for an annual transparency report is not satisfactory. Those transparency reports are not required to be laid before Parliament. The clause places vague obligations on reporting patterns, and it will not give Parliament the breadth of information needed to allow us to decide the Online Safety Bill’s effectiveness.

Clause 149 is welcome. It will ensure that a review conducted by the Secretary of State in consultation with Ofcom is placed before Parliament. However, that review is a one-off that will provide just a small snapshot of the Bill’s effectiveness. It may not fully reflect Ofcom’s concerns as the regulator, and most importantly it will not disclose the data and information that Parliament needs to accurately assess the impact of the Bill.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Does the hon. Member agree with me that there is no point in having world-leading legislation if it does not actually work?

11:15
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I agree with the hon. Member wholeheartedly. It should be Parliament that is assessing the effectiveness of the Bill. The Committee has discussed many times how groundbreaking the Bill could be, how difficult it has been to regulate the internet for the first time, the many challenges encountered, the relationship between platforms and regulator and how other countries will be looking at the legislation as a guide for their own regulations. Once this legislation is in place, the only way we can judge how well it is tackling harm in the UK is with clear public reports detailing information on what harms have been prevented, who has intervened to remove that harm, and what role the regulator—in this case Ofcom—has had in protecting us online.

New clause 25 will place a number of important obligations on Ofcom to provide us with that crucial information. First, Ofcom will report annually to Parliament on the overall effectiveness of the Act. That report will allow Ofcom to explore fully where the Act is working, where it could be tightened and where we have left gaps. Throughout the Bill we are heaping considerable responsibility on to Ofcom, and it is only right that Ofcom is able to feedback publicly and state clearly where its powers allow it to act, and where it is constrained and in need of assistance.

Secondly, new clause 25 will compel Ofcom to monitor, collate and publish figures relating to the number of harms removed by category 1 services, which is an important indicator for us to know the scale of the issue and that the Act is working.

Thirdly, we need to know how often Ofcom is intervening, compared with how often the platforms themselves are acting. That crucial figure will allow us to assess the balance of regulation, which assists not only us in the UK but countries looking at the legislation as a guide for their own regulation.

Finally, Ofcom will detail the harms removed by type to identify any areas where the Act may be falling short, and where further attention may be needed.

I hope the Committee understands why this information is absolutely invaluable, when we have previously discussed our concerns that this groundbreaking legislation will need constant monitoring. I hope it will also understand why the information needs to be transparent in order to instil trust in the online space, to show the zero-tolerance approach to online harms, and to show countries across the globe that the online space can be effectively regulated to protect citizens online. Only Parliament, as the legislature, can be an effective monitor of that information. I hope I can count on the Government’s support for new clause 25.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I speak in support of new clause 25. As my hon. Friend has argued, transparency is critical to the Bill. It is too risky to leave information and data about online harms unpublished. That is why we have tabled several amendments to the Bill to increase reporting, both to the regulator and publicly.

New clause 25 is an important addition that would offer an overview of the effectiveness of the Bill and act as a warning bell for any unaddressed historical or emerging harms. Not only would such a report benefit legislators, but the indicators included in the report would be helpful for both Ofcom and user advocacy groups. We cannot continue to attempt to regulate the internet blind. We must have the necessary data and analysis to be sure that the provisions in the Bill are as effective as they can be. I hope the Minister can support this new clause.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The idea that a report on Ofcom’s activities be delivered to Parliament so that it can be considered is an excellent one. In fact, it is such an excellent idea that it has been set out in statute since 2002: the Office of Communications Act 2002 already requires Ofcom to provide a report to the Secretary of State on the carrying out of all of its functions, which will include the new duties we are giving Ofcom under the Bill. The Secretary of State must then lay that report before each House of Parliament. That is a well-established procedure for Ofcom and for other regulatory bodies. It ensures the accountability of Ofcom to the Department and to Parliament.

I was being slightly facetious there, because the hon. Member for Batley and Spen is quite right to raise the issue. However, the duty she is seeking to create via new clause 25 is already covered by the duties in the Office of Communications Act. The reports that Ofcom publish under that duty will include their new duties under the Bill. Having made that clear, I trust that new clause 25 can be withdrawn.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I would like to press new clause 25 to a Division. It is important that it is included in the Bill.

Question put, That the clause be read a Second time.

Division 67

Ayes: 5


Labour: 4
Scottish National Party: 1

Noes: 9


Conservative: 9

New clause 26
Report on synthetic media content harms
“(1) The Secretary of State must publish and lay before Parliament a report on the harms caused to users by synthetic media content appearing on regulated services.
(2) The report must contain analysis of the harms caused specifically to individuals working in the entertainment industry, including, but not limited to, infringements of their intellectual property rights.
(3) The report must be published within six months of this Act being passed.
(4) In this section, ‘synthetic media content’ means any content that has been produced or modified by automated means.”—(Alex Davies-Jones.)
This new clause would require the Secretary of State to publish and lay before Parliament a report on the harms caused to users by synthetic media content (aka “deepfakes”). The report must contain particular reference to the harms caused to those working in the entertainment industry.
Brought up, and read the First time.
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

This new clause would require the Secretary of State to publish and lay before Parliament a report on the harms caused to users by synthetic media content, also known as deepfakes. The report must contain particular reference to the harms caused to those working in the entertainment industry.

The Government define artificial intelligence as

“technologies with the ability to perform tasks that would otherwise require human intelligence, such as visual perception, speech recognition, and language translation”.

That kind of technology has advanced rapidly in recent years, and commercial AI companies can be found across all areas of the entertainment industries, including voice, modelling, music, dance, journalism and gaming—the list goes on.

One key area of development is AI-made performance synthetisation, which is the process of creating a synthetic performance. That has a wide range of applications, including automated audiobooks, interactive digital avatars and “deepfake” technology, which often, sadly, has more sinister implications. Innovation for the entertainment industry is welcome and, when used ethically and responsibly, can have various benefits. For example, AI systems can create vital sources of income for performers and creative workers. From an equalities perspective, it can be used to increase accessibility for disabled workers.

However, deepfake technology has received significant attention globally due to its often-malicious application. Deepfakes have been defined as,

“realistic digital forgeries of videos or audio created with cutting-edge machine learning techniques.”

An amalgamation of artificial intelligence, falsification and automation, deepfakes use deep learning to replicate the likeness and actions of real people. Over the past few years, deepfake technology has become increasingly sophisticated and accessible. Various apps can be downloaded for free, or a low cost, to utilise deepfake technology.

Deepfakes can cause short-term and long-term social harms to individuals working in the entertainment industry, and to society more broadly. Currently, deepfakes are mostly used in pornography, inflicting emotional and reputational damage, and in some cases violence towards the individual—mainly women. The US entertainment union, the Screen Actors Guild, estimates that 96% of deepfakes are pornographic and depict women, and 99% of deepfake subjects are from the entertainment industry.

However, deepfakes used without consent pose a threat in other key areas. For example, deepfake technology has the power to alter the democratic discourse. False information about institutions, policies, and public leaders, powered by a deepfake, can be exploited to spin information and manipulate belief. For example, deepfakes have the potential to sabotage the image and reputation of a political candidate and may alter the course of an election. They could be used to impersonate the identities of business leaders and executives to facilitate fraud, and also have the potential to accelerate the already declining trust in the media.

Alongside the challenges presented by deepfakes, there are issues around consent for performers and creative workers. In a famous case, the Canadian voiceover artist Bev Standing won a settlement after TikTok synthesised her voice without her consent and used it for its first ever text-to-speech voice function. Many artists in the UK are also having their image, voice or likeness used without their permission. AI systems have also started to replace jobs for skilled professional performers because using them is often perceived to be a cheaper and more convenient way of doing things.

Audio artists are particularly concerned by the development of digital voice technology for automated audiobooks, using the same technology used for digital voice assistants such as Siri and Alexa. It is estimated that within one or two years, high-end synthetic voices will have reached human levels. Equity recently conducted a survey on this topic, which found that 65% of performers responding thought that the development of AI technology poses a threat to employment opportunities in the performing arts sector. That figure rose to 93% for audio artists. Pay is another key issue; it is common for artists to not be compensated fairly, and sometimes not be paid at all, when engaging with AI. Many artists have also been asked to sign non-disclosure agreements without being provided with the full information about the job they are taking part in.

Government policy making is non-existent in this space. In September 2021 the Government published their national AI strategy, outlining a 10-year plan to make Britain a global AI superpower. In line with that strategy, the Government have delivered two separate consultations looking at our intellectual property system in relation to AI.

None Portrait The Chair
- Hansard -

Order. I am sorry, but I must interrupt the hon. Lady to adjourn the sitting until this afternoon, when Ms Rees will be in the Chair.

Before we leave the room, my understanding is that it is hoped that the Bill will report this afternoon. That is a matter for the usual channels; it is nothing to do with the Chair. However, of course, it is an open-ended session, so if you are getting close to the mark, you may choose to go on. If that poses a problem for Ms Rees, I am prepared to take the Chair again to see it through if we have to. On the assumption that I do not, thank you all very much indeed for the courtesy you have shown throughout this session, which has been exemplary. I also thank the staff; thank you very much.

11:25
The Chair adjourned the Committee without Question put (Standing Order No. 88).
Adjourned till this day at Two o’clock.