Debates between Chris Philp and Baroness Keeley during the 2019-2024 Parliament

Tue 28th Jun 2022
Tue 28th Jun 2022
Thu 23rd Jun 2022
Tue 21st Jun 2022
Online Safety Bill (Thirteenth sitting)
Public Bill Committees

Committee stage: 13th sitting & Committee Debate - 13th sitting
Tue 21st Jun 2022
Thu 16th Jun 2022
Thu 16th Jun 2022
Tue 14th Jun 2022
Tue 14th Jun 2022
Thu 9th Jun 2022
Thu 9th Jun 2022
Tue 7th Jun 2022
Tue 24th May 2022

Online Safety Bill (Sixteenth sitting)

Debate between Chris Philp and Baroness Keeley
Committee stage
Tuesday 28th June 2022

(2 years, 5 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 28 June 2022 - (28 Jun 2022)
Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - -

It is a pleasure, as always, to serve under your chairmanship, Sir Roger. As the hon. Member for Worsley and Eccles South said, the new clause is designed to introduce a duty on providers to notify Ofcom of anything that Ofcom could reasonably be expected to be notified of.

The Bill already has extremely strong information disclosure provisions. I particularly draw the Committee’s attention to clause 85, which sets out Ofcom’s power to require information by provision of an information notice. If Ofcom provides an information notice—the particulars of which are set out in clause 86—the company has to abide by that request. As the Committee will recall, the strongest sanctions are reserved for the information duties, extending not only to fines of up to 10% or service discontinuation—unplugging the website, as it were; there is also personal criminal liability for named executives, with prison sentences of up to two years. We take those information duties extremely seriously, which is why the sanctions are as strong as they are.

The hon. Member for Aberdeen North asked what updates would occur if there were a significant design change. I draw the Committee’s attention to clause 10, which deals with children’s risk assessment duties, but there are similar duties in relation to illegal content and the safety of adults. The duty set out in clause 10(2), which cross-refers to schedule 3, makes it clear. The relevant words are “suitable and sufficient”. Clearly if there were a massive design change that would, in this case, adversely affect children, the risk assessment would not be suitable and sufficient if it were not updated to reflect that design change. I hope that answers the hon. Lady’s question.

Turning to the particulars of the new clause, if we incentivise companies to disclose information they have not been asked for by Ofcom, there is a danger that they might, through an excessive desire to comply, over-disclose and provide a torrent of information that would not be very helpful. There might also be a risk that some companies that are not well intentioned would deliberately dump enormous quantities of data in order to hide things within it. The shadow Minister, the hon. Member for Worsley and Eccles South, mentioned an example from the world of financial services, but the number of companies potentially within the scope of the Bill is so much larger than even the financial services sector. Some 25,000 companies may be in scope, a number that is much larger—probably by one order of magnitude, and possibly by two—than the financial services sector regulated by the FCA. That disparity in scale makes a significant difference.

Given that there are already strong information provision requirements in the Bill, particularly clause 85, and because of the reasons of scale that I have mentioned, I will respectfully resist the new clause.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

We believe that the platforms need to get into disclosure proactively, and that this is a reasonable clause, so we will push it to a vote.

Question put, That the clause be read a Second time.

--- Later in debate ---
Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The transparency requirements set out in the Bill are welcome but limited. Numerous amendments have been tabled by the Opposition and by our colleagues in the SNP to increase transparency, so that we can all be better informed about the harms around us, and so that the regulator can determine what protections are needed for existing and emerging harms. This new clause is another important provision in that chain and I speak in support of it.

We know that there is research being undertaken all the time by companies that is never published—neither publicly nor to the regulator. As the hon. Member for Aberdeen North said, publishing research undertaken by companies is an issue championed by Frances Haugen, whose testimony last month the Committee will remember. A few years ago, Frances Haugen brought to the public’s attention the extent to which research is held by companies such as Facebook—as it was called then—and never reaches the public realm.

Billions of members of the public are unaware that they are being tracked and monitored by social media companies as subjects in their research studies. The results of those studies are only published when revealed by brave whistleblowers. However, their findings could help charities, regulators and legislators to recognise harms and help to make the internet a safer place. For example, Frances Haugen leaked one Facebook study that found that a third of teenage girls said Instagram made them feel worse about their bodies. Facebook’s head of safety, Antigone Davis, fielded questions on this issue from United States Senators last September. She claimed that the research on the impact of Instagram and Facebook to children’s health was “not a bombshell”. Senator Richard Blumenthal responded:

“I beg to differ with you, Ms Davis, this research is a bombshell. It is powerful, gripping, riveting evidence that Facebook knows of the harmful effects of its site on children and that it has concealed those facts and findings.”

It is this kind of cover-up that new clause 19 seeks to prevent.

I remind the Committee of one more example that Frances Haugen illustrated to us in her evidence last month. Meta conducts frequent analyses of the estimated age of its users, which is often different from the ages they submit when registering, both among adults and children. Frances told us that Meta does this so that adverts can be targeted more effectively. However, if Ofcom could request this data, as the new clause would require, it would give an important insight into how many under-13s were in fact creating accounts on Facebook. Ofcom should be able to access such information, so I hope hon. Members and the Minister will support the new clause as a measure to increase transparency and support greater protections for children.

Chris Philp Portrait Chris Philp
- Hansard - -

Let me start by saying that I completely agree with the premise of the new clause. First, I agree that these large social media companies are acting principally for motives of their own profit and not the public good. Secondly, I agree with the proposition that they are extremely secretive, and do not transparently and openly disclose information to the public, the Government or researchers, and that is a problem we need to solve. I therefore wholeheartedly agree with the premise of the hon. Member for Aberdeen North’s new clause and her position.

However, I am honestly a bit perplexed by the two speeches we have just heard, because the Bill sets out everything the hon. Members for Aberdeen North and for Worsley and Eccles South asked for in unambiguous, black and white terms on the face of the Bill—or black and green terms, because the Bill is published on green paper.

Clause 85 on page 74 outlines the power Ofcom has to request information from the companies. Clause 85(1) says very clearly that Ofcom may require a person

“to provide them with any information”—

I stress the word “any”—

“that they require for the purpose of exercising, or deciding whether to exercise, any of their online safety functions.”

Ofcom can already request anything of these companies.

For the avoidance of doubt, clause 85(5) lists the various things Ofcom can request information for the purpose of and clause 85(5)(l)—on page 75, line 25— includes for

“the purpose of carrying out research, or preparing a report, in relation to online safety matters”.

Ofcom can request anything, expressly including requesting information to carry out research, which is exactly what the hon. Member for Aberdeen North quite rightly asks for.

The hon. Lady then said, “What if they withhold information or, basically, lie?” Clause 92 on page 80 sets out the situation when people commit an offence. The Committee will see that clause 92(3)(a) states that a person “commits an offence” if

“the person provides information that is false in a material respect”.

Again, clause 92(5)(a) states that a person “commits an offence” if

“the person suppresses, destroys or alters, or causes or permits the suppression, destruction or alteration of, any information required to be provided.”

In short, if the person or company who receives the information request lies, or falsifies or destroys information, they are committing an offence that will trigger not only civil sanctions—under which the company can pay a fine of up to 10% of global revenue or be disconnected—but a personal offence that is punishable by up to two years in prison.

I hope I have demonstrated that clauses 85 and 92 already clearly contain the powers for Ofcom to request any information, and that if people lie, destroy information or supress information as they do as the moment, as the hon. Member for Aberdeen North rightly says they do, that will be a criminal offence with full sanctions available. I hope that demonstrates to the Committee’s satisfaction that the Bill does this already, and that it is important that it does so for the reasons that the hon. Lady set out.

--- Later in debate ---
Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

New clause 24 would enable users to bring civil proceedings against providers when they fail to meet their duties under part 3 of the Bill. As has been said many times, power is currently skewed significantly against individuals and in favour of big corporations, leading people to feel that they have no real ability to report content or complain to companies because, whenever they do, there is no response and no action. We have discussed how the reporting, complaints and super-complaints mechanisms in the Bill could be strengthened, as well as the potential merits of an ombudsman, which we argued should be considered when we debated new clause 1.

In tabling this new clause, we are trying to give users the right to appeal through another route—in this case, the courts. As the Minister will be aware, that was a recommendation of the Joint Committee, whose report stated:

“While we recognise the resource challenges both for individuals in accessing the courts and the courts themselves, we think the importance of issues in this Bill requires that users have a right of redress in the courts. We recommend the Government develop a bespoke route of appeal in the courts to allow users to sue providers for failure to meet their obligations under the Act.”

The Government’s response to that recommendation was that the Bill would not change the current situation, which allows individuals to

“seek redress through the courts in the event that a company has been negligent or is in breach of its contract with the individual.”

It went on to note:

“Over time, as regulatory precedent grows, it will become easier for individuals to take user-to-user services to court when necessary.”

That seems as close as we are likely to get to an admission that the current situation for individuals is far from easy. We should not have to wait for the conclusion of the first few long and drawn-out cases before it becomes easier for people to fight companies in the courts.

Some organisations have rightly pointed out that a system of redress based on civil proceedings in the courts risks benefiting those with the resources to sue—as we know, that is often the case. However, including that additional redress system on the face of the Bill should increase pressure on companies to fulfil their duties under part 3, which will hopefully decrease people’s need to turn to the redress mechanism.

If we want the overall system of redress to be as strong as possible, individuals must have the opportunity to appeal failures of a company’s duty of care as set out in the Bill. The Joint Committee argued that the importance of the issues dealt with by the Bill requires that users have a right of redress in the courts. The Government did not respond to that criticism in their formal response, but it is a critical argument. A balancing act between proportionate restrictions and duties versus protections against harms is at the heart of this legislation, and has been at the heart of all our debates. Our position is in line with that of the Joint Committee: these issues are too important to deny individuals the right to appeal failures of duty by big companies through the courts.

Chris Philp Portrait Chris Philp
- Hansard - -

I agree with the shadow Minister’s point that it is important to make sure social media firms are held to account, which is the entire purpose of the Bill. I will make two points in response to the proposed new clause, beginning with the observation that the first part of its effect is essentially to restate an existing right. Obviously, individuals are already at liberty to seek redress through the courts where a company has caused that individual to suffer loss through negligence or some other behaviour giving rise to grounds for civil liability. That would, I believe, include a breach of that company’s terms of service, so simply restating in legislation a right that already exists as a matter of law and common law is not necessary. We do not do declaratory legislation that just repeats an existing right.

Secondly, the new clause creates a new right of action that does not currently exist, which is a right of individual action if the company is in breach of one of the duties set out in part 3 of the Bill. Individuals being able to sue for a breach of a statutory duty that we are creating is not the way in which we are trying to construct enforcement under the Bill. We will get social media firms to comply through Ofcom acting as the regulator, rather than via individuals litigating these duties on a case-by-case basis. A far more effective way of dealing with the problems, as we discussed previously when we debated the ombudsman, is to get Ofcom to deal with this on behalf of the whole public on a systemic basis, funded not by individual litigants’ money, which is what would happen, at least in the first instance, if they had to proceed individually. Ofcom should act on behalf of us all collectively—this should appeal to socialists—using charges levied from the industry itself.

That is why we want to enforce against these companies using Ofcom, funded by the industry and acting on behalf of all of us. We want to fix these issues not just on an individual basis but systemically. Although I understand the Opposition’s intent, the first part simply declares what is already the law, and the second bit takes a different route from the one that the Bill takes. The Bill’s route is more comprehensive and will ultimately be more effective. Perhaps most importantly of all, the approach that the Bill takes is funded by the fees charged on the polluters—the social media firms—rather than requiring individual citizens, at least in the first instance, to put their hand in their own pocket, so I think the Bill as drafted is the best route to delivering these objectives.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I will say a couple of things in response to the Minister. It is individuals who are damaged by providers breaching their duties under part 3 of the Bill. I understand the point about—

Chris Philp Portrait Chris Philp
- Hansard - -

Systems.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Yes, but it is not systems that are damaged; it is people. As I said in my speech, the Government’s response that, as regulatory precedent grows, it will become easier over time for individuals to take user-to-user services to court where necessary clearly shows that the Government think it will happen. What we are saying is: why should it wait? The Minister says it is declaratory, but I think it is important, so we will put the new clause to a vote.

Question put, That the clause be read a Second time.

--- Later in debate ---
Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I speak in support of new clause 25. As my hon. Friend has argued, transparency is critical to the Bill. It is too risky to leave information and data about online harms unpublished. That is why we have tabled several amendments to the Bill to increase reporting, both to the regulator and publicly.

New clause 25 is an important addition that would offer an overview of the effectiveness of the Bill and act as a warning bell for any unaddressed historical or emerging harms. Not only would such a report benefit legislators, but the indicators included in the report would be helpful for both Ofcom and user advocacy groups. We cannot continue to attempt to regulate the internet blind. We must have the necessary data and analysis to be sure that the provisions in the Bill are as effective as they can be. I hope the Minister can support this new clause.

Chris Philp Portrait Chris Philp
- Hansard - -

The idea that a report on Ofcom’s activities be delivered to Parliament so that it can be considered is an excellent one. In fact, it is such an excellent idea that it has been set out in statute since 2002: the Office of Communications Act 2002 already requires Ofcom to provide a report to the Secretary of State on the carrying out of all of its functions, which will include the new duties we are giving Ofcom under the Bill. The Secretary of State must then lay that report before each House of Parliament. That is a well-established procedure for Ofcom and for other regulatory bodies. It ensures the accountability of Ofcom to the Department and to Parliament.

I was being slightly facetious there, because the hon. Member for Batley and Spen is quite right to raise the issue. However, the duty she is seeking to create via new clause 25 is already covered by the duties in the Office of Communications Act. The reports that Ofcom publish under that duty will include their new duties under the Bill. Having made that clear, I trust that new clause 25 can be withdrawn.

Online Safety Bill (Seventeenth sitting)

Debate between Chris Philp and Baroness Keeley
Committee stage
Tuesday 28th June 2022

(2 years, 5 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 28 June 2022 - (28 Jun 2022)
Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Labour argued in favour of greater empowerment provisions for children during the debate on new clause 3, which would have brought in a user advocacy body for children. YoungMinds has pointed out that many young people are unaware of the Bill, and there has been little engagement with children regarding its design. I am sure members of the Committee would agree that the complexity of the Bill is evidence enough of that.

New clause 28 would make the online world more accessible for children and increase their control over the content they see. We know that many children use category 1 services, so they should be entitled to the same control over harmful content as adults. As such, Labour supports the new clause.

Chris Philp Portrait Chris Philp
- Hansard - -

I thank the hon. Member for Aberdeen North for her, as ever, thoughtful comments on the new clause. She has already referred to the user empowerment duties for adults set out in clause 57, and is right to say that those apply only to adults, as is made clear in the very first line of subsection (1) near the bottom of page 52.

As always, the hon. Lady’s analysis of the Bill is correct: the aim of those empowerment duties is to give adults more control over the content they see and the people with whom they interact online. One of the reasons why those empowerment duties have been crafted specifically for adults is that, as we discussed in a freedom of expression context, the Bill does not ultimately censor free speech regarding content that is legal but potentially harmful. Platforms can continue to display that information if their policies allow, so we felt it was right to give adults more choice over whose content they see, given that it could include content that is harmful but falls on the right side of the legal threshold.

As Members would expect, the provisions of the Bill in relation to children are very difficult to the provisions for adults. There are already specific provisions in the Bill that relate to children, requiring all social media companies whose platforms are likely to be accessed by children—not just the big ones—to undertake comprehensive risk assessments and protect children from any kind of harmful activity. If we refer to the children’s risk assessment duties in clause 10, and specifically clause 10(6)(e), we see that those risk assessments include an assessment looking at the content that children will encounter and—critically—who they might encounter online, including adults.

To cut to the chase and explain why user empowerment has been applied to adults but not children, the view was taken that children are already protected a lot more than adults through the child risk assessment duties and child safety duties. Therefore, they do not need the user empowerment provisions because they are already—all of them, regardless of whether they choose to be verified or not—being protected from harmful content already by the much stronger provisions in the Bill relating to children. That is why it was crafted as it is.

--- Later in debate ---
Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I rise to speak in support of the new clause. Too often people with learning disabilities are left out of discussions about provisions relevant to them. People with learning disabilities are disproportionately affected by online harms and can receive awful abuse online.

At the same time, Mencap has argued that social media platforms enable people with learning disabilities to develop positive friendships and relationships. It is therefore even more important that people with learning disabilities do not lose out on the features described in clause 14, which allow them to control the content to which they are exposed. It is welcome that clauses 17, 18, 27 and 28 specify that reporting and complaints procedures must be easy to access and use.

The Bill, however, should go further to ensure that the duties on complaints and reporting explicitly cater to adults with learning disabilities. In the case of clause 14 on user empowerment functions, it must be made much clearer that those functions are easy to access and use. The new clause would be an important step towards ensuring that the Bill benefits everyone who experiences harms online, including people with learning disabilities. Labour supports the new clause.

Chris Philp Portrait Chris Philp
- Hansard - -

I thank the hon. Member for Aberdeen North once again for the thoughtfulness with which she has moved her new clause. To speak first to the existing references to accessibility in the Bill, let me start with user empowerment in clause 14.

Clause 14(4) makes it clear that the features included in “a service in compliance” with the duty in this clause must be made available to all adult users. I stress “all” because, by definition, that includes people with learning disabilities or others with characteristics that mean they may require assistance. When it comes to content reporting duties, clause 17(2)—line 6 of page 17—states that it has to be easy for any “affected persons” to report the content. They may be people who are disabled or have a learning difficulty or anything else. Clause 17(6)(d) further makes it clear that adults who are “providing assistance” to another adult are able to raise content reporting issues.

There are references in the Bill to being easy to report and to one adult assisting another. Furthermore, clause 18(2)(c), on page 18, states that the complaints system has to be

“easy to use (including by children)”.

It also makes it clear through the definition of “affected person”, which we have spoken about, that an adult assisting another adult is allowed to make a complaint on behalf of the second adult. Those things have been built into the structure of the Bill.

Furthermore, to answer the question from the hon. Member for Aberdeen North, I am happy to put on record that Ofcom, as a public body, is subject to the public sector equality duty, so by law it must take into account the ways in which people with certain characteristics, such as learning disabilities, may be impacted when performing its duties, including writing the codes of practice for user empowerment, redress and complaints duties. I can confirm, as the hon. Member requested, that Ofcom, when drafting its codes of practice, will have to take accessibility into account. It is not just a question of my confirming that to the Committee; it is a statutory duty under the Equality Act 2010 and the public sector equality duty that flows from it.

I hope that the words of the Bill, combined with that statutory public sector equality duty, make it clear that the objectives of new clause 29 are met.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The Minister mentioned learning difficulties. That is not what we are talking about. Learning difficulties are things such as dyslexia and attention deficit hyperactivity disorder. Learning disabilities are lifelong intellectual impairments and very different things—that is what we are talking about.

Chris Philp Portrait Chris Philp
- Hansard - -

I am very happy to accept the shadow Minister’s clarification. The way that clauses 14, 17 and 18 are drafted, and the public sector equality duty, include the groups of people she referred to, but I am happy to acknowledge and accept her clarification.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

That is fine, but I have a further point to make. The new clause would be very important to all those people who support people with learning disabilities. So much of the services that people use do not take account of people’s learning disabilities. I have done a huge amount of work to try to support people with learning disabilities over the years. This is a very important issue to me.

There are all kinds of good examples, such as easy-read versions of documents, but the Minister said when batting back this important new clause that the expression “all adult users” includes people with learning disabilities. That is not the case. He may not have worked with a lot of people with learning disabilities, but they are excluded from an awful lot. That is why I support making that clear in the Bill.

We on the Opposition Benches say repeatedly that some things are not included by an all-encompassing grouping. That is certainly the case here. Some things need to be said for themselves, such as violence against women and girls. That is why this is an excellent new clause that we support.

--- Later in debate ---
Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

As we heard from the hon. Member for Ochil and South Perthshire, new clauses 38 to 40 would align the duties on pornographic content so that both user-to-user sites and published pornography sites are subject to robust duties that are relevant to the service. Charities have expressed concerns that many pornography sites might slip through the net because their content does not fall under the definition of “pornographic content” in clause 66. The new clauses aim to address that. They are based on the duties placed on category 1 services, but they recognise the unique harms that can be caused by pornographic content providers, some of which the hon. Member graphically described with the titles that he gave. The new clauses also contain some important new duties that are not currently in the Bill, including the transparency arrangements in new clause 39 and important safeguards in new clause 40.

The Opposition have argued time and again for publishing duties when it comes to risk assessments. New clause 39 would introduce a duty to summarise in the terms of service the findings of the most recent adult risk assessments of a service. That is an important step towards making risk assessments publicly accessible, although Labour’s preference would be for them to be published publicly and in full, as I argued in the debate on new clause 9, which addressed category 1 service risk assessments.

New clause 40 would introduce measures to prevent the upload of illegal content, such as by allowing content uploads only from verified content providers, and by requiring all uploaded content to be reviewed. If the latter duty were accepted, there would need to be proper training and support for any human content moderators. We have heard during previous debates about the awful circumstances of human content moderators. They are put under such pressure for that low-paid work, and we do not want to encourage that.

New clause 40 would also provide protections for those featured in such content, including the need for written consent and identity and age verification. Those are important safeguards that the Labour party supports. I hope the Minister will consider them.

Chris Philp Portrait Chris Philp
- Hansard - -

I thank the hon. Member for Ochil and South Perthshire for raising these issues with the Committee. It is important first to make it clear that websites providing user-to-user services are covered in part 3 of the Bill, under which they are obliged to protect children and prevent illegal content, including some forms of extreme pornography, from circulating. Such websites are also obliged to prevent children from accessing those services. For user-to-user sites, those matters are all comprehensively covered in part 3.

New clauses 38, 39 and 40 seek to widen the scope of part 5 of the Bill, which applies specifically to commercial pornography sites. Those are a different part of the market. Part 5 is designed to close a loophole in the original draft of the Bill that was identified by the Joint Committee, on which the hon. Member for Ochil and South Perthshire and my hon. Friend the Member for Watford served. Protecting children from pornographic content on commercial porn sites had been wrongly omitted from the original draft of the Bill. Part 5 of the Bill as currently drafted is designed to remedy that oversight. That is why the duties in part 5 are narrowly targeted at protecting children in the commercial part of the market.

A much wider range of duties is placed by part 3 on the user-to-user part of the pornography market. The user-to-user services covered by part 3 are likely to include the largest sites with the least control; as the content is user generated, there is no organising mind—whatever gets put up, gets put up. It is worth drawing the distinction between the services covered in part 3 and part 5 of the Bill.

In relation to part 5 services publishing their own material, Parliament can legislate, if it chooses to, to make some of that content illegal, as it has done in some areas—some forms of extreme pornography are illegal. If Parliament thinks that the line is drawn in the wrong place and need to be moved, it can legislate to move that line as part of the general legislation in this area.

I emphasise most strongly that user-to-user sites, which are probably what the hon. Member for Ochil and South Perthshire was mostly referring to, are comprehensively covered by the duties in part 3. The purpose of part 5, which was a response to the Joint Committee’s report, is simply to stop children viewing such content. That is why the Bill has been constructed as it has.

Question put, That the clause be read a Second time.

Online Safety Bill (Fifteenth sitting)

Debate between Chris Philp and Baroness Keeley
Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - -

It is a pleasure to serve under your chairmanship, Sir Roger. Clause 168 is a very short and straightforward clause. Ofcom will be required to publish a variety of documents under the Online Safety Bill. The clause simply requires that this be done in a way that is appropriate and likely to bring it to the attention of any audience who are going to be affected by it. Ofcom is already familiar with this type of statutory obligation through existing legislation, such as the Digital Economy Act 2017, which places similar obligations on Ofcom. Ofcom is well versed in publishing documents in a way that is publicly accessible. Clause 168 puts the obligation on to a clear statutory footing.

Baroness Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

As the Minister said, clause 168 rightly sets out that the raw material the Bill requires of Ofcom is published in a way that will bring it to the attention of any audience likely to be affected by it. It will be important that all the guidance is published in a way that is easily available and accessible, including for people who are not neurotypical, or experience digital exclusion. I think we would all agree, after the work we have done on the Bill, that the subjects are complex and the landscape is difficult to understand. I hope Ofcom will make its documents as accessible as possible.

Question put and agreed to.

Clause 168 accordingly ordered to stand part of the Bill.

Clause 169

Service of notices

Question proposed, That the clause stand part of the Bill.

Chris Philp Portrait Chris Philp
- Hansard - -

Clause 169 sets out the process for the service of any notice under the Bill, including notices to deal with child sexual exploitation and abuse or terrorism content, information notices, enforcement notices, penalty notices and public statement notices to providers of regulated services both within and outside the United Kingdom. The clause sets out that Ofcom may give a notice to a person by handing it to them, leaving it at the person’s last known address, sending it by post to that address or sending it by email to the person’s email address. It provides clarity regarding who Ofcom must give notice to in respect of different structures. For example, notice may be given to an officer of a body corporate.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

As the Minister said, clause 169 sets out the process of issuing notices or decisions by Ofcom. It mostly includes provisions about how Ofcom is to contact the company, which seem reasonable. The Opposition do not oppose clause 169.

Question put and agreed to.

Clause 169 accordingly ordered to stand part of the Bill.

Clause 170

Repeal of Part 4B of the Communications Act

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to consider clauses 171 and 172.

Chris Philp Portrait Chris Philp
- Hansard - -

Clause 170 repeals the video-sharing platform regime. While the VSP and online safety regimes have similar objectives, the new framework in the Bill will be broader and will apply to a wider range of online platforms. It is for this reason that we will repeal the VSP regime and transition those entities regulated as VSPs across to the online safety regime, which is broader and more effective in its provisions. The clause simply sets out the intention to repeal the VSP.

Clause 171 repeals part 3 of the Digital Economy Act 2017. As we have discussed previously, the Online Safety Bill now captures all online sites that display pornography, including commercial pornography sites, social media sites, video sharing platforms, forums and search engines. It will provide much greater protection to children than the Digital Economy Act. The Digital Economy Act was criticised for not covering social media platforms, which this Bill does cover. By removing that section from the Digital Economy Act, we are laying the path to regulate properly and more comprehensively.

Finally, in this group, clause 172 amends section 1B of the Protection of Children Act 1978 and creates a defence to the offence of making an indecent photograph of a child for Ofcom, its staff and those assisting Ofcom in exercising its online safety duties. Clearly, we do not want to criminalise Ofcom staff while they are discharging their duties under the Bill that we are imposing on them, so it is reasonable to set out that such a defence exists. I hope that provides clarity to the Committee on the three clauses.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The provisions in clauses 170 to 172, as the Minister has said, repeal or amend existing laws for the purposes of the Bill. As Labour supports the need to legislate on the issue of online safety, we will not oppose the clauses. However, I want to note that the entire process, up until the final abandonment of part 3 of the Digital Economy Act under clause 171 appears shambolic. It has been five years now since that part of the Act could have been implemented, which means five years during which children could have been better protected from the harms of pornographic content.

When the Government eventually admitted that part 3 was being ditched, the Minister at the time, the hon. Member for Boston and Skegness (Matt Warman), said that the Government would seek to take action on pornography more quickly than on other parts of the online harms regime. Stakeholders and charities have expressed concerns that we could now see a delay to the implementation of the duties on pornographic content providers, which is similar to the postponement and eventual abandonment of part 3 of the Digital Economy Act. I appreciate that the Minister gave some reassurance of his

“desire to get this done as quickly as possible”—[Official Report, Online Safety Bill Committee, 9 June 2022; c. 308.]

in our debate on clauses 31 to 33, but would it not be better to set out timeframes in the Bill?

Under clause 193, it appears that the only clauses in part 5 to be enacted once the Bill receives Royal Assent will be the definitions—clause 66 and clause 67(4)—and not the duties. That is because Ofcom is expected to issue a call for evidence, after which draft proposals for consultation are published, which then need to be agreed by the Secretary of State and laid before Parliament. There are opportunities there for delays and objections at any stage and, typically, enforcement will be implemented only in a staged fashion, from monitoring to supervision. The consultations and safeguarding processes are necessary to make the guidance robust; we understand that. However, children cannot wait another three years for protections, having been promised protection under part 3 of the Digital Economy Act five years ago, which, as I have said, was never implemented.

The provisions on pornography in part 5 of the Bill require no secondary legislation so they should be implemented as quickly as possible to minimise the amount of time children continue to be exposed to harmful content. It would be irresponsible to wait any longer than absolutely necessary, given the harms already caused by this drawn-out process.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

Thank you, Sir Roger, for chairing this meeting this morning. I want to agree with the Opposition’s points about the timing issue. If an Act will repeal another one, it needs to make sure that there is no gap in the middle and, if the repeal takes place on one day, that the Bill’s provisions that relate to that are in force and working on the same day, rather than leaving a potential set-up time gap.

On clause 170 and repealing the part of the Communications Act 2003 on video-sharing platform services, some concerns have been raised that the requirements in the Online Safety Bill do not exactly mirror the same provisions in the video-sharing platform rules. I am not saying necessarily or categorically that the Online Safety Bill is less strong than the video-sharing platform rules currently in place. However, if the legislation on video-sharing platform services is repealed, the Online Safety Act, as it will be, will become the main way of regulating video-sharing platforms and there will be a degradation in the protections provided on those platforms and an increase in some of the issues and concerns we have seen raised. Will the Minister keep that under review and consider how that could be improved? We do not want to see this getting worse simply because one regime has been switched for another that, as the Minister said, is broader and has stronger protections. Will he keep under review whether that turns out to be the case when the Act has bedded in, when Ofcom has the ability to take action and properly regulate—particularly, in this case, video-sharing platforms?

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

So far as I am aware, all the costs incurred by Ofcom in relation to the duties in the Bill can be recouped by way of fees. If that is not correct, I will write to the hon. Lady saying so, but my understanding is that any relevant Ofcom cost will be in the scope of the fees.

Question put and agreed to.

New clause 42 accordingly read a Second time, and added to the Bill.

New Clause 43

Payment of sums into the Consolidated Fund

“(1) Section 400 of the Communications Act (destination of penalties etc) is amended as follows.

(2) In subsection (1), after paragraph (i) insert—

‘(j) an amount paid to OFCOM in respect of a penalty imposed by them under Chapter 6 of Part 7 of the Online Safety Act 2022;

(k) an amount paid to OFCOM in respect of an additional fee charged under Schedule (Recovery of OFCOM’s initial costs) to the Online Safety Act 2022.’

(3) In subsection (2), after ‘applies’ insert ‘(except an amount mentioned in subsection (1)(j) or (k))’.

(4) After subsection (3) insert—

‘(3A) Where OFCOM receive an amount mentioned in subsection (1)(j) or (k), it must be paid into the Consolidated Fund of the United Kingdom.’

(5) In the heading, omit ‘licence’.”—(Chris Philp.)

This new clause provides that additional fees charged to providers under NS2 must be paid into the Consolidated Fund. The Bill already provided that monetary penalties must be paid into the Consolidated Fund, and those provisions are now placed in this clause.

Brought up, read the First and Second time, and added to the Bill.

New Clause 3

Establishment of Advocacy Body

“(1) There is to be a body corporate (‘the Advocacy Body’) to represent interests of child users of regulated services.

(2) A ‘child user’—

(a) means any person aged 17 years or under who uses or is likely to use regulated internet services; and

(b) includes both any existing child user and any future child user.

(3) The work of the Advocacy Body may include—

(a) representing the interests of child users;

(b) the protection and promotion of these interests;

(c) any other matter connected with those interests.

(4) The ‘interests of child users’ means the interest of children in relation to the discharge by any regulated company of its duties under this Act, including—

(a) safety duties about illegal content, in particular CSEA content;

(b) safety duties protecting children;

(c) ‘enforceable requirements’ relating to children.

(5) The Advocacy Body must have particular regard to the interests of child users that display one or more protected characteristics within the meaning of the Equality Act 2010.

(6) The Advocacy Body will be defined as a statutory consultee for OFCOM’s regulatory decisions which impact upon the interests of children.

(7) The Secretary of State may appoint an organisation known to represent children to be designated the functions under this Act, or may create an organisation to carry out the designated functions.”—(Barbara Keeley.)

This new clause creates a new advocacy body for child users of regulated internet services.

Brought up, and read the First time.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

New clause 3 would make provision for a statutory user advocacy body representing the interests of children. It would also allow the Secretary of State to appoint a new or existing body as the statutory user advocate. A strong, authoritative and well-resourced voice that can speak for children in regulatory debates would ensure that complex safeguarding issues are well understood, and would also actively inform the regulator’s decisions.

Charities have highlighted that the complaints and reporting mechanisms in the Bill may not always be appropriate for children. Ofcom’s own evidence shows that only 14% to 12 to 15-year-old children have ever reported content. Children who are most at risk of online harms may find it incredibly challenging to complete a multi-stage reporting and complaints process. Dame Rachel de Souza told the Committee:

“I worry that the Bill does not do enough to respond to individual cases of abuse and that it needs to do more to understand issues and concerns directly from children. Children should not have to exhaust the platforms’ ineffective complaints routes, which can take days, weeks or even months. I have just conducted a survey of 2,000 children and asked them about their experiences in the past month. Of those 2,000 children, 50% had seen harmful content and 40% had tried to get content about themselves removed and had not succeeded. For me, there is something really important about listening to children and taking their complaints into account.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 16, Q22.]

A children’s advocacy body would be able to support children with redress mechanisms that are fundamentally targeted at adults. Given how many children now use the internet, that is an essential element that is missing from the Bill. That is why the super-complaints mechanism needs to be strengthened with specific arrangements for children, as advocated by the National Society for the Prevention of Cruelty to Children and other children’s organisations. A statutory user advocacy body could support the regulator, as well as supporting child users. It would actively promote the interests of children in regulatory decision making and offer support by ensuring that an understanding of children’s behaviour and safeguarding is front and centre in its approach.

Online Safety Bill (Thirteenth sitting)

Debate between Chris Philp and Baroness Keeley
Committee stage & Committee Debate - 13th sitting
Tuesday 21st June 2022

(2 years, 5 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 21 June 2022 - (21 Jun 2022)
Chris Philp Portrait Chris Philp
- Hansard - -

In this clause we are specifically talking about access to information for researchers. Obviously, the transparency matters were covered in clauses 64 and 135. There is consensus across both parties that access to information for bona fide academic researchers is important. The clause lays out a path to take us in the direction of providing that access by requiring Ofcom to produce a report. We debated the matter earlier. The hon. Member for Worsley and Eccles South—I hope I got the pronunciation right this time—

Chris Philp Portrait Chris Philp
- Hansard - -

The hon. Lady made some points about the matter in an earlier sitting, as the shadow Minister just said. It is an area we are giving some careful thought to, because it is important that it is properly academically researched. Although Ofcom is being well resourced, as we have discussed, with lots of money and the ability to levy fees, we understand that it does not have a monopoly on wisdom—as good a regulator as it is. It may well be that a number of academics could add a great deal to the debate by looking at some of the material held inside social media firms. The Government recognise the importance of the matter, and some thought is being given to these questions, but at least we can agree that clause 136 as drafted sets out a path that leads us in this important direction.

Question put and agreed to.

Clause 136 accordingly ordered to stand part of the Bill.

Clause 137

OFCOM’s reports

--- Later in debate ---
None Portrait The Chair
- Hansard -

With this, it will be convenient to consider clause 139 stand part.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Good morning, Ms Rees. It is a pleasure to serve on the Committee with you in the Chair. Clause 138 allows companies to make appeals against Ofcom’s decisions regarding the categorisation of services within categories 1, 2A or 2B.

We have argued, many times, that we believe the Government’s size-based approach to categorisation is flawed. Our preference for an approach based on risk is backed up by the views of multiple stakeholders and the Joint Committee. It was encouraging to hear last week of the Minister’s intention to look again at the issues of categorisation, and I hope we will see movement on that on Report.

Clause 138 sets out that where a regulated provider has filed an appeal, they are exempt from carrying out the duties in the Bill that normally apply to services designated as category 1, 2A or 2B. That is concerning, given that there is no timeframe in which the appeals process must be concluded.

While the right to appeal is important, it is feasible that many platforms will raise appeals about their categorisation to delay the start of their duties under the Bill. I understand that the platforms will still have to comply with the duties that apply to all regulated services, but for a service that has been classified by Ofcom as high risk, it is potentially dangerous that none of the risk assessments on measures to assess harm will be completed while the appeal is taking place. Does the Minister agree that the appeals process must be concluded as quickly as possible to minimise the risk? Will he consider putting a timeframe on that?

Clause 139 allows for appeals against decisions by Ofcom to issue notices about dealing with terrorism and child sexual abuse material, as well as a confirmation decision or a penalty notice. As I have said, in general the right to appeal is important. However, would an appeals system work if, for example, a company were appealing to a notice under clause 103? In what circumstances does the Minister imagine that a platform would appeal a notice by Ofcom requiring the platform to use accredited technology to identify child sexual abuse content and swiftly take down that content? It is vital that appeals processes are concluded as rapidly as possible, so that we do not risk people being exposed to harmful or dangerous content.

Chris Philp Portrait Chris Philp
- Hansard - -

The shadow Minister has set out the purpose of the clauses, which provide for, in clause 138 appeal rights for decisions relating to registration under clause 81, and in clause 139 appeals against Ofcom notices.

I agree that it is important that judicial decisions in this area get made quickly. I note that the appeals are directly to the relevant upper tribunal, which is a higher tier of the tribunal system and tends to be a little less congested than the first-tier tribunal, which often gets used for some first-instance matters. I hope that appeals going to the upper tribunal, directly to that more senior level, provides some comfort.

On putting in a time limit, the general principle is that matters concerning listing are reserved to the judiciary. I recall from my time as a Minister in the Ministry of Justice, that the judiciary guards its independence fiercely. Whether it is the Senior President of Tribunals or the Lord Chief Justice, they consider listing matters to be the preserve of the judiciary, not the Executive or the legislature. Compelling the judiciary to hear a case in a certain time might well be considered to infringe on such principles.

We can agree, however—I hope the people making those listing decisions hear that we believe, that Parliament believes—that it is important to do this quickly, in particular where there is a risk of harm to individuals. Where there is risk to individuals, especially children, but more widely as well, those cases should be heard very expeditiously indeed.

The hon. Member for Worsley and Eccles South also asked about the basis on which appeals might be made and decided. I think that is made fairly clear. For example, clause 139(3) makes it clear that, in deciding an appeal, the upper tribunal will use the same principles as would be applied by the High Court to an application for judicial review—so, standard JR terms—which in the context of notices served or decisions made under clause 103 might include whether the power had been exercised in conformity with statute. If the power were exercised or purported to be exercised in a manner not authorised by statute, that would be one grounds for appeal, or if a decision were considered so grossly unreasonable that no reasonable decision maker could make it, that might be a grounds for appeal as well.

I caution the Committee, however: I am not a lawyer and my interpretation of judicial review principles should not be taken as definitive. Lawyers will advise their clients when they come to apply the clause in practice and they will not take my words in Committee as definitive when it comes to determining “standard judicial review principles”—those are well established in law, regardless of my words just now.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

There is a concern that platforms might raise appeals about their categorisation in order to delay the start of their duties under the Bill. How would the Minister act if that happened—if a large number of appeals were pending and the duties under the Bill therefore did not commence?

Chris Philp Portrait Chris Philp
- Hansard - -

Clearly, resourcing of the upper tribunal is a matter decided jointly by the Lord Chancellor and the Secretary of State for Justice, in consultation with the Lord Chief Justice, and, in this case, the Senior President of Tribunals. Parliament would expect the resourcing of that part of the upper tribunal to be such that cases could be heard in an expedited matter. Particularly where cases concern the safety of the public—and particularly of children—we expect that to be done as quickly as it can.

Question put and agreed to.

Clause 138 accordingly ordered to stand part of the Bill.

Clause 139 ordered to stand part of the Bill.

Clause 140

Power to make super-complaints

--- Later in debate ---
Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The Bill currently specifies that super-complaints can be made back to Ofcom by bodies representing users or members of the public. The addition of consumer representatives through the amendments is important. Consumer representatives are a key source of information about harms to users of online services, which are widespread, and would be regulated by this legislation. We support the amendments, which would include consumers on the list as an entity that is eligible to make super-complaints.

Chris Philp Portrait Chris Philp
- Hansard - -

Clearly, we want the super-complaint function to be as effective as possible and for groups of relevant people, users or members of the public to be able to be represented by an eligible entity to raise super-complaints. I believe we are all on the same page in wanting to do that. If I am honest, I am a little confused as to what the addition of the term “consumers” will add. The term “users” is defined quite widely, via clause 140(6), which then refers to clause 181, where, as debated previously, a “user” is defined widely to include anyone using a service, whether registered or not. So if somebody stumbles across a website, they count as a user, but the definition being used in clause 140 about bringing super-complaints also includes “members of the public”—that is, regular citizens. Even if they are not a user of that particular service, they could still be represented in bringing a complaint.

Given that, by definition, “users” and “members of the public” already cover everybody in the United Kingdom, I am not quite sure what the addition of the term “consumers” adds. By definition, consumers are a subset of the group “users” or “members of the public”. It follows that in seeking to become an eligible entity, no eligible entity will purport to act for everybody in the United Kingdom; they will always be seeking to define some kind of subset of people. That might be children, people with a particular vulnerability or, indeed, consumers, who are one such subset of “members of the public” or “users”. I do not honestly understand what the addition of the word “consumers” adds here when everything is covered already.

Online Safety Bill (Fourteenth sitting)

Debate between Chris Philp and Baroness Keeley
Committee stage
Tuesday 21st June 2022

(2 years, 5 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 21 June 2022 - (21 Jun 2022)
Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Part 10 of the Bill sets out three new offences involving harmful, false or threatening communications. Clause 156 includes a new offence on cyber-flashing, to which my hon. Friend the Member for Pontypridd will speak shortly.

For many years, charities have been calling for an update to the offences included in the Malicious Communications Act 1998 and the Communications Act 2003. Back in 2018, the Law Commission pointed out that using the criminal law to deal with harmful online conduct was hindered by several factors, including limited law enforcement capacity to pursue the scale of abusive communications, what the commission called a “persistent cultural tolerance” of online abuse, and difficulties in striking a balance between protecting people from harm and maintaining rights of freedom of expression—a debate that we keep coming to in Committee and one that is still raging today. Reform of the legislation governing harmful online communications is welcome—that is the first thing to say—but the points laid out by the Law Commission in 2018 still require attention if the new offences are to result in the reduction of harm.

My hon. Friend the Member for Batley and Spen spoke about the limited definition of harm, which relates to psychological harm but does not protect against all harms resulting from messages received online, including those that are physical. We also heard from the hon. Member for Ochil and South Perthshire about the importance of including an offence of encouraging or assisting self-harm, which we debated last week with schedule 7. I hope that the Minister will continue to upgrade the merits of new clause 36 when the time comes to vote on it.

Those are important improvements about what should constitute an offence, but we share the concerns of the sector about the extent to which the new offences will result in prosecution. The threshold for committing one of the offences in clause 150 is high. When someone sends the message, there must be

“a real and substantial risk that it would cause harm to a likely audience”,

and they must have

“no reasonable excuse for sending the message.”

The first problem is that the threshold of having to prove the intention to cause distress is an evidential threshold. Finding evidence to prove intent is notoriously difficult. Professor Clare McGlynn’s oral evidence to the Committee was clear:

“We know from the offence of non-consensual sending of sexual images that it is that threshold that limits prosecutions, but we are repeating that mistake here with this offence.”

Professor McGlynn highlighted the story of Gaia Pope. With your permission, Ms Rees, I will make brief reference to it, in citing the evidence given to the Committee. In the past few weeks, it has emerged that shortly before Gaia Pope went missing, she was sent indecent images through Facebook, which triggered post-traumatic stress disorder from a previous rape. Professor McGlynn said:

“We do not know why that man sent her those images, and I guess my question would be: does it actually matter why he sent them? Unfortunately, the Bill says that why he sent them does matter, despite the harm it caused, because it would only be a criminal offence if it could be proved that he sent them with the intention of causing distress or for sexual gratification and being reckless about causing distress.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 58, Q105.]

The communications offences should be grounded upon consent rather than the motivation of the perpetrator. That is a clear omission in the Bill, which my hon. Friend the Member for Pontypridd will speak more about in relation to our amendments 41 and 42 to clause 156. The Government must act or risk missing a critical opportunity to tackle the harms resulting from communications offences.

We then come to the problem of the “reasonable excuse” defence and the “public interest” defence. Clause 150(5) sets out that the court must consider

“whether the message is, or is intended to be, a contribution to a matter of public interest”.

The wording in the clause states that this should not “determine the point”. If that is the case, why does the provision exist? Does the Minister recognise that there is a risk of the provision being abused? In a response to a question from the hon. Member for Aberdeen North, the Minister has previously said that:

“Clause 150…does not give a get-out-of-jail-free card”.––[Official Report, Online Safety Public Bill Committee, 7 June 2022; c. 275.]

Could he lay out what the purpose of this “matter of public interest” defence is? Combined with the reasonable excuse defence in subsection (1), the provisions risk sending the wrong message when it comes to balancing harms, particularly those experienced by women, of which we have already heard some awful examples.

There is a difference in the threshold of harm between clause 150, on harmful communications offences, and clause 151, on false communications offences. To constitute a false communications offence, the message sender must have

“intended the message, or the information in it, to cause non-trivial psychological or physical harm to a likely audience”.

To constitute a harmful communications offence, the message sender must have

“intended to cause harm to a likely audience”

and there must have been

“a real and substantial risk that it would cause harm to a likely audience”.

Will the Minister set out the Government’s reasoning for that distinction? We need to get these clauses right because people have been let down by inadequate legislation and enforcement on harmful online communications offences for far too long.

Chris Philp Portrait Chris Philp
- Hansard - -

Let me start by saying that many of these clauses have been developed in careful consultation with the Law Commission, which has taken a great deal of time to research and develop policy in this area. It is obviously quite a delicate area, and it is important to make sure that we get it right.

The Law Commission is the expert in this kind of thing, and it is right that the Government commissioned it, some years ago, to work on these provisions, and it is right that, by and large, we follow its expert advice in framing these offences, unless there is a very good reason not to. That is what we have done—we have followed the Law Commission’s advice, as we would be expected to do. The clauses replace previous offences—for example, those in the Malicious Communications Act 1998—and update and improve those provisions in the form we see them in the Bill.

The shadow Minister, the hon. Member for Worsley and Eccles South, asked a number of questions about the drafting of the clauses and the thresholds that have to be met for an offence to be committed. We are trying to strike a balance between criminalising communications that deserve to be criminalised and not criminalising communications that people would consider should fall below the criminal threshold. There is obviously a balance to strike in doing that. We do not want to infringe free speech by going too far and having legitimate criticism and debate being subject to criminal sanctions. There is a balance to strike here between, on the one hand, public protection and where the criminal law sits versus, on the other hand, free speech and people expressing themselves. That is why clause 150 is constructed as it is, on the advice of the Law Commission.

As the hon. Member set out, the offence is committed only where there is a “real and substantial risk” that the likely audience would suffer harm. Harm is defined as

“psychological harm amounting to at least serious distress.”

Serious distress is quite a high threshold—it is significant thing, not something trivial. It is important to make that clear.

The second limb is that there is an intention to cause harm. Intention can in some circumstances be difficult to prove, but there are also acts that are so obviously malicious that there can be no conceivable motivation or intention other than to cause harm, where the communication is so obviously malfeasant. In those cases, establishing intent is not too difficult.

In a number of specific areas, such as intimate image abuse, my right hon. Friend the Member for Basingstoke and others have powerfully suggested that establishing intent is an unreasonably high threshold, and that the bar should be set simply at consent. For the intimate image abuse offence, the bar is set at the consent level, not at intent. That is being worked through by the Law Commission and the Ministry of Justice, and I hope that it will be brought forward as soon as possible, in the same way as the epilepsy trolling offence that we discussed a short while ago. That work on intimate image abuse is under way, and consent, not intent, is the test.

For the generality of communications—the clause covers any communications; it is incredibly broad in scope—it is reasonable to have the intent test to avoid criminalising what people would consider to be an exercise of free speech. That is a balance that we have tried to strike. The intention behind the appalling communications that we have heard in evidence and elsewhere is clear: it is in inconceivable that there was any other motivation or intention than to cause harm.

There are some defences—well, not defences, but conditions to be met—in clause 150(1)(c). The person must have “no reasonable excuse”. Subsection (5) makes it clear that

“In deciding whether a person has a reasonable excuse…one of the factors that a court must consider (if it is relevant in a particular case) is whether the message is, or is intended to be, a contribution to a matter of public interest (but that does not determine the point)”

of whether there is a reasonable excuse—it simply has to be taken into account by the court and balanced against the other considerations. That qualification has been put in for reasons of free speech.

There is a delicate balance to strike between criminalising what should be criminal and, at the same time, allowing reasonable free speech. There is a line to draw, and that is not easy, but I hope that, through my comments and the drafting of the clause, the Committee will see that that line has been drawn and a balance struck in a carefully calibrated way. I acknowledge that the matter is not straightforward, but we have addressed it with advice from the Law Commission, which is expert in this area. I commend clause 150 to the Committee.

The other clauses in this group are a little less contentious. Clause 151 sets out a new false communication offence, and I think it is pretty self-explanatory as drafted. The threatening communications offence in clause 152 is also fairly self-explanatory—the terms are pretty clear. Clause 153 contains interpretative provisions. Clause 154 sets out the extra-territorial application, and Clause 155 sets out the liability of corporate officers. Clause 157 repeals some of the old offences that the new provisions replace.

Those clauses—apart from clause 150—are all relatively straightforward. I hope that, in following the Law Commission’s advice, we have struck a carefully calibrated balance in the right place.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I would like to take the Minister back to the question I asked about the public interest defence. There is a great deal of concern that a lot of the overlaying elements create loopholes. He did not answer specifically the question of the public interest defence, which, combined with the reasonable excuse defence, sends the wrong message.

Chris Philp Portrait Chris Philp
- Hansard - -

The two work together. On the reasonable excuse condition, for the offence to have been committed, it has to be established that there was no reasonable excuse. The matter of public interest condition—I think the hon. Lady is referring to subsection (5)—simply illustrates one of the ways in which a reasonable excuse can be established, but, as I said in my remarks, it is not determinative. It does not mean that someone can say, “There is public interest in what I am saying,” and they automatically have a reasonable excuse—it does not work automatically like that. That is why in brackets at the end of subsection (5) it says

“but that does not determine the point”.

That means that if a public interest argument was mounted, a magistrate or a jury, in deciding whether the condition in subsection (1)(c)—the “no reasonable excuse” condition—had been met, would balance the public interest argument, but it would not be determinative. A balancing exercise would be performed. I hope that provides some clarity about the way that will operate in practice.

--- Later in debate ---
Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

We have argued that changes to the legislation are long overdue to protect people from the harms caused by online communications offences. The clause and schedule 13 include necessary amendments to the legislation, so we do not oppose them standing part of the Bill.

Chris Philp Portrait Chris Philp
- Hansard - -

The clause cross-references schedule 13 and sets out amendments to existing legislation consequential on the communications offences in part 10. Schedule 13 has a number of consequential amendments, divided broadly into two parts. It makes various changes to the Sexual Offences Act 2003, amends the Regulatory Enforcement and Sanctions Act 2008 in relation to the Malicious Communications Act 1988, and makes various other changes, all of which are consequential on the clauses we have just debated. I therefore commend clause 158 and its associated schedule 13 to the Committee.

Question put and agreed to.

Clause 158 accordingly ordered to stand part of the Bill.

Schedule 13 agreed to.

Clause 159

Providers that are not legal persons

Question proposed, That the clause stand part of the Bill.

Online Safety Bill (Eleventh sitting)

Debate between Chris Philp and Baroness Keeley
Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Before the Minister gets past this point—I think he has reached the point of my question—the fees do not kick in for two years. The figure is £88 million, but the point I was making is that the scope of the Bill has already increased. I asked about this during the evidence session with Ofcom. Fraudulent advertising was not included before, so there are already additional powers for Ofcom that need to be funded. I was questioning whether the original estimate will be enough for those two years.

Chris Philp Portrait Chris Philp
- Hansard - -

I assume that the hon. Lady is asking about the £88 million.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

indicated assent.

Chris Philp Portrait Chris Philp
- Hansard - -

That covers the preparatory work rather than the actual enforcement work that will follow. For the time being, we believe that it is enough, but of course we always maintain an active dialogue with Ofcom.

Finally, there was a question from my right hon. Friend the Member for Basingstoke, who asked how victims will be supported and compensated. As she said, Ofcom will always pay attention to victims in its work, but we should make it clear that the fees we are debating in these clauses are designed to cover only Ofcom’s costs and not those of third parties. I think the costs of victim support and measures to support victims are funded separately via the Ministry of Justice, which leads in this area. I believe that a victims Bill is being prepared that will significantly enhance the protections and rights that victims have—something that I am sure all of us will support.

Question put and agreed to.

Clause 70 accordingly ordered to stand part of the Bill.

Clauses 71 to 76 ordered to stand part of the Bill.

Clause 77

General duties of OFCOM under section 3 of the Communications Act

Question proposed, That the clause stand part of the Bill.

Online Safety Bill (Twelfth sitting)

Debate between Chris Philp and Baroness Keeley
Chris Philp Portrait Chris Philp
- Hansard - -

I understand the shadow Minister’s point. We all heard from Frances Haugen about the social media firms’ well-documented reluctance—to put it politely—to open themselves up to external scrutiny. Making that happen is a shared objective. We have already discussed several times the transparency obligations enshrined in clause 64. Those will have a huge impact in ensuring that the social media firms open up a lot more and become more transparent. That will not be an option; they will be compelled to do that. Ofcom is obliged under clause 64 to publish the guidance around those transparency reports. That is all set in train already, and it will be extremely welcome.

Researchers’ access to information is covered in clause 136, which the amendments seek to amend. As the shadow Minister said, our approach is first to get Ofcom to prepare a report into how that can best be done. There are some non-trivial considerations to do with personal privacy and protecting people’s personal information, and there are questions about who counts as a valid researcher. When just talking about it casually, it might appear obvious who is or is not a valid researcher, but we will need to come up with a proper definition of “valid researcher” and what confidentiality obligations may apply to them.

Baroness Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

This is all sorted in the health environment because of the personal data involved—there is no data more personal than health data—and a trusted and safe environment has been created for researchers to access personal data.

Chris Philp Portrait Chris Philp
- Hansard - -

This data is a little different—the two domains do not directly correspond. In the health area, there has been litigation—an artificial intelligence company is currently engaged in litigation with an NHS hospital trust about a purported breach of patient data rules—so even in that long-established area, there is uncertainty and recent, or perhaps even current, litigation.

We are asking for the report to be done to ensure that those important issues are properly thought through. Once they are, Ofcom has the power under clause 136 to lay down guidance on providing access for independent researchers to do their work.

--- Later in debate ---
Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

As my hon. Friend the Member for Pontypridd has pointed out, there is little or no transparency about one of the most critical ways in which platforms tackle harms. Human moderators are on the frontline of protecting children and adults from harmful content. They must be well resourced, trained and supported in order to fulfil that function, or the success of the Bill’s aims will be severely undermined.

I find it shocking that platforms offer so little data on human moderation, either because they refuse to publish it or because they do not know it. For example, in evidence to the Home Affairs Committee, William McCants from YouTube could not give precise statistics for its moderator team after being given six days’ notice to find the figure, because many moderators were employed or operated under third-party auspices. For YouTube’s global counter-terrorism lead to be unaware of the detail of how the platform is protecting its users from illegal content is shocking, but it is not uncommon.

In evidence to this Committee, Meta’s Richard Earley was asked how many of Meta’s 40,000 human moderators were outsourced to remove illegal content and disinformation from the platform. My hon. Friend the Member for Pontypridd said:

“You do not have the figures, so you cannot tell me.”

Richard Earley replied:

“I haven’t, no, but I will be happy to let you know afterwards in our written submission.”

Today, Meta submitted its written evidence to the Committee. It included no reference to human content moderators, despite its promise.

The account that my hon. Friend gave just now shows why new clause 11 is so necessary. Meta’s representative told this Committee in evidence:

“Everyone who is involved in reviewing content at Meta goes through an extremely lengthy training process that lasts multiple weeks, covering not just our community standards in total but also the specific area they are focusing on, such as violence and incitement.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 45, Q76.]

But now we know from whistleblowers such as Daniel, whose case my hon. Friend described, that that is untrue. What is happening to Daniel and the other human moderators is deeply concerning. There are powerful examples of the devastating emotional impact that can occur because human moderators are not monitored, trained and supported.

There are risks of platforms shirking responsibility when they outsource moderation to third parties. Stakeholders have raised concerns that a regulated company could argue that an element of its service is not in the scope of the regulator because it is part of a supply chain. We will return to that issue when we debate new clause 13, which seeks to ensure enforcement of liability for supply chain failures that amount to a breach of one of the specified duties.

Platforms, in particular those supporting user-to-user generated content, employ those services from third parties. Yesterday, I met Danny Stone, the chief executive of the Antisemitism Policy Trust, who described the problem of antisemitic GIFs. Twitter would say, “We don’t supply GIFs. The responsibility is with GIPHY.” GIPHY, as part of the supply chain, would say, “We are not a user-to-user platform.” If someone searched Google for antisemitic GIFs, the results would contain multiple entries saying, “Antisemitic GIFs—get the best GIFs on GIPHY. Explore and share the best antisemitic GIFs.”

One can well imagine a scenario in which a company captured by the regulatory regime established by the Bill argues that an element of its service is not within the ambit of the regulator because it is part of a supply chain presented by, but not necessarily the responsibility of, the regulated service. The contracted element, which I have just described by reference to Twitter and GIPHY, supported by an entirely separate company, would argue that it was providing a business-to-business service that is not user-generated content but content designed and delivered at arm’s length and provided to the user-to-user service to deploy for its users.

I suggest that dealing with this issue would involve a timely, costly and unhelpful legal process during which systems were not being effectively regulated—the same may apply in relation to moderators and what my hon. Friend the Member for Pontypridd described; there are a number of lawsuits involved in Daniel’s case—and complex contract law was invoked.

We recognise in UK legislation that there are concerns and issues surrounding supply chains. Under the Bribery Act 2010, for example, a company is liable if anyone performing services for or on the company’s behalf is found culpable for specific actions. These issues on supply chain liability must be resolved if the Bill is to fulfil its aim of protecting adults and children from harm.

Chris Philp Portrait Chris Philp
- Hansard - -

May I first say a brief word about clause stand part, Sir Roger?

--- Later in debate ---
Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The Minister has not commented on the problem I raised of the contracted firm in the supply chain not being covered by the regulations under the Bill—the problem of Twitter and the GIFs, whereby the GIFs exist and are used on Twitter, but Twitter says, “We’re not responsible for them; it’s that firm over there.” That is the same thing, and new clause 11 would cover both.

Chris Philp Portrait Chris Philp
- Hansard - -

I am answering slightly off the cuff, but I think the point the hon. Lady is raising—about where some potentially offensive or illegal content is produced on one service and then propagated or made available by another—is one we debated a few days ago. I think the hon. Member for Aberdeen North raised that question, last week or possibly the week before. I cannot immediately turn to the relevant clause—it will be in our early discussions in Hansard about the beginning of the Bill—but I think the Bill makes it clear that where content is accessed through another platform, which is the example that the hon. Member for Worsley and Eccles South just gave, the platform through which the content is made available is within the scope of the Bill.

Question put, That the amendment be made.

Online Safety Bill (Ninth sitting)

Debate between Chris Philp and Baroness Keeley
Chris Philp Portrait Chris Philp
- Hansard - -

Given that the clause is clearly uncontentious, I will be extremely brief.

Online Safety Bill (Tenth sitting)

Debate between Chris Philp and Baroness Keeley
Committee stage
Tuesday 14th June 2022

(2 years, 5 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 14 June 2022 - (14 Jun 2022)
Chris Philp Portrait Chris Philp
- Hansard - -

Clause 58, which was touched on in our last debate, simply sets out Ofcom’s duty to publish guidance for category 1 services to assist them in complying with the user identification duty set out in clause 57. We have probably covered the main points, so I will say nothing further.

Question put and agreed to.

Clause 58 accordingly ordered to stand part of the Bill.

Clause 59

Requirement to report CSEA content to the NCA

Question proposed, That the clause stand part of the Bill.

Baroness Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

You are really moving us at pace, Sir Roger. It is a pleasure to serve in Committee with you in the Chair.

It is welcome that regulated services will have to report all child sexual exploitation and abuse material that they detect on their platform. The Government’s decision to move away from the approach of a regulatory code of practice to a mandatory reporting requirement is an important improvement to the draft Bill.

For companies to report child sexual exploitation and abuse material correctly to the mandatory reporting body, they will need access to accurate datasets that will determine whether something that they are intending to report is child sexual exploitation and abuse content. What guidance will be made available to companies so that they can proactively detect CSEA, and what plans are in place to assist companies to identify potential CSEA that has not previously been identified? The impact assessment mentions that, for example, BT is planning to use the Internet Watch Foundation’s hash list, which is compliant with UK law enforcement standards, to identify CSEA proactively. Hashing is a technology used to prevent access to known CSEA; a hash is a unique string of letters and numbers which is applied to an image and which can then be matched every time a user attempts to upload a known illegal image to a platform. It relies, however, on CSEA already having been detected. What plans are in place to assist companies to identify potential CSEA?

Finally, it is important that the introduction of mandatory reporting does not impact on existing international reporting structures. Many of the largest platforms in the scope of the Bill are US-based and required under US law to report CSEA material detected on their platform to the National Centre for Missing and Exploited Children, which ensures that information relevant to UK law enforcement is referred to it for investigation.

Chris Philp Portrait Chris Philp
- Hansard - -

To answer the shadow Minister’s question about the duty to detect CSEA proactively—because, as she says, we have to detect it before we can report it—I confirm that there are already duties in the Bill to prevent and detect CSEA proactively, because CSEA is a priority offence in the schedule 6 list of child exploitation and abuse offences, and there is a duty for companies to prevent those proactively. In preventing them proactively, they will by definition identify them. That part of her question is well covered.

The hon. Lady also asked about the technologies available to those companies, including hash matching—comparing images against a known database of child sexual exploitation images. A lot of technology is being developed that can proactively spot child sexual exploitation in new images that are not on the hash matching database. For example, some technology combines age identification with nude image identification; by putting them together, we can identify sexual exploitation of children in images that are new and are not yet in the database.

To ensure that such new technology can be used, we have the duties under clause 103, which gives Ofcom the power to mandate—to require—the use of certain accredited technologies in fighting not just CSEA, but terrorism. I am sure that we will discuss that more when we come to that clause. Combined, the requirement to proactively prevent CSEA and the ability to specify technology under clause 103 will mean that companies will know about the content that they now, under clause 59, have to report to the National Crime Agency. Interestingly, the hon. Member for Worsley and Eccles South mentioned that that duty already exists in the USA, so it is good that we are matching that requirement in our law via clause 59, which I hope that the Committee will agree should stand part of the Bill.

Question put and agreed to.

Clause 59 accordingly ordered to stand part of the Bill.

Clause 60

Regulations about reports to the NCA

Question proposed, That the clause stand part of the Bill.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

The hon. Member for Worsley and Eccles South asks about the prioritisation of reports made to the NCA under the new statutory provisions. The prioritisation of investigations is an operational matter for the NCA, acting as a law enforcement body. I do not think it would be right either for myself as a Minister or for Parliament as a legislative body to specify how the NCA should conduct its operational activities. I imagine that it would pursue the most serious cases as a matter of priority, and if there is evidence of any systemic abuse it would also prioritise that, but it really is a matter for the NCA, as an operationally independent police force, to decide for itself. I think it is fairly clear that the scope of matters to be contained in these regulations is fairly comprehensive, as one would expect.

On the questions raised by the hon. Member for Aberdeen North, the Secretary of State might consult Scottish Ministers under clause 63(6)(c), particularly those with responsibility for law enforcement in Scotland, and the same would apply to other jurisdictions. On whether an amendment is required to cover any matters to do with the procedures in Scotland equivalent to the matter covered in clause 61, we do not believe that any equivalent change is required to devolved Administration law. However, in order to be absolutely sure, we will get the hon. Lady written confirmation on that point.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I am not sure that the Minister has answered my question on clause 60. I think we all agree that law enforcement agencies can decide their own priorities, quite rightly, but clause 60(2)(d) sets out that the Secretary of State’s regulations must include

“provision about cases of particular urgency”.

I asked the Minister what that would look like.

Also, we think it is pretty important that the National Crime Agency, the Internet Watch Foundation and Ofcom work together on mandatory reporting. I asked him how he envisaged them working together to share information, because the better they do that, the more children are protected.

Chris Philp Portrait Chris Philp
- Hansard - -

I apologise for missing those two points. On working together, the hon. Lady is right that agencies such as the Internet Watch Foundation and others should co-operate closely. There is already very good working between the Internet Watch Foundation, law enforcement and others—they seem to be well networked together and co-operating closely. It is appropriate to put on the record that Parliament, through this Committee, thinks that co-operation should continue. That communication and the sharing of information on particular images is obviously critical.

As the clause states, the regulations can set out expedited timeframes in cases of particular urgency. I understand that to mean cases where there might be an immediate risk to a child’s safety, or where somebody might be at risk in real time, as opposed to something historic—for example, an image that might have been made some time ago. In cases where it is believed abuse is happening at the present time, there is an expectation that the matter will be dealt with immediately or very close to immediately. I hope that answers the shadow Minister’s questions.

Question put and agreed to.

Clause 60 accordingly ordered to stand part of the Bill.

Clause 61 ordered to stand part of the Bill.

Clause 62

Offence in relation to CSEA reporting

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

If the child had been in the UK when the offence was committed, that would ordinarily be subject to UK criminal law, because the crime would have been committed in the UK. The test is: where was the child or victim at the time the offence was committed? As I said a moment ago, however, the definition of “UK-linked” is particularly wide and includes

“the place where the content was published, generated, uploaded or shared.”

The word “generated”—I am reading from clause 63(6)(a), at the top of page 56—is clearly in the past tense and would include the circumstance that the hon. Lady described.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

What the Minister has said is helpful, but the question I asked was about what guidance and support will be made available to regulated services. We all want this to work, because it is one of the most important aspects of the Bill—many aspects are important. He made it clear to us that the definition is quite wide, for both the general definitions and the “UK-linked” content. The point of the question was, given the possible difficulties in some circumstances, what guidance and support will be made available?

Chris Philp Portrait Chris Philp
- Hansard - -

I anticipate that the National Crime Agency will issue best practice guidance. A fair amount of information about the requirements will also be set out in the regulations that the Secretary of State will issue under clause 60, which we have already debated. So it is a combination of those regulations and National Crime Agency best practice guidance. I hope that answers the question.

Finally, on companies being taken over, if a company ceases to be UK-linked, we would expect it to continue to discharge its reporting duties, which might include reporting not just in the UK but to its domestic reporting agency—we have already heard the US agency described and referenced.

I hope that my answers demonstrate that the clause is intended to be comprehensive and effective. It should ensure that the National Crime Agency gets all the information it needs to investigate and prosecute CSEA in order to keep our children safe.

Question put and agreed to.

Clause 62, as amended, accordingly ordered to stand part of the Bill.

Clause 63 ordered to stand part of the Bill.

Clause 64

Transparency reports about certain Part 3 services

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I beg to move amendment 54, in clause 64, page 56, line 29, leave out “Once” and insert “Twice”.

This amendment would change the requirement for transparency report notices from once a year to twice a year.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

To start with, it is worth saying that clause 64 is extremely important. In the course of debating earlier clauses, Opposition Members rightly and repeatedly emphasised how important it is that social media platforms are compelled to publish information. The testimony that Frances Haugen gave to the Joint Committee and to this Committee a few weeks ago demonstrates how important that is. Social media platforms are secretive and are not open. They seek to disguise what is going on, even though the impact of what they are doing has a global effect. So the transparency power in clause 64 is a critical part of the Bill and will dramatically transform the insights of parliamentarians, the wider public, civil society campaigners and academics. It will dramatically open up the sense of what is going on inside these companies, so it is extremely important indeed.

Amendment 54 seeks to increase the frequency of transparency reporting from once a year to twice a year. To be honest, we do not want to do this unreasonably frequently, and our sense is that once a year, rather than twice a year, is the right regularity. We therefore do not support the amendment. However, Members will notice that there is an ability in clause 64(12) for the Secretary of State, by regulation, to

“amend subsection (1) so as to change the frequency of the transparency reporting process.”

If it turns out in due course that once a year is not enough and we would like to do it more frequently—for example, twice a year—there is the power for those regulations to be used so that the reporting occurs more frequently. The frequency is not set in stone.

I turn to amendment 55, which sets out a number of topics that would be included in reporting. It is important to say that, as a quick glance at schedule 8 shows, the remit of the reports is already extremely wide in scope. Hon. Members will see that paragraph 5 specifies that reports can cover

“systems and processes for users to report content which they consider to be illegal”

or “harmful”, and so on. Paragraph 6 mentions:

“The systems and processes that a provider operates to deal with illegal content, content that is harmful to children”,

and so on. Therefore, the topics that amendment 55 speaks to are already covered by the schedule, and I would expect such things to be reported on. We have given Ofcom the explicit powers to do that and, rather than prescribe such details in the Bill, we should let Ofcom do its job. It certainly has the powers to do such things—that is clearly set out in the schedule—and I would expect, and obviously the Opposition would expect, that it will do so. On that basis, I will gently resist amendments 54 and 55.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

On amendment 55, I want to come back to the Minister on two points about languages that were made by the hon. Member for Aberdeen North. I think most people would be shocked to discover that safety systems and the languages in which they operate are not protected, so if people are speaking a language other than English, they will not be protected. I also think that people will be shocked about, as I outlined, the employment of moderators and how badly they are paid and trained. There are factories full of people doing that important task.

I recommend that the Minister thinks again about requiring Ofcom to provide details on human moderators who are employed or engaged and how they are trained and supported. It is a bit like when we find out about factories producing various items under appalling conditions in other parts of the world—we need transparency on these issues to make people do something about it. These platforms will not do anything about it. Under questioning from my hon. Friend the Member for Pontypridd, Richard Earley admitted that he had no idea how many human moderators were working for Facebook. That is appalling and we must do something about it.

Chris Philp Portrait Chris Philp
- Hansard - -

I obviously have sympathy with the objectives, but the topics covered in schedule 8, which include the systems and processes for responding to illegal and harmful content and so on, give Ofcom the power to do what the hon. Member requires. On the language point, the risk assessments that companies are required to do are hard-edged duties in the Bill, and they will have to include an assessment of languages used in the UK, which is a large number of languages—obviously, it does not include languages spoken outside the UK. So the duty to risk-assess languages already exists. I hope that gives the hon. Member reassurance. She is making a reasonable point, and I would expect that, in setting out transparency requirements, Ofcom will address it. I am sure that it will look at our proceedings to hear Parliament’s expectations, and we are giving it those powers, which are clearly set out in schedule 8.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I will just make a final point. The Bill gives Ofcom powers when it already has so much to do. We keep returning to the point of how much will ride on Ofcom’s decisions. Our amendments would make clear the requirement for transparency reporting relating to the language issue, as well as the employment of human moderators and how they are trained and supported. If we do not point that out to Ofcom, it really has enough other things to be doing, so we are asking for these points to be drawn out specifically. As in so many of our amendments, we are just asking for things to be drawn out so that they happen.

Question put, That the amendment be made.

Online Safety Bill (Seventh sitting)

Debate between Chris Philp and Baroness Keeley
Chris Philp Portrait Chris Philp
- Hansard - -

Let me develop the point before I give way. Our first line of defence is Ofcom enforcing the clause, but we have a couple of layers of additional defence. One of those is the super-complaints mechanism, which I have mentioned before. If a particular group of people, represented by a body such as the NSPCC, feel that their legitimate complaints are being infringed systemically by the social media platform, and that Ofcom is failing to take the appropriate action, they can raise that as a super-complaint to ensure that the matter is dealt with.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - -

I should give way to the hon. Member for Aberdeen North first, and then I will come to the shadow Minister.

Chris Philp Portrait Chris Philp
- Hansard - -

A later clause gives Ofcom the ability to levy the fees and charges it sees as necessary and appropriate to ensure that it can deliver the duties. Ofcom will have the power to set those fees at a level to enable it to do its job properly, as Parliament would wish it to do.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

This is the point about individual redress again: by talking about super-complaints, the Minister seems to be agreeing that it is not there. As I said earlier, for super-complaints to be made to Ofcom, the issue has to be of particular importance or to impact a particularly large number of users, but that does not help the individual. We know how much individuals are damaged; there must be a system of external redress. The point about internal complaints systems is that we know that they are not very good, and we require a big culture change to change them, but unless there is some mechanism thereafter, I cannot see how we are giving the individual any redress—it is certainly not through the super-complaints procedure.

Chris Philp Portrait Chris Philp
- Hansard - -

As I said explicitly a few moments ago, the hon. Lady is right to point out the fact that the super-complaints process is to address systemic issues. She is right to say that, and I think I made it clear a moment or two ago.

Whether there should be an external ombudsman to enforce individual complaints, rather than just Ofcom enforcing against systemic complaints, is a question worth addressing. In some parts of our economy, we have ombudsmen who deal with individual complaints, financial services being an obvious example. The Committee has asked the question, why no ombudsman here? The answer, in essence, is a matter of scale and of how we can best fix the issue. The volume of individual complaints generated about social media platforms is just vast. Facebook in the UK alone has tens of millions of users—I might get this number wrong, but I think it is 30 million or 40 million users.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

Yes. My hon. Friend hits the nail on the head. If there is a systemic problem and a platform fails to act appropriately not just in one case, but in a number of them, we have, as she has just described, the super-complaints process in clauses 140 to 142. Even under the Bill as drafted, without any changes, if a platform turns out to be systemically ignoring reasonable complaints made by the public and particular groups of users, the super-complainants will be able to do exactly as she describes. There is a mechanism to catch this—it operates not at individual level, but at the level of groups of users, via the super-complaint mechanism—so I honestly feel that the issue has been addressed.

When the numbers are so large, I think that the super-complaint mechanism is the right way to push Ofcom if it does not notice. Obviously, the first line of defence is that companies comply with the Bill. The second line of defence is that if they fail to do so, Ofcom will jump on them. The third line of defence is that if Ofcom somehow does not notice, a super-complaint group—such as the NSPCC, acting for children—will make a super-complaint to Ofcom. We have three lines of defence, and I submit to the Committee that they are entirely appropriate.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - -

I was about to sit down, but of course I will give way.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The Minister said that the Opposition had not tabled an amendment to bring in an ombudsman.

Chris Philp Portrait Chris Philp
- Hansard - -

On this clause.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

On this clause. What we have done, however—we are debating it now—is to table a new clause to require a report on redress for individual complaints. The Minister talks about clause 149 and a process that will kick in between two and five years away, but we have a horrendous problem at the moment. I and various others have described the situation as the wild west, and very many people—thousands, if not millions, of individuals—are being failed very badly. I do not see why he is resisting our proposal for a report within six months of the commencement of the Act, which would enable us to start to see at that stage, not two to five years down the road, how these systems—he is putting a lot of faith in them—were turning out. I think that is a very sound idea, and it would help us to move forward.

Chris Philp Portrait Chris Philp
- Hansard - -

The third line of defence—the super-complaint process—is available immediately, as I set out a moment ago. In relation to new clause 1, which the hon. Lady mentioned a moment ago, I think six months is very soon for a Bill of this magnitude. The two-to-five-year timetable under the existing review mechanism in clause 149 is appropriate.

Although we are not debating clause 149, I hope, Ms Rees, that you will forgive me for speaking about it for a moment. If Members turn to pages 125 and 126 and look at the matters covered by the review, they will see that they are extraordinarily comprehensive. In effect, the review covers the implementation of all aspects of the Bill, including the need to minimise the harms to individuals and the enforcement and information-gathering powers. It covers everything that Committee members would want to be reviewed. No doubt as we go through the Bill we will have, as we often do in Bill Committee proceedings, a number of occasions on which somebody tables an amendment to require a review of x, y or z. This is the second such occasion so far, I think, and there may be others. It is much better to have a comprehensive review, as the Bill does via the provisions in clause 149.

Question put and agreed to.

Clause 18 accordingly ordered to stand part of the Bill.

Clause 19

Duties about freedom of expression and privacy

Question proposed, That the clause stand part of the Bill.

Chris Philp Portrait Chris Philp
- Hansard - -

Clause 19, on user-to-user services, and its associated clause 29, which relates to search services, specify a number of duties in relation to freedom of expression and privacy. In carrying out their safety duties, in-scope companies will be required by clause 19(2) to have regard to the importance of protecting users’ freedom of expression and privacy.

Let me pause for a moment on this issue. There has been some external commentary about the Bill’s impact on freedom of expression. We have already seen, via our discussion of a previous clause, that there is nothing in the Bill that compels the censorship of speech that is legal and not harmful to children. I put on the record again the fact that nothing in the Bill requires the censorship of legal speech that poses no harm to children.

We are going even further than that. As far as I am aware, for the first time ever there will be a duty on social media companies, via clause 19(2), to have regard to freedom of speech. There is currently no legal duty at all on platforms to have regard to freedom of speech. The clause establishes, for the first time, an obligation to have regard to freedom of speech. It is critical that not only Committee members but others more widely who consider the Bill should bear that carefully in mind. Besides that, the clause speaks to the right to privacy. Existing laws already speak to that, but the clause puts it in this Bill as well. Both duties are extremely important.

In addition, category 1 service providers—the really big ones—will need proactively to assess the impact of their policies on freedom of expression and privacy. I hope all Committee members will strongly welcome the important provisions I have outlined.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

As the Minister says, clauses 19 and 29 are designed to provide a set of balancing provisions that will require companies to have regard to freedom of expression and privacy when they implement their safety duties. However, it is important that companies cannot use privacy and free expression as a basis to argue that they can comply with regulation in less substantive ways. That is a fear here.

Category 1 providers will need to undertake an impact assessment to determine the impact of their product and safety decisions on freedom of expression, but it is unclear whether that applies only in respect of content that is harmful to adults. Unlike with the risk assessments for the illegal content and child safety duties set out in part 3, chapter 2, these clauses do not set expectations about whether risk assessments are of a suitable and sufficient quality. It is also not clear what powers Ofcom has at its disposal to challenge any assessments that it considers insufficient or that reach an inappropriate or unreasonable assessment of how to balance fundamental rights. I would appreciate it if the Minister could touch on that when he responds.

The assumption underlying these clauses is that privacy and free expression may need to act as a constraint on safety measures, but I believe that that is seen quite broadly as simplistic and potentially problematic. To give one example, a company could argue that end-to-end encryption is important for free expression, and privacy could justify any adverse impact on users’ safety. The subjects of child abuse images, which could more easily be shared because of such a decision, would see their safety and privacy rights weakened. Such an argument fails to take account of the broader nuance of the issues at stake. Impacts on privacy and freedom of expression should therefore be considered across a range of groups rather than assuming an overarching right that applies equally to all users.

Similarly, it will be important that Ofcom understands and delivers its functions in relation to these clauses in a way that reflects the complexity and nuance of the interplay of fundamental rights. It is important to recognise that positive and negative implications for privacy and freedom of expression may be associated with any compliance decision. I think the Minister implied that freedom of speech was a constant positive, but it can also have negative connotations.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I am pleased that the clause is in the Bill, and I think it is a good one to include. Can the Minister reaffirm what he said on Tuesday about child sexual abuse, and the fact that the right to privacy does not trump the ability—particularly with artificial intelligence—to search for child sexual abuse images?

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

I am concerned to ensure that children are appropriately protected, as the hon. Lady sets out. Let me make a couple of points in that area before I address that point.

The hon. Lady asked another question earlier, about video content. She gave the example of TikTok videos being viewed or accessed not directly on TikTok but via some third-party means, such as a WhatsApp message. First, it is worth emphasising again that in order to count as a user, a person does not have to be registered and can simply be viewing the content. Secondly, if someone is viewing something through another service, such as WhatsApp—the hon. Lady used the example of browsing the internet on another site—the duty will bite at the level of WhatsApp, and it will have to consider the content that it is providing access to. As I said, someone does not have to be registered with a service in order to count as a user of that service.

On amendment 22, there is a drafting deficiency, if I may put it politely—this is a point of drafting rather than of principle. The amendment would simply delete subsection (3), but there would still be references to the “child user condition”—for example, the one that appears on the same page of the Bill at line 11. If the amendment were adopted as drafted, it would end up leaving references to “child user condition” in the Bill without defining what it meant, because we would have deleted the definition.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Is the Minister coming on to say that he is accepting what we are saying here?

Chris Philp Portrait Chris Philp
- Hansard - -

No, is the short answer. I was just mentioning in passing that there is that drafting issue.

On the principle, it is worth being very clear that, when it comes to content or matters that are illegal, that applies to all platforms, regardless of size, where children are at all at risk. In schedule 6, we set out a number of matters—child sexual exploitation and abuse, for example—as priority offences that all platforms have to protect children from proactively, regardless of scale.

--- Later in debate ---
Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The Minister has not addressed the points I raised. I specifically raised—he has not touched on this—harmful pro-anorexia blogs, which we know are dangerous but are not in scope, and games that children access that increase gambling addiction. He says that there is separate legislation for gambling addiction, but families have lost thousands of pounds through children playing games linked to gambling addiction. There are a number of other services that do not affect an appreciable number of children, and the drafting causes them to be out of scope.

Chris Philp Portrait Chris Philp
- Hansard - -

rose—[Interruption.]

Online Safety Bill (Eighth sitting)

Debate between Chris Philp and Baroness Keeley
Committee stage
Thursday 9th June 2022

(2 years, 6 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 9 June 2022 - (9 Jun 2022)
Chris Philp Portrait Chris Philp
- Hansard - -

I thank the hon. Member for Aberdeen North for her latter remarks. We made an important addition to the Bill after listening to parliamentarians across the House and to the Joint Committee, which many people served on with distinction. I am delighted that we have been able to make that significant move. We have heard a lot about how fraudulent advertising can affect people terribly, particularly more vulnerable people, so that is an important addition.

Amendments 23 and 24 seek to make it clear that where the target is in the UK, people are covered. I am happy to assure the Committee that that is already covered, because the definitions at the beginning of the Bill—going back to clause 3(5)(b), on page 3—make it clear that companies are in scope, both user-to-user and search, if there is a significant number of UK users or where UK users form one of the target markets, or is the only target market. Given the reference to “target markets” in the definitions, I hope that the shadow Minister will withdraw the amendment, because the matter is already covered in the Bill.

New clause 5 raises important points about the regulation of online advertising, but that is outside the purview of what the Bill is trying to achieve. The Government are going to work through the online advertising programme to tackle these sorts of issues, which are important. The shadow Minister is right to raise them, but they will be tackled holistically by the online advertising programme, and of course there are already codes of practice that apply and are overseen by the Advertising Standards Authority. Although these matters are very important and I agree with the points that she makes, there are other places where those are best addressed.

New clause 6 is about the verification process. Given that the Bill is primary legislation, we want to have the core duty to prevent fraudulent advertising in the Bill. How that is implemented in this area, as in many others, is best left to Ofcom and its codes of practice. When Ofcom publishes the codes of practice, it might consider such a duty, but we would rather leave Ofcom, as the expert regulator, with the flexibility to implement that via the codes of practice and leave the hard-edged duty in the Bill as drafted.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

We are going to press amendments 23 and 24 to a vote because they are very important. I cited the example of earlier legislation that considered it important, in relation to selling tickets, to include the wording “anywhere in the world”. We know that ticket abuses happen with organisations in different parts of the world.

Chris Philp Portrait Chris Philp
- Hansard - -

The hon. Lady is perfectly entitled to press to a vote whatever amendments she sees fit, but in relation to amendments 24 and 25, the words she asks for,

“where the UK is a target market”,

are already in the Bill, in clause 3(5)(b), on page 3, which set out the definitions at the start. I will allow the hon. Lady a moment to look at where it states:

“United Kingdom users form one of the target markets for the service”.

That applies to user-to-user and to search, so it is covered already.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The problem is that we are getting into the wording of the Bill. As with the child abuse clause that we discussed before lunch, there are limitations. Clause 3 states that a service has links with the United Kingdom if

“the service has a significant number of United Kingdom users”.

It does not matter if a person is one of 50, 100 or 1,000 people who get scammed by some organisation operating in another part of the country. The 2006 Bill dealing with the sale of Olympic tickets believed that was important, and we also believe it is important. We have to find a way of dealing with ticket touting and ticket abuse.

Turning to fraudulent advertising, I have given examples and been supported very well by the hon. Member for Aberdeen North. It is not right that vulnerable people are repeatedly taken in by search results, which is the case right now. The reason we have tabled all these amendments is that we are trying to protect vulnerable people, as with every other part of the Bill.

Chris Philp Portrait Chris Philp
- Hansard - -

That is of course our objective as well, but let me just return to the question of the definitions. The hon. Lady is right that clause 3(5)(a) says

“a significant number of United Kingdom users”,

but paragraph (b) just says,

“United Kingdom users form one of the target markets”.

There is no significant number qualification in paragraph (b), and to put it beyond doubt, clause 166(1) makes it clear that service providers based outside the United Kingdom are within the scope of the Bill. To reiterate the point, where the UK is a target market, there is no size qualification: the service provider is in scope, even if it is only one user.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Does the Minister want to say anything about the other points I made about advertisements?

Chris Philp Portrait Chris Philp
- Hansard - -

Not beyond the points I made previously, no.

Question put, That the amendment be made.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

As we have heard already, these clauses are very important because they protect people from online fraudulent advertisements for the first time—something that the whole House quite rightly called for. As the shadow Minister said, the Government heard Parliament’s views on Second Reading, and the fact that the duties in clause 35 were not as strongly worded as those in clause 34 was recognised. The Government heard what Members said on Second Reading and tabled Government amendments 91 to 94, which make the duties on search firms in clause 35 as strong as those on user-to-user firms in clause 34. Opposition amendment 45 would essentially do the same thing, so I hope we can adopt Government amendments 91 to 94 without needing to move amendment 45. It would do exactly the same thing—we are in happy agreement on that point.

I listened carefully to what the shadow Minister said on amendment 44. The example she gave at the end of her speech—the poor lady who was induced into sending money, which she thought was being sent to pay off creditors but was, in fact, stolen—would, of course, be covered by the Bill as drafted, because it would count as an act of fraud.

The hon. Lady also talked about some other areas that were not fraud, such as unfair practices, misleading statements or statements that were confusing, which are clearly different from fraud. The purpose of clause 35 is to tackle fraud. Those other matters are, as she says, covered by the Consumer Protection from Unfair Trading Regulations 2008, which are overseen and administered by the Competition and Markets Authority. While matters to do with unfair, misleading or confusing content are serious—I do not seek to minimise their importance—they are overseen by a different regulator and, therefore, better handled by the CMA under its existing regulations.

If we introduce this extra offence to the list in clause 36, we would end up having a bit of regulatory overlap and confusion, because there would be two regulators involved. For that reason, and because those other matters—unfair, misleading and confusing advertisements —are different to fraud, I ask that the Opposition withdraw amendment 44 and, perhaps, take it up on another occasion when the CMA’s activities are in the scope of the debate.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

No, we want to press this amendment to a vote. I have had further comment from the organisations that I quoted. They believe that we do need the amendment because it is important to stop harmful ads going up in the first place. They believe that strengthened provisions are needed for that. Guidance just puts the onus for protecting consumers on the other regulatory regimes that the Minister talked about. The view of organisations such as StepChange is that those regimes—the Advertising Standards Authority regime—are not particularly strong.

The regulatory framework for financial compulsion is fragmented. FCA-regulated firms are clearly under much stronger obligations than those that fall outside FCA regulations. I believe that it would be better to accept the amendment, which would oblige search engines and social media giants to prevent harmful and deceptive ads from appearing in the first place. The Minister really needs to take on board the fact that in this patchwork, this fragmented world of different regulatory systems, some of the existing systems are clearly failing badly, and the strong view of expert organisations is that the amendment is necessary.

Question put and agreed to.

Clause 34 accordingly ordered to stand part of the Bill.

Clause 35

Duties about fraudulent advertising: Category 2A services

Amendments made: 91, in clause 35, page 34, line 3, leave out from “to” to end of line 5 and insert—

“(a) prevent individuals from encountering content consisting of fraudulent advertisements in or via search results of the service;

(b) if any such content may be encountered in or via search results of the service, minimise the length of time that that is the case;

(c) where the provider is alerted by a person to the fact that such content may be so encountered, or becomes aware of that fact in any other way, swiftly ensure that individuals are no longer able to encounter such content in or via search results of the service.”

This amendment alters the duty imposed on providers of Category 2A services relating to content consisting of fraudulent advertisements so that it is in line with the corresponding duty imposed on providers of Category 1 services by clause 34(1).

Amendment 92, in clause 35, page 34, line 16, leave out “reference” and insert “references”.

This amendment is consequential on Amendment 91.

Amendment 93, in clause 35, page 34, line 18, leave out “is a reference” and insert “are references”.

This amendment is consequential on Amendment 91.

Amendment 94, in clause 35, page 34, line 22, leave out

“does not include a reference”

and insert “do not include references”.—(Chris Philp.)

This amendment is consequential on Amendment 91.

Clause 35, as amended, ordered to stand part of the Bill.

Clause 36

Fraud etc offences

Amendment proposed: 44, in clause 36, page 35, line 10, at end insert—

“(4A) An offence under Part 3 of the Consumer Protection from Unfair Trading Regulations 2008.”—(Barbara Keeley.)

This amendment adds further offences to those which apply for the purposes of the Bill’s fraudulent advertising provisions.

Question put, That the amendment be made.

Online Safety Bill (Sixth sitting)

Debate between Chris Philp and Baroness Keeley
Baroness Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

As this is the first time I have spoken in the Committee, may I say that it is a pleasure to serve with you in the Chair, Ms Rees? I agree with my hon. Friend the Member for Pontypridd that we are committed to improving the Bill, despite the fact that we have some reservations, which we share with many organisations, about some of the structure of the Bill and some of its provisions. As my hon. Friend has detailed, there are particular improvements to be made to strengthen the protection of children online, and I think the Committee’s debate on this section is proving fruitful.

Amendment 28 is a good example of where we must go further if we are to achieve the goal of the Bill and protect children from harm online. The amendment seeks to require regulated services to assess their level of risk based, in part, on the frequency with which they are blocking, detecting and removing child sexual exploitation and abuse content from their platforms. By doing so, we will be able to ascertain the reality of their overall risk and the effectiveness of their existing response.

The addition of livestreamed child sexual exploitation and abuse content not only acknowledges first-generation CSEA content, but recognises that livestreamed CSEA content happens on both public and private channels, and that they require different methods of detection.

Furthermore, amendment 28 details the practical information needed to assess whether the action being taken by a regulated service is adequate in countering the production and dissemination of CSEA content, in particular first-generation CSEA content. Separating the rates of terminated livestreams of CSEA in public and private channels is important, because those rates may vary widely depending on how CSEA content is generated. By specifying tools, strategies and interventions, the amendment would ensure that the systems in place to detect and report CSEA are adequate, and that is why we would like it to be part of the Bill.

Chris Philp Portrait Chris Philp
- Hansard - -

The Government support the spirit of amendments 17 and 28, which seek to achieve critical objectives, but the Bill as drafted delivers those objectives. In relation to amendment 17 and cross-platform risk, clause 8 already sets out harms and risks—including CSEA risks—that arise by means of the service. That means through the service to other services, as well as on the service itself, so that is covered.

Amendment 28 calls for the risk assessments expressly to cover illegal child sexual exploitation content, but clause 8 already requires that to happen. Clause 8(5) states that the risk assessment must cover the

“risk of individuals who are users of the service encountering…each kind of priority illegal content”.

If we follow through the definition of priority illegal content, we find all those CSEA offences listed in schedule 6. The objective of amendment 28 is categorically delivered by clause 8(5)(b), referencing onwards to schedule 6.

--- Later in debate ---
Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I will speak to other amendments in this group as well as amendment 15. The success of the Bill’s regulatory framework relies on regulated companies carefully risk-assessing their platforms. Once risks have been identified, the platform can concentrate on developing and implementing appropriate mitigations. However, up to now, boards and top executives have not taken the risk to children seriously. Services have either not considered producing risk assessments or, if they have done so, they have been of limited efficacy and failed to identify and respond to harms to children.

In evidence to the Joint Committee, Frances Haugen explained that many of the corporate structures involved are flat, and accountability for decision making can be obscure. At Meta, that means teams will focus only on delivering against key commercial metrics, not on safety. Children’s charities have also noted that corporate structures in the large technology platforms reward employees who move fast and break things. Those companies place incentives on increasing return on investment rather than child safety. An effective risk assessment and risk mitigation plan can impact on profit, which is why we have seen so little movement from companies to take the measures themselves without the duty being placed on them by legislation.

It is welcome that clause 10 introduces a duty to risk-assess user-to-user services that are likely to be accessed by children. But, as my hon. Friend the Member for Pontypridd said this morning, it will become an empty, tick-box exercise if the Bill does not also introduce the requirement for boards to review and approve the risk assessments.

The Joint Committee scrutinising the draft Bill recommended that the risk assessment be approved at board level. The Government rejected that recommendation on the grounds thar Ofcom could include that in its guidance on producing risk assessments. As with much of the Bill, it is difficult to blindly accept promised safeguards when we have not seen the various codes of practice and guidance materials. The amendments would make sure that decisions about and awareness of child safety went right to the top of regulated companies. The requirement to have the board or a senior manager approve the risk assessment will hardwire the safety duties into decision making and create accountability and responsibility at the most senior level of the organisation. That should trickle down the organisation and help embed a culture of compliance across it. Unless there is a commitment to child safety at the highest level of the organisation, we will not see the shift in attitude that is urgently needed to keep children safe, and which I believe every member of the Committee subscribes to.

On amendments 11 and 13, it is welcome that we have risk assessments for children included in the Bill, but the effectiveness of that duty will be undermined unless the risk assessments can be available for scrutiny by the public and charities. In the current version of the Bill, risk assessments will only be made available to the regulator, which we debated on an earlier clause. Companies will be incentivised to play down the likelihood of currently emerging risks because of the implications of having to mitigate against them, which may run counter to their business interests. Unless the risk assessments are published, there will be no way to hold regulated companies to account, nor will there be any way for companies to learn from one another’s best practice, which is a very desirable aim.

The current situation shows that companies are unwilling to share risk assessments even when requested. In October 2021, following the whistleblower disclosures made by Frances Haugen, the National Society for the Prevention of Cruelty to Children led a global coalition of 60 child protection organisations that urged Meta to publish its risk assessments, including its data privacy impact assessments, which are a legal requirement under data protection law. Meta refused to share any of its risk assessments, even in relation to child sexual abuse and grooming. The company argued that risk assessments were live documents and it would not be appropriate for it to share them with any organisation other than the Information Commissioner’s Office, to whom it has a legal duty to disclose. As a result, civil society organisations and the charities that I talked about continue to be in the dark about whether and how Meta has appropriately identified online risk to children.

Making risk assessments public would support the smooth running of the regime and ensure its broader effectiveness. Civil society and other interested groups would be able to assess and identify any areas where a company might not be meeting its safety duties and make full, effective use of the proposed super-complaints mechanism. It will also help civil society organisations to hold the regulated companies and the regulator, Ofcom, to account.

As we have seen from evidence sessions, civil society organisations are often at the forefront of understanding and monitoring the harms that are occurring to users. They have an in depth understanding of what mitigations may be appropriate and they may be able to support the regulator to identify any obvious omissions. The success of the systemic risk assessment process will be significantly underpinned by and reliant upon the regulator’s being able to rapidly and effectively identify new and emerging harms, and it is highly likely that the regulator will want to draw on civil society expertise to ensure that it has highly effective early warning functions in place.

However, civil society organisations will be hampered in that role if they remain unable to determine what, if anything, companies are doing to respond to online threats. If Ofcom is unable to rapidly identify new and emerging harms, the resulting delays could mean entire regulatory cycles where harms were not captured in risk profiles or company risk assessments, and an inevitable lag between harms being identified and companies being required to act upon them. It is therefore clear that there is a significant public value to publishing risk assessments.

Amendments 27 and 32 are almost identical to the suggested amendments to clause 8 that we discussed earlier. As my hon. Friend the Member for Pontypridd said in our discussion about amendments 25, 26 and 30, the duty to carry out a suitable and sufficient risk assessment could be significantly strengthened by preventing the creation of illegal content, not only preventing individuals from encountering it. I know the Minister responded to that point, but the Opposition did not think that response was fully satisfactory. This is just as important for children’s risk assessments as it is for illegal content risk assessments.

Online platforms are not just where abusive material is published. Sex offenders use mainstream web platforms and services as tools to commit child sexual abuse. This can be seen particularly in the livestreaming of child sexual exploitation. Sex offenders pay to direct and watch child sexual abuse in real time. The Philippines is a known hotspot for such abuse and the UK has been identified by police leads as the third-largest consumer of livestreamed abuse in the world. What a very sad statistic that our society is the third-largest consumer of livestreamed abuse in the world.

Ruby is a survivor of online sexual exploitation in the Philippines, although Ruby is not her real name; she recently addressed a group of MPs about her experiences. She told Members how she was trafficked into sexual exploitation aged 16 after being tricked and lied to about the employment opportunities she thought she would be getting. She was forced to perform for paying customers online. Her story is harrowing. She said:

“I blamed myself for being trapped. I felt disgusted by every action I was forced to do, just to satisfy customers online. I lost my self-esteem and I felt very weak. I became so desperate to escape that I would shout whenever I heard a police siren go by, hoping somebody would hear me. One time after I did this, a woman in the house threatened me with a knife.”

Eventually, Ruby was found by the Philippine authorities and, after a four-year trial, the people who imprisoned her and five other girls were convicted. She said it took many years to heal from the experience, and at one point she nearly took her own life.

It should be obvious that if we are to truly improve child protection online we need to address the production of new child abuse material. In the Bill, we have a chance to address not only what illegal content is seen online, but how online platforms are used to perpetrate abuse. It should not be a case of waiting until the harm is done before taking action.

Chris Philp Portrait Chris Philp
- Hansard - -

As the hon. Lady said, we discussed in the groupings for clauses 8 and 9 quite a few of the broad principles relating to children, but I will none the less touch on some of those points again because they are important.

On amendment 27, under clause 8 there is already an obligation on platforms to put in place systems and processes to reduce the risk that their services will be used to facilitate the presence of illegal content. As that includes the risk of illegal content being present, including that produced via the service’s functionality, the terrible example that the hon. Lady gave is already covered by the Bill. She is quite right to raise that example, because it is terrible when such content involving children is produced, but such cases are expressly covered in the Bill as drafted, particularly in clause 8.

Amendment 31 covers a similar point in relation to search. As I said for the previous grouping, search does not facilitate the production of content; it helps people to find it. Clearly, there is already an obligation on search firms to stop people using search engines to find illegal content, so the relevant functionality in search is already covered by the Bill.

Amendments 15 and 16 would expressly require board member sign-off for risk assessments. I have two points to make on that. First, the duties set out in clause 10(6)(h) in relation to children’s risk assessments already require the governance structures to be properly considered, so governance is directly addressed. Secondly, subsection (2) states that the risk assessment has to be “suitable and sufficient”, so it cannot be done in a perfunctory or slipshod way. Again, Ofcom must be satisfied that those governance arrangements are appropriate. We could invent all the governance arrangements in the world, but the outcome needs to be delivered and, in this case, to protect children.

Beyond governance, the most important things are the sanctions and enforcement powers that Ofcom can use if those companies do not protect children. As the hon. Lady said in her speech, we know that those companies are not doing enough to protect children and are allowing all kinds of terrible things to happen. If those companies continue to allow those things to happen, the enforcement powers will be engaged, and they will be fined up to 10% of their global revenue. If they do not sort it out, they will find that their services are disconnected. Those are the real teeth that will ensure that those companies comply.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I know that the Minister listened to Frances Haugen and to the members of charities. The charities and civil society organisations that are so concerned about this point do not accept that the Bill addresses it. I cannot see how his point addresses what I said about board-level acceptance of that role in children’s risk assessments. We need to change the culture of those organisations so that they become different from how they were described to us. He, like us, was sat there when we heard from the big platform providers, and they are not doing enough. He has had meetings with Frances Haugen; he knows what they are doing. It is good and welcome that the regulator will have the powers that he mentions, but that is just not enough.

Chris Philp Portrait Chris Philp
- Hansard - -

I agree with the hon. Lady that, as I said a second ago, those platforms are not doing enough to protect children. There is no question about that at all, and I think there is unanimity across the House that they are not doing enough to protect children.

I do not think the governance point is a panacea. Frankly, I think the boards of these companies are aware of what is going on. When these big questions arise, they go all the way up to Mark Zuckerberg. It is not as if Mark Zuckerberg and the directors of companies such as Meta are unaware of these risks; they are extremely aware of them, as Frances Haugen’s testimony made clear.

We do address the governance point. As I say, the risk assessments do need to explain how governance matters are deployed to consider these things—that is in clause 10(6)(h). But for me, it is the sanctions—the powers that Ofcom will have to fine these companies billions of pounds and ultimately to disconnect their service if they do not protect our children—that will deliver the result that we need.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The Minister is talking about companies of such scale that even fines of billions will not hurt them. I refer him to the following wording in the amendments:

“a named individual who the provider considers to be a senior manager of the entity, who may reasonably be expected to be in a position to ensure compliance with the children’s risk assessment duties”.

That is the minimum we should be asking. We should be asking these platforms, which are doing so much damage and have had to be dragged to the table to do anything at all, to be prepared to appoint somebody who is responsible. The Minister tries to gloss over things by saying, “Oh well, they must be aware of it.” The named individual would have to be aware of it. I hope he understands the importance of his role and the Committee’s role in making this happen. We could make this happen.

Chris Philp Portrait Chris Philp
- Hansard - -

As I say, clause 10 already references the governance arrangements, but my strong view is that the only thing that will make these companies sit up and take notice—the only thing that will make them actually protect children in a way they are currently not doing—is the threat of billions of pounds of fines and, if they do not comply even after being fined at that level, the threat of their service being disconnected. Ultimately, that is the sanction that will make these companies protect our children.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Barbara Keeley, do you have anything to add?

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

All I have to add is the obvious point—I am sure that we are going to keep running into this—that people should not have to look to a transcript to see what the Minister’s and Parliament’s intention was. It is clear what the Opposition’s intention is—to protect children. I cannot see why the Minister will not specify who in an organisation should be responsible. It should not be a question of ploughing through transcripts of what we have talked about here in Committee; it should be obvious. We have the chance here to do something different and better. The regulator could specify a senior level.

Chris Philp Portrait Chris Philp
- Hansard - -

Clearly, we are legislating here to cover, as I think we said this morning, 25,000 different companies. They all have different organisational structures, different personnel and so on. To anticipate the appropriate level of decision making in each of those companies and put it in the Bill in black and white, in a very prescriptive manner, might not adequately reflect the range of people involved.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

The Government obviously support the objective of these amendments, which is to prevent children from suffering the appalling sexual and physical abuse that the hon. Member for Worsley and Eccles South outlined in her powerful speech. It is shocking that these incidents have risen in the way that she described.

To be clear, that sort of appalling sexual abuse is covered in clause 9—which we have debated already—which covers illegal content. As Members would expect, child sexual abuse is defined as one of the items of priority illegal content, which are listed in more detail in schedule 6, where the offences that relate to sexual abuse are enumerated. As child sexual exploitation is a priority offence, services are already obliged through clause 9 to be “proactive” in preventing it from happening. As such, as Members would expect, the requirements contained in these amendments are already delivered through clause 9.

The hon. Member for Worsley and Eccles South also asked when we are going to hear what the primary priority harms to children might be. To be clear, those will not include the sexual exploitation offences, because as Members would also expect, those are already in the Bill as primary illegal offences. The primary priority harms might include material promoting eating disorders and that kind of thing, which is not covered by the criminal matters—the illegal matters. I have heard the hon. Lady’s point that if that list were to be published, or at least a draft list, that would assist Parliament in scrutinising the Bill. I will take that point away and see whether there is anything we can do in that area. I am not making a commitment; I am just registering that I have heard the point and will take it away.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Question proposed, That the clause stand part of the Bill.

Online Safety Bill (Second sitting)

Debate between Chris Philp and Baroness Keeley
Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q I have a really simple question. You have touched on the balance between free speech rights and the rights of people who are experiencing harassment, but does the Bill do enough to protect human rights?

Ellen Judson: At the moment, no. The rights that are discussed in the Bill at the minute are quite limited: primarily, it is about freedom of expression and privacy, and the way that protections around privacy have been drafted is less strong than for those around freedom of expression. Picking up on the question about setting precedents, if we have a Bill that is likely to lead to more content moderation and things like age verification and user identity verification, and if we do not have strong protections for privacy and anonymity online, we are absolutely setting a bad precedent. We would want to see much more integration with existing human rights legislation in the Bill.

Kyle Taylor: All I would add is that if you look at the exception for content of democratic importance, and the idea of “active political issue”, right now, conversion therapy for trans people—that has been described by UN experts as torture—is an active political issue. Currently, the human rights of trans people are effectively set aside because we are actively debating their lives. That is another example of how minority and marginalised people can be negatively impacted by this Bill if it is not more human rights-centred.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Let me start with this concept—this suggestion, this claim—that there is special protection for politicians and journalists. I will come to clause 50, which is the recognised news publisher exemption, in a moment, but I think you are referring to clauses 15 and 16. If we turn to those clauses and read them carefully, they do not specifically protect politicians and journalists, but “content of democratic importance” and “journalistic content”. It is about protecting the nature of the content, not the person who is speaking it. Would you accept that?

Ellen Judson: I accept that that is what the Bill currently says. Our point was thinking about how it will be implemented in practice. If platforms are expected to prove to a regulator that they are taking certain steps to protect content of democratic importance—in the explanatory notes, that is content related to Government policy and political parties—and they are expected to prove that they are taking a special consideration of journalistic content, the most straightforward way for them to do that will be in relation to journalists and politicians. Given that it is such a broad category and definition, that seems to be the most likely effect of the regime.

Kyle Taylor: It is potentially—