Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I absolutely agree. I certainly do not think I am suggesting that the bigger platforms such as Twitter and Facebook will reduce their reporting mechanisms as a result of how the Bill is written. However, it is possible that newer or smaller platforms, or anything that starts after this legislation comes, could limit the ability to report on the basis of these clauses.

Baroness Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - -

Good morning, Ms Rees.

It is important that users of online services are empowered to report harmful content, so that it can be removed. It is also important for users to have access to complaints procedures when wrong moderation decisions have been made. Reporting and complaint mechanisms are integral to ensuring that users are safe and that free speech is upheld, and we support these provisions in the Bill.

Clauses 17 and 18, and clauses 27 and 28, are two parts of the same process: content reporting by individual users, and the handling of content reported as a complaint. However, it is vital that these clauses create a system that works. That is the key point that Labour Members are trying to make, because the wild west system that we have at the moment does not work.

It is welcome that the Government have proposed a system that goes beyond the users of the platform and introduces a duty on companies. However, companies have previously failed to invest enough money in their complaints systems for the scale at which they are operating in the UK. The duties in the Bill are an important reminder to companies that they are part of a wider society that goes beyond their narrow shareholder interest.

One example of why this change is so necessary, and why Labour Members are broadly supportive of the additional duties, is the awful practice of image abuse. With no access to sites on which their intimate photographs are being circulated, victims of image abuse have very few if any routes to having the images removed. Again, the practice of image abuse has increased during the pandemic, including through revenge porn, which the Minister referred to. The revenge porn helpline reported that its case load more than doubled between 2019 and 2020.

These clauses should mean that people can easily report content that they consider to be either illegal, or harmful to children, if it is hosted on a site likely to be accessed by children, or, if it is hosted on a category 1 platform, harmful to adults. However, the Minister needs to clarify how these service complaints systems will be judged and what the performance metrics will be. For instance, how will Ofcom enforce against a complaint?

In many sectors of the economy, even with long-standing systems of regulation, companies can have tens of millions of customers reporting content, but that does not mean that any meaningful action can take place. The hon. Member for Aberdeen North has just told us how often she reports on various platforms, but what action has taken place? Many advocacy groups of people affected by crimes such as revenge porn will want to hear, in clear terms, what will happen to material that has been complained about. I hope the Minister can offer that clarity today.

Transparency in reporting will be vital to analysing trends and emerging types of harm. It is welcome that in schedule 8, which we will come to later, transparency reporting duties apply to the complaints process. It is important that as much information as possible is made public about what is going on in companies’ complaints and reporting systems. As well as the raw number of complaints, reporting should include what is being reported or complained about, as the Joint Committee on the draft Bill recommended last year. Again, what happens to the reported material will be an important metric on which to judge companies.

Finally, I will mention the lack of arrangements for children. We have tabled new clause 3, which has been grouped for discussion with other new clauses at the end of proceedings, but it is relevant to mention it now briefly. The Children’s Commissioner highlighted in her oral evidence to the Committee how children had lost faith in complaints systems. That needs to be changed. The National Society for the Prevention of Cruelty to Children has also warned that complaints mechanisms are not always appropriate for children and that a very low proportion of children have ever reported content. A child specific user advocacy body could represent the interests of child users and support Ofcom’s regulatory decisions. That would represent an important strengthening of protections for users, and I hope the Government will support it when the time comes.

Jane Stevenson Portrait Jane Stevenson (Wolverhampton North East) (Con)
- Hansard - - - Excerpts

I rise briefly to talk about content reporting. I share the frustrations of the hon. Member for Aberdeen North. The way I read the Bill was that it would allow users and affected persons, rather than “or” affected persons, to report content. I hope the Minister can clarify that that means affected persons who might not be users of a platform. That is really important.

Will the Minister also clarify the use of human judgment in these decisions? Many algorithms are not taking down some content at the moment, so I would be grateful if he clarified that there is a need for platforms to provide a genuine human judgment on whether content is harmful.

--- Later in debate ---
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 78, in clause 28, page 28, line 28, leave out “affected” and replace with “any other”

This amendment allows those who do not fit the definition of “affected person” to make a complaint about search content which they consider to be illegal.

Amendment 79, in clause 28, page 28, line 30, leave out “affected” and replace with “any other”

This amendment allows those who do not fit the definition of “affected person” to make a complaint about search content which they consider not to comply with sections 24, 27 or 29.

Clause 28 stand part.

New clause 1—Report on redress for individual complaints

“(1) The Secretary of State must publish a report assessing options for dealing with appeals about complaints made under—

(a) section 18; and

(b) section 28

(2) The report must—

(a) provide a general update on the fulfilment of duties about complaints procedures which apply in relation to all regulated user-to-user services and regulated search services;

(b) assess which body should be responsible for a system to deal with appeals in cases where a complainant considers that a complaint has not been satisfactorily dealt with; and

(c) provide options for how the system should be funded, including consideration of whether an annual surcharge could be imposed on user-to-user services and search services.

(3) The report must be laid before Parliament within six months of the commencement of this Act.”

Baroness Keeley Portrait Barbara Keeley
- Hansard - -

I will speak to new clause 1. Although duties about complaints procedures are welcome, it has been pointed out that service providers’ user complaints processes are often obscure and difficult to navigate—that is the world we are in at the moment. The lack of any external complaints option for individuals who seek redress is worrying.

The Minister has just talked about the super-complaints mechanism—which we will come to later in proceedings—to allow eligible entities to make complaints to Ofcom about a single regulated service if that complaint is of particular importance or affects a particularly large number of service users or members of the public. Those conditions are constraints on the super-complaints process, however.

An individual who felt that they had been failed by a service’s complaints system would have no source of redress. Without redress for individual complaints once internal mechanisms have been exhausted, victims of online abuse could be left with no further options, consumer protections could be compromised, and freedom of expression could be impinged upon for people who felt that their content had been unfairly removed.

Various solutions have been proposed. The Joint Committee recommended the introduction of an online safety ombudsman to consider complaints for which recourse to internal routes of redress had not resulted in resolution and the failure to address risk had led to significant and demonstrable harm. Such a mechanism would give people an additional body through which to appeal decisions after they had come to the end of a service provider’s internal process. Of course, we as hon. Members are all familiar with the ombudsman services that we already have.

Concerns have been raised about the level of complaints such an ombudsman could receive. However, as the Joint Committee noted, complaints would be received only once the service’s internal complaints procedure had been exhausted, as is the case for complaints to Ofcom about the BBC. The new clause seeks to ensure that we find the best possible solution to the problem. There needs to be a last resort for users who have suffered serious harm on services. It is only through the introduction of an external redress mechanism that service providers can truly be held to account for their decisions as they impact on individuals.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

I rise to contribute to the stand part debate on clauses 18 and 28. It was interesting, though, to hear the debate on clause 17, because it is right to ask how the complaints services will be judged. Will they work in practice? When we start to look at how to ensure that the legislation works in all eventualities, we need to ensure that we have some backstops for when the system does not work as it should.

It is welcome that there will be clear duties on providers to have operational complaints procedures—complaints procedures that work in practice. As we all know, many of them do not at the moment. As a result, we have a loss of faith in the system, and that is not going to be changed overnight by a piece of legislation. For years, people have been reporting things—in some cases, very serious criminal activity—that have not been acted on. Consumers—people who use these platforms—are not going to change their mind overnight and suddenly start trusting these organisations to take their complaints seriously. With that in mind, I hope that the Minister listened to the points I made on Second Reading about how to give extra support to victims of crimes or people who have experienced things that should not have happened online, and will look at putting in place the right level of support.

The hon. Member for Worsley and Eccles South talked about the idea of an ombudsman; it may well be that one should be in place to deal with situations where complaints are not dealt with through the normal processes. I am also quite taken by some of the evidence we received about third-party complaints processes by other organisations. We heard a bit about the revenge porn helpline, which was set up a few years ago when we first recognised in law that revenge pornography was a crime. The Bill creates a lot more victims of crime and recognises them as victims, but we are not yet hearing clearly how the support systems will adequately help that massively increased number of victims to get the help they need.

I will probably talk in more detail about this issue when we reach clause 70, which provides an opportunity to look at the—unfortunately—probably vast fines that Ofcom will be imposing on organisations and how we might earmark some of that money specifically for victim support, whether by funding an ombudsman or helping amazing organisations such as the revenge porn helpline to expand their services.

We must address this issue now, in this Bill. If we do not, all those fines will go immediately into the coffers of the Treasury without passing “Go”, and we will not be able to take some of that money to help those victims directly. I am sure the Government absolutely intend to use some of the money to help victims, but that decision would be at the mercy of the Treasury. Perhaps we do not want that; perhaps we want to make it cleaner and easier and have the money put straight into a fund that can be used directly for people who have been victims of crime or injustice or things that fall foul of the Bill.

I hope that the Minister will listen to that and use this opportunity, as we do in other areas, to directly passport fines for specific victim support. He will know that there are other examples of that that he can look at.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me develop the point before I give way. Our first line of defence is Ofcom enforcing the clause, but we have a couple of layers of additional defence. One of those is the super-complaints mechanism, which I have mentioned before. If a particular group of people, represented by a body such as the NSPCC, feel that their legitimate complaints are being infringed systemically by the social media platform, and that Ofcom is failing to take the appropriate action, they can raise that as a super-complaint to ensure that the matter is dealt with.

Baroness Keeley Portrait Barbara Keeley
- Hansard - -

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I should give way to the hon. Member for Aberdeen North first, and then I will come to the shadow Minister.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

A later clause gives Ofcom the ability to levy the fees and charges it sees as necessary and appropriate to ensure that it can deliver the duties. Ofcom will have the power to set those fees at a level to enable it to do its job properly, as Parliament would wish it to do.

Baroness Keeley Portrait Barbara Keeley
- Hansard - -

This is the point about individual redress again: by talking about super-complaints, the Minister seems to be agreeing that it is not there. As I said earlier, for super-complaints to be made to Ofcom, the issue has to be of particular importance or to impact a particularly large number of users, but that does not help the individual. We know how much individuals are damaged; there must be a system of external redress. The point about internal complaints systems is that we know that they are not very good, and we require a big culture change to change them, but unless there is some mechanism thereafter, I cannot see how we are giving the individual any redress—it is certainly not through the super-complaints procedure.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

As I said explicitly a few moments ago, the hon. Lady is right to point out the fact that the super-complaints process is to address systemic issues. She is right to say that, and I think I made it clear a moment or two ago.

Whether there should be an external ombudsman to enforce individual complaints, rather than just Ofcom enforcing against systemic complaints, is a question worth addressing. In some parts of our economy, we have ombudsmen who deal with individual complaints, financial services being an obvious example. The Committee has asked the question, why no ombudsman here? The answer, in essence, is a matter of scale and of how we can best fix the issue. The volume of individual complaints generated about social media platforms is just vast. Facebook in the UK alone has tens of millions of users—I might get this number wrong, but I think it is 30 million or 40 million users.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Yes. My hon. Friend hits the nail on the head. If there is a systemic problem and a platform fails to act appropriately not just in one case, but in a number of them, we have, as she has just described, the super-complaints process in clauses 140 to 142. Even under the Bill as drafted, without any changes, if a platform turns out to be systemically ignoring reasonable complaints made by the public and particular groups of users, the super-complainants will be able to do exactly as she describes. There is a mechanism to catch this—it operates not at individual level, but at the level of groups of users, via the super-complaint mechanism—so I honestly feel that the issue has been addressed.

When the numbers are so large, I think that the super-complaint mechanism is the right way to push Ofcom if it does not notice. Obviously, the first line of defence is that companies comply with the Bill. The second line of defence is that if they fail to do so, Ofcom will jump on them. The third line of defence is that if Ofcom somehow does not notice, a super-complaint group—such as the NSPCC, acting for children—will make a super-complaint to Ofcom. We have three lines of defence, and I submit to the Committee that they are entirely appropriate.

Baroness Keeley Portrait Barbara Keeley
- Hansard - -

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I was about to sit down, but of course I will give way.

Baroness Keeley Portrait Barbara Keeley
- Hansard - -

The Minister said that the Opposition had not tabled an amendment to bring in an ombudsman.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

On this clause.

Baroness Keeley Portrait Barbara Keeley
- Hansard - -

On this clause. What we have done, however—we are debating it now—is to table a new clause to require a report on redress for individual complaints. The Minister talks about clause 149 and a process that will kick in between two and five years away, but we have a horrendous problem at the moment. I and various others have described the situation as the wild west, and very many people—thousands, if not millions, of individuals—are being failed very badly. I do not see why he is resisting our proposal for a report within six months of the commencement of the Act, which would enable us to start to see at that stage, not two to five years down the road, how these systems—he is putting a lot of faith in them—were turning out. I think that is a very sound idea, and it would help us to move forward.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The third line of defence—the super-complaint process—is available immediately, as I set out a moment ago. In relation to new clause 1, which the hon. Lady mentioned a moment ago, I think six months is very soon for a Bill of this magnitude. The two-to-five-year timetable under the existing review mechanism in clause 149 is appropriate.

Although we are not debating clause 149, I hope, Ms Rees, that you will forgive me for speaking about it for a moment. If Members turn to pages 125 and 126 and look at the matters covered by the review, they will see that they are extraordinarily comprehensive. In effect, the review covers the implementation of all aspects of the Bill, including the need to minimise the harms to individuals and the enforcement and information-gathering powers. It covers everything that Committee members would want to be reviewed. No doubt as we go through the Bill we will have, as we often do in Bill Committee proceedings, a number of occasions on which somebody tables an amendment to require a review of x, y or z. This is the second such occasion so far, I think, and there may be others. It is much better to have a comprehensive review, as the Bill does via the provisions in clause 149.

Question put and agreed to.

Clause 18 accordingly ordered to stand part of the Bill.

Clause 19

Duties about freedom of expression and privacy

Question proposed, That the clause stand part of the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clause 19, on user-to-user services, and its associated clause 29, which relates to search services, specify a number of duties in relation to freedom of expression and privacy. In carrying out their safety duties, in-scope companies will be required by clause 19(2) to have regard to the importance of protecting users’ freedom of expression and privacy.

Let me pause for a moment on this issue. There has been some external commentary about the Bill’s impact on freedom of expression. We have already seen, via our discussion of a previous clause, that there is nothing in the Bill that compels the censorship of speech that is legal and not harmful to children. I put on the record again the fact that nothing in the Bill requires the censorship of legal speech that poses no harm to children.

We are going even further than that. As far as I am aware, for the first time ever there will be a duty on social media companies, via clause 19(2), to have regard to freedom of speech. There is currently no legal duty at all on platforms to have regard to freedom of speech. The clause establishes, for the first time, an obligation to have regard to freedom of speech. It is critical that not only Committee members but others more widely who consider the Bill should bear that carefully in mind. Besides that, the clause speaks to the right to privacy. Existing laws already speak to that, but the clause puts it in this Bill as well. Both duties are extremely important.

In addition, category 1 service providers—the really big ones—will need proactively to assess the impact of their policies on freedom of expression and privacy. I hope all Committee members will strongly welcome the important provisions I have outlined.

Baroness Keeley Portrait Barbara Keeley
- Hansard - -

As the Minister says, clauses 19 and 29 are designed to provide a set of balancing provisions that will require companies to have regard to freedom of expression and privacy when they implement their safety duties. However, it is important that companies cannot use privacy and free expression as a basis to argue that they can comply with regulation in less substantive ways. That is a fear here.

Category 1 providers will need to undertake an impact assessment to determine the impact of their product and safety decisions on freedom of expression, but it is unclear whether that applies only in respect of content that is harmful to adults. Unlike with the risk assessments for the illegal content and child safety duties set out in part 3, chapter 2, these clauses do not set expectations about whether risk assessments are of a suitable and sufficient quality. It is also not clear what powers Ofcom has at its disposal to challenge any assessments that it considers insufficient or that reach an inappropriate or unreasonable assessment of how to balance fundamental rights. I would appreciate it if the Minister could touch on that when he responds.

The assumption underlying these clauses is that privacy and free expression may need to act as a constraint on safety measures, but I believe that that is seen quite broadly as simplistic and potentially problematic. To give one example, a company could argue that end-to-end encryption is important for free expression, and privacy could justify any adverse impact on users’ safety. The subjects of child abuse images, which could more easily be shared because of such a decision, would see their safety and privacy rights weakened. Such an argument fails to take account of the broader nuance of the issues at stake. Impacts on privacy and freedom of expression should therefore be considered across a range of groups rather than assuming an overarching right that applies equally to all users.

Similarly, it will be important that Ofcom understands and delivers its functions in relation to these clauses in a way that reflects the complexity and nuance of the interplay of fundamental rights. It is important to recognise that positive and negative implications for privacy and freedom of expression may be associated with any compliance decision. I think the Minister implied that freedom of speech was a constant positive, but it can also have negative connotations.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I am pleased that the clause is in the Bill, and I think it is a good one to include. Can the Minister reaffirm what he said on Tuesday about child sexual abuse, and the fact that the right to privacy does not trump the ability—particularly with artificial intelligence—to search for child sexual abuse images?

--- Later in debate ---
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clause 30 stand part.

Baroness Keeley Portrait Barbara Keeley
- Hansard - -

Record-keeping and review duties on in-scope services make up an important function of the regulatory regime that we are discussing today. Platforms will need to report all harms identified and the action taken in response to this, in line with regulation. The requirements to keep records of the action taken in response to harm will be vital in supporting the regulator to make effective decisions about regulatory breaches and whether company responses are sufficient. That will be particularly important to monitor platforms’ responses through risk assessments—an area where some charities are concerned that we will see under-reporting of harms to evade regulation.

Evidence of under-reporting can be seen in the various transparency reports that are currently being published voluntarily by sites, where we are not presented with the full picture and scale of harm and the action taken to address that harm is thus obscured.

As with other risk assessments, the provisions in clauses 20 and 30 could be strengthened through a requirement on in-scope services to publish their risk assessments. We have made that point many times. Greater transparency would allow researchers and civil society to track harms and hold services to account.

--- Later in debate ---
None Portrait The Chair
- Hansard -

I call Kirsty Blackman to move amendment 22. [Interruption.] Sorry—my bad, as they say. I call Barbara Keeley to move amendment 22.

Baroness Keeley Portrait Barbara Keeley
- Hansard - -

I beg to move amendment 22, in clause 31, page 31, line 17, leave out subsection (3).

This amendment removes the condition that applies a child use test to a service or part of a service.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clause stand part.

Clause 32 stand part.

That schedule 3 be the Third schedule to the Bill.

Clause 33 stand part.

Baroness Keeley Portrait Barbara Keeley
- Hansard - -

The purpose of the amendment is to remove the child use test from the children’s access assessment and to make sure that any service likely to be accessed by children is within the scope of the child safety duty. The amendment is supported by the NSPCC and other children’s charities.

Children require protection wherever they are online. I am sure that every Committee member believes that. The age-appropriate design code from the Information Commissioner’s Office requires all services that are likely to be accessed by children to provide high levels of data protection and privacy. Currently, the Bill will regulate only user-to-user and search services that have a significant number of child users or services for which children form a significant part of their user base. It will therefore not apply to all services that fall within the scope of the ICO’s code, creating a patchwork of regulation that could risk uncertainty, legal battles and unnecessary complexity. It might also create a perverse incentive for online services to stall the introduction of their child safety measures until Ofcom has the capacity to investigate and reach a determination on the categorisation of their sites.

The inclusion of a children’s access assessment in the Bill may result in lower standards of protection, with highly problematic services such as Telegram and OnlyFans able to claim that they are excluded from the child safety duties because children do not account for a significant proportion of their user base. However, evidence has shown that children have been able to access those platforms.

Other services will remain out of the scope of the Bill as currently drafted. They include harmful blogs that promote life-threatening behaviours, such as pro-anorexia sites with provider-generated rather than user-generated content; some of the most popular games among children that do not feature user-generated content but are linked to increasing gambling addiction among children, and through which some families have lost thousands of pounds; and other services with user-generated content that is harmful but does not affect an appreciable number of children. That risks dozens, hundreds or even thousands of children falling unprotected.

Parents have the reasonable expectation that, under the new regime introduced by the Bill, children will be protected wherever they are online. They cannot be expected to be aware of exemptions or distinctions between categories of service. They simply want their children to be protected and their rights upheld wherever they are.

As I say, children have the right to be protected from harmful content and activity by any platform that gives them access. That is why the child user condition in clause 31 should be deleted from the Bill. As I have said, the current drafting could leave problematic platforms out of scope if they were to claim that they did not have a significant number of child users. It should be assumed that platforms are within the scope of the child safety duties unless they can provide evidence that children cannot access their sites, for example through age verification tools.

Although clause 33 provides Ofcom with the power to determine that a platform is likely to be accessed by children, this will necessitate Ofcom acting on a company-by-company basis to bring problematic sites back into scope of the child safety duties. That will take considerable time, and it will delay children receiving protection. It would be simpler to remove the child user condition from clause 31, as I have argued.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am concerned to ensure that children are appropriately protected, as the hon. Lady sets out. Let me make a couple of points in that area before I address that point.

The hon. Lady asked another question earlier, about video content. She gave the example of TikTok videos being viewed or accessed not directly on TikTok but via some third-party means, such as a WhatsApp message. First, it is worth emphasising again that in order to count as a user, a person does not have to be registered and can simply be viewing the content. Secondly, if someone is viewing something through another service, such as WhatsApp—the hon. Lady used the example of browsing the internet on another site—the duty will bite at the level of WhatsApp, and it will have to consider the content that it is providing access to. As I said, someone does not have to be registered with a service in order to count as a user of that service.

On amendment 22, there is a drafting deficiency, if I may put it politely—this is a point of drafting rather than of principle. The amendment would simply delete subsection (3), but there would still be references to the “child user condition”—for example, the one that appears on the same page of the Bill at line 11. If the amendment were adopted as drafted, it would end up leaving references to “child user condition” in the Bill without defining what it meant, because we would have deleted the definition.

Baroness Keeley Portrait Barbara Keeley
- Hansard - -

Is the Minister coming on to say that he is accepting what we are saying here?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

No, is the short answer. I was just mentioning in passing that there is that drafting issue.

On the principle, it is worth being very clear that, when it comes to content or matters that are illegal, that applies to all platforms, regardless of size, where children are at all at risk. In schedule 6, we set out a number of matters—child sexual exploitation and abuse, for example—as priority offences that all platforms have to protect children from proactively, regardless of scale.

--- Later in debate ---
Other areas include gambling, which the shadow Minister mentioned. There is separate legislation—very strong legislation—that prohibits children from being involved in gambling. That stands independently of this Bill, so I hope that the Committee is assured—
Baroness Keeley Portrait Barbara Keeley
- Hansard - -

The Minister has not addressed the points I raised. I specifically raised—he has not touched on this—harmful pro-anorexia blogs, which we know are dangerous but are not in scope, and games that children access that increase gambling addiction. He says that there is separate legislation for gambling addiction, but families have lost thousands of pounds through children playing games linked to gambling addiction. There are a number of other services that do not affect an appreciable number of children, and the drafting causes them to be out of scope.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

rose—[Interruption.]