All 2 Public Bill Committees debates in the Commons on 13th Dec 2022

Tue 13th Dec 2022
ONLINE SAFETY BILL (First sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 1st sitting
Tue 13th Dec 2022
ONLINE SAFETY BILL (Second sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 2nd sitting

ONLINE SAFETY BILL (First sitting)

Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(1 year, 4 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
The Committee consisted of the following Members:
Chairs: Dame Angela Eagle, † Sir Roger Gale
† Ansell, Caroline (Eastbourne) (Con)
† Bailey, Shaun (West Bromwich West) (Con)
† Bhatti, Saqib (Meriden) (Con)
† Blackman, Kirsty (Aberdeen North) (SNP)
† Bonnar, Steven (Coatbridge, Chryston and Bellshill) (SNP)
Bristow, Paul (Peterborough) (Con)
† Collins, Damian (Folkestone and Hythe) (Con)
† Davies-Jones, Alex (Pontypridd) (Lab)
† Fletcher, Nick (Don Valley) (Con)
† Leadbeater, Kim (Batley and Spen) (Lab)
† Maclean, Rachel (Redditch) (Con)
† Nichols, Charlotte (Warrington North) (Lab)
† Owen, Sarah (Luton North) (Lab)
Peacock, Stephanie (Barnsley East) (Lab)
† Russell, Dean (Watford) (Con)
† Scully, Paul (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)
† Wood, Mike (Dudley South) (Con)
Kevin Maddison, Bethan Harding, Committee Clerks
† attended the Committee
Public Bill Committee
Tuesday 13 December 2022
(Morning)
[Sir Roger Gale in the Chair]
Online Safety Bill
(Re-committed Clauses and Schedules: Clauses 11 to 14, 18 to 21, 30, 46, 55 and 65, Schedule 8, Clauses 79 and 82, Schedule 11, Clauses 87, 90, 115, 169 and 183, Schedule 17, Clauses 203, 206 and 207, new Clauses and new Schedules)
09:25
None Portrait The Chair
- Hansard -

Good morning, ladies and gentlemen. We are sitting in public and the proceedings are being broadcast. I have a few preliminary announcements. Hansard would be grateful if Members would provide speaking notes as and when they finish with them. Please also ensure that all mobile phones and other electronic devices are switched off.

I had better apologise at the start for the temperature in the room. For reasons that none of us can understand, two of the windows were left wide open all night. However, the Minister’s private office has managed to sort that out, because that is what it is there for. Hopefully the room will warm up. If any hon. Member has a problem with the temperature, they will have to tell the Chair. If necessary, I will suspend, but we are made of tough stuff, so we will try to bat on if we can.

We will first consider the programme motion, which can be debated for up to half an hour. I call the Minister to move the motion standing in his name, which was discussed yesterday by the Programming Sub-Committee.

Ordered,

That—

(1) the Committee shall (in addition to its first meeting at 9.25 am on Tuesday 13 December) meet—

(a) at 2.00 pm on Tuesday 13 December;

(b) at 11.30 am and 2.00 pm on Thursday 15 December;

(2) the proceedings shall be taken in the following order: Clauses 11 to 14, 18 to 21, 30, 46, 55, 56 and 65; Schedule 8; Clauses 79 and 82; Schedule 11; Clauses 87, 90, 115, 155, 169 and 183; Schedule 17; Clauses 203, 206 and 207; new Clauses; new Schedules; remaining proceedings on the Bill;

(3) the proceedings shall (so far as not previously concluded) be brought to a conclusion at 5.00 pm on Thursday 15 December. —(Paul Scully.)

Resolved,

That, subject to the discretion of the Chair, any written evidence received by the Committee shall be reported to the House for publication.—(Paul Scully.)

None Portrait The Chair
- Hansard -

We now begin line-by-line consideration of the Bill. Owing to the unusual nature of today’s proceedings on recommittal, which is exceptional, I need to make a few points.

Only the recommitted clauses and schedules, and amendments and new clauses relating to them, are in scope for consideration. The selection list, which has been circulated to Members and is available in the room, outlines which clauses and schedules those are. Any clause or schedule not on the list is not in scope for discussion. Basically, that means that we cannot have another Second Reading debate. Moreover, we cannot have a further debate on any issues that have been debated already on Report on the Floor of the House. As I say, this is unusual; in fact, I think it is probably a precedent—“Erskine May” will no doubt wish to take note.

The selection list also shows the selected amendments and how they have been grouped. Colleagues will by now be aware that we group amendments by subject for debate. They are not necessarily voted on at the time of the completion of the debate on that group, but as we reach their position in the Bill. Do not panic; we have expert advice to ensure that we do not miss anything—at least, I hope we have.

Finally, only the lead amendment is decided on at the end of the debate. If a Member wishes to move any other amendment in the group, please let the Chair know. Dame Angela or I will not necessarily select it for a Division, but we need to know if Members wish to press it to one. Otherwise, there will be no Division on the non-lead amendments.

Clause 11

Safety duties protecting children

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

I beg to move amendment 98, in clause 11, page 10, line 17, at end insert

“, and—

(c) mitigate the harm to children caused by habit-forming features of the service by consideration and analysis of how processes (including algorithmic serving of content, the display of other users’ approval of posts and notifications) contribute to development of habit-forming behaviour.”

This amendment requires services to take or use proportionate measures to mitigate the harm to children caused by habit-forming features of a service.

Thank you, Sir Roger, for chairing this recommitted Bill Committee. I will not say that it is nice to be back discussing the Bill again; we had all hoped to have made more progress by now. If you will indulge me for a second, I would like to thank the Clerks, who have been massively helpful in ensuring that this quick turnaround could happen and that we could table the amendments in a sensible form.

Amendment 98 arose from comments and evidence from the Royal College of Psychiatrists highlighting that a number of platforms, and particularly social media platforms such as TikTok and Facebook, generally encourage habit-forming behaviour or have algorithms that encourage it. Such companies are there to make money—that is what companies do—so they want people to linger on their sites and to spend as much time there as possible.

I do not know how many hon. Members have spent time on TikTok, but if they do, and they enjoy some of the cat videos, for instance, the algorithm will know and will show them more videos of cats. They will sit there and think, “Gosh, where did the last half-hour go? I have been watching any number of 20-second videos about cats, because they constantly come up.” Social media sites work by encouraging people to linger on the site and to spend the time dawdling and looking at the advertisements, which make the company additional revenue.

That is good for capitalism and for the company’s ability to make money but the issue, particularly in relation to clause 11, is how that affects children. Children may not have the necessary filters; they may not have the ability that we have to put our phones down—not that we always manage to do so. That ability and decision-making process may not be as refined in children as in adults. Children can be sucked into the platforms by watching videos of cats or of something far more harmful.

Sarah Owen Portrait Sarah Owen (Luton North) (Lab)
- Hansard - - - Excerpts

The hon. Member makes an excellent point about TikTok, but it also applies to YouTube. The platforms’ addictive nature has to do with the content. A platform does not just show a person a video of a cat, because that will not keep them hooked for half an hour. It has to show them a cat doing something extraordinary, and then a cat doing something even more extraordinary. That is why vulnerable people, especially children, get sucked into a dark hole. They click to see not just the same video but something more exciting, and then something even more exciting. That is the addictive nature of this.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

That is absolutely the case. We are talking about cats because I chose them to illustrate the situation, but people may look at content about healthy eating, and that moves on to content that encourages them to be sick. The way the algorithms step it up is insidious; they get more and more extreme, so that the linger time is increased and people do not get bored. It is important that platforms look specifically at their habit-forming features.

Charlotte Nichols Portrait Charlotte Nichols (Warrington North) (Lab)
- Hansard - - - Excerpts

A specific case on the platform TikTok relates to a misogynist who goes by the name of Andrew Tate, who has been banned from a number of social media platforms. However, because TikTok works by making clips shorter, which makes it more difficult for the company to identify some of this behaviour among users, young boys looking for videos of things that might interest them were very quickly shown misogynist content from Andrew Tate. Because they watched one video of him, they were then shown more and more. It is easy to see how the habit-forming behaviours built into platforms’ algorithms, which the hon. Lady identifies, can also be a means of quickly radicalising children into extreme ideologies.

None Portrait The Chair
- Hansard -

Order. I think we have the message. I have to say to all hon. Members that interventions are interventions, not speeches. If Members wish to make speeches, there is plenty of time.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Thank you, Sir Roger. I absolutely agree with the hon. Member for Warrington North. The platform works by stitching things together, so a video could have a bit of somebody else’s video in it, and that content ends up being shared and disseminated more widely.

This is not an attack on every algorithm. I am delighted to see lots of videos of cats—it is wonderful, and it suits me down to the ground—but the amendment asks platforms to analyse how those processes contribute to the development of habit-forming behaviour and to mitigate the harm caused to children by habit-forming features in the service. It is not saying, “You can’t use algorithms” or “You can’t use anything that may encourage people to linger on your site.” The specific issue is addiction—the fact that people will get sucked in and stay on platforms for hours longer than is healthy.

There is a demographic divide here. There is a significant issue when we compare children whose parents are engaged in these issues and spend time—and have the time to spend—assisting them to use the internet. There is a divide between the experiences of those children online and the experiences of children who are generally not nearly as well off, whose parents may be working two or three jobs to try to keep their homes warm and keep food on the table, so the level of supervision those children have may be far lower. We have a parental education gap, where parents are not able to instruct or teach their children a sensible way to use these things. A lot of parents have not used things such as TikTok and do not know how it works, so they are unable to teach their children.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

Does the hon. Lady agree that this feeds into the problem we have with the lack of a digital media literacy strategy in the Bill, which we have, sadly, had to accept? However, that makes it even more important that we protect children wherever we have the opportunity to do so, and this amendment is a good example of where we can do that.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The hon. Lady makes an excellent point. This is not about mandating that platforms stop doing these things; it is about ensuring that they take this issue into account and that they agree—or that we as legislators agree—with the Royal College of Psychiatrists that we have a responsibility to tackle it. We have a responsibility to ask Ofcom to tackle it with platforms.

This comes back to the fact that we do not have a user advocacy panel, and groups representing children are not able to bring emerging issues forward adequately and effectively. Because of the many other inadequacies in the Bill, that is even more important than it was. I assume the Minister will not accept my amendment—that generally does not happen in Bill Committees—but if he does not, it would be helpful if he could give Ofcom some sort of direction of travel so that it knows it should take this issue into consideration when it deals with platforms. Ofcom should be talking to platforms about habit-forming features and considering the addictive nature of these things; it should be doing what it can to protect children. This threat has emerged only in recent years, and things will not get any better unless we take action.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

It is a privilege to see you back in the Chair for round 2 of the Bill Committee, Sir Roger. It feels slightly like déjà vu to return to line-by-line scrutiny of the Bill, which, as you said, Sir Roger, is quite unusual and unprecedented. Seeing this Bill through Committee is the Christmas gift that keeps on giving. Sequels are rarely better than the original, but we will give it a go. I have made no secret of my plans, and my thoughts on the Minister’s plans, to bring forward significant changes to the Bill, which has already been long delayed. I am grateful that, as we progress through Committee, I will have the opportunity to put on record once again some of Labour’s long-held concerns with the direction of the Bill.

I will touch briefly on clause 11 specifically before addressing the amendments to the clause. Clause 11 covers safety duties to protect children, and it is a key part of the Bill—indeed, it is the key reason many of us have taken a keen interest in online safety more widely. Many of us, on both sides of the House, have been united in our frustrations with the business models of platform providers and search engines, which have paid little regard to the safety of children over the years in which the internet has expanded rapidly.

That is why Labour has worked with the Government. We want to see the legislation get over the line, and we recognise—as I have said in Committee previously—that the world is watching, so we need to get this right. The previous Minister characterised the social media platforms and providers as entirely driven by finance, but safety must be the No. 1 priority. Labour believes that that must apply to both adults and children, but that is an issue for debate on a subsequent clause, so I will keep my comments on this clause brief.

The clause and Government amendments 1, 2 and 3 address the thorny issue of age assurance measures. Labour has been clear that we have concerns that the Government are relying heavily on the ability of social media companies to distinguish between adults and children, but age verification processes remain fairly complex, and that clearly needs addressing. Indeed, Ofcom’s own research found that a third of children have false social media accounts aged over 18. This is an area we certainly need to get right.

I am grateful to the many stakeholders, charities and groups working in this area. There are far too many to mention, but a special shout-out should go to Iain Corby from the Age Verification Providers Association, along with colleagues at the Centre to End All Sexual Exploitation and Barnardo’s, and the esteemed John Carr. They have all provided extremely useful briefings for my team and me as we have attempted to unpick this extremely complicated part of the Bill.

We accept that there are effective age checks out there, and many have substantial anti-evasion mechanisms, but it is the frustrating reality that this is the road the Government have decided to go down. As we have repeatedly placed on the record, the Government should have retained the “legal but harmful” provisions that were promised in the earlier iteration of the Bill. Despite that, we are where we are.

I will therefore put on the record some brief comments on the range of amendments on this clause. First, with your permission, Sir Roger, I will speak to amendments 98, 99—

None Portrait The Chair
- Hansard -

Order. No, you cannot. I am sorry. I am perfectly willing to allow—the hon. Lady has already done this—a stand part debate at the start of a group of selections, rather than at the end, but she cannot have it both ways. I equally understand the desire of an Opposition Front Bencher to make some opening remarks, which is perfectly in order. With respect, however, you may not then go through all the other amendments. We are dealing now with amendment 98. If the hon. Lady can confine her remarks to that at this stage, that would be helpful.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Of course, Sir Roger. Without addressing the other amendments, I would like us to move away from the overly content-focused approach that the Government seem intent on taking in the Bill more widely. I will leave my comments there on the SNP amendment, but we support our SNP colleagues on it.

Paul Scully Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Paul Scully)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Sir Roger.

Being online can be a hugely positive experience for children and young people, but we recognise the challenge of habit-forming behaviour or designed addiction to some digital services. The Bill as drafted, however, would already deliver the intent of the amendment from the hon. Member for Aberdeen North. If service providers identify in their risk assessment that habit-forming or addictive-behaviour risks cause significant harm to an appreciable number of children on a service, the Bill will require them to put in place measures to mitigate and manage that risk under clause 11(2)(a).

To meet the child safety risk assessment duties under clause 10, services must assess the risk of harm to children from the different ways in which the service is used; the impact of such use; the level of risk of harm to children; how the design and operation of the service may increase the risks identified; and the functionalities that facilitate the presence or dissemination of content of harm to children. The definition of “functionality” at clause 200 already includes an expression of a view on content, such as applying a “like” or “dislike” button, as at subsection (2)(f)(i).

Sarah Owen Portrait Sarah Owen
- Hansard - - - Excerpts

I thank the Minister for giving way so early on. He mentioned an “appreciable number”. Will he clarify what that is? Is it one, 10, 100 or 1,000?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I do not think that a single number can be put on that, because it depends on the platform and the type of viewing. It is not easy to put a single number on that. An “appreciable number” is basically as identified by Ofcom, which will be the arbiter of all this. It comes back to what the hon. Member for Aberdeen North said about the direction that we, as she rightly said, want to give Ofcom. Ofcom has a range of powers already to help it assess whether companies are fulfilling their duties, including the power to require information about the operation of their algorithms. I would set the direction that the hon. Lady is looking for, to ensure that Ofcom uses those powers to the fullest and can look at the algorithms. We should bear in mind that social media platforms face criminal liability if they do not supply the information required by Ofcom to look under the bonnet.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

If platforms do not recognise that they have an issue with habit-forming features, even though we know they have, will Ofcom say to them, “Your risk assessment is insufficient. We know that the habit-forming features are really causing a problem for children”?

09:45
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

We do not want to wait for the Bill’s implementation to start those conversations with the platforms. We expect companies to be transparent about their design practices that encourage extended engagement and to engage with researchers to understand the impact of those practices on their users.

The child safety duties in clause 11 apply across all areas of a service, including the way it is operated and used by children and the content present on the service. Subsection (4)(b) specifically requires services to consider the

“design of functionalities, algorithms and other features”

when complying with the child safety duties. Given the direction I have suggested that Ofcom has, and the range of powers that it will already have under the Bill, I am unable to accept the hon. Member’s amendment, and I hope she will therefore withdraw it.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I would have preferred it had the Minister been slightly more explicit that habit-forming features are harmful. That would have been slightly more helpful.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will say that habit-forming features can be harmful.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I thank the Minister. Absolutely—they are not always harmful. With that clarification, I am happy to beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I beg to move amendment 1, in clause 11, page 10, line 22, leave out

“, or another means of age assurance”.

This amendment omits words which are no longer necessary in subsection (3)(a) of clause 11 because they are dealt with by the new subsection inserted by Amendment 3.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss Government amendments 2 and 3.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The Bill’s key objective, above everything else, is the safety of young people online. That is why the strongest protections in the Bill are for children. Providers of services that are likely to be accessed by children will need to provide safety measures to protect child users from harmful content, such as pornography, and from behaviour such as bullying. We expect companies to use age verification technologies to prevent children from accessing services that pose the highest risk of harm to them, and age assurance technologies and other measures to provide children with an age-appropriate experience.

The previous version of the Bill already focused on protecting children, but the Government are clear that the Bill must do more to achieve that and to ensure that the requirements on providers are as clear as possible. That is why we are strengthening the Bill and clarifying the responsibilities of providers to provide age-appropriate protections for children online. We are making it explicit that providers may need to use age assurance to identify the age of their users in order to meet the child safety duties for user-to-user services.

The Bill already set out that age assurance may be required to protect children from harmful content and activity, as part of meeting the duty in clause 11(3), but the Bill will now clarify that it may also be needed to meet the wider duty in subsection (2) to

“mitigate and manage the risks of harm to children”

and to manage

“the impact of harm to children”

on such services. That is important so that only children who are old enough are able to use functionalities on a service that poses a risk of harm to younger children. The changes will also ensure that children are signposted to support that is appropriate to their age if they have experienced harm. For those reasons, I recommend that the Committee accepts the amendments.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I have a few questions regarding amendments 1 to 3, which as I mentioned relate to the thorny issue of age verification and age assurance, and I hope the Minister can clarify some of them.

We are unclear about why, in subsection (3)(a), the Government have retained the phrase

“for example, by using age verification, or another means of age assurance”.

Can that difference in wording be taken as confirmation that the Government want harder forms of age verification for primary priority content? The Minister will be aware that many in the sector are unclear about what that harder form of age verification may look like, so some clarity would be useful for all of us in the room and for those watching.

In addition, we would like to clarify the Minister’s understanding of the distinction between age verification and age assurance. They are very different concepts in reality, so we would appreciate it if he could be clear, particularly when we consider the different types of harm that the Bill will address and protect people from, how that will be achieved and what technology will be required for different types of platform and content. I look forward to clarity from the Minister on that point.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

That is a good point. In essence, age verification is the hard access to a service. Age assurance ensures that the person who uses the service is the same person whose age was verified. Someone could use their parent’s debit card or something like that, so it is not necessarily the same person using the service right the way through. If we are to protect children, in particular, we have to ensure that we know there is a child at the other end whom we can protect from the harm that they may see.

On the different technologies, we are clear that our approach to age assurance or verification is not technology-specific. Why? Because otherwise the Bill would be out of date within around six months. By the time the legislation was fully implemented it would clearly be out of date. That is why it is incumbent on the companies to be clear about the technology and processes they use. That information will be kept up to date, and Ofcom can then look at it.

None Portrait The Chair
- Hansard -

The Minister leapt to his feet before I had the opportunity to call any other Members. I call Kirsty Blackman.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Thank you, Sir Roger. It was helpful to hear the Minister’s clarification of age assurance and age verification, and it was useful for him to put on the record the difference between the two.

I have a couple of points. In respect of Ofcom keeping up to date with the types of age verification and the processes, new ones will come through and excellent new methods will appear in coming years. I welcome the Minister’s suggestion that Ofcom will keep up to date with that, because it is incredibly important that we do not rely on, say, the one provider that there is currently, when really good methods could come out. We need the legislation to ensure that we get the best possible service and the best possible verification to keep children away from content that is inappropriate for them.

This is one of the most important parts of the Bill for ensuring that we can continue to have adult sections of the internet—places where there is content that would be disturbing for children, as well as for some adults—and that an age-verification system is in place to ensure that that content can continue to be there. Websites that require a subscription, such as OnlyFans, need to continue to have in place the age-verification systems that they currently have. By writing into legislation the requirement for them to continue to have such systems in place, we can ensure that children cannot access such services but adults can continue to do so. This is not about what is banned online or about trying to make sure that this content does not exist anywhere; it is specifically about gatekeeping to ensure that no child, as far as we can possibly manage, can access content that is inappropriate for kids.

There was a briefing recently on children’s access to pornography, and we heard horrendous stories. It is horrendous that a significant number of children have seen inappropriate content online, and the damage that that has caused to so many young people cannot be overstated. Blocking access to adult parts of the internet is so important for the next generation, not just so that children are not disturbed by the content they see, but so that they learn that it is not okay and normal and understand that the depictions of relationships in pornography are not the way reality works, not the way reality should work and not how women should be treated. Having a situation in which Ofcom or anybody else is better able to take action to ensure that adult content is specifically accessed only by adults is really important for the protection of children and for protecting the next generation and their attitudes, particularly towards sex and relationships.

Rachel Maclean Portrait Rachel Maclean (Redditch) (Con)
- Hansard - - - Excerpts

I wish to add some brief words in support of the Government’s proposals and to build on the comments from Members of all parties.

We know that access to extreme and abusive pornography is a direct factor in violence against women and girls. We see that play out in the court system every day. People claim to have watched and become addicted to this type of pornography; they are put on trial because they seek to play that out in their relationships, which has resulted in the deaths of women. The platforms already have technology that allows them to figure out the age of people on their platforms. The Bill seeks to ensure that they use that for a good end, so I thoroughly support it. I thank the Minister.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - - - Excerpts

There are two very important and distinct issues here. One is age verification. The platforms ask adults who have identification to verify their age; if they cannot verify their age, they cannot access the service. Platforms have a choice within that. They can design their service so that it does not have adult content, in which case they may not need to build in verification systems—the platform polices itself. However, a platform such as Twitter, which allows adult content on an app that is open to children, has to build in those systems. As the hon. Member for Aberdeen North mentioned, people will also have to verify their identity to access a service such as OnlyFans, which is an adult-only service.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

On that specific point, I searched on Twitter for the name—first name and surname—of a politician to see what people had been saying, because I knew that he was in the news. The pictures that I saw! That was purely by searching for the name of the politician; it is not as though people are necessarily seeking such stuff out.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

On these platforms, the age verification requirements are clear: they must age-gate the adult content or get rid of it. They must do one or the other. Rightly, the Bill does not specify technologies. Technologies are available. The point is that a company must demonstrate that it is using an existing and available technology or that it has some other policy in place to remedy the issue. It has a choice, but it cannot do nothing. It cannot say that it does not have a policy on it.

Age assurance is always more difficult for children, because they do not have the same sort of ID that adults have. However, technologies exist: for instance, Yoti uses facial scanning. Companies do not have to do that either; they have to demonstrate that they do something beyond self-certification at the point of signing up. That is right. Companies may also demonstrate what they do to take robust action to close the accounts of children they have identified on their platforms.

If a company’s terms of service state that people must be 13 or over to use the platform, the company is inherently stating that the platform is not safe for someone under 13. What does it do to identify people who sign up? What does it do to identify people once they are on the platform, and what action does it then take? The Bill gives Ofcom the powers to understand those things and to force a change of behaviour and action. That is why—to the point made by the hon. Member for Pontypridd—age assurance is a slightly broader term, but companies can still extract a lot of information to determine the likely age of a child and take the appropriate action.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I think we are all in agreement, and I hope that the Committee will accept the amendments.

Amendment 1 agreed to.

Amendments made: 2, in clause 11, page 10, line 25, leave out

“(for example, by using age assurance)”.

This amendment omits words which are no longer necessary in subsection (3)(b) of clause 11 because they are dealt with by the new subsection inserted by Amendment 3.

Amendment 3, in clause 11, page 10, line 26, at end insert—

“(3A) Age assurance to identify who is a child user or which age group a child user is in is an example of a measure which may be taken or used (among others) for the purpose of compliance with a duty set out in subsection (2) or (3).”—(Paul Scully.)

This amendment makes it clear that age assurance measures may be used to comply with duties in clause 11(2) as well as (3) (safety duties protecting children).

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move amendment 99, in clause 11, page 10, line 34, leave out paragraph (d) and insert—

“(d) policies on user access to the service, parts of the service, or to particular content present on the service, including blocking users from accessing the service, parts of the service, or particular content,”.

This amendment is intended to make clear that if it is proportionate to do so, services should have policies that include blocking access to parts of a service, rather than just the entire service or particular content on the service.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 96, in clause 11, page 10, line 41, at end insert—

“(i) reducing or removing a user’s access to private messaging features”.

This amendment is intended to explicitly include removing or reducing access to private messaging features in the list of areas where proportionate measures can be taken to protect children.

Amendment 97, in clause 11, page 10, line 41, at end insert—

“(i) reducing or removing a user’s access to livestreaming features”.

This amendment is intended to explicitly include removing or reducing access to livestreaming features in the list of areas where proportionate measures can be taken to protect children.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I am glad that the three amendments are grouped, because they link together nicely. I am concerned that clause 11(4)(d) does not do exactly what the Government intend it to. It refers to

“policies on user access to the service or to particular content present on the service, including blocking users from accessing the service or particular content”.

There is a difference between content and parts of the service. It would be possible to block users from accessing some of the things that we have been talking about —for example, eating disorder content—on the basis of clause 11(4)(d). A platform would be able to take that action, provided that it had the architecture in place. However, on my reading, I do not think it would be possible to block a user from accessing, for example, private messaging or livestreaming features. Clause 11(4)(d) would allow a platform to block certain content, or access to the service, but it would not explicitly allow it to block users from using part of the service.

Let us think about platforms such as Discord and Roblox. I have an awful lot of issues with Roblox, but it can be a pretty fun place for people to spend time. However, if a child, or an adult, is inappropriately using its private messaging features, or somebody on Instagram is using the livestreaming features, there are massive potential risks of harm. Massive harm is happening on such platforms. That is not to say that Instagram is necessarily inherently harmful, but if it could block a child’s access to livestreaming features, that could have a massive impact in protecting them.

09:49
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Does the hon. Lady accept that the amendments would give people control over the bit of the service that they do not currently have control of? A user can choose what to search for and which members to engage with, and can block people. What they cannot do is stop the recommendation feeds recommending things to them. The shields intervene there, which gives user protection, enabling them to say, “I don’t want this sort of content recommended to me. On other things, I can either not search for them, or I can block and report offensive users.” Does she accept that that is what the amendment achieves?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I think that that is what the clause achieves, rather than the amendments that I have tabled. I recognise that the clause achieves that, and I have no concerns about it. It is good that the clause does that; my concern is that it does not take the second step of blocking access to certain features on the platform. For example, somebody could be having a great time on Instagram looking at various people’s pictures or whatever, but they may not want to be bombarded with private messages. They have no ability to turn off the private messaging section.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

They can disengage with the user who is sending the messages. On a Meta platform, often those messages will be from someone they are following or engaging with. They can block them, and the platforms have the ability, in most in-app messaging services, to see whether somebody is sending priority illegal content material to other users. They can scan for that and mitigate that as well.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

That is exactly why users should be able to block private messaging in general. Someone on Twitter can say, “I’m not going to receive a direct message from anybody I don’t follow.” Twitter users have the opportunity to do that, but there is not necessarily that opportunity on all platforms. We are asking for those things to be included, so that the provider can say, “You’re using private messaging inappropriately. Therefore, we are blocking all your access to private messaging,” or, “You are being harmed as a result of accessing private messaging. Therefore, we are blocking your access to any private messaging. You can still see pictures on Instagram, but you can no longer receive any private messages, because we are blocking your access to that part of the site.” That is very different from blocking a user’s access to certain kinds of content, for example. I agree that that should happen, but it is about the functionalities and stopping access to some of them.

We are not asking Ofcom to mandate that platforms take this measure; they could still take the slightly more nuclear option of banning somebody entirely from their service. However, if this option is included, we could say, “Your service is doing pretty well, but we know there is an issue with private messaging. Could you please take action to ensure that those people who are using private messaging to harm children no longer have access to private messaging and are no longer able to use the part of the service that enables them to do these things?” Somebody might be doing a great job of making games in Roblox, but they may be saying inappropriate things. It may be proportionate to block that person entirely, but it may be more proportionate to block their access to voice chat, so that they can no longer say those things, or direct message or contact anybody. It is about proportionality and recognising that the service is not necessarily inherently harmful but that specific parts of it could be.

Sarah Owen Portrait Sarah Owen
- Hansard - - - Excerpts

The hon. Member is making fantastic, salient points. The damage with private messaging is around phishing, as well as seeing a really harmful message and not being able to unsee it. Would she agree that it is about protecting the victim, not putting the onus on the victim to disengage from such conversations?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I completely agree. The hon. Member put that much better than I could. I was trying to formulate that point in my head, but had not quite got there, so I appreciate her intervention. She is right: we should not put the onus on a victim to deal with a situation. Once they have seen a message from someone, they can absolutely block that person, but that person could create another account and send them messages again. People could be able to choose, and to say, “No, I don’t want anyone to be able to send me private messages,” or “I don’t want any private messages from anyone I don’t know.” We could put in those safeguards.

I am talking about adding another layer to the clause, so that companies would not necessarily have to demonstrate that it was proportionate to ban a person from using their service, as that may be too high a bar—a concern I will come to later. They could, however, demonstrate that it was proportionate to ban a person from using private messaging services, or from accessing livestreaming features. There has been a massive increase in self-generated child sexual abuse images, and huge amount has come from livestreaming. There are massive risks with livestreaming features on services.

Livestreaming is not always bad. Someone could livestream themselves showing how to make pancakes. There is no issue with that—that is grand—but livestreaming is being used by bad actors to manipulate children into sharing videos of themselves, and once they are on the internet, they are there forever. It cannot be undone. If we were able to ban vulnerable users—my preferred option would be all children—from accessing livestreaming services, they would be much safer.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The hon. Lady is talking about extremely serious matters. My expectation is that Ofcom would look at all of a platform’s features when risk-assessing the platform and enforcing safety, and in-app messaging services would not be exempt. Platforms have to demonstrate what they would do to mitigate harmful and abusive behaviour, and that they would take action against the accounts responsible.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Absolutely, I agree, but the problem is with the way the Bill is written. It does not suggest that a platform could stop somebody accessing a certain part of a service. The Bill refers to content, and to the service as a whole, but it does not have that middle point that I am talking about.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

A platform is required to demonstrate to Ofcom what it would do to mitigate activity that would breach the safety duties. It could do that through a feature that it builds in, or it may take a more draconian stance and say, “Rather than turning off certain features, we will just suspend the account altogether.” That could be discussed in the risk assessments, and agreed in the codes of practice.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

What I am saying is that the clause does not actually allow that middle step. It does not explicitly say that somebody could be stopped from accessing private messaging. The only options are being banned from certain content, or being banned from the entire platform.

I absolutely recognise the hard work that Ofcom has done, and I recognise that it will work very hard to ensure that risks are mitigated, but the amendment ensures what the Minister intended with this legislation. I am not convinced that he intended there to be just the two options that I outlined. I think he intended something more in line with what I am suggesting in the amendment. It would be very helpful if the Minister explicitly said something in this Committee that makes it clear that Ofcom has the power to say to platforms, “Your risk assessment says that there is a real risk from private messaging”—or from livestreaming—“so why don’t you turn that off for all users under 18?” Ofcom should be able to do that.

Could the Minister be clear that that is the direction of travel he is hoping and intending that Ofcom will take? If he could be clear on that, and will recognise that the clause could have been slightly better written to ensure Ofcom had that power, I would be quite happy to not push the amendment to a vote. Will the Minister be clear about the direction he hopes will be taken?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I rise to support my SNP colleagues’ amendments 99, and 96 and 97, just as I supported amendment 98. The amendments are sensible and will ensure that service providers are empowered to take action to mitigate harms done through their services. In particular, we support amendment 99, which makes it clear that a service should be required to have the tools available to allow it to block access to parts of its service, if that is proportionate.

Amendments 96 and 97 would ensure that private messaging and livestreaming features were brought into scope, and that platforms and services could block access to them when that was proportionate, with the aim of protecting children, which is the ultimate aim of the Bill. Those are incredibly important points to raise.

In previous iterations of the Bill Committee, Labour too tabled a number of amendments to do with platforms’ responsibilities for livestreaming. I expressed concerns about how easy it is for platforms to host live content, and about how ready they were to screen that content for harm, illegal or not. I am therefore pleased to support our SNP colleagues. The amendments are sensible, will empower platforms and will keep children safe.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

It is a pleasure to serve with you in the Chair, Sir Roger. I rise in support of amendments 99, and 96 and 97, as my hon. Friend the Member for Pontypridd did. I have an issue with the vagueness and ambiguity in the Bill. Ministerial direction is incredibly helpful, not only for Ofcom, but for the companies and providers that will use the Bill to make technologies available to do what we are asking them to do.

As the hon. Member for Aberdeen North said, if the Bill provided for that middle ground, that would be helpful for a number of purposes. Amendment 97 refers to livestreaming; in a number of cases around the world, people have livestreamed acts of terror, such as the shooting at the Christchurch mosque. Those offences were watched in real time, as they were perpetuated, by potentially hundreds of thousands of people. We have people on watch lists—people we are aware of. If we allowed them to use a social media platform but not the livestreaming parts, that could go some way to mitigating the risk of their livestreaming something like that. Their being on the site is perhaps less of a concern, as their general use of it could be monitored in real time. Under a risk analysis, we might be happy for people to be on a platform, but consider that the risk was too great to allow them to livestream. Having such a provision would be helpful.

My hon. Friend the Member for Luton North mentioned the onus always being on the victim. When we discuss online abuse, I really hate it when people say, “Well, just turn off your messages”, “Block them” or “Change your notification settings”, as though that were a panacea. Turning off the capacity to use direct messages is a much more effective way of addressing abuse by direct message than banning the person who sent it altogether—they might just make a new account—or than relying on the recipient of the message to take action when the platform has the capacity to take away the option of direct messaging. The adage is that sunlight is the best disinfectant. When people post in public and the post can be seen by anyone, they can be held accountable by anyone. That is less of a concern to me than what they send privately, which can be seen only by the recipient.

This group of amendments is reasonable and proportionate. They would not only give clear ministerial direction to Ofcom and the technology providers, and allow Ofcom to take the measures that we are discussing, but would pivot us away from placing the onus on the recipients of abusive behaviour, or people who might be exposed to it. Instead, the onus would be on platforms to make those risk assessments and take the middle ground, where that is a reasonable and proportionate step.

10:15
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

If someone on a PlayStation wants to play online games, they must sign up to PlayStation Plus—that is how the model works. Once they pay that subscription, they can access online games and play Fortnite or Rocket League or whatever they want online. They then also have access to a suite of communication features; they can private message people. It would be disproportionate to ban somebody from playing any PlayStation game online in order to stop them from being able to private message inappropriate things. That would be a disproportionate step. I do not want PlayStation to be unable to act against somebody because it could not ban them, as that would be disproportionate, but was unable to switch off the direct messaging features because the clause does not allow it that flexibility. A person could continue to be in danger on the PlayStation platform as a result of private communications that they could receive. That is one example of how the provision would be key and important.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Again, the Government recognise the intent behind amendment 99, which, as the hon. Member for Aberdeen North said, would require providers to be able to block children’s access to parts of a service, rather than the entire service. I very much get that. We recognise the nature and scale of the harm that can be caused to children through livestreaming and private messaging, as has been outlined, but the Bill already delivers what is intended by these amendments. Clause 11(4) sets out examples of areas in which providers will need to take measures, if proportionate, to meet the child safety duties. It is not an exhaustive list of every measure that a provider might be required to take. It would not be feasible or practical to list every type of measure that a provider could take to protect children from harm, because such a list could become out of date quickly as new technologies emerge, as the hon. Lady outlined with her PlayStation example.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a concern. The Minister’s phrasing was “to block children’s access”. Surely some of the issues would be around blocking adults’ access, because they are the ones causing risk to the children. From my reading of the clause, it does not suggest that the action could be taken only against child users; it could be taken against any user in order to protect children.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will come to that in a second. The hon. Member for Luton North talked about putting the onus on the victim. Any element of choice is there for adults; the children will be protected anyway, as I will outline in a second. We all agree that the primary purpose of the Bill is to be a children’s protection measure.

Ofcom will set out in codes of practice the specific steps that providers can take to protect children who are using their service, and the Government expect those to include steps relating to children’s access to high-risk features, such as livestreaming or private messaging. Clause 11(4)(d) sets out that that providers may be required to take measures in the following areas:

“policies on user access to the service or to particular content present on the service, including blocking users from accessing the service or particular content”.

The other areas listed are intentionally broad categories that allow for providers to take specific measures. For example, a measure in the area of blocking user access to particular content could include specific measures that restrict children’s access to parts of a service, if that is a proportionate way to stop users accessing that type of content. It can also apply to any of the features of a service that enable children to access particular content, and could therefore include children’s access to livestreaming and private messaging features. In addition, the child safety duties make it clear that providers need to use proportionate systems and processes that prevent children from encountering primary priority content that is harmful to them, and protect children and age groups at risk of harm from other content that is harmful to them.

While Ofcom will set out in codes of practice the steps that providers can take to meet these duties, we expect those steps, as we have heard, to include the use of age verification to prevent children accessing content that poses the greatest risk of harm to them. To meet that duty, providers may use measures that restrict children from accessing parts of the service. The Bill therefore allows Ofcom to require providers to take that step where it is proportionate. I hope that that satisfies the hon. Member for Aberdeen North, and gives her the direction that she asked for—that is, a direction to be more specific that Ofcom does indeed have the powers that she seeks.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

The Bill states that we can expect little impact on child protection before 2027-28 because of the enforcement road map and when Ofcom is planning to set that out. Does the Minister not think that in the meantime, that sort of ministerial direction would be helpful? It could make Ofcom’s job easier, and would mean that children could be protected online before 2027-28.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The ministerial direction that the various platforms are receiving from the Dispatch Box, from our conversations with them and from the Bill’s progress as it goes through the House of Lords will be helpful to them. We do not expect providers to wait until the very last minute to implement the measures. They are starting to do so now, but we want them to go them further, quicker.

Government amendment 4 will require providers who already have a minimum age requirement for access to their service, or parts of it, to give details of the measures that they use to restrict access in their terms of service and apply them consistently. Providers will also need to provide age-appropriate protections for children using their service. That includes protecting children from harmful content and activity on their service, as well as reviewing children’s use of higher-risk features, as I have said.

To meet the child safety risk assessment duties in clause 10, providers must assess: the risk of harm to children from functionalities that facilitate the presence or dissemination of harmful content; the level of risk from different kinds of harmful content, giving separate consideration to children in different age groups; the different ways in which the service is used, and the impact of such use on the level of risk of harm; and how the design and operation of the service may increase the risks identified.

The child safety duties in clause 11 apply across all areas of the service, including the way it is operated and used by children, as well as the content present on the service. For the reasons I have set out, I am not able to accept the amendments, but I hope that the hon. Member for Aberdeen North will take on board my assurances.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

That was quite helpful. I am slightly concerned about the Minister’s focus on reducing children’s access to the service or to parts of it. I appreciate that is part of what the clause is intended to do, but I would also expect platforms to be able to reduce the ability of adults to access parts of the service or content in order to protect children. Rather than just blocking children, blocking adults from accessing some features—whether that is certain adults or adults as a group—would indeed protect children. My reading of clause 11(4) was that users could be prevented from accessing some of this stuff, rather than just child users. Although the Minister has given me more questions, I do not intend to push the amendment to a vote.

May I ask a question of you, Sir Roger? I have not spoken about clause stand part. Are we still planning to have a clause stand part debate?

None Portrait The Chair
- Hansard -

No.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Thank you, Sir Roger; I appreciate the clarification. When I talk about Government amendment 4, I will also talk about clause stand part. I withdraw the amendment.

None Portrait The Chair
- Hansard -

That is up to the Committee.

Amendment, by leave, withdrawn.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I beg to move amendment 4, in clause 11, page 11, line 9, at end insert—

“(6A) If a provider takes or uses a measure designed to prevent access to the whole of the service or a part of the service by children under a certain age, a duty to—

(a) include provisions in the terms of service specifying details about the operation of the measure, and

(b) apply those provisions consistently.”

This amendment requires providers to give details in their terms of service about any measures they use which prevent access to a service (or part of it) by children under a certain age, and to apply those terms consistently.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Government amendment 5.

Amendment 100, in clause 11, page 11, line 15, after “accessible” insert “for child users.”

This amendment makes clear that the provisions of the terms of service have to be clear and accessible for child users.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Although the previous version of the Bill already focused on protecting children, as I have said, the Government are clear that it must do more to achieve that and to ensure that requirements for providers are as clear as possible. That is why we are making changes to strengthen the Bill. Amendments 4 and 5 will require providers who already have a minimum age requirement for access to their service, or parts of it, to give details in their terms of services of the measures that they use to ensure that children below the minimum age are prevented access. Those terms must be applied consistently and be clear and accessible to users. The change will mean that providers can be held to account for what they say in their terms of service, and will no longer do nothing to prevent underage access.

The Government recognise the intent behind amendment 100, which is to ensure that terms of service are clear and accessible for child users, but the Bill as drafted sets an appropriate standard for terms of service. The duty in clause 11(8) sets an objective standard for terms of service to be clear and accessible, rather than requiring them to be clear for particular users. Ofcom will produce codes of practice setting out how providers can meet that duty, which may include provisions about how to tailor the terms of service to the user base where appropriate.

The amendment would have the unintended consequence of limiting to children the current accessibility requirement for terms of service. As a result, any complicated and detailed information that would not be accessible for children—for example, how the provider uses proactive technology—would probably need to be left out of the terms of service, which would clearly conflict with the duty in clause 11(7) and other duties relating to the terms of service. It is more appropriate to have an objective standard of “clear and accessible” so that the terms of service can be tailored to provide the necessary level of information and be useful to other users such as parents and guardians, who are most likely to be able to engage with the more detailed information included in the terms of service and are involved in monitoring children’s online activities.

Ofcom will set out steps that providers can take to meet the duty and will have a tough suite of enforcement powers to take action against companies that do not meet their child safety duties, including if their terms of service are not clear and accessible. For the reasons I have set out, I am not able to accept the amendment tabled by the hon. Member for Aberdeen North and I hope she will withdraw it.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

As I said, I will also talk about clause 11. I can understand why the Government are moving their amendments. It makes sense, particularly with things like complying with the provisions. I have had concerns all the way along—particularly acute now as we are back in Committee with a slightly different Bill from the one that we were first presented with—about the reliance on terms of service. There is a major issue with choosing to go down that route, given that providers of services can choose what to put in their terms of service. They can choose to have very good terms of service that mean that they will take action on anything that is potentially an issue and that will be strong enough to allow them to take the actions they need to take to apply proportionate measures to ban users that are breaking the terms of service. Providers will have the ability to write terms of service like that, but not all providers will choose to do that. Not all providers will choose to write the gold standard terms of service that the Minister expects everybody will write.

We have to remember that these companies’ and organisations’ No. 1 aim is not to protect children. If their No. 1 aim was to protect children, we would not be here. We would not need an Online Safety Bill because they would be putting protection front and centre of every decision they make. Their No. 1 aim is to increase the number of users so that they can get more money. That is the aim. They are companies that have a duty to their shareholders. They are trying to make money. That is the intention. They will not therefore necessarily draw up the best possible terms of service.

I heard an argument on Report that market forces will mean that companies that do not have strong enough terms of service, companies that have inherent risks in their platforms, will just not be used by people. If that were true, we would not be in the current situation. Instead, the platforms that are damaging people and causing harm—4chan, KiwiFarms or any of those places that cause horrendous difficulties—would not be used by people because market forces would have intervened. That approach does not work; it does not happen that the market will regulate itself and people will stay away from places that cause them or others harm. That is not how it works. I am concerned about the reliance on terms of service and requiring companies to stick to their own terms of service. They might stick to their own terms of service, but those terms of service might be utterly rubbish and might not protect people. Companies might not have in place what we need to ensure that children and adults are protected online.

10:30
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Does the hon. Lady agree that people out there in the real world have absolutely no idea what a platform’s terms of service are, so we are being expected to make a judgment on something about which we have absolutely no knowledge?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Absolutely. The amendment I tabled regarding the accessibility of terms of service was designed to ensure that if the Government rely on terms of service, children can access those terms of service and are able to see what risks they are putting themselves at. We know that in reality children will not read these things. Adults do not read these things. I do not know what Twitter’s terms of service say, but I do know that Twitter managed to change its terms of service overnight, very easily and quickly. Companies could just say, “I’m a bit fed up with Ofcom breathing down my neck on this. I’m just going to change my terms of service, so that Ofcom will not take action on some of the egregious harm that has been done. If we just change our terms of service, we don’t need to bother. If we say that we are not going to ban transphobia on our platform—if we take that out of the terms of service—we do not need to worry about transphobia on our platform. We can just let it happen, because it is not in our terms of service.”

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Does the hon. Lady agree that the Government are not relying solely on terms of service, but are rightly saying, “If you say in your terms of service that this is what you will do, Ofcom will make sure that you do it”? Ofcom will take on that responsibility for people, making sure that these complex terms of service are understood and enforced, but the companies still have to meet all the priority illegal harms objectives that are set out in the legislation. Offences that exist in law are still enforced on platforms, and risk-assessed by Ofcom as well, so if a company does not have a policy on race hate, we have a law on race hate, and that will apply.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

It is absolutely the case that those companies still have to do a risk assessment, and a child risk assessment if they meet the relevant criteria. The largest platforms, for example, will still have to do a significant amount of work on risk assessments. However, every time a Minister stands up and talks about what they are requiring platforms and companies to do, they say, “Companies must stick to their terms of service. They must ensure that they enforce things in line with their terms of service.” If a company is finding it too difficult, it will just take the tough things out of their terms of service. It will take out transphobia, it will take out abuse. Twitter does not ban anyone for abuse anyway, it seems, but it will be easier for Twitter to say, “Ofcom is going to try to hold us for account for the fact that we are not getting rid of people for abusive but not illegal messages, even though we say in our terms of service, ‘You must act with respect’, or ‘You must not abuse other users’. We will just take that out of our terms of service so that we are not held to account for the fact that we are not following our terms of service.” Then, because the abuse is not illegal—because it does not meet that bar—those places will end up being even less safe than they are right now.

For example, occasionally Twitter does act in line with its terms of service, which is quite nice: it does ban people who are behaving inappropriately, but not necessarily illegally, on its platform. However, if it is required to implement that across the board for everybody, it will be far easier for Twitter to say, “We’ve sacked all our moderators—we do not have enough people to be able to do this job—so we will just take it all out of the terms of service. The terms of service will say, ‘We will ban people for sharing illegal content, full stop.’” We will end up in a worse situation than we are currently in, so the reliance on terms of service causes me a big, big problem.

Turning to amendment 100, dealing specifically with the accessibility of this feature for child users, I appreciate the ministerial clarification, and agree that my amendment could have been better worded and potentially causes some problems. However, can the Minister talk more about the level of accessibility? I would like children to be able to see a version of the terms of service that is age-appropriate, so that they understand what is expected of them and others on the platform, and understand when and how they can make a report and how that report will be acted on. The kids who are using Discord, TikTok or YouTube are over 13—well, some of them are—so they are able to read and understand, and they want to know how to make reports and for the reporting functions to be there. One of the biggest complaints we hear from kids is that they do not know how to report things they see that are disturbing.

A requirement for children to have an understanding of how reporting functions work, particularly on social media platforms where people are interacting with each other, and of the behaviour that is expected of them, does not mean that there cannot be a more in-depth and detailed version of the terms of service, laying out potential punishments using language that children may not be able to understand. The amendment would specifically ensure that children have an understanding of that.

We want children to have a great time on the internet. There are so many ace things out there and wonderful places they can access. Lego has been in touch, for example; its website is really pretty cool. We want kids to be able to access that stuff and communicate with their friends, but we also want them to have access to features that allow them to make reports that will keep them safe. If children are making reports, then platforms will say, “Actually, there is real problem with this because we are getting loads of reports about it.” They will then be able to take action. They will be able to have proper risk assessments in place because they will be able to understand what is disturbing people and what is causing the problems.

I am glad to hear the Minister’s words. If he were even more clear about the fact that he would expect children to be able to understand and access information about keeping themselves safe on the platforms, then that would be even more helpful.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

On terms and conditions, it is clearly best practice to have a different level of explanation that ensures children can fully understand what they are getting into. The hon. Lady talked about the fact that children do not know how to report harm. Frankly, judging by a lot of conversations we have had in our debates, we do not know how to report harm because it is not transparent. On a number of platforms, how to do that is very opaque.

A wider aim of the Bill is to make sure that platforms have better reporting patterns. I encourage platforms to do exactly what the hon. Member for Aberdeen North says to engage children, and to engage parents. Parents are well placed to engage with reporting and it is important that we do not forget parenting in the equation of how Government and platforms are acting. I hope that is clear to the hon. Lady. We are mainly relying on terms and conditions for adults, but the Bill imposes a wider set of protections for children on the platforms.

Amendment 4 agreed to.

Amendment made: 5, in clause 11, page 11, line 15, after “(5)” insert “, (6A)”.—(Paul Scully.)

This amendment ensures that the duty in clause 11(8) to have clear and accessible terms of service applies to the terms of service mentioned in the new subsection inserted by Amendment 4.

Clause 11, as amended, ordered to stand part of the Bill.

Clause 12

Adults’ risk assessment duties

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clause 13 stand part.

Government amendments 18, 23 to 25, 32, 33 and 39.

Clause 55 stand part.

Government amendments 42 to 45, 61 to 66, 68 to 70, 74, 80, 85, 92, 51 and 52, 54, 94 and 60.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

To protect free speech and remove any possibility that the Bill could cause tech companies to censor legal content, I seek to remove the so-called “legal but harmful” duties from the Bill. These duties are currently set out in clauses 12 and 13 and apply to the largest in-scope services. They require services to undertake risk assessments for defined categories of harmful but legal content, before setting and enforcing clear terms of service for each category of content.

I share the concerns raised by Members of this House and more broadly that these provisions could have a detrimental effect on freedom of expression. It is not right that the Government define what legal content they consider harmful to adults and then require platforms to risk assess for that content. Doing so may encourage companies to remove legal speech, undermining this Government’s commitment to freedom of expression. That is why these provisions must be removed.

At the same time, I recognise the undue influence that the largest platforms have over our public discourse. These companies get to decide what we do and do not see online. They can arbitrarily remove a user’s content or ban them altogether without offering any real avenues of redress to users. On the flip side, even when companies have terms of service, these are often not enforced, as we have discussed. That was the case after the Euro 2020 final where footballers were subject to the most appalling abuse, despite most platforms clearly prohibiting that. That is why I am introducing duties to improve the transparency and accountability of platforms and to protect free speech through new clauses 3 and 4. Under these duties, category 1 platforms will only be allowed to remove or restrict access to content or ban or suspend users when this is in accordance with their terms of service or where they face another legal obligation. That protects against the arbitrary removal of content.

Companies must ensure that their terms of service are consistently enforced. If companies’ terms of service say that they will remove or restrict access to content, or will ban or suspend users in certain circumstances, they must put in place proper systems and processes to apply those terms. That will close the gap between what companies say they will do and what they do in practice. Services must ensure that their terms of service are easily understandable to users and that they operate effective reporting and redress mechanisms, enabling users to raise concerns about a company’s application of the terms of service. We will debate the substance of these changes later alongside clause 18.

Clause 55 currently defines

“content that is harmful to adults”,

including

“priority content that is harmful to adults”

for the purposes of this legislation. As this concept would be removed with the removal of the adult safety duties, this clause will also need to be removed.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

My hon. Friend mentioned earlier that companies will not be able to remove content if it is not part of their safety duties or if it was not a breach of their terms of service. I want to be sure that I heard that correctly and to ask whether Ofcom will be able to risk assess that process to ensure that companies are not over-removing content.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Absolutely. I will come on to Ofcom in a second and respond directly to his question.

The removal of clauses 12, 13 and 55 from the Bill, if agreed by the Committee, will require a series of further amendments to remove references to the adult safety duties elsewhere in the Bill. These amendments are required to ensure that the legislation is consistent and, importantly, that platforms, Ofcom and the Secretary of State are not held to requirements relating to the adult safety duties that we intend to remove from the Bill. The amendments remove requirements on platforms and Ofcom relating to the adult safety duties. That includes references to the adult safety duties in the duties to provide content reporting and redress mechanisms and to keep records. They also remove references to content that is harmful to adults from the process for designating category 1, 2A and 2B companies. The amendments in this group relate mainly to the process for the category 2B companies.

I also seek to amend the process for designating category 1 services to ensure that they are identified based on their influence over public discourse, rather than with regard to the risk of harm posed by content that is harmful to adults. These changes will be discussed when we debate the relevant amendments alongside clause 82 and schedule 11. The amendments will remove powers that will no longer be required, such as the Secretary of State’s ability to designate priority content that is harmful to adults. As I have already indicated, we intend to remove the adult safety duties and introduce new duties on category 1 services relating to transparency, accountability and freedom of expression. While they will mostly be discussed alongside clause 18, amendments 61 to 66, 68 to 70 and 74 will add references to the transparency, accountability and freedom of expression duties to schedule 8. That will ensure that Ofcom can require providers of category 1 services to give details in their annual transparency reports about how they comply with the new duties. Those amendments define relevant content and consumer content for the purposes of the schedule.

We will discuss the proposed transparency and accountability duties that will replace the adult safety duties in more detail later in the Committee’s deliberations. For the reasons I have set out, I do not believe that the current adult safety duties with their risks to freedom of expression should be retained. I therefore urge the Committee that clauses 12, 13 and 55 do not stand part and instead recommend that the Government amendments in this group are accepted.

None Portrait The Chair
- Hansard -

Before we proceed, I emphasise that we are debating clause 13 stand part as well as the litany of Government amendments that I read out.

10:45
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Clause 12 is extremely important because it outlines the platforms’ duties in relation to keeping adults safe online. The Government’s attempts to remove the clause through an amendment that thankfully has not been selected are absolutely shocking. In addressing Government amendments 18, 23, 24, 25, 32, 33 and 39, I must ask the Minister: exactly how will this Bill do anything to keep adults safe online?

In the original clause 12, companies had to assess the risk of harm to adults and the original clause 13 outlined the means by which providers had to report these assessments back to Ofcom. This block of Government amendments will make it impossible for any of us—whether that is users of a platform or service, researchers or civil society experts—to understand the problems that arise on these platforms. Labour has repeatedly warned the Government that this Bill does not go far enough to consider the business models and product design of platforms and service providers that contribute to harm online. By tabling this group of amendments, the Government are once again making it incredibly difficult to fully understand the role of product design in perpetuating harm online.

We are not alone in our concerns. Colleagues from Carnegie UK Trust, who are a source of expertise to hon. Members across the House when it comes to internet regulation, have raised their concerns over this grouping of amendments too. They have raised specific concerns about the removal of the transparency obligation, which Labour has heavily pushed for in previous Bill Committees.

Previously, service providers had been required to inform customers of the harms their risk assessment had detected, but the removal of this risk assessment means that users and consumers will not have the information to assess the nature or risk on the platform. The Minister may point to the Government’s approach in relation to the new content duties in platforms’ and providers’ terms of service, but we know that there are risks arising from the fact that there is no minimum content specified for the terms of service for adults, although of course all providers will have to comply with the illegal content duties.

This approach, like the entire Bill, is already overly complex—that is widely recognised by colleagues across the House and is the view of many stakeholders too. In tabling this group of amendments, the Minister is showing his ignorance. Does he really think that all vulnerabilities to harm online simply disappear at the age of 18? By pushing these amendments, which seek to remove these protections from harmful but legal content to adults, the Minister is, in effect, suggesting that adults are not susceptible to harm and therefore risk assessments are simply not required. That is an extremely narrow-minded view to take, so I must push the Minister further. Does he recognise that many young, and older, adults are still highly likely to be impacted by suicide and self-harm messaging, eating disorder content, disinformation and abuse, which will all be untouched by these amendments?

Labour has been clear throughout the passage of the Bill that we need to see more, not less, transparency and protection from online harm for all of us—whether adults or children. These risk assessments are absolutely critical to the success of the Online Safety Bill and I cannot think of a good reason why the Minister would not support users in being able to make an assessment about their own safety online.

We have supported the passage of the Bill, as we know that keeping people safe online is a priority for us all and we know that the perfect cannot be the enemy of the good. The Government have made some progress towards keeping children safe, but they clearly do not consider it their responsibility to do the same for adults. Ultimately, platforms should be required to protect everyone: it does not matter whether they are a 17-year-old who falls short of being legally deemed an adult in this country, an 18-year-old or even an 80-year-old. Ultimately, we should all have the same protections and these risk assessments are critical to the online safety regime as a whole. That is why we cannot support these amendments. The Government have got this very wrong and we have genuine concerns that this wholesale approach will undermine how far the Bill will go to truly tackling harm online.

I will also make comments on clause 55 and the other associated amendments. I will keep my comments brief, as the Minister is already aware of my significant concerns over his Department’s intention to remove adult safety duties more widely. In the previous Bill Committee, Labour made it clear that it supports, and thinks it most important, that the Bill should clarify specific content that is deemed to be harmful to adults. We have repeatedly raised concerns about missing harms, including health misinformation and disinformation, but really this group of amendments, once again, will touch on widespread concerns that the Government’s new approach will see adults online worse off. The Government’s removal of the “legal but harmful” sections of the Online Safety Bill is a major weakening—not a strengthening—of the Bill. Does the Minister recognise that the only people celebrating these decisions will be the executives of big tech firms, and online abusers? Does he agree that this delay shows that the Government have bowed to vested interests over keeping users and consumers safe?

Labour is not alone in having these concerns. We are all pleased to see that child safety duties are still present in the Bill, but the NSPCC, among others, is concerned about the knock-on implications that may introduce new risks to children. Without adult safety duties in place, children will be at greater risk of harm if platforms do not identify and protect them as children. In effect, these plans will now place a significant greater burden on platforms to protect children than adults. As the Bill currently stands, there is a significant risk of splintering user protections that can expose children to adult-only spaces and harmful content, while forming grooming pathways for offenders, too.

The reality is that these proposals to deal with harms online for adults rely on the regulator ensuring that social media companies enforce their own terms and conditions. We already know and have heard that that can have an extremely damaging impact for online safety more widely, and we have only to consider the very obvious and well-reported case study involving Elon Musk’s takeover of Twitter to really get a sense of how damaging that approach is likely to be.

In late November, Twitter stopped taking action against tweets in violation of coronavirus rules. The company had suspended at least 11,000 accounts under that policy, which was designed to remove accounts posting demonstrably false or misleading content relating to covid-19 that could lead to harm. The company operated a five-strike policy, and the impact on public health around the world of removing that policy will likely be tangible. The situation also raises questions about the platform’s other misinformation policies. As of December 2022, they remain active, but for how long remains unclear.

Does the Minister recognise that as soon as they are inconvenient, platforms will simply change their terms and conditions, and terms of service? We know that simply holding platforms to account for their terms and conditions will not constitute robust enough regulation to deal with the threat that these platforms present, and I must press the Minister further on this point.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

My hon. Friend is making an excellent speech. I share her deep concerns about the removal of these clauses. The Government have taken this tricky issue of the concept of “legal but harmful”—it is a tricky issue; we all acknowledge that—and have removed it from the Bill altogether. I do not think that is the answer. My hon. Friend makes an excellent point about children becoming 18; the day after they become 18, they are suddenly open to lots more harmful and dangerous content. Does she also share my concern about the risks of people being drawn towards extremism, as well as disinformation and misinformation?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

My hon. Friend makes a valid point. This is not just about misinformation and disinformation; it is about leading people to really extreme, vile content on the internet. As we all know, that is a rabbit warren. That situation does not change as soon as a 17-year-old turns 18 on their 18th birthday—that they are then exempt when it comes to seeing this horrendous content. The rules need to be there to protect all of us.

As we have heard, terms and conditions can change overnight. Stakeholders have raised the concern that, if faced with a clearer focus on their terms of service, platforms and providers may choose to make their terms of service shorter, in an attempt to cut out harmful material that, if left undealt with, they may be held liable for.

In addition, the fact that there is no minimum requirement in the regime means that companies have complete freedom to set terms of service for adults, which may not reflect the risks to adults on that service. At present, service providers do not even have to include terms of service in relation to the list of harmful content proposed by the Government for the user empowerment duties—an area we will come on to in more detail shortly as we address clause 14. The Government’s approach and overreliance on terms of service, which as we know can be so susceptible to rapid change, is the wrong approach. For that reason, we cannot support these amendments.

I would just say, finally, that none of us was happy with the term “legal but harmful”. It was a phrase we all disliked, and it did not encapsulate exactly what the content is or includes. Throwing the baby out with the bathwater is not the way to tackle that situation. My hon. Friend the Member for Batley and Spen is right that this is a tricky area, and it is difficult to get it right. We need to protect free speech, which is sacrosanct, but we also need to recognise that there are so many users on the internet who do not have access to free speech as a result of being piled on or shouted down. Their free speech needs to be protected too. We believe that the clauses as they stand in the Bill go some way to making the Bill a meaningful piece of legislation. I urge the Minister not to strip them out, to do the right thing and to keep them in the Bill.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Throughout the consideration of the Bill, I have been clear that I do not want it to end up simply being the keep MPs safe on Twitter Bill. That is not what it should be about. I did not mean that we should therefore take out everything that protects adults; what I meant was that we need to have a big focus on protecting children in the Bill, which thankfully we still do. For all our concerns about the issues and inadequacies of the Bill, it will go some way to providing better protections for children online. But saying that it should not be the keep MPs safe on Twitter Bill does not mean that it should not keep MPs safe on Twitter.

I understand how we have got to this situation. What I cannot understand is the Minister’s being willing to stand up there and say, “We can’t have these clauses because they are a risk to freedom of speech.” Why are they in the Bill in the first place if they are such a big risk to freedom of speech? If the Government’s No. 1 priority is making sure that we do not have these clauses, why did they put them in it? Why did it go through pre-legislative scrutiny? Why were they in the draft Bill? Why were they in the Bill? Why did they agree with them in Committee? Why did they agree with them on Report? Why have we ended up in a situation where, suddenly, there is a massive epiphany that they are a threat to freedom of speech and therefore we cannot possibly have them?

What is it that people want to say that they will be banned from saying as a result of this Bill? What is it that freedom of speech campaigners are so desperate to want to say online? Do they want to promote self-harm on platforms? Is that what people want to do? Is that what freedom of speech campaigners are out for? They are now allowed to do that a result of the Bill.

Nick Fletcher Portrait Nick Fletcher (Don Valley) (Con)
- Hansard - - - Excerpts

I believe that the triple shield being put in is in place of “legal but harmful”. That will enable users to put a layer of protection in so they can actually take control. But the illegal content still has to be taken down: anything that promotes self-harm is illegal content and would still have to be removed. The problem with the way it was before is that we had a Secretary of State telling us what could be said out there and what could not. What may offend the hon. Lady may not offend me, and vice versa. We have to be very careful of that. It is so important that we protect free speech. We are now giving control to each individual who uses the internet.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The promotion of self-harm is not illegal content; people are now able to do that online—congratulations, great! The promotion of incel culture is not illegal content, so this Bill will now allow people to do that online. It will allow terms of service that do not require people to be banned for promoting incel culture, self-harm, not wearing masks and not getting a covid vaccine. It will allow the platforms to allow people to say these things. That is what has been achieved by campaigners.

The Bill is making people less safe online. We will continue to have the same problems that we have with people being driven to suicide and radicalised online as a result of the changes being made in this Bill. I know the Government have been leaned on heavily by the free speech lobby. I still do not know what people want to say that they cannot say as a result of the Bill as it stands. I do not know. I cannot imagine that anybody is not offended by content online that drives people to hurt themselves. I cannot imagine anybody being okay and happy with that. Certainly, I imagine that nobody in this room is okay and happy with that.

These people have won this war on the attack on free speech. They have won a situation where they are able to promote misogynistic, incel culture and health disinformation, where they are able to say that the covid vaccine is entirely about putting microchips in people. People are allowed to say that now—great! That is what has been achieved, and it is a societal issue. We have a generational issue where people online are being exposed to harmful content. That will now continue.

It is not just a generational societal thing—it is not just an issue for society as a whole that these conspiracy theories are pervading. Some of the conspiracy theories around antisemitism are unbelievably horrific, but do not step over into illegality or David Icke would not be able to stand up and suggest that the world is run by lizard people—who happen to be Jewish. He would not be allowed to say that because it would be considered harmful content. But now he is. That is fine. He is allowed to say that because this Bill is refusing to take action on that.

11:00
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Can the hon. Lady tell me where in the Bill, as it is currently drafted—so, unamended—it requires platforms to remove legal speech?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

It allows the platforms to do that. It allows them, and requires legal but harmful stuff to be taken into account. It requires the platforms to act—to consider, through risk assessments, the harm done to adults by content that is legal but massively harmful.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The hon. Lady is right: the Bill does not require the removal of legal speech. Platforms must take the issue into account—it can be risk assessed—but it is ultimately their decision. I think the point has been massively overstated that, somehow, previously, Ofcom had the power to strike down legal but harmful speech that was not a breach of either terms of service or the law. It never had that power.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Why do the Government now think that there is a risk to free speech? If Ofcom never had that power, if it was never an issue, why are the Government bothered about that risk—it clearly was not a risk—to free speech? If that was never a consideration, it obviously was not a risk to free speech, so I am now even more confused as to why the Government have decided that they will have to strip this measure out of the Bill because of the risk to free speech, because clearly it was not a risk in this situation. This is some of the most important stuff in the Bill for the protection of adults, and the Government are keen to remove it.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

The hon. Member is making an excellent and very passionate speech, and I commend her for that. Would she agree with one of my concerns, which is about the message that this sends to the public? It is almost that the Government were acknowledging that there was a problem with legal but harmful content—we can all, hopefully, acknowledge that that is a problem, even though we know it is a tricky one to tackle—but, by removing these clauses from the Bill, are now sending the message that, “We were trying to clean up the wild west of the internet, but, actually, we are not that bothered anymore.”

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The hon. Lady is absolutely right. We have all heard from organisations and individuals who have had their lives destroyed as a result of “legal but harmful”—I don’t have a better phrase for it—content online and of being radicalised by being driven deeper and deeper into blacker and blacker Discord servers, for example, that are getting further and further right wing.

A number of the people who are radicalised—who are committing terror attacks, or being referred to the Prevent programme because they are at risk of committing terror attacks—are not so much on the far-right levels of extremism any more, or those with incredible levels of religious extremism, but are in a situation where they have got mixed up or unclear ideological drivers. It is not the same situation as it was before, because people are being radicalised by the stuff that they find online. They are being radicalised into situations where they “must do something”—they “must take some action”—because of the culture change in society.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The hon. Member is making a powerful point. Just a few weeks ago, I asked the Secretary of State for Digital, Culture, Media and Sport, at the Dispatch Box, whether the horrendous and horrific content that led a man to shoot and kill five people in Keyham—in the constituency of my hon. Friend the Member for Plymouth, Sutton and Devonport (Luke Pollard)—would be allowed to remain and perpetuate online as a result of the removal of these clauses from the Bill. I did not get a substantial answer then, but we all know that the answer is yes.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

That is the thing: this Bill is supposed to be the Online Safety Bill. It is supposed to be about protecting people from the harm that can be done to them by others. It is also supposed to be about protecting people from that radicalisation and that harm that they can end up in. It is supposed to make a difference. It is supposed to be game changer and a world leader.

Although, absolutely, I recognise the importance of the child-safety duties in the clauses and the change that that will have, when people turn 18 they do not suddenly become different humans. They do not wake up on their 18th birthday as a different person from the one that they were before. They should not have to go from that level of protection, prior to 18, to being immediately exposed to comments and content encouraging them to self-harm, and to all of the negative things that we know are present online.

Nick Fletcher Portrait Nick Fletcher
- Hansard - - - Excerpts

I understand some of the arguments the hon. Lady is making, but that is a poor argument given that the day people turn 17 they can learn to drive or the day they turn 16 they can do something else. There are lots of these things, but we have to draw a line in the sand somewhere. Eighteen is when people become adults. If we do not like that, we can change the age, but there has to be a line in the sand. I agree with much of what the hon. Lady is saying, but that is a poor argument. I am sorry, but it is.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I do not disagree that overnight changes are involved, but the problem is that we are going from a certain level of protection to nothing; there will be a drastic, dramatic shift. We will end up with any vulnerable person who is over 18 being potentially subject to all this content online.

I still do not understand what people think they will have won as a result of having the provisions removed from the Bill. I do not understand how people can say, “This is now a substantially better Bill, and we are much freer and better off as a result of the changes.” That is not the case; removing the provisions will mean the internet continuing to be unsafe—much more unsafe than it would have been under the previous iteration of the Bill. It will ensure that more people are harmed as a result of online content. It will absolutely—

Nick Fletcher Portrait Nick Fletcher
- Hansard - - - Excerpts

Will the hon. Lady give way?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

No, I will not give way again. The change will ensure that people can absolutely say what they like online, but the damage and harm that it will cause are not balanced by the freedoms that have been won.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

As a Back-Bench Member of Parliament, I recommended that the “legal but harmful” provisions be removed from the Bill. When I chaired the Joint Committee of both Houses of Parliament that scrutinised the draft Bill, it was the unanimous recommendation of the Committee that the “legal but harmful” provisions be removed. As a Minister at the Dispatch Box, I said that I thought “legal but harmful” was a problematic term and we should not use it. The term “legal but harmful” does not exist in the Bill, and has never existed in the Bill, but it has provoked a debate that has caused a huge confusion. There is a belief, which we have heard expressed in debate today, that somehow there are categories of content that Ofcom can deem categories for removal whether they are unlawful or not.

During the Bill’s journey from publication in draft to where we are today, it has become more specific. Rather than our relying on general duties of care, written into the Bill are areas of priority illegal activity that the companies must proactively look for, monitor and mitigate. In the original version of the Bill, that included only terrorist content and child sexual exploitation material, but on the recommendation of the Joint Committee, the Government moved in the direction of writing into the Bill at schedule 7 offences in law that will be the priority illegal offences.

The list of offences is quite wide, and it is more comprehensive than any other such list in the world in specifying exactly what offences are in scope. There is no ambiguity for the platforms as to what offences are in scope. Stalking, harassment and inciting violence, which are all serious offences, as well as the horrible abuse a person might receive as a consequence of their race or religious beliefs, are written into the Bill as priority illegal offences.

There has to be a risk assessment of whether such content exists on platforms and what action platforms should take. They are required to carry out such a risk assessment, although that was never part of the Bill before. The “legal but harmful” provisions in some ways predate that. Changes were made; the offences were written into the Bill, risk assessments were provided for, and Parliament was invited to create new offences and write them into the Bill, if there were categories of content that had not been captured. In some ways, that creates a democratic lock that says, “If we are going to start to regulate areas of speech, what is the legal reason for doing that? Where is the legal threshold? What are the grounds for us taking that decision, if it is something that is not already covered in platforms’ terms of service?”

We are moving in that direction. We have a schedule of offences that we are writing into the Bill, and those priority illegal offences cover most of the most serious behaviour and most of the concerns raised in today’s debate. On top of that, there is a risk assessment of platforms’ terms of service. When we look at the terms of service of the companies—the major platforms we have been discussing—we see that they set a higher bar again than the priority illegal harms. On the whole, platforms do not have policies that say, “We won’t do anything about this illegal activity, race hate, incitement to violence, or promotion or glorification of terrorism.” The problem is that although have terms of service, they do not enforce them. Therefore, we are not relying on terms of service. What we are saying, and what the Bill says, is that the minimum safety standards are based on the offences written into the Bill. In addition, we have risk assessment, and we have enforcement based on the terms of service.

There may be a situation in which there is a category of content that is not in breach of a platform’s terms of service and not included in the priority areas of illegal harm. It is very difficult to think of what that could be—something that is not already covered, and over which Ofcom would not have power. There is the inclusion of the new offences of promoting self-harm and suicide. That captures not just an individual piece of content, but the systematic effect of a teenager like Molly Russell—or an adult of any age—being targeted with such content. There are also new offences for cyber-flashing, and there is Zach’s law, which was discussed in the Chamber on Report. We are creating and writing into the Bill these new priority areas of illegal harm.

Freedom of speech groups’ concern was that the Government could have a secret list of extra things that they also wanted risk-assessed, rather enforcement being clearly based either on the law or on clear terms of service. It is difficult to think of categories of harm that are not already captured in terms of service or priority areas of illegal harm, and that would be on such a list. I think that is why the change was made. For freedom of speech campaigners, there was a concern about exactly what enforcement was based on: “Is it based on the law? Is it based on terms of service? Or is it based on something else?”

I personally believed that the “legal but harmful” provisions in the Bill, as far as they existed, were not an infringement on free speech, because there was never a requirement to remove legal speech. I do not think the removal of those clauses from the Bill suddenly creates a wild west in which no enforcement will take place at all. There will be very effective enforcement based on the terms of service, and on the schedule 7 offences, which deal with the worst kinds of illegal activity; there is a broad list. The changes make it much clearer to everybody—platforms and users alike, and Ofcom—exactly what the duties are, how they are enforced and what they are based on.

For future regulation, we have to use this framework, so that we can say that when we add new offences to the scope of the legislation, they are offences that have been approved by Parliament and have gone through a proper process, and are a necessary addition because terms of service do not cover them. That is a much clearer and better structure to follow, which is why I support the Government amendments.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

I cannot help but see the Government’s planned removal of clauses 12 and 13 as essentially wrecking amendments to the Bill. Taking those provisions out of the Bill makes it a Bill not about online safety, but about child protection. We have not had five years or so of going backwards and forwards, and taken the Bill through Committee and then unprecedentedly recommitted it to Committee, in order to fundamentally change what the Bill set out to do. The fact that, at this late stage, the Government are trying to take out these aspects of the Bill melts my head, for want of a better way of putting it.

My hon. Friend the Member for Batley and Spen was absolutely right when she talked about what clauses 12 and 13 do. In effect, they are an acknowledgement that adults are also harmed online, and have different experiences online. I strongly agree with the hon. Member for Aberdeen North about this not being the protect MPs from being bullied on Twitter Bill, because obviously the provisions go much further than that, but it is worth noting, in the hope that it is illustrative to Committee members, the very different experience that the Minister and I have in using Twitter. I say that as a woman who is LGBT and Jewish—and although I would not suggest that it should be a protected characteristic, the fact that I am ginger probably plays a part as well. He and I could do the same things on Twitter on the same day and have two completely different experiences of that platform.

The risk-assessment duties set out in clause 12, particularly in subsection (5)(d) to (f), ask platforms to consider the different ways in which different adult users might experience them. Platforms have a duty to attempt to keep certain groups of people, and categories of user, safe. When we talk about free speech, the question is: freedom of speech for whom, and at what cost? Making it easier for people to perpetuate, for example, holocaust denial on the internet—a category of speech that is lawful but awful, as it is not against the law in this country to deny that the holocaust happened—makes it much less likely that I, or other Jewish people, will want to use the platform.

11:24
Our freedom of speech and ability to express ourselves on the platform is curtailed by the platform’s decision to prioritise the freedom of expression of people who would deny the holocaust over that of Jewish people who want to use the platform safely and not be bombarded by people making memes of their relatives getting thrown into gas chambers, of Jewish people with big noses, or of the Rothschild Zionist global control conspiracy nonsense that was alluded to earlier, which is encountered online constantly by Jewish users of social media platforms.
Organisations such as the Community Security Trust and the Antisemitism Policy Trust, which do excellent work in this area, have been very clear that someone’s right to be protected from that sort of content should not end the day they turn 18. Duties should remain on platforms to do risk assessments to protect certain groups of adults who may be at increased risk from such content, in order to protect their freedom of speech and expression.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The hon. Member makes a powerful point about the different ways in which people experience things. That tips over into real-life abusive interactions, and goes as far as terrorist incidents in some cases. Does she agree that protecting people’s freedom of expression and safety online also protects people in their real, day-to-day life?

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

I could not agree more. I suppose that is why this aspect of the Bill is so important, not just to me but to all those categories of user. I mentioned paragraphs (d) to (f), which would require platforms to assess exactly that risk. This is not about being offended. Personally, I have the skin of a rhino. People can say most things to me and I am not particularly bothered by it. My concern is where things that are said online are transposed into real-life harms. I will use myself as an example. Online, we can see antisemitic and conspiratorial content, covid misinformation, and covid misinformation that meets with antisemitism and conspiracies. When people decide that I, as a Jewish Member of Parliament, am personally responsible for George Soros putting a 5G chip in their arm, or whatever other nonsense they have become persuaded by on the internet, that is exactly the kind of thing that has meant people coming to my office armed with a knife. The kind of content that they were radicalised by on the internet led to their perpetrating a real-life, in-person harm. Thank God—Baruch Hashem—neither I nor my staff were in the office that day, but that could have ended very differently, because of the sorts of content that the Bill is meant to protect online users from.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The hon. Lady is talking about an incredibly important issue, but the Bill covers such matters as credible threats to life, incitement to violence against an individual, and harassment and stalking—those patterns of behaviour. Those are public order offences, and they are in the Bill. I would absolutely expect companies to risk-assess for that sort of activity, and to be required by Ofcom to mitigate it. On her point about holocaust denial, first, the shield will mean that people can protect themselves from seeing stuff. The further question would be whether we create new offences in law, which can then be transposed across.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

I accept the points that the hon. Member raised, but he is fundamentally missing the point. The categories of information and content that these people had seen and been radicalised by would not fall under the scope of public order offences or harassment. The person was not sending me harassing messages before they turned up at my office. Essentially, social media companies and other online platforms have to take measures to mitigate the risk of categories of offences that are illegal, whether or not they are in the Bill. I am talking about what clauses 12 and 13 covered, whether we call it the “legal but harmful” category or “lawful but awful”. Whatever we name those provisions, by taking out of the Bill clauses relating to the “legal but harmful” category, we are opening up an area of harm that already exists, that has a real-world impact, and that the Bill was meant to go some way towards addressing.

The provisions have taken out the risk assessments that need to be done. The Bill says,

“(e) the level of risk of functionalities of the service facilitating the presence or dissemination of priority content that is harmful to adults, identifying and assessing those functionalities that present higher levels of risk;

(f) the different ways in which the service is used, and the impact of such use on the level of risk of harm that might be suffered by adults;

(g) the nature, and severity, of the harm that might be suffered by adults”.

Again, the idea that we are talking about offence, and that the clauses need to be taken out to protect free speech, is fundamentally nonsense.

I have already mentioned holocaust denial, but it is also worth mentioning health-related disinformation. We have already seen real-world harms from some of the covid misinformation online. It led to people including Piers Corbyn turning up outside Parliament with a gallows, threatening to hang hon. Members for treason. Obviously, that was rightly dealt with by the police, but the kind of information and misinformation that he had been getting online and that led him to do that, which is legal but harmful, will now not be covered by the Bill.

I will also raise an issue I have heard about from a number of people dealing with cancer and conditions such as multiple sclerosis. People online try to discourage them from accessing the proper medical interventions for their illnesses, and instead encourage them to take more vitamin B or adopt a vegan diet. There are people who have died because they had cancer but were encouraged online to not access cancer treatment because they were subject to lawful but awful categories of harm.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I wonder if the hon. Member saw the story online about the couple in New Zealand who refused to let their child have a life-saving operation because they could not guarantee that the blood used would not be from vaccinated people? Is the hon. Member similarly concerned that this has caused real-life harm?

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

I am aware of the case that the hon. Member mentioned. I appreciate that I am probably testing the patience of everybody in the Committee Room, but I want to be clear just how abhorrent I find it that these provisions are coming out of the Bill. I am trying to be restrained, measured and reasonably concise, but that is difficult when there are so many parts of the change that I find egregious.

My final point is on self-harm and suicide content. For men under the age of 45, suicide is the biggest killer. In the Bill, we are doing as much as we can to protect young people from that sort of content. My real concern is this: many young people are being protected by the Bill’s provisions relating to children. They are perhaps waiting for support from child and adolescent mental health services, which are massively oversubscribed. The minute they tick over into 18, fall off the CAMHS waiting list and go to the bottom of the adult mental health waiting list—they may have to wait years for treatment of various conditions—there is no requirement or duty on the social media companies and platforms to do risk assessments.

11:25
The Chair adjourned the Committee without Question put (Standing Order No. 88).
Adjourned till this day at Two o’clock.

ONLINE SAFETY BILL (Second sitting)

Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(1 year, 4 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
The Committee consisted of the following Members:
Chairs: † Dame Angela Eagle, Sir Roger Gale
† Ansell, Caroline (Eastbourne) (Con)
† Bailey, Shaun (West Bromwich West) (Con)
Bhatti, Saqib (Meriden) (Con)
† Blackman, Kirsty (Aberdeen North) (SNP)
† Bonnar, Steven (Coatbridge, Chryston and Bellshill) (SNP)
† Bristow, Paul (Peterborough) (Con)
† Collins, Damian (Folkestone and Hythe) (Con)
† Davies-Jones, Alex (Pontypridd) (Lab)
† Fletcher, Nick (Don Valley) (Con)
† Leadbeater, Kim (Batley and Spen) (Lab)
† Maclean, Rachel (Redditch) (Con)
† Nichols, Charlotte (Warrington North) (Lab)
† Owen, Sarah (Luton North) (Lab)
Peacock, Stephanie (Barnsley East) (Lab)
† Russell, Dean (Watford) (Con)
† Scully, Paul (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)
† Wood, Mike (Dudley South) (Con)
Kevin Maddison, Bethan Harding, Committee Clerks
† attended the Committee
Public Bill Committee
Tuesday 13 December 2022
(Afternoon)
[Dame Angela Eagle in the Chair]
Online Safety Bill
(Re-committed Clauses and Schedules: Clauses 11 to 14, 18 to 21, 30, 46, 55 and 65, Schedule 8, Clauses 79 and 82, Schedule 11, Clauses 87, 90, 115, 169 and 183, Schedule 17, Clauses 203, 206 and 207, new Clauses and new Schedules)
14:00
None Portrait The Chair
- Hansard -

Before we begin, I have a few preliminary announcements. Hansard colleagues would be grateful if Members could email their speaking notes to hansardnotes@ parliament.uk. Please switch electronic devices to silent. Traditionally, the Chair of a Committee gives Members permission to take off their jackets, but given the temperature in this room, please understand that you do not need my permission to keep on your blankets or coats.

Clause 12

Adults’ risk assessment duties

Question (this day) again proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

I remind the Committee that with this we are discussing the following:

Clause 13 stand part.

Government amendments 18, 23 to 25, 32, 33 and 39.

Clause 55 stand part.

Government amendments 42 to 45, 61 to 66, 68 to 70, 74, 80, 85, 92, 51 and 52, 54, 94 and 60.

Charlotte Nichols Portrait Charlotte Nichols (Warrington North) (Lab)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairship, Dame Angela. I did not make a note of the specific word I was on when we adjourned, so I hope Hansard colleagues will forgive me if the flow between what I said previously and what I say now is somewhat stilted.

I will keep this brief, because I was—purposefully—testing the patience of the Minister with some of my contributions. However, I did so to hammer home the fact that the removal of clauses 12 and 13 from the Bill is a fatal error. If the recommittal of the Bill is not to fundamentally undermine what the Bill set out to do five years or so ago, their removal should urgently be reconsidered. We have spent five years debating the Bill to get it to this point.

As I said, there are forms of harm that are not illegal, but they are none the less harmful, and they should be legislated for. They should be in the Bill, as should specific protections for adults, not just children. I therefore urge the Minister to keep clauses 12 and 13 in the Bill so that we do not undermine what it set out to do and all the work that has been done up to this point. Inexplicably, the Government are trying to undo that work at this late stage before the Bill becomes law.

Sarah Owen Portrait Sarah Owen (Luton North) (Lab)
- Hansard - - - Excerpts

It is a pleasure to see you in the Chair, Dame Angela—I wish it was a toastier room. Let me add to the points that the shadow Minister, my hon. Friend the Member for Pontypridd, made so powerfully about vulnerable people. There is no cliff edge when such a person becomes 18. What thought have the Minister and the Department given to vulnerable young adults with learning disabilities or spectrum disorders? Frankly, the idea that, as soon as a person turns 18, they are magically no longer vulnerable is for the birds—particularly when it comes to eating disorders, suicide and self-harm.

Adults do not live in isolation, and they do not just live online. We have a duty of care to people. The perfect example is disinformation, particularly when it comes to its harmful impact on public health. We saw that with the pandemic and vaccine misinformation. We saw it with the harm done to children by the anti-vaccine movement’s myths about vaccines, children and babies. It causes greater harm than just having a conversation online.

People do not stay in one lane. Once people start being sucked into conspiracy myths, much as we discussed earlier around the algorithms that are used to keep people online, it has to keep ramping up. Social media and tech companies do that very well. They know how to do it. That is why I might start looking for something to do with ramen recipes and all of a sudden I am on to a cat that has decided to make noodles. It always ramps up. That is the fun end of it, but on the serious end somebody will start to have doubts about certain public health messages the Government are sending out. That then tips into other conspiracy theories that have really harmful, damaging consequences.

I saw that personally. My hon. Friend the Member for Warrington North eloquently put forward some really powerful examples of what she has been subjected to. With covid, some of the anti-vaccinators and anti-mask-wearers who targeted me quickly slipped into Sinophobia and racism. I was sent videos of people eating live animals, and being blamed for a global pandemic.

The people who have been targeted do not stay in one lane. The idea that adults are not vulnerable, and susceptible, to such targeting and do not need protection from it is frankly for the birds. We see that particularly with extremism, misogyny and the incel culture. I take the point from our earlier discussion about who determines what crosses the legal threshold, but why do we have to wait until somebody is physically hurt before the Government act?

That is really regrettable. So, too, is the fact that this is such a huge U-turn in policy, with 15% of the Bill coming back to Committee. As we have heard, that is unprecedented, and yet, on the most pivotal point, we were unable to hear expert advice, particularly from the National Society for the Prevention of Cruelty to Children, Barnardo’s and the Antisemitism Policy Trust. I was struggling to understand why we would not hear expert advice on such a drastic change to an important piece of legislation—until I heard the hon. Member for Don Valley talk about offence. This is not about offence; it is about harm.

The hon. Member’s comments highlighted perfectly the real reason we are all here in a freezing cold Bill Committee, rehashing work that has already been solved. The Bill was not perfect by any stretch of the imagination, but it was better than what we have today. The real reason we are here is the fight within the Conservative party.

Nick Fletcher Portrait Nick Fletcher (Don Valley) (Con)
- Hansard - - - Excerpts

No such fight has taken place. These are my personal views, and I genuinely believe that people have a right to say what they would like to say. That is free speech. There have been no fights whatever.

Sarah Owen Portrait Sarah Owen
- Hansard - - - Excerpts

In that case, I must have been mistaken in thinking that the hon. Member—who has probably said quite a lot of things, which is why his voice is as hoarse as it is—was criticising the former Minister for measures that were agreed in previous Committee sittings.

For me, the current proposals are a really disappointing, retrograde step. They will not protect the most vulnerable people in our communities, including offline—this harm is not just online, but stretches out across all our communities. What happens online does not take place, and stay, in an isolated space; people are influenced by it and take their cues from it. They do not just take their cues from what is said in Parliament; they see misogynists online and think that they can treat people like that. They see horrific abuses of power and extreme pornography and, as we heard from the hon. Member for Aberdeen North, take their cues from that. What happens online does not stay online.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

My hon. Friend makes an important point about what happens online and its influence on the outside world. We saw that most recently with Kanye West being reinstated to Twitter and allowed to spew his bile and abhorrent views about Jews. That antisemitism had a real-world impact in terms of the rise in antisemitism on the streets, particularly in the US. The direct impact of his being allowed to talk about that online was Jews being harmed in the real world. That is exactly what is happening.

Sarah Owen Portrait Sarah Owen
- Hansard - - - Excerpts

I thank the shadow Minister for that intervention. She is absolutely right. We have had a discussion about terms of reference and terms of service. Not only do most people not actually fully read them or understand them, but they are subject to change. The moment Elon Musk took over Twitter, everything changed. Not only have we got Donald Trump back, but Elon Musk also gave the keys to a mainstream social media platform to Kanye West. We have seen what happened then.

That is the situation the Government will now not shut the door on. That is regrettable. For all the reasons we have heard today, it is really damaging. It is really disappointing that we are not taking the opportunity to lead in this area.

Paul Scully Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Paul Scully)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Dame Angela.

A lot of the discussion has replayed the debate from day two on Report about the removal of “legal but harmful” measures. Some of the discussion this morning and this afternoon has covered really important issues such as self-harm on which, as we said on the Floor of the House, we will introduce measures at a later stage. I will not talk about those measures now, but I would just say that we have already said that if we agree that the promotion of things such as self-harm is illegal, it should be illegal. Let us be very straight about how we deal with the promotion of self-harm.

The Bill will bring huge improvements for adult safety online. In addition to their duty to tackle illegal content, companies will have to provide adult users with tools to keep themselves safer. On some of the other clauses, we will talk about the triple shield that was mentioned earlier. If the content is illegal, it will still be illegal. If content does not adhere to the companies’ terms of service—that includes many of the issues that we have been debating for the last hour—it will have to be removed. We will come to user enforcement issues in further clauses.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

The Minister mentions tools for adults to keep themselves safe. Does he not think that that puts the onus on the users—the victims—to keep themselves safe? The measures as they stand in the Bill put the onus on the companies to be more proactive about how they keep people safe.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The onus on adults is very much a safety net—very much a catch-all, after we have put the onus on the social media companies and the platforms to adhere to their own terms and conditions.

We have heard a lot about Twitter and the changes to Twitter. We can see the commercial imperative for mainstream platforms, certainly the category 1 platforms, to have a wide enough catch-all in their terms of service—anything that an advertiser, for example, would see as reasonably sensible—to be able to remain a viable platform in the first place. When Elon Musk first started making changes at Twitter, a comment did the rounds: “How do you build a multimillion-dollar company? You sell it to Elon Musk for £44 billion.” He made that change. He has seen the bottom falling out of his market and has lost a lot of the cash he put into Twitter. That is the commercial impetus that underpins a lot of the changes we are making.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

Is the Minister really suggesting that it is reasonable for people to say, “Right, I am going to have to walk away from Facebook because I don’t agree with their terms of service,” to hold the platform to account? How does he expect people to keep in touch with each other if they have to walk away from social media platforms in order to try to hold them to account?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I do not think the hon. Lady is seriously suggesting that people can communicate only via Facebook—via one platform. The point is that there are a variety of methods of communication, of which has been a major one, although it is not one of the biggest now, with its share value having dropped 71% in the last year. That is, again, another commercial impetus in terms of changing its platform in other, usability-related ways.

14:15
People will have choice in terms of how they communicate, but we are saying that if something is illegal, it will need to be removed from the platform. The majority of the big communication platforms referred to by the hon. Member for Aberdeen North, which people are becoming increasingly committed and dedicated to as part of their lives and ways of communicating around the world, will need to keep their terms of service broad. We heard from my hon. Friend the Member for Folkestone and Hythe that their terms of service are largely a higher bar than what was in the original Bill, so it is about getting them to adhere to the terms of service. It is not about the measures we put in; it is about how things are enforced. If platforms cannot enforce their terms of service, there is a swingeing fine—£18 million or 10% of their qualifying global turnover. If they do not then put those things right or share with Ofcom their methods of risk assessment, age verification, age assurance, user enforcement and all those other areas, there is a criminal liability attached as well.
As we heard in this morning’s sitting, companies will clearly need to design their services to prevent the spread of illegal content and protect children. Ofcom will have that broad power to require information from companies to assess compliance with the rules. As I keep saying, platforms have that strong commercial incentive to tackle harmful content, and the major companies already prohibit most of the harmful and abusive content that we talked about this morning, but they are just not readily enforcing that. Their business model does not lend itself to enforcing that legislation, so we have to change that impetus so that they adhere to their moral, as well as their legal, duties.
For that reason, which has been well addressed in the main Chamber and which we will continue to talk about the Bill continues its passage, this legislation finds the right balance between protecting free speech and freedom of expression, which are vital aims, and protecting vulnerable adults and particularly children. These user empowerment duties are about giving users greater control over their online experience, very much as a safety net, but understanding the risk assessments that each platform will have to provide. It is right that the Bill empowers vulnerable people who may find certain types of legal content unhelpful or harmful, depending on their personal circumstances. We had the sentence from the hon. Member for Warrington North about people’s different experiences, so it is right that people can change and enforce their own experience with that safety net.
Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

One of the examples I alluded to, which is particularly offensive for Jewish people, LGBT people and other people who were persecuted in the Nazi holocaust, is holocaust denial. Does the Minister seriously think that it is only Jewish people, LGBT people and other people who were persecuted in the holocaust who find holocaust denial offensive and objectionable and who do not want to see it as part of their online experience? Surely having these sorts of safety nets in place and saying that we do not think that certain kinds of content—although they may not be against the law—have a place online protects everyone’s experience, whether they are Jewish or not. Surely, no one wants to see holocaust denial online.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

No, but there is freedom of expression to a point—when it starts to reach into illegality. We have to have the balance right: someone can say something in public—in any session offline—but what the hon. Lady is suggesting is that, as soon as they hit a keyboard or a smartphone, there are two totally different regimes. That is not getting the balance right.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

The Minister says that we should have freedom of speech up to a point. Does that point include holocaust denial? He has just suggested that if something is acceptable to say in person, which I do not think holocaust denial should be, it should be acceptable online. Surely holocaust denial is objectionable whenever it happens, in whatever context—online or offline.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I have been clear about where I set the line. [Interruption.] I have said that if something is illegal, it is illegal. The terms of service of the platforms largely cover the list that we are talking about. As my hon. Friend the Member for Folkestone and Hythe and I have both said, the terms of service of the vast majority of platforms—the big category 1 platforms—set a higher bar than was in our original Bill. The hon. Member for Luton North talked about whether we should have more evidence. I understand that the pre-legislative scrutiny committee heard evidence and came to a unanimous conclusion that the “legal but harmful” conditions should not be in the Bill.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

A few moments ago, the Minister compared the online world to the real world. Does he agree that they are not the same? Sadly, the sort of thing that someone says in the pub on a Friday night to two or three of their friends is very different from someone saying something dangerously harmful online that can reach millions and billions of people in a very short space of time. The person who spoke in the pub might get up the following morning and regret what they said, but no harm was done. Once something is out there in the online world, very serious damage can be done very quickly.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The hon. Lady makes a good point. I talked about the offline world rather than the real world, but clearly that can happen. That is where the balance has to be struck, as we heard from my hon. Friend the Member for Don Valley. It is not black and white; it is a spectrum of greys. Any sensible person can soon see when they stray into areas that we have talked about such as holocaust denial and extremism, but we do not want to penalise people who invariably are testing their freedom of expression.

It is a fine balance, but I think that we have reached the right balance between protecting freedom of expression and protecting vulnerable adults by having three layers of checks. The first is illegality. The second is enforcing the terms of service, which provide a higher bar than we had in the original Bill for the vast majority of platforms, so that we can see right at the beginning how they will be enforced by the platforms. If they change them and do not adhere them, Ofcom can step in. Ofcom can step in at any point to ensure that they are being enforced. The third is a safety net.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

On illegal content, is the Minister proposing that the Government will introduce new legislation to make, for example, holocaust denial and eating disorder content illegal, whether it is online or offline? If he is saying that the bar in the online and offline worlds should be the same, will the Government introduce more hate crime legislation?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Hate crime legislation will always be considered by the Ministry of Justice, but I am not committing to any changes. That is beyond my reach, but the two shields that we talked about are underpinned by a safety net.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - - - Excerpts

Does my hon. Friend agree that the risk assessments that will be done on the priority illegal offences are very wide ranging, in addition to the risk assessments that will be done on meeting the terms of service? They will include racially and religiously motivated harassment, and putting people in fear of violence. A lot of the offences that have been discussed in the debate would already be covered by the adult safety risk assessments in the Bill.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I absolutely agree. As I said in my opening remarks about the racial abuse picked up in relation to the Euro 2020 football championship, that would have been against the terms and conditions of all those platforms, but it still happened as the platforms were not enforcing those terms and conditions. Whether we put them on a list in the Bill or talk about them in the terms of the service, they need to be enforced, but the terms of service are there.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

On that point, does my hon. Friend also agree that the priority legal offences are important too? People were prosecuted for what they posted on Twitter and Instagram about the England footballers, so that shows that we understand what racially motivated offences are and that people are prosecuted for them. The Bill will require a minimum regulatory standard that meets that threshold and requires companies to act in cases such as that one, where we know what this content is, what people are posting and what is required. Not only will the companies have to act, but they will have to complete risk assessments to demonstrate how they will do that.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Indeed. I absolutely agree with my hon. Friend and that is a good example of enforcement being used. People can be prosecuted if such abuse appears on social media, but a black footballer, who would otherwise have seen that racial abuse, can choose in the user enforcement to turn that off so that he does not see it. That does not mean that we cannot pursue a prosecution for racial abuse via a third-party complaint or via the platform.

None Portrait The Chair
- Hansard -

Order. Could the Minister address his remarks through the Chair so that I do not have to look at his back?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I apologise, Dame Angela. I will bring my remarks to a close by saying that with those triple shields, we have the protections and the fine balance that we need.

Question put, That the clause, as amended, stand part of the Bill.

Division 1

Ayes: 6

Noes: 9

Clause 13
Safety duties protecting adults
Question put, That the clause stand part of the Bill.

Division 2

Ayes: 6

Noes: 9

Clause 14
User empowerment duties
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I beg to move amendment 8, in clause 14, page 14, line 3, leave out “harmful content” and insert—

“content to which this subsection applies”.

This amendment, and Amendments 9 to 17, amend clause 14 (user empowerment) as the adult safety duties are removed (see Amendments 6, 7 and 41). New subsections (8B) to (8D) describe the kinds of content which are now relevant to the duty in clause 14(2) - see Amendment 15.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Government amendments 9 to 14.

Government amendment 15, in clause 14, page 14, line 29, at end insert—

“(8A) Subsection (2) applies to content that—

(a) is regulated user-generated content in relation to the service in question, and

(b) is within subsection (8B), (8C) or (8D).

(8B) Content is within this subsection if it encourages, promotes or provides instructions for—

(a) suicide or an act of deliberate self-injury, or

(b) an eating disorder or behaviours associated with an eating disorder.

(8C) Content is within this subsection if it is abusive and the abuse targets any of the following characteristics—

(a) race,

(b) religion,

(c) sex,

(d) sexual orientation,

(e) disability, or

(f) gender reassignment.

(8D) Content is within this subsection if it incites hatred against people—

(a) of a particular race, religion, sex or sexual orientation,

(b) who have a disability, or

(c) who have the characteristic of gender reassignment.”

This amendment describes the content relevant to the duty in subsection (2) of clause 14. The effect is (broadly) that providers must offer users tools to reduce their exposure to these kinds of content.

Amendment (a), to Government amendment 15, at end insert—

“(8E) Content is within this subsection if it—

(a) incites hateful extremism,

(b) provides false information about climate change, or

(c) is harmful to health.”

Government amendment 16, in clause 14, page 14, line 30, leave out subsection (9) and insert—

“(9) In this section—

‘disability’ means any physical or mental impairment;

‘injury’ includes poisoning;

‘non-verified user’ means a user who has not verified their identity to the provider of a service (see section 58(1));

‘race’ includes colour, nationality, and ethnic or national origins.”

This amendment inserts definitions of terms now used in clause 14.

Amendment (a), to Government amendment 16, after “mental impairment;” insert—

“‘hateful extremism’ means activity or materials directed at an out-group who are perceived as a threat to an in-group motivated by or intended to advance a political, religious or racial supremacist ideology—

(a) to create a climate conducive to hate crime, terrorism or other violence, or

(b) to attempt to erode or destroy the rights and freedoms protected by article 17 (Prohibition of abuse of rights) of Schedule 1 of the Human Rights Act 1998.”

Government amendment 17.

14:30
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The Government recognise the importance of giving adult users greater choice about what they see online and who they interact with, while upholding users’ rights to free expression online. That is why we have removed the “legal but harmful” provisions from the Bill in relation to adults and replaced it with a fairer, simpler approach: the triple shield.

As I said earlier, the first shield will require all companies in scope to take preventive measures to tackle illegal content or activity. The second shield will place new duties on category 1 services to improve transparency and accountability, and protect free speech, by requiring them to adhere to their terms of service when restricting access to content or suspending or banning users. As I said earlier, user empowerment is the key third shield, empowering adults with a greater control over their exposure to legal forms of abuse or hatred, or content that encourages, promotes or provides instructions for suicide, self-harm or eating disorders. That has been done while upholding and protecting freedom of expression.

Amendments 9 and 12 will strengthen the user empowerment duty, so that the largest companies are required to ensure that those tools are effective in reducing the likelihood of encountering the listed content or alerting users to it, and are easy for users to access. That will provide adult users with greater control over their online experience.

We are also setting out the categories of content that those user empowerment tools apply to in the Bill, through amendment 15. Adult users will be given the choice of whether they want to take advantage of those tools to have greater control over content that encourages, promotes or provides instructions for suicide, self-harm and eating disorders, and content that targets abuse or incites hate against people on the basis of race, religion, sex, sexual orientation, disability, or gender reassignment. This is a targeted approach, focused on areas where we know that adult users—particularly those who are vulnerable or disproportionately targeted by online hate and abuse—would benefit from having greater choice.

As I said, the Government remain committed to free speech, which is why we have made changes to the adult safety duties. By establishing high thresholds for inclusion in those content categories, we have ensured that legitimate debate online will not be affected by the user empowerment duties.

I want to emphasise that the user empowerment duties do not require companies to remove legal content from their services; they are about giving individual adult users the option to increase their control over those kinds of content. Platforms will still be required to provide users with the ability to filter out unverified users, if they so wish. That duty remains unchanged. For the reasons that I have set out, I hope that Members can support Government amendments 8 to 17.

I turn to the amendments in the name of the hon. Member for Pontypridd to Government amendments 15 and 16. As I have set out in relation to Government amendments 8 to 17, the Government recognise the intent behind the amendments—to apply the user empowerment tools in clause 14(2) to a greater range of content categories. As I have already set out, it is crucial that a tailored approach is taken, so that the user empowerment tools stay in balance with users’ rights to free expression online. I am sympathetic to the amendments, but they propose categories of content that risk being either unworkable for companies or duplicative to the approach already set out in amendment 15.

The category of

“content that is harmful to health”

sets an extremely broad scope. That risks requiring companies to apply the tools in clause 14(2) to an unfeasibly large volume of content. It is not a proportionate approach and would place an unreasonable burden on companies. It might also have concerning implications for freedom of expression, as it may capture important health advice. That risks, ultimately, undermining the intention behind the user empowerment tools in clause 14(2) by preventing users from accessing helpful content, and disincentivising users from using the features.

In addition, the category

“provides false information about climate change”

places a requirement on private companies to be the arbiters of truth on subjective and evolving issues. Those companies would be responsible for determining what types of legal content were considered false information, which poses a risk to freedom of expression and risks silencing genuine debate.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Did the Minister just say that climate change is subjective?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

No, not about whether climate change is happening, but we are talking about a wide range. “Provides false information”—how do the companies determine what is false? I am not talking about the binary question of whether climate change is happening, but climate change is a wide-ranging debate. “Provides false information” means that someone has to determine what is false and what is not. Basically, the amendment outsources that to the social media platforms. That is not appropriate.

Sarah Owen Portrait Sarah Owen
- Hansard - - - Excerpts

Would that not also apply to vaccine efficacy? If we are talking about everything being up for debate and nothing being a hard fact, we are entering slightly strange worlds where we undo a huge amount of progress, in particular on health.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The amendment does not talk about vaccine efficacy; it talks about content that is harmful to health. That is a wide-ranging thing.

None Portrait The Chair
- Hansard -

Order. I am getting increasingly confused. The Minister appears to be answering a debate on an amendment that has not yet been moved. It might be helpful to the Committee, for good debate, if the Minister were to come back with his arguments against the amendment not yet moved by the Opposition spokesperson, the hon. Member for Pontypridd, once she has actually moved it. We can then hear her reasons for it and he can reply.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

In that case, having moved my amendment, I close my remarks.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It is a pleasure to serve under your chairship, Dame Angela. With your permission, I will take this opportunity to make some broad reflections on the Government’s approach to the new so-called triple-shield protection that we have heard so much about, before coming on to the amendment tabled in my name in the group.

Broadly, Labour is disappointed that the system-level approach to content that is harmful to adults is being stripped from the Bill and replaced with a duty that puts the onus on the user to keep themselves safe. As the Antisemitism Policy Trust among others has argued, the two should be able to work in tandem. The clause allows a user to manage what harmful material they see by requiring the largest or most risky service providers to provide tools to allow a person in effect to reduce their likelihood of encountering, or to alert them to, certain types of material. We have concerns about the overall approach of the Government, but Labour believes that important additions can be made to the list of content where user-empowerment tools must be in place, hence our amendment (a) to Government amendment 15.

In July, in a little-noticed written ministerial statement, the Government produced a prototype list of content that would be harmful to adults. The list included priority content that category 1 services need to address in their terms and conditions; online abuse and harassment—mere disagreement with another’s point of view would not reach the threshold for harmful content, and so would not be covered; circulation of real or manufactured intimate images without the subject’s consent; content promoting self-harm; content promoting eating disorders; legal suicide content; and harmful health content that is demonstrably false, such as urging people to drink bleach to cure cancer.

We have concerns about whether listing those harms in the Bill is the most effective mechanism, mostly because we feel that the list should be more flexible and able to change according to the issues of the day, but it is clear that the Government will continue to pursue this avenue despite some very worrying gaps. With that in mind, will the Minister clarify what exactly underpins that list if there have been no risk assessments? What was the basis for drawing up that specific list? Surely the Government should be implored to publish the research that determined the list, at the very least.

I recognise that the false communications offence has remained in the Bill, but the list in Government amendment 15 is not exhaustive. Without the additions outlined in our amendment (a) to amendment 15, the list will do little to tackle some of the most pressing harm of our time, some of which we have already heard about today.

I am pleased that the list from the written ministerial statement has more or less been reproduced in amendment 15, under subsection (2), but there is a key and unexplained omission that our amendment (a) to it seeks to correct: the absence of the last point, on harmful health content. Amendment (a) seeks to reinsert such important content into the Bill directly. It seems implausible that the Government failed to consider the dangerous harm that health misinformation can have online, especially given that back in July they seemed to have a grasp of its importance by including it in the original list.

We all know that health-related misinformation and disinformation can significantly undermine public health, as we have heard. We only have to cast our minds back to the height of the coronavirus pandemic to remind ourselves of how dangerous the online space was, with anti-vax scepticism being rife. Many groups were impacted, including pregnant women, who received mixed messages about the safety of covid vaccination, causing widespread confusion, fear and inaction. By tabling amendment (a) to amendment 15, we wanted to understand why the Government have dropped that from the list and on what exact grounds.

In addition to harmful health content, our amendment (a) to amendment 15 would also add to the list content that incites hateful extremism and provides false information about climate change, as we have heard. In early written evidence from Carnegie, it outlined how serious the threat of climate change disinformation is to the UK. Malicious actors spreading false information on social media could undermine collective action to combat the threats. At present, the Online Safety Bill is not designed to tackle those threats head on.

We all recognise that social media is an important source of news and information for many people, and evidence is emerging of its role in climate change disinformation. The Centre for Countering Digital Hate published a report in 2021 called “The Toxic Ten: How ten fringe publishers fuel 69% of digital climate change denial”, which explores the issue further. Further analysis of activity on Facebook around COP26 undertaken by the Institute for Strategic Dialogue demonstrates the scale of the challenge in dealing with climate change misinformation and disinformation. The research compared the levels of engagement generated by reliable, scientific organisations and climate-sceptic actors, and found that posts from the latter frequently received more traction and reach than the former, which is shocking. For example, in the fortnight in which COP26 took place, sceptic content garnered 12 times the level of engagement that authoritative sources did on the platform, and 60% of the sceptic posts analysed could be classified as actively and explicitly attacking efforts to curb climate change, which just goes to show the importance of ensuring that climate change disinformation is also included in the list in Government amendment 15.

Our two amendments—amendment (a) to amendment 15, and amendment (a) to amendment 16 —seek to ensure that the long-standing omission from the Bill of hateful extremism is put right here as a priority. There is increasing concern about extremism leading to violence and death that does not meet the definition for terrorism. The internet and user-to-user services play a central role in the radicalisation process, yet the Online Safety Bill does not cover extremism.

Colleagues may be aware that Sara Khan, the former lead commissioner for countering extremism, provided a definition of extremism for the Government in February 2021, but there has been no response. The issue has been raised repeatedly by Members across the House, including by my hon. Friend the Member for Plymouth, Sutton and Devonport (Luke Pollard), following the tragic murders carried out by a radicalised incel in his constituency.

Amendment (a) to amendment 16 seeks to bring a formal definition of hateful extremism into the Bill and supports amendment (a) to amendment 15. The definition, as proposed by Sara Khan, who was appointed as Britain’s first countering extremism commissioner in 2018, is an important first step in addressing the gaps that social media platforms and providers have left open for harm and radicalisation.

Social media platforms have often been ineffective in removing other hateful extremist content. In November 2020, The Guardian reported that research from the Centre for Countering Digital Hate had uncovered how extremist merchandise had been sold on Facebook and Instagram to help fund neo-Nazi groups. That is just one of a huge number of instances, and it goes some way to suggest that a repeatedly inconsistent and ineffective approach to regulating extremist content is the one favoured by some social media platforms.

I hope that the Minister will seriously consider the amendments and will see the merits in expanding the list in Government amendment 15 to include these additional important harms.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Thank you for chairing the meeting this afternoon, Dame Angela. I agree wholeheartedly with the amendments tabled by the Labour Front-Bench team. It is important that we talk about climate change denial and what we can do to ensure people are not exposed to that harmful conspiracy theory through content. It is also important that we do what we can to ensure that pregnant women, for example, are not told not to take the covid vaccine or that parents are not told not to vaccinate their children against measles, mumps and rubella. We need to do what we can to ensure measures are in place.

I appreciate the list in Government amendment 15, but I have real issues with this idea of a toggle system—of being able to switch off this stuff. Why do the Government think people should have to switch off the promotion of suicide content or content that promotes eating disorders? Why is it acceptable that people should have to make an active choice to switch that content off in order to not see it? People have to make an active choice to tick a box that says, “No, I don’t want to see content that is abusing me because of my religion,” or “No, I don’t want to see content that is abusing me because of my membership of the LGBT community.” We do not want people to have to look through the abuse they are receiving in order to press the right buttons to switch it off. As the hon. Member for Don Valley said, people should be allowed to say what they want online, but the reality is that the extremist content that we have seen published online is radicalising people and bringing them to the point that they are taking physical action against people in the real, offline world as well as taking action online.

14:45
There were many issues with the Bill before, but it was significantly better than it will be at the end of this Committee. I wholeheartedly support the Opposition amendments. They are particularly clever, in that they bring in that additional content and include that definition of extremism, and they would make a significant and positive difference to the Bill.
On clause 14 more generally, the user empowerment tools are important. It is important that we have user empowerment and that that is mandated. I agree that people should be able to fix their online lives in order to stay away from some of the stuff that they may not want to see. I am disappointed that gambling is not included in the list. It could have been included so that people have the opportunity to avoid it. In real life, if someone has an issue with gambling, they can go to their local betting shop and say, “I have a problem with gambling. Do not allow me to bet anymore.” The betting shop has to say, “Okay, we will not allow you to bet anymore.” That is how it works in real life, and not having that in the Bill, as I said at the previous Committee stage, is a concern, because we do not have parity between the online and offline worlds.
As a result of the Bill, people will be able to stop seeing content on YouTube, for example, promoting eating disorders, but they might not be able to stop seeing content promoting online poker sites, when that might be causing a significant issue for their health, so not including that is bit of an oversight. As I say, user empowerment is important, but the Government have not implemented it in nearly as good a way as they should have done, and the Opposition amendments would make the Government amendments better.
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I rise briefly to say that the introduction of the shields is a significant additional safety measure in the Bill and shows that the Government have thought about how to improve certain safety features as the Bill has progressed.

In the previous version of the Bill, as we have discussed at length, there were the priority legal offences that companies had to proactively identify and mitigate, and there were the measures on transparency and accountability on the terms of service. However, if pieces of content fell below the threshold for the priority legal offences or were not covered, or if they were not addressed in the terms of service, the previous version of the Bill never required the companies to act in any particular way. Reports might be done by Ofcom raising concerns, but there was no requirement for further action to be taken if the content was not a breach of platform policies or the priority safety duties.

The additional measure before us says that there may be content where there is no legal basis for removal, but users nevertheless have the right to have that content blocked. Many platforms offer ad tools already—they are not perfect, but people can opt in to say that they do not want to see ads for particular types of content—but there was nothing for the types of content covered by the Online Safety Bill, where someone could say, “I want to make sure I protect myself from seeing this at all,” and then, for the more serious content, “I expect the platforms to take action to mitigate it.” So this measure is an important additional level of protection for adult users, which allows them to give themselves the certainty that they will not see certain types of content and puts an important, additional duty on the companies themselves.

Briefly, on the point about gambling, the hon. Member for Aberdeen North is quite right to say that someone can self-exclude from gambling at the betting shop, but the advertising code already requires that companies do not target people who have self-excluded with advertising messages. As the Government complete their online advertising review, which is a separate piece of work, it is important that that is effectively enforced on big platforms, such as Facebook and Google, to ensure that they do not allow companies to advertise to vulnerable users in breach of the code. However, that can be done outside the Bill.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

My concern is not just about advertising content or stuff that is specifically considered as an advert. If someone put up a TikTok video about how to cheat an online poker system, that would not be classed as an advert and therefore would not be caught. People would still be able to see it, and could not opt out.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I totally appreciate the point that the hon. Lady makes, which is a different one. For gambling, the inducement to act straightaway often comes in the form of advertising. It usually comes in the form of free bets and immediate inducements to act. People who have self-excluded should not be targeted in that way. We need to ensure that that is rigorously enforced on online platforms too.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

It is a pleasure to serve under your chairship, Dame Angela. It is lovely to be back in a Public Bill Committee with many familiar faces—and a few new ones, including the Minister. However, after devoting many weeks earlier this year to the previous Committee, I must admit that it is with some frustration that we are back here with the Government intent on further weakening their Bill.

Throughout the passage of the Bill, I have raised a number of specific concerns, from democratic and journalistic exemptions, to age verification, recognised news publishers, advocacy bodies and media literacy. On clause 14, while I support the principles of Government amendments 15 and 16, I draw the Minister’s attention to the importance of amendment (a) to amendment 15 and amendment (a) to amendment 16. He has already said that he is sympathetic to those amendments. Let me try to convince him to turn that sympathy into action.

I will focus primarily on an issue that is extremely important to me and to many others: extremism and radicalisation. However, while I will focus on the dangers of extremism and radicalisation, be it right-wing, Islamist, incel or other, the dangers that I am about to set out—the chain of events that leads to considerable harm online—are the same for self-harm content, eating disorder content, health disinformation, climate change disinformation or any dangerous, hateful material directed at people based on their sex, sexual orientation, ethnicity, religion or other characteristics.

Such content is not just deeply offensive and often wholly inaccurate; it is dangerous and vile and serves only to spread harm, misinformation and conspiracy. To be clear, such content is not about a social media user stating how upset and angry they are about the football result, or somebody disagreeing legitimately and passionately about a political issue. It is not the normal, everyday social media content that most people see on their feeds.

This is content that is specifically, carefully and callously designed to sit just below the criminal threshold, yet that can still encourage violence, self-harm or worse. It is content used by extremists of all types that lures vulnerable people in, uses social media likes and comments to create the illusion of legitimacy and popularity, and then directly targets those most likely to be susceptible, encouraging them either to commit harm or to move on to smaller but high-harm platforms that may fall out of the scope of the Bill. This is not free speech; it is content that can act as a dangerous gateway to radicalisation and extremism. The Government know how dangerous it is because their own report from His Majesty’s Prison and Probation Service last year found:

“The Internet appears to be playing an increasingly prominent role in radicalisation processes of those convicted of extremist offences in England and Wales.”

Hon. Members will understand my deep and personal interest in this matter. Since the murder of my sister, a Member of this House, six and a half years ago by a far-right extremist, I have worked hard to bring communities and people together in the face of hatred. Some of that work has included meeting former extremists and discussing how they were radicalised. Those conversations were never easy, but what became very clear to me was that such people are not born extremists. Their radicalisation starts somewhere, and it is often somewhere that appears to be completely innocent, such as a Facebook group about issues or problems in their community, a Twitter discussion about current affairs or the state of the country, or even a page for supporters of their football team.

One day, a comment is posted that is not illegal and is not hate speech, but that references a conspiracy or a common trope. It is an ideological remark placed there to test the water. The conversation moves on and escalates. More disturbing or even violent comments start to be made. They might be accompanied by images or videos, leading those involved down a more sinister path. Nothing yet is illegal, but clearly—I hope we would all agree—it is unacceptable.

The number of contributors reduces, but a few remain. No warnings are presented, no flags are raised and it appears like normal social media content. However, the person reading it might be lonely or vulnerable, and now feels that they have found people to listen to them. They might be depressed or unhappy and looking to blame their situation on something or someone. They might feel that nobody understands them, but these people seem to.

The discussion is then taken to a more private place, to the smaller but more harmful platforms that may fall outside the scope of the Bill, but that will now become the go-to place for spreading extremism, misinformation and other harmful content. The radicalisation continues there—harder to track, harder to monitor and harder to stop. Let us remember, however, that all of that started with those legal but harmful comments being witnessed. They were clearly unacceptable, but mainstream social media give them legitimacy. The Online Safety Bill will do nothing to stop that.

Unfortunately, that chain of events occurs far too often. It is a story told many times, about how somebody vulnerable is lured in by those wishing to spread their hatred. It is hosted by major social media platforms. Hon. Members may remember the case of John, a teenager radicalised online and subsequently sentenced. His story was covered by The Guardian last year. John was feeling a sense of hopelessness, which left him susceptible to the messaging of the far right. Aged 15, he felt “written off”: he was in the bottom set at school, with zero exam expectations, and feeling that his life opportunities would be dismal. The far right, however, promised him a future. John became increasingly radicalised by an online barrage of far-right disinformation. He said:

“I was relying on the far right for a job. They were saying that when they got power they would be giving jobs to people like me”.

John now says:

“Now I know the posts were all fake, but the 15-year-old me didn’t bother to fact-check.”

For some people in the room, that might seem like a totally different world. Thankfully, for most of us, it is. However, if Members take the time to see some of that stuff online, it is extremely disturbing and alarming. It is a world that we do not understand, but we have to be aware that it exists. The truth, as we can see, is that such groups use popular online platforms to lure in young people and give them a sense of community. One white nationalist group actively targets younger recruits and recently started Call of Duty warcraft gaming tournaments for its supporters. Let us be clear: John was 15, but he could easily have been 18, 19 or indeed significantly older.

John was radicalised by the far right, but we know that similar methods are used by Islamist extremists. A 2020 report from New York University’s Centre for Global Affairs stated:

“The age of social media has allowed ISIS to connect with a large-scale global audience that it would not be able to reach without it...Through strategic targeting, ISIS selects those who are most vulnerable and susceptible to radicalization”.

That includes those who are

“searching for meaning or purpose in their life, feeling anger and…alienated from society”.

The ages that are most vulnerable are 15 to 25.

Social media platforms allow ISIS to present its propaganda as mainstream news at little to no cost. Preventing that harm and breaking those chains of radicalisation is, however, possible, and the Bill could go much further to put the responsibility not on the user, but on the platforms. I believe that those platforms need unique regulation, because social media interaction is fundamentally different from real-life social interaction.

Social media presents content to us as if it is the only voice and viewpoint. On social media, people are far more likely to say things that they never would in person. On social media, those views spread like wildfire in a way that they would not in real life. On social media, algorithms find such content and pump it towards us, in a way that can become overwhelming and that can provide validity and reassurance where doubt might otherwise set in.

Allowing that content to remain online without warnings, or allowing it to be visible to all users unless they go searching through their settings to turn it off—which is wholly unrealistic—is a dereliction of duty and a missed opportunity to clean up the platforms and break the chains of radicalisation. As I set out, the chain of events is not unique to one form of radicalisation or hateful content. The same online algorithms that present extremist content to users also promote negative body image, eating disorders, and self-harm and suicide content.

I hope the Committee realises why I am so impassioned about “legal but harmful” clauses, and why I am particularly upset that a few Conservative Members appear to believe that such content should remain unchecked online because of free speech, with full knowledge that it is exactly that content that serves as the gateway for people to self-harm and to be radicalised. That is not free speech.

15:00
There is broad consensus across the Committee that the Bill as a whole must do greater good than harm and lead the world in effectively regulating the internet for the benefit and safety of its users. However, there remains a number of considerable gaps that will allow harm to continue online. One small step that the Government could commit to today—I urge the Minister to do so—is to accept at least the Opposition amendment (a) to amendment 15 and amendment (a) to amendmentâ 16, which would define and explicitly categorise content that incites hateful extremism as harmful content in the Bill, ensuring that platforms have a responsibility to find, label and hopefully hide that content from users by default. The Government can be assured of the Opposition’s support to strengthen the Bill further, including in the “legal but harmful” area, in the face of a very small number of Conservative Members who are resisting on the basis of ideological purity rather than of preventing real life harm.
Allowing such content freely on platforms and doing nothing to ensure that smaller but high-harm platforms are brought into the remit of this Bill is a backward step. We should be strengthening, not weakening, the Bill in this Committee. That is why I oppose the Government’s position and wholeheartedly support the Opposition’s amendments to clause 14.
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I have talked a little already about these amendments, so let me sum up where I think we are. I talked about harmful health content and why it is not included. The Online Safety Bill will force social media companies to tackle health misinformation and disinformation online, where it constitutes a criminal offence. It includes the communications offence, which would capture posts encouraging dangerous hoax cures, where the sender knows the information to be false and intends to cause harm, such as encouraging drinking bleach to cure cancer, which we heard about a little earlier.

The legislation is only one part of the wider Government approach to this issue. It includes the work of the counter-disinformation unit, which brings together cross-Government monitoring and analysis capabilities and engages with platforms directly to ensure that appropriate action is taken, in addition to the Government’s work to build users’ resilience to misinformation through media literacy.

Including harmful health content as a category risks requiring companies to apply the adult user empowerment tools to an unfeasibly large volume of content—way beyond just the vaccine efficacy that was mentioned. That has implications both for regulatory burden and for freedom of expression, as it may capture important health advice. Similarly, on climate change, the Online Safety Bill itself will introduce new transparency, accountability and free speech duties and category one services. If a platform said that certain types of content are not allowed, it will be held to account for their removal.

We recognised that there was a heightened risk of disinformation surrounding the COP26 summit. The counter-disinformation unit led by the Department for Digital, Culture, Media and Sport brought together monitoring and analysis capabilities across Government to understand disinformation that posed a risk to public safety or to delegates or that represented attempts at interference from malign actors. We are clear that free debate is essential to a democracy and that the counter-disinformation unit should not infringe upon political debate. Government already work closely with the major social media platforms to encourage them to collaborate at speed to remove disinformation as per their terms of service.

Amendment (a) to amendment 15 and amendment (a) to amendment 16 would create that new category of content that incites hateful extremism. That is closely aligned with the approach that the Government are already taking with amendment 15, specifically subsections (8C) and (8D), which create a category of content that is abusive or incites hate on the basis of race, religion, sex, sexual orientation, disability, or gender reassignment. Those conditions would likely capture the majority of the kinds of content that the hon. Members are seeking to capture through their hateful extremism category. For example, it would capture antisemitic abuse and conspiracy theories, racist abuse and promotion of racist ideologies.

Furthermore, where companies’ terms of service say they prohibit or apply restrictions to the kind of content listed in the Opposition amendments, companies must ensure that those terms are consistently enforced. It comes back so much to the enforcement. They must also ensure that the terms of service are easily understandable.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

If this is about companies enforcing what is in their terms of service for the use of their platforms, could it not create a perverse incentive for them to have very little in their terms of service? If they will be punished for not enforcing their terms of service, surely they will want them to be as lax as possible in order to limit their legal liability for enforcing them. Does the Minister follow?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I follow, but I do not agree. The categories of content in proposed new subsections (8C) and (8D), introduced by amendment 15, underpin a lot of this. I answered the question in an earlier debate when talking about the commercial impetus. I cannot imagine many mainstream advertisers wanting to advertise with a company that removed from its terms of service the exclusion of racial abuse, misogyny and general abuse. We have seen that commercial impetus really kicking in with certain platforms. For those reasons, I am unable to accept the amendments to the amendments, and I hope that the Opposition will not press them to a vote.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful for the opportunity to push the Minister further. I asked him whether he could outline where the list in amendment 15 came from. Will he publish the research that led him to compile that specific list of priority harms?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The definitions that we have taken are ones that strike the right balance and have a comparatively high threshold, so that they do not capture challenging and robust discussions on controversial topics.

Amendment 8 agreed to.

Amendments made: 9, in clause 14, page 14, line 5, after “to” insert “effectively”.

This amendment strengthens the duty in this clause by requiring that the systems or processes used to deal with the kinds of content described in subsections (8B) to (8D) (see Amendment 15) should be designed to effectively increase users’ control over such content.

Amendment 10, in clause 14, page 14, line 6, leave out from “encountering” to “the” in line 7 and insert

“content to which subsection (2) applies present on”.

This amendment inserts a reference to the kinds of content now relevant for this clause, instead of referring to priority content that is harmful to adults.

Amendment 11, in clause 14, page 14, line 9, leave out from “to” to end of line 10 and insert

“content present on the service that is a particular kind of content to which subsection (2) applies”.—(Paul Scully.)

This amendment inserts a reference to the kinds of content now relevant for this clause, instead of referring to priority content that is harmful to adults.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move amendment 102, in clause 14, page 14, line 12, leave out “made available to” and insert “in operation for”.

This amendment, and Amendment 103, relate to the tools proposed in Clause 14 which will be available for individuals to use on platforms to protect themselves from harm. This amendment specifically forces platforms to have these safety tools “on” by default.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss amendment 103, in clause 14, page 14, line 15, leave out “take advantage of” and insert “disapply”.

This amendment relates to Amendment 102.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The amendments relate to the tools proposed in clause 14, which as we know will be available for individuals to use on platforms to protect themselves from harm. As the Minister knows, Labour fundamentally disagrees with that approach, which will place the onus on the user, rather than the platform, to protect themselves from harmful content. It is widely recognised that the purpose of this week’s Committee proceedings is to allow the Government to remove the so-called “legal but harmful” clauses and replace them with the user empowerment tool option. Let us be clear that that goes against the very essence of the Bill, which was created to address the particular way in which social media allows content to be shared, spread and broadcast around the world at speed.

This approach could very well see a two-tier internet system develop, which leaves those of us who choose to utilise the user empowerment tools ignorant of harmful content perpetuated elsewhere for others to see. The tools proposed in clause 14, however, reflect something that we all know to be true: that there is some very harmful content out there for us all to see online. We can all agree that individuals should therefore have access to the appropriate tools to protect themselves. It is also right that providers will be required to ensure that adults have greater choice and control over the content that they see and engage with, but let us be clear that instead of focusing on defining exactly what content is or is not harmful, the Bill should focus on the processes by which harmful content is amplified on social media.

However, we are where we are, and Labour believes that it is better to have the Bill over the line, with a regulator in place with some powers, than simply to do nothing at all. With that in mind, we have tabled the amendment specifically to force platforms to have safety tools on by default. We believe that the user empowerment tools should be on by default and that they must be appropriately visible and easy to use. We must recognise that for people at a point of crisis—if a person is suffering with depressive or suicidal thoughts, or with significant personal isolation, for example—the tools may not be at the forefront of their minds if their mental state is severely impacted.

On a similar point, we must not patronise the public. Labour sees no rational argument why the Government would not support the amendment. We should all assume that if a rational adult is able to easily find and use these user empowerment tools, then they will be easily able to turn them off if they choose to do so.

The Minister knows that I am not in the habit of guessing but, judging from our private conversations, his rebuttal to my points may be because he believes it is not the Government’s role to impose rules directly on platforms, particularly when they impact their functionality. However, for Labour, the existence of harm and the importance of protecting people online tips the balance in favour of turning these user empowerment tools on by default. We see no negative reason why that should not be the case, and we now have a simple amendment that could have a significantly positive impact.

I hope the Minister and colleagues will reflect strongly on these amendments, as we believe they are a reasonable and simple ask of platforms to do the right thing and have the user empowerment tools on by default.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Once again, this is a very smart amendment that I wish I had thought of myself and I am happy to support. The case made by those campaigning for freedom of speech at any cost is about people being able to say what they want to say, no matter how harmful that may be. It is not about requiring me, or anyone else, to read those things—the harmful bile, the holocaust denial or the promotion of suicide that is spouted. It is not freedom of speech to require someone else to see and read such content so I cannot see any potential argument that the Government could come up with against these amendments.

The amendments have nothing to do with freedom of speech or with limiting people’s ability to say whatever they want to say or to promote whatever untruths they want to promote. However, they are about making sure that people are protected and that they are starting from a position of having to opt in if they want to see harmful content. If I want to see content about holocaust denial—I do not want to see that, but if I did—I should have to clearly tick a button that says, “Yes, I am pretty extreme in my views and I want to see things that are abusing people. I want to see that sort of content.” I should have to opt in to be able to see that.

There are a significant number of newspapers out there. I will not even pick up a lot of them because there is so much stuff in them with which I disagree, but I can choose not to pick them up. I do not have that newspaper served to me against my will because I have the opportunity to choose to opt out from buying it. I do not have to go into the supermarket and say, “No, please do not give me that newspaper!” I just do not pick it up. If we put the Government’s proposal on its head and do what has been suggested in the Opposition amendments, everyone would be in a much better position.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

I note that many providers of 4G internet, including the one I have on my own phone, already block adult content. Essentially, if people want to look at pornography or other forms of content, they have to proactively opt in to be allowed to see it. Would it not make sense to make something as straightforward as that, which already exists, into the model that we want on the internet more widely, as opposed to leaving it to EE and others to do?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I absolutely agree. Another point that has been made is that this is not creating undue burden; the Government are already creating the burden for companies—I am not saying that it is a bad burden, but the Government are already creating it. We just want people to have the opportunity to opt into it, or out of it. That is the position that we are in.

15:19
My hon. Friend the Member for Coatbridge, Chryston and Bellshill and I were having a conversation earlier about how the terms of service might say that holocaust denial was banned. Potentially, however, the terms of service could say, “You may see content that is about holocaust denial on our platform, because we don’t ban it.” They could explicitly have to warn people about the presence of that content.
The Opposition are suggesting flipping the issue on its head. As the hon. Member for Batley and Spen said, there is no way that people go on to Facebook and imagine that they will see extremist content. Most people do not imagine that they will be led down this rabbit hole of increasing extremism on Facebook, because Facebook is where we go to speak to our friends, to see our auntie’s holiday photos or to communicate with people.
The Minister was making slightly light of the fact that there are other ways to communicate—yes, absolutely, but two of the communities that I spend a lot of time in and where I get an awful lot of support and friendship exist only on Facebook. That is the only place where I can have those relationships with friends who live all around the world, because that is where the conversation is taking place. I will not choose to opt out of that, because I would be cut off from two of my support networks. I do not think it is right that we should be told, “If you don’t want to see extremist content, just don’t be a member of Facebook”—or whatever platform it happens to be.
That is not the way to go; we should be writing in the protections. We should be starting from the point of view that no one wants to see content on the promotion of suicide; if they do, they can tick a box to see it. We should start from that point of view: allowing people to opt in if they want to see free speech in an untrammelled way on whatever platform it is.
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I will speak briefly in favour of amendments 102 and 103. As I mentioned a few moments ago, legal but harmful content can act as the gateway to dangerous radicalisation and extremism. Such content, hosted by mainstream social media platforms, should not be permitted unchecked online. I appreciate tható for children the content will be banned, but I strongly believe that the default position should be for such content to be hidden by default to all adult users, as the amendments would ensure.

The chain of events that leads to radicalisation, as I spelt out, relies on groups and individuals reaching people unaware that they are being radicalised. The content is posted in otherwise innocent Facebook groups, forums or Twitter threads. Adding a toggle, hidden somewhere in users’ settings, which few people know about or use, will do nothing to stop that. It will do nothing to stop the harmful content from reaching vulnerable and susceptible users.

We, as legislators, have an obligation to prevent at root that harmful content reaching and drawing in those vulnerable and susceptible to the misinformation and conspiracy spouted by vile groups and individuals wishing to spread their harm. The only way that we can make meaningful progress is by putting the responsibility squarely on platforms, to ensure that by default users do not come across the content in the first place.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

In the previous debate, I talked about amendment 15, which brought in a lot of protections against content that encourages and promotes, or provides instruction for, self-harm, suicide or eating disorders, and against content that is abusive or incites hate on the base of race, religion, disability, sex, gender reassignment or sexual orientation. We have also placed a duty on the largest platforms to offer adults the option to filter out unverified users if they so wish. That is a targeted approach that reflects areas where vulnerable users in particular could benefit from having greater choice and control. I come back to the fact that that is the third shield and an extra safety net. A lot of the extremes we have heard about, which have been used as debating points, as important as they are, should very much be wrapped up by the first two shields.

We have a targeted approach, but it is based on choice. It is right that adult users have a choice about what they see online and who they interact with. It is right that this choice lies in the hands of those adults. The Government mandating that these tools be on by default goes against the central aim of users being empowered to choose for themselves whether they want to reduce their engagement with some kinds of legal content.

We have been clear right from the beginning that it is not the Government’s role to say what legal content adults should or should not view online or to incentivise the removal of legal content. That is why we removed the adult legal but harmful duties in the first place. I believe we are striking the right balance between empowering adult users online and protecting freedom of expression. For that reason, I am not able to accept the amendments from the hon. Member for Pontypridd.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It is disappointing that the Government are refusing to back these amendments to place the toggle as “on” by default. It is something that we see as a safety net, as the Minister described. Why would someone have to choose to have the safety net there? If someone does not want it, they can easily take it away. The choice should be that way around, because it is there to protect all of us.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

I am sure that, like me, the shadow Minister will be baffled that the Government are against our proposals to have to opt out. Surely this is something that is of key concern to the Government, given that the former MP for Tiverton and Honiton might still be an MP if users had to opt in to watching pornography, rather than being accidentally shown it when innocently searching for tractors?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

My hon. Friend makes a very good point. It goes to show the nature of this as a protection for all of us, even MPs, from accessing content that could be harmful to our health or, indeed, profession. Given the nature of the amendment, we feel that this is a safety net that should be available to all. It should be on by default.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I should say that in the spirit of choice, companies can also choose to default it to be switched off in the first place as well.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The Minister makes the point that companies can choose to have it off by default, but we would not need this Bill in the first place if companies did the right thing. Let us be clear: we would not have had to be here debating this for the past five years —for me it has been 12 months—if companies were going to do the right thing and protect people from harmful content online. On that basis, I will push the amendments to a vote.

Question put, That the amendment be made.

Division 3

Ayes: 6

Noes: 8

Amendment made: 12, in clause 14, page 14, line 12, at end insert
“and are easy to access”.—(Paul Scully.)
This amendment requires providers to ensure that features for users to increase their control over content described in subsections (8B) to (8D) (see Amendment 15) are easy to access.
Amendment proposed: 103, in clause 14, page 14, line 15, leave out “take advantage of” and insert “disapply”.—(Alex Davies-Jones.)
This amendment relates to Amendment 102.
Question put, That the amendment be made.

Division 4

Ayes: 6

Noes: 8

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move amendment 101, in clause 14, page 14, line 17, at end insert—

“(6A) A duty to ensure features and provisions in subsections (2), (4) and (6) are accessible and understandable to adult users with learning disabilities.”

This amendment creates a duty that user empowerment functions must be accessible and understandable to adult users with learning disabilities.

This issue was originally brought to my attention by Mencap. It is incredibly important, and it has potentially not been covered adequately by either our previous discussions of the Bill or the Bill itself. The amendment is specifically about ensuring that available features are accessible to adult users with learning disabilities. An awful lot of people use the internet, and people should not be excluded from using it and having access to safety features because they have a learning disability. That should not be the case, for example, when someone is trying to find how to report something on a social media platform. I had an absolute nightmare trying to report a racist gif that was offered in the list of gifs that came up. There is no potential way to report that racist gif to Facebook because it does not take responsibility for it, and GIPHY does not take responsibility for it because it might not be a GIPHY gif.

It is difficult to find the ways to report some of this stuff and to find some of the privacy settings. Even when someone does find the privacy settings, on a significant number of these platforms they do not make much sense—they are not understandable. I am able to read fairly well, I would think, and I am able to speak in the House of Commons, but I still do not understand some of the stuff in the privacy features found on some social media sites. I cannot find how to toggle off things that I want to toggle off on the level of accessibility or privacy that I have, particularly on social media platforms; I will focus on those for the moment. The Bill will not achieve even its intended purpose if all people using these services cannot access or understand the safety features and user empowerment tools.

I am quite happy to talk about the difference between the real world and the online world. My online friends have no problem with me talking about the real world as if it is something different, because it is. In the real world, we have a situation where things such as cuckooing take place and people take advantage of vulnerable adults. Social services, the police and various organisations are on the lookout for that and try to do what they can to put protections in place. I am asking for more parity with the real world here. Let us ensure that we have the protections in place, and that people who are vulnerable and taken advantage of far too often have access to those tools in order to protect themselves. It is particularly reasonable.

Let us say that somebody with a learning disability particularly likes cats; the Committee may have worked out that I also particularly like cats. Let us say that they want to go on TikTok or YouTube and look at videos of cats. They have to sign up to watch videos of cats. They may not have the capacity or understanding to know that there might be extreme content on those sites. They may not be able to grasp that. It may never cross their minds that there could be extreme content on that site. When they are signing up to TikTok, they should not have to go and find the specific toggle to switch off eating disorder content. All they had thought about was that this is a cool place to look at videos of cats.

15:30
I am asking the Minister to make it really clear that these tools should be available and accessible to everybody, and that Ofcom will look at that availability and accessibility and listen to experts who say that there is a real issue with a certain website because the tools are not as accessible as they should be. Would the Minister be kind enough to make that incredibly clear, so that platforms are aware of the direction and the intention? Ofcom also needs to be aware that this is a priority and that these tools should be available to everyone in order to provide that level of accessibly, and in order that everybody can enjoy cat videos.
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I am happy to do that. In the same way that we spoke this morning about children’s protection, I am very aware of the terms of service and what people are getting into by looking for cats or whatever they want to do.

The Bill requires providers to make all the usual enforcement and protection tools available to all adults, including those with learning disabilities. Clause 14(4) makes it explicitly clear that features offered by providers, in compliance with the duty for users to be given greater control over the content that they see, must be made available to all adult users. Clause 14(5) further outlines that providers must have clear and accessible terms of service about what tools are offered in their service and how users may take advantage of them. We have strengthened the accessibility of the user enforcement duties through Government amendment 12 as well, to make sure that user enforcement tools and features are easy for users to access.

In addition, clause 58(1) says that providers must offer all adult users the option to verify themselves so that vulnerable users, including those with learning disabilities, are not at a disadvantage as a result of the user empowerment duties. Clause 59(2) and (3) further stipulate that in producing the guidance for providers about the user verification duty, Ofcom must have particular regard to the desirability of making identity verification available to vulnerable adult users, and must consult with persons who represent the interests of vulnerable adult users. That is about getting the thoughts of experts and advocates into their processes to make sure that they can enforce what is going on.

In addition, Ofcom is subject to the public sector equality duty, so it will have to take into account the ways in which people with disabilities may be impacted when performing its duties, such as writing its codes of practice for the user empowerment duty. I hope the hon. Member will appreciate the fact that, in a holistic way, that covers the essence of exactly what she is trying to do in her amendment, so I do not believe her amendment is necessary.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

In view of the Minister’s statement, I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Amendments made: 13, in clause 14, page 14, line 26, leave out paragraph (a) and insert—

“(a) the likelihood of adult users encountering content to which subsection (2) applies by means of the service, and”

This amendment is about factors relevant to the proportionality of measures to comply with the duty in subsection (2). The new wording replaces a reference to an adults’ risk assessment, as adults’ risk assessments are no longer required (see Amendment 6 which removes clause 12).

Amendment 14, in clause 14, page 14, line 29, leave out “a” and insert “the”.—(Paul Scully.)

This is a technical amendment consequential on Amendment 13.

Amendment (a) proposed to amendment 15: (a), at end insert—

“(8E) Content is within this subsection if it—

(a) incites hateful extremism,

(b) provides false information about climate change, or

(c) is harmful to health.”—(Alex Davies-Jones.)

Question put, That the amendment be made.

Division 5

Ayes: 6

Noes: 8

Amendments made: 15, in clause 14, page 14, line 29, at end insert—
“(8A) Subsection (2) applies to content that—
(a) is regulated user-generated content in relation to the service in question, and
(b) is within subsection (8B), (8C) or (8D).
(8B) Content is within this subsection if it encourages, promotes or provides instructions for—
(a) suicide or an act of deliberate self-injury, or
(b) an eating disorder or behaviours associated with an eating disorder.
(8C) Content is within this subsection if it is abusive and the abuse targets any of the following characteristics—
(a) race,
(b) religion,
(c) sex,
(d) sexual orientation,
(e) disability, or
(f) gender reassignment.
(8D) Content is within this subsection if it incites hatred against people—
(a) of a particular race, religion, sex or sexual orientation,
(b) who have a disability, or
(c) who have the characteristic of gender reassignment.”
This amendment describes the content relevant to the duty in subsection (2) of clause 14. The effect is (broadly) that providers must offer users tools to reduce their exposure to these kinds of content.
Amendment 16, in clause 14, page 14, line 30, leave out subsection (9) and insert—
“(9) In this section—
‘disability’ means any physical or mental impairment;
‘injury’ includes poisoning;
‘non-verified user’ means a user who has not verified their identity to the provider of a service (see section 58(1));
‘race includes colour, nationality, and ethnic or national origins.”
This amendment inserts definitions of terms now used in clause 14.
Amendment 17, in clause 14, page 14, line 33, at end insert
“, and
(b) references to religion include references to a lack of religion.
(11) For the purposes of this section, a person has the characteristic of gender reassignment if the person is proposing to undergo, is undergoing or has undergone a process (or part of a process) for the purpose of reassigning the person’s sex by changing physiological or other attributes of sex, and the reference to gender reassignment in subsection (8C) is to be construed accordingly.” —(Paul Scully.)
This amendment clarifies the meaning of terms now used in clause 14.
Clause 14, as amended, ordered to stand part of the Bill.
Clause 18
Duty about content reporting
Amendment made: 18, in clause 18, page 19, line 15, leave out subsection (5).—(Paul Scully.)
This amendment is consequential on the removal of the adult safety duties (see Amendments 6, 7 and 41.)
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I beg to move amendment 19, in clause 18, page 19, line 32, leave out from “also” to second “section”.

This is a technical amendment relating to Amendment 20.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Government amendments 20 and 21, 26 and 27, 30, 34 and 35, 67, 71, 46 and 47, 50, 53, 55 to 57, and 95.

Government new clause 3—Duty not to act against users except in accordance with terms of service.

Government new clause 4—Further duties about terms of service.

Government new clause 5—OFCOM’s guidance about duties set out in sections (Duty not to act against users except in accordance with terms of service) and (Further

duties about terms of service).

Government new clause 6—Interpretation of this Chapter.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I am seeking to impose new duties on category 1 services to ensure that they are held accountable to their terms of service and to protect free speech. Under the status quo, companies get to decide what we do and do not see online. They can arbitrarily ban users or remove their content without offering any form of due process and with very few avenues for users to achieve effective redress. On the other hand, companies’ terms of service are often poorly enforced, if at all.

I have mentioned before the horrendous abuse suffered by footballers around the 2020 Euro final, despite most platforms’ terms and conditions clearly not allowing that sort of content. There are countless similar instances, for example, relating to antisemitic abuse—as we have heard—and other forms of hate speech, that fall below the criminal threshold.

This group of amendments relates to a series of new duties that will fundamentally reset the relationship between platforms and their users. The duties will prevent services from arbitrarily removing content or suspending users without offering users proper avenues to appeal. At the same time, they will stop companies making empty promises to their users about their terms of service. The duties will ensure that where companies say they will remove content or ban a user, they actually do.

Government new clause 3 is focused on protecting free speech. It would require providers of category 1 services to remove or restrict access to content, or ban or suspend users, only where this is consistent with their terms of service. Ofcom will oversee companies’ systems and processes for discharging those duties, rather than supervising individual decisions.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I am grateful for what the Minister has said, and glad that Ofcom will have a role in seeing that companies do not remove content that is not in breach of terms of service where there is no legal requirement to do so. In other areas of the Bill where these duties exist, risk assessments are to be conducted and codes of practice are in place. Will there similarly be risk assessments and codes of practice to ensure that companies comply with their freedom of speech obligations?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Absolutely. As I say, it is really important that people understand right at the beginning, through risk assessments, what they are signing up for and what they can expect. To come back to the point of whether someone is an adult or a child, it is really important that parents lean in when it comes to children’s protections; that is a very important tool in the armoury.

New clause 4 will require providers of category 1 services to ensure that what their terms of service say about their content moderation policies is clear and accessible. Those terms have to be easy for users to understand, and should have sufficient detail, so that users know what to expect, in relation to moderation actions. Providers of category 1 services must apply their terms of service consistently, and they must have in place systems and processes that enable them to enforce their terms of service consistently.

These duties will give users the ability to report any content or account that they suspect does not meet a platform’s terms of service. They will also give users the ability to make complaints about platforms’ moderation actions, and raise concerns if their content is removed in error. Providers will be required to take appropriate action in response to complaints. That could include removing content that they prohibit, or reinstating content removed in error. These duties ensure that providers are made aware of issues to do with their services and require them to take action to resolve them, to keep users safe, and to uphold users’ rights to free speech.

The duties set out in new clauses 3 and 4 will not apply to illegal content, content that is harmful to children or consumer content. That is because illegal content and content that is harmful to children are covered by existing duties in the Bill, and consumer content is already regulated under consumer protection legislation. Companies will also be able to remove any content where they have a legal obligation to do so, or where the user is committing a criminal offence, even if that is not covered in their terms of service.

New clause 5 will require Ofcom to publish guidance to help providers of category 1 services to understand what they need to do to comply with their new duties. That could include guidance on how to make their terms of service clear and easy for users to understand, and how to operate an effective reporting and redress mechanism. The guidance will not prescribe what types of content companies should include in their terms of service, or how they should treat such content. That will be for companies to decide, based on their knowledge of their users, and their brand and commercial incentives, and subject to their other legal obligations.

New clause 6 clarifies terms used in new clauses 3 and 4. It also includes a definition of “Consumer content”, which is excluded from the main duties in new clauses 3 and 4. This covers content that is already regulated by the Competition and Markets Authority and other consumer protection bodies, such as content that breaches the Consumer Protection from Unfair Trading Regulations 2008. These definitions are needed to provide clarity to companies seeking to comply with the duties set out in new clauses 3 and 4.

The remaining amendments to other provisions in the Bill are consequential on the insertion of these new transparency, accountability and free speech duties. They insert references to the new duties in, for example, the provisions about content reporting, enforcement, transparency and reviewing compliance. That will ensure that the duties apply properly to the new measure.

Amendment 30 removes the duty on platforms to include clear and accessible provisions in their terms of service informing users that they have a right of action in court for breach of contract if a platform removes or restricts access to their content in violation of its terms of service. This is so that the duty can be moved to new clause 4, which focuses on ensuring that platforms comply with their terms of service. The replacement duty in new clause 4 will go further than the original duty, in that it will cover suspensions and bans of users as well as restrictions on content.

Amendments 46 and 47 impose a new duty on Ofcom to have regard to the need for it to be clear to providers of category 1 services what they must do to comply with their new duties. These amendments will also require Ofcom to have regard to the extent to which providers of category 1 services are demonstrating, in a transparent and accountable way, how they are complying with their new duties.

Lastly, amendment 95 temporarily exempts video-sharing platforms that are category 1 services from the new terms of service duties, as set out in new clauses 3 and 4, until the Secretary of State agrees that the Online Safety Bill is sufficiently implemented. This approach simultaneously maximises user protections by the temporary continuation of the VSP regime and minimises burdens for services and Ofcom. The changes are central to the Government’s intention to hold companies accountable for their promises. They will protect users in a way that is in line with companies’ terms of service. They are a critical part of the triple shield, which aims to protect adults online. It ensures that users are safe by requiring companies to remove illegal content, enforce their terms of service and provide users with tools to control their online experiences. Equally, these changes prevent arbitrary or random content removal, which helps to protect pluralistic and robust debate online. For those reasons, I hope that Members can support the amendments.

15:45
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

This is an extremely large grouping so, for the sake of the Committee, I will do my best to keep my comments focused and brief where possible. I begin by addressing Government new clauses 3 and 4 and the consequential amendments.

Government new clause 3 introduces new duties that aim to ensure that the largest or most risky online service providers design systems and processes that ensure they cannot take down or restrict content in a way prevents a person from seeing it without further action by the user, or ban users, except in accordance with their own terms of service, or if the content breaks the law or contravenes the Online Safety Bill regime. This duty is referred to as the duty not to act against users except in accordance with terms of service. In reality, that will mean that the focus remains far too much on the banning, taking down and restriction of content, rather than our considering the systems and processes behind the platforms that perpetuate harm.

Labour has long held the view that the Government have gone down an unhelpful cul-de-sac on free speech. Instead of focusing on defining exactly which content is or is not harmful, the Bill should be focused on the processes by which harmful content is amplified on social media. We must recognise that a person posting a racist slur online that nobody notices, shares or reads is significantly less harmful than a post that can quickly go viral, and can within hours gain millions of views or shares. We have talked a lot in this place about Kanye West and the comments he has made on Twitter in the past few weeks. It is safe to say that a comment by Joe Bloggs in Hackney that glorifies Hitler does not have the same reach or produce the same harm as Kanye West saying exactly the same thing to his 30 million Twitter followers.

Our approach has the benefit of addressing the things that social media companies can control—for example, how content spreads—rather than the things they cannot control, such as what people say online. It reduces the risk to freedom of speech because it tackles how content is shared, rather than relying entirely on taking down harmful content. Government new clause 4 aims to improve the effectiveness of platforms’ terms of service in conjunction with the Government’s new triple shield, which the Committee has heard a lot about, but the reality is they are ultimately seeking to place too much of the burden of protection on extremely flexible and changeable terms of service.

If a provider’s terms of service say that certain types of content are to be taken down or restricted, then providers must run systems and processes to ensure that that can happen. Moreover, people must be able to report breaches easily, through a complaints service that delivers appropriate action, including when the service receives complaints about the provider. This “effectiveness” duty is important but somewhat misguided.

The Government, having dropped some of the “harmful but legal” provisions, seem to expect that if large and risky services—the category 1 platforms—claim to be tackling such material, they must deliver on that promise to the customer and user. This reflects a widespread view that companies may pick and choose how to apply their terms of service, or implement them loosely and interchangeably, as we have heard. Those failings will lead to harm when people encounter things that they would not have thought would be there when they signed up. All the while, service providers that do not fall within category 1 need not enforce their terms of service, or may do so erratically or discriminatorily. That includes search engines, no matter how big.

This large bundle of amendments seems to do little to actually keep people safe online. I have already made my concerns about the Government’s so-called triple shield approach to internet safety clear, so I will not repeat myself. We fundamentally believe that the Government’s approach, which places too much of the onus on the user rather than the platform, is wrong. We therefore cannot support the approach that is taken in the amendments. That being said, the Minister can take some solace from knowing that we see the merits of Government new clause 5, which

“requires OFCOM to give guidance to providers about complying with the duties imposed by NC3 and NC4”.

If this is the avenue that the Government insist on going down, it is absolutely vital that providers are advised by Ofcom on the processes they will be required to take to comply with these new duties.

Amendment 19 agreed to.

Amendment made: 20, in clause 18, page 19, line 33, at end insert

“, and

(b) section (Further duties about terms of service)(5)(a) (reporting of content that terms of service allow to be taken down or restricted).”—(Paul Scully.)

This amendment inserts a signpost to the new provision about content reporting inserted by NC4.

Clause 18, as amended, ordered to stand part of the Bill.

Clause 19

Duties about complaints procedures

Amendment made: 21, in clause 19, page 20, line 15, leave out “, (3) or (4)” and insert “or (3)”.—(Paul Scully.)

This amendment removes a reference to clause 20(4), as that provision is moved to NC4.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I beg to move amendment 22, in clause 19, page 20, line 27, leave out from “down” to “and” in line 28 and insert

“or access to it being restricted, or given a lower priority or otherwise becoming less likely to be encountered by other users,”.

NC2 states what is meant by restricting users’ access to content, and this amendment makes a change in line with that, to avoid any implication that downranking is a form of restriction on access to content.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Government amendment 59.

Government new clause 2—Restricting users’ access to content.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

These amendments clarify the meaning of “restricting access to content” and “access to content” for the purposes of the Bill. Restricting access to content is an expression that is used in various provisions across the Bill, such as in new clause 2, under which providers of category 1 services will have a duty to ensure that they remove or restrict access to users’ content only where that is in accordance with their terms of service or another legal obligation. There are other such references in clauses 15, 16 and 17.

The amendments make it clear that the expression

“restricting users’ access to content”

covers cases where a provider prevents a user from accessing content without that user taking a prior step, or where content is temporarily hidden from a user. They also make it clear that this expression does not cover any restrictions that the provider puts in place to enable users to apply user empowerment tools to limit the content that they encounter, or cases where access to content is controlled by another user, rather than by the provider.

The amendments are largely technical, but they do cover things such as down-ranking. Amendment 22 is necessary because the previous wording of this provision wrongly suggested that down-ranking was covered by the expression “restricting access to content”. Down-ranking is the practice of giving content a lower priority on a user’s feed. The Government intend that users should be able to complain if they feel that their content has been inappropriately down-ranked as a result of the use of proactive technology. This amendment ensures consistency.

I hope that the amendments provide clarity as to the meaning of restricting access to content for those affected by the Bill, and assist providers with complying with their duties.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Again, I will keep my comments on clause 19 brief, as we broadly support the intentions behind the clause and the associated measures in the grouping. My hon. Friend the Member for Worsley and Eccles South (Barbara Keeley) spoke at length about this important clause, which relates to the all-important complaints procedures available around social media platforms and companies, in the previous Bill Committee.

During the previous Committee, Labour tabled amendments that would have empowered more individuals to make a complaint about search content in the event of non-compliance. In addition, we wanted an external complaints option for individuals seeking redress. Sadly, all those amendments were voted down by the last Committee, but I must once again press the Minister on those points, particularly in the context of the new amendments that have been tabled.

Without redress for individual complaints, once internal mechanisms have been exhausted, victims of online abuse could be left with no further options. Consumer protections could be compromised and freedom of expression, with which the Government seem to be borderline obsessed, could be infringed for people who feel that their content has been unfairly removed.

Government new clause 2 deals with the meaning of references to

“restricting users’ access to content”,

in particular by excluding restrictions resulting from the use of user empowerment tools as described in clause 14. We see amendments 22 and 59 as important components of new clause 2, and are therefore more than happy to support them. However, I reiterate to the Minister and place on the record once again the importance of introducing an online safety ombudsman, which we feel is crucial to new clause 2. The Joint Committee recommended the introduction of such an ombudsman, who would consider complaints when internal routes of redress had not resulted in resolution, had failed to address risk and had led to significant and demonstrable harm. As new clause 2 relates to restricting users’ access to content, we must also ensure that there is an appropriate channel for complaints if there is an issue that users wish to take up around restrictions in accessing content.

By now, the Minister will be well versed in my thoughts on the Government’s approach, and on the reliance on the user empowerment tool approach more broadly. It is fundamentally an error to pursue a regime that is so content-focused. Despite those points, we see the merits in Government amendments 22 and 59, and in new clause 2, so have not sought to table any further amendments at this stage.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I am slightly confused, and would appreciate a little clarification from the Minister. I understand what new clause 2 means; if the hon. Member for Pontypridd says that she does not want to see content of a certain nature, and I put something of that nature online, I am not being unfairly discriminated against in any way because she has chosen to opt out of receiving that content. I am slightly confused about the downgrading bit.

I know that an awful lot of platforms use downgrading when there is content that they find problematic, or something that they feel is an issue. Rather than taking that content off the platform completely, they may just no longer put it in users’ feeds, for example; they may move it down the priority list, and that may be part of what they already do to keep people safe. I am not trying to criticise what the Government are doing, but I genuinely do not understand whether that downgrading would still be allowed, whether it would be an issue, and whether people could complain about their content being downgraded because the platform was a bit concerned about it, and needed to check it out and work out what was going on, or if it was taken off users’ feeds.

Some companies, if they think that videos have been uploaded by people who are too young to use the platform, or by a registered child user of the platform, will not serve that content to everybody’s feeds. I will not be able to see something in my TikTok feed that was published by a user who is 13, for example, because there are restrictions on how TikTok deals with and serves that content, in order to provide increased protection and the safety that they want on their services.

Will it still be acceptable for companies to have their own internal downgrading system, in order to keep people safe, when content does not necessarily meet an illegality bar or child safety duty bar? The Minister has not used the phrase “market forces”; I think he said “commercial imperative”, and he has talked a lot about that. Some companies and organisations use downgrading to improve the systems on their site and to improve the user experience on the platform. I would very much appreciate it if the Minister explained whether that will still be the case. If not, will we all have a worse online experience as a result?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will have a go at that, but I am happy to write to the hon. Lady if I do not respond as fully as she wants. Down-ranking content is a moderation action, as she says, but it is not always done just to restrict access to content; there are many reasons why people might want to do it. Through these changes, we are saying that the content is not actually being restricted; it can still be seen if it is searched for or otherwise encountered. That is consistent with the clarification.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

This is quite an important point. The hon. Member for Aberdeen North was talking about recommendation systems. If a platform chooses not to amplify content, that is presumably not covered. As long as the content is accessible, someone could search and find it. That does not inhibit a platform’s decision, for policy reasons or whatever, not to actively promote it.

15:09
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Absolutely. There are plenty of reasons why platforms will rank users’ content, including down-ranking it. Providing personal content recommendations will have that process in it as well. It is not practical to specify that restricting access includes down-ranking. That is why we made that change.

Amendment 22 agreed to.

Amendments made: 23, in clause 19, page 21, line 7, leave out from “The” to “complaints” in line 10 and insert

“relevant kind of complaint for Category 1 services is”.

This amendment is consequential on the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 24, in clause 19, page 21, line 12, leave out sub-paragraph (i).

This amendment is consequential on Amendment 7 (removal of clause 13).

Amendment 25, in clause 19, page 21, line 18, leave out paragraphs (c) and (d).

This amendment is consequential on the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 26, in clause 19, page 21, line 33, leave out from “also” to second “section”.

This is a technical amendment relating to Amendment 27.

Amendment 27, in clause 19, page 21, line 34, at end insert

“, and

(b) section (Further duties about terms of service)(6) (complaints procedure relating to content that terms of service allow to be taken down or restricted).”—(Paul Scully.)

This amendment inserts a signpost to the new provision about complaints procedures inserted by NC4.

Clause 19, as amended, ordered to stand part of the Bill.

Clause 20

Duties about freedom of expression and privacy

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I beg to move amendment 28, in clause 20, page 21, line 42, after “have” insert “particular”.

This amendment has the result that providers of regulated user-to-user services must have particular regard to freedom of expression when deciding on and implementing safety measures and policies.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss Government amendments 29, 31, 36 to 38 and 40.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will be brief. The rights to freedom of expression and privacy are essential to our democracy. We have long been clear that the Bill must not interfere with those rights. The amendments will further strengthen protections for freedom of expression and privacy and ensure consistency in the Bill. They require regulated user-to-user and search services to have particular regard to freedom of expression and privacy when deciding on and implementing their safety measures and policy.

Amendments 28, 29 and 31 mean that service providers will need to thoroughly consider the impact that their safety and user empowerment measures have on users’ freedom of expression and privacy. That could mean, for example, providing detailed guidance and training for human reviewers about content that is particularly difficult to assess. Amendments 36 and 37 apply that to search services in relation to their safety duties. Ofcom can take enforcement action against services that fail to comply with those duties and will set out steps that platforms can take to safeguard freedom of expression and privacy in their codes of practice.

Those changes will not detract from platforms’ illegal content and child protection duties. Companies must tackle illegal content and ensure that children are protected on their services, but the amendments will protect against platforms taking an over-zealous approach to removing content or undermining users’ privacy when complying with their duties. Amendments 38 and 40 ensure that the rest of the Bill is consistent with those changes. The new duties will therefore ensure that companies give proper consideration to users’ rights when complying with them, and that that is reflected in Ofcom’s codes, providing greater clarity to companies.

Amendment 28 agreed to.

Amendments made: 29, in clause 20, page 22, line 2, after “have” insert “particular”.

This amendment has the result that providers of regulated user-to-user services must have particular regard to users’ privacy when deciding on and implementing safety measures and policies.

Amendment 30, in clause 20, page 22, line 6, leave out subsection (4).

This amendment removes clause 20(4), as that provision is moved to NC4.

Amendment 31, in clause 20, page 22, line 37, leave out paragraph (c) and insert—

“(c) section 14 (user empowerment),”.—(Paul Scully.)

The main effect of this amendment is that providers must consider freedom of expression and privacy issues when deciding on measures and policies to comply with clause 14 (user empowerment). The reference to clause 14 replaces the previous reference to clause 13 (adults’ safety duties), which is now removed (see Amendment 7).

Question proposed, That the clause, as amended, stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clause 30 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I will speak broadly to clause 20, as it is an extremely important clause, before making remarks about the group of Government amendments we have just voted on.

Clause 20 is designed to provide a set of balancing provisions that will require companies to have regard to freedom of expression and privacy when they implement their safety duties. However, as Labour has repeatedly argued, it is important that companies cannot use privacy and free expression as a basis to argue that they can comply with regulations in less substantive ways. That is a genuine fear here.

We all want to see a Bill in place that protects free speech, but that cannot come at the expense of safety online. The situation with regards to content that is harmful to adults has become even murkier with the Government’s attempts to water down the Bill and remove adult risk assessments entirely.

The Minister must acknowledge that there is a balance to be achieved. We all recognise that. The truth is—and this is something that his predecessor, or should I say his predecessor’s predecessor, touched on when we considered this clause in the previous Bill Committee—that at the moment platforms are extremely inconsistent in their approach to getting the balance right. Although Labour is broadly supportive of this clause and the group of amendments, we feel that now is an appropriate time to put on record our concerns over the important balance between safety, transparency and freedom of expression.

Labour has genuine concerns over the future of platforms’ commitment to retaining that balance, particularly if the behaviours following the recent takeover of Twitter by Elon Musk are anything to go by. Since Elon Musk took over ownership of the platform, he has repeatedly used Twitter polls, posted from his personal account, as metrics to determine public opinion on platform policy. The general amnesty policy and the reinstatement of Donald Trump both emerged from such polls.

According to former employees, those polls are not only inaccurate representations of the platform’s user base, but are actually

“designed to be spammed and gamed”.

The polls are magnets for bots and other inauthentic accounts. This approach and the reliance on polls have allowed Elon Musk to enact and dictate his platform’s policy on moderation and freedom of expression. Even if he is genuinely trusting the results of these polls and not gamifying them, they do not accurately represent the user base nor the best practices for confronting disinformation and harm online.

Elon Musk uses the results to claim that “the people have spoken”, but they have not. Research from leading anti-hate organisation the Anti-Defamation League shows that far-right extremists and neo-Nazis encouraged supporters to actively re-join Twitter to vote in these polls. The impacts of platforming neo-Nazis on Twitter do not need to be stated. Such users are explicitly trying to promote violent and hateful agendas, and they were banned initially for that exact reason. The bottom line is that those people were banned in line with Twitter’s terms of service at the time, and they should not be re-platformed just because of the findings of one Twitter poll.

These issues are at the very heart of Labour’s concerns in relation to the Bill—that the duties around freedom of expression and privacy will be different for those at the top of the platforms. We support the clause and the group of amendments, but I hope the Minister will be able to address those concerns in his remarks.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I endorse the general approach set out by the hon. Member for Pontypridd. We do not want to define freedom of speech based on a personal poll carried out on one platform. That is exactly why we are enshrining it in this ground-breaking Bill.

We want to get the balance right. I have talked about the protections for children. We also want to protect adults and give them the power to understand the platforms they are on and the risks involved, while having regard for freedom of expression and privacy. That is a wider approach than one man’s Twitter feed. These clauses are important to ensure that the service providers interpret and implement their safety duties in a proportionate way that limits negative impact on users’ rights to freedom of expression. However, they also have to have regard to the wider definition of freedom of expression, while protecting users, which the rest of the Bill covers in a proportionate way.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

This goes to the heart of more than just one person’s Twitter feed, although we could say that that person is an incredibly powerful and influential figure on the platform. In the past 24 hours, Twitter has disbanded its trust and safety council. Members of that council included expert groups working to tackle harassment and child sexual exploitation, and to promote human rights. Does the Minister not feel that the council being disbanded goes to the heart of what we have been debating? It shows how a platform can remove its terms of service or change them at whim in order to prevent harm from being perpetrated on that platform.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will come back to some of the earlier points. At the end of the day, when platforms change their terms and conditions, which they are free to do, they will be judged by their users and indeed the advertisers from whom they make their money. There are market forces—I will use that phrase as well as “commercial imperative”, to get that one in there—that will drive behaviour. It may be the usability of Facebook, or Twitter’s terms and conditions and the approach of its new owner, that will drive those platforms to alternative users. I am old enough to remember Myspace, CompuServe and AOL, which tried to box people into their walled gardens. What happened to them? Only yesterday, someone from Google was saying that the new artificial intelligence chatbot—ChatGPT—may well disrupt Google. These companies, as big as they are, do not have a right to exist. They have to keep innovating. If they get it wrong, then they get it wrong.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Does my hon. Friend agree that this is why the Bill is structured in the way it is? We have a wide range of priority illegal offences that companies have to meet, so it is not down to Elon Musk to determine whether he has a policy on race hate. They have to meet the legal standards set, and that is why it is so important to have that wide range of priority illegal offences. If companies go beyond that and have higher safety standards in their terms of service, that is checked as well. However, a company cannot avoid its obligations simply by changing its terms of service.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

My hon. Friend is absolutely right. We are putting in those protections, but we want companies to have due regard to freedom of speech.

I want to clarify a point that my hon. Friend made earlier about guidance to the new accountability, transparency and free speech duties. Companies will be free to set any terms of service that they want to, subject to their other legal obligations. That is related to the conversations that we have just been having. Those duties are there to properly enforce the terms of service, and not to remove content or ban users except in accordance with those terms. There will no platform risk assessments or codes of practices associated with those new duties. Instead, Ofcom will issue guidance on how companies can comply with their duties rather than codes of practice. That will focus on how companies set their terms of service, but companies will not be required to set terms directly for specific types of content or cover risks. I hope that is clear.

To answer the point made by the hon. Member for Pontypridd, I agree with the overall sentiment about how we need to protect freedom of expression.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I want to be clear on my point. My question was not related to how platforms set their terms of service, which is a matter for them and they are held to account for that. If we are now bringing in requirements to say that companies cannot go beyond terms of service or their duties in the Bill if they are going to moderate content, who will oversee that? Will Ofcom have a role in checking whether platforms are over-moderating, as the Minister referred to earlier? In that case, where those duties exist elsewhere in the Bill, we have codes of practice in place to make sure it is clear what companies should and should not do. We do not seem to be doing that with this issue.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Absolutely. We have captured that in other parts of the Bill, but I wanted to make that specific bit clear because I am not sure whether I understood or answered my hon. Friend’s question correctly at the time.

Question put and agreed to.

Clause 20, as amended, accordingly ordered to stand part of the Bill.

Clause 21

Record-keeping and review duties

Amendments made: 32, in clause 21, page 23, line 5, leave out “, 10 or 12” and insert “or 10”.

This amendment is consequential on Amendment 6 (removal of clause 12).

Amendment 33, in clause 21, page 23, line 45, leave out paragraph (c).

This amendment is consequential on Amendment 7 (removal of clause 13).

Amendment 34, in clause 21, page 24, line 6, leave out “section” and insert “sections”.

This amendment is consequential on Amendment 35.

Amendment 35, in clause 21, page 24, line 6, at end insert—

“, (Duty not to act against users except in accordance with terms of service) and (Further duties about terms of service) (duties about terms of service).”—(Paul Scully.)

This amendment ensures that providers have a duty to review compliance with the duties set out in NC3 and NC4 regularly, and after making any significant change to the design or operation of the service.

Question proposed, That the clause, as amended, stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Given that there are few changes to this clause from when the Bill was amended in the previous Public Bill Committee, I will be brief. We in the Opposition are clear that record-keeping and review duties on in-scope services make up an important function of the regulatory regime and sit at the very heart of the Online Safety Bill. We must push platforms to transparently report all harms identified and the action taken in response, in line with regulation.

16:15
The requirements to keep records of the action taken in response to harm will be vital in supporting the regulator in making effective decisions about regulatory breaches and on whether the company responses are sufficient. They will also be vital to understanding the success of the regime once it is in place. We see the clause as central to preventing concerns over the under-reporting of harms to evade regulation.
We already know that under-reporting exists. We only have to turn to the testimony of many whistleblowers—colleagues will be aware of those who have bravely shared their concerns over the lack of transparency in this space—to know that we are often not presented with the full picture on the scale of the harm.
Labour has not sought to amend the clause, but one again I must reiterate a point that we have pushed on numerous occasions—namely, the importance of requiring in-scope services to publish their risk assessments. The Government have refused on a number of occasions to understand the significance of the level of transparency, but it could bring great benefits, as it would allow researchers and civil society to track harms and hold services to account. Again, I push the Minister and urge him to stress that the risk assessments are published.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Specifically on the issue that was just raised, there were two written ministerial statements on the Online Safety Bill. The first specifically said that an amendment would

“require the largest platforms to publish summaries of their risk assessments for illegal content and material that is harmful to children, to allow users and empower parents to clearly understand the risks presented by these services and the approach platforms are taking to children’s safety”.—[Official Report, 29 November 2022; Vol. 723, c. 31WS.]

Unless I have completely missed an amendment that has been tabled for this Committee, my impression is that that amendment will be tabled in the Lords and that details will be made available about how exactly the publishing will work and which platforms will be required to publish.

I would appreciate it if the Minister could provide more clarity about what that might look like, and about which platforms might have to publish their assessments. I appreciate that that will be scrutinised in the Lords but, to be fair, this is the second time that the Bill has been in Committee in the Commons. It would be helpful if we could be a bit more sighted on what exactly the Government intend to do—meaning more than the handful of lines in a written ministerial statement—because then we would know whether the proposal is adequate, or whether we would have to ask further questions in order to draw it out and ensure that it is published in a certain form. The more information the Minister can provide, the better.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I think we all agree that written records are hugely important. They are important as evidence in cases where Ofcom is considering enforcement action, and a company’s compliance review should be done regularly, especially before they make changes to their service.

The Bill does not intend to place excessive burdens on small and low-risk businesses. As such, clause 21 provides Ofcom with the power to exempt certain types of service from the record-keeping and review duties. However, the details of any exemptions must be published.

To half-answer the point made by the hon. Member for Aberdeen North, the measures will be brought to the Lords, but I will endeavour to keep her up to date as best we can so that we can continue the conversation. We have served together on several Bill Committees, including on technical Bills that required us to spend several days in Committee—although they did not come back for re-committal—so I will endeavour to keep her and, indeed, the hon. Member for Pontypridd, up to date with developments.

Question put and agreed to. 

Clause 21, as amended, accordingly ordered to stand part of the Bill.

Clause 30

duties about freedom of expression and privacy

Amendments made: 36, in clause 30, page 31, line 31, after “have” insert “particular”.

This amendment has the result that providers of regulated search services must have particular regard to freedom of expression when deciding on and implementing safety measures and policies.

Amendment 37, in clause 30, page 31, line 34, after “have” insert “particular”.—(Paul Scully.)

This amendment has the result that providers of regulated search services must have particular regard to users’ privacy when deciding on and implementing safety measures and policies.

Clause 30, as amended, ordered to stand part of the Bill.

Clause 46

Relationship between duties and codes of practice

Amendments made: 38, in clause 46, page 44, line 27, after “have” insert “particular”.

This amendment has the result that providers of services who take measures other than those recommended in codes of practice in order to comply with safety duties must have particular regard to freedom of expression and users’ privacy.

Amendment 39, in clause 46, page 45, line 12, leave out paragraph (c).

This amendment is consequential on Amendment 7 (removal of clause 13).

Amendment 40, in clause 46, page 45, line 31, at end insert “, or

(ii) a duty set out in section 14 (user empowerment);”.—(Paul Scully.)

This amendment has the effect that measures recommended in codes of practice to comply with the duty in clause 14 are relevant to the question of whether a provider is complying with the duties in clause 20(2) and (3) (having regard to freedom of expression and users’ privacy).

Question proposed, That the clause, as amended, stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I do not wish to repeat myself and test the Committee’s patience, so I will keep my comments brief. As it stands, service providers would be treated as complying with their duties if they had followed the recommended measures set out in the relevant codes of practice, as set out in the Bill. However, providers could take alternative measures to comply, but as I said in previous Committee sittings, Labour remains concerned that the definition of “alternative measures” is far too broad. I would be grateful if the Minister elaborated on his assessment of the instances in which a service provider may seek to comply via alternative measures.

The codes of practice should be, for want of a better phrase, best practice. Labour is concerned that, to avoid the duties, providers may choose to take the “alternative measures” route as an easy way out. We agree that it is important to ensure that providers have a duty with regard to protecting users’ freedom of expression and personal privacy. As we have repeatedly said, the entire Online Safety Bill regime relies on that careful balance being at the forefront. We want to see safety at the forefront, but recognise the importance of freedom of expression and personal privacy, and it is right that those duties are central to the clause. For those reasons, Labour has not sought to amend this part of the Bill, but I want to press the Minister on exactly how he sees this route being used.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

It is important that service providers have flexibility, so that the Bill does not disincentivise innovation or force service providers to use measures that might not work for all business models or technological contexts. The tech sector is diverse and dynamic, and it is appropriate that companies can take innovative approaches to fulfilling their duties. In most circumstances, we expect companies to take the measures outlined in Ofcom’s code of practice as the easiest route to compliance. However, where a service provider takes alternative measures, Ofcom must consider whether those measures safeguard users’ privacy and freedom of expression appropriately. Ofcom must also consider whether they extend across all relevant areas of a service mentioned in the illegal content and children’s online safety duties, such as content moderation, staff policies and practices, design of functionalities, algorithms and other features. Ultimately, it will be for Ofcom to determine a company’s compliance with the duties, which are there to ensure users’ safety.

Question put and agreed to.

Clause 46, as amended, accordingly ordered to stand part of the Bill.

Clause 55 disagreed to.

Clause 56

Regulations under sections 54 and 55

Amendments made: 42, in clause 56, page 54, line 40, leave out subsection (3).

This amendment is consequential on Amendment 41 (removal of clause 55).

Amendment 43, in clause 56, page 54, line 46, leave out “or 55”.

This amendment is consequential on Amendment 41 (removal of clause 55).

Amendment 44, in clause 56, page 55, line 8, leave out “or 55”.

This amendment is consequential on Amendment 41 (removal of clause 55).

Amendment 45, in clause 56, page 55, line 9, leave out

“or adults are to children or adults”

and insert “are to children”.—(Paul Scully.)

This amendment is consequential on Amendment 41 (removal of clause 55).

Question proposed, That the clause, as amended, stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

As we know, the clause makes provision in relation to the making of regulations designating primary and priority content that is harmful to children, and priority content that is harmful to adults. The Secretary of State may specify a description of content in regulations only if they consider that there is a material risk of significant harm to an appreciable number of children or adults in the United Kingdom presented by user-generated or search content of that description, and must consult Ofcom before making such regulations.

In the last Bill Committee, Labour raised concerns that there were no duties that required the Secretary of State to consult others, including expert stakeholders, ahead of making these regulations. That decision cannot be for one person alone. When it comes to managing harmful content, unlike illegal content, we can all agree that it is about implementing systems that prevent people from encountering it, rather than removing it entirely.

Sarah Owen Portrait Sarah Owen
- Hansard - - - Excerpts

The fact that we are here again to discuss what one Secretary of State wanted to put into law, and which another is now seeking to remove before the law has even been introduced, suggests that my hon. Friend’s point about protection and making sure that there are adequate measures within which the Secretary of State must operate is absolutely valid.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely agree: we are now on our third Secretary of State, our third Minister and our third Prime Minister since we began considering this iteration of the Bill. It is vital that this does not come down to one person’s ideological beliefs. We have spoken at length about this issue; the hon. Member for Don Valley has spoken about his concerns that Parliament should be sovereign, and should make these decisions. It should not be for one individual or one stakeholder to make these determinations.

We also have issues with the Government’s chosen toggle approach—we see that as problematic. We have debated it at length, but our concerns regarding clause 56 are about the lack of consultation that the Secretary of State of the day, whoever that may be and whatever political party they belong to, will be forced to make before making widespread changes to a regime. I am afraid that those concerns still exist, and are not just held by us, but by stakeholders and by Members of all political persuasions across the House. However, since our proposed amendment was voted down in the previous Bill Committee, nothing has changed. I will spare colleagues from once again hearing my pleas about the importance of consultation when it comes to determining all things related to online safety, but while Labour Members do not formally oppose the clause, we hope that the Minister will address our widespread concerns about the powers of the Secretary of State in his remarks.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I appreciate the hon. Lady’s remarks. We have tried to ensure that the Bill is proportionate, inasmuch as the Secretary of State can designate content if there is material risk of significant harm to an appreciable number of children in the United Kingdom. The Bill also requires the Secretary of State to consult Ofcom before making regulations on the priority categories of harm.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

I appreciate that this point has been made about the same wording earlier today, but I really feel that the ambiguity of “appreciable number” is something that could do with being ironed out. The ambiguity and vagueness of that wording make it very difficult to enforce the provision. Does the Minister agree that “appreciable number” is too vague to be of real use in legislation such as this?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The different platforms, approaches and conditions will necessitate different numbers; it would be hard to pin a number down. The wording is vague and wide-ranging because it is trying to capture any number of scenarios, many as yet unknown. However, the regulations designating priority harms will be made under the draft affirmative resolution procedure.

Sarah Owen Portrait Sarah Owen
- Hansard - - - Excerpts

On that point, which we discussed earlier—my hon. Friend the Member for Warrington North discussed it—I am struggling to understand what is an acceptable level of harm, and what is the acceptable number of people to be harmed, before a platform has to act.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

It totally depends on the scenario. It is very difficult for me to stand here now and give a wide number of examples, but the Secretary of State will be reacting to a given situation, rather than trying to predict them.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The Minister has just outlined exactly what our concerns are. He is unable to give an exact number, figure or issue, but that is what the Secretary of State will have to do, without having to consult any stakeholders regarding that issue. There are many eyes on us around the world, with other legislatures looking at us and following suit, so we want the Bill to be world-leading. Many Governments across the world may deem that homosexuality, for example, is of harm to children. Because this piece of legislation creates precedent, a Secretary of State in such a Government could determine that any platform in that country should take down all that content. Does the Minister not see our concerns in that scenario?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I was about to come on to the fact that the Secretary of State would be required to consult Ofcom before making regulations on the priority categories of harm. Indeed Ofcom, just like the Secretary of State, speaks to and engages with a number of stakeholders on this issue to gain a deeper understanding. Regulations designating priority harms would be made under the draft affirmative resolution procedure, but there is also provision for the Secretary of State to use the made affirmative resolution procedure in urgent scenarios, and this would be an urgent scenario. It is about getting the balance right.

16:30
Following amendments 42 to 45, the definition of priority harmful content to adults and the power for the Secretary of State to designate categories of priority harmful content to adults have been removed. These amendments update clause 56 to reflect the removal of the adult safety duties and the concept of legal but harmful content from the Bill.
Question put and agreed to.
Clause 56, as amended, accordingly ordered to stand part of the Bill.
Clause 65
Transparency reports about certain Part 3 services
Question proposed, That the clause stand part of the Bill.
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

As we know, this clause requires providers of relevant services to publish annual transparency reports and sets out Ofcom’s powers in relation to those reports. The information set out in transparency reports is intended to help users to understand the steps that providers are taking to help keep them safe and to provide Ofcom with the information required to hold them to account.

These duties on regulated services are very welcome indeed. Labour has long held the view that mandatory transparency reporting and reporting mechanisms are vital to hold platforms to account, and to understand the true nature of how online harm is driven and perpetuated on the internet.

I will reiterate the points that were made in previous Committee sittings about our concerns about the regularity of these transparency reports. I note that, sadly, those reports remain unchanged and therefore they will only have to be submitted to Ofcom annually. It is important that the Minister truly considers the rapid rate at which the online world can change and develop, so I urge him to reconsider this point and to make these reports a biannual occurrence. Labour firmly believes that increasing the frequency of the transparency reports will ensure that platforms and services remain on the pulse, and are forced to be aware of and act on emergent risks. In turn, that would compel Ofcom to do the same in its role as an industry regulator.

I must also put on the record some of our concerns about subsections (12) and (13), which state that the Secretary of State of the day could amend by regulation the frequency of the transparency reporting, having consulted Ofcom first. I hope that the Minister can reassure us that this approach will not result in our ending up in a position where, perhaps because of Ofcom’s incredible workload, transparency reporting becomes even less frequent than an annual occurrence. We need to see more transparency, not less, so I really hope that he can reassure me on this particular point.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Does my hon. Friend agree that transparency should be at the heart of this Bill and that the Government have missed an opportunity to accelerate the inclusion of a provision in the Bill, namely the requirement to give researchers and academics access to platform data? Data access must be prioritised in the Bill and without such prioritisation the UK will fall behind the rest of Europe in safety, research and innovation. The accessibility and transparency of that data from a research perspective are really important.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely agree with my hon. Friend. We both made the point at length in previous sittings of the Committee about the need to ensure transparency, access to the data, and access to reporting for academics, civil society and researchers.

That also goes to the point that it is not for this Committee or this Minister—it is not in his gift—to determine something that we have all discussed in this place at length, which is the potential requirement for a standalone Committee specifically to consider online harm. Such a Committee would look at whether this legislation is actively doing what we need it to do, whether it needs to be reviewed, whether it could look at the annual reports from Ofcom to determine the length and breadth of harm on the internet, and whether or not this legislation is actually having an impact. That all goes to the heart of transparency, openness and the review that we have been talking about.

I want to go further and raise concerns about how public the reports will be, as we have touched on. The Government claim that their so-called triple shield approach will give users of platforms and services more power and knowledge to understand the harms that they may discover online. That is in direct contradiction to the Bill’s current approach, which does not provide any clarity about exactly how the transparency reports will be made available to the public. In short, we feel that the Government are missing a significant opportunity. We have heard many warnings about what can happen when platforms are able to hide behind a veil of secrecy. I need only point to the revelations of whistleblowers, including Frances Haugen, to highlight the importance of that point.

As the Bill stands, once Ofcom has issued a notice, companies will have to produce a transparency report that

“must…be published in the manner and by the date specified in the notice”.

I want to press the Minister on that and ask him to clarify the wording. We are keen for the reports to be published publicly and in an accessible way, so that users, civil society, researchers and anyone else who wants to see them can make sense of them. The information contained in the transparency reports is critical to analysing trends and harms, so I hope that the Minister will clarify those points in his response.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Does my hon. Friend agree that if the Government are to achieve their objective—which we all share—for the Bill to be world-leading legislation, we cannot rely on whistleblowers to tell us what is really going on in the online space? That is why transparency is vital. This is the perfect opportunity to provide that transparency, so that we can do some proper research into what is going on out there. We cannot rely on whistleblowers to give us such information.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

My hon. Friend is absolutely right. We want the Bill to work. We have always wanted the Bill to work. We want it to achieve its aim of keeping children, adults and everyone who uses the internet safe from the harms that are perpetuated there. If there is no transparency, how will we know that the platforms are breaking the rules covertly, and whether they are hiding content and getting round the rules? That is what they do; we know it, because we have heard it from whistleblowers, but we cannot rely on whistleblowers alone to highlight exactly what happens behind the closed doors of the platforms.

We need the transparency and the reports to be made public, so that we can see whether the legislation is working. If that does not happen, although we have waited five years, we will need another piece of legislation to fix it. We know that the Bill is not perfect, and the Minister knows that—he has said so himself—but, ultimately, we need to know that it works. If it does not, we have a responsibility as legislators to put something in place that does. Transparency is the only way in which we will figure that out.

Sarah Owen Portrait Sarah Owen
- Hansard - - - Excerpts

I want to add to the brilliant points made by my hon. Friend the shadow Minister, in particular on the continually changing nature of market forces, which the Minister himself referenced. We want innovation. We want the tech companies to innovate—preferably ones in the UK—but we do not want to be playing catch-up as we are now, making legislation retrospectively to right wrongs that have taken place because our legislative process has been too slow to deal with the technological changes and the changes in social media, in apps, and with how we access data and communicate with one another online. The bare minimum is a biannual report.

Within six months, if a new piece of technology comes up, it does not simply stay with one app or platform; that technology will be leapfrogged by others. Such technological advances can take place at a very rapid pace. The transparency aspect is important, because people should have a right to know what they are using and whether it is safe. We as policy makers should have a right to know clearly whether the legislation that we have introduced, or the legislation that we want to amend or update, is effective.

If we look at any other approach that we take to protect the health and safety of the people in our country—the people we all represent in our constituencies —we always say that prevention is better than cure. At the moment, without transparency and without researchers being able to update the information we need to see, we will constantly be playing catch-up with digital tech.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

This may be the only place in the Bill where I do not necessarily agree wholeheartedly with the Labour Front Benchers. I agree with the vast majority of what was said, but I have some concerns about making mandatory the requirement for transparency reports to be public in all circumstances, because there are circumstances in which that would simply highlight loopholes, allowing people to exploit them in a way that we do not want them to do.

Specifically on the regularity of reporting and some level of transparency, given that the Minister is keen on the commercial imperative and ensuring that people are safe, we need a higher level of transparency than we currently see among the platforms. There is a very good case to be made for some of the transparency reporting to be made public, particularly for the very largest platforms to be required to make it public, or to make sections of it public.

I want to talk about the speed of change to the terms of service and about proportionality. If Ofcom could request transparency reporting only annually, imagine that it received transparency information three days before Elon Musk took over Twitter. Twitter would be a completely different place three days later, and Ofcom would be unable to ask for more transparency information for a whole year, by which point a significant amount of damage could have been done. We have seen that the terms of service can change quickly. Ofcom would not have the flexibility to ask for an updated transparency report, even if drastic changes were made to the services.

Another thing slightly concerns me about doing this annually and not allowing a bit more flexibility. Let us say that a small platform that none of us has ever heard of, such as Mastodon, shoots to prominence overnight. Let us also say that, as a small platform, Mastodon was previously regulated, and Ofcom had made a request for transparency information shortly before Elon Musk took over Twitter and people had migrated to Mastodon. Mastodon would now be suffering from very different issues than those it had when it had a small number of users, compared with the significant number that it has now. It would have changed dramatically, yet Ofcom would not have the flexibility to seek that information. We know that platforms in the online world have sudden stellar increases in popularity overnight. Some have been bubbling along for ages with nobody using them. Not all of them are brand-new platforms that suddenly shoot to prominence. The lack of flexibility is a problem.

Lastly, I agree about researchers being able to access the transparency information provided. It is really important that we recognise that Ofcom is not the only expert. Ofcom has a huge amount of expertise, and it is massively increasing its staff numbers to cope with these issues, but the reality is that those staff are not academic researchers. They are unable to look at the issues and are not necessarily the most prominent experts in the field of child protection, for example. That is not to take away from the expertise in Ofcom, but we could allow it to ask a regulated group of researchers to look at the information and point out any issues that may not have been spotted, particularly given the volume of transparency reports that there are likely to be.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

The hon. Lady makes an important point. In terms of transparency, the question for me is, what are the Government worried about? Surely part of the Bill is about finding out what is really going on, and the only way that we will do that is by having access to the information. The more transparency, the better. The hon. Lady is right that having experts who can research what is going on is fundamental. If there is a concern around the workload for Ofcom, that is a separate issue that the Minister needs to address, but surely the more work that is done in terms of research and transparency, the better.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

We have seen that just from the people from external organisations who have contacted us about the Bill. The amount of expertise that we do not have that they have brought to the table has significantly improved the debate and hopefully the Bill. Even prior to the consultations that have happened, that encouraged the Minister to make the Bill better. Surely that is why the pre-legislative scrutiny Committee looked at the Bill—in order to improve it and to get expert advice. I still think that having specific access to expertise in order to analyse the transparency report has not been covered adequately.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Annual transparency reporting is an important part of how the system will work. Transparency is one of the most important aspects of how the Online Safety Bill works, because without it companies can hide behind the transparency reports they produce at the moment, which give no transparency at all. For example, Facebook and YouTube report annually that their AI finds 95% of the hate speech they remove, but Frances Haugen said that they removed only 5% of the hate speech. So the transparency report means that they remove 95% of 5%, and that is one of the fundamental problems. The Bill gives the regulator the power to know, and the regulator then has to make informed decisions based on the information it has access to.

16:45
Ofcom is also acting with statutory powers, which is different from how other researchers or organisations that might be appointed would work. The nature of the relationship between Ofcom and the regulated platforms is very different from that with a company that is open to independent scrutiny from independent researchers. Of course, the Bill does not limit Ofcom to just doing annual transparency reports. Ofcom can appoint what I think the Bill calls a “skilled person”, although I think an Ofcom special agent is a better description. At any time, Ofcom can appoint a skilled person—an expert—to go into the company and analyse particular problems. If it was a case of a change of ownership and new risks on the platform that were not previously foreseen, or a big concern about the platform’s performance, Ofcom can appoint that person.
Of course, Ofcom would be free to appoint outside experts, not just people from within the organisation. It could bring in a specialist with particular knowledge of an area where it had concerns. It could do that at any time and appoint as many people as it liked.
Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

As much as I am keen on the idea of Ofcom special agents conceptually, my concern on the transparency front is that, to appoint a special agent and send them in to look at the data, Ofcom would have to have cause to believe that there was an issue of concern with the data, whereas if that data is more transparently available to the research community, they can then proactively identify things that they can flag to Ofcom as a concern. Without that, we are relying on an annual cycle of Ofcom being able to intervene only when they have a concern, rather than the research community, which is much better placed to make that determination, being able to keep a watching brief on the company.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

That concern would be triggered by Ofcom discovering things as a consequence of user complaint. Although Ofcom is not a complaint resolution company, users can complain to it. Independent academics and researchers may produce studies and reports highlighting problems at any time, so Ofcom does not have to wait through an annual cycle of transparency reporting. At any time, Ofcom can say, “We want to have a deeper look at this problem.” It could be something Ofcom or someone else has discovered, and Ofcom can either research that itself or appoint an outside expert.

As the hon. Member for Warrington North mentioned, very sensitive information might become apparent through the transparency reporting that one might not necessarily wish to make public because it requires further investigation and could highlight a particular flaw that could be exploited by bad actors. I would hope and expect, as I think we all would, that we would have the routine publication of transparency reporting to give people assurance that the platforms are meeting their obligations. Indeed, if Ofcom were to intervene against a platform, it would probably use information gathered and received to provide the rationale for why a fine has been issued or another intervention has been made. I am sure that Ofcom will draw all the time on information gathered through transparency reporting and, where relevant, share it.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

This has been a helpful debate. Everyone was right that transparency must be and is at the heart of the Bill. From when we were talking earlier today about how risk assessments and terms of service must be accessible to all, through to this transparency reporting section, it is important that we hold companies to account and that the reports play a key role in allowing users, Ofcom and civil society, including those in academia, to understand the steps that companies are taking to protect users.

Under clause 65, category 1 services, category 2A search services and category 2B user-to-user services need to publish transparency reports annually in accordance with the transparency report notice from Ofcom. That relates to the points about commerciality that my hon. Friend the Member for Folkestone and Hythe talked about. Ofcom will set out what information is required from companies in their notice, which will also specify the format, manner and deadline for the information to be provided to Ofcom. Clearly, it would not be proportionate to require every service provider within the scope of the overall regulatory framework to produce a transparency report—it is also important that we deal with capacity and proportionality—but those category threshold conditions will ensure that the framework is flexible and future-proofed.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

I note what the Minister said about the commercial implications of some of these things, and some of those commercial implications might act as levers to push companies to do better on some things. By that same token, should this information not be more transparent and publicly available to give the user the choice he referred to earlier? That would mean that if a user’s data was not being properly protected and these companies were not taking the measures around safety that the public would expect, users can vote with their feet and go to a different platform. Surely that underpins a lot of what we have been talking about.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Yes, and that is why Ofcom will be the one that decides which information should be published, and from whom, to ensure that it is proportionate. At the end of the day, I have talked about the fact that transparency is at the heart of the Bill and that the transparency reports are important. To go to the original point raised by the hon. Member for Pontypridd about when these reports will be published, they will indeed be published in accordance with subsection 3(d) of the clause.

Question put and agreed to.

Clause 65 accordingly ordered to stand part of the Bill.

Schedule 8

Transparency reports by providers of Category 1 services, Category 2A services and Category 2B services

Amendments made: 61, in schedule 8, page 203, line 13, leave out

“priority content that is harmful to adults”

and insert “relevant content”.

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 62, in schedule 8, page 203, line 15, leave out

“priority content that is harmful to adults”

and insert “relevant content”.

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 63, in schedule 8, page 203, line 17, leave out

“priority content that is harmful to adults”

and insert “relevant content”.

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 64, in schedule 8, page 203, line 21, leave out from “or” to end of line 23 and insert “relevant content”.

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about user reporting of content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 65, in schedule 8, page 203, line 25, leave out

“priority content that is harmful to adults”

and insert “relevant content”.

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 66, in schedule 8, page 203, line 29, leave out

“priority content that is harmful to adults”

and insert “relevant content”.

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 67, in schedule 8, page 203, line 41, at end insert—

“11A Measures taken or in use by a provider to comply with any duty set out in section (Duty not to act against users except in accordance with terms of service) or (Further duties about terms of service) (terms of service).”

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about measures taken to comply with the new duties imposed by NC3 and NC4.

Amendment 68, in schedule 8, page 204, line 2, leave out from “illegal content” to end of line 3 and insert

“or content that is harmful to children—”.

This amendment removes the reference to content that is harmful to adults, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 69, in schedule 8, page 204, line 10, leave out from “illegal content” to “, and” in line 12 and insert

“and content that is harmful to children”.

This amendment removes the reference to content that is harmful to adults, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 70, in schedule 8, page 204, line 14, leave out from “illegal content” to “present” in line 15 and insert

“and content that is harmful to children”.

This amendment removes the reference to content that is harmful to adults, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 71, in schedule 8, page 205, line 38, after “Part 3” insert

“or Chapters 1 to 2A of Part 4”.—(Paul Scully.)

This amendment requires OFCOM, in considering which information to require from a provider in a transparency report, to consider whether the provider is subject to the duties imposed by Chapter 2A, which is the new Chapter expected to be formed by NC3 to NC6 (and Chapter 1 of Part 4).

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I beg to move amendment 72, in schedule 8, page 206, line 5, at end insert—

“35A (1) For the purposes of this Schedule, content of a particular kind is ‘relevant content’ if—

(a) a term of service, other than a term of service within sub-paragraph (2), states that a provider may or will take down content of that kind from the service or restrict users’ access to content of that kind, and

(b) it is regulated user-generated content.

(2) The terms of service within this sub-paragraph are as follows—

(a) terms of service which make provision of the kind mentioned in section 9(5) (protecting individuals from illegal content) or 11(5) (protecting children from content that is harmful to children);

(b) terms of service which deal with the treatment of consumer content.

(3) References in this Schedule to relevant content are to content that is relevant content in relation to the service in question.”

This amendment defines “relevant content” for the purposes of Schedule 8.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss Government amendments 73 and 75.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The amendments to schedule 8 confirm that references to relevant content, consumer content and regulated user-generated content have the same meaning as established by other provisions of the Bill. Again, that ensures consistency, which will, in turn, support Ofcom in requiring providers of category 1 services to give details in their annual transparency reports of their compliance with the new transparency, accountability and freedom of expression duties.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I will keep my comments on this grouping brief, because I have already raised our concerns and our overarching priority in terms of transparency reports in the previous debate, which was good one, with all Members highlighting the need for transparency and reporting in the Bill. With the Chair’s permission, I will make some brief comments on Government amendment 72 before addressing Government amendments 73 and 75.

It will come as no surprise to the Minister that amendment 72, which defines relevant content for the purposes of schedule 8, has a key omission—specifying priority content harmful to adults. For reasons we have covered at length, we think that it is a gross mistake on the Government’s side to attempt to water down the Bill in this way. If the Minister is serious about keeping adults safe online, he must reconsider this approach. However, we are happy to see amendments 73 and 75, which define consumer content and regulated user-generated content. It is important for all of us—whether we are politicians, researchers, academics, civil society, stakeholders, platforms, users or anyone else—that these definitions are in the Bill so that, when it is passed, it can be applied properly and at pace. That is why we have not sought to amend this grouping.

I must press the Minister to respond on the issues around relevant content as outlined in amendment 72. We greatly feel that more needs to be done to address this type of content and its harm to adults, so I would be grateful to hear the Minister’s assessment of how exactly these transparency reports will report back on this type of harm, given its absence in this group of amendments and the lack of a definition.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I am pleased to see the list included and the number of things that Ofcom can ask for more information on. I have a specific question about amendment 75. Amendment 75 talks about regulated user-generated content and says it has the same meaning as it does in the interpretation of part 3 under clause 50. The Minister may or may not know that there are concerns about clause 50(5), which relates to

“One-to-one live aural communications”.

One-to-one live aural communications are exempted. I understand that that is because the Government do not believe that telephony services, for example, should be part of the Online Safety Bill—that is a pretty reasonable position for them to take. However, allowing one-to-one live aural communications not to be regulated means that if someone is using voice chat in Fortnite, for example, and there are only two people on the team that they are on, or if someone is using voice chat in Discord and there are only two people online on the channel at that time, that is completely unregulated and not taken into account by the Bill.

I know that that is not the intention of the Bill, which is intended to cover user-generated content online. The exemption is purely in place for telephony services, but it is far wider than the Government intend it to be. With the advent of more and more people using virtual reality technology, for example, we will have more and more aural communication between just two people, and that needs to be regulated by the Bill. We cannot just allow a free-for-all.

If we have child protection duties, for example, they need to apply to all user-generated content and not exempt it specifically because it is a live, one-to-one aural communication. Children are still at significant risk from this type of communication. The Government have put this exemption in because they consider such communication to be analogous to telephony services, but it is not. It is analogous to telephony services if we are talking about a voice call on Skype, WhatsApp or Signal—those are voice calls, just like telephone services—but we are talking about a voice chat that people can have with people who they do not know, whose phone number they do not know and who they have no sort of relationship with.

Some of the Discord servers are pretty horrendous, and some of the channels are created by social media influencers or people who have pretty extreme views in some cases. We could end up with a case where the Discord server and its chat functions are regulated, but if aural communication or a voice chat is happening on that server, and there are only two people online because it is 3 o’clock in the morning where most of the people live and lots of them are asleep, that would be exempted. That is not the intention of the Bill, but the Government have not yet fixed this. So I will make one more plea to the Government: will they please fix this unintended loophole, so that it does not exist? It is difficult to do, but it needs to be done, and I would appreciate it if the Minister could take that into consideration.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I do not believe that the provisions in terms of Ofcom’s transparency powers have been watered down. It is really important that the Bill’s protection for adults strikes the right balance with its protections for free speech, which is why we have replaced the “legal but harmful” clause. I know we will not agree on that, but there are more new duties that will make platforms more accountable. Ofcom’s transparency powers will enable it to assess compliance with the new safety duties and hold platforms accountable for enforcing their terms of service to keep users safe. Companies will also have to report on the measures that they have in place to tackle illegal content or activity and content that is harmful for children, which includes proactive steps to address offences such as child sexual exploitation and abuse.

The legislation will set out high-level categories of information that companies may be required to include in their transparency reports, and Ofcom will then specify the information that service providers will need to include in those reports, in the form of a notice. Ofcom will consider companies’ resources and capacity, service type and audience in determining what information they will need to include. It is likely that the information that is most useful to the regulator and to users will vary between different services. To ensure that the transparency framework is proportionate and reflects the diversity of services in scope, the transparency reporting requirements set out in the Ofcom notice are likely to differ between those services, and the Secretary of State will have powers to update the list of information that Ofcom may require to reflect any changes of approach.

17:03
Let me address the interesting point about aural exemptions made by the hon. Member for Aberdeen North. As she says, the exemption is there to ensure that we do not capture traditional phone calls. Phones have moved from pots and pans—from a plain old telephone system—to public access networks and beyond over the last 20 years. Although one-to-one live aural communications are exempt, other types of interactions between adults and children, including in-game private messaging chat functions and video calls, are in scope. If there are unintended consequences—the hon. Lady will know that I was described as the Minister for unintended consequences when I was at the Department for Business, Energy and Industrial Strategy—I would be happy to continue chatting with her and others to ensure that we get that difficult position right.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The in-game chat that children use is overwhelmingly voice chat. Children do not type if they can possibly avoid it. I am sure that that is not the case for all children, but it is for most children. Aural communication is used if someone is playing Fortnite duos, for example, with somebody they do not know. That is why that needs to be included.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I very much get that point. It is not something that I do, but I have certainly seen it myself. I am happy to chat to the hon. Lady to ensure that we get it right.

Amendment 72 agreed to.

Amendments made: 73, in schedule 8, page 206, line 6, at end insert—

“‘consumer content’ has the same meaning as in Chapter 2A of Part 4 (see section (Interpretation of this Chapter)(3));”.

This amendment defines “consumer content” for the purposes of Schedule 8.

Amendment 74, in schedule 8, page 206, leave out lines 7 and 8.

This amendment is consequential on Amendment 41 (removal of clause 55).

Amendment 75, in schedule 8, page 206, line 12, at end insert—

“‘regulated user-generated content’ has the same meaning as in Part 3 (see section 50), and references to such content are to content that is regulated user-generated content in relation to the service in question;”.—(Paul Scully.)

This amendment defines “regulated user-generated content” for the purposes of Schedule 8.

Schedule 8, as amended, agreed to.

Ordered, That further consideration be now adjourned. —(Mike Wood.)

17:02
Adjourned till Thursday 15 December at half-past Eleven o’clock.
Written evidence reported to the House
OSB101 Mencap
OSB102 News Media Association (NMA)
OSB103 Dr Edina Harbinja, Reader in law, Aston Law School, Aston Business School, and Deputy Editor of the Computer Law and Security Review
OSB104 Carnegie UK
OSB105 Full Fact
OSB106 Antisemitism Policy Trust
OSB107 Big Brother Watch
OSB108 Microsoft
OSB109 Internet Society
OSB110 Parent Zone
OSB111 Robin Wilton
OSB112 Wikimedia Foundation