All 25 Alex Davies-Jones contributions to the Online Safety Act 2023

Read Bill Ministerial Extracts

Tue 19th Apr 2022
Online Safety Bill
Commons Chamber

2nd reading & 2nd reading
Tue 24th May 2022
Tue 24th May 2022
Thu 26th May 2022
Online Safety Bill (Third sitting)
Public Bill Committees

Committee stage: 3rd sitting & Committee Debate - 3rd sitting
Thu 26th May 2022
Online Safety Bill (Fourth sitting)
Public Bill Committees

Committee stage: 4th sitting & Committee Debate - 4th sitting
Tue 7th Jun 2022
Tue 7th Jun 2022
Thu 9th Jun 2022
Thu 9th Jun 2022
Tue 14th Jun 2022
Tue 14th Jun 2022
Thu 16th Jun 2022
Thu 16th Jun 2022
Tue 21st Jun 2022
Online Safety Bill (Thirteenth sitting)
Public Bill Committees

Committee stage: 13th sitting & Committee Debate - 13th sitting
Tue 21st Jun 2022
Thu 23rd Jun 2022
Tue 28th Jun 2022
Tue 28th Jun 2022
Tue 12th Jul 2022
Online Safety Bill
Commons Chamber

Report stage & Report stage (day 1) & Report stage
Mon 5th Dec 2022
Tue 13th Dec 2022
ONLINE SAFETY BILL (First sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 1st sitting
Tue 13th Dec 2022
ONLINE SAFETY BILL (Second sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 2nd sitting
Thu 15th Dec 2022
ONLINE SAFETY BILL (Third sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 3rd sitting
Tue 17th Jan 2023
Tue 12th Sep 2023
Online Safety Bill
Commons Chamber

Consideration of Lords amendments

Online Safety Bill

Alex Davies-Jones Excerpts
2nd reading
Tuesday 19th April 2022

(2 years ago)

Commons Chamber
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts
Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - -

It is an honour to close this debate on behalf of the Opposition. Sadly, there is so little time for the debate that there is much that we will not even get to probe, including any mention of the Government’s underfunded and ill-thought-through online media strategy.

However, we all know that change and regulation of the online space are much needed, so Labour welcomes this legislation even in its delayed form. The current model, which sees social media platforms and tech giants making decisions about what content is hosted and shared online, is simply failing. It is about time that that model of self-regulation, which gives too much control to Silicon Valley, was challenged.

Therefore, as my hon. Friend the Member for Manchester Central (Lucy Powell) said, Labour broadly supports the principles of the Bill and welcomes some aspects of the Government’s approach, including the duty of care frameworks and the introduction of an independent regulator, Ofcom. It cannot and should not be a matter for the Government of the time to control what people across the UK are able to access online. Labour will continue to work hard to ensure that Ofcom remains truly independent of political influence.

We must also acknowledge, however, that after significant delays this Bill is no longer world leading. The Government first announced their intention to regulate online spaces all the way back in 2018. Since then, the online space has remained unregulated and, in many cases, has perpetuated dangerous and harmful misinformation with real-world consequences. Colleagues will be aware of the sheer amount of coronavirus vaccine disinformation so easily accessed by millions online at the height of the pandemic. Indeed, in many respects, it was hard to avoid.

More recently, the devastating impact of state disinformation at the hands of Putin’s regime has been clearer than ever, almost two years after Parliament’s own Intelligence and Security Committee called Russian influence in the UK “the new normal”.

Deidre Brock Portrait Deidre Brock
- Hansard - - - Excerpts

Does the hon. Lady share my disappointment and concern that the Bill does nothing to address misinformation and disinformation in political advertising? A rash of very aggressive campaign groups emerged before the last Scottish Parliament elections, for example; they spent heavily on online political advertising, but were not required to reveal their political ties or funding sources. That is surely not right.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I share the hon. Lady’s concern. There is so much more that is simply missing from this Bill, which is why it is just not good enough. We have heard in this debate about a range of omissions from the Bill and the loopholes that, despite the years of delay, have still not been addressed by the Government. I thank hon. Members on both sides of the House for pointing those out. It is a shame that we are not able to address them individually here, but we will probe those valued contributions further in the Bill Committee.

Despite huge public interest and a lengthy prelegislative scrutiny process, the Government continue to ignore many key recommendations, particularly around defining and regulating both illegal and legal but harmful content online. The very nature of the Bill and its heavy reliance on secondary legislation to truly flesh out the detail leaves much to be desired. We need to see action now if we are truly to keep people safe online.

Most importantly, this Bill is an opportunity, and an important one at that, to decide the kind of online world our children grow up in. I know from many across the House that growing up online as children do now is completely unimaginable. When I was young, we played Snake on a Nokia 3310, and had to wait for the dial-up and for people to get off the phone in order to go online and access MSN, but for people today access to the internet, social media and everything that brings is a fundamental part of their lives.

Once again, however, far too much detail, and the specifics of how this legislation will fundamentally change the user experience, is simply missing from the Bill. When it comes to harmful content that is not illegal, the Government have provided no detail. Despite the Bill’s being years in the making, we are no closer to understanding the impact it will have on users.

The Bill in its current draft has a huge focus on the tools for removing and moderating harmful content, rather than ensuring that design features are in place to make services systematically safer for all of us. The Government are thus at real risk of excluding children from being able to participate in the digital world freely and safely. The Bill must not lock children out of services they are entitled to use; instead, it must focus on making those services safe by design.

I will push the Minister on this particular point. We are all eager to hear what exact harms platforms will have to take steps to address and mitigate. Will it be self-harm? Will it perhaps be content promoting eating disorders, racism, homophobia, antisemitism and misogyny? One of the key problems with the Bill is the failure to make sure that the definitions of “legal but harmful” content are laid out within it. Will the Minister therefore commit to amending the Bill to address this and to allow for proper scrutiny? As we have heard, the Government have also completely failed to address what stakeholders term the problem of breadcrumbing. I would be grateful if the Minister outlined what steps the Government will be taking to address this issue, as there is clearly a loophole in the Bill that would allow this harmful practice to continue.

As we have heard, the gaps in the Bill, sadly, do not end there. Women and girls are disproportionately likely to be affected by online abuse and harassment. Online violence against women and girls is defined as including but not limited to

“intimate image abuse, online harassment, the sending of unsolicited explicit images, coercive ‘sexting’, and the creation and sharing of ‘deepfake’ pornography.”

This Bill is an important step forward but it will need significant strengthening to make online spaces safe for women and girls. While we welcome the steps by the Government to include cyber-flashing in the Bill, it must go further in other areas. Misogyny should be included as a harm to adults that online platforms have a duty to prevent from appearing on them. As colleagues will be aware, Instagram has been completely failing to tackle misogynistic abuse sent via direct message. The Centre for Countering Digital Hate has exposed what it terms an “epidemic of misogynistic abuse”, 90% of which has been completely and utterly ignored by Instagram, even when it has been reported to moderators. The Government must see sense and put violence against women and girls into the Bill, and it must also form a central pillar of regulation around legal but harmful content. Will the Minister therefore commit to at least outlining the definitions of “legal but harmful” content, both for adults and children, in the Bill?

Another major omission from the Bill in as currently drafted is its rather arbitrary categorisation of platforms based on size versus harm. As mentioned by many hon. Members, the categorisation system as it currently stands will completely fail to address some of the most extreme harms on the internet. Thanks to the fantastic work of organisations such as Hope not Hate and the Antisemitism Policy Trust, we know that smaller platforms such as 4chan and BitChute have significant numbers of users who are highly motivated to promote extremely dangerous content. The Minister must accept that his Department has been completely tone-deaf on this particular point, and—he must listen to what hon. Members have said today—its decision making utterly inexplicable. Rather than an arbitrary size cut-off, the regulator must instead use risk levels to determine which category a platform should fall into so that harmful and dangerous content does not slip through the net. Exactly when will the Minister’s Department publish more information on the detail around this categorisation system? Exactly what does he have to say to those people, including many Members here today, who have found themselves the victim of abusive content that has originated on these hate-driven smaller platforms? How will this Bill change their experience of being online? I will save him the energy, because we all know the real answer: it will do little to change the situation.

This Bill was once considered a once-in-a-generation opportunity to improve internet safety for good, and Labour wants to work with the Government to get this right. Part of our frustration is due to the way in which the Government have failed to factor technological change and advancement—which, as we all know, and as we have heard today, can be extremely rapid—into the workings of this Bill. While the Minister and I disagree on many things, I am sure that we are united in saying that no one can predict the future, and that is not where my frustrations lie. Instead, I feel that the Bill has failed to address issues that are developing right now—from developments in online gaming to the expansion of the metaverse. These are complicated concepts but they are also a reality that we as legislators must not shy away from.

The Government have repeatedly said that the Bill’s main objective is to protect children online, and of course it goes without saying that Labour supports that. Yet with the Bill being so restricted to user-to-user services, there are simply too many missed opportunities to deal with areas where children, and often adults, are likely to be at risk of harm. Online gaming is a space that is rightly innovative and fast-changing, but the rigid nature of how services have been categorised will soon mean that the Bill is outdated long before it has had a chance to have a positive impact. The same goes for the metaverse.

While of course Labour welcomes the Government’s commitment to prevent under-18s from accessing pornography online, the Minister must be realistic. A regime that seeks to ban rather than prevent is unlikely to ever be able to keep up with the creative, advanced nature of the tech industry. For that reason, I must press the Minister on exactly how this Bill will be sufficiently flexible and future-proofed to avoid a situation whereby it is outdated by the time it finally receives Royal Assent. We must make sure that we get this right, and the Government know that they could and can do more. I therefore look forward to the challenge and to working with colleagues across the House to strengthen this Bill throughout its passage.

Online Safety Bill (First sitting)

Alex Davies-Jones Excerpts
Committee stage
Tuesday 24th May 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 24 May 2022 - (24 May 2022)
None Portrait The Chair
- Hansard -

We are now sitting in public again, and the proceedings are being broadcast. Before we start hearing from the witnesses, do any Members wish to make declarations of interest in connection with the Bill?

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - -

The witness on Thursday’s sitting, Danny Stone from the Antisemitism Policy Trust, is an informal secretariat in a personal capacity to the all-party parliamentary group on wrestling, which I co-chair.

None Portrait The Chair
- Hansard -

That is noted.

--- Later in debate ---
None Portrait The Chair
- Hansard -

I will open up to the floor for questions now. I call Alex Davies-Jones.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Good morning, both, and welcome to the Committee. The Bill as it stands places responsibility on Ofcom to regulate the 25,000 tech companies and the tens—if not hundreds—of thousands of websites within the UK. How does that look in practice? What technical and administrative capacity do you have to carry that function out, realistically?

Kevin Bakhurst: We should say that we feel the Bill has given us a very good framework to regulate online safety. We have been working closely with the Department for Digital, Culture, Media and Sport to make sure that the Bill gives us a practical, deliverable framework. There is no doubt that it is a challenge. As you rightly say, there will be potentially 25,000 platforms in scope, but we feel that the Bill sets out a series of priorities really clearly in terms of categories.

It is also for us to set out—we will be saying more about this in the next couple of months—how we will approach this, and how we will prioritise certain platforms and types of risk. It is important to say that the only way of achieving online safety is through what the Bill sets out, which is to look at the systems in place at the platforms, and not the individual pieces of content on them, which would be unmanageable.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Thank you, Kevin. You mentioned the categorisation of platforms. A number of stakeholders, including the platforms themselves and charities, have quite rightly raised some serious concerns around the categorisation of platforms. Would you, the regulator, prefer a risk-based approach, or the categorisation as it stands within the Bill?

Richard Wronka: We completely recognise the concerns that have been raised by stakeholders, and we have been speaking to many of them ourselves, so we have first-hand experience. I think my starting point is that the Bill captures those high-risk services, which is a really important feature of it. In particular, responsibilities around the legal content apply across all services in scope. That means that, in practice, when we are regulating, we will take a risk-based approach to whom we choose to engage with, and to where we focus our effort and attention.

We recognise that some of the debate has been about the categorisation process, which is intended to pick up high-risk and high-reach services. We understand the logic behind that. Indeed, I think we would have some concerns about the workability of an approach that was purely risk-based in its categorisation. We need an approach that we can put into operation. Currently, the Bill focuses on the reach of services and their functionality. We would have some concerns about a purely risk-based approach in terms of whether it was something that we could put into practice, given the number of services in scope.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q May I bring you back to putting this into practice, and to the recategorisation of platform and practice? If a category 2B platform as it stands in the Bill grows exponentially in size, and is spreading disinformation and incredibly harmful content quite quickly, how quickly would you be able to react as a regulator to recategorise that platform and bring it into scope as a category 1 platform? How long would that process take, and what would happen in the interim?

Richard Wronka: At the moment, the category 2B service would have transparency reporting requirements. That would be helpful, because it would be one way that the nature of harmful content on that platform could be brought to our attention, and to the public’s attention. We would also be looking at approaches that we could use to monitor the whole scope of the services, to ensure that we had a good grip of who was growing quickest and where the areas of risk were. Some of that is through engaging with the platforms themselves and a whole range of stakeholders, and some of it is through more advanced data and analytical techniques—“supervision technology”, as it is known in the regulatory jargon.

On the specifics of your question, if a company was growing very quickly, the Bill gives us the ability to look at that company again, to ask it for information to support a categorisation decision, and to recategorise it if that is the right approach and if it has met the thresholds set out by the Secretary of State. One of the thresholds regards the number of users, so if a company has moved over that threshold, we look to act as quickly as possible while running a robust regulatory process.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q So while that process is under way, there is no mechanism for you to take action against the platform.

Kevin Bakhurst: May I answer this? We have some experience of this already in the video-sharing platform regime, which is much more limited in scope, and we are already regulating a number of platforms, ranging from some very big ones such as Twitch, TikTok and Snap, down to some much smaller platforms that have caused us some concerns. We think we have the tools, but part of our approach will also be to focus on high-risk and high-impact content, even if it comes through small platforms. That is what we have already done with the video-sharing platform regime. We have to be agile enough to capture that and to move resources to it. We are doing that already with the video-sharing platform regime, even though we have only been regulating it for less than a year.

None Portrait The Chair
- Hansard -

Maria Miller has indicated that she would like to ask a question, so if I may, I will bring her in.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

Not immediately —go on please.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Thank you, Chair, and thank you, Maria.

I am just trying to get to the intricacies of this, and of what would happen during the time that it would take for you to recategorise. This platform, which is disseminating harm to both children and adults, would be allowed to carry on while the recategorisation process is under way. There is no mechanism in the Bill to stop that from happening.

Richard Wronka: A really important point here is that we will be regulating that platform from the outset for illegal content and, potentially, for how it protects children on its platform, irrespective of the categorisation approach. That is really important. We will be able to take action, and take action quickly, irrespective of how the platform is categorised. Categorisation really determines whether the adult “legal but harmful” provisions apply. That is the bit that really matters in this context.

It is worth reminding ourselves what those provisions mean: they are more a transparency and accountability measure. Those categorised category 1 platforms will need to have clear terms and conditions applied to adult “legal but harmful” content, and they will need to implement those consistently. We would expect the really serious and egregious concerns to be picked up by the “illegal” part of the regime, and the protection-of-children part of the regime. The categorisation process may go on. It may take a little time, but we will have tools to act in those situations.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q May I bring you on to the powers of the Secretary of State and the question of the regulator’s independence? The Bill will see the Secretary of State, whoever that may be, have a huge amount of personal direction over Ofcom. Do you have any other experience of being directed by a Secretary of State in this way, and what are the consequences of such an approach?

Kevin Bakhurst: We do have some experience across the various sectors that we regulate, but being directed by the Secretary of State does not happen very often. Specifically on the Bill, our strong feeling is that we think it entirely appropriate, and that the Secretary of State should be able to direct us on matters of national security and terrorist content. However, we have some concerns about the wider direction powers of the Secretary of State, and particularly the grounds on which the Secretary of State can direct public policy, and we have expressed those concerns previously.

We feel it is important that the independence of a regulator can be seen to be there and is there in practice. Legally, we feel it important that there is accountability. We have some experience of being taken to judicial review, and there must be accountability for the codes of practice that we put in place. We must be able to show why and how we have created those codes of practice, so that we can be accountable and there is absolute clarity between regulator and Government.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Thank you very much to the witnesses who have taken the time to be with us today. We are really grateful. You have already alluded to the fact that you have quite extensive experience in regulation, even in social media spaces. I think the Committee would be really interested in your view, based on your experience, about what is not in the Bill that should be.

Kevin Bakhurst: Richard has been leading this process, so he can give more detail on it, but suffice to say, we have been engaging closely with DCMS over the last year or so, and we appreciate the fact that it has taken on board a number of our concerns. What we felt we needed from the Bill was clarity as far as possible, and a balance between clarity and flexibility for this regime, which is a very fast-moving field. We feel, by and large, that the Bill has achieved that.

We still have concerns about one or two areas, to pick up on your question. We feel it is really important—hopefully this is something the Committee can contribute to—that the definition of “illegal content” is really clear for platforms, and particularly the area of intent of illegality, which at the moment might be quite tricky for the platforms to pick up on.

Richard Wronka: I completely agree with Kevin that the Bill as it stands gives us a good framework. I think the pre-legislative scrutiny process has been really helpful in getting us there, and I point out that it is already quite a broad and complex regime. We welcome the introduction of issues such as fraudulent advertising and the regulation of commercial pornographic providers, but I think there is a point about ensuring that the Bill does not expand too much further, because that might raise some practical and operational issues for us.

I completely agree with Kevin that clarity in the Bill regarding illegal content and what constitutes that is really important. An additional area that requires clarity is around some of the complex definitions in the Bill, such as journalistic content and democratically important content. Those are inherently tricky issues, but any extra clarity that Parliament can provide in those areas would be welcome.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Just quickly, do coroners have sufficient powers? Should they have more powers to access digital data after the death of a child?

Andy Burrows: We can see what a protracted process it has been. There have been improvements to the process. It is currently a very lengthy process because of the mutual legal assistance treaty arrangements—MLAT, as they are known—by which injunctions have to be sought to get data from US companies. It has taken determination from some coroners to pursue cases, very often going up against challenges. It is an area where we think the arrangements could certainly be streamlined and simplified. The balance here should shift toward giving parents and families access to the data, so that the process can be gone through quickly and everything can be done to ease the heartbreak for families having to go through those incredibly traumatic situations.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Very briefly, Dame Rachel, I will build on what you were just saying, based on your experience as a headteacher. When I make my school visits, the teachers overwhelmingly tell me how, on a daily basis, they have to deal with the fallout from an issue that has happened online or on social media. On that matter, the digital media literacy strategy is being removed from the Bill. What is your thinking on that? How important do you see a digital media literacy strategy being at the heart of whatever policy the Government try to make regarding online safety for children?

Dame Rachel de Souza: There is no silver bullet. This is now a huge societal issue and I think that some of the things that I would want to say would be about ensuring that we have in our educational arsenal, if you like, a curriculum that has a really strong digital media literacy element. To that end, the Secretary of State for Education has just asked me to review how online harms and digital literacy are taught in schools—reviewing not the curriculum, but how good the teaching is and what children think about how the subject has been taught, and obviously what parents think, too.

I would absolutely like to see the tech companies putting some significant funding into supporting education of this kind; it is exactly the kind of thing that they should be working together to provide. So we need to look at this issue from many aspects, not least education.

Obviously, in a dream world I would like really good and strong digital media literacy in the Bill, but actually it is all our responsibility. I know from my conversations with Nadhim Zahawi that he is very keen that this subject is taught through the national curriculum, and very strongly.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q I have a quick question on parental digital literacy. You mentioned the panel that you put together of 16 to 21-year-olds. Do you think that today’s parents have the experience, understanding, skills and tools to keep their children properly safe online? Even if they are pretty hands-on and want to do that, do you think that they have all the tools they need to be able to do that?

Dame Rachel de Souza: It is a massive concern to parents. Parents talk to me all the time about their worries: “Do we know enough?” They have that anxiety, especially as their children turn nine or 10; they are thinking, “I don’t even know what this world out there is.” I think that our conversations with 16 to 21-year-olds were really reassuring, and we have produced a pamphlet for parents. It has had a massive number of downloads, because parents absolutely want to be educated in this subject.

What did young people tell us? They told us, “Use the age controls; talk to us about how much time we are spending online; keep communication open; and talk to us.” Talk to children when they’re young, particularly boys, who are likely to be shown pornography for the first time, even if there are parental controls, around the age of nine or 10. So have age-appropriate conversations. There was some very good advice about online experiences, such as, “Don’t worry; you’re not an expert but you can talk to us.” I mean, I did not grow up with the internet, but I managed parenting relatively well—my son is 27 now. I think this is a constant concern for parents.

I do think that the tech companies could be doing so much more to assist parents in digital media literacy, and in supporting them in how to keep their child safe. We are doing it as the Office of the Children’s Commissioner. I know that we are all trying to do it, but we want to see everyone step up on this, particularly the tech companies, to support parents on this issue.

--- Later in debate ---
None Portrait The Chair
- Hansard -

We will now hear from Ben Bradley, government relations and public policy manager at TikTok, and Katy Minshall, head of UK public policy at Twitter. We have until 11.25 for this panel of witnesses. Could the witnesses please introduce themselves for the record?

Ben Bradley: I am Ben Bradley. I am a public policy manager at TikTok, leading on the Bill from TikTok.

Katy Minshall: I am Katy Minshall. I am head of UK public policy for Twitter.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Good morning, both. Thank you for joining us today. We have recently had it confirmed by the Minister in a written parliamentary question that NFTs—non-fungible tokens—will be included in the scope of the Bill. Concerns have been raised about how that will work in practice, and also in relation to GIFs, memes and other image-based content that is used on your platforms, Twitter specifically. Katy, how do you see that working in practice? Is the Bill workable in its current form to encapsulate all of that?

Katy Minshall: Thank you for inviting me here to talk about the Online Safety Bill. On whether the Bill is workable in its current form, on the one hand, we have long been supportive of an approach that looks at overall systems and processes, which I think would capture some of the emerging technologies that you are talking about. However, we certainly have questions about how are aspects of the Bill would work in practice. To give you an example, one of the late additions to the Bill was about user verification requirements, which as I understand it means that all category 1 platforms will need to offer users the opportunity to verify themselves and, in turn, those verified users have the ability to turn off interaction from unverified users. Now, while we share the Government’s policy objective of giving users more control, we certainly have some workability questions.

Just to give you one example, let’s say this existed today, and Boris Johnson turned on the feature. In practice, that would mean one of two things. Either the feature is only applicable to users in the UK, meaning that people around the world—in France, Australia, Germany or wherever it may be—are unable to interact with Boris Johnson, and only people who are verified in the UK can reply to him, tweet at him and so on, or it means the opposite and anyone anywhere can interact with Boris Johnson except those people who have chosen not to verify their identity, perhaps even in his own constituency, who are therefore are at a disadvantage in being able to engage with the Prime Minister. That is just one illustration of the sorts of workability questions we have about the Bill at present.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q You brought up the Prime Minister, so we’ll carry on down that route. One of the concerns about the Bill is the issue of protecting democratic importance. If there is an exemption for content of democratic importance, would your platforms be able to take that down?

Katy Minshall: I am sorry, do you mean—

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Would you be able to remove the content?

Katy Minshall: At present, what would be expected of companies in that scenario is not entirely clear in the Bill. There are certainly examples of content we have removed over the years for abuse and hateful conduct where the account owner that we suspended would have grounds to say, “Actually, this is content of democratic importance.” At the very least, it is worth pointing out that, in practice, it is likely to slow down our systems because we would have to build in extra steps to understand if a tweet or an account could be considered content of democratic importance, and we would therefore treat it differently.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q That brings me to my next question. Because what would be classed as content of democratic importance is so ambiguous, would your platforms even be able to detect it?

Katy Minshall: That is a really important question. At present, the Bill envisages that we would treat journalistic content differently from other types of content. I think the definition in the Bill—correct me if I get this wrong—is content for the purposes of journalism that is UK linked. That could cover huge swathes of the conversation on Twitter—links to blog posts, citizen journalists posting, front pages of news articles. The Bill envisages our having a system to separate that content from other content, and then treating that content differently. I struggle to understand how that would work in practice, especially when you layer on top the fact that so much of our enforcement is assisted by technology and algorithms. Most of the abusive content we take down is detected using algorithms; we suspend millions of spam accounts every day using automated systems. When you propose to layer something so ambiguous and complicated on top of that, it is worth considering how that might impact on the speed of enforcement across all of our platform.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Thank you. Given the media carve-out and the journalism exemption in the Bill, how could you detect state actors that are quoting disinformation, or even misinformation?

Katy Minshall: At present, we label a number of accounts as Government actors or state-affiliated media and we take action on those accounts. We take down their tweets and in some cases we do not amplify their content because we have seen in current situations that some Governments are sharing harmful content. Again, I question the ambiguity in the Bill and how it would interact with our existing systems that are designed to ensure safety on Twitter.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Thank you. Just one final question for Twitter. A query we raised with the Children’s Commissioner and the NSPCC is about pornography and children accessing it. A person needs to be 13 years old to join Twitter—to host a profile on the site—but you do host pornographic content; it is used mainly by sex workers to promote their trade. How does the proposed provision affect your model of business in allowing 13-year-olds and above to access your platform?

Katy Minshall: Until we see the full extent of the definitions and requirements, it is difficult to say exactly what approach we would take under the Bill. Regarding adult content, Twitter is not a service targeting a youth audience, and as you illustrate, we endeavour to give people the ability to express themselves as they see fit. That has to be balanced with the objective of preventing young people from inadvertently stumbling on such content.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q So you are not predominantly aimed at children? If you are an adult service, why is it that people aged 13 or above can access your platform?

Katy Minshall: We find that, in practice, the overwhelming majority of our user base are over the age of 18; both internal and external data show that. Of course young people can access Twitter. I think we have to be very careful that the Bill does not inadvertently lock children out of services they are entitled to use. I am sure we can all think of examples of people under the age of 18 who have used Twitter to campaign, for activism and to organise; there are examples of under-18s using Twitter to that effect. But as I say, predominantly we are not a service targeting a youth audience.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Okay. Thank you, Chair.

Online Safety Bill (Second sitting)

Alex Davies-Jones Excerpts
Committee stage
Tuesday 24th May 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 24 May 2022 - (24 May 2022)
None Portrait The Chair
- Hansard -

May I just ask you, for the benefit of Hansard, to try to speak up a little? The sound system is not all that it might be in this room, and the acoustics certainly are not.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - -

Q Thank you to our witnesses for joining us this afternoon. Quite bluntly, I will get into it, because what is frustrating for us, as Parliamentarians, and for our constituents, is the fact that we need this legislation in the first place. Why are you, as platforms, allowing harmful and illegal content to perpetuate on your platforms? Why do we need this legislation for you to take action? It is within your gift to give, and despite all the things I am sure you are about to tell me that you are doing to prevent this issue from happening, it is happening and we are needing to legislate, so why?

None Portrait The Chair
- Hansard -

Mr Earley, I will go left to right to start with, if that is all right with you, so you have drawn the short straw.

Richard Earley: No worries, and thank you very much for giving us the opportunity to speak to you all today; I know that we do not have very much time. In short, we think this legislation is necessary because we believe that it is really important that democratically elected Members of Parliament and Government can provide input into the sorts of decisions that companies such as ours are making, every day, about how people use the internet. We do not believe that it is right for companies such as ours to be taking so many important decisions every single day.

Now, unfortunately, it is the case that social media reflects the society that we live in, so all of the problems that we see in our society also have a reflection on our services. Our priority, speaking for Meta and the services we provide—Facebook, Instagram and WhatsApp—is to do everything we can to make sure our users have as positive an experience as possible on our platform. That is why we have invested more than $13 billion over the past five years in safety and security, and have more than 40,000 people working at our company on safety and security every day.

That said, I fully recognise that we have a lot more areas to work on, and we are not waiting for this Bill to come into effect to do that. We recently launched a whole range of updated tools and technologies on Instagram, for example, to protect young people, including preventing anyone under the age of 18 from being messaged by a person they are not directly connected to. We are also using new technology to identify potentially suspicious accounts to prevent young people from appearing in any search results that those people carry out. We are trying to take steps to address these problems, but I accept there is a lot more to do.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Before I bring in Becky and Katie to answer that, I just want to bring you back to something you said about social media and your platforms reflecting society like a mirror. That analogy is used time and again, but actually they are not a mirror. The platforms and the algorithms they use amplify, encourage and magnify certain types of content, so they are not a mirror of what we see in society. You do not see a balanced view of two points of an issue, for example.

You say that work is already being done to remove this content, but on Instagram, for example, which is a platform predominantly used by women, the Centre for Countering Digital Hate has exposed what they term an “epidemic of misogynistic abuse”, with 90% of misogynistic abuse being sent via direct messaging. It is being ignored by the platform even when it is being reported to the moderators. Why is that happening?

Richard Earley: First, your point about algorithms is really important, but I do not agree that they are being used to promote harmful content. In fact, in our company, we use algorithms to do the reverse of that. We try to identify content that might break our policies—the ones we write with our global network of safety experts—and then remove those posts, or if we find images or posts that we think might be close to breaking those rules, we show them lower in people’s feeds so that they have a lower likelihood of being seen. That is why, over the past two years, we have reduced the prevalence of harmful posts such as hate speech on Facebook so that now only 0.03% of views of posts on Facebook contain that kind of hate speech—we have almost halved the number. That is one type of action that we take in the public parts of social media.

When it comes to direct messages, including on Instagram, there are a range of steps that we take, including giving users additional tools to turn off any words they do not want to see in direct messages from anyone. We have recently rolled out a new feature called “restrict” which enables you to turn off any messages or comments from people who have just recently started to follow you, for example, and have just created their accounts. Those are some of the tools that we are trying to use to address that.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q So the responsibility is on the user, rather than the platform, to take action against abuse?

Richard Earley: No, the responsibility is absolutely shared by those of us who offer platforms, by those who are engaged in abuse in society, and by civil society and users more widely. We want to ensure we are doing everything we can to use the latest technology to stop abuse happening where we can and give people who use our services the power to control their experience and prevent themselves from encountering it.

None Portrait The Chair
- Hansard -

We must allow the other witnesses to participate.

Becky Foreman: Thank you for inviting me to give evidence to you today. Online safety is extremely important to Microsoft and sits right at the heart of everything we do. We have a “safety by design” policy, and responsibility for safety within our organisation sits right across the board, from engineers to operations and policy people. However, it is a complicated, difficult issue. We welcome and support the regulation that is being brought forward.

We have made a lot of investments in this area. For example, we introduced PhotoDNA more than 10 years ago, which is a tool that is used right across the sector and by non-governmental organisations to scan for child sexual abuse material and remove it from their platforms. More recently, we have introduced a grooming tool that automates the process of trying to establish whether there is a conversation for grooming taking place between an adult and a child. That can then be flagged for human review. We have made that available at no charge to the industry, and it has been licensed by a US NGO called Thorn. We take this really seriously, but it is a complicated issue and we really welcome the regulation and the opportunity to work with the Government and Ofcom on this.

Katie O’Donovan: Thank you so much for having me here today and asking us to give evidence. Thank you for your question. I have worked at Google and YouTube for about seven years and I am really proud of our progress on safety in those years. We think about it in three different ways. First, what products can we design and build to keep our users safer? Similar to Microsoft, we have developed technology that identifies new child sex abuse material and we have made that available across the industry. We have developed new policies and new ways of detecting content on YouTube, which means we have really strict community guidelines, we identify that content and we take it down. Those policies that underlie our products are really important. Finally, we work across education, both in secondary and primary schools, to help inform and educate children through our “Be Internet Legends” programme, which has reached about 4 million people.

There is definitely much more that we can do and I think the context of a regulatory environment is really important. We also welcome the Bill and I think it is really going to be meaningful when Ofcom audits how we are meeting the requirements in the legislation—not just how platforms like ours are meeting the requirements in the Bill, but a wide spectrum of platforms that young people and adults use. That could have a really positive additive effect to the impact.

It is worth pausing and reflecting on legislation that has passed recently, as well. The age-appropriate design code or the children’s code that the Information Commissioner’s Office now manages has also helped us determine new ways to keep our users safe. For example, where we have long had a product called SafeSearch, which you can use on search and parents can keep a lock on, we now also put that on by default where we use signals to identify people who we think are under 18.

We think that is getting the right balance between providing a safer environment but also enabling people to access information. We have not waited for this regulation. This regulation can help us do more, and it can also level the playing field and really make sure that everyone in the industry steps up and meets the best practice that can exist.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Thank you, both, for adding context to that. If I can bring you back to what is not being done and why we need to legislate, Richard, I come back to you. You mentioned some of the tools and systems that you have put in place so users can stop abuse from happening. Why is it that that 90% of abuse on Instagram in direct messages is being ignored by your moderators?

Richard Earley: I do not accept that figure. I believe that if you look at our quarterly transparency report, which we just released last week, you can see that we find more than 90% of all the content that we remove for breaking our policies ourselves. Whenever somebody reports something on any of our platforms, they get a response from us. I think it is really important, as we are focusing on the Bill, to understand or make the point that, for private messaging, yes, there are different harms and different risks of harm that can apply, which is why the steps that we take differ from the steps that we take in public social media.

One of the things that we have noticed in the final draft of the Bill is that the original distinction between public social media and private messaging, which was contained in the online harms White Paper and in earlier drafts of the Bill, has been lost here. Acknowledging that distinction, and helping companies recognise that there is different risk and then different steps that can be taken in private messaging to what is taken on public social media, would be a really important thing for the Committee to consider.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Quite briefly, because I know we are short on time, exactly how many human moderators do you have working to take down disinformation and harmful illegal content on your platforms?

Richard Earley: We have around 40,000 people in total working on safety and security globally and, of those, around half directly review posts and content.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q How many of those are directly employed by you and how many are third party?

Richard Earley: I do not have that figure myself but I know it is predominantly the case that, in terms of the safety functions that we perform, it is not just looking at the pieces of content; it is also designing the technology that finds and surfaces content itself. As I said, more than 90% of the time—more than 95% in most cases—it is our technology that finds and removes content before anyone has to look at it or report it.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q On that technology, we have been told that you are not doing enough to remove harmful and illegal content in minority languages. This is a massive gap. In London alone, more than 250 languages are spoken on a regular basis. How do you explain your inaction on this? Can you really claim that your platform is safe if you are not building and investing in AI systems in a range of languages? What proactive steps are you taking to address this extreme content that is not in English?

Richard Earley: That group of 40,000 people that I mentioned, they operate 24 hours, 7 days a week. They cover more than 70 languages between them, which includes the vast majority of the world’s major spoken languages. I should say that people working at Meta, working on these classifiers and reviewing content, include people with native proficiency in these languages and people who can build the technology to find and remove things too. It is not just what happens within Meta that makes a difference here, but the work we do with our external partners. We have over 850 safety partners that we work with globally, who help us understand how different terms can be used and how different issues can affect the spread of harm on our platforms. All of that goes into informing both the policies we use to protect people on our platform and the technology we build to ensure those policies are followed.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Finally, which UK organisations that you use have quality assured any of their moderator training materials?

Richard Earley: I am sorry, could you repeat the question?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

The vast majority of people are third party. They are not employed directly by Meta to moderate content, so how many of the UK organisations you use have been quality assured to ensure that the training they provide in order to spot this illegal and harmful content is taken on board?

Richard Earley: I do not believe it is correct that for our company, the majority of moderators are employed by—

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

You do not have the figures, so you cannot tell me.

Richard Earley: I haven’t, no, but I will be happy to let you know afterwards in our written submission. Everyone who is involved in reviewing content at Meta goes through an extremely lengthy training process that lasts multiple weeks, covering not just our community standards in total but also the specific area they are focusing on, such as violence and incitement. If it is hate speech, of course, there is a very important language component to that training, but in other areas—nudity or graphic violence—the language component is less important. We have published quite a lot about the work we do to make sure our moderators are as effective as possible and to continue auditing and training them. I would be really happy to share some of that information, if you want.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q But that is only for those employed directly by Meta.

Richard Earley: I will have to get back to you to confirm that, but I think it applies to everyone who reviews content for Meta, whether they are directly employed by Meta or through one of our outsourced-in persistent partners.

None Portrait The Chair
- Hansard -

Thank you very much. Don’t worry, ladies; I am sure other colleagues will have questions that they wish to pursue. Dean Russell, please.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Good afternoon. We now hear oral evidence from Professor Clare McGlynn, professor of law at Durham University, Jessica Eagleton, policy and public affairs manager at Refuge, and Janaya Walker, public affairs manager at End Violence Against Women. Ladies, thank you very much for taking the trouble to join us this afternoon. We look forward to hearing from you.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Thank you, Sir Roger, and thank you to the witnesses for joining us. We hear a lot about the negative experiences online of women, particularly women of colour. If violence against women and girls is not mentioned directly in the Bill, if misogyny is not made a priority harm, and if the violence against women and girls code of practice is not adopted in the Bill, what will that mean for the experience of women and girls?

Janaya Walker: Thank you for the opportunity to speak today. As you have addressed there, the real consensus among violence against women and girls organisations is for VAWG to be named in the Bill. The concern is that without that, the requirements that are placed on providers of regulated services will be very narrowly tied to the priority illegal content in schedule 7, as well as other illegal content.

We are very clear that violence against women and girls is part of a continuum in which there is a really broad manifestation of behaviour; some reaches a criminal threshold, but there is other behaviour that is important to be understood as part of the wider context. Much of the abuse that women and girls face cannot be understood by only looking through a criminal lens. We have to think about the relationship between the sender and the recipient—if it is an ex-partner, for example—the severity of the abuse they have experienced, the previous history and also the reach of the content. The worry is that the outcome of the Bill will be a missed opportunity in terms of addressing something that the Government have repeatedly committed to as a priority.

As you mentioned, we have worked with Refuge, Clare McGlynn, the NSPCC and 5Rights, bringing together our expertise to produce this full code of practice, which we think the Bill should be amended to include. The code of practice would introduce a cross-cutting duty that tries to mitigate this kind of pocketing of violence against women and girls into those three categories, to ensure that it is addressed really comprehensively.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q To what extent do you think that the provisions on anonymity will assist in reducing online violence against women and girls? Will the provisions currently in the Bill make a difference?

Janaya Walker: I think it will be limited. For the End Violence Against Women Coalition, our priority above all else is having a systems-based approach. Prevention really needs to be at the heart of the Bill. We need to think about the choices that platforms make in the design and operation of their services in order to prevent violence against women and girls in the first instance.

Anonymity has a place in the sense of providing users with agency, particularly in a context where a person is in danger and they could take that step in order to mitigate harm. There is a worry, though, when we look at things through an intersectional lens—thinking about how violence against women and girls intersects with other forms of harm, such as racism and homophobia. Lots of marginalised and minoritised people rely very heavy on being able to participate online anonymously, so we do not want to create a two-tier system whereby some people’s safety is contingent on them being a verified user, which is one option available. We would like the focus to be much more on prevention in the first instance.

None Portrait The Chair
- Hansard -

Professor McGlynn and Ms Eagelton, you must feel free to come in if you wish to.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q My final question is probably directed at you, Professor McGlynn. Although we welcome the new communications offence of cyber-flashing, one of the criticisms is that it will not actually make a difference because of the onus on proving intent to cause harm, rather than the sender providing consent to receive the material. How do you respond to that?

Professor Clare McGlynn: I think it is great that the Government have recognised the harms of cyber-flashing and put that into the Bill. In the last couple of weeks we have had the case of Gaia Pope, a teenager who went missing and died—an inquest is currently taking place in Dorset. The case has raised the issue of the harms of cyber-flashing, because in the days before she went missing she was sent indecent images that triggered post-traumatic stress disorder from a previous rape. On the day she went missing, her aunt was trying to report that to the police, and one of the police officers was reported as saying that she was “taking the piss”.

What I think that case highlights, interestingly, is that this girl was triggered by receiving these images, and it triggered a lot of adverse consequences. We do not know why that man sent her those images, and I guess my question would be: does it actually matter why he sent them? Unfortunately, the Bill says that why he sent them does matter, despite the harm it caused, because it would only be a criminal offence if it could be proved that he sent them with the intention of causing distress or for sexual gratification and being reckless about causing distress.

That has two main consequences. First, it is not comprehensive, so it does not cover all cases of cyber-flashing. The real risk is that a woman, having seen the headlines and heard the rhetoric about cyber-flashing being criminalised, might go to report it to the police but will then be told, “Actually, your case of cyber-flashing isn’t criminal. Sorry.” That might just undermine women’s confidence in the criminal justice system even further.

Secondly, this threshold of having to prove the intention to cause distress is an evidential threshold, so even if you think, as might well be the case, that he sent the image to cause distress, you need the evidence to prove it. We know from the offence of non-consensual sending of sexual images that it is that threshold that limits prosecutions, but we are repeating that mistake here with this offence. So I think a consent-based, comprehensive, straightforward offence would send a stronger message and be a better message from which education could then take place.

None Portrait The Chair
- Hansard -

You are nodding, Ms Eagelton.

Jessica Eagelton: I agree with Professor McGlynn. Thinking about the broader landscape and intimate image abuse as well, I think there are some significant gaps. There is quite a piecemeal approach at the moment and issues that we are seeing in terms of enforcing measures on domestic abuse as well.

--- Later in debate ---
None Portrait The Chair
- Hansard -

We will now hear oral evidence from Lulu Freemont, head of digital regulation at techUK; Ian Stevenson, the chairman of OSTIA; and Adam Hildreth, chief executive officer of Crisp, who is appearing by Zoom—and it works. Thank you all for joining us. I will not waste further time by asking you to identify yourselves, because I have effectively done that for you. Without further ado, I call Alex Davies-Jones.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Thank you, Sir Roger; thank you, witnesses. We want the UK to become a world leader in tech start-ups. We want those employment opportunities for the future. Does this legislation, as it currently stands, threaten that ability?

Lulu Freemont: Hi everybody. Thank you so much for inviting techUK to give evidence today. Just to give a small intro to techUK, so that you know the perspective I am coming from, we are the trade body for the tech sector. We have roughly 850 tech companies in our membership, the majority of which are small and medium-sized enterprises. We are really focused on how this regime will work for the 25,000 tech companies that are set to be in scope, and our approach is really on the implementation and how the Bill can deliver on the objectives.

Thank you so much for the question. There are some definite risks when we think about smaller businesses and the Online Safety Bill. Today, we have heard a lot of the names that come up with regard to tech companies; they are the larger companies. However, this will be a regime that impacts thousands of different tech companies, with different functionalities and different roles within the ecosystem, all of which contribute to the economy in their own way.

There are specific areas to be addressed in the Bill, where there are some threats to innovation and investment by smaller businesses. First, greater clarity is needed. In order for this regime to be workable for smaller businesses, they need clarity on guidelines and on definitions, and they also need to be confident that the systems and processes that they put in place will be sustainable—in other words, the right ones.

Certain parts of the regime risk not having enough clarity. The first thing that I will point to is around the definitions of harm. We would very much welcome having some definitions of harmful content, or even categories of harmful content, in primary legislation. It might then be for Ofcom to determine how those definitions are interpreted within the codes, but having things to work off and types of harmful content for smaller businesses to start thinking about would be useful; obviously, that will be towards children, given that they are likely to be category 2.

The second risk for smaller businesses is really around the powers of the Secretary of State. I think there is a real concern. The Secretary of State will have some technical powers, which are pretty much normal; they are what you would expect in any form of regulation. However, the Online Safety Bill goes a bit further than that, introducing some amendment powers. So, the Secretary of State can modify codes of practice to align with public policy. In addition to that, there are provisions to allow the Secretary of State to set thresholds between the categories of companies.

Smaller businesses want to start forming a strong relationship with Ofcom and putting systems and processes in place that they can feel confident in. If they do not have that level of confidence and if the regime could be changed at any point, they might not be able to progress with those systems and processes, and when it comes to kind of pushing them out of the market, they might not be able to keep up with some of the larger companies that have been very much referenced in every conversation.

So, we need to think about proportionality, and we need to think about Ofcom’s independence and the kind of relationship that it can form with smaller businesses. We also need to think about balance. This regime is looking to strike a balance between safety, free speech and innovation in the UK’s digital economy. Let us just ensure that we provide enough clarity for businesses so that they can get going and have confidence in what they are doing.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Thank you, Lulu. Adam and Ian, if either of you want to come in at any point, please just indicate that and I will bring you in.

None Portrait The Chair
- Hansard -

May I just apologise before we go any further, because I got you both the wrong way round? I am sorry. It is Mr Stevenson who is online and it is Adam Hildreth who is here in body and person.

Adam Hildreth: I think we have evolved as a world actually, when it comes to online safety. I think that if you went back five or 10 years, safety would have come after your people had developed their app, their platform or whatever they were creating from a tech perspective. I think we are now in a world where safety, in various forms, has to be there by default. And moving on to your point, we have to understand what that means for different sizes of businesses. The risk assessment word or phrase for me is the critical part there, which is putting blocks in front of people who are innovating and creating entrepreneurial businesses that make the online world a better place. Putting those blocks in without them understanding whether they can compete or not in an open and fair market is where we do not want to be.

So, getting to the point where it is very easy to understand is important—a bit like where we got to in other areas, such as data protection and where we went with the GDPR. In the end, it became simplified; I will not use the word “simplified” ever again in relation to GDPR, but it did become simplified from where it started. It is really important for anyone developing any type of tech platform that the Online Safety Bill will affect that they understand exactly what they do and do not have to put in place; otherwise, they will be taken out just by not having a legal understanding of what is required.

The other point to add, though, is that there is a whole other side to online safety, which is the online safety tech industry. There are tons of companies in the UK and worldwide that are developing innovative technologies that solve these problems. So, there is a positive as well as an understanding of how the Bill needs to be created and publicised, so that people understand what the boundaries are, if you are a UK business.

None Portrait The Chair
- Hansard -

Mr Stevenson, you are nodding. Do you want to come in?

Ian Stevenson: I agree with the contributions from both Adam and Lulu. For me, one of the strengths of the Bill in terms of the opportunity for innovators is that so much is left to Ofcom to provide codes of practice and so on in the future, but simultaneously that is its weakness in the short term. In the absence of those codes of practice and definitions of exactly where the boundaries between merely undesirable and actually harmful and actionable might lie, the situation is very difficult. It is very difficult for companies like my own and the other members of the Online Safety Tech Industry Association, who are trying to produce technology to support safer experiences online, to know exactly what that technology should do until we know which harms are in scope and exactly what the thresholds are and what the definitions of those harms are. Similarly, it is very hard for anybody building a service to know what technologies, processes and procedures they will need until they have considerably more detailed information than they have at the moment.

I agree that there are certain benefits to having more of that in the Bill, especially when it comes to the harms, but in terms of the aspiration and of what I hear is the objective of the Bill—creating safer online experiences—we really need to understand when we are going to have much more clarity and detail from Ofcom and any other relevant party about exactly what is going to be seen as best practice and acceptable practice, so that people can put in place those measures on their sites and companies in the Online Safety Tech Industry Association can build the tools to help support putting those measures in place.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Thank you all. Lulu, you mentioned concerns about the Secretary of State’s powers and Ofcom’s independence. Other concerns expressed about Ofcom include its ability to carry out this regulation. It is being hailed as the saviour of the internet by some people. Twenty-five thousand tech companies in the UK will be under these Ofcom regulations, but questions have been asked about its technical and administrative capacity to do this. Just today, there is an online safety regulator funding policy adviser role being advertised by the Department for Digital, Culture, Media and Sport. Part of the key roles and responsibilities are:

“The successful post holder will play a key role in online safety as the policy advisor on Funding for the Online Safety Regulator.”

Basically, their job is to raise money for Ofcom. Does that suggest concerns about the role of Ofcom going forward, its funding, and its resource and capacity to support those 25,000 platforms?

Lulu Freemont: It is a very interesting question. We really support Ofcom in this role. We think that it has a very good track record with other industries that are also in techUK’s membership, such as broadcasters. It has done a very good job at implementing proportionate regulation. We know that it has been increasing its capacity for some time now, and we feel confident that it is working with us as the trade and with a range of other experts to try to understand some of the detail that it will have to understand to regulate.

One of the biggest challenges—we have had this conversation with Ofcom as well—is to understand the functionalities of tech services. The same functionality might be used in a different context, and that functionality could be branded as very high risk in one context but very low risk in another. We are having those conversations now. It is very important that they are being had now, and we would very much welcome Ofcom publishing drafts. We know that is its intention, but it should bring everything forward in terms of all the gaps in this regulation that are left to Ofcom’s codes, guidance and various other documentation.

Adam Hildreth: One of the challenges that I hear a lot, and that we hear a lot at Crisp in our work, is that people think that the Bill will almost eradicate all harmful content everywhere. The challenge that we have with content is that every time we create a new technology or mechanism that defeats harmful or illegal content, the people who are creating it—they are referred to in lots of ways, but bad actors, ultimately—create another mechanism to do it. It is very unlikely that we will ever get to a situation in which it is eradicated from every platform forever—though I hope we do.

What is even harder for a regulator is to be investigating why a piece of content is on a platform. If we get to a position where people are saying, “I saw this bit of content; it was on a platform,” that will be a really dangerous place to be, because the funding requirement for any regulator will go off the charts—think about how much content we consume. I would much prefer to be in a situation where we think about the processes and procedures that a platform puts in place and making them appropriate, ensuring that if features are aimed at children, they do a risk assessment so that they understand how those features are being used and how they could affect children in particular—or they might have a much more diverse user group, whereby harm is much less likely.

So, risk assessments and, as Ian mentioned, technologies, processes and procedures—that is the bit that a regulator can do well. If your risk assessment is good and your technology, process and procedures are as good as they can be based on a risk assessment, that almost should mean that you are doing the best job you possibly can to stop that content appearing, but you are not eradicating it. It really worries me that we are in a position whereby people are going to expect that they will never see content on a platform again, even though billions of pieces of potentially harmful content could have been removed from those platforms.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q On that point, you mentioned that it is hard to predict the future and to regulate on the basis of what is already there. We have waited a long time for the Bill, and in that time we have had new platforms and new emerging technology appear. How confident are you that the Bill allows for future-proofing, in order that we can react to anything new that might crop up on the internet?

Adam Hildreth: I helped personally in 2000 and 2001, when online grooming did not even exist as a law, so I have been involved in this an awful long time, waiting for laws to exist. I do not think we will ever be in a situation in which they are future-proofed if we keep putting every possibility into law. There needs to be some principles there. There are new features launched every day, and assessments need to be made about who they pose a risk to and the level of risk. In the same way as you would do in all kinds of industries, someone should do an assessment from a health and safety perspective. From that, you then say, “Can we even launch it at all? Is it feasible? Actually, we can, because we can take this amount of risk.” Once they understand those risk assessments, technology providers can go further and develop technology that can combat this.

If we can get to the point where it is more about process and the expectations around people who are creating any types of online environments, apps or technologies, it will be future-proofed. If we start trying to determine exact pieces of content, what will happen is that someone will work out a way around it tomorrow, and that content will not be included in the Bill, or it will take too long to get through and suddenly, the whole principle of why we are here and why we are having this discussion will go out the window. That is what we have faced every day since 1998: every time the technology works out how to combat a new risk—whether that is to children, adults, the economy or society—someone comes along and works out a way around the technology or around the rules and regulations. It needs to move quickly; that will future-proof it.

None Portrait The Chair
- Hansard -

I have four Members plus the Minister to get in, so please be brief. I call Dean Russell.

--- Later in debate ---
None Portrait The Chair
- Hansard -

We will now hear from Rhiannon-Faye McDonald, victim and survivor advocate at the Marie Collins Foundation, and Susie Hargreaves, chief executive at the Internet Watch Foundation. Thank you for joining us this afternoon; first question, please.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Thank you both for joining us this afternoon. One of the key objectives of the legislation is to ensure that a high level of protection for children and adults is in place. In your view, does the Bill in its current form achieve that?

Susie Hargreaves: Thank you very much for inviting me today. I think the Bill is working in the right direction. Obviously, the area that we at the IWF are concerned with is child sexual abuse online, and from our point of view, the Bill does need to make a few changes in order to put those full protections in place for children.

In particular, we have drafted an amendment to put co-designation on the face of the Bill. When it comes to child sexual abuse, we do not think that contracting out is an acceptable approach, because we are talking about the most egregious form of illegal material—we are talking about children—and we need to ensure that Ofcom is not just working in a collaborative way, but is working with experts in the field. What is really important for us at the moment is that there is nothing in the Bill to ensure that the good work that has been happening over 25 years in this country, where the IWF is held up as a world leader, is recognised, and that that expertise is assured on the face of the Bill. We would like to see that amendment in particular adopted, because the Bill needs to ensure that there are systems and processes in place for dealing with illegal material. The IWF already works with internet companies to ensure they take technical services.

There needs to be a strong integration with law enforcement—again, that is already in place with the memorandum of understanding between CPS, the National Police Chiefs’ Council and the IWF. We also need clarity about the relationship with Ofcom so that child sexual abuse, which is such a terrible situation and such a terrible crime, is not just pushed into the big pot with other harms. We would like to see those specific changes.

Rhiannon-Faye McDonald: Generally, we think the Bill is providing a higher standard of care for children, but there is one thing in particular that I would like to raise. Like the IWF, the Marie Collins Foundation specialises in child sexual abuse online, specifically the recovery of people who have been affected by child sexual abuse.

The concern I would like to raise is around the contextual CSA issue. I know this has been raised before, and I am aware that the Obscene Publications Act 1959 has been brought into the list of priority offences. I am concerned that that might not cover all contextual elements of child sexual abuse: for example, where images are carefully edited and uploaded to evade content moderation, or where there are networks of offenders who are able to gain new members, share information with each other, and lead other people to third-party sites where illegal content is held. Those things might not necessarily be caught by the illegal content provisions; I understand that they will be dealt with through the “legal but harmful” measures.

My concern is that the “legal but harmful” measures do not need to be implemented by every company, only those that are likely to be accessed by children. There are companies that can legitimately say that the majority of their user base is not children, and therefore would not have to deal with that, but that provides a space for this contextual CSA to happen. While those platforms may not be accessed by children as much as other platforms, it still provides a place for this to happen—the harm can still occur, even if children do not come across it as much as they would elsewhere.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q On that point, one of the concerns that has been raised by other stakeholders is about the categorisation of platforms—for example, category 1 and category 2B have different duties on them, as Ofcom is the regulator. Would you rather see a risk-based approach to platforms, rather than categorisation? What are your thoughts on that?

Susie Hargreaves: We certainly support the concept of a risk-based approach. We host very little child sexual abuse content in the UK, with the majority of the content we see being hosted on smaller platforms in the Netherlands and other countries. It is really important that we take a risk-based approach, which might be in relation to where the content is—obviously, we are dealing with illegal content—or in relation to where children are. Having a balance there is really important.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q A final question from me. We heard concerns from children’s charities and the Children’s Commissioner that the Bill does not account for breadcrumbing—the cross-platform grooming that happens on platforms. What more could the Bill do to address that, and do you see it as an omission and a risk?

Susie Hargreaves: I think we probably have a slightly different line from that of some of the other charities you heard from this morning, because we think it is very tricky and nuanced. What we are trying to do at the moment is define what it actually means and how we would have to deal with it, and we are working very closely with the Home Office to go through some of those quite intense discussions. At the moment, “harmful” versus “illegal” is not clearly defined in law, and it could potentially overwhelm certain organisations if we focus on the higher-level harms and the illegal material. We think anything that protects children is essential and needs to be in the Bill, but we need to have those conversations and to do some more work on what that means in reality. We are more interested in the discussions at the moment about the nuance of the issue, which needs to be mapped out properly.

One of the things that we are very keen on in the Bill as a whole is that there should be a principles-based approach, because we are dealing with new harms all the time. For example, until 2012 we had not seen self-generated content, which now accounts for 75% of the content we remove. So we need constantly to change and adapt to new threats as they come online, and we should not make the Bill too prescriptive.

None Portrait The Chair
- Hansard -

Ms McDonald?

Rhiannon-Faye McDonald: I was just thinking of what I could add to what Susie has said. My understanding is that it is difficult to deal with cross-platform abuse because of the ability to share information between different platforms—for example, where a platform has identified an issue or offender and not shared that information with other platforms on which someone may continue the abuse. I am not an expert in tech and cannot present you with a solution to that, but I feel that sharing intelligence would be an important part of the solution.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Finally this afternoon, we will hear from Ellen Judson, who is the lead researcher at the Centre for the Analysis of Social Media at Demos, and Kyle Taylor, who is the founder and director of Fair Vote. Thank you for joining us this afternoon.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Thank you both for joining us, and for waiting until the end of a very long day. We appreciate it.

There is a wide exemption in the Bill for the media and for journalistic content. Are you concerned that that is open to abuse?

Kyle Taylor: Oh, absolutely. There are aspects of the Bill that are extremely worrying from an online safety perspective: the media exemption, the speech of democratic importance exemption, and the fact that a majority of paid ads are out of scope. We know that a majority of harmful content originates from or is amplified by entities that meet one of those exceptions. What that means is that the objective of the Bill, which is to make the online world safer, might not actually be possible, because platforms, at least at present, are able to take some actions around these through their current terms and conditions, but this will say explicitly that they cannot act.

One real-world example is the white supremacist terror attack just last week in Buffalo, in the United States. The “great replacement” theory that inspired the terrorist was pushed by Tucker Carlson of Fox News, who would meet the media exemption; by right-wing blogs, which were set up by people who claim to be journalists and so would meet the journalistic standards exemption; by the third-ranking House Republican, who would meet the democratic importance exemption; and it was even run as paid ads by those candidates. In that one example, you would not be able to capture a majority of the way that harm spreads online.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Is there a way in which the exemptions could be limited to ensure that the extremists you have mentioned cannot take advantage of them?

Ellen Judson: I think there are several options. The primary option, as we would see it, is that the exemptions are removed altogether, on the basis that if the Bill is really promoting a systems-based approach rather than focusing on individual small categories of content, then platforms should be required to address their systems and processes whenever those lead to an increased risk of harm. If that leads to demotion of media content that meets those harmful thresholds, that would seem appropriate within that response.

If the exemptions are not to be removed, they could be improved. Certainly, with regard to the media exemption specifically, I think the thresholds for who qualifies as a recognised news publisher could be raised to make it more difficult for bad actors and extremists, as Kyle mentioned, simply to set up a website, add a complaints policy, have an editorial code of conduct and then say that they are a news publisher. That could involve linking to existing publishers that are already registered with existing regulators, but I think there are various ways that could be strengthened.

On the democratic importance and journalism exemptions, I think the issue is that the definitions are very broad and vague; they could easily be interpreted in any way. Either they could be interpreted very narrowly, in which case they might not have much of an impact on how platforms treat freedom of expression, as I think they were intended to do; or they could be interpreted very broadly, and then anyone who thinks or who can claim to think that their content is democratically important or journalistic, even if it is clearly abusive and breaches the platform’s terms and conditions, would be able to claim that.

One option put forward by the Joint Committee is to introduce a public interest exemption, so that platforms would have to think about how they are treating content that is in the public interest. That would at least remove some of the concerns. The easiest way for platforms to interpret what is democratically important speech and what is journalistic speech is based on who the user is: are they a politician or political candidate, or are they a journalist? That risks them privileging certain people’s forms of speech over that of everyday users, even if that speech is in fact politically relevant. I think that having something that moves the threshold further away from focusing on who a user is as a proxy for whether their speech is likely to deserve extra protection would be a good start.

Kyle Taylor: It is basically just saying that content can somehow become less harmful depending on who says it. A systems-based approach is user-neutral, so its only metric is: does this potentially cause harm at scale? It does not matter who is saying it; it is simply a harm-based approach and a system solution. If you have exemptions, exceptions and exclusions, a system will not function. It suggests that a normal punter with six followers saying that the election was stolen is somehow more harmful than the President of the United States saying that an election is stolen. That is just the reality of how online systems work and how privileged and powerful users are more likely to cause harm.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q You are creating a two-tier internet, effectively, between the normal user and those who are exempt, which large swathes of people will be because it is so ambiguous. One of the other concerns that have been raised is the fact that the comments sections on newspaper websites are exempt from the Bill. Do you see an issue with that?

Ellen Judson: There is certainly an issue as that is often where we see a lot of abuse and harm, such that if that same content were replicated on a social media platform, it would almost certainly be within the scope of the Bill. There is a question, which is for Ofcom to consider in its risk profiles and risk registers, about where content at scale has the potential to cause the most harm. The reach of a small news outlet’s comments section would be much less than the reach of Donald Trump’s Twitter account, for instance. Certainly, if the risk assessments are done and comments sections of news websites have similar reach and scale and could cause significant harm, I think it would be reasonable for the regulator to consider that.

Kyle Taylor: It is also that they are publicly available. I can speak from personal experience. Just last week, there was a piece about me. The comments section simultaneously said that I should be at Nuremberg 2.0 because I was a Nazi, but also that I should be in a gas chamber. Hate perpetuates in a comments section just as it does on a social media platform. The idea that it is somehow less harmful because it is here and not there is inconsistent and incoherent with the regime where the clue is in the name: the Online Safety Bill. We are trying to make the online world safer.

On media I would add that we have to think about how easy it is, based on the criteria in the Bill, to become exempt as a media entity. We can think about that domestically, but what happens when a company is only meant to enforce their terms and conditions in that country, but can broadcast to the world? The UK could become the world’s disinformation laundromat because you can come here, meet the media exemption and then blast content to other places in the world. I do not think that is something that we are hoping to achieve through this Bill. We want to be the safest place in the world to go online and to set a global benchmark for what good regulation looks like.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q I suppose, yes. Under the current media carve-out, how do you see platforms being able to detect state actors that are quoting misinformation or perpetuating disinformation on their platforms?

Ellen Judson: I think it is a real challenge with the media exemptions, because it is a recognised tactic of state-based actors, state-aligned actors and non-state actors to use media platforms as ways to disseminate this information. If you can make a big enough story out of something, it gets into the media and that perpetuates the campaign of abuse, harassment and disinformation. If there are protections in place, it will not take disinformation actors very long to work out that if there are ways that they can get stories into the press, they are effectively covered.

In terms of platform enforceability, if platforms are asked to, for instance, look at their systems of amplification and what metrics they use to recommend or promote content to users, and to do that from a risk-based perspective and based on harm except when they are talking about media, it all becomes a bit fuzzy what a platform would actually be expected to do in terms of curating those sorts of content.

Kyle Taylor: As an example, Russia Today, until its broadcast licence was revoked about three months ago, would have qualified for the media exemption. Disinformation from Russia Today is not new; it has been spreading disinformation for years and years, and would have qualified for the media exemption until very recently.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q So as a result of these exemptions, the Bill as it stands could make the internet less safe than it currently is.

Kyle Taylor: The Bill as it stands could absolutely make the internet less safe than it currently is.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q You have done a really good job of explaining the concerns about journalistic content. Thinking about the rest of the Bill for a moment, do you think the balance between requiring the removal of content and the prioritisation of content is right? Do you think it will be different from how things are now? Do you think there is a better way it could be done in the Bill?

Ellen Judson: The focus at the moment is too heavily on content. There is a sort of tacit equation of content removal—sometimes content deprioritisation, but primarily content removal—as the way to protect users from harm, and as the threat to freedom of expression. That is where the tension comes in with how to manage both those things at once. What we would want from a Bill that was taking more of a systems approach is thinking: where are platforms making decisions about how they are designing their services, and how they are operating their services at all levels? Content moderation policy is certainly included, but it goes back to questions of how a recommendation algorithm is designed and trained, who is involved in that process, and how human moderators are trained and supported. It is also about what functionality users are given and what behaviour is incentivised and encouraged. There is a lot of mitigation that platforms can put in place that does not talk about directly affecting user content.

I think we should have risk assessments that focus on the risks of harms to users, as opposed to the risk of users encountering harmful content. Obviously there is a relationship, but one piece of content may have very different effects when it is encountered by different users. It may cause a lot of harm to one user, whereas it may not cause a lot of harm to another. We know that when certain kinds of content are scaled and amplified, and certain kinds of behaviour are encouraged or incentivised, we see harms at a scale that the Bill is trying to tackle. That is a concern for us. We want more of a focus on some things that are mentioned in the Bill—business models, platform algorithms, platform designs and systems and processes. They often take a backseat to the issues of content identification and removal.

Kyle Taylor: I will use the algorithm as an example, because this word flies around a lot when we talk about social media. An algorithm is a calculation that is learning from people’s behaviour. If society is racist, an algorithm will be racist. If society is white, an algorithm will be white. You can train an algorithm to do different things, but you have to remember that these companies are for-profit businesses that sell ad space. The only thing they are optimising for in an algorithm is engagement.

What we can do, as Ellen said, through a system is force optimisation around certain things, or drive algorithms away from certain types of content, but again, an algorithm is user-neutral. An algorithm does not care what user is saying what; it is just “What are people clicking on?”, regardless of what it is or who said it. An approach to safety has to follow the same methodology and say, “We are user-neutral. We are focused entirely on propensity to cause harm.”

The second piece is all the mitigation measures you can take once a post is up. There has been a real binary of “Leave it up” and “Take it down”, but there is a whole range of stuff—the most common word used is “friction”—to talk about what you can do with content once it is in the system. You have to say to yourself, “Okay, we absolutely must have free speech protections that exceed the platform’s current policies, because they are not implemented equally.” At the same time, you can preserve someone’s free expression by demonetising content to reduce the incentive of the company to push that content or user through its system. That is a way of achieving both a reduction in harm and the preservation of free expression.

Online Safety Bill (Third sitting)

Alex Davies-Jones Excerpts
Committee stage & Committee Debate - 3rd sitting
Thursday 26th May 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 26 May 2022 - (26 May 2022)
None Portrait The Chair
- Hansard -

Before we hear oral evidence, I invite Members to declare any interests in connection with the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - -

I need to declare an interest, Ms Rees. Danny Stone from the Antisemitism Policy Trust provides informal secretariat in a personal capacity to the all-party parliamentary group on wrestling, which I co-chair.

None Portrait The Chair
- Hansard -

That is noted. Thank you.

Examination of Witnesses

Mat Ilic, William Moy, Professor Lorna Woods MBE and William Perrin OBE gave evidence.

None Portrait The Chair
- Hansard -

We will now hear oral evidence from Mat Ilic, chief development officer at Catch22; William May, chief executive at Full Fact; and Professor Lorna Woods and William Perrin of the Carnegie UK Trust. Before calling the first Member, I remind all Members that questions should be limited to matters within the scope of the Bill and that we must stick to the timings in the programme order that the Committee agreed. For this session, we have until 12.15 pm. I call Alex Davies- Jones to begin the questioning.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q187 Good morning to our witnesses. Thank you for joining us today. One of the main criticisms of the Bill is that the vast majority of the detail will not be available until after the legislation is enacted, under secondary legislation and so on. Part of the problem is that we are having difficulty in differentiating the “legal but harmful” content. What impact does that have?

William Perrin: At Carnegie, we saw this problem coming some time ago, and we worked in the other place with Lord McNally on a private Member’s Bill —the Online Harms Reduction Regulator (Report) Bill—that, had it carried, would have required Ofcom to make a report on a wide range of risks and harms, to inform and fill in the gaps that you have described.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

On a point of order, Ms Rees. There is a gentleman taking photographs in the Gallery.

None Portrait The Chair
- Hansard -

There is no photography allowed here.

William Perrin: Unfortunately, that Bill did not pass and the Government did not quite take the hint that it might be good to do some prep work with Ofcom to provide some early analysis to fill in holes in a framework Bill. The Government have also chosen in the framework not to bring forward draft statutory instruments or to give indications of their thinking in a number of key areas of the Bill, particularly priority harms to adults and the two different types of harms to children. That creates uncertainty for companies and for victims, and it makes the Bill rather hard to scrutinise.

I thought it was promising that the Government brought forward a list of priority offences in schedule 7 —I think that is where it is; I get these things mixed up, despite spending hours reading the thing. That was helpful to some extent, but the burden is on the Government to reduce complexity by filling in some of the blanks. It may well be better to table an amendment to bring some of these things into new schedules, as we at Carnegie have suggested—a schedule 7A for priority harms to adults, perhaps, and a 7B and 7C for children and so on—and then start to fill in some of the blanks in the regime, particularly to reassure victims.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Thank you. Does anybody else want to comment?

William Moy: There is also a point of principle about whether these decisions should be made by Government later or through open, democratic, transparent decision making in Parliament.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q That brings me on to my next point, William, relating to concerns about the powers that the Bill gives to the Secretary of State and about the independence of the regulator and the impact that could have. Do you have any comments on that?

William Moy: Sure. I should point out—we will need to get to this later—the fact that the Bill is not seriously trying to address misinformation and disinformation at this point, but in that context, we all know that there will be another information incident that will have a major effect on the public. We have lived through the pandemic, when information quality has been a matter of life and death; we are living through information warfare in the context of Ukraine, and more will come. The only response to that in the Bill is in clause 146, which gives the Secretary of State power to direct Ofcom to use relatively weak media literacy duties to respond.

We think that in an open society there should be an open mechanism for responding to information incidents—outbreaks of misinformation and disinformation that affect people’s lives. That should be set out in the roles of the regulator, the Government and internet companies, so that there is a framework that the public understand and that is open, democratic and transparent in declaring a misinformation and disinformation incident, creating proportionate responses to it, and monitoring the effects of those responses and how the incident is managed. At the moment, it largely happens behind closed doors and it involves a huge amount of restricting what people can see and share online. That is not a healthy approach in an open society.

William Perrin: I should add that as recently as April this year, the Government signed up to a recommendation of the Council of Ministers of the Council of Europe on principles for media and communication governance, which said that

“media and communication governance should be independent and impartial to avoid undue influence…discriminatory treatment and preferential treatment of powerful groups, including those with significant political or economic power.”

That is great. That is what the UK has done for 50 to 60 years in media regulation, where there are very few powers for the Secretary of State or even Parliament to get involved in the day-to-day working of communications regulators. Similarly, we have had independent regulation of cinema by the industry since 1913 and regulation of advertising independent of Government, and those systems have worked extremely well. However, this regime—which, I stress, Carnegie supports—goes a little too far in introducing a range of powers for the Secretary of State to interfere with Ofcom’s day-to-day doing of its business.

Clause 40 is particularly egregious, in that it gives the Secretary of State powers of direction over Ofcom’s codes of practice and, very strangely, introduces an almost infinite ability for the Government to keep rejecting Ofcom’s advice—presumably, until they are happy with the advice they get. That is a little odd, because Ofcom has a long track record as an independent, evidence-based regulator, and as Ofcom hinted in a terribly polite way when it gave evidence to this Committee, some of these powers may go a little too far. Similarly, in clause 147, the Secretary of State can give tactical guidance to Ofcom on its exercise of its powers. Ofcom may ignore that advice, but it is against convention that the Secretary of State can give that advice at all. The Secretary of State should be able to give strategic guidance to Ofcom roughly one or one and a half times per Parliament to indicate its priorities. That is absolutely fine, and is in accordance with convention in western Europe and most democracies, but the ability to give detailed guidance is rather odd.

Then, as Mr Moy has mentioned, clause 146, “Directions in special circumstances”, is a very unusual power. The Secretary of State can direct Ofcom to direct companies to make notices about things and can direct particular companies to do things without a particularly high threshold. There just have to be “reasonable grounds to believe”. There is no urgency threshold, nor is there a strong national security threshold in there, or anyone from whom the Secretary of State has to take advice in forming that judgment. That is something that we think can easily be amended down.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Thank you. Mr Moy, you brought up the issue of misinformation and disinformation being removed from the scope of the Bill. Can you expand on your thoughts on that point?

William Moy: Absolutely. It is an extraordinary decision in a context where we are just coming through the pandemic, where information quality was such a universal concern, and we are in an information war, with the heightened risk of attempts to interfere in future elections and other misinformation and disinformation risks. It is also extraordinary because of the Minister’s excellent and thoughtful Times article, in which he pointed out that at the moment, tech companies censor legal social media posts at vast scale, and this Bill does nothing to stop that. In fact, the Government have actively asked internet companies to do that censorship—it has told them to do so. I see the Minister looking surprised, so let me quote from BBC News on 5 April 2020:

“The culture secretary is to order social media companies to be more aggressive in their response to conspiracy theories linking 5G networks to the coronavirus pandemic.”

In that meeting, essentially, the internet companies were asked to make sure they were taking down that kind of content from their services. Now, in the context of a Bill where, I think, the Minister and I completely agree about our goal—tackling misinformation in an open society—there is an opportunity for this Bill to be an example to the free world of how open societies respond to misinformation, and a beacon for the authoritarian world as well.

This is the way to do that. First, set out that the Bill must cover misinformation and disinformation. We cannot leave it to internet companies, with their political incentives, their commercial convenience and their censoring instincts, to do what they like. The Bill must cover misinformation and set out an open society response to it. Secondly, we must recognise that the open society response is about empowering people. The draft Bill had a recognition that we need to modernise the media literacy framework, but we do not have that in this Bill, which is really regrettable. It would be a relatively easy improvement to create a modern, harms and safety-based media literacy framework in this Bill, empowering users to make their own decisions with good information.

Then, the Bill would need to deal with three main threats to freedom of expression that threaten the good information in our landscape. Full Fact as a charity exists to promote informed and improved public debate, and in the long run we do that by protecting freedom of expression. Those three main threats are artificial intelligence, the internet companies and our own Government, and there are three responses to them. First, we must recognise that the artificial intelligence that internet companies use is highly error-prone, and it is a safety-critical technology. Content moderation affects what we can all see and share; it affects our democracy, it affects our health, and it is safety-critical. In every other safety-critical industry, that kind of technology would be subject to independent third-party open testing. Cars are crashed against walls, water samples are taken and tested, even sofas are sat on thousands of times to check they are safe, but internet companies are subject to no third-party independent open scrutiny. The Bill must change that, and the crash test dummy test is the one I would urge Members to apply.

The second big threat, as I said, is the internet companies themselves, which too often reach for content restrictions rather than free speech-based and information-based interventions. There are lots of things you can do to tackle misinformation in a content-neutral way—creating friction in sharing, asking people to read a post before they share it—or you can tackle misinformation by giving people information, rather than restricting what they can do; fact-checking is an example of that. The Bill should say that we prefer content-neutral and free speech-based interventions to tackle misinformation to content-restricting ones. At the moment the Bill does not touch that, and thus leaves the existing system of censorship, which the Minister has warned about, in place. That is a real risk to our open society.

The final risk to freedom of expression, and therefore to tackling misinformation, are the Government themselves. I have just read you an example of a Government bringing in internet companies to order them around by designating their terms and conditions and saying certain content is unacceptable. That content then starts to get automatically filtered out, and people are stopped from seeing it and sharing it online. That is a real risk. Apart from the fact that they press released it, that is happening behind closed doors. Is that acceptable in an open democratic society, or do we think there should be a legal framework governing when Governments can seek to put pressure on internet companies to affect what we can all see and share? I think that should be governed by a clear legislative framework that sets out if those functions need to exist, what they are and what their parameters are. That is just what we would expect for any similarly sensitive function that Government carry out.

None Portrait The Chair
- Hansard -

Thank you. I am going to bring Maria Miller in now.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Good morning, witnesses. Thank you for joining us today. Does the Bill give Ofcom discretion to regulate on the smaller but high-risk platforms?

Danny Stone: First, thank you for having me today. We have made various representations about the problems that we think there are with small, high-harm platforms. The Bill creates various categories, and the toughest risk mitigation is on the larger services. They are defined by their size and functionality. Of course, if I am determined to create a platform that will spread harm, I may look at the size threshold that is set and make a platform that falls just below it, in order to spread harm.

It is probably important to set out what this looks like. The Community Security Trust, which is an excellent organisation that researches antisemitism and produces incident figures, released a report called “Hate Fuel” in June 2020. It looked at the various small platforms and highlighted that, in the wake of the Pittsburgh antisemitic murders, there had been 26 threads, I think, with explicit calls for Jews to be killed. One month prior to that, in May 2020, a man called Payton Gendron found footage of the Christchurch attacks. Among this was legal but harmful content, which included the “great replacement” theory, GIFs and memes, and he went on a two-year journey of incitement. A week or so ago, he targeted and killed 10 people in Buffalo. One of the things that he posted was:

“Every Time I think maybe I shouldn’t commit to an attack I spend 5 min of /pol/”—

which is a thread on the small 4chan platform—

“then my motivation returns”.

That is the kind of material that we are seeing: legal but harmful material that is inspiring people to go out and create real-world harm. At the moment, the small platforms do not have that additional regulatory burden. These are public-facing message boards, and this is freely available content that is promoted to users. The risks of engaging with such content are highest. There is no real obligation, and there are no consequences. It is the most available extremism, and it is the least regulated in respect of the Bill. I know that Members have raised this issue and the Minister has indicated that the Government are looking at it, but I would urge that something is done to ensure that it is properly captured in the Bill, because the consequences are too high if it is not.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Thanks, Danny. So in your opinion, you would rather see a risk-based approach, as opposed to size and functionality.

Danny Stone: I think there are various options. Either you go for a risk-based approach—categorisation—or you could potentially amend it so that it is not just size and functionality. You would take into account other things—for example, characteristics are already defined in the Bill, and that might be an option for doing it.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Does anybody else want to come in on small platforms? Liron?

Liron Velleman: From the perspective of HOPE not hate, most of our work targeting and looking at far-right groups is spent on some of those smaller platforms. I think that the original intention of the Bill, when it was first written, may have been a more sensible way of looking at the social media ecospace: larger platforms could host some of this content, while other platforms were just functionally not ready to host large, international far-right groups. That has changed radically, especially during the pandemic.

Now, there are so many smaller platforms—whether small means hundreds of thousands, tens of thousands or even smaller than that—that are almost as easy to use as some of the larger platforms we all know so well. Some of the content on those smaller platforms is definitely the most extreme. There are mechanisms utilised by the far-right—not just in the UK, but around the world—to move that content and move people from some of the larger platforms, where they can recruit, on to the smaller platforms. To have a situation in which that harmful content is not looked at as stringently as content on the larger platforms is a miscategorisation of the internet.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q One of our concerns with the Bill, which we raised with the regulator, Ofcom, in Tuesday’s evidence session, is what would happen in the interim if one of those smaller categorised platforms was to grow substantially and then need to be recategorised. Our concern is about what would happen in the interim, during the recategorisation process, while that platform was allowed to disseminate harmful content. What would you like to see happen as an interim measure during recategorisation, if that provision remained in the Bill?

Liron Velleman: We have seen this similarly with the proscription of far-right terrorist groups in other legislation. It was originally quite easy to say that, eventually, the Government would proscribe National Action as a far-right terror group. What has happened since is that aliases and very similar organisations are set up, and it then takes months or sometimes years for the Government to be able to proscribe those organisations. We have to spend our time making the case as to why those groups should be banned.

We can foresee a similar circumstance here. We turn around and say, “Here is BitChute” or hundreds of other platforms that should be banned. We spend six months saying to the Government that it needs to be banned. Eventually, it is, but then almost immediately an offshoot starts. We think that Ofcom should have delegated power to make sure that it is able to bring those platforms into category 1 almost immediately, if the categorisations stay as they are.

Danny Stone: It could serve a notice and ensure that platforms prepare for that. There will, understandably, be a number of small platforms that are wary and do not want to be brought into that category, but some of them will need to be brought in because of the risk of harm. Let us be clear: a lot of this content may well—probably will—stay on the platform, but, at the very least, they will be forced to risk assess for it. They will be forced to apply their terms and conditions consistently. It is a step better than what they will be doing without it. Serving a notice to try to bring them into that regime as quickly as possible and ensure that they are preparing measures to comply with category 1 obligations would be helpful.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Thank you. The Antisemitism Policy Trust has made the case that search services should be eligible for inclusion as a high-risk category. Is that still your position? What is the danger, currently, of excluding them from that provision?

Danny Stone: Very much so. You heard earlier about the problems with advertising. I recognise that search services are not the same as user-to-user services, so there does need to be some different thinking. However, at present, they are not required to address legal harms, and the harms are there.

I appeared before the Joint Committee on the draft Bill and talked about Microsoft Bing, which, in its search bar, was prompting people with “Jews are” and then a rude word. You look at “Gays are”, today, and it is prompting people with “Gays are using windmills to waft homosexual mists into your home”. That is from the search bar. The first return is a harmful article. Do the same in Google, for what it’s worth, and you get “10 anti-gay myths debunked.” They have seen this stuff. I have talked to them about it. They are not doing the work to try to address it.

Last night, using Amazon Alexa, I searched “Is George Soros evil?” and the response, was “Yes, he is. According to an Alexa Answers contributor, every corrupt political event.” “Are the White Helmets fake?” “Yes, they are set up by an ex-intelligence officer.” The problem with that is that the search prompts—the things that you are being directed to; the systems here—are problematic, because one person could give an answer to Amazon and that prompts the response. The second one, about the White Helmets, was a comment on a website that led Alexa to give that answer.

Search returns are not necessarily covered because, as I say, they are not the responsibility of the internet companies, but the systems that they design as to how those things are indexed and the systems to prevent them going to harmful sites by default are their responsibility, and at present the Bill does not address that. Something that forces those search companies to have appropriate risk assessments in place for the priority harms that Parliament sets, and to enforce those terms and conditions consistently, would be very wise.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Thank you to the witnesses for joining us today. The Bill contains duties to protect content of “democratic importance” and “journalistic content”. What is your view of these measures and their likely effectiveness?

Liron Velleman: These are both pretty dangerous clauses. We are very concerned about what I would probably be kind and call their unintended consequences. They are loopholes that could allow some of the most harmful and hateful actors to spread harm on social media. I will take “journalistic” first and then move on to “democratic”.

A number of companies mentioned in the previous evidence session are outlets that could be media publications just by adding a complaints system to their website. There is a far-right outlet called Urban Scoop that is run by Tommy Robinson. They just need to add a complaints system to their website and then they would be included as a journalist. There are a number of citizen journalists who specifically go to our borders to harass people who are seeking refuge in this country. They call themselves journalists; Tommy Robinson himself calls himself a journalist. These people have been specifically taken off platforms because they have repeatedly broken the terms of service of those platforms, and we see this as a potential avenue for them to make the case that they should return.

We also see mainstream publications falling foul of the terms of service of social media companies. If I take the example of the Christchurch massacre, social media companies spent a lot of time trying to take down both the livestream of the attack in New Zealand and the manifesto of the terrorist, but the manifesto was then put on the Daily Mail website—you could download the manifesto straight from the Daily Mail website—and the livestream was on the Daily Mirror and The Sun’s websites. We would be in a situation where social media companies could take that down from anyone else, but they would not be able to take it down from those news media organisations. I do not see why we should allow harmful content to exist on the platform just because it comes from a journalist.

On “democratic”, it is still pretty unclear what the definition of democratic speech is within the Bill. If we take it to be pretty narrow and just talk about elected officials and candidates, we know that far-right organisations that have been de-platformed from social media companies for repeatedly breaking the terms of service—groups such as Britain First and, again, Tommy Robinson—are registered with the Electoral Commission. Britain First ran candidates in the local elections in 2022 and they are running in the Wakefield by-election, so, by any measure, they are potentially of “democratic importance”, but I do not see why they should be allowed to break terms of service just because they happen to have candidates in elections.

If we take it on a wider scale and say that it is anything of “democratic importance”, anyone who is looking to cause harm could say, “A live political issue is hatred of the Muslim community.” Someone could argue that that or the political debate around the trans community in the UK is a live political debate, and that would allow anyone to go on the platform and say, “I’ve got 60 users and I’ve got something to say on this live political issue, and therefore I should be on the platform,” in order to cause that harm. To us, that is unacceptable and should be removed from the Bill. We do not want a two-tier internet where some people have the right to be racist online, so we think those two clauses should be removed.

Stephen Kinsella: At Clean up the Internet this is not our focus, although the proposals we have made, which we have been very pleased to see taken up in the Bill, will certainly introduce friction. We keep coming back to friction being one of the solutions. I am not wearing this hat today, but I am on the board of Hacked Off, and if Hacked Off were here, I think they would say that the solution—although not a perfect solution—might be to say that a journalist, or a journalistic outlet, will be one that has subjected itself to proper press regulation by a recognised press regulator. We could then possibly take quite a lot of this out of the scope of social media regulation and leave it where I think it might belong, with proper, responsible press regulation. That would, though, lead on to a different conversation about whether we have independent press regulation at the moment.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you again to the witnesses for joining us this morning. I will start with Stephen Kinsella. You have spoken already about some of the issues to do with anonymity. Can you share with the Committee your view on the amendments made to the Bill, when it was introduced a couple of months ago, to give users choices over self-verification and the content they see? Do you think they are useful and helpful updates to the Bill?

Stephen Kinsella: Yes. We think they are extremely helpful. We welcome what we see in clause 14 and clause 57. There is thus a very clear right to be verified, and an ability to screen out interactions with unverified accounts, which is precisely what we asked for. The Committee will be aware that we have put forward some further proposals. I would really hesitate to describe them as amendments; I see them as shading-in areas—we are not trying to add anything. We think that it would be helpful, for instance, when someone is entitled to be verified, that verification status should also be visible to other users. We think that should be implicit, because it is meant to act as a signal to others as to whether someone is verified. We hope that would be visible, and we have suggested the addition of just a few words into clause 14 on that.

We think that the Bill would benefit from a further definition of what it means by “user identity verification”. We have put forward a proposal on that. It is such an important term that I think it would be helpful to have it as a defined term in clause 189. Finally, we have suggested a little bit more precision on the things that Ofcom should take into account when dealing with platforms. I have been a regulatory lawyer for nearly 40 years, and I know that regulators often benefit from having that sort of clarity. There is going to be negotiation between Ofcom and the platforms. If Ofcom can refer to a more detailed list of the factors it is supposed to take into account, I think that will speed the process up.

One of the reasons we particularly welcomed the structure of the Bill is that there is no wait for detailed codes of conduct because these are duties that we will be executing immediately. I hope Ofcom is working on the guidance already, but the guidance could come out pretty quickly. Then there would be the process of—maybe negotiating is the wrong word—to-and-fro with the platforms. I would be very reluctant to take too much on trust. I do not mean on trust from the Government; I mean on trust from the platforms—I saw the Minister look up quickly then. We have confidence in Government; it is the platforms we are little bit wary of. I heard the frustration expressed on Tuesday.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

indicated assent.

Stephen Kinsella: I think you said, “If platforms care about the users, why aren’t they already implementing this?” Another Member, who is not here today, said, “Why do they have to be brought kicking and screaming?” Yet, every time platforms were asked, we heard them say, “We will have to wait until we see the detail of—”, and then they would fill in whatever thing is likely to come last in the process. So we welcome the approach. Our suggestions are very modest and we are very happy to discuss them with you.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Good. Thank you. I hope the Committee is reassured by those comments on the freedom of speech question.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q I will use the small amount of time we have left to ask one question. A number of other stakeholders and witnesses have expressed concerns regarding the removal of a digital media literacy strategy from the Bill. What role do you see a digital media literacy strategy playing in preventing the kind of abuse that you have been describing?

Danny Stone: I think that a media literacy strategy is really important. There is, for example, UCL data on the lack of knowledge of the word “antisemitism”: 68% of nearly 8,000 students were unfamiliar with the term’s meaning. Dr Tom Harrison has discussed cultivating cyber-phronesis—this was also in an article by Nicky Morgan in the “Red Box” column some time ago—which is a method of building practical knowledge over time to make the right decisions when presented with a moral challenge. We are not well geared up as a society—I am looking at my own kids—to educate young people about their interactions, about what it means when they are online in front of that box and about to type something, and about what might be received back. I have talked about some of the harms people might be directed to, even through Alexa, but some kind of wider strategy, which goes beyond what is already there from Ofcom—during the Joint Committee process, the Government said that Ofcom already has its media literacy requirements—and which, as you heard earlier, updates it to make it more fit for purpose for the modern age, would be very appropriate.

Stephen Kinsella: I echo that. We also think that that would be welcome. When we talk about media literacy, we often find ourselves with the platforms throwing all the obligation back on to the users. Frankly, that is one of the reasons why we put forward our proposal, because we think that verification is quite a strong signal. It can tell you quite a lot about how likely it is that what you are seeing or reading is going to be true if someone is willing to put their name to it. Seeing verification is just one contribution. We are really talking about trying to build or rebuild trust online, because that is what is seriously lacking. That is a system and design failure in the way that these platforms have been built and allowed to operate.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q The shadow Minister’s question is related to the removal of what was clause 103 in the old draft of the Bill. As she said, that related to media literacy. Does the panel draw any comfort from three facts? First, there is already a media literacy duty on Ofcom under section 11 of the Communications Act 2003—the now deleted clause 103 simply provided clarification on an existing duty. Secondly, last December, after the Joint Committee’s deliberations, but before the updated Bill was published, Ofcom published its own updated approach to online media literacy, which laid out the fact that it was going to expand its media literacy programme beyond what used to be in the former clause 103. Finally, the Government also have their own media literacy strategy, which is being funded and rolled out. Do those three things—including, critically, Ofcom’s own updated guidance last December—give the panel comfort and confidence that media literacy is being well addressed?

Liron Velleman: If the Bill is seeking to make the UK the safest place to be on the internet, it seems to be the obvious place to put in something about media literacy. I completely agree with what Danny said earlier: we would also want to specifically ensure—although I am sure this already exists in some other parts of Ofcom and Government business—that there is much greater media literacy for adults as well as children. There are lots of conversations about how children understand use of the internet, but what we have seen, especially during the pandemic, is the proliferation of things like community Facebook groups, which used to be about bins and a fair that is going on this weekend, becoming about the worst excesses of harmful content. People have seen conspiracy theories, and that is where we have seen some of the big changes to how the far-right and other hateful groups operate, in terms of being able to use some of those platforms. That is because of a lack of media literacy not just among children, but among the adult population. I definitely would encourage that being in the Bill, as well as anywhere else, so that we can remove some of those harms.

Danny Stone: I think it will need further funding, beyond what has already been announced. That might put a smile on the faces of some Department for Education officials, who looked so sad during some of the consultation process—trying to ensure that there is proper funding. If you are going to roll this out across the country and make it fit for purpose, it is going to cost a lot of money.

Online Safety Bill (Fourth sitting)

Alex Davies-Jones Excerpts
Committee stage & Committee Debate - 4th sitting
Thursday 26th May 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 26 May 2022 - (26 May 2022)
None Portrait The Chair
- Hansard -

Good afternoon, ladies and gentlemen. We are now sitting in public and the proceedings are being broadcast. Thank you all for joining us.

We will now hear oral evidence from Stephen Almond, the director of technology and innovation in the Information Commissioner’s Office. Mr Almond, thank you for coming. As I have introduced you, I am not going to ask you to introduce yourself, so we can go straight into the questions. I call the shadow Front-Bench spokesman.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - -

Q 224 Thank you for coming to give evidence to us this afternoon, Mr Almond. There has been a lot of debate about the risk end-to-end encrypted platforms pose to online safety. What need is there to mitigate that risk in the Bill?

Stephen Almond: Let me start by saying that the ICO warmly welcomes the Bill and its mission to make the UK the safest place in the world to be online. End-to-end encryption supports the security and privacy of online communication and keeps people safe online, but the same characteristics that create a private space for the public to communicate can also provide a safe harbour for more malicious actors, and there are valid concerns that encrypted channels may be creating spaces where children are at risk.

Our view is that the Bill has the balance right. All services in scope, whether encrypted or not, must assess the level of risk that they present and take proportionate action to address it. Moreover, where Ofcom considers it necessary and proportionate, it will have the power to issue technology notices to regulated services to require them to deal with child sexual abuse and exploitation material. We think this presents a proportionate way of addressing the risk that is present on encrypted channels.

It is worth saying that I would not favour provisions that sought to introduce some form of outright ban on encryption in a generalised way. It is vital that the online safety regime does not seek to trade off one sort of online safety risk for another. Instead, I urge those advancing more fundamentalist positions around privacy or safety to move towards the question of how we can incentivise companies to develop technological innovation that will enable the detection of harmful content without compromising privacy. It is one reason why the ICO has been very pleased to support the Government’s safety tech challenge, which has really sought to incentivise the development of technological innovation in this area. Really what we would like to see is progress in that space.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q On that point around technological advances and enabling people to access the internet, people have raised concerns that tech-savvy children will be able to use VPNs, Tor Browser and other tricks to easily circumnavigate the measures that will be in the Bill, especially around age verification and user identity. How do you respond to that, and how do you suggest we close those loopholes, if we can?

Stephen Almond: First and foremost, it is incredibly important that the Bill has the appropriate flexibility to enable Ofcom as the online safety regulator to be agile in responding to technological advances and novel threats in this area. I think the question of VPNs is ultimately going to be one that Ofcom and the regulator services themselves are going to have to work around. VPNs play an important role in supporting a variety of different functions, such as the security of communications, but ultimately it is going to be critical to make sure that services are able to carry out their duties. That is going to require some questions to be asked in this area.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q One final question from me. I would like to discuss your thoughts on transparency and how we can make social media companies like Meta be more transparent and open with their data, beyond the measures we currently have in the Bill. For instance, we could create statute to allow academics or researchers in to examine their data. Do you have any thoughts on how this can be incentivised?

Stephen Almond: Transparency is a key foundation of data protection law in and of itself. As the regulator in this space, I would say that there is a significant emphasis within the data protection regime on ensuring that companies are transparent about the processing of personal data that they undertake. We think that that provides proportionate safeguards in this space. I would not recommend an amendment to the Bill on this point, because I would be keen to avoid duplication or an overlap between the regimes, but it is critical; we want companies to be very clear about how people’s personal data is being processed. It is an area that we are going to continue to scrutinise.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

May I ask a supplementary to that before I come on to my main question?

--- Later in debate ---
None Portrait The Chair
- Hansard -

Moving, I hope, seamlessly on, we are now going to hear oral evidence from Sanjay Bhandari, who is the chairman of Kick It Out, and—as the Committee agreed this morning—after Tuesday’s technical problems, if we do not have further technical problems, we are going to hear from Lynn Perry from Barnardo’s, again by Zoom. Is Lynn Perry on the line? [Interruption.] Lynn Perry is not on the line. We’ve got pictures; now all we need is Lynn Perry in the pictures.

I am afraid we must start, but if Lynn Perry is able to join, we will be delighted to hear from her. We have Mr Bhandari, so we will press on, because we are very short of time as it is. We hope that Lynn Perry will join us.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Good afternoon, Mr Bhandari; thank you for joining us. What response have you as a football charity seen from the social media companies to the abuse that has been suffered by our sports players online? We all saw the horrendous abuse that our football heroes suffered during the Euros last year. What has been the reaction of the social media companies when this has been raised? Why has it not been tackled?

Sanjay Bhandari: I think you would have to ask them why it has not been tackled. My perception of their reaction is that it has been a bit like the curate’s egg: it has been good in parts and bad in parts, and maybe like the original meaning of that allegory, it is a polite way of saying something is really terrible.

Before the abuse from the Euros, actually, we convened a football online hate working group with the social media companies. They have made some helpful interventions: when I gave evidence to the Joint Committee, I talked about wanting to have greater friction in the system, and they are certainly starting to do that with things like asking people, “Do you really want to send this?” before they post something. We understand that that is having some impact, but of course, it is against the backdrop of a growing number of trolls online. Also, we have had some experiences where we make a suggestion, around verification for instance, where we are introducing third-party companies to social media companies, and very often the response we get is different between London and California. London will say “maybe”, and California then says “no”. I have no reason to distrust the people we meet locally here, but I do not think they always have the power to actually help and respond. The short answer is that there are certainly good intentions from the people we meet locally and there is some action. However, the reality is that we still see quite a lot of content coming through.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Thank you for that. The Centre for Countering Digital Hate, which we will hear from later this afternoon, has identified that, as well as a vast majority of abuse being directed on public profiles, it is also done via direct messaging, in private and sometimes on those smaller high-harm platforms. There are concerns raised by others that this would not be covered by the Bill. Do you have any thoughts on that and what would you like to see?

Sanjay Bhandari: I think we need to work that through. I am sorry that my colleagues from the Premier League and the Football Association could not be here today; I did speak to them earlier this week but unfortunately they have got some clashes. One thing we are talking about is how we tag this new framework to exist in content. We have a few hundred complaints that the Premier League investigates, and we have got a few thousand items that are proactively identified by Signify, working with us and the Professional Footballers’ Association. Our intention is to take that data and map it to the new framework and say, “Is this caught? What is caught by the new definition of harm? What is caught by priority illegal content? What is caught by the new communication offences, and what residue in that content might be harmful to adults?” We can then peg that dialogue to real-world content rather than theoretical debate. We know that a lot of complaints we receive are in relation to direct messaging, so we are going to do that exercise. It may take us a little bit of time, but we are going to do that.

None Portrait The Chair
- Hansard -

Lynn Perry is on the line, but we have lost her for the moment. I am afraid we are going to have to press on.

--- Later in debate ---
None Portrait The Chair
- Hansard -

We will hear oral evidence first from Eva Hartshorn-Sanders, who is the head of policy at the Centre for Countering Digital Hate. We shall be joined in due course by Poppy Wood. Without further ado, I call the shadow Minister.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Thank you for joining us this afternoon. I have quoted a lot of the stats that the Centre for Countering Digital Hate has produced on online abuse directed at individuals with protected characteristics. In the previous panel, I mentioned that the vast majority is done via direct messaging, sometimes through end-to-end encryption on platforms. What are your concerns about this issue in the Bill? Does the Bill adequately account for tackling that form of abuse?

Eva Hartshorn-Sanders: That is obviously an important area. The main mechanism to look at are the complaints pathways and ensuring that when reports are made, action is taken, and that that is included in risk assessments as well. In our “Hidden Hate” report, we found that 90% of misogynist abuse, which included quite serious sexual harassment and abuse, videos and death threats, was not acted on by Instagram, even when we used the current pathways for the complainant. This is an important area.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Part of the issue is that the regulated service providers have to rely heavily on the use of AI to facilitate monitoring and take down problematic content in order to comply with the Bill, but, as several stakeholders have said, algorithmic moderation is inadequate for recognising the nuance and subtleties, in order to actively and effectively take down the content. What more would you like to see in the Bill to counteract that issue?

Eva Hartshorn-Sanders: There has to be human intervention as part of that process as well. Whatever system is in place—the relationship between Ofcom and the provider is going to vary by platform and by search provider too, possibly—if you are making those sorts of decisions, you want to have it adequately resourced. That is what we are saying is not happening at the moment, partly because there is not yet the motivation or the incentives there for them to be doing any differently. They are doing the minimum; what they say they are going to do often comes out through press releases or policies, and then it is not followed through.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q You mentioned that there is not adequate transparency and openness on how these things work. What systems would you like to see the Bill put the place to ensure the transparency, independence and accountability of Ofcom, but also the transparency and openness of the tech companies and the platforms that we are seeking to regulate?

Eva Hartshorn-Sanders: I think there is a role for independent civil society, working with the regulator, to hold those companies to account and to be accessing that data in a way that can be used to show how they are performing against their responsibilities under the Bill. I know Poppy from Reset.tech will talk to this area a bit more. We have just had a global summit on online harms and misinformation. Part of the outcome of that was looking at a framework for how we evaluate global efforts at legislation and the transparency of algorithms and rules enforcement, and the economics that are driving online harms and misinformation. That is an essential part of ensuring that we are dealing with the problems.

None Portrait The Chair
- Hansard -

May I say, for the sake of the record, that we have now been joined by Poppy Wood, the UK director of Reset.tech? Ms Wood, you are not late; we were early. We are trying to make as much use as we can of the limited time. I started with the Opposition Front Bencher. If you have any questions for Poppy Wood, go ahead.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q I do—thank you, Sir Roger. I am not sure if you managed to hear any of that interaction, Poppy. Do you have any comments to make on those points before I move on?

Poppy Wood: I did not hear your first set of questions—I apologise.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

That is fine. I will just ask you what you think the impact is of the decision to remove misinformation and disinformation from the scope of the Bill, particularly in relation to state actors?

Poppy Wood: Thank you very much, and thank you for having me here today. There is a big question about how this Bill tackles co-ordinated state actors—co-ordinated campaigns of disinformation and misinformation. It is a real gap in the Bill. I know you have heard from Full Fact and other groups about how the Bill can be beefed up for mis- and disinformation. There is the advisory committee, but I think that is pretty weak, really. The Bill is sort of saying that disinformation is a question that we need to explore down the line, but we all know that it is a really live issue that needs to be tackled now.

First of all, I would make sure that civil society organisations are on that committee and that its report is brought forward in months, not years, but then I would say there is just a real gap about co-ordinated inauthentic behaviour, which is not referenced. We are seeing a lot of it live with everything that is going on with Russia and Ukraine, but it has been going on for years. I would certainly encourage the Government to think about how we account for some of the risks that the platforms promote around co-ordinated inauthentic behaviour, particularly with regard to disinformation and misinformation.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q We have heard a lot from other witnesses about the ability of Ofcom to regulate the smaller high-risk platforms. What is your view on that?

Poppy Wood: Absolutely, and I agree with what was said earlier, particularly by groups such as HOPE not hate and Antisemitism Policy Trust. There are a few ways to do this, I suppose. As we are saying, at the moment the small but high-risk platforms just are not really caught in the current categorisation of platforms. Of course, the categories are not even defined in the Bill; we know there are going to be categories, but we do not know what they will be.

I suppose there are different ways to do this. One is to go back to where this Bill started, which was not to have categories of companies at all but to have a proportionality regime, where depending on your size and your functionality you had to account for your risk profile, and it was not set by Ofcom or the Government. The problem of having very prescriptive categories—category 1, category 2A, category 2B—is, of course, that it becomes a race to the bottom in getting out of these regulations without having to comply with the most onerous ones, which of course are category 1.

There is also a real question about search. I do not know how they have wriggled out of this, but it was one of the biggest surprises in the latest version of the Bill that search had been given its own category without many obligations around adult harm. I think that really should be revisited. All the examples that were given earlier today are absolutely the sort of thing we should be worrying about. If someone can google a tractor in their workplace and end up looking at a dark part of the web, there is a problem with search, and I think we should be thinking about those sorts of things. Apologies for the example, but it is a really, really live one and it is a really good thing to think about how search promotes these kinds of content.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q I want to touch on something we have not talked about a lot today, which is enforcement and the enforcement powers in the Bill. There are significant enforcement powers in the Bill, but do our two witnesses here which those enforcement powers are enough. Eva?

Eva Hartshorn-Sanders: Are you specifically asking about the takedown notices and the takedown powers?

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Good afternoon, both, and thank you for coming this afternoon. We have heard a lot about the journalistic content exemption. What is your view of the current measures in the Bill and their likely consequences?

Owen Meredith: You may be aware that we submitted evidence to the Joint Committee that did prelegislative scrutiny of the draft Bill, because we think that although the Government’s stated intention to have content from recognised news media publishers, who I represent, outside the scope of the Bill, we do not believe that the drafting, as it was and still is, achieves that. Ministers and the Secretary of State have confirmed, both in public appearances and on Second Reading, that they wish to table further amendments to achieve the aim that the Government have set out, which is to ensure that content from recognised news publishers is fully out of scope of the Bill. It needs to go further, but I understand that there will be amendments coming before you at some point to achieve that.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q What further would you like to see?

Owen Meredith: I would like to see a full exemption for recognised news publisher content.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q You would like to see a full exemption. Matt, do you have any thoughts on that?

Matt Rogerson: Yes. I would step back a bit and point to the evidence that a few of your witnesses gave today and Tuesday. I think Fair Vote gave evidence on this point. At the moment, our concern is that we do not know what the legal but harmful category of content that will be included in the Bill will look like. That is clearly going to be done after the event, through codes of practice. There is definitely a danger that news publisher content gets caught by the platforms imposing that. The reason for having a news publisher exemption is to enable users of platforms such as Facebook, Twitter and others to access the same news as they would via search. I agree with Owen’s point. I think the Government are going in the right direction with the exemption for broadcasters such as the BBC, The Times and The Guardian, but we would like to see it strengthened a bit to ensure a cast-iron protection.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Currently, is the definition of journalistic content used in the Bill clear, or do you find it ambiguous?

Matt Rogerson: I think it is quite difficult for platforms to interpret that. It is a relatively narrow version of what journalism is—it is narrower than the article 10 description of what journalism is. The legal definitions of journalism in the Official Secrets Act and the Information Commissioner’s Office journalism code are slightly more expansive and cover not just media organisations but acts of journalism. Gavin Millar has put together a paper for Index on Censorship, in which he talks about that potentially being a way to expand the definition slightly.

The challenge for the platforms is, first, that they have to take account of journalistic content, and there is not a firm view of what they should do with it. Secondly, defining what a piece of journalism or an act of journalism is takes a judge, generally with a lot of experience. Legal cases involving the media are heard through a specific bench of judges—the media and communications division—and they opine on what is and is not an act of journalism. There is a real challenge, which is that you are asking the platforms to—one assumes—use machine learning tools to start with to identify what is a potential act of journalism. Then an individual, whether they are based in California or, more likely, outsourced via an Accenture call centre, then determines within that whether it is an act of journalism and what to do with it. That does place quite a lot of responsibility on the platforms to do that. Again, I would come back to the fact that I think if the Bill was stripped back to focus on illegal content, rather than legal but harmful content, you would have less of these situations where there was concern that that sort of content was going to be caught.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q We have heard a lot of concern about disinformation by state actors purporting to be journalists and using that exemption, which could cause harm. Do you have any thoughts on that?

Matt Rogerson: Yes, a few. The first thing that is missing from the Bill is a focus on advertising. The reason we should focus on advertising is that that is why a lot of people get involved in misinformation. Ad networks at the moment are able to channel money to “unknown” sites in ways that mean that disinformation or misinformation is highly profitable. For example, a million dollars was spent via Google’s ad exchanges in the US; the second biggest recipient of that million dollars was “Unknown sites”—sites that do not categorise themselves as doing anything of any purpose. You can see how the online advertising market is channelling cash to the sort of sites that you are talking about.

In terms of state actors, and how they relate to the definition, the definition is set out quite broadly in the Bill, and it is more lengthy than the definition in the Crime and Courts Act 2013. On top of that definition, Ofcom would produce guidance, which is subject to a full and open public consultation, which would then work out how you are going to apply the definition in practice. Even once you have that guidance in place, there will be a period of case law developing where people will appeal to be inside of that exemption and people will be thrown out of that exemption. Between the platforms and Ofcom, you will get that iteration of case law developing. So I suppose I am slightly more confident that the exemption would work in practice and that Ofcom could find a workable way of making sure that bad actors do not make use of it.

None Portrait The Chair
- Hansard -

Mr Meredith, do you wish to add to that?

Owen Meredith: No, I would echo almost entirely what Matt has said on that. I know you are conscious of time.

--- Later in debate ---
None Portrait The Chair
- Hansard -

One final quick question from the Opposition Front Bench.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Mr Rogerson, you mentioned that platforms and tech companies currently have a list of approved broadcasters that they are enabled to use, to ensure they have that content. Isn’t it true that one of those broadcasters was Russia Today, and it was only because Ofcom intervened to remove it from social media that it was taken down, but under the current provisions in this Bill, Ofcom would not be able to do that and Russia Today would be allowed to spread disinformation on social media platforms?

Matt Rogerson: On the Russia Today problem, I think Russia Today had a licence from Ofcom, so the platforms probably took their cue from the fact that Russia Today was beamed into British homes via Freeview. Once that changed, the position of having their content available on social media changed as well. Ultimately, if it was allowed to go via broadcast, if it had a broadcast licence, I would imagine that social media companies took that as meaning that it was a—

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q But under the new Bill, as journalistic content, it would be allowed to remain on those social media platforms.

Matt Rogerson: I think that would be subject to the guidance that Ofcom creates and the consultation on that guidance. I do not believe that Russia Today would be allowed under the definitions. If it is helpful, I could write to you to set out why.

--- Later in debate ---
None Portrait The Chair
- Hansard -

We will now hear from Tim Fassam, the director of government relations and policy at PIMFA, the Personal Investment Management & Financial Advice Association, and from Rocio Concha, director of policy and advocacy at Which? We will be joined by Martin Lewis, of MoneySavingExpert, in due course. Thank you to the witnesses for joining us. I call the Opposition Front Bench.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Thank you for joining us this afternoon. As a constituency MP, I am sure I am not alone in saying that a vast amount of my casework comes from members of my community writing to me to say that they have been scammed online, that they have been subject to fraud and that they feel horrendous about it. They feel shame and they do not know what to do about it. It is the single biggest crime in the UK, with victims losing an estimated £2.3 billion. In your opinion, does the Bill go far enough to tackle that?

Rocio Concha: This Bill is very important in tackling fraud. It is very important for Which? We were very pleased when fraud was included to tackle the issue that you mentioned and also when paid-for advertising was included. It was a very important step, and it is a very good Bill, so we commend DCMS for producing it.

However, we have found some weakness in the Bill, and those can be solved with very simple amendments, which will have a big impact on the Bill in terms of achieving its objective. For example, at the moment in the Bill, search engines such as Google and Yahoo! are not subject to the same duties in terms of protecting consumers from fraudulent advertising as social media platforms are. There is no reason for Google and Yahoo! to have weaker duties in the Bill, so we need to solve that.

The second area is booster content. Booster content is user-generated content, but it is also advertising. In the current definition of fraudulent advertising in the Bill, booster content is not covered. For example, if a criminal makes a Facebook page and starts publishing things about fake investments, and then he pays Facebook to boost that content in order to reach more people, the Bill, at the moment, does not cover that fraudulent advertising.

The last part is that, at the moment, the risk checks that platforms need to do for priority illegal content, the transparency reporting that they need to do to basically say, “We are finding this illegal content and this is what we are doing about it,” and the requirement to have a way for users to tell them about illegal content or complain about something that they are not doing to tackle this, only apply to priority illegal content. They do not apply to fraudulent advertising, but we think they need to.

Paid-for advertising is the most expensive way that criminals have to reach out to a lot of people. The good news, as I said before, is that this can be solved with very simple amendments to the Bill. We will send you suggestions for those amendments and, if we fix the problem, we think the Bill will really achieve its objective.

None Portrait The Chair
- Hansard -

One moment—I think we have been joined by Martin Lewis on audio. I hope you can hear us, Mr Lewis. You are not late; we started early. I will bring you in as soon as we have you on video, preferably, but otherwise on audio.

Tim Fassam: I would echo everything my colleague from Which? has said. The industry, consumer groups and the financial services regulators are largely in agreement. We were delighted to see fraudulent advertising and wider issues of economic crime included in the Bill when they were not in the initial draft. We would also support all the amendments that Which? are putting forward, especially the equality between search and social media.

Our members compiled a dossier of examples of fraudulent activity, and the overwhelming examples of fraudulent adverts were on search, rather than social media. We would also argue that search is potentially higher risk, because the act of searching is an indication that you may be ready to take action. If you are searching “invest my pension”, hopefully you will come across Martin’s site or one of our members’ sites, but if you come across a fraudulent advert in that moment, you are more likely to fall foul of it.

We would also highlight two other areas where we think the Bill needs further work. These are predominantly linked to the interaction between Ofcom, the police and the Financial Conduct Authority, because the definitions of fraudulent adverts and fraudulent behaviour are technical and complex. It is not reasonable to expect Ofcom to be able to ascertain whether an advert or piece of content is in breach of the Financial Services and Markets Act 2000; that is the FCA’s day job. Is it fraud? That is Action Fraud’s and the police’s day job. We would therefore suggest that the Bill go as far as allowing the police and the FCA to direct Ofcom to have content removed, and creating an MOU that enables Ofcom to refer things to the FCA and the police for their expert analysis of whether it breaches those definitions of fraudulent adverts or fraudulent activity.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Thank you, both. You mentioned that search is a concern, especially because it is currently out of scope of the Bill in terms of this issue. Another issue is that when people do use search to look for a financial service or something that they wish to purchase, the cookies are remembered. The algorithms on social media platforms are then triggered to promote specific adverts to them as a result of that search history or things they have mentioned via voice control to their home help devices. That is a concern. Digital advertising that you see on third-party websites is also not within scope. That has been raised as well. Do you have any thoughts on those points?

Rocio Concha: Yes. Open-display advertising is not part of the Bill. That also needs to be tackled. I think the online advertising programme should be considered, to tackle this issue. I agree with you: this is a very important step in the right direction, and it will make a huge difference if we fix this small weakness in terms of the current scope. However, there are still areas out there that need to be tackled.

None Portrait The Chair
- Hansard -

Mr Lewis, I am living in hope that we may be able to see you soon—although that may be a forlorn hope. However, I am hoping that you can hear us. Do you want to come in and comment at all at this point? [Interruption.] Oh, we have got you on the screen. Thank you very much for joining us.

Martin Lewis: Hurrah. I am so sorry, everybody—for obvious reasons, it has been quite a busy day on other issues for me, so you’ll forgive me.

None Portrait The Chair
- Hansard -

I can’t think why it has been.

Martin Lewis: I certainly agree with the other two witnesses. Those three issues are all very important to be brought in. From a wider perspective, I was vociferously campaigning to have scam adverts brought within the scope of the Online Safety Bill. I am delighted that that has happened, but let us be honest among ourselves: it is far from a panacea.

Adverts and scams come in so many places—on social media, in search engines and in display advertising, which is very common and is not covered. While I accept that the online advertising programme will address that, if I had my way I would be bringing it all into the Online Safety Bill. However, the realpolitik is that that is not going to happen, so we have to have the support in the OAP coming later.

It is also worth mentioning just for context that, although I think there is little that we can do about this—or it would take brighter people than me—one of the biggest routes for scams is email. Everybody is being emailed—often with my face, which is deeply frustrating. We have flaccid policing of what is going on on social media, and I hope the Bill will improve it, but at least there is some policing, even though it is flaccid, and it is the same on search engines. There is nothing on email, so whatever we do in this Bill, it will not stop scams reaching people. There are many things that would improve that, certainly including far better resourcing for policing so that people who scam individuals get at least arrested and possibly even punished and sentenced. Of course, that does not happen at the moment, because scamming is a crime that you can undertake with near impunity.

There is a lot that needs to be done to make the situation work, but in general the moves in the Online Safety Bill to include scam advertising are positive. I would like to see search engines and display advertising brought into that. I absolutely support the call for the FCA to be involved, because what is and is not a scam can certainly be complicated. There are more obvious ones and less obvious ones. We saw that with the sale of bonds at 5% or 6%, which pretend to be deposit bonds but are nothing of the sort. That might get a bit more difficult for Ofcom, and it would be great to see the regulator involved. I support all the calls of the other witnesses, but we need to be honest with ourselves: even if we do all that, we are still a long way from seeing the back of all scam adverts and all scams.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Thank you, Mr Lewis. My final question is not necessarily about financial services advertising. With the rise of influencer culture, specifically on social media platforms such as TikTok and Instagram, we are seeing a failure to disclose adverts correctly and the potential for harmful advertising. Slimming products, for example, that are not particularly safe, especially for children, are being targeted at children. What more would you like to see this Bill do to tackle some of that? I know the ASA has taken action against some prolific offenders, but what more would you like to see in this Bill to tackle that and keep children safe from adverts that are not marked as such?

Rocio Concha: To be honest, in this area we do not have any specific proposals. I completely agree with you that this is an area that needs to be tackled, but I do not have a specific proposal for this Bill.

Tim Fassam: This is an area that we have raised with the Financial Conduct Authority—particularly the trend for financial advice TikTok and adverts for non-traditional investments, such as whisky barrels or wine, which do not meet the standards required by the FCA for other investment products. That is also true of a number of cryptocurrency adverts and formats. We have been working with the FCA to try to identify ways to introduce more consistency in the application of the rule. There has been a welcome expansion by the Treasury on the promotion of high-risk investments, which is now a regulated activity in and of itself.

I go back to my initial point. We do not believe that there is any circumstance in which the FCA would want content in any place taken down where that content should not be removed, because they are the experts in identifying consumer harm in this space.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Mr Lewis, do you have anything to add?

Martin Lewis: I still believe that most of this comes down to an issue of policing. The rules are there and are not being enforced strongly enough. The people who have to enforce the rules are not resourced well enough to do that. Therefore, you get people who are able to work around the rules with impunity.

Advertising in the UK, especially online, has been the wild west for a very long time, and it will continue to be so for quite a while. The Advertising Standards Authority is actually better at dealing with the influencer issue, because of course it is primarily strong at dealing with people who listen to the Advertising Standards Authority. It is not very good at dealing with criminal scammers based outside the European Union, who frankly cannot be bothered and will not reply—they are not going to stop—but it is better at dealing with influencers who have a reputation.

We all know it is still extremely fast and loose out there. We need to adequately resource it; putting rules and laws in place is only one step. Resourcing the policing and the execution of those rules and laws is a secondary step, and I have doubts that we will ever quite get there, because resources are always squeezed and put on the back burner.

None Portrait The Chair
- Hansard -

Thank you. Do I have any questions from Government Back Benchers? No. Does anyone have any further questions?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Yes, I do. If nobody else has questions, I will have another bite of the cherry.

None Portrait The Chair
- Hansard -

The Minister is going to come in in a minute.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q I would just like to query your thoughts on a right to redress for victims. Do you think that having an ombudsman in the Bill would be appropriate, and what would you like to see to support victims of fraud?

Martin Lewis: As you will know, I had to sue Facebook for defamation, which is a ridiculous thing to do in order to stop scam adverts. I was unable to report the scam adverts to the police, because I had not been scammed—even though it was my face that was in them—and many victims were not willing to come forward. That is a rather bizarre situation, and we got Facebook to put forward £3 million to set up Citizens Advice Scam Action—that is what I settled for, as well as a scam ad reporting tool.

There are two levels here. The problem is who is at fault. Of course, those mainly at fault for scams are the scammers. They are criminals and should be prosecuted, but not enough of them are. You have times when it is the bank’s fault. If a company has not put proper precautions in place, and people have got scammed because it has put up adverts or posts that it should have prevented, they absolutely need to have some responsibility for that. I think you will struggle to have a direct redress system put in place. I would like to see it, but it would be difficult.

It is rather interesting to me that I am worried that the £3 million for Citizens Advice Scam Action, which was at least meant to provide help and support for victims of scams, is going to run out. I have not seen any more money coming from Facebook, Google or any of the other big players out there. If we are not going to fund direct redress, we could at least make sure that they fund a collective form of redress and help for the victims of scams, as a bare minimum. It is very strange that these firms go so quiet on this, and what they say is, “We are doing everything we can.”

From my meetings with these firms—these are meetings with lawyers in the room, so I have to be slightly careful—one of the things that I would warn the Committee about is that they tend to get you in and give you a presentation on all the technological reasons why they cannot stop scam adverts. My answer to them after about 30 seconds, having stopped what was meant to be an hour-long presentation, is, “I have not framed the fact that you need a technological solution. I have said you need a solution. If the answer to stopping scam adverts, and to stopping scams, is that you have to pre-vet every single advert, as old-fashioned media did, and that every advert that you put up has to have been vetted by a human being, so be it. You’re making it a function of technology, but let’s be honest: this is a function of profitability.” We have to look at the profitability of these companies when it comes to redress. What your job is—if you forgive me saying this—is to make sure that it costs them more money to let people be scammed than it does to stop people being scammed. If we solve that, we will have a lot fewer scams on social media and on the search advertising.

Rocio Concha: I completely agree with everything that Martin says. At the moment, the provisions in the Bill for “priority illegal content” require the platforms to publish reports that say, “This is how much illegal content we are seeing on the platform, and these are the measures that we are going to take.” They are also required to have a way for users to report it and to complain when they think that the platforms are not doing the right thing. At the moment, that does not apply to fraudulent advertising, so you have an opportunity to fix that in the Bill very easily, to at least get the transparency out there. The platform has to say, “We are finding this”—that puts pressure on the platform, because it is there and is also with the regulator—“and these are the measures that we are taking.” That gives us transparency to say, “Are these measures enough?” There should also be an easy way for the user to complain when they think that platforms are not doing the right thing. It is a complex question, but there are many things in the Bill that you can improve in order to improve the situation.

Tim Fassam: I wonder if it would be useful to give the Committee a case study. Members may be familiar with London Capital & Finance. Now, London Capital & Finance is one of the most significant recent scams. It sold mini-bonds fraudulently, at a very high advertised return, which then collapsed, with individuals losing all their money.

Those individuals were compensated through two vehicles. One was a Government Bill; so, they were compensated by the taxpayer. The others, because they were found to have been given financial advice despite LCF not having advice permissions or operating through a regulated product, went on to the Financial Services Compensation Scheme, which, among others, our members pay for; legitimate financial services companies pay for it. The most recent estimate is over £650 million. The expectation is that that will reach £1 billion at some point over the next few years, in terms of cost to the economy.

LCF was heavily driven by online advertising, and we would argue that the online platforms were in fact probably the only people who could have stopped it happening. They have profited from those adverts and they have not contributed anything to either of those two schemes. We would argue—possibly not for this Bill—that serious consideration should be given to the tech platforms being part of the financial services compensation scheme architecture and contributing to the costs of scams that individuals have fallen foul of, as an additional incentive for them to get on top of this problem.

Martin Lewis: That is a very important point, but I will just pick up on what Rocio was saying. One of the things that I would like to see, as well as much more rigid requirements of how reporting scams can be put in place—because I cannot see proper pre-vetting happening with these technology companies, but we can at least rely on social policing and reporting of scams. There are many people who recognise a scam, just as there are many people who do not recognise a scam.

However, I also think this is a wonderful opportunity to make sure that the method, the language and the symbols used for reporting scams are universal in the UK, so that whatever site you are on, if you see an advert you click the same symbol, and the process is unified and universal, and works in a very similar way, so that you can report a scam the same way on every site, which makes it simpler, and we can train people in how to do it and we can make the processes work.

Then, of course, we have to make sure that they act on the back of reports, but simply the various ways it is reported, and the complexity, and the number of clicks that you need to make mean it is a lot easier generally to click on an advert than it is to click to report an advert that is a scam. And with so many scams out there, I think there should be a parity of ease between those two factors.

Caroline Ansell Portrait Caroline Ansell
- Hansard - - - Excerpts

Q May I ask, directly related to that, about the complaints procedure? What would you like to see in terms of changes there, to make it more unified, more universal and simpler? It has been suggested that it is not robust enough, not dynamic enough and not fast enough.

Rocio Concha: There were complaints from the users. At the moment, this Bill will not allow this for fraudulent advertising. So, we need to make sure that it is a requirement for the platforms to allow and to have an easy tool for people to complain and to report when they see something that is fraudulent. At the moment, the Bill does not do that. It is an easy fix; you can do it. And then the user will have that tool. It would also give us transparency for the regulator and for organisations such as ours, to see what is happening and to see what measures the platforms are taking.

Tim Fassam: I would agree with that. I would also highlight a particular problem that our members have flagged, and we have flagged directly with Meta and Instagram. Within the definition in the Bill of individuals who can raise concern about social media platforms, our members find they fall between two stools, because quite often what is happening is that people are claiming an association with a legitimate firm. So they will have a firm’s logo, or a firm’s web address, in their profile for their social media and then they will not directly claim to be a financial adviser but imply an association with a legitimate financial advice firm. This happens surprisingly frequently.

Our members find it incredibly difficult to get those accounts taken down, because it is not a fraudulent account; that individual is not pretending to be someone else and they are not the individual claiming pretence. They are not directly claiming to be an employee; they could just say they are a fan of the company. And they are not a direct victim of this individual. What happens is that when they report, it goes into a volume algorithm, and only if a very large number of complaints are made does that particular site get taken down. I think that could be expanded to include complaints from individuals affected by the account, rather than directly believing they are pretending to be that.

--- Later in debate ---
None Portrait The Chair
- Hansard -

We now have Frances Haugen, a former Facebook employee. Thank you for joining us.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Good afternoon, Frances. Thank you for joining us.

Frances Haugen: Thank you so much for inviting me.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

No problem. Could you give us a brief overview of how, in your opinion, platforms such as Meta will be able to respond to the Bill if it is enacted in its current form?

Frances Haugen: There are going to be some pretty strong challenges in implementing the Bill as it is currently written. I want to be really honest with you about the limitations of artificial intelligence. We call it artificial intelligence, but people who actually build these systems call it machine learning, because it is not actually intelligent. One of the major limitations in the Bill is that there are carve-outs, such as “content of democratic importance”, that computers will not be able to distinguish. That might have very serious implications. If the computers cannot differentiate between whether something is or is not hate speech, imagine a concept even more ambiguous that requires even more context, such as defining what is of democratic importance. If we have carve-outs like that, it may actually prevent the platforms from doing any content moderation, because they will never know whether a piece of content is safe or not safe.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q You have just answered my question on AI and algorithmic intention. When I questioned Meta in Tuesday’s oral evidence session, they were unable to tell me how many human moderators they had directly working for them and how many had abided by a UK standard and code of conduct. Do you see the lack of human moderators being a problem as the Bill is enacted by platforms such as Meta?

Frances Haugen: I think it is unacceptable that large corporations such as this do not answer very basic questions. I guarantee you that they know exactly how many moderators they have hired—they have dashboards to track these numbers. The fact that they do not disclose those numbers shows why we need to pass laws to have mandatory accountability. The role of moderators is vital, especially for things like people questioning judgment decisions. Remember, no AI system is going to be perfect, and one of the major ways people can have accountability is to be able to complain and say, “This was inaccurately judged by a computer.” We need to ensure that there is always enough staffing and that moderators can play an active role in this process.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q One final question from me, because I know others will want to come in. How do you think platforms such as Meta—I know we have used Meta as an example, but there are others—can be incentivised, beyond the statutory duty that we are currently imposing, to publish their data to allow academics and researchers into their platforms to examine exactly what is going on? Or is this the only way?

Frances Haugen: All industries that live in democratic societies must live within democratic processes, so I do believe that it is absolutely essential that we the public, through our democratic representatives like yourself, have mandatory transparency. The only two other paths I currently see towards getting any transparency out of Meta, because Meta has demonstrated that it does not want to give even the slightest slivers of data—for example, how many moderators there are—are via ESG, so we can threaten then with divestment by saying, “Prosocial companies are transparent with their data,” and via litigation. In the United States, sometimes we can get data out of these companies through the discovery process. If we want consistent and guaranteed access to data, we must put it in the Bill, because those two routes are probabilistic—we cannot ensure that we will get a steady, consistent flow of data, which is what we need to have these systems live within a democratic process.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Turning to the issue of child safety and online abuse with images involving children, what should be added to or removed from the Bill to improve how it protects children online? Have you got any thoughts on that? Some groups have described the Bill’s content as overly broad. Would you make any comments on how effective it will be in terms of online safety for children?

Frances Haugen: I am not well versed on the exact provisions in the Bill regarding child safety. What I can say is that one of the most important things that we need to have in there is transparency around how the platforms in general keep children under the age of 13 off their systems—transparency on those processes—because we know that Facebook is doing an inadequate job. That is the single biggest lever in terms of child safety.

I have talked to researchers at places like Oxford and they talk about how, with social media, one of the critical windows is when children transition through puberty, because they are more sensitive on issues, they do not have great judgment yet and their lives are changing in really profound ways. Having mandatory transparency on what platforms are doing to keep kids off their platforms, and the ability to push for stronger interventions, is vital, because keeping kids off them until they are at least 13, if not 16, is probably the biggest single thing we can do to move the ball down the field for child safety.

Online Safety Bill (Fifth sitting)

Alex Davies-Jones Excerpts
Committee stage
Tuesday 7th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 7 June 2022 - (7 Jun 2022)
None Portrait The Chair
- Hansard -

The Minister was completely out of order in congratulating the right hon. Lady, but I concur with him. I call the shadow Minister.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - -

Thank you, Sir Roger; it is a genuine privilege and an honour to serve under your chairship today and for the duration of the Committee. I concur with congratulations to the right hon. Member for Basingstoke and I, too, congratulate her.

If you would indulge me, Sir Roger, this is the first time I have led on behalf of the Opposition in a Bill Committee of this magnitude. I am very much looking forward to getting my teeth stuck into the hours of important debate that we have ahead of us. I would also like to take this opportunity to place on record an early apology for any slight procedural errors I may inadvertently make as we proceed. However, I am very grateful to be joined by my hon. Friend the Member for Worsley and Eccles South, who is much more experienced in these matters. I place on record my grateful support to her. Along with your guidance, Sir Roger, I expect that I will quickly pick up the correct parliamentary procedure as we make our way through this colossal legislation. After all, we can agree that it is a very important piece of legislation that we all need to get right.

I want to say clearly that the Opposition welcome the Bill in principle; the Minister knows that, as we voted in favour of it at Second Reading. However, it will come as no surprise that we have a number of concerns about areas where we feel the Bill is lacking, which we will explore further. We have many reservations about how the Bill has been drafted. The structure and drafting pushes services into addressing harmful content—often in a reactive, rather than proactive, way—instead of harmful systems, business models and algorithms, which would be a more lasting and systemic approach.

Despite that, we all want the Bill to work and we know that it has the potential to go far. We also recognise that the world is watching, so the Opposition look forward to working together to do the right thing, making the internet a truly safe space for all users across the UK. We will therefore not oppose clause 1.

Dan Carden Portrait Dan Carden (Liverpool, Walton) (Lab)
- Hansard - - - Excerpts

It is a pleasure to serve on the Committee. I want to apologise for missing the evidence sessions. Unfortunately, I came down with covid, but I have been following the progress of the Committee.

This is important legislation. We spend so much of our lives online these days, yet there has never been an attempt to regulate the space, or for democratically elected Members to contribute towards its regulation. Clause 1 gives a general outline of what to expect in the Bill. I have no doubt that this legislation is required, but also that it will not get everything right, and that it will have to change over the years. We may see many more Bills of this nature in this place.

I have concerns that some clauses have been dropped, and I hope that there will be future opportunities to amend the Bill, not least with regard to how we educate and ensure that social media companies promote media literacy, so that information that is spread widely online is understood in its context—that it is not always correct or truthful. The Bill, I hope, will go some way towards ensuring that we can rely more on the internet, which should provide a safer space for all its users.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clause 3 stand part.

That schedules 1 and 2 be the First and Second schedules to the Bill.

Clause 4 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

We do not oppose clauses 2, 3 or 4, or the intentions of schedules 1 and 2, and have not sought to amend them at this stage, but this is an important opportunity to place on record some of the Opposition’s concerns as the Bill proceeds.

The first important thing to note is the broadness in the drafting of all the definitions. A service has links to the UK if it has a significant number of users in the UK, if the UK users are a target market, or if

“there are reasonable grounds to believe there is a material risk of significant harm to individuals”

in the UK using the service. Thus, territorially, a very wide range of online services could be caught. The Government have estimated in their impact assessment that 25,100 platforms will be in scope of the new regime, which is perhaps a conservative estimate. The impact assessment also notes that approximately 180,000 platforms could potentially be considered in scope of the Bill.

The provisions on extraterritorial jurisdiction are, again, extremely broad and could lead to some international platforms seeking to block UK users in a way similar to that seen following the introduction of GDPR. Furthermore, as has been the case under GDPR, those potentially in scope through the extraterritorial provisions may vigorously resist attempts to assert jurisdiction.

Notably absent from schedule 1 is an attempt to include or define how the Bill and its definitions of services that are exempt may adapt to emerging future technologies. The Minister may consider that a matter for secondary legislation, but as he knows, the Opposition feel that the Bill already leaves too many important matters to be determined at a later stage via statutory instruments. Although it good to see that the Bill has incorporated everyday internet behaviour such as a like or dislike button, as well as factoring in the use of emojis and symbols, it fails to consider how technologies such as artificial intelligence will sit within the framework as it stands.

It is quite right that there are exemptions for everyday user-to-user services such as email, SMS, and MMS services, and an all-important balance to strike between our fundamental right to privacy and keeping people safe online. That is where some difficult questions arise on platforms such as WhatsApp, which are embedded with end-to-end encryption as a standard feature. Concerns have been raised about Meta’s need to extend that feature to Instagram and Facebook Messenger.

The Opposition also have concerns about private messaging features more widely. Research from the Centre for Missing and Exploited Children highlighted the fact that a significant majority of online child abuse takes place in private messages. For example, 12 million of the 18.4 million child sexual abuse reports made by Facebook in 2019 related to content shared on private channels. Furthermore, recent data from the Office for National Statistics shows that private messaging plays a central role in contact between children and people they have not met offline before. Nearly three quarters—74%—of cases of children contacted by someone they do not know initially take place by private message. We will address this issue further in new clause 20, but I wanted to highlight those exemptions early on, as they are relevant to schedule 1.

On a similar point, we remain concerned about how emerging online systems such as the metaverse have had no consideration in Bill as it stands. Only last week, colleagues will have read about a researcher from a non- profit organisation that seeks to limit the power of large corporations, SumOfUs, who claimed that she experienced sexual assault by a stranger in Meta’s virtual reality space, Horizon Worlds. The organisation’s report said:

“About an hour into using the platform, a SumOfUs researcher was led into a private room at a party where she was raped by a user who kept telling her to turn around so he could do it from behind while users outside the window could see—all while another user in the room watched and passed around a vodka bottle.”

There is currently no clear distinction about how these very real technologies will sit in the Bill more widely. Even more worryingly, there has been no consideration of how artificial intelligence systems such as Horizon Worlds, with clear user-to-user functions, fit within the exemptions in schedule 1. If we are to see exemptions for internal business services or services provided by public bodies, along with many others, as outlined in the schedule, we need to make sure that the exemptions are fit for purpose and in line with the rapidly evolving technology that is widely available overseas. Before long, I am sure that reality spaces such as Horizon Worlds will become more and more commonplace in the UK too.

I hope that the Minister can reassure us all of his plans to ensure that the Bill is adequately future-proofed to cope with the rising expansion of the online space. Although we do not formally oppose the provisions outlined in schedule 1, I hope that the Minister will see that there is much work to be done to ensure that the Bill is adequately future-proofed to ensure that the current exemptions are applicable to future technologies too.

Turning to schedule 2, the draft Bill was hugely lacking in provisions to tackle pornographic content, so it is a welcome step that we now see some attempts to tackle the rate at which pornographic content is easily accessed by children across the country. As we all know, the draft Bill only covered pornography websites that allow user-generated content such as OnlyFans. I am pleased to see that commercial pornography sites have now been brought within scope. This positive step forward has been made possible thanks to the incredible efforts of campaigning groups, of which there are far too many to mention, and from some of which we took evidence. I pay tribute to them today. Over the years, it is thanks to their persistence that the Government have been forced to take notice and take action.

Once again—I hate to repeat myself—I urge the Minister to consider how far the current definitions outlined in schedule 2 relating to regulated provider pornographic content will go to protect virtual technologies such as those I referred to earlier. We are seeing an increase in all types of pornographic and semi-pornographic content that draws on AI or virtual technology. An obvious example is the now thankfully defunct app that was making the rounds online in 2016 called DeepNude. While available, the app used neural networks to remove clothing from images of women, making them look realistically nude. The ramifications and potential for technology like this to take over the pornographic content space is essentially limitless.

I urge the Minister carefully to keep in mind the future of the online space as we proceed. More specifically, the regulation of pornographic content in the context of keeping children safe is an area where we can all surely get on board. The Opposition have no formal objection at this stage to the provisions outlined in schedule 2.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

Thank you, Sir Roger, for chairing our sittings. It is a pleasure to be part of this Bill Committee. I have a couple of comments on clause 2 and more generally.

The Opposition spokesperson, the hon. Member for Pontypridd, made some points about making sure that we are future-proofing the Bill. There are some key issues where we need to make sure that we are not going backwards. That particularly includes private messaging. We need to make sure that the ability to use AI to find content that is illegal, involving child sexual abuse for example, in private messages is still included in the way that it is currently and that the Bill does not accidentally bar those very important safeguards from continuing. That is one way in which we need to be clear on the best means to go forward with the Bill.

Future-proofing is important—I absolutely agree that we need to ensure that the Bill either takes into account the metaverse and virtual reality or ensures that provisions can be amended in future to take into account the metaverse, virtual reality and any other emerging technologies that we do not know about and cannot even foresee today. I saw a meme online the other day that was somebody taking a selfie of themselves wearing a mask and it said, “Can you imagine if we had shown somebody this in 1995 and asked them what this was? They wouldn’t have had the faintest idea.” The internet changes so quickly that we need to ensure that the Bill is future-proofed, but we also need to make sure that it is today-proofed.

I still have concerns, which I raised on Second Reading, about whether the Bill adequately encompasses the online gaming world, where a huge number of children use the internet—and where they should use it—to interact with their friends in a safe way. A lot of online gaming is free from the bullying that can be seen in places such as WhatsApp, Snapchat and Instagram. We need to ensure that those safeguards are included for online gaming. Private messaging is a thing in a significant number of online games, but many people use oral communication—I am thinking of things such as Fortnite and Roblox, which is apparently a safe space, according to Roblox Corporation, but according to many researchers is a place where an awful lot of grooming takes place.

My other question for the Minister—I am not bothered if I do not get an answer today, as I would rather have a proper answer than the Minister try to come up with an answer right at this moment—is about what category the app store and the Google Play store fall into.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

On a point of order, Sir Roger. The livestream is not working. In the interest of transparency we should pause the Committee while it is fixed so that people can observe.

None Portrait The Chair
- Hansard -

I am reluctant to do that. It is a technical fault and it is clearly undesirable, but I do not think we can suspend the Committee for the sake of a technical problem. Every member of the public who wishes to express an interest in these proceedings is able to be present if they choose to do so. Although I understand the hon. Lady’s concern, we have to continue. We will get it fixed as soon as we can.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am sure we will discuss this topic a bit more as the Bill progresses.

I will make a few points on disinformation. The first is that, non-legislatively, the Government have a counter-disinformation unit, which sits within the Department for Digital, Culture, Media and Sport. It basically scans for disinformation incidents. For the past two years it has been primarily covid-focused, but in the last three or four months it has been primarily Russia/Ukraine-focused. When it identifies disinformation being spread on social media platforms, the unit works actively with the platforms to get it taken down. In the course of the Russia-Ukraine conflict, and as a result of the work of that unit, I have personally called in some of the platforms to complain about the stuff they have left up. I did not have a chance to make this point in the evidence session, but when the person from Twitter came to see us, I said that there was some content on Russian embassy Twitter accounts that, in my view, was blatant disinformation—denial of the atrocities that have been committed in Bucha. Twitter had allowed it to stay up, which I thought was wrong. Twitter often takes down such content, but in that example, wrongly and sadly, it did not. We are doing that work operationally.

Secondly, to the extent that disinformation can cause harm to an individual, which I suspect includes a lot of covid disinformation—drinking bleach is clearly not very good for people—that would fall under the terms of the legal but harmful provisions in the Bill.

Thirdly, when it comes to state-sponsored disinformation of the kind that we know Russia engages in on an industrial scale via the St Petersburg Internet Research Agency and elsewhere, the Home Office has introduced the National Security Bill—in fact, it had its Second Reading yesterday afternoon, when some of us were slightly distracted. One of the provisions in that Bill is a foreign interference offence. It is worth reading, because it is very widely drawn and it criminalises foreign interference, which includes disinformation. I suggest the Committee has a look at the foreign interference offence in the National Security Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I am grateful for the Minister’s intervention in bringing in the platforms to discuss disinformation put out by hostile nation states. Does he accept that if Russia Today had put out some of that disinformation, the platforms would be unable to take such content down as a result of the journalistic exemption in the Bill?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We will no doubt discuss in due course clauses 15 and 50, which are the two that I think the shadow Minister alludes to. If a platform is exempt from the duties of the Bill owing to its qualification as a recognised news publisher under clause 50, it removes the obligation to act under the Bill, but it does not prevent action. Social media platforms can still choose to act. Also, it is not a totally straightforward matter to qualify as a regulated news publisher under clause 50. We saw the effect of sanctions: when Russia Today was sanctioned, it was removed from many platforms as a result of the sanctioning process. There are measures outside the Bill, such as sanctions, that can help to address the shocking disinformation that Russia Today was pumping out.

The last point I want to pick up on was rightly raised by my right hon. Friend the Member for Basingstoke and the hon. Member for Aberdeen North. It concerns child sexual exploitation and abuse images, and particularly the ability of platforms to scan for those. Many images are detected as a result of scanning messages, and many paedophiles or potential paedophiles are arrested as a result of that scanning. We saw a terrible situation a little while ago, when—for a limited period, owing to a misconception of privacy laws—Meta, or Facebook, temporarily suspended scanning in the European Union; as a result, loads of images that would otherwise have been intercepted were not.

I agree with the hon. Member for Aberdeen North that privacy concerns, including end-to-end encryption, should not trump the ability of organisations to scan for child sexual exploitation and abuse images. Speaking as a parent—I know she is, too—there is, frankly, nothing more important than protecting children from sexual exploitation and abuse. Some provisions in clause 103 speak to this point, and I am sure we will debate those in more detail when we come to that clause. I mention clause 103 to put down a marker as the place to go for the issue being raised. I trust that I have responded to the points raised in the debate, and I commend the clause to the Committee.

Question put and agreed to.

Clause 2 accordingly ordered to stand part of the Bill.

Clause 3 ordered to stand part of the Bill.

Schedules 1 and 2 agreed to.

Clause 4 ordered to stand part of the Bill.

None Portrait The Chair
- Hansard -

Before we move on, we have raised the issue of the live feed. The audio will be online later today. There is a problem with the feed—it is reaching the broadcasters, but it is not being broadcast at the moment.

As we are not certain we can sort out the technicalities between now and this afternoon, the Committee will move to Committee Room 9 for this afternoon’s sitting to ensure that the live stream is available. Mr Double, if Mr Russell intends to be present—he may not; that is up to you—it would be helpful if you would let him know. Ms Blackman, if John Nicolson intends to be present this afternoon, would you please tell him that Committee Room 9 will be used?

It would normally be possible to leave papers and other bits and pieces in the room, because it is usually locked between the morning and afternoon sittings. Clearly, because we are moving rooms, you will all need to take your papers and laptops with you.

Clause 5

Overview of Part 3

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I want to just put it on the record that the irony is not lost on me that we are having tech issues relating to the discussion of the Online Safety Bill. The Opposition have huge concerns regarding clause 5. We share the frustrations of stakeholders who have been working on these important issues for many years and who feel the Bill has been drafted in overly complex way. In its evidence, the Carnegie UK Trust outlined its concerns over the complexity of the Bill, which will likely lead to ineffective regulation for both service users and companies. While the Minister is fortunate to have a team of civil servants behind him, he will know that the Opposition sadly do not share the same level of resources—although I would like to place on the record my sincere thanks to my researcher, Freddie Cook, who is an army of one all by herself. Without her support, I would genuinely not know where I was today.

Complexity is an issue that crops up time and again when speaking with charities, stakeholders and civil society. We all recognise that the Bill will have a huge impact however it passes, but the complexity of its drafting is a huge barrier to implementation. The same can be said for the regulation. A Bill as complex as this is likely to lead to ineffective regulation for both service users and companies, who, for the first time, will be subject to specific requirements placed on them by the regulator. That being said, we absolutely support steps to ensure that providers of regulated user-to-user services and regulated search services have to abide by a duty of care regime, which will also see the regulator able to issue codes of practice.

I would also like to place on record my gratitude—lots of gratitude today—to Professor Lorna Woods and Will Perrin, who we heard from in evidence sessions last week. Alongside many others, they have been and continue to be an incredible source of knowledge and guidance for my team and for me as we seek to unpick the detail of this overly complex Bill. Colleagues will also be aware that Professor Woods and Mr Perrin originally developed the idea of a duty of care a few years ago now; their model was based on the idea that social media providers should be,

“seen as responsible for public space they have created, much as property owners or operators are in a physical world.”

It will come as no surprise to the Minister that Members of the Opposition fully fall behind that definition and firmly believe that forcing platforms to identify and act on harms that present a reasonable chance of risk is a positive step forward.

More broadly, we welcome moves by the Government to include specific duties on providers of services likely to be accessed by children, although I have some concerns about just how far they will stretch. Similarly, although I am sure we will come to address those matters in the debates that follow, we welcome steps to require Ofcom to issue codes of practice, but have fundamental concerns about how effective they will be if Ofcom is not allowed to remain fully independent and free from Government influence.

Lastly, on subsection 7, I imagine our debate on chapter 7 will be a key focus for Members. I know attempts to define key terms such as “priority content” will be a challenge for the Minister and his officials but we remain concerned that there are important omissions, which we will come to later. It is vital that those key terms are broad enough to encapsulate all the harms that we face online. Ultimately, what is illegal offline must be approached in the same way online if the Bill is to have any meaningful positive impact, which is ultimately what we all want.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I want to make a couple of brief comments. Unfortunately, my hon. Friend the Member for Ochil and South Perthshire is not here as, ironically, he is at the DCMS committee taking evidence on the Online Safety Bill. That is a pretty unfortunate clash of timing, but that is why I am here solo for the morning.

I wanted to make a quick comment on subsection 7. The Minister will have heard the evidence given on schedule 7 and the fact that the other schedules, particularly schedule 6, has a Scottish-specific section detailing the Scottish legislation that applies. Schedule 7 has no Scotland-specific section and does not adequately cover the Scottish legislation. I appreciate that the Minister has tabled amendment 126, which talks about the Scottish and Northern Irish legislation that may be different from England and Wales legislation, but will he give me some comfort that he does intend Scottish-specific offences to be added to schedule 7 through secondary legislation? There is a difference between an amendment on how to add them and a commitment that they will be added if necessary and if he feels that that will add something to the Bill. If he could commit that that will happen, I would appreciate that—obviously, in discussion with Scottish Ministers if amendment 126 is agreed. It would give me a measure of comfort and would assist, given the oral evidence we heard, in overcoming some of the concerns raised about schedule 7 and the lack of inclusion of Scottish offences.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clause 5 simply provides an overview of part 3 of the Bill. Several good points have been raised in the course of this discussion. I will defer replying to the substance of a number of them until we come to the relevant clause, but I will address two or three of them now.

The shadow Minister said that the Bill is a complex, and she is right; it is 193-odd clauses long and a world-leading piece of legislation. The duties that we are imposing on social media firms and internet companies do not already exist; we have no precedent to build on. Most matters on which Parliament legislates have been considered and dealt with before, so we build on an existing body of legislation that has been built up over decades or, in some cases in the criminal law, over centuries. In this case, we are constructing a new legislative edifice from the ground up. Nothing precedes this piece of legislation—we are creating anew—and the task is necessarily complicated by virtue of its novelty. However, I think we have tried to frame the Bill in a way that keeps it as straightforward and as future-proof as possible.

The shadow Minister is right to point to the codes of practice as the source of practical guidance to the public and to social media firms on how the obligations operate in practice. We are working with Ofcom to ensure that those codes of practice are published as quickly as possible and, where possible, prepared in parallel with the passage of the legislation. That is one reason why we have provided £88 million of up-front funding to Ofcom in the current and next financial years: to give it the financial resources to do precisely that.

My officials have just confirmed that my recollection of the Ofcom evidence session on the morning of Tuesday 24 May was correct: Ofcom confirmed to the Committee that it will publish, before the summer, what it described as a “road map” providing details on the timing of when and how those codes of practice will be created. I am sure that Ofcom is listening to our proceedings and will hear the views of the Committee and of the Government. We would like those codes of practice to be prepared and introduced as quickly as possible, and we certainly provided Ofcom with the resources to do precisely that.

There was question about the Scottish offences and, I suppose, about the Northern Irish offences as well—we do not want to forget any part of the United Kingdom.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Hear, hear.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We are in agreement on that. I can confirm that the Government have tabled amendments 116 to 126 —the Committee will consider them in due course—to place equivalent Scottish offences, which the hon. Member for Aberdeen North asked about, in the Bill. We have done that in close consultation with the Scottish Government to ensure that the relevant Scottish offences equivalent to the England and Wales offences are inserted into the Bill. If the Scottish Parliament creates any new Scottish offences that should be inserted into the legislation, that can be done under schedule 7 by way of statutory instrument. I hope that answers the question.

The other question to which I will briefly reply was about parliamentary scrutiny. The Bill already contains a standard mechanism that provides for the Bill to be reviewed after a two to five-year period. That provision appears at the end of the Bill, as we would expect. Of course, there are the usual parliamentary mechanisms—Backbench Business debates, Westminster Hall debates and so on—as well as the DCMS Committee.

I heard the points about a standing Joint Committee. Obviously, I am mindful of the excellent prelegislative scrutiny work done by the previous Joint Committee of the Commons and the Lords. Equally, I am mindful that standing Joint Committees, outside the regular Select Committee structure, unusual. The only two that spring immediately to mind are the Intelligence and Security Committee, which is established by statute, and the Joint Committee on Human Rights, chaired by the right hon. and learned Member for Camberwell and Peckham (Ms Harman), which is established by Standing Orders of the House. I am afraid I am not in a position to make a definitive statement about the Government’s position on this. It is of course always open to the House to regulate its own businesses. There is nothing I can say today from a Government point of view, but I know that hon. Members’ points have been heard by my colleagues in Government.

We have gone somewhat beyond the scope of clause 5. You have been extremely generous, Sir Roger, in allowing me to respond to such a wide range of points. I commend clause 5 to the Committee.

Question put and agreed to.

Clause 5 accordingly ordered to stand part of the Bill.

Clause 6

Providers of user-to-user services: duties of care

None Portrait The Chair
- Hansard -

Before we proceed, perhaps this is the moment to explain what should happen and what is probably going to happen. Ordinarily, a clause is taken with amendments. This Chairman takes a fairly relaxed view of stand part debates. Sometimes it is convenient to have a very broad-ranging debate on the first group of amendments because it covers matters relating to the whole clause. The Chairman would then normally say, “Well, you’ve already had your stand part debate, so I’m not going to allow a further stand part debate.” It is up to hon. Members to decide whether to confine themselves to the amendment under discussion and then have a further stand part debate, or whether to go free range, in which case the Chairman would almost certainly say, “You can’t have a stand part debate as well. You can’t have two bites of the cherry.”

This is slightly more complex. It is a very complex Bill, and I think I am right in saying that it is the first time in my experience that we are taking other clause stand parts as part of the groups of amendments, because there is an enormous amount of crossover between the clauses. That will make it, for all of us, slightly harder to regulate. It is for that reason—the Minister was kind enough to say that I was reasonably generous in allowing a broad-ranging debate—that I think we are going to have to do that with this group.

I, and I am sure Ms Rees, will not wish to be draconian in seeking to call Members to order if you stray slightly outside the boundaries of a particular amendment. However, we have to get on with this, so please try not to be repetitive if you can possibly avoid it, although I accept that there may well be some cases where it is necessary.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I beg to move amendment 69, in clause 6, page 5, line 39, at end insert—

‘(6A) All providers of regulated user-to-user services must name an individual whom the provider considers to be a senior manager of the provider, who is designated as the provider’s illegal content safety controller, and who is responsible for the provider’s compliance with the following duties—

(a) the duties about illegal content risk assessments set out in section 8,

(b) the duties about illegal content set out in section 9.

(6B) An individual is a “senior manager” of a provider if the individual plays a significant role in—

(a) the making of decisions about how the provider’s relevant activities are to be managed or organised, or

(b) the actual managing or organising of the provider’s relevant activities.

(6C) A provider’s “relevant activities” are activities relating to the provider’s compliance with the duties of care imposed by this Act.

(6D) The Safety Controller commits an offence if the provider fails to comply with the duties set out in sections 8 and 9 which must be complied with by the provider.”

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss amendment 70, in clause 96, page 83, line 7, after “section” insert “6(6D),”.

This is one of those cases where the amendment relates to a later clause. While that clause may be debated now, it will not be voted on now. If amendment 69 is negated, amendment 70 will automatically fall later. I hope that is clear, but it will be clearer when we get to amendment 70. Having confused the issue totally, without further ado, I call Ms Davies-Jones.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

With your permission, Sir Roger, I would like to discuss clause 6 and our amendments 69 and 70, and then I will come back to discuss clauses 7, 21 and 22.

Chapter 2 includes a number of welcome improvements from the draft Bill that the Opposition support. It is only right that, when it comes to addressing illegal content, all platforms, regardless of size or reach, will now be required to develop suitable and sufficient risk assessments that must be renewed before design change is applied. Those risk assessments must be linked to safety duties, which Labour has once again long called for.

It was a huge oversight that, until this point, platforms have not had to perform risk assessments of that nature. During our oral evidence sessions only a few weeks ago, we heard extensive evidence about the range of harms that people face online. Yet the success of the regulatory framework relies on regulated companies carefully assessing the risk posed by their platforms and subsequently developing and implementing appropriate mitigations. Crucial to that, as we will come to later, is transparency. Platforms must be compelled to publish the risk assessments, but in the current version of the Bill, only the regulator will have access to them. Although we welcome the fact that the regulator will have the power to ensure that the risk assessments are of sufficient quality, there remain huge gaps, which I will come on to.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I am grateful to the hon. Lady and for SNP support for amendment 69.

The Bill introduces criminal liability for senior managers who fail to comply with information notice provisions, but not for actual failure to fulfil their statutory duties with regard to safety, including child safety, and yet such failures lead to the most seriously harmful outcomes. Legislation should focus the minds of those in leadership positions in services that operate online platforms.

A robust corporate and senior management liability scheme is needed to impose personal liability on directors whose actions consistently and significantly put children at risk. The Bill must learn lessons from other regulated sectors, principally financial services, where regulation imposes specific duties on directors and senior management of financial institutions. Those responsible individuals face regulatory enforcement if they act in breach of such duties. Are we really saying that the financial services sector is more important than child safety online?

The Government rejected the Joint Committee’s recommendation that each company appoint a safety controller at, or reporting to, board level. As a result, there is no direct relationship in the Bill between senior management liability and the discharge by a platform of its safety duties. Under the Bill as drafted, a platform could be wholly negligent in its approach to child safety and put children at significant risk of exposure to illegal activity, but as long as the senior manager co-operated with the regulator’s investigation, senior managers would not be held personally liable. That is a disgrace.

The Joint Committee on the draft Bill recommended that

“a senior manager at board level or reporting to the board should be designated the ‘Safety Controller’ and made liable for a new offence: the failure to comply with their obligations as regulated service providers when there is clear evidence of repeated and systemic failings that result in a significant risk of serious harm to users. We believe that this would be a proportionate last resort for the Regulator. Like any offence, it should only be initiated and provable at the end of an exhaustive legal process.”

Amendment 69 would make provision for regulated companies to appoint an illegal content safety controller, who has responsibility and accountability for protecting children from illegal content and activity. We believe this measure would drive a more effective culture of online safety awareness within regulated firms by making senior management accountable for harms caused through their platforms and embedding safety within governance structures. The amendment would require consequential amendments setting out the nature of the offences for which the safety officer may be liable and the penalties associated with them.

In financial services regulation, the Financial Conduct Authority uses a range of personal accountability regimes to deter individuals who may exhibit unwanted and harmful behaviour and as mechanisms for bringing about cultural change. The senior managers and certificate regime is an overarching framework for all staff in financial sectors and service industries. It aims to

“encourage a culture of staff at all levels taking personal responsibility for their actions”,

and to

“make sure firms and staff clearly understand and can demonstrate where responsibility lies.”

Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

One of the challenges for this legislation will be the way it is enforced. Have my hon. Friend and her Front-Bench colleagues given consideration to the costs of the funding that Ofcom and the regulatory services may need?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

That is a huge concern for us. As was brought up in our evidence sessions with Ofcom, it is recruiting, effectively, a fundraising officer for the regulator. That throws into question the potential longevity of the regulator’s funding and whether it is resourced effectively to properly scrutinise and regulate the online platforms. If that long-term resource is not available, how can the regulator effectively scrutinise and bring enforcement to bear against companies for enabling illegal activity?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Just to reassure the shadow Minister and her hon. Friend the Member for Liverpool, Walton, the Bill confers powers on Ofcom to levy fees and charges on the sector that it is regulating—so, on social media firms—to recoup its costs. We will debate that in due course—I think it is in clause 71, but that power is in the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I am grateful to the Minister for that clarification and I look forward to debating that further as the Bill progresses.

Returning to the senior managers and certificate regime in the financial services industry, it states that senior managers must be preapproved by the regulator, have their responsibilities set out in a statement of responsibilities and be subject to enhanced conduct standards. Those in banks are also subject to regulatory requirements on their remuneration. Again, it baffles me that we are not asking the same for child safety from online platforms and companies.

The money laundering regulations also use the threat of criminal offences to drive culture change. Individuals can be culpable for failure of processes, as well as for intent. I therefore hope that the Minister will carefully consider the need for the same to apply to our online space to make children safe.

Amendment 70 is a technical amendment that we will be discussing later on in the Bill. However, I am happy to move it in the name of the official Opposition.

None Portrait The Chair
- Hansard -

The Committee will note that, at the moment, the hon. Lady is not moving amendment 70; she is only moving amendment 69. So the Question is, That that amendment be made.

--- Later in debate ---
None Portrait The Chair
- Hansard -

No, there are two groups. Let me clarify this for everyone, because it is not as straightforward as it normally is. At the moment we are dealing with amendments 69 and 70. The next grouping, underneath this one on your selection paper, is the clause stand part debates—which is peculiar, as effectively we are having the stand part debate on clause 6 now. For the convenience of the Committee, and if the shadow Minister is happy, I am relaxed about taking all this together.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I am happy to come back in and discuss clauses 7, 21 and 22 stand part afterwards.

None Portrait The Chair
- Hansard -

The hon. Lady can be called again. The Minister is not winding up at this point.

--- Later in debate ---
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clause 7 stand part.

Clauses 21 and 22 stand part.

My view is that the stand part debate on clause 6 has effectively already been had, but I will not be too heavy-handed about that at the moment.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

On clause 7, as I have previously mentioned, we were all pleased to see the Government bring in more provisions to tackle pornographic content online, much of which is easily accessible and can cause harm to those viewing it and potentially to those involved in it.

As we have previously outlined, a statutory duty of care for social platforms online has been missing for far too long, but we made it clear on Second Reading that such a duty will only be effective if we consider the systems, business models and design choices behind how platforms operate. For too long, platforms have been abuse-enabling environments, but it does not have to be this way. The amendments that we will shortly consider are largely focused on transparency, as we all know that the duties of care will only be effective if platforms are compelled to proactively supply their assessments to Ofcom.

On clause 21, the duty of care approach is one that the Opposition support and it is fundamentally right that search services are subject to duties including illegal content risk assessments, illegal content assessments more widely, content reporting, complaints procedures, duties about freedom of expression and privacy, and duties around record keeping. Labour has long held the view that search services, while not direct hosts of potentially damaging content, should have responsibilities that see them put a duty of care towards users first, as we heard in our evidence sessions from HOPE not hate and the Antisemitism Policy Trust.

It is also welcome that the Government have committed to introducing specific measures for regulated search services that are likely to be accessed by children. However, those measures can and must go further, so we will be putting forward some important amendments as we proceed.

Labour does not oppose clause 22, either, but I would like to raise some important points with the Minister. We do not want to be in a position whereby those designing, operating and using a search engine in the United Kingdom are subject to a second-rate internet experience. We also do not want to be in a position where we are forcing search services to choose what is an appropriate design for people in the UK. It would be worrying indeed if our online experience vastly differed from that of, let us say, our friends in the European Union. How exactly will clause 22 ensure parity? I would be grateful if the Minister could confirm that before we proceed.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The shadow Minister has already touched on the effect of these clauses: clause 6 sets out duties applying to user-to-user services in a proportionate and risk-based way; clause 7 sets out the scope of the various duties of care; and clauses 21 and 22 do the same in relation to search services.

In response to the point about whether the duties on search will end up providing a second-rate service in the United Kingdom, I do not think that they will. The duties have been designed to be proportionate and reasonable. Throughout the Bill, Members will see that there are separate duties for search and for user-to-user services. That is reflected in the symmetry—which appears elsewhere, too—of clauses 6 and 7, and clauses 21 and 22. We have done that because we recognise that search is different. It indexes the internet; it does not provide a user-to-user service. We have tried to structure these duties in a way that is reasonable and proportionate, and that will not adversely impair the experience of people in the UK.

I believe that we are ahead of the European Union in bringing forward this legislation and debating it in detail, but the European Union is working on its Digital Services Act. I am confident that there will be no disadvantage to people conducting searches in United Kingdom territory.

Question put and agreed to.

Clause 6 accordingly ordered to stand part of the Bill.

Clause 7 ordered to stand part of the Bill.

Clause 8

Illegal content risk assessment duties

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I beg to move amendment 10, in clause 8, page 6, line 33, at end insert—

“(4A) A duty to publish the illegal content risk assessment and proactively supply this to OFCOM.”

This amendment creates a duty to publish an illegal content risk assessment and supply it to Ofcom.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 14, in clause 8, page 6, line 33, at end insert—

“(4A) A duty for the illegal content risk assessment to be approved by either—

(a) the board of the entity; or, if the organisation does not have a board structure,

(b) a named individual who the provider considers to be a senior manager of the entity, who may reasonably be expected to be in a position to ensure compliance with the illegal content risk assessment duties, and reports directly into the most senior employee of the entity.”

This amendment seeks to ensure that regulated companies’ boards or senior staff have responsibility for illegal content risk assessments.

Amendment 25, in clause 8, page 7, line 3, after the third “the” insert “production,”.

This amendment requires the risk assessment to take into account the risk of the production of illegal content, as well as the risk of its presence and dissemination.

Amendment 19, in clause 8, page 7, line 14, at end insert—

“(h) how the service may be used in conjunction with other regulated user-to-user services such that it may—

(i) enable users to encounter illegal content on other regulated user-to-user services, and

(ii) constitute part of a pathway to harm to individuals who are users of the service, in particular in relation to CSEA content.”

This amendment would incorporate into the duties a requirement to consider cross-platform risk.

Clause stand part.

Amendment 20, in clause 9, page 7, line 30, at end insert

“, including by being directed while on the service towards priority illegal content hosted by a different service;”.

This amendment aims to include within companies’ safety duties a duty to consider cross-platform risk.

Amendment 26, in clause 9, page 7, line 30, at end insert—

“(aa) prevent the production of illegal content by means of the service;”.

This amendment incorporates a requirement to prevent the production of illegal content within the safety duties.

Amendment 18, in clause 9, page 7, line 35, at end insert—

“(d) minimise the presence of content which reasonably foreseeably facilitates or aids the discovery or dissemination of priority illegal content, including CSEA content.”

This amendment brings measures to minimise content that may facilitate or aid the discovery of priority illegal content within the scope of the duty to maintain proportionate systems and processes.

Amendment 21, in clause 9, page 7, line 35, at end insert—

“(3A) A duty to collaborate with other companies to take reasonable and proportionate measures to prevent the means by which their services can be used in conjunction with other services to facilitate the encountering or dissemination of priority illegal content, including CSEA content,”.

This amendment creates a duty to collaborate in cases where there is potential cross-platform risk in relation to priority illegal content and CSEA content.

Clause 9 stand part.

Amendment 30, in clause 23, page 23, line 24, after “facilitating” insert

“the production of illegal content and”.

This amendment requires the illegal content risk assessment to consider the production of illegal content.

Clause 23 stand part.

Amendment 31, in clause 24, page 24, line 2, after “individuals” insert “producing or”.

This amendment expands the safety duty to include the need to minimise the risk of individuals producing certain types of search content.

Clause 24 stand part.

Members will note that amendments 17 and 28 form part of a separate group. I hope that is clear.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

At this stage, I will speak to clause 8 and our amendments 10, 14, 25, 19 and 17.

None Portrait The Chair
- Hansard -

Order. This is confusing. The hon. Lady said “and 17”. Amendment 17 is part of the next group of amendments.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Apologies, Sir Roger; I will speak to amendments 10, 14, 25 and 19.

None Portrait The Chair
- Hansard -

It’s all right, we’ll get there.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

The Opposition welcome the moves to ensure that all user-to-user services are compelled to provide risk assessments in relation to illegal content, but there are gaps, ranging from breadcrumbing to provisions for the production of livestreaming of otherwise illegal content.

Labour is extremely concerned by the lack of transparency around the all-important illegal content risk assessments, which is why we have tabled amendment 10. The effectiveness of the entire Bill is undermined unless the Government commit to a more transparent approach more widely. As we all know, in the Bill currently, the vital risk assessments will only be made available to the regulator, rather than for public scrutiny. There is a real risk—for want of a better word—in that approach, as companies could easily play down or undermine the risks. They could see the provision of the risk assessments to Ofcom as a simple, tick-box exercise to satisfy the requirements of them, rather than using the important assessments as an opportunity truly to assess the likelihood of current and emerging risks.

As my hon. Friend the Member for Worsley and Eccles South will touch on in her later remarks, the current approach runs the risk of allowing businesses to shield themselves from true transparency. The Minister knows that this is a major issue, and that until service providers and platforms are legally compelled to provide data, we will be shielded from the truth, because there is no statutory requirement for them to be transparent. That is fundamentally wrong and should not be allowed to continue. If the Government are serious about their commitment to transparency, and to the protection of adults and children online, they should make this small concession and see it as a positive step forward.

Amendment 14 would ensure that regulated companies, boards or senior staff have appropriate oversight of risk assessments related to adults. An obligation on boards or senior managers to approve risk assessments would hardwire the safety duties and create a culture of compliance in the regulated firms. The success of the regulatory framework relies on regulated companies carefully risk assessing their platforms. Once risks have been identified, the platform can concentrate on developing and implementing appropriate mitigations.

To date, boards and top executives of the regulated companies have not taken the risks to children seriously enough. Platforms either have not considered producing risk assessments or, if they have done so, they have been of limited efficiency and have demonstrably failed to adequately identify and respond to harms to children. Need I remind the Minister that the Joint Committee on the draft Bill recommended that risk assessments should be approved at board level?

Introducing a requirement on regulated companies to have the board or a senior manager approve the risk assessment will hardwire the safety duties into decision making, and create accountability and responsibility at the most senior level of the organisation. That will trickle down the organisation and help embed a culture of compliance across the company. We need to see safety online as a key focus for these platforms, and putting the onus on senior managers to take responsibility is a positive step forward in that battle.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clause 8 sets out the risk assessment duties for illegal content, as already discussed, that apply to user-to-user services. Ofcom will issue guidance on how companies can undertake those. To comply with those duties, companies will need to take proportionate measures to mitigate the risks identified in those assessments. The clause lists a number of potential risk factors the providers must assess, including how likely it is that users will encounter illegal content, as defined later in the Bill,

“by means of the service”.

That phrase is quite important, and I will come to it later, on discussing some of the amendments, because it does not necessarily mean just on the service itself but, in a cross-platform point, other sites where users might find themselves via the service. That phrase is important in the context of some of the reasonable queries about cross-platform risks.

Moving on, companies will also need to consider how the design and operation of their service may reduce or increase the risks identified. Under schedule 3, which we will vote on, or at least consider, later on, companies will have three months to carry out risk assessments, which must be kept up to date so that fresh risks that may arise from time to time can be accommodated. Therefore, if changes are made to the service, the risks can be considered on an ongoing basis.

Amendment 10 relates to the broader question that the hon. Member for Liverpool, Walton posed about transparency. The Bill already contains obligations to publish summary risk assessments on legal but harmful content. That refers to some of the potentially contentious or ambiguous types of content for which public risk assessments would be helpful. The companies are also required to make available those risk assessments to Ofcom on request. That raises a couple of questions, as both the hon. Member for Liverpool, Walton mentioned and some of the amendments highlighted. Should companies be required to proactively serve up their risk assessments to Ofcom, rather than wait to be asked? Also, should those risk assessments all be published—probably online?

In considering those two questions, there are a couple of things to think about. The first is Ofcom’s capacity. As we have discussed, 25,000 services are in scope. If all those services proactively delivered a copy of their risk assessment, even if they are very low risk and of no concern to Ofcom or, indeed, any of us, they would be in danger of overwhelming Ofcom. The approach contemplated in the Bill is that, where Ofcom has a concern or the platform is risk assessed as being significant—to be clear, that would apply to all the big platforms—it will proactively make a request, which the platform will be duty bound to meet. If the platform does not do that, the senior manager liability and the two years in prison that we discussed earlier will apply.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

The Minister mentioned earlier that Ofcom would be adequately resourced and funded to cope with the regulatory duty set out in the Bill. If Ofcom is not able to receive risk assessments for all the platforms potentially within scope, even if those platforms are not deemed to be high risk, does that not call into question whether Ofcom has the resource needed to actively carry out its duties in relation to the Bill?

Online Safety Bill (Sixth sitting)

Alex Davies-Jones Excerpts
Committee stage
Tuesday 7th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 7 June 2022 - (7 Jun 2022)
Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

It is a great pleasure to serve under your chairmanship, Ms Rees, and I am glad that this afternoon’s Committee proceedings are being broadcast to the world.

Before we adjourned this morning, I was in the process of saying that one of the challenges with public publication of the full risk assessment, even for larger companies, is that the vulnerabilities in their systems, or the potential opportunities to exploit those systems for criminal purposes, would then be publicly exposed in a way that may not serve the public interest, and that is a reason for not requiring complete disclosure of everything.

However, I draw the Committee’s attention to the existing transparency provisions in clause 64. We will come on to them later, but I want to mention them now, given that they are relevant to amendment 10. The transparency duties state that, once a year, Ofcom must serve notice on the larger companies—those in categories 1, 2A and 2B—requiring them to produce a transparency report. That is not a power for Ofcom—it is a requirement. Clause 64(1) states that Ofcom

“must give every provider…a notice which requires the provider to produce…(a ‘transparency report’).”

The content of the transparency report is specified by Ofcom, as set out in subsection (3). As Members will see, Ofcom has wide powers to specify what must be included in the report. On page 186, schedule 8—I know that we will debate it later, but it is relevant to the amendment—sets out the scope of what Ofcom can require. It is an extremely long list that covers everything we would wish to see. Paragraph 1, for instance, states:

“The incidence of illegal content, content that is harmful to children and priority content that is harmful to adults on a service.”

Therefore, the transparency reporting requirement—it is not an option but a requirement—in clause 64 addresses the transparency point that was raised earlier.

Amendment 14 would require a provider’s board members or senior manager to take responsibility for the illegal content risk assessment. We agree with the Opposition’s point. Indeed, we agree with what the Opposition are trying to achieve in a lot of their amendments.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

There is a “but” coming. We think that, in all cases apart from one, the Bill as drafted already addresses the matter. In the case of amendment 14, the risk assessment duties as drafted already explicitly require companies to consider how their governance structures may affect the risk of harm to users arising from illegal content. Ofcom will provide guidance to companies about how they can comply with those duties, which is very likely to include measures relating to senior-level engagement. In addition, Ofcom can issue confirmation decisions requiring companies to take specific steps to come into compliance. To put that simply, if Ofcom thinks that there is inadequate engagement by senior managers in relation to the risk assessment duties, it can require—it has the power to compel—a change of behaviour by the company.

I come now to clause 9—I think this group includes clause 9 stand part as well. The shadow Minister has touched on this. Clause 9 contains safety duties in relation to—

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Lady raises an interesting point about time. However, the clause 8(5)(d) uses the wording,

“the level of risk of functionalities of the service facilitating the presence or dissemination of illegal content”

and so on. That presence can happen at any time, even fleetingly, as with Snapchat. Even when the image self-deletes after a certain period—so I am told, I have not actually used Snapchat—the presence has occurred. Therefore, that would be covered by clause 8(5)(d).

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Will the Minister explain how we would be able to prove, once the image is deleted, that it was present on the platform?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The question of proof is a separate one, and that would apply however we drafted the clause. The point is that the clause provides that any presence of a prohibited image would fall foul of the clause. There are also duties on the platforms to take reasonable steps. In the case of matters such as child sexual exploitation and abuse images, there are extra-onerous duties that we have discussed before, for obvious and quite correct reasons.

None Portrait The Chair
- Hansard -

Order. Minister, before you continue, before the Committee rose earlier today, there was a conversation about clause 9 being in, and then I was told it was out. This is like the hokey cokey; it is back in again, just to confuse matters further. I was confused enough, so that point needs to be clarified.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

It is grouped, Chair. We were discussing clause 8 and the relevant amendments, then we were going to come back to clause 9 and the relevant amendments.

None Portrait The Chair
- Hansard -

Is that as clear as mud?

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Obviously, I encourage the Committee to support those clauses standing part of the Bill. They impose duties on search services—we touched on search a moment ago—to assess the nature and risk to individuals of accessing illegal content via their services, and to minimise the risk of users encountering that illegal content. They are very similar duties to those we discussed for user-to-user services, but applied in the search context. I hope that that addresses all the relevant provisions in the group that we are debating.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I am grateful for the opportunity to speak to amendments to clause 9 and to clauses 23 and 24, which I did not speak on earlier. I am also very grateful that we are being broadcast live to the world and welcome that transparency for all who might be listening.

On clause 9, it is right that the user-to-user services will be required to have specific duties and to take appropriate measures to mitigate and manage the risk of harm to individuals and their likelihood of encountering priority illegal content. Again, however, the Bill does not go far enough, which is why we are seeking to make these important amendments. On amendment 18, it is important to stress that the current scope of the Bill does not capture the range of ways in which child abusers use social networks to organise abuse, including to form offender networks. They post digital breadcrumbs that signpost to illegal content on third-party messaging apps and the dark web, and they share child abuse videos that are carefully edited to fall within content moderation guidelines. This range of techniques, known as child abuse breadcrumbing, is a significant enabler of online child abuse.

Our amendment would give the regulator powers to tackle breadcrumbing and ensure a proactive upstream response. The amendment would ensure that tens of millions of interactions with accounts that actively enable the discovery and sharing of child abuse material will be brought into regulatory scope. It will not leave that as ambiguous. The amendment will also ensure that companies must tackle child abuse at the earliest possible stage. As it stands, the Bill would reinforce companies’ current focus only on material that explicitly reaches the criminal threshold. Because companies do not focus their approach on other child abuse material, abusers can exploit this knowledge to post carefully edited child abuse images and content that enables them to connect and form networks with other abusers. Offenders understand and can anticipate that breadcrumbing material will not be proactively identified or removed by the host site, so they are able to organise and link to child abuse in plain sight.

We all know that child abuse breadcrumbing takes many forms, but techniques include tribute sites where users create social media profiles using misappropriated identities of known child abuse survivors. These are used by offenders to connect with likeminded perpetrators to exchange contact information, form offender networks and signpost child abuse material elsewhere online. In the first quarter of 2021, there were 6 million interactions with such accounts.

Abusers may also use Facebook groups to build offender groups and signpost to child abuse hosted on third-party sites. Those groups are thinly veiled in their intentions; for example, as we heard in evidence sessions, groups are formed for those with an interest in children celebrating their 8th, 9th or 10th birthdays. Several groups with over 50,000 members remained alive despite being reported to Meta, and algorithmic recommendations quickly suggested additional groups for those members to join.

Lastly, abusers can signpost to content on third-party sites. Abusers are increasingly using novel forms of technology to signpost to online child abuse, including QR codes, immersive technologies such as the metaverse, and links to child abuse hosted on the blockchain. Given the highly agile nature of the child abuse threat and the demonstrable ability of sophisticated offenders to exploit new forms of technology, this amendment will ensure that the legislation is effectively futureproofed. Technological change makes it increasingly important that the ability of child abusers to connect and form offender networks can be disrupted at the earliest possible stage.

Turning to amendment 21, we know that child abuse is rarely siloed on a single platform or app. Well-established grooming pathways see abusers exploit the design features of social networks to contact children before they move communication across to other platforms, including livestreaming sites, as we have already heard, and encrypted messaging services. Offenders manipulate features such as Facebook’s algorithmic friend suggestions to make initial contact with a large number of children. They can then use direct messages to groom them and coerce children into sending sexual images via WhatsApp. Similarly, as we heard earlier, abusers can groom children through playing videogames and then bringing them on to another ancillary platform, such as Discord.

The National Society for the Prevention of Cruelty to Children has shared details of an individual whose name has been changed, and whose case particularly highlights the problems that children are facing in the online space. Ben was 14 when he was tricked on Facebook into thinking he was speaking to a female friend of a friend, who turned out to be a man. Using threats and blackmail, he coerced Ben into sending abuse images and performing sex acts live on Skype. Those images and videos were shared with five other men, who then bombarded Ben with further demands. His mum, Rachel, said:

“The abuse Ben suffered had a devastating impact on our family. It lasted two long years, leaving him suicidal.

It should not be so easy for an adult to meet and groom a child on one site then trick them into livestreaming their own abuse on another app, before sharing the images with like-minded criminals at the click of a button.

Social media sites should have to work together to stop this abuse happening in the first place, so other children do not have to go through what Ben did.”

The current drafting of the Bill does not place sufficiently clear obligations on platforms to co-operate on the cross-platform nature of child abuse. Amendment 21 would require companies to take reasonable and proportionate steps to share threat assessments, develop proportionate mechanisms to share offender intelligence, and create a rapid response arrangement to ensure that platforms develop a coherent, systemic approach to new and emerging threats. Although the industry has developed a systemic response to the removal of known child abuse images, these are largely ad hoc arrangements that share information on highly agile risk profiles. The cross-platform nature of grooming and the interplay of harms across multiple services need to be taken into account. If it is not addressed explicitly in the Bill, we are concerned that companies may be able to cite competition concerns to avoid taking action.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

On the topic of child abuse images, the hon. Member spoke earlier about livestreaming and those images not being captured. I assume that she would make the same point in relation to this issue: these live images may not be captured by AI scraping for them, so it is really important that they are included in the Bill in some way as well.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I completely agree with the hon. Member, and appreciate her intervention. It is fundamental for this point to be captured in the Bill because, as we are seeing, this is happening more and more. More and more victims are coming forward who have been subject to livestreaming that is not picked up by the technology available, and is then recorded and posted elsewhere on smaller platforms.

Legal advice suggests that cross-platform co-operation is likely to be significantly impeded by the negative interplay with competition law unless there is a clear statutory basis for enabling or requiring collaboration. Companies may legitimately have different risk and compliance appetites, or may simply choose to hide behind competition law to avoid taking a more robust form of action.

New and emerging technologies are likely to produce an intensification of cross-platform risks in the years ahead, and we are particularly concerned about the child abuse impacts in immersive virtual reality and alternative-reality environments, including the metaverse. A number of high-risk immersive products are already designed to be platform-agnostic, meaning that in-product communication takes place between users across multiple products and environments. There is a growing expectation that these environments will be built along such lines, with an incentive for companies to design products in this way in the hope of blunting the ability of Governments to pursue user safety objectives.

Separately, regulatory measures that are being developed in the EU, but are highly likely to impact service users in the UK, could result in significant unintended safety consequences. Although the interoperability provisions in the Digital Markets Act are strongly beneficial when viewed through a competition lens—they will allow the competition and communication of multiple platforms—they could, without appropriate safety mitigations, provide new means for abusers to contact children across multiple platforms, significantly increase the overall profile of cross-platform risk, and actively frustrate a broad number of current online safety responses. Amendment 21 will provide corresponding safety requirements that can mitigate the otherwise significant potential for unintended consequences.

The Minister referred to clauses 23 and 24 in relation to amendments 30 and 31. We think a similar consideration should apply for search services as well as for user-to-user services. We implore that the amendments be made, in order to prevent those harms from occurring.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I have already commented on most of those amendments, but one point that the shadow Minister made that I have not addressed was about acts that are essentially preparatory to acts of child abuse or the exchange of child sexual exploitation and abuse images. She was quite right to raise that issue as a matter of serious concern that we would expect the Bill to prevent, and I offer the Committee the reassurance that the Bill, as drafted, does so.

Schedule 6 sets out the various forms of child sexual exploitation and abuse that are designated as priority offences and that platforms have to take proactive steps to prevent. On the cross-platform point, that includes, as we have discussed, things that happen through a service as well as on a service. Critically, paragraph 9 of schedule 6 includes “inchoate offences”, which means someone not just committing the offence but engaging in acts that are preparatory to committing the offence, conspiring to commit the offence, or procuring, aiding or abetting the commission of the offence. The preparatory activities that the shadow Minister referred to are covered under schedule 6, particularly paragraph 9.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I thank the Minister for giving way. I notice that schedule 6 includes provision on the possession of indecent photographs of children. Can he confirm that that provision encapsulates the livestreaming of sexual exploitation?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Yes, I can.

Question put, That the amendment be made.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Amendments 20, 26, 18 and 21 to clause 9 have already been debated. Does the shadow Minister wish to press any of them to a vote?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Amendments 20, 18 and 21.

Amendment proposed: 20, in clause 9, page 7, line 30, at end insert

“, including by being directed while on the service towards priority illegal content hosted by a different service;”—(Alex Davies-Jones.)

This amendment aims to include within companies’ safety duties a duty to consider cross-platform risk.

Question put, That the amendment be made.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Yes, and that is why governance is addressed in the clause as drafted. But the one thing that will really change the way the leadership of these companies thinks about this issue is the one thing they ultimately care about—money. The reason they allow unsafe content to circulate and do not rein in or temper their algorithms, and the reason we are in this situation, which has arisen over the last 10 years or so, is that these companies have consistently prioritised profit over protection. Ultimately, that is the only language they understand—it is that and legal compulsion.

While the Bill rightly addresses governance in clause 10 and in other clauses, as I have said a few times, what has to happen to make this change occur is the compulsion that is inherent in the powers to fine and to deny service—to pull the plug—that the Bill also contains. The thing that will give reassurance to our constituents, and to me as a parent, is knowing that for the first time ever these companies can properly be held to account. They can be fined. They can have their connection pulled out of the wall. Those are the measures that will protect our children.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

The Minister is being very generous with his time, but I do not think he appreciates the nature of the issue. Mark Zuckerberg’s net worth is $71.5 billion. Elon Musk, who is reported to be purchasing Twitter, is worth $218 billion. Bill Gates is worth $125 billion. Money does not matter to these people.

The Minister discusses huge fines for the companies and the potential sanction of bringing down their platforms. They will just set up another one. That is what we are seeing with the smaller platforms: they are closing down and setting up new platforms. These measures do not matter. What matters and will actually make a difference to the safety of children and adults online is personal liability—holding people personally responsible for the direct harm they are causing to people here in the United Kingdom. That is what these amendments seek to do, and that is why we are pushing them so heavily. I urge the Minister to respond to that.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We discussed personal liability extensively this morning. As we discussed, there is personal liability in relation to providing information, with a criminal penalty of up to two years’ imprisonment, to avoid situations like the one we saw a year or two ago, where one of these companies failed to provide the Competition and Markets Authority with the information that it required.

The shadow Minister pointed out the very high levels of global turnover—$71.5 billion—that these companies have. That means that ultimately they can be fined up to $7 billion for each set of breaches. That is a vast amount of money, particularly if those breaches happen repeatedly. She said that such companies will just set up again if we deny their service. Clearly, small companies can close down and set up again the next day, but gigantic companies, such as Meta—Facebook—cannot do that. That is why I think the sanctions I have pointed to are where the teeth really lie.

I accept the point about governance being important as well; I am not dismissing that. That is why we have personal criminal liability for information provision, with up to two years in prison, and it is why governance is referenced in clause 10. I accept the spirit of the points that have been made, but I think the Bill delivers these objectives as drafted.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We have deliberately avoided being too prescriptive about precisely how the duty is met. We have pointed to age verification as an example of how the duty can be met without saying that that is the only way. We would not want to bind Ofcom’s hands, or indeed the hands of platforms. Clearly, using a third party is another way of delivering the outcome. If a platform were unable to demonstrate to Ofcom that it could deliver the required outcome using its own methods, Ofcom may well tell it to use a third party instead. The critical point is that the outcome must be delivered. That is the message that the social media firms, Ofcom and the courts need to hear when they look at our proceedings. That is set out clearly in the clause. Parliament is imposing a duty, and we expect all those to whom the legislation applies to comply with it.

Question put and agreed to.

Clause 11 accordingly ordered to stand part of the Bill.

Clause 12

Adults’ risk assessment duties

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I beg to move amendment 12, in clause 12, page 12, line 10, at end insert—

“(4A) A duty to publish the adults’ risk assessment and proactively supply this to OFCOM.”

This amendment creates a duty to publish the adults’ risk assessment and supply it to Ofcom.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clause stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

The amendment creates a duty to publish the adults’ risk assessment and supply it to Ofcom. As my hon. Friend the Member for Worsley and Eccles South remarked when addressing clause 10, transparency and scrutiny of those all-important risk assessments must be at the heart of the Online Safety Bill. We all know that the Government have had a hazy record on transparency lately but, for the sake of all in the online space, I sincerely hope that the Minister will see the value in ensuring that the risk assessments are accurate, proactively supplied and published for us all to consider.

It is only fair that all the information about risks to personal safety be made available to users of category 1 services, which we know are the most popular and, often, the most troublesome services. We all want people to feel compelled to make their own decisions about their behaviour both online and offline. That is why we are pushing for a thorough approach to risk assessments more widely. Also, without a formal duty to publicise those risk assessments, I fear there will be little change in our safety online. The Minister has referenced that the platforms will be looking back at Hansard in years to come to determine whether or not they should be doing the right thing. Unless we make that a statutory obligation within the Bill, I fear that reference will fall on deaf ears.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Once again, I agree with the point about transparency and the need to have those matters brought into the light of day. We heard from Frances Haugen how Facebook—now Meta—actively resisted doing so. However, I point to two provisions already in the Bill that deliver precisely that objective. I know we are debating clause 12, but there is a duty in clause 13(2) for platforms to publish in their terms of service—a public document—the findings of the most recent adult risk assessment. That duty is in clause 13—the next clause we are going to debate—in addition to the obligations I have referred to twice already in clause 64, where Ofcom compels those firms to publish their transparency reports. I agree with the points that the shadow Minister made, but suggest that through clause 13(2) and clause 64, those objectives are met in the Bill as drafted.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I thank the Minister for his comments, but sadly we do not feel that is appropriate or robust enough, which is why we will be pressing the amendment to a Division.

Question put, That the amendment be made.

The Committee divided.

--- Later in debate ---
Question proposed, That the clause stand part of the Bill.
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

While I am at risk of parroting my hon. Friend the Member for Worsley and Eccles South on clause 11, it is important that adults and the specific risks they face online are considered in the clause. The Minister knows we have wider concerns about the specific challenges of the current categorisation system. I will come on to that at great length later, but I thought it would be helpful to remind him at this relatively early stage that the commitments to safety and risk assessments for category 1 services will only work if category 1 encapsulates the most harmful platforms out there. That being said, Labour broadly supports this clause and has not sought to amend it.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am eagerly awaiting the lengthy representations that the shadow Minister just referred to, as are, I am sure, the whole Committee and indeed the millions watching our proceedings on the live broadcast. As the shadow Minister said, clause 13 sets out the safety duties in relation to adults. This is content that is legal but potentially harmful to adults, and for those topics specified in secondary legislation, it will require category 1 services to set out clearly what actions they might be taking—from the actions specified in subsection (4) —in relation to that content.

It is important to specify that the action they may choose to take is a choice for the platform. I know some people have raised issues concerning free speech and these duties, but I want to reiterate and be clear that this is a choice for the platform. They have to be publicly clear about what choices they are making, and they must apply those choices consistently. That is a significant improvement on where we are now, where some of these policies get applied in a manner that is arbitrary.

Question put and agreed to.

Clause 13 accordingly ordered to stand part of the Bill.

Clause 14

User empowerment duties

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I beg to move amendment 46, in clause 14, page 14, line 12, after “non-verified users” insert

“and to enable them to see whether another user is verified or non-verified.”

This amendment would make it clear that, as part of the User Empowerment Duty, users should be able to see which other users are verified and which are non-verified.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 47, in clause 189, page 155, line 1, at end insert

“‘Identity Verification’ means a system or process designed to enable a user to prove their identity, for purposes of establishing that they are a genuine, unique, human user of the service and that the name associated with their profile is their real name.”

This amendment adds a definition of Identity Verification to the terms defined in the Bill.

New clause 8—OFCOM’s guidance about user identity verification

“(1) OFCOM must produce guidance for providers of Category 1 services on how to comply with the duty set out in section 57(1).

(2) In producing the guidance (including revised or replacement guidance), OFCOM must have regard to—

(a) ensuring providers offer forms of identity verification which are likely to be accessible to vulnerable adult users and users with protected Characteristics under the Equality Act 2010,

(b) promoting competition, user choice, and interoperability in the provision of identity verification,

(c) protection of rights, including rights to privacy, freedom of expression, safety, access to information, and the rights of children,

(d) alignment with other relevant guidance and regulation, including with regards to Age Assurance and Age Verification.

(3) In producing the guidance (including revised or replacement guidance), OFCOM must set minimum standards for the forms of identity verification which Category services must offer, addressing—

(a) effectiveness,

(b) privacy and security,

(c) accessibility,

(d) time-frames for disclosure to Law Enforcement in case of criminal investigations,

(e) transparency for the purposes of research and independent auditing,

(f) user appeal and redress mechanisms.

(4) Before producing the guidance (including revised or replacement guidance), OFCOM must consult—

(a) the Information Commissioner,

(b) the Digital Markets Unit,

(c) persons whom OFCOM consider to have technological expertise relevant to the duty set out in section 57(1),

(d) persons who appear to OFCOM to represent the interests of users including vulnerable adult users of Category 1 services, and

(e) such other persons as OFCOM considers appropriate.

(5) OFCOM must publish the guidance (and any revised or replacement guidance).”

This new clause would require Ofcom to set a framework of principles and minimum standards for the User Verification Duty.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

The revised Bill seeks to address the problems associated with anonymity through requiring platforms to empower users, with new options to verify their identity and filter out non-verified accounts. This is in line with the approach recommended by Clean Up The Internet and also reflects the approach proposed in the Social Media Platforms (Identity Verification) Bill, which was tabled by the hon. Member for Stroud (Siobhan Baillie) and attracted cross-party support. It has the potential to strike a better balance between tackling the clear role that anonymity can play in fuelling abuse and disinformation, while safeguarding legitimate uses of anonymity, including by vulnerable users, for whom anonymity can act as a protection. However, Labour does share the concerns of stakeholders around the revised Bill, which we have sought to amend.

Amendment 46 aims to empower people to use this information about verification when making judgments about the reliability of other accounts and the content they share. This would ensure that the user verification duty helps disrupt the use of networks of inauthentic accounts to spread disinformation. Labour welcomes the inclusion in the revised Bill of measures designed to address harm associated with misuse of anonymous social media accounts. There is considerable evidence from Clean Up The Internet and others that anonymity fuels online abuse, bullying and trolling and that it is one of the main tools used by organised disinformation networks to spread and amplify false, extremist and hateful content.

The revised Bill seeks to address the problems associated with anonymity, by requiring platforms to empower users with new options to verify their identity and to filter out non-verified accounts. In doing so, it has the potential to strike a better balance between tackling the clear role that anonymity can play in fuelling abuse and misinformation while safeguarding legitimate users of anonymity, including vulnerable users, for whom anonymity acts as a protection.

Clause 14 falls short of truly empowering people to make the most well-informed decisions about the type of content they engage with. We believe that this could be simple, and a simple change from a design perspective. Category 1 platforms are already able to verify different types of accounts, whether they be personal or business accounts, so ensuring that people are equipped with this information more broadly would be an easy step for the big platforms to make. Indeed, the Joint Committee’s prelegislative scrutiny recommended that the Government consider, as part of Ofcom’s code of practice, a requirement for the largest and highest-risk platforms to offer the choice of verified or unverified status and user options on how they interact with accounts in either category.

I know that there are concerns about verification, and there is a delicate balance between anonymity, free speech and protecting us all online. I somewhat sympathise with the Minister in being tasked with bringing forward this complex legislation, but the options for choosing what content and users we do and do not engage with are already there on most platforms. On Twitter, we are able to mute accounts—I do so regularly—or keywords that we want to avoid. Similarly, we can restrict individuals on Instagram.

In evidence to the Joint Committee, the Secretary of State said that the first priority of the draft Bill was to end all online abuse, not just that from anonymous accounts. Hopes were raised about the idea of giving people the option to limit their interaction with anonymous or non-verified accounts. Clearly, the will is there, and the amendment ensures that there is a way, too. I urge the Minister to accept the amendment, if he is serious about empowering users across the United Kingdom.

Now I move on to amendment 47. As it stands, the Bill does not adequately define “verification” or set minimum standards for how it will be carried out. There is a risk that platforms will treat this as a loophole in order to claim that their current, wholly inadequate processes count as verification. We also see entirely avoidable risks of platforms developing new verification processes that fail to protect users’ privacy and security or which serve merely to extend their market dominance to the detriment of independent providers. That is why it is vital that a statutory definition of identity verification is placed in the Bill.

I have already spoken at length today, and I appreciate that we are going somewhat slowly on the Bill, but it is complex legislation and this is an incredibly important detail that we need to get right if the Bill is to be truly world leading. Without a definition of identity verification, I fear that we are at risk of allowing technology, which can easily replicate the behaviours of a human being, to run rife, which would essentially invalidate the process of verification entirely.

I have also spoken at length about my concerns relating to AI technologies, the lack of future proofing in the Bill and the concerns that could arise in the future. I am sure that the Minister is aware that that could have devastating impacts on our democracy and our online safety more widely.

New clause 8 would ensure that the user empowerment duty and user verification work as intended by simply requiring Ofcom to set out principles and minimum standards for compliance. We note that the new clause is entirely compatible with the Government’s stated aims for the Bill and would provide a clearer framework for both regulated companies and the regulator. By its very nature, it is vital that in preparing the guidance Ofcom must ensure that the delicate balance that I touched on earlier between freedom of expression, the right to privacy and safety online is kept in mind throughout.

We also felt it important that, in drawing up the guidance a collaborative approach should be taken. Regulating the online space is a mammoth task, and while we have concerns about Ofcom’s independence, which I will gladly touch on later, we also know that it will be best for us all if it is required to draw on the expertise of other expert organisations in doing so.

None Portrait The Chair
- Hansard -

There is a Division in the House, so I will suspend the sitting for 15 minutes.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

When it comes to police investigations, if something is illegal and merits a report to the police, users should report it, regardless of whether someone is verified or not—whatever the circumstances. I would encourage any internet user to do that. That effectively applies on Twitter already; some people have blue ticks and some people do not, and people should report others to the police if they do something illegal, whether or not they happen to have a blue tick.

Amendment 47 seeks to create a definition of identity verification in clause 189. In addition, it would compel the person’s real name to be displayed. I understand the spirit of the amendment, but there are two reasons why I would not want to accept it and would ask hon. Members not to press it. First, the words “identity verification” are ordinary English words with a clear meaning and we do not normally define in legislation ordinary English words with a clear meaning. Secondly, the amendment would add the new requirement that, if somebody is verified, their real name has to be displayed, but I do not think that that is the effect of the drafting as it stands. Somebody may be verified, and the company knows who they are—if the police go to the company, they will have the verified information—but there is no obligation, as the amendment is drafted, for that information to be displayed publicly. The effect of that part of the amendment would be to force users to choose between disclosing their identity to everyone or having no control over who they interact with. That may not have been the intention, but I am not sure that this would necessarily make sense.

New clause 8 would place requirements on Ofcom about how to produce guidance on user identity verification and what that guidance must contain. We already have provisions on that in clause 58, which we will no doubt come to, although probably not later on today—maybe on Thursday. Clause 58 allows Ofcom to include in its regulatory guidance the principles and standards referenced in the new clause, which can then assist service providers in complying with their duties. Of course, if they choose to ignore the guidelines and do not comply with their duties, they will be subject to enforcement action, but we want to ensure that there is flexibility for Ofcom, in writing those guidelines, and for companies, in following those guidelines or taking alternative steps to meet their duty.

This morning, a couple of Members talked about the importance of remaining flexible and being open to future changes in technology and a wide range of user needs. We want to make sure that flexibility is retained. As drafted, new clause 8 potentially undermines that flexibility. We think that the powers set out in clause 58 give Ofcom the ability to set the relevant regulatory guidance.

Clause 14 implements the proposals made by my hon. Friend the Member for Stroud in her ten-minute rule Bill and the proposals made, as the shadow Minister has said, by a number of third-party stakeholders. We should all welcome the fact that these new user empowerment duties have now been included in the Bill in response to such widespread parliamentary lobbying.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I am grateful to the Minister for giving way. I want to recount my own experience on this issue. He mentioned that anybody in receipt of anonymous abuse on social media should report it to the police, especially if it is illegal. On Thursday, I dared to tweet my opinions on the controversial Depp-Heard case in America. As a result of putting my head above the parapet, my Twitter mentions were an absolute sewer of rape threats and death threats, mainly from anonymous accounts. My Twitter profile was mocked up—I had devil horns and a Star of David on my forehead. It was vile. I blocked, deleted and moved on, but I also reported those accounts to Twitter, especially those that sent me rape threats and death threats.

That was on Thursday, and to date no action has been taken and I have not received any response from Twitter about any of the accounts I reported. The Minister said they should be reported to the police. If I reported all those accounts to the police, I would still be there now reporting them. How does he anticipate that this will be resourced so that social media companies can tackle the issue? That was the interaction resulting from just one tweet that I sent on Thursday, and anonymous accounts sent me a barrage of hate and illegal activity.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The shadow Minister raises a very good point. Of course, what she experienced on Twitter was despicable, and I am sure that all members of the Committee would unreservedly condemn the perpetrators who put that content on there. Once the Bill is passed, there will be legal duties on Twitter to remove illegal content. At the moment, they do not exist, and there is no legal obligation for Twitter to remove that content, even though much of it, from the sound of it, would cross one of various legal thresholds. Perhaps some messages qualify as malicious communication, and others might cross other criminal thresholds. That legal duty does not exist at the moment, but when this Bill passes, for the first time there will be that duty to protect not just the shadow Minister but users across the whole country.

Question put, That the amendment be made.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I will speak to clauses 15 and 16 and to new clause 7. The duties outlined in the clause, alongside clause 16, require platforms to have special terms and processes for handling journalistic and democratically important content. In respect of journalistic content, platforms are also required to provide an expedited appeals process for removed posts, and terms specifying how they will define journalistic content. There are, however, widespread concerns about both those duties.

As the Bill stands, we feel that there is too much discretion for platforms. They are required to define “journalistic” content, a role that they are completely unsuited to and, from what I can gather, do not want. In addition, the current drafting leaves the online space open to abuse. Individuals intent on causing harm are likely to apply to take advantage of either of those duties; masquerading as journalists or claiming democratic importance in whatever harm they are causing, and that could apply to almost anything. In the evidence sessions, we also heard about the concerns expressed brilliantly by Kyle Taylor from Fair Vote and Ellen Judson from Demos, that the definitions as they stand in the Bill thus far are broad and vague. However, we will come on to those matters later.

Ultimately, treating “journalistic” and “democratically important” content differently is unworkable, leaving platforms to make impossible judgments over, for example, when and for how long an issue becomes a matter of reasonable public debate, or in what settings a person is acting as a journalist. As the Minister knows, the duties outlined in the clause could enable a far-right activist who was standing in an election, or potentially even just supporting candidates in elections, to use all social media platforms. That might allow far-right figures to be re-platformed on to social media sites where they would be free to continue spreading hate.

The Bill indicates that content will be protected if created by a political party ahead of a vote in Parliament, an election or a referendum, or when campaigning on a live political issue—basically, anything. Can the Minister confirm whether the clause means that far-right figures who have been de-platformed for hate speech already must be reinstated if they stand in an election? Does that include far-right or even neo-Nazi political parties? Content and accounts that have been de-platformed from mainstream platforms for breaking terms of service should not be allowed to return to those platforms via this potential—dangerous—loophole.

As I have said, however, I know that these matters are complex and, quite rightly, exemptions must be in place to allow for free discussion around matters of the day. What cannot be allowed to perpetuate is hate sparked by bad actors using simple loopholes to avoid any consequences.

On clause 16, the Minister knows about the important work that Hope not Hate is doing in monitoring key far-right figures. I pay tribute to it for its excellent work. Many of them self-define as journalists and could seek to exploit this loophole in the Bill and propagate hate online. Some of the most high-profile and dangerous far-right figures in the UK, including Stephen Yaxley-Lennon, also known as Tommy Robinson, now class themselves as journalists. There are also far-right and conspiracy-theory so-called “news companies” such as Rebel Media and Urban Scoop. Both those replicate mainstream news publishers, but are used to spread misinformation and discriminatory content. Many of those individuals and organisations have been de-platformed already for consistently breaking the terms of service of major social media platforms, and the exemption could see them demand their return and have their return allowed.

New clause 7 would require the Secretary of State to publish a report reviewing the effectiveness of clauses 15 and 16. It is a simple new clause to require parliamentary scrutiny of how the Government’s chosen means of protecting content of democratic importance and content of journalistic content are working.

Hacked Off provided me with a list of people it found who have claimed to be journalists and who would seek to exploit the journalistic content duty, despite being banned from social media because they are racists or bad actors. First is Charles C. Johnson, a far-right activist who describes himself as an “investigative journalist”. Already banned from Twitter for saying he would “take out” a civil rights activist, he is also alleged to be a holocaust denier.

Secondly, we have Robert Stacy McCain. Robert has been banned from Twitter for participating in targeted abuse. He was a journalist for The Washington Post, but is alleged to have also been a member of the League of the South, a far-right group known to include racists. Then, there is Richard B. Spencer, a far-right journalist and former editor, only temporary banned for using overlapping accounts. He was pictured making the Nazi salute and has repeated Nazi propaganda. When Trump became President, he encouraged people to “party like it’s 1933”. Sadly, the list goes on and on.

Transparency is at the very heart of the Bill. The Minister knows we have concerns about clauses 15 and 16, as do many of his own Back Benchers. We have heard from my hon. Friend the Member for Batley and Spen how extremist groups and individuals and foreign state actors are having a very real impact on the online space. If the Minister is unwilling to move on tightening up those concepts, the very least he could commit to is a review that Parliament will be able to formally consider.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the shadow Minister for her comments and questions. I would like to pick up on a few points on the clauses. First, there was a question about what content of democratic importance and content of journalistic importance mean in practice. As with many concepts in the Bill, we will look to Ofcom to issue codes of practice specifying precisely how we might expect platforms to implement the various provisions in the Bill. That is set out in clause 37(10)(e) and (f), which appear at the top of page 37, for ease. Clauses 15 and 16 on content of democratic and journalistic importance are expressly referenced as areas where codes of practice will have to be published by Ofcom, which will do further work on and consult on that. It will not just publish it, but will go through a proper process.

The shadow Minister expressed some understandable concerns a moment ago about various extremely unpleasant people, such as members of the far right who might somehow seek to use the provisions in clauses 15 and 16 as a shield behind which to hide, to enable them to continue propagating hateful, vile content. I want to make it clear that the protections in the Bill are not absolute—it is not that if someone can demonstrate that what they are saying is of democratic importance, they can say whatever they like. That is not how the clauses are drafted.

I draw attention to subsection (2) of both clauses 15 and 16. At the end of the first block of text, just above paragraph (a), it says “taken into account”: the duty is to ensure that matters concerning the importance of freedom of expression relating to content of democratic importance are taken into account when making decisions. It is not an absolute prohibition on takedown or an absolute protection, but simply something that has to be taken into account.

If someone from the far right, as the shadow Minister described, was spewing out vile hatred, racism or antisemitism, and tried to use those clauses, the fact that they might be standing in an election might well be taken into account. However, in performing that balancing exercise, the social media platforms and Ofcom acting as enforcers—and the court if it ever got judicially reviewed—would weigh those things up and find that taking into account content of democratic importance would not be sufficient to outweigh considerations around vile racism, antisemitism or misogyny.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

The Minister mentions that it would be taken into account. How long does he anticipate it would be taken into account for, especially given the nature of an election? A short campaign could be a number of weeks, or something could be posted a day before an election, be deemed democratically important and have very serious and dangerous ramifications.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

As I say, if content was racist, antisemitic or flagrantly misogynistic, the balancing exercise is performed and the democratic context may be taken into account. I do not think the scales would tip in favour of leaving the content up. Even during an election period, I think common sense dictates that.

To be clear on the timing point that the hon. Lady asked about, the definition of democratic importance is not set out in hard-edged terms. It does not say, “Well, if you are in a short election period, any candidate’s content counts as of democratic importance.” It is not set out in a manner that is as black and white as that. If, for example, somebody was a candidate but it was just racist abuse, I am not sure how even that would count as democratic importance, even during an election period, because it would just be abuse; it would not be contributing to any democratic debate. Equally, somebody might not be a candidate, or might have been a candidate historically, but might be contributing to a legitimate debate after an election. That might be seen as being of democratic importance, even though they were not actually a candidate. As I said, the concept is not quite as black and white as that. The main point is that it is only to be taken into account; it is not determinative.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I appreciate the Minister’s allowing me to come back on this. During the Committee’s evidence sessions, we heard of examples where bad-faith state actors were interfering in the Scottish referendum, hosting Facebook groups and perpetuating disinformation around the royal family to persuade voters to vote “Yes” to leave the United Kingdom. That disinformation by illegal bad-faith actors could currently come under both the democratic importance and journalistic exemptions, so would be allowed to remain for the duration of that campaign. Given the exemptions in the Bill, it could not be taken down but could have huge, serious ramifications for democracy and the security of the United Kingdom.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I understand the points that the hon. Lady is raising. However, I do not think that it would happen in that way.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

You don’t think?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

No, I don’t. First of all, as I say, it is taken into account; it is not determinative. Secondly, on the point about state-sponsored disinformation, as I think I mentioned yesterday in response to the hon. Member for Liverpool, Walton, there is, as we speak, a new criminal offence of foreign interference being created in the National Security Bill. That will criminalise the kind of foreign interference in elections that she referred to. Because that would then create a new category of illegal content, that would flow through into this Bill. That would not be overridden by the duty to protect content of democratic importance set out here. I think that the combination of the fact that this is a balancing exercise, and not determinative, and the new foreign interference offence being created in the National Security Bill, will address the issue that the hon. Lady is raising—reasonably, because it has happened in this country, as she has said.

I will briefly turn to new clause 7, which calls for a review. I understand why the shadow Minister is proposing a review, but there is already a review mechanism in the Bill; it is to be found in clause 149, and will, of course, include a review of the way that clauses 15 and 16 operate. They are important clauses; we all accept that journalistic content and content of democratic importance is critical to the functioning of our society. Case law relating to article 10 of the European convention on human rights, for example, recognises content of journalistic importance as being especially critical. These two clauses seek to ensure that social media firms, in making their decisions, and Ofcom, in enforcing the firms, take account of that. However, it is no more than that: it is “take account”, it is not determinative.

Question put and agreed to.

Clause 15 accordingly ordered to stand part of the Bill.

Clause 16 ordered to stand part of the Bill.

Ordered, That further consideration be now adjourned. —(Steve Double.)

Online Safety Bill (Seventh sitting)

Alex Davies-Jones Excerpts
Committee stage
Thursday 9th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 9 June 2022 - (9 Jun 2022)
Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

Good morning, Ms Rees. It is a pleasure to serve once again under your chairmanship. I wondered whether the shadow Minister, the hon. Member for Pontypridd, wanted to speak first—I am always happy to follow her, if she would prefer that.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I do my best.

Clauses 17 and 27 have similar effects, the former applying to user-to-user services and the latter to search services. They set out an obligation on the companies to put in place effective and accessible content reporting mechanisms, so that users can report issues. The clauses will ensure that service providers are made aware of illegal and harmful content on their sites. In relation to priority illegal content, the companies must proactively prevent it in the first place, but in the other areas, they may respond reactively as well.

The clause will ensure that anyone who wants to report illegal or harmful content can do so in a quick and reasonable way. We are ensuring that everyone who needs to do that will be able to do so, so the facility will be open to those who are affected by the content but who are not themselves users of the site. For example, that might be non-users who are the subject of the content, such as a victim of revenge pornography, or non-users who are members of a specific group with certain characteristics targeted by the content, such as a member of the Jewish community reporting antisemitic content. There is also facility for parents and other adults with caring responsibility for children, and adults caring for another adult, to report content. Clause 27 sets out similar duties in relation to search. I commend the clauses to the Committee.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

I will talk about this later, when we come to a subsequent clause to which I have tabled some amendments—I should have tabled some to this clause, but unfortunately missed the chance to do so.

I appreciate the Minister laying out why he has designated the people covered by this clause; my concern is that “affected” is not wide enough. My logic is that, on the strength of these provisions, I might not be able to report racist content that I come across on Twitter if I am not the subject of that content—if I am not a member of a group that is the subject of the content or if I am not caring for someone who is the subject of it.

I appreciate what the Minister is trying to do, and I get the logic behind it, but I think the clause unintentionally excludes some people who would have a reasonable right to expect to be able to make reports in this instance. That is why I tabled amendments 78 and 79 to clause 28, about search functions, but those proposals would have worked reasonably for this clause as well. I do not expect a positive answer from the Minister today, but perhaps he could give consideration to my concern. My later amendments would change “affected person” to “any other person”. That would allow anyone to make a report, because if something is illegal content, it is illegal content. It does not matter who makes the report, and it should not matter that I am not a member of the group of people targeted by the content.

I report things all the time, particularly on Twitter, and a significant amount of it is nothing to do with me. It is not stuff aimed at me; it is aimed at others. I expect that a number of the platforms will continue to allow reporting for people who are outwith the affected group, but I do not want to be less able to report than I am currently, and that would be the case for many people who see concerning content on the internet.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

The hon. Lady is making a really important point. One stark example that comes to my mind is when English footballers suffered horrific racist abuse following the penalty shootout at the Euros last summer. Hundreds of thousands of people reported the abuse that they were suffering to the social media platforms on their behalf, in an outcry of solidarity and support, and it would be a shame if people were prevented from doing that.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I absolutely agree. I certainly do not think I am suggesting that the bigger platforms such as Twitter and Facebook will reduce their reporting mechanisms as a result of how the Bill is written. However, it is possible that newer or smaller platforms, or anything that starts after this legislation comes, could limit the ability to report on the basis of these clauses.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I give way first to the hon. Member for Aberdeen North—I think she was first on her feet—and then I will come to the hon. Member for Pontypridd.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I think the shadow Minister wanted to intervene, unless I have answered her point already.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I wanted to reiterate the point that the hon. Member for Aberdeen North made, which the Minister has not answered. If he has such faith that the systems and processes will be changed and controlled by Ofcom as a result of the Bill, why is he so reluctant to put in an ombudsman? It will not be overwhelmed with complaints if the systems and processes work, and therefore protect victims. We have already waited far too long for the Bill, and now he says that we need to wait two to four years for a review, and even longer to implement an ombudsman to protect victims. Why will he not just put this in the Bill now to keep them safe?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Because we need to give the new systems and processes time to take effect. If the hon. Lady felt so strongly that an ombudsman was required, she was entirely at liberty to table an amendment to introduce one, but she has not done so.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

As I have said, at the moment there is nothing at all. Platforms such as Facebook can and do arbitrarily censor content with little if any regard for freedom of speech. Some platforms have effectively cancelled Donald Trump while allowing the Russian state to propagate shocking disinformation about the Russian invasion of Ukraine, so there is real inconsistency and a lack of respect for freedom of speech. This at least establishes something where currently there is nothing. We can debate whether “have regard to” is strong enough. We have heard the other point of view from the other side of the House, which expressed concern that it might be used to allow otherwise harmful content, so there are clearly arguments on both sides of the debate. The obligation to have regard does have some weight, because the issue cannot be completely ignored. I do not think it would be adequate to simply pay lip service to it and not give it any real regard, so I would not dismiss the legislation as drafted.

I would point to the clauses that we have recently discussed, such as clause 15, under which content of democratic importance—which includes debating current issues and not just stuff said by an MP or candidate—gets additional protection. Some of the content that my hon. Friend the Member for Don Valley referred to a second ago would probably also get protection under clause 14, under which content of democratic importance has to be taken in account when making decisions about taking down or removing particular accounts. I hope that provides some reassurance that this is a significant step forwards compared with where the internet is today.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I share the Minister’s sentiments about the Bill protecting free speech; we all want to protect that. He mentions some of the clauses we debated on Tuesday regarding democratic importance. Some would say that debating this Bill is of democratic importance. Since we started debating the Bill on Tuesday, and since I have mentioned some of the concerns raised by stakeholders and others about the journalistic exemption and, for example, Tommy Robinson, my Twitter mentions have been a complete sewer—as everyone can imagine. One tweet I received in the last two minutes states:

“I saw your vicious comments on Tommy Robinson…The only reason you want to suppress him is to bury the Pakistani Muslim rape epidemic”

in this country. Does the Minister agree that that is content of democratic importance, given we are debating this Bill, and that it should remain on Twitter?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

That sounds like a very offensive tweet. Could the hon. Lady read it again? I didn’t quite catch it.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Yes:

“I saw your vicious comments on Tommy Robinson…The only reason you want to suppress him is to bury the Pakistani Muslim rape epidemic”

in this country. It goes on:

“this is a toxic combination of bloc vote grubbing and woke”

culture, and there is a lovely GIF to go with it.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I do not want to give an off-the-cuff assessment of an individual piece of content—not least because I am not a lawyer. It does not sound like it meets the threshold of illegality. It most certainly is offensive, and that sort of matter is one that Ofcom will set out in its codes of practice, but there is obviously a balance between freedom of speech and content that is harmful, which the codes of practice will delve into. I would be interested if the hon. Lady could report that to Twitter and then report back to the Committee on what action it takes.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Yes, I will do that right now and see what happens.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

At the moment, there is no legal obligation to do anything about it, which is precisely why this Bill is needed, but let us put it to the test.

Question put and agreed to.

Clause 19 accordingly ordered to stand part of the Bill.

Clause 20

Record-keeping and review duties

Question proposed, That the clause stand part of the Bill.

Online Safety Bill (Eighth sitting) Debate

Full Debate: Read Full Debate

Alex Davies-Jones

Main Page: Alex Davies-Jones (Labour - Pontypridd)

Online Safety Bill (Eighth sitting)

Alex Davies-Jones Excerpts
Committee stage
Thursday 9th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 9 June 2022 - (9 Jun 2022)
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I begin by thanking my hon. Friend the Member for Washington and Sunderland West (Mrs Hodgson) for her work on drafting these amendments and others relating to this chapter, which I will speak to shortly. She has campaigned excellently over many years in her role as chair of the all-party parliamentary group on ticket abuse. I attended the most recent meeting of that group back in April to discuss what we need to see changed in the Bill to protect people from scams online. I am grateful to those who have supported the group and the anti-ticket touting campaign for their insights.

It is welcome that, after much flip-flopping, the Government have finally conceded to Labour’s calls and those of many campaign groups to include a broad duty to tackle fraudulent advertising on search engines through chapter 5 of part 3 of the Bill. We know that existing laws to protect consumers in the online world have failed to keep pace with the actors attempting to exploit them, and that is particularly true of scams and fraudulent advertisements.

Statistics show a steep increase in this type of crime in the online world, although those figures are likely to be a significant underestimate and do not capture the devastating emotional impact that scams have on their victims. The scale of the problem is large and it is growing.

The Financial Conduct Authority estimates that fraud costs the UK up to £190 billion a year, with 86% of that fraud committed online. We know those figures are increasing. The FCA more than doubled the number of scam warnings it issued between 2019 and 2020, while UK Finance data shows that there has been a significant rise in cases across all scam types as criminals adapt to targeting victims online. The pandemic, which led to a boom in internet shopping, created an environment ripe for exploitation. Reported incidents of scams and fraud have increased by 41% since before the pandemic, with one in 10 of us now victims of fraud.

Being scammed can cause serious psychological harm. Research by the Money and Mental Health Policy Institute suggests that three in 10 online scam victims felt depressed as a result of being scammed, while four in 10 said they felt stressed. Clearly, action to tackle the profound harms that result from fraudulent advertising is long overdue.

This Bill is an important opportunity but, as with other issues the Government are seeking to address, we need to see changes if it is to be successful. Amendments 23 and 24 are small and very simple, but would have a profound impact on the ability of the Bill to prevent online fraud from taking place and to protect UK users.

As currently drafted, the duties set out in clauses 34 and 35 for category 1 and 2A services extend only to the design, operation and use of a category 1 or 2A service in the United Kingdom. Our amendments would mean that the duties extended to the design, operation and use of a category 1 or 2A service that targets users in the United Kingdom. That change would make the Bill far more effective, because it would reduce the risk of a company based overseas being able to target UK consumers without any action being taken against them—being allowed to target the public fraudulently without fear of disruption.

That would be an important change, because paid-for advertisements function by the advertiser stating where in the world, by geographical location, they wish to target consumers. For instance, a company would be able to operate from Hong Kong and take out paid-for advertisements to target consumers just in one particular part of north London. The current wording of the Bill does not acknowledge the fact that internet services can operate from anywhere in the world and use international boundaries to circumvent UK legislation.

Other legislation has been successful in tackling scams across borders. I draw the Committee’s attention to the London Olympic Games and Paralympic Games Act 2006, which made it a crime to sell a ticket to the Olympics into the black market anywhere in the world, rather than simply in the UK where the games took place. I suggest that we should learn from the action taken to regulate the Olympics back in 2012 and implement the same approach through amendments 23 and 24.

New clause 5 was also tabled by my hon. Friend the Member for Washington and Sunderland West, who will be getting a lot of mentions this afternoon.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

New clause 5 would tackle one of the reasons people become subject to fraud online by introducing a duty for search engines to ensure that all paid-for search advertisements should be made to look distinct from non-paid-for search results. When bad actors are looking to scam consumers, they often take out paid-for advertising on search results, so that they can give consumers the false impression that their websites are official and trustworthy.

Paid search results occur when companies pay a charge to have their site appear at the top of search results. This is valuable to them because it is likely to direct consumers towards their site. The new clause would stop scam websites buying their way to the top of a search result.

Let me outline some of the consequences of not distinguishing between paid-for and not-paid-for advertisements, because they can be awful. Earlier this year, anti-abortion groups targeted women who were searching online for a suitable abortion clinic. The groups paid for the women to have misleading adverts at the top of their search that directed them towards an anti-abortion centre rather than a clinic. One woman who knew that she wanted to have an abortion went on researching where she could have the procedure. Her search for a clinic on Google led her to an anti-abortion centre that she went on to contact and visit. That was because she trusted the top search results on Google, which were paid for. The fact that it was an advertisement was indicated only by the two letters “AD” appearing in very small font underneath the search headline and description.

Another example was reported by The Times last year. Google had been taking advertising money from scam websites selling premier league football tickets, even though the matches were taking place behind closed doors during lockdown. Because these advertisements appeared at the top of search results, it is entirely understandable that people looking for football tickets were deceived into believing that they would be able to attend the games, which led to them being scammed.

There have been similar problems with passport renewals. As colleagues will be very aware, people have been desperately trying to renew their passports amid long delays because of the backlog of cases. This is a target for fraudsters, who take out paid advertisements to offer people assistance with accessing passport renewal services and then scam them.

New clause 5 would end this practice by ensuring that search engines provide clear messaging to show that the user is looking at a paid-for advertisement, by stating that clearly and through other measures, such as a separate colour scheme. A duty to distinguish paid-for advertising is present in many other areas of advertising. For example, when we watch TV, there is no confusion between what is a programme and what is an advert; the same is true of radio advertising; and when someone is reading a newspaper or magazine, the line between journalism and the advertisements that fund the paper is unmistakable.

We cannot continue to have these discrepancies and be content with the internet being a wild west. Therefore, it is clear that advertising on search engines needs to be brought into line with advertising in other areas, with a requirement on search engines to distinguish clearly between paid-for and organic results.

New clause 6 is another new clause tabled by my hon. Friend the Member for Washington and Sunderland West. It would protect consumers from bad actors trying to exploit them online by placing a duty on search engines to verify adverts before they accept them. That would mean that, before their adverts were allowed to appear in a paid-for search result, companies would have to demonstrate that they were authorised by a UK regulatory body designated by the Secretary of State.

This methodology for preventing fraud is already in process for financial crime. Google only accepts financial services advertisements from companies that are a member of the Financial Conduct Authority. This gives companies a further incentive to co-operate with regulators and it protects consumers by preventing companies that are well-known for their nefarious activities from dominating search results and then misleading consumers. By extending this best practice to all advertisements, search engines would no longer be able to promote content that is fake or fraudulent after being paid to do so.

Without amending the Bill in this way, we risk missing an opportunity to tackle the many forms of scamming that people experience online, one of which is the world of online ticketing. In my role as shadow Minister for the arts and civil society, I have worked on this issue and been informed by the expertise of my hon. Friend the Member for Washington and Sunderland West.

In the meeting of the all-party parliamentary group on ticket abuse in April, we heard about the awful consequences of secondary ticket reselling practices. Ticket reselling websites, such as Viagogo, are rife with fraud. Large-scale ticket touts dominate the resale site, and Viagogo has a well-documented history of breaching consumer protection laws. Those breaches include a number of counts of fraud for selling non-existent tickets. Nevertheless, Viagogo continues to take out paid-for advertisements with Google and is continually able to take advantage of consumers by dominating search results and commanding false trust.

If new clause 6 is passed, then secondary ticketing websites such as Viagogo would have to be members of a regulatory body responsible for secondary ticketing, such as the Society of Ticket Agents and Retailers, or STAR. Viagogo would then have to comply with STAR standards for its business model to be successful.

I have used ticket touting as an example, but the repercussions of this change would be wider than that. Websites that sell holidays and flights, such as Skyscanner, would have to be a member of the relevant regulatory group, for example the Association of British Travel Agents. People would be able to go to football matches, art galleries and music festivals without fearing that they are getting ripped off or have been issued with fake tickets.

I will describe just a few examples of the poor situation we are in at the moment, to illustrate the need for change. The most heartbreaking one is of an elderly couple who bought two tickets from a secondary ticketing website to see their favourite artist, the late Leonard Cohen, to celebrate their 70th wedding anniversary. When the day came around and they arrived at the venue, they were turned away and told they had been sold fake tickets. The disappointment they must have felt would have been very hard to bear. In another instance, a British soldier serving overseas decided to buy his daughter concert tickets because he could not be with her on her birthday. When his daughter went along to the show, she was turned away at the door and told she could not enter because the tickets had been bought through a scam site and were invalid.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I beg to move amendment 65, in clause 37, page 36, line 27, at end insert—

“(ia) organisations that campaign for the removal of animal abuse content, and”.

This amendment would add organisations campaigning for the removal of animal content to the list of bodies Ofcom must consult.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 63, in schedule 4, page 176, line 29, at end insert “and

(x) there are adequate safeguards to monitor cruelty towards humans and animals;”.

This amendment would ensure that ensuring adequate safeguards to monitor cruelty towards humans and animals is one of the online safety objectives for user-to-user services.

Amendment 64, in schedule 4, page 177, line 4, at end insert “and

(vii) the systems and process are appropriate to detect cruelty towards humans and animals;”.

This amendment would ensure that ensuring systems and processes are appropriate to detect cruelty towards humans and animals is one of the online safety objectives for search services.

Amendment 60, in clause 52, page 49, line 5, at end insert—

“(e) an offence, not within paragraph (a), (b) or (c), of which the subject is an animal.”

This amendment brings offences to which animals are subject within the definition of illegal content.

Amendment 59, in schedule 7, page 185, line 39, at end insert—

“Animal Welfare

22A An offence under any of the following provisions of the Animal Welfare Act 2006—

(a) section 4 (unnecessary suffering);

(b) section 5 (mutilation);

(c) section 7 (administration of poisons);

(d) section 8 (fighting);

(e) section 9 (duty of person responsible for animal to ensure welfare).

22B An offence under any of the following provisions of the Animal Health and Welfare (Scotland) Act 2006—

(a) section 19 (unnecessary suffering);

(b) section 20 (mutilation);

(c) section 21 (cruel operations);

(d) section 22 (administration of poisons);

(e) section 23 (fighting);

(f) section 24 (ensuring welfare of animals).

22C An offence under any of the following provisions of the Welfare of Animals Act (Northern Ireland) 2011—

(a) section 4 (unnecessary suffering);

(b) section 5 (prohibited procedures);

(c) section 7 (administration of poisons);

(d) section 8 (fighting);

(e) section 9 (ensuring welfare of animals).

22D For the purpose of paragraphs 22A, 22B or 22C of this Schedule, the above offences are deemed to have taken place regardless of whether the offending conduct took place within the United Kingdom, if the offending conduct would have constituted an offence under the provisions contained within those paragraphs.”

This amendment adds certain animal welfare offences to the list of priority offences in Schedule 7.

Amendment 66, in clause 140, page 121, line 8, at end insert—

“(d) causing harm to any human or animal.”

This amendment ensures groups are able to make complaints regarding animal abuse videos.

Amendment 67, in clause 140, page 121, line 20, at end insert

“, or a particular group that campaigns for the removal of harmful online content towards humans and animals”.

This amendment makes groups campaigning against harmful content eligible to make supercomplaints.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

It is, as ever, a pleasure to serve under your chairship, Ms Rees. Amendment 65 would add organisations campaigning for the removal of animal content to the list of bodies that Ofcom must consult. As we all know, Ofcom must produce codes of practice that offer guidance on how regulated services can comply with its duties. Later in the Bill, clause 45 makes clear that if a company complies with the code of practice, it will be deemed to have complied with the Bill in general. In addition, the duties for regulated services come into force at the same time as the codes of practice. That all makes what the codes say extremely important.

The absence of protections relating to animal abuse content is a real omission from the Bill. Colleagues will have seen the written evidence from Action for Primates, which neatly summarised the key issues on which Labour is hoping to see agreement from the Government. Given this omission, it is clear that the current draft of the Bill is not fit for tackling animal abuse, cruelty and violence, which is all too common online.

There are no explicit references to content that can be disturbing and distressing to those who view it—both children and adults. We now know that most animal cruelty content is produced specifically for sharing on social media, often for profit through the monetisation schemes offered by platforms such as YouTube. Examples include animals being beaten, set on fire, crushed or partially drowned; the mutilation and live burial of infant monkeys; a kitten intentionally being set on by a dog and another being stepped on and crushed to death; live and conscious octopuses being eaten; and animals being pitted against each other in staged fights.

Animals being deliberately placed into frightening or dangerous situations from which they cannot escape or are harmed before being “rescued” on camera is becoming increasingly popular on social media, too. For example, kittens and puppies are “rescued” from the clutches of a python. Such fake rescues not only cause immense suffering to animals, but are fraudulent because viewers are asked to donate towards the rescue and care of the animals. This cannot be allowed to continue.

Indeed, as part of its Cancel Out Cruelty campaign, the Royal Society for the Prevention of Cruelty to Animals conducted research, which found that in 2020 there were nearly 500 reports of animal cruelty on social media. That was more than twice the figure reported for 2019. The majority of these incidents appeared on Facebook. David Allen, head of prevention and education at the RSPCA, has spoken publicly about the issue, saying:

“Sadly, we have seen an increase in recent years in the number of incidents of animal cruelty being posted and shared on social media such as Facebook, Instagram, TikTok and Snapchat.”

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

I totally agree with the points that the hon. Lady is making. Does she agree that the way in which the Bill is structured means that illegal acts that are not designated as “priority illegal” will likely be put at the very end of companies’ to-do list and that they will focus considerably more effort on what they will call “priority illegal” content?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I completely agree with and welcome the hon. Gentleman’s contribution. It is a very valid point and one that we will explore further. It shows the necessity of this harm being classed as a priority harm in order that we protect animals, as well as people.

David Allen continued:

“We’re very concerned that the use of social media has changed the landscape of abuse with videos of animal cruelty being shared for likes and kudos with this sort of content normalising—and even making light of—animal cruelty. What’s even more worrying is the level of cruelty that can be seen in these videos, particularly as so many young people are being exposed to graphic footage of animals being beaten or killed which they otherwise would never have seen.”

Although the Bill has a clear focus on protecting children, we must remember that the prevalence of cruelty to animals online has the potential to have a hugely negative impact on children who may be inadvertently seeing that content through everyday social media channels.

Jane Stevenson Portrait Jane Stevenson (Wolverhampton North East) (Con)
- Hansard - - - Excerpts

The hon. Lady knows that I am a great animal lover, and I obviously have concerns about children being exposed to these images. I am just wondering how she would differentiate between abusive images and the images that are there to raise awareness of certain situations that animals are in. I have seen many distressing posts about the Yulin dogmeat festival and about beagles being used in laboratory experiments. How would she differentiate between images that are there to raise awareness of the plight of animals and the abusive ones?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I thank the hon. Lady for her contribution. Like me, she is a passionate campaigner for animal welfare. It was a pleasure to serve on the Committee that considered her Glue Traps (Offences) Act 2022, which I know the whole House was pleased to pass. She raises a very important point and one that the Bill later explores with regard to other types of content, such as antisemitic content and racist content in terms of education and history and fact. The Bill deals specifically with that later, and this content would be dealt with in the same way. We are talking about where content is used as an educational tool and a raising-awareness tool, compared with just images and videos of direct abuse.

To give hon. Members a real sense of the extent of the issue, I would like to share some findings from a recent survey of the RSPCA’s frontline officers. These are pretty shocking statistics, as I am sure Members will all agree. Eighty-one per cent. of RSPCA frontline officers think that more abuse is being caught on camera. Nearly half think that more cases are appearing on social media. One in five officers said that one of the main causes of cruelty to animals is people hurting animals just to make themselves more popular on social media. Some of the recent cruelty videos posted on social media include a video of a magpie being thrown across the road on Instagram in June 2021; a woman captured kicking her dog on TikTok in March 2021; a teenager being filmed kicking a dog, which was shared on WhatsApp in May 2021; and videos posted on Instagram of cockerels being forced to fight in March 2021.

I am sure that colleagues will be aware of the most recent high-profile case, which was when disturbing footage was posted online of footballer Kurt Zouma attacking his cat. There was, quite rightly, an outpouring of public anger and demands for justice. Footage uploaded to Snapchat on 6 February showed Zouma kicking his Bengal cat across a kitchen floor in front of his seven-year-old son. Zouma also threw a pair of shoes at his pet cat and slapped its head. In another video, he was heard saying:

“I swear I’ll kill it.”

In sentencing him following his guilty plea to two offences under the Animal Welfare Act 2006, district judge Susan Holdham described the incident as “disgraceful and reprehensible”. She added:

“You must be aware that others look up to you and many young people aspire to emulate you.”

What makes that case even more sad is the way in which the video was filmed and shared, making light of such cruelty. I am pleased that the case has now resulted in tougher penalties for filming animal abuse and posting it on social media, thanks to new guidelines from the Sentencing Council. The prosecutor in the Zouma case, Hazel Stevens, told the court:

“Since this footage was put in the public domain there has been a spate of people hitting cats and posting it on various social media sites.”

There have been many other such instances. Just a few months ago, the most abhorrent trend was occurring on TikTok: people were abusing cats, dogs and other animals to music and encouraging others to do the same. Police officers discovered a shocking 182 videos with graphic animal cruelty on mobile phones seized during an investigation. This sickening phenomenon is on the rise on social media platforms, provoking a glamorisation of the behaviour. The videos uncovered during the investigation showed dogs prompted to attack other animals such as cats, or used to hunt badgers, deer, rabbits and birds. Lancashire police began the investigation after someone witnessed two teenagers encouraging a dog to attack a cat on an estate in Burnley in March of last year. The cat, a pet named Gatsby, was rushed to the vet by its owners once they discovered what was going on, but unfortunately it was too late and Gatsby’s injuries were fatal. The photos and videos found on the boys’ phones led the police to discover more teenagers in the area who were involved in such cruel activities. The views and interactions that the graphic footage was attracting made it even more visible, as the platform was increasing traffic and boosting content when it received attention.

It should not have taken such a high-profile case of a professional footballer with a viral video to get this action taken. There are countless similar instances occurring day in, day out, and yet the platforms and authorities are not taking the necessary action to protect animals and people from harm, or to protect the young people who seek to emulate this behaviour.

I pay tribute to the hard work of campaigning groups such as the RSPCA, Action for Primates, Asia for Animals Coalition and many more, because they are the ones who have fought to keep animal rights at the forefront. The amendment seeks to ensure that such groups are given a voice at the table when Ofcom consults on its all-important codes of practice. That would be a small step towards reducing animal abuse content online, and I hope the Minister can see the merits in joining the cause.

I turn to amendment 60, which would bring offences to which animals are subject within the definition of illegal content, a point raised by the hon. Member for Ochil and South Perthshire. The Minister will recall the Animal Welfare (Sentencing) Act 2021, which received Royal Assent last year. Labour was pleased to see the Government finally taking action against those who commit animal cruelty offences offline. The maximum prison sentence for animal cruelty was increased from six months to five years, and the Government billed that move as them taking a firmer approach to cases such as dog fighting, abuse of puppies and kittens, illegally cropping a dog’s ears and gross neglect of farm animals. Why, then, have the Government failed to include offences against animals within the scope of illegal content online? We want parity between the online and offline space, and that seems like a sharp omission from the Bill.

Placing obligations on service providers to remove animal cruelty content should fall within both the spirit and the scope of the Bill. We all know that the scope of the Bill is to place duties on service providers to remove illegal and harmful content, placing particular emphasis on the exposure of children. Animal cruelty content is a depiction of illegality and also causes significant harm to children and adults.

If my inbox is anything to go by, all of us here today know what so many of our constituents up and down the country feel about animal abuse. It is one of the most popular topics that constituents contact me about. Today, the Minister has a choice to make about his Government's commitment to preventing animal cruelty and keeping us all safe online. I hope he will see the merit in acknowledging the seriousness of animal abuse online.

Amendment 66 would ensure that groups were able to make complaints about animal abuse videos. Labour welcomes clause 140, as the ability to make super-complaints is a vital part of our democracy. However, as my hon. Friend the Member for Worsley and Eccles South and other Members have mentioned, the current definition of an “eligible entity” is far too loose. I have set out the reasons as to why the Government must go further to limit and prevent animal abuse content online. Amendment 66 would ensure that dangerous animal abuse content is a reasonable cause for a super-complaint to be pursued.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The shadow Minister raises important issues to do with animal cruelty. The whole House and our constituents feel extremely strongly about this issue, as we know. She set out some very powerful examples of how this terrible form of abuse takes place.

To some extent, the offences are in the Bill’s scope already. It covers, for example, extreme pornography. Given that the content described by the hon. Lady would inflict psychological harm to children, it is, to that extent, in scope.

The hon. Lady mentioned the Government’s wider activities to prevent animal cruelty. That work goes back a long time and includes the last Labour Government’s Animal Welfare Act 2006. She mentioned the more recent update to the criminal sentencing laws that increased by a factor of 10 the maximum sentence for cruelty to animals. It used to be six months and has now been increased to up to five years in prison.

In addition, just last year the Department for Environment, Food and Rural Affairs announced an action plan for animal welfare, which outlines a whole suite of activities that the Government are taking to protect animals in a number of different areas—sentience, international trade, farming, pets and wild animals. That action plan will be delivered through a broad programme of legislative and non-legislative work.

--- Later in debate ---
On the basis of the Government’s existing work on animal welfare, the effect that the Bill as drafted will have in this area, and the fact that we will give this issue some further thought, I hope that the shadow Minister will let the matter rest for now.
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I thank the Minister for agreeing to look at this issue further. However, we do see it as being within the scope of the Bill, and we have the opportunity to do something about it now, so we will be pressing these amendments to a vote. If you will allow me, Ms Rees, I would also like to pay tribute to the former Member of Parliament for Redcar, Anna Turley, who campaigned tirelessly on these issues when she was a Member of the House. We would like these amendments to be part of the Bill.

Question put, That the amendment be made.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

On clause 37, it is welcome that Ofcom will have to prepare and issue a code of practice for service providers with duties relating to illegal content in the form of terrorism or child sexual exploitation and abuse content. The introduction of compliance measures relating to fraudulent advertising is also very welcome. We do, however, have some important areas to amend, including the role of different expert groups in assisting Ofcom during its consultation process, which I have already outlined in relation to animal cruelty.

On clause 38, Labour supports the notion that Ofcom must have specific principles to adhere to when preparing the codes of practice, and of course, the Secretary of State must have oversight of those. However, as I will touch on as we proceed, Labour feels that far too much power is given to the Secretary of State of the day in establishing those codes.

Labour believes that that schedule 4 is overwhelmingly loose in its language, and we have concerns about the ability of Ofcom—try as it might—to ensure that its codes of practice are both meaningful to service providers and in compliance with the Bill’s legislative requirements. Let me highlight the schedule’s broadness by quoting from it. Paragraph 4 states:

“The online safety objectives for regulated user-to-user services are as follows”.

I will move straight to paragraph 4(a)(iv), which says

“there are adequate systems and processes to support United Kingdom users”.

Forgive me if I am missing something here, but surely an assessment of adequacy is too subjective for these important codes of practice. Moreover, the Bill seems to have failed to consider the wide-ranging differences that exist among so-called United Kingdom users. Once again, there is no reference to future-proofing against emerging technologies. I hope that the Minister will therefore elaborate on how he sees the codes of practice and their principles, objectives and content as fit for purpose. More broadly, it is remarkable that schedule 4 is both too broad in its definitions and too limiting in some areas—we might call it a Goldilocks schedule.

I turn to new clause 20. As we have discussed, a significant majority of online child abuse takes place in private messages. Research from the NSPCC shows that 12 million of the 18.4 million child sexual abuse reports made by Facebook in 2019 related to content shared on private channels. Recent data from the Office for National Statistics shows that private messaging plays a central role in contact between children and people whom they have not met offline before. When children are contacted by someone they do not know, in nearly three quarters of cases that takes place by private message.

Schedule 4 introduces new restrictions on Ofcom’s ability to require a company to use proactive technology to identify or disrupt abuse in private messaging. That will likely restrict Ofcom’s ability to include in codes of practice widely used industry-standard tools such as PhotoDNA and CSAI Match, which detect known child abuse images, and artificial intelligence classifiers to detect self-generated images and grooming behaviour. That raises significant questions about whether the regulator can realistically produce codes of practice that respond to the nature and extent of the child abuse threat.

As it stands, the Bill will leave Ofcom unable to require companies to proactively use technology that can detect child abuse. Instead, Ofcom will be wholly reliant on the use of CSEA warning notices under clause 103, which will enable it to require the use of proactive technologies only where there is evidence that child abuse is already prevalent—in other words, where significant online harm has already occurred. That will necessitate the use of a laborious and resource-intensive process, with Ofcom having to build the evidence to issue CSEA warning notices company by company.

Those restrictions will mean that the Bill will be far less demanding than comparable international legislation in respect of the requirement on companies to proactively detect and remove online child abuse. So much for the Bill being world leading. For example, the EU child abuse legislative proposal published in May sets out clear and unambiguous requirements on companies to proactively scan for child abuse images and grooming behaviour on private messages.

If the regulator is unable to tackle online grooming sufficiently proactively, the impact will be disproportionately felt by girls. NSPCC data shows that an overwhelming majority of criminal offences target girls, with those aged 12 to 15 the most likely to be victims of online grooming. Girls were victims in 83% of offences where data was recorded. Labour recognises that once again there are difficulties between our fundamental right to privacy and the Bill’s intentions in keeping children safe. This probing new clause is designed to give the Government an opportunity to report on the effectiveness of their proposed approach.

Ultimately, the levels of grooming taking place on private messaging platforms are incredibly serious. I have two important testimonies that are worth placing on the record, both of which have been made anonymous to protect the victims but share the same sentiment. The first is from a girl aged 15. She said:

“I’m in a serious situation that I want to get out of. I’ve been chatting with this guy online who’s like twice my age. This all started on Instagram but lately all our chats have been on WhatsApp. He seemed really nice to begin with, but then he started making me do these things to prove my trust to him, like doing video chats with my chest exposed.”

The second is from a boy aged 17. He said:

“I’ve got a fitness page on Instagram to document my progress but I get a lot of direct messages from weird people. One guy said he’d pay me a lot of money to do a private show for him. He now messages me almost every day asking for more explicit videos and I’m scared that if I don’t do what he says, then he will leak the footage and my life would be ruined”.

Those testimonies go to show how fundamentally important it is for an early assessment to be made of the effectiveness of the Government’s approach following the Bill gaining Royal Assent.

We all have concerns about the use of proactive technology in private messaging and its potential impact on personal privacy. End-to-end encryption offers both risks and benefits to the online environment, but the main concern is based on risk profiles. End-to-end encryption is particularly problematic on social networks because it is embedded in the broader functionality of the service, so all text, DMs, images and live chats could be encrypted. Consequently, its impact on detecting child abuse becomes even greater. There is an even greater risk with Meta threatening to bring in end-to-end encryption for all its services. If platforms cannot demonstrate that they can mitigate those risks to ensure a satisfactory risk profile, they should not be able to proceed with end-to-end encryption until satisfactory measures and mitigations are in place.

Tech companies have made significant efforts to frame this issue in the false binary that any legislation that impacts private messaging will damage end-to-end encryption and will mean that encryption will not work or is broken. That argument is completely false. A variety of novel technologies are emerging that could allow for continued CSAM scanning in encrypted environments while retaining the privacy benefits afforded by end-to-end encryption.

Apple, for example, has developed its NeuralHash technology, which allows for on-device scans for CSAM before a message is sent and encrypted. That client-side implementation—rather than service-side encryption—means that Apple does not learn anything about images that do not match the known CSAM database. Apple’s servers flag accounts that exceed a threshold number of images that match a known database of CSAM image hashes, so that Apple can provide relevant information to the National Centre for Missing and Exploited Children. That process is secure and expressly designed to preserve user privacy.

Homomorphic encryption technology can perform image hashing on encrypted data without the need to decrypt the data. No identifying information can be extracted and no details about the encrypted image are revealed, but calculations can be performed on the encrypted data. Experts in hash scanning—including Professor Hany Farid of the University of California, Berkeley, who developed PhotoDNA—insist that scanning in end-to-end encrypted environments without damaging privacy will be possible if companies commit to providing the engineering resources to work on it.

To move beyond the argument that requiring proactive scanning for CSAM means breaking or damaging end-to-end encryption, amendments to the Bill could provide a powerful incentive for companies to invest in technology and engineering resources that will allow them to continue scanning while pressing ahead with end-to-end encryption, so that privacy is preserved but appropriate resources for and responses to online child sexual abuse can continue. It is highly unlikely that some companies will do that unless they have the explicit incentive to do so. Regulation can provide such an incentive, and I urge the Minister to make it possible.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

It is a pleasure to follow the shadow Minister, who made some important points. I will focus on clause 37 stand part. I pay tribute to the Minister for his incredible work on the Bill, with which he clearly wants to stop harm occurring in the first place. We had a great debate on the matter of victim support. The Bill requires Ofcom to produce a number of codes of practice to help to achieve that important aim.

Clause 37 is clear: it requires codes of practice on illegal content and fraudulent advertising, as well as compliance with “the relevant duties”, and it is on that point that I hope the Minister can help me. Those codes will help Ofcom to take action when platforms do things that they should not, and will, I hope, provide a way for platforms to comply in the first place rather than falling foul of the rules.

How will the codes help platforms that are harbouring material or configuring their services in a way that might be explicitly or inadvertently promoting violence against women and girls? The Minister knows that women are disproportionately the targets of online abuse on social media or other platforms. The impact, which worries me as much as I am sure it worries him, is that women and girls are told to remove themselves from social media as a way to protect themselves against extremely abusive or harassing material. My concern is that the lack of a specific code to tackle those important issues might inadvertently mean that Ofcom and the platforms overlook them.

Would a violence against women and girls code of practice help to ensure that social media platforms were monitored by Ofcom for their work to prevent tech-facilitated violence against women and girls? A number of organisations think that it would, as does the Domestic Abuse Commissioner herself. Those organisations have drafted a violence against women and girls code of practice, which has been developed by an eminent group of specialists—the End Violence Against Women Coalition, Glitch, Carnegie UK Trust, the NSPCC, 5Rights, and Professors Clare McGlynn and Lorna Woods, both of whom gave evidence to us. They believe it should be mandatory for Ofcom to adopt a violence against women and girls code to ensure that this issue is taken seriously and that action is taken to prevent the risks in the first place. Clause 37 talks about codes, but it is not specific on that point, so can the Minister help us? Like the rest of the Committee, he wants to prevent women from experiencing these appalling acts online, and a code of practice could help us deal with that better.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I do not know whether everybody draws the same distinction as me. For me the distinction is that, because it will be happening with proactive technology—technological means will be scanning those messages rather than humans—nobody will see the messages. Software will scan messages, and should there be anything that is illegal—should there be child sexual abuse material—that is what will be flagged and further action taken.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I am not sure whether the hon. Member for Wolverhampton North East heard during my contribution, but this technology does exist, so it is possible. It is a false argument made by those who believe that impacting end-to-end encryption will limit people’s privacy. The technology does exist, and I named some that is able to scan without preventing the encryption of the data. It simply scans for those images and transfers them over existing databases. It would have no impact on anybody’s right to privacy.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I thank the shadow Minister for her assistance with that intervention, which was incredibly helpful. I do not have concerns that anybody will be able to access that data. The only data that will be accessible is when the proactive technology identifies something that is illegal, so nobody can see any of the messages except for the artificial intelligence. When the AI recognises that something is abuse material, at that point the Bill specifies that it will go to the National Crime Agency if it is in relation to child abuse images.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Before we begin the next debate, does anyone wish to speak to Carla Lockhart’s amendment 97? If so, it will be debated as part of this group; otherwise, it will not be selected. The amendment is not selected.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I beg to move amendment 48, in clause 39, page 37, line 17, at beginning insert—

“(A1) OFCOM must prepare the draft codes of practice required under section 37 within the period of six months beginning with the day on which this Act is passed.”

This amendment requires Ofcom to prepare draft codes of practice within six months of the passing of the Act.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clause stand part.

Clauses 42 to 47 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

This is a mammoth part of the Bill, and I rise to speak to clause 39. Under the clause, Ofcom will submit a draft code of practice to the Secretary of State and, provided that the Secretary of State does not intend to issue a direction to Ofcom under clause 40, the Secretary of State would lay the draft code before Parliament. Labour’s main concern about the procedure for issuing codes of practice is that, without a deadline, they may not come into force for quite some time, and the online space needs addressing now. We have already waited far too long for the Government to bring forward the Bill. Parliamentary oversight is also fundamentally important, and the codes will have huge implications for the steps that service providers take, so it is vital that they are given due diligence at the earliest opportunity.

Amendment 48 would require Ofcom to prepare draft codes of practice within six months of the passing of the Act. This simple amendment would require Ofcom to bring forward these important codes of practice within an established time period—six months—after the Bill receives Royal Assent. Labour recognises the challenges ahead for Ofcom in both capacity and funding.

On this note, I must raise with the Minister something that I have raised previously. I find it most curious that his Department recently sought to hire an online safety regulator funding policy adviser. The job advert listed some of the key responsibilities:

“The post holder will support ministers during passage of the Online Safety Bill; secure the necessary funding for Ofcom and DCMS in order to set up the Online Safety regulator; and help implement and deliver a funding regime which is first of its kind in the UK.”

That raises worrying questions about how prepared Ofcom is for the huge task ahead. That being said, the Government have drafted the Bill in a way that brings codes of practice to its heart, so they cannot and should not be susceptible to delay.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Lady is very kind in giving way—I was twitching to stand up. On the preparedness of Ofcom and its resources, Ofcom was given about £88 million in last year’s spending review to cover this and the next financial year—2022-23 and 2023-24—so that it could get ready. Thereafter, Ofcom will fund itself by raising fees, and I believe that the policy adviser will most likely advise on supporting the work on future fees. That does not imply that there will be any delay, because the funding for this year and next year has already been provided by the Government.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I appreciate that intervention, but the Minister must be aware that if Ofcom has to fundraise itself, that raises questions about its future capability as a regulator and its funding and resource requirements. What will happen if it does not raise those funds?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Lady’s use of the word “fundraise” implies that Ofcom will be going around with a collection tin on a voluntary basis.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

It is your word.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I will find the relevant clause in a moment. The Bill gives Ofcom the legal power to make the regulated companies pay fees to finance Ofcom’s regulatory work. It is not voluntary; it is compulsory.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I am grateful to the Minister for that clarification. Perhaps he should make that more obvious in the job requirements and responsibilities.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The fees requirements are in clauses 70 to 76, in particular clause 71, “Duty to pay fees”. The regulated companies have to pay the fees to Ofcom. It is not optional.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I am grateful to the Minister for that clarification.

The Government have drafted the Bill in a way that puts codes of practice at its heart, so they cannot and should not be susceptible to delay. We have heard from platforms and services that stress that the ambiguity of the requirements is causing concern. At least with a deadline for draft codes of practice, those that want to do the right thing will be able to get on with it in a timely manner.

The Age Verification Providers Association provided us with evidence in support of amendment 48 in advance of today’s sitting. The association agrees that early publication of the codes will set the pace for implementation, encouraging both the Secretary of State and Parliament to approve the codes swiftly. A case study it shared highlights delays in the system, which we fear will be replicated within the online space, too. Let me indulge Members with details of exactly how slow Ofcom’s recent record has been on delivering similar guidance required under the audio-visual media services directive.

The directive became UK law on 30 September 2020 and came into force on 1 November 2020. By 24 June 2021, Ofcom had issued a note as to which video sharing platforms were in scope. It took almost a year until, on 6 October 2021, Ofcom issued formal guidance on the measures.

In December 2021, Ofcom wrote to the verification service providers and

“signalled the beginning of a new phase of supervisory engagement”.

However, in March 2022 it announced that

“the information we collect will inform our Autumn 2022 VSP report, which intends to increase the public’s awareness of the measures platforms have in place to protect users from harm.”

There is still no indication that Ofcom intends to take enforcement action against the many VSPs that remain non-compliant with the directive. It is simply not good enough. I urge the Minister to carefully consider the aims of amendment 48 and to support it.

Labour supports the principles of clause 42. Ofcom must not drag out the process of publishing or amending the codes of practice. Labour also supports a level of transparency around the withdrawal of codes of practice, should that arise.

Labour also supports clause 43 and the principles of ensuring that Ofcom has a requirement to review its codes of practice. We do, however, have concerns over the Secretary of State’s powers in subsection (6). It is absolutely right that the Secretary of State of the day has the ability to make representations to Ofcom in order to prevent the disclosure of certain matters in the interests of national security, public safety or relations with the Government of a country outside the UK. However, I am keen to hear the Minister’s assurances about how well the Bill is drafted to prevent those powers from being used, shall we say, inappropriately. I hope he can address those concerns.

On clause 44, Ofcom should of course be able to propose minor amendments to its codes of practice. Labour does, however, have concerns about the assessment that Ofcom will have to make to ensure that the minor nature of changes will not require amendments to be laid before Parliament, as in subsection (1). As I have said previously, scrutiny must be at the heart of the Bill, so I am interested to hear from the Minister how exactly he will ensure that Ofcom is making appropriate decisions about what sorts of changes are allowed to circumvent parliamentary scrutiny. We cannot and must not get to a place where the Secretary of State, in agreeing to proposed amendments, actively prevents scrutiny from taking place. I am keen to hear assurances on that point from the Minister.

On clause 45, as I mentioned previously on amendment 65 to clause 37, as it stands, service providers would be treated as complying with their duties if they had followed the recommended measures set out in the relevant codes of practice, as set out in subsection (1). However, providers could take alternative measures to comply, as outlined in subsection (5). Labour supports the clause in principle, but we are concerned that the definition of alternative measures is too broad. I would be grateful if the Minister could elaborate on his assessment of the instances in which a service provider may seek to comply via alternative measures. Surely the codes of practice should be, for want of a better phrase, best practice. None of us want to get into a position where service providers are circumnavigating their duties by taking the alternative measures route.

Again, Labour supports clause 46 in principle, but we feel that the provisions in subsection (1) could go further. We know that, historically, service providers have not always been transparent and forthcoming when compelled to be so by the courts. While we understand the reasoning behind subsection (3), we have broader concerns that service providers could, in theory, lean on their codes of practice as highlighting their best practice. I would be grateful if the Minister could address our concerns.

We support clause 47, which establishes that the duties in respect of which Ofcom must issue a code of practice under clause 37 will apply only once the first code of practice for that duty has come into force. However, we are concerned that this could mean that different duties will apply at different times, depending on when the relevant code for a particular duty comes into force. Will the Minister explain his assessment of how that will work in practice? We have concerns that drip feeding this information to service providers will cause further delay and confusion. In addition, will the Minister confirm how Ofcom will prioritise its codes of practice?

Lastly, we know that violence against women and girls has not a single mention in the Bill, which is an alarming and stark omission. Women and girls are disproportionately likely to be affected by online abuse and harassment. The Minister knows this—we all know this—and a number of us have spoken up on the issue on quite a few occasions. He also knows that online violence against women and girls is defined as including, but not limited to, intimate image abuse, online harassment, the sending of unsolicited explicit images, coercive sexting and the creation and sharing of deepfake pornography.

The Minister will also know that Carnegie UK is working with the End Violence Against Women coalition to draw up what a code of practice to tackle violence against women and girls could look like. Why has that been left out of the redraft of the Bill? What consideration has the Minister given to including a code of this nature in the Bill? If the Minister is truly committed to tackling violence against women and girls, why will he not put that on the face of the Bill?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a quick question about timelines because I am slightly confused about the order in which everything will happen. It is unlikely that the Bill will have been through the full parliamentary process before the summer, yet Ofcom intends to publish information and guidance by the summer, even though some things, such as the codes of practice, will not come in until after the Bill has received Royal Assent. Will the Minister give a commitment that, whether or not the Bill has gone through the whole parliamentary process, Ofcom will be able to publish before the summer?

Will Ofcom be encouraged to publish everything, whether that is guidance, information on its website or the codes of practice, at the earliest point at which they are ready? That will mean that anyone who has to apply those codes of practice or those regulations—people who will have to work within those codes, for example, or charities or other organisations that might be able to make super-complaints—will have as much information as possible, as early as possible, and will be able to prepare to fully implement their work at the earliest possible time. They will need that information in order to be able to gear up to do that.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I will do my best to make sure that we come to it very quickly indeed, by being concise in my replies on this group of amendments.

On amendment 48, which seeks to get Ofcom to produce its codes of practice within six months, obviously we are unanimous in wanting that to be done as quickly as possible. However, Ofcom has to go through a number of steps in order to produce those codes of practice. For example, first we have to designate in secondary legislation the priority categories of content that is harmful to children and content that is harmful to adults, and then Ofcom has to go through a consultation exercise before it publishes the codes. It has in the past indicated that it expects that to be a 12-month, rather than a six-month, process. I am concerned that a hard, six-month deadline may be either impossible to meet or make Ofcom rush and do it in a bad way. I accept the need to get this done quickly, for all the obvious reasons, but we also want to make sure that it is done right. For those reasons, a hard, six-month deadline would not help us very much.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Why does the Minister believe that six months is out of scope? Does he think that Ofcom is not adequately resourced to meet that deadline and make it happen as soon as possible?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

There are a number of steps to go through. Regardless of how well resourced Ofcom is and how fast it works, first, we have to designate the priority categories by secondary legislation, and there is a lead time for that. Secondly, Ofcom has to consult. Best practice suggests that consultations need to last for a certain period, because the consultation needs to be written, then it needs to open, and then the responses need to be analysed. Then, Ofcom obviously has to write the codes of practice. It might be counterproductive to set a deadline that tight.

There are quite a few different codes of practice to publish, and the hon. Lady asked about that. The ones listed in clause 47 will not all come out at the same time; they will be staggered and prioritised. Obviously, the ones that are most germane to safety, such as those on illegal content and children’s safety, will be done first. We would expect them to be done as a matter of extreme urgency.

I hope I have partly answered some of the questions that the hon. Member for Aberdeen North asked. The document to be published before the summer, which she asked about, is a road map. I understand it to be a sort of timetable that will set out the plan for doing everything we have just been debating—when the consultations will happen and when the codes of practice will be published. I guess we will get the road map in the next few weeks, if “before the summer” means before the summer recess. We will have all that set out for us, and then the formal process follows Royal Assent. I hope that answers the hon. Lady’s question.

There were one or two other questions from the hon. Member for Pontypridd. She asked whether a Secretary of State might misuse the power in clause 43(2)—a shocking suggestion, obviously. The power is only to request a review; it is nothing more sinister or onerous than that.

On clause 44, the hon. Lady asked what would happen if Ofcom and the Secretary of State between them—it would require both—conspired to allow through a change claiming it is minor when in fact it is not minor. First, it would require both of them to do that. It requires Ofcom to propose it and the Secretary of State to agree it, so I hope the fact that it is not the Secretary of State acting alone gives her some assurance. She asked what the redress is if both the Secretary of State and Ofcom misbehave, as it were. Well, the redress is the same as with any mis-exercise of a public power—namely, judicial review, which, as a former Home Office Minister, I have experienced extremely frequently—so there is legal redress.

The hon. Lady then asked about the alternative measures. What if a service provider, rather than meeting its duties via the codes of practice, does one of the alternative measures instead? Is it somehow wriggling out of what it is supposed to do? The thing that is legally binding, which it must do and about which there is no choice because there is a legal duty, is the duties that we have been debating over the past few days. Those are the binding requirements that cannot be circumvented. The codes of practice propose a way of meeting those. If the service provider can meet the duties in a different way and can satisfy Ofcom that it has met those duties as effectively as it would under the codes of practices, it is open to doing that. We do not want to be unduly prescriptive. The test is: have the duties been delivered? That is non-negotiable and legally binding.

I hope I have answered all the questions, while gently resisting amendment 48 and encouraging the Committee to agree that the various other clauses stand part of the Bill.

Question put, That the amendment be made.

The Committee divided:.

Online Safety Bill (Ninth sitting)

Alex Davies-Jones Excerpts
Committee stage
Tuesday 14th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 14 June 2022 - (14 Jun 2022)
Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - -

Good morning, Ms Rees; it is, as always, a pleasure to serve under your chairship.

Amendment 84 would remove the Secretary of State’s ability to modify Ofcom codes of practice

“for reasons of public policy”.

Labour agrees with the Carnegie UK Trust assessment of this: the codes are the fulcrum of the regulatory regime and it is a significant interference in Ofcom’s independence. Ofcom itself has noted that the “reasons of public policy” power to direct might weaken the regime. If Ofcom has undertaken a logical process, rooted in evidence, to arrive at a draft code, it is hard to see how a direction based on “reasons of public policy” is not irrational. That then creates a vulnerability to legal challenge.

On clause 40 more widely, the Secretary of State should not be able to give Ofcom specific direction on non-strategic matters. Ofcom’s independence in day-to-day decision making is paramount to preserving freedom of expression. Independence of media regulators is the norm in developed democracies. The UK has signed up to many international statements in that vein, including as recently as April 2022 at the Council of Europe. That statement says that

“media and communication governance should be independent and impartial to avoid undue influence on policy making, discriminatory treatment and preferential treatment of powerful groups, including those with significant political or economic power.”

The Bill introduces powers for the Secretary of State to direct Ofcom on internet safety codes. These provisions should immediately be removed. After all, in broadcasting regulation, Ofcom is trusted to make powerful programme codes with no interference from the Secretary of State. Labour further notes that although the draft Bill permitted this

“to ensure that the code of practice reflects government policy”,

clause 40 now specifies that any code may be required to be modified

“for reasons of public policy”.

Although that is more normal language, it is not clear what in practice the difference in meaning is between the two sets of wording. I would be grateful if the Minister could confirm what that is.

The same clause gives the Secretary of State powers to direct Ofcom, on national security or public safety grounds, in the case of terrorism or CSEA—child sexual exploitation and abuse—codes of practice. The Secretary of State might have some special knowledge of those, but the Government have not demonstrated why they need a power to direct. In the broadcasting regime, there are no equivalent powers, and the Secretary of State was able to resolve the case of Russia Today, on national security grounds, with public correspondence between the Secretary of State and Ofcom.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

Good morning, Ms Rees; it is a pleasure to serve under your chairmanship again. The SNP spokesman and the shadow Minister have already explained what these provisions do, which is to provide a power for the Secretary of State to make directions to Ofcom in relation to modifying a code of conduct. I think it is important to make it clear that the measures being raised by the two Opposition parties are, as they said, envisaged to be used only in exceptional circumstances. Of course the Government accept that Ofcom, in common with other regulators, is rightly independent and there should be no interference in its day-to-day regulatory decisions. This clause does not seek to violate that principle.

However, we also recognise that although Ofcom has great expertise as a regulator, there may be situations in which a topic outside its area of expertise needs to be reflected in a code of practice, and in those situations, it may be appropriate for a direction to be given to modify a code of conduct. A recent and very real example would be in order to reflect the latest medical advice during a public health emergency. Obviously, we saw in the last couple of years, during covid, some quite dangerous medical disinformation being spread—concerning, for example, the safety of vaccines or the “prudence” of ingesting bleach as a remedy to covid. There was also the purported and entirely false connection between 5G phone masts and covid. There were issues on public policy grounds—in this case, medical grounds—and it might have been appropriate to make sure that a code of conduct was appropriately modified.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Thank you, Ms Rees, for your hard work in chairing the Committee this morning; we really appreciate it. Amendment 89 relates to below-the-line comments on newspaper articles. For the avoidance of doubt, if we do not get amendment 89, I am more than happy to support the Labour party’s amendment 43, which has a similar effect but covers slightly fewer—or many fewer—organisations and places.

Below-the-line comments in newspaper articles are infamous. They are places that everybody fears to go. They are worse than Twitter. In a significant number of ways, below-the-line comments are an absolute sewer. I cannot see any reasonable excuse for them to be excluded from the Bill. We are including Twitter in the Bill; why are we not including below-the-line comments for newspapers? It does not make any sense to me; I do not see any logic.

We heard a lot of evidence relating to freedom of speech and a free press, and I absolutely, wholeheartedly agree with that. However, the amendment would not stop anyone writing a letter to the editor. It would not stop anyone engaging with newspapers in the way that they would have in the print medium. It would still allow that to happen; it would just ensure that below-the-line comments were subject to the same constraints as posts on Twitter. That is the entire point of amendment 89.

I do not think that I need to say much more, other than to add one more thing about the direction by comments to other, more radical and extreme pieces, or bits of information. It is sometimes the case that the comments on a newspaper article will direct people to even more extreme views. The newspaper article itself may be just slightly derogatory, while some of the comments may have links or references to other pieces, and other places on the internet where people can find a more radical point of view. That is exactly what happens on Twitter, and is exactly some of the stuff that we are trying to avoid—sending people down an extremist rabbit hole. I do not understand how the Minister thinks that the clause, which excludes below the line newspaper comments, is justifiable or acceptable.

Having been contacted by a number of newspapers, I understand and accept that some newspapers have moderation policies for their comments sections, but that is not strong enough. Twitter has a moderation policy, but that does not mean that there is actually any moderation, so I do not think that subjecting below-the-line comments to the provisions of the Bill is asking too much. It is completely reasonable for us to ask for this to happen, and I am honestly baffled as to why the Minister and the Government have chosen to make this exemption.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Before I address the amendments, I will speak to clause 49 more broadly.

Labour has concerns about a number of subsections of the clause, including subsections (2), and (8) to (10)— commonly known as the news publisher content exemption, which I have spoken about previously. We understand that the intention of the exemption is to shield broadcasters and traditional newspaper publishers from the Bill’s regulatory effects, clause 50(2) defines a “recognised news publisher” as a regulated broadcaster or any other publisher that publishes news, has an office, and has a standards code and complaints process. There is no detail about the latter two requirements, thus enabling almost any news publishing enterprise to design its own code and complaints process, however irrational, and so benefit from the exemption. “News” is also defined broadly, and may include gossip. There remains a glaring omission, which amendment 43 addresses and which I will come to.

During an earlier sitting of the Committee, in response to comments made by my hon. Friend the Member for Liverpool, Walton as we discussed clause 2, the Minister claimed that

“The metaverse is a good example, because even though it did not exist when the structure of the Bill was conceived, anything happening in the metaverse is none the less covered by the Bill. Anything that happens in the metaverse that is illegal or harmful to children, falls into the category of legal but harmful to adults, or indeed constitutes pornography will be covered because the Bill is tech agnostic.”––[Official Report, Online Safety Public Bill Committee, 7 June 2022; c. 204.]

Clause 49 exempts one-to-one live aural communications from the scope of regulation. Given that much interaction in virtual reality is live aural communication, including between two users, it is hard to understand how that would be covered by the Bill.

There is also an issue about what counts as content. Most standard understandings would define “content” as text, video, images and audio, but one of the worries about interactions in VR is that behaviour such as physical violence will be able to be replicated virtually, with psychologically harmful effects. It is very unclear how that would be within the scope of the current Bill, as it does not clearly involve content, so could the Minister please address that point? As he knows, Labour advocates for a systems-based approach, and for risk assessments and systems to take place in a more upstream and tech-agnostic way than under the current approach. At present, the Bill would struggle to be expanded effectively enough to cover those risks.

Amendment 43 removes comments sections operated by news websites where the publisher has a UK turnover of more than £100 million from the exemption for regulated user-generated comment. If the Bill is to be effective in protecting the public from harm, the least it must accomplish is a system of accountability that covers all the largest platforms used by British citizens. Yet as drafted, the Bill would exempt some of the most popular social media platforms online: those hosted on news publisher websites, which are otherwise known as comments sections. The amendment would close that loophole and ensure that the comments sections of the largest newspaper websites are subject to the regime of regulation set out in the Bill.

Newspaper comments sections are no different from the likes of Facebook and Twitter, in that they are social media platforms that allow users to interact with one another. This is done through comments under stories, comments in response to other comments, and other interactions—for example, likes and dislikes on posts. In some ways, their capacity to cause harm to the public is even greater: for example, their reach is in many cases larger than even the biggest of social media platforms. Whereas there are estimated to be around 18 million users of Twitter in the UK, more than twice that number of British citizens access newspaper websites every month, and the harm perpetuated on those platforms is severe.

In July 2020, the rapper Wiley posted a series of antisemitic tweets, which Twitter eventually removed after an unacceptable delay of 48 hours, but under coverage of the incident in The Sun newspaper, several explicitly antisemitic comments were posted. Those comments contained holocaust denial and alleged a global Jewish conspiracy to control the world. They remained up and accessible to The Sun’s 7 million daily readers for the best part of a week. If we exempt comments sections from the Bill’s proposed regime and the duties that the Bill sets for platforms, we will send the message that that kind of vicious, damaging and harmful racism is acceptable.

Similarly, after an antisemitic attack in the German city of Halle, racists comments followed in the comments section under the coverage in The Sun. There are more examples: Chinese people being described as locusts and attacked with other racial slurs; 5G and Bill Gates conspiracy theories under articles on the Telegraph website; and of course, the most popular targets for online abuse, women in public life. Comments that described the Vice-President of the United States as a “rat” and “ho” appeared on the MailOnline. A female union leader has faced dozens of aggressive and abusive comments about her appearance, and many of such comments remain accessible on newspaper comments sections to this day. Some of them have been up for months, others for years.

Last week, the Committee was sent a letter from a woman who was the victim of comments section abuse, Dr Corinne Fowler. Dr Fowler said of the comments that she received:

“These comments contained scores of suggestions about how to kill or injure me. Some were general ideas, such as hanging, but many were gender specific, saying that I should be burnt at the stake like a witch. Comments focused on physical violence, one man advising that I should slapped hard enough to make my teeth chatter”.

She added:

“I am a mother: without me knowing, my son (then 12 years old) read these reader comments. He became afraid for my safety.”

Without the amendment, the Bill cannot do anything to protect women such as Dr Fowler and their families from this vile online abuse, because comments sections will be entirely out of scope of the Bill’s new regime and the duties designed to protect users.

As I understand it, two arguments have been made to support the exemption. First, it is argued that the complaints handlers for the press already deal with such content, but the handler for most national newspapers, the Independent Press Standards Organisation, will not act until a complaint is made. It then takes an average of six months for a complaint to be processed, and it cannot do anything if the comments have not been moderated. The Opposition do not feel that that is a satisfactory response to the seriousness of harms that we know to occur, and which I have described. IPSO does not even have a code to deal with cases of antisemitic abuse that appeared on the comments section of The Sun. IPSO’s record speaks for itself from the examples that I have given, and the many more, and it has proven to be no solution to the severity of harms that appear in newspaper comments sections.

The second argument for an exemption is that publishers are legally responsible for what appears on comments sections, but that is only relevant for illegal harms. For everything else, from disinformation to racial prejudice and abuse, regulation is needed. That is why it is so important that the Bill does the job that we were promised. To keep the public safe from harm online, comments sections must be covered under the Bill.

The amendment is a proportionate solution to the problem of comments section abuse. It would protect user’s freedom of expression and, given that it is subject to a turnover threshold, ensure that duties and other requirements do not place a disproportionate burden on smaller publishers such as locals, independents and blogs.

I have reams and reams and reams of examples from comments sections that all fall under incredibly harmful abuse and should be covered by the Bill. I could be here for hours reading them all out, and while I do not think that anybody in Committee would like me to, I urge Committee members to take a look for themselves at the types of comments under newspaper articles and ask themselves whether those comments should be covered by the terms of the Bill. I think they know the answer.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

On a point of order, Ms Rees. Are we considering clause 49 now? I know that it is supposed to considered under the next set of amendments, but I just wondered, because I have separate comments to make on that clause that I did not make earlier because I spoke purely to the amendment.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me start by addressing the substance of the two amendments and then I will answer one or two of the questions that arose in the course of the debate.

As Opposition Members have suggested, the amendments would bring the comments that appear below the line on news websites such as The Guardian, MailOnline or the BBC into the scope of the Bill’s safety duties. They are right to point out that there are occasions when the comments posted on those sites are extremely offensive.

There are two reasons why comments below BBC, Guardian or Mail articles are excluded from the scope of the Bill. First, the news media publishers—newspapers, broadcasters and their representative industry bodies—have made the case to the Government, which we are persuaded by, that the comments section below news articles is an integral part of the process of publishing news and of what it means to have a free press. The news publishers—both newspapers and broadcasters that have websites—have made that case and have suggested, and the Government have accepted, that intruding into that space through legislation and regulation would represent an intrusion into the operation of the free press.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I am sorry, but I am having real trouble buying that argument. If the Minister is saying that newspaper comments sections are exempt in order to protect the free press because they are an integral part of it, why do we need the Bill in the first place? Social media platforms could argue in the same way that they are protecting free speech. They could ask, “Why should we regulate any comments on our social media platform if we are protecting free speech?” I am sorry; that argument does not wash.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

There is a difference between random individuals posting stuff on Facebook, as opposed to content generated by what we have defined as a “recognised news publisher”. We will debate that in a moment. We recognise that is different in the Bill. Although the Opposition are looking to make amendments to clause 50, they appear to accept that the press deserve special protection. Article 10 case law deriving from the European convention on human rights also recognises that the press have a special status. In our political discourse we often refer generally to the importance of the freedom of the press. We recognise that the press are different, and the press have made the case—both newspapers and broadcasters, all of which now have websites—that their reader engagement is an integral part of that free speech. There is a difference between that and individuals chucking stuff on Facebook outside of the context of a news article.

There is then a question about whether, despite that, those comments are still sufficiently dangerous that they merit regulation by the Bill—a point that the shadow Minister, the hon. Member for Pontypridd, raised. There is a functional difference between comments made on platforms such as Facebook, Twitter, TikTok, Snapchat or Instagram, and comments made below the line on a news website, whether it is The Guardian, the Daily Mail, the BBC—even The National. The difference is that on social media platforms, which are the principal topic of the Bill, there is an in-built concept of virality—things going viral by sharing and propagating content widely. The whole thing can spiral rapidly out of control.

Virality is an inherent design feature in social media sites. It is not an inherent design feature of the comments we get under the news website of the BBC, The Guardian or the Daily Mail. There is no way of generating virality in the same way as there is on Facebook and Twitter. Facebook and Twitter are designed to generate massive virality in a way that comments below a news website are not. The reach, and the ability for them to grow exponentially, is orders of magnitude lower on a news website comment section than on Facebook. That is an important difference, from a risk point of view.

--- Later in debate ---
John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

No, I will let that particular weed die in the bed. I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Briefly, as with earlier clauses, the Labour party recognises the challenge in finding the balance between freedom of expression and keeping people safe online. Our debate on the amendment has illustrated powerfully that the exemptions as they stand in the Bill are hugely flawed.

First, the exemption is open to abuse. Almost any organisation could develop a standards code and complaints process to define itself as a news publisher and benefit from the exemption. Under those rules, as outlined eloquently by my hon. Friend the Member for Batley and Spen, Russia Today already qualifies, and various extremist publishers could easily join it. Organisations will be able to spread seriously harmful content with impunity—I referred to many in my earlier contributions, and I have paid for that online.

Secondly, the exemption is unjustified, as we heard loud and clear during the oral evidence sessions. I recall that Kyle from FairVote made that point particularly clearly. There are already rigorous safeguards in the Bill to protect freedom of expression. The fact that content is posted by a news provider should not itself be sufficient reason to treat such content differently from that which is posted by private citizens.

Furthermore, quality publications with high standards stand to miss out on the exemption. The Minister must also see the lack of parity in the broadcast media space. In order for broadcast media to benefit from the exemption, they must be regulated by Ofcom, and yet there is no parallel stipulation for non-broadcast media to be regulated in order to benefit. How is that fair? For broadcast media, the requirement to be regulated by Ofcom is simple, but for non-broadcast media, the series of requirements are not rational, exclude many independent publishers and leave room for ambiguity.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a couple of questions that were probably too long for interventions. The Minister said that if comments on a site are the only user-generated content, they are not in scope. It would be really helpful if he explained what exactly he meant by that. We were talking about services that do not fall within the definition of “recognised news publishers”, because we were trying to add them to that definition. I am not suggesting that the Minister is wrong in any way, but I do not understand where the Bill states that those comments are excluded, and how this all fits together.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I made general comments about clause 50 during the debate on amendment 107; I will not try the Committee’s patience by repeating them, but I believe that in them, I addressed some of the issues that the shadow Minister, the hon. Member for Pontypridd, has raised.

On the hon. Member for Aberdeen North’s question about where the Bill states that sites with limited functionality—for example, functionality limited to comments alone—are out of scope, paragraph 4(1) of schedule 1 states that

“A user-to-user service is exempt if the functionalities of the service are limited, such that users are able to communicate by means of the service only in the following ways—

(a) posting comments or reviews relating to provider content;

(b) sharing such comments or reviews on a different internet service”.

Clearly, services where a user can share freely are in scope, but if they cannot share directly—if they can only share via another service, such as Facebook—that service is out of scope. This speaks to the point that I made to the hon. Member for Batley and Spen in a previous debate about the level of virality, because the ability of content to spread, proliferate, and be forced down people’s throats is one of the main risks that we are seeking to address through the Bill. I hope that paragraph 4(1) of schedule 1 is of assistance, but I am happy to discuss the matter further if that would be helpful.

Question put and agreed to.

Clause 50 accordingly ordered to stand part of the Bill.

Clause 51

“Search content”, “search results” etc

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Labour does not oppose the intention of the clause. It is important to define “search content” in order to understand the responsibilities that fall within search services’ remits.

However, we have issues with the way that the Bill treats user-to-user services and search services differently when it comes to risk-assessing and addressing legal harm—an issue that we will come on to when we debate schedule 10. Although search services rightly highlight that the content returned by a search is not created or published by them, the algorithmic indexing, promotion and search prompts provided in search bars are fundamentally their responsibility. We do, however, accept that over the past 20 years, Google, for example, has developed mechanisms to provide a safer search experience for users while not curtailing access to lawful information. We also agree that search engines are critical to the proper functioning of the world wide web; they play a uniquely important role in facilitating access to the internet, and enable people to access, impart, and disseminate information.

Question put and agreed to.

Clause 51 accordingly ordered to stand part of the Bill.

Clause 52

“Illegal content” etc

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I beg to move amendment 61, in clause 52, page 49, line 5, at end insert—

“(4A) An offence referred to in subsection (4) is deemed to have occurred if it would be an offence under the law of the United Kingdom regardless of whether or not it did take place in the United Kingdom.”

This amendment brings offences committed overseas within the scope of relevant offences for the purposes of defining illegal content.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clause stand part.

That schedules 5 and 6 be the Fifth and Sixth schedules to the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

With your permission, Ms Rees, I will speak to clause 52 before coming to amendment 61. Illegal content is defined in clause 52(2) as

“content that amounts to a relevant offence.”

However, as the Minister will know from representations from Carnegie UK to his Department—we share its concerns—the illegal and priority illegal regimes may not be able to operate as intended. The Bill requires companies to decide whether content “amounts to” an offence, with limited room for movement. We share concerns that that points towards decisions on an item-by-item basis; it means detecting intent for each piece of content. However, such an approach does not work at the scale on which platforms operate; it is bad regulation and poor risk management.

There seem to be two different problems relating to the definition of “illegal content” in clause 52. The first is that it is unclear whether we are talking about individual items of content or categories of content—the word “content” is ambiguous because it can be singular or plural—which is a problem for an obligation to design and run a system. Secondly, determining when an offence has taken place will be complex, especially bearing in mind mens rea and defences, so the providers are not in a position to get it right.

The use of the phrase “amounts to” in clause 52(2) seems to suggest that platforms will be required to identify accurately, in individual cases, where an offence has been committed, without any wriggle room drafted in, unlike in the draft Bill. As the definition now contains no space for error either side of the line, it could be argued that there are more incentives to avoid false negatives than false positives—providers can set higher standards than the criminal law—and that leads to a greater risk of content removal. That becomes problematic, because it seems that the obligation under clause 9(3) is then to have a system that is accurate in all cases, whereas it would be more natural to deal with categories of content. This approach seems not to be intended; support for that perspective can be drawn from clause 9(6), which recognises that there is a distinction between categories of content and individual items, and that the application of terms of service might specifically have to deal with individual instances of content. Critically, the “amounts to” approach cannot work in conjunction with a systems-based approach to harm reduction. That leaves victims highly vulnerable.

This problem is easily fixed by a combination of reverting to the draft Bill’s language, which required reasonableness, and using concepts found elsewhere in the Bill that enable a harm mitigation system to operate for illegal content. We also remind the Minister that Ofcom raised this issue in the evidence sessions. I would be grateful if the Minister confirmed whether we can expect a Government amendment to rectify this issue shortly.

More broadly, as we know, priority illegal content, which falls within illegal content, includes,

“(a) terrorism content,

(b) CSEA content, and

(c) content that amounts to an offence specified in Schedule 7”,

as set out in clause 52(7). Such content attracts a greater level of scrutiny and regulation. Situations in which user-generated content will amount to “a relevant offence” are set out in clause 52(3). Labour supports the inclusion of a definition of illegal content as outlined in the grouping; it is vital that service providers and platforms have a clear indication of the types of content that they will have a statutory duty to consider when building, or making changes to the back end of, their business models.

We have also spoken about the importance of parity between the online and offline spaces—what is illegal offline must be illegal online—so the Minister knows we have more work to do here. He also knows that we have broad concerns around the omissions in the Bill. While we welcome the inclusion of terrorism and child sexual exploitation content as priority illegal content, there remain gaps in addressing violence against women and girls content, which we all know is hugely detrimental to many online.

The UK Government stated that their intention for the Online Safety Bill was to make the UK the safest place to be online in the world, yet the Bill does not mention online gender-based violence once. More than 60,000 people have signed the Glitch and End Violence Against Women Coalition’s petition calling for women and girls to be included in the Bill, so the time to act is now. We all have a right to not just survive but thrive, engage and play online, and not have our freedom of expression curtailed or our voices silenced by perpetrators of abuse. The online space is just as real as the offline space. The Online Safety Bill is our opportunity to create safe digital spaces.

The Bill must name the problem. Violence against women and girls, particularly those who have one or multiple protected characteristics, is creating harm and inequality online. We must actively and meaningfully name this issue and take an intersectional approach to ending online abuse to ensure that the Bill brings meaningful change for all women. We also must ensure that the Bill truly covers all illegal content, whether it originated in the UK or not.

Amendment 61 brings offences committed overseas within the scope of relevant offences for the purposes of defining illegal content. The aim of the amendment is to clarify whether the Bill covers content created overseas that would be illegal if what was shown in the content took place in the UK. For example, animal abuse and cruelty content is often filmed abroad. The same can be said for dreadful human trafficking content and child sexual exploitation. The optimal protection would be if the Bill’s definition of illegal content covered matter that would be illegal in either the UK or the country it took place in, regardless of whether it originated in the UK.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I do not intend to make a speech, but I want to let the hon. Lady know that we wholeheartedly support everything that she has said on the clause and amendment 61.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I am grateful for the hon. Member’s contribution, and for her support for the amendment and our comments on the clause.

The Bill should be made clearer, and I would appreciate an update on the Minister’s assessment of the provisions in the Bill. Platforms and service providers need clarity if they are to take effective action against illegal content. Gaps in the Bill give rise to serious questions about the overwhelming practical challenges of the Bill. None of us wants a two-tier internet, in which user experience and platforms’ responsibilities in the UK differ significantly from those in the rest of the world. Clarifying the definition of illegal content and acknowledging the complexity of the situation when content originates abroad are vital if this legislation is to tackle wide-ranging, damaging content online. That is a concern I raised on Second Reading, and a number of witnesses reiterated it during the oral evidence sessions. I remind the Committee of the comments of Kevin Bakhurst from Ofcom, who said:

“We feel it is really important—hopefully this is something the Committee can contribute to—that the definition of ‘illegal content’ is really clear for platforms, and particularly the area of intent of illegality, which at the moment might be quite tricky for the platforms to pick up on.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 8, Q7.]

That has been reiterated by myriad other stakeholders, so I would be grateful for the Minister’s comments.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I rise to speak on clause 52 stand part, particularly —the Minister will not be surprised—the element in subsection (4)(c) around the offences specified in schedule 7. The debate has been very wide ranging throughout our sittings. It is extraordinary that we need a clause defining what is illegal. Presumably, most people who provide goods and services in this country would soon go out of business if they were not knowledgeable about what is illegal. The Minister is helping the debate very much by setting out clearly what is illegal, so that people who participate in the social media world are under no illusion as to what the Government are trying to achieve through this legislation.

The truth is that the online world has unfolded without a regulatory framework. New offences have emerged, and some of them are tackled in the Bill, particularly cyber-flashing. Existing offences have taken on a new level of harm for their victims, particularly when it comes to taking, making and sharing intimate images without consent. As the Government have already widely acknowledged, because the laws on that are such a patchwork, it is difficult for the enforcement agencies in this country to adequately protect the victims of that heinous crime, who are, as the Minister knows, predominately women.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

As always, the right hon. Lady makes an incredibly powerful point. She asked specifically about whether the Bill is a suitable legislative vehicle in which to implement any Law Commission recommendations—we do not yet have the final version of that report—and I believe that that would be in scope. A decision about legislative vehicles depends on the final form of the Law Commission report and the Ministry of Justice response to it, and on cross-Government agreement about which vehicle to use.

I hope that addresses all the questions that have been raised by the Committee. Although the shadow Minister is right to raise the question, I respectfully ask her to withdraw amendment 61 on the basis that those matters are clearly covered in clause 52(9). I commend the clause to the Committee.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I am grateful to the Minister for his comments. The Labour party has concerns that clause 52(9) does not adequately get rid of the ambiguity around potential illegal online content. We feel that amendment 61 sets that out very clearly, which is why we will press it to a vote.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Just to help the Committee, what is it in clause 52(9) that is unclear or ambiguous?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

We just feel that amendment 61 outlines matters much more explicitly and leaves no ambiguity by clearly defining any

“offences committed overseas within the scope of relevant offences for the purposes of defining illegal content.”

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I think they say the same thing, but we obviously disagree.

Question put, That the amendment be made.

Online Safety Bill (Tenth sitting) Debate

Full Debate: Read Full Debate

Alex Davies-Jones

Main Page: Alex Davies-Jones (Labour - Pontypridd)

Online Safety Bill (Tenth sitting)

Alex Davies-Jones Excerpts
Committee stage
Tuesday 14th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 14 June 2022 - (14 Jun 2022)
John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

I do, of course, agree. As anyone who has suffered with someone in their family committing suicide knows, it has a lifelong family effect. It is yet another amendment where I feel we should depart from the pantomime of so much parliamentary procedure, where both sides fundamentally agree on things but Ministers go through the torturous process of trying to tell us that every single amendment that any outside body or any Opposition Member, whether from the SNP or the Labour party, comes up with has been considered by the ministerial team and is already incorporated or covered by the Bill. They would not be human if that were the case. Would it not be refreshing if there were a slight change in tactic, and just occasionally the Minister said, “Do you know what? That is a very good point. I think I will incorporate it into the Bill”?

None of us on the Opposition Benches seeks to make political capital out of any of the things we propose. All of us, on both sides of the House, are here with the best of intentions, to try to ensure that we get the best possible Bill. We all want to be able to vote for the Bill at the end of the day. Indeed, as I said, I have worked with two friends on the Conservative Benches—with the hon. Member for Watford on the Joint Committee on the draft Bill and with the hon. Member for Wolverhampton North East on the Select Committee on Digital, Culture, Media and Sport—and, as we know, they have both voted for various proposals. It is perhaps part of the frustration of the party system here that people are forced to go through the hoops and pretend that they do not really agree with things that they actually do agree with.

Let us try to move on with this, in a way that we have not done hitherto, and see if we can agree on amendments. We will withdraw amendments if we are genuinely convinced that they have already been considered by the Government. On the Government side, let them try to accept some of our amendments—just begin to accept some—if, as with this one, they think they have some merit.

I was talking about Samaritans, and exactly what it wants to do with the Bill. It is concerned about harmful content after the Bill is passed. This feeds into potentially the most important aspect of the Bill: it does not mandate risk assessments based exclusively on risk. By adding in the qualifications of size and scope, the Bill wilfully lets some of the most harmful content slip through its fingers—wilfully, but I am sure not deliberately. Categorisation will be covered by a later amendment, tabled by my hon. Friend the Member for Aberdeen North, so I shall not dwell on it now.

In July 2021, the Law Commission for England and Wales recommended the creation of a new narrow offence of the “encouragement or assistance” of serious self-harm with “malicious intent”. The commission identified that there is

“currently no offence that adequately addresses the encouragement of serious self-harm.”

The recommendation followed acknowledgement that

“self-harm content online is a worrying phenomenon”

and should have a

“robust fault element that targets deliberate encouragement of serious self-harm”.

Currently, there are no provisions of the Bill to create a new offence of assisting or encouraging self- harm.

In conclusion, I urge the Minister to listen not just to us but to the expert charities, including Samaritans, to help people who have lived experience of self-harm and suicide who are calling for regulation of these dangerous sites.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - -

Good afternoon, Sir Roger; it is a pleasure, as ever, to serve under your chairship. I rise to speak to new clause 36, which has been grouped with amendment 142 and is tabled in the names of the hon. Members for Ochil and South Perthshire and for Aberdeen North.

I, too, pay tribute to Samaritans for all the work it has done in supporting the Bill and these amendments to it. As colleagues will be aware, new clause 36 follows a recommendation from the Law Commission dating back to July 2021. The commission recommended the creation of a new, narrow offence of the “encouragement or assistance” of serious self-harm with “malicious intent”. It identified that there is

“currently no offence that adequately addresses the encouragement of serious self-harm.”

The recommendation followed acknowledgement that

“self-harm content online is a worrying phenomenon”

and should have a

“robust fault element that targets deliberate encouragement of serious self-harm”.

Currently, there are no provisions in the Bill to create a new offence of assisting or encouraging self-harm, despite the fact that other recommendations from the Law Commission report have been brought into the Bill, such as creating a new offence of cyber-flashing and prioritising tackling illegal suicide content.

We all know that harmful suicide and self-harm content is material that has the potential to cause or exacerbate self-harm and suicidal behaviours. Content relating to suicide and self-harm falls into both categories in the Bill—illegal content and legal but harmful content. Encouraging or assisting suicide is also currently a criminal offence in England and Wales under the Suicide Act 1961, as amended by the Coroners and Justice Act 2009.

Content encouraging or assisting someone to take their own life is illegal and has been included as priority illegal content in the Bill, meaning that platforms will be required to proactively and reactively prevent individuals from encountering it, and search engines will need to structure their services to minimise the risk to individuals encountering the content. Other content, including content that positions suicide as a suitable way of overcoming adversity or describes suicidal methods, is legal but harmful.

The Labour party’s Front-Bench team recognises that not all content falls neatly into the legal but harmful category. What can be helpful for one user can be extremely distressing to others. Someone may find it extremely helpful to share their personal experience of suicide, for example, and that may also be helpful to other users. However, the same material could heighten suicidal feelings and levels of distress in someone else. We recognise the complexities of the Bill and the difficulties in developing a way around this, but we should delineate harmful and helpful content relating to suicide and self-harm, and that should not detract from tackling legal but clearly harmful content.

In its current form, the Bill will continue to allow legal but clearly harmful suicide and self-harm content to be accessed by over-18s. Category 1 platforms, which have the highest reach and functionality, will be required to carry out risk assessments of, and set out in their terms and conditions their approach to, legal but harmful content in relation to over-18s. As the hon. Member for Ochil and South Perthshire outlined, however, the Bill’s impact assessment states that “less than 0.001%” of in-scope platforms

“are estimated to meet the Category 1 and 2A thresholds”,

and estimates that only 20 platforms will be required to fulfil category 1 obligations. There is no requirement on the smaller platforms, including those that actively encourage suicide, to do anything at all to protect over-18s. That simply is not good enough. That is why the Labour party supports new clause 36, and we urge the Minister to do the right thing by joining us.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

It is, as always, a great pleasure to serve under your chairmanship, Sir Roger. The hon. Member for Ochil and South Perthshire made an observation in passing about the Government’s willingness to listen and respond to parliamentarians about the Bill. We listened carefully to the extensive prelegislative scrutiny that the Bill received, including from the Joint Committee on which he served. As a result, we have adopted 66 of the changes that that Committee recommended, including on significant things such as commercial pornography and fraudulent advertising.

If Members have been listening to me carefully, they will know that the Government are doing further work or are carefully listening in a few areas. We may have more to say on those topics as the Bill progresses; it is always important to get the drafting of the provisions exactly right. I hope that that has indicated to the hon. Gentleman our willingness to listen, which I think we have already demonstrated well.

On new clause 36, it is important to mention that there is already a criminal offence of inciting suicide. It is a schedule 7 priority offence, so the Bill already requires companies to tackle content that amounts to the existing offence of inciting suicide. That is important. We would expect the promotion of material that encourages children to self-harm to be listed as a primary priority harm relating to children, where, again, there is a proactive duty to protect them. We have not yet published that primary priority harm list, but it would be reasonable to expect that material promoting children to self-harm would be on it. Again, although we have not yet published the list of content that will be on the adult priority harm list—obviously, I cannot pre-empt the publication of that list—one might certainly wish for content that promotes adults to self-harm to appear on it too.

The hon. Gentleman made the point that duties relating to adults would apply only to category 1 companies. Of course, the ones that apply to children would apply to all companies where there was significant risk, but he is right that were that priority harm added to the adult legal but harmful list, it would apply only to category 1 companies.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

These amendments pick up a question asked by the hon. Member for Aberdeen North much earlier in our proceedings. In schedule 7 we set out the priority offences that exist in English and Welsh law. We have consulted the devolved Administrations in Scotland and Northern Ireland extensively, and I believe we have agreed with them a number of offences in Scottish and Northern Irish law that are broadly equivalent to the English and Welsh offences already in schedule 7. Basically, Government amendments 116 to 126 add those devolved offences to the schedule.

In future, if new Scottish or Northern Irish offences are created, the Secretary of State will be able to consult Scottish or Northern Irish Ministers and, by regulations, amend schedule 7 to add the new offences that may be appropriate if conceived by the devolved Parliament or Assembly in due course. That, I think, answers the question asked by the hon. Lady earlier in our proceedings. As I say, we consulted the devolved Administrations extensively and I hope that the Committee will assent readily to the amendments.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

The amendments aim to capture all the criminal offences in other parts of the UK to be covered by the provisions of the Bill, as the Minister outlined. An offence in one part of the UK will be considered an offence elsewhere, for the purposes of the Bill.

With reference to some of the later paragraphs, I am keen for the Minister to explain briefly how this will work in the case of Scotland. We believe that the revenge porn offence in Scotland is more broadly drawn than the English version, so the level of protection for women in England and Wales will be increased. Can the Minister confirm that?

The Bill will not apply the Scottish offence to English offenders, but it means that content that falls foul of the law in Scotland, but not in England or Wales, will still be relevant regulated content for service providers, irrespective of the part of the UK in which the service users are located. That makes sense from the perspective of service providers, but I will be grateful for clarity from the Minister on this point.

--- Later in debate ---
John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

It is an interesting question. Alas, I long ago stopped trying to put myself into the minds of Conservative Ministers—a scary place for any of us to be.

We understand that it is difficult to try to regulate in respect of human trafficking on platforms. It requires work across borders and platforms, with moderators speaking different languages. We established that Facebook does not have moderators who speak different languages. On the Joint Committee on the draft Bill, we discovered that Facebook does not moderate content in English to any adequate degree. Just look at the other languages around the world—do we think Facebook has moderators who work in Turkish, Finnish, Swedish, Icelandic or a plethora of other languages? It certainly does not. The only language that Facebook tries to moderate—deeply inadequately, as we know—is English. We know how bad the moderation is in English, so can the Committee imagine what it is like in some of the world’s other languages? The most terrifying things are allowed to happen without moderation.

Regulating in respect of human trafficking on platforms is not cheap or easy, but it is utterly essential. The social media companies make enormous amounts of money, so let us shed no tears for them and the costs that will be entailed. If human trafficking is not designated a priority harm, I fear it will fall by the wayside, so I must ask the Minister: is human trafficking covered by another provision on priority illegal content? Like my hon. Friend the Member for Aberdeen North, I cannot see where in the Bill that lies. If the answer is yes, why are the human rights groups not satisfied with the explanation? What reassurance can the Minister give to the experts in the field? Why not add a direct reference to the Modern Slavery Act, as in the amendment?

If the answer to my question is no, I imagine the Minister will inform us that the Bill requires platforms to consider all illegal content. In what world is human trafficking that is facilitated online not a priority? Platforms must be forced to be proactive on this issue; if not, I fear that human trafficking, like so much that is non-priority illegal content, will not receive the attention it deserves.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Schedule 7 sets out the list of criminal content that in-scope firms will be required to remove as a priority. Labour was pleased to see new additions to the most recent iteration, including criminal content relating to online drug and weapons dealing, people smuggling, revenge porn, fraud, promoting suicide and inciting or controlling prostitution for gain. The Government’s consultation response suggests that the systems and processes that services may use to minimise illegal or harmful content could include user tools, content moderation and recommendation procedures.

More widely, although we appreciate that the establishment of priority offences online is the route the Government have chosen to go down with the Bill, we believe the Bill remains weak in relation to addressing harms to adults and wider societal harms. Sadly, the Bill remains weak in its approach and has seemingly missed a number of known harms to both adults and children that we feel are a serious omission. Three years on from the White Paper, the Government know where the gaps are, yet they have failed to address them. That is why we are pleased to support the amendment tabled by the hon. Members for Ochil and South Perthshire and for Aberdeen North.

Human trafficking offences are a serious omission from schedule 7 that must urgently be rectified. As we all know from whistleblower Frances Haugen’s revelations, Facebook stands accused, among the vast array of social problems, of profiting from the trade and sale of human beings—often for domestic servitude—by human traffickers. We also know that, according to internal documents, the company has been aware of the problems since at least 2018. As the hon. Member for Ochil and South Perthshire said, we know that a year later, on the heels of a BBC report that documented the practice, the problem was said to be so severe that Apple itself threatened to pull Facebook and Instagram from its app store. It was only then that Facebook rushed to remove content related to human trafficking and made emergency internal policy changes to avoid commercial consequences described as “potentially severe” by the company. However, an internal company report detailed that the company did not take action prior to public disclosure and threats from Apple—profit over people.

In a complaint to the US Securities and Exchange Commission first reported by The Wall Street Journal, whistleblower Haugen wrote:

“Investors would have been very interested to learn the truth about Facebook almost losing access to the Apple App Store because of its failure to stop human trafficking on its products.”

I cannot believe that the Government have failed to commit to doing more to tackle such abhorrent practices, which are happening every day. I therefore urge the Minister to do the right thing and support amendment 90.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The first thing to make clear to the Committee and anyone listening is that, of course, offences under the Modern Slavery Act 2015 are brought into the scope of the illegal content duties of this Bill through clause 52(4)(d), because such offences involve an individual victim.

Turning to the priority offences set out in schedule 7 —I saw this when I was a Home Office Minister—modern slavery is generally associated with various other offences that are more directly visible and identifiable. Modern slavery itself can be quite hard to identify. That is why our approach is, first, to incorporate modern slavery as a regular offence via clause 52(4)(d) and, secondly, to specify as priority offences those things that are often identifiable symptoms of it and that are feasibly identified. Those include many of the offences listed in schedule 7, such as causing, inciting or controlling prostitution for gain, as in paragraph 16 on sexual exploitation, which is often the manifestation of modern slavery; money laundering, which is often involved where modern slavery takes place; and assisting illegal immigration, because modern slavery often involves moving somebody across a border, which is covered in paragraph 15 on assisting illegal immigration, as per section 25 of the Immigration Act 1971.

Modern slavery comes into scope directly via clause 52(4)(d) and because the practicably identifiable consequences of modern slavery are listed as priority offences, I think we do have this important area covered.

--- Later in debate ---
None Portrait The Chair
- Hansard -

I have had no indication that anybody wishes to move Carla Lockhart’s amendment 98—she is not a member of the Committee.

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

It is absolutely right that the Government have included a commitment to children in the form of defining primary priority content that is harmful. We all know of the dangerous harms that exist online for children, and while the Opposition support the overarching aims of the Bill, we feel the current definitions do not go far enough—that is a running theme with this Bill.

The Bill does not adequately address the risks caused by the design—the functionalities and features of services themselves—or those created by malign contact with other users, which we know to be an immense problem. Research has found that online grooming of young girls has soared by 60% in the last three years—and four in five victims are girls. We also know that games increasingly have addictive gambling-style features. Those without user-to-user functionalities, such as Subway Surfers, which aggressively promotes in-app purchases, are currently out of scope of the Bill.

Lastly, research by Parent Zone found that 91% of children say that loot boxes are available in the games they play and 40% have paid to open one. That is not good enough. I urge the Minister to consider his approach to tackling harmful content and the impact that it can have in all its forms. When considering how children will be kept safe under the new regime, we should consider concerns flagged by some of the civil society organisations that work with them. Organisations such as the Royal College of Psychiatrists, The Mix, YoungMinds and the Mental Health Foundation have all been instrumental in their calls for the Government to do more. While welcoming the intention to protect children, they note that it is not clear at present how some categories of harm, including material that damages people’s body image, will be regulated—or whether it will be regulated at all.

While the Bill does take steps to tackle some of the most egregious, universally damaging material that children currently see, it does not recognise the harm that can be done through the algorithmic serving of material that, through accretion, will cause harm to children with particular mental health vulnerabilities. For example, beauty or fitness-related content could be psychologically dangerous to a child recovering from an eating disorder. Research from the Mental Health Foundation shows how damaging regular exposure to material that shows conventionally perfect images of bodies, often edited digitally and unattainable, are to children and young people.

This is something that matters to children, with 84% of those questioned in a recent survey by charity The Mix saying the algorithmic serving of content was a key issue that the Bill should address. Yet in its current form it does not give children full control over the content they see. Charities also tell us about the need to ensure that children are exposed to useful content. We suggest that the Government consider a requirement for providers to push material on social media literacy to users and to provide the option to receive content that can help with recovery where it is available, curated by social media companies with the assistance of trusted non-governmental organisations and public health bodies. We also hope that the Government can clarify that material damaging to people’s body image will be considered a form of harm.

Additionally, beyond the issue of the content itself that is served to children, organisations including YoungMinds and the Royal College of Psychiatrists have raised the potential dangers to mental health inherent in the way services can be designed to be addictive.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

My hon Friend raises an important point about media literacy, which we have touched on a few times during this debate. We have another opportunity here to talk about that and to say how important it is to think about media literacy within the scope of the Bill. It has been removed, and I think we need to put it back into the Bill at every opportunity—I am talking about media literacy obligations for platforms to help to responsibly educate children and adults about the risks online. We need to not lose sight of that.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I completely agree with my hon. Friend. She is right to talk about the lack of a social and digital media strategy within the Bill, and the need to educate children and adults about the harmful content that we see online. How to stay safe online in all its capacities is absolutely fundamental to the Bill. We cannot have an Online Safety Bill without teaching people how to be safe online. That is important for how children and young people interact online. We know that they chase likes and the self-esteem buzz they get from notifications popping up on their phone or device. That can be addictive, as has been highlighted by mental health and young persons’ charities.

I urge the Minister to address those issues and to consider how the Government can go further, whether through this legislation or further initiatives, to help to combat some of those issues.

--- Later in debate ---
With a third of internet users unaware of the potential for inaccurate or biased information online, it is vital that this amendment on health-related misinformation and disinformation is inserted into the Bill during Committee stage. It would give Parliament the time to scrutinise what content is in scope and ensure that regulation is in place to promote proportionate and effective responses. We must make it incumbent on platforms to be proactive in reducing that pernicious form of disinformation, designed only to hurt and to harm. As we have seen from the pandemic, the consequences can be grave if the false information is believed, as, sadly, it so often is.
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Again, Labour supports moves to ensure that there is some clarity about specific content that is deemed to be harmful to adults, but of course the Opposition have concerns about the overall aim of defining harm.

The Government’s chosen approach to regulating the online space has left too much up to secondary legislation. We are also concerned that health misinformation and disinformation—a key harm, as we have all learned from the coronavirus pandemic—is missing from the Bill. That is why we too support amendment 83. The impact of health misinformation and disinformation is very real. Estimates suggest that the number of social media accounts posting misinformation about vaccines, and the number of users following those accounts, increased during the pandemic. Research by the Centre for Countering Digital Hate, published in November 2020, suggested that the number of followers of the largest anti-vaccination social media accounts had increased by 25% since 2019. At the height of the pandemic, it was also estimated that there were 5.4 million UK-based followers of anti-vaccine Twitter accounts.

Interestingly, an Ofcom survey of around 200 respondents carried out between 12 and 14 March 2021 found that 28% of respondents had come across information about covid-19 that could be considered false or misleading. Of those who had encountered such information, respondents from minority ethnic backgrounds were twice as likely to say that the claim made to them made them think twice about the issue compared with white respondents. The survey found that of those people who were getting news and information about the coronavirus within the preceding week, 15% of respondents had come across claims that the coronavirus vaccines would alter human DNA; 18% had encountered claims that the coronavirus vaccines were a cover for the implant of trackable microchips, and 10% had encountered claims that the vaccines contained animal products.

Public health authorities, the UK Government, social media companies and other organisations all attempted to address the spread of vaccine misinformation through various strategies, including moderation of vaccine misinformation on social media platforms, ensuring the public had access to accurate and reliable information and providing education and guidance to people on how to address misinformation when they came across it.

Although studies do not show strong links between susceptibility to misinformation and ethnicity in the UK, some practitioners and other groups have raised concerns about the spread and impact of covid-19 vaccine misinformation among certain minority ethnic groups. Those concerns stem from research that shows historically lower levels of vaccine confidence and uptake among those groups. Some recent evidence from the UK’s vaccine roll-out suggests that that trend has continued for the covid-19 vaccine.

Data from the OpenSAFELY platform, which includes data from 40% of GP practices in England, covering more than 24 million patients, found that up to 7 April 2021, 96% of white people aged over 60 had received a vaccination compared with only 77% of people from a Pakistani background, 76% from a Chinese background and 69% of black people within the same age group. A 2021 survey of more than 172,000 adults in England on attitudes to the vaccine also found that confidence in covid-19 vaccines was highest in those of white ethnicity, with some 92.6% saying that they had accepted or would accept the vaccine. The lowest confidence was found in those of black ethnicity, at 72.5%. Some of the initiatives to tackle vaccine misinformation and encourage vaccine take-up were aimed at specific minority ethnic groups, and experts have emphasised the importance of ensuring that factual information about covid-19 vaccines is available in multiple different languages.

Social media companies have taken various steps to tackle misinformation on their platforms during the covid-19 pandemic, including removing or demoting misinformation, directing users to information from official sources and banning certain adverts. So, they can do it when they want to—they just need to be compelled to do it by a Bill. However, we need to go further. Some of the broad approaches to content moderation that digital platforms have taken to address misinformation during the pandemic are discussed in the Parliamentary Office of Science and Technology’s previous rapid response on covid-19 and misinformation.

More recently, some social media companies have taken specific action to counter vaccine misinformation. In February 2021, as part of its wider policies on coronavirus misinformation, Facebook announced that it would expand its efforts to remove false information about covid-19 vaccines, and other vaccines more broadly. The company said it would label posts that discuss covid-19 vaccines with additional information from the World Health Organisation. It also said it would signpost its users to information on where and when they could get vaccinated. Facebook is now applying similar measures to Instagram.

In March 2021, Twitter began applying labels to tweets that could contain misinformation about covid-19 vaccines. It also introduced a strike policy, under which users that violate its covid-19 misinformation policy five or more times would have their account permanently suspended.

YouTube announced a specific ban on covid-19 anti-vaccination videos in October 2020. It committed to removing any videos that contradict official information about the vaccine from the World Health Organisation. In March, the company said it had removed more than 30,000 misleading videos about the covid-19 vaccine since the ban was introduced. However, as with most issues, until the legislation changes, service providers will not feel truly compelled to do the right thing, which is why we must legislate and push forward with amendment 83.

Nick Fletcher Portrait Nick Fletcher (Don Valley) (Con)
- Hansard - - - Excerpts

I would like to speak to the clause rather than the amendment, Sir Roger. Is now the right time to do so, or are we only allowed to speak to the amendment?

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Yes. There is an obligation on the Secretary of State to consult—[Interruption.] Did I hear someone laugh?—before proposing a statutory instrument to add things. There is a consultation first and then, if extra things are going to be added—in my hon. Friend’s language, if the scope is increased—that would be votable by Parliament because it is an affirmative SI. So the answer is yes to both questions. Yes there will be consultation in advance, and yes, if this Government or a future Government wanted to add anything, Parliament could vote on it if it wanted to because it will be an affirmative SI. That is a really important point.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

In a moment; I want to answer the other point made by my hon. Friend the Member for Don Valley first. He said that two wrongs don’t make a right. I am not defending the fact that social media firms act in a manner that is arbitrary and censorious at the moment. I am not saying that it is okay for them to carry on. The point that I was making was a different one. I was saying that they act censoriously and arbitrarily at times at the moment. The Bill will diminish their ability to do that in a couple of ways. First, for the legal but harmful stuff, which he is worried about, they will have a duty to act consistently. If they do not, Ofcom will be able to enforce against them. So their liberty to behave arbitrarily, for this category of content at least, will be circumscribed. They will now have to be consistent. For other content that is outside the scope of this clause —which I guess therefore does not worry my hon. Friend—they can still be arbitrary, but for this they have got to be consistent.

There is also the duty to have regard to freedom of expression, and there is a protection of democratic and journalistic importance in clauses 15 and 16. Although those clauses are not perfect and some people say they should be stronger, they are at least better than what we have now. When I say that this is good for freedom of speech, I mean that nothing here infringes on freedom of speech, and to the extent that it moves one way or the other, it moves us somewhat in the direction of protecting free speech more than is the case at the moment, for the reasons I have set out. I will be happy to debate the issue in more detail either in this Committee or outside, if that is helpful and to avoid trying the patience of colleagues.

None Portrait The Chair
- Hansard -

Order. Before we go any further, I know it is tempting to turn around and talk to Back Benchers, but that makes life difficult for Hansard because you tend to miss the microphone. It is also rather discourteous to the Chair, so in future I ask the Minister to please address the Chair. I call the shadow Minister.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I thank the Minister for giving way; I think that is what he was doing as he sat down.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

indicated assent.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Just for clarity, the hon. Member for Don Valley and the Minister have said that Labour Members are seeking to curtail or tighten freedom of expression and freedom of speech, but that is not the case. We fundamentally support free speech, as we always have been. The Bill addresses systems and processes, and that is what it should do—the Minister, the Labour party and I are in full alignment on that. We do not think that the Bill should restrict freedom of speech. I would just like to put that on the record.

We also share the concerns expressed by the hon. Member for Don Valley about the Secretary of State’s potential powers, the limited scope and the extra scrutiny that Parliament might have to undertake on priority harms, so I hope he will support some of our later amendments.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am grateful to the shadow Minister for confirming her support for free speech. Perhaps I could take this opportunity to apologise to you, Sir Roger, and to Hansard for turning round. I will try to behave better in future.

--- Later in debate ---
None Portrait The Chair
- Hansard -

As I have indicated already, I do not propose that we have a clause stand part debate. It has been exhaustively debated, if I may say so.

Clause 54 ordered to stand part of the Bill.

Clause 55

Regulations under sections 53 and 54

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I beg to move amendment 62, in clause 55, page 52, line 4, after “OFCOM” insert

“and other stakeholders, including organisations that campaign for the removal of harmful content online”.

This amendment requires the Secretary of State to consult other stakeholders before making regulations under clause 53 or 54.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clause stand part.

Clause 56 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

We all know that managing harmful content, unlike illegal content, is more about implementing systems that prevent people from encountering it rather than removing it entirely. At the moment, there are no duties on the Secretary of State to consult anyone other than Ofcom ahead of making regulations under clauses 53 and 54. We have discussed at length the importance of transparency, and surely the Minister can agree that the process should be widened, as we have heard from those on the Government Back Benches.

Labour has said time and again that it should not be for the Secretary of State of the day to determine what constitutes harmful content for children or adults. Without the important consultation process outlined in amendment 62, there are genuine concerns that that could lead to a damaging precedent whereby a Secretary of State, not Parliament, has the ability to determine what information is harmful. We all know that the world is watching as we seek to work together on this important Bill, and Labour has genuine concerns that without a responsible consultation process, as outlined in amendment 62, we could inadvertently be suggesting to the world that this fairly dogmatic approach is the best way forward.

Amendment 62 would require the Secretary of State to consult other stakeholders before making regulations under clauses 53 and 54. As has been mentioned, we risk a potentially dangerous course of events if there is no statutory duty on the Secretary of State to consult others when determining the definition of harmful content. Let me draw the Minister’s attention to the overarching concerns of stakeholders across the board. Many are concerned that harmful content for adults requires the least oversight, although there are potential gaps that mean that certain content—such as animal abuse content—could completely slip through the net. The amendment is designed to ensure that sufficient consultation takes place before the Secretary of State makes important decisions in directing Ofcom.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

On that point, I agree wholeheartedly with my hon. Friend. It is important that the Secretary of State consults campaign organisations that have expertise in the relevant areas. Much as we might want the Secretary of State to be informed on every single policy issue, that is unrealistic. It is also important to acknowledge the process that we have been through with the Bill: the expertise of organisations has been vital in some of the decisions that we have had to make. My hon. Friend gave a very good example, and I am grateful to animal welfare groups for their expertise in highlighting the issue of online abuse of animals.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I completely agree with my hon. Friend. As parliamentarians we are seen as experts in an array of fields. I do not purport to be an expert in all things, as it is more a jack of all trades role, and it would be impossible for one Secretary of State to be an expert in everything from animal abuse to online scam ads, from fraud to CSAM and terrorism. That is why it is fundamental that the Secretary of State consults with experts and stakeholders in those fields, for whom these things are their bread and butter—their day job every day. I hope the Minister can see that regulation of the online space is a huge task to take on for us all. It is Labour’s view that any Secretary of State would benefit from the input of experts in specific fields. I urge him to support the amendment, especially given the wider concerns we have about transparency and power sharing in the Bill.

It is welcome that clause 56 will force Ofcom, as the regulator, to carry out important reviews that will assess the extent to which content is harmful to children and adults when broadly appearing on user-to-user services. As we have repeatedly said, transparency must be at the heart of our approach. While Labour does not formally oppose the clause, we have concerns about subsection (5), which states:

“The reports must be published not more than three years apart.”

The Minister knows that the Bill has been long awaited, and we need to see real, meaningful change and updates now. Will he tell us why it contains a three-year provision?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I thank the Minister for his clarification earlier and his explanation of how the categories of primary priority content and priority content can be updated. That was helpful.

Amendment 62 is excellent, and I am more than happy to support it.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I have heard my right hon. Friend’s points about a standing Joint Committee for post-legislative implementation scrutiny. On the comments about the time, I agree that the Ofcom review needs to be far enough into the future that it can be meaningful, hence the three-year time period.

On the substance of amendment 62, tabled by the shadow Minister, I can confirm that the Government are already undertaking research and working with stakeholders on identifying what the priority harms will be. That consideration includes evidence from various civil society organisations, victims organisations and many others who represent the interests of users online. The wider consultation beyond Ofcom that the amendment would require is happening already as a matter of practicality.

We are concerned, however, that making this a formal consultation in the legal sense, as the amendment would, would introduce some delays while we do so, because a whole sequence of things have to happen after Royal Assent. First, we have to designate the priority harms by statutory instrument, and then Ofcom has to publish its risk assessments and codes of practice. If we insert into that a formal legal consultation step, it would add at least four or even six months into the process of implementing the Act. I know that that was not the hon. Lady’s intention and that she is concerned about getting the Act implemented quickly. For that reason, the Government do not want to insert a formal legal consultation step into the process, but I am happy to confirm that we are engaging in the consultation already on an informal basis and will continue to do so. I ask respectfully that amendment 62 be withdrawn.

The purpose of clauses 55 and 56 has been touched on already, and I have nothing in particular to add.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I am grateful for the Minister’s comments on the time that these things would take. I cannot see how they could not happen succinctly along with the current consultation, and why it would take an additional four to six months. Could he clarify that?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

A formal statutory consultation could happen only after the passage of the Bill, whereas the informal non-statutory consultation we can do, and are doing, now.

Question put, That the amendment be made.

--- Later in debate ---
Question proposed, That the clause stand part of the Bill.
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I have some brief comments on the clause. The Labour party very much welcomes the addition to user verification duties in the revised Bill. A range of groups, including Clean Up the Internet, have long campaigned for a verification requirement process, so this is a positive step forward.

We do, however, have some concerns about the exact principles and minimum standards for the user verification duty, which I will address when we consider new clause 8. We also have concerns about subsection (2), which states:

“The verification process may be of any kind (and in particular, it need not require documentation to be provided).”

I would be grateful if the Minister could clarify exactly what that process will look like in practice.

Lastly, as Clean Up the Internet has said, we need further clarification on whether users will be given a choice of how they verify and of the verification provider itself. We can all recognise that there are potential down- sides to the companies that own the largest platforms —such as Meta, Google, Twitter and ByteDance—developing their own in-house verification processes and making them the only option for users wishing to verify on their platform. Indeed, some users may have reservations about sharing even more personal data with those companies. Users of multiple social media platforms can find it inconvenient and confusing, and could be required to go through multiple different verification processes on different platforms to achieve the same outcome of confirming their real name.

There is a risk of the largest platforms seeking to leverage their dominance of social media to capture the market for ID verification services, raising competition concerns. I would be grateful if the Minister could confirm his assessment of the potential issues around clause 57 as it stands.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I rise to welcome clause 57. It is an important part of the Bill and shows the Government acknowledging that anonymity can have a significant impact on the harms that affect victims. There is a catalogue of evidence of the harm done by those posting anonymously. Anonymity appears to encourage abusive behaviour, and there is evidence dating back to 2015 showing that anonymous accounts are more likely to share sexist comments and that online harassment victims are often not able to identify their perpetrators because of the way anonymity works online. The Government are doing an important thing here and I applaud them.

I underline that again by saying that recent research from Compassion in Politics showed that more than one in four people were put off posting on social media because of the fear of abuse, particularly from anonymous posters. Far from the status quo promoting freedom of speech, it actually deters freedom of speech, as we have said in other debates, and it particularly affects women. The Government are to be applauded for this measure.

In the work I was doing with the FA and the Premier League around this very issue, I particularly supported their call for a twin-track approach to verified accounts that said that they should be the default and that people should automatically be able to opt out of receiving posts from unverified accounts. The Bill does not go as far as that, and I can understand the Government’s reasons, but I gently point out that 81% of the people who took part in the Compassion in Politics research would willingly provide identification to get a verified account if it reduced unverified posts. They felt that was important. Some 72% supported the idea if it reduced the amount of anonymous posting.

I am touching on clause 58, but I will not repeat myself when we debate that clause. I hope that it will be possible in the code of practice for Ofcom to point out the clear benefits of having verified accounts by default and perhaps urge responsible providers to do the responsible thing and allow their users to automatically filter out unverified accounts. That is what users want, and it is extraordinary that large consumer organisations do not seem to want to give consumers what they want. Perhaps Ofcom can help those organisations understand what their consumers want, certainly in Britain.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

As we have said previously, it is absolutely right that Ofcom produces guidance for providers of category 1 services to assist with their compliance with the duty. We very much welcome the inclusion and awareness of identity verification forms for vulnerable adult users in subsection (2); once again, however, we feel that that should go further, as outlined in new clause 8.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clause 58, which was touched on in our last debate, simply sets out Ofcom’s duty to publish guidance for category 1 services to assist them in complying with the user identification duty set out in clause 57. We have probably covered the main points, so I will say nothing further.

Question put and agreed to.

Clause 58 accordingly ordered to stand part of the Bill.

Clause 59

Requirement to report CSEA content to the NCA

Question proposed, That the clause stand part of the Bill.

--- Later in debate ---
None Portrait The Chair
- Hansard -

With this it will be convenient to consider the following:

Clause 67 stand part.

That schedule 9 be the Ninth schedule to the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Labour welcomes the important changes that have been made to the Bill since its original draft, which applied only to user-generated pornographic content. The Bill now includes all pornography, and that is a positive step forward. It is also welcome that the provisions do not apply only to commercial pornography. We all know that some of the biggest commercial pornography sites could have switched their business models had these important changes not been made. As we have reiterated, our priority in regulating pornographic content is to keep children safe. The question that we should continue to ask each other is simple: “Is this content likely to harm children?”

We have a few concerns—which were also outlined in evidence by Professor Clare McGlynn—about the definition of “provider pornographic content” in clause 66(3). It is defined as

“pornographic content that is published or displayed on the service by the provider of the service or by a person acting on behalf of the provider (including pornographic content published or displayed…by means of software or an automated tool or algorithm”.

That definition separates provider porn from content that is uploaded or shared by users, which is outlined in clause 49(2). That separation is emphasised in clause 66(6), which states:

“Pornographic content that is user-generated content in relation to an internet service is not to be regarded as provider pornographic content in relation to that service.”

However, as Professor McGlynn emphasised, it is unclear is exactly what will be covered by the words

“acting on behalf of the provider”.

I would appreciate some clarity from the Minister on that point. Could he give some clear examples?

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank my hon. Friend for his intervention and for his work on the Joint Committee, which has had a huge impact, as we have seen. I hope that colleagues will join me in thanking the members of the Joint Committee for their work.

My final point on this important clause is in response to a question that the shadow Minister raised about clause 66(3), which makes reference to

“a person acting on behalf of the provider”.

That is just to ensure that the clause is comprehensively drafted without any loopholes. If the provider used an agent or engaged some third party to disseminate content on their behalf, rather than doing so directly, that would be covered too. We just wanted to ensure that there was absolutely no loophole—no chink of light—in the way that the clause was drafted. That is why that reference is there.

I am delighted that these clauses seem to command such widespread support. It therefore gives me great pleasure to commend them to the Committee.

Question put and agreed to.

Clause 66 accordingly ordered to stand part of the Bill.

Clause 67 ordered to stand part of the Bill.

Schedule 9 agreed to.

Clause 68

Duties about regulated provider pornographic content

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I beg to move amendment 114, in clause 68, page 60, line 13, at end insert—

“(2A) A duty to verify that every individual featured in regulated provider pornographic content is an adult before the content is published on the service.

(2B) A duty to verify that every individual featured in regulated provider pornographic content that is already published on the service when this Act is passed is an adult and, where that is not the case, remove such content from the service.

(2C) A duty to verify that each individual appearing in regulated provider pornographic content has given their permission for the content in which they appear to be published or made available by the internet service.

(2D) A duty to remove regulated provider pornographic content featuring an individual if that individual withdraws their consent, at any time, to the pornographic content in which they feature remaining on the service.”

This amendment creates a duty to verify that each individual featured in pornographic content is an adult and has agreed to the content being uploaded before it is published. It would also impose a duty to remove content if the individual withdraws consent at any time.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 115, in clause 68, page 60, line 17, after “(2)” insert “to (2D)”.

Clause stand part.

New clause 2—Duties regarding user-generated pornographic content: regulated services

“(1) This section sets out the duties which apply to regulated services in relation to user-generated pornographic content.

(2) A duty to verify that each individual featuring in the pornographic content has given their permission for the content in which they feature to be published or made available by the service.

(3) A duty to remove pornographic content featuring a particular individual if that individual withdraws their consent, at any time, to the pornographic content in which they feature remaining on the service.

(4) For the meaning of ‘pornographic content’, see section 66(2).

(5) In this section, ‘user-generated pornographic content’ means any content falling within the meaning given by subsection (4) and which is also generated directly on the service by a user of the service, or uploaded to or shared on the service by a user of the service, may be encountered by another user, or other users, of the service.

(6) For the meaning of ‘regulated service’, see section 2(4).”

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Clause 68 outlines the duties covering regulated provider pornographic content, and Ofcom’s guidance on those duties. Put simply, the amendments are about age verification and consent, to protect women and children who are victims of commercial sexual exploitation.

I am moving a series of targeted amendments, tabled by my right hon. Friend the Member for Kingston upon Hull North (Dame Diana Johnson), which I hope that all hon. Members will be able to support because this is an issue that goes beyond party lines. This is about children who have been sexually abused, women who have been raped, and trafficking victims who have been exploited, who have all suffered the horror of filmed footage of their abuse being published on some of the world’s biggest pornography websites. This is about basic humanity.

Currently, leading pornography websites allow members of the public to upload pornographic videos without verifying that everyone in the film is an adult, that they gave their permission for it to be uploaded to a pornography website, or even that they know the film exists. It is sadly not surprising that because of the absence of even the most basic safety measures, hugely popular and profitable pornography websites have been found hosting and profiting from filmed footage of rape, sex trafficking, image-based sexual abuse and child sexual abuse. This atrocious practice is ongoing and well documented.

In 2019, PayPal stopped processing payments for Pornhub—one of the most popular pornography websites in the world—after an investigation by The Sunday Times revealed that the site contained child abuse videos and other illegal content. That included an account on the site dedicated to posting so-called creepshots of UK schoolgirls. In 2020, The New York Times documented the presence of child abuse videos on Pornhub, prompting Mastercard, Visa and Discover to block the use of their cards for purchases on the site.

New York Times reporter Nicholas Kristof wrote of Pornhub:

“Its site is infested with rape videos. It monetizes child rapes, revenge pornography, spy cam videos of women showering, racist and misogynist content, and footage of women being asphyxiated in plastic bags.”

That particular pornography website is now subject to multiple lawsuits launched against its parent company, MindGeek, by victims whose abuse was published on the site. Plaintiffs include victims of image-based sexual abuse in the UK, such as Crystal Palace footballer Leigh Nicol. Her phone was hacked, and private content was uploaded to Pornhub without her knowledge. She bravely and generously shared her experience in an interview for Sky Sports News, saying:

“The damage is done for me so this is about the next generation. I feel like prevention is better than someone having to react to this. I cannot change it alone but if I can raise awareness to stop it happening to others then that is what I want to do… The more that you dig into this, the more traumatising it is because there are 14-year-old kids on these websites and they don’t even know about it. The fact that you can publish videos that have neither party’s consent is something that has to be changed by law, for sure.”

I agree. It is grotesque that pornography website operators do not even bother to verify that everyone featured in films on their sites is an adult or even gave permission for the film to be uploaded. That cannot be allowed to continue.

These amendments, which I hope will receive the cross-party backing that they strongly deserve, would stop pornography websites publishing and profiting from videos of rape and child sexual abuse by requiring them to implement the most basic of prevention measures.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I support the hon. Member’s amendments. The cases that she mentions hammer home the need for women and girls to be mentioned in the Bill. I do not understand how the Government can justify not doing so when she is absolutely laying out the case for doing so.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I agree with the hon. Member and welcome her intervention. We will be discussing these issues time and again during our proceedings. What is becoming even more apparent is the need to include women and girls in the Bill, call out violence against women and girls online for what it is, and demand that the Government go further to protect women and girls. This is yet another example of where action needs to happen. I hope the Minister is hearing our pleas and that this will happen at some point as we make progress through the Bill.

More needs to be done to tackle this problem. Pornography websites need to verify that every individual in pornographic videos published on their site is an adult and gave their permission for the video to be published, and enable individuals to withdraw their consent for pornography of them to remain on the site. These are rock-bottom safety measures for preventing the most appalling abuses on pornography websites.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I add my voice to the arguments made by my hon. Friend and the hon. Member for Aberdeen North. Violence against women and girls is a fundamental issue that the Bill needs to address. We keep coming back to that, and I too hope that the Minister hears that point. My hon. Friend has described some of the most horrific harms. Surely, this is one area where we have to be really clear. If we are to achieve anything with the Bill, this is an area that we should be working on.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I wholeheartedly agree with my hon. Friend. As I have said, the amendments would put in place rock-bottom safety measures that could prevent the most appalling abuses on pornography websites, and it is a scandal that, hitherto, they have not been implemented. We have the opportunity to change that today by voting for the amendments and ensuring that these measures are in place. I urge the Minister and Conservative Members to do the right thing.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I thank the hon. Lady for giving way. I can understand the intent behind what she is saying and I have a huge amount of sympathy for it, but we know as a matter of fact that many of the images that are lodged on these sorts of websites were never intended to be pornographic in the first place. They may be intimate images taken by individuals of themselves—or, indeed, of somebody else—that are then posted as pornographic images. I am slightly concerned that an image such as that may not be caught by the hon. Lady’s amendments. Would she join me in urging the Government to bring forward the Law Commission’s recommendations on the taking, making and sharing of intimate images online without consent, which are far broader? They would probably do what she wants to do but not run into the problem of whether an image was meant to be pornographic in the first place.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I am grateful to the right hon. Member for her intervention. She knows that I have the utmost respect for all that she has tried to achieve in this area in the House along with my right hon. Friend the Member for Kingston upon Hull North.

We feel these amendments would encapsulate the specific issue of consent-based imagery or video content for which consent has not been obtained. Many of these people do not even know that the content has been taken in the first place, and it is then uploaded to these websites. It would be the website’s duty to verify that consent had been obtained and that the people in the video were of the age of consent. That is why we urge hon. Members to back the amendments.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The shadow Minister has laid out compellingly how awful the displaying of images of children on pornography websites and the displaying of images where the consent of the person has not been obtained are. Let me take each of those in turn, because my answers will be a bit different in the two cases.

First, all material that contains the sexual abuse of children or features children at all—any pornographic content featuring children is, by definition, sexual abuse—is already criminalised through the criminal law. Measures such as the Protection of Children Act 1978, the Criminal Justice Act 1988 and the Coroners and Justice Act 2009 provide a range of criminal offences that include the taking, making, circulating, possessing with a view to distributing, or otherwise possessing indecent photos or prohibited images of children. As we would expect, everything that the hon. Lady described is already criminalised under existing law.

This part of the Bill—part 5—covers publishers and not the user-to-user stuff we talked about previously. Because they are producing and publishing the material themselves, publishers of such material are covered by the existing criminal law. What they are doing is already illegal. If they are engaged in that activity, they should—and, I hope, will—be prosecuted for doing it.

The new clause and the amendments essentially seek to duplicate what is already set out very clearly in criminal law. While their intentions are completely correct, I do not think it is helpful to have duplicative law that essentially tries to do the same thing in a different law. We have well established and effective criminal laws in these areas.

In relation to the separate question of people whose images are displayed without their consent, which is a topic that my right hon. Friend the Member for Basingstoke has raised a few times, there are existing criminal offences that are designed to tackle that, including the recent revenge pornography offences in particular, as well as the criminalisation of voyeurism, harassment, blackmail and coercive or controlling behaviour. There is then the additional question of intimate image abuse, where intimate images are produced or obtained without the consent of the subject, and are then disseminated.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I welcome the Minister’s comments and commitment to look at this further, and the Law Commission’s review being taken forward. With that in mind, I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Clause 68 ordered to stand part of the Bill.

Ordered, That further consideration be now adjourned—(Steve Double.)

Online Safety Bill (Eleventh sitting)

Alex Davies-Jones Excerpts
Committee stage
Thursday 16th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 16 June 2022 - (16 Jun 2022)
None Portrait The Chair
- Hansard -

We start with amendment 127 to clause 69. It is up to the Committee, but I am minded to allow this debate to go slightly broader and take the stand part debate with it.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - -

I beg to move amendment 127, in clause 69, page 60, line 26, after “must” insert—

“within six months of this Act being passed”.

As ever, it is a pleasure to serve under your chairship, Sir Roger. The thoughts and prayers of us all are with my hon. Friend the Member for Batley and Spen and all her friends and family.

Labour welcomes the clause, which sets out Ofcom’s duties to provide guidance to providers of internet services. It is apparent, however, that we cannot afford to kick the can down the road and delay implementation of the Bill any further than necessary. With that in mind, I urge the Minister to support the amendment, which would give Ofcom an appropriate amount of time to produce this important guidance.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

It is a pleasure, once again, to serve under your august chairmanship, Sir Roger. I associate the Government with the remarks that you and the shadow Minister made, marking the anniversary of Jo Cox’s appalling murder, which shook the entire House when it happened. She will never be forgotten.

The Government are sympathetic to the intent of the amendment, which seeks to ensure that guidance for providers on protecting children from online pornography is put in place as quickly as possible. We of course sympathise with that objective, but we feel that the Secretary of State must retain the power to determine when to bring in the provisions of part 5, including the requirement under the clause for Ofcom to produce guidance, to ensure that implementation of the framework comprehensively and effectively regulates all forms of pornography online. That is the intention of the whole House and of this Committee.

Ofcom needs appropriate time and flexibility to get the guidance exactly right. We do not want to rush it and consequently see loopholes, which pornography providers or others might seek to exploit. As discussed, we will be taking a phased approach to bringing duties under the Bill into effect. We expect prioritisation for the most serious harms as quickly as possible, and we expect the duties on illegal content to be focused on most urgently. We have already accelerated the timescales for the most serious harms by putting priority illegal content in the various schedules to the Bill.

Ofcom is working hard to prepare implementation. We are all looking forward to the implementation road map, which it has committed to produce before the summer. For those reasons, I respectfully resist the amendment.

Question put, That the amendment be made.

--- Later in debate ---
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clauses 78 and 79 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

We welcome clause 77, which is an important clause that seeks to amend Ofcom’s existing general duties in the Communications Act 2003. Given the prevalence of illegal harms online, as we discussed earlier in proceedings, it is essential that the Communications Act is amended to reflect the important role that Ofcom will have as a new regulator.

As the Minister knows, and as we will discuss shortly when we reach amendments to clause 80, we have significant concerns about the Government’s approach to size versus harm when categorising service providers. Clause 77(4) amends section 3 of the Communications Act by inserting new subsection (4A). New paragraph (4A)(d) outlines measures that are proportionate to

“the size or capacity of the provider”,

and to

“the level of risk of harm presented by the service in question, and the severity of the potential harm”.

We know that harm, and the potential of accessing harmful content, is what is most important in the Bill—it says it in the name—so I am keen for my thoughts on the entire categorisation process to be known early on, although I will continue to press this issue with the Minister when we debate the appropriate clause.

Labour also supports clause 78. It is vital that Ofcom will have a duty to publish its proposals on strategic priorities within a set time period, and ensuring that that statement is published is a positive step towards transparency, which has been so crucially missing for far too long.

Similarly, Labour supports clause 79, which contains a duty to carry out impact assessments. That is vital, and it must be conveyed in the all-important Communications Act.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

As the shadow Minister has set out, these clauses ensure that Ofcom’s duties under the Communications Act 2003 are updated to reflect the new duties that we are asking it to undertake—I think that is fairly clear from the clauses. On the shadow Minister’s comment about size and risk, I note her views and look forward to debating that more fully in a moment.

Question put and agreed to.

Clause 77 accordingly ordered to stand part of the Bill.

Clauses 78 and 79 ordered to stand part of the Bill.

Clause 80

Meaning of threshold conditions etc

Question proposed, That the clause stand part of the Bill.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

Thank you for your efforts in chairing our meeting today, Sir Roger. My thoughts are with the hon. Member for Batley and Spen and her entire family on the anniversary of Jo Cox’s murder; the SNP would like to echo that sentiment.

I want to talk about my amendment, and I start with a quote from the Minister on Second Reading:

“A number of Members…have raised the issue of small platforms that are potentially harmful. I will give some thought to how the question of small but high-risk platforms can be covered.”—[Official Report, 19 April 2022; Vol. 712, c. 133.]

I appreciate that the Minister may still be thinking about that. He might accept all of our amendments; that is entirely possible, although I am not sure there is any precedent. The possibility is there that that might happen.

Given how strong I felt that the Minister was on the issue on Second Reading, I am deeply disappointed that there are no Government amendments to this section of the Bill. I am disappointed because of the massive risk of harm caused by some very small platforms—it is not a massive number—where extreme behaviour and radicalisation is allowed to thrive. It is not just about the harm to those individuals who spend time on those platforms and who are radicalised, presented with misinformation and encouraged to go down rabbit holes and become more and more extreme in their views. It is also about the risk of harm to other people as a result of the behaviour inspired in those individuals. We are talking about Jo Cox today; she is in our memories and thoughts. Those small platforms are the ones that are most likely to encourage individuals towards extremely violent acts.

If the Bill is to fulfil its stated aims and take the action we all want to see to prevent the creation of those most heinous, awful crimes, it needs to be much stronger on small, very high-risk platforms. I will make no apologies for that. I do not care if those platforms have small amounts of profits. They are encouraging and allowing the worst behaviours to thrive on their platforms. They should be held to a higher level of accountability. It is not too much to ask to class them as category 1 platforms. It is not too much to ask them to comply with a higher level of risk assessment requirements and a higher level of oversight from Ofcom. It is not too much to ask because of the massive risk of harm they pose and the massive actual harm that they create.

Those platforms should be punished for that. It is one thing to punish and criminalise the behaviour of users on those platforms—individual users create and propagate illegal content or radicalise other users—but the Bill does not go far enough in holding those platforms to account for allowing that to take place. They know that it is happening. Those platforms are set up as an alternative place—a place that people are allowed to be far more radical that they are on Twitter, YouTube, Twitch or Discord. None of those larger platforms have much moderation, but the smaller platforms encourage such behaviour. Links are put on other sites pointing to those platforms. For example, when people read vaccine misinformation, there are links posted to more radical, smaller platforms. I exclude Discord because, given its number of users, I think it would be included in one of the larger-platform categories anyway. It is not that there is not radical behaviour on Discord—there is—but I think the size of its membership excludes it, in my head certainly, from the category of the very smallest platforms that pose the highest risk.

We all know from our inboxes the number of people who contact us saying that 5G is the Government trying to take over their brains, or that the entire world is run by Jewish lizard people. We get those emails on a regular basis and those theories are propagated on the smallest platforms. Fair enough—some people may not take any action as a result of the radicalisation that they have experienced as a result of their very extreme views. But some people will take action and that action may be simply enough to harm their friends or family, it may be simply enough to exclude them and drag them away from the society or community that they were previously members of or it might, in really tragic cases, be far more extreme. It might lead people to cause physical or mental harm to others intentionally as a result of the beliefs that they have had created and fostered on those platforms.

That is why we have tabled the amendments. This is the one area that the Government have most significantly failed in writing this Bill, by not ensuring that the small, very high-risk platforms are held to the highest level of accountability and are punished for allowing these behaviours to thrive on their platforms. I give the Minister fair warning that unless he chooses to accept the amendments, I intend to push them to a vote. I would appreciate it if he gave assurances, but I do not believe that any reassurance that he could give would compare to having such a measure in the Bill. As I say, for me the lack of this provision is the biggest failing of the entire Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I echo the comments of the hon. Member for Aberdeen North. I completely agree with everything she has just said and I support the amendments that she has tabled.

The Minister knows my feelings on the Government’s approach to categorisation services; he has heard my concerns time and time again. However, it is not just me who believes that the Government have got their approach really wrong. It is also stakeholders far and wide. In our evidence sessions, we heard from HOPE not hate and the Antisemitism Policy Trust specifically on this issue. In its current form, the categorisation process is based on size versus harm, which is a fundamentally flawed approach.

The Government’s response to the Joint Committee that scrutinised the draft Bill makes it clear that they consider that reach is a key and proportional consideration when assigning categories and that they believe that the Secretary of State’s powers to amend those categories are sufficient to protect people. Unfortunately, that leaves many alternative platforms out of category 1, even if they host large volumes of harmful material.

The duty of care approach that essentially governs the Bill is predicated on risk assessment. If size allows platforms to dodge the entry criteria for managing high risk, there is a massive hole in the regime. Some platforms have already been mentioned, including BitChute, Gab and 4chan, which host extreme racist, misogynistic, homophobic and other extreme content that radicalises people and incites harm. And the Minister knows that.

I take this opportunity to pay tribute to my hon. Friend the Member for Plymouth, Sutton and Devonport (Luke Pollard), who has campaigned heavily on the issue since the horrendous and tragic shooting in Keyham in his constituency. One of my big concerns about the lack of focus on violence against women and girls in the Bill, which we have mentioned time and time again, is the potential for the rise of incel culture online, which is very heavily reported on these alternative platforms—these high-harm, high-risk platforms.

I will just give one example. A teacher contacted me about the Bill. She talked about the rise of misogyny and trying to educate her class on what was happening. At the end of the class, a 15-year-old boy—I appreciate that he is under 18 and is a child, so would come under a different category within the Bill, but I will still give the example. He came up to her and said: “Miss, I need to chat to you. This is something I’m really concerned about. All I did was google, ‘Why can’t I get a girlfriend?’” He had been led down a rabbit hole into a warren of alternative platforms that tried to radicalise him with the most extreme content of incel culture: women are evil; women are the ones who are wrong; it is women he should hate; it is his birth right to have a girlfriend, and he should have one; and he should hate women. That is the type of content that is on those platforms that young, impressionable minds are being pointed towards. They are being radicalised and it is sadly leading to incredibly tragic circumstances, so I really want to push the Minister on the subject.

We share the overarching view of many others that this crucial risk needs to be factored into the classification process that determines which companies are placed in category 1. Otherwise, the Bill risks failing to protect adults from substantial amounts of material that causes physical and psychological harm. Schedule 10 needs to be amended to reflect that.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I appreciate the shadow Minister’s bringing that issue up. Would she agree that, given we have constraints on broadcast and newspaper reporting on suicide for these very reasons, there can be no argument against including such a measure in the Bill?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I completely agree. Those safeguards are in place for that very reason. It seems a major omission that they are not also included in the Online Safety Bill if we are truly to save lives.

The Bill’s own pre-legislative scrutiny Committee recommended that the legislation should

“adopt a more nuanced approach, based not just on size and high-level functionality, but factors such as risk, reach, user base, safety performance, and business model.”

The Government replied that they

“want the Bill to be targeted and proportionate for businesses and Ofcom and do not wish to impose disproportionate burdens on small companies.”

It is, though, entirely appropriate to place a major regulatory burden on small companies that facilitate the glorification of suicide and the sharing of dangerous methods through their forums. It is behaviour that is extraordinarily damaging to public health and makes no meaningful economic or social contribution.

Amendment 82 is vital to our overarching aim of having an assessed risk of harm at the heart of the Bill. The categorisation system is not fit for purpose and will fail to capture so many of the extremely harmful services that many of us have already spoken about.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I want to remind Committee members of what my hon. Friend is talking about. I refer to the oral evidence we heard from Danny Stone, from the Antisemitism Policy Trust, on these small, high-harm platforms. He laid out examples drawn from the work of the Community Security Trust, which released a report called “Hate Fuel”. The report looked at

“various small platforms and highlighted that, in the wake of the Pittsburgh antisemitic murders, there had been 26 threads…with explicit calls for Jews to be killed. One month prior to that, in May 2020, a man called Payton Gendron found footage of the Christchurch attacks. Among this was legal but harmful content, which included the “great replacement” theory, GIFs and memes, and he went on a two-year journey of incitement.”

A week or so before the evidence sitting,

“he targeted and killed 10 people in Buffalo. One of the things that he posted was:

‘Every Time I think maybe I shouldn’t commit to an attack I spend 5 min of /pol/’—

which is a thread on the small 4chan platform—

‘then my motivation returns’.”

Danny Stone told us that the kind of material we are seeing, which is legal but harmful, is inspiring people to go out and create real-world harm. When my hon. Friend the Member for Pontypridd asked him how to amend this approach, he said:

“You would take into account other things—for example, characteristics are already defined in the Bill, and that might be an option”.––[Official Report, Online Safety Public Bill Committee, 26 May 2022; c. 128, Q203-204.]

I do hope that, as my hon. Friend urges, the Minister will look at all these options, because this is a very serious matter.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I completely agree with my hon. Friend. The evidence we heard from Danny Stone from the Antisemitism Policy Trust clearly outlined the real-world harm that legal but harmful content causes. Such content may be legal, but it causes mass casualties and harm in the real world.

There are ways that we can rectify that in the Bill. Danny Stone set them out in his evidence and the SNP amendments, which the Labour Front Bench supports wholeheartedly, outline them too. I know the Minister wants to go further; he has said as much himself to this Committee and on the Floor of the House. I urge him to support some of the amendments, because it is clear that such changes can save lives.

Schedule 10 outlines the regulations specifying threshold conditions for categories of part 3 services. Put simply, as the Minister knows, Labour has concerns about the Government’s plans to allow thresholds for each category to be set out in secondary legislation. As we have said before, the Bill has already faced significant delays at the hands of the Government and we have real concerns that a reliance on secondary legislation further kicks the can down the road.

We also have concerns that the current system of categorisation is inflexible in so far as we have no understanding of how it will work if a service is required to shift from one category to another, and how long that would take. How exactly will that work in practice? Moreover, how long would Ofcom have to preside over such decisions?

We all know that the online space is susceptible to speed, with new technologies and ways of functioning popping up all over, and very often. Will the Minister clarify how he expects the re-categorisation process to occur in practice? The Minister must accept that his Department has been tone deaf on this point. Rather than an arbitrary size cut-off, the regulator must use risk levels to determine which category a platform should fall into so that harmful and dangerous content does not slip through the net.

Labour welcomes clause 81, which sets out Ofcom’s duties in establishing a register of categories of certain part 3 services. As I have repeated throughout the passage of the Bill, having a level of accountability and transparency is central to its success. However, we have slight concerns that the wording in subsection (1), which stipulates that the register be established

“as soon as reasonably practicable”,

could be ambiguous and does not give us the certainty we require. Given the huge amount of responsibility the Bill places on Ofcom, will the Minister confirm exactly what he believes the stipulation means in practice?

Finally, we welcome clause 82. It clarifies that Ofcom has a duty to maintain the all-important register. However, we share the same concerns I previously outlined about the timeframe in which Ofcom will be compelled to make such changes. We urge the Minister to move as quickly as he can, to urge Ofcom to do all they can and to make these vital changes.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

As we have heard, the clauses set out how different platforms will be categorised with the purpose of ensuring duties are applied in a reasonable and proportionate way that avoids over-burdening smaller businesses. However, it is worth being clear that the Online Safety Bill, as drafted, requires all in-scope services, regardless of their user size, to take action against content that is illegal and where it is necessary to protect children. It is important to re-emphasise the fact that there is no size qualification for the illegal content duties and the duties on the protection of children.

It is also important to stress that under schedule 10 as drafted there is flexibility, as the shadow Minister said, for the Secretary of State to change the various thresholds, including the size threshold, so there is an ability, if it is considered appropriate, to lower the size thresholds in such a way that more companies come into scope, if that is considered necessary.

It is worth saying in passing that we want these processes to happen quickly. Clearly, it is a matter for Ofcom to work through the operations of that, but our intention is that this will work quickly. In that spirit, in order to limit any delays to the process, Ofcom can rely on existing research, if that research is fit for purpose under schedule 10 requirements, rather than having to do new research. That will greatly assist moving quickly, because the existing research is available off the shelf immediately, whereas commissioning new research may take some time. For the benefit of Hansard and people who look at this debate for the application of the Bill, it is important to understand that that is Parliament’s intention.

I will turn to the points raised by the hon. Member for Aberdeen North and the shadow Minister about platforms that may be small and fall below the category 1 size threshold but that are none the less extremely toxic, owing to the way that they are set up, their rules and their user base. The shadow Minister mentioned several such platforms. I have had meetings with the stakeholders that she mentioned, and we heard their evidence. Other Members raised this point on Second Reading, including the right hon. Member for Barking (Dame Margaret Hodge) and my hon. Friend the Member for Brigg and Goole (Andrew Percy). As the hon. Member for Aberdeen North said, I signalled on Second Reading that the Government are listening carefully, and our further work in that area continues at pace.

I am not sure that amendment 80 as drafted would necessarily have the intended effect. Proposed new sub-paragraph (c) to schedule 10(1) would add a risk condition, but the conditions in paragraph (1) are applied with “and”, so they must all be met. My concern is that the size threshold would still apply, and that this specific drafting of the amendment would not have the intended effect.

We will not accept the amendments as drafted, but as I said on Second Reading, we have heard the representations—the shadow Minister and the hon. Member for Aberdeen North have made theirs powerfully and eloquently—and we are looking carefully at those matters. I hope that provides some indication of the Government’s thinking. I thank the stakeholders who engaged and provided extremely valuable insight on those issues. I commend the clause to the Committee.

--- Later in debate ---
None Portrait The Chair
- Hansard -

The hon. Lady is correct. I am advised that, actually, the ruling has changed, so it can be. We will see—well, I won’t, but the hon. Lady will see what the Minister does on report.

Schedule 10 agreed to.  

Clauses 81 and 82 ordered to stand part of the Bill.  

Clause 83

OFCOM’s register of risks, and risk profiles, of Part 3

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I beg to move amendment 34, in clause 83, page 72, line 12, at end insert—

“(d) the risk of harm posed by individuals in the United Kingdom in relation to adults and children in the UK or elsewhere through the production, publication and dissemination of illegal content.”

This amendment requires the Ofcom’s risk assessment to consider risks to adults and children through the production, publication and dissemination of illegal content.

Labour welcomes clause 83, which places a duty on Ofcom to carry out risk assessments to identify and assess a range of potential risks of harm presented by part 3 services. However we are concerned about subsection (9), which says:

“OFCOM must from time to time review and revise the risk assessments and risk profiles so as to keep them up to date”

That seems a fairly woolly concept even for the Minister to try to defend, so I would be grateful if he clarified exactly what demands will be placed on Ofcom to review those risk assessments and risk profiles. He will know that those are absolutely central to the Bill, so some clarification is required here. Despite that, Labour agrees that it will be a significant advantage for Ofcom to oversee the risk of harm presented by the regulated services.

However, harm should not be limited to those in the UK. Amendment 34 would therefore require Ofcom’s risk assessment to consider risks to adults and children throughout the production, publication and dissemination of illegal content. I have already spoken on this issue, in the debate on amendment 25 to clause 8, so I will keep my comments brief. As the Minister knows, online harms are global in nature, and amendment 34 seeks to ensure that the risk of harm presented by regulated services is not just limited to those in the UK. As we have mentioned previously, research shows us that there is some very damaging, often sexually violent, content being streamed abroad. Labour fears that the current provisions in the legislation will not be far-reaching enough to capture the true essence of the risk of harm that people may face when online.

Labour supports the intentions of clause 84, which outlines that Ofcom must produce guidance to assist providers in complying with their duties to carry out illegal content risk assessments

“As soon as reasonably practicable”.

Of course, the Minister will not be surprised that Labour has slight reservations about the timing around those important duties, so I would appreciate an update from the Minister on the conversations he has had with Ofcom about the practicalities of its duties.

None Portrait The Chair
- Hansard -

I did not indicate at the start of the debate that I would take the clause stand part and clause 84 stand part together, but I am perfectly relaxed about it and very happy to do so, as the hon. Lady has spoken to them. If any other colleague wishes to speak to them, that is fine by me.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Perhaps I might start with amendment 34, which the shadow Minister just spoke to. We agree that it is very important to consider the risks posed to victims who are outside of the territory of the United Kingdom. However, for the reasons I will elaborate on, we believe that the Bill as drafted achieves that objective already.

First, just to remind the Committee, the Bill already requires companies to put in place proportionate systems and processes to prevent UK users from encountering illegal content. Critically, that includes where a UK user creates illegal content via an in-scope platform, but where the victim is overseas. Let me go further and remind the Committee that clause 9 requires platforms to prevent UK users from encountering illegal content no matter where that content is produced or published. The word “encounter” is very broadly defined in clause 189 as meaning

“read, view, hear or otherwise experience content”.

As such, it will cover a user’s contact with any content that they themselves generate or upload to a service.

Critically, there is another clause, which we have discussed previously, that is very important in the context of overseas victims, which the shadow Minister quite rightly raises. The Committee will recall that subsection (9) of clause 52, which is the important clause that defines illegal content, makes it clear that that content does not have to be generated, uploaded or accessed in the UK, or indeed to have anything to do with the UK, in order to count as illegal content towards which the company has duties, including risk assessment duties. Even if the illegal act—for example, sexually abusing a child—happens in some other country, not the UK, it still counts as illegal content under the definitions in the Bill because of clause 52(9). It is very important that those duties will apply to that circumstance. To be completely clear, if an offender in the UK uses an in-scope platform to produce content where the victim is overseas, or to share abuse produced overseas with other UK users, the platform must tackle that, both through its risk assessment duties and its other duties.

As such, the entirely proper intent behind amendment 34 is already covered by the Bill as drafted. The shadow Minister, the hon. Member for Pontypridd, has already referred to the underlying purpose of clauses 83 and 84. As we discussed before, the risk assessments are central to the duties in the Bill. It is essential that Ofcom has a proper picture of the risks that will inform its various regulatory activities, which is why these clauses are so important. Clause 84 requires Ofcom to produce guidance to services to make sure they are carrying out those risk assessments properly, because it is no good having a token risk assessment or one that does not properly deal with the risks. The guidance published under clause 84 will ensure that happens. As such, I will respectfully resist amendment 34, on the grounds that its contents are already covered by the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I am grateful for the Minister’s clarification. Given his assurances that its contents are already covered by the Bill, I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Clause 83 ordered to stand part of the Bill.

Clause 84 ordered to stand part of the Bill.

Clause 85

Power to require information

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clauses 86 to 91 stand part.

Schedule 11 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Labour supports clause 85, which gives Ofcom the power to require the provision of any information it requires in order to discharge its online safety functions. We strongly believe that, in the interests of transparency, Ofcom as the regulator must have sufficient power to require a service provider to share its risk assessment in order to understand how that service provider is identifying risks. As the Minister knows, we feel that that transparency should go further, and that the risk assessments should be made public. However, we have already had that argument during a previous debate, so I will not repeat those arguments—on this occasion, at least.

Labour also supports clause 86, and we particularly welcome the clarification that Ofcom may require the provision of information in any form. If we are to truly give Ofcom the power to regulate and, where necessary, investigate service providers, we must ensure that it has sufficient legislative tools to rely on.

The Bill gives some strong powers to Ofcom. We support the requirement in clause 87 to name a senior manager, but again, we feel those provisions should go further. Both users and Ofcom must have access to the full range of tools they need to hold the tech giants to account. As it stands, senior managers can be held criminally liable only for technical offences, such as failing to supply information to the regulator, and even then, those measures might not come in until two years after the Bill is in place. Surely the top bosses at social media companies should be held criminally liable for systemic and repeated failures to ensure online safety as soon as the Bill comes into force, so can the Minister explain the reasons for the delay?

The Minister will be happy to hear that Labour supports clause 88. It is important to have an outline on the face of the Bill of the circumstances in which Ofcom can require a report from a skilled person. It is also important that Ofcom has the power to appoint, or give notice to a provider requiring them to appoint, a skilled person, as Labour fears that without those provisions in subsections (3) and (4), the ambiguity around defining a so-called skilled person could be detrimental. We therefore support the clause, and have not sought to amend it at this stage.

Again, Labour supports all the intentions of clause 89 in the interests of online safety more widely. Of course, Ofcom must have the power to force a company to co-operate with an investigation.

Again, we support the need for clause 90, which gives Ofcom the power to require an individual to attend an interview. That is particularly important in the instances outlined in subsection (1), whereby Ofcom is carrying out an investigation into the failure or possible failure of a provider of a regulated service to comply with a relevant requirement. Labour has repeatedly called for such personal responsibility, so we are pleased that the Government are ensuring that the Bill includes sufficient powers for Ofcom to allow proper scrutiny.

Labour supports clause 91 and schedule 11, which outlines in detail Ofcom’s powers of entry, inspection and audit. I did not think we would support this much, but clearly we do. We want to work with the Government to get this right, and we see ensuring Ofcom has those important authorisation powers as central to it establishing itself as a viable regulator of the online space, both now and for generations to come. We will support and have not sought to amend the clauses or schedule 11 for the reasons set out.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I want to make a brief comment echoing the shadow Minister’s welcome for the inclusion of senior managers and named people in the Bill. I agree that that level of personal liability and responsibility is the only way that we will be able to hold some of these incredibly large, unwieldy organisations to account. If they could wriggle out of this by saying, “It’s somebody else’s responsibility,” and if everyone then disagreed about whose responsibility it was, we would be in a much worse place, so I also support the inclusion of these clauses and schedule 11.

--- Later in debate ---
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clauses 93 to 96 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

The Minister will be pleased to hear that we, again, support these clauses. We absolutely support the Bill’s aims to ensure that information offences and penalties are strong enough to dissuade non-compliance. However, as we said repeatedly, we feel that the current provisions are lacking.

As it stands, senior managers can be held criminally liable only for technical offences, such as failing to supply information to the regulator. I am grateful that the Minister has confirmed that the measures will come into force with immediate effect following Royal Assent, rather than waiting two years. That is welcome news. The Government should require that top bosses at social media companies be criminally liable for systemic and repeated failures on online safety, and I am grateful for the Minister’s confirmation on that point.

As these harms are allowed to perpetuate, tech companies cannot continue to get away without penalty. Will the Minister confirm why the Bill does not include further penalties, in the form of criminal offences, should a case of systemic and repeated failures arise? Labour has concerns that, without stronger powers, Ofcom may not feel compelled or equipped to sanction those companies who are treading the fine line of doing just enough to satisfy the requirements outlined in the Bill as it stands.

Labour also welcomes clause 93, which sets out the criminal offences that can be committed by named senior managers in relation to their entity’s information obligations. It establishes that senior managers who are named in a response to an information notice can be held criminally liable for failing to prevent the relevant service provider from committing an information offence. Senior managers can only be prosecuted under the clause where the regulated provider has already been found liable for failing to comply with Ofcom’s information request. As I have already stated, we feel that this power needs to go further if we are truly to tackle online harm. For far too long, those at the very top have known about the harm that exists on their platforms, but they have failed to take action.

Labour supports clause 94 and we have not sought to amend at this stage. It is vital that provisions are laid in the Bill, such as those in subsection (3), which specify actions that a person may take to commit an offence of this nature. We all want to see the Bill keep people safe online, and at the heart of doing so is demanding a more transparent approach from those in silicon valley. My hon. Friend the Member for Worsley and Eccles South made an excellent case for the importance of transparency earlier in the debate but, as the Minister knows, and as I have said time and again, the offences must go further than just applying to simple failures to provide information. We must consider a systemic approach to harm more widely, and that goes far beyond simple information offences.

There is no need to repeat myself. Labour supports the need for clause 95 as it stands and we support clause 96, which is in line with penalties for other information offences that already exist.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am delighted to discover that agreement with the Governments clauses continues to provoke a tsunami of unanimity across the Committee. I sense a gathering momentum behind these clauses.

As the shadow Minister mentioned, the criminal offences here are limited to information provision and disclosure. We have debated the point before. The Government’s feeling is that going beyond the information provision into other duties for criminal liability would potentially go a little far and have a chilling effect on the companies concerned.

Also, the fines that can be levied—10% of global revenue—run into billions of pounds, and there are the denial of service provisions, where a company can essentially be disconnected from the internet in extreme cases; these do provide more than adequate enforcement powers for the other duties in the Bill. The information duties are so fundamental—that is why personal criminal liability is needed. Without the information, we cannot really make any further assessment of whether the duties are being met.

The shadow Minister has set out what the other clauses do: clause 92 creates offences; clause 93 introduces senior managers’ liability; clause 94 sets out the offences that can be committed in relation to audit notices issued by Ofcom; clause 95 creates offences for intentionally obstructing or delaying a person exercising Ofcom’s power; and clause 96 sets out the penalties for the information offences set out in the Bill, which of course include a term of imprisonment of up to two years. Those are significant criminal offences, which I hope will make sure that executives working for social media firms properly discharge those important duties.

Question put and agreed to.

Clause 92 accordingly ordered to stand part of the Bill.

Clauses 93 to 95 ordered to stand part of the Bill.

Clause 96

Penalties for information offences

Amendment made: 2, in clause 96, page 83, line 15, leave out

“maximum summary term for either-way offences”

and insert

“general limit in a magistrates’ court”—(Chris Philp.)

Clause 96, as amended, ordered to stand part of the Bill.

Clause 97

Co-operation and disclosure of information: overseas regulators

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to consider clauses 98 to 102 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Again, Labour supports the intentions of clause 97—the collegiality continues. We know that the Bill’s aims are to protect people across the UK, but we know that online harms often originate elsewhere. That is why it is vital that Ofcom has powers to co-operate with an overseas regulator, as outlined in subsection (1).

However, we do have concerns about subsection (2), which states:

“The power conferred by subsection (1) applies only in relation to an overseas regulator for the time being specified in regulations made by the Secretary of State.”

Can the Minister confirm exactly how that will work in practice? He knows that Labour Members have tabled important amendments to clause 123. Amendments 50 and 51, which we will consider later, aim to ensure that Ofcom has the power to co-operate and take action through the courts where necessary. The same issue applies here: Ofcom must be compelled and have the tools available at its disposal to work internationally where required.

Labour supports clause 98, which amends section 393 of the Communications Act 2003 to include new provisions. That is obviously a vital step, and we particularly welcome subsection (2), which outlined that, subject to the specific exceptions in section 393 of the 2003 Act, Ofcom cannot disclose information with respect to a business that it has obtained by exercising its powers under this Bill without the consent of the business in question. This is once again an important step in encouraging transparency across the board.

We support clause 99, which places a duty on Ofcom to consult the relevant intelligence service before Ofcom discloses or publishes any information that it has received from that intelligence service. For reasons of national security, it is vital that the relevant intelligence service is included in Ofcom’s reasoning and approach to the Bill more widely.

We broadly support the intentions of clause 100. It is vital that Ofcom is encouraged to provide information to the Secretary of State of the day, but I would be grateful if the Minister could confirm exactly how the power will function in reality. Provision of information to assist in the formulation of policy, as we know, is a very broad spectrum in the Communications Act. We want to make sure the powers are not abused—I know that is a concern shared on his own Back Benches—so I would be grateful for the Minister’s honest assessment of the situation.

We welcome clause 101, which amends section 26 of the Communications Act and provides for publication of information and advice for various persons, such as consumers. Labour supports the clause as it stands. We also welcome clause 102, which, importantly, sets out the circumstances in which a statement given to Ofcom can be used in evidence against that person. Again, this is an important clause in ensuring that Ofcom has the powers it needs to truly act as a world-leading regulator, which we all want it to be. Labour supports it and has chosen not to table any amendments.

Online Safety Bill (Twelfth sitting)

Alex Davies-Jones Excerpts
Committee stage
Thursday 16th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 16 June 2022 - (16 Jun 2022)
None Portrait The Chair
- Hansard -

With this it will be convenient to consider clauses 105 and 106 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - -

Under this chapter, Ofcom will have the power to direct companies to use accredited technology to identify child sexual exploitation and abuse content, whether communicated publicly or privately by means of a service, and to remove that content quickly. Colleagues will be aware that the Internet Watch Foundation is one group that assists companies in doing that by providing them with “hashes” of previously identified child sexual abuse material in order to prevent the upload of such material to their platforms. That helps stop the images of victims being recirculated again and again. Tech companies can then notify law enforcement of the details of who has uploaded the content, and an investigation can be conducted and offenders sharing the content held to account.

Those technologies are extremely accurate and, thanks to the quality of our datasets, ensure that companies are detecting only imagery that is illegal. There are a number of types of technology that Ofcom could consider accrediting, including image hashing. A hash is a unique string of letters and numbers that can be applied to an image and matched every time a user attempts to upload a known illegal image to a platform.

PhotoDNA is another type, created in 2009 in a collaboration between Microsoft and Professor Hany Farid at the University of Berkeley. PhotoDNA is a vital tool in the detection of CSEA online. It enables law enforcement, charities, non-governmental organisations and the internet industry to find copies of an image even when it has been digitally altered. It is one of the most important technical developments in online child protection. It is extremely accurate, with a failure rate of one in 50 billion to 100 billion. That gives companies a high degree of certainty that what they are removing is illegal, and a firm basis for law enforcement to pursue offenders.

Lastly, there is webpage blocking. Most of the imagery that the Internet Watch Foundation removes from the internet is hosted outside the UK. While it is waiting for removal, it can disable public access to an image or webpage by adding it to our webpage blocking list. That can be utilised by search providers to de-index known webpages containing CSAM. I therefore ask the Minister, as we continue to explore this chapter, to confirm exactly how such technologies can be utilised once the Bill receives Royal Assent.

Labour welcomes clause 105, which confirms, in subsection (2), that where a service provider is already using technology on a voluntary basis but it is ineffective, Ofcom can still intervene and require a service provider to use a more effective technology, or the same technology in a more effective way. It is vital that Ofcom is given the power and opportunity to intervene in the strongest possible sense to ensure that safety online is kept at the forefront.

However, we do require some clarification, particularly on subsections (9) and (10), which explain that Ofcom will only be able to require the use of tools that meet the minimum standards for accuracy for detecting terrorism and/or CSEA content, as set out by the Secretary of State. Although minimum standards are of course a good thing, can the Minister clarify the exact role that the Secretary of State will have in imposing these minimum standards? How will this work in practice?

Once again, Labour does not oppose clause 106 and we have not sought to amend it at this stage. It is vital that Ofcom has the power to revoke a notice under clause 103(1) if there are reasonable grounds to believe that the provider is not complying with it. Only with these powers can we be assured that service providers will be implored to take their responsibilities and statutory duties, as outlined in the Bill, seriously.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

I have a few questions, concerns and suggestions relating to these clauses. I think it was the hon. Member for Don Valley who asked me last week about the reports to the National Crime Agency and how that would work—about how, if a human was not checking those things, there would be an assurance that proper reports were being made, and that scanning was not happening and reports were not being made when images were totally legal and there was no problem with them. [Interruption.] I thought it was the hon. Member for Don Valley, although it may not have been. Apologies—it was a Conservative Member. I am sorry for misnaming the hon. Member.

The hon. Member for Pontypridd made a point about the high level of accuracy of the technologies. That should give everybody a level of reassurance that the reports that are and should be made to the National Crime Agency on child sexual abuse images will be made on a highly accurate basis, rather than a potentially inaccurate one. Actually, some computer technology—particularly for scanning for images, rather than text—is more accurate than human beings. I am pleased to hear those particular statistics.

Queries have been raised on this matter by external organisations—I am particularly thinking about the NSPCC, which we spoke about earlier. The Minister has thankfully given a number of significant reassurances about the ability to proactively scan. External organisations such as the NSPCC are still concerned that there is not enough on the face of the Bill about proactive scanning and ensuring that the current level of proactive scanning is able—or required—to be replicated when the Bill comes into action.

During an exchange in an earlier Committee sitting, the Minister gave a commitment—I am afraid I do not have the quote—to being open to looking at amending clause 103. I am slightly disappointed that there are no Government amendments, but I understand that there has been only a fairly short period; I am far less disappointed than I was previously, when the Minister had much more time to consider the actions he might have been willing to take.

The suggestion I received from the NSPCC is about the gap in the Bill regarding the ability of Ofcom to take action. These clauses allow Ofcom to take action against individual providers about which it has concerns; those providers will have to undertake duties set out by Ofcom. The NSPCC suggests that there could be a risk register, or that a notice could be served on a number of companies at one time, rather than Ofcom simply having to pick one company, or to repeatedly pick single companies and serve notices on them. Clause 83 outlines a register of risk profiles that must be created by Ofcom. It could therefore serve notice on all the companies that fall within a certain risk profile or all the providers that have common functionalities.

If there were a new, emerging concern, that would make sense. Rather than Ofcom having to go through the individual process with all the individual providers when it knows that there is common functionality—because of the risk assessments that have been done and Ofcom’s oversight of the different providers—it could serve notice on all of them in one go. It could not then accidentally miss one out and allow people to move to a different platform that had not been mentioned. I appreciate the conversation we had around this issue earlier, and the opportunity to provide context in relation to the NSPCC’s suggestions, but it would be great if the Minister would be willing to consider them.

I have another question, to which I think the Minister will be able to reply in the affirmative, which is on the uses of the technology as it evolves. We spoke about that in an earlier meeting. The technology that we have may not be what we use in the future to scan for terrorist-related activity or child sexual abuse material. It is important that the Bill adequately covers future conditions. I think that it does, but will the Minister confirm that, as technology advances and changes, these clauses will adequately capture the scanning technologies that are required, and any updates in the way in which platforms work and we interact with each other on the internet?

I have fewer concerns about future-proofing with regard to these provisions, because I genuinely think they cover future conditions, but it would be incredibly helpful and provide me with a bit of reassurance if the Minister could confirm that. I very much look forward to hearing his comments on clause 103.

--- Later in debate ---
Matters relevant to a decision to give a notice under section 103(1)
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I beg to move amendment 35, in clause 104, page 88, line 39, leave out “prevalence” and insert “presence”.

This amendment requires that Ofcom considers the presence of relevant content, rather than its prevalence.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 36, in clause 104, page 88, line 43, leave out “prevalence” and insert “presence”.

This amendment requires that Ofcom considers the presence of relevant content, rather than its prevalence.

Amendment 37, in clause 104, page 89, line 13, at end insert—

“(k) risk of harm posed by individuals in the United Kingdom in relation to adults and children in the UK or elsewhere through the production, publication and dissemination of illegal content.”

This amendment requires the Ofcom’s risk assessment to consider risks to adults and children through the production, publication and dissemination of illegal content.

Amendment 39, in clause 116, page 98, line 37, leave out “prevalence” and insert “presence”.

This amendment requires that Ofcom considers the presence of relevant content, rather than its prevalence.

Amendment 40, in clause 116, page 98, line 39, leave out “prevalence” and insert “presence”.

This amendment requires that Ofcom considers the presence of relevant content, rather than its prevalence.

Amendment 38, in clause 116, page 99, line 12, at end insert—

“(j) the risk of harm posed by individuals in the United Kingdom in relation to adults and children in the UK or elsewhere through the production, publication and dissemination of illegal content.”

This amendment requires Ofcom to consider risks to adults and children through the production, publication and dissemination of illegal content before imposing a proactive technology requirement.

Government amendment 6.

Clause stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

We welcome clause 104, but have tabled some important amendments that the Minister should closely consider. More broadly, the move away from requiring child sexual exploitation and abuse content to be prevalent and persistent before enforcement action can be taken is a positive one. It is welcome that Ofcom will have the opportunity to consider a range of factors.

Despite this, Labour—alongside the International Justice Mission—is still concerned about the inclusion of prevalence as a factor, owing to the difficulty in detecting newly produced CSEA content, especially livestreamed abuse. Amendments 35, 36, 39 and 40 seek to address that gap. Broadly, the amendments aim to capture the concern about the Bill’s current approach, which we feel limits its focus to the risk of harm faced by individuals in the UK. Rather, as we have discussed previously, the Bill should recognise the harm that UK nationals cause to people around the world, including children in the Philippines. The amendments specifically require Ofcom to consider the presence of relevant content, rather than its prevalence.

Amendment 37 would require Ofcom’s risk assessments to consider risks to adults and children through the production, publication and dissemination of illegal content—an issue that Labour has repeatedly raised. I believe we last mentioned it when we spoke to amendments to clause 8, so I will do my best to not repeat myself. That being said, we firmly believe it is important that video content, including livestreaming, is captured by the Bill. I remain unconvinced that the Bill as it stands goes far enough, so I urge the Minister to closely consider and support these amendments. The arguments that we and so many stakeholders have already made still stand.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I echo the sentiments that have been expressed by the shadow Minister, and thank her and her colleagues for tabling this amendment and giving voice to the numerous organisations that have been in touch with us about this matter. The Scottish National party is more than happy to support the amendment, which would make the Bill stronger and better, and would better enable Ofcom to take action when necessary.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I understand the spirit behind these amendments, focusing on the word “presence” rather than “prevalence” in various places. It is worth keeping in mind that throughout the Bill we are requiring companies to implement proportionate systems and processes to protect their users from harm. Even in the case of the most harmful illegal content, we are not placing the duty on companies to remove every single piece of illegal content that has ever appeared online, because that is requesting the impossible. We are asking them to take reasonable and proportionate steps to create systems and processes to do so. It is important to frame the legally binding duties in that way that makes them realistically achievable.

As the shadow Minister said, amendments 35, 36, 39 and 40 would replace the word “prevalence” with “presence”. That would change Ofcom’s duty to enforce not just against content that was present in significant numbers—prevalent—but against a single instance, which would be enough to engage the clause.

We mutually understand the intention behind these amendments, but we think the significant powers to compel companies to adopt certain technology contained in section 103 should be engaged only where there is a reasonable level of risk. For example, if a single piece of content was present on a platform, if may not be reasonable or proportionate to force the company to adopt certain new technologies, where indeed they do not do so at the moment. The use of “prevalence” ensures that the powers are used where necessary.

It is clear—there is no debate—that in the circumstances where scanning technology is currently used, which includes on Facebook Messenger, there is enormous prevalence of material. To elaborate on a point I made in a previous discussion, anything that stops that detection happening would be unacceptable and, in the Government’s view, it would not be reasonable to lose the ability to detect huge numbers of images in the service of implementing encryption, because there is nothing more important than scanning against child sexual exploitation images.

However, we think adopting the amendment and replacing the word “prevalence” with “presence” would create an extremely sensitive trigger that would be engaged on almost every site, even tiny ones or where there was no significant risk, because a single example would be enough to trigger the amendment, as drafted. Although I understand the spirit of the amendment, it moves away from the concepts of proportionality and reasonableness in the systems and processes that the Bill seeks to deliver.

Amendment 37 seeks to widen the criteria that Ofcom must consider when deciding to use section 103 powers. It is important to ensure that Ofcom considers a wide range of factors, taking into account the harm occurring, but clause 104(2)(f) already requires Ofcom to consider

“the level of risk of harm to individuals in the United Kingdom presented by relevant content, and the severity of that harm”.

Therefore, the Bill already contains provision requiring Ofcom to take those matters into account, as it should, but the shadow Minister is right to draw attention to the issue.

Finally, amendment 38 seeks to amend clause 116 to require Ofcom to consider the risk of harm posed by individuals in the United Kingdom, in relation to adults and children in the UK or elsewhere, through the production, publication and dissemination of illegal content. In deciding whether to make a confirmation decision requiring the use of technology, it is important that Ofcom considers a wide range of factors. However, clause 116(6)(e) already proposes to require Ofcom to consider, in particular, the risk and severity of harm to individuals in the UK. That is clearly already in the Bill.

I hope that this analysis provides a basis for the shadow Minister to accept that the Bill, in this area, functions as required. I gently request that she withdraw her amendment.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I welcome the Minister’s comments, but if we truly want the Bill to be world-leading, as the Government and the Minister insist it will be, and if it is truly to keep children safe, surely one image of child sexual exploitation and abuse on a platform is one too many. We do not need to consider prevalence over presence. I do not buy that argument. I believe we need to do all we can to make this Bill as strong as possible. I believe the amendments would do that.

Question put, That the amendment be made.

--- Later in debate ---
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clauses 108 and 109 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Labour welcomes clause 107, which requires Ofcom to issue guidance setting out the circumstances in which it could require a service provider in scope of the power to use technology to identify CSEA and/or terrorism content. It is undeniably important that Ofcom will have the discretion to decide on the exact content of the guidance, which it must keep under review and publish.

We also welcome the fact that Ofcom must have regard to its guidance when exercising these powers. Of course, it is also important that the Information Commissioner is included and consulted in the process. Ofcom has a duty to continually review its guidance, which is fundamental to the Bill’s success.

We also welcome clause 108. Indeed, the reporting of Ofcom is an area that my hon. Friend the Member for Batley and Spen will touch on when we come to new clause 25. It is right that Ofcom will have a statutory duty to lay an annual report in this place, but we feel it should ultimately go further. That is a conversation for another day, however, so we broadly welcome clause 108 and have not sought to amend it directly at this stage.

Clause 109 ensures that the definitions of “terrorism content” and “child sexual exploitation and abuse content” used in chapter 5 are the same as those used in part 3. Labour supports the clause and we have not sought to amend it.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I welcome the cross-party support for the provisions set out in these important clauses. Clause 107 points out the requirement for Ofcom to publish guidance, which is extremely important. Clause 108 makes sure that it publishes an annual report. Clause 109 covers the interpretations.

The hon. Member for Aberdeen North asked the only question, about the contents of the Ofcom road map, which in evidence it committed to publishing before the summer. I cannot entirely speak for Ofcom, which is of course an independent body. In order to avoid me giving the Committee misleading information, the best thing is for officials at the Department for Digital, Culture, Media and Sport to liaise with Ofcom and ascertain what the exact contents of the road map will be, and we can report that back to the Committee by letter.

It will be fair to say that the Committee’s feeling—I invite hon. Members to intervene if I have got this wrong—is that the road map should be as comprehensive as possible. Ideally, it would lay out the intended plan to cover all the activities that Ofcom would have to undertake in order to make the Bill operational, and the more detail there is, and the more comprehensive the road map can be, the happier the Committee will be.

Officials will take that away, discuss it with Ofcom and we can revert with fuller information. Given that the timetable was to publish the road map prior to the summer, I hope that we are not going to have to wait very long before we see it. If Ofcom is not preparing it now, it will hopefully hear this discussion and, if necessary, expand the scope of the road map a little bit accordingly.

Question put and agreed to.

Clause 107 accordingly ordered to stand part of the Bill

Clauses 108 and 109 ordered to stand part of the Bill.

Clause 110

Provisional notice of contravention

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I will be brief. Labour welcomes clause 110, which addresses the process of starting enforcement. We support the process, particularly the point that ensures that Ofcom must first issue a “provisional notice of contravention” to an entity before it reaches its final decision.

The clause ultimately ensures that the process for Ofcom issuing a provisional notice of contravention can take place only after a full explanation and deadline has been provided for those involved. Thankfully, this process means that Ofcom can reach a decision only after allowing the recipient a fair opportunity to make relevant representations too. The process must be fair for all involved and that is why we welcome the provisions outlined in the clause.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

I hope that I am speaking at the right stage of the Bill, and I promise not to intervene at any further stages where this argument could be put forward.

Much of the meat of the Bill is within chapter 6. It establishes what many have called the “polluter pays” principle, where an organisation that contravenes can then be fined—a very important part of the Bill. We are talking about how Ofcom is going to be able to make the provisions that we have set out work in practice. A regulated organisation that fails to stop harm contravenes and will be fined, and fined heavily.

I speak at this point in the debate with slight trepidation, because these issues are also covered in clause 117 and schedule 12, but it is just as relevant to debate the point at this stage. It is difficult to understand where in the Bill the Government set out how the penalties that they can levy as a result of the powers under this clause will be used. Yes, they will be a huge deterrent, and that is good in its own right and important, but surely the real opportunity is to make the person who does the harm pay for righting the wrong that they have created.

That is not a new concept. Indeed, it is one of the objectives that the Government set out in the intentions behind their approach to the draft victims Bill. It is a concept used in the Investigatory Powers Act 2016. It is the concept behind the victims surcharge. So how does this Bill make those who cause harm take greater responsibility for the cost of supporting victims to recover from what they have suffered? That is exactly what the Justice Ministers set out as being so important in their approach to victims. In the Bill, that is not clear to me.

At clause 70, the Minister helpfully set out that there was absolutely no intention for Ofcom to have a role in supporting victims individually. In reply to the point that I made at that stage, he said that the victims Bill would address some of the issues—I am sure that he did not say all the issues, but some of them at least. I do not believe that it will. The victims Bill establishes a code and a duty to provide victim support, but it makes absolutely no reference to how financial penalties on those who cause harm—as set out so clearly in this Bill—will be used to support victims. How will they support victims’ organisations, which do so much to help in particular those who do not end up in court, before a judge, because what they have suffered does not warrant that sort of intervention?

I believe that there is a gap. We heard that in our evidence session, including from Ofcom itself, which identified the need for law enforcement, victim-support organisations and platforms themselves to find what the witnesses described as an effective way for the new “ecosystem” to work. Victim-support organisations went further and argued strongly for the need for victims’ voices to be heard independently. The NSPCC in particular made a very powerful argument for children’s voices needing to be heard and for having independent advocacy. There would be a significant issue with trust levels if we were to rely solely on the platforms themselves to provide such victim support.

There are a couple of other reasons why we need the Government to tease the issue out. We are talking about the most significant culture change imaginable for the online platforms to go through. There will be a lot of good will, I am sure, to achieve that culture change, but there will also be problems along the way. Again referring back to our evidence sessions, the charity Refuge said that reporting systems are “not up to scratch” currently. There is a lot of room for change. We know that Revenge Porn Helpline has seen a continual increase in demand for its services in support of victims, in particular following the pandemic. It also finds revenue and funding a little hand to mouth.

Victim support organisations will have a crucial role in assisting Ofcom with the elements outlined in chapter 6, of which clause 110 is the start, in terms of monitoring the reality for users of how the platforms are performing. The “polluter pays” principle is not working quite as the Government might want it to in the Bill. My solution is for the Minister to consider talking to his colleagues in the Treasury about whether this circle could be squared—whether we could complete the circle—by having some sort of hypothecation of the financial penalties, so that some of the huge amount that will be levied in penalties can be put into a fund that can be used directly to support victims’ organisations. I know that that requires the Department for Digital, Culture, Media and Sport and the Ministry of Justice to work together, but my hon. Friend is incredibly good at collaborative working, and I am sure he will be able to achieve that.

This is not an easy thing. I know that the Treasury would not welcome Committees such as this deciding how financial penalties are to be used, but this is not typical legislation. We are talking about enormous amounts of money and enormous numbers of victims, as the Minister himself has set out when we have tried to debate some of these issues. He could perhaps undertake to raise this issue directly with the Treasury, and perhaps get it to look at how much money is currently going to organisations to support victims of online abuse and online fraud—the list goes on—and to see whether we will have to take a different approach to ensure that the victims we are now recognising get the support he and his ministerial colleagues want to see.

--- Later in debate ---
Requirements enforceable by OFCOM against providers of regulated services
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I beg to move amendment 53, in clause 111, page 94, line 24, at end insert—

“Section 136(7C)

Code of practice on access to data”



This amendment is linked to Amendment 52.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss amendment 52, in clause 136, page 118, line 6, at end insert—

“(7A) Following the publication of the report, OFCOM must produce a code of practice on access to data setting out measures with which regulated services are required to comply.

(7B) The code of practice must set out steps regulated services are required to take to facilitate access to date by persons carrying out independent research.

(7C) Regulated services must comply with any measures in the code of practice.”

This amendment would require Ofcom to produce a code of practice on access to data.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Labour welcomes this important clause, which lists the enforceable requirements. Failure to comply with those requirements can trigger enforcement action. However, the provisions could go further, so we urge the Minister to consider our important amendments.

Amendments 52 and 53 make it abundantly clear that more access to, and availability of, data and information about systems and processes would improve understanding of the online environment. We cannot rely solely on Ofcom to act as problems arise, when new issues could be spotted early by experts elsewhere. The entire regime depends on how bright a light we can shine into the black box of the tech companies, but only minimal data can be accessed.

The amendments would require Ofcom simply to produce a code of practice on access to data. We have already heard that without independent researchers accessing data on relevant harm, the platforms have no real accountability for how they tackle online harms. Civil society and researchers work hard to identify online harms from limited data sources, which can be taken away by the platforms if they choose. Labour feels that the Bill must require platforms, in a timely manner, to share data with pre-vetted independent researchers and academics. The EU’s Digital Services Act does that, so will the Minister confirm why such a provision is missing from this supposed world-leading Bill?

Clause 136 gives Ofcom two years to assess whether access to data is required, and it “may”, but not “must”, publish guidance on how its approach to data access might work. The process is far too slow and, ultimately, puts the UK behind the EU, whose legislation makes data access requests possible immediately. Amendment 52 would change the “may” to “must”, and would ultimately require Ofcom to explore how access to data works, not if it should happen in the first place.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Frances Haugen’s evidence highlighted quite how shadowy a significant number of the platforms are. Does the hon. Member agree that that hammers home the need for independent researchers to access as much detail as possible so that we can ensure that the Bill is working?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I agree 100%. The testimony of Frances Haugen, the Facebook whistleblower, highlighted the fact that expert researchers and academics will need to examine the data and look at what is happening behind social media platforms if we are to ensure that the Bill is truly fit for purpose and world leading. That process should be carried out as quickly as possible, and Ofcom must also be encouraged to publish guidance on how access to data will work.

Ultimately, the amendments make a simple point: civil society and researchers should be able to access data, so why will the Minister not let them? The Bill should empower independently verified researchers and civil society to request tech companies’ data. Ofcom should be required to publish guidance as soon as possible —within months, not years—on how data may be accessed. That safety check would hold companies to account and make the internet a safer and less divisive space for everyone.

The process would not be hard or commercially ruinous, as the platforms claim. The EU has already implemented it through its Digital Services Act, which opens up the secrets of tech companies’ data to Governments, academia and civil society in order to protect internet users. If we do not have that data, researchers based in the EU will be ahead of those in the UK. Without more insight to enable policymaking, quality research and harm analysis, regulatory intervention in the UK will stagnate. What is more, without such data, we will not know Instagram’s true impact on teen mental health, nor the reality of violence against women and girls online or the risks to our national security.

We propose amending the Bill to accelerate data sharing provisions while mandating Ofcom to produce guidance on how civil society and researchers can access data, not just on whether they should. As I said, that should happen within months, not years. The provisions should be followed by a code of practice, as outlined in the amendment, to ensure that platforms do not duck and dive in their adherence to transparency requirements. A code of practice would help to standardise data sharing in a way that serves platforms and researchers.

The changes would mean that tech companies can no longer hide in the shadows. As Frances Haugen said of the platforms in her evidence a few weeks ago:

“The idea that they have worked in close co-operation with researchers is a farce. The only way that they are going to give us even the most basic data that we need to keep ourselves safe is if it is mandated in the Bill. We need to not wait two years after the Bill passes”.––[Official Report, Online Safety Public Bill Committee, 26 May 2022; c. 188, Q320.]

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I understand the shadow Minister’s point. We all heard from Frances Haugen about the social media firms’ well-documented reluctance—to put it politely—to open themselves up to external scrutiny. Making that happen is a shared objective. We have already discussed several times the transparency obligations enshrined in clause 64. Those will have a huge impact in ensuring that the social media firms open up a lot more and become more transparent. That will not be an option; they will be compelled to do that. Ofcom is obliged under clause 64 to publish the guidance around those transparency reports. That is all set in train already, and it will be extremely welcome.

Researchers’ access to information is covered in clause 136, which the amendments seek to amend. As the shadow Minister said, our approach is first to get Ofcom to prepare a report into how that can best be done. There are some non-trivial considerations to do with personal privacy and protecting people’s personal information, and there are questions about who counts as a valid researcher. When just talking about it casually, it might appear obvious who is or is not a valid researcher, but we will need to come up with a proper definition of “valid researcher” and what confidentiality obligations may apply to them.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Yes, I would agree that bona fide academic independent researchers do have something to offer and to add in this area. The more we have highly intelligent, experienced and creative people looking at a particular problem or issue, the more likely we are to get a good and well-informed result. They may have perspectives that Ofcom does not. I agree that, in principle, independent researchers can add a great deal, but we need to ensure that we get that set up in a thoughtful and proper way. I understand the desire to get it done quickly, but it is important to take the time to do it not just quickly, but right. It is an area that does not exist already—at the moment, there is no concept of independent researchers getting access to the innards of social media companies’ data vaults—so we need to make sure that it is done in the right way, which is why it is structured as it is. I ask the Committee to stick with the drafting, whereby there will be a report and then Ofcom will have the power. I hope we end up in the same place—well, the same place, but a better place. The process may be slightly slower, but we may also end up in a better place for the consideration and thought that will have to be given.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I appreciate where the Minister is coming from. It seems that he wants to back the amendment, so I am struggling to see why he will not, especially given that the DSA—the EU’s new legislation—is already doing this. We know that the current wording in the Bill is far too woolly. If providers can get away with it, they will, which is why we need to compel them, so that we are able to access this data. We need to put that on the face of the Bill. I wish that we did not have to do so, but we all wish that we did not have to have this legislation in the first place. Unless we put it in the Bill, however, the social media platforms will carry on regardless, and the internet will not be a safe place for children and adults in the UK. That is why I will push amendment 53 to a vote.

Question put, That the amendment be made.

Division 35

Ayes: 3


Labour: 2
Scottish National Party: 1

Noes: 5


Conservative: 5

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I beg to move amendment 56, in clause 111, page 94, line 24, at end insert—

“Section [Supply chain risk assessment duties]

Supply chain risk assessments”



This amendment is linked to NC11.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss new clause 11—Supply chain risk assessment duties—

“(1) This section sets out duties to assess risks arising in a provider’s supply chain, which apply to all Part 3 services.

(2) A duty to carry out a suitable and sufficient assessment of the risk of harm arising to persons employed by contractors of the provider, where the role of such persons is to moderate content on the service.

(3) A duty to keep the risk assessment up to date.

(4) Where any change is proposed to any contract for the moderation of content on the service, a duty to carry out a further suitable and sufficient risk assessment.

(5) In this section, the ‘risk of harm’ includes any risks arising from—

(a) exposure to harmful content; and

(b) a lack of training, counselling or support.”

This new clause introduces a duty to assess the risk of harm in the supply chain.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

We know that human content moderation is the foundation of all content moderation for major platforms. It is the most important resource for making platforms safe. Relying on AI alone is an ineffective and risky way to moderate content, so platforms have to rely on humans to make judgment calls about context and nuance. I pay tribute to all human moderators for keeping us all safe by having to look at some of the most horrendous and graphic content.

The content moderation reviews carried out by humans, often at impossible speeds, are used to classify content to train algorithms that are then used to automatically moderate exponentially more content. Human moderators can be, and often are, exploited by human resource processes that do not disclose the trauma inherent in the work or properly support them in their dangerous tasks. There is little oversight of this work, as it is done largely through a network of contracted companies that do not disclose their expectations for staff or the support and training provided to them. The contractors are “off book” from the platforms and operate at arm’s length from the services they are supporting, and they are hidden by a chain of unaccountable companies. This creates a hazardous supply chain for the safety processes that platforms claim will protect users in the UK and around the world.

Not all online abuse in the UK happens in English, and women of many cultures and backgrounds in the UK are subject to horrific abuse that is not in the English language. The amendment would make all victim groups in the UK much safer.

To make the internet safer it is imperative to better support human content moderators and regulate the supply chain for their work. It is an obvious but overlooked point that content moderators are users of a platform, but they are also the most vulnerable group of users, as they are the frontline of defence in sifting out harmful content. Their sole job is to watch gruesome, traumatising and harmful content so that we do not have to. The Bill has a duty to protect the most vulnerable users, but it cannot do so if their existence is not even acknowledged.

Many reports in the media have described the lack of clarity about, and the exploitative nature of, the hiring process. Just yesterday, I had the immense privilege of meeting Daniel Motaung, the Facebook whistleblower from Kenya who has described the graphic and horrendous content that he was required to watch to keep us all safe, including live beheadings and children being sexually exploited. Members of the Committee cannot even imagine what that man has had to endure, and I commend him for his bravery in speaking out and standing up for his rights. He has also been extremely exploited by Facebook and the third party company by which he was employed. He was paid the equivalent of $2 an hour for doing that work, whereas human moderators in the US were paid roughly $18 an hour—again, nowhere near enough for what they had to endure.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Thank you. Clause 111 sets out and defines the “enforceable requirements” in this chapter—the duties that Ofcom is able to enforce against. Those are set out clearly in the table at subsection (2) and the requirements listed in subsection (3).

The amendment speaks to a different topic. It seeks to impose or police standards for people employed as subcontractors of the various companies that are in scope of the Bill, for example people that Facebook contracts; the shadow Minister, the hon. Member for Pontypridd, gave the example of the gentleman from Kenya she met yesterday. I understand the point she makes and I accept that there are people in those supply chains who are not well treated, who suffer PTSD and who have to do extraordinarily difficult tasks. I do not dispute at all the problems she has referenced. However, the Government do not feel that the Bill is the right place to address those issues, for a couple of reasons.

First, in relation to people who are employed in the UK, we have existing UK employment and health and safety laws. We do not want to duplicate or cut across those. I realise that they relate only to people employed in the UK, but if we passed the amendment as drafted, it would apply to people in the UK as much as it would apply to people in Kenya.

Secondly, the amendment would effectively require Ofcom to start paying regard to employment conditions in Kenya, among other places—indeed, potentially any country in the world—and it is fair to say that that sits substantially outside Ofcom’s area of expertise as a telecoms and communications regulator. That is the second reason why the amendment is problematic.

The third reason is more one of principle. The purpose of the Bill is to keep users safe online. While I understand the reasonable premise for the amendment, it seeks essentially to regulate working conditions in potentially any country in the world. I am just not sure that it is appropriate for an online safety Bill to seek to regulate global working conditions. Facebook, a US company, was referenced, but only 10% of its activity—very roughly speaking—is in the UK. The shadow Minister gave the example of Kenyan subcontractors. Compelling though her case was, I am not sure it is appropriate that UK legislation on online safety should seek to regulate the Kenyan subcontractor of a United States company.

The Government of Kenya can set their own employment regulations and President Biden’s Government can impose obligations on American companies. For us, via a UK online safety Bill, to seek to regulate working conditions in Kenya goes a long way beyond the bounds of what we are trying to do, particularly when we take into account that Ofcom is a telecommunications and communications regulator. To expect it to regulate working conditions anywhere in the world is asking quite a lot.

I accept that a real issue is being raised. There is definitely a problem, and the shadow Minister and the hon. Member for Aberdeen North are right to raise it, but for the three principal reasons that I set out, I suggest that the Bill is not the place to address these important issues.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

The Minister mentions workers in the UK. I am a proud member of the Labour party and a proud trade unionist; we have strong protections for workers in the UK. There is a reason why Facebook and some of these other platforms, which are incredibly exploitative, will not have human moderators in the UK looking at this content: because they know they would be compelled to treat them a hell of a lot better than they do the workers around the world that they are exploiting, as they do in Kenya, Dublin and the US.

To me, the amendment speaks to the heart of the Bill. This is an online safety Bill that aims to keep the most vulnerable users safe online. People around the world are looking at content that is created here in the UK and having to moderate it; we are effectively shipping our trash to other countries and other people to deal with it. That is not acceptable. We have the opportunity here to keep everybody safe from looking at this incredibly harmful content. We have a duty to protect those who are looking at content created in the UK in order to keep us safe. We cannot let those people down. The amendment and new clause 11 give us the opportunity to do that. We want to make the Bill world leading. We want the UK to stand up for those people. I urge the Minister to do the right thing and back the amendment.

--- Later in debate ---
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clauses 113 to 117 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

We support clause 112, which gives Ofcom the power to issue a confirmation decision if, having followed the required process—for example, in clause 110—its final decision is that a regulated service has breached an enforceable requirement. As we know, this will set out Ofcom’s final decision and explain whether Ofcom requires the recipient of the notice to take any specific steps and/or pay a financial penalty. Labour believes that this level of scrutiny and accountability is vital to an Online Safety Bill that is truly fit for purpose, and we support clause 112 in its entirety.

We also support the principles of clause 113, which outlines the steps that a person may be required to take either to come into compliance or to remedy the breach that has been committed. Subsection (5) in particular is vital, as it outlines how Ofcom can require immediate action when the breach has involved an information duty. We hope this will be a positive step forward in ensuring true accountability of big tech companies, so we are happy to support the clause unamended.

It is right and proper that Ofcom has powers when a regulated provider has failed to carry out an illegal content or children’s risk assessment properly or at all, and when it has identified a risk of serious harm that the regulated provider is not effectively mitigating or managing. As we have repeatedly heard, risk assessments are the very backbone of the Bill, so it is right and proper that Ofcom is able to force a company to take measures to comply in the event of previously failing to act.

Children’s access assessments, which are covered by clause 115, are a crucial component of the Bill. Where Ofcom finds that a regulated provider has failed to properly carry out an assessment, it is vital that it has the power and legislative standing to force the company to do more. We also appreciate the inclusion of a three-month timeframe, which would ensure that, in the event of a provider re-doing the assessment, it would at least be completed within a specific—and small—timeframe.

While we recognise that the use of proactive technologies may come with small issues, Labour ultimately feels that clause 116 is balanced and fair, as it establishes that Ofcom may require the use of proactive technology only on content that is communicated publicly. It is fair that content in the public domain is subject to those important safety checks. It is also right that under subsection (7), Ofcom may set a requirement forcing services to review the kind of technology being used. That is a welcome step that will ensure that platforms face a level of scrutiny that has certainly been missing so far.

Labour welcomes and is pleased to support clause 117, which allows Ofcom to impose financial penalties in its confirmation decision. That is something that Labour has long called for, as we believe that financial penalties of this nature will go some way towards improving best practice in the online space and deterring bad actors more widely.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The shadow Minister has set out the provisions in the clauses, and I am grateful for her support. In essence, clauses 112 to 117 set out the processes around confirmation decisions and make provisions to ensure that those are effective and can be operated in a reasonable and fair way. The clauses speak largely for themselves, so I am not sure that I have anything substantive to add.

Question put and agreed to.

Clause 112 accordingly ordered to stand part of the Bill.

Clauses 113 to 117 ordered to stand part of the Bill.

Ordered, That further consideration be now adjourned. —(Dean Russell.)

Online Safety Bill (Thirteenth sitting)

Alex Davies-Jones Excerpts
Committee stage & Committee Debate - 13th sitting
Tuesday 21st June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 21 June 2022 - (21 Jun 2022)
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clause 119 stand part.

Government amendments 154 to 157.

Clauses 120 and 121 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - -

Bore da, Ms Rees. It is, as ever, a pleasure to serve under your chairship. I rise to speak to clauses 118 to 121 and Government amendments 154 to 157.

As we all know, clause 118 is important and allows Ofcom to impose a financial penalty on a person who fails to complete steps that have been required by Ofcom in a confirmation decision. This is absolutely vital if we are to guarantee that regulated platforms take seriously their responsibilities in keeping us all safe online. We support the use of fines. They are key to overall behavioural change, particularly in the context of personal liability. We welcome clause 118, which outlines the steps Ofcom can take in what we hope will become a powerful deterrent.

Labour also welcomes clause 119. It is vital that Ofcom has these important powers to impose a financial penalty on a person who fails to comply with a notice that requires technology to be implemented to identify and deal with content relating to terrorism and child sexual exploitation and abuse on their service. These are priority harms and the more that can be done to protect us on these two points the better.

Government amendments 155 and 157 ensure that Ofcom has the power to impose a monetary penalty on a provider of a service who fails to pay a fee that it is required to pay under new schedule 2. We see these amendments as crucial in giving Ofcom the important powers it needs to be an effective regulator, which is something we all require. We have some specific observations around new schedule 2, but I will save those until we consider that schedule. For now, we support these amendments and I look forward to outlining our thoughts shortly.

We support clause 120, which allows Ofcom to give a penalty notice to a provider of a regulated service who does not pay the fee due to Ofcom in full. This a vital provision that also ensures that Ofcom’s process to impose a penalty can progress only when it has given due notice to the provider and once the provider has had fair opportunity to make fair representations to Ofcom. This is a fair approach and is central to the Bill, which is why we have not sought to amend.

Finally, we support clause 121, which ensures that Ofcom must state the reasons why it is imposing a penalty, the amount of the penalty and any aggravating or mitigating factors. Ofcom must also state when the penalty must be paid. It is imperative that when issuing a notice Ofcom is incentivised to publish information about the amount, aggravating or mitigating factors and when the penalty must be paid. We support this important clause and have not sought to amend.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship once again, Ms Rees, and I congratulate Committee members on evading this morning’s strike action.

I am delighted that the shadow Minister supports the intent behind these clauses, and I will not speak at great length given the unanimity on this topic. As she said, clause 118 allows Ofcom to impose a financial penalty for failure to take specified steps by a deadline set by Ofcom. The maximum penalty that can be imposed is the greater of £18 million or 10% of qualifying worldwide revenue. In the case of large companies, it is likely to be a much larger amount than £18 million.

Clause 119 enables Ofcom to impose financial penalties if the recipient of a section 103 notice does not comply by the deadline. It is very important to ensure that section 103 has proper teeth. Government amendments 154 to 157 make changes that allow Ofcom to recover not only the cost of running the service once the Bill comes into force and into the future but also the preparatory cost of setting up for the Bill to come into force.

As previously discussed, £88 million of funding is being provided to Ofcom in this financial year and next. We believe that something like £20 million of costs that predate these financial years have been funded as well. That adds up to around £108 million. However, the amount that Ofcom recovers will be the actual cost incurred. The figure I provided is simply an indicative estimate. The actual figure would be based on the real costs, which Ofcom would be able to recoup under these measures. That means that the taxpayer—our constituents —will not bear any of the costs, including the set-up and preparatory cost. This is an equitable and fair change to the Bill.

Clause 120 sets out that some regulated providers will be required to pay a regulatory fee to Ofcom, as set out in clause 71. Clause 120 allows Ofcom to impose a financial penalty if a regulated provider does not pay its fee by the deadline it sets. Finally, clause 121 sets out the information that needs to be included in these penalty notices issued by Ofcom.

--- Later in debate ---
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss:

Government amendment 158.

That schedule 12 be the Twelfth schedule to the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Labour supports clause 122 and schedule 12, which set out in detail the financial penalties that Ofcom may impose, including the maximum penalty that can be imposed. Labour has long supported financial penalties for those failing to comply with the duties in the Bill. We firmly believe that tough action is needed on online safety, but we feel the sanctions should go further and that there should be criminal liability for offences beyond just information-related failures. We welcome clause 122 and schedule 12. It is vital that Ofcom is also required to produce guidelines around how it will determine penalty amounts. Consistency across the board is vital, so we feel this is a positive step forward and have not sought to amend the clause.

Paragraph 8 of schedule 12 requires monetary penalties to be paid into the consolidated fund. There is no change to that requirement, but it now appears in new clause 43, together with the requirement to pay fees charged under new schedule 2 into the consolidated fund. We therefore support the amendments.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I have nothing further to add on these amendments. The shadow Minister has covered them, so I will not detain the Committee further.

Question put and agreed to.

Clause 122 accordingly ordered to stand part of the Bill.

Schedule 12

Penalties imposed by OFCOM under Chapter 6 of Part 7

Amendment made: 158, in schedule 12, page 206, line 43, leave out paragraph 8.—(Chris Philp.)

Paragraph 8 of Schedule 12 requires monetary penalties to be paid into the Consolidated Fund. There is no change to that requirement, but it now appears in NC43 together with the requirement to pay fees charged under NS2 into the Consolidated Fund.

Schedule 12, as amended, agreed to.

Clause 123

Service restriction orders

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I beg to move amendment 50, in clause 123, page 106, line 36, at end insert—

“(9A) OFCOM may apply to the court for service restriction orders against multiple regulated services with one application, through the use of a schedule of relevant services which includes all the information required by subsection (5).”

This amendment would give Ofcom the ability to take action against a schedule of non-compliant sites, while still preserving the right of those sites to oppose the application for, and/or appeal through the courts against any, orders to block access or support services.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss amendment 51, in clause 125, page 110, line 20, at end insert—

“(7A) OFCOM may apply to the court for service restriction orders against multiple regulated services with one application, through the use of a schedule of relevant services which includes all the information required by subsection (6).”

This amendment would give Ofcom the ability to take action against a schedule of non-compliant sites, while still preserving the right of those sites to oppose the application for, and/or appeal through the courts against any, orders to block access or support services.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

With your permission, Ms Rees, I will speak to clause stand part and clauses 124 to 127 at the same time. Labour supports clause 123, which outlines the powers that Ofcom will have when applying to the court for business disruption measures. Business disruption measures are court orders that require third parties to withdraw services or block access to non-compliant regulated services. It is right that Ofcom has these tools at its disposal, particularly if it is going to be able to regulate effectively against the most serious instances of user harm. However, the Bill will be an ineffective regime if Ofcom is forced to apply for separate court orders when trying to protect people across the board from the same harms. We have already waited too long for change. Labour is committed to giving Ofcom the powers to take action, where necessary, as quickly as possible. That is why we have tabled amendments 50 and 51, which we feel will go some way in tackling these issues.

Amendment 50 would give Ofcom the ability to take action against a schedule of non-compliant sites, while still preserving the right of those sites to oppose the application for—and/or appeal through the courts against any—orders to block access or support services. The Bill currently requires Ofcom to seek a separate court order for each service against which it wishes to take enforcement action in the form of blocking access or services. That is the only effective mechanism for overseas websites. UK-based services will be subject to enforcement notices and financial penalties that can be enforced without having to go to court. That creates a disadvantage for UK sites, which can be more easily enforced against.

Given that there are 4 million to 5 million pornographic websites, for example, the requirement for separate court orders will prevent Ofcom from taking action at scale and creating a level playing field for all adult sites. Under the Bill, Ofcom must take action against each offending website or social media company individually. While we acknowledge that the Government have stated that enforcement action can be taken against multiple offending content providers, in our opinion that is not made clear in the Bill.

Moreover, we are concerned that some pornography websites would seek to avoid the Bill’s requirements by changing their domain name—domain hopping. That was threatened last year when Germany moved to issue a blocking order against major providers of internet pornography. That is why Ofcom must be granted clear enforcement powers to take swift action against multiple websites and content providers in one court action or order.

This group of amendments would also provide clarity and ease of enforcement for internet service providers, which will be expected to enforce court orders. Labour wants the Bill to be genuinely effective, and amendments 50 and 51 could ensure that Ofcom has the tools available to it to take action at pace. We urge the Minister to accept these small concessions, which could have a hugely positive impact.

Amendment 51 would give Ofcom the ability to take action against a schedule of non-compliant sites, while preserving the right of those sites to oppose an application for an order to block access or support services, or to appeal through the courts against any such order.

It will come as no surprise that Labour supports clause 124, which sets out the circumstances in which Ofcom may apply to the courts for an interim service restriction order. We particularly support the need for Ofcom to be able to take action when time is not on its side, or where, put plainly, the level of harm being caused means that it would be inappropriate to wait for a definite failure before taking action.

However, we hope that caution is exercised if Ofcom ever needs to consider such an interim order; we must, of course, get the balance right in our approach to internet regulation more widely. I would therefore be grateful if the Minister could outline his understanding of the specifics of when these orders may be applied. More broadly, Labour agrees that Ofcom should be given the power to act when time demands it, so we have not sought to amend clause 124 at this stage.

Labour also supports the need for Ofcom to have the power to apply to the courts for an access restriction order, as outlined in clause 125. It is vital that Ofcom is given the power to prevent, restrict or deter individuals in the UK from accessing a service from a non-compliant provider. We welcome the specific provisions on access via internet service providers and app stores. We all know from Frances Haugen’s testimony that harmful material can often be easily buried, so it is right and proper that those are considered as “access facilities” under the clause. Ultimately, we support the intentions of clause 125 and, again, have not sought to amend it at this stage.

We also support clause 126, which sets out the circumstances in which Ofcom may apply to the courts for an interim access restriction order. I will not repeat myself: for the reasons I have already outlined, it is key that Ofcom has sufficient powers to act, particularly on occasions when it is inappropriate to wait for a failure to be established.

We welcome clause 127, which clarifies how Ofcom’s enforcement powers can interact. We particularly welcome clarification that, where Ofcom exercises its power to apply to the courts for a business disruption order under clauses 123 to 126, it is not precluded from taking action under its other enforcement powers. As we have repeatedly reiterated, we welcome Ofcom’s having sufficient power to reasonably bring about positive change and increase safety measures online. That is why we have not sought to amend clause 127.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Thank you for chairing this morning’s sitting, Ms Rees.

I agree with the hon. Member for Pontypridd that these clauses are necessary and important, but I also agree that the amendments are important. It seems like this is a kind of tidying-up exercise, to give Ofcom the ability to act in a way that will make its operation smoother. We all want this legislation to work. This is not an attempt to break this legislation—to be fair, none of our amendments have been—but an attempt to make things work better.

Amendments 50 and 51 are fairly similar to the one that the National Society for the Prevention of Cruelty to Children proposed to clause 103. They would ensure that Ofcom could take action against a group of sites, particularly if they were facing the same kind of issues, they had the same kind of functionality, or the same kind of concerns were being raised about them.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me start with amendments 50 and 51, which were introduced by the shadow Minister and supported by the SNP spokesperson. The Government recognise the valid intent behind the amendments, namely to make sure that applications can be streamlined and done quickly, and that Ofcom can make bulk applications if large numbers of service providers violate the new duties to the extent that interim service restriction orders or access restriction orders become necessary.

We want a streamlined process, and we want Ofcom to deal efficiently with it, including, if necessary, by making bulk applications to the court. Thankfully, however, procedures under the existing civil procedure rules already allow so-called multi-party claims to be made. Those claims permit any number of claimants, any number of defendants or respondents and any number of claims to be covered in a single form. The overriding objective of the CPR is that cases are dealt with justly and proportionately. Under the existing civil procedure rules, Ofcom can already make bulk applications to deal with very large numbers of non-compliant websites and service providers in one go. We completely agree with the intent behind the amendments, but their content is already covered by the CPR.

It is worth saying that the business disruption measures—the access restriction orders and the service restriction orders—are intended to be a last resort. They effectively amount to unplugging the websites from the internet so that people in the United Kingdom cannot access them and so that supporting services, such as payment services, do not support them. The measures are quite drastic, although necessary and important, because we do not want companies and social media firms ignoring our legislation. It is important that we have strong measures, but they are last resorts. We would expect Ofcom to use them only when it has taken reasonable steps to enforce compliance using other means.

If a provider outside the UK ignores letters and fines, these measures are the only option available. As the shadow Minister, the hon. Member for Pontypridd, mentioned, some pornography providers probably have no intention of even attempting to comply with our regulations; they are probably not based in the UK, they are never going to pay the fine and they are probably incorporated in some obscure, offshore jurisdiction. Ofcom will need to use these powers in such circumstances, possibly on a bulk scale—I am interested in her comment that that is what the German authorities had to do—but the powers already exist in the CPR.

It is also worth saying that in its application to the courts, Ofcom must set out the information required in clauses 123(5) and 125(3), so evidence that backs up the claim can be submitted, but that does not stop Ofcom doing this on a bulk basis and hitting multiple different companies in one go. Because the matter is already covered in the CPR, I ask the shadow Minister to withdraw the amendment.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I am interested to know whether the Minister has anything to add about the other clauses. I am happy to give way to him.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the shadow Minister for giving way. I do not have too much to say on the other clauses, because she has introduced them, but in my enthusiasm for explaining the civil procedure rules I neglected to respond to her question about the interim orders in clauses 124 and 126.

The hon. Lady asked what criteria have to be met for these interim orders to be made. The conditions for clause 124 are set out in subsections (3) and (4) of that clause, which states, first, that it has to be

“likely that the…service is failing to comply with an enforceable requirement”—

so it is likely that there has been a breach—and, secondly, that

“the level of risk of harm to individuals in the United Kingdom…and the nature and severity of that harm, are such that it would not be appropriate to wait to establish the failure before applying for the order.”

Similar language in clause 124(4) applies to breaches of section 103.

Essentially, if it is likely that there has been a breach, and if the resulting harm is urgent and severe—for example, if children are at risk—we would expect these interim orders to be used as emergency measures to prevent very severe harm. I hope that answers the shadow Minister’s question. She is very kind, as is the Chair, to allow such a long intervention.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I welcome the Minister’s comments about clauses 124 and 126 in answer to my questions, and also his comments about amendments 50 and 51, clarifying the CPR. If the legislation is truly to have any impact, it must fundamentally give clarity to service users, providers and regulators. That is why we seek to remove any ambiguity and to put these important measures in the Bill, and it is why I will press amendment 50 to a Division.

Question put, That the amendment be made.

--- Later in debate ---
Question proposed, That the clause stand part of the Bill.
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

The Minister and his Back Benchers will, I am sure, be tired of our calls for more transparency, but I will be kind to him and confirm that Labour welcomes the provisions in clause 128.

We believe that it is vital that, once Ofcom has followed the process outlined in clause 110 when issuing a confirmation decision outlining its final decision, that is made public. We particularly welcome provisions to ensure that when a confirmation decision is issued, Ofcom will be obliged to publish the identity of the person to whom the decision was sent, details of the failure to which the decision relates, and details relating to Ofcom’s response.

Indeed, the transparency goes further, as Ofcom will be obliged to publish details of when a penalty notice has been issued in many more areas: when a person fails to comply with a confirmation decision; when a person fails to comply with a notice to deal with terrorism content or child sexual exploitation and abuse content, or both; and when there has been a failure to pay a fee in full. That is welcome indeed. Labour just wishes that the Minister had committed to the same level of transparency on the duties in the Bill to keep us safe in the first place. That said, transparency on enforcement is a positive step forward, so we have not sought to amend the clause at this stage.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am grateful for the shadow Minister’s support. I have nothing substantive to add, other than to point to the transparency reporting obligation in clause 64, which we have debated.

Question put and agreed to.

Clause 128 accordingly ordered to stand part of the Bill.

Clause 129

OFCOM’s guidance about enforcement action

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Member asks for my assistance in interpreting legislative language. Generally speaking, “consult” means what it suggests. Ofcom will consult the Secretary of State, as it will consult the ICO, to ascertain the Secretary of State’s opinion, but Ofcom is not bound by that opinion. Unlike the power in a previous clause—I believe it was clause 40—where the Secretary of State could issue a direct instruction to Ofcom on certain matters, here we are talking simply about consulting. When the Secretary of State expresses an opinion in response to the consultation, it is just that—an opinion. I would not expect it to be binding on Ofcom, but I would expect Ofcom to pay proper attention to the views of important stakeholders, which in this case include both the Secretary of State and the ICO. I hope that gives the hon. Member the clarification he was seeking.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

As we know, clause 129 requires Ofcom to publish guidance about how it will use its enforcement powers. It is right that regulated providers and other stakeholders have a full understanding of how, and in what circumstances, Ofcom will have the legislative power to exercise this suite of enforcement powers. We also welcome Government amendment 7, which will ensure that the Information Commissioner—a key and, importantly, independent authority—is included in the consultation before guidance is produced.

As we have just heard, however, the clause sets out that Secretary of State must be consulted before Ofcom produces guidance, including revised or replacement guidance, about how it will use its enforcement powers. We feel that that involves the Secretary of State far too closely in the enforcement of the regime. The Government should be several steps away from being involved, and the clause seriously undermines Ofcom’s independence—the importance of which we have been keen to stress as the Bill progresses, and on which Conservative Back Benchers have shared our view—so we cannot support the clause.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I repeat the point I made to the hon. Member for Liverpool, Walton a moment ago. This is simply an obligation to consult. The clause gives the Secretary of State an opportunity to offer an opinion, but it is just that—an opinion. It is not binding on Ofcom, which may take that opinion into account or not at its discretion. This provision sits alongside the requirement to consult the Information Commissioner’s Office. I respectfully disagree with the suggestion that it represents unwarranted and inappropriate interference in the operation of a regulator. Consultation between organs of state is appropriate and sensible, but in this case it does not fetter Ofcom’s ability to act at its own discretion. I respectfully do not agree with the shadow Minister’s analysis.

--- Later in debate ---
Advisory committee on disinformation and misinformation
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I beg to move amendment 57, in clause 130, page 115, line 4, leave out “18” and insert “6”

This amendment changes the period by which the advisory committee must report from 18 months to 6.

None Portrait The Chair
- Hansard -

With this, it will be convenient to discuss the following: amendment 58, in clause 130, page 115, line 5, at end insert—

‘(6) Following the publication of the report, OFCOM must produce a code of practice setting out the steps services should take to reduce disinformation across their systems.”

This amendment requires Ofcom to produce a code of practice on system-level disinformation.

Clause stand part.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Clause 130 sets up a committee to advise Ofcom on misinformation and disinformation, which is the only direct reference to misinformation and disinformation in the entire Online Safety Bill. However, the Bill gives the committee no identifiable powers or active role in tackling harmful misinformation and disinformation, meaning that it has limited practical purpose. It is also unclear how the advisory committee will fit with Ofcom’s wider regulatory functions.

The remaining provisions in the Bill are limited and do not properly address harmful misinformation and disinformation. If tackling harmful misinformation and disinformation is left to this clause, the Bill will fail both to tackle harm properly, and to keep children and adults safe.

The clause risks giving a misleading impression that action is being taken. If the Government and Ofcom proceed with creating the committee, we need to see that its remit is strengthened and clarified, so that it more effectively tackles harmful disinformation and misinformation. That should include advising on Ofcom’s research, reporting on drivers of harmful misinformation and disinformation, and proportionate responses to them. There should also be a duty on Ofcom to consult the committee when drafting relevant codes of practice.

That is why we have tabled amendment 57. It would change the period by which the advisory committee must report from 18 months to six. This is a simple amendment that encourages scrutiny. Once again, the Minister surely has little reason not to accept it, especially as we have discussed at length the importance of the advisory committee having the tools that it needs to succeed.

Increasing the regularity of these reports from the advisory committee is vital, particularly given the ever-changing nature of the internet. Labour has already raised concerns about the lack of futureproofing in the Bill more widely, and we feel that the advisory committee has an important role and function to play in areas where the Bill itself is lacking. We are not alone in this view; the Minister has heard from his Back Benchers about just how important this committee is.

Amendment 58 would require Ofcom to produce a code of practice on system-level disinformation. Again, this amendment will come as no surprise to the Minister, given the concerns that Labour has repeatedly raised about the lack of provisions relating to disinformation in the Bill. It seems like an obvious omission that the Bill has failed to consider a specific code of practice around reducing disinformation, and the amendment would be a simple way to ensure that Ofcom actively encourages services to reduce disinformation across their platforms. The Minister knows that this would be a welcome step, and I urge him to consider supporting the amendment.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I want to briefly agree with the sentiments of the Opposition Front Bench, especially about the strength of the committee and the lack of teeth that it currently has. Given that the Government have been clear that they are very concerned about misinformation and disinformation, it seems odd that they are covered in the Bill in such a wishy-washy way.

The reduction of the time from 18 months to six months would also make sense. We would expect the initial report the committee publish in six months to not be as full as the ones it would publish after that. I do not see any issue with it being required to produce a report as soon as possible to assess how the Act is bedding in and beginning to work, rather than having to wait to assess—potentially once the Act is properly working. We want to be able to pick up any teething problems that the Act might have.

We want the committee to be able to say, “Actually, this is not working quite as we expected. We suggest that Ofcom operates in a slightly different way or that the interaction with providers happens in a slightly different way.” I would rather that problems with the Act were tackled as early as possible. We will not know about problems with the Act, because there is no proper review mechanism. There is no agreement on the committee, for example, to look at how the Act is operating. This is one of the few parts of the Bill where we have got an agreement to a review, and it would make sense that it happen as early as possible.

We agree that misinformation and disinformation are very important matters that really need to be tackled, but there is just not enough clout in the Bill to allow Ofcom to properly tackle these issues that are causing untold harm.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The clause allows Ofcom to confer functions on the content board in relation to content-related functions under the Bill, but does not require it to do so. We take the view that how Ofcom manages its responsibilities internally is a matter for Ofcom. That may change over time. The clause simply provides that Ofcom may, if Ofcom wishes, ask its content board to consider online safety matters alongside its existing responsibilities. I trust that the Committee considers that a reasonable measure.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Labour welcomes the clause, which, as the Minister has said, sets out some important clarifications with respect to the Communications Act 2003. We welcome the clarification that the content board will have delegated and advisory responsibilities, and look forward to the Minister’s confirmation of exactly what those are and how this will work in practice. It is important that the content board and the advisory committee on disinformation and misinformation are compelled to communicate, too, so we look forward to an update from the Minister on what provisions in the Bill will ensure that that happens.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The shadow Minister has asked how this will work in practice, but as I said, the internal operation of Ofcom obviously is a matter for Ofcom. As Members have said in the recent past—indeed, in the last hour—they do not welcome undue Government interference in the operation of Ofcom, so it is right that we leave this as a matter for Ofcom. We are providing Ofcom with the power, but we are not compelling it to use that power. We are respecting Ofcom’s operational independence—a point that shadow Ministers and Opposition Members have made very recently.

Question put and agreed to.

Clause 131 accordingly ordered to stand part of the Bill.

Clause 132

Research about users’ experiences of regulated services

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clause 133 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

We support clause 132, which ensures that Ofcom is required to understand and measure public opinion concerning providers of regulated services, as well as the experiences and interests of those using the regulated services in question. The Bill in its entirety is very much a learning curve for us all, and I am sure we all agree that, as previously maintained, the world really is watching as we seek to develop and implement the legislation. That is why it is vital that Ofcom is compelled to conduct and arrange its own research to ensure that we are getting an accurate picture of how our regulatory framework is affecting people. I stress to the Minister that it is imperative that Ofcom consults all service providers—big and small—which the CBI stressed to me in recent meetings.

We also welcome the provisions outlined in subsection (2) that confirm that Ofcom must include a statement of its research in its annual report to the Secretary of State and the devolved Administrations. It is important that Ofcom, as a regulator, takes a research-led approach, and Labour is pleased to see these provisions included in the Bill.

We welcome the inclusion of clause 133, which extends the communication panel’s remit to include online safety. This will mean that the panel is able to give advice on matters relating to different types of online content under the Bill, and on the impacts of online content on UK users of regulated services. It is a welcome step forward, so we have not sought to amend the clause.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

I want to make one short comment about clauses 132 and 133, which are really important. There is no intention to interfere with or fetter the way that Ofcom operates, but there is an obligation on this Committee, and on Parliament, to indicate what we would expect to see from Ofcom by way of the clauses, because they are an essential part of the transparency that we are trying to inject into the sector.

Research about users’ experiences is hugely important, and such reports contain important insights into how platforms are used, and the levels of misinformation and disinformation that people are exposed to. Ofcom already produces highly authoritative reports on various aspects of the online world, including the fact that three in four adults do not think about whether the online information that they see is truthful. Indeed, one in three adults believes that all or most information that they find online is truthful. We know that there is a significant gap between consumers perception and reality, so it is important to ensure that research has good exposure among those using the internet.

We do not often hear about the problems of how the online world works, and the level of disinformation and inaccuracy is not well known, so will the Minister elaborate on how he expects Ofcom to ensure that people are aware of the reality of the online world? Platforms will presumably be required to have regard to the content of Ofcom reports, but will Ofcom be required to publicise its reports? It is not clear that such a duty is in the Bill at the moment, so does the Minister expect Ofcom to have a role in educating people, especially children, about the problem of inaccurate data or other aspects of the online world?

We know that a number of platforms spend a great deal of money on going into schools and talking about their products, which may or may not entail accurate information. Does Ofcom not have an important role to play in this area? Educating users about the changes in the Bill would be another potential role for Ofcom in order to recalibrate users’ expectations as to what they might reasonably expect platforms to offer as a result of the legislation. It is important that we have robust regulatory frameworks in place, and this Bill clearly does that. However, it also requires users to be aware of the changes that have been made so that they can report the problems they experience in a timely manner.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I welcome the support of the hon. Member for Pontypridd for these clauses. I will turn to the questions raised by my right hon. Friend the Member for Basingstoke. First, she asked whether Ofcom has to publish these reports so that the public, media and Parliament can see what they say. I am pleased to confirm that Ofcom does have to publish the reports; section 15 of the Communications Act 2003 imposes a duty on Ofcom to publish reports of this kind.

Secondly, my right hon. Friend asked about educating the public on issues pertinent to these reports, which is what we would call a media literacy duty. Again, I confirm that, under the Communications Act, Ofcom has a statutory duty to promote media literacy, which would include matters that flow from these reports. In fact, Ofcom published an expanded and updated set of policies in that area at the end of last year, which is why the old clause 103 in the original version of this Bill was removed—Ofcom had already gone further than that clause required.

Thirdly, my right hon. Friend asked about the changes that might happen in response to the findings of these reports. Of course, it is open to Ofcom—indeed, I think this Committee would expect it—to update its codes of practice, which it can do from time to time, in response to the findings of these reports. That is a good example of why it is important for those codes of practice to be written by Ofcom, rather than being set out in primary legislation. It means that when some new fact or circumstance arises or some new bit of research, such as the information required in this clause, comes out, those codes of practice can be changed. I hope that addresses the questions my right hon. Friend asked.

The hon. Member for Liverpool, Walton asked about transparency, referring to Frances Haugen’s testimony to the US Senate and her disclosures to The Wall Street Journal, as well as the evidence she gave this House, both to the Joint Committee and to this Committee just before the Whitsun recess. I have also met her bilaterally to discuss these issues. The hon. Gentleman is quite right to point out that these social media firms use Facebook as an example, although there are others that are also extremely secretive about what they say in public, to the media and even to representative bodies such as the United States Congress. That is why, as he says, it is extremely important that they are compelled to be a lot more transparent.

The Bill contains a large number of provisions compelling or requiring social media firms to make disclosures to Ofcom as the regulator. However, it is important to have public disclosure as well. It is possible that the hon. Member for Liverpool, Walton was not in his place when we came to the clause in question, but if he turns to clause 64 on page 56, he will see that it includes a requirement for Ofcom to give every provider of a relevant service a notice compelling them to publish a transparency report. I hope he will see that the transparency obligation that he quite rightly refers to—it is necessary—is set out in clause 64(1). I hope that answers the points that Committee members have raised.

Question put and agreed to.

Clause 132 accordingly ordered to stand part of the Bill.

Clause 133 ordered to stand part of the Bill.

Clause 134

OFCOM’s statement about freedom of expression and privacy

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

As we all know, the clause requires Ofcom to publish annual reports on the steps it has taken, when carrying out online safety functions, to uphold users’ rights under articles 8 and 10 of the convention, as required by section 6 of the Human Rights Act 1998. It will come as no surprise to the Minister that Labour entirely supports this clause.

Upholding users’ rights is a central part of this Bill, and it is a topic we have debated repeatedly in our proceedings. I know that the Minister faces challenges of his own, as the Opposition do, regarding the complicated balance between freedom of speech and safety online. It is only right and proper, therefore, for Ofcom to have a specific duty to publish reports about what steps it is taking to ensure that the online space is fair and equal for all.

That being said, we know that we can and should go further. My hon. Friend the Member for Batley and Spen will shortly address an important new clause tabled in her name—I believe it is new clause 25—so I will do my best not to repeat her comments, but it is important to say that Ofcom must be compelled to publish reports on how its overall regulatory operating function is working. Although Labour welcomes clause 134 and especially its commitment to upholding users’ rights, we believe that when many feel excluded in the existing online space, Ofcom can do more in its annual reporting. For now, however, we support clause 134.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I welcome the shadow Minister’s continuing support for these clauses. Clause 134 sets out the requirement on Ofcom to publish reports setting out how it has complied with articles 8 and 10 of the European convention on human rights.

I will pause for a second, because my hon. Friend the Member for Don Valley and others have raised concerns about the implications of the Bill for freedom of speech. In response to a question he asked last week, I set out in some detail the reasons why I think the Bill improves the position for free speech online compared with the very unsatisfactory status quo. This clause further strengthens that case, because it requires this report and reminds us that Ofcom must discharge its duties in a manner compatible with articles 8 and 10 of the ECHR.

From memory, article 8 enshrines the right to a family life, and article 10 enshrines the right to free speech, backed up by quite an extensive body of case law. The clause reminds us that the powers that the Bill confers on Ofcom must be exercised—indeed, can only be exercised—in conformity with the article 10 duties on free speech. I hope that that gives my hon. Friend additional assurance about the strength of free speech protection inherent in the Bill. I apologise for speaking at a little length on a short clause, but I think that was an important point to make.

Question put and agreed to.

Clause 134 accordingly ordered to stand part of the Bill.

Clause 135

OFCOM’s transparency reports

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Again, Labour welcomes clause 135, which places a duty on Ofcom to produce its own reports based on information from the transparency reports that providers are required to publish. However, the Minister will know that Labour feels the Bill has much more work to do on transparency more widely, as we have repeatedly outlined through our debates. The Minister rejected our calls for increased transparency when we were addressing, I believe, clause 61. We are not alone in feeling that transparency reports should go further. The sector and his own Back Benchers are calling for it, yet so far his Department has failed to act.

It is a welcome step that Ofcom must produce its own reports based on information from the provider’s transparency reports, but the ultimate motivation for the reports to provide a truly accurate depiction of the situation online is for them to be made public. I know the Minister has concerns around security, but of course no one wants to see users put at harm unnecessarily. That is not what we are asking for here. I will refrain from repeating debates we have already had at length, but I wish to again put on the record our concerns around the transparency reporting process as it stands.

That being said, we support clause 135. It is right that Ofcom is compelled to produce its own reports; we just wish they were made public. With the transparency reports coming from the providers, we only wish they would go further.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I have spoken to these points previously, so I do not want to tax the Committee’s patience by repeating what I have said.

Question put and agreed to.

Clause 135 accordingly ordered to stand part of the Bill.

Clause 136

OFCOM’s report about researchers’ access to information

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Again, Labour welcomes clause 136, which is a positive step towards a transparent approach to online safety, given that it requires Ofcom to publish a report about the access that independent researchers have, or could have, to matters relating to the online safety of regulated services. As my hon. Friend the Member for Worsley and Eccles South rightly outlined in an earlier sitting, Labour strongly believes that the transparency measures in the Bill do not go far enough.

Independent researchers already play a vital role in regulating online safety. Indeed, there are far too many to list, but many have supported me, and I am sure the Minister, in our research on the Bill. That is why we have tabled a number of amendments on this point, as we sincerely feel there is more work to be done. I know the Minister says he understands and is taking on board our comments, but thus far we have seen little movement on transparency.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

In this clause we are specifically talking about access to information for researchers. Obviously, the transparency matters were covered in clauses 64 and 135. There is consensus across both parties that access to information for bona fide academic researchers is important. The clause lays out a path to take us in the direction of providing that access by requiring Ofcom to produce a report. We debated the matter earlier. The hon. Member for Worsley and Eccles South—I hope I got the pronunciation right this time—

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Lady made some points about the matter in an earlier sitting, as the shadow Minister just said. It is an area we are giving some careful thought to, because it is important that it is properly academically researched. Although Ofcom is being well resourced, as we have discussed, with lots of money and the ability to levy fees, we understand that it does not have a monopoly on wisdom—as good a regulator as it is. It may well be that a number of academics could add a great deal to the debate by looking at some of the material held inside social media firms. The Government recognise the importance of the matter, and some thought is being given to these questions, but at least we can agree that clause 136 as drafted sets out a path that leads us in this important direction.

Question put and agreed to.

Clause 136 accordingly ordered to stand part of the Bill.

Clause 137

OFCOM’s reports

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Briefly, before I hand over to my hon. Friend the Member for Worsley and Eccles South, I should say that Labour welcomes clause 137, which gives Ofcom a discretionary power to publish reports about certain online safety measures and matters. Clearly, it is important to give Ofcom the power to redact or exclude confidential matters where needs be, and I hope that there will be a certain level of common sense and public awareness, should information of this nature be excluded. As I have previously mentioned—I sound a bit like a broken record—Labour echoes the calls for more transparency, which my hon. Friend the Member for Batley and Spen will come on to in her new clause. However, broadly, we support this important clause.

I would like to press the Minister briefly on how exactly the exclusion of material from Ofcom reports will work in practice. Can he outline any specific contexts or examples, beyond commercial sensitivity and perhaps matters of national security, where he can envision this power being used?

Online Safety Bill (Fourteenth sitting) Debate

Full Debate: Read Full Debate

Alex Davies-Jones

Main Page: Alex Davies-Jones (Labour - Pontypridd)

Online Safety Bill (Fourteenth sitting)

Alex Davies-Jones Excerpts
Committee stage
Tuesday 21st June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 21 June 2022 - (21 Jun 2022)
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clause 144 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - -

As we know, clause 143 introduces a power for the Secretary of State to set out a statement of the Government’s strategic priorities in relation to online safety matters. Given that the power is similar to those that already exist in the Communications Act 2003, we do not formally oppose the clause. We welcome the fact that the Secretary of State must follow a consultation and parliamentary procedure before proceeding. It is vital that transparency surrounds any targets or priorities that the Secretary of State may outline. However, we want to put on record our slight concerns around the frequency limitations on amendments that are outlined in subsections (7) and (8). This is a direct interference regime, and we would appreciate the Minister’s reassurances on the terms of how it will work in practice.

We also welcome clause 144, which sets out the consultation and parliamentary procedure requirements that must be satisfied before the Secretary of State can designate a statement of strategic priorities under clause 143. We firmly believe that parliamentary oversight must be at the heart of the Bill, and the Minister’s Back Benchers agree. We have heard compelling statements from the right hon. Member for Basingstoke and other colleagues about just how important parliamentary oversight of the Bill will be, even when it has received Royal Assent. That is why clause 144 is so important: it ensures that the Secretary of State must consult Ofcom when considering the statement of strategic priorities.

Following that, the draft statement must be laid before Parliament for proper scrutiny. As we have said before, this is central to the Bill’s chances of success, but Labour firmly believes that it would be unreasonable for us to expect the Secretary of State to always be an expert across every policy area out there, because it is not possible. That is why parliamentary scrutiny and transparency are so important. It is not about the politics; it is about all of us working together to get this right. Labour will support clause 144 because, fundamentally, it is for the Secretary of State to set out strategic priorities, but we must ensure that Parliament is not blocked from its all-important role in providing scrutiny.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the shadow Minister for her broad support for these two clauses. Clause 143 provides the power, but not an obligation, for the Secretary of State to set out a strategic statement on her priorities for online safety matters. As the shadow Minister said, it is similar to powers that already exist in other areas. The clause links back to clause 78, whereby Ofcom must have regard to the strategic priorities and set out how it responds to them when they are updated. On clause 144, I am glad that the shadow Minister accepts the consultation has to happen and that the 40-day period for Parliament to consider changes to the draft statement and, if it wishes to, to object to them is also a welcome opportunity for parliamentary scrutiny.

The Government have heard the wider points about parliamentary scrutiny and the functioning of the Joint Committee, which my right hon. Friend the Member for Basingstoke mentioned previously. I have conveyed them to higher authorities than me, so that transmission has occurred. I recognise the valuable work that the Joint Committee of the Commons and Lords did in scrutinising the Bill prior to its introduction, so I am glad that these clauses are broadly welcome.

Question put and agreed to.

Clause 143 accordingly ordered to stand part of the Bill.

Clause 144 ordered to stand part of the Bill.

Clause 145

Directions about advisory committees

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Labour supports the clause, which enables the Secretary of State to give Ofcom a direction to establish an expert committee to advise it on a specific online safety matter. As we have said repeatedly, it is vital that expert stakeholders are included as we begin the challenging process of regulating the internet. With that in mind, we need to ensure that the committee truly is expert and that it remains independent.

The Minister knows that I have concerns about Ofcom’s ability to remain truly independent, particularly given the recent decision to appoint a Tory peer to chair the organisation. I do not want to use our time today to make pointed criticisms about that decision—much as I would like to—but it is important that the Minister addresses these concerns. Ofcom must be independent—it really is quite important for the future success of the Bill. The expert committee’s chair, and its other members, must be empowered to report freely and without influence. How can the Minister ensure that that will genuinely be the case?

Subsection (4) places a duty on an advisory committee established under such a direction to publish a report within 18 months of its being established. I want to push the Minister on the decision to choose 18 months. I have mentioned my concerns about that timeframe; it seems an awfully long time for the industry, stakeholders, civil society and, indeed, Parliament to wait. I cannot be clearer about how important a role I think that this committee will have, so I would be grateful if the Minister could clarify why he thinks it will take 18 months for such a committee to be established.

That said, we broadly support the principles of what the clause aims to do, so we have not sought to amend it at this stage.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the shadow Minister for her comments and questions. She raised two substantive points on the clause; I will address those, rather than any wider issues that may be contentious.

The first question was about whether the advisory committee would be independent, and how we can be certain that it will not be unduly interfered in by the Government. The answer lies clearly in subsection (3). Paragraphs (a) and (b) make it very clear that although the Secretary of State may direct Ofcom to establish the committee, the identity of the people on the committee is for Ofcom to determine. Subsection (3)(a) states very clearly that the chairman is “appointed by OFCOM”, and subsection (3)(b) states that members of the committee are

“appointed by OFCOM as OFCOM consider appropriate.”

It is Ofcom, not the Secretary of State, that appoints the chair and the members. I trust that that deals with the question about the independence of the members.

On the second question, about time, the 18 months is not 18 months for the committee to be established—I am looking at clause 145(4)—but 18 months for the report to be published. Subsection (4) says “within” a period of 18 months, so it does not have to be 18 months for delivery of the report; it could be less, and I am sure that in many cases it will be. I hope that answers the shadow Minister’s questions on the clause, and I agree that it should stand part of the Bill.

Question put and agreed to.

Clause 145 accordingly ordered to stand part of the Bill.

Clause 146

Directions in special circumstances

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss new clause 10—Special circumstances—

“(1) This section applies where OFCOM has reasonable grounds for believing that circumstances exist that present a threat—

(a) to the health or safety of the public, or

(b) to national security.

(2) OFCOM may, in exercising their media literacy functions, give priority for a specified period to specified objectives designed to address the threat presented by the circumstances mentioned in subsection (1).

(3) OFCOM may give a public statement notice to—

(a) a specified provider of a regulated service, or

(b) providers of regulated services generally.

(4) A ‘public statement notice’ is a notice requiring a provider of a regulated service to make a publicly available statement, by a date specified in the notice, about steps the provider is taking in response to the threat presented in the circumstances mentioned in subsection (1).

(5) OFCOM may, by a public statement notice or a subsequent notice, require a provider of a regulated service to provide OFCOM with such information as they may require for the purpose of responding to that threat.

(6) If OFCOM takes any of the steps set out in this Chapter, they must publish their reasons for doing so.

(7) In subsection (2) ‘media literacy functions’ means OFCOM’s functions under section 11 of the Communications Act (duty to promote media literacy), so far as functions under that section relate to regulated services.”

This new clause gives Ofcom the power to take particular steps where it considers that there is a threat to the health and safety of the public or to national security, without the need for a direction from the Secretary of State.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

As we all know, the clause as it stands enables the Secretary of State to give Ofcom directions in circumstances where it considers that there is a threat to the health or safety of the public or to national security. That includes directing Ofcom to prioritise action to respond to a specific threat when exercising its media literacy functions, and to require specified service providers, or providers of regulated services more generally, to publicly report on what steps they are taking to respond to that threat.

However, Labour shares the concerns of the Carnegie UK Trust, among others, that there is no meaningful constraint on the Secretary of State’s powers to intervene as outlined in the clause. Currently, the Secretary of State has the power to direct Ofcom where they have “reasonable grounds for believing” that there is a threat to the public’s health or safety or to national security. The UK did not need these powers before—during the cold war, for example—so we have to ask: why now?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

So far as I am aware, the phenomenon of social media companies, to which media literacy relates, did not exist during the cold war.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

It did not, but there were examples of disinformation, misinformation and the spreading of falsehoods, and none of these powers existed at the time. It seems weird—if I can use that term—that these exist now. Surely, the more appropriate method would be for the Secretary of State to write a letter to Ofcom to which it had to have regard. As it stands, this dangerous clause ensures the Secretary of State has the power to interfere with day-to-day enforcement. Ultimately, it significantly undermines Ofcom’s overall independence, which we truly believe should be at the heart of the Bill.

With that in mind, I will now speak to our crucial new clause 10, which instead would give Ofcom the power to take particular steps, where it considers that there is a threat to the health and safety of the public or national security, without the need for direction from the Secretary of State. Currently, there is no parliamentary scrutiny of the powers outlined in clause 146; it says only that the Secretary of State must publish their reasoning unless national security is involved. There is no urgency threshold or requirement in the clause. The Secretary of State is not required to take advice from an expert body, such as Public Health England or the National Crime Agency, in assessing reasonable grounds for action. The power is also not bounded by the Bill’s definition of harm.

These instructions do two things. First, they direct Ofcom to use its quite weak media literacy duties to respond to the circumstances. Secondly, a direction turns on a power for Ofcom to ask a platform to produce a public statement about what the platform is doing to counter the circumstances or threats in the direction order—that is similar in some ways to the treatment of harm to adults. This is trying to shame a company into doing something without actually making it do it. The power allows the Secretary of State directly to target a given company. There is potential for the misuse of such an ability.

The explanatory notes say:

“the Secretary of State could issue a direction during a pandemic to require OFCOM to; give priority to ensuring that health misinformation and disinformation is effectively tackled when exercising its media literacy function; and to require service providers to report on the action they are taking to address this issue.”

Recent experience of the covid pandemic and the Russian invasion of Ukraine suggests that the Government can easily legislate when required in an emergency and can recall Parliament. The power in the Bill is a strong power, cutting through regulatory independence and targeting individual companies to evoke quite a weak effect. It is not being justified as an emergency power where the need to move swiftly is paramount. Surely, if a heavier-duty action is required in a crisis, the Government can legislate for that and explain to Parliament why the power is required in the context of a crisis.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

It is really important to make sure that the Bill does not end up being a cover for the Secretary of State of the day to significantly interfere with the online space, both now and in the future. At the moment, I am not satisfied that the Secretary of State’s powers littered through the Bill are necessary. I share other hon. Members’ concerns about what this could mean for both the user experience and online safety more broadly. I hope my hon. Friend agrees that the Minister needs to provide us—not just us here today, but civil society and others who might be listening—with more reassurance that the Secretary of State’s powers really are necessary.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I completely agree with my hon. Friend. We talk time and again about this Bill being world leading, but with that comes a responsibility to show global leadership. Other countries around the world will be looking to us, and this Parliament, when they adopt their own, similar legislation, and we need to be mindful of that when looking at what powers we give to a Secretary of State—particularly in overruling any independence of Ofcom or Parliament’s sovereignty for that matter.

New clause 10 provides a viable alternative. The Minister knows that this is an area where even his Back Benchers are divided. He must closely consider new clause 10 and recognise that placing power in Ofcom’s hands is an important step forward. None of us wants to see a situation where the Secretary of State is able to influence the regulator. We feel that, without this important clause and concession, the Government could be supporting a rather dangerous precedent in terms of independence in regulatory systems more widely.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I want to talk about a specific example. Perhaps the Minister will be able to explain why the legislation is written this way around when I would have written it the opposite way around, much more in line with proposed new clause 10.

Snapchat brought in the Snap Map feature, which that involved having geolocation on every individual’s phone; whenever anyone took a photo to put it on Snapchat, that geolocation was included. The feature was automatically turned on for all Snapchat users when it first came in, I think in 2017. No matter what age they were, when they posted their story on Snapchat, which is available to anyone on their friends list and sometimes wider, anyone could see where they were. If a child had taken a photo at their school and put it on Snapchat, anyone could see what school they went to. It was a major security concern for parents.

That very concerning situation genuinely could have resulted in children and other vulnerable people, who may not have even known that the feature had been turned on by default and would not know how to turn on ghost mode in Snapchat so as not to post their location, being put at risk. The situation could have been helped if media literacy duties had kicked in that meant that the regulator had to say, “This is a thing on Snapchat: geolocation is switched on. Please be aware of this if your children or people you are responsible for are using Snapchat.”

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me start by addressing the point that was raised by the hon. Member for Aberdeen North on Ofcom’s power to issue media literacy advice of its own volition, which is the subject of new clause 10. Under section 11 of the Communications Act 2003, Ofcom already has the power to issue media literacy guidance on issues such as Snapchat geolocation, the Strava map location functionality that I mentioned, and the other example that came up. Ofcom does not need the Secretary of State’s permission to do that, as it already has the power to do so. The power that new clause 10 would confer on Ofcom already exists.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

The Minister says that Ofcom can already use that existing power, so why does it not do so?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

That is obviously an operational matter for Ofcom. We would encourage it to do as much as possible. We encouraged it through our media literacy strategy, and it published an updated policy on media literacy in December last year. If Members feel that there are areas of media literacy in which Ofcom could do more, they will have a good opportunity to raise those questions when senior Ofcom officials next appear before the Digital, Culture, Media and Sport Committee or any other parliamentary Committee.

The key point is that the measures in new clause 10 are already in legislation, so the new clause is not necessary. The Secretary of State’s powers under clause 146 do not introduce a requirement for permission—they are two separate things. In addition to Ofcom’s existing powers to act of its own volition, the clause gives the Secretary of State powers to issue directions in certain very limited circumstances. A direction may be issued where there is a present threat—I stress the word “threat”—to the health or safety of the public or to national security, and only in relation to media literacy. We are talking about extremely narrowly defined powers.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Lady is quite right to correct me. I do mean “present a threat”, as it is written in the Bill—I apologise for inadvertently transposing the words.

Is it reasonable that the Secretary of State has those very limited and specific powers? Why should they exist at all? Does this represent an unwarranted infringement of Ofcom’s freedom? I suppose those are the questions that the Opposition and others might ask. The Government say that, yes, it is reasonable and important, because in those particular areas—health and safety, and national security—there is information to which only the Government have access. In relation to national security, for example, information gathered by the UK intelligence community—GCHQ, the Secret Intelligence Service and MI5—is made available to the Government but not more widely. It is certainly not information that Ofcom would have access to. That is why the Secretary of State has the power to direct in those very limited circumstances.

I hope that, following that explanation, the Committee will see that new clause 10 is not necessary because it replicates an existing power, and that clause 146 is a reasonable provision.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I welcome the Minister’s comments, but I am not convinced by his arguments on the powers given to the Secretary of State on issues of national security or public health and safety. Parliament can be recalled and consulted, and Members of Parliament can have their say in the Chamber on such issues. It should not be up to the Secretary of State alone to direct Ofcom and challenge its independence.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I understand the shadow Minister’s point, but recalling Parliament during a recess is extremely unusual. I am trying to remember how many times it has happened in the seven years that I have been here, and I can immediately recall only one occasion. Does she think that it would be reasonable and proportionate to recall 650 MPs in recess for the purpose of issuing a media literacy directive to Ofcom?

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I think the Minister has just made my point for me. If he does not see this happening only in extreme circumstances where a threat is presented or there is an immediate risk to public health and safety, how many times does he envisage the power being used? How many times will the Secretary of State have the power to overrule Ofcom if the power is not to be used only in those unique situations where it would be deemed appropriate for Parliament to be recalled?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

It is not overruling Ofcom; it is offering a direction to Ofcom.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Yes—having direct influence on a regulator, overruling its independence and taking the stance directly themselves. The Minister has made my point for me: if he does not envisage the power being used only in unique circumstances where Parliament would need to be recalled to have a say, it will be used a lot more often than he suggests.

With that in mind, the Opposition will withhold our support for clause 146, in order to progress with new clause 10. I place on record the Labour party’s distinct concerns with the clause, which we will seek to amend on Report.

Dan Carden Portrait Dan Carden (Liverpool, Walton) (Lab)
- Hansard - - - Excerpts

I add my voice to the concerns that have been raised about the clause, and about the powers for the Secretary of State that are littered throughout the Bill. This comes on top of the scandals around the public appointments process that we have seen under this Government—even around the role of chair of Ofcom, which they tried to hand to a former editor of the Daily Mail, Paul Dacre. Earlier this year, Lord Grade was appointed for a four-year term. He is on £140,000-odd a year. The Secretary of State is responsible for appointing the whole board of Ofcom. I really do wonder why, on top of the power that the Government hold in the appointments process, they need the Secretary of State to have the claims to intervention that the Bill affords her.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I have nothing further to add.

Question put and agreed to.

Clause 146 accordingly ordered to stand part of the Bill.

Clause 147

Secretary of State’s guidance

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

It seems that our support for the clauses has run out. Clause 147 enables the Secretary of State to give guidance to Ofcom relating to its exercise of its statutory powers and functions under the Bill. It also allows the Secretary of State to give guidance to Ofcom around its functions and general powers under certain provisions of the Communications Act 2003. While we appreciate that the Secretary of State must consult Ofcom before issuing, revising or replacing guidance, we feel that this level of interference is unnecessary.

The Minister must recognise that the clause allows for an incredibly granular level of interference by the Secretary of State in the day-to-day functioning of a supposedly independent regulator. It profoundly interferes with enforcement and once again broadly undermines Ofcom’s independence. Civil society and stakeholders alike share our concerns. I must press the Minister on why this level of interference is included in the Bill—what is the precedent? We have genuine concerns that the fundamental aims of the Bill—to keep us all safe online—could easily be shifted according to the priorities of the Secretary of State of the day. We also need to ensure there is consistency in our overall approach to the Bill. Labour feels that this level of interference will cause the Bill to lack focus.

Ultimately, Ofcom, as the independent regulator, should be trusted to do what is right. The Minister must recognise how unpopular the Bill’s current approach of giving overarching powers to the Secretary of State is. I hope he will go some way to addressing our concerns, which, as I have already said, we are not alone in approaching him with. For those reasons, we cannot support clause 147 as it stands.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We are introducing a new, groundbreaking regime, and we are trying to strike a balance between the need for regulatory independence of Ofcom and appropriate roles for Parliament and Government. There is a balance to strike there, particularly in an area such as this, which has not been regulated previously. It is a brand-new area, so we do not have decades of cumulated custom and practice that has built up. We are creating this from the ground up—from a blank sheet of paper.

That is why, in establishing this regime, we want to provide a facility for high-level strategic guidance to be given to Ofcom. Of course, that does not infringe on Ofcom’s day-to-day operations; it will continue to do those things itself, in taking decisions on individual enforcement matters and on the details around codes of practice. All those things, of course, remain for Ofcom.

We are very clear that guidance issued under clause 147 is strategic in nature and will not stray into the operational or organisational matters that should properly fall into the exclusive ambit of the independent regulator. There are a number of safeguards in the clause to ensure that the power is exercised in the way that I have just described and does not go too far.

First, I point to the fact that clause 147(8) simply says that

“ OFCOM must have regard to the guidance”.

That is obviously different from a hard-edged statutory obligation for it to follow the guidance in full. Of course, it does mean that Ofcom cannot ignore it completely—I should be clear about that—but it is different from a hard-edged statutory obligation.

There is also the requirement for Ofcom to be consulted, so that its opinions can be known. Of course, being consulted does not mean that the opinions will be followed, but it means that they will be sought and listened to. There are also some constraints on how frequently this strategic guidance can be revised, to ensure that it does not create regulatory uncertainty by being chopped and changed on an unduly frequent basis, which would cause confusion.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I will be brief. The clause is incredibly important. It requires the Secretary of State to prepare and lay before Parliament annual reports about their performance in relation to online safety. We fully support such transparency. That is all we want—we want it to go further. That is what we have been trying to say in Committee all day. We agree in principle and therefore have not sought to amend the clause.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I could not possibly add to that exceptionally eloquent description.

Question put and agreed to.

Clause 148 accordingly ordered to stand part of the Bill.

Clause 149

Review

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

As we know, the clause compels the Secretary of State to undertake a review to assess the effectiveness of the regulatory framework. The review will have to be published and laid before Parliament, which we welcome. However, we note the broad time limits on this duty. We have heard repeatedly about the challenges that delays to the Bill’s full implementation will cause, so I urge the Minister to consider that point closely. By and large, though, we absolutely support the clause, especially as the Secretary of State will be compelled to consult Ofcom and other appropriate persons when carrying out its review—something that we have called for throughout scrutiny of the Bill. We only wish that that level of collaboration had been accepted by the Minister on the other clauses. I will not waste time repeating points that I have already made. We support the clause.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I welcome the shadow Minister’s support for this review clause, which is important. I will not add to her comments.

Question put and agreed to.

Clause 149 accordingly ordered to stand part of the Bill.

Clause 150

Harmful communications offence

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

That was about as clear as mud, actually, but let us leave it there.

Question put and agreed to.

Clause 150 accordingly ordered to stand part of the Bill.

Clauses 151 to 155 ordered to stand part of the Bill.

Clause 156

Sending etc photograph or film of genitals

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I beg to move amendment 41, in clause 156, page 131, line 15, at end insert—

“(za) B has not consented for A to share the photograph or film with B, or”.

This amendment makes it an offence to send an image of genitals to another person if the recipient has not given consent to receive the image.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss amendment 42, in clause 156, page 131, line 20, at end insert—

“(1A) A person consents if the person agrees by choice, and has the freedom and capacity to make that choice.”

This amendment is linked to Amendment 41.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

With your permission, Ms Rees, I will also speak to clause stand part.

Labour welcomes the clause. We see it as a positive step forward that the Government have committed to creating a new offence in certain circumstances where sending a photograph or film of a person’s genitals to another person will cause distress or humiliation. However, the Government have missed a huge opportunity to accurately capture the problems caused by sharing intimate images online. I will come to that shortly in addressing amendments 41 and 42.

We know that the act of sending unsolicited genital images—cyber-flashing, or sending dick pics—is a huge problem here in the UK. Research from Bumble has shown how disproportionally the issue affects young women. The statistics are shocking and speak for themselves. A whopping 48% of millennial women said that they had been sent an unsolicited sexual image in the last year alone. I must pay tribute to the right hon. Member for Basingstoke, who we all know shared her own experiences of cyber-flashing relatively recently. She is not alone—not in this House or in the country.

I have my own experiences, as do friends, colleagues and even my staff members, and we all share the same concerns about the prevalence of cyber-flashing. The Minister does not need to be reminded of it; he knows of the extent of the issues. We heard compelling evidence only a few weeks ago from Professor Clare McGlynn and Nima Elmi from Bumble, among others.

Labour firmly believes, as Professor McGlynn has outlined, that cyber-flashing is problematic because it is non-consensual conduct of a sexual nature. Distributing these images is not in and of itself wrong, but doing so without the consent of the recipient is. The non-consensual act breaches women’s rights to sexual autonomy, to be treated with dignity and to be free from sexual violence, regardless of the motive of the perpetrator.

We know that men’s motivations for cyber-flashing are varied and overlapping. They include misogyny, causing distress, sexual gratification, humour, boosting status among peers, sexual intimidation, and transactional motivations. Yet there is no evidence that the harms experienced by women are worse when offenders have the specific motivations identified in motive-based proposals, such as causing distress.

For example, a woman may be sent unsolicited penis images while on public transport, making her feel threatened and fearful for her safety, regardless of whether the sender intended to cause her alarm or was simply trying to impress his friends as a bit of banter. That is why the consent approach really is crucial, as I will now discuss in relation to amendments 41 and 42.

Amendment 41 would make it an offence to send an image of genitals to another person if the recipient has not given consent to receive that image. Labour recognises that there are two main options when drafting a new cyber-flashing criminal offence. The first is what we are trying to achieve with these amendments—a comprehensive consent-based offence requiring proof of non-consent. The alternative, as currently proposed by the Law Commission, is far too limited. It offers a motive-based offence, which applies only on proof of specific motives on the part of the offender, such as to cause distress, alarm or humiliation, to get sexual gratification, or to cause distress by being reckless. This is hugely problematic for women and girls across the country, and the Minister must recognise the message this sends to them.

Proving a motive behind an offence as simple as merely sending a photograph is nigh on impossible. If we really want to see systemic change in attitudes to women and girls, we fundamentally should not be creating laws that place the burden on the victim. A consent-based offence, as in our amendments, covers all forms of cyber-flashing, regardless of the motives of the sender. Motive requirements create an unjustified hierarchy of abuses and victims, and they do not reflect victims’ experiences. Requiring proof of specific motives will make investigations and prosecutions more difficult.

We know from police and victims that investigations and prosecutions for sharing sexual images without consent, such as revenge porn, are not taken forward due to similar motive requirements. How, therefore, can the Minister think that the provisions in the Bill related to cyber-flashing go far enough? Will they actually create change? I mentioned on Second Reading our genuine concerns about the levels of misogyny that have become far too normalised across our communities and within our society as a whole.

The consent-based offence provides a much better foundation for education and prevention projects. It sends the message that all sexual activity should be grounded in consent. It better supports education about online activities, with a focus on consent-based practices, and makes clear that any taking or sharing of sexual images without consent is wrong, harmful and criminal. Those are all positives.

The stakeholders are calling for a consent-based approach. The Opposition want the same. Even the Minister’s own Back Benchers can see that the Bill fails to capture and address the real harms women and girls face online. The Minister can likely sense my exasperation. It comes from a place of genuine frustration. I cannot understand how there has not been any movement on this from the Government side.

My final point—and indeed plea—is to urge the Minister to consider what is going on internationally on this issue. He will know that a consent-based cyber-flashing offence has been adopted in Texas and is being debated in other US states. Consent is easily obtained and criminal charges easily avoided. It is important to remember that avoiding being charged with a criminal offence is straightforward. All the sender needs to do is ask, “Would you like to see a picture of my genitals?” It is as simple as that. I am sure even the Minister can agree on that point. I urge him to genuinely consider amendments 41 and 42. There has been no movement from the Minister and no concessions thus far as we have scrutinised the Bill, but he must know that the Bill is far from perfect in its current form.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I would like to make a couple of comments. The shadow Minister mentioned education and prevention projects, which are key. In Scotland, our kids’ sex, health and relationship education in schools teaches consent from the earliest possible age. That is vital. We have a generation of men who think it is okay to send these images and not seek consent. As the shadow Minister said, the problem is everywhere. So many women have received images that they had no desire to see. They did not ask for them, and they did not consent to receive them, but they get them.

Requiring someone to prove the intent behind the offence is just impossible. It is so unworkable, and that makes it really difficult. This is yet another issue that makes it clear that we need to have reference to violence against women and girls on the face of the Bill. If that were included, we would not be making such a passionate case here. We would already have a code of conduct and assessments that have to take place on the basis of the specific harm to women and girls from such offences. We would not be making the case so forcefully because it would already be covered.

I wish the Minister would take on board how difficult it is for women and girls online, how much of an issue this specific action causes and how much pain and suffering it causes. It would great if the Minister could consider moving somewhat on this issue in order to protect women and girls.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the Members who have contributed to the debate. Rather like with the provisions in clause 150, which we discussed a few minutes ago, a difficult and delicate balance needs to be struck. We want to criminalise that which should be criminal, but not inadvertently criminalise that which should not be. The legal experts at the Law Commission have been studying the matter and consulting other legal experts for quite some time. As my right hon. Friend the Member for Basingstoke said in her excellent speech, their recommendations have been our starting point.

It is probably worth making one or two points about how the clause works. There are two elements of intention, set out in subsection (1). First, the act of sending has to be intentional; it cannot be done accidentally. I think that is reasonable. Secondly, as set out in subsection (1)(a), there must be an intention to cause the person who sees the image alarm, distress or intimidation.

I understand the point that establishing intent could, in some circumstances, present a higher hurdle. As we discussed in relation to clause 150, we are, separately from this, working on the intimate image abuse offence, which does not require intention to be established; it simply requires lack of consent. I was not aware, until my right hon. Friend mentioned it a few moments ago—she was ahead of me there—that the Law Commission has given a timeframe for coming back. I am not sure whether that implies it will be concomitant with Ministry of Justice agreement or whether that will have to follow, but I am very pleased to hear that there is a timeframe. Clearly, it is an adjacent area to this and it will represent substantial progress.

I understand that it can sometimes be hard to establish intention, but there will be circumstances in which the context of such an incident will often make it clear that there was an intention to cause alarm, distress or humiliation.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Has the Minister ever received a dick pic?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Is that a rhetorical question?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

No, it is a genuine question.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

So he cannot possibly know how it feels to receive one. I appreciate the comments that he is trying to make, and that this is a fine balance, but I do see this specific issue of sending a photograph or film of genitals as black and white: they are sent either with or without consent. It is as simple as that. What other circumstances could there be? Can he give me an example of when one could be sent without the intention to cause distress, harm or intimidation?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

It is a fair question. There might be circumstances in which somebody simply misjudges a situation—has not interpreted it correctly—and ends up committing a criminal offence; stumbling into it almost by accident. Most criminal offences require some kind of mens rea—some kind of intention to commit a criminal offence. If a person does something by accident, without intention, that does not normally constitute a criminal offence. Most criminal offences on the statute book require the person committing the offence to intend to do something bad. If we replace the word “intent” with “without consent”, the risk is that someone who does something essentially by accident will have committed a criminal offence.

I understand that the circumstances in which that might happen are probably quite limited, and the context of the incidents that the hon. Member for Pontypridd and my right hon. Friend the Member for Basingstoke have described would generally support the fact that there is a bad intention, but we have to be a little careful not accidentally to draw the line too widely. If a couple are exchanging images, do they have to consent prior to the exchange of every single image? We have to think carefully about such circumstances before amending the clause.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I will commit to consider the clause further, as my right hon. Friend has requested. It is important to do so in the context of the Law Commission’s recommendations, but she has pointed to wording in the Law Commission’s original report that could be used to improve the drafting here. I do not want to make a firm commitment to change, but I will commit to considering whether the clause can be improved upon. My right hon. Friend referred to the “likely to cause harm” test, and asked whether recklessness as to whether someone suffers alarm, distress or humiliation could be looked at as a separate element. We need to be careful; if we sever that from sexual gratification, we need to have some other qualification on sexual gratification. We might have sexual gratification with consent, which would be fine. If we severed them, we would have to add another qualification.

It is clear that there is scope for further examination of clause 156. That does not necessarily mean it will be possible to change it, but it is worth examining it further in the light of the comments made by my right hon. Friend. The testimony we heard from witnesses, the testimony of my right hon. Friend and what we heard from the hon. Member for Pontypridd earlier do demonstrate that this is a widespread problem that is hugely distressing and intrusive and that it represents a severe violation. It does need to be dealt with properly.

We need to be cognisant of the fact that in some communities there is a culture of these kinds of pictures being freely exchanged between people who have not met or communicated before—on some dating websites, for example. We need to draft the clause in such a way that it does not inadvertently criminalise those communities—I have been approached by members of those communities who are concerned.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

They have consent to do that.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Member for Pontypridd says from a sedentary position that they have given consent. The consent is not built into the website’s terms and conditions; it is an assumed social norm for people on those websites. We need to tread carefully and be thoughtful, to ensure that by doing more to protect one group we do not inadvertently criminalise another.

There is a case for looking at the issue again. My right hon. Friend has made the point thoughtfully and powerfully, and in a way that suggests we can stay within the confines of the Law Commission’s advice, while being more thoughtful. I will certainly undertake to go away and do that, in consultation with my right hon. Friend and others.

--- Later in debate ---
For the time being, I will resist amendments 41 and 42, but in so doing I commit myself to look further at these measures. It is worth saying—this was mentioned a short time ago—that there is nothing in law dealing with this issue, so we have been debating points of detail from around the world. Those are important points of detail, and I am in no way minimising or dismissing them, but we should recognise that, today, Parliament is introducing this offence, which does not exist at the moment. We are taking a gigantic stride forward. While it is important to ensure that we get the details right, let us not forget that a gigantic stride forward is being taken here.
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I wholeheartedly agree with the Minister’s comments. This is a gigantic step forward that is long overdue, and we wholeheartedly welcome the new offence being created, but, as he rightly pointed out, it is important that we get this right and that we make the measure as strong as possible so that the legislation causes direct and meaningful change.

To us, the issue is simple: “Do you want to see my genitals, yes or no?” We will push amendment 41 to the vote.

Question put, That the amendment be made.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Labour supports clause 159, because it is vital that the Bill includes provisions for Ofcom to issue a penalty notice or confirmation decision when the provider may not be a legal person in the traditional sense. We have repeatedly maintained that it is central to the success of the Bill that, once implemented, it properly and sufficiently gives Ofcom the relevant powers, autonomy and independence to properly pursue providers of regulated services and their wrongdoings.

We recognise the complexity of the service providers’ business models and therefore agree that the Bill must be broad enough to ensure that penalty notices and confirmation decisions can be given, even when the provider may constitute an association, or an organisation between a group of people. Ultimately, as we have made clear, Labour will continue to support giving the regulator the tools required to keep us all safe online.

We have already raised concerns over Ofcom’s independence and the interference of and over-reliance on the Secretary of State’s powers within the Bill as it stands. However, we are in agreement on clause 159 and feel that it provides a vital tool for Ofcom to have at its disposal should the need for a penalty notice or confirmation decision arise. That is why we support the clause and have not sought to amend it.

Government amendment 159, as we know, ensures that if the provider of a service consists of two or more individuals, those individuals are jointly liable to pay a fee demanded under new schedule 2. As I will come on to in my comments on clauses 160 and 161, we welcome the provisions and clarifications around liability for fees when the provider of a service consists of two or more individuals.

As with clause 159, we welcome the clarity of provisions in the Bill that confirm actions to be taken where a group of two or more individuals act together. It is absolutely right that where two or more individuals together are the providers of a regulated service, they should be jointly and severally liable for any duty, requirement or liability to pay a fee.

We also welcome the clarification that that liability and joint responsibility will also apply in the event of a penalty notice or confirmation decision. We believe that these provisions are vital to capturing the true extent of where responsibility should lie, and we hope they will go some way to remedying the hands-off approach that service providers have managed to get away with for too long when it comes to regulation of the internet. We do, however, feel that the Government could have gone further, as we outlined in amendment 50, which we spoke to when we addressed clause 123.

Labour firmly believes that Ofcom’s ability to take action against non-compliance en masse is critical. That is why we welcome clause 160 and will not be seeking to amend it at this stage. We also fundamentally support clause 161, which contains provisions on how joint liability will operate.

We will speak to our concerns about supply chains when we debate a later clause—I believe it is new clause 13 —because it is vital that this Bill captures the challenges around supply chain failures and where responsibility lies. With that in mind, we will support clause 161, with a view to the Minister understanding our broader concerns, which we will address when we debate new clause 13.

Finally, schedule 14 establishes that decisions or notices can be given jointly to both a regulated provider and its parent company. We particularly support the confirmation that all relevant entities must be given the opportunity to make representations when Ofcom seeks to establish joint liability, including on the matters contained in the decision or notice and whether joint liability would be appropriate.

As we have made clear, we see the provisions outlined in this schedule as fundamental to Ofcom’s ability to issue truly meaningful decisions, penalties and notices to multiple parties. The fact that, in this instance, service providers will be jointly liable to comply is key to capturing the extent to which it has been possible to perpetuate harm online for so long. That is why we support the intention behind schedule 14 and have not sought to amend it.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The shadow Minister has set out clearly the purpose of and intent behind these clauses, and how they work, so I do not think I will add anything. I look forward to our future debate on the new clause.

There is one point of correction that I wish to make, and it relates to a question that the hon. Member for Aberdeen North asked this morning and that is germane to amendment 159. That amendment touches on the arrangements for recouping the set-up costs that Ofcom incurs prior to the Bill receiving Royal Assent. The hon. Member for Aberdeen North asked me over what time period those costs would be collected, and I answered slightly off the cuff. Now I have had a chance to dig through the papers, I will take this opportunity to confirm exactly how that works.

To answer the question a little bit better than I did this morning, the place to go is today’s amendment paper. The relevant provisions are on page 43 of the amendment paper, in paragraph 7(5) of Government new schedule 2, which we will debate later. If we follow the drafting through—this is quite a convoluted trail to follow —it states that the cost can be recouped over a period that is not less than three years and not more than five years. I hope that gives the hon. Member for Aberdeen North a proper answer to her question from this morning, and I hope it provides clarity and points to where in the new schedule the information can be found. I wanted to take the first opportunity to clarify that point.

Beyond that, the hon. Member for Pontypridd has summarised the provisions in this group very well, and I have nothing to add to her comments.

Question put and agreed to.

Clause 159 accordingly ordered to stand part of the Bill.

Clause 160

Individuals providing regulated services: liability

Amendment made: 159, in clause 160, page 133, line 6, after “71” insert

“or Schedule (Recovery of OFCOM’s initial costs)”.—(Chris Philp.)

This amendment ensures that, if the provider of a service consists of two or more individuals, those individuals are jointly liable to pay a fee demanded under NS2.

Clause 160, as amended, ordered to stand part of the Bill.

Clause 161 ordered to stand part of the Bill.

Schedule 14 agreed to.

Clause 162

Information offences: supplementary

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clauses 163 to 165 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Labour supports the intention behind clause 162, because we believe that only by creating specific offences will the messaging around liability and the overall message about public safety really hit home for those at the top in Silicon Valley. We welcome the clarification on exactly how Ofcom will be able to exercise these important powers, and we support the process of giving notice, confirmation decisions and subsequent penalties. We see the clause as fundamental to the Bill’s overall success, although, as the Minister will recall, we feel that the Bill could go further in addressing broader offences beyond those around information practices. However, that is a debate for another day.

In this clause, we believe that the importance and, indeed, the power of information notices is crystal clear for service providers to see, and Labour fully supports and welcomes that move. That is why we will support clause 162 and have not sought to amend it at this stage. We welcome the clarity in clause 163 around the process that applies when a person relies on a defence in an information offence. We see this clause as sitting alongside current legal precedents and are therefore happy to support it.

We fully support and welcome clause 164. We believe it is central to the entire argument around liability that the Minister knows Labour has been making for some time now. We have heard in Committee evidence sessions some truly compelling insights from people such as Frances Haugen, and we know for certain that companies are prone to covering up information that they know will be received unfavourably.

--- Later in debate ---
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clause 167 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Labour welcomes clause 166, which specifies that references to regulated services and Ofcom’s information-gathering powers apply to services provided from outside the United Kingdom as well as to services provided from within the United Kingdom. While we recognise the challenges around internet regulation in the UK, we live in a global world, and we are pleased that the legislation has been drawn up in a way that will capture services based overseas.

We feel the Bill is lacking in its ability to regulate against content that may have originated from outside the UK. While it is welcome that regulated services based abroad will be within scope, we have concerns that that will do little to capture specific content that may not originate within the UK. We have raised these points at length in previous debates, so I will not dwell on them now, but the Minister knows that the Bill will continue to fall short when it does not capture, for example, child sexual exploitation and abuse content that was filmed and originated abroad. That is a huge loophole, which will allow harmful content to be present and to be perpetuated online well into the future. Although we support clause 166 for now, I urge the Minister to reconsider his view on how all-encompassing the current approach to content can be as he considers his Department’s strategy before Report.

Clause 167 outlines that the information offences in the Bill apply to acts done in the United Kingdom and outside the United Kingdom. We welcome its provisions, but we feel that the Government could go further. We welcome the clarification that it will be possible to prosecute information offences in any part of the UK as if they occurred there. Given the devastating pressures that our legal system already faces thanks to this Government’s cuts and shambolic approach to justice, such flexibility is crucial and a welcome step forward.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Last week or the week before, we debated extensively the points about the extraterritorial application to protecting children, and I made it clear that the Bill protects people as we would wish it to.

Clause 166 relates to extraterritorial enforceability. It is important to make sure that the duties, enforceable elements and sanctions apply worldwide, reflecting the realities of the internet, and clause 166 specifies that references to regulated services in the Bill include services provided from outside the United Kingdom. That means that services based overseas must also comply, as well as those in the UK, if they reach UK users.

The clause ensures that Ofcom has effective information-gathering powers and can seek information from in-scope companies overseas for the purposes of regulating and enforcing the regime. Obviously, companies such as Facebook are firmly in scope, as hon. Members would expect. The clause makes it clear that Ofcom can request information held outside the UK and interview individuals outside the UK, if that is necessary for its investigations.

Clause 167 explains that the information-related personal criminal offences in the Bill—for example, failing to comply with Ofcom’s information notices—apply to acts done inside and outside the UK. That means that those offences can be criminally prosecuted whether the perpetrator is based in the UK or outside the UK. That will send a clear message to the large global social media firms that no matter where they may be based in the world or where their services may be provided from, we expect them to comply and the enforcement provisions in the Bill will apply to them.

Question put and agreed to.

Clause 166 accordingly ordered to stand part of the Bill.

Clause 167 ordered to stand part of the Bill.

Ordered, That further consideration be now adjourned. —(Steve Double.)

Online Safety Bill (Fifteenth sitting)

Alex Davies-Jones Excerpts
Committee stage
Thursday 23rd June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 23 June 2022 - (23 Jun 2022)
Clause 176 provides powers to amend schedules 5, 6 and 7. Those schedules, as colleagues will recall, cover priority criminal offences, which is schedule 7, child sexual exploitation and abuse offences, which is schedule 6, and terrorism offences, which is schedule 5. Clearly, if new offences are created or if there are existing offences that Parliament believes need to be added to these priority lists of offences, we need the flexibility to do that. An example might be a new offence created by a devolved Administration, a new offence that Parliament here at Westminster legislates for that we think needs to be a priority offence, or an existing offence that is not on the list now but in the future we think needs to be added to ensure that platforms proactively protect the public. We need this flexibility. Again, this speaks to the future-proofing of the Bill that Members have spoken about. It is an extremely important aspect of the Bill’s ability to respond to threats that may emerge in the future and to new legislation.
Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - -

Good morning, Sir Roger. As the Minister has outlined, clause 173 gives the Secretary of State the power to amend the list of fraud offences in what will be section 36 in relation to the duties about fraudulent advertising. Although we recognise that this power is subject to some constraints, Labour has concerns about what we consider to be an unnecessary power given to the Secretary of State to amend duties about fraudulent advertising on category 1 services.

We welcome the provisions outlined in clause 173(2), which lists the criteria that any new offences must meet before the Secretary of State may include them in the list of fraud offences in section 36. The Minister outlined some of those. Along the same lines, the provision in clause 173(3) to further limit the Secretary of State’s power to include new fraud offences—it lists types of offences that may not be added to section 36—is a positive step.

However, we firmly believe that delegated law making of this nature, even when there are these minor constraints in place, is a worrying course for the Government to pursue when we have already strongly verbalised our concerns about Ofcom’s independence. Can the Minister alleviate our concerns by clarifying exactly how this process will work in practice? He must agree with the points that colleagues from across the House have made about the importance of Ofcom being truly independent and free from any political persuasion, influence or control. We all want to see the Bill change things for the better so I am keen to hear from the Minister the specific reasoning behind giving the Secretary of State the power to amend this important legislation through what will seemingly be a simple process.

As we all know, clause 174 allows the Secretary of State to make regulations to amend or repeal provisions relating to exempt content or services. Regulations made under this clause can be used to exempt certain content or services from the scope of the regulatory regime, or to bring them into scope. It will come as no surprise to the Minister that we have genuine concerns about the clause, given that it gives the Secretary of State of the day the power to amend the substantive scope of the regulatory regime. In layman’s terms, we see this clause as essentially giving the Secretary of State the power to, through regulations, exempt certain content and services from the scope of the Bill, or bring them into scope. Although we agree with the Minister that a degree of flexibility is crucial to the Bill’s success and we have indeed raised concerns throughout the Bill’s proceedings about the need to future-proof the Bill, it is a fine balance, and we feel that these powers in this clause are in excess of what is required. I will therefore be grateful to the Minister if he confirms exactly why this legislation has been drafted in a way that will essentially give the Secretary of State free rein on these important regulations.

Clauses 175 and 176 seek to give the Secretary of State additional powers, and again Labour has concerns. Clause 175 gives the Secretary of State the power to amend the list in part 2 of schedule 1, specifically paragraph 10. That list sets out descriptions of education and childcare relating to England; it is for the relevant devolved Ministers to amend the list in their respective areas. Although we welcome the fact that certain criteria must be met before the amendments can be made, this measure once again gives the Secretary of State of the day the ability substantively to amend the scope of the regime more broadly.

Those concerns are felt even more strongly when we consider clause 176, which gives the Secretary of State the power to amend three key areas in the Bill—schedules 5, 6 and 7, which relate to terrorism offences, to child sexual exploitation and abuse content offences—except those extending to Scotland—and to priority offences in some circumstances. Alongside stakeholders, including Carnegie, we strongly feel that the Secretary of State should not be able to amend the substantive scope of the regime at this level, unless moves have been initiated by Ofcom and followed by effective parliamentary oversight and scrutiny. Parliament should have a say in this. There should be no room for this level of interference in a regulatory regime, and the Minister knows that these powers are at risk of being abused by a bad actor, whoever the Secretary of State of the day may be. I must, once again, press the Minister to specifically address the concerns that Labour colleagues and I have repeatedly raised, both during these debates and on Second Reading.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a couple of questions, particularly on clause 176 and the powers to amend schedules 6 and 7. I understand the logic for schedule 5 being different—in that terrorism offences are a wholly reserved matter—and therefore why only the Secretary of State would be making any changes.

My question is on the difference in the ways to amend schedules 6 and 7—I am assuming that Government amendment 126, which asks the Secretary of State to consult Scottish Ministers and the Department of Justice in Northern Ireland, and which we have already discussed, will be voted on and approved before we come to clause 176. I do not understand the logic for having different procedures to amend the child sexual exploitation and abuse offences and the priority offences. Why have the Government chosen two different procedures for amending the two schedules?

I understand why that might not be a terribly easy question to answer today, and I would be happy for the Minister to get in touch afterwards with the rationale. It seems to me that both areas are very important, and I do not quite understand why the difference is there.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Again, Labour has concerns about clause 177, which gives the Secretary of State a power to make consequential provisions relating to the Bill or regulations under the Bill. As we know, the power is exercised by regulation and includes the ability to amend the Communications Act 2003. I will spare the Committee a repetition of my sentiments, but we do feel that the clause is part of an extremely worrying package of clauses related to the Secretary of State’s powers, which we feel are broadly unnecessary.

We have the same concerns about clause 178, which sets out how the powers to make regulations conferred on the Secretary of State may be used. Although we recognise that it is important in terms of flexibility and future-proofing that regulations made under the Bill can make different provisions for different purposes, in particular relating to different types of service, we are concerned about the precedent that this sets for future legislation that relies on an independent regulatory system.

Labour supports amendment 160, which will ensure that the regulations made under new schedule 2, which we will debate shortly, are subject to the affirmative procedure. That is vital if the Bill is to succeed. We have already expressed our concerns about the lack of scrutiny of other provisions in the Bill, so we see no issue with amendment 160.

The Minister has outlined clause 179, and he knows that we welcome parliamentary oversight and scrutiny of the Bill more widely. We regard this as a procedural clause and have therefore not sought to amend it.

Question put and agreed to.

Clause 177 accordingly ordered to stand part of the Bill.

Clause 178 ordered to stand part of the Bill.

Clause 179

Parliamentary procedure for regulations

Amendment made: 160, in clause 179, page 146, line 13, at end insert “, or

(k) regulations under paragraph 7 of Schedule (Recovery of OFCOM’s initial costs),—(Chris Philp.)

This amendment provides that regulations under NS2 are subject to the affirmative procedure.

Clause 179, as amended, ordered to stand part of the Bill.

Clause 180

“Provider” of internet service

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to consider the following:

Clauses 181 to 188 stand part.

Amendment 76, in clause 189, page 154, line 34, after “including” insert “but not limited to”.

This amendment clarifies the definition of “content” in the bill in order that anything communicated by means of an internet service is considered content, not only those examples listed.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I will address clauses 180 to 182 together, before moving on to discuss our concerns about the remaining clauses in this group.

As we know, clause 180 determines who is the provider of an internet service and therefore who is subject to the duties imposed on providers. Labour has already raised concerns about the Bill’s lack of future-proofing and its inability to incorporate internet services that may include user-to-user models. The most obvious of those are user-to-user chat functions in gaming, which the hon. Member for Aberdeen North has raised on a number of occasions; we share her concerns.

Broadly, we think the Bill as it stands fails to capture the rapidity of technological advances, and the gaming industry is a key example of this. The Bill targets the providers that have control over who may use the user-to-user functions of a game, but in our view the clarity just is not there for emerging tech in the AI space in particular, so we would welcome the Minister’s comments on where he believes this is defined or specified in the Bill.

Clause 181 defines “user”, “United Kingdom user” and “interested person” in relation to regulated services. We welcome the clarification outlined in subsections (3) and (4) of the role of an employee at a service provider and their position when uploading content. We support the clarity on the term “internet service” in clause 182, and we welcome the provisions to capture services that are accessed via an app specifically, rather than just via an internet browser.

We welcome clause 183, which sets out the meaning of “search engine”. It is important to highlight the difference between search engines and user-to-user services, which has been attempted throughout the Bill. We heard from Google about its definition of “search”, and Labour agrees that, at their root, search services exist as an index of the web, and are therefore different from user-to-user services. We also fully appreciate the rapid nature of the internet—hundreds of web pages are created every single second—meaning that search services have a fundamental role to play in assisting users to find authoritative information that is most relevant to what they are seeking. Although search engines do not directly host content, they have an important role to play in ensuring that a delicate balance is maintained between online safety and access to lawful information. We are therefore pleased to support clause 183, which we feel broadly outlines the responsibilities placed on search services more widely.

On clause 184, Labour supports the need for a proactive technology to be used by regulated service providers to comply with their duties on illegal content, content that is harmful to children, and fraudulent advertising. In our consideration of proactive technology elsewhere in the Bill, Labour has made it clear that we support measures to keep us all safe. When speaking to new clause 20, which we debated with clause 37, I made it clear that we disagree with the Bill’s stance on proactive technology. As it is, the Bill will leave Ofcom unable to proactively require companies to use technology that can detect child abuse. Sadly, I was not particularly reassured by the Minister’s response, but it is important to place on the record again our feeling that proactive technology has an important role to play in improving online safety more widely.

Clause 185 provides information to assist Ofcom in its decision making on whether, in exercising its powers under the Bill, content is communicated publicly or privately. We see no issues with the process that the clause outlines. It is fundamentally right that, in the event of making an assessment of public or private content, Ofcom has a list of factors to consider and a subsequent process to follow. We will therefore support clause 185, which we have not sought to amend.

Clause 186 sets out the meaning of the term “functionality”. Labour supports the clause, particularly the provisions in subsection (2), which include the detailed ways in which platforms’ functionality can affect subsequent online behaviours. Despite our support, I put on the record our concern that the definitions in the clause do little to imagine or capture the broad nature of platforms or, indeed, the potential for them to expand into the AI space in future.

The Minister knows that Labour has advocated a systems-based approach to tackling online safety that would put functionality at the heart of the regulatory system. It is a frustrating reality that those matters are not outlined until clause 186. That said, we welcome the content of the clause, which we have not sought to amend.

Clause 187 aims to define “harm” as “physical or psychological harm”. Again, we feel that that definition could go further. My hon. Friend the Member for Batley and Spen spoke movingly about her constituent Zach in an earlier debate, and made a compelling case for clarity on the interplay between the physical and psychological harm that can occur online. The Minister said that the Government consider the Bill to cover a range of physical and psychological harms, but many charities disagree. What does he say to them?

We will shortly be considering new clause 23, and I will outline exactly how Labour feels that the Bill fails to capture the specific harms that women and girls face online. It is another frustrating reality that the Government have not taken the advice of so many stakeholders, and of so many women and girls, to ensure that those harms are on the face of the Bill.

Labour agrees with the provisions in clause 188, which sets out the meaning of “online safety functions” and “online safety matters”, so we have not sought to amend the clause.

Would it be appropriate for me to speak to the SNP amendment as well, Sir Roger?

None Portrait The Chair
- Hansard -

Not really. If the hon. Lady has finished with her own amendments, we should, as a courtesy, allow the SNP spokesperson to speak to her amendment first.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Thank you, Sir Roger. I thank the shadow Minister for running through some of our shared concerns about the clauses. Similarly, I will talk first about some of the issues and questions that I have about the clauses, and then I will speak to amendment 76. Confusingly, amendment 76 was tabled to clause 189, which we are not discussing right now. I should have raised that when I saw the provisional selection of amendments. I will do my best not to stray too far into clause 189 while discussing the amendment.

I have raised before with the Minister some of the questions and issues that I have. Looking specifically at clause 181, I very much appreciate the clarification that he has given us about users, what the clause actually means, and how the definition of “user” works. To be fair, I agree with the way the definition of “user” is written. My slight concern is that, in measuring the number of users, platforms might find it difficult to measure the number of unregistered users and the number of users who are accessing the content through another means.

Let us say, for example, that someone is sent a WhatsApp message with a TikTok link and they click on that. I do not know whether TikTok has the ability to work out who is watching the content, or how many people are watching it. Therefore, I think that TikTok might have a difficulty when it comes to the child safety duties and working out the percentage or number of children who are accessing the service, because it will not know who is accessing it through a secondary means.

I am not trying to give anyone a get-out clause. I am trying to ensure that Ofcom can properly ensure that platforms that have a significant number of children accessing them through secondary means are still subject to the child safety duties even though there may not be a high number of children accessing the platform or the provider directly. My major concern is assessing whether they are subject to the child safety duties laid out in the Bill.

I will move straight on to our amendment 76, which would amend the definition of “content” in clause 189. I have raised this issue with the Minister already. The clause, as amended, would state that

“‘content’ means anything communicated by means of an internet service, whether publicly or privately, including but not limited to”—

and then a list. The reason I suggest that we should add those words “but not limited to” is that if we are to have a list, we should either make an exhaustive list or have clarity that there are other things that may not be on the list.

I understand that it could be argued that the word “including” suggests that the provision actually goes much wider than what is in the list. I understand that that is the argument that the Minister may make, but can we have some more clarity from him? If he is not willing to accept the amendment but he is willing to be very clear that, actually, the provision does include things that we have not thought of and that do not currently exist and that it genuinely includes anything communicated by means of an internet service, that will be very helpful.

I think that the amendment would add something positive to the Bill. It is potentially the most important amendment that I have tabled in relation to future-proofing the Bill, because it does feel as though the definition of “content”, even though it says “including”, is unnecessarily restrictive and could be open to challenge should someone invent something that is not on the list and say, “Well, it’s not mentioned, so I am not going to have to regulate this in the way we have to regulate other types of content.”

I have other questions about the same provision in clause 189, but I will hold on to those until we come to the next grouping.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I rise briefly to support amendment 76, in the name of the hon. Member for Aberdeen North. Labour supports broadening the definition of “content” in this way. I refer the Minister to our earlier contributions about the importance of including newspaper comments, for example, in the scope of the Bill. This is a clear example of a key loophole in the Bill. We believe that a broadened definition of “content” would be a positive step forward to ensure that there is future-proofing, to prevent any unnecessary harm from any future content.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The shadow Minister, in her first contribution to the debate, introduced the broad purpose of the various clauses in this group, so I do not propose to repeat those points.

I would like to touch on one or two issues that came up. One is that clause 187 defines the meaning of “harm” throughout the Bill, although clause 150, as we have discussed, has its own internal definition of harm that is different. The more general definition of harm is made very clear in clause 187(2), which states:

“‘Harm’ means physical or psychological harm.”

That means that harm has a very broad construction in the Bill, as it should, to make sure that people are being protected as they ought to be.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Amendment 111 is not claimed; it has been tabled by the hon. Member for Stroud (Siobhan Baillie), who is not a member of the Committee. I am assuming that nobody wishes to take ownership of it and we will not debate it.

If the hon. Member for Aberdeen North wishes to move amendment 76, she will be able to do so at the end of the stand part debate.

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

As we know, the clause sets out the meanings of various terms used in the Bill. Throughout our Committee debates, Labour has raised fundamental concerns on a number of points where we feel the interpretation of the Bill requires clarification. We raised concerns as early as clause 8, when we considered the Bill’s ability to capture harm in relation to newly produced CSEA content and livestreaming. The Minister may feel he has sufficiently reassured us, but I am afraid that simply is not the case. Labour has no specific issues with the interpretations listed in clause 189, but we will likely seek to table further amendments on Report in the areas that we feel require clarification.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

In one of our earlier debates, I asked the Minister about the difference between “oral” and “aural”, and I did not get a very satisfactory answer. I know the difference in their dictionary definition—I understand that they are different, although the words sound the same. I am confused that clause 189 uses “oral” as part of the definition of content, but clause 49 refers to

“one-to-one live aural communications”

in defining things that are excluded.

I do not understand why the Government have chosen to use those two different words in different places in the Bill. It strikes me that, potentially, we mean one or the other. If they do mean two different things, why has one thing been chosen for clause 49 and another thing for clause 189? Why has the choice been made that clause 49 relates to communications that are heard, but clause 189 relates to communications that are said? I do not quite get the Government’s logic in using those two different words.

I know this is a picky point, but in order to have good legislation, we want it to make sense, for there to be a good rationale for everything that is in it and for people to be able to understand it. At the moment, I do not properly understand why the choice has been made to use two different words.

More generally, the definitions in clause 189 seem pretty sensible, notwithstanding what I said in the previous debate in respect of amendment 76, which, with your permission, Sir Roger, I intend to move when we reach the appropriate point.

--- Later in debate ---
Question proposed, That the clause stand part of the Bill.
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Labour has not tabled any amendments to clause 190, which lists the provisions that define or explain terms used in the Bill. However, it will come as no surprise that we dispute the Bill’s definition of harm, and I am grateful to my hon. Friend the Member for Batley and Spen for raising those important points in our lively debate about amendment 112 to clause 150. We maintain that the Minister has missed the point, in that the Bill’s definition of harm fails to truly capture physical harm caused as a consequence of being online. I know that the Minister has promised to closely consider that as we head to Report stage, but I urge him to bear in mind the points raised by Labour, as well as his own Back Benchers.

The Minister knows, because we have repeatedly raised them, that we have concerns about the scope of the Bill’s provisions relating to priority content. I will not repeat myself, but he will be unsurprised to learn that this is an area in which we will continue to prod as the Bill progresses through Parliament.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I have made points on those issues previously. I do not propose to repeat now what I have said before.

Question put and agreed to.

Clause 190 accordingly ordered to stand part of the Bill.

Clause 191 ordered to stand part of the Bill.

Clause 192

Extent

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

The clause provides that the Bill extends to England, Wales, Scotland and Northern Ireland, subject to the exceptions set out in subsections (2) to (7). We welcome clarification of how the devolved nations may be affected by the provisions of the Bill—that is of particular importance to me as a Welsh MP. It is important to clarify how amendments or appeals, as outlined in subsection (7), may work in the context of devolution more widely.

Labour also supports new clause 35 and Government amendment 141. Clearly, those working for Ofcom should have a defence to the offence of publishing obscene articles as, sadly, we see that as a core part of establishing the online safety regime in full. We know that having such a defence available is likely to be an important part of the regulator’s role and that of its employees. Labour is therefore happy to support this sensible new clause and amendment.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Amendment 139 was tabled by a Member who is not a member of the Committee, and nobody has claimed it, so we come to amendment 49.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I beg to move amendment 49, in clause 193, page 161, line 1, leave out subsection (2) and insert—

“(2) Subject to subsection (2A) below, the other provisions of this Act come into force on such day as the Secretary of State may by regulations appoint.

(2A) The provisions of Part 5 shall come into force at the end of the period of three months beginning with the day on which this Act is passed.”

This amendment would bring Part 5 into force three months after the Act is passed.

We all understand the need for the Bill, which is why we have been generally supportive in Committee. I hope we can also agree that the measures that the Bill introduces must come into force as soon as is reasonably possible. That is particularly important for the clauses introducing protections for children, who have been subject to the harms of the online world for far too long already. I was glad to hear the Minister say in our discussions of clauses 31 to 33 that the Government share the desire to get such protections in place quickly.

My hon. Friend the Member for Worsley and Eccles South also spoke about our concerns about the commencement and transitional provisions when speaking to clauses 170 to 172. We fundamentally believe that the provisions on pornography in part 5 cannot, and should not, be susceptible to further delay, because they require no secondary legislation. I will come to that point in my comments on the amendment. More broadly, I will touch briefly on the reasons why we cannot wait for the legislation and make reference to a specific case that I know colleagues across the House are aware of.

My hon. Friend the Member for Reading East (Matt Rodda) has been a powerful voice on behalf of his constituents Amanda and Stuart Stephens, whose beloved son Olly was tragically murdered in a field outside his home. A BBC “Panorama” investigation, shown only a few days ago, investigated the role that social media played in Olly’s death. It specifically highlighted disturbing evidence that some social media algorithms may still promote violent content to vulnerable young people. That is another example highlighting the urgent need for the Bill, along with a regulatory process to keep people safe online.

We also recognise, however, the important balance between the need for effective development of guidance by Ofcom, informed by consultation, and the need to get the duties up and going. In some cases, that will mean having to stipulate deadlines in the Bill, which we feel is a serious omission and oversight at present.

The amendment would bring part 5 of the Bill into force three months after it is enacted. The Minister knows how important part 5 is, so I do not need to repeat myself. The provisions of the amendment, including subsequent amendments that Labour and others will likely table down the line, are central to keeping people safe online. We have heard compelling evidence from experts and speeches from colleagues across the House that have highlighted how vital it is that the Bill goes further on pornographic content. The amendment is simple. It seeks to make real, meaningful change as soon as is practically possible. The Bill is long delayed, and providers and users are desperate for clarity and positive change, which is what led us to tabling the amendment.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

In the interests of not having to make a speech in this debate, I want to let the hon. Member know that I absolutely support the amendment. It is well balanced, brings the most important provisions into force as soon as possible, and allows the Secretary of State to appoint dates for the others.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I welcome the hon. Member’s intervention, and I am grateful for her and her party’s support for this important amendment.

It is also worth drawing colleagues’ attention to the history of issues, which have been brought forward in this place before. We know there was reluctance on the part of Ministers when the Digital Economy Act 2017 was on the parliamentary agenda to commence the all-important part 3, which covered many of the provisions now in part 5. Ultimately, the empty promises made by the Minister’s former colleagues have led to huge, record failures, even though the industry is ready, having had years to prepare to implement the policy. I want to place on record my thanks to campaigning groups such as the Age Verification Providers Association and others, which have shown fierce commitment in getting us this far.

It might help if I cast colleagues’ minds back to the Digital Economy Act 2017, which received Royal Assent in April of that year. Following that, in November 2018, the then Minister of State for Digital and Creative Industries told the Science and Technology Committee that part 3 of the DEA would be in force “by Easter next year”. Then, in December 2018, both Houses of Parliament approved the necessary secondary legislation, the Online Pornography (Commercial Basis) Regulations 2018, and the required statutory guidance.

But shortly after, in April 2018, the first delay arose when the Government published an online press release stating that part 3 of the DEA would not come into force until 15 July 2019. However, June 2019 came around and still there was nothing. On 20 June, five days after it should have come into force, the then Under-Secretary of State told the House of Lords that the defendant had failed to notify the European Commission of the statutory guidance, which would need to be done, and that that would result in a delay to the commencement of part 3

“in the region of six months”.—[Official Report, House of Lords, 20 June 2019; Vol. 798, c. 883.]

However, on 16 October 2019, the then Secretary of State announced via a written statement to Parliament that the Government

“will not be commencing part 3 of the Digital Economy Act 2017 concerning age verification for online pornography.”—[Official Report, 16 October 2019; Vol. 666, c. 17WS.]

A mere 13 days later, the Government called a snap general election. I am sure those are pretty staggering realities for the Minister to hear—and defend—but I am willing to listen to his defence. It really is not good enough. The industry is ready, the technology has been there for quite some time, and, given this Government’s fondness for a U-turn, there are concerns that part 5 of the Bill, which we have spent weeks deliberating, could be abandoned in a similar way as part 3 of the DEA was.

The Minister has failed to concede on any of the issues we have raised in Committee. It seems we are dealing with a Government who are ignoring the wide-ranging gaps and issues in the Bill. He has a relatively last-ditch opportunity to at least bring about some positive change, and to signify that he is willing to admit that the legislation as it stands is far from perfect. The provisions in part 5 are critical—they are probably the most important in the entire Bill—so I urge him to work with Labour to make sure they are put to good use in a more than reasonable timeframe.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

On the implementation of part 3 of the Digital Economy Act 2017, all the events that the shadow Minister outlined predated my time in the Department. In fact, apart from the last few weeks of the period she talked about, the events predated my time as a Minister in different Departments, and I cannot speak for the actions and words of Ministers prior to my arrival in DCMS. What I can say, and I have said in Committee, is that we are determined to get the Bill through Parliament and implemented as quickly as we can, particularly the bits to do with child safety and the priority illegal content duties.

The shadow Minister commented at the end of her speech that she thought the Government had been ignoring parliamentary opinion. I take slight issue with that, given that we published a draft Bill in May 2021 and went through a huge process of scrutiny, including by the Joint Committee of the Commons and the Lords. We accepted 66 of the Joint Committee’s recommendations, and made other very important changes to the Bill. We have made changes such as addressing fraudulent advertising, which was previously omitted, and including commercial pornography—meaning protecting children—which is critical in this area.

The Government have made a huge number of changes to the Bill since it was first drafted. Indeed, we have made further changes while the Bill has been before the Committee, including amending clause 35 to strengthen the fraudulent advertising duties on large search companies. Members of Parliament, such as the right hon. Member for East Ham (Sir Stephen Timms), raised that issue on Second Reading. We listened to what was said at that stage and we made the changes.

There have also been quite a few occasions during these Committee proceedings when I have signalled—sometimes subtly, sometimes less so—that there are areas where further changes might be forthcoming as the Bill proceeds through both Houses of Parliament. I do not think the hon. Member for Pontypridd, or any member of the Committee, should be in any doubt that the Government are very open to making changes to the Bill where we are able to and where they are right. We have done so already and we might do so again in the future.

On the specifics of the amendment, we share the intention to protect children from accessing pornography online as quickly as possible. The amendment seeks to set a three-month timeframe within which part 5 must come into force. However, an important consideration for the commencement of part 5 will be the need to ensure that all kinds of providers of online pornography are treated the same, including those hosting user-generated content, which are subject to the duties of part 3. If we take a piecemeal approach, bringing into force part 5, on commercial pornography, before part 3, on user-to-user pornography, that may enable some of the services, which are quite devious, to simply reconfigure their services to circumvent regulation or cease to be categorised as part 5 services and try to be categorised as part 3 services. We want to do this in a comprehensive way to ensure that no one will be able to wriggle out of the provisions in the Bill.

Parliament has also placed a requirement on Ofcom to produce, consult on and publish guidance for in-scope providers on meeting the duties in part 5. The three-month timescale set out in the amendment would be too quick to enable Ofcom to properly consult on that guidance. It is important that the guidance is right; if it is not, it may be legally challenged or turn out to be ineffective.

I understand the need to get this legislation implemented quickly. I understand the scepticism that flows from the long delays and eventual cancellation of part 3 of the Digital Economy Act 2017. I acknowledge that, and I understand where the sentiment comes from. However, I think we are in a different place today. The provisions in the Bill have been crafted to address some of the concerns that Members had about the previous DEA measures—not least the fact that they are more comprehensive, as they cover user-to-user, which the DEA did not. There is therefore a clear commitment to getting this done, and getting it done fast. However, we also have to get it done right, and I think the process we have set out does that.

The Ofcom road map is expected before the summer. I hope that will give further reassurance to the Committee and to Parliament about the speed with which these things can get implemented. I share Members’ sentiments about needing to get this done quickly, but I do not think it is practical or right to do it in the way set out in amendment 49.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I am grateful for the Minister’s comments. However, I respectfully disagree, given the delays already since 2017. The industry is ready for this. The providers of the age verification services are ready for this. We believe that three months is an adequate timeframe, and it is vital that we get this done as quickly as possible. With that in mind, I will be pushing amendment 49 to a vote.

Question put, That the amendment be made.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

New clause 42 introduces new schedule 2. New clause 43 provides that the additional fees charged to providers under new schedule 2 must be paid into the consolidated fund. We discussed that a few days ago. That is where the fees are currently destined and I owe my right hon. Friend the Member for Basingstoke some commentary on this topic in due course. The Bill already provided that monetary penalties must be paid into the Consolidated Fund; the provisions are now placed into that clause.

New schedule 2, which is quite detailed, makes provisions in connection with Ofcom’s ability to recover its initial costs, which we have previously debated. As discussed, it is important that the taxpayer not only is protected from the ongoing costs but that the set-up costs are recovered. The taxpayer should not have to pay for the regulatory framework; the people who are being regulated should pay, whether the costs are incurred before or after commencement, in line with the “polluter pays” principle. Deep in new schedule 2 is the answer to the question that the hon. Member for Aberdeen North asked a day or two ago about the period over which set-up costs can be recovered, with that period specified as between three and five years. I hope that provides an introduction to the new clauses and new schedules.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

We welcome this grouping, which includes two new clauses and a new schedule. Labour has raised concerns about the future funding of Ofcom more widely, specifically when we discussed groupings on clause 42. The Minister’s response did little to alleviate our concerns about the future of Ofcom’s ability to raise funds to maintain its position as the regulator. Despite that, we welcome the grouping, particularly the provisions in the new schedule, which will require Ofcom to seek to recover the costs it has incurred when preparing to take on functions as the regulator of services under the Bill by charging fees to providers of services. This is an important step, which we see as being broadly in line with the kind of mechanisms already in place for other, similar regulatory regimes.

Ultimately, it is right that fees charged to providers under new schedule 2 must be paid into the Consolidated Fund and important that Ofcom can recover its costs before a full fee structure and governance process is established. However, I have some questions for the Minister. How many people has Ofcom hired into roles, and can any of those costs count towards the calculation of fees? We want to ensure that other areas of regulation do not lose out as a consequence. Broadly speaking, though, we are happy to support the grouping and have not sought to table amendment at this stage.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

So far as I am aware, all the costs incurred by Ofcom in relation to the duties in the Bill can be recouped by way of fees. If that is not correct, I will write to the hon. Lady saying so, but my understanding is that any relevant Ofcom cost will be in the scope of the fees.

Question put and agreed to.

New clause 42 accordingly read a Second time, and added to the Bill.

New Clause 43

Payment of sums into the Consolidated Fund

“(1) Section 400 of the Communications Act (destination of penalties etc) is amended as follows.

(2) In subsection (1), after paragraph (i) insert—

‘(j) an amount paid to OFCOM in respect of a penalty imposed by them under Chapter 6 of Part 7 of the Online Safety Act 2022;

(k) an amount paid to OFCOM in respect of an additional fee charged under Schedule (Recovery of OFCOM’s initial costs) to the Online Safety Act 2022.’

(3) In subsection (2), after ‘applies’ insert ‘(except an amount mentioned in subsection (1)(j) or (k))’.

(4) After subsection (3) insert—

‘(3A) Where OFCOM receive an amount mentioned in subsection (1)(j) or (k), it must be paid into the Consolidated Fund of the United Kingdom.’

(5) In the heading, omit ‘licence’.”—(Chris Philp.)

This new clause provides that additional fees charged to providers under NS2 must be paid into the Consolidated Fund. The Bill already provided that monetary penalties must be paid into the Consolidated Fund, and those provisions are now placed in this clause.

Brought up, read the First and Second time, and added to the Bill.

New Clause 3

Establishment of Advocacy Body

“(1) There is to be a body corporate (‘the Advocacy Body’) to represent interests of child users of regulated services.

(2) A ‘child user’—

(a) means any person aged 17 years or under who uses or is likely to use regulated internet services; and

(b) includes both any existing child user and any future child user.

(3) The work of the Advocacy Body may include—

(a) representing the interests of child users;

(b) the protection and promotion of these interests;

(c) any other matter connected with those interests.

(4) The ‘interests of child users’ means the interest of children in relation to the discharge by any regulated company of its duties under this Act, including—

(a) safety duties about illegal content, in particular CSEA content;

(b) safety duties protecting children;

(c) ‘enforceable requirements’ relating to children.

(5) The Advocacy Body must have particular regard to the interests of child users that display one or more protected characteristics within the meaning of the Equality Act 2010.

(6) The Advocacy Body will be defined as a statutory consultee for OFCOM’s regulatory decisions which impact upon the interests of children.

(7) The Secretary of State may appoint an organisation known to represent children to be designated the functions under this Act, or may create an organisation to carry out the designated functions.”—(Barbara Keeley.)

This new clause creates a new advocacy body for child users of regulated internet services.

Brought up, and read the First time.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I am grateful to the Minister for his support for Labour legislation. Does he acknowledge that we have different Children’s Commissioners across the nations of the UK? Each would have the same rights to advocate for children, so we would have four, rather than one focusing on one specific issue, which is what the Children’s Commissioners across the UK are advocating for.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I do not have in front of me the relevant devolved legislation—I have only the Children Act 2004 directly in front of me—but I assume it is broadly similar. The hon. Member for Aberdeen North can correct me if I am wrong, but I assume it is probably broadly similar in the way—[Interruption.] She is not sure, so I do not feel too bad about not being sure either. I imagine it is similar. I am not sure that having similar statutory bodies with the same function—we would create another with the new clause—is necessarily helpful.

The Bill sets out formal processes that allow other organisations, such as the NSPCC, to raise complaints that have to be dealt with. That ensures that the voices of groups—including children, but not just children—will be heard. I suspect that if we have a children’s advocacy body, other groups will want them and might feel that they have been overlooked by omission.

The good thing about the way the super-complaint structure in clause 140 works is that it does not prescribe what the groups are. Although I am sure that children will be top of the list, there will be other groups that want to advocate and to be able to bring super-complaints. I imagine that women’s groups will be on that list, along with groups advocating for minorities and people with various sexual orientations. Clause 140 is not exclusive; it allows all these groups to have a voice that must be heard. That is why it is so effective.

My right hon. Friend the Member for Basingstoke and the hon. Member for Batley and Spen asked whether the groups have enough resources to advocate on issues under the super-complaint process. That is a fair question. The allocation of funding to different groups tends to be done via the spending review process. Colleagues in other Departments—the Department for Education or, in the case of victims, the Ministry of Justice—allocate quite a lot of money to third-sector groups. The victims budget was approximately £200 million a year or two ago, and I am told it has risen to £300 million for the current financial year. That is the sort of funding that can find its way into the hands of the organisations that advocate for particular groups of victims. My right hon. Friend asked whether the proceeds of fines could be applied to fund such work, and I have undertaken to raise that with the Treasury.

We already have a statutory advocate for children: the four Children’s Commissioners for the four parts of the United Kingdom. We have the super-complaints process, which covers more than children’s groups, crucial though they are. We have given Ofcom statutory duties to consult when developing its codes of practice, and we have money flowing via the Ministry of Justice, the DFE and others, into advocate groups. Although we agree with the intention behind new clause 3, we believe its objectives are very well covered via the mechanisms that I have just set out at some length.

Online Safety Bill (Sixteenth sitting) Debate

Full Debate: Read Full Debate

Alex Davies-Jones

Main Page: Alex Davies-Jones (Labour - Pontypridd)

Online Safety Bill (Sixteenth sitting)

Alex Davies-Jones Excerpts
Committee stage
Tuesday 28th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 28 June 2022 - (28 Jun 2022)
Brought up, and read the First time.
Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - -

I beg to move, That the clause be read a Second time.

Good morning, Sir Roger. As my hon. Friend the Member for Worsley and Eccles South mentioned when speaking to new clause 11, Labour has genuine concerns about supply chain risk assessment duties. That is why we have tabled new clause 13, which seeks to ensure enforcement of liability for supply chain failures that amount to a breach of one of the specified duties drawing on existing legislation.

As we know, platforms, particularly those supporting user-to-user generated content, often employ services from third parties. At our evidence sessions we heard from Danny Stone of the Antisemitism Policy Trust that this has included Twitter explaining that racist GIFs were not its own but were provided by another service. The hands-off approach that platforms have managed to get away with for far too long is exactly what the Bill is trying to fix, yet without this important new clause we fear there will be very little change.

We have already raised issues with the reliance on third party providers more widely, particularly content moderators, but the same problems also apply to some types of content. Labour fears a scenario in which a company captured by the regulatory regime established by the Bill will argue that an element of its service is not within the ambit of the regulator simply because it is part of a supply chain, represented by, but not necessarily the responsibility of, the regulated services.

The contracted element, supported by an entirely separate company, would argue that it is providing business-to-business services. That is not user-to-user generated content per se but content designed and delivered at arm’s length, provided to the user-to-user service to deploy to its users. The result would likely be a timely, costly and unhelpful legal process during which systems could not be effectively regulated. The same may apply in relation to moderators, where complex contract law would need to be invoked.

We recognise that in UK legislation there are concerns and issues around supply chains. The Bribery Act 2010, for example, says that a company is liable if anyone performing services for or on the company’s behalf is found culpable of specific actions. We therefore strongly urge the Minister to consider this new clause. We hope he will see the extremely compelling reasons why liability should be introduced for platforms failing to ensure that associated parties, considered to be a part of a regulated service, help to fulfil and abide by relevant duties.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The new clause seeks to impose liability on a provider where a company providing regulated services on its behalf does not comply with the duties in the Bill. The provider would be liable regardless of whether it has any control over the service in question. We take the view this would impose an unreasonable burden on businesses and cause confusion over which companies are required to comply with the duties in the Bill.

As drafted, the Bill ensures legal certainty and clarity over which companies are subject to duties. Clause 180 makes it clear that the Bill’s duties fall on companies with control over the regulated service. The point about who is in control is very important, because the liability should follow the control. These companies are responsible for ensuring that any third parties, such as contractors or individuals involved in running the service, are complying with the Bill’s safety duties, so that they cannot evade their duties in that way.

Companies with control over the regulated service are best placed to keep users safe online, assess risk, and put in place systems and processes to minimise harm, and therefore bear the liability if there is a transgression under the Bill as drafted. Further, the Bill already contains robust provisions in clause 161 and schedule 14 that allow Ofcom to hold parent and subsidiary companies jointly liable for the actions of other companies in a group structure. These existing mechanisms promote strong compliance within groups of companies and ensure that the entities responsible for breaches are the ones held responsible. That is why we feel the Bill as drafted achieves the relevant objectives.

Question put, That the clause be read a Second time.

--- Later in debate ---
Brought up, and read the First time.
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I beg to move, That the clause be read a Second time.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

New clause 15—Media literacy strategy

“(1) OFCOM must prepare a strategy which sets out how they intend to undertake their duty to promote media literacy in relation to regulated user-to-user services and regulated search services under section (Duty to promote media literacy: regulated user-to-user services and search services).

(2) The strategy must—

(a) set out the steps OFCOM propose to take to achieve the pursuit of the objectives set out in section (Duty to promote media literacy: regulated user-to-user services and search services),

(b) set out the organisations, or types of organisations, that OFCOM propose to work with in undertaking the duty;

(c) explain why OFCOM considers that the steps it proposes to take will be effective;

(d) explain how OFCOM will assess the extent of the progress that is being made under the strategy.

(3) In preparing the strategy OFCOM must have regard to the need to allocate adequate resources for implementing the strategy.

(4) OFCOM must publish the strategy within the period of 6 months beginning with the day on which this section comes into force.

(5) Before publishing the strategy (or publishing a revised strategy), OFCOM must consult—

(a) persons with experience in or knowledge of the formulation, implementation and evaluation of policies and programmes intended to improve media literacy;

(b) the advisory committee on disinformation and misinformation, and

(c) any other person that OFCOM consider appropriate.

(6) If OFCOM have not revised the strategy within the period of 3 years beginning with the day on which the strategy was last published, they must either—

(a) revise the strategy, or

(b) publish an explanation of why they have decided not to revise it.

(7) If OFCOM decides to revise the strategy they must—

(a) consult in accordance with subsection (3), and

(b) publish the revised strategy.”

This new clause requires Ofcom to publish a strategy related to their duty to promote media literacy of the public in relation to regulated user-to-user services and search services.

New clause 16—Media literacy strategy: progress report

“(1) OFCOM must report annually on the delivery of the strategy required under section (Duty to promote media literacy: regulated user-to-user services and search services).

(2) The report must include—

(a) a description of the steps taken in accordance with the strategy during the year to which the report relates; and

(b) an assessment of the extent to which those steps have had an effect on the media literacy of the public in that year.

(3) The assessment referred to in subsection (2)(b) must be made in accordance with the approach set out by OFCOM in the strategy (see section (Duty to promote media literacy: regulated user-to-user services and search services) (2)(d).

(4) OFCOM must—

(a) publish the progress report in such manner as they consider appropriate; and

(b) send a copy of the report to the Secretary of State who must lay the copy before Parliament.”

This new clause is contingent on NC15.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

The UK has a vast media literacy skills and knowledge gap, which leaves the population at risk of harm. Indeed, research from Ofcom found that a third of internet users are unaware of the potential for inaccurate or biased information. Similarly, about 61% of social media users who say they are confident in judging whether online content is true or false actually lack the skills to do so.

Good media literacy is our first line of defence against bad information online. It can make the difference between decisions based on sound evidence and decisions based on poorly informed opinions that can harm health and wellbeing, social cohesion and democracy. Clause 103 of the draft Bill proposed a new media duty for Ofcom to replace the one in section 11 of the Communications Act 2003, but sadly the Government scrapped it from the final Bill.

Media literacy initiatives in the Online Safety Bill are now mentioned only in the context of risk assessments, but there is no active requirement for internet companies to promote media literacy. The draft Bill’s media literacy provision needed to be strengthened, not cut. New clauses 14, 15 and 16 would introduce a new, stronger media literacy duty on Ofcom, with specific objectives. They would require the regulator to produce a statutory strategy for delivering on it and then to report on progress made towards increasing media literacy under the strategy. There is no logical reason for the Minister not to accept these important new clauses or work with Labour on them.

Over the past few weeks, we have debated a huge range of issues that are being perpetuated online as we speak, from vile, misogynistic content about women and girls to state-sponsored disinformation. It is clear that the lessons have not been learned from the past few years, when misinformation was able to significantly undermine public health, most notably throughout the pandemic. Harmful and, more importantly, false statistics were circulated online, which caused significant issues in encouraging the uptake of the vaccine. We have concerns that, without a robust media literacy strategy, the consequences of misinformation and disinformation could go further.

The issues that Labour has raised about the responsibility of those at the top—the Government—have been well documented. Only a few weeks ago, we spoke about the Secretary of State actually contributing to the misinformation discourse by sharing a picture of the Labour leader that was completely out of context. How can we be in a position where those at the top are contributing to this harmful discourse? The Minister must be living in a parallel universe if he cannot see the importance of curbing these harmful behaviours online as soon as possible. He must know that media literacy is at the very heart of the Bill’s success more widely. We genuinely feel that a strengthened media literacy policy would be a huge step forward, and I sincerely hope that the Minister will therefore accept the justification behind these important new clauses.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I agree entirely on these new clauses. Although the Bill will make things safer, it will do that properly only if supported by proper media literacy and the upskilling of everybody who spends any portion of their lives online. They all need better media literacy, and I am not excluding myself from that. Everybody, no matter how much time they have spent online, can learn more about better ways to fact-check and assess risk, and about how services use our data.

I pay tribute to all those involved in media literacy—all the educators at all levels, including school teachers delivering it as part of the curriculum, school teachers delivering it not as part of the curriculum, and organisations such as CyberSafe Scotland in my constituency, which is working incredibly hard to upskill parents and children about the internet. They also include organisations such as the Silver City Surfers in Aberdeen, where a group of young people teaches groups of elderly people how to use the internet. All those things are incredibly helpful and useful, but we need to ensure that Ofcom is at the top of that, producing materials and taking its duties seriously. It must produce the best possible information and assistance for people so that up-to-date media literacy training can be provided.

As we have discussed before, Ofcom’s key role is to ensure that when threats emerge, it is clear and tells people, “This is a new threat that you need to be aware of,” because the internet will grow and change all the time, and Ofcom is absolutely the best placed organisation to be recognising the new threats. Obviously, it would do that much better with a user advocacy panel on it, but given its oversight and the way it will be regulating all the providers, Ofcom really needs to take this issue as seriously as it can. It is impossible to overstate the importance of media literacy, so I give my wholehearted backing to the three new clauses.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The Government obviously recognise and support the intent behind the new clause, which is to make sure that work is undertaken by Ofcom specifically, and the Government more widely, on media literacy. That is important for the reasons laid out by the hon. Members for Aberdeen North and for Batley and Spen.

Ofcom already has a statutory duty to promote media literacy in relation to electronic media, which includes everything in scope of the Bill and more beyond. That is set out in the Communications Act 2003, so the statutory duty exists already. The duty proposed in new clause 14 is actually narrower in scope than the existing statutory duty on Ofcom, and I do not think it would be a very good idea to give Ofcom an online literacy duty with a narrower scope than the one it has already. For that reason, I will resist the amendment, because it narrows the duties rather than widens them.

I would also point out that a number of pieces of work are being done non-legislatively. The campaigns that the hon. Member for Batley and Spen mentioned—dating often, I think, back to the 1980s—were of course done on a non-legislative basis and were just as effective for it. In that spirit, Ofcom published “Ofcom’s approach to online media literacy” at the end of last year, which sets out how Ofcom plans to expand, and is expanding, its media literacy programmes, which cover many of the objectives specified in the new clause. Therefore, Ofcom itself has acted already—just recently—via that document.

Finally, I have two points about what the Government are doing. First, about a year ago the Government published their own online media literacy strategy, which has been backed with funding and is being rolled out as we speak. When it comes to disinformation more widely, which we have debated previously, we also have the counter-disinformation unit working actively on that area.

Therefore, through the Communications Act 2003, the statutory basis exists already, and on a wider basis than in these new clauses; and, through the online media literacy strategy and Ofcom’s own approach, as recently set out, this important area is well covered already.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

We feel that we cannot have an online safety Bill without a core digital media literacy strategy. We are disappointed that clause 103 was removed from the draft Bill. We do not feel that the current regime, under the Communications Act 2003, is robust enough. Clearly, the Government do not think it is robust enough, which is why they tried to replace it in the first place. We are sad to see that now replaced altogether. We fully support these new clauses.

Question put, That the clause be read a Second time.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

My hon. Friend the Member for Ochil and South Perthshire is not present and he had intended to move this new clause. If the Committee does not mind, I will do more reading and look at my notes more than I would normally when giving a speech.

Misinformation and disinformation arise during periods of uncertainty, either acutely, such as during a terror attack, or over a long period, as with the pandemic. That often includes information gaps and a proliferation of inaccurate claims that spread quickly. Where there is a vacuum of information, we can have bad actors or the ill-informed filling it with false information.

Information incidents are not dealt with effectively enough in the Bill, which is focused on regulating the day-to-day online environment. I accept that clause 146 gives the Secretary of State powers of direction in certain special circumstances, but their effectiveness in real time would be questionable. The Secretary of State would have to ask Ofcom to prioritise its media literacy function or to make internet companies report on what they are doing in response to a crisis. That is just too slow, given the speed at which such incidents can spread.

The new clause might involve Ofcom introducing a system whereby emerging incidents could be reported publicly and different actors could request the regulator to convene a response group. The provision would allow Ofcom to be more proactive in its approach and, in I hope rare moments, to provide clear guidance. That is why the new clause is a necessary addition to the Bill.

Many times, we have seen horrendous incidents unfold on the internet, in a very different way from how they ever unfolded in newspapers, on news websites or among people talking. We have seen the untold and extreme harm that such information incidents can cause, as significant, horrific events can be spread very quickly. We could end up in a situation where an incident happens and, for example, a report spreads that a Muslim group was responsible when there is absolutely no basis of truth to that. A vacuum can be created and bad actors step into it in order to spread discrimination and lies, often about minority groups who are already struggling. That is why we move the new clause.

For the avoidance of doubt, new clause 45, which was tabled by Labour, is also to be debated in this group. I am more than happy to support it.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

As we know, the new clause would give Ofcom a proactive role in identifying and responding to misinformation incidents that can occur in a moment of crisis. As we have discussed, there are huge gaps in the Bill’s ability to sufficiently arm Ofcom with the tools it will likely need to tackle information incidents in real time. It is all very well that the Bill will ensure that things such as risk assessments are completed, but, ultimately, if Ofcom is not able to proactively identify and respond to incidents in a crisis, I have genuine concerns about how effective this regulatory regime will be in the wider sense. Labour is therefore pleased support the new clause, which is fundamental to ensuring that Ofcom can be the proactive regulator that the online space clearly needs.

The Government’s methods of tackling disinformation are opaque, unaccountable and may not even work. New clause 45, which would require reporting to Parliament, may begin to address this issue. When Ministers are asked how they tackle misinformation or disinformation harms, they refer to some unaccountable civil service team involved in state-based interference in online media.

I thank those at Carnegie UK Trust for their support when researching the following list, and for supporting my team and me to make sense of the Bill. First, we have the counter-disinformation unit, which is based in the Department for Digital, Culture, Media and Sport and intends to address mainly covid issues that breach companies’ terms of service and, recently, the Russia-Ukraine conflict. In addition, the Government information cell, which is based in the Foreign, Commonwealth and Development Office, focuses on war and national security issues, including mainly Russia and Ukraine. Thirdly, there is the so-called rapid response unit, which is based in the Cabinet Office, and mainly tackles proactive counter-messaging.

Those teams appear to nudge service providers in different ways where there are threats to national security or the democratic process, or risks to public health, yet we have zero record of their effectiveness. The groups do not publish logs of action to any external authority for oversight of what they raise with companies using the privilege authority of Her Majesty’s Government, nor do they publish the effectiveness of their actions. As far as we know, they are not rooted in expert independent external advisers. That direct state interference in the media is very worrying.

In our recent debate on amendment 83, which calls on the Government to include health misinformation and disinformation in the Bill, the Minister clearly set out why he thinks the situation is problematic. He said,

“We have established a counter-disinformation unit within DCMS whose remit is to identify misinformation and work with social media firms to get it taken down. The principal focus of that unit during the pandemic was, of course, covid. In the past three months, it has focused more on the Russia-Ukraine conflict, for obvious reasons.

In some cases, Ministers have engaged directly with social media firms to encourage them to remove content that is clearly inappropriate. For example, in the Russia-Ukraine context, I have had conversations with social media companies that have left up clearly flagrant Russian disinformation. This is, therefore, an area that the Government are concerned about and have been acting on operationally already.”––[Official Report, Online Safety Public Bill Committee, 14 June 2022; c. 408.]

Until we know more about those units, the boundary between their actions and that of a press office remains unclear. In the new regulatory regime, Ofcom needs to be kept up to date on the issues they are raising. The Government should reform the system and bring those units out into the open. We support Carnegie’s longer term strategic goal to set up a new external oversight body and move the current Government functions under Ofcom’s independent supervision. The forthcoming National Security Bill may tackle that, but I will leave that for the Minister to consider.

There must be a reporting system that requires the Government to set out their operational involvement with social media companies to address misinformation and disinformation, which is why we have tabled new clause 45. I hope the Minister will see that the current efforts in these units are hugely lacking in transparency, which we all want and have learned is fundamental to keep us all safe online.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We agree that it is important that the Bill contains measures to tackle disinformation and misinformation that may emerge during serious information incidents, but the Bill already contains measures to address those, including the powers vested in the Secretary of State under clause 146, which, when debated, provoked some controversy. Under that clause, the Secretary of State will have the power to direct Ofcom when exercising its media literacy functions in the context of an issue of public health or safety or national security.

Moreover, Ofcom will be able to require platforms to issue a public statement about the steps they are taking to respond to a threat to public health or safety or to national security. As we discussed, it is appropriate that the Secretary of State will make those directions, given that the Government have the access to intelligence around national security and the relevant health information. Ofcom, as a telecoms regulator, obviously does not have access to that information, hence the need for the Secretary of State’s involvement.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Given that clarification, I will not press the new clause. The Minister has made the case strongly enough and has clarified clause 85(1) to my satisfaction. I beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.

New Clause 23

Priority illegal content: violence against women and girls

“(1) For the purposes of this Act, any provision applied to priority illegal content should also be applied to any content which—

(a) constitutes,

(b) encourages, or

(c) promotes

violence against women or girls.

(2) ‘Violence against women and girls’ is defined by Article 3 of the Council of Europe Convention on Preventing Violence Against Women and Domestic Violence (‘the Istanbul Convention’).” —(Alex Davies-Jones.)

This new clause applies provisions to priority illegal content to content which constitutes, encourages or promotes violence against women and girls.

Brought up, and read the First time.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I beg to move, That the clause be read a Second time.

This new clause would apply provisions applied to priority illegal content also to content that constitutes, encourages or promotes violence against women and girls. As it stands, the Bill is failing women and girls. In an attempt to tackle that alarming gap, the new clause uses the Istanbul convention definition of VAWG, given that the Home Secretary has so recently agreed to ratify the convention—just a decade after was signed.

The Minister might also be aware that GREVIO—the Group of Experts on Action against Violence against Women and Domestic Violence—which monitors the implementation of the Istanbul convention, published a report in October 2021 on the digital dimension of violence against women and girls. It stated that domestic laws are failing to place the abuse of women and girls online

“in the context of a continuum of violence against women that women and girls are exposed to in all spheres of life, including in the digital sphere.”

The purpose of naming VAWG in the Bill is to require tech companies to be responsible for preventing and addressing VAWG as a whole, rather than limiting their obligations only to specific criminal offences listed in schedule 7 and other illegal content. It is also important to note that the schedule 7 priority list was decided on without any consultation with the VAWG sector. Naming violence against women and girls will also ensure that tech companies are held to account for addressing emerging forms of online hate, which legislation is often unable to keep up with.

We only need to consider accounts from survivors of online violence against women and girls, as outlined in “VAWG Principles for the Online Safety Bill”, published in September last year, to really see the profound impact that the issue is having on people’s lives. Ellesha, a survivor of image-based sexual abuse, was a victim of voyeurism at the hands of her ex-partner. She was filmed without her consent and was later notified by someone else that he had uploaded videos of her to Pornhub. She recently spoke at an event that I contributed to—I believe the right hon. Member for Basingstoke and others also did—on the launch of the “Violence Against Women and Girls Code of Practice”. I am sure we will come to that code of practice more specifically on Report. Her account was genuinely difficult to listen to.

This is an issue that Ellesha, with the support of EVAW, Glitch, and a huge range of other organisations, has campaigned on for some time. She says:

“Going through all of this has had a profound impact on my life. I will never have the ability to trust people in the same way and will always second guess their intentions towards me. My self confidence is at an all time low and although I have put a brave face on throughout this, it has had a detrimental effect on my mental health.”

Ellesha was informed by the police that they could not access the websites where her ex-partner had uploaded the videos, so she was forced to spend an immense amount of time trawling through all of the videos uploaded to simply identify herself. I can only imagine how distressing that must have been for her.

Pornhub’s response to the police inquiries was very vague in the first instance, and it later ignored every piece of following correspondence. Eventually the videos were taken down, likely by the ex-partner himself when he was released from the police station. Ellesha was told that Pornhub had only six moderators at the time—just six for the entire website—and it and her ex-partner ultimately got away with allowing the damaging content to remain, even though the account was under his name and easily traced back to his IP address. That just is not good enough, and the Minister must surely recognise that the Bill fails women in its current form.

If the Minister needs any further impetus to genuinely consider the amendment, I point him to a BBC report from last week that highlighted how much obscene material of women and girls is shared online without their consent. The BBC’s Angus Crawford investigated Facebook accounts and groups that were seen to be posting pictures and videos of upskirting. Naturally, Meta—Facebook’s owner—said that it had a grip on the problem and that those accounts and groups had all been removed, yet the BBC was able to find thousands of users sharing material. Indeed, one man who posted videos of himself stalking schoolgirls in New York is now being investigated by the police. This is the reality of the internet; it can be a powerful, creative tool for good, but far too often it seeks to do the complete opposite.

I hate to make this a gendered argument, but there is a genuine difference between the experiences of men and women online. Last week the Minister came close to admitting that when I queried whether he had ever received an unsolicited indecent picture. I am struggling to understand why he has failed to consider these issues in a Bill proposed by his Department.

The steps that the Government are taking to tackle violence against women and girls offline are broadly to be commended, and I welcome a lot of the initiatives. The Minister must see sense and do the right thing by also addressing the harms faced online. We have a genuine opportunity in the Bill to prevent violence against women and girls online, or at least to diminish some of the harms they face. Will he please do the right thing?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The shadow Minister is right to raise the issue of women and girls being disproportionately—one might say overwhelmingly—the victims of certain kinds of abuse online. We heard my right hon. Friend the Member for Basingstoke, the shadow Minister and others set that out in a previous debate. The shadow Minister is right to raise the issue.

Tackling violence against women and girls has been a long-standing priority of the Government. Indeed, a number of important new offences have already been and are being created, with protecting women principally in mind—the offence of controlling or coercive behaviour, set out in the Serious Crime Act 2015 and amended in the Domestic Abuse Act 2021; the creation of a new stalking offence in 2012; a revenge porn offence in 2015; and an upskirting offence in 2019. All of those offences are clearly designed principally to protect women and girls who are overwhelmingly the victims of those offences. Indeed, the cyber-flashing offence created by clause 156 —the first time we have ever had such an offence in this jurisdiction—will, again, overwhelmingly benefit women and girls who are the victims of that offence.

All of the criminal offences I have mentioned—even if they are not mentioned in schedule 7, which I will come to in a moment—will automatically flow into the Bill via the provisions of clause 52(4)(d). Criminal offences where the victim is an individual, which these clearly all are, automatically flow into the provisions of the Bill, including the offences I just listed, which have been created particularly with women in mind.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I hope I have made very clear in everything I have said, which I do not propose to repeat, that the way the Bill operates, in several different areas, and the way the criminal law has been constructed over the past 10 years, building on the work of previous Governments, is that it is designed to make sure that the crimes committed overwhelmingly against women and girls are prioritised. I think the Bill does achieve the objective of providing that protection, which every member of this Committee wishes to see delivered. I have gone through it in some detail. It is woven throughout the fabric of the Bill, in multiple places. The objective of new clause 23 is more than delivered.

In conclusion, we will be publishing a list of harms, including priority harms for children and adults, which will then be legislated for in secondary legislation. The list will be constructed with the vulnerability of women and girls particularly in mind. When Committee members see that list, they will find it reassuring on this topic. I respectfully resist the new clause, because the Bill is already incredibly strong in this important area as it has been constructed.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

The Bill is strong, but it could be stronger. It could be, and should be, a world-leading piece of legislation. We want it to be world-leading and we feel that new clause 23 would go some way to achieving that aim. We have cross-party support for tackling violence against women and girls online. Placing it on the face of the Bill would put it at the core of the Bill—at its heart—which is what we all want to achieve. With that in mind, I wish to press the new clause to a vote.

Question put, That the clause be read a Second time.

--- Later in debate ---
Brought up, and read the First time.
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I beg to move, That the clause be read a Second time.

This new clause would require the Secretary of State to publish and lay before Parliament a report on the harms caused to users by synthetic media content, also known as deepfakes. The report must contain particular reference to the harms caused to those working in the entertainment industry.

The Government define artificial intelligence as

“technologies with the ability to perform tasks that would otherwise require human intelligence, such as visual perception, speech recognition, and language translation”.

That kind of technology has advanced rapidly in recent years, and commercial AI companies can be found across all areas of the entertainment industries, including voice, modelling, music, dance, journalism and gaming—the list goes on.

One key area of development is AI-made performance synthetisation, which is the process of creating a synthetic performance. That has a wide range of applications, including automated audiobooks, interactive digital avatars and “deepfake” technology, which often, sadly, has more sinister implications. Innovation for the entertainment industry is welcome and, when used ethically and responsibly, can have various benefits. For example, AI systems can create vital sources of income for performers and creative workers. From an equalities perspective, it can be used to increase accessibility for disabled workers.

However, deepfake technology has received significant attention globally due to its often-malicious application. Deepfakes have been defined as,

“realistic digital forgeries of videos or audio created with cutting-edge machine learning techniques.”

An amalgamation of artificial intelligence, falsification and automation, deepfakes use deep learning to replicate the likeness and actions of real people. Over the past few years, deepfake technology has become increasingly sophisticated and accessible. Various apps can be downloaded for free, or a low cost, to utilise deepfake technology.

Deepfakes can cause short-term and long-term social harms to individuals working in the entertainment industry, and to society more broadly. Currently, deepfakes are mostly used in pornography, inflicting emotional and reputational damage, and in some cases violence towards the individual—mainly women. The US entertainment union, the Screen Actors Guild, estimates that 96% of deepfakes are pornographic and depict women, and 99% of deepfake subjects are from the entertainment industry.

However, deepfakes used without consent pose a threat in other key areas. For example, deepfake technology has the power to alter the democratic discourse. False information about institutions, policies, and public leaders, powered by a deepfake, can be exploited to spin information and manipulate belief. For example, deepfakes have the potential to sabotage the image and reputation of a political candidate and may alter the course of an election. They could be used to impersonate the identities of business leaders and executives to facilitate fraud, and also have the potential to accelerate the already declining trust in the media.

Alongside the challenges presented by deepfakes, there are issues around consent for performers and creative workers. In a famous case, the Canadian voiceover artist Bev Standing won a settlement after TikTok synthesised her voice without her consent and used it for its first ever text-to-speech voice function. Many artists in the UK are also having their image, voice or likeness used without their permission. AI systems have also started to replace jobs for skilled professional performers because using them is often perceived to be a cheaper and more convenient way of doing things.

Audio artists are particularly concerned by the development of digital voice technology for automated audiobooks, using the same technology used for digital voice assistants such as Siri and Alexa. It is estimated that within one or two years, high-end synthetic voices will have reached human levels. Equity recently conducted a survey on this topic, which found that 65% of performers responding thought that the development of AI technology poses a threat to employment opportunities in the performing arts sector. That figure rose to 93% for audio artists. Pay is another key issue; it is common for artists to not be compensated fairly, and sometimes not be paid at all, when engaging with AI. Many artists have also been asked to sign non-disclosure agreements without being provided with the full information about the job they are taking part in.

Government policy making is non-existent in this space. In September 2021 the Government published their national AI strategy, outlining a 10-year plan to make Britain a global AI superpower. In line with that strategy, the Government have delivered two separate consultations looking at our intellectual property system in relation to AI.

None Portrait The Chair
- Hansard -

Order. I am sorry, but I must interrupt the hon. Lady to adjourn the sitting until this afternoon, when Ms Rees will be in the Chair.

Before we leave the room, my understanding is that it is hoped that the Bill will report this afternoon. That is a matter for the usual channels; it is nothing to do with the Chair. However, of course, it is an open-ended session, so if you are getting close to the mark, you may choose to go on. If that poses a problem for Ms Rees, I am prepared to take the Chair again to see it through if we have to. On the assumption that I do not, thank you all very much indeed for the courtesy you have shown throughout this session, which has been exemplary. I also thank the staff; thank you very much.

Online Safety Bill (Seventeenth sitting) Debate

Full Debate: Read Full Debate

Alex Davies-Jones

Main Page: Alex Davies-Jones (Labour - Pontypridd)

Online Safety Bill (Seventeenth sitting)

Alex Davies-Jones Excerpts
Committee stage
Tuesday 28th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 28 June 2022 - (28 Jun 2022)
Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - -

Before we adjourned, I was discussing the Government’s national artificial intelligence strategy and the two separate consultations launched by the Government to look at the intellectual property system in relation to AI. In those consultations, the Intellectual Property Office recognised that AI

“is playing an increasing role in...artistic creativity.”

However, specific questions about reviewing or enhancing performers’ rights were notably absent from both Government consultations. If the UK Government really want to make Britain a global AI and creative superpower, strengthening the rights of performers and other creatives must be at the heart of the national AI strategy.

Another key challenge is that our intellectual property framework is desperately out of date. Currently, performers have two sets of rights under the Copyright, Designs and Patents Act 1988: the right to consent to the making of a recording of a performance; and the right to control the subsequent use of such recordings, such as the right to make copies. However, as highlighted by Dr Mathilde Pavis, senior lecturer in law at the University of Exeter, AI-made performance synthetisation challenges our intellectual property framework because it reproduces performances without generating a recording or a copy, and therefore falls outside the scope of the Act. An unintended consequence is that people are left vulnerable to abuse and exploitation. Without effective checks and balances put in place by the Government, that will continue. That is why 93% of Equity members responding to a recent survey stated that the Government should introduce a new legal protection for performers, so that a performance cannot be reproduced by AI technology without the performer’s consent.

Advances in AI, including deepfake technology, have reinforced the urgent need to introduce image rights—also known as personality rights or publicity rights. That refers to

“the expression of a personality in the public domain”,

such as an individual’s name, likeness or other personal indicators. Provision of image rights in law enables performers to safeguard meaningful income streams, and to defend their artistic integrity, career choices, brand and reputation. More broadly, for society, it is an important tool for protecting privacy and allowing an individual to object to the use of their image without consent.

In the UK, there is no codified law of image rights or privacy. Instead, we have a patchwork of statutory and common-law causes of action, which an individual can use to protect various aspects of their image and personality. However, none of that is fit for purpose. Legal provision for image rights can be found around the world, so the Government here can and should do more. For example, some American states recognise the right through their statute, and some others through common law. California has both statutory and common-law strains of authority, which protect slightly different forms of the right.

The Celebrities Rights Act of 1985 was passed in California and extended the personality rights for a celebrity to 70 years after their death. In 2020, New York State passed a Bill that recognised rights of publicity for “deceased performers” and “deceased personalities”. Guernsey has created a statutory regime under which image rights can be registered. The legislation centres on the legal concept of a “personnage”— the person or character behind a personality that is registered. The image right becomes a property right capable of protection under the legislation through registration, which enables the image right to be protected, licensed and assigned.

The Minister will know that Equity is doing incredible work to highlight the genuine impact that this type of technology is having on our creative industry and our performers. He must therefore see the sense in our new clause, which would require the Government at least to consider the matter of synthetic media content, which thus far they have utterly failed to do.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship again, Ms Rees. I thank the shadow Minister, the hon. Member for Pontypridd, for raising the issues that she has done about synthetic and digitally manipulated content, which we are very conscious of. We are conscious of the risk of harm to those who work in the entertainment industry and of course, in particular, to victims of deepfake pornography.

We take intellectual property infringement extremely seriously. The Government have recently published a counter-infringement strategy, setting out a range of steps that we intend to take to strengthen the whole system approach to tackling infringement of intellectual property rights. It is widely acknowledged that the United Kingdom has an intellectual property framework that is genuinely world leading and considered among the best in the world. That includes strong protections for performers’ rights. We intend that to continue. However, we are not complacent and the law is kept under review, not least via the counter-infringement strategy I mentioned a moment ago.

Harmful synthetic media content, including the deepfakes that the hon. Member for Pontypridd mentioned, is robustly addressed by the safety duties set out in the Bill in relation to illegal content—much deepfake content, if it involves creating an image of someone, would be illegal—as well as content that could be harmful to children and content that will be on the “legal but harmful” adult list. Those duties will tackle the most serious and illegal forms of deepfake and will rightly cover certain threats that undermine our democracy. For example, a manipulated media image that contained incitement to violence, such as a deepfake of a politician telling people to attack poll workers because they are rigging an election, would obviously already fall foul of the Bill under the illegal duties.

In terms of reporting and codes of practice, the Bill already requires Ofcom to produce codes of practice setting out the ways in which providers can take steps to reduce the harm arising from illegal and harmful content, which could include synthetic media content such as deepfakes where those contain illegal content.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

The Minister uses the example of a deepfake of a politician inciting people to attack poll workers during an election. Given some of the technology is so advanced that it is really difficult to spot when the deepfakes actually occur, could it be argued that Ofcom as regulator or even the platforms themselves would be adverse to removing or reporting the content as it could fall foul of the democratic content exemption in the Bill?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The democratic content protection that the shadow Minister refers to, in clause 15, is not an exemption; it is a duty to take into account content of democratic importance. That is on line 34 of page 14. When making a decision, it has to be taken into account—it is not determinative; it is not as if a politician or somebody involved in an election gets a free pass to say whatever they like, even if it is illegal, and escapes the provisions of the Bill entirely. The platform simply has to take it into account. If it was a deepfake image that was saying such a thing, the balancing consideration in clause 15 would not even apply, because the protection applies to content of democratic importance, not to content being produced by a fake image of a politician.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

It is important that we get this right. One of our concerns on clause 15, which we have previously discussed, relates to this discussion of deepfakes, particularly of politicians, and timeframes. I understand the Minister’s point on illegal content. If there is a deepfake of a politician—on the eve of poll, for example—widely spreading disinformation or misinformation on a platform, how can the Minister confidently say that that would be taken seriously, in a timely manner? That could have direct implications on a poll or an election. Would the social media companies have the confidence to take that content down, given clause 15?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The protections in clause 15—they are not exemptions—would only apply to content that is of bona fide, genuine democratic importance. Obviously, a deepfake of a politician would not count as genuine, democratic content, because it is fake. If it was a real politician, such as the hon. Lady, it would benefit from that consideration. If it was a fake, it would not, because it would not be genuine content of democratic importance.

It is also worth saying that if—well, I hope when—our work with the Law Commission to review the criminal law related to the non-consensual taking and sharing of internet images is taken forward, that will then flow into the duties in the Bill. Deepfakes of internet images are rightly a concern of many people. That work would fall into the ambit of the Bill, either via clause 52, which points to illegal acts where there is an individual victim, or schedule 7, if a new internet image abuse were added to schedule 7 as a priority offence. There are a number of ways in which deepfakes could fall into the ambit of the Bill, including if they relate to extreme pornography.

The new clause would require the production of a report, not a change to the substantive duties in the Bill. It is worth saying that the Bill already provides Ofcom with powers to produce and publish reports regarding online safety matters. Those powers are set out in clause 137. The Bill will ensure that Ofcom has access to the information required to prepare those reports, including information from providers about the harm caused by deepfakes and how companies tackle the issue. We debated that extensively this morning when we talked about the strong powers that already exist under clause 85.

The hon. Lady has raised important points about intellectual property, and I have pointed to our counter-infringement strategy. She raised important points about deepfakes both in a political context and in the context of especially intimate images being generated by AI. I hope I have set out how the Bill addresses concerns in those areas. The Bill as drafted addresses those important issues in a way that is certainly adequate.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I welcome the Minister’s comments and I am grateful for his reassurance on some of the concerns that were raised. At this stage we will not press the matter to a vote. I beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.

New Clause 27

OFCOM: power to impose duties on regulated services

“OFCOM: power to impose duties on regulated services

(1) OFCOM may carry out an assessment of the risk of harm posed by any regulated service.

(2) Where OFCOM assess a service to pose a very high risk of harm, OFCOM may, notwithstanding the categorisation of the service or the number or profile of its users, impose upon the service duties equivalent to—

(a) the children’s risk assessment duties set out in sections 10 and 25 of this Act; and

(b) the safety duties protecting children set out in sections 11 and 26 of this Act.”—(Kirsty Blackman.)

This new clause enables Ofcom to impose on any regulated service duties equivalent to the children’s risk assessment duties and the safety duties protecting children.

Brought up, and read the First time.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

I drafted this new clause following a number of conversations and debates that we had in Committee about how the Act will be scrutinised. How will we see whether the Act is properly achieving what it is supposed to achieve? We know that there is currently a requirement in the Bill for a review to take place but, as has been mentioned already, that is a one-off thing; it is not a rolling update on the efficacy of the Act and whether it is achieving the duties that it is supposed to achieve.

This is particularly important because there are abilities for the Secretary of State to make changes to some of the Act. Presumably the Government would not have put that in if they did not think there was a possibility or a likelihood that changes would have to be made to the Act at some future point. The Bill is certainly not perfect, but even from the Government’s point of view it is not perfect for all time. There is a requirement for the Act to be updated; it will have to change. New priority harms may have to be added. New details about different illegal acts may have to be added to the duties. That flexibility is given, and the Secretary of State has that flexibility in a number of cases.

If the Act were just going to be a standing thing, if it were not going to be updated, it would never be future-proof; it would never work in the changing world that we have. We know that this legislation has taken a very long time to get here. We have been sadly lacking in significant regulation in the online world for more than 20 years, certainly. For a very long time we have not had this. Now that the Act is here—or it will be once the Bill passes through both Houses of Parliament—we want it to work.

That is the point of every amendment we have tabled: we are trying to make the Bill better so that it works and can keep people as safe as possible. At the moment, we do not know how safe the internet will be as a result of the Bill. Even once it begins to be implemented, we will not have enough information on the improvements it has created to be able to say, “Actually, this was a world-leading piece of legislation.”

It may be that the digital regulation committee that I am suggesting in this new clause has a look regularly at the implementation of the Bill going forward and says, “Yep, that’s brilliant.” The committee might look at the implementation and the increasing time we spend online, with all the harms that can come with that, and says, “Actually, you need to tweak that a bit” or, “That is not quite fulfilling what it was intended to.” The committee might also say, “This brand new technology has come in and it is not entirely covered by the Act as it is being implemented.” A digital regulation committee was proposed by the Joint Committee, I think, to scrutinise implementation of the legislation.

The Government will say that they will review—they always do. I have been in so many Delegated Legislation Committees that involve the Treasury and the Government saying, “Yes, we keep everything under review—we always review everything.” That line is used in so many of these Committees, but it is just not true. In January I asked the Department for Digital, Culture, Media and Sport

“how many and what proportion of (a) primary and (b) secondary legislation sponsored by (i) their Department…has undergone a post legislative review”.

It was a written question I put to a number of Departments including DCMS. The reply I got from the Minister here was:

“The number of post legislative reviews the Department has undertaken on primary and secondary legislation in each of the last five years is not held within the Department.”

The Government do not even know how many pieces of primary or secondary legislation they have reviewed. They cannot tell us that all of them have been reviewed. Presumably, if they could tell us that all of them have been reviewed, the answer to my written question would have been, “All of them.” I have a list of the number they sponsored. It was six in 2021, for example. If the Department had reviewed the implementation of all those pieces of legislation, I would expect it to be shouting that from the rooftops in response to a written question. It should be saying, “Yes, we are wonderful. We have reviewed all these and found that most of them are working exactly as we intended them to.”

I do not have faith in the Government or in DCMS—nor pretty much in any Government Department. I do not have faith in their ability or intention to adequately and effectively review the implementation of this legislation, to ensure that the review is done timeously and sent to the Digital, Culture, Media and Sport Committee, or to ensure those proper processes that are supposed to be in place are actually in place and that the Bill is working.

It is unfortunate for the Minister that he sent me that reply earlier in the year, but I only asked the question because I was aware of the significant lack of work the Government are doing on reviewing whether or not legislation has achieved its desired effect, including whether it has cost the amount of money they said it would, whether it has kept the amount of people safe that they said it would, and that it has done what it needs to do.

I have a lack of faith in the Government generally, but specifically on this issue because of the shifting nature of the internet. This is not to take away from the DCMS Committee, but I have sat on a number of Select Committees and know that they are very busy—they have a huge amount of things to scrutinise. This would not stop them scrutinising this Act and taking action to look at whether it is working. It would give an additional line of scrutiny, transparency and defence, in order to ensure that this world-leading legislation is actually world-leading and keeps people safe in the way it is intended to.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

It is an honour to support the new clause moved by the hon. Member for Aberdeen North. This was a recommendation from the Joint Committee report, and we believe it is important, given the sheer complexity of the Bill. The Minister will not be alarmed to hear that I am all in favour of increasing the scrutiny and transparency of this legislation.

Having proudly served on the DCMS Committee, I know it does some excellent work on a very broad range of policy areas, as has been highlighted. It is important to acknowledge that there will of course be cross-over, but ultimately we support the new clause. Given my very fond memories of serving on the Select Committee, I want to put on the record my support for it. My support for this new clause is not meant as any disrespect to that Committee. It is genuinely extremely effective in scrutinising the Government and holding them to account, and I know it will continue to do that in relation to both this Bill and other aspects of DCMS. The need for transparency, openness and scrutiny of this Bill is fundamental if it is truly to be world-leading, which is why we support the new clause.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am grateful for the opportunity to discuss this issue once again. I want to put on the record my thanks to the Joint Committee, which the hon. Member for Ochil and South Perthshire sat on, for doing such fantastic work in scrutinising the draft legislation. As a result of its work, no fewer than 66 changes were made to the Bill, so it was very effective.

I want to make one or two observations about scrutinising the legislation following the passage of the Bill. First, there is the standard review mechanism in clause 149, on pages 125 and 126, which provides for a statutory review not before two years and not after five years of the Bill receiving Royal Assent.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I beg to move, That the clause be read a Second time.

The new clause would give Ofcom the power to co-operate with other regulators for the purposes of tackling harm from illegal content, and for criminal investigations and proceedings. The Minister will be aware that the vast range of human and business activity covered online presents a complex map of potential harms. Some harms will fall into or be adjacent to the purview of other regulators with domain-specific expertise. The relationship formalised through the Digital Regulation Cooperation Forum is well known. Indeed, Ofcom already has a working relationship with the Advertising Standards Authority and the Internet Watch Foundation, among others. Within this regulatory web, Ofcom will have the most relevant powers and expertise, so many regulators will look to it for help in tackling online safety issues. The Minister must recognise that public protection will most effectively be achieved through regulatory interlock. To protect people, Ofcom should be empowered to co-operate with others and to share information. The Bill should, therefore, as much as it can, enable Ofcom to work with other regulators and share online safety information with them.

Ofcom should also be able to bring the immense skills of other regulators into its work. The Bill gives Ofcom the general ability to co-operate with overseas regulators, but, with the exception of references to consulting the Information Commissioner’s Office when drawing up codes of practice and various items of guidance, the Bill is largely silent on co-operation with UK regulators.

The Communications Act 2003 limits the UK regulators with which Ofcom can share information—excluding the ICO, for instance—yet the Online Safety Bill takes a permissive approach to overseas regulators. The Bill should extend co-operation and information sharing in respect of online safety to include regulators overseeing the offences in schedule 7, the primary priority and priority harms to children, and the priority harms to adults.

Elsewhere in regulation, the Financial Conduct Authority has a general duty to co-operate. The same should apply here. Increasing safety through co-operation between relevant regulators is most easily achieved through our new clause, which will allow Ofcom to co-operate more widely. That is limited to co-operation in respect of harmful illegal content, harms to children and priority harms to adults. It is implicit that Ofcom will share information only with the regulators responsible for those precise matters. We have spoken frequently about the importance of co-operation, collaboration and consultation. This simple new clause would help to remedy the slight limitations placed on Ofcom in the Bill.

Ms Rees, with your permission, at this point—because this is likely to be my last contribution to the Bill Committee—[Interruption.] For shame. I place on record my sincere thanks to you and Sir Roger for chairing these Committee sittings, as well as all the Hansard staff, the Clerks, the Table Office, our civil servants, the Doorkeepers, the tech staff and broadcasting team who enable our proceedings to be broadcast to the public, and all members of the Committee for allowing great scrutiny of this legislation to take place. I look forward to continuing that scrutiny on Report.

--- Later in debate ---
Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

On a point of order, Ms Rees. On behalf of the Back Benchers, I thank you and Sir Roger for your excellent chairpersonships, and the Minister and shadow Ministers for the very courteous way in which proceedings have taken place. It has been a great pleasure to be a member of the Bill Committee.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I am content with the Minister’s assurance that the provisions of new clause 41 are covered in the Bill, and therefore do not wish to press it to a vote. I beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.

New Schedule 2

Recovery of OFCOM’s initial costs

Recovery of initial costs

1 (1) This Schedule concerns the recovery by OFCOM of an amount equal to the aggregate of the amounts of WTA receipts which, in accordance with section 401(1) of the Communications Act and OFCOM’s statement under that section, are retained by OFCOM for the purpose of meeting their initial costs.

(2) OFCOM must seek to recover the amount described in sub-paragraph (1) (“the total amount of OFCOM’s initial costs”) by charging providers of regulated services fees under this Schedule (“additional fees”).

(3) In this Schedule—

“initial costs” means the costs incurred by OFCOM before the day on which section 75 comes into force on preparations for the exercise of their online safety functions;

“WTA receipts” means the amounts described in section 401(1)(a) of the Communications Act which are paid to OFCOM (certain receipts under the Wireless Telegraphy Act 2006).

Recovery of initial costs: first phase

2 (1) The first phase of OFCOM’s recovery of their initial costs is to take place over a period of several charging years to be specified in regulations under paragraph 7 (“specified charging years”).

(2) Over that period OFCOM must, in aggregate, charge providers of regulated services additional fees of an amount equal to the total amount of OFCOM’s initial costs.

(3) OFCOM may not charge providers additional fees in respect of any charging year which falls before the first specified charging year.

(4) OFCOM may require a provider to pay an additional fee in respect of a charging year only if the provider is required to pay a fee in respect of that year under section 71 (and references in this Schedule to charging providers are to be read accordingly).

(5) The amount of an additional fee payable by a provider is to be calculated in accordance with regulations under paragraph 7.

Further recovery of initial costs

3 (1) The second phase of OFCOM’s recovery of their initial costs begins after the end of the last of the specified charging years.

(2) As soon as reasonably practicable after the end of the last of the specified charging years, OFCOM must publish a statement specifying—

(a) the amount which is at that time the recoverable amount (see paragraph 6), and

(b) the amounts of the variables involved in the calculation of the recoverable amount.

(3) OFCOM’s statement must also specify the amount which is equal to that portion of the recoverable amount which is not likely to be paid or recovered. The amount so specified is referred to in sub-paragraphs (4) and (5) as “the outstanding amount”.

(4) Unless a determination is made as mentioned in sub-paragraph (5), OFCOM must, in aggregate, charge providers of regulated services additional fees of an amount equal to the outstanding amount.

(5) The Secretary of State may, as soon as reasonably practicable after the publication of OFCOM’s statement, make a determination specifying an amount by which the outstanding amount is to be reduced, and in that case OFCOM must, in aggregate, charge providers of regulated services additional fees of an amount equal to the difference between the outstanding amount and the amount specified in the determination.

(6) Additional fees mentioned in sub-paragraph (4) or (5) must be charged in respect of the charging year immediately following the last of the specified charging years (“year 1”).

(7) The process set out in sub-paragraphs (2) to (6) is to be repeated in successive charging years, applying those sub-paragraphs as if—

(a) in sub-paragraph (2), the reference to the end of the last of the specified charging years were to the end of year 1 (and so on for successive charging years);

(b) in sub-paragraph (6), the reference to year 1 were to the charging year immediately following year 1 (and so on for successive charging years).

(8) Any determination by the Secretary of State under this paragraph must be published in such manner as the Secretary of State considers appropriate.

(9) Sub-paragraphs (4) and (5) of paragraph 2 apply to the charging of additional fees under this paragraph as they apply to the charging of additional fees under that paragraph.

(10) The process set out in this paragraph comes to an end in accordance with paragraph 4.

End of the recovery process

4 (1) The process set out in paragraph 3 comes to an end if a statement by OFCOM under that paragraph records that—

(a) the recoverable amount is nil, or

(b) all of the recoverable amount is likely to be paid or recovered.

(2) Or the Secretary of State may bring that process to an end by making a determination that OFCOM are not to embark on another round of charging providers of regulated services additional fees.

(3) The earliest time when such a determination may be made is after the publication of OFCOM’s first statement under paragraph 3.

(4) A determination under sub-paragraph (2)—

(a) must be made as soon as reasonably practicable after the publication of a statement by OFCOM under paragraph 3;

(b) must be published in such manner as the Secretary of State considers appropriate.

(5) A determination under sub-paragraph (2) does not affect OFCOM’s power—

(a) to bring proceedings for the recovery of the whole or part of an additional fee for which a provider became liable at any time before the determination was made, or

(b) to act in accordance with the procedure set out in section 120 in relation to such a liability.

Providers for part of a year only

5 (1) For the purposes of this Schedule, the “provider” of a regulated service, in relation to a charging year, includes a person who is the provider of the service for part of the year.

(2) Where a person is the provider of a regulated service for part of a charging year only, OFCOM may refund all or part of an additional fee paid to OFCOM under paragraph 2 or 3 by that provider in respect of that year.

Calculation of the recoverable amount

6 For the purposes of a statement by OFCOM under paragraph 3, the “recoverable amount” is given by the formula—

C – (F – R) - D

where—

C is the total amount of OFCOM’s initial costs,

F is the aggregate amount of the additional fees received by OFCOM at the time of the statement in question,

R is the aggregate amount of the additional fees received by OFCOM that at the time of the statement in question have been, or are due to be, refunded (see paragraph 5(2)), and

D is the amount specified in a determination made by the Secretary of State under paragraph 3 (see paragraph 3(5)) at a time before the statement in question or, where more than one such determination has been made, the sum of the amounts specified in those determinations.

If no such determination has been made before the statement in question, D=).

Regulations about recovery of initial costs

7 (1) The Secretary of State must make regulations making such provision as the Secretary of State considers appropriate in connection with the recovery by OFCOM of their initial costs.

(2) The regulations must include provision as set out in sub-paragraphs (3), (4) and (6).

(3) The regulations must specify the total amount of OFCOM’s initial costs.

(4) For the purposes of paragraph 2, the regulations must specify—

(a) the charging years in respect of which additional fees are to be charged, and

(b) the proportion of the total amount of initial costs which OFCOM must seek to recover in each of the specified charging years.

(5) The following rules apply to provision made in accordance with sub-paragraph (4)(a)—

(a) the initial charging year may not be specified;

(b) only consecutive charging years may be specified;

(c) at least three charging years must be specified;

(d) no more than five charging years may be specified.

(6) The regulations must specify the computation model that OFCOM must use to calculate fees payable by individual providers of regulated services under paragraphs 2 and 3 (and that computation model may be different for different charging years).

(7) The regulations may make provision about what OFCOM may or must do if the operation of this Schedule results in them recovering more than the total amount of their initial costs.

(8) The regulations may amend this Schedule or provide for its application with modifications in particular cases.

(9) Before making regulations under this paragraph, the Secretary of State must consult—

(a) OFCOM,

(b) providers of regulated user-to-user services,

(c) providers of regulated search services,

(d) providers of internet services within section 67(2), and

(e) such other persons as the Secretary of State considers appropriate.

Interpretation

8 In this Schedule—

“additional fees” means fees chargeable under this Schedule in respect of the recovery of OFCOM’s initial costs;

“charging year” has the meaning given by section76;

“initial charging year” has the meaning given by section76;

“initial costs” has the meaning given by paragraph 1(3), and the “total amount” of initial costs means the amount described in paragraph 1(1);

“recoverable amount” has the meaning given by paragraph 6;

“specified charging year” means a charging year specified in regulations under paragraph 7 for the purposes of paragraph 2.” —(Chris Philp.)

This new Schedule requires Ofcom to seek to recover their costs which they have incurred (before clause 75 comes into force) when preparing to take on functions as the regulator of services under the Bill by charging fees to providers of services.

Brought up, read the First and Second time, and added to the Bill.

None Portrait The Chair
- Hansard -

New schedule 1 was tabled by Carla Lockhart, who is not on the Committee. Does any Member wish to move new schedule 1? No.

We now come to the final Question in the proceedings. The Committee has finished its work.

Bill, as amended, to be reported.

Online Safety Bill

Alex Davies-Jones Excerpts
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

That is entirely right, and in closing I say that the Bill does what we have always asked for it to do: it gives absolute clarity that illegal things offline must be illegal online as well, and be regulated online. It establishes clear responsibilities and liabilities for the platforms to do that proactively. It enables a regulator to hold the platforms to account on their ability to tackle those priority illegal harms and provide transparency on other areas of harmful content. At present we simply do not know about the policy decisions that companies choose to make: we have no say in it; it is not transparent; we do not know whether they do it. The Bill will deliver in those important regards. If we are serious about tackling issues such as fraud and abuse online, and other criminal offences, we require a regulatory system to do that and proper legal accountability and liability for the companies. That is what the Bill and the further amendments deliver.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - -

It is an honour to respond on the first group of amendments on behalf of the Opposition.

For those of us who have been working on this Bill for some time now, it has been extremely frustrating to see the Government take such a siloed approach in navigating this complex legislation. I remind colleagues that in Committee Labour tabled a number of hugely important amendments that sought to make the online space safer for us all, but the Government responded by voting against each and every one of them. I certainly hope the new Minister—I very much welcome him to his post—has a more open-minded approach than his predecessor and indeed the Secretary of State; I look forward to what I hope will be a more collaborative approach to getting this legislation right.

With that in mind, it must be said that time and again this Government claim that the legislation is world-leading but that is far from the truth. Instead, once again the Government have proposed hugely significant and contentious amendments only after line-by-line scrutiny in Committee; it is not the first time this has happened in this Parliament, and it is extremely frustrating for those of us who have debated this Bill for more than 50 hours over the past month.

I will begin by touching on Labour’s broader concerns around the Bill. As the Minister will be aware, we believe that the Government have made a fundamental mistake in their approach to categorisation, which undermines the very structure of the Bill. We are not alone in this view and have the backing of many advocacy and campaign groups including the Carnegie UK Trust, Hope Not Hate and the Antisemitism Policy Trust. Categorisation of services based on size rather than risk of harm will mean that the Bill will fail to address some of the most extreme harms on the internet.

We all know that smaller platforms such as 4chan and BitChute have significant numbers of users who are highly motivated to promote very dangerous content. Their aim is to promote radicalisation and to spread hate and harm.

Debbie Abrahams Portrait Debbie Abrahams
- Hansard - - - Excerpts

Not only that: people migrate from one platform to another, a fact that just has not been reflected on by the Government.

Alex Davies-Jones Portrait Alex Davies-Jones
- View Speech - Hansard - -

My hon. Friend is absolutely right, and has touched on elements that I will address later in my speech. I will look at cross-platform harm and breadcrumbing; the Government have taken action to address that issue, but they need to go further.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I am sorry to intervene so early in the hon. Lady’s speech, and thank her for her kind words. I personally agree that the question of categorisation needs to be looked at again, and the Government have agreed to do so. We will hopefully discuss it next week during consideration of the third group of amendments.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I welcome the Minister’s commitment, which is something that the previous Minister, the hon. Member for Croydon South (Chris Philp) also committed to in Committee. However, it should have been in the Bill to begin with, or been tabled as an amendment today so that we could discuss it on the Floor of the House. We should not have to wait until the Bill goes to the other place to discuss this fundamental, important point that I know colleagues on the Minister’s own Back Benches have been calling for. Here we are, weeks down the line, with nothing having been done to fix that problem, which we know will be a persistent problem unless action is taken. It is beyond frustrating that no indication was given in Committee of these changes, because they have wide-ranging consequences for the effects of the Bill. Clearly, the Government are distracted with other matters, but I remind the Minister that Labour has long called for a safer internet, and we are keen to get the Bill right.

Let us start with new clause 14, which provides clarification about how online services should determine whether content should be considered illegal, and therefore how the illegal safety duty should apply. The new clause is deeply problematic, and is likely to reduce significantly the amount of illegal content and fraudulent advertising that is correctly identified and acted on. First, companies will be expected to determine whether content is illegal or fraudulently based on information that is

“reasonably available to a provider”,

with reasonableness determined in part by the size and capacity of the provider. That entrenches the problems I have outlined with smaller, high-risk companies being subject to fewer duties despite the acute risks they pose. Having less onerous applications of the illegal safety duties will encourage malign actors to migrate illegal activity on to smaller sites that have less pronounced regulatory expectations placed on them. That has particularly concerning ramifications for children’s protections, which I will come on to shortly. On the other end of the scale, larger sites could use new clause 14 to argue that their size and capacity, and the corresponding volumes of material they are moderating, makes it impractical for them reliably and consistently to identify illegal content.

The second problem arises from the fact that the platforms will need to have

“reasonable grounds to infer that all elements necessary for the commission of the offence, including mental elements, are present or satisfied”.

That significantly raises the threshold at which companies are likely to determine that content is illegal. In practice, companies have routinely failed to remove content where there is clear evidence of illegal intent. That has been the case in instances of child abuse breadcrumbing, where platforms use their own definitions of what constitutes a child abuse image for moderation purposes. Charities believe it is inevitable that companies will look to use this clause to minimise their regulatory obligations to act.

Finally, new clause 14 and its resulting amendments do not appear to be adequately future-proofed. The new clause sets out that judgments should be made

“on the basis of all relevant information that is reasonably available to a provider.”

However, on Meta’s first metaverse device, the Oculus Quest product, that company records only two minutes of footage on a rolling basis. That makes it virtually impossible to detect evidence of grooming, and companies can therefore argue that they cannot detect illegal content because the information is not reasonably available to them. The new clause undermines and weakens the safety mechanisms that the Minister, his team, the previous Minister, and all members of the Joint Committee and the Public Bill Committee have worked so hard to get right. I urge the Minister to reconsider these amendments and withdraw them.

I will now move on to improving the children’s protection measures in the Bill. In Committee, it was clear that one thing we all agreed on, cross-party and across the House, was trying to get the Bill to work for children. With colleagues in the Scottish National party, Labour Members tabled many amendments and new clauses in an attempt to achieve that goal. However, despite their having the backing of numerous children’s charities, including the National Society for the Prevention of Cruelty to Children, 5Rights, Save the Children, Barnardo’s, The Children’s Society and many more, the Government sadly did not accept them. We are grateful to those organisations for their insights and support throughout the Bill’s passage.

We know that children face significant risks online, from bullying and sexist trolling to the most extreme grooming and child abuse. Our amendments focus in particular on preventing grooming and child abuse, but before I speak to them, I associate myself with the amendments tabled by our colleagues in the Scottish National party, the hon. Members for Aberdeen North (Kirsty Blackman) and for Ochil and South Perthshire (John Nicolson). In particular, I associate myself with the sensible changes they have suggested to the Bill at this stage, including a change to children’s access assessments through amendment 162 and a strengthening of duties to prevent harm to children caused by habit-forming features through amendment 190.

Since the Bill was first promised in 2017, the number of online grooming crimes reported to the police has increased by more than 80%. Last year, around 120 sexual communication with children offences were committed every single week, and those are only the reported cases. The NSPCC has warned that that amounts to a

“tsunami of online child abuse”.

We now have the first ever opportunity to legislate for a safer world online for our children.

However, as currently drafted, the Bill falls short by failing to grasp the dynamics of online child abuse and grooming, which rarely occurs on one single platform or app, as mentioned by my hon. Friend the Member for Oldham East and Saddleworth (Debbie Abrahams). In well-established grooming pathways, abusers exploit the design features of open social networks to contact children, then move their communication across to other, more encrypted platforms, including livestreaming sites and encrypted messaging services. For instance, perpetrators manipulate features such as Facebook’s algorithmic friend suggestions to make initial contact with large numbers of children, who they then groom through direct messages before moving to encrypted services such as WhatsApp, where they coerce children into sending sexual images. That range of techniques is often referred to as child abuse breadcrumbing, and is a significant enabler of online child abuse.

I will give a sense of how easy it is for abusers to exploit children by recounting the words and experiences of a survivor, a 15-year-old girl who was groomed on multiple sites:

“I’ve been chatting with this guy online who’s…twice my age. This all started on Instagram but lately all our chats have been on WhatsApp. He seemed really nice to begin with, but then he started making me do these things to ‘prove my trust’ to him, like doing video chats with my chest exposed. Every time I did these things for him, he would ask for more and I felt like it was too late to back out. This whole thing has been slowly destroying me and I’ve been having thoughts of hurting myself.”

I appreciate that it is difficult listening, but that experience is being shared by thousands of other children every year, and we need to be clear about the urgency that is needed to change that.

It will come as a relief to parents and children that, through amendments 58 to 61, the Government have finally agreed to close the loophole that allowed for breadcrumbing to continue. However, I still wish to speak to our amendments 15, 16, and 17 to 19, which were tabled before the Government changed their mind. Together with the Government’s amendments, these changes will bring into scope tens of millions of interactions with accounts that actively enable the discovery and sharing of child abuse material.

Amendment 15 would ensure that platforms have to include in their illegal content risk assessment content that

“reasonably foreseeably facilitates or aids the discovery or dissemination of CSEA content.”

Amendment 16 would ensure that platforms have to maintain proportionate systems and processes to minimise the presence of such content on their sites. The wording of our amendments is tighter and includes aiding the discovery or dissemination of content, whereas the Government’s amendments cover only “commission or facilitation”. Can the Minister tell me why the Government chose that specific wording and opposed the amendments that we tabled in Committee, which would have done the exact same thing? I hope that in the spirit of collaboration that we have fostered throughout the passage of the Bill with the new Minister and his predecessor, the Minister will consider the merit of our amendments 15 and 16.

Labour is extremely concerned about the significant powers that the Bill in its current form gives to the Secretary of State. We see that approach to the Bill as nothing short of a shameless attempt at power-grabbing from a Government whose so-called world-leading Bill is already failing in its most basic duty of keeping people safe online. Two interlinked issues arise from the myriad of powers granted to the Secretary of State throughout the Bill: the first is the unjustified intrusion of the Secretary of State into decisions that are about the regulation of speech, and the second is the unnecessary levels of interference and threats to the independence of Ofcom that arise from the powers of direction to Ofcom in its day-to-day matters and operations. That is not good governance, and it is why Labour has tabled a range of important amendments that the Minister must carefully consider. None of us wants the Bill to place undue powers in the hands of only one individual. That is not a normal approach to regulation, so I fail to see why the Government have chosen to go down that route in this case.

Chris Philp Portrait Chris Philp
- View Speech - Hansard - - - Excerpts

I thank the shadow Minister for giving way—I will miss our exchanges across the Dispatch Box. She is making a point about the Secretary of State powers in, I think, clause 40. Is she at all reassured by the undertakings given in the written ministerial statement tabled by the Secretary of State last Thursday, in which the Government committed to amending the Bill in the Lords to limit the use of those powers to exceptional circumstances only, and precisely defined those circumstances as only being in connection with issues such as public health and public safety?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I thank the former Minister for his intervention, and I am grateful for that clarification. We debated at length in Committee the importance of the regulator’s independence and the prevention of overarching Secretary of State powers, and of Parliament having a say and being reconvened if required. I welcome the fact that that limitation on the power will be tabled in the other place, but it should have been tabled as an amendment here so that we could have discussed it today. We should not have to wait for the Bill to go to the other place for us to have our say. Who knows what will happen to the Bill tomorrow, next week or further down the line with the Government in utter chaos? We need this to be done now. The Minister must recognise that this is an unparalleled level of power, and one with which the sector and Back Benchers in his own party disagree. Let us work together and make sure the Bill really is fit for purpose, and that Ofcom is truly independent and without interference and has the tools available to it to really create meaningful change and keep us all safe online once and for all.

--- Later in debate ---
Andrew Percy Portrait Andrew Percy (Brigg and Goole) (Con)
- Hansard - - - Excerpts

While the shadow Minister is on the subject of exemptions for antisemites, will she say where the Opposition are on the issue of search? Search platforms and search engines provide some of the most appalling racist, Islamophobic and antisemitic content.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I thank the hon. Gentleman, who is absolutely right. In Committee, we debated at length the impact search engines have, and they should be included in the Bill’s categorisation of difficult issues. In one recent example on a search engine, the imagery that comes up when we search for desk ornaments is utterly appalling and needs to be challenged and changed. If we are to truly tackle antisemitism, racism and extremist content online, then the provisions need to be included in the Bill, and journalistic exemptions should not apply to this type of content. Often, they operate more discretely and are less likely to attract sanctions. Furthermore, any amendment will provide no answer to the many extremist publishers who seek to exploit the terms of the exemption. For those reasons, we need to go further.

The amendments are not a perfect or complete solution. Deficiencies remain, and the amendments do not address the fact that the exemption continues to exclude dozens of independent local newspapers around the country on the arbitrary basis that they have no fixed address. The Independent Media Association, which represents news publishers, describes the news publisher criteria as

“punishing quality journalism with high standards”.

I hope the Minister will reflect further on that point. As a priority, we need to ensure that the exemption cannot be exploited by bad actors. We must not give a free pass to those propagating racist, misogynistic or antisemitic harm and abuse. By requiring some standards of accountability for news providers, however modest, the amendments are an improvement on the Bill as drafted. In the interests of national security and the welfare of the public, we must support the amendments.

Finally, I come to a topic that I have spoken about passionately in this place on a number of occasions and that is extremely close to my heart: violence against women and girls. Put simply, in their approach to the Bill the Government are completely failing and falling short in their responsibilities to keep women and girls safe online. Labour has been calling for better protections for some time now, yet still the Government are failing to see the extent of the problem. They have only just published an initial indicative list of priority harms to adults, in a written statement that many colleagues may have missed. While it is claimed that this will add to scrutiny and debate, the final list of harms will not be on the face of the Bill but will included in secondary legislation after the Bill has received Royal Assent. Non-designated content that is harmful will not require action on the part of service providers, even though by definition it is still extremely harmful. How can that be acceptable?

Many campaigners have made the case that protections for women and girls are not included in the draft Bill at all, a concern supported by the Petitions Committee in its report on online abuse. Schedule 7 includes a list of sexual offences and aggravated offences, but the Government have so far made no concessions here and the wider context of violence against women and girls has not been addressed. That is why I urge the Minister to carefully consider our new clause 3, which seeks to finally name violence against women and girls as a priority harm. The Minister’s predecessor said in Committee that women and girls receive “disproportionate” levels of abuse online. The Minister in his new role will likely be well briefed on the evidence, and I know this is an issue he cares passionately about. The case has been put forward strongly by hon. Members on all sides of the House, and the message is crystal clear: women and girls must be protected online, and we see this important new clause as the first step.

Later on, we hope to see the Government move further and acknowledge that there must be a code of practice on tackling violence against women and girls content online.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

The hon. Lady raises the issue of codes of practice. She will recall that in Committee we talked about that specifically and pressed the then Minister on that point. It became very clear that Ofcom would be able to issue a code of practice on violence against women and girls, which she talked about. Should we not be seeking an assurance that Ofcom will do that? That would negate the need to amend the Bill further.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I welcome the right hon. Lady’s comments. We did discuss this at great length in Committee, and I know she cares deeply and passionately about this issue, as do I. It is welcome that Ofcom can issue a code of practice on violence against women and girls, and we should absolutely be urging it to do that, but we also need to make it a fundamental aim of the Bill. If the Bill is to be truly world leading, if it is truly to make us all safe online, and if we are finally to begin to tackle the scourge of violence against women and girls in all its elements—not just online but offline—then violence against women and girls needs to be named as a priority harm in the Bill. We need to take the brave new step of saying that enough is enough. Words are not enough. We need actions, and this is an action the Minister could take.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I think we would all agree that when we look at the priority harms set out in the Bill, women and girls are disproportionately the victims of those offences. The groups in society that the Bill will most help are women and girls in our community. I am happy to work with the hon. Lady and all hon. Members to look at what more we can do on this point, both during the passage of the Bill and in future, but as it stands the Bill is the biggest step forward in protecting women and girls, and all users online, that we have ever seen.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I am grateful to the Minister for the offer to work on that further, but we have an opportunity now to make real and lasting change. We talk about how we tackle this issue going forward. How can we solve the problem of violence against women and girls in our community? Three women a week are murdered at the hands of men in this country—that is shocking. How can we truly begin to tackle a culture change? This is how it starts. We have had enough of words. We have had enough of Ministers standing at the Dispatch Box saying, “This is how we are going to tackle violence against women and girls; this is our new plan to do it.” They have an opportunity to create a new law that makes it a priority harm, and that makes women and girls feel like they are being listened to, finally. I urge the Minister and Members in all parts of the House, who know that this is a chance for us finally to take that first step, to vote for new clause 3 today and make women and girls a priority by showing understanding that they receive a disproportionate level of abuse and harm online, and by making them a key component of the Bill.

David Davis Portrait Mr David Davis (Haltemprice and Howden) (Con)
- View Speech - Hansard - - - Excerpts

I join everybody else in welcoming the Under-Secretary of State for Digital, Culture, Media and Sport, my hon. Friend the Member for Folkestone and Hythe (Damian Collins), to the Front Bench. He is astonishingly unusual in that he is both well-intentioned and well-informed, a combination we do not always find among Ministers.

I will speak to my amendments to the Bill. I am perfectly willing to be in a minority of one—one of my normal positions in this House. To be in a minority of one on the issue of free speech is an honourable place to be. I will start by saying that I think the Bill is fundamentally mis-designed. It should have been several Bills, not one. It is so complex that it is very difficult to forecast the consequences of what it sets out to do. It has the most fabulously virtuous aims, but unfortunately the way things will be done under it, with the use of Government organisations to make decisions that, properly, should be taken on the Floor of the House, is in my view misconceived.

We all want the internet to be safe. Right now, there are too many dangers online—we have been hearing about some of them from the hon. Member for Pontypridd (Alex Davies-Jones), who made a fabulous speech from the Opposition Front Bench—from videos propagating terror to posts promoting self-harm and suicide. But in its well-intentioned attempts to address those very real threats, the Bill could actually end up being the biggest accidental curtailment of free speech in modern history.

There are many reasons to be concerned about the Bill. Not all of them are to be dealt with in this part of the Report stage—some will be dealt with later—and I do not have time to mention them all. I will make one criticism of the handling of the Bill at this point. I have seen much smaller Bills have five days on Report in the past. This Bill demands more than two days. That was part of what I said in my point of order at the beginning.

One of the biggest problems is the “duties of care” that the Bill seeks to impose on social media firms to protect users from harmful content. That is a more subtle issue than the tabloid press have suggested. My hon. Friend the Member for Croydon South (Chris Philp), the previous Minister, made that point and I have some sympathy with him. I have spoken to representatives of many of the big social media firms, some of which cancelled me after speeches that I made at the Conservative party conference on vaccine passports. I was cancelled for 24 hours, which was an amusing process, and they put me back up as soon as they found out what they had done. Nevertheless, that demonstrated how delicate and sensitive this issue is. That was a clear suppression of free speech without any of the pressures that are addressed in the Bill.

When I spoke to the firms, they made it plain that they did not want the role of online policemen, and I sympathise with them, but that is what the Government are making them do. With the threat of huge fines and even prison sentences if they consistently fail to abide by any of the duties in the Bill—I am using words from the Bill—they will inevitably err on the side of censorship whenever they are in doubt. That is the side they will fall on.

Worryingly, the Bill targets not only illegal content, which we all want to tackle—indeed, some of the practice raised by the Opposition Front Bencher, the hon. Member for Pontypridd should simply be illegal full stop—but so-called “legal but harmful” content. Through clause 13, the Bill imposes duties on companies with respect to legal content that is “harmful to adults”. It is true that the Government have avoided using the phrase “legal but harmful” in the Bill, preferring “priority content”, but we should be clear about what that is.

The Bill’s factsheet, which is still on the Government’s website, states on page 1:

“The largest, highest-risk platforms will have to address named categories of legal but harmful material”.

This is not just a question of transparency—they will “have to” address that. It is simply unacceptable to target lawful speech in this way. The “Legal to Say, Legal to Type” campaign, led by Index on Censorship, sums up this point: it is both perverse and dangerous to allow speech in print but not online.

Online Safety Bill

Alex Davies-Jones Excerpts
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The hon. Gentleman talks about the wider use of screens and screen time, and that is why Ofcom’s media literacy programme, and DCMS’s media literacy strategy—

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

That is because we have a detailed strategy that tackles many of these issues. Again, none of this is perfect, and as I have said, the Government are working in tandem with the platforms, and with parents and education bodies, to make sure we get that bit right. The hon. Gentleman is right to highlight that as a big issue.

I talked about harmful communications, recognising that we could leave a potential gap in the criminal law. The Government have also decided not to repeal existing communications offences in the Malicious Communications Act 1988, or those under section 127(1) of the Communications Act 2003. That will ensure that victims of domestic abuse or other extremely harmful communications will still be robustly protected by the criminal law. Along with planned changes to the harmful communications offence, we are making a number of additional changes to the Bill—that will come later, Mr Speaker, and I will not tread too much into that, as it includes the removal of the adult safety duties, often referred to as the legal but harmful provision. The amended Bill offers adults a triple shield of protection that requires platforms to remove illegal content and material that violates their terms and conditions, and gives adults user controls to help them avoid seeing certain types of content.

The Bill’s key objective, above everything else, is the safety of children online, and we will be making a number of changes to strengthen the Bill’s existing protections for children. We will make sure that we expect platforms to use age assurance technology when identifying the age of their users, and we will also require platforms with minimum age restrictions to explain in their terms of service what measures they have in place to prevent access to those below their minimum age, and enforce those measures consistently. We are planning to name the Children’s Commissioner as a statutory consultee for Ofcom in its development of the codes of practice, ensuring that children’s views and needs are represented.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Which one?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

That is the Children’s Commissioner for England, specifically because they have particular reserved duties for the whole of the UK. None the less, Ofcom must also have regard to a wider range of voices, which can easily include the other Children’s Commissioners.

--- Later in debate ---
Lindsay Hoyle Portrait Mr Speaker
- Hansard - - - Excerpts

I call the shadow Minister.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

It is an absolute pleasure to be back in the Chamber to respond on behalf of the Opposition to this incredibly important piece of legislation on its long overdue second day on Report. It certainly has not been an easy ride so far: I am sure that Bill Committee colleagues across the House agree that unpicking and making sense of this unnecessarily complicated Bill has been anything but straightforward.

We should all be incredibly grateful and are all indebted to the many individuals, charities, organisations and families who have worked so hard to bring online safety to the forefront for us all. Today is a particularly important day, as we are joined in the Public Gallery by a number of families who have lost children in connection with online harms. They include Lorin LaFave, Ian Russell, Andy and Judy Thomas, Amanda and Stuart Stephens and Ruth Moss. I sincerely hope that this debate will do justice to their incredible hard work and commitment in the most exceptionally difficult of circumstances.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

It may be a drop in the ocean to the likes of Elon Musk or Mark Zuckerberg—these multibillionaires who are taking over social media and using it as their personal plaything. They are not going to listen to fines; the only way they are going to listen, sit up and take notice is if criminal liability puts their neck on the line and makes them answer for some of the huge failures of which they are aware.

The right hon. and learned Member mentions that he shares the sentiment of the amendment but feels it could be wrong. We have an opportunity here to put things right and put responsibility where it belongs: with the tech companies, the platforms and the managers responsible. In a similar way to what happens in the financial sector or in health and safety regulation, it is vital that people be held responsible for issues on their platforms. We feel that criminal liability will make that happen.

David Davis Portrait Mr David Davis
- Hansard - - - Excerpts

May I intervene on a point of fact? The hon. Lady says that fines are a drop in the ocean. The turnover of Google is $69 billion; 10% of that is just shy of $7 billion. That is not a drop in the ocean, even to Elon Musk.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

We are looking at putting people on the line. It needs to be something that people actually care about. Money does not matter to these people, as we have seen with the likes of Google, Elon Musk and Mark Zuckerberg; what matters to them is actually being held to account. Money may matter to Government Members, but it will be criminal liability that causes people to sit up, listen and take responsibility.

While I am not generally in the habit of predicting the Minister’s response or indeed his motives—although my job would be a hell of a lot easier if I did—I am confident that he will try to peddle the line that it was the Government who introduced director liability for compliance failures in an earlier draft of the Bill. Let me be crystal clear in making this point, because it is important. The Bill, in its current form, makes individuals at the top of companies personally liable only when a platform fails to supply information to Ofcom, which misses the point entirely. Directors must be held personally liable when safety duties are breached. That really is quite simple, and I am confident that it would be effective in tackling harm online much more widely.

We also support new clause 28, which seeks to establish an advocacy body to represent the interests of children online. It is intended to deal with a glaring omission from the Bill, which means that children who experience online sexual abuse will receive fewer statutory user advocacy protections than users of a post office or even passengers on a bus. The Minister must know that that is wrong and, given his Government’s so-called commitment to protecting children, I hope he will carefully consider a new clause which is supported by Members on both sides of the House as well as the brilliant National Society for the Prevention of Cruelty to Children. In rejecting new clause 28, the Government would be denying vulnerable children a strong, authoritative voice to represent them directly, so I am keen to hear the Minister’s justification for doing so, if that is indeed his plan.

Members will have noted the bundle of amendments tabled by my hon. Friend the Member for Worsley and Eccles South (Barbara Keeley) relating to Labour’s concerns about the unnecessary powers to overrule Ofcom that the Bill, as currently drafted, gives the Secretary of State of the day. During Committee evidence sessions, we heard from Will Perrin of the Carnegie UK Trust, who, as Members will know, is an incredibly knowledgeable voice when it comes to internet regulation. He expressed concern about the fact that, in comparison with other regulatory frameworks such as those in place for advertising, the Bill

“goes a little too far in introducing a range of powers for the Secretary of State to interfere with Ofcom’s day-to-day doing of its business.”––[Official Report, Online Safety Public Bill Committee, 26 May 2022; c. 117.]

Labour shares that concern. Ofcom must be truly independent if it is to be an effective regulator. Surely we have to trust it to undertake logical processes, rooted in evidence, to arrive at decisions once this regime is finally up and running. It is therefore hard to understand how the Government can justify direct interference, and I hope that the Minister will seriously consider amendments 23 to 30, 32, and 35 to 41.

Before I address Labour’s main concerns about the Government’s proposed changes to the Bill, I want to record our support for new clauses 29 and 30, which seek to bring media literacy duties back into the scope of the Bill. As we all know, media literacy is the first line of defence when it comes to protecting ourselves against false information online. Prevention is always better than cure. Whether it is a question of viral conspiracy theories or Russian disinformation, Labour fears that the Government’s approach to internet regulation will create a two-tier internet, leaving some more vulnerable than others.

However, I am sorry to say that the gaps in this Bill do not stop there. I was pleased to see that my hon. Friend the Member for Rotherham (Sarah Champion) had tabled new clause 54, which asks the Government to formally consider the impact that the use of virtual private networks will have on Ofcom’s ability to enforce its powers. This touches on the issue of future-proofing, which Labour has raised repeatedly in debates on the Bill. As we have heard from a number of Members, the tech industry is evolving rapidly, with concepts such as the metaverse changing the way in which we will all interact with the internet in the future. When the Bill was first introduced, TikTok was not even a platform. I hope the Minister can reassure us that the Bill will be flexible enough to deal with those challenges head-on; after all, we have waited far too long.

That brings me to what Labour considers to be an incredible overturn by the Government relating to amendment 239, which seeks to remove the new offence of harmful communications from the Bill entirely. As Members will know, the communications offence was designed by the Law Commission with the intention of introducing a criminal threshold for the most dangerous online harms. Indeed, in Committee it was welcome to hear the then Minister—the present Minister for Crime, Policing and Fire, the right hon. Member for Croydon South (Chris Philp)—being so positive about the Government’s consultation with the commission. In relation to clause 151, which concerns the communications offences, he even said:

“The Law Commission is the expert in this kind of thing…and it is right that, by and large, we follow its expert advice in framing these offences, unless there is a very good reason not to. That is what we have done—we have followed the Law Commission’s advice, as we would be expected to do.” ––[Official Report, Online Safety Public Bill Committee, 21 June 2022; c. 558.]

Less than six months down the line, we are seeing yet another U-turn from this Government, who are doing precisely the opposite of what was promised.

Removing these communications offences from the Bill will have real-life consequences. It will mean that harmful online trends such as hoax bomb threats, abusive social media pile-ons and fake news such as encouraging people to drink bleach to cure covid will be allowed to spread online without any consequence.

Christian Wakeford Portrait Christian Wakeford (Bury South) (Lab)
- Hansard - - - Excerpts

No Jewish person should have to log online and see Hitler worship, but what we have seen in recent weeks from Kanye West has been nothing short of disgusting, from him saying “I love Hitler” to inciting online pile-ons against Jewish people, and this is magnified by the sheer number of his followers, with Jews actually being attacked on the streets in the US. Does my hon. Friend agree that the Government’s decision to drop the “legal but harmful” measures from the Bill will allow this deeply offensive and troubling behaviour to continue?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I thank my hon. Friend for that important and powerful intervention. Let us be clear: everything that Kanye West said online is completely abhorrent and has no place in our society. It is not for any of us to glorify Hitler and his comments or praise him for the work he did; that is absolutely abhorrent and it should never be online. Sadly, however, that is exactly the type of legal but harmful content that will now be allowed to proliferate online because of the Government’s swathes of changes to the Bill, meaning that that would be allowed to be seen by everybody. Kanye West has 30 million followers online. His followers will be able to look at, share, research and glorify that content without any consequence to that content being freely available online.

Margaret Hodge Portrait Dame Margaret Hodge
- Hansard - - - Excerpts

Further to that point, it is not just that some of the content will be deeply offensive to the Jewish community; it could also harm wider society. Some further examples of postings that would be considered legal but harmful are likening vaccination efforts to Nazi death camps and alleging that NHS nurses should stand trial for genocide. Does my hon. Friend not agree that the changes the Government are now proposing will lead to enormous and very damaging impacts right through society?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

My right hon. Friend is absolutely right. I am keen to bring this back into scope before Mr Speaker chastises us any further, but she is right to say that this will have a direct real-world impact. This is what happens when we focus on content rather than directly on the platforms and the algorithms on the platforms proliferating this content. That is where the focus needs to be. It is the algorithms that share and amplify this content to these many followers time and again that need to be tackled, rather than the content itself. That is what we have been pleading with the Government to concentrate on, but here we are in this mess.

We are pleased that the Government have taken on board Labour’s policy to criminalise certain behaviours—including the encouragement of self-harm, sharing people’s intimate images without their consent, and controlling or coercive behaviours—but we believe that the communications offences more widely should remain in order to tackle dangerous online harms at their root. We have worked consistently to get this Bill over the line and we have reached out to do so. It has been subject to far too many delays and it is on the Government’s hands that we are again facing substantial delays, when internet regulation has never been more sorely needed. I know that the Minister knows that, and I sincerely hope he will take our concerns seriously. I reach out to him again across the Dispatch Box, and look forward to working with him and challenging him further where required as the Bill progresses. I look forward to getting the Bill on to the statute book.

Rosie Winterton Portrait Madam Deputy Speaker (Dame Rosie Winterton)
- Hansard - - - Excerpts

I call the Chair of the Select Committee.

ONLINE SAFETY BILL (First sitting)

Alex Davies-Jones Excerpts
Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(1 year, 4 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The hon. Lady makes an excellent point. This is not about mandating that platforms stop doing these things; it is about ensuring that they take this issue into account and that they agree—or that we as legislators agree—with the Royal College of Psychiatrists that we have a responsibility to tackle it. We have a responsibility to ask Ofcom to tackle it with platforms.

This comes back to the fact that we do not have a user advocacy panel, and groups representing children are not able to bring emerging issues forward adequately and effectively. Because of the many other inadequacies in the Bill, that is even more important than it was. I assume the Minister will not accept my amendment—that generally does not happen in Bill Committees—but if he does not, it would be helpful if he could give Ofcom some sort of direction of travel so that it knows it should take this issue into consideration when it deals with platforms. Ofcom should be talking to platforms about habit-forming features and considering the addictive nature of these things; it should be doing what it can to protect children. This threat has emerged only in recent years, and things will not get any better unless we take action.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - -

It is a privilege to see you back in the Chair for round 2 of the Bill Committee, Sir Roger. It feels slightly like déjà vu to return to line-by-line scrutiny of the Bill, which, as you said, Sir Roger, is quite unusual and unprecedented. Seeing this Bill through Committee is the Christmas gift that keeps on giving. Sequels are rarely better than the original, but we will give it a go. I have made no secret of my plans, and my thoughts on the Minister’s plans, to bring forward significant changes to the Bill, which has already been long delayed. I am grateful that, as we progress through Committee, I will have the opportunity to put on record once again some of Labour’s long-held concerns with the direction of the Bill.

I will touch briefly on clause 11 specifically before addressing the amendments to the clause. Clause 11 covers safety duties to protect children, and it is a key part of the Bill—indeed, it is the key reason many of us have taken a keen interest in online safety more widely. Many of us, on both sides of the House, have been united in our frustrations with the business models of platform providers and search engines, which have paid little regard to the safety of children over the years in which the internet has expanded rapidly.

That is why Labour has worked with the Government. We want to see the legislation get over the line, and we recognise—as I have said in Committee previously—that the world is watching, so we need to get this right. The previous Minister characterised the social media platforms and providers as entirely driven by finance, but safety must be the No. 1 priority. Labour believes that that must apply to both adults and children, but that is an issue for debate on a subsequent clause, so I will keep my comments on this clause brief.

The clause and Government amendments 1, 2 and 3 address the thorny issue of age assurance measures. Labour has been clear that we have concerns that the Government are relying heavily on the ability of social media companies to distinguish between adults and children, but age verification processes remain fairly complex, and that clearly needs addressing. Indeed, Ofcom’s own research found that a third of children have false social media accounts aged over 18. This is an area we certainly need to get right.

I am grateful to the many stakeholders, charities and groups working in this area. There are far too many to mention, but a special shout-out should go to Iain Corby from the Age Verification Providers Association, along with colleagues at the Centre to End All Sexual Exploitation and Barnardo’s, and the esteemed John Carr. They have all provided extremely useful briefings for my team and me as we have attempted to unpick this extremely complicated part of the Bill.

We accept that there are effective age checks out there, and many have substantial anti-evasion mechanisms, but it is the frustrating reality that this is the road the Government have decided to go down. As we have repeatedly placed on the record, the Government should have retained the “legal but harmful” provisions that were promised in the earlier iteration of the Bill. Despite that, we are where we are.

I will therefore put on the record some brief comments on the range of amendments on this clause. First, with your permission, Sir Roger, I will speak to amendments 98, 99—

None Portrait The Chair
- Hansard -

Order. No, you cannot. I am sorry. I am perfectly willing to allow—the hon. Lady has already done this—a stand part debate at the start of a group of selections, rather than at the end, but she cannot have it both ways. I equally understand the desire of an Opposition Front Bencher to make some opening remarks, which is perfectly in order. With respect, however, you may not then go through all the other amendments. We are dealing now with amendment 98. If the hon. Lady can confine her remarks to that at this stage, that would be helpful.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Of course, Sir Roger. Without addressing the other amendments, I would like us to move away from the overly content-focused approach that the Government seem intent on taking in the Bill more widely. I will leave my comments there on the SNP amendment, but we support our SNP colleagues on it.

Paul Scully Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Paul Scully)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Sir Roger.

Being online can be a hugely positive experience for children and young people, but we recognise the challenge of habit-forming behaviour or designed addiction to some digital services. The Bill as drafted, however, would already deliver the intent of the amendment from the hon. Member for Aberdeen North. If service providers identify in their risk assessment that habit-forming or addictive-behaviour risks cause significant harm to an appreciable number of children on a service, the Bill will require them to put in place measures to mitigate and manage that risk under clause 11(2)(a).

To meet the child safety risk assessment duties under clause 10, services must assess the risk of harm to children from the different ways in which the service is used; the impact of such use; the level of risk of harm to children; how the design and operation of the service may increase the risks identified; and the functionalities that facilitate the presence or dissemination of content of harm to children. The definition of “functionality” at clause 200 already includes an expression of a view on content, such as applying a “like” or “dislike” button, as at subsection (2)(f)(i).

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The Bill’s key objective, above everything else, is the safety of young people online. That is why the strongest protections in the Bill are for children. Providers of services that are likely to be accessed by children will need to provide safety measures to protect child users from harmful content, such as pornography, and from behaviour such as bullying. We expect companies to use age verification technologies to prevent children from accessing services that pose the highest risk of harm to them, and age assurance technologies and other measures to provide children with an age-appropriate experience.

The previous version of the Bill already focused on protecting children, but the Government are clear that the Bill must do more to achieve that and to ensure that the requirements on providers are as clear as possible. That is why we are strengthening the Bill and clarifying the responsibilities of providers to provide age-appropriate protections for children online. We are making it explicit that providers may need to use age assurance to identify the age of their users in order to meet the child safety duties for user-to-user services.

The Bill already set out that age assurance may be required to protect children from harmful content and activity, as part of meeting the duty in clause 11(3), but the Bill will now clarify that it may also be needed to meet the wider duty in subsection (2) to

“mitigate and manage the risks of harm to children”

and to manage

“the impact of harm to children”

on such services. That is important so that only children who are old enough are able to use functionalities on a service that poses a risk of harm to younger children. The changes will also ensure that children are signposted to support that is appropriate to their age if they have experienced harm. For those reasons, I recommend that the Committee accepts the amendments.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I have a few questions regarding amendments 1 to 3, which as I mentioned relate to the thorny issue of age verification and age assurance, and I hope the Minister can clarify some of them.

We are unclear about why, in subsection (3)(a), the Government have retained the phrase

“for example, by using age verification, or another means of age assurance”.

Can that difference in wording be taken as confirmation that the Government want harder forms of age verification for primary priority content? The Minister will be aware that many in the sector are unclear about what that harder form of age verification may look like, so some clarity would be useful for all of us in the room and for those watching.

In addition, we would like to clarify the Minister’s understanding of the distinction between age verification and age assurance. They are very different concepts in reality, so we would appreciate it if he could be clear, particularly when we consider the different types of harm that the Bill will address and protect people from, how that will be achieved and what technology will be required for different types of platform and content. I look forward to clarity from the Minister on that point.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

That is a good point. In essence, age verification is the hard access to a service. Age assurance ensures that the person who uses the service is the same person whose age was verified. Someone could use their parent’s debit card or something like that, so it is not necessarily the same person using the service right the way through. If we are to protect children, in particular, we have to ensure that we know there is a child at the other end whom we can protect from the harm that they may see.

On the different technologies, we are clear that our approach to age assurance or verification is not technology-specific. Why? Because otherwise the Bill would be out of date within around six months. By the time the legislation was fully implemented it would clearly be out of date. That is why it is incumbent on the companies to be clear about the technology and processes they use. That information will be kept up to date, and Ofcom can then look at it.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

What I am saying is that the clause does not actually allow that middle step. It does not explicitly say that somebody could be stopped from accessing private messaging. The only options are being banned from certain content, or being banned from the entire platform.

I absolutely recognise the hard work that Ofcom has done, and I recognise that it will work very hard to ensure that risks are mitigated, but the amendment ensures what the Minister intended with this legislation. I am not convinced that he intended there to be just the two options that I outlined. I think he intended something more in line with what I am suggesting in the amendment. It would be very helpful if the Minister explicitly said something in this Committee that makes it clear that Ofcom has the power to say to platforms, “Your risk assessment says that there is a real risk from private messaging”—or from livestreaming—“so why don’t you turn that off for all users under 18?” Ofcom should be able to do that.

Could the Minister be clear that that is the direction of travel he is hoping and intending that Ofcom will take? If he could be clear on that, and will recognise that the clause could have been slightly better written to ensure Ofcom had that power, I would be quite happy to not push the amendment to a vote. Will the Minister be clear about the direction he hopes will be taken?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I rise to support my SNP colleagues’ amendments 99, and 96 and 97, just as I supported amendment 98. The amendments are sensible and will ensure that service providers are empowered to take action to mitigate harms done through their services. In particular, we support amendment 99, which makes it clear that a service should be required to have the tools available to allow it to block access to parts of its service, if that is proportionate.

Amendments 96 and 97 would ensure that private messaging and livestreaming features were brought into scope, and that platforms and services could block access to them when that was proportionate, with the aim of protecting children, which is the ultimate aim of the Bill. Those are incredibly important points to raise.

In previous iterations of the Bill Committee, Labour too tabled a number of amendments to do with platforms’ responsibilities for livestreaming. I expressed concerns about how easy it is for platforms to host live content, and about how ready they were to screen that content for harm, illegal or not. I am therefore pleased to support our SNP colleagues. The amendments are sensible, will empower platforms and will keep children safe.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Clause 12 is extremely important because it outlines the platforms’ duties in relation to keeping adults safe online. The Government’s attempts to remove the clause through an amendment that thankfully has not been selected are absolutely shocking. In addressing Government amendments 18, 23, 24, 25, 32, 33 and 39, I must ask the Minister: exactly how will this Bill do anything to keep adults safe online?

In the original clause 12, companies had to assess the risk of harm to adults and the original clause 13 outlined the means by which providers had to report these assessments back to Ofcom. This block of Government amendments will make it impossible for any of us—whether that is users of a platform or service, researchers or civil society experts—to understand the problems that arise on these platforms. Labour has repeatedly warned the Government that this Bill does not go far enough to consider the business models and product design of platforms and service providers that contribute to harm online. By tabling this group of amendments, the Government are once again making it incredibly difficult to fully understand the role of product design in perpetuating harm online.

We are not alone in our concerns. Colleagues from Carnegie UK Trust, who are a source of expertise to hon. Members across the House when it comes to internet regulation, have raised their concerns over this grouping of amendments too. They have raised specific concerns about the removal of the transparency obligation, which Labour has heavily pushed for in previous Bill Committees.

Previously, service providers had been required to inform customers of the harms their risk assessment had detected, but the removal of this risk assessment means that users and consumers will not have the information to assess the nature or risk on the platform. The Minister may point to the Government’s approach in relation to the new content duties in platforms’ and providers’ terms of service, but we know that there are risks arising from the fact that there is no minimum content specified for the terms of service for adults, although of course all providers will have to comply with the illegal content duties.

This approach, like the entire Bill, is already overly complex—that is widely recognised by colleagues across the House and is the view of many stakeholders too. In tabling this group of amendments, the Minister is showing his ignorance. Does he really think that all vulnerabilities to harm online simply disappear at the age of 18? By pushing these amendments, which seek to remove these protections from harmful but legal content to adults, the Minister is, in effect, suggesting that adults are not susceptible to harm and therefore risk assessments are simply not required. That is an extremely narrow-minded view to take, so I must push the Minister further. Does he recognise that many young, and older, adults are still highly likely to be impacted by suicide and self-harm messaging, eating disorder content, disinformation and abuse, which will all be untouched by these amendments?

Labour has been clear throughout the passage of the Bill that we need to see more, not less, transparency and protection from online harm for all of us—whether adults or children. These risk assessments are absolutely critical to the success of the Online Safety Bill and I cannot think of a good reason why the Minister would not support users in being able to make an assessment about their own safety online.

We have supported the passage of the Bill, as we know that keeping people safe online is a priority for us all and we know that the perfect cannot be the enemy of the good. The Government have made some progress towards keeping children safe, but they clearly do not consider it their responsibility to do the same for adults. Ultimately, platforms should be required to protect everyone: it does not matter whether they are a 17-year-old who falls short of being legally deemed an adult in this country, an 18-year-old or even an 80-year-old. Ultimately, we should all have the same protections and these risk assessments are critical to the online safety regime as a whole. That is why we cannot support these amendments. The Government have got this very wrong and we have genuine concerns that this wholesale approach will undermine how far the Bill will go to truly tackling harm online.

I will also make comments on clause 55 and the other associated amendments. I will keep my comments brief, as the Minister is already aware of my significant concerns over his Department’s intention to remove adult safety duties more widely. In the previous Bill Committee, Labour made it clear that it supports, and thinks it most important, that the Bill should clarify specific content that is deemed to be harmful to adults. We have repeatedly raised concerns about missing harms, including health misinformation and disinformation, but really this group of amendments, once again, will touch on widespread concerns that the Government’s new approach will see adults online worse off. The Government’s removal of the “legal but harmful” sections of the Online Safety Bill is a major weakening—not a strengthening—of the Bill. Does the Minister recognise that the only people celebrating these decisions will be the executives of big tech firms, and online abusers? Does he agree that this delay shows that the Government have bowed to vested interests over keeping users and consumers safe?

Labour is not alone in having these concerns. We are all pleased to see that child safety duties are still present in the Bill, but the NSPCC, among others, is concerned about the knock-on implications that may introduce new risks to children. Without adult safety duties in place, children will be at greater risk of harm if platforms do not identify and protect them as children. In effect, these plans will now place a significant greater burden on platforms to protect children than adults. As the Bill currently stands, there is a significant risk of splintering user protections that can expose children to adult-only spaces and harmful content, while forming grooming pathways for offenders, too.

The reality is that these proposals to deal with harms online for adults rely on the regulator ensuring that social media companies enforce their own terms and conditions. We already know and have heard that that can have an extremely damaging impact for online safety more widely, and we have only to consider the very obvious and well-reported case study involving Elon Musk’s takeover of Twitter to really get a sense of how damaging that approach is likely to be.

In late November, Twitter stopped taking action against tweets in violation of coronavirus rules. The company had suspended at least 11,000 accounts under that policy, which was designed to remove accounts posting demonstrably false or misleading content relating to covid-19 that could lead to harm. The company operated a five-strike policy, and the impact on public health around the world of removing that policy will likely be tangible. The situation also raises questions about the platform’s other misinformation policies. As of December 2022, they remain active, but for how long remains unclear.

Does the Minister recognise that as soon as they are inconvenient, platforms will simply change their terms and conditions, and terms of service? We know that simply holding platforms to account for their terms and conditions will not constitute robust enough regulation to deal with the threat that these platforms present, and I must press the Minister further on this point.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

My hon. Friend is making an excellent speech. I share her deep concerns about the removal of these clauses. The Government have taken this tricky issue of the concept of “legal but harmful”—it is a tricky issue; we all acknowledge that—and have removed it from the Bill altogether. I do not think that is the answer. My hon. Friend makes an excellent point about children becoming 18; the day after they become 18, they are suddenly open to lots more harmful and dangerous content. Does she also share my concern about the risks of people being drawn towards extremism, as well as disinformation and misinformation?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

My hon. Friend makes a valid point. This is not just about misinformation and disinformation; it is about leading people to really extreme, vile content on the internet. As we all know, that is a rabbit warren. That situation does not change as soon as a 17-year-old turns 18 on their 18th birthday—that they are then exempt when it comes to seeing this horrendous content. The rules need to be there to protect all of us.

As we have heard, terms and conditions can change overnight. Stakeholders have raised the concern that, if faced with a clearer focus on their terms of service, platforms and providers may choose to make their terms of service shorter, in an attempt to cut out harmful material that, if left undealt with, they may be held liable for.

In addition, the fact that there is no minimum requirement in the regime means that companies have complete freedom to set terms of service for adults, which may not reflect the risks to adults on that service. At present, service providers do not even have to include terms of service in relation to the list of harmful content proposed by the Government for the user empowerment duties—an area we will come on to in more detail shortly as we address clause 14. The Government’s approach and overreliance on terms of service, which as we know can be so susceptible to rapid change, is the wrong approach. For that reason, we cannot support these amendments.

I would just say, finally, that none of us was happy with the term “legal but harmful”. It was a phrase we all disliked, and it did not encapsulate exactly what the content is or includes. Throwing the baby out with the bathwater is not the way to tackle that situation. My hon. Friend the Member for Batley and Spen is right that this is a tricky area, and it is difficult to get it right. We need to protect free speech, which is sacrosanct, but we also need to recognise that there are so many users on the internet who do not have access to free speech as a result of being piled on or shouted down. Their free speech needs to be protected too. We believe that the clauses as they stand in the Bill go some way to making the Bill a meaningful piece of legislation. I urge the Minister not to strip them out, to do the right thing and to keep them in the Bill.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Throughout the consideration of the Bill, I have been clear that I do not want it to end up simply being the keep MPs safe on Twitter Bill. That is not what it should be about. I did not mean that we should therefore take out everything that protects adults; what I meant was that we need to have a big focus on protecting children in the Bill, which thankfully we still do. For all our concerns about the issues and inadequacies of the Bill, it will go some way to providing better protections for children online. But saying that it should not be the keep MPs safe on Twitter Bill does not mean that it should not keep MPs safe on Twitter.

I understand how we have got to this situation. What I cannot understand is the Minister’s being willing to stand up there and say, “We can’t have these clauses because they are a risk to freedom of speech.” Why are they in the Bill in the first place if they are such a big risk to freedom of speech? If the Government’s No. 1 priority is making sure that we do not have these clauses, why did they put them in it? Why did it go through pre-legislative scrutiny? Why were they in the draft Bill? Why were they in the Bill? Why did they agree with them in Committee? Why did they agree with them on Report? Why have we ended up in a situation where, suddenly, there is a massive epiphany that they are a threat to freedom of speech and therefore we cannot possibly have them?

What is it that people want to say that they will be banned from saying as a result of this Bill? What is it that freedom of speech campaigners are so desperate to want to say online? Do they want to promote self-harm on platforms? Is that what people want to do? Is that what freedom of speech campaigners are out for? They are now allowed to do that a result of the Bill.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The hon. Lady is absolutely right. We have all heard from organisations and individuals who have had their lives destroyed as a result of “legal but harmful”—I don’t have a better phrase for it—content online and of being radicalised by being driven deeper and deeper into blacker and blacker Discord servers, for example, that are getting further and further right wing.

A number of the people who are radicalised—who are committing terror attacks, or being referred to the Prevent programme because they are at risk of committing terror attacks—are not so much on the far-right levels of extremism any more, or those with incredible levels of religious extremism, but are in a situation where they have got mixed up or unclear ideological drivers. It is not the same situation as it was before, because people are being radicalised by the stuff that they find online. They are being radicalised into situations where they “must do something”—they “must take some action”—because of the culture change in society.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

The hon. Member is making a powerful point. Just a few weeks ago, I asked the Secretary of State for Digital, Culture, Media and Sport, at the Dispatch Box, whether the horrendous and horrific content that led a man to shoot and kill five people in Keyham—in the constituency of my hon. Friend the Member for Plymouth, Sutton and Devonport (Luke Pollard)—would be allowed to remain and perpetuate online as a result of the removal of these clauses from the Bill. I did not get a substantial answer then, but we all know that the answer is yes.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

That is the thing: this Bill is supposed to be the Online Safety Bill. It is supposed to be about protecting people from the harm that can be done to them by others. It is also supposed to be about protecting people from that radicalisation and that harm that they can end up in. It is supposed to make a difference. It is supposed to be game changer and a world leader.

Although, absolutely, I recognise the importance of the child-safety duties in the clauses and the change that that will have, when people turn 18 they do not suddenly become different humans. They do not wake up on their 18th birthday as a different person from the one that they were before. They should not have to go from that level of protection, prior to 18, to being immediately exposed to comments and content encouraging them to self-harm, and to all of the negative things that we know are present online.

ONLINE SAFETY BILL (Second sitting)

Alex Davies-Jones Excerpts
Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(1 year, 4 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
Sarah Owen Portrait Sarah Owen
- Hansard - - - Excerpts

In that case, I must have been mistaken in thinking that the hon. Member—who has probably said quite a lot of things, which is why his voice is as hoarse as it is—was criticising the former Minister for measures that were agreed in previous Committee sittings.

For me, the current proposals are a really disappointing, retrograde step. They will not protect the most vulnerable people in our communities, including offline—this harm is not just online, but stretches out across all our communities. What happens online does not take place, and stay, in an isolated space; people are influenced by it and take their cues from it. They do not just take their cues from what is said in Parliament; they see misogynists online and think that they can treat people like that. They see horrific abuses of power and extreme pornography and, as we heard from the hon. Member for Aberdeen North, take their cues from that. What happens online does not stay online.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - -

My hon. Friend makes an important point about what happens online and its influence on the outside world. We saw that most recently with Kanye West being reinstated to Twitter and allowed to spew his bile and abhorrent views about Jews. That antisemitism had a real-world impact in terms of the rise in antisemitism on the streets, particularly in the US. The direct impact of his being allowed to talk about that online was Jews being harmed in the real world. That is exactly what is happening.

Sarah Owen Portrait Sarah Owen
- Hansard - - - Excerpts

I thank the shadow Minister for that intervention. She is absolutely right. We have had a discussion about terms of reference and terms of service. Not only do most people not actually fully read them or understand them, but they are subject to change. The moment Elon Musk took over Twitter, everything changed. Not only have we got Donald Trump back, but Elon Musk also gave the keys to a mainstream social media platform to Kanye West. We have seen what happened then.

That is the situation the Government will now not shut the door on. That is regrettable. For all the reasons we have heard today, it is really damaging. It is really disappointing that we are not taking the opportunity to lead in this area.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

In that case, having moved my amendment, I close my remarks.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

It is a pleasure to serve under your chairship, Dame Angela. With your permission, I will take this opportunity to make some broad reflections on the Government’s approach to the new so-called triple-shield protection that we have heard so much about, before coming on to the amendment tabled in my name in the group.

Broadly, Labour is disappointed that the system-level approach to content that is harmful to adults is being stripped from the Bill and replaced with a duty that puts the onus on the user to keep themselves safe. As the Antisemitism Policy Trust among others has argued, the two should be able to work in tandem. The clause allows a user to manage what harmful material they see by requiring the largest or most risky service providers to provide tools to allow a person in effect to reduce their likelihood of encountering, or to alert them to, certain types of material. We have concerns about the overall approach of the Government, but Labour believes that important additions can be made to the list of content where user-empowerment tools must be in place, hence our amendment (a) to Government amendment 15.

In July, in a little-noticed written ministerial statement, the Government produced a prototype list of content that would be harmful to adults. The list included priority content that category 1 services need to address in their terms and conditions; online abuse and harassment—mere disagreement with another’s point of view would not reach the threshold for harmful content, and so would not be covered; circulation of real or manufactured intimate images without the subject’s consent; content promoting self-harm; content promoting eating disorders; legal suicide content; and harmful health content that is demonstrably false, such as urging people to drink bleach to cure cancer.

We have concerns about whether listing those harms in the Bill is the most effective mechanism, mostly because we feel that the list should be more flexible and able to change according to the issues of the day, but it is clear that the Government will continue to pursue this avenue despite some very worrying gaps. With that in mind, will the Minister clarify what exactly underpins that list if there have been no risk assessments? What was the basis for drawing up that specific list? Surely the Government should be implored to publish the research that determined the list, at the very least.

I recognise that the false communications offence has remained in the Bill, but the list in Government amendment 15 is not exhaustive. Without the additions outlined in our amendment (a) to amendment 15, the list will do little to tackle some of the most pressing harm of our time, some of which we have already heard about today.

I am pleased that the list from the written ministerial statement has more or less been reproduced in amendment 15, under subsection (2), but there is a key and unexplained omission that our amendment (a) to it seeks to correct: the absence of the last point, on harmful health content. Amendment (a) seeks to reinsert such important content into the Bill directly. It seems implausible that the Government failed to consider the dangerous harm that health misinformation can have online, especially given that back in July they seemed to have a grasp of its importance by including it in the original list.

We all know that health-related misinformation and disinformation can significantly undermine public health, as we have heard. We only have to cast our minds back to the height of the coronavirus pandemic to remind ourselves of how dangerous the online space was, with anti-vax scepticism being rife. Many groups were impacted, including pregnant women, who received mixed messages about the safety of covid vaccination, causing widespread confusion, fear and inaction. By tabling amendment (a) to amendment 15, we wanted to understand why the Government have dropped that from the list and on what exact grounds.

In addition to harmful health content, our amendment (a) to amendment 15 would also add to the list content that incites hateful extremism and provides false information about climate change, as we have heard. In early written evidence from Carnegie, it outlined how serious the threat of climate change disinformation is to the UK. Malicious actors spreading false information on social media could undermine collective action to combat the threats. At present, the Online Safety Bill is not designed to tackle those threats head on.

We all recognise that social media is an important source of news and information for many people, and evidence is emerging of its role in climate change disinformation. The Centre for Countering Digital Hate published a report in 2021 called “The Toxic Ten: How ten fringe publishers fuel 69% of digital climate change denial”, which explores the issue further. Further analysis of activity on Facebook around COP26 undertaken by the Institute for Strategic Dialogue demonstrates the scale of the challenge in dealing with climate change misinformation and disinformation. The research compared the levels of engagement generated by reliable, scientific organisations and climate-sceptic actors, and found that posts from the latter frequently received more traction and reach than the former, which is shocking. For example, in the fortnight in which COP26 took place, sceptic content garnered 12 times the level of engagement that authoritative sources did on the platform, and 60% of the sceptic posts analysed could be classified as actively and explicitly attacking efforts to curb climate change, which just goes to show the importance of ensuring that climate change disinformation is also included in the list in Government amendment 15.

Our two amendments—amendment (a) to amendment 15, and amendment (a) to amendment 16 —seek to ensure that the long-standing omission from the Bill of hateful extremism is put right here as a priority. There is increasing concern about extremism leading to violence and death that does not meet the definition for terrorism. The internet and user-to-user services play a central role in the radicalisation process, yet the Online Safety Bill does not cover extremism.

Colleagues may be aware that Sara Khan, the former lead commissioner for countering extremism, provided a definition of extremism for the Government in February 2021, but there has been no response. The issue has been raised repeatedly by Members across the House, including by my hon. Friend the Member for Plymouth, Sutton and Devonport (Luke Pollard), following the tragic murders carried out by a radicalised incel in his constituency.

Amendment (a) to amendment 16 seeks to bring a formal definition of hateful extremism into the Bill and supports amendment (a) to amendment 15. The definition, as proposed by Sara Khan, who was appointed as Britain’s first countering extremism commissioner in 2018, is an important first step in addressing the gaps that social media platforms and providers have left open for harm and radicalisation.

Social media platforms have often been ineffective in removing other hateful extremist content. In November 2020, The Guardian reported that research from the Centre for Countering Digital Hate had uncovered how extremist merchandise had been sold on Facebook and Instagram to help fund neo-Nazi groups. That is just one of a huge number of instances, and it goes some way to suggest that a repeatedly inconsistent and ineffective approach to regulating extremist content is the one favoured by some social media platforms.

I hope that the Minister will seriously consider the amendments and will see the merits in expanding the list in Government amendment 15 to include these additional important harms.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Thank you for chairing the meeting this afternoon, Dame Angela. I agree wholeheartedly with the amendments tabled by the Labour Front-Bench team. It is important that we talk about climate change denial and what we can do to ensure people are not exposed to that harmful conspiracy theory through content. It is also important that we do what we can to ensure that pregnant women, for example, are not told not to take the covid vaccine or that parents are not told not to vaccinate their children against measles, mumps and rubella. We need to do what we can to ensure measures are in place.

I appreciate the list in Government amendment 15, but I have real issues with this idea of a toggle system—of being able to switch off this stuff. Why do the Government think people should have to switch off the promotion of suicide content or content that promotes eating disorders? Why is it acceptable that people should have to make an active choice to switch that content off in order to not see it? People have to make an active choice to tick a box that says, “No, I don’t want to see content that is abusing me because of my religion,” or “No, I don’t want to see content that is abusing me because of my membership of the LGBT community.” We do not want people to have to look through the abuse they are receiving in order to press the right buttons to switch it off. As the hon. Member for Don Valley said, people should be allowed to say what they want online, but the reality is that the extremist content that we have seen published online is radicalising people and bringing them to the point that they are taking physical action against people in the real, offline world as well as taking action online.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I follow, but I do not agree. The categories of content in proposed new subsections (8C) and (8D), introduced by amendment 15, underpin a lot of this. I answered the question in an earlier debate when talking about the commercial impetus. I cannot imagine many mainstream advertisers wanting to advertise with a company that removed from its terms of service the exclusion of racial abuse, misogyny and general abuse. We have seen that commercial impetus really kicking in with certain platforms. For those reasons, I am unable to accept the amendments to the amendments, and I hope that the Opposition will not press them to a vote.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I am grateful for the opportunity to push the Minister further. I asked him whether he could outline where the list in amendment 15 came from. Will he publish the research that led him to compile that specific list of priority harms?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The definitions that we have taken are ones that strike the right balance and have a comparatively high threshold, so that they do not capture challenging and robust discussions on controversial topics.

Amendment 8 agreed to.

Amendments made: 9, in clause 14, page 14, line 5, after “to” insert “effectively”.

This amendment strengthens the duty in this clause by requiring that the systems or processes used to deal with the kinds of content described in subsections (8B) to (8D) (see Amendment 15) should be designed to effectively increase users’ control over such content.

Amendment 10, in clause 14, page 14, line 6, leave out from “encountering” to “the” in line 7 and insert

“content to which subsection (2) applies present on”.

This amendment inserts a reference to the kinds of content now relevant for this clause, instead of referring to priority content that is harmful to adults.

Amendment 11, in clause 14, page 14, line 9, leave out from “to” to end of line 10 and insert

“content present on the service that is a particular kind of content to which subsection (2) applies”.—(Paul Scully.)

This amendment inserts a reference to the kinds of content now relevant for this clause, instead of referring to priority content that is harmful to adults.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I beg to move amendment 102, in clause 14, page 14, line 12, leave out “made available to” and insert “in operation for”.

This amendment, and Amendment 103, relate to the tools proposed in Clause 14 which will be available for individuals to use on platforms to protect themselves from harm. This amendment specifically forces platforms to have these safety tools “on” by default.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss amendment 103, in clause 14, page 14, line 15, leave out “take advantage of” and insert “disapply”.

This amendment relates to Amendment 102.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

The amendments relate to the tools proposed in clause 14, which as we know will be available for individuals to use on platforms to protect themselves from harm. As the Minister knows, Labour fundamentally disagrees with that approach, which will place the onus on the user, rather than the platform, to protect themselves from harmful content. It is widely recognised that the purpose of this week’s Committee proceedings is to allow the Government to remove the so-called “legal but harmful” clauses and replace them with the user empowerment tool option. Let us be clear that that goes against the very essence of the Bill, which was created to address the particular way in which social media allows content to be shared, spread and broadcast around the world at speed.

This approach could very well see a two-tier internet system develop, which leaves those of us who choose to utilise the user empowerment tools ignorant of harmful content perpetuated elsewhere for others to see. The tools proposed in clause 14, however, reflect something that we all know to be true: that there is some very harmful content out there for us all to see online. We can all agree that individuals should therefore have access to the appropriate tools to protect themselves. It is also right that providers will be required to ensure that adults have greater choice and control over the content that they see and engage with, but let us be clear that instead of focusing on defining exactly what content is or is not harmful, the Bill should focus on the processes by which harmful content is amplified on social media.

However, we are where we are, and Labour believes that it is better to have the Bill over the line, with a regulator in place with some powers, than simply to do nothing at all. With that in mind, we have tabled the amendment specifically to force platforms to have safety tools on by default. We believe that the user empowerment tools should be on by default and that they must be appropriately visible and easy to use. We must recognise that for people at a point of crisis—if a person is suffering with depressive or suicidal thoughts, or with significant personal isolation, for example—the tools may not be at the forefront of their minds if their mental state is severely impacted.

On a similar point, we must not patronise the public. Labour sees no rational argument why the Government would not support the amendment. We should all assume that if a rational adult is able to easily find and use these user empowerment tools, then they will be easily able to turn them off if they choose to do so.

The Minister knows that I am not in the habit of guessing but, judging from our private conversations, his rebuttal to my points may be because he believes it is not the Government’s role to impose rules directly on platforms, particularly when they impact their functionality. However, for Labour, the existence of harm and the importance of protecting people online tips the balance in favour of turning these user empowerment tools on by default. We see no negative reason why that should not be the case, and we now have a simple amendment that could have a significantly positive impact.

I hope the Minister and colleagues will reflect strongly on these amendments, as we believe they are a reasonable and simple ask of platforms to do the right thing and have the user empowerment tools on by default.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Once again, this is a very smart amendment that I wish I had thought of myself and I am happy to support. The case made by those campaigning for freedom of speech at any cost is about people being able to say what they want to say, no matter how harmful that may be. It is not about requiring me, or anyone else, to read those things—the harmful bile, the holocaust denial or the promotion of suicide that is spouted. It is not freedom of speech to require someone else to see and read such content so I cannot see any potential argument that the Government could come up with against these amendments.

The amendments have nothing to do with freedom of speech or with limiting people’s ability to say whatever they want to say or to promote whatever untruths they want to promote. However, they are about making sure that people are protected and that they are starting from a position of having to opt in if they want to see harmful content. If I want to see content about holocaust denial—I do not want to see that, but if I did—I should have to clearly tick a button that says, “Yes, I am pretty extreme in my views and I want to see things that are abusing people. I want to see that sort of content.” I should have to opt in to be able to see that.

There are a significant number of newspapers out there. I will not even pick up a lot of them because there is so much stuff in them with which I disagree, but I can choose not to pick them up. I do not have that newspaper served to me against my will because I have the opportunity to choose to opt out from buying it. I do not have to go into the supermarket and say, “No, please do not give me that newspaper!” I just do not pick it up. If we put the Government’s proposal on its head and do what has been suggested in the Opposition amendments, everyone would be in a much better position.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

In the previous debate, I talked about amendment 15, which brought in a lot of protections against content that encourages and promotes, or provides instruction for, self-harm, suicide or eating disorders, and against content that is abusive or incites hate on the base of race, religion, disability, sex, gender reassignment or sexual orientation. We have also placed a duty on the largest platforms to offer adults the option to filter out unverified users if they so wish. That is a targeted approach that reflects areas where vulnerable users in particular could benefit from having greater choice and control. I come back to the fact that that is the third shield and an extra safety net. A lot of the extremes we have heard about, which have been used as debating points, as important as they are, should very much be wrapped up by the first two shields.

We have a targeted approach, but it is based on choice. It is right that adult users have a choice about what they see online and who they interact with. It is right that this choice lies in the hands of those adults. The Government mandating that these tools be on by default goes against the central aim of users being empowered to choose for themselves whether they want to reduce their engagement with some kinds of legal content.

We have been clear right from the beginning that it is not the Government’s role to say what legal content adults should or should not view online or to incentivise the removal of legal content. That is why we removed the adult legal but harmful duties in the first place. I believe we are striking the right balance between empowering adult users online and protecting freedom of expression. For that reason, I am not able to accept the amendments from the hon. Member for Pontypridd.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

It is disappointing that the Government are refusing to back these amendments to place the toggle as “on” by default. It is something that we see as a safety net, as the Minister described. Why would someone have to choose to have the safety net there? If someone does not want it, they can easily take it away. The choice should be that way around, because it is there to protect all of us.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

I am sure that, like me, the shadow Minister will be baffled that the Government are against our proposals to have to opt out. Surely this is something that is of key concern to the Government, given that the former MP for Tiverton and Honiton might still be an MP if users had to opt in to watching pornography, rather than being accidentally shown it when innocently searching for tractors?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

My hon. Friend makes a very good point. It goes to show the nature of this as a protection for all of us, even MPs, from accessing content that could be harmful to our health or, indeed, profession. Given the nature of the amendment, we feel that this is a safety net that should be available to all. It should be on by default.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I should say that in the spirit of choice, companies can also choose to default it to be switched off in the first place as well.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

The Minister makes the point that companies can choose to have it off by default, but we would not need this Bill in the first place if companies did the right thing. Let us be clear: we would not have had to be here debating this for the past five years —for me it has been 12 months—if companies were going to do the right thing and protect people from harmful content online. On that basis, I will push the amendments to a vote.

Question put, That the amendment be made.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

This is an extremely large grouping so, for the sake of the Committee, I will do my best to keep my comments focused and brief where possible. I begin by addressing Government new clauses 3 and 4 and the consequential amendments.

Government new clause 3 introduces new duties that aim to ensure that the largest or most risky online service providers design systems and processes that ensure they cannot take down or restrict content in a way prevents a person from seeing it without further action by the user, or ban users, except in accordance with their own terms of service, or if the content breaks the law or contravenes the Online Safety Bill regime. This duty is referred to as the duty not to act against users except in accordance with terms of service. In reality, that will mean that the focus remains far too much on the banning, taking down and restriction of content, rather than our considering the systems and processes behind the platforms that perpetuate harm.

Labour has long held the view that the Government have gone down an unhelpful cul-de-sac on free speech. Instead of focusing on defining exactly which content is or is not harmful, the Bill should be focused on the processes by which harmful content is amplified on social media. We must recognise that a person posting a racist slur online that nobody notices, shares or reads is significantly less harmful than a post that can quickly go viral, and can within hours gain millions of views or shares. We have talked a lot in this place about Kanye West and the comments he has made on Twitter in the past few weeks. It is safe to say that a comment by Joe Bloggs in Hackney that glorifies Hitler does not have the same reach or produce the same harm as Kanye West saying exactly the same thing to his 30 million Twitter followers.

Our approach has the benefit of addressing the things that social media companies can control—for example, how content spreads—rather than the things they cannot control, such as what people say online. It reduces the risk to freedom of speech because it tackles how content is shared, rather than relying entirely on taking down harmful content. Government new clause 4 aims to improve the effectiveness of platforms’ terms of service in conjunction with the Government’s new triple shield, which the Committee has heard a lot about, but the reality is they are ultimately seeking to place too much of the burden of protection on extremely flexible and changeable terms of service.

If a provider’s terms of service say that certain types of content are to be taken down or restricted, then providers must run systems and processes to ensure that that can happen. Moreover, people must be able to report breaches easily, through a complaints service that delivers appropriate action, including when the service receives complaints about the provider. This “effectiveness” duty is important but somewhat misguided.

The Government, having dropped some of the “harmful but legal” provisions, seem to expect that if large and risky services—the category 1 platforms—claim to be tackling such material, they must deliver on that promise to the customer and user. This reflects a widespread view that companies may pick and choose how to apply their terms of service, or implement them loosely and interchangeably, as we have heard. Those failings will lead to harm when people encounter things that they would not have thought would be there when they signed up. All the while, service providers that do not fall within category 1 need not enforce their terms of service, or may do so erratically or discriminatorily. That includes search engines, no matter how big.

This large bundle of amendments seems to do little to actually keep people safe online. I have already made my concerns about the Government’s so-called triple shield approach to internet safety clear, so I will not repeat myself. We fundamentally believe that the Government’s approach, which places too much of the onus on the user rather than the platform, is wrong. We therefore cannot support the approach that is taken in the amendments. That being said, the Minister can take some solace from knowing that we see the merits of Government new clause 5, which

“requires OFCOM to give guidance to providers about complying with the duties imposed by NC3 and NC4”.

If this is the avenue that the Government insist on going down, it is absolutely vital that providers are advised by Ofcom on the processes they will be required to take to comply with these new duties.

Amendment 19 agreed to.

Amendment made: 20, in clause 18, page 19, line 33, at end insert

“, and

(b) section (Further duties about terms of service)(5)(a) (reporting of content that terms of service allow to be taken down or restricted).”—(Paul Scully.)

This amendment inserts a signpost to the new provision about content reporting inserted by NC4.

Clause 18, as amended, ordered to stand part of the Bill.

Clause 19

Duties about complaints procedures

Amendment made: 21, in clause 19, page 20, line 15, leave out “, (3) or (4)” and insert “or (3)”.—(Paul Scully.)

This amendment removes a reference to clause 20(4), as that provision is moved to NC4.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I beg to move amendment 22, in clause 19, page 20, line 27, leave out from “down” to “and” in line 28 and insert

“or access to it being restricted, or given a lower priority or otherwise becoming less likely to be encountered by other users,”.

NC2 states what is meant by restricting users’ access to content, and this amendment makes a change in line with that, to avoid any implication that downranking is a form of restriction on access to content.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

These amendments clarify the meaning of “restricting access to content” and “access to content” for the purposes of the Bill. Restricting access to content is an expression that is used in various provisions across the Bill, such as in new clause 2, under which providers of category 1 services will have a duty to ensure that they remove or restrict access to users’ content only where that is in accordance with their terms of service or another legal obligation. There are other such references in clauses 15, 16 and 17.

The amendments make it clear that the expression

“restricting users’ access to content”

covers cases where a provider prevents a user from accessing content without that user taking a prior step, or where content is temporarily hidden from a user. They also make it clear that this expression does not cover any restrictions that the provider puts in place to enable users to apply user empowerment tools to limit the content that they encounter, or cases where access to content is controlled by another user, rather than by the provider.

The amendments are largely technical, but they do cover things such as down-ranking. Amendment 22 is necessary because the previous wording of this provision wrongly suggested that down-ranking was covered by the expression “restricting access to content”. Down-ranking is the practice of giving content a lower priority on a user’s feed. The Government intend that users should be able to complain if they feel that their content has been inappropriately down-ranked as a result of the use of proactive technology. This amendment ensures consistency.

I hope that the amendments provide clarity as to the meaning of restricting access to content for those affected by the Bill, and assist providers with complying with their duties.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Again, I will keep my comments on clause 19 brief, as we broadly support the intentions behind the clause and the associated measures in the grouping. My hon. Friend the Member for Worsley and Eccles South (Barbara Keeley) spoke at length about this important clause, which relates to the all-important complaints procedures available around social media platforms and companies, in the previous Bill Committee.

During the previous Committee, Labour tabled amendments that would have empowered more individuals to make a complaint about search content in the event of non-compliance. In addition, we wanted an external complaints option for individuals seeking redress. Sadly, all those amendments were voted down by the last Committee, but I must once again press the Minister on those points, particularly in the context of the new amendments that have been tabled.

Without redress for individual complaints, once internal mechanisms have been exhausted, victims of online abuse could be left with no further options. Consumer protections could be compromised and freedom of expression, with which the Government seem to be borderline obsessed, could be infringed for people who feel that their content has been unfairly removed.

Government new clause 2 deals with the meaning of references to

“restricting users’ access to content”,

in particular by excluding restrictions resulting from the use of user empowerment tools as described in clause 14. We see amendments 22 and 59 as important components of new clause 2, and are therefore more than happy to support them. However, I reiterate to the Minister and place on the record once again the importance of introducing an online safety ombudsman, which we feel is crucial to new clause 2. The Joint Committee recommended the introduction of such an ombudsman, who would consider complaints when internal routes of redress had not resulted in resolution, had failed to address risk and had led to significant and demonstrable harm. As new clause 2 relates to restricting users’ access to content, we must also ensure that there is an appropriate channel for complaints if there is an issue that users wish to take up around restrictions in accessing content.

By now, the Minister will be well versed in my thoughts on the Government’s approach, and on the reliance on the user empowerment tool approach more broadly. It is fundamentally an error to pursue a regime that is so content-focused. Despite those points, we see the merits in Government amendments 22 and 59, and in new clause 2, so have not sought to table any further amendments at this stage.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I am slightly confused, and would appreciate a little clarification from the Minister. I understand what new clause 2 means; if the hon. Member for Pontypridd says that she does not want to see content of a certain nature, and I put something of that nature online, I am not being unfairly discriminated against in any way because she has chosen to opt out of receiving that content. I am slightly confused about the downgrading bit.

I know that an awful lot of platforms use downgrading when there is content that they find problematic, or something that they feel is an issue. Rather than taking that content off the platform completely, they may just no longer put it in users’ feeds, for example; they may move it down the priority list, and that may be part of what they already do to keep people safe. I am not trying to criticise what the Government are doing, but I genuinely do not understand whether that downgrading would still be allowed, whether it would be an issue, and whether people could complain about their content being downgraded because the platform was a bit concerned about it, and needed to check it out and work out what was going on, or if it was taken off users’ feeds.

Some companies, if they think that videos have been uploaded by people who are too young to use the platform, or by a registered child user of the platform, will not serve that content to everybody’s feeds. I will not be able to see something in my TikTok feed that was published by a user who is 13, for example, because there are restrictions on how TikTok deals with and serves that content, in order to provide increased protection and the safety that they want on their services.

Will it still be acceptable for companies to have their own internal downgrading system, in order to keep people safe, when content does not necessarily meet an illegality bar or child safety duty bar? The Minister has not used the phrase “market forces”; I think he said “commercial imperative”, and he has talked a lot about that. Some companies and organisations use downgrading to improve the systems on their site and to improve the user experience on the platform. I would very much appreciate it if the Minister explained whether that will still be the case. If not, will we all have a worse online experience as a result?

--- Later in debate ---
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clause 30 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I will speak broadly to clause 20, as it is an extremely important clause, before making remarks about the group of Government amendments we have just voted on.

Clause 20 is designed to provide a set of balancing provisions that will require companies to have regard to freedom of expression and privacy when they implement their safety duties. However, as Labour has repeatedly argued, it is important that companies cannot use privacy and free expression as a basis to argue that they can comply with regulations in less substantive ways. That is a genuine fear here.

We all want to see a Bill in place that protects free speech, but that cannot come at the expense of safety online. The situation with regards to content that is harmful to adults has become even murkier with the Government’s attempts to water down the Bill and remove adult risk assessments entirely.

The Minister must acknowledge that there is a balance to be achieved. We all recognise that. The truth is—and this is something that his predecessor, or should I say his predecessor’s predecessor, touched on when we considered this clause in the previous Bill Committee—that at the moment platforms are extremely inconsistent in their approach to getting the balance right. Although Labour is broadly supportive of this clause and the group of amendments, we feel that now is an appropriate time to put on record our concerns over the important balance between safety, transparency and freedom of expression.

Labour has genuine concerns over the future of platforms’ commitment to retaining that balance, particularly if the behaviours following the recent takeover of Twitter by Elon Musk are anything to go by. Since Elon Musk took over ownership of the platform, he has repeatedly used Twitter polls, posted from his personal account, as metrics to determine public opinion on platform policy. The general amnesty policy and the reinstatement of Donald Trump both emerged from such polls.

According to former employees, those polls are not only inaccurate representations of the platform’s user base, but are actually

“designed to be spammed and gamed”.

The polls are magnets for bots and other inauthentic accounts. This approach and the reliance on polls have allowed Elon Musk to enact and dictate his platform’s policy on moderation and freedom of expression. Even if he is genuinely trusting the results of these polls and not gamifying them, they do not accurately represent the user base nor the best practices for confronting disinformation and harm online.

Elon Musk uses the results to claim that “the people have spoken”, but they have not. Research from leading anti-hate organisation the Anti-Defamation League shows that far-right extremists and neo-Nazis encouraged supporters to actively re-join Twitter to vote in these polls. The impacts of platforming neo-Nazis on Twitter do not need to be stated. Such users are explicitly trying to promote violent and hateful agendas, and they were banned initially for that exact reason. The bottom line is that those people were banned in line with Twitter’s terms of service at the time, and they should not be re-platformed just because of the findings of one Twitter poll.

These issues are at the very heart of Labour’s concerns in relation to the Bill—that the duties around freedom of expression and privacy will be different for those at the top of the platforms. We support the clause and the group of amendments, but I hope the Minister will be able to address those concerns in his remarks.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I endorse the general approach set out by the hon. Member for Pontypridd. We do not want to define freedom of speech based on a personal poll carried out on one platform. That is exactly why we are enshrining it in this ground-breaking Bill.

We want to get the balance right. I have talked about the protections for children. We also want to protect adults and give them the power to understand the platforms they are on and the risks involved, while having regard for freedom of expression and privacy. That is a wider approach than one man’s Twitter feed. These clauses are important to ensure that the service providers interpret and implement their safety duties in a proportionate way that limits negative impact on users’ rights to freedom of expression. However, they also have to have regard to the wider definition of freedom of expression, while protecting users, which the rest of the Bill covers in a proportionate way.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

This goes to the heart of more than just one person’s Twitter feed, although we could say that that person is an incredibly powerful and influential figure on the platform. In the past 24 hours, Twitter has disbanded its trust and safety council. Members of that council included expert groups working to tackle harassment and child sexual exploitation, and to promote human rights. Does the Minister not feel that the council being disbanded goes to the heart of what we have been debating? It shows how a platform can remove its terms of service or change them at whim in order to prevent harm from being perpetrated on that platform.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will come back to some of the earlier points. At the end of the day, when platforms change their terms and conditions, which they are free to do, they will be judged by their users and indeed the advertisers from whom they make their money. There are market forces—I will use that phrase as well as “commercial imperative”, to get that one in there—that will drive behaviour. It may be the usability of Facebook, or Twitter’s terms and conditions and the approach of its new owner, that will drive those platforms to alternative users. I am old enough to remember Myspace, CompuServe and AOL, which tried to box people into their walled gardens. What happened to them? Only yesterday, someone from Google was saying that the new artificial intelligence chatbot—ChatGPT—may well disrupt Google. These companies, as big as they are, do not have a right to exist. They have to keep innovating. If they get it wrong, then they get it wrong.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Absolutely. We have captured that in other parts of the Bill, but I wanted to make that specific bit clear because I am not sure whether I understood or answered my hon. Friend’s question correctly at the time.

Question put and agreed to.

Clause 20, as amended, accordingly ordered to stand part of the Bill.

Clause 21

Record-keeping and review duties

Amendments made: 32, in clause 21, page 23, line 5, leave out “, 10 or 12” and insert “or 10”.

This amendment is consequential on Amendment 6 (removal of clause 12).

Amendment 33, in clause 21, page 23, line 45, leave out paragraph (c).

This amendment is consequential on Amendment 7 (removal of clause 13).

Amendment 34, in clause 21, page 24, line 6, leave out “section” and insert “sections”.

This amendment is consequential on Amendment 35.

Amendment 35, in clause 21, page 24, line 6, at end insert—

“, (Duty not to act against users except in accordance with terms of service) and (Further duties about terms of service) (duties about terms of service).”—(Paul Scully.)

This amendment ensures that providers have a duty to review compliance with the duties set out in NC3 and NC4 regularly, and after making any significant change to the design or operation of the service.

Question proposed, That the clause, as amended, stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Given that there are few changes to this clause from when the Bill was amended in the previous Public Bill Committee, I will be brief. We in the Opposition are clear that record-keeping and review duties on in-scope services make up an important function of the regulatory regime and sit at the very heart of the Online Safety Bill. We must push platforms to transparently report all harms identified and the action taken in response, in line with regulation.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I think we all agree that written records are hugely important. They are important as evidence in cases where Ofcom is considering enforcement action, and a company’s compliance review should be done regularly, especially before they make changes to their service.

The Bill does not intend to place excessive burdens on small and low-risk businesses. As such, clause 21 provides Ofcom with the power to exempt certain types of service from the record-keeping and review duties. However, the details of any exemptions must be published.

To half-answer the point made by the hon. Member for Aberdeen North, the measures will be brought to the Lords, but I will endeavour to keep her up to date as best we can so that we can continue the conversation. We have served together on several Bill Committees, including on technical Bills that required us to spend several days in Committee—although they did not come back for re-committal—so I will endeavour to keep her and, indeed, the hon. Member for Pontypridd, up to date with developments.

Question put and agreed to. 

Clause 21, as amended, accordingly ordered to stand part of the Bill.

Clause 30

duties about freedom of expression and privacy

Amendments made: 36, in clause 30, page 31, line 31, after “have” insert “particular”.

This amendment has the result that providers of regulated search services must have particular regard to freedom of expression when deciding on and implementing safety measures and policies.

Amendment 37, in clause 30, page 31, line 34, after “have” insert “particular”.—(Paul Scully.)

This amendment has the result that providers of regulated search services must have particular regard to users’ privacy when deciding on and implementing safety measures and policies.

Clause 30, as amended, ordered to stand part of the Bill.

Clause 46

Relationship between duties and codes of practice

Amendments made: 38, in clause 46, page 44, line 27, after “have” insert “particular”.

This amendment has the result that providers of services who take measures other than those recommended in codes of practice in order to comply with safety duties must have particular regard to freedom of expression and users’ privacy.

Amendment 39, in clause 46, page 45, line 12, leave out paragraph (c).

This amendment is consequential on Amendment 7 (removal of clause 13).

Amendment 40, in clause 46, page 45, line 31, at end insert “, or

(ii) a duty set out in section 14 (user empowerment);”.—(Paul Scully.)

This amendment has the effect that measures recommended in codes of practice to comply with the duty in clause 14 are relevant to the question of whether a provider is complying with the duties in clause 20(2) and (3) (having regard to freedom of expression and users’ privacy).

Question proposed, That the clause, as amended, stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I do not wish to repeat myself and test the Committee’s patience, so I will keep my comments brief. As it stands, service providers would be treated as complying with their duties if they had followed the recommended measures set out in the relevant codes of practice, as set out in the Bill. However, providers could take alternative measures to comply, but as I said in previous Committee sittings, Labour remains concerned that the definition of “alternative measures” is far too broad. I would be grateful if the Minister elaborated on his assessment of the instances in which a service provider may seek to comply via alternative measures.

The codes of practice should be, for want of a better phrase, best practice. Labour is concerned that, to avoid the duties, providers may choose to take the “alternative measures” route as an easy way out. We agree that it is important to ensure that providers have a duty with regard to protecting users’ freedom of expression and personal privacy. As we have repeatedly said, the entire Online Safety Bill regime relies on that careful balance being at the forefront. We want to see safety at the forefront, but recognise the importance of freedom of expression and personal privacy, and it is right that those duties are central to the clause. For those reasons, Labour has not sought to amend this part of the Bill, but I want to press the Minister on exactly how he sees this route being used.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

It is important that service providers have flexibility, so that the Bill does not disincentivise innovation or force service providers to use measures that might not work for all business models or technological contexts. The tech sector is diverse and dynamic, and it is appropriate that companies can take innovative approaches to fulfilling their duties. In most circumstances, we expect companies to take the measures outlined in Ofcom’s code of practice as the easiest route to compliance. However, where a service provider takes alternative measures, Ofcom must consider whether those measures safeguard users’ privacy and freedom of expression appropriately. Ofcom must also consider whether they extend across all relevant areas of a service mentioned in the illegal content and children’s online safety duties, such as content moderation, staff policies and practices, design of functionalities, algorithms and other features. Ultimately, it will be for Ofcom to determine a company’s compliance with the duties, which are there to ensure users’ safety.

Question put and agreed to.

Clause 46, as amended, accordingly ordered to stand part of the Bill.

Clause 55 disagreed to.

Clause 56

Regulations under sections 54 and 55

Amendments made: 42, in clause 56, page 54, line 40, leave out subsection (3).

This amendment is consequential on Amendment 41 (removal of clause 55).

Amendment 43, in clause 56, page 54, line 46, leave out “or 55”.

This amendment is consequential on Amendment 41 (removal of clause 55).

Amendment 44, in clause 56, page 55, line 8, leave out “or 55”.

This amendment is consequential on Amendment 41 (removal of clause 55).

Amendment 45, in clause 56, page 55, line 9, leave out

“or adults are to children or adults”

and insert “are to children”.—(Paul Scully.)

This amendment is consequential on Amendment 41 (removal of clause 55).

Question proposed, That the clause, as amended, stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

As we know, the clause makes provision in relation to the making of regulations designating primary and priority content that is harmful to children, and priority content that is harmful to adults. The Secretary of State may specify a description of content in regulations only if they consider that there is a material risk of significant harm to an appreciable number of children or adults in the United Kingdom presented by user-generated or search content of that description, and must consult Ofcom before making such regulations.

In the last Bill Committee, Labour raised concerns that there were no duties that required the Secretary of State to consult others, including expert stakeholders, ahead of making these regulations. That decision cannot be for one person alone. When it comes to managing harmful content, unlike illegal content, we can all agree that it is about implementing systems that prevent people from encountering it, rather than removing it entirely.

Sarah Owen Portrait Sarah Owen
- Hansard - - - Excerpts

The fact that we are here again to discuss what one Secretary of State wanted to put into law, and which another is now seeking to remove before the law has even been introduced, suggests that my hon. Friend’s point about protection and making sure that there are adequate measures within which the Secretary of State must operate is absolutely valid.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I completely agree: we are now on our third Secretary of State, our third Minister and our third Prime Minister since we began considering this iteration of the Bill. It is vital that this does not come down to one person’s ideological beliefs. We have spoken at length about this issue; the hon. Member for Don Valley has spoken about his concerns that Parliament should be sovereign, and should make these decisions. It should not be for one individual or one stakeholder to make these determinations.

We also have issues with the Government’s chosen toggle approach—we see that as problematic. We have debated it at length, but our concerns regarding clause 56 are about the lack of consultation that the Secretary of State of the day, whoever that may be and whatever political party they belong to, will be forced to make before making widespread changes to a regime. I am afraid that those concerns still exist, and are not just held by us, but by stakeholders and by Members of all political persuasions across the House. However, since our proposed amendment was voted down in the previous Bill Committee, nothing has changed. I will spare colleagues from once again hearing my pleas about the importance of consultation when it comes to determining all things related to online safety, but while Labour Members do not formally oppose the clause, we hope that the Minister will address our widespread concerns about the powers of the Secretary of State in his remarks.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I appreciate the hon. Lady’s remarks. We have tried to ensure that the Bill is proportionate, inasmuch as the Secretary of State can designate content if there is material risk of significant harm to an appreciable number of children in the United Kingdom. The Bill also requires the Secretary of State to consult Ofcom before making regulations on the priority categories of harm.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

The Minister has just outlined exactly what our concerns are. He is unable to give an exact number, figure or issue, but that is what the Secretary of State will have to do, without having to consult any stakeholders regarding that issue. There are many eyes on us around the world, with other legislatures looking at us and following suit, so we want the Bill to be world-leading. Many Governments across the world may deem that homosexuality, for example, is of harm to children. Because this piece of legislation creates precedent, a Secretary of State in such a Government could determine that any platform in that country should take down all that content. Does the Minister not see our concerns in that scenario?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I was about to come on to the fact that the Secretary of State would be required to consult Ofcom before making regulations on the priority categories of harm. Indeed Ofcom, just like the Secretary of State, speaks to and engages with a number of stakeholders on this issue to gain a deeper understanding. Regulations designating priority harms would be made under the draft affirmative resolution procedure, but there is also provision for the Secretary of State to use the made affirmative resolution procedure in urgent scenarios, and this would be an urgent scenario. It is about getting the balance right.

--- Later in debate ---
Question proposed, That the clause stand part of the Bill.
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

As we know, this clause requires providers of relevant services to publish annual transparency reports and sets out Ofcom’s powers in relation to those reports. The information set out in transparency reports is intended to help users to understand the steps that providers are taking to help keep them safe and to provide Ofcom with the information required to hold them to account.

These duties on regulated services are very welcome indeed. Labour has long held the view that mandatory transparency reporting and reporting mechanisms are vital to hold platforms to account, and to understand the true nature of how online harm is driven and perpetuated on the internet.

I will reiterate the points that were made in previous Committee sittings about our concerns about the regularity of these transparency reports. I note that, sadly, those reports remain unchanged and therefore they will only have to be submitted to Ofcom annually. It is important that the Minister truly considers the rapid rate at which the online world can change and develop, so I urge him to reconsider this point and to make these reports a biannual occurrence. Labour firmly believes that increasing the frequency of the transparency reports will ensure that platforms and services remain on the pulse, and are forced to be aware of and act on emergent risks. In turn, that would compel Ofcom to do the same in its role as an industry regulator.

I must also put on the record some of our concerns about subsections (12) and (13), which state that the Secretary of State of the day could amend by regulation the frequency of the transparency reporting, having consulted Ofcom first. I hope that the Minister can reassure us that this approach will not result in our ending up in a position where, perhaps because of Ofcom’s incredible workload, transparency reporting becomes even less frequent than an annual occurrence. We need to see more transparency, not less, so I really hope that he can reassure me on this particular point.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Does my hon. Friend agree that transparency should be at the heart of this Bill and that the Government have missed an opportunity to accelerate the inclusion of a provision in the Bill, namely the requirement to give researchers and academics access to platform data? Data access must be prioritised in the Bill and without such prioritisation the UK will fall behind the rest of Europe in safety, research and innovation. The accessibility and transparency of that data from a research perspective are really important.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I completely agree with my hon. Friend. We both made the point at length in previous sittings of the Committee about the need to ensure transparency, access to the data, and access to reporting for academics, civil society and researchers.

That also goes to the point that it is not for this Committee or this Minister—it is not in his gift—to determine something that we have all discussed in this place at length, which is the potential requirement for a standalone Committee specifically to consider online harm. Such a Committee would look at whether this legislation is actively doing what we need it to do, whether it needs to be reviewed, whether it could look at the annual reports from Ofcom to determine the length and breadth of harm on the internet, and whether or not this legislation is actually having an impact. That all goes to the heart of transparency, openness and the review that we have been talking about.

I want to go further and raise concerns about how public the reports will be, as we have touched on. The Government claim that their so-called triple shield approach will give users of platforms and services more power and knowledge to understand the harms that they may discover online. That is in direct contradiction to the Bill’s current approach, which does not provide any clarity about exactly how the transparency reports will be made available to the public. In short, we feel that the Government are missing a significant opportunity. We have heard many warnings about what can happen when platforms are able to hide behind a veil of secrecy. I need only point to the revelations of whistleblowers, including Frances Haugen, to highlight the importance of that point.

As the Bill stands, once Ofcom has issued a notice, companies will have to produce a transparency report that

“must…be published in the manner and by the date specified in the notice”.

I want to press the Minister on that and ask him to clarify the wording. We are keen for the reports to be published publicly and in an accessible way, so that users, civil society, researchers and anyone else who wants to see them can make sense of them. The information contained in the transparency reports is critical to analysing trends and harms, so I hope that the Minister will clarify those points in his response.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Does my hon. Friend agree that if the Government are to achieve their objective—which we all share—for the Bill to be world-leading legislation, we cannot rely on whistleblowers to tell us what is really going on in the online space? That is why transparency is vital. This is the perfect opportunity to provide that transparency, so that we can do some proper research into what is going on out there. We cannot rely on whistleblowers to give us such information.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

My hon. Friend is absolutely right. We want the Bill to work. We have always wanted the Bill to work. We want it to achieve its aim of keeping children, adults and everyone who uses the internet safe from the harms that are perpetuated there. If there is no transparency, how will we know that the platforms are breaking the rules covertly, and whether they are hiding content and getting round the rules? That is what they do; we know it, because we have heard it from whistleblowers, but we cannot rely on whistleblowers alone to highlight exactly what happens behind the closed doors of the platforms.

We need the transparency and the reports to be made public, so that we can see whether the legislation is working. If that does not happen, although we have waited five years, we will need another piece of legislation to fix it. We know that the Bill is not perfect, and the Minister knows that—he has said so himself—but, ultimately, we need to know that it works. If it does not, we have a responsibility as legislators to put something in place that does. Transparency is the only way in which we will figure that out.

Sarah Owen Portrait Sarah Owen
- Hansard - - - Excerpts

I want to add to the brilliant points made by my hon. Friend the shadow Minister, in particular on the continually changing nature of market forces, which the Minister himself referenced. We want innovation. We want the tech companies to innovate—preferably ones in the UK—but we do not want to be playing catch-up as we are now, making legislation retrospectively to right wrongs that have taken place because our legislative process has been too slow to deal with the technological changes and the changes in social media, in apps, and with how we access data and communicate with one another online. The bare minimum is a biannual report.

Within six months, if a new piece of technology comes up, it does not simply stay with one app or platform; that technology will be leapfrogged by others. Such technological advances can take place at a very rapid pace. The transparency aspect is important, because people should have a right to know what they are using and whether it is safe. We as policy makers should have a right to know clearly whether the legislation that we have introduced, or the legislation that we want to amend or update, is effective.

If we look at any other approach that we take to protect the health and safety of the people in our country—the people we all represent in our constituencies —we always say that prevention is better than cure. At the moment, without transparency and without researchers being able to update the information we need to see, we will constantly be playing catch-up with digital tech.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The amendments to schedule 8 confirm that references to relevant content, consumer content and regulated user-generated content have the same meaning as established by other provisions of the Bill. Again, that ensures consistency, which will, in turn, support Ofcom in requiring providers of category 1 services to give details in their annual transparency reports of their compliance with the new transparency, accountability and freedom of expression duties.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I will keep my comments on this grouping brief, because I have already raised our concerns and our overarching priority in terms of transparency reports in the previous debate, which was good one, with all Members highlighting the need for transparency and reporting in the Bill. With the Chair’s permission, I will make some brief comments on Government amendment 72 before addressing Government amendments 73 and 75.

It will come as no surprise to the Minister that amendment 72, which defines relevant content for the purposes of schedule 8, has a key omission—specifying priority content harmful to adults. For reasons we have covered at length, we think that it is a gross mistake on the Government’s side to attempt to water down the Bill in this way. If the Minister is serious about keeping adults safe online, he must reconsider this approach. However, we are happy to see amendments 73 and 75, which define consumer content and regulated user-generated content. It is important for all of us—whether we are politicians, researchers, academics, civil society, stakeholders, platforms, users or anyone else—that these definitions are in the Bill so that, when it is passed, it can be applied properly and at pace. That is why we have not sought to amend this grouping.

I must press the Minister to respond on the issues around relevant content as outlined in amendment 72. We greatly feel that more needs to be done to address this type of content and its harm to adults, so I would be grateful to hear the Minister’s assessment of how exactly these transparency reports will report back on this type of harm, given its absence in this group of amendments and the lack of a definition.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I am pleased to see the list included and the number of things that Ofcom can ask for more information on. I have a specific question about amendment 75. Amendment 75 talks about regulated user-generated content and says it has the same meaning as it does in the interpretation of part 3 under clause 50. The Minister may or may not know that there are concerns about clause 50(5), which relates to

“One-to-one live aural communications”.

One-to-one live aural communications are exempted. I understand that that is because the Government do not believe that telephony services, for example, should be part of the Online Safety Bill—that is a pretty reasonable position for them to take. However, allowing one-to-one live aural communications not to be regulated means that if someone is using voice chat in Fortnite, for example, and there are only two people on the team that they are on, or if someone is using voice chat in Discord and there are only two people online on the channel at that time, that is completely unregulated and not taken into account by the Bill.

I know that that is not the intention of the Bill, which is intended to cover user-generated content online. The exemption is purely in place for telephony services, but it is far wider than the Government intend it to be. With the advent of more and more people using virtual reality technology, for example, we will have more and more aural communication between just two people, and that needs to be regulated by the Bill. We cannot just allow a free-for-all.

If we have child protection duties, for example, they need to apply to all user-generated content and not exempt it specifically because it is a live, one-to-one aural communication. Children are still at significant risk from this type of communication. The Government have put this exemption in because they consider such communication to be analogous to telephony services, but it is not. It is analogous to telephony services if we are talking about a voice call on Skype, WhatsApp or Signal—those are voice calls, just like telephone services—but we are talking about a voice chat that people can have with people who they do not know, whose phone number they do not know and who they have no sort of relationship with.

Some of the Discord servers are pretty horrendous, and some of the channels are created by social media influencers or people who have pretty extreme views in some cases. We could end up with a case where the Discord server and its chat functions are regulated, but if aural communication or a voice chat is happening on that server, and there are only two people online because it is 3 o’clock in the morning where most of the people live and lots of them are asleep, that would be exempted. That is not the intention of the Bill, but the Government have not yet fixed this. So I will make one more plea to the Government: will they please fix this unintended loophole, so that it does not exist? It is difficult to do, but it needs to be done, and I would appreciate it if the Minister could take that into consideration.

ONLINE SAFETY BILL (Third sitting)

Alex Davies-Jones Excerpts
Committee stage (re-committed clauses and schedules)
Thursday 15th December 2022

(1 year, 4 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 15 December 2022 - (15 Dec 2022)
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

These Government amendments confer a duty on Ofcom to create and publish a list of companies that are approaching the category 1 threshold to ensure that it proactively identifies emerging high-reach, high-influence companies and is ready to add them to the category 1 register without delay. That is being done in recognition of the rapid pace of change in the tech industry, in which companies can grow quickly. The changes mean that Ofcom can designate companies as category 1 at pace. That responds to concerns that platforms could be unexpectedly popular and quickly grow in size, and that there could be delays in capturing them as category 1 platforms. Amendments 48 and 49 are consequential on new clause 7, which confers a duty on Ofcom to create and publish a list of companies that are approaching the category 1 threshold. For those reasons, I recommend that the amendments be accepted.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - -

It will come as no surprise to Members to hear that we have serious concerns about the system of categorisation and the threshold conditions for platforms and service providers, given our long-standing view that the approach taken is far too inflexible.

In previous sittings, we raised the concern that the Government have not provided enough clarity about what will happen if a service is required to shift from one category to another, and how long that will take. We remain unclear about that, about how shifting categories will work in practice, and about how long Ofcom will have to preside over such changes and decisions.

I have been following this Bill closely for just over a year, and I recognise that the online space is constantly changing and evolving. New technologies are popping up that will make this categorisation process even more difficult. The Government must know that their approach does not capture smaller, high-harm platforms, which we know—we have debated this several times—can be at the root of some of the most dangerous and harmful content out there. Will the Minister clarify whether the Government amendments will allow Ofcom to consider adding such small, high-harm platforms to category 1, given the risk of harm?

More broadly, we are pleased that the Government tabled new clause 7, which will require Ofcom to prepare and update a list of regulated user-to-user services that have 75% of the number of users of a category 1 service, and at least one functionality of a category 1 service, or one required combination of a functionality and another characteristic or factor of a category 1 service. It is absolutely vital that Ofcom, as the regulator, is sufficiently prepared, and that there is monitoring of regulated user-to-user services so that this regime is as flexible as possible and able to cope with the rapid changes in the online space. That is why the Opposition support new clause 7 and have not sought to amend it. Moreover, we also support Government amendments 48 and 49, which are technical amendments to ensure that new clause 7 references user-to-user services and assessments of those services appropriately. I want to press the Minister on how he thinks these categories will work, and on Ofcom’s role in that.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

I agree with everything that the hon. Lady said. New clause 7 is important. It was missing from the earlier iterations of the Bill, and it makes sense to have it here, but it raises further concerns about the number of people who are required to use a service before it is classed as category 1. We will come later to our amendment 104 to schedule 11, which is about adding high-risk platforms to the categorisation.

I am still concerned that the numbers are a pretty blunt instrument for categorising something as category 1. The number may end up being particularly high. I think it would be very easy for the number to be wrong—for it to be too high or too low, and probably too high rather than too low.

If Twitter were to disappear, which, given the changing nature of the online world, is not outside the realms of possibility, we could see a significant number of other platforms picking up the slack. A lot of them might have fewer users, but the same level of risk as platforms such as Twitter and Facebook. I am still concerned that choosing a number is a very difficult thing to get right, and I am not totally convinced that the Government’s way of going about this is right.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

My hon. Friend is absolutely right. All companies will still have to tackle the risk assessment, and will have to remove illegal content. We are talking about the extra bits that could take a disproportionate amount of resource from core functions that we all want to see around child protection.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I would push the Minister further. He mentioned that there will not be an onus on companies to tackle the “legal but harmful” duty now that it has been stripped from the Bill, but we know that disinformation, particularly around elections in this country, is widespread on these high-harm platforms, and they will not be in scope of category 2. We have debated that at length. We have debated the time it could take Ofcom to act and put those platforms into category 1. Given the potential risk of harm to our democracy as a result, will the Minister press Ofcom to act swiftly in that regard? We cannot put that in the Bill now, but time is of the essence.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Absolutely. The Department has techniques for dealing with misinformation and disinformation as well, but we will absolutely push Ofcom to work as quickly as possible. As my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), the former Secretary of State, has said, once an election is done, it is done and it cannot be undone.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

These Government amendments seek to change the approach to category 1 designation, following the removal from the Bill of the adult safety duties and the concept of “legal but harmful” content. Through the proposed new duties on category 1 services, we aim to hold companies accountable to their terms of service, as we have said. I seek to remove all requirements on category 1 services relating to harmful content, so it is no longer appropriate to designate them with reference to harm. Consequently, the amendments in this group change the approach to designating category 1 services, to ensure that only the largest companies with the greatest influence over public discourse are designated as category 1 services.

Specifically, these amendments will ensure that category 1 services are so designated where they have functionalities that enable easy, quick and wide dissemination of user-generated content, and the requirement of category 1 services to meet a number of users threshold remains unchanged.

The amendments also give the Secretary of State the flexibility to consider other characteristics of services, as well as other relevant factors. Those characteristics might include a service’s functionalities, the user base, the business model, governance, and other systems and processes. That gives the designation process greater flexibility to ensure that services are designated category 1 services only when they have significant influence over public discourse.

The amendments also seek to remove the use of criteria for content that is harmful to adults from category 2B, and we have made a series of consequential amendments to the designation process for categories 2A and 2B to ensure consistency.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I have commented extensively on the flaws in the categorisation process in this and previous Committees, so I will not retread old ground. I accept the amendments in this grouping. They show that the Government are prepared to broaden the criteria for selecting which companies are likely to be in category 1. That is a very welcome, if not subtle, shift in the right direction.

The amendments bring the characteristics of a company’s service into consideration, which will be a slight improvement on the previous focus on size and functionality, so we welcome them. The distinction is important, because size and functionality alone are obviously very vague indicators of harm, or the threat of harm.

We are pleased to see that the Government have allowed for a list to be drawn up of companies that are close to the margins of category 1, or that are emerging as category 1 companies. This is a positive step for regulatory certainty, and I hope that the Minister will elaborate on exactly how the assessment will be made.

However, I draw the Minister’s attention to Labour’s long-held concern about the Bill’s over-reliance on powers afforded to the Secretary of State of the day. We debated this concern in a previous sitting. I press the Minister again on why these amendments, and the regulations around the threshold conditions, are ultimately only for the Secretary of State to consider, depending on characteristics or factors that only he or she, whoever they may be, deems relevant.

We appreciate that the regulations need some flexibility, but we have genuine concerns—indeed, colleagues from all parties have expressed such concerns—that the Bill will give the Secretary of State far too much power to determine how the entire online safety regime is imposed. I ask the Minister to give the Committee an example of a situation in which it would be appropriate for the Secretary of State to make such changes without any consultation with stakeholders or the House.

It is absolutely key for all of us that transparency should lie at the heart of the Bill. Once again, we fear that the amendments are a subtle attempt by the Government to impose on what is supposed to be an independent regulatory process the whim of one person. I would appreciate assurance on that point. The Minister knows that these concerns have long been held by me and colleagues from all parties, and we are not alone in those concerns. Civil society groups are also calling for clarity on exactly how decisions will be made, and particularly on what information will be used to determine a threshold. For example, do the Government plan on quantifying a user base, and will the Minister explain how the regime would work in practice, when we know that a platform’s user base can fluctuate rapidly? We have seen that already with Mastodon; the latter’s users have increased incredibly as a result of Elon Musk’s takeover of Twitter. I hope that the Minister can reassure me about those concerns. He will know that this is a point of contention for colleagues from across the House, and we want to get the Bill right before we progress to Report.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move amendment 104, in schedule 11, page 213, line 11, at end insert—

“(1A) Regulations made under sub-paragraph (1) must provide for any regulated user-to-user service which OFCOM assesses as posing a very high risk of harm to be included within Category 1, regardless of the number of users.”

This amendment allows Ofcom to impose Category 1 duties on user-to-user services which pose a very high risk of harm.

I would say this, but I think that this is the most important amendment. The key area that the Government are getting wrong is the way in which platforms, providers or services will be categorised. The threshold is based on the number of users. It is the number of users “and” one of those other things, not the number of users “or” one of those other things; even that would make a significant difference.

The Secretary of State talked about the places that have a significant influence over public discourse. It is perfectly possible to have a significant influence over public discourse with a small number of users, or with a number of users that does not number into the millions. We have seen the spread of conspiracy theories that have originated and been perpetuated on very small platforms—very small, shady places on the internet that none of us has experienced or even heard of. Those are the places that have a massive impact and effect.

We know that one person can have a significant impact on the world and on people’s lives. We have heard about the physical harm that people can be incited to cause by the platforms they access, and the radicalisation and extremism they find themselves subject to. That can cause massive, damaging effects to anybody they choose to take physical action against, and to some of the most marginalised communities and groups in society. We are seeing an increase in the amount of hate crime and the number of people who believe conspiracy theories, and not all of that is because of the spread of those things on Facebook and Twitter. It is because of the breadcrumbing and the spread that there can be on smaller platforms.

The most extreme views do not necessarily tip over into “illegal” or “incitement”; they do not actually say, “Please go out and kill everybody in this particular group.” They say, “This particular group is responsible for all of ills you feel and for every negative thing that is happening in your life”, and people are therefore driven to take extremist, terrorist action. That is a significant issue.

I want to talk about a couple of platforms. Kiwi Farms, which is no longer in existence and has been taken down, was a very small platform that dramatically damaged the lives of trans people in particular. It was a platform where people went to incite hatred and give out the addresses of folk who they knew were members of the trans community. Some of those people had to move to another continent to get away from the physical violence and attacks they faced as a result of the behaviour on that incredibly small platform, which very few people will have heard about.

Kiwi Farms has been taken down because the internet service providers decided that it was too extreme and they could not possibly host it any more. That was eventually recognised and change was made, but the influence that that small place had on lives—the difficulties and harm it caused—is untold. Some of that did tip over into illegality, but some did not.

I also want to talk about the places where there is a significant amount of pornography. I am not going to say that I have a problem with pornography online; the internet will always have pornography on it. It attracts a chunk of people to spend time online, and some of that pornography is on large mainstream sites. Searches for incest, underage girls, or black women being abused all get massive numbers of hits. There is a significant amount of pornography on these sites that is illegal, that pretends to be illegal or that acts against people with protected characteristics. Research has found that a significant proportion—significantly more than a half—of pornography on mainstream sites that involves black women also involves violence. That is completely and totally unacceptable, and has a massive negative impact on society, whereby it reinforces negativity and discrimination against groups that are already struggling with being discriminated against and that do not experience the privilege of a cis white man.

It is really grim that we are requiring a number of users to be specified, when we know the harm that caused by platforms that do not have 10 million or 20 million United Kingdom users. I do not know what the threshold will be, but I know it will be too high to include a lot of platforms that have a massive effect. The amendment is designed specifically to give Ofcom the power to designate as category 1 any service that it thinks has a very high risk of harm; I have not set the bar particularly low. Now that the Minister has increased the levels of transparency that will be required for category 1 platforms, it is even more important that we subject extremist sites and platforms—the radicalising ones, which are perpetuating discrimination—to a higher bar and require them to have the transparency that they need as a category 1 service. This is a place where the Bill could really make a difference and change lives, and I am really concerned that it is massively failing to do so.

The reason I have said that it should be Ofcom’s responsibility to designate category 1 services is on the basis that it has the experts who will be looking at all the risk assessments, dealing with companies on a day-to-day basis, and seeing the harms and transparencies that the rest of us will not be able to see. The reporting mechanisms will be public for only some of the category 1 platforms, and we will not be able to find out the level of information that Ofcom has, so it is right that it should be responsible for designating sites as having a very high risk of harm. That is why I tabled the amendment, which would make a massive difference to people who are the most discriminated against as it is and who are the most at risk of harm from extremism. I urge the Minister to think again.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I rise briefly to support everything the hon. Member for Aberdeen North just said. We have long called for the Bill to take a harm-led approach; indeed, the Government initially agreed with us, as when it was in its first iteration it was called the Online Harms Bill rather than the Online Safety Bill. Addressing harm must be a central focus of the Bill, as we know extremist content is perpetuated on smaller, high-harm platforms; this is something that the Antisemitism Policy Trust and Hope not Hate have long called for with regards to the Bill.

I want to put on the record our huge support for the amendment. Should the hon. Lady be willing to push it to a vote—I recognise that we are small in number—we will absolutely support her.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I want to speak briefly to the amendment. I totally understand the reasons that the hon. Member for Aberdeen North has tabled it, but in reality, the kinds of activities she describes would be captured anyway, because most would fall within the remit of the priority illegal harms that all platforms and user-to-user services have to follow. If there were occasions when they did not, being included in category 1 would mean that they would be subject to the additional transparency of terms of service, but the smaller platforms that allow extremist behaviour are likely to have extremely limited terms of service. We would be relying on the priority illegal activity to set the minimum safety standards, which Ofcom would be able to do.

It would also be an area where we would want to move at pace. Even if we wanted to bring in extra risk assessments on terms of service that barely exist, the time it would take to do that would not give a speedy resolution. It is important that in the way Ofcom exercises its duties, it does not just focus on the biggest category 1 platforms but looks at how risk assessments for illegal activity are conducted across a wide range of services in scope, and that it has the resources needed to do that.

Even within category 1, it is important that is done. We often cite TikTok, Instagram and Facebook as the biggest platforms, but I recently spoke to a teacher in a larger secondary school who said that by far the worst platform they have to deal with in terms of abuse, bullying, intimidation, and even sharing of intimate images between children, is Snapchat. We need to ensure that those services get the full scrutiny they should have, because they are operating at the moment well below their stated terms of service, and in contravention of the priority illegal areas of harm.

--- Later in debate ---
None Portrait The Chair
- Hansard -

We now come to Government amendments 54 and 55 to clause 115.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I do not wish to test the Committee’s patience. I know we need to get the Bill over the line quickly, so I do not wish to delay it by talking over old ground that we covered in the previous Public Bill Committee on clauses that we support. We do support the Government on this clause, but I will make some brief comments because, as we know, clause 115 is important. It lists the enforceable requirements for which failure to comply can trigger enforcement action.

None Portrait The Chair
- Hansard -

Order. I think the hon. Lady is speaking to clause 115. This is Government amendments 54 and 55 to clause 115. I will call you when we get to that place, which will be very soon, so stay alert.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Apologies, Dame Angela. I got carried away.

Amendments made: 54, in clause 115, page 98, leave out lines 35 and 36.

This amendment is consequential on Amendments 6 and 7 (removal of clauses 12 and 13).

Amendment 55, in clause 115, page 99, line 19, at end insert—

“Section (Duty not to act against users except in accordance with terms of service)

Acting against users only in accordance with terms of service

Section (Further duties about terms of service)

Terms of service”



—(Paul Scully.)

This amendment ensures that OFCOM are able to use their enforcement powers in Chapter 6 of Part 7 in relation to a breach of any of the new duties imposed by NC3 and NC4.

Question proposed, That the clause, as amended, stand part of the Bill.

None Portrait The Chair
- Hansard -

We now come to clause 115 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Thank you, Dame Angela—take 2.

Clause 115 focuses on the enforcement action that may be taken and will be triggered if a platform fails to comply. Given that the enforceable requirements may include, for example, duties to carry out and report on risk assessments and general safety duties, it is a shame that the Government have not seen the merits of going further with these provisions. I point the Minister to the previous Public Bill Committee, where Labour made some sensible suggestions for how to remedy the situation. Throughout the passage of the Bill, we have made it abundantly clear that more access to, and availability of, data and information about systems and processes would improve understanding of the online environment.

We cannot and should not rely solely on Ofcom to act as problems arise when they could be spotted earlier by experts somewhere else. We have already heard the Minister outline the immense task that Ofcom has ahead of it to monitor risk assessments and platforms, ensuring that platforms comply and taking action where there is illegal content and a risk to children. It is important that Ofcom has at its disposal all the help it needs.

It would be helpful if there were more transparency about how the enforcement provisions work in practice. We have repeatedly heard that without independent researchers accessing data on relevant harm, platforms will have no real accountability for how they tackle online harm. I hope that the Minister can clarify why, once again, the Government have not seen the merit of encouraging transparency in their approach. It would be extremely valuable and helpful to both the online safety regime and the regulator as a whole, and it would add merit to the clause.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

We have talked about the fact that Ofcom will have robust enforcement powers. It can direct companies to take specific steps to come into compliance or to remedy failure to comply, as well as issue fines and apply to the courts for business disruption measures. Indeed, Ofcom can institute criminal proceedings against senior managers who are responsible for compliance with an information notice, when they have failed to take all reasonable steps to ensure the company’s compliance with that notice. That criminal offence will commence two months after Royal Assent.

Ofcom will be required to produce enforcement guidelines, as it does in other areas that it regulates, explaining how it proposes to use its enforcement powers. It is important that Ofcom is open and transparent, and that companies and people using the services understand exactly how to comply. Ofcom will provide those guidelines. People will be able to see who are the users of the services. The pre-emptive work will come from the risk assessments that platforms themselves will need to produce.

We will take a phased approach to bringing the duties under the Bill into effect. Ofcom’s initial focus will be on illegal content, so that the most serious harms can be addressed as soon as possible. When those codes of practice and guidelines come into effect, the hon. Member for Pontypridd will see some of the transparency and openness that she is looking for.

Question put and agreed to.

Clause 115, as amended, accordingly ordered to stand part of the Bill.

Clause 55

Review

Amendment made: 56, in clause 155, page 133, line 27, after “Chapter 1” insert “or 2A”.—(Paul Scully.)

Clause 155 is about a review by the Secretary of State of the regulatory framework established by this Bill. This amendment inserts a reference to Chapter 2A, which is the new Chapter expected to be formed by NC3 to NC6.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

My hon. Friend is absolutely right. The report as is obviously has to be laid before Parliament and will form part of the package of parliamentary scrutiny. But, yes, we will consider how we can utilise the expertise of both Houses in post-legislative scrutiny. We will come back on that.

Question put and agreed to.

Clause 155, as amended, accordingly ordered to stand part of the Bill.

Clause 169

Individuals providing regulated services: liability

Amendment made: 57, in clause 169, page 143, line 15, at end insert—

“(fa) Chapter 2A of Part 4 (terms of service: transparency, accountability and freedom of expression);”.—(Paul Scully.)

Clause 169 is about liability of providers who are individuals. This amendment inserts a reference to Chapter 2A, which is the new Chapter expected to be formed by NC3 to NC6, so that individuals may be jointly and severally liable for the duties imposed by that Chapter.

Clause 169, as amended, ordered to stand part of the Bill.

Clause 183 ordered to stand part of the Bill.

Schedule 17

Video-sharing platform services: transitional provision etc

Amendments made: 94, in schedule 17, page 235, line 43, leave out paragraph (c).

This amendment is consequential on Amendment 6 (removal of clause 12).

Amendment 95, in schedule 17, page 236, line 27, at end insert—

“(da) the duties set out in sections (Duty not to act against users except in accordance with terms of service) and (Further duties about terms of service) (terms of service);”.—(Paul Scully.)

This amendment ensures that services already regulated under Part 4B of the Communications Act 2003 (video-sharing platform services) are not required to comply with the new duties imposed by NC3 and NC4 during the transitional period.

Question proposed, That the schedule, as amended, be the Seventeenth schedule to the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Labour welcomes schedule 17, which the Government introduced on Report. We see this schedule as clarifying exactly how the existing video-sharing platform regime will be repealed and the transitional provisions that will apply to those providers as they transition to the online safety framework. The schedule is fundamentally important for both providers and users, as it establishes the formal requirements of these platforms as we move the requirement to this new legislation.

We welcome the clarification in paragraph 1(1) of the definition of a qualifying video-sharing service. On that point, I would be grateful if the Minister clarified the situation around livestreaming video platforms and whether this schedule would also apply to them. Throughout this Bill Committee, we have heard just how dangerous and harmful live video-sharing platforms can be, so this is an important point to clarify.

I have spoken at length about the importance of capturing the harms on these platforms, particularly in the context of child sexual exploitation being livestreamed online, which, thanks to the brilliant work of International Justice Mission, we know is a significant and widespread issue. I must make reference to the IJM’s findings from its recent White Paper, which highlighted the extent of the issue in the Philippines, which is widely recognised as a source country for livestreamed sexual exploitation of children. It found that traffickers often use cheap Android smartphones with pre-paid cellular data services to communicate with customers and produce and distribute explicit material. To reach the largest possible customer base, they often connect with sexually motivated offenders through everyday technology—the same platforms that the rest of us use to communicate with friends, family and co-workers.

One key issue in assessing the extent of online sexual exploitation of children is that we are entirely dependent on the detection of the crime, but the reality is that most current technologies that are widely used to detect various forms of online sexual exploitation of children are not designed to recognise livestreaming video services. This is an important and prolific issue, so I hope the Minister can assure me that the provisions in the schedule will apply to those platforms too.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

We are setting out in schedule 17 how the existing video-sharing platform regime will be repealed in the transitional provisions that apply to these providers as they transition to the online safety framework. My understanding is that it does include livestreaming, but I will obviously write to the hon. Lady if I have got that wrong. I am not sure there is a significant legal effect here. To protect children and treat services fairly while avoiding unnecessary burdens on business, we are maintaining the current user protections in the VSP regime while the online safety framework is being implemented. That approach to transition avoids the duplication of regulation.

Question put and agreed to.

Schedule 17, as amended, accordingly agreed to.

Clause 203

Interpretation: general

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

It is the functionalities around it that enable the voice conversation to happen.

Question put and agreed to.

Clause 203, as amended, accordingly ordered to stand part of the Bill.

Clause 206

Extent

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I would like to welcome the Government’s clarification, particularly as an MP representing a devolved nation within the UK. It is important to clarify the distinction between the jurisdictions, and I welcome that this clause does that.

Question put and agreed to.

Clause 206 accordingly ordered to stand part of the Bill.

Clause 207

Commencement and transitional provision

Amendment made: 60, in clause 207, page 173, line 15, leave out “to” and insert “and”.—(Paul Scully.)

This amendment is consequential on amendment 41 (removal of clause 55).

Question proposed, That the clause, as amended, stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Labour welcomes clause 207, which outlines the commencement and transitional provisions for the Bill to effectively come into existence. The Minister knows that Labour is concerned about the delays that have repeatedly held up the Bill’s progress, and I need not convince him of the urgent need for it to pass. I think contributions in Committee plus those from colleagues across the House as the Bill has progressed speak for themselves. The Government have repeatedly claimed they are committed to keeping children safe online, but have repeatedly failed to bring forward this legislation. We must now see commitments from the Minister that the Bill, once enacted, will make a difference right away.

Labour has specific concerns shared with stakeholders, from the Age Verification Providers Association to the Internet Watch Foundation, the NSPCC and many more, about the road map going forward. Ofcom’s plan for enforcement already states that it will not begin enforcement on harm to children from user-to-user content under part 3 of the Bill before 2025. Delays to the Bill as well as Ofcom’s somewhat delayed enforcement plans mean that we are concerned that little will change in the immediate future or even in the short term. I know the Minister will stand up and say that if the platforms want to do the right thing, there is nothing stopping them from doing so immediately, but as we have seen, they need convincing to take action when it counts, so I am not convinced that platforms will do the right thing.

Charlotte Nichols Portrait Charlotte Nichols (Warrington North) (Lab)
- Hansard - - - Excerpts

If the Government’s argument is that there is nothing to stop platforms taking such actions early, why are we discussing the Bill at all? Platforms have had many years to implement such changes, and the very reason we need this Bill is that they have not been.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Exactly. My hon. Friend makes an incredibly important point that goes to the heart of why we are here in the first place. If the platforms were not motivated by commercial interest and we could trust them to do the right thing on keeping children safe and reducing harm on their platforms, we would not require this legislation in the first place. But sadly, we are where we are, which is why it is even more imperative that we get on with the job, that Ofcom is given the tools to act swiftly and tries to reduce the limit of when they come into effect and that this legislation is enacted so that it actually makes a lasting difference.

Ofcom has already been responsible for regulating video-sharing platforms for two years, yet still, despite being in year 3, it is only asking websites to provide a plan as to how they will be compliant. That means the reality is that we can expect little on child protection before 2027-28, which creates a massive gap compared with public expectations of when the Bill will be passed. We raised these concerns last time, and I felt little assurance from the Minister in post last time, so I am wondering whether the current Minister can improve on his predecessor by ensuring a short timeline for when exactly the Bill can be implemented and Ofcom can act.

We all understand the need for the Bill, which my hon. Friend the Member for Warrington North just pointed out. That is why we have been supportive in Committee and throughout the passage of the Bill. But the measures that the Bill introduces must come into force as soon as is reasonably possible. Put simply, the industry is ready and users want to be protected online and are ready too. It is just the Government, sadly, and the regulator that would be potentially holding up implementation of the legislation.

The Minister has failed to concede on any of the issues that we have raised in Committee, despite being sympathetic and supportive. His predecessor was also incredibly supportive and sympathetic on everything we raised in Committee, yet failed to take into account a single amendment or issue that we raised. I therefore make a plea to this Minister to at least see the need to press matters and the timescale that is needed here. We have not sought to formally amend this clause, so I seek the Minister’s assurance that this legislation will be dealt with swiftly. I urge him to work with Labour, SNP colleagues and colleagues across the House to ensure that the legislation and the provisions in it are enacted and that there are no further unnecessary delays.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Our intention is absolutely to get this regime operational as soon as possible after Royal Assent. We have to get to Royal Assent first, so I am looking forward to working with all parties in the other House to get the legislation to that point. After that, we have to ensure that the necessary preparations are completed effectively and that service providers understand exactly what is expected of them. To answer the point made by the hon. Member for Warrington North about service providers, the key difference from what happened in the years that led to this legislation being necessary is that they now will know exactly what is expected of them—and it is literally being expected of them, with legislation and with penalties coming down the line. They should not be needing to wait for the day one switch-on. They can be testing and working through things to ensure that the system does work on day one, but they can do that months earlier.

The legislation does require some activity that can be carried out only after Royal Assent, such as public consultation or laying of secondary legislation. The secondary legislation is important. We could have put more stuff in primary legislation, but that would belie the fact that we are trying to make this as flexible as possible, for the reasons that we have talked about. It is so that we do not have to keep coming back time and again for fear of this being out of date almost before we get to implementation in the first place.

However, we are doing things at the moment. Since November 2020, Ofcom has begun regulation of harmful content online through the video-sharing platform regulatory regime. In December 2020, Government published interim codes of practice on terrorist content and activity and sexual exploitation and abuse online. Those will help to bridge the gap until the regulator becomes operational. In June 2021, we published “safety by design” guidance, and information on a one-stop-shop for companies on protecting children online. In July 2021, we published the first Government online media literacy strategy. We do encourage stakeholders, users and families to engage with and help to promote that wealth of material to minimise online harms and the threat of misinformation and disinformation. But clearly, we all want this measure to be on the statute book and implemented as soon as possible. We have talked a lot about child protection, and that is the core of what we are trying to do here.   

     Question put and agreed to.

Clause 207, as amended, accordingly ordered to stand part of the Bill.

New Clause 1

OFCOM’s guidance: content that is harmful to children and user empowerment

“(1) OFCOM must produce guidance for providers of Part 3 services which contains examples of content or kinds of content that OFCOM consider to be, or consider not to be— OFCOM must produce guidance for providers of Category 1 services which contains examples of content or kinds of content that OFCOM consider to be, or consider not to be, content to which section 14(2) applies (see section 14(8A)).

(a) primary priority content that is harmful to children, or

(b) priority content that is harmful to children.

(2) Before producing any guidance under this section (including revised or replacement guidance), OFCOM must consult such persons as they consider appropriate.

(3) OFCOM must publish guidance under this section (and any revised or replacement guidance).”—(Paul Scully.)

This new clause requires OFCOM to give guidance to providers in relation to the kinds of content that OFCOM consider to be content that is harmful to children and content relevant to the duty in clause 14(2) (user empowerment).

Brought up, and read the First time.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

The Government are committed to empowering adults to have greater control over their online experience, and to protecting children from seeing harmful content online. New clause 1 places a new duty on Ofcom to produce and publish guidance for providers of user-to-user regulated services, in relation to the crucial aims of empowering adults and providers having effective systems and processes in place. The guidance will provide further clarity, including through

“examples of content or kinds of content that OFCOM consider to be…primary priority”

or

“priority content that is harmful to children.”

Ofcom will also have to produce guidance that sets out examples of content that it considers to be relevant to the user empowerment duties, as set out in amendment 15 to clause 14.

It is really important that expert opinion is considered in the development of this guidance, and the new clause places a duty on Ofcom to consult with relevant persons when producing sets of guidance. That will ensure that the views of subject matter experts are reflected appropriately.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Labour is pleased to see the introduction of the new clause, which clarifies the role of Ofcom in delivering guidance to providers about their duties. Specifically, the new clause will require Ofcom to give guidance to providers on the kind of content that Ofcom considers to be harmful to children, or relevant to the user empowerment duty in clause 14. That is a very welcome addition indeed.

Labour remains concerned about exactly how these so-called user empowerment tools will work in practice—we have discussed that at length—and let us face it: we have had little assurance from the Minister on that point. We welcome the new clause, as it clarifies what guidance providers can expect to receive from Ofcom once the Bill is finally enacted. We can all recognise that Ofcom has a colossal task ahead of it—the Minister said so himself—so it is particularly welcome that the guidance will be subject to consultation with those that it deems appropriate. I can hope only that that will include the experts, and the many groups that provided expertise, support and guidance on internet regulation long before the Bill even received its First Reading, a long time ago. There are far too many of those experts and groups to list, but it is fundamental that the experts who often spot online harms before they properly emerge be consulted and included in this process if we are to truly capture the priority harms to children, as the new clause intends.

We also welcome the clarification in subsection (2) that Ofcom will be required to provide “examples of content” that would be considered to be—or not be—harmful. These examples will be key to ensuring that the platforms have nowhere to hide when it comes to deciding what is harmful; there will be no grey area. Ofcom will have the power to show them exact examples of what could be deemed harmful.

We recognise, however, that there is subjectivity to the work that Ofcom will have to do once the Bill passes. On priority content, it is most important that providers are clear about what is and is not acceptable; that is why we welcome the new clause, but we do of course wish that the Government applied the same logic to harm pertaining to adults online.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I am also happy to support new clause 1, but I have a couple of questions. It mentions that “replacement guidance” may be provided, which is important because, as we have said a number of times, things will change, and we will end up with a different online experience; that can happen quickly. I am glad that Ofcom has the ability to refresh and update the guidance.

My question is about timelines. There do not seem to be any timelines in the new clause for when the guidance is required to be published. It is key that the guidance be published before companies and organisations have to comply with it. My preference would be for it to be published as early as possible. There may well need to be more work, and updated versions of the guidance may therefore need to be published, but I would rather companies had an idea of the direction of travel, and what they must comply with, as soon as possible, knowing that it might be tweaked. That would be better than waiting until the guidance was absolutely perfect and definitely the final version, but releasing it just before people had to start complying with it. I would like an assurance that Ofcom will make publishing the guidance a priority, so that there is enough time to ensure compliance. We want the Bill to work; it will not work if people do not know what they have to comply with. Assurance on that would be helpful.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

That was some stretch of procedure, Dame Angela, but we got there in the end. This new clause is about child user empowerment duties. I am really pleased that the Government have user empowerment duties in the Bill—they are a good thing—but I am confused as to why they apply only to adult users, and why children do not deserve the same empowerment rights over what they access online.

In writing the new clause, I pretty much copied clause 14, before there were any amendments to it, and added a couple of extra bits: subsections (8) and (9). In subsection (8), I have included:

“A duty to include in a service features which child users may use or apply if they wish to only encounter content by users they have approved.”

That would go a step further than the verification process and allow users to approve only people who are in their class at school, people with whom they are friends, or even certain people in their class at school, and to not have others on that list. I know that young people playing Fortnite—I have mentioned Fortnite a lot because people play it a lot—or Roblox are contacted by users whom they do not know, and there is no ability for young people to switch off some of the features while still being able to contact their friends. Users can either have no contact from anyone, or they can have a free-for-all. That is not the case for all platforms, but a chunk of them do not let users speak only to people on their friends list, or receive messages only from people on the list.

My proposed subsection (8) would ensure that children could have a “white list” of people who they believe are acceptable, and who they want to be contacted by, and could leave others off the list. That would help tackle not just online child exploitation, but the significant online bullying that teachers and children report. Children have spoken of the harms they experience as a result of people bullying them and causing trouble online; the perpetrators are mainly other children. Children would be able to remove such people from the list and so would not receive any content, messages or comments from those who make their lives more negative.

Subsection (9) is related to subsection (8); it would require a service to include

“features which child users may use or apply if they wish to filter out private messages from—

(a) non-verified users, or

(b) adult users, or

(c) any user other than those on a list approved by the child user.”

Adults looking to exploit children will use private messaging on platforms such as Instagram. Instagram has to know how old its users are, so anybody who is signed up to it will have had to provide it with their date of birth. It is completely reasonable for a child to say, “I want to filter out everything from an adult.” When we talk about children online, we are talking about anybody from zero to 18, which is a very wide age range. Some of those people will be working and paying bills, but will not have access to the empowerment features that adults have access to, because they have not yet reached that magical threshold. Some services may decide to give children access to user empowerment tools, but there is no requirement to. The only requirement in the Bill on user empowerment tools is for adults. That is not fair.

Children should have more control over the online environment. We know how many children feel sad as a result of their interactions online, and how many encounter content online that they wish they had never seen and cannot unsee. We should give them more power over that, and more power to say, “No, I don’t want to see that. I don’t want people I don’t know contacting me. I don’t want to get unsolicited messaged. I don’t want somebody messaging me, pretending that they are my friend or that they go to another school, when they are in fact an adult, and I won’t realise until it is far too late.”

The Bill applies to people of all ages. All of us make pretty crappy decisions sometimes. That includes teenagers, but they also make great decisions. If there was a requirement for them to have these tools, they could choose to make their online experience better. I do not think this was an intentional oversight, or that the Government set out to disadvantage children when they wrote the adult user empowerment clauses. I think they thought that it would be really good to have those clauses in the Bill, in order to give users a measure of autonomy over their time and interactions online. However, they have failed to include the same thing for children. It is a gap.

I appreciate that there are child safety duties, and that there is a much higher bar for platforms that have child users, but children are allowed a level of autonomy; look at the UN convention on the rights of the child. We give children choices and flexibilities; we do not force them to do every single thing they do, all day every day. We recognise that children should be empowered to make decisions where they can.

I know the Government will not accept the provision—I am not an idiot. I have never moved a new clause in Committee that has been accepted, and I am pretty sure that it will not happen today. However, if the Government were to say that they would consider, or even look at the possibility of, adding child user empowerment duties to the Bill, the internet would be a more pleasant place for children. They are going to use it anyway; let us try to improve their online experience even more than the Bill does already.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

The hon. Member for Aberdeen North has outlined the case for the new clause eloquently and powerfully. She may not press it to a Division, if the Minister can give her assurances, but if she did, she would have the wholehearted support of the Opposition.

We see new clause 8 as complementing the child safety duties in the legislation. We fully welcome provisions that provide children with greater power and autonomy in choosing to avoid exposure to certain types of content. We have concerns about how the provisions would work in practice, but that issue has more to do with the Government’s triple-shield protections than the new clause.

The Opposition support new clause 8 because it aims to provide further protections, in addition to the child safety duties, to fully protect children from harmful content and to empower them. It would empower and enable them to filter out private messages from adults or non-verified users. We also welcome the measures in the new clause that require platforms and service providers to design accessible terms of service. That is absolutely vital to best protect children online, which is why we are all here, and what the legislation was designed for.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The aim of the user empowerment duty is to give adults more control over certain categories of legal content that some users will welcome greater choice over. Those duties also give adult users greater control over who they interact with online, but these provisions are not appropriate for children. As the hon. Member for Aberdeen North acknowledged, there are already separate duties on services likely to be accessed by children, in scope of part 3, to undertake comprehensive risk assessments and to comply with safety duties to protect children from harm. That includes requirements to assess how many specific functionalities may facilitate the spread of harmful content, as outlined in clause 10(6)(e), and to protect children from harmful content, including content that has been designated as priority harmful content, by putting in place age-appropriate protections.

As such, children will not need to be provided with tools to control any harmful content they see, as the platform will need to put in place age-appropriate protections. We do not want to give children an option to choose to see content that is harmful to them. The Bill also outlines in clause 11(4)(f) that, where it is proportionate to do so, service providers will be required to take measures in certain areas to meet the child-safety duties. That includes functionalities allowing for control over content that is encountered. It would not be appropriate to require providers to offer children the option to verify their identity, due to the safeguarding and data protection risks that that would pose. Although we expect companies to use technologies such as age assurance to protect children on their service, they would only be used to establish age, not identity.

The new clause would create provisions to enable children to filter out private messages from adults and users who are not on an approved list, but the Bill already contains provisions that address the risks of adults contacting children. There are also requirements on service providers to consider how their service could be used for grooming or child sexual exploitation and abuse, and to apply proportionate measures to mitigate those risks. The service providers already have to assess and mitigate the risks. They have to provide the risk assessment, and within it they could choose to mitigate risk by requiring services to prevent unknown users from contacting children.

For the reasons I have set out, the Bill already provides strong protections for children on services that they are likely to access. I am therefore not able to accept the new clause, and I hope that the hon. Member for Aberdeen North will withdraw it.

--- Later in debate ---
Rachel Maclean Portrait Rachel Maclean (Redditch) (Con)
- Hansard - - - Excerpts

I would like to build on the excellent comments from my colleagues and to speak about child sexual abuse material. I thank my hon. Friends the Members for Penistone and Stocksbridge (Miriam Cates) and for Stone for tabling the amendment. I am very interested in how we can use the excellent provisions in the Bill to keep children safe from child sexual abuse material online. I am sure the Committee is aware of the devastating impact of such material.

Sexual abuse imagery—of girls in particular—is increasingly prevalent. We know that 97% of this material in 2021 showed female children. The Internet Watch Foundation took down a record-breaking 252,000 URLs that had images of children being raped, and seven in 10 of those images were of children aged 11 to 13. Unfortunately, the National Crime Agency estimates that between 550,000 and 850,000 people in the UK are searching for such material on the internet. They are actively looking for it, and at the moment they are able to find it.

My concern is with how we use what is in the Bill already to instil a top-down culture in companies, because this is about culture change in the boardroom, so that safety is considered with every decision. I have read the proceedings from previous sittings, and I recognise that the Government and Ministers have said that we have sufficient provisions to protect children, but I think there is a little bit of a grey area with tech companies.

I want to mention Apple and the update it was planning for quite a few years. There was an update that would have automatically scanned for child sex abuse material. Apple withdrew it following a backlash from encryption and privacy experts, who claimed it would undermine the privacy and security of iCloud users and make people less safe on the internet. Having previously said that it would pause it to improve it, Apple now says that it has stopped it altogether and that it is vastly expanding its end-to-end encryption, even though law enforcement agencies around the world, including our own UK law enforcement agencies, have expressed serious concerns because it makes investigations and prosecution more challenging. All of us are not technical experts. I do not believe that we are in a position to judge how legitimate it is for Apple to have this pause. What we do know is that while there is this pause, the risks for children are still there, proliferating online.

We understand completely that countering this material involves a complicated balance and that the tech giants need to walk a fine line between keeping users safe and keeping their data safe. But the question is this: if Apple and others continue to delay or backtrack, will merely failing to comply with an information request, which is what is in the Bill now, be enough to protect children from harm? Could they delay indefinitely and still be compliant with the Bill? That is what I am keen to hear from the Minister. I would be grateful if he could set out why he thinks that individuals who have the power to prevent the harmful content that has torn apart the lives of so many young people and their families should not face criminal consequences if they fail to do so. Can he reassure us as to how he thinks that the Bill can protect so many children—it is far too many children—from this material online?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Labour supports new clause 9, as liability is an issue that we have repeatedly raised throughout the passage of the Bill—most recently, on Report. As colleagues will be aware, the new clause would introduce criminal liabilities for directors who failed to comply with their duties. This would be an appropriate first step in ensuring a direct relationship between senior management of platforms and companies, and their responsibilities to protect children from significant harm. As we have heard, this measure would drive a more effective culture of awareness and accountability in relation to online safety at the top of and within the entire regulated firm. It would go some way towards ensuring that online safety was at the heart of the governance structures internally. The Bill must go further to actively promote cultural change and put online safety at the forefront of business models; it must ensure that these people are aware that it is about keeping people safe and that that must be at the forefront, over any profit. A robust corporate and senior management liability scheme is needed, and it needs to be one that imposes personal liability on directors when they put children at risk.

The Minister knows as well as I do that the benefits of doing so would be strong. We have only to turn to the coroner’s comments in the tragic case of Molly Russell’s death—which I know we are all mindful of as we debate this Bill—to fully understand the damaging impact of viewing harmful content online. I therefore urge the Minister to accept new clause 9, which we wholeheartedly support.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The Government recognise that the intent behind the new clause is to create new criminal offences of non-compliance with selected duties. It would establish a framework for personal criminal offences punishable through fines or imprisonment. It would mean that providers committed a criminal offence if they did not comply with certain duties.

We all want this Bill to be effective. We want it to be on the statute book. It is a question of getting that fine balance right, so that we can properly hold companies to account for the safety of their users. The existing approach to enforcement and senior manager liability strikes the right balance between robust enforcement and deterrent, and ensuring that the UK remains an attractive place to do business. We are confident that the Bill as a whole will bring about the change necessary to ensure that users, especially younger users, are kept safe online.

This new clause tries to criminalise not complying with the Bill’s duties. Exactly what activity would be criminalised is not obvious from the new clause, so it could be difficult for individuals to foresee exactly what type of conduct would constitute an offence. That could lead to unintended consequences, with tech executives driving an over-zealous approach to content take-down for fear of imprisonment, and potentially removing large volumes of innocuous content and so affecting the ability for open debate to take place.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I look forward to continuing the debate on Report.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I thank you, too, Dame Angela. I echo the Minister’s sentiments, and thank all the Clerks, the Doorkeepers, the team, and all the stakeholders who have massively contributed, with very short turnarounds, to the scrutiny of this legislation. I have so appreciated all that assistance and expertise, which has helped me, as shadow Minister, to compile our comments on the Bill following the Government’s recommittal of it to Committee, which is an unusual step. Huge thanks to my colleagues who joined us today and in previous sittings, and to colleagues from across the House, and particularly from the SNP, a number of whose amendments we have supported. We look forward to scrutinising the Bill further when it comes back to the House in the new year.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I thank you, Dame Angela, as well as Sir Roger for chairing our debates. Recommittal has been a very odd and unusual process; it has been a bit like groundhog day, discussing things we have discussed previously. I very much appreciate the hard work of departmental and Ofcom staff that went into making this happen, as well as the work of the Clerks, the Doorkeepers, and the team who ensured that we have a room that is not freezing—that has been really helpful.

I thank colleagues from across the House, particularly the Labour Front-Bench spokespeople, who have been incredibly helpful in supporting our amendments. This has been a pretty good-tempered Committee and we have all got on fairly well, even though we have disagreed on a significant number of issues. I am sure we will have those arguments again on Report.

Online Safety Bill

Alex Davies-Jones Excerpts
Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- View Speech - Hansard - -

I beg to move, That the clause be read a Second time.

Rosie Winterton Portrait Madam Deputy Speaker
- Hansard - - - Excerpts

With this it will be convenient to discuss the following:

New clause 2—Offence of failing to comply with a relevant duty

‘(1) The provider of a service to whom a relevant duty applies commits an offence if the provider fails to comply with the duty.

(2) In the application of sections 178(2) and 179(5) to an offence under this section (where the offence has been committed with the consent or connivance of an officer of the entity or is attributable to any neglect on the part of an officer of the entity) the references in those provisions to an officer of an entity include references to any person who, at the time of the commission of the offence—

(a) was (within the meaning of section 93) a senior manager of the entity in relation to the activities of the entity in the course of which the offence was committed; or

(b) was a person purporting to act in such a capacity.

(3) A person who commits an offence under this section is liable on conviction on indictment to imprisonment for a term not exceeding two years or a fine (or both).

(4) In this section, “relevant duty” means a duty provided for by section 11 of this Act.’

This new clause makes it an offence for the provider of a user-to-service not to comply with the safety duties protecting children set out in clause 11. Where the offence is committed with the consent or connivance of a senior manager or other officer of the provider, or is attributable to their neglect, the officer, as well as the entity, is guilty of the offence.

New clause 3—Child user empowerment duties

‘(1) This section sets out the duties to empower child users which apply in relation to Category 1 services.

(2) A duty to include in a service, to the extent that it is proportionate to do so, features which child users may use or apply if they wish to increase their control over harmful content.

(3) The features referred to in subsection (2) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to—

(a) reduce the likelihood of the user encountering priority content that is harmful, or particular kinds of such content, by means of the service, or

(b) alert the user to the harmful nature of priority content that is harmful that the user may encounter by means of the service.

(4) A duty to ensure that all features included in a service in compliance with the duty set out in subsection (2) are made available to all child users.

(5) A duty to include clear and accessible provisions in the terms of service specifying which features are offered in compliance with the duty set out in subsection (2), and how users may take advantage of them.

(6) A duty to include in a service features which child users may use or apply if they wish to filter out non-verified users.

(7) The features referred to in subsection (6) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to—

(a) prevent non-verified users from interacting with content which that user generates, uploads or shares on the service, and

(b) reduce the likelihood of that user encountering content which non-verified users generate, upload or share on the service.

(8) A duty to include in a service features which child users may use or apply if they wish to only encounter content by users they have approved.

(9) A duty to include in a service features which child users may use or apply if they wish to filter out private messages from—

(a) non-verified users, or

(b) adult users, or

(c) any user other than those on a list approved by the child user.

(10) In determining what is proportionate for the purposes of subsection (2), the following factors, in particular, are relevant—

(a) all the findings of the most recent child risk assessment (including as to levels of risk and as to nature, and severity, of potential harm), and

(b) the size and capacity of the provider of a service.

(11) In this section “non-verified user” means a user who has not verified their identity to the provider of a service (see section 57(1)).

(12) In this section references to features include references to functionalities and settings.’

New clause 4—Safety duties protecting adults and society: minimum standards for terms of service

‘(1) OFCOM may set minimum standards for the provisions included in a provider’s terms of service as far as they relate to the duties under sections 11, [Harm to adults and society risk assessment duties], [Safety duties protecting adults and society], 12, 16 to 19 and 28 of this Act (“relevant duties”).

(2) Where a provider does not meet the minimum standards, OFCOM may direct the provider to amend its terms of service in order to ensure that the standards are met.

(3) OFCOM must, at least once a year, conduct a review of—

(a) the extent to which providers are meeting the minimum standards, and

(b) how the providers’ terms of service are enabling them to fulfil the relevant duties.

(4) The report must assess whether any provider has made changes to its terms of service that might affect the way it fulfils a relevant duty.

(5) OFCOM must lay a report on the first review before both Houses of Parliament within one year of this Act being passed.

(6) OFCOM must lay a report on each subsequent review at least once a year thereafter.’

New clause 5—Harm to adult and society risk assessment duties

‘(1) This section sets out the duties about risk assessments which apply in relation to Category 1 services (in addition to the duties about risk assessments set out in section 8 and, in the case of Category 1 services likely to be accessed by children, section 10).

(2) A duty to carry out a suitable and sufficient harm to adults and society risk assessment at a time set out in, or as provided by, Schedule 3.

(3) A duty to take appropriate steps to keep an harm to adults and society risk assessment up to date, including when OFCOM make any significant change to a risk profile that relates to services of the kind in question.

(4) Before making any significant change to any aspect of a service’s design or operation, a duty to carry out a further suitable and sufficient harm to adults and society risk assessment relating to the impacts of that proposed change.

(5) A “harm to adults and society risk assessment” of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—

(a) the user base;

(b) the level of risk of adults who are users of the service encountering, by means of the service, each kind of priority content that is harmful to adults and society (with each kind separately assessed), taking into account (in particular) algorithms used by the service, and how easily, quickly and widely content may be disseminated by means of the service;

(c) the level of risk of harm to adults and society presented by different kinds of priority content that is harmful to adults and society;

(d) the level of risk of harm to adults and society presented by priority content that is harmful to adults and society which particularly affects individuals with a certain characteristic or members of a certain group;

(e) the level of risk of functionalities of the service facilitating the presence or dissemination of priority content that is harmful to adults and society, identifying and assessing those functionalities that present higher levels of risk;

(f) the different ways in which the service is used, and the impact of such use on the level of risk of harm that might be suffered by adults and society;

(g) the nature, and severity, of the harm that might be suffered by adults and society from the matters identified in accordance with paragraphs (b) to (f);

(h) how the design and operation of the service (including the business model, governance, use of proactive technology, measures to promote users’ media literacy and safe use of the service, and other systems and processes) may reduce or increase the risks identified.

(6) In this section references to risk profiles are to the risk profiles for the time being published under section 85 which relate to the risk of harm to adults and society presented by priority content that is harmful to adults and society.

(7) See also—

(a) section 19(2) (records of risk assessments), and

(b) Schedule 3 (timing of providers’ assessments).’

New clause 6—Safety duties protecting adults and society

‘(1) This section sets out the duties to prevent harms to adults and society which apply in relation to Category 1 services.

(2) A duty to summarise in the terms of service the findings of the most recent adults and society risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to adults and society).

(3) If a provider decides to treat a kind of priority content that is harmful to adults and society in a way described in subsection (4), a duty to include provisions in the terms of service specifying how that kind of content is to be treated (separately covering each kind of priority content that is harmful to adults and society which a provider decides to treat in one of those ways).

(4) These are the kinds of treatment of content referred to in subsection (3)—

(a) taking down the content;

(b) restricting users’ access to the content;

(c) limiting the recommendation or promotion of the content;

(d) recommending or promoting the content;

(e) allowing the content without treating it in a way described in any of paragraphs (a) to (d).

(5) A duty to explain in the terms of service the provider’s response to the risks relating to priority content that is harmful to adults and society (as identified in the most recent adults and society risk assessment of the service), by reference to—

(a) any provisions of the terms of service included in compliance with the duty set out in subsection (3), and

(b) any other provisions of the terms of service designed to mitigate or manage those risks.

(6) If provisions are included in the terms of service in compliance with the duty set out in subsection (3), a duty to ensure that those provisions—

(a) are clear and accessible, and

(b) are applied consistently.

(7) If the provider of a service becomes aware of any non-designated content that is harmful to adults and society present on the service, a duty to notify OFCOM of—

(a) the kinds of such content identified, and

(b) the incidence of those kinds of content on the service.

(8) In this section—

“harm to adults and society risk assessment” has the meaning given by section [harm to adults and society risk assessment duties];

“non-designated content that is harmful to adults and society” means content that is harmful to adults and society other than priority content that is harmful to adults and society.

(9) See also, in relation to duties set out in this section, section 18 (duties about freedom of expression and privacy).’

New clause 7—“Content that is harmful to adults and society” etc

‘(1) This section applies for the purposes of this Part.

(2) “Priority content that is harmful to adults and society” means content of a description designated in regulations made by the Secretary of State as priority content that is harmful to adults and society.

(3) “Content that is harmful to adults and society” means—

(a) priority content that is harmful to adults and society, or

(b) content, not within paragraph (a), of a kind which presents a material risk of significant harm to an appreciable number of adults in the United Kingdom.

(4) For the purposes of this section—

(a) illegal content (see section 53) is not to be regarded as within subsection (3)(b), and

(b) content is not to be regarded as within subsection (3)(b) if the risk of harm flows from—

(i) the content’s potential financial impact,

(ii) the safety or quality of goods featured in the content, or

(iii) the way in which a service featured in the content may be performed (for example, in the case of the performance of a service by a person not qualified to perform it).

(5) References to “priority content that is harmful to adults and society” and “content that is harmful to adults and society” are to be read as—

(a) limited to content within the definition in question that is regulated user-generated content in relation to a regulated user-to-user service, and

(b) including material which, if it were present on a regulated user-to-user service, would be content within paragraph (a) (and this section is to be read with such modifications as may be necessary for the purpose of this paragraph).

(6) Sections 55 and 56 contain further provision about regulations made under this section.’

Government amendments 1 to 4.

Amendment 44, clause 11, page 10, line 17, , at end insert ‘, and—

“(c) mitigate the harm to children caused by habit-forming features of the service by consideration and analysis of how processes (including algorithmic serving of content, the display of other users’ approval of posts and notifications) contribute to development of habit-forming behaviour.”’

Amendment 82, page 10, line 25, at end insert—

‘(3A) Content under subsection (3) includes content that may result in serious harm or death to a child while crossing the English Channel with the aim of entering the United Kingdom in a vessel unsuited or unsafe for those purposes.’

This amendment would require proportionate systems and processes, including removal of content, to be in place to control the access by young people to material which encourages them to undertake dangerous Channel crossings where their lives could be lost.

Amendment 83, page 10, line 25, at end insert—

‘(3A) Content promoting self-harm, including content promoting eating disorders, must be considered as harmful.’

Amendment 84, page 10, line 25, at end insert—

‘(3A) Content which advertises or promotes the practice of so-called conversion practices of LGBTQ+ individuals must be considered as harmful for the purposes of this section.’

Amendment 45, page 10, line 36, leave out paragraph (d) and insert—

‘(d) policies on user access to the service, parts of the service, or to particular content present on the service, including blocking users from accessing the service, parts of the service, or particular content,’.

Amendment 47, page 10, line 43, at end insert ‘, and

“(i) reducing or removing a user’s access to livestreaming features.”’

Amendment 46, page 10, line 43, at end insert ‘, and

“(i) reducing or removing a user’s access to private messaging features.”’

Amendment 48, page 11, line 25, after ‘accessible’ insert ‘for child users.’

Amendment 43, clause 12, page 12, line 24, leave out ‘made available to’ and insert

‘in operation by default for’.

Amendment 52, page 12, line 30, after ‘non-verified users’ insert

‘and to enable them to see whether another user is verified or non-verified.’

This amendment would require Category 1 services to make visible to users whether another user is verified or non-verified.

Amendment 49, page 12, line 30, at end insert—

‘(6A) A duty to ensure features and provisions in subsections (2), (4) and (6) are accessible and understandable to adult users with learning disabilities.’

Amendment 53, page 12, line 32, after ‘to’ insert ‘effectively’.

This amendment would bring this subsection into line with subsection (3) by requiring that the systems or processes available to users for the purposes described in subsections (7)(a) and (7)(b) should be effective.

Amendment 55, page 18, line 15, at end insert—

‘(4A) Content that is harmful to adults and society.’

Amendment 56, clause 17, page 20, line 10, leave out subsection (6) and insert—

‘(6) The following kinds of complaint are relevant for Category 1 services—

(a) complaints by users and affected persons about content present on a service which they consider to be content that is harmful to adults and society;

(b) complaints by users and affected persons if they consider that the provider is not complying with a duty set out in—

(i) section [adults and society online safety]

(ii) section 12 (user empowerment),

(iii) section 13 (content of democratic importance),

(iv) section 14 (news publisher content),

(v) section 15 (journalistic content), or

(vi) section 18(4), (6) or (7) (freedom of expression and privacy);

(c) complaints by a user who has generated, uploaded or shared content on a service if that content is taken down, or access to it is restricted, on the basis that it is content that is harmful to adults and society;

(d) complaints by a user of a service if the provider has given a warning to the user, suspended or banned the user from using the service, or in any other way restricted the user’s ability to use the service, as a result of content generated, uploaded or shared by the user which the provider considers to be content that is harmful to adults and society.’

Amendment 57, clause 19, page 21, line 40, leave out ‘or 10’ and insert

‘, 10 or [harms to adults and society risk assessment duties]’.

Amendment 58, page 22, line 37, at end insert—

‘(ba) section [adults and society online safety] (adults and society online safety),’

Government amendment 5.

Amendment 59, clause 44, page 44, line 11, at end insert

‘or

(ba) section [adults and society online safety] (adults and society online safety);’

Government amendment 6.

Amendment 60, clause 55, page 53, line 43, at end insert—

‘(2A) The Secretary of State may specify a description of content in regulations under section [“Content that is harmful to adult and society” etc](2) (priority content that is harmful to adults and society) only if the Secretary of State considers that, in relation to regulated user-to-user services, there is a material risk of significant harm to an appreciable number of adults presented by content of that description that is regulated user-generated content.’

Amendment 61, page 53, line 45, after ‘54’ insert

‘or [“Content that is harmful to adults and society” etc]’.

Amendment 62, page 54, line 8, after ‘54’ insert

‘or [“Content that is harmful to adults and society” etc]’.

Amendment 63, page 54, line 9, leave out ‘are to children’ and insert

‘or adults are to children or adults and society’.

Government amendments 7 to 16.

Amendment 77, clause 94, page 85, line 42, after ‘10’ insert

‘, [Adults and society risk assessment duties]’.

Amendment 78, page 85, line 44, at end insert—

‘(iiia) section [Adults and society online safety] (adults and society online safety);’

Amendment 54, clause 119, page 102, line 22, at end insert—

‘Section [Safety duties protecting adults and society: minimum standards for terms of service]

Minimum standards for terms of service’



Amendment 79, page 102, line 22, at end insert—

‘Section [Harm to adults and society assessments]

Harm to adults and society risk assessments

Section [Adults and society online safety]

Adults and society online safety’



Government amendments 17 to 19.

Amendment 51, clause 207, page 170, line 42, after ‘including’ insert ‘but not limited to’.

Government amendments 20 to 23.

Amendment 81, clause 211, page 177, line 3, leave out ‘and 55’ and insert

‘, [“Content that is harmful to adults and society” etc] and 55’.

Government amendments 24 to 42.

Amendment 64, schedule 8, page 207, line 13, leave out ‘relevant content’ and insert

‘priority content that is harmful to adults and society’.

Amendment 65, page 207, line 15, leave out ‘relevant content’ and insert

‘priority content that is harmful to adults and society’.

Amendment 66, page 207, line 17, leave out ‘relevant content’ and insert

‘priority content that is harmful to adults and society’.

Amendment 67, page 207, line 21, leave out ‘relevant content’ and insert

‘content that is harmful to adults and society, or other content which they consider breaches the terms of service.’

Amendment 68, page 207, line 23, leave out ‘relevant content’ and insert

‘priority content that is harmful to adults and society’.

Amendment 69, page 207, line 26, leave out ‘relevant content’ and insert

‘priority content that is harmful to adults and society’.

Amendment 70, page 208, line 2, leave out

‘or content that is harmful to children’

and insert

‘content that is harmful to children or priority content that is harmful to adults and society’.

Amendment 71, page 208, line 10, leave out

‘and content that is harmful to children’

and insert

‘content that is harmful to children and priority content that is harmful to adults and society’.

Amendment 72, page 208, line 13, leave out

“and content that is harmful to children”

and insert

‘content that is harmful to children and priority content that is harmful to adults and society’.

Amendment 73, page 210, line 2, at end insert

‘“content that is harmful to adults and society” and “priority content that is harmful to adults and society” have the same meaning as in section [“Content that is harmful to adults and society” etc]’.

Amendment 50, schedule 11, page 217, line 31, at end insert—

‘(1A) Regulations made under sub-paragraph (1) must provide for any regulated user-to-user service which OFCOM assesses as posing a very high risk of harm to be included within Category 1, regardless of the number of users.’

Amendment 74, page 218, line 24, leave out

‘and content that is harmful to children’

and insert

‘content that is harmful to children and priority content that is harmful to adults and society’.

Amendment 75, page 219, line 6, leave out

‘and content that is harmful to children’

and insert

‘content that is harmful to children and priority content that is harmful to adults and society’.

Amendment 76, page 221, line 24, at end insert—

‘“priority content that is harmful to adults and society” has the same meaning as in section [“Content that is harmful to adults and society” etc]’.

Amendment 80, page 240, line 35, in schedule 17, at end insert—

‘(ba) section [Harm to adults and society assessments] (Harm to adults and society assessments), and’.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Once again, it is a privilege to be back in the Chamber opening this debate—the third Report stage debate in recent months—of this incredibly important and urgently needed piece of legislation. I speak on behalf of colleagues across the House when I say that the Bill is in a much worse position than when it was first introduced. It is therefore vital that it is now able to progress to the other place. Although we are all pleased to see the Bill return today, the Government’s delays have been incredibly costly and we still have a long way to go until we see meaningful change for the better.

In December, during the last Report stage debate, we had the immense privilege to be joined in the Public Gallery by a number of the families who have all lost children in connection with online harms. It is these families whom we must keep in our mind when we seek to get the Bill over the line once and for all. As ever, I pay tribute to their incredible efforts in the most difficult of all circumstances.

Today’s debate is also very timely in that, earlier today, the End Violence Against Women and Girls coalition and Glitch, a charity committed to ending online abuse, handed in their petition, which calls on the Prime Minister to protect women and girls online. The petition has amassed more than 90,000 signatures and rising, so we know there is strong support for improving internet safety across the board. I commend all those involved on their fantastic efforts in raising this important issue.

It would be remiss of me not to make a brief comment on the Government’s last-minute U-turns in their stance on criminal sanctions. The fact that we are seeing amendments withdrawn at the last minute goes to show that this Government have absolutely no idea where they truly stand on these issues and that they are ultimately too weak to stand up against vested interests, whereas Labour is on the side of the public and has consistently put safety at the forefront throughout the Bill’s passage.

More broadly, I made Labour’s feelings about the Government’s highly unusual decision to send part of this Bill back to Committee a second time very clear during the previous debate. I will spare colleagues by not repeating those frustrations here, but let me be clear: it is absolutely wrong that the Government chose to remove safety provisions relating to “legal but harmful” content in Committee. That is a major weakening, not strengthening, of the Bill; everyone online, including users and consumers, will be worse off without those provisions.

The Government’s alternative proposal, to introduce a toggle to filter out harmful content, is unworkable. Replacing the sections of this Bill that could have gone some way towards preventing harm with an emphasis on free speech instead undermines the very purpose of the Bill. It will embolden abusers, covid deniers, hoaxers and others, who will feel encouraged to thrive online.

In Committee, the Government also chose to remove important clauses from the Bill that were in place to keep adults safe online. Without the all-important risk assessments for adults, I must press the Minister on an important point: exactly how will this Bill do anything to keep adults safe online? The Government know all that, but have still pursued a course of action that will see the Bill watered down entirely.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

Does my hon. Friend agree that, as we discussed in the Bill Committee, there is clear evidence that legal but harmful content is often the gateway to far more dangerous radicalisation and extremism, be it far-right, Islamist, incel or other? Will she therefore join me in supporting amendment 43 to ensure that by default such content is hidden from all adult users?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I completely support my hon. Friend’s comments and I was pleased to see her champion that cause in the Bill Committee. Of course I support amendment 43, tabled in the names of SNP colleagues, to ensure that the toggle is on by default. Abhorrent material is being shared and amplified—that is the key point, amplified—online by algorithms and by the processes and systems in place. It is obvious that the Government just do not get that. That said, there is a majority in Parliament and in the country for strengthening the Online Safety Bill, and Labour has been on the front foot in arguing for a stronger Bill since First Reading last year.

It is also important to recognise the sheer number of amendments and changes we have seen to the Bill so far. Even today, there are many more amendments tabled by the Government. If that does not give an indication of the mess they have made of getting this legislation over the line in a fit and proper state, I do not know what does.

I have said it before, and I am certain I will say it again, but we need to move forward with this Bill, not backward. That is why, despite significant Government delay, we will support the Bill’s Third Reading, as each day of inaction allows more harm to spread online. With that in mind, I too will make some progress.

I will first address new clause 1, tabled in my name and that of my hon. Friend the Member for Manchester Central (Lucy Powell). This important addition to the Bill will go some way to address the gaps around support for individual complaints. We in the Opposition have repeatedly queried Ministers and the Secretary of State on the mechanisms available for individuals who have appeals of complaints. That is why new clause 1 is so important. It is vital that platforms’ complaints procedures are fit for purpose, and this new clause will finally see the Secretary of State publishing a report on the options available to individuals.

We already know that the Bill in its current form fails to consider an appropriate avenue for individual complaints. This is a classic case of David and Goliath, and it is about time those platforms went further in giving their users a transparent, effective complaints process. That substantial lack of transparency underpins so many of the issues Labour has with the way the Government have handled—or should I say mishandled—the Bill so far, and it makes the process by which the Government proceeded to remove the all-important clauses on legal but harmful content, in a quiet room on Committee Corridor just before Christmas, even more frustrating.

That move put the entire Bill at risk. Important sections that would have put protections in place to prevent content such as health and foreign-state disinformation, the promotion of self-harm, and online abuse and harassment from being actively pushed and promoted were rapidly removed by the Government. That is not good enough, and it is why Labour has tabled a series of amendments, including new clauses 4, 5, 6 and 7, that we think would go some way towards correcting the Government’s extremely damaging approach.

Under the terms of the Bill as currently drafted, platforms could set whatever terms and conditions they want and change them at will. We saw that in Elon Musk’s takeover at Twitter, when he lifted the ban on covid disinformation overnight because of his own personal views. Our intention in tabling new clause 4 is to ensure that platforms are not able to simply avoid safety duties by changing their terms and conditions whenever they see fit. This group of amendments would give Ofcom the power to set minimum standards for platforms’ terms and conditions, and to direct platforms to change them if they do not meet those standards.

Andrew Gwynne Portrait Andrew Gwynne (Denton and Reddish) (Lab)
- Hansard - - - Excerpts

My hon. Friend is making an important point. She might not be aware of it, but I recently raised in the House the case of my constituents, whose 11-year-old daughter was groomed on the music streaming platform Spotify and was able to upload explicit photographs of herself on that platform. Thankfully, her parents found out and made several complaints to Spotify, which did not immediately remove that content. Is that not why we need the ombudsman?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I am aware of that case, which is truly appalling and shocking. That is exactly why we need such protections in the Bill: to stop those cases proliferating online, to stop the platforms from choosing their own terms of service, and to give Ofcom real teeth, as a regulator, to take on those challenges.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - - - Excerpts

Does the hon. Lady accept that the Bill does give Ofcom the power to set minimum safety standards based on the priority legal offences written into the Bill? That would cover almost all the worst kinds of offences, including child sexual exploitation, inciting violence and racial hatred, and so on. Those are the minimum safety standards that are set, and the Bill guarantees them.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

What is not in those minimum safety standards is all the horrendous and harmful content that I have described: covid disinformation, harmful content from state actors, self-harm promotion, antisemitism, misogyny and the incel culture, all of which is proliferating online and being amplified by the algorithms. This set of minimum safety standards can be changed overnight.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

As the hon. Lady knows, foreign-state disinformation is covered because it is part of the priority offences listed in the National Security Bill, so those accounts can be disabled. Everything that meets the criminal threshold is in this Bill because it is in the National Security Bill, as she knows. The criminal threshold for all the offences she lists are set in schedule 7 of this Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

That is just the problem, though, isn’t it? A lot of those issues would not be covered by the minimum standards—that is why we have tabled new clause 4—because they do not currently meet the legal threshold. That is the problem. There is a grey area of incredibly harmful but legal content, which is proliferating online, being amplified by algorithms and by influencers—for want of a better word—and being fed to everybody online. That content is then shared incredibly widely, and that is what is causing harm and disinformation.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Will the hon. Lady give way one more time?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

No, I will not. I need to make progress; we have a lot to cover and a lot of amendments, as I have outlined.

Under the terms of the Bill, platforms can issue whatever minimum standards they wish and then simply change them at will overnight. In tabling new clause 4, our intention is to ensure that the platforms are not able to avoid safety duties by changing their terms and conditions. As I have said, this group of amendments will give Ofcom the relevant teeth to act and keep everybody safe online.

We all recognise that there will be a learning curve for everyone involved once the legislation is enacted. We want to get that right, and the new clauses will ensure that platforms have specific duties to keep us safe. That is an important point, and I will continue to make it clear at every opportunity, because the platforms and providers have, for far too long, got away with zero regulation—nothing whatsoever—and enough is enough.

During the last Report stage, I made it clear that Labour considers individual liability essential to ensuring that online safety is taken seriously by online platforms. We have been calling for stronger criminal sanctions for months, and although we welcome some movement from the Government on that issue today, enforcement is now ultimately a narrower set of measures because the Government gutted much of the Bill before Christmas. That last minute U-turn is another one to add to a long list, but to be frank, very little surprises me when it comes to this Government’s approach to law-making.

John Hayes Portrait Sir John Hayes (South Holland and The Deepings) (Con)
- Hansard - - - Excerpts

I have to say to the hon. Lady that to describe it as a U-turn is not reasonable. The Government have interacted regularly with those who, like her, want to strengthen the Bill. There has been proper engagement and constructive conversation, and the Government have been persuaded by those who have made a similar case to the one she is making now. I think that warrants credit, rather than criticism.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I completely disagree with the right hon. Member, because we voted on this exact amendment before Christmas in the previous Report stage. It was tabled in the name of my right hon. Friend the Member for Barking (Dame Margaret Hodge), and it was turned down. It was word for word exactly the same amendment. If this is not anything but a U-turn, what is it?

I am pleased to support a number of important amendments in the names of the hon. Members for Aberdeen North (Kirsty Blackman) and for Ochil and South Perthshire (John Nicolson). In particular, I draw colleagues’ attention to new clause 3, which would improve the child empowerment duties in the Bill. The Government may think they are talking a good game on child safety, but it is clear to us all that some alarming gaps remain. The new clause would go some way to ensuring that the systems and processes behind platforms will go further in keeping children safe online.

In addition, we are pleased, as I have mentioned, to support amendment 43, which calls for the so-called safety toggle feature to be turned on by default. When the Government removed the clause relating to legal but harmful content in Committee, they instead introduced a requirement for platforms to give users the tools to reduce the likelihood of certain content appearing on their feeds. We have serious concerns about whether this approach is even workable, but if it is the route that the Government wish to take, we feel that these tools should at least be turned on by default.

Debbie Abrahams Portrait Debbie Abrahams (Oldham East and Saddleworth) (Lab)
- Hansard - - - Excerpts

Since my hon. Friend is on the point of safeguarding children, will she support Baroness Kidron as the Bill progresses to the other House in ensuring that coroners have access to data where they suspect that social media may have played a part in the death of children?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I can confirm that we will be supporting Baroness Kidron in her efforts. We will support a number of amendments that will be tabled in the Lords in the hope of strengthening this Bill further, because we have reached the limit of what we can do in this place. I commend the work that Baroness Kidron and the 5Rights Foundation have been doing to support children and to make this Bill work to keep everybody online as safe as possible.

Supporting amendment 43 would send a strong signal that our Government want to put online safety at the forefront of all our experiences when using the internet. For that reason, I look forward to the Minister seriously considering this amendment going forward. Scottish National party colleagues can be assured of our support, as I have previously outlined, should there be a vote on that.

More broadly, I highlight the series of amendments tabled in my name and that of my hon. Friend the Member for Manchester Central that ultimately aim to reverse out of the damaging avenue that the Government have chosen to go to down in regulating so-called legal but harmful content. As I have already mentioned, the Government haphazardly chose to remove those important clauses in Committee. They have chopped and changed this Bill more times than any of us can remember, and we are now left with a piece of legislation that is even more difficult to follow and, importantly, implement than when it was first introduced. We can all recognise that there is a huge amount of work to be done in making the Bill fit for purpose. Labour has repeatedly worked to make meaningful improvements at every opportunity, and it will be on the Government’s hands if the Bill is subject to even more delay. The Minister knows that, and I sincerely hope that he will take these concerns seriously. After all, if he will not listen to me, he would do well to listen to the mounting concerns raised by Members on his own Benches instead.

None Portrait Several hon. Members rose—
- Hansard -

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I do not have time, but I thank all Members who contributed to today’s debate. I pay tribute to my officials and to all the Ministers who have worked on this Bill over such a long time.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I beg to ask leave to withdraw the clause.

Clause, by leave, withdrawn.

Online Safety Bill

Alex Davies-Jones Excerpts
Consideration of Lords amendments
Tuesday 12th September 2023

(7 months, 2 weeks ago)

Commons Chamber
Read Full debate Online Safety Act 2023 Read Hansard Text Watch Debate Read Debate Ministerial Extracts Amendment Paper: Commons Consideration of Lords Amendments as at 12 September 2023 - (12 Sep 2023)
Roger Gale Portrait Mr Deputy Speaker (Sir Roger Gale)
- Hansard - - - Excerpts

I call the Opposition spokesperson.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- View Speech - Hansard - -

Before I address the amendments at hand, let me first put on record my thanks for the incredible efforts of our colleagues in the other place. The Bill has gone on a huge journey. The Government have repeatedly delayed its passage, and even went to great effort to recommit parts of the Bill to Committee in an attempt to remove important provisions on legal but harmful content. For those reasons alone, it is somewhat of a miracle that we have arrived at this moment, with a Bill that I am glad to say is in a much better place than when we last debated it here. That is thanks to the tireless work of so many individuals, charities and organisations, which have come together to coalesce around important provisions that will have a positive impact on people’s lives online.

Today, we have the real privilege of being joined by Ian Russell, Stuart Stephens, Emilia Stevens, Hollie Dance and Lisa Kenevan, who have all been impacted by losing a child at the hands of online harm. I want to take a moment to give my most heartfelt thanks to them all, and to the other families who have shared their stories, insights and experiences with colleagues and me as the Bill progressed. Today, in our thoughts are Archie, Isaac, Olly, Molly and all the other children who were taken due to online harm. Today, their legacy stands before us. We would not be here without you, so thank you.

We also could not have arrived at this point without the determination of colleagues in the other place, notably Baroness Kidron. Colleagues will know that she has been an extremely passionate, determined and effective voice for children throughout, and the Bill is stronger today thanks to her efforts. More broadly, I hope that today’s debate will be a significant and poignant moment for everyone who has been fighting hard for more protections online for many years.

It is good to see the Minister in his place. This is a complex Bill, and has been the responsibility of many of his colleagues since its introduction to Parliament. That being said, it will come as no surprise that Labour is pleased with some of the significant concessions that the Government have made on the Bill. Many stem from amendments the Opposition sought to make early on in the Bill’s passage. Although his Department’s press release may try to claim a united front, let us be clear: the Bill has sadly fallen victory to Tory infighting from day one. The Conservatives truly cannot decide if they are the party of protecting children or of free speech, when they should be the party of both. Sadly, some colleagues on the Government Benches have tried to stop the Bill in its tracks entirely, but Labour has always supported the need for it. We have worked collaboratively with the Government and have long called for these important changes. It is a welcome relief that the Government have finally listened.

Let me also be clear that the Bill goes some way to regulate the online space in the past and present, but it makes no effort to future-proof or anticipate emerging harms. The Labour party has repeatedly warned the Government of our concerns that, thanks to the Bill’s focus on content rather than social media platforms’ business models, it may not go far enough. With that in mind, I echo calls from across the House. Will the Minister commit to a review of the legislation within five years of enactment, to ensure that it has met their objective of making the UK the safest place in the world to be online?

Richard Burgon Portrait Richard Burgon (Leeds East) (Lab)
- Hansard - - - Excerpts

My hon. Friend is making an important speech. It is clear that the Government want to tackle harmful suicide and self-harm content. It is also clear that the Bill does not go far enough. Does she agree that we should support Samaritans’ suggested way forward after implementation? We need the Government to engage with people with lived experience of suicide and self-harm, to ensure that the new legislation makes things better. If it is shown—as we fear—not to go far enough, new legislative approaches will be required to supplement and take it further, to ensure that the internet is as safe as possible for vulnerable people of all ages.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I thank my hon. Friend for that intervention. He has been a passionate advocate on that point, speaking on behalf of his constituent Joe Nihill and his family for more protections in the Bill. It is clear that we need to know whether the legislation works in practice. Parliamentary oversight of that is essential, so I echo calls around the Chamber for that review. How will it take place? What will it look like? Parliament must have oversight, so that we know whether the legislation is fit for purpose.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

On the small high-harm platforms that are now in the scope of the Bill, will the Minister join me in thanking Hope Not Hate, the Antisemitism Policy Trust and CST, which have campaigned heavily on this point? While we have been having this debate, the CST has exposed BitChute, one of those small high-harm platforms, for geoblocking some of the hate to comply with legislation but then advertising loopholes and ways to get around that on the platform. Can the Minister confirm that the regulator will be able to take action against such proceedings?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will certainly look at that. Our intention is that in all areas, especially relating to children and their protection, that might not fall within the user enforcement duties, we will look to make sure that the work of those organisations is reflected in what we are trying to achieve in the Bill.

We have talked about the various Ministers that have looked after the Bill during its passage, and the Secretary of State was left literally holding the baby in every sense of the word because she continued to work on it while she was on maternity leave. We can see the results of that with the engagement that we have had. I urge all Members on both sides of the House to consider carefully the amendments I have proposed today in lieu of those made in the Lords. I know every Member looks forward eagerly to a future in which parents have surety about the safety of their children online. That future is fast approaching.

I reiterate my thanks to esteemed colleagues who have engaged so passionately with the Bill. It is due to their collaborative spirit that I stand today with amendments that we believe are effective, proportionate and agreeable to all. I hope all Members will feel able to support our position.

Amendment (a) made to Lords amendment 182.

Lords amendment 182, as amended, agreed to.

Amendments (a) and (b) made to Lords amendment 349.

Lords amendment 349, as amended, agreed to.

Amendment (a) made to Lords amendment 391.

Lords amendment 391, as amended, agreed to.

Government consequential amendment (a) made.

Lords amendment 17 disagreed to.

Government amendments (a) and (b) made in lieu of Lords amendment 17.

Lords amendment 20 disagreed to.

Lords amendment 22 disagreed to.

Lords amendment 81 disagreed to.

Government amendments (a) to (c) made in lieu of Lords amendment 81.

Lords amendment 148 disagreed to.

Government amendment (a) made in lieu of Lords amendment 148.

Lords amendments 1 to 16, 18, 19, 21, 23 to 80, 82 to 147, 149 to 181, 183 to 348, 350 to 390, and 392 to 424 agreed to, with Commons financial privileges waived in respect of Lords amendments 171, 180, 181, 317, 390 and 400.

Ordered, That a Committee be appointed to draw up Reasons to be assigned to the Lords for disagreeing to their amendments 20 and 22;

That Paul Scully, Steve Double, Alexander Stafford, Paul Howell, Alex Davies-Jones, Taiwo Owatemi and Kirsty Blackman be members of the Committee;

That Paul Scully be the Chair of the Committee;

That three be the quorum of the Committee.

That the Committee do withdraw immediately.—(Mike Wood.)

Committee to withdraw immediately; reasons to be reported and communicated to the Lords.