Online Safety Bill (Fifth sitting) Debate
Full Debate: Read Full DebateDan Carden
Main Page: Dan Carden (Labour - Liverpool Walton)Department Debates - View all Dan Carden's debates with the Department for Digital, Culture, Media & Sport
(2 years, 6 months ago)
Public Bill CommitteesThank you, Sir Roger; it is a genuine privilege and an honour to serve under your chairship today and for the duration of the Committee. I concur with congratulations to the right hon. Member for Basingstoke and I, too, congratulate her.
If you would indulge me, Sir Roger, this is the first time I have led on behalf of the Opposition in a Bill Committee of this magnitude. I am very much looking forward to getting my teeth stuck into the hours of important debate that we have ahead of us. I would also like to take this opportunity to place on record an early apology for any slight procedural errors I may inadvertently make as we proceed. However, I am very grateful to be joined by my hon. Friend the Member for Worsley and Eccles South, who is much more experienced in these matters. I place on record my grateful support to her. Along with your guidance, Sir Roger, I expect that I will quickly pick up the correct parliamentary procedure as we make our way through this colossal legislation. After all, we can agree that it is a very important piece of legislation that we all need to get right.
I want to say clearly that the Opposition welcome the Bill in principle; the Minister knows that, as we voted in favour of it at Second Reading. However, it will come as no surprise that we have a number of concerns about areas where we feel the Bill is lacking, which we will explore further. We have many reservations about how the Bill has been drafted. The structure and drafting pushes services into addressing harmful content—often in a reactive, rather than proactive, way—instead of harmful systems, business models and algorithms, which would be a more lasting and systemic approach.
Despite that, we all want the Bill to work and we know that it has the potential to go far. We also recognise that the world is watching, so the Opposition look forward to working together to do the right thing, making the internet a truly safe space for all users across the UK. We will therefore not oppose clause 1.
It is a pleasure to serve on the Committee. I want to apologise for missing the evidence sessions. Unfortunately, I came down with covid, but I have been following the progress of the Committee.
This is important legislation. We spend so much of our lives online these days, yet there has never been an attempt to regulate the space, or for democratically elected Members to contribute towards its regulation. Clause 1 gives a general outline of what to expect in the Bill. I have no doubt that this legislation is required, but also that it will not get everything right, and that it will have to change over the years. We may see many more Bills of this nature in this place.
I have concerns that some clauses have been dropped, and I hope that there will be future opportunities to amend the Bill, not least with regard to how we educate and ensure that social media companies promote media literacy, so that information that is spread widely online is understood in its context—that it is not always correct or truthful. The Bill, I hope, will go some way towards ensuring that we can rely more on the internet, which should provide a safer space for all its users.
May I join others in welcoming line-by-line scrutiny of the Bill? I am sure that the Minister will urge us to ensure that we do not make the perfect the enemy of the good. This is a very lengthy and complex Bill, and a great deal of time and scrutiny has already gone into it. I am sure that we will all pay due regard to that excellent work.
The hon. Member for Pontypridd is absolutely right to say that in many ways the world is watching what the Government are doing regarding online regulation. This will set a framework for many countries around the world, and we must get it right. We are ending the myth that social media and search engines are not responsible for their content. Their use of algorithms alone demonstrates that, while they may not publish all of the information on their sites, they are the editors at the very least and must take responsibility.
We will no doubt hear many arguments about the importance of free speech during these debates and others. I would like gently to remind people that there are many who feel that their free speech is currently undermined by the way in which the online world operates. Women are subject to harassment and worse online, and children are accessing inappropriate material. There are a number of areas that require specific further debate, particularly around the safeguarding of children, adequate support for victims, ensuring that the criminal law is future-proof within this framework, and ensuring that we pick up on the comments made in the evidence sessions regarding the importance of guidance and codes of practice. It was slightly shocking to hear from some of those giving evidence that the operators did not know what was harmful, as much has been written about the harm caused by the internet.
I will listen keenly to the Minister’s responses on guidance and codes of practice, and secondary legislation more generally, because it is critical to how the Bill works. I am sure we will have many hours of interesting and informed debate on this piece of legislation. While there has already been a great deal of scrutiny, the Committee’s role is pivotal to ensure that the Bill is as good as it can be.
Question put and agreed to.
Clause 1 accordingly ordered to stand part of the Bill.
Clause 2
Key Definitions
Question proposed, That the clause stand part of the Bill.
The hon. Lady is right to pick up on the nuance and the balance that we have to strike in legislation between freedom of speech and the protection of vulnerable individuals and children. I do not think there can be many people, particularly among those here today, who would want anything to trump the safeguarding of children. Will the Minister clarify exactly how the Bill works in relation to such important work?
Secondly, it is important that the Government have made the changes to schedule 2. They have listened closely on the issue of pornography and extended the provisions of the Bill to cover commercial pornography. However, the hon. Member for Pontypridd mentioned nudification software, and I am unclear whether the Bill would outlaw such software, which is designed to sexually harass women. That software takes photographs only of women, because its database relates only to female figures, and makes them appear to be completely naked. Does that software fall in scope of the Bill? If not, will the Minister do something about that? The software is available and we have to regulate it to ensure that we safeguard women’s rights to live without harassment in their day-to-day life.
This part of the Bill deals with the definitions of services and which services would be exempt. I consider myself a millennial; most people my age or older are Facebook and Twitter users, and people a couple of years younger might use TikTok and other services. The way in which the online space is used by different generations, particularly by young people, changes rapidly. Given the definitions in the Bill, how does the Minister intend to keep pace with the changing ways in which people communicate? Most online games now allow interaction between users in different places, which was not the case a few years ago. Understanding how the Government intend the Bill to keep up with such changes is important. Will the Minister tell us about that?
Let me briefly speak to the purpose of these clauses and then respond to some of the points made in the debate.
As the shadow Minister, the hon. Member for Pontypridd, touched on, clauses 2 and 3 define some of the key terms in the Bill, including “user-to-user services” and “search services”—key definitions that the rest of the Bill builds on. As she said, schedule 1 and clause 4 contain specific exemptions where we believe the services concerned present very low risk of harm. Schedule 2 sets out exemptions relating to the new duties that apply to commercial providers of pornography. I thank the shadow Minister and my right hon. Friend the Member for Basingstoke for noting the fact that the Government have substantially expanded the scope of the Bill to now include commercial pornography, in response to widespread feedback from Members of Parliament across the House and the various Committees that scrutinised the Bill.
The shadow Minister is quite right to say that the number of platforms to which the Bill applies is very wide. [Interruption.] Bless you—or bless my hon. Friend the Member for North West Durham, I should say, Sir Roger, although he is near sanctified already. As I was saying, we are necessarily trying to protect UK users, and with many of these platforms not located in the UK, we are seeking to apply these duties to those companies as well as ones that are domestically located. When we come to discuss the enforcement powers, I hope the Committee will see that those powers are very powerful.
The shadow Minister, the hon. Member for Liverpool, Walton and others asked about future technologies and whether the Bill will accommodate technologies that we cannot even imagine today. The metaverse is a good example: The metaverse did not exist when the Bill was first contemplated and the White Paper produced. Actually, I think Snapchat did not exist when the White Paper that preceded the Bill was first conceived. For that reason, the Bill is tech agnostic. We do not talk about specific technologies; we talk about the duties that apply to companies and the harms they are obligated to prevent.
The whole Bill is tech agnostic because we as parliamentarians today cannot anticipate future developments. When those future developments arise, as they inevitably will, the duties under the Bill will apply to them as well. The metaverse is a good example, because even though it did not exist when the structure of the Bill was conceived, anything happening in the metaverse is none the less covered by the Bill. Anything that happens in the metaverse that is illegal or harmful to children, falls into the category of legal but harmful to adults, or indeed constitutes pornography will be covered because the Bill is tech agnostic. That is an extremely important point to make.
The hon. Member for Aberdeen North asked about gaming. Parents are concerned because lots of children, including quite young children, use games. My own son has started playing Minecraft even though he is very young. To the extent that those games have user-to-user features—for example, user-to-user messaging, particularly where those messages can be sent widely and publicly—those user-to-user components are within the scope of the Bill.
The hon. Member for Aberdeen North also asked about the App Store. I will respond quickly to her question now rather than later, to avoid leaving the Committee in a state of tingling anticipation and suspense. The App Store, or app stores generally, are not in the scope of the Bill, because they are not providing, for example, user-to-user services, and the functionality they provide to basically buy apps does not count as a search service. However, any app that is purchased in an app store, to the extent that it has either search functionality, user-to-user functionality or purveys or conveys pornography, is in scope. If an app that is sold on one of these app stores turns out to provide a service that breaks the terms of the Bill, that app will be subject to regulatory enforcement directly by Ofcom.
The hon. Members for Aberdeen North and for Liverpool, Walton touched on media literacy, noting that there has been a change to the Bill since the previous version. We will probably debate this later, so I will be brief. The Government published a media literacy strategy, backed by funding, to address this point. It was launched about a year ago. Ofcom also has existing statutory duties—arising under the Communications Act 2003, I believe. The critical change made since the previous draft of the Bill—it was made in December last year, I believe—is that Ofcom published an updated set of policy intentions around media literacy that went even further than we had previously intended. That is the landscape around media literacy.
On the way that media literacy relates to misinformation and disinformation, we heard from William Moy, chief executive of Full Fact. His view was that the Bill does nothing to tackle disinformation and that another information incident, as we have seen with covid and Ukraine recently, is inevitable. Full Fact’s view was that the Bill should give the regulator the power to declare misinformation incidents. Is that something the Minister has considered?
In many ways, clause 6 is the central meat of the Bill. It brings into play a duty of care, which means that people operating online will be subject to the same rules as the rest of us when it comes to the provision of services. But when it comes to the detail, the guidance and codes that will be issued by Ofcom will play a central role. My question for the Minister is: in the light of the evidence that we received, I think in panel three, where the providers were unable to define what was harmful because they had not yet seen codes of practice from Ofcom, could he update us on when those codes and guidance might be available? I understand thoroughly why they may not be available at this point, and they certainly should not form part of the Bill because they need to be flexible enough to be changed in future, but it is important that we know how the guidance and codes work and that they work properly.
Will the Minister update the Committee on what further consideration he and other Ministers have given to the establishment of a standing committee to scrutinise the implementation of the Bill? Unless we have that in place, it will be difficult to know whether his legislation will work.
Some of the evidence we heard suggested that the current precedent was that the Secretary of State had very little to do with independent regulators in this realm, but that the Bill overturns that precedent. Does the right hon. Lady have any concerns that the Bill hands too much power to the Secretary of State to intervene and influence regulators that should be independent?
I want to add my voice to the calls for ways to monitor the success or failures of this legislation. We are starting from a position of self-regulation where companies write the rules and regulate themselves. It is right that we are improving on that, but with it comes further concerns around the powers of the Secretary of State and the effectiveness of Ofcom. As the issues are fundamental to freedom of speech and expression, and to the protection of vulnerable and young people, will the Minster consider how we better monitor whether the legislation does what it says on the tin?
Clause 5 simply provides an overview of part 3 of the Bill. Several good points have been raised in the course of this discussion. I will defer replying to the substance of a number of them until we come to the relevant clause, but I will address two or three of them now.
The shadow Minister said that the Bill is a complex, and she is right; it is 193-odd clauses long and a world-leading piece of legislation. The duties that we are imposing on social media firms and internet companies do not already exist; we have no precedent to build on. Most matters on which Parliament legislates have been considered and dealt with before, so we build on an existing body of legislation that has been built up over decades or, in some cases in the criminal law, over centuries. In this case, we are constructing a new legislative edifice from the ground up. Nothing precedes this piece of legislation—we are creating anew—and the task is necessarily complicated by virtue of its novelty. However, I think we have tried to frame the Bill in a way that keeps it as straightforward and as future-proof as possible.
The shadow Minister is right to point to the codes of practice as the source of practical guidance to the public and to social media firms on how the obligations operate in practice. We are working with Ofcom to ensure that those codes of practice are published as quickly as possible and, where possible, prepared in parallel with the passage of the legislation. That is one reason why we have provided £88 million of up-front funding to Ofcom in the current and next financial years: to give it the financial resources to do precisely that.
My officials have just confirmed that my recollection of the Ofcom evidence session on the morning of Tuesday 24 May was correct: Ofcom confirmed to the Committee that it will publish, before the summer, what it described as a “road map” providing details on the timing of when and how those codes of practice will be created. I am sure that Ofcom is listening to our proceedings and will hear the views of the Committee and of the Government. We would like those codes of practice to be prepared and introduced as quickly as possible, and we certainly provided Ofcom with the resources to do precisely that.
There was question about the Scottish offences and, I suppose, about the Northern Irish offences as well—we do not want to forget any part of the United Kingdom.
One of the challenges for this legislation will be the way it is enforced. Have my hon. Friend and her Front-Bench colleagues given consideration to the costs of the funding that Ofcom and the regulatory services may need?
That is a huge concern for us. As was brought up in our evidence sessions with Ofcom, it is recruiting, effectively, a fundraising officer for the regulator. That throws into question the potential longevity of the regulator’s funding and whether it is resourced effectively to properly scrutinise and regulate the online platforms. If that long-term resource is not available, how can the regulator effectively scrutinise and bring enforcement to bear against companies for enabling illegal activity?
The Committee will note that, at the moment, the hon. Lady is not moving amendment 70; she is only moving amendment 69. So the Question is, That that amendment be made.
I congratulate my own Front Bench on this important amendment. I would like the Minister to respond to the issue of transparency and the reason why only the regulator would have sight of these risk assessments. It is fundamental that civil society groups and academics have access to them. Her Majesty’s Revenue and Customs is an example of where that works very well. HMRC publishes a lot of its data, which is then used by academics and researchers to produce reports and documents that feed back into the policy making processes and HMRC’s work. It would be a missed opportunity if the information and data gathered by Ofcom were not widely available for public scrutiny.
I would reinforce the earlier points about accountability. There are too many examples—whether in the financial crash or the collapse of companies such as Carillion—where accountability was never there. Without this amendment and the ability to hold individuals to account for the failures of companies that are faceless to many people, the legislation risks being absolutely impotent.
Finally, I know that we will get back to the issue of funding in a later clause but I hope that the Minister can reassure the Committee that funding for the enforcement of these regulations will be properly considered.
Let me start by speaking to clauses 6, 7, 21 and 22 stand part. I will then address the amendments moved by the shadow Minister.
I want to talk about a few different things relating to the amendments. Speaking from the Opposition Front Bench, the hon. Member for Pontypridd covered in depth amendment 20, which relates to being directed to other content. Although this seems like a small amendment, it would apply in a significant number of different situations. Particular mention was made of Discord for gaming, but also of things such as moving from Facebook to Messenger—all those different directions that can happen. A huge number of those are important for those who would seek to abuse children online by trying to move from the higher-regulation services or ones with more foot traffic to areas with perhaps less moderation so as to attack children in more extreme ways.
I grew up on the internet and spent a huge amount of time speaking to people, so I am well aware that people can be anyone they want to be on the internet, and people do pretend to be lots of different people. If someone tells us their age on the internet, we cannot assume that that is in any way accurate. I am doing what I can to imprint that knowledge on my children in relation to any actions they are taking online. In terms of media literacy, which we will come on to discuss in more depth later, I hope that one of the key things that is being told to both children and adults is that it does not matter if people have pictures on their profile—they can be anybody that they want to online and could have taken those pictures from wherever.
In relation to amendment 21 on collaboration, the only reasonable concern that I have heard is about an action that was taken by Facebook in employing an outside company in the US. It employed an outside company that placed stories in local newspapers on concerns about vile things that were happening on TikTok. Those stories were invented—they were made up—specifically to harm TikTok’s reputation. I am not saying for a second that collaboration is bad, but I think the argument that some companies may make that it is bad because it causes them problems and their opponents may use it against them proves the need to have a regulator. The point of having a regulator is to ensure that any information or collaboration that is required is done in a way that, should a company decide to use it with malicious intent, the regulator can come down on them. The regulator ensures that the collaboration that we need to happen in order for emergent issues to be dealt with as quickly as possible is done in a way that does not harm people. If it does harm people, the regulator is there to take action.
I want to talk about amendments 25 and 30 on the production of images and child sexual abuse content. Amendment 30 should potentially have an “or” at the end rather than an “and”. However, I am very keen to support both of those amendments, and all the amendments relating to the production of child sexual abuse content. On the issues raised by the Opposition about livestreaming, for example, we heard two weeks ago about the percentage of self-generated child sexual abuse content. The fact is that 75% of that content is self-generated. That is absolutely huge.
If the Bill does not adequately cover production of the content, whether it is by children and young people who have been coerced into producing the content and using their cameras in that way, or whether it is in some other way, then the Bill fails to adequately protect our children. Purely on the basis of that 75% stat, which is so incredibly stark, it is completely reasonable that production is included. I would be happy to support the amendments in that regard; I think they are eminently sensible. Potentially, when the Bill was first written, production was not nearly so much of an issue. However, as it has moved on, it has become a huge issue and something that needs tackling. Like Opposition Members, I do not feel like the Bill covers production in as much detail as it should, in order to provide protection for children.
Amendment 10 would create a duty to publish the illegal content risk assessment, and proactively supply that to Ofcom. This is new legislation that is really a trial that will set international precedent, and a lot of the more prescriptive elements—which are necessary—are perhaps the most challenging parts of the Bill. The Minister has been very thoughtful on some of the issues, so I want to ask him, when we look at the landscape of how we look to regulate companies, where does he stand on transparency and accountability? How far is he willing to go, and how far does the Bill go, on issues of transparency? It is my feeling that the more companies are forced to publish and open up, the better. As we saw with the case of the Facebook whistleblower Frances Haugen, there is a lot to uncover. I therefore take this opportunity to ask the Minister how far the Bill goes on transparency and what his thoughts are on that.
Of course, Ofcom is able to request any of them if it wants to—if it feels that to be necessary—but receiving 25,000 risk assessments, including from tiny companies that basically pose pretty much no risk at all and hardly anyone uses, would, I think, be an unreasonable and disproportionate requirement to impose. I do not think it is a question of the resources being inadequate; it is a question of being proportionate and reasonable.
The point I was trying to get the Minister to think about was the action of companies in going through the process of these assessments and then making that information publicly available to civil society groups; it is about transparency. It is what the sector needs; it is the way we will find and root out the problems, and it is a great missed opportunity in this Bill.
To reassure the hon. Member on the point about doing the risk assessment, all the companies have to do the risk assessment. That obligation is there. Ofcom can request any risk assessment. I would expect, and I think Parliament would expect, it to request risk assessments either where it is concerned about risk or where the platform is particularly large and has a very high reach—I am thinking of Facebook and companies like that. But hon. Members are talking here about requiring Ofcom to receive and, one therefore assumes, to consider, because what is the point of receiving an assessment unless it considers it? Receiving it and just putting it on a shelf without looking at it would be pointless, obviously. Requiring Ofcom to receive and look at potentially 25,000 risk assessments strikes me as a disproportionate burden. We should be concentrating Ofcom’s resources—and it should concentrate its activity, I submit—on those companies that pose a significant risk and those companies that have a very high reach and large numbers of users. I suggest that, if we imposed an obligation on it to receive and to consider risk assessments for tiny companies that pose no risk, that would not be the best use of its resources, and it would take away resources that could otherwise be used on those companies that do pose risk and that have larger numbers of users.