(2 years ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
It is always a pleasure to serve under your chairship, Mr Dowd. I am grateful to be here representing the Opposition in this important debate. This is the first time I have overwhelmingly agreed with every single excellent contribution in this Chamber. That goes to show that, as my friend the hon. Member for Aberdeen North (Kirsty Blackman) said, this does cross party lines and is not a political issue—at least, it should not be. There is huge cross-party consensus in this place, and the other place, about getting the Bill on the statute book and in action to protect everybody on the internet.
I pay particular tribute to the right hon. Member for East Hampshire (Damian Hinds) who, as a former Education Secretary, comes at this debate with a huge breadth of knowledge and experience. He is a former colleague of mine; we sat together on the Digital, Culture, Media and Sport Committee, where we scrutinised this legislation and these issues in depth. I know it is an issue he cares very deeply about. I echo his and other Members’ sentiments on the reappointment of the Minister, who comes at this with a breadth of experience and cares deeply. I am very pleased to see him in his post.
Regulation to tackle online abuse was first promised many years ago. In the initial White Paper, the Conservatives promised world-leading legislation. However, when the draft Online Safety Bill was published in May 2021, those proposals were totally watered down and incomplete. The Bill is no longer world leading. Since it was first announced that this Government intended to regulate the online space, seven jurisdictions have introduced online safety laws. Although those pieces of legislation are not perfect, they are in place. In that time, online crime has exploded, child sex abuse online has become rife and scams have continued to proliferate. The Minister knows that, and he may share my frustration and genuine concern at the cost that the delay is causing.
I recognise that we are living in turbulent political times, but when it comes to online harms, particularly in the context of children, we cannot afford to wait. Last week, the coroner’s report from the tragic death of Molly Russell brought into sharp relief the serious impact that harmful social media content is having on young people across the UK every day. Let me be clear; Molly Russell’s death is a horrific tragedy. I pay tribute to her father Ian and her family, who have, in the most harrowing of circumstances, managed to channel their energy into tireless campaigning that has quite rightly made us all sit up and listen.
Molly’s untimely death, to which, as the coroner announced last week, harmful social media content was a contributing factor, has stunned us all. It should force action from the Government. While I was pleased to note in the business statement last week that the Online Safety Bill will return to the House on Tuesday, I plead with the Minister to work with Labour, the SNP and all parties to get it through, with some important amendments. Without measures on legal but harmful content—or harmful but legal, as we are now referring to it—it is not likely that suicide and self-harm content such as that faced online by Molly or by Joe Nihill, the constituent of my hon. Friend the Member for Leeds East (Richard Burgon), will be dealt with.
Enough is enough. Children and adults—all of us—need to be kept safe online. Labour has long campaigned for stronger protections online for children and the public, to keep people safe, secure our democracy and ensure that everyone is treated with decency and respect. There is broad consensus that social media companies have failed to regulate themselves. That is why I urge the Minister to support our move to ensure that those at the top of multi-million-pound social media companies are held personally accountable for failures beyond those currently in the Bill relating to information notices.
The Online Safety Bill is our opportunity to do better. I am keen to understand why the Government have failed to introduce or support personal criminal liability measures for senior leaders who have fallen short on their statutory duty to protect us online. There are such measures in other areas, such as financial services. The same goes for the Government’s approach to the duties of care for adults under the Bill—what we call harmful but legal. The Minister knows that the Opposition has concerns over the direction of the Bill, as do other Members here today.
Freedom of speech is vital to our democracy, but it absolutely must not come at a harmful cost. The Bill Committee, which I was a member of, heard multiple examples of racist, antisemitic, extremist and other harmful publishers, from holocaust deniers to white supremacists, which would stand to benefit from the recognised news publisher exemption as it currently stands, either overnight or by making minor administrative changes.
In Committee, in response to an amendment from my hon. Friend the Member for Batley and Spen (Kim Leadbeater), the Minister promised the concession that Russia Today would be excluded from the recognised news publisher exemption. I am pleased that the Government have indeed promised to exclude sanctioned news titles such as Russia Today through an amendment that they have said they will introduce at a later stage, but that does not go far enough. Disinformation outlets rarely have the profile of Russia Today. Often they operate more discreetly and are less likely to attract sanctions. For those reasons, the Government must go further. As a priority, we must ensure that the current exemption cannot be exploited by bad actors. The Government must not give a free pass to those propagating racist or misogynistic harm and abuse.
Aside from freedom of speech, Members have raised myriad harms that appear online, many of which we tried to tackle with amendments in Committee. A robust corporate and senior management liability scheme for routine failures was rejected. Basic duties that would have meant that social media companies had to publish their own risk assessments were rejected. Amendments to bring into scope small but high-harm platforms that we have heard about today were also rejected. The Government would not even support moves to name violence against women and girls as a harm in the Bill, despite the huge amount of evidence suggesting that women and people of colour are more at risk.
Recent research from the Centre for Countering Digital Hate has found that Instagram fails to act on nine out of 10 reports of misogyny over its direct messenger. One in 15 DMs sent to women by strangers were abusive or contained violent and sexual images. Of 330 examples reported on Twitter and Instagram, only nine accounts were removed. More than half of those that were reported continued to offend. The Government are letting down survivors and putting countless women and girls at risk of gendered harms, such as image-based sexual abuse—so-called revenge porn—rape threats, doxxing and tech abuse perpetrated by an abusive partner. What more will it take for meaningful change to be made?
I hope the Minister will address those specific omissions. Although I recognise that he was not in his role as the Bill progressed in Committee, he is in the unfortunate position of having to pick up the pieces. I hope he will today give us some reassurances, which I know many of us are seeking.
I must also raise with the Minister once again the issue of online discriminatory abuse, particularly in the context of sport. In oral questions I recently raised the very serious problem of rising discrimination faced not just by players but their families, referees, coaches, pundits, fans and others. I know the hon. Member for Barrow and Furness (Simon Fell) tried to make this point in his contribution. Abuse and harm faced online is not virtual; it is real and has a lasting impact. Labour Members believe it is essential that tech firms are held to account when harmful abuse and criminal behaviour appear on, are amplified by and therefore flourish on their platforms.
There are genuine issues with the Government’s approach to the so-called legal but harmful provisions in the Bill that will, in essence, fail to capture some of the most harmful content out there. We have long called for a more systems-based approach to the Bill, and we need only to look at the research that we have had from Kick It Out to recognise the extent of the issue. Research from that organisation used artificial intelligence to identify violent abuse that falls below the current criminal thresholds outlined in the current draft of the Bill. There is no need for me to repeat the vile language in this place today. We have only to cast our minds back to 2020 and the Euros to recall the disgraceful abuse—and more—targeted at members of the England team to know the realities of the situation online. But it does not have to be this way.
Labour colleagues have repeatedly raised concerns that the current AI moderation practices utilised by the big social media giants are seemingly incapable of adapting to the rapid rate at which new internet-based languages, emojis and euphemisms develop. It is wrong of the Government to pursue an online harms agenda that is so clearly focused on content moderation, rather than considering the business models that underpin those harmful practices. Worse still, we now know that that approach often underpins a wide range of the harmful content that we see online.
The Times recently reported that TikTok users were able to easily evade safety filters to share suicide and self-harm posts by using slang terms and simple misspellings. Some of the content in question had been online for more than a year, despite including direct advice on how to self-harm. TikTok’s community guidelines forbid content that depicts or encourages suicide or self-harm, and yet such content still remains online for everyone to see.
We have concerns that the Government’s current approach will have little impact unless the big firms are held more accountable. What we really need is a consistent approach from the Government, and a commitment to tackling myriad online harms that is fit for the modern age and for emerging tech, too. There is a widespread political consensus on the importance of getting this right, and the Minister can be assured of success if only his Department is prepared to listen.
It is a pleasure to serve under your chairmanship, Mr Dowd. This is my first appearance as a Minister in Westminster Hall, and your first appearance in the Chair, so we are both making our debuts. I hope we have long and successful reigns in our respective roles.
It is a great pleasure to respond to the debate secured by my right hon. Friend the Member for East Hampshire (Damian Hinds) and to his excellent opening speech. He feels strongly about these issues—as he did both in Government and previously as a member of the Digital, Culture, Media and Sport Committee—and he has spoken up about them. I enjoyed working with him when he was a Minister at the Home Office and I chaired the prelegislative scrutiny Committee, which discussed many important features of the Online Safety Bill. One feature of the Bill, of course, is the inclusion of measures on fraud and scam advertising, which was a recommendation of the Joint Committee. It made my life easier that, by the time I became a Minister in the Department, the Government had already accepted that recommendation and introduced the exemption, and I will come on to talk about that in more detail.
My right hon. Friend, the hon. Member for Pontypridd (Alex Davies-Jones) and other Members raised the case of Molly Russell, and it is important to reflect on that case. I share the sentiments expressed about the tragedy of Molly’s death, its avoidable nature and the tireless work of the Russell family, and particularly her father, Ian Russell, whom I have met several times to discuss this. The Russell family pursued a very difficult and complicated case, which required a huge release of evidence from the social media companies, particularly Instagram and Pinterest, to demonstrate the sort of content to which Molly Russell was exposed.
One of the things Ian Russell talks about is the work done by the investigating officers in the coroner’s inquest. Tellingly, the inquest restricted the amount of time that people could be exposed to the content that Molly was exposed to, and ensured that police officers who were investigating were not doing so on their own. Yet that was content that a vulnerable teenage girl saw repeatedly, on her own, in isolation from those who could have helped her.
When online safety issues are raised with social media companies, they say things like, “We make this stuff very hard to find.” The lived experience of most teenagers is not searching for such material; it is such material being selected by the platforms and targeted at the user. When someone opens TikTok, their first exposure is not to content that they have searched for; it is to content recommended to them by TikTok, which data-profiles the user and chooses things that will engage them. Those engagement-based business models are at the heart of the way the Online Safety Bill works and has to work. If platforms choose to recommend content to users to increase their engagement with the platform, they make a business decision. They are selecting content that they think will make a user want to return more frequently and stay on the platform for longer. That is how free apps make money from advertising: by driving engagement.
It is a fair criticism that, at times, the platforms are not effective enough at recognising the kinds of engagement tools they are using, the content that is used to engage people and the harm that that can do. For a vulnerable person, the sad truth is that their vulnerability will probably be detected by the AI that drives the recommendation tools. That person is far more likely to be exposed to content that will make their vulnerabilities worse. That is how a vulnerable teenage girl can be held by the hand—by an app’s AI recommendation tools—and walked from depression to self-harm and worse. That is why regulating online safety is so important and why the protection of children is so fundamental to the Bill. As hon. Members have rightly said, we must also ensure that we protect adults from some of the illegal and harmful activity on the platforms and hold those platforms to account for the business model they have created.
I take exception to the suggestion from the hon. Member for Pontypridd that this is a content-moderation Bill. It is not; it is a systems Bill. The content that we use, and often refer to, is an exemplar of the problem; it is an exemplar of things going wrong. On all the different areas of harm that are listed in the Bill, particularly the priority legal offences in schedule 7, our challenge to the companies is: “You have to demonstrate to the regulator that you have appropriate systems in place to identify this content, to ensure that you are not amplifying or recommending it and to mitigate it.” Mitigation could be suppressing the content—not letting it be amplified by their tools—removing it altogether or taking action against the accounts that post it. It is the regulator’s job to work with the companies, assess the risk, create codes of practice and then hold the companies to account for how they work.
There is criminal liability for the companies if they refuse to co-operate with the regulator. If they refuse to share information or evidence asked for by the regulator, a named company director will be criminally liable. That was in the original Bill. The recommendation in the Joint Committee report was that that should be commenced within months of the Bill being live; originally it was going to be two years. That is in the Bill today, and it is important that it is there so that companies know they have to comply with requests.
The hon. Member for Pontypridd is right to say that the Bill is world-leading, in the sense that it goes further than other people’s Bills, but other Bills have been enacted elsewhere in the world. That is why it is important that we get on with this.
The Minister is right to say that we need to get on with this. I appreciate that he is not responsible for the business of this House, but his party and his Government are, so will he explain why the Bill has been pulled from the timetable next week, if it is such an important piece of legislation?
As the hon. Lady knows, I can speak to the Bill; I cannot speak to the business of the House—that is a matter for the business managers in the usual way. Department officials—some here and some back at the Department—have been working tirelessly on the Bill to ensure we can get it in a timely fashion. I want to see it complete its Commons stages and go to the House of Lords as quickly as possible. Our target is to ensure that it receives safe passage in this Session of Parliament. Obviously, I cannot talk to the business of the House, which may alter as a consequence of the changes to Government.
(2 years, 1 month ago)
Commons ChamberLast weekend there was yet another case of vile online racist abuse being hurled at a professional footballer, on this occasion the Brentford striker Ivan Toney. Ironically, tomorrow we will all come together to recognise Show Racism the Red Card day. If the Government are at all serious about keeping people safe online, it is vital for those at the top of these multimillion-pound social media companies to be held personally accountable. The Online Safety Bill is our opportunity to do better. Can the Minister therefore tell us exactly why the Government have failed to introduce personal criminal liability measures for senior leaders who have fallen short on their statutory duty to protect us online?
I think it is about time the Opposition remembered that it is this Government who are introducing the Online Safety Bill. It is this Government who committed themselves to it in our manifesto. As I have already told Opposition Members, we will bring it back imminently. I am sure you agree, Mr Speaker, that it would not be proper for me to announce House business here today, but I can assure the hon. Member that this is my top priority. We will be coming back with the Bill shortly. I mean what I say, and I will do what I say.
(2 years, 4 months ago)
Commons ChamberThat is entirely right, and in closing I say that the Bill does what we have always asked for it to do: it gives absolute clarity that illegal things offline must be illegal online as well, and be regulated online. It establishes clear responsibilities and liabilities for the platforms to do that proactively. It enables a regulator to hold the platforms to account on their ability to tackle those priority illegal harms and provide transparency on other areas of harmful content. At present we simply do not know about the policy decisions that companies choose to make: we have no say in it; it is not transparent; we do not know whether they do it. The Bill will deliver in those important regards. If we are serious about tackling issues such as fraud and abuse online, and other criminal offences, we require a regulatory system to do that and proper legal accountability and liability for the companies. That is what the Bill and the further amendments deliver.
It is an honour to respond on the first group of amendments on behalf of the Opposition.
For those of us who have been working on this Bill for some time now, it has been extremely frustrating to see the Government take such a siloed approach in navigating this complex legislation. I remind colleagues that in Committee Labour tabled a number of hugely important amendments that sought to make the online space safer for us all, but the Government responded by voting against each and every one of them. I certainly hope the new Minister—I very much welcome him to his post—has a more open-minded approach than his predecessor and indeed the Secretary of State; I look forward to what I hope will be a more collaborative approach to getting this legislation right.
With that in mind, it must be said that time and again this Government claim that the legislation is world-leading but that is far from the truth. Instead, once again the Government have proposed hugely significant and contentious amendments only after line-by-line scrutiny in Committee; it is not the first time this has happened in this Parliament, and it is extremely frustrating for those of us who have debated this Bill for more than 50 hours over the past month.
I will begin by touching on Labour’s broader concerns around the Bill. As the Minister will be aware, we believe that the Government have made a fundamental mistake in their approach to categorisation, which undermines the very structure of the Bill. We are not alone in this view and have the backing of many advocacy and campaign groups including the Carnegie UK Trust, Hope Not Hate and the Antisemitism Policy Trust. Categorisation of services based on size rather than risk of harm will mean that the Bill will fail to address some of the most extreme harms on the internet.
We all know that smaller platforms such as 4chan and BitChute have significant numbers of users who are highly motivated to promote very dangerous content. Their aim is to promote radicalisation and to spread hate and harm.
Not only that: people migrate from one platform to another, a fact that just has not been reflected on by the Government.
My hon. Friend is absolutely right, and has touched on elements that I will address later in my speech. I will look at cross-platform harm and breadcrumbing; the Government have taken action to address that issue, but they need to go further.
I am sorry to intervene so early in the hon. Lady’s speech, and thank her for her kind words. I personally agree that the question of categorisation needs to be looked at again, and the Government have agreed to do so. We will hopefully discuss it next week during consideration of the third group of amendments.
I welcome the Minister’s commitment, which is something that the previous Minister, the hon. Member for Croydon South (Chris Philp) also committed to in Committee. However, it should have been in the Bill to begin with, or been tabled as an amendment today so that we could discuss it on the Floor of the House. We should not have to wait until the Bill goes to the other place to discuss this fundamental, important point that I know colleagues on the Minister’s own Back Benches have been calling for. Here we are, weeks down the line, with nothing having been done to fix that problem, which we know will be a persistent problem unless action is taken. It is beyond frustrating that no indication was given in Committee of these changes, because they have wide-ranging consequences for the effects of the Bill. Clearly, the Government are distracted with other matters, but I remind the Minister that Labour has long called for a safer internet, and we are keen to get the Bill right.
Let us start with new clause 14, which provides clarification about how online services should determine whether content should be considered illegal, and therefore how the illegal safety duty should apply. The new clause is deeply problematic, and is likely to reduce significantly the amount of illegal content and fraudulent advertising that is correctly identified and acted on. First, companies will be expected to determine whether content is illegal or fraudulently based on information that is
“reasonably available to a provider”,
with reasonableness determined in part by the size and capacity of the provider. That entrenches the problems I have outlined with smaller, high-risk companies being subject to fewer duties despite the acute risks they pose. Having less onerous applications of the illegal safety duties will encourage malign actors to migrate illegal activity on to smaller sites that have less pronounced regulatory expectations placed on them. That has particularly concerning ramifications for children’s protections, which I will come on to shortly. On the other end of the scale, larger sites could use new clause 14 to argue that their size and capacity, and the corresponding volumes of material they are moderating, makes it impractical for them reliably and consistently to identify illegal content.
The second problem arises from the fact that the platforms will need to have
“reasonable grounds to infer that all elements necessary for the commission of the offence, including mental elements, are present or satisfied”.
That significantly raises the threshold at which companies are likely to determine that content is illegal. In practice, companies have routinely failed to remove content where there is clear evidence of illegal intent. That has been the case in instances of child abuse breadcrumbing, where platforms use their own definitions of what constitutes a child abuse image for moderation purposes. Charities believe it is inevitable that companies will look to use this clause to minimise their regulatory obligations to act.
Finally, new clause 14 and its resulting amendments do not appear to be adequately future-proofed. The new clause sets out that judgments should be made
“on the basis of all relevant information that is reasonably available to a provider.”
However, on Meta’s first metaverse device, the Oculus Quest product, that company records only two minutes of footage on a rolling basis. That makes it virtually impossible to detect evidence of grooming, and companies can therefore argue that they cannot detect illegal content because the information is not reasonably available to them. The new clause undermines and weakens the safety mechanisms that the Minister, his team, the previous Minister, and all members of the Joint Committee and the Public Bill Committee have worked so hard to get right. I urge the Minister to reconsider these amendments and withdraw them.
I will now move on to improving the children’s protection measures in the Bill. In Committee, it was clear that one thing we all agreed on, cross-party and across the House, was trying to get the Bill to work for children. With colleagues in the Scottish National party, Labour Members tabled many amendments and new clauses in an attempt to achieve that goal. However, despite their having the backing of numerous children’s charities, including the National Society for the Prevention of Cruelty to Children, 5Rights, Save the Children, Barnardo’s, The Children’s Society and many more, the Government sadly did not accept them. We are grateful to those organisations for their insights and support throughout the Bill’s passage.
We know that children face significant risks online, from bullying and sexist trolling to the most extreme grooming and child abuse. Our amendments focus in particular on preventing grooming and child abuse, but before I speak to them, I associate myself with the amendments tabled by our colleagues in the Scottish National party, the hon. Members for Aberdeen North (Kirsty Blackman) and for Ochil and South Perthshire (John Nicolson). In particular, I associate myself with the sensible changes they have suggested to the Bill at this stage, including a change to children’s access assessments through amendment 162 and a strengthening of duties to prevent harm to children caused by habit-forming features through amendment 190.
Since the Bill was first promised in 2017, the number of online grooming crimes reported to the police has increased by more than 80%. Last year, around 120 sexual communication with children offences were committed every single week, and those are only the reported cases. The NSPCC has warned that that amounts to a
“tsunami of online child abuse”.
We now have the first ever opportunity to legislate for a safer world online for our children.
However, as currently drafted, the Bill falls short by failing to grasp the dynamics of online child abuse and grooming, which rarely occurs on one single platform or app, as mentioned by my hon. Friend the Member for Oldham East and Saddleworth (Debbie Abrahams). In well-established grooming pathways, abusers exploit the design features of open social networks to contact children, then move their communication across to other, more encrypted platforms, including livestreaming sites and encrypted messaging services. For instance, perpetrators manipulate features such as Facebook’s algorithmic friend suggestions to make initial contact with large numbers of children, who they then groom through direct messages before moving to encrypted services such as WhatsApp, where they coerce children into sending sexual images. That range of techniques is often referred to as child abuse breadcrumbing, and is a significant enabler of online child abuse.
I will give a sense of how easy it is for abusers to exploit children by recounting the words and experiences of a survivor, a 15-year-old girl who was groomed on multiple sites:
“I’ve been chatting with this guy online who’s…twice my age. This all started on Instagram but lately all our chats have been on WhatsApp. He seemed really nice to begin with, but then he started making me do these things to ‘prove my trust’ to him, like doing video chats with my chest exposed. Every time I did these things for him, he would ask for more and I felt like it was too late to back out. This whole thing has been slowly destroying me and I’ve been having thoughts of hurting myself.”
I appreciate that it is difficult listening, but that experience is being shared by thousands of other children every year, and we need to be clear about the urgency that is needed to change that.
It will come as a relief to parents and children that, through amendments 58 to 61, the Government have finally agreed to close the loophole that allowed for breadcrumbing to continue. However, I still wish to speak to our amendments 15, 16, and 17 to 19, which were tabled before the Government changed their mind. Together with the Government’s amendments, these changes will bring into scope tens of millions of interactions with accounts that actively enable the discovery and sharing of child abuse material.
Amendment 15 would ensure that platforms have to include in their illegal content risk assessment content that
“reasonably foreseeably facilitates or aids the discovery or dissemination of CSEA content.”
Amendment 16 would ensure that platforms have to maintain proportionate systems and processes to minimise the presence of such content on their sites. The wording of our amendments is tighter and includes aiding the discovery or dissemination of content, whereas the Government’s amendments cover only “commission or facilitation”. Can the Minister tell me why the Government chose that specific wording and opposed the amendments that we tabled in Committee, which would have done the exact same thing? I hope that in the spirit of collaboration that we have fostered throughout the passage of the Bill with the new Minister and his predecessor, the Minister will consider the merit of our amendments 15 and 16.
Labour is extremely concerned about the significant powers that the Bill in its current form gives to the Secretary of State. We see that approach to the Bill as nothing short of a shameless attempt at power-grabbing from a Government whose so-called world-leading Bill is already failing in its most basic duty of keeping people safe online. Two interlinked issues arise from the myriad of powers granted to the Secretary of State throughout the Bill: the first is the unjustified intrusion of the Secretary of State into decisions that are about the regulation of speech, and the second is the unnecessary levels of interference and threats to the independence of Ofcom that arise from the powers of direction to Ofcom in its day-to-day matters and operations. That is not good governance, and it is why Labour has tabled a range of important amendments that the Minister must carefully consider. None of us wants the Bill to place undue powers in the hands of only one individual. That is not a normal approach to regulation, so I fail to see why the Government have chosen to go down that route in this case.
I thank the shadow Minister for giving way—I will miss our exchanges across the Dispatch Box. She is making a point about the Secretary of State powers in, I think, clause 40. Is she at all reassured by the undertakings given in the written ministerial statement tabled by the Secretary of State last Thursday, in which the Government committed to amending the Bill in the Lords to limit the use of those powers to exceptional circumstances only, and precisely defined those circumstances as only being in connection with issues such as public health and public safety?
I thank the former Minister for his intervention, and I am grateful for that clarification. We debated at length in Committee the importance of the regulator’s independence and the prevention of overarching Secretary of State powers, and of Parliament having a say and being reconvened if required. I welcome the fact that that limitation on the power will be tabled in the other place, but it should have been tabled as an amendment here so that we could have discussed it today. We should not have to wait for the Bill to go to the other place for us to have our say. Who knows what will happen to the Bill tomorrow, next week or further down the line with the Government in utter chaos? We need this to be done now. The Minister must recognise that this is an unparalleled level of power, and one with which the sector and Back Benchers in his own party disagree. Let us work together and make sure the Bill really is fit for purpose, and that Ofcom is truly independent and without interference and has the tools available to it to really create meaningful change and keep us all safe online once and for all.
While the shadow Minister is on the subject of exemptions for antisemites, will she say where the Opposition are on the issue of search? Search platforms and search engines provide some of the most appalling racist, Islamophobic and antisemitic content.
I thank the hon. Gentleman, who is absolutely right. In Committee, we debated at length the impact search engines have, and they should be included in the Bill’s categorisation of difficult issues. In one recent example on a search engine, the imagery that comes up when we search for desk ornaments is utterly appalling and needs to be challenged and changed. If we are to truly tackle antisemitism, racism and extremist content online, then the provisions need to be included in the Bill, and journalistic exemptions should not apply to this type of content. Often, they operate more discretely and are less likely to attract sanctions. Furthermore, any amendment will provide no answer to the many extremist publishers who seek to exploit the terms of the exemption. For those reasons, we need to go further.
The amendments are not a perfect or complete solution. Deficiencies remain, and the amendments do not address the fact that the exemption continues to exclude dozens of independent local newspapers around the country on the arbitrary basis that they have no fixed address. The Independent Media Association, which represents news publishers, describes the news publisher criteria as
“punishing quality journalism with high standards”.
I hope the Minister will reflect further on that point. As a priority, we need to ensure that the exemption cannot be exploited by bad actors. We must not give a free pass to those propagating racist, misogynistic or antisemitic harm and abuse. By requiring some standards of accountability for news providers, however modest, the amendments are an improvement on the Bill as drafted. In the interests of national security and the welfare of the public, we must support the amendments.
Finally, I come to a topic that I have spoken about passionately in this place on a number of occasions and that is extremely close to my heart: violence against women and girls. Put simply, in their approach to the Bill the Government are completely failing and falling short in their responsibilities to keep women and girls safe online. Labour has been calling for better protections for some time now, yet still the Government are failing to see the extent of the problem. They have only just published an initial indicative list of priority harms to adults, in a written statement that many colleagues may have missed. While it is claimed that this will add to scrutiny and debate, the final list of harms will not be on the face of the Bill but will included in secondary legislation after the Bill has received Royal Assent. Non-designated content that is harmful will not require action on the part of service providers, even though by definition it is still extremely harmful. How can that be acceptable?
Many campaigners have made the case that protections for women and girls are not included in the draft Bill at all, a concern supported by the Petitions Committee in its report on online abuse. Schedule 7 includes a list of sexual offences and aggravated offences, but the Government have so far made no concessions here and the wider context of violence against women and girls has not been addressed. That is why I urge the Minister to carefully consider our new clause 3, which seeks to finally name violence against women and girls as a priority harm. The Minister’s predecessor said in Committee that women and girls receive “disproportionate” levels of abuse online. The Minister in his new role will likely be well briefed on the evidence, and I know this is an issue he cares passionately about. The case has been put forward strongly by hon. Members on all sides of the House, and the message is crystal clear: women and girls must be protected online, and we see this important new clause as the first step.
Later on, we hope to see the Government move further and acknowledge that there must be a code of practice on tackling violence against women and girls content online.
The hon. Lady raises the issue of codes of practice. She will recall that in Committee we talked about that specifically and pressed the then Minister on that point. It became very clear that Ofcom would be able to issue a code of practice on violence against women and girls, which she talked about. Should we not be seeking an assurance that Ofcom will do that? That would negate the need to amend the Bill further.
I welcome the right hon. Lady’s comments. We did discuss this at great length in Committee, and I know she cares deeply and passionately about this issue, as do I. It is welcome that Ofcom can issue a code of practice on violence against women and girls, and we should absolutely be urging it to do that, but we also need to make it a fundamental aim of the Bill. If the Bill is to be truly world leading, if it is truly to make us all safe online, and if we are finally to begin to tackle the scourge of violence against women and girls in all its elements—not just online but offline—then violence against women and girls needs to be named as a priority harm in the Bill. We need to take the brave new step of saying that enough is enough. Words are not enough. We need actions, and this is an action the Minister could take.
I think we would all agree that when we look at the priority harms set out in the Bill, women and girls are disproportionately the victims of those offences. The groups in society that the Bill will most help are women and girls in our community. I am happy to work with the hon. Lady and all hon. Members to look at what more we can do on this point, both during the passage of the Bill and in future, but as it stands the Bill is the biggest step forward in protecting women and girls, and all users online, that we have ever seen.
I am grateful to the Minister for the offer to work on that further, but we have an opportunity now to make real and lasting change. We talk about how we tackle this issue going forward. How can we solve the problem of violence against women and girls in our community? Three women a week are murdered at the hands of men in this country—that is shocking. How can we truly begin to tackle a culture change? This is how it starts. We have had enough of words. We have had enough of Ministers standing at the Dispatch Box saying, “This is how we are going to tackle violence against women and girls; this is our new plan to do it.” They have an opportunity to create a new law that makes it a priority harm, and that makes women and girls feel like they are being listened to, finally. I urge the Minister and Members in all parts of the House, who know that this is a chance for us finally to take that first step, to vote for new clause 3 today and make women and girls a priority by showing understanding that they receive a disproportionate level of abuse and harm online, and by making them a key component of the Bill.
I join everybody else in welcoming the Under-Secretary of State for Digital, Culture, Media and Sport, my hon. Friend the Member for Folkestone and Hythe (Damian Collins), to the Front Bench. He is astonishingly unusual in that he is both well-intentioned and well-informed, a combination we do not always find among Ministers.
I will speak to my amendments to the Bill. I am perfectly willing to be in a minority of one—one of my normal positions in this House. To be in a minority of one on the issue of free speech is an honourable place to be. I will start by saying that I think the Bill is fundamentally mis-designed. It should have been several Bills, not one. It is so complex that it is very difficult to forecast the consequences of what it sets out to do. It has the most fabulously virtuous aims, but unfortunately the way things will be done under it, with the use of Government organisations to make decisions that, properly, should be taken on the Floor of the House, is in my view misconceived.
We all want the internet to be safe. Right now, there are too many dangers online—we have been hearing about some of them from the hon. Member for Pontypridd (Alex Davies-Jones), who made a fabulous speech from the Opposition Front Bench—from videos propagating terror to posts promoting self-harm and suicide. But in its well-intentioned attempts to address those very real threats, the Bill could actually end up being the biggest accidental curtailment of free speech in modern history.
There are many reasons to be concerned about the Bill. Not all of them are to be dealt with in this part of the Report stage—some will be dealt with later—and I do not have time to mention them all. I will make one criticism of the handling of the Bill at this point. I have seen much smaller Bills have five days on Report in the past. This Bill demands more than two days. That was part of what I said in my point of order at the beginning.
One of the biggest problems is the “duties of care” that the Bill seeks to impose on social media firms to protect users from harmful content. That is a more subtle issue than the tabloid press have suggested. My hon. Friend the Member for Croydon South (Chris Philp), the previous Minister, made that point and I have some sympathy with him. I have spoken to representatives of many of the big social media firms, some of which cancelled me after speeches that I made at the Conservative party conference on vaccine passports. I was cancelled for 24 hours, which was an amusing process, and they put me back up as soon as they found out what they had done. Nevertheless, that demonstrated how delicate and sensitive this issue is. That was a clear suppression of free speech without any of the pressures that are addressed in the Bill.
When I spoke to the firms, they made it plain that they did not want the role of online policemen, and I sympathise with them, but that is what the Government are making them do. With the threat of huge fines and even prison sentences if they consistently fail to abide by any of the duties in the Bill—I am using words from the Bill—they will inevitably err on the side of censorship whenever they are in doubt. That is the side they will fall on.
Worryingly, the Bill targets not only illegal content, which we all want to tackle—indeed, some of the practice raised by the Opposition Front Bencher, the hon. Member for Pontypridd should simply be illegal full stop—but so-called “legal but harmful” content. Through clause 13, the Bill imposes duties on companies with respect to legal content that is “harmful to adults”. It is true that the Government have avoided using the phrase “legal but harmful” in the Bill, preferring “priority content”, but we should be clear about what that is.
The Bill’s factsheet, which is still on the Government’s website, states on page 1:
“The largest, highest-risk platforms will have to address named categories of legal but harmful material”.
This is not just a question of transparency—they will “have to” address that. It is simply unacceptable to target lawful speech in this way. The “Legal to Say, Legal to Type” campaign, led by Index on Censorship, sums up this point: it is both perverse and dangerous to allow speech in print but not online.
(2 years, 5 months ago)
Public Bill CommitteesGood morning, Sir Roger. As the Minister has outlined, clause 173 gives the Secretary of State the power to amend the list of fraud offences in what will be section 36 in relation to the duties about fraudulent advertising. Although we recognise that this power is subject to some constraints, Labour has concerns about what we consider to be an unnecessary power given to the Secretary of State to amend duties about fraudulent advertising on category 1 services.
We welcome the provisions outlined in clause 173(2), which lists the criteria that any new offences must meet before the Secretary of State may include them in the list of fraud offences in section 36. The Minister outlined some of those. Along the same lines, the provision in clause 173(3) to further limit the Secretary of State’s power to include new fraud offences—it lists types of offences that may not be added to section 36—is a positive step.
However, we firmly believe that delegated law making of this nature, even when there are these minor constraints in place, is a worrying course for the Government to pursue when we have already strongly verbalised our concerns about Ofcom’s independence. Can the Minister alleviate our concerns by clarifying exactly how this process will work in practice? He must agree with the points that colleagues from across the House have made about the importance of Ofcom being truly independent and free from any political persuasion, influence or control. We all want to see the Bill change things for the better so I am keen to hear from the Minister the specific reasoning behind giving the Secretary of State the power to amend this important legislation through what will seemingly be a simple process.
As we all know, clause 174 allows the Secretary of State to make regulations to amend or repeal provisions relating to exempt content or services. Regulations made under this clause can be used to exempt certain content or services from the scope of the regulatory regime, or to bring them into scope. It will come as no surprise to the Minister that we have genuine concerns about the clause, given that it gives the Secretary of State of the day the power to amend the substantive scope of the regulatory regime. In layman’s terms, we see this clause as essentially giving the Secretary of State the power to, through regulations, exempt certain content and services from the scope of the Bill, or bring them into scope. Although we agree with the Minister that a degree of flexibility is crucial to the Bill’s success and we have indeed raised concerns throughout the Bill’s proceedings about the need to future-proof the Bill, it is a fine balance, and we feel that these powers in this clause are in excess of what is required. I will therefore be grateful to the Minister if he confirms exactly why this legislation has been drafted in a way that will essentially give the Secretary of State free rein on these important regulations.
Clauses 175 and 176 seek to give the Secretary of State additional powers, and again Labour has concerns. Clause 175 gives the Secretary of State the power to amend the list in part 2 of schedule 1, specifically paragraph 10. That list sets out descriptions of education and childcare relating to England; it is for the relevant devolved Ministers to amend the list in their respective areas. Although we welcome the fact that certain criteria must be met before the amendments can be made, this measure once again gives the Secretary of State of the day the ability substantively to amend the scope of the regime more broadly.
Those concerns are felt even more strongly when we consider clause 176, which gives the Secretary of State the power to amend three key areas in the Bill—schedules 5, 6 and 7, which relate to terrorism offences, to child sexual exploitation and abuse content offences—except those extending to Scotland—and to priority offences in some circumstances. Alongside stakeholders, including Carnegie, we strongly feel that the Secretary of State should not be able to amend the substantive scope of the regime at this level, unless moves have been initiated by Ofcom and followed by effective parliamentary oversight and scrutiny. Parliament should have a say in this. There should be no room for this level of interference in a regulatory regime, and the Minister knows that these powers are at risk of being abused by a bad actor, whoever the Secretary of State of the day may be. I must, once again, press the Minister to specifically address the concerns that Labour colleagues and I have repeatedly raised, both during these debates and on Second Reading.
I have a couple of questions, particularly on clause 176 and the powers to amend schedules 6 and 7. I understand the logic for schedule 5 being different—in that terrorism offences are a wholly reserved matter—and therefore why only the Secretary of State would be making any changes.
My question is on the difference in the ways to amend schedules 6 and 7—I am assuming that Government amendment 126, which asks the Secretary of State to consult Scottish Ministers and the Department of Justice in Northern Ireland, and which we have already discussed, will be voted on and approved before we come to clause 176. I do not understand the logic for having different procedures to amend the child sexual exploitation and abuse offences and the priority offences. Why have the Government chosen two different procedures for amending the two schedules?
I understand why that might not be a terribly easy question to answer today, and I would be happy for the Minister to get in touch afterwards with the rationale. It seems to me that both areas are very important, and I do not quite understand why the difference is there.
Again, Labour has concerns about clause 177, which gives the Secretary of State a power to make consequential provisions relating to the Bill or regulations under the Bill. As we know, the power is exercised by regulation and includes the ability to amend the Communications Act 2003. I will spare the Committee a repetition of my sentiments, but we do feel that the clause is part of an extremely worrying package of clauses related to the Secretary of State’s powers, which we feel are broadly unnecessary.
We have the same concerns about clause 178, which sets out how the powers to make regulations conferred on the Secretary of State may be used. Although we recognise that it is important in terms of flexibility and future-proofing that regulations made under the Bill can make different provisions for different purposes, in particular relating to different types of service, we are concerned about the precedent that this sets for future legislation that relies on an independent regulatory system.
Labour supports amendment 160, which will ensure that the regulations made under new schedule 2, which we will debate shortly, are subject to the affirmative procedure. That is vital if the Bill is to succeed. We have already expressed our concerns about the lack of scrutiny of other provisions in the Bill, so we see no issue with amendment 160.
The Minister has outlined clause 179, and he knows that we welcome parliamentary oversight and scrutiny of the Bill more widely. We regard this as a procedural clause and have therefore not sought to amend it.
Question put and agreed to.
Clause 177 accordingly ordered to stand part of the Bill.
Clause 178 ordered to stand part of the Bill.
Clause 179
Parliamentary procedure for regulations
Amendment made: 160, in clause 179, page 146, line 13, at end insert “, or
(k) regulations under paragraph 7 of Schedule (Recovery of OFCOM’s initial costs),—(Chris Philp.)
This amendment provides that regulations under NS2 are subject to the affirmative procedure.
Clause 179, as amended, ordered to stand part of the Bill.
Clause 180
“Provider” of internet service
Question proposed, That the clause stand part of the Bill.
With this it will be convenient to consider the following:
Clauses 181 to 188 stand part.
Amendment 76, in clause 189, page 154, line 34, after “including” insert “but not limited to”.
This amendment clarifies the definition of “content” in the bill in order that anything communicated by means of an internet service is considered content, not only those examples listed.
I will address clauses 180 to 182 together, before moving on to discuss our concerns about the remaining clauses in this group.
As we know, clause 180 determines who is the provider of an internet service and therefore who is subject to the duties imposed on providers. Labour has already raised concerns about the Bill’s lack of future-proofing and its inability to incorporate internet services that may include user-to-user models. The most obvious of those are user-to-user chat functions in gaming, which the hon. Member for Aberdeen North has raised on a number of occasions; we share her concerns.
Broadly, we think the Bill as it stands fails to capture the rapidity of technological advances, and the gaming industry is a key example of this. The Bill targets the providers that have control over who may use the user-to-user functions of a game, but in our view the clarity just is not there for emerging tech in the AI space in particular, so we would welcome the Minister’s comments on where he believes this is defined or specified in the Bill.
Clause 181 defines “user”, “United Kingdom user” and “interested person” in relation to regulated services. We welcome the clarification outlined in subsections (3) and (4) of the role of an employee at a service provider and their position when uploading content. We support the clarity on the term “internet service” in clause 182, and we welcome the provisions to capture services that are accessed via an app specifically, rather than just via an internet browser.
We welcome clause 183, which sets out the meaning of “search engine”. It is important to highlight the difference between search engines and user-to-user services, which has been attempted throughout the Bill. We heard from Google about its definition of “search”, and Labour agrees that, at their root, search services exist as an index of the web, and are therefore different from user-to-user services. We also fully appreciate the rapid nature of the internet—hundreds of web pages are created every single second—meaning that search services have a fundamental role to play in assisting users to find authoritative information that is most relevant to what they are seeking. Although search engines do not directly host content, they have an important role to play in ensuring that a delicate balance is maintained between online safety and access to lawful information. We are therefore pleased to support clause 183, which we feel broadly outlines the responsibilities placed on search services more widely.
On clause 184, Labour supports the need for a proactive technology to be used by regulated service providers to comply with their duties on illegal content, content that is harmful to children, and fraudulent advertising. In our consideration of proactive technology elsewhere in the Bill, Labour has made it clear that we support measures to keep us all safe. When speaking to new clause 20, which we debated with clause 37, I made it clear that we disagree with the Bill’s stance on proactive technology. As it is, the Bill will leave Ofcom unable to proactively require companies to use technology that can detect child abuse. Sadly, I was not particularly reassured by the Minister’s response, but it is important to place on the record again our feeling that proactive technology has an important role to play in improving online safety more widely.
Clause 185 provides information to assist Ofcom in its decision making on whether, in exercising its powers under the Bill, content is communicated publicly or privately. We see no issues with the process that the clause outlines. It is fundamentally right that, in the event of making an assessment of public or private content, Ofcom has a list of factors to consider and a subsequent process to follow. We will therefore support clause 185, which we have not sought to amend.
Clause 186 sets out the meaning of the term “functionality”. Labour supports the clause, particularly the provisions in subsection (2), which include the detailed ways in which platforms’ functionality can affect subsequent online behaviours. Despite our support, I put on the record our concern that the definitions in the clause do little to imagine or capture the broad nature of platforms or, indeed, the potential for them to expand into the AI space in future.
The Minister knows that Labour has advocated a systems-based approach to tackling online safety that would put functionality at the heart of the regulatory system. It is a frustrating reality that those matters are not outlined until clause 186. That said, we welcome the content of the clause, which we have not sought to amend.
Clause 187 aims to define “harm” as “physical or psychological harm”. Again, we feel that that definition could go further. My hon. Friend the Member for Batley and Spen spoke movingly about her constituent Zach in an earlier debate, and made a compelling case for clarity on the interplay between the physical and psychological harm that can occur online. The Minister said that the Government consider the Bill to cover a range of physical and psychological harms, but many charities disagree. What does he say to them?
We will shortly be considering new clause 23, and I will outline exactly how Labour feels that the Bill fails to capture the specific harms that women and girls face online. It is another frustrating reality that the Government have not taken the advice of so many stakeholders, and of so many women and girls, to ensure that those harms are on the face of the Bill.
Labour agrees with the provisions in clause 188, which sets out the meaning of “online safety functions” and “online safety matters”, so we have not sought to amend the clause.
Would it be appropriate for me to speak to the SNP amendment as well, Sir Roger?
Not really. If the hon. Lady has finished with her own amendments, we should, as a courtesy, allow the SNP spokesperson to speak to her amendment first.
Thank you, Sir Roger. I thank the shadow Minister for running through some of our shared concerns about the clauses. Similarly, I will talk first about some of the issues and questions that I have about the clauses, and then I will speak to amendment 76. Confusingly, amendment 76 was tabled to clause 189, which we are not discussing right now. I should have raised that when I saw the provisional selection of amendments. I will do my best not to stray too far into clause 189 while discussing the amendment.
I have raised before with the Minister some of the questions and issues that I have. Looking specifically at clause 181, I very much appreciate the clarification that he has given us about users, what the clause actually means, and how the definition of “user” works. To be fair, I agree with the way the definition of “user” is written. My slight concern is that, in measuring the number of users, platforms might find it difficult to measure the number of unregistered users and the number of users who are accessing the content through another means.
Let us say, for example, that someone is sent a WhatsApp message with a TikTok link and they click on that. I do not know whether TikTok has the ability to work out who is watching the content, or how many people are watching it. Therefore, I think that TikTok might have a difficulty when it comes to the child safety duties and working out the percentage or number of children who are accessing the service, because it will not know who is accessing it through a secondary means.
I am not trying to give anyone a get-out clause. I am trying to ensure that Ofcom can properly ensure that platforms that have a significant number of children accessing them through secondary means are still subject to the child safety duties even though there may not be a high number of children accessing the platform or the provider directly. My major concern is assessing whether they are subject to the child safety duties laid out in the Bill.
I will move straight on to our amendment 76, which would amend the definition of “content” in clause 189. I have raised this issue with the Minister already. The clause, as amended, would state that
“‘content’ means anything communicated by means of an internet service, whether publicly or privately, including but not limited to”—
and then a list. The reason I suggest that we should add those words “but not limited to” is that if we are to have a list, we should either make an exhaustive list or have clarity that there are other things that may not be on the list.
I understand that it could be argued that the word “including” suggests that the provision actually goes much wider than what is in the list. I understand that that is the argument that the Minister may make, but can we have some more clarity from him? If he is not willing to accept the amendment but he is willing to be very clear that, actually, the provision does include things that we have not thought of and that do not currently exist and that it genuinely includes anything communicated by means of an internet service, that will be very helpful.
I think that the amendment would add something positive to the Bill. It is potentially the most important amendment that I have tabled in relation to future-proofing the Bill, because it does feel as though the definition of “content”, even though it says “including”, is unnecessarily restrictive and could be open to challenge should someone invent something that is not on the list and say, “Well, it’s not mentioned, so I am not going to have to regulate this in the way we have to regulate other types of content.”
I have other questions about the same provision in clause 189, but I will hold on to those until we come to the next grouping.
I rise briefly to support amendment 76, in the name of the hon. Member for Aberdeen North. Labour supports broadening the definition of “content” in this way. I refer the Minister to our earlier contributions about the importance of including newspaper comments, for example, in the scope of the Bill. This is a clear example of a key loophole in the Bill. We believe that a broadened definition of “content” would be a positive step forward to ensure that there is future-proofing, to prevent any unnecessary harm from any future content.
The shadow Minister, in her first contribution to the debate, introduced the broad purpose of the various clauses in this group, so I do not propose to repeat those points.
I would like to touch on one or two issues that came up. One is that clause 187 defines the meaning of “harm” throughout the Bill, although clause 150, as we have discussed, has its own internal definition of harm that is different. The more general definition of harm is made very clear in clause 187(2), which states:
“‘Harm’ means physical or psychological harm.”
That means that harm has a very broad construction in the Bill, as it should, to make sure that people are being protected as they ought to be.
Amendment 111 is not claimed; it has been tabled by the hon. Member for Stroud (Siobhan Baillie), who is not a member of the Committee. I am assuming that nobody wishes to take ownership of it and we will not debate it.
If the hon. Member for Aberdeen North wishes to move amendment 76, she will be able to do so at the end of the stand part debate.
Question proposed, That the clause stand part of the Bill.
As we know, the clause sets out the meanings of various terms used in the Bill. Throughout our Committee debates, Labour has raised fundamental concerns on a number of points where we feel the interpretation of the Bill requires clarification. We raised concerns as early as clause 8, when we considered the Bill’s ability to capture harm in relation to newly produced CSEA content and livestreaming. The Minister may feel he has sufficiently reassured us, but I am afraid that simply is not the case. Labour has no specific issues with the interpretations listed in clause 189, but we will likely seek to table further amendments on Report in the areas that we feel require clarification.
In one of our earlier debates, I asked the Minister about the difference between “oral” and “aural”, and I did not get a very satisfactory answer. I know the difference in their dictionary definition—I understand that they are different, although the words sound the same. I am confused that clause 189 uses “oral” as part of the definition of content, but clause 49 refers to
“one-to-one live aural communications”
in defining things that are excluded.
I do not understand why the Government have chosen to use those two different words in different places in the Bill. It strikes me that, potentially, we mean one or the other. If they do mean two different things, why has one thing been chosen for clause 49 and another thing for clause 189? Why has the choice been made that clause 49 relates to communications that are heard, but clause 189 relates to communications that are said? I do not quite get the Government’s logic in using those two different words.
I know this is a picky point, but in order to have good legislation, we want it to make sense, for there to be a good rationale for everything that is in it and for people to be able to understand it. At the moment, I do not properly understand why the choice has been made to use two different words.
More generally, the definitions in clause 189 seem pretty sensible, notwithstanding what I said in the previous debate in respect of amendment 76, which, with your permission, Sir Roger, I intend to move when we reach the appropriate point.
Labour has not tabled any amendments to clause 190, which lists the provisions that define or explain terms used in the Bill. However, it will come as no surprise that we dispute the Bill’s definition of harm, and I am grateful to my hon. Friend the Member for Batley and Spen for raising those important points in our lively debate about amendment 112 to clause 150. We maintain that the Minister has missed the point, in that the Bill’s definition of harm fails to truly capture physical harm caused as a consequence of being online. I know that the Minister has promised to closely consider that as we head to Report stage, but I urge him to bear in mind the points raised by Labour, as well as his own Back Benchers.
The Minister knows, because we have repeatedly raised them, that we have concerns about the scope of the Bill’s provisions relating to priority content. I will not repeat myself, but he will be unsurprised to learn that this is an area in which we will continue to prod as the Bill progresses through Parliament.
I have made points on those issues previously. I do not propose to repeat now what I have said before.
Question put and agreed to.
Clause 190 accordingly ordered to stand part of the Bill.
Clause 191 ordered to stand part of the Bill.
Clause 192
Extent
The clause provides that the Bill extends to England, Wales, Scotland and Northern Ireland, subject to the exceptions set out in subsections (2) to (7). We welcome clarification of how the devolved nations may be affected by the provisions of the Bill—that is of particular importance to me as a Welsh MP. It is important to clarify how amendments or appeals, as outlined in subsection (7), may work in the context of devolution more widely.
Labour also supports new clause 35 and Government amendment 141. Clearly, those working for Ofcom should have a defence to the offence of publishing obscene articles as, sadly, we see that as a core part of establishing the online safety regime in full. We know that having such a defence available is likely to be an important part of the regulator’s role and that of its employees. Labour is therefore happy to support this sensible new clause and amendment.
Amendment 139 was tabled by a Member who is not a member of the Committee, and nobody has claimed it, so we come to amendment 49.
I beg to move amendment 49, in clause 193, page 161, line 1, leave out subsection (2) and insert—
“(2) Subject to subsection (2A) below, the other provisions of this Act come into force on such day as the Secretary of State may by regulations appoint.
(2A) The provisions of Part 5 shall come into force at the end of the period of three months beginning with the day on which this Act is passed.”
This amendment would bring Part 5 into force three months after the Act is passed.
We all understand the need for the Bill, which is why we have been generally supportive in Committee. I hope we can also agree that the measures that the Bill introduces must come into force as soon as is reasonably possible. That is particularly important for the clauses introducing protections for children, who have been subject to the harms of the online world for far too long already. I was glad to hear the Minister say in our discussions of clauses 31 to 33 that the Government share the desire to get such protections in place quickly.
My hon. Friend the Member for Worsley and Eccles South also spoke about our concerns about the commencement and transitional provisions when speaking to clauses 170 to 172. We fundamentally believe that the provisions on pornography in part 5 cannot, and should not, be susceptible to further delay, because they require no secondary legislation. I will come to that point in my comments on the amendment. More broadly, I will touch briefly on the reasons why we cannot wait for the legislation and make reference to a specific case that I know colleagues across the House are aware of.
My hon. Friend the Member for Reading East (Matt Rodda) has been a powerful voice on behalf of his constituents Amanda and Stuart Stephens, whose beloved son Olly was tragically murdered in a field outside his home. A BBC “Panorama” investigation, shown only a few days ago, investigated the role that social media played in Olly’s death. It specifically highlighted disturbing evidence that some social media algorithms may still promote violent content to vulnerable young people. That is another example highlighting the urgent need for the Bill, along with a regulatory process to keep people safe online.
We also recognise, however, the important balance between the need for effective development of guidance by Ofcom, informed by consultation, and the need to get the duties up and going. In some cases, that will mean having to stipulate deadlines in the Bill, which we feel is a serious omission and oversight at present.
The amendment would bring part 5 of the Bill into force three months after it is enacted. The Minister knows how important part 5 is, so I do not need to repeat myself. The provisions of the amendment, including subsequent amendments that Labour and others will likely table down the line, are central to keeping people safe online. We have heard compelling evidence from experts and speeches from colleagues across the House that have highlighted how vital it is that the Bill goes further on pornographic content. The amendment is simple. It seeks to make real, meaningful change as soon as is practically possible. The Bill is long delayed, and providers and users are desperate for clarity and positive change, which is what led us to tabling the amendment.
In the interests of not having to make a speech in this debate, I want to let the hon. Member know that I absolutely support the amendment. It is well balanced, brings the most important provisions into force as soon as possible, and allows the Secretary of State to appoint dates for the others.
I welcome the hon. Member’s intervention, and I am grateful for her and her party’s support for this important amendment.
It is also worth drawing colleagues’ attention to the history of issues, which have been brought forward in this place before. We know there was reluctance on the part of Ministers when the Digital Economy Act 2017 was on the parliamentary agenda to commence the all-important part 3, which covered many of the provisions now in part 5. Ultimately, the empty promises made by the Minister’s former colleagues have led to huge, record failures, even though the industry is ready, having had years to prepare to implement the policy. I want to place on record my thanks to campaigning groups such as the Age Verification Providers Association and others, which have shown fierce commitment in getting us this far.
It might help if I cast colleagues’ minds back to the Digital Economy Act 2017, which received Royal Assent in April of that year. Following that, in November 2018, the then Minister of State for Digital and Creative Industries told the Science and Technology Committee that part 3 of the DEA would be in force “by Easter next year”. Then, in December 2018, both Houses of Parliament approved the necessary secondary legislation, the Online Pornography (Commercial Basis) Regulations 2018, and the required statutory guidance.
But shortly after, in April 2018, the first delay arose when the Government published an online press release stating that part 3 of the DEA would not come into force until 15 July 2019. However, June 2019 came around and still there was nothing. On 20 June, five days after it should have come into force, the then Under-Secretary of State told the House of Lords that the defendant had failed to notify the European Commission of the statutory guidance, which would need to be done, and that that would result in a delay to the commencement of part 3
“in the region of six months”.—[Official Report, House of Lords, 20 June 2019; Vol. 798, c. 883.]
However, on 16 October 2019, the then Secretary of State announced via a written statement to Parliament that the Government
“will not be commencing part 3 of the Digital Economy Act 2017 concerning age verification for online pornography.”—[Official Report, 16 October 2019; Vol. 666, c. 17WS.]
A mere 13 days later, the Government called a snap general election. I am sure those are pretty staggering realities for the Minister to hear—and defend—but I am willing to listen to his defence. It really is not good enough. The industry is ready, the technology has been there for quite some time, and, given this Government’s fondness for a U-turn, there are concerns that part 5 of the Bill, which we have spent weeks deliberating, could be abandoned in a similar way as part 3 of the DEA was.
The Minister has failed to concede on any of the issues we have raised in Committee. It seems we are dealing with a Government who are ignoring the wide-ranging gaps and issues in the Bill. He has a relatively last-ditch opportunity to at least bring about some positive change, and to signify that he is willing to admit that the legislation as it stands is far from perfect. The provisions in part 5 are critical—they are probably the most important in the entire Bill—so I urge him to work with Labour to make sure they are put to good use in a more than reasonable timeframe.
On the implementation of part 3 of the Digital Economy Act 2017, all the events that the shadow Minister outlined predated my time in the Department. In fact, apart from the last few weeks of the period she talked about, the events predated my time as a Minister in different Departments, and I cannot speak for the actions and words of Ministers prior to my arrival in DCMS. What I can say, and I have said in Committee, is that we are determined to get the Bill through Parliament and implemented as quickly as we can, particularly the bits to do with child safety and the priority illegal content duties.
The shadow Minister commented at the end of her speech that she thought the Government had been ignoring parliamentary opinion. I take slight issue with that, given that we published a draft Bill in May 2021 and went through a huge process of scrutiny, including by the Joint Committee of the Commons and the Lords. We accepted 66 of the Joint Committee’s recommendations, and made other very important changes to the Bill. We have made changes such as addressing fraudulent advertising, which was previously omitted, and including commercial pornography—meaning protecting children—which is critical in this area.
The Government have made a huge number of changes to the Bill since it was first drafted. Indeed, we have made further changes while the Bill has been before the Committee, including amending clause 35 to strengthen the fraudulent advertising duties on large search companies. Members of Parliament, such as the right hon. Member for East Ham (Sir Stephen Timms), raised that issue on Second Reading. We listened to what was said at that stage and we made the changes.
There have also been quite a few occasions during these Committee proceedings when I have signalled—sometimes subtly, sometimes less so—that there are areas where further changes might be forthcoming as the Bill proceeds through both Houses of Parliament. I do not think the hon. Member for Pontypridd, or any member of the Committee, should be in any doubt that the Government are very open to making changes to the Bill where we are able to and where they are right. We have done so already and we might do so again in the future.
On the specifics of the amendment, we share the intention to protect children from accessing pornography online as quickly as possible. The amendment seeks to set a three-month timeframe within which part 5 must come into force. However, an important consideration for the commencement of part 5 will be the need to ensure that all kinds of providers of online pornography are treated the same, including those hosting user-generated content, which are subject to the duties of part 3. If we take a piecemeal approach, bringing into force part 5, on commercial pornography, before part 3, on user-to-user pornography, that may enable some of the services, which are quite devious, to simply reconfigure their services to circumvent regulation or cease to be categorised as part 5 services and try to be categorised as part 3 services. We want to do this in a comprehensive way to ensure that no one will be able to wriggle out of the provisions in the Bill.
Parliament has also placed a requirement on Ofcom to produce, consult on and publish guidance for in-scope providers on meeting the duties in part 5. The three-month timescale set out in the amendment would be too quick to enable Ofcom to properly consult on that guidance. It is important that the guidance is right; if it is not, it may be legally challenged or turn out to be ineffective.
I understand the need to get this legislation implemented quickly. I understand the scepticism that flows from the long delays and eventual cancellation of part 3 of the Digital Economy Act 2017. I acknowledge that, and I understand where the sentiment comes from. However, I think we are in a different place today. The provisions in the Bill have been crafted to address some of the concerns that Members had about the previous DEA measures—not least the fact that they are more comprehensive, as they cover user-to-user, which the DEA did not. There is therefore a clear commitment to getting this done, and getting it done fast. However, we also have to get it done right, and I think the process we have set out does that.
The Ofcom road map is expected before the summer. I hope that will give further reassurance to the Committee and to Parliament about the speed with which these things can get implemented. I share Members’ sentiments about needing to get this done quickly, but I do not think it is practical or right to do it in the way set out in amendment 49.
I am grateful for the Minister’s comments. However, I respectfully disagree, given the delays already since 2017. The industry is ready for this. The providers of the age verification services are ready for this. We believe that three months is an adequate timeframe, and it is vital that we get this done as quickly as possible. With that in mind, I will be pushing amendment 49 to a vote.
Question put, That the amendment be made.
New clause 42 introduces new schedule 2. New clause 43 provides that the additional fees charged to providers under new schedule 2 must be paid into the consolidated fund. We discussed that a few days ago. That is where the fees are currently destined and I owe my right hon. Friend the Member for Basingstoke some commentary on this topic in due course. The Bill already provided that monetary penalties must be paid into the Consolidated Fund; the provisions are now placed into that clause.
New schedule 2, which is quite detailed, makes provisions in connection with Ofcom’s ability to recover its initial costs, which we have previously debated. As discussed, it is important that the taxpayer not only is protected from the ongoing costs but that the set-up costs are recovered. The taxpayer should not have to pay for the regulatory framework; the people who are being regulated should pay, whether the costs are incurred before or after commencement, in line with the “polluter pays” principle. Deep in new schedule 2 is the answer to the question that the hon. Member for Aberdeen North asked a day or two ago about the period over which set-up costs can be recovered, with that period specified as between three and five years. I hope that provides an introduction to the new clauses and new schedules.
We welcome this grouping, which includes two new clauses and a new schedule. Labour has raised concerns about the future funding of Ofcom more widely, specifically when we discussed groupings on clause 42. The Minister’s response did little to alleviate our concerns about the future of Ofcom’s ability to raise funds to maintain its position as the regulator. Despite that, we welcome the grouping, particularly the provisions in the new schedule, which will require Ofcom to seek to recover the costs it has incurred when preparing to take on functions as the regulator of services under the Bill by charging fees to providers of services. This is an important step, which we see as being broadly in line with the kind of mechanisms already in place for other, similar regulatory regimes.
Ultimately, it is right that fees charged to providers under new schedule 2 must be paid into the Consolidated Fund and important that Ofcom can recover its costs before a full fee structure and governance process is established. However, I have some questions for the Minister. How many people has Ofcom hired into roles, and can any of those costs count towards the calculation of fees? We want to ensure that other areas of regulation do not lose out as a consequence. Broadly speaking, though, we are happy to support the grouping and have not sought to table amendment at this stage.
So far as I am aware, all the costs incurred by Ofcom in relation to the duties in the Bill can be recouped by way of fees. If that is not correct, I will write to the hon. Lady saying so, but my understanding is that any relevant Ofcom cost will be in the scope of the fees.
Question put and agreed to.
New clause 42 accordingly read a Second time, and added to the Bill.
New Clause 43
Payment of sums into the Consolidated Fund
“(1) Section 400 of the Communications Act (destination of penalties etc) is amended as follows.
(2) In subsection (1), after paragraph (i) insert—
‘(j) an amount paid to OFCOM in respect of a penalty imposed by them under Chapter 6 of Part 7 of the Online Safety Act 2022;
(k) an amount paid to OFCOM in respect of an additional fee charged under Schedule (Recovery of OFCOM’s initial costs) to the Online Safety Act 2022.’
(3) In subsection (2), after ‘applies’ insert ‘(except an amount mentioned in subsection (1)(j) or (k))’.
(4) After subsection (3) insert—
‘(3A) Where OFCOM receive an amount mentioned in subsection (1)(j) or (k), it must be paid into the Consolidated Fund of the United Kingdom.’
(5) In the heading, omit ‘licence’.”—(Chris Philp.)
This new clause provides that additional fees charged to providers under NS2 must be paid into the Consolidated Fund. The Bill already provided that monetary penalties must be paid into the Consolidated Fund, and those provisions are now placed in this clause.
Brought up, read the First and Second time, and added to the Bill.
New Clause 3
Establishment of Advocacy Body
“(1) There is to be a body corporate (‘the Advocacy Body’) to represent interests of child users of regulated services.
(2) A ‘child user’—
(a) means any person aged 17 years or under who uses or is likely to use regulated internet services; and
(b) includes both any existing child user and any future child user.
(3) The work of the Advocacy Body may include—
(a) representing the interests of child users;
(b) the protection and promotion of these interests;
(c) any other matter connected with those interests.
(4) The ‘interests of child users’ means the interest of children in relation to the discharge by any regulated company of its duties under this Act, including—
(a) safety duties about illegal content, in particular CSEA content;
(b) safety duties protecting children;
(c) ‘enforceable requirements’ relating to children.
(5) The Advocacy Body must have particular regard to the interests of child users that display one or more protected characteristics within the meaning of the Equality Act 2010.
(6) The Advocacy Body will be defined as a statutory consultee for OFCOM’s regulatory decisions which impact upon the interests of children.
(7) The Secretary of State may appoint an organisation known to represent children to be designated the functions under this Act, or may create an organisation to carry out the designated functions.”—(Barbara Keeley.)
This new clause creates a new advocacy body for child users of regulated internet services.
Brought up, and read the First time.
I am grateful to the Minister for his support for Labour legislation. Does he acknowledge that we have different Children’s Commissioners across the nations of the UK? Each would have the same rights to advocate for children, so we would have four, rather than one focusing on one specific issue, which is what the Children’s Commissioners across the UK are advocating for.
I do not have in front of me the relevant devolved legislation—I have only the Children Act 2004 directly in front of me—but I assume it is broadly similar. The hon. Member for Aberdeen North can correct me if I am wrong, but I assume it is probably broadly similar in the way—[Interruption.] She is not sure, so I do not feel too bad about not being sure either. I imagine it is similar. I am not sure that having similar statutory bodies with the same function—we would create another with the new clause—is necessarily helpful.
The Bill sets out formal processes that allow other organisations, such as the NSPCC, to raise complaints that have to be dealt with. That ensures that the voices of groups—including children, but not just children—will be heard. I suspect that if we have a children’s advocacy body, other groups will want them and might feel that they have been overlooked by omission.
The good thing about the way the super-complaint structure in clause 140 works is that it does not prescribe what the groups are. Although I am sure that children will be top of the list, there will be other groups that want to advocate and to be able to bring super-complaints. I imagine that women’s groups will be on that list, along with groups advocating for minorities and people with various sexual orientations. Clause 140 is not exclusive; it allows all these groups to have a voice that must be heard. That is why it is so effective.
My right hon. Friend the Member for Basingstoke and the hon. Member for Batley and Spen asked whether the groups have enough resources to advocate on issues under the super-complaint process. That is a fair question. The allocation of funding to different groups tends to be done via the spending review process. Colleagues in other Departments—the Department for Education or, in the case of victims, the Ministry of Justice—allocate quite a lot of money to third-sector groups. The victims budget was approximately £200 million a year or two ago, and I am told it has risen to £300 million for the current financial year. That is the sort of funding that can find its way into the hands of the organisations that advocate for particular groups of victims. My right hon. Friend asked whether the proceeds of fines could be applied to fund such work, and I have undertaken to raise that with the Treasury.
We already have a statutory advocate for children: the four Children’s Commissioners for the four parts of the United Kingdom. We have the super-complaints process, which covers more than children’s groups, crucial though they are. We have given Ofcom statutory duties to consult when developing its codes of practice, and we have money flowing via the Ministry of Justice, the DFE and others, into advocate groups. Although we agree with the intention behind new clause 3, we believe its objectives are very well covered via the mechanisms that I have just set out at some length.
(2 years, 5 months ago)
Public Bill CommitteesWith this it will be convenient to discuss the following:
Clause 119 stand part.
Government amendments 154 to 157.
Clauses 120 and 121 stand part.
Bore da, Ms Rees. It is, as ever, a pleasure to serve under your chairship. I rise to speak to clauses 118 to 121 and Government amendments 154 to 157.
As we all know, clause 118 is important and allows Ofcom to impose a financial penalty on a person who fails to complete steps that have been required by Ofcom in a confirmation decision. This is absolutely vital if we are to guarantee that regulated platforms take seriously their responsibilities in keeping us all safe online. We support the use of fines. They are key to overall behavioural change, particularly in the context of personal liability. We welcome clause 118, which outlines the steps Ofcom can take in what we hope will become a powerful deterrent.
Labour also welcomes clause 119. It is vital that Ofcom has these important powers to impose a financial penalty on a person who fails to comply with a notice that requires technology to be implemented to identify and deal with content relating to terrorism and child sexual exploitation and abuse on their service. These are priority harms and the more that can be done to protect us on these two points the better.
Government amendments 155 and 157 ensure that Ofcom has the power to impose a monetary penalty on a provider of a service who fails to pay a fee that it is required to pay under new schedule 2. We see these amendments as crucial in giving Ofcom the important powers it needs to be an effective regulator, which is something we all require. We have some specific observations around new schedule 2, but I will save those until we consider that schedule. For now, we support these amendments and I look forward to outlining our thoughts shortly.
We support clause 120, which allows Ofcom to give a penalty notice to a provider of a regulated service who does not pay the fee due to Ofcom in full. This a vital provision that also ensures that Ofcom’s process to impose a penalty can progress only when it has given due notice to the provider and once the provider has had fair opportunity to make fair representations to Ofcom. This is a fair approach and is central to the Bill, which is why we have not sought to amend.
Finally, we support clause 121, which ensures that Ofcom must state the reasons why it is imposing a penalty, the amount of the penalty and any aggravating or mitigating factors. Ofcom must also state when the penalty must be paid. It is imperative that when issuing a notice Ofcom is incentivised to publish information about the amount, aggravating or mitigating factors and when the penalty must be paid. We support this important clause and have not sought to amend.
It is a pleasure to serve under your chairmanship once again, Ms Rees, and I congratulate Committee members on evading this morning’s strike action.
I am delighted that the shadow Minister supports the intent behind these clauses, and I will not speak at great length given the unanimity on this topic. As she said, clause 118 allows Ofcom to impose a financial penalty for failure to take specified steps by a deadline set by Ofcom. The maximum penalty that can be imposed is the greater of £18 million or 10% of qualifying worldwide revenue. In the case of large companies, it is likely to be a much larger amount than £18 million.
Clause 119 enables Ofcom to impose financial penalties if the recipient of a section 103 notice does not comply by the deadline. It is very important to ensure that section 103 has proper teeth. Government amendments 154 to 157 make changes that allow Ofcom to recover not only the cost of running the service once the Bill comes into force and into the future but also the preparatory cost of setting up for the Bill to come into force.
As previously discussed, £88 million of funding is being provided to Ofcom in this financial year and next. We believe that something like £20 million of costs that predate these financial years have been funded as well. That adds up to around £108 million. However, the amount that Ofcom recovers will be the actual cost incurred. The figure I provided is simply an indicative estimate. The actual figure would be based on the real costs, which Ofcom would be able to recoup under these measures. That means that the taxpayer—our constituents —will not bear any of the costs, including the set-up and preparatory cost. This is an equitable and fair change to the Bill.
Clause 120 sets out that some regulated providers will be required to pay a regulatory fee to Ofcom, as set out in clause 71. Clause 120 allows Ofcom to impose a financial penalty if a regulated provider does not pay its fee by the deadline it sets. Finally, clause 121 sets out the information that needs to be included in these penalty notices issued by Ofcom.
With this it will be convenient to discuss:
Government amendment 158.
That schedule 12 be the Twelfth schedule to the Bill.
Labour supports clause 122 and schedule 12, which set out in detail the financial penalties that Ofcom may impose, including the maximum penalty that can be imposed. Labour has long supported financial penalties for those failing to comply with the duties in the Bill. We firmly believe that tough action is needed on online safety, but we feel the sanctions should go further and that there should be criminal liability for offences beyond just information-related failures. We welcome clause 122 and schedule 12. It is vital that Ofcom is also required to produce guidelines around how it will determine penalty amounts. Consistency across the board is vital, so we feel this is a positive step forward and have not sought to amend the clause.
Paragraph 8 of schedule 12 requires monetary penalties to be paid into the consolidated fund. There is no change to that requirement, but it now appears in new clause 43, together with the requirement to pay fees charged under new schedule 2 into the consolidated fund. We therefore support the amendments.
I have nothing further to add on these amendments. The shadow Minister has covered them, so I will not detain the Committee further.
Question put and agreed to.
Clause 122 accordingly ordered to stand part of the Bill.
Schedule 12
Penalties imposed by OFCOM under Chapter 6 of Part 7
Amendment made: 158, in schedule 12, page 206, line 43, leave out paragraph 8.—(Chris Philp.)
Paragraph 8 of Schedule 12 requires monetary penalties to be paid into the Consolidated Fund. There is no change to that requirement, but it now appears in NC43 together with the requirement to pay fees charged under NS2 into the Consolidated Fund.
Schedule 12, as amended, agreed to.
Clause 123
Service restriction orders
I beg to move amendment 50, in clause 123, page 106, line 36, at end insert—
“(9A) OFCOM may apply to the court for service restriction orders against multiple regulated services with one application, through the use of a schedule of relevant services which includes all the information required by subsection (5).”
This amendment would give Ofcom the ability to take action against a schedule of non-compliant sites, while still preserving the right of those sites to oppose the application for, and/or appeal through the courts against any, orders to block access or support services.
With this it will be convenient to discuss amendment 51, in clause 125, page 110, line 20, at end insert—
“(7A) OFCOM may apply to the court for service restriction orders against multiple regulated services with one application, through the use of a schedule of relevant services which includes all the information required by subsection (6).”
This amendment would give Ofcom the ability to take action against a schedule of non-compliant sites, while still preserving the right of those sites to oppose the application for, and/or appeal through the courts against any, orders to block access or support services.
With your permission, Ms Rees, I will speak to clause stand part and clauses 124 to 127 at the same time. Labour supports clause 123, which outlines the powers that Ofcom will have when applying to the court for business disruption measures. Business disruption measures are court orders that require third parties to withdraw services or block access to non-compliant regulated services. It is right that Ofcom has these tools at its disposal, particularly if it is going to be able to regulate effectively against the most serious instances of user harm. However, the Bill will be an ineffective regime if Ofcom is forced to apply for separate court orders when trying to protect people across the board from the same harms. We have already waited too long for change. Labour is committed to giving Ofcom the powers to take action, where necessary, as quickly as possible. That is why we have tabled amendments 50 and 51, which we feel will go some way in tackling these issues.
Amendment 50 would give Ofcom the ability to take action against a schedule of non-compliant sites, while still preserving the right of those sites to oppose the application for—and/or appeal through the courts against any—orders to block access or support services. The Bill currently requires Ofcom to seek a separate court order for each service against which it wishes to take enforcement action in the form of blocking access or services. That is the only effective mechanism for overseas websites. UK-based services will be subject to enforcement notices and financial penalties that can be enforced without having to go to court. That creates a disadvantage for UK sites, which can be more easily enforced against.
Given that there are 4 million to 5 million pornographic websites, for example, the requirement for separate court orders will prevent Ofcom from taking action at scale and creating a level playing field for all adult sites. Under the Bill, Ofcom must take action against each offending website or social media company individually. While we acknowledge that the Government have stated that enforcement action can be taken against multiple offending content providers, in our opinion that is not made clear in the Bill.
Moreover, we are concerned that some pornography websites would seek to avoid the Bill’s requirements by changing their domain name—domain hopping. That was threatened last year when Germany moved to issue a blocking order against major providers of internet pornography. That is why Ofcom must be granted clear enforcement powers to take swift action against multiple websites and content providers in one court action or order.
This group of amendments would also provide clarity and ease of enforcement for internet service providers, which will be expected to enforce court orders. Labour wants the Bill to be genuinely effective, and amendments 50 and 51 could ensure that Ofcom has the tools available to it to take action at pace. We urge the Minister to accept these small concessions, which could have a hugely positive impact.
Amendment 51 would give Ofcom the ability to take action against a schedule of non-compliant sites, while preserving the right of those sites to oppose an application for an order to block access or support services, or to appeal through the courts against any such order.
It will come as no surprise that Labour supports clause 124, which sets out the circumstances in which Ofcom may apply to the courts for an interim service restriction order. We particularly support the need for Ofcom to be able to take action when time is not on its side, or where, put plainly, the level of harm being caused means that it would be inappropriate to wait for a definite failure before taking action.
However, we hope that caution is exercised if Ofcom ever needs to consider such an interim order; we must, of course, get the balance right in our approach to internet regulation more widely. I would therefore be grateful if the Minister could outline his understanding of the specifics of when these orders may be applied. More broadly, Labour agrees that Ofcom should be given the power to act when time demands it, so we have not sought to amend clause 124 at this stage.
Labour also supports the need for Ofcom to have the power to apply to the courts for an access restriction order, as outlined in clause 125. It is vital that Ofcom is given the power to prevent, restrict or deter individuals in the UK from accessing a service from a non-compliant provider. We welcome the specific provisions on access via internet service providers and app stores. We all know from Frances Haugen’s testimony that harmful material can often be easily buried, so it is right and proper that those are considered as “access facilities” under the clause. Ultimately, we support the intentions of clause 125 and, again, have not sought to amend it at this stage.
We also support clause 126, which sets out the circumstances in which Ofcom may apply to the courts for an interim access restriction order. I will not repeat myself: for the reasons I have already outlined, it is key that Ofcom has sufficient powers to act, particularly on occasions when it is inappropriate to wait for a failure to be established.
We welcome clause 127, which clarifies how Ofcom’s enforcement powers can interact. We particularly welcome clarification that, where Ofcom exercises its power to apply to the courts for a business disruption order under clauses 123 to 126, it is not precluded from taking action under its other enforcement powers. As we have repeatedly reiterated, we welcome Ofcom’s having sufficient power to reasonably bring about positive change and increase safety measures online. That is why we have not sought to amend clause 127.
Thank you for chairing this morning’s sitting, Ms Rees.
I agree with the hon. Member for Pontypridd that these clauses are necessary and important, but I also agree that the amendments are important. It seems like this is a kind of tidying-up exercise, to give Ofcom the ability to act in a way that will make its operation smoother. We all want this legislation to work. This is not an attempt to break this legislation—to be fair, none of our amendments have been—but an attempt to make things work better.
Amendments 50 and 51 are fairly similar to the one that the National Society for the Prevention of Cruelty to Children proposed to clause 103. They would ensure that Ofcom could take action against a group of sites, particularly if they were facing the same kind of issues, they had the same kind of functionality, or the same kind of concerns were being raised about them.
Let me start with amendments 50 and 51, which were introduced by the shadow Minister and supported by the SNP spokesperson. The Government recognise the valid intent behind the amendments, namely to make sure that applications can be streamlined and done quickly, and that Ofcom can make bulk applications if large numbers of service providers violate the new duties to the extent that interim service restriction orders or access restriction orders become necessary.
We want a streamlined process, and we want Ofcom to deal efficiently with it, including, if necessary, by making bulk applications to the court. Thankfully, however, procedures under the existing civil procedure rules already allow so-called multi-party claims to be made. Those claims permit any number of claimants, any number of defendants or respondents and any number of claims to be covered in a single form. The overriding objective of the CPR is that cases are dealt with justly and proportionately. Under the existing civil procedure rules, Ofcom can already make bulk applications to deal with very large numbers of non-compliant websites and service providers in one go. We completely agree with the intent behind the amendments, but their content is already covered by the CPR.
It is worth saying that the business disruption measures—the access restriction orders and the service restriction orders—are intended to be a last resort. They effectively amount to unplugging the websites from the internet so that people in the United Kingdom cannot access them and so that supporting services, such as payment services, do not support them. The measures are quite drastic, although necessary and important, because we do not want companies and social media firms ignoring our legislation. It is important that we have strong measures, but they are last resorts. We would expect Ofcom to use them only when it has taken reasonable steps to enforce compliance using other means.
If a provider outside the UK ignores letters and fines, these measures are the only option available. As the shadow Minister, the hon. Member for Pontypridd, mentioned, some pornography providers probably have no intention of even attempting to comply with our regulations; they are probably not based in the UK, they are never going to pay the fine and they are probably incorporated in some obscure, offshore jurisdiction. Ofcom will need to use these powers in such circumstances, possibly on a bulk scale—I am interested in her comment that that is what the German authorities had to do—but the powers already exist in the CPR.
It is also worth saying that in its application to the courts, Ofcom must set out the information required in clauses 123(5) and 125(3), so evidence that backs up the claim can be submitted, but that does not stop Ofcom doing this on a bulk basis and hitting multiple different companies in one go. Because the matter is already covered in the CPR, I ask the shadow Minister to withdraw the amendment.
I am interested to know whether the Minister has anything to add about the other clauses. I am happy to give way to him.
I thank the shadow Minister for giving way. I do not have too much to say on the other clauses, because she has introduced them, but in my enthusiasm for explaining the civil procedure rules I neglected to respond to her question about the interim orders in clauses 124 and 126.
The hon. Lady asked what criteria have to be met for these interim orders to be made. The conditions for clause 124 are set out in subsections (3) and (4) of that clause, which states, first, that it has to be
“likely that the…service is failing to comply with an enforceable requirement”—
so it is likely that there has been a breach—and, secondly, that
“the level of risk of harm to individuals in the United Kingdom…and the nature and severity of that harm, are such that it would not be appropriate to wait to establish the failure before applying for the order.”
Similar language in clause 124(4) applies to breaches of section 103.
Essentially, if it is likely that there has been a breach, and if the resulting harm is urgent and severe—for example, if children are at risk—we would expect these interim orders to be used as emergency measures to prevent very severe harm. I hope that answers the shadow Minister’s question. She is very kind, as is the Chair, to allow such a long intervention.
I welcome the Minister’s comments about clauses 124 and 126 in answer to my questions, and also his comments about amendments 50 and 51, clarifying the CPR. If the legislation is truly to have any impact, it must fundamentally give clarity to service users, providers and regulators. That is why we seek to remove any ambiguity and to put these important measures in the Bill, and it is why I will press amendment 50 to a Division.
Question put, That the amendment be made.
The Minister and his Back Benchers will, I am sure, be tired of our calls for more transparency, but I will be kind to him and confirm that Labour welcomes the provisions in clause 128.
We believe that it is vital that, once Ofcom has followed the process outlined in clause 110 when issuing a confirmation decision outlining its final decision, that is made public. We particularly welcome provisions to ensure that when a confirmation decision is issued, Ofcom will be obliged to publish the identity of the person to whom the decision was sent, details of the failure to which the decision relates, and details relating to Ofcom’s response.
Indeed, the transparency goes further, as Ofcom will be obliged to publish details of when a penalty notice has been issued in many more areas: when a person fails to comply with a confirmation decision; when a person fails to comply with a notice to deal with terrorism content or child sexual exploitation and abuse content, or both; and when there has been a failure to pay a fee in full. That is welcome indeed. Labour just wishes that the Minister had committed to the same level of transparency on the duties in the Bill to keep us safe in the first place. That said, transparency on enforcement is a positive step forward, so we have not sought to amend the clause at this stage.
I am grateful for the shadow Minister’s support. I have nothing substantive to add, other than to point to the transparency reporting obligation in clause 64, which we have debated.
Question put and agreed to.
Clause 128 accordingly ordered to stand part of the Bill.
Clause 129
OFCOM’s guidance about enforcement action
The hon. Member asks for my assistance in interpreting legislative language. Generally speaking, “consult” means what it suggests. Ofcom will consult the Secretary of State, as it will consult the ICO, to ascertain the Secretary of State’s opinion, but Ofcom is not bound by that opinion. Unlike the power in a previous clause—I believe it was clause 40—where the Secretary of State could issue a direct instruction to Ofcom on certain matters, here we are talking simply about consulting. When the Secretary of State expresses an opinion in response to the consultation, it is just that—an opinion. I would not expect it to be binding on Ofcom, but I would expect Ofcom to pay proper attention to the views of important stakeholders, which in this case include both the Secretary of State and the ICO. I hope that gives the hon. Member the clarification he was seeking.
As we know, clause 129 requires Ofcom to publish guidance about how it will use its enforcement powers. It is right that regulated providers and other stakeholders have a full understanding of how, and in what circumstances, Ofcom will have the legislative power to exercise this suite of enforcement powers. We also welcome Government amendment 7, which will ensure that the Information Commissioner—a key and, importantly, independent authority—is included in the consultation before guidance is produced.
As we have just heard, however, the clause sets out that Secretary of State must be consulted before Ofcom produces guidance, including revised or replacement guidance, about how it will use its enforcement powers. We feel that that involves the Secretary of State far too closely in the enforcement of the regime. The Government should be several steps away from being involved, and the clause seriously undermines Ofcom’s independence—the importance of which we have been keen to stress as the Bill progresses, and on which Conservative Back Benchers have shared our view—so we cannot support the clause.
I repeat the point I made to the hon. Member for Liverpool, Walton a moment ago. This is simply an obligation to consult. The clause gives the Secretary of State an opportunity to offer an opinion, but it is just that—an opinion. It is not binding on Ofcom, which may take that opinion into account or not at its discretion. This provision sits alongside the requirement to consult the Information Commissioner’s Office. I respectfully disagree with the suggestion that it represents unwarranted and inappropriate interference in the operation of a regulator. Consultation between organs of state is appropriate and sensible, but in this case it does not fetter Ofcom’s ability to act at its own discretion. I respectfully do not agree with the shadow Minister’s analysis.
I beg to move amendment 57, in clause 130, page 115, line 4, leave out “18” and insert “6”
This amendment changes the period by which the advisory committee must report from 18 months to 6.
With this, it will be convenient to discuss the following: amendment 58, in clause 130, page 115, line 5, at end insert—
‘(6) Following the publication of the report, OFCOM must produce a code of practice setting out the steps services should take to reduce disinformation across their systems.”
This amendment requires Ofcom to produce a code of practice on system-level disinformation.
Clause stand part.
Clause 130 sets up a committee to advise Ofcom on misinformation and disinformation, which is the only direct reference to misinformation and disinformation in the entire Online Safety Bill. However, the Bill gives the committee no identifiable powers or active role in tackling harmful misinformation and disinformation, meaning that it has limited practical purpose. It is also unclear how the advisory committee will fit with Ofcom’s wider regulatory functions.
The remaining provisions in the Bill are limited and do not properly address harmful misinformation and disinformation. If tackling harmful misinformation and disinformation is left to this clause, the Bill will fail both to tackle harm properly, and to keep children and adults safe.
The clause risks giving a misleading impression that action is being taken. If the Government and Ofcom proceed with creating the committee, we need to see that its remit is strengthened and clarified, so that it more effectively tackles harmful disinformation and misinformation. That should include advising on Ofcom’s research, reporting on drivers of harmful misinformation and disinformation, and proportionate responses to them. There should also be a duty on Ofcom to consult the committee when drafting relevant codes of practice.
That is why we have tabled amendment 57. It would change the period by which the advisory committee must report from 18 months to six. This is a simple amendment that encourages scrutiny. Once again, the Minister surely has little reason not to accept it, especially as we have discussed at length the importance of the advisory committee having the tools that it needs to succeed.
Increasing the regularity of these reports from the advisory committee is vital, particularly given the ever-changing nature of the internet. Labour has already raised concerns about the lack of futureproofing in the Bill more widely, and we feel that the advisory committee has an important role and function to play in areas where the Bill itself is lacking. We are not alone in this view; the Minister has heard from his Back Benchers about just how important this committee is.
Amendment 58 would require Ofcom to produce a code of practice on system-level disinformation. Again, this amendment will come as no surprise to the Minister, given the concerns that Labour has repeatedly raised about the lack of provisions relating to disinformation in the Bill. It seems like an obvious omission that the Bill has failed to consider a specific code of practice around reducing disinformation, and the amendment would be a simple way to ensure that Ofcom actively encourages services to reduce disinformation across their platforms. The Minister knows that this would be a welcome step, and I urge him to consider supporting the amendment.
I want to briefly agree with the sentiments of the Opposition Front Bench, especially about the strength of the committee and the lack of teeth that it currently has. Given that the Government have been clear that they are very concerned about misinformation and disinformation, it seems odd that they are covered in the Bill in such a wishy-washy way.
The reduction of the time from 18 months to six months would also make sense. We would expect the initial report the committee publish in six months to not be as full as the ones it would publish after that. I do not see any issue with it being required to produce a report as soon as possible to assess how the Act is bedding in and beginning to work, rather than having to wait to assess—potentially once the Act is properly working. We want to be able to pick up any teething problems that the Act might have.
We want the committee to be able to say, “Actually, this is not working quite as we expected. We suggest that Ofcom operates in a slightly different way or that the interaction with providers happens in a slightly different way.” I would rather that problems with the Act were tackled as early as possible. We will not know about problems with the Act, because there is no proper review mechanism. There is no agreement on the committee, for example, to look at how the Act is operating. This is one of the few parts of the Bill where we have got an agreement to a review, and it would make sense that it happen as early as possible.
We agree that misinformation and disinformation are very important matters that really need to be tackled, but there is just not enough clout in the Bill to allow Ofcom to properly tackle these issues that are causing untold harm.
The clause allows Ofcom to confer functions on the content board in relation to content-related functions under the Bill, but does not require it to do so. We take the view that how Ofcom manages its responsibilities internally is a matter for Ofcom. That may change over time. The clause simply provides that Ofcom may, if Ofcom wishes, ask its content board to consider online safety matters alongside its existing responsibilities. I trust that the Committee considers that a reasonable measure.
Labour welcomes the clause, which, as the Minister has said, sets out some important clarifications with respect to the Communications Act 2003. We welcome the clarification that the content board will have delegated and advisory responsibilities, and look forward to the Minister’s confirmation of exactly what those are and how this will work in practice. It is important that the content board and the advisory committee on disinformation and misinformation are compelled to communicate, too, so we look forward to an update from the Minister on what provisions in the Bill will ensure that that happens.
The shadow Minister has asked how this will work in practice, but as I said, the internal operation of Ofcom obviously is a matter for Ofcom. As Members have said in the recent past—indeed, in the last hour—they do not welcome undue Government interference in the operation of Ofcom, so it is right that we leave this as a matter for Ofcom. We are providing Ofcom with the power, but we are not compelling it to use that power. We are respecting Ofcom’s operational independence—a point that shadow Ministers and Opposition Members have made very recently.
Question put and agreed to.
Clause 131 accordingly ordered to stand part of the Bill.
Clause 132
Research about users’ experiences of regulated services
Question proposed, That the clause stand part of the Bill.
We support clause 132, which ensures that Ofcom is required to understand and measure public opinion concerning providers of regulated services, as well as the experiences and interests of those using the regulated services in question. The Bill in its entirety is very much a learning curve for us all, and I am sure we all agree that, as previously maintained, the world really is watching as we seek to develop and implement the legislation. That is why it is vital that Ofcom is compelled to conduct and arrange its own research to ensure that we are getting an accurate picture of how our regulatory framework is affecting people. I stress to the Minister that it is imperative that Ofcom consults all service providers—big and small—which the CBI stressed to me in recent meetings.
We also welcome the provisions outlined in subsection (2) that confirm that Ofcom must include a statement of its research in its annual report to the Secretary of State and the devolved Administrations. It is important that Ofcom, as a regulator, takes a research-led approach, and Labour is pleased to see these provisions included in the Bill.
We welcome the inclusion of clause 133, which extends the communication panel’s remit to include online safety. This will mean that the panel is able to give advice on matters relating to different types of online content under the Bill, and on the impacts of online content on UK users of regulated services. It is a welcome step forward, so we have not sought to amend the clause.
I want to make one short comment about clauses 132 and 133, which are really important. There is no intention to interfere with or fetter the way that Ofcom operates, but there is an obligation on this Committee, and on Parliament, to indicate what we would expect to see from Ofcom by way of the clauses, because they are an essential part of the transparency that we are trying to inject into the sector.
Research about users’ experiences is hugely important, and such reports contain important insights into how platforms are used, and the levels of misinformation and disinformation that people are exposed to. Ofcom already produces highly authoritative reports on various aspects of the online world, including the fact that three in four adults do not think about whether the online information that they see is truthful. Indeed, one in three adults believes that all or most information that they find online is truthful. We know that there is a significant gap between consumers perception and reality, so it is important to ensure that research has good exposure among those using the internet.
We do not often hear about the problems of how the online world works, and the level of disinformation and inaccuracy is not well known, so will the Minister elaborate on how he expects Ofcom to ensure that people are aware of the reality of the online world? Platforms will presumably be required to have regard to the content of Ofcom reports, but will Ofcom be required to publicise its reports? It is not clear that such a duty is in the Bill at the moment, so does the Minister expect Ofcom to have a role in educating people, especially children, about the problem of inaccurate data or other aspects of the online world?
We know that a number of platforms spend a great deal of money on going into schools and talking about their products, which may or may not entail accurate information. Does Ofcom not have an important role to play in this area? Educating users about the changes in the Bill would be another potential role for Ofcom in order to recalibrate users’ expectations as to what they might reasonably expect platforms to offer as a result of the legislation. It is important that we have robust regulatory frameworks in place, and this Bill clearly does that. However, it also requires users to be aware of the changes that have been made so that they can report the problems they experience in a timely manner.
I welcome the support of the hon. Member for Pontypridd for these clauses. I will turn to the questions raised by my right hon. Friend the Member for Basingstoke. First, she asked whether Ofcom has to publish these reports so that the public, media and Parliament can see what they say. I am pleased to confirm that Ofcom does have to publish the reports; section 15 of the Communications Act 2003 imposes a duty on Ofcom to publish reports of this kind.
Secondly, my right hon. Friend asked about educating the public on issues pertinent to these reports, which is what we would call a media literacy duty. Again, I confirm that, under the Communications Act, Ofcom has a statutory duty to promote media literacy, which would include matters that flow from these reports. In fact, Ofcom published an expanded and updated set of policies in that area at the end of last year, which is why the old clause 103 in the original version of this Bill was removed—Ofcom had already gone further than that clause required.
Thirdly, my right hon. Friend asked about the changes that might happen in response to the findings of these reports. Of course, it is open to Ofcom—indeed, I think this Committee would expect it—to update its codes of practice, which it can do from time to time, in response to the findings of these reports. That is a good example of why it is important for those codes of practice to be written by Ofcom, rather than being set out in primary legislation. It means that when some new fact or circumstance arises or some new bit of research, such as the information required in this clause, comes out, those codes of practice can be changed. I hope that addresses the questions my right hon. Friend asked.
The hon. Member for Liverpool, Walton asked about transparency, referring to Frances Haugen’s testimony to the US Senate and her disclosures to The Wall Street Journal, as well as the evidence she gave this House, both to the Joint Committee and to this Committee just before the Whitsun recess. I have also met her bilaterally to discuss these issues. The hon. Gentleman is quite right to point out that these social media firms use Facebook as an example, although there are others that are also extremely secretive about what they say in public, to the media and even to representative bodies such as the United States Congress. That is why, as he says, it is extremely important that they are compelled to be a lot more transparent.
The Bill contains a large number of provisions compelling or requiring social media firms to make disclosures to Ofcom as the regulator. However, it is important to have public disclosure as well. It is possible that the hon. Member for Liverpool, Walton was not in his place when we came to the clause in question, but if he turns to clause 64 on page 56, he will see that it includes a requirement for Ofcom to give every provider of a relevant service a notice compelling them to publish a transparency report. I hope he will see that the transparency obligation that he quite rightly refers to—it is necessary—is set out in clause 64(1). I hope that answers the points that Committee members have raised.
Question put and agreed to.
Clause 132 accordingly ordered to stand part of the Bill.
Clause 133 ordered to stand part of the Bill.
Clause 134
OFCOM’s statement about freedom of expression and privacy
Question proposed, That the clause stand part of the Bill.
As we all know, the clause requires Ofcom to publish annual reports on the steps it has taken, when carrying out online safety functions, to uphold users’ rights under articles 8 and 10 of the convention, as required by section 6 of the Human Rights Act 1998. It will come as no surprise to the Minister that Labour entirely supports this clause.
Upholding users’ rights is a central part of this Bill, and it is a topic we have debated repeatedly in our proceedings. I know that the Minister faces challenges of his own, as the Opposition do, regarding the complicated balance between freedom of speech and safety online. It is only right and proper, therefore, for Ofcom to have a specific duty to publish reports about what steps it is taking to ensure that the online space is fair and equal for all.
That being said, we know that we can and should go further. My hon. Friend the Member for Batley and Spen will shortly address an important new clause tabled in her name—I believe it is new clause 25—so I will do my best not to repeat her comments, but it is important to say that Ofcom must be compelled to publish reports on how its overall regulatory operating function is working. Although Labour welcomes clause 134 and especially its commitment to upholding users’ rights, we believe that when many feel excluded in the existing online space, Ofcom can do more in its annual reporting. For now, however, we support clause 134.
I welcome the shadow Minister’s continuing support for these clauses. Clause 134 sets out the requirement on Ofcom to publish reports setting out how it has complied with articles 8 and 10 of the European convention on human rights.
I will pause for a second, because my hon. Friend the Member for Don Valley and others have raised concerns about the implications of the Bill for freedom of speech. In response to a question he asked last week, I set out in some detail the reasons why I think the Bill improves the position for free speech online compared with the very unsatisfactory status quo. This clause further strengthens that case, because it requires this report and reminds us that Ofcom must discharge its duties in a manner compatible with articles 8 and 10 of the ECHR.
From memory, article 8 enshrines the right to a family life, and article 10 enshrines the right to free speech, backed up by quite an extensive body of case law. The clause reminds us that the powers that the Bill confers on Ofcom must be exercised—indeed, can only be exercised—in conformity with the article 10 duties on free speech. I hope that that gives my hon. Friend additional assurance about the strength of free speech protection inherent in the Bill. I apologise for speaking at a little length on a short clause, but I think that was an important point to make.
Question put and agreed to.
Clause 134 accordingly ordered to stand part of the Bill.
Clause 135
OFCOM’s transparency reports
Question proposed, That the clause stand part of the Bill.
Again, Labour welcomes clause 135, which places a duty on Ofcom to produce its own reports based on information from the transparency reports that providers are required to publish. However, the Minister will know that Labour feels the Bill has much more work to do on transparency more widely, as we have repeatedly outlined through our debates. The Minister rejected our calls for increased transparency when we were addressing, I believe, clause 61. We are not alone in feeling that transparency reports should go further. The sector and his own Back Benchers are calling for it, yet so far his Department has failed to act.
It is a welcome step that Ofcom must produce its own reports based on information from the provider’s transparency reports, but the ultimate motivation for the reports to provide a truly accurate depiction of the situation online is for them to be made public. I know the Minister has concerns around security, but of course no one wants to see users put at harm unnecessarily. That is not what we are asking for here. I will refrain from repeating debates we have already had at length, but I wish to again put on the record our concerns around the transparency reporting process as it stands.
That being said, we support clause 135. It is right that Ofcom is compelled to produce its own reports; we just wish they were made public. With the transparency reports coming from the providers, we only wish they would go further.
I have spoken to these points previously, so I do not want to tax the Committee’s patience by repeating what I have said.
Question put and agreed to.
Clause 135 accordingly ordered to stand part of the Bill.
Clause 136
OFCOM’s report about researchers’ access to information
Question proposed, That the clause stand part of the Bill.
Again, Labour welcomes clause 136, which is a positive step towards a transparent approach to online safety, given that it requires Ofcom to publish a report about the access that independent researchers have, or could have, to matters relating to the online safety of regulated services. As my hon. Friend the Member for Worsley and Eccles South rightly outlined in an earlier sitting, Labour strongly believes that the transparency measures in the Bill do not go far enough.
Independent researchers already play a vital role in regulating online safety. Indeed, there are far too many to list, but many have supported me, and I am sure the Minister, in our research on the Bill. That is why we have tabled a number of amendments on this point, as we sincerely feel there is more work to be done. I know the Minister says he understands and is taking on board our comments, but thus far we have seen little movement on transparency.
In this clause we are specifically talking about access to information for researchers. Obviously, the transparency matters were covered in clauses 64 and 135. There is consensus across both parties that access to information for bona fide academic researchers is important. The clause lays out a path to take us in the direction of providing that access by requiring Ofcom to produce a report. We debated the matter earlier. The hon. Member for Worsley and Eccles South—I hope I got the pronunciation right this time—
The hon. Lady made some points about the matter in an earlier sitting, as the shadow Minister just said. It is an area we are giving some careful thought to, because it is important that it is properly academically researched. Although Ofcom is being well resourced, as we have discussed, with lots of money and the ability to levy fees, we understand that it does not have a monopoly on wisdom—as good a regulator as it is. It may well be that a number of academics could add a great deal to the debate by looking at some of the material held inside social media firms. The Government recognise the importance of the matter, and some thought is being given to these questions, but at least we can agree that clause 136 as drafted sets out a path that leads us in this important direction.
Question put and agreed to.
Clause 136 accordingly ordered to stand part of the Bill.
Clause 137
OFCOM’s reports
Briefly, before I hand over to my hon. Friend the Member for Worsley and Eccles South, I should say that Labour welcomes clause 137, which gives Ofcom a discretionary power to publish reports about certain online safety measures and matters. Clearly, it is important to give Ofcom the power to redact or exclude confidential matters where needs be, and I hope that there will be a certain level of common sense and public awareness, should information of this nature be excluded. As I have previously mentioned—I sound a bit like a broken record—Labour echoes the calls for more transparency, which my hon. Friend the Member for Batley and Spen will come on to in her new clause. However, broadly, we support this important clause.
I would like to press the Minister briefly on how exactly the exclusion of material from Ofcom reports will work in practice. Can he outline any specific contexts or examples, beyond commercial sensitivity and perhaps matters of national security, where he can envision this power being used?
(2 years, 5 months ago)
Public Bill CommitteesUnder this chapter, Ofcom will have the power to direct companies to use accredited technology to identify child sexual exploitation and abuse content, whether communicated publicly or privately by means of a service, and to remove that content quickly. Colleagues will be aware that the Internet Watch Foundation is one group that assists companies in doing that by providing them with “hashes” of previously identified child sexual abuse material in order to prevent the upload of such material to their platforms. That helps stop the images of victims being recirculated again and again. Tech companies can then notify law enforcement of the details of who has uploaded the content, and an investigation can be conducted and offenders sharing the content held to account.
Those technologies are extremely accurate and, thanks to the quality of our datasets, ensure that companies are detecting only imagery that is illegal. There are a number of types of technology that Ofcom could consider accrediting, including image hashing. A hash is a unique string of letters and numbers that can be applied to an image and matched every time a user attempts to upload a known illegal image to a platform.
PhotoDNA is another type, created in 2009 in a collaboration between Microsoft and Professor Hany Farid at the University of Berkeley. PhotoDNA is a vital tool in the detection of CSEA online. It enables law enforcement, charities, non-governmental organisations and the internet industry to find copies of an image even when it has been digitally altered. It is one of the most important technical developments in online child protection. It is extremely accurate, with a failure rate of one in 50 billion to 100 billion. That gives companies a high degree of certainty that what they are removing is illegal, and a firm basis for law enforcement to pursue offenders.
Lastly, there is webpage blocking. Most of the imagery that the Internet Watch Foundation removes from the internet is hosted outside the UK. While it is waiting for removal, it can disable public access to an image or webpage by adding it to our webpage blocking list. That can be utilised by search providers to de-index known webpages containing CSAM. I therefore ask the Minister, as we continue to explore this chapter, to confirm exactly how such technologies can be utilised once the Bill receives Royal Assent.
Labour welcomes clause 105, which confirms, in subsection (2), that where a service provider is already using technology on a voluntary basis but it is ineffective, Ofcom can still intervene and require a service provider to use a more effective technology, or the same technology in a more effective way. It is vital that Ofcom is given the power and opportunity to intervene in the strongest possible sense to ensure that safety online is kept at the forefront.
However, we do require some clarification, particularly on subsections (9) and (10), which explain that Ofcom will only be able to require the use of tools that meet the minimum standards for accuracy for detecting terrorism and/or CSEA content, as set out by the Secretary of State. Although minimum standards are of course a good thing, can the Minister clarify the exact role that the Secretary of State will have in imposing these minimum standards? How will this work in practice?
Once again, Labour does not oppose clause 106 and we have not sought to amend it at this stage. It is vital that Ofcom has the power to revoke a notice under clause 103(1) if there are reasonable grounds to believe that the provider is not complying with it. Only with these powers can we be assured that service providers will be implored to take their responsibilities and statutory duties, as outlined in the Bill, seriously.
I have a few questions, concerns and suggestions relating to these clauses. I think it was the hon. Member for Don Valley who asked me last week about the reports to the National Crime Agency and how that would work—about how, if a human was not checking those things, there would be an assurance that proper reports were being made, and that scanning was not happening and reports were not being made when images were totally legal and there was no problem with them. [Interruption.] I thought it was the hon. Member for Don Valley, although it may not have been. Apologies—it was a Conservative Member. I am sorry for misnaming the hon. Member.
The hon. Member for Pontypridd made a point about the high level of accuracy of the technologies. That should give everybody a level of reassurance that the reports that are and should be made to the National Crime Agency on child sexual abuse images will be made on a highly accurate basis, rather than a potentially inaccurate one. Actually, some computer technology—particularly for scanning for images, rather than text—is more accurate than human beings. I am pleased to hear those particular statistics.
Queries have been raised on this matter by external organisations—I am particularly thinking about the NSPCC, which we spoke about earlier. The Minister has thankfully given a number of significant reassurances about the ability to proactively scan. External organisations such as the NSPCC are still concerned that there is not enough on the face of the Bill about proactive scanning and ensuring that the current level of proactive scanning is able—or required—to be replicated when the Bill comes into action.
During an exchange in an earlier Committee sitting, the Minister gave a commitment—I am afraid I do not have the quote—to being open to looking at amending clause 103. I am slightly disappointed that there are no Government amendments, but I understand that there has been only a fairly short period; I am far less disappointed than I was previously, when the Minister had much more time to consider the actions he might have been willing to take.
The suggestion I received from the NSPCC is about the gap in the Bill regarding the ability of Ofcom to take action. These clauses allow Ofcom to take action against individual providers about which it has concerns; those providers will have to undertake duties set out by Ofcom. The NSPCC suggests that there could be a risk register, or that a notice could be served on a number of companies at one time, rather than Ofcom simply having to pick one company, or to repeatedly pick single companies and serve notices on them. Clause 83 outlines a register of risk profiles that must be created by Ofcom. It could therefore serve notice on all the companies that fall within a certain risk profile or all the providers that have common functionalities.
If there were a new, emerging concern, that would make sense. Rather than Ofcom having to go through the individual process with all the individual providers when it knows that there is common functionality—because of the risk assessments that have been done and Ofcom’s oversight of the different providers—it could serve notice on all of them in one go. It could not then accidentally miss one out and allow people to move to a different platform that had not been mentioned. I appreciate the conversation we had around this issue earlier, and the opportunity to provide context in relation to the NSPCC’s suggestions, but it would be great if the Minister would be willing to consider them.
I have another question, to which I think the Minister will be able to reply in the affirmative, which is on the uses of the technology as it evolves. We spoke about that in an earlier meeting. The technology that we have may not be what we use in the future to scan for terrorist-related activity or child sexual abuse material. It is important that the Bill adequately covers future conditions. I think that it does, but will the Minister confirm that, as technology advances and changes, these clauses will adequately capture the scanning technologies that are required, and any updates in the way in which platforms work and we interact with each other on the internet?
I have fewer concerns about future-proofing with regard to these provisions, because I genuinely think they cover future conditions, but it would be incredibly helpful and provide me with a bit of reassurance if the Minister could confirm that. I very much look forward to hearing his comments on clause 103.
I beg to move amendment 35, in clause 104, page 88, line 39, leave out “prevalence” and insert “presence”.
This amendment requires that Ofcom considers the presence of relevant content, rather than its prevalence.
With this it will be convenient to discuss the following:
Amendment 36, in clause 104, page 88, line 43, leave out “prevalence” and insert “presence”.
This amendment requires that Ofcom considers the presence of relevant content, rather than its prevalence.
Amendment 37, in clause 104, page 89, line 13, at end insert—
“(k) risk of harm posed by individuals in the United Kingdom in relation to adults and children in the UK or elsewhere through the production, publication and dissemination of illegal content.”
This amendment requires the Ofcom’s risk assessment to consider risks to adults and children through the production, publication and dissemination of illegal content.
Amendment 39, in clause 116, page 98, line 37, leave out “prevalence” and insert “presence”.
This amendment requires that Ofcom considers the presence of relevant content, rather than its prevalence.
Amendment 40, in clause 116, page 98, line 39, leave out “prevalence” and insert “presence”.
This amendment requires that Ofcom considers the presence of relevant content, rather than its prevalence.
Amendment 38, in clause 116, page 99, line 12, at end insert—
“(j) the risk of harm posed by individuals in the United Kingdom in relation to adults and children in the UK or elsewhere through the production, publication and dissemination of illegal content.”
This amendment requires Ofcom to consider risks to adults and children through the production, publication and dissemination of illegal content before imposing a proactive technology requirement.
Government amendment 6.
Clause stand part.
We welcome clause 104, but have tabled some important amendments that the Minister should closely consider. More broadly, the move away from requiring child sexual exploitation and abuse content to be prevalent and persistent before enforcement action can be taken is a positive one. It is welcome that Ofcom will have the opportunity to consider a range of factors.
Despite this, Labour—alongside the International Justice Mission—is still concerned about the inclusion of prevalence as a factor, owing to the difficulty in detecting newly produced CSEA content, especially livestreamed abuse. Amendments 35, 36, 39 and 40 seek to address that gap. Broadly, the amendments aim to capture the concern about the Bill’s current approach, which we feel limits its focus to the risk of harm faced by individuals in the UK. Rather, as we have discussed previously, the Bill should recognise the harm that UK nationals cause to people around the world, including children in the Philippines. The amendments specifically require Ofcom to consider the presence of relevant content, rather than its prevalence.
Amendment 37 would require Ofcom’s risk assessments to consider risks to adults and children through the production, publication and dissemination of illegal content—an issue that Labour has repeatedly raised. I believe we last mentioned it when we spoke to amendments to clause 8, so I will do my best to not repeat myself. That being said, we firmly believe it is important that video content, including livestreaming, is captured by the Bill. I remain unconvinced that the Bill as it stands goes far enough, so I urge the Minister to closely consider and support these amendments. The arguments that we and so many stakeholders have already made still stand.
I echo the sentiments that have been expressed by the shadow Minister, and thank her and her colleagues for tabling this amendment and giving voice to the numerous organisations that have been in touch with us about this matter. The Scottish National party is more than happy to support the amendment, which would make the Bill stronger and better, and would better enable Ofcom to take action when necessary.
I understand the spirit behind these amendments, focusing on the word “presence” rather than “prevalence” in various places. It is worth keeping in mind that throughout the Bill we are requiring companies to implement proportionate systems and processes to protect their users from harm. Even in the case of the most harmful illegal content, we are not placing the duty on companies to remove every single piece of illegal content that has ever appeared online, because that is requesting the impossible. We are asking them to take reasonable and proportionate steps to create systems and processes to do so. It is important to frame the legally binding duties in that way that makes them realistically achievable.
As the shadow Minister said, amendments 35, 36, 39 and 40 would replace the word “prevalence” with “presence”. That would change Ofcom’s duty to enforce not just against content that was present in significant numbers—prevalent—but against a single instance, which would be enough to engage the clause.
We mutually understand the intention behind these amendments, but we think the significant powers to compel companies to adopt certain technology contained in section 103 should be engaged only where there is a reasonable level of risk. For example, if a single piece of content was present on a platform, if may not be reasonable or proportionate to force the company to adopt certain new technologies, where indeed they do not do so at the moment. The use of “prevalence” ensures that the powers are used where necessary.
It is clear—there is no debate—that in the circumstances where scanning technology is currently used, which includes on Facebook Messenger, there is enormous prevalence of material. To elaborate on a point I made in a previous discussion, anything that stops that detection happening would be unacceptable and, in the Government’s view, it would not be reasonable to lose the ability to detect huge numbers of images in the service of implementing encryption, because there is nothing more important than scanning against child sexual exploitation images.
However, we think adopting the amendment and replacing the word “prevalence” with “presence” would create an extremely sensitive trigger that would be engaged on almost every site, even tiny ones or where there was no significant risk, because a single example would be enough to trigger the amendment, as drafted. Although I understand the spirit of the amendment, it moves away from the concepts of proportionality and reasonableness in the systems and processes that the Bill seeks to deliver.
Amendment 37 seeks to widen the criteria that Ofcom must consider when deciding to use section 103 powers. It is important to ensure that Ofcom considers a wide range of factors, taking into account the harm occurring, but clause 104(2)(f) already requires Ofcom to consider
“the level of risk of harm to individuals in the United Kingdom presented by relevant content, and the severity of that harm”.
Therefore, the Bill already contains provision requiring Ofcom to take those matters into account, as it should, but the shadow Minister is right to draw attention to the issue.
Finally, amendment 38 seeks to amend clause 116 to require Ofcom to consider the risk of harm posed by individuals in the United Kingdom, in relation to adults and children in the UK or elsewhere, through the production, publication and dissemination of illegal content. In deciding whether to make a confirmation decision requiring the use of technology, it is important that Ofcom considers a wide range of factors. However, clause 116(6)(e) already proposes to require Ofcom to consider, in particular, the risk and severity of harm to individuals in the UK. That is clearly already in the Bill.
I hope that this analysis provides a basis for the shadow Minister to accept that the Bill, in this area, functions as required. I gently request that she withdraw her amendment.
I welcome the Minister’s comments, but if we truly want the Bill to be world-leading, as the Government and the Minister insist it will be, and if it is truly to keep children safe, surely one image of child sexual exploitation and abuse on a platform is one too many. We do not need to consider prevalence over presence. I do not buy that argument. I believe we need to do all we can to make this Bill as strong as possible. I believe the amendments would do that.
Question put, That the amendment be made.
Labour welcomes clause 107, which requires Ofcom to issue guidance setting out the circumstances in which it could require a service provider in scope of the power to use technology to identify CSEA and/or terrorism content. It is undeniably important that Ofcom will have the discretion to decide on the exact content of the guidance, which it must keep under review and publish.
We also welcome the fact that Ofcom must have regard to its guidance when exercising these powers. Of course, it is also important that the Information Commissioner is included and consulted in the process. Ofcom has a duty to continually review its guidance, which is fundamental to the Bill’s success.
We also welcome clause 108. Indeed, the reporting of Ofcom is an area that my hon. Friend the Member for Batley and Spen will touch on when we come to new clause 25. It is right that Ofcom will have a statutory duty to lay an annual report in this place, but we feel it should ultimately go further. That is a conversation for another day, however, so we broadly welcome clause 108 and have not sought to amend it directly at this stage.
Clause 109 ensures that the definitions of “terrorism content” and “child sexual exploitation and abuse content” used in chapter 5 are the same as those used in part 3. Labour supports the clause and we have not sought to amend it.
I welcome the cross-party support for the provisions set out in these important clauses. Clause 107 points out the requirement for Ofcom to publish guidance, which is extremely important. Clause 108 makes sure that it publishes an annual report. Clause 109 covers the interpretations.
The hon. Member for Aberdeen North asked the only question, about the contents of the Ofcom road map, which in evidence it committed to publishing before the summer. I cannot entirely speak for Ofcom, which is of course an independent body. In order to avoid me giving the Committee misleading information, the best thing is for officials at the Department for Digital, Culture, Media and Sport to liaise with Ofcom and ascertain what the exact contents of the road map will be, and we can report that back to the Committee by letter.
It will be fair to say that the Committee’s feeling—I invite hon. Members to intervene if I have got this wrong—is that the road map should be as comprehensive as possible. Ideally, it would lay out the intended plan to cover all the activities that Ofcom would have to undertake in order to make the Bill operational, and the more detail there is, and the more comprehensive the road map can be, the happier the Committee will be.
Officials will take that away, discuss it with Ofcom and we can revert with fuller information. Given that the timetable was to publish the road map prior to the summer, I hope that we are not going to have to wait very long before we see it. If Ofcom is not preparing it now, it will hopefully hear this discussion and, if necessary, expand the scope of the road map a little bit accordingly.
Question put and agreed to.
Clause 107 accordingly ordered to stand part of the Bill
Clauses 108 and 109 ordered to stand part of the Bill.
Clause 110
Provisional notice of contravention
Question proposed, That the clause stand part of the Bill.
I will be brief. Labour welcomes clause 110, which addresses the process of starting enforcement. We support the process, particularly the point that ensures that Ofcom must first issue a “provisional notice of contravention” to an entity before it reaches its final decision.
The clause ultimately ensures that the process for Ofcom issuing a provisional notice of contravention can take place only after a full explanation and deadline has been provided for those involved. Thankfully, this process means that Ofcom can reach a decision only after allowing the recipient a fair opportunity to make relevant representations too. The process must be fair for all involved and that is why we welcome the provisions outlined in the clause.
I hope that I am speaking at the right stage of the Bill, and I promise not to intervene at any further stages where this argument could be put forward.
Much of the meat of the Bill is within chapter 6. It establishes what many have called the “polluter pays” principle, where an organisation that contravenes can then be fined—a very important part of the Bill. We are talking about how Ofcom is going to be able to make the provisions that we have set out work in practice. A regulated organisation that fails to stop harm contravenes and will be fined, and fined heavily.
I speak at this point in the debate with slight trepidation, because these issues are also covered in clause 117 and schedule 12, but it is just as relevant to debate the point at this stage. It is difficult to understand where in the Bill the Government set out how the penalties that they can levy as a result of the powers under this clause will be used. Yes, they will be a huge deterrent, and that is good in its own right and important, but surely the real opportunity is to make the person who does the harm pay for righting the wrong that they have created.
That is not a new concept. Indeed, it is one of the objectives that the Government set out in the intentions behind their approach to the draft victims Bill. It is a concept used in the Investigatory Powers Act 2016. It is the concept behind the victims surcharge. So how does this Bill make those who cause harm take greater responsibility for the cost of supporting victims to recover from what they have suffered? That is exactly what the Justice Ministers set out as being so important in their approach to victims. In the Bill, that is not clear to me.
At clause 70, the Minister helpfully set out that there was absolutely no intention for Ofcom to have a role in supporting victims individually. In reply to the point that I made at that stage, he said that the victims Bill would address some of the issues—I am sure that he did not say all the issues, but some of them at least. I do not believe that it will. The victims Bill establishes a code and a duty to provide victim support, but it makes absolutely no reference to how financial penalties on those who cause harm—as set out so clearly in this Bill—will be used to support victims. How will they support victims’ organisations, which do so much to help in particular those who do not end up in court, before a judge, because what they have suffered does not warrant that sort of intervention?
I believe that there is a gap. We heard that in our evidence session, including from Ofcom itself, which identified the need for law enforcement, victim-support organisations and platforms themselves to find what the witnesses described as an effective way for the new “ecosystem” to work. Victim-support organisations went further and argued strongly for the need for victims’ voices to be heard independently. The NSPCC in particular made a very powerful argument for children’s voices needing to be heard and for having independent advocacy. There would be a significant issue with trust levels if we were to rely solely on the platforms themselves to provide such victim support.
There are a couple of other reasons why we need the Government to tease the issue out. We are talking about the most significant culture change imaginable for the online platforms to go through. There will be a lot of good will, I am sure, to achieve that culture change, but there will also be problems along the way. Again referring back to our evidence sessions, the charity Refuge said that reporting systems are “not up to scratch” currently. There is a lot of room for change. We know that Revenge Porn Helpline has seen a continual increase in demand for its services in support of victims, in particular following the pandemic. It also finds revenue and funding a little hand to mouth.
Victim support organisations will have a crucial role in assisting Ofcom with the elements outlined in chapter 6, of which clause 110 is the start, in terms of monitoring the reality for users of how the platforms are performing. The “polluter pays” principle is not working quite as the Government might want it to in the Bill. My solution is for the Minister to consider talking to his colleagues in the Treasury about whether this circle could be squared—whether we could complete the circle—by having some sort of hypothecation of the financial penalties, so that some of the huge amount that will be levied in penalties can be put into a fund that can be used directly to support victims’ organisations. I know that that requires the Department for Digital, Culture, Media and Sport and the Ministry of Justice to work together, but my hon. Friend is incredibly good at collaborative working, and I am sure he will be able to achieve that.
This is not an easy thing. I know that the Treasury would not welcome Committees such as this deciding how financial penalties are to be used, but this is not typical legislation. We are talking about enormous amounts of money and enormous numbers of victims, as the Minister himself has set out when we have tried to debate some of these issues. He could perhaps undertake to raise this issue directly with the Treasury, and perhaps get it to look at how much money is currently going to organisations to support victims of online abuse and online fraud—the list goes on—and to see whether we will have to take a different approach to ensure that the victims we are now recognising get the support he and his ministerial colleagues want to see.
I beg to move amendment 53, in clause 111, page 94, line 24, at end insert— “Section 136(7C) Code of practice on access to data”
This amendment is linked to Amendment 52.
With this it will be convenient to discuss amendment 52, in clause 136, page 118, line 6, at end insert—
“(7A) Following the publication of the report, OFCOM must produce a code of practice on access to data setting out measures with which regulated services are required to comply.
(7B) The code of practice must set out steps regulated services are required to take to facilitate access to date by persons carrying out independent research.
(7C) Regulated services must comply with any measures in the code of practice.”
This amendment would require Ofcom to produce a code of practice on access to data.
Labour welcomes this important clause, which lists the enforceable requirements. Failure to comply with those requirements can trigger enforcement action. However, the provisions could go further, so we urge the Minister to consider our important amendments.
Amendments 52 and 53 make it abundantly clear that more access to, and availability of, data and information about systems and processes would improve understanding of the online environment. We cannot rely solely on Ofcom to act as problems arise, when new issues could be spotted early by experts elsewhere. The entire regime depends on how bright a light we can shine into the black box of the tech companies, but only minimal data can be accessed.
The amendments would require Ofcom simply to produce a code of practice on access to data. We have already heard that without independent researchers accessing data on relevant harm, the platforms have no real accountability for how they tackle online harms. Civil society and researchers work hard to identify online harms from limited data sources, which can be taken away by the platforms if they choose. Labour feels that the Bill must require platforms, in a timely manner, to share data with pre-vetted independent researchers and academics. The EU’s Digital Services Act does that, so will the Minister confirm why such a provision is missing from this supposed world-leading Bill?
Clause 136 gives Ofcom two years to assess whether access to data is required, and it “may”, but not “must”, publish guidance on how its approach to data access might work. The process is far too slow and, ultimately, puts the UK behind the EU, whose legislation makes data access requests possible immediately. Amendment 52 would change the “may” to “must”, and would ultimately require Ofcom to explore how access to data works, not if it should happen in the first place.
Frances Haugen’s evidence highlighted quite how shadowy a significant number of the platforms are. Does the hon. Member agree that that hammers home the need for independent researchers to access as much detail as possible so that we can ensure that the Bill is working?
I agree 100%. The testimony of Frances Haugen, the Facebook whistleblower, highlighted the fact that expert researchers and academics will need to examine the data and look at what is happening behind social media platforms if we are to ensure that the Bill is truly fit for purpose and world leading. That process should be carried out as quickly as possible, and Ofcom must also be encouraged to publish guidance on how access to data will work.
Ultimately, the amendments make a simple point: civil society and researchers should be able to access data, so why will the Minister not let them? The Bill should empower independently verified researchers and civil society to request tech companies’ data. Ofcom should be required to publish guidance as soon as possible —within months, not years—on how data may be accessed. That safety check would hold companies to account and make the internet a safer and less divisive space for everyone.
The process would not be hard or commercially ruinous, as the platforms claim. The EU has already implemented it through its Digital Services Act, which opens up the secrets of tech companies’ data to Governments, academia and civil society in order to protect internet users. If we do not have that data, researchers based in the EU will be ahead of those in the UK. Without more insight to enable policymaking, quality research and harm analysis, regulatory intervention in the UK will stagnate. What is more, without such data, we will not know Instagram’s true impact on teen mental health, nor the reality of violence against women and girls online or the risks to our national security.
We propose amending the Bill to accelerate data sharing provisions while mandating Ofcom to produce guidance on how civil society and researchers can access data, not just on whether they should. As I said, that should happen within months, not years. The provisions should be followed by a code of practice, as outlined in the amendment, to ensure that platforms do not duck and dive in their adherence to transparency requirements. A code of practice would help to standardise data sharing in a way that serves platforms and researchers.
The changes would mean that tech companies can no longer hide in the shadows. As Frances Haugen said of the platforms in her evidence a few weeks ago:
“The idea that they have worked in close co-operation with researchers is a farce. The only way that they are going to give us even the most basic data that we need to keep ourselves safe is if it is mandated in the Bill. We need to not wait two years after the Bill passes”.––[Official Report, Online Safety Public Bill Committee, 26 May 2022; c. 188, Q320.]
I understand the shadow Minister’s point. We all heard from Frances Haugen about the social media firms’ well-documented reluctance—to put it politely—to open themselves up to external scrutiny. Making that happen is a shared objective. We have already discussed several times the transparency obligations enshrined in clause 64. Those will have a huge impact in ensuring that the social media firms open up a lot more and become more transparent. That will not be an option; they will be compelled to do that. Ofcom is obliged under clause 64 to publish the guidance around those transparency reports. That is all set in train already, and it will be extremely welcome.
Researchers’ access to information is covered in clause 136, which the amendments seek to amend. As the shadow Minister said, our approach is first to get Ofcom to prepare a report into how that can best be done. There are some non-trivial considerations to do with personal privacy and protecting people’s personal information, and there are questions about who counts as a valid researcher. When just talking about it casually, it might appear obvious who is or is not a valid researcher, but we will need to come up with a proper definition of “valid researcher” and what confidentiality obligations may apply to them.
Yes, I would agree that bona fide academic independent researchers do have something to offer and to add in this area. The more we have highly intelligent, experienced and creative people looking at a particular problem or issue, the more likely we are to get a good and well-informed result. They may have perspectives that Ofcom does not. I agree that, in principle, independent researchers can add a great deal, but we need to ensure that we get that set up in a thoughtful and proper way. I understand the desire to get it done quickly, but it is important to take the time to do it not just quickly, but right. It is an area that does not exist already—at the moment, there is no concept of independent researchers getting access to the innards of social media companies’ data vaults—so we need to make sure that it is done in the right way, which is why it is structured as it is. I ask the Committee to stick with the drafting, whereby there will be a report and then Ofcom will have the power. I hope we end up in the same place—well, the same place, but a better place. The process may be slightly slower, but we may also end up in a better place for the consideration and thought that will have to be given.
I appreciate where the Minister is coming from. It seems that he wants to back the amendment, so I am struggling to see why he will not, especially given that the DSA—the EU’s new legislation—is already doing this. We know that the current wording in the Bill is far too woolly. If providers can get away with it, they will, which is why we need to compel them, so that we are able to access this data. We need to put that on the face of the Bill. I wish that we did not have to do so, but we all wish that we did not have to have this legislation in the first place. Unless we put it in the Bill, however, the social media platforms will carry on regardless, and the internet will not be a safe place for children and adults in the UK. That is why I will push amendment 53 to a vote.
Question put, That the amendment be made.
I beg to move amendment 56, in clause 111, page 94, line 24, at end insert— “Section [Supply chain risk assessment duties] Supply chain risk assessments”
This amendment is linked to NC11.
With this it will be convenient to discuss new clause 11—Supply chain risk assessment duties—
“(1) This section sets out duties to assess risks arising in a provider’s supply chain, which apply to all Part 3 services.
(2) A duty to carry out a suitable and sufficient assessment of the risk of harm arising to persons employed by contractors of the provider, where the role of such persons is to moderate content on the service.
(3) A duty to keep the risk assessment up to date.
(4) Where any change is proposed to any contract for the moderation of content on the service, a duty to carry out a further suitable and sufficient risk assessment.
(5) In this section, the ‘risk of harm’ includes any risks arising from—
(a) exposure to harmful content; and
(b) a lack of training, counselling or support.”
This new clause introduces a duty to assess the risk of harm in the supply chain.
We know that human content moderation is the foundation of all content moderation for major platforms. It is the most important resource for making platforms safe. Relying on AI alone is an ineffective and risky way to moderate content, so platforms have to rely on humans to make judgment calls about context and nuance. I pay tribute to all human moderators for keeping us all safe by having to look at some of the most horrendous and graphic content.
The content moderation reviews carried out by humans, often at impossible speeds, are used to classify content to train algorithms that are then used to automatically moderate exponentially more content. Human moderators can be, and often are, exploited by human resource processes that do not disclose the trauma inherent in the work or properly support them in their dangerous tasks. There is little oversight of this work, as it is done largely through a network of contracted companies that do not disclose their expectations for staff or the support and training provided to them. The contractors are “off book” from the platforms and operate at arm’s length from the services they are supporting, and they are hidden by a chain of unaccountable companies. This creates a hazardous supply chain for the safety processes that platforms claim will protect users in the UK and around the world.
Not all online abuse in the UK happens in English, and women of many cultures and backgrounds in the UK are subject to horrific abuse that is not in the English language. The amendment would make all victim groups in the UK much safer.
To make the internet safer it is imperative to better support human content moderators and regulate the supply chain for their work. It is an obvious but overlooked point that content moderators are users of a platform, but they are also the most vulnerable group of users, as they are the frontline of defence in sifting out harmful content. Their sole job is to watch gruesome, traumatising and harmful content so that we do not have to. The Bill has a duty to protect the most vulnerable users, but it cannot do so if their existence is not even acknowledged.
Many reports in the media have described the lack of clarity about, and the exploitative nature of, the hiring process. Just yesterday, I had the immense privilege of meeting Daniel Motaung, the Facebook whistleblower from Kenya who has described the graphic and horrendous content that he was required to watch to keep us all safe, including live beheadings and children being sexually exploited. Members of the Committee cannot even imagine what that man has had to endure, and I commend him for his bravery in speaking out and standing up for his rights. He has also been extremely exploited by Facebook and the third party company by which he was employed. He was paid the equivalent of $2 an hour for doing that work, whereas human moderators in the US were paid roughly $18 an hour—again, nowhere near enough for what they had to endure.
Thank you. Clause 111 sets out and defines the “enforceable requirements” in this chapter—the duties that Ofcom is able to enforce against. Those are set out clearly in the table at subsection (2) and the requirements listed in subsection (3).
The amendment speaks to a different topic. It seeks to impose or police standards for people employed as subcontractors of the various companies that are in scope of the Bill, for example people that Facebook contracts; the shadow Minister, the hon. Member for Pontypridd, gave the example of the gentleman from Kenya she met yesterday. I understand the point she makes and I accept that there are people in those supply chains who are not well treated, who suffer PTSD and who have to do extraordinarily difficult tasks. I do not dispute at all the problems she has referenced. However, the Government do not feel that the Bill is the right place to address those issues, for a couple of reasons.
First, in relation to people who are employed in the UK, we have existing UK employment and health and safety laws. We do not want to duplicate or cut across those. I realise that they relate only to people employed in the UK, but if we passed the amendment as drafted, it would apply to people in the UK as much as it would apply to people in Kenya.
Secondly, the amendment would effectively require Ofcom to start paying regard to employment conditions in Kenya, among other places—indeed, potentially any country in the world—and it is fair to say that that sits substantially outside Ofcom’s area of expertise as a telecoms and communications regulator. That is the second reason why the amendment is problematic.
The third reason is more one of principle. The purpose of the Bill is to keep users safe online. While I understand the reasonable premise for the amendment, it seeks essentially to regulate working conditions in potentially any country in the world. I am just not sure that it is appropriate for an online safety Bill to seek to regulate global working conditions. Facebook, a US company, was referenced, but only 10% of its activity—very roughly speaking—is in the UK. The shadow Minister gave the example of Kenyan subcontractors. Compelling though her case was, I am not sure it is appropriate that UK legislation on online safety should seek to regulate the Kenyan subcontractor of a United States company.
The Government of Kenya can set their own employment regulations and President Biden’s Government can impose obligations on American companies. For us, via a UK online safety Bill, to seek to regulate working conditions in Kenya goes a long way beyond the bounds of what we are trying to do, particularly when we take into account that Ofcom is a telecommunications and communications regulator. To expect it to regulate working conditions anywhere in the world is asking quite a lot.
I accept that a real issue is being raised. There is definitely a problem, and the shadow Minister and the hon. Member for Aberdeen North are right to raise it, but for the three principal reasons that I set out, I suggest that the Bill is not the place to address these important issues.
The Minister mentions workers in the UK. I am a proud member of the Labour party and a proud trade unionist; we have strong protections for workers in the UK. There is a reason why Facebook and some of these other platforms, which are incredibly exploitative, will not have human moderators in the UK looking at this content: because they know they would be compelled to treat them a hell of a lot better than they do the workers around the world that they are exploiting, as they do in Kenya, Dublin and the US.
To me, the amendment speaks to the heart of the Bill. This is an online safety Bill that aims to keep the most vulnerable users safe online. People around the world are looking at content that is created here in the UK and having to moderate it; we are effectively shipping our trash to other countries and other people to deal with it. That is not acceptable. We have the opportunity here to keep everybody safe from looking at this incredibly harmful content. We have a duty to protect those who are looking at content created in the UK in order to keep us safe. We cannot let those people down. The amendment and new clause 11 give us the opportunity to do that. We want to make the Bill world leading. We want the UK to stand up for those people. I urge the Minister to do the right thing and back the amendment.
We support clause 112, which gives Ofcom the power to issue a confirmation decision if, having followed the required process—for example, in clause 110—its final decision is that a regulated service has breached an enforceable requirement. As we know, this will set out Ofcom’s final decision and explain whether Ofcom requires the recipient of the notice to take any specific steps and/or pay a financial penalty. Labour believes that this level of scrutiny and accountability is vital to an Online Safety Bill that is truly fit for purpose, and we support clause 112 in its entirety.
We also support the principles of clause 113, which outlines the steps that a person may be required to take either to come into compliance or to remedy the breach that has been committed. Subsection (5) in particular is vital, as it outlines how Ofcom can require immediate action when the breach has involved an information duty. We hope this will be a positive step forward in ensuring true accountability of big tech companies, so we are happy to support the clause unamended.
It is right and proper that Ofcom has powers when a regulated provider has failed to carry out an illegal content or children’s risk assessment properly or at all, and when it has identified a risk of serious harm that the regulated provider is not effectively mitigating or managing. As we have repeatedly heard, risk assessments are the very backbone of the Bill, so it is right and proper that Ofcom is able to force a company to take measures to comply in the event of previously failing to act.
Children’s access assessments, which are covered by clause 115, are a crucial component of the Bill. Where Ofcom finds that a regulated provider has failed to properly carry out an assessment, it is vital that it has the power and legislative standing to force the company to do more. We also appreciate the inclusion of a three-month timeframe, which would ensure that, in the event of a provider re-doing the assessment, it would at least be completed within a specific—and small—timeframe.
While we recognise that the use of proactive technologies may come with small issues, Labour ultimately feels that clause 116 is balanced and fair, as it establishes that Ofcom may require the use of proactive technology only on content that is communicated publicly. It is fair that content in the public domain is subject to those important safety checks. It is also right that under subsection (7), Ofcom may set a requirement forcing services to review the kind of technology being used. That is a welcome step that will ensure that platforms face a level of scrutiny that has certainly been missing so far.
Labour welcomes and is pleased to support clause 117, which allows Ofcom to impose financial penalties in its confirmation decision. That is something that Labour has long called for, as we believe that financial penalties of this nature will go some way towards improving best practice in the online space and deterring bad actors more widely.
The shadow Minister has set out the provisions in the clauses, and I am grateful for her support. In essence, clauses 112 to 117 set out the processes around confirmation decisions and make provisions to ensure that those are effective and can be operated in a reasonable and fair way. The clauses speak largely for themselves, so I am not sure that I have anything substantive to add.
Question put and agreed to.
Clause 112 accordingly ordered to stand part of the Bill.
Clauses 113 to 117 ordered to stand part of the Bill.
Ordered, That further consideration be now adjourned. —(Dean Russell.)
(2 years, 5 months ago)
Public Bill CommitteesWe start with amendment 127 to clause 69. It is up to the Committee, but I am minded to allow this debate to go slightly broader and take the stand part debate with it.
I beg to move amendment 127, in clause 69, page 60, line 26, after “must” insert—
“within six months of this Act being passed”.
As ever, it is a pleasure to serve under your chairship, Sir Roger. The thoughts and prayers of us all are with my hon. Friend the Member for Batley and Spen and all her friends and family.
Labour welcomes the clause, which sets out Ofcom’s duties to provide guidance to providers of internet services. It is apparent, however, that we cannot afford to kick the can down the road and delay implementation of the Bill any further than necessary. With that in mind, I urge the Minister to support the amendment, which would give Ofcom an appropriate amount of time to produce this important guidance.
It is a pleasure, once again, to serve under your august chairmanship, Sir Roger. I associate the Government with the remarks that you and the shadow Minister made, marking the anniversary of Jo Cox’s appalling murder, which shook the entire House when it happened. She will never be forgotten.
The Government are sympathetic to the intent of the amendment, which seeks to ensure that guidance for providers on protecting children from online pornography is put in place as quickly as possible. We of course sympathise with that objective, but we feel that the Secretary of State must retain the power to determine when to bring in the provisions of part 5, including the requirement under the clause for Ofcom to produce guidance, to ensure that implementation of the framework comprehensively and effectively regulates all forms of pornography online. That is the intention of the whole House and of this Committee.
Ofcom needs appropriate time and flexibility to get the guidance exactly right. We do not want to rush it and consequently see loopholes, which pornography providers or others might seek to exploit. As discussed, we will be taking a phased approach to bringing duties under the Bill into effect. We expect prioritisation for the most serious harms as quickly as possible, and we expect the duties on illegal content to be focused on most urgently. We have already accelerated the timescales for the most serious harms by putting priority illegal content in the various schedules to the Bill.
Ofcom is working hard to prepare implementation. We are all looking forward to the implementation road map, which it has committed to produce before the summer. For those reasons, I respectfully resist the amendment.
Question put, That the amendment be made.
We welcome clause 77, which is an important clause that seeks to amend Ofcom’s existing general duties in the Communications Act 2003. Given the prevalence of illegal harms online, as we discussed earlier in proceedings, it is essential that the Communications Act is amended to reflect the important role that Ofcom will have as a new regulator.
As the Minister knows, and as we will discuss shortly when we reach amendments to clause 80, we have significant concerns about the Government’s approach to size versus harm when categorising service providers. Clause 77(4) amends section 3 of the Communications Act by inserting new subsection (4A). New paragraph (4A)(d) outlines measures that are proportionate to
“the size or capacity of the provider”,
and to
“the level of risk of harm presented by the service in question, and the severity of the potential harm”.
We know that harm, and the potential of accessing harmful content, is what is most important in the Bill—it says it in the name—so I am keen for my thoughts on the entire categorisation process to be known early on, although I will continue to press this issue with the Minister when we debate the appropriate clause.
Labour also supports clause 78. It is vital that Ofcom will have a duty to publish its proposals on strategic priorities within a set time period, and ensuring that that statement is published is a positive step towards transparency, which has been so crucially missing for far too long.
Similarly, Labour supports clause 79, which contains a duty to carry out impact assessments. That is vital, and it must be conveyed in the all-important Communications Act.
As the shadow Minister has set out, these clauses ensure that Ofcom’s duties under the Communications Act 2003 are updated to reflect the new duties that we are asking it to undertake—I think that is fairly clear from the clauses. On the shadow Minister’s comment about size and risk, I note her views and look forward to debating that more fully in a moment.
Question put and agreed to.
Clause 77 accordingly ordered to stand part of the Bill.
Clauses 78 and 79 ordered to stand part of the Bill.
Clause 80
Meaning of threshold conditions etc
Question proposed, That the clause stand part of the Bill.
Thank you for your efforts in chairing our meeting today, Sir Roger. My thoughts are with the hon. Member for Batley and Spen and her entire family on the anniversary of Jo Cox’s murder; the SNP would like to echo that sentiment.
I want to talk about my amendment, and I start with a quote from the Minister on Second Reading:
“A number of Members…have raised the issue of small platforms that are potentially harmful. I will give some thought to how the question of small but high-risk platforms can be covered.”—[Official Report, 19 April 2022; Vol. 712, c. 133.]
I appreciate that the Minister may still be thinking about that. He might accept all of our amendments; that is entirely possible, although I am not sure there is any precedent. The possibility is there that that might happen.
Given how strong I felt that the Minister was on the issue on Second Reading, I am deeply disappointed that there are no Government amendments to this section of the Bill. I am disappointed because of the massive risk of harm caused by some very small platforms—it is not a massive number—where extreme behaviour and radicalisation is allowed to thrive. It is not just about the harm to those individuals who spend time on those platforms and who are radicalised, presented with misinformation and encouraged to go down rabbit holes and become more and more extreme in their views. It is also about the risk of harm to other people as a result of the behaviour inspired in those individuals. We are talking about Jo Cox today; she is in our memories and thoughts. Those small platforms are the ones that are most likely to encourage individuals towards extremely violent acts.
If the Bill is to fulfil its stated aims and take the action we all want to see to prevent the creation of those most heinous, awful crimes, it needs to be much stronger on small, very high-risk platforms. I will make no apologies for that. I do not care if those platforms have small amounts of profits. They are encouraging and allowing the worst behaviours to thrive on their platforms. They should be held to a higher level of accountability. It is not too much to ask to class them as category 1 platforms. It is not too much to ask them to comply with a higher level of risk assessment requirements and a higher level of oversight from Ofcom. It is not too much to ask because of the massive risk of harm they pose and the massive actual harm that they create.
Those platforms should be punished for that. It is one thing to punish and criminalise the behaviour of users on those platforms—individual users create and propagate illegal content or radicalise other users—but the Bill does not go far enough in holding those platforms to account for allowing that to take place. They know that it is happening. Those platforms are set up as an alternative place—a place that people are allowed to be far more radical that they are on Twitter, YouTube, Twitch or Discord. None of those larger platforms have much moderation, but the smaller platforms encourage such behaviour. Links are put on other sites pointing to those platforms. For example, when people read vaccine misinformation, there are links posted to more radical, smaller platforms. I exclude Discord because, given its number of users, I think it would be included in one of the larger-platform categories anyway. It is not that there is not radical behaviour on Discord—there is—but I think the size of its membership excludes it, in my head certainly, from the category of the very smallest platforms that pose the highest risk.
We all know from our inboxes the number of people who contact us saying that 5G is the Government trying to take over their brains, or that the entire world is run by Jewish lizard people. We get those emails on a regular basis and those theories are propagated on the smallest platforms. Fair enough—some people may not take any action as a result of the radicalisation that they have experienced as a result of their very extreme views. But some people will take action and that action may be simply enough to harm their friends or family, it may be simply enough to exclude them and drag them away from the society or community that they were previously members of or it might, in really tragic cases, be far more extreme. It might lead people to cause physical or mental harm to others intentionally as a result of the beliefs that they have had created and fostered on those platforms.
That is why we have tabled the amendments. This is the one area that the Government have most significantly failed in writing this Bill, by not ensuring that the small, very high-risk platforms are held to the highest level of accountability and are punished for allowing these behaviours to thrive on their platforms. I give the Minister fair warning that unless he chooses to accept the amendments, I intend to push them to a vote. I would appreciate it if he gave assurances, but I do not believe that any reassurance that he could give would compare to having such a measure in the Bill. As I say, for me the lack of this provision is the biggest failing of the entire Bill.
I echo the comments of the hon. Member for Aberdeen North. I completely agree with everything she has just said and I support the amendments that she has tabled.
The Minister knows my feelings on the Government’s approach to categorisation services; he has heard my concerns time and time again. However, it is not just me who believes that the Government have got their approach really wrong. It is also stakeholders far and wide. In our evidence sessions, we heard from HOPE not hate and the Antisemitism Policy Trust specifically on this issue. In its current form, the categorisation process is based on size versus harm, which is a fundamentally flawed approach.
The Government’s response to the Joint Committee that scrutinised the draft Bill makes it clear that they consider that reach is a key and proportional consideration when assigning categories and that they believe that the Secretary of State’s powers to amend those categories are sufficient to protect people. Unfortunately, that leaves many alternative platforms out of category 1, even if they host large volumes of harmful material.
The duty of care approach that essentially governs the Bill is predicated on risk assessment. If size allows platforms to dodge the entry criteria for managing high risk, there is a massive hole in the regime. Some platforms have already been mentioned, including BitChute, Gab and 4chan, which host extreme racist, misogynistic, homophobic and other extreme content that radicalises people and incites harm. And the Minister knows that.
I take this opportunity to pay tribute to my hon. Friend the Member for Plymouth, Sutton and Devonport (Luke Pollard), who has campaigned heavily on the issue since the horrendous and tragic shooting in Keyham in his constituency. One of my big concerns about the lack of focus on violence against women and girls in the Bill, which we have mentioned time and time again, is the potential for the rise of incel culture online, which is very heavily reported on these alternative platforms—these high-harm, high-risk platforms.
I will just give one example. A teacher contacted me about the Bill. She talked about the rise of misogyny and trying to educate her class on what was happening. At the end of the class, a 15-year-old boy—I appreciate that he is under 18 and is a child, so would come under a different category within the Bill, but I will still give the example. He came up to her and said: “Miss, I need to chat to you. This is something I’m really concerned about. All I did was google, ‘Why can’t I get a girlfriend?’” He had been led down a rabbit hole into a warren of alternative platforms that tried to radicalise him with the most extreme content of incel culture: women are evil; women are the ones who are wrong; it is women he should hate; it is his birth right to have a girlfriend, and he should have one; and he should hate women. That is the type of content that is on those platforms that young, impressionable minds are being pointed towards. They are being radicalised and it is sadly leading to incredibly tragic circumstances, so I really want to push the Minister on the subject.
We share the overarching view of many others that this crucial risk needs to be factored into the classification process that determines which companies are placed in category 1. Otherwise, the Bill risks failing to protect adults from substantial amounts of material that causes physical and psychological harm. Schedule 10 needs to be amended to reflect that.
I appreciate the shadow Minister’s bringing that issue up. Would she agree that, given we have constraints on broadcast and newspaper reporting on suicide for these very reasons, there can be no argument against including such a measure in the Bill?
I completely agree. Those safeguards are in place for that very reason. It seems a major omission that they are not also included in the Online Safety Bill if we are truly to save lives.
The Bill’s own pre-legislative scrutiny Committee recommended that the legislation should
“adopt a more nuanced approach, based not just on size and high-level functionality, but factors such as risk, reach, user base, safety performance, and business model.”
The Government replied that they
“want the Bill to be targeted and proportionate for businesses and Ofcom and do not wish to impose disproportionate burdens on small companies.”
It is, though, entirely appropriate to place a major regulatory burden on small companies that facilitate the glorification of suicide and the sharing of dangerous methods through their forums. It is behaviour that is extraordinarily damaging to public health and makes no meaningful economic or social contribution.
Amendment 82 is vital to our overarching aim of having an assessed risk of harm at the heart of the Bill. The categorisation system is not fit for purpose and will fail to capture so many of the extremely harmful services that many of us have already spoken about.
I want to remind Committee members of what my hon. Friend is talking about. I refer to the oral evidence we heard from Danny Stone, from the Antisemitism Policy Trust, on these small, high-harm platforms. He laid out examples drawn from the work of the Community Security Trust, which released a report called “Hate Fuel”. The report looked at
“various small platforms and highlighted that, in the wake of the Pittsburgh antisemitic murders, there had been 26 threads…with explicit calls for Jews to be killed. One month prior to that, in May 2020, a man called Payton Gendron found footage of the Christchurch attacks. Among this was legal but harmful content, which included the “great replacement” theory, GIFs and memes, and he went on a two-year journey of incitement.”
A week or so before the evidence sitting,
“he targeted and killed 10 people in Buffalo. One of the things that he posted was:
‘Every Time I think maybe I shouldn’t commit to an attack I spend 5 min of /pol/’—
which is a thread on the small 4chan platform—
‘then my motivation returns’.”
Danny Stone told us that the kind of material we are seeing, which is legal but harmful, is inspiring people to go out and create real-world harm. When my hon. Friend the Member for Pontypridd asked him how to amend this approach, he said:
“You would take into account other things—for example, characteristics are already defined in the Bill, and that might be an option”.––[Official Report, Online Safety Public Bill Committee, 26 May 2022; c. 128, Q203-204.]
I do hope that, as my hon. Friend urges, the Minister will look at all these options, because this is a very serious matter.
I completely agree with my hon. Friend. The evidence we heard from Danny Stone from the Antisemitism Policy Trust clearly outlined the real-world harm that legal but harmful content causes. Such content may be legal, but it causes mass casualties and harm in the real world.
There are ways that we can rectify that in the Bill. Danny Stone set them out in his evidence and the SNP amendments, which the Labour Front Bench supports wholeheartedly, outline them too. I know the Minister wants to go further; he has said as much himself to this Committee and on the Floor of the House. I urge him to support some of the amendments, because it is clear that such changes can save lives.
Schedule 10 outlines the regulations specifying threshold conditions for categories of part 3 services. Put simply, as the Minister knows, Labour has concerns about the Government’s plans to allow thresholds for each category to be set out in secondary legislation. As we have said before, the Bill has already faced significant delays at the hands of the Government and we have real concerns that a reliance on secondary legislation further kicks the can down the road.
We also have concerns that the current system of categorisation is inflexible in so far as we have no understanding of how it will work if a service is required to shift from one category to another, and how long that would take. How exactly will that work in practice? Moreover, how long would Ofcom have to preside over such decisions?
We all know that the online space is susceptible to speed, with new technologies and ways of functioning popping up all over, and very often. Will the Minister clarify how he expects the re-categorisation process to occur in practice? The Minister must accept that his Department has been tone deaf on this point. Rather than an arbitrary size cut-off, the regulator must use risk levels to determine which category a platform should fall into so that harmful and dangerous content does not slip through the net.
Labour welcomes clause 81, which sets out Ofcom’s duties in establishing a register of categories of certain part 3 services. As I have repeated throughout the passage of the Bill, having a level of accountability and transparency is central to its success. However, we have slight concerns that the wording in subsection (1), which stipulates that the register be established
“as soon as reasonably practicable”,
could be ambiguous and does not give us the certainty we require. Given the huge amount of responsibility the Bill places on Ofcom, will the Minister confirm exactly what he believes the stipulation means in practice?
Finally, we welcome clause 82. It clarifies that Ofcom has a duty to maintain the all-important register. However, we share the same concerns I previously outlined about the timeframe in which Ofcom will be compelled to make such changes. We urge the Minister to move as quickly as he can, to urge Ofcom to do all they can and to make these vital changes.
As we have heard, the clauses set out how different platforms will be categorised with the purpose of ensuring duties are applied in a reasonable and proportionate way that avoids over-burdening smaller businesses. However, it is worth being clear that the Online Safety Bill, as drafted, requires all in-scope services, regardless of their user size, to take action against content that is illegal and where it is necessary to protect children. It is important to re-emphasise the fact that there is no size qualification for the illegal content duties and the duties on the protection of children.
It is also important to stress that under schedule 10 as drafted there is flexibility, as the shadow Minister said, for the Secretary of State to change the various thresholds, including the size threshold, so there is an ability, if it is considered appropriate, to lower the size thresholds in such a way that more companies come into scope, if that is considered necessary.
It is worth saying in passing that we want these processes to happen quickly. Clearly, it is a matter for Ofcom to work through the operations of that, but our intention is that this will work quickly. In that spirit, in order to limit any delays to the process, Ofcom can rely on existing research, if that research is fit for purpose under schedule 10 requirements, rather than having to do new research. That will greatly assist moving quickly, because the existing research is available off the shelf immediately, whereas commissioning new research may take some time. For the benefit of Hansard and people who look at this debate for the application of the Bill, it is important to understand that that is Parliament’s intention.
I will turn to the points raised by the hon. Member for Aberdeen North and the shadow Minister about platforms that may be small and fall below the category 1 size threshold but that are none the less extremely toxic, owing to the way that they are set up, their rules and their user base. The shadow Minister mentioned several such platforms. I have had meetings with the stakeholders that she mentioned, and we heard their evidence. Other Members raised this point on Second Reading, including the right hon. Member for Barking (Dame Margaret Hodge) and my hon. Friend the Member for Brigg and Goole (Andrew Percy). As the hon. Member for Aberdeen North said, I signalled on Second Reading that the Government are listening carefully, and our further work in that area continues at pace.
I am not sure that amendment 80 as drafted would necessarily have the intended effect. Proposed new sub-paragraph (c) to schedule 10(1) would add a risk condition, but the conditions in paragraph (1) are applied with “and”, so they must all be met. My concern is that the size threshold would still apply, and that this specific drafting of the amendment would not have the intended effect.
We will not accept the amendments as drafted, but as I said on Second Reading, we have heard the representations—the shadow Minister and the hon. Member for Aberdeen North have made theirs powerfully and eloquently—and we are looking carefully at those matters. I hope that provides some indication of the Government’s thinking. I thank the stakeholders who engaged and provided extremely valuable insight on those issues. I commend the clause to the Committee.
The hon. Lady is correct. I am advised that, actually, the ruling has changed, so it can be. We will see—well, I won’t, but the hon. Lady will see what the Minister does on report.
Schedule 10 agreed to.
Clauses 81 and 82 ordered to stand part of the Bill.
Clause 83
OFCOM’s register of risks, and risk profiles, of Part 3
I beg to move amendment 34, in clause 83, page 72, line 12, at end insert—
“(d) the risk of harm posed by individuals in the United Kingdom in relation to adults and children in the UK or elsewhere through the production, publication and dissemination of illegal content.”
This amendment requires the Ofcom’s risk assessment to consider risks to adults and children through the production, publication and dissemination of illegal content.
Labour welcomes clause 83, which places a duty on Ofcom to carry out risk assessments to identify and assess a range of potential risks of harm presented by part 3 services. However we are concerned about subsection (9), which says:
“OFCOM must from time to time review and revise the risk assessments and risk profiles so as to keep them up to date”
That seems a fairly woolly concept even for the Minister to try to defend, so I would be grateful if he clarified exactly what demands will be placed on Ofcom to review those risk assessments and risk profiles. He will know that those are absolutely central to the Bill, so some clarification is required here. Despite that, Labour agrees that it will be a significant advantage for Ofcom to oversee the risk of harm presented by the regulated services.
However, harm should not be limited to those in the UK. Amendment 34 would therefore require Ofcom’s risk assessment to consider risks to adults and children throughout the production, publication and dissemination of illegal content. I have already spoken on this issue, in the debate on amendment 25 to clause 8, so I will keep my comments brief. As the Minister knows, online harms are global in nature, and amendment 34 seeks to ensure that the risk of harm presented by regulated services is not just limited to those in the UK. As we have mentioned previously, research shows us that there is some very damaging, often sexually violent, content being streamed abroad. Labour fears that the current provisions in the legislation will not be far-reaching enough to capture the true essence of the risk of harm that people may face when online.
Labour supports the intentions of clause 84, which outlines that Ofcom must produce guidance to assist providers in complying with their duties to carry out illegal content risk assessments
“As soon as reasonably practicable”.
Of course, the Minister will not be surprised that Labour has slight reservations about the timing around those important duties, so I would appreciate an update from the Minister on the conversations he has had with Ofcom about the practicalities of its duties.
I did not indicate at the start of the debate that I would take the clause stand part and clause 84 stand part together, but I am perfectly relaxed about it and very happy to do so, as the hon. Lady has spoken to them. If any other colleague wishes to speak to them, that is fine by me.
Perhaps I might start with amendment 34, which the shadow Minister just spoke to. We agree that it is very important to consider the risks posed to victims who are outside of the territory of the United Kingdom. However, for the reasons I will elaborate on, we believe that the Bill as drafted achieves that objective already.
First, just to remind the Committee, the Bill already requires companies to put in place proportionate systems and processes to prevent UK users from encountering illegal content. Critically, that includes where a UK user creates illegal content via an in-scope platform, but where the victim is overseas. Let me go further and remind the Committee that clause 9 requires platforms to prevent UK users from encountering illegal content no matter where that content is produced or published. The word “encounter” is very broadly defined in clause 189 as meaning
“read, view, hear or otherwise experience content”.
As such, it will cover a user’s contact with any content that they themselves generate or upload to a service.
Critically, there is another clause, which we have discussed previously, that is very important in the context of overseas victims, which the shadow Minister quite rightly raises. The Committee will recall that subsection (9) of clause 52, which is the important clause that defines illegal content, makes it clear that that content does not have to be generated, uploaded or accessed in the UK, or indeed to have anything to do with the UK, in order to count as illegal content towards which the company has duties, including risk assessment duties. Even if the illegal act—for example, sexually abusing a child—happens in some other country, not the UK, it still counts as illegal content under the definitions in the Bill because of clause 52(9). It is very important that those duties will apply to that circumstance. To be completely clear, if an offender in the UK uses an in-scope platform to produce content where the victim is overseas, or to share abuse produced overseas with other UK users, the platform must tackle that, both through its risk assessment duties and its other duties.
As such, the entirely proper intent behind amendment 34 is already covered by the Bill as drafted. The shadow Minister, the hon. Member for Pontypridd, has already referred to the underlying purpose of clauses 83 and 84. As we discussed before, the risk assessments are central to the duties in the Bill. It is essential that Ofcom has a proper picture of the risks that will inform its various regulatory activities, which is why these clauses are so important. Clause 84 requires Ofcom to produce guidance to services to make sure they are carrying out those risk assessments properly, because it is no good having a token risk assessment or one that does not properly deal with the risks. The guidance published under clause 84 will ensure that happens. As such, I will respectfully resist amendment 34, on the grounds that its contents are already covered by the Bill.
I am grateful for the Minister’s clarification. Given his assurances that its contents are already covered by the Bill, I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
Clause 83 ordered to stand part of the Bill.
Clause 84 ordered to stand part of the Bill.
Clause 85
Power to require information
With this it will be convenient to discuss the following:
Clauses 86 to 91 stand part.
Schedule 11 stand part.
Labour supports clause 85, which gives Ofcom the power to require the provision of any information it requires in order to discharge its online safety functions. We strongly believe that, in the interests of transparency, Ofcom as the regulator must have sufficient power to require a service provider to share its risk assessment in order to understand how that service provider is identifying risks. As the Minister knows, we feel that that transparency should go further, and that the risk assessments should be made public. However, we have already had that argument during a previous debate, so I will not repeat those arguments—on this occasion, at least.
Labour also supports clause 86, and we particularly welcome the clarification that Ofcom may require the provision of information in any form. If we are to truly give Ofcom the power to regulate and, where necessary, investigate service providers, we must ensure that it has sufficient legislative tools to rely on.
The Bill gives some strong powers to Ofcom. We support the requirement in clause 87 to name a senior manager, but again, we feel those provisions should go further. Both users and Ofcom must have access to the full range of tools they need to hold the tech giants to account. As it stands, senior managers can be held criminally liable only for technical offences, such as failing to supply information to the regulator, and even then, those measures might not come in until two years after the Bill is in place. Surely the top bosses at social media companies should be held criminally liable for systemic and repeated failures to ensure online safety as soon as the Bill comes into force, so can the Minister explain the reasons for the delay?
The Minister will be happy to hear that Labour supports clause 88. It is important to have an outline on the face of the Bill of the circumstances in which Ofcom can require a report from a skilled person. It is also important that Ofcom has the power to appoint, or give notice to a provider requiring them to appoint, a skilled person, as Labour fears that without those provisions in subsections (3) and (4), the ambiguity around defining a so-called skilled person could be detrimental. We therefore support the clause, and have not sought to amend it at this stage.
Again, Labour supports all the intentions of clause 89 in the interests of online safety more widely. Of course, Ofcom must have the power to force a company to co-operate with an investigation.
Again, we support the need for clause 90, which gives Ofcom the power to require an individual to attend an interview. That is particularly important in the instances outlined in subsection (1), whereby Ofcom is carrying out an investigation into the failure or possible failure of a provider of a regulated service to comply with a relevant requirement. Labour has repeatedly called for such personal responsibility, so we are pleased that the Government are ensuring that the Bill includes sufficient powers for Ofcom to allow proper scrutiny.
Labour supports clause 91 and schedule 11, which outlines in detail Ofcom’s powers of entry, inspection and audit. I did not think we would support this much, but clearly we do. We want to work with the Government to get this right, and we see ensuring Ofcom has those important authorisation powers as central to it establishing itself as a viable regulator of the online space, both now and for generations to come. We will support and have not sought to amend the clauses or schedule 11 for the reasons set out.
I want to make a brief comment echoing the shadow Minister’s welcome for the inclusion of senior managers and named people in the Bill. I agree that that level of personal liability and responsibility is the only way that we will be able to hold some of these incredibly large, unwieldy organisations to account. If they could wriggle out of this by saying, “It’s somebody else’s responsibility,” and if everyone then disagreed about whose responsibility it was, we would be in a much worse place, so I also support the inclusion of these clauses and schedule 11.
The Minister will be pleased to hear that we, again, support these clauses. We absolutely support the Bill’s aims to ensure that information offences and penalties are strong enough to dissuade non-compliance. However, as we said repeatedly, we feel that the current provisions are lacking.
As it stands, senior managers can be held criminally liable only for technical offences, such as failing to supply information to the regulator. I am grateful that the Minister has confirmed that the measures will come into force with immediate effect following Royal Assent, rather than waiting two years. That is welcome news. The Government should require that top bosses at social media companies be criminally liable for systemic and repeated failures on online safety, and I am grateful for the Minister’s confirmation on that point.
As these harms are allowed to perpetuate, tech companies cannot continue to get away without penalty. Will the Minister confirm why the Bill does not include further penalties, in the form of criminal offences, should a case of systemic and repeated failures arise? Labour has concerns that, without stronger powers, Ofcom may not feel compelled or equipped to sanction those companies who are treading the fine line of doing just enough to satisfy the requirements outlined in the Bill as it stands.
Labour also welcomes clause 93, which sets out the criminal offences that can be committed by named senior managers in relation to their entity’s information obligations. It establishes that senior managers who are named in a response to an information notice can be held criminally liable for failing to prevent the relevant service provider from committing an information offence. Senior managers can only be prosecuted under the clause where the regulated provider has already been found liable for failing to comply with Ofcom’s information request. As I have already stated, we feel that this power needs to go further if we are truly to tackle online harm. For far too long, those at the very top have known about the harm that exists on their platforms, but they have failed to take action.
Labour supports clause 94 and we have not sought to amend at this stage. It is vital that provisions are laid in the Bill, such as those in subsection (3), which specify actions that a person may take to commit an offence of this nature. We all want to see the Bill keep people safe online, and at the heart of doing so is demanding a more transparent approach from those in silicon valley. My hon. Friend the Member for Worsley and Eccles South made an excellent case for the importance of transparency earlier in the debate but, as the Minister knows, and as I have said time and again, the offences must go further than just applying to simple failures to provide information. We must consider a systemic approach to harm more widely, and that goes far beyond simple information offences.
There is no need to repeat myself. Labour supports the need for clause 95 as it stands and we support clause 96, which is in line with penalties for other information offences that already exist.
I am delighted to discover that agreement with the Governments clauses continues to provoke a tsunami of unanimity across the Committee. I sense a gathering momentum behind these clauses.
As the shadow Minister mentioned, the criminal offences here are limited to information provision and disclosure. We have debated the point before. The Government’s feeling is that going beyond the information provision into other duties for criminal liability would potentially go a little far and have a chilling effect on the companies concerned.
Also, the fines that can be levied—10% of global revenue—run into billions of pounds, and there are the denial of service provisions, where a company can essentially be disconnected from the internet in extreme cases; these do provide more than adequate enforcement powers for the other duties in the Bill. The information duties are so fundamental—that is why personal criminal liability is needed. Without the information, we cannot really make any further assessment of whether the duties are being met.
The shadow Minister has set out what the other clauses do: clause 92 creates offences; clause 93 introduces senior managers’ liability; clause 94 sets out the offences that can be committed in relation to audit notices issued by Ofcom; clause 95 creates offences for intentionally obstructing or delaying a person exercising Ofcom’s power; and clause 96 sets out the penalties for the information offences set out in the Bill, which of course include a term of imprisonment of up to two years. Those are significant criminal offences, which I hope will make sure that executives working for social media firms properly discharge those important duties.
Question put and agreed to.
Clause 92 accordingly ordered to stand part of the Bill.
Clauses 93 to 95 ordered to stand part of the Bill.
Clause 96
Penalties for information offences
Amendment made: 2, in clause 96, page 83, line 15, leave out
“maximum summary term for either-way offences”
and insert
“general limit in a magistrates’ court”—(Chris Philp.)
Clause 96, as amended, ordered to stand part of the Bill.
Clause 97
Co-operation and disclosure of information: overseas regulators
Question proposed, That the clause stand part of the Bill.
Again, Labour supports the intentions of clause 97—the collegiality continues. We know that the Bill’s aims are to protect people across the UK, but we know that online harms often originate elsewhere. That is why it is vital that Ofcom has powers to co-operate with an overseas regulator, as outlined in subsection (1).
However, we do have concerns about subsection (2), which states:
“The power conferred by subsection (1) applies only in relation to an overseas regulator for the time being specified in regulations made by the Secretary of State.”
Can the Minister confirm exactly how that will work in practice? He knows that Labour Members have tabled important amendments to clause 123. Amendments 50 and 51, which we will consider later, aim to ensure that Ofcom has the power to co-operate and take action through the courts where necessary. The same issue applies here: Ofcom must be compelled and have the tools available at its disposal to work internationally where required.
Labour supports clause 98, which amends section 393 of the Communications Act 2003 to include new provisions. That is obviously a vital step, and we particularly welcome subsection (2), which outlined that, subject to the specific exceptions in section 393 of the 2003 Act, Ofcom cannot disclose information with respect to a business that it has obtained by exercising its powers under this Bill without the consent of the business in question. This is once again an important step in encouraging transparency across the board.
We support clause 99, which places a duty on Ofcom to consult the relevant intelligence service before Ofcom discloses or publishes any information that it has received from that intelligence service. For reasons of national security, it is vital that the relevant intelligence service is included in Ofcom’s reasoning and approach to the Bill more widely.
We broadly support the intentions of clause 100. It is vital that Ofcom is encouraged to provide information to the Secretary of State of the day, but I would be grateful if the Minister could confirm exactly how the power will function in reality. Provision of information to assist in the formulation of policy, as we know, is a very broad spectrum in the Communications Act. We want to make sure the powers are not abused—I know that is a concern shared on his own Back Benches—so I would be grateful for the Minister’s honest assessment of the situation.
We welcome clause 101, which amends section 26 of the Communications Act and provides for publication of information and advice for various persons, such as consumers. Labour supports the clause as it stands. We also welcome clause 102, which, importantly, sets out the circumstances in which a statement given to Ofcom can be used in evidence against that person. Again, this is an important clause in ensuring that Ofcom has the powers it needs to truly act as a world-leading regulator, which we all want it to be. Labour supports it and has chosen not to table any amendments.
(2 years, 5 months ago)
Public Bill CommitteesGood morning, Ms Rees; it is, as always, a pleasure to serve under your chairship.
Amendment 84 would remove the Secretary of State’s ability to modify Ofcom codes of practice
“for reasons of public policy”.
Labour agrees with the Carnegie UK Trust assessment of this: the codes are the fulcrum of the regulatory regime and it is a significant interference in Ofcom’s independence. Ofcom itself has noted that the “reasons of public policy” power to direct might weaken the regime. If Ofcom has undertaken a logical process, rooted in evidence, to arrive at a draft code, it is hard to see how a direction based on “reasons of public policy” is not irrational. That then creates a vulnerability to legal challenge.
On clause 40 more widely, the Secretary of State should not be able to give Ofcom specific direction on non-strategic matters. Ofcom’s independence in day-to-day decision making is paramount to preserving freedom of expression. Independence of media regulators is the norm in developed democracies. The UK has signed up to many international statements in that vein, including as recently as April 2022 at the Council of Europe. That statement says that
“media and communication governance should be independent and impartial to avoid undue influence on policy making, discriminatory treatment and preferential treatment of powerful groups, including those with significant political or economic power.”
The Bill introduces powers for the Secretary of State to direct Ofcom on internet safety codes. These provisions should immediately be removed. After all, in broadcasting regulation, Ofcom is trusted to make powerful programme codes with no interference from the Secretary of State. Labour further notes that although the draft Bill permitted this
“to ensure that the code of practice reflects government policy”,
clause 40 now specifies that any code may be required to be modified
“for reasons of public policy”.
Although that is more normal language, it is not clear what in practice the difference in meaning is between the two sets of wording. I would be grateful if the Minister could confirm what that is.
The same clause gives the Secretary of State powers to direct Ofcom, on national security or public safety grounds, in the case of terrorism or CSEA—child sexual exploitation and abuse—codes of practice. The Secretary of State might have some special knowledge of those, but the Government have not demonstrated why they need a power to direct. In the broadcasting regime, there are no equivalent powers, and the Secretary of State was able to resolve the case of Russia Today, on national security grounds, with public correspondence between the Secretary of State and Ofcom.
Good morning, Ms Rees; it is a pleasure to serve under your chairmanship again. The SNP spokesman and the shadow Minister have already explained what these provisions do, which is to provide a power for the Secretary of State to make directions to Ofcom in relation to modifying a code of conduct. I think it is important to make it clear that the measures being raised by the two Opposition parties are, as they said, envisaged to be used only in exceptional circumstances. Of course the Government accept that Ofcom, in common with other regulators, is rightly independent and there should be no interference in its day-to-day regulatory decisions. This clause does not seek to violate that principle.
However, we also recognise that although Ofcom has great expertise as a regulator, there may be situations in which a topic outside its area of expertise needs to be reflected in a code of practice, and in those situations, it may be appropriate for a direction to be given to modify a code of conduct. A recent and very real example would be in order to reflect the latest medical advice during a public health emergency. Obviously, we saw in the last couple of years, during covid, some quite dangerous medical disinformation being spread—concerning, for example, the safety of vaccines or the “prudence” of ingesting bleach as a remedy to covid. There was also the purported and entirely false connection between 5G phone masts and covid. There were issues on public policy grounds—in this case, medical grounds—and it might have been appropriate to make sure that a code of conduct was appropriately modified.
Thank you, Ms Rees, for your hard work in chairing the Committee this morning; we really appreciate it. Amendment 89 relates to below-the-line comments on newspaper articles. For the avoidance of doubt, if we do not get amendment 89, I am more than happy to support the Labour party’s amendment 43, which has a similar effect but covers slightly fewer—or many fewer—organisations and places.
Below-the-line comments in newspaper articles are infamous. They are places that everybody fears to go. They are worse than Twitter. In a significant number of ways, below-the-line comments are an absolute sewer. I cannot see any reasonable excuse for them to be excluded from the Bill. We are including Twitter in the Bill; why are we not including below-the-line comments for newspapers? It does not make any sense to me; I do not see any logic.
We heard a lot of evidence relating to freedom of speech and a free press, and I absolutely, wholeheartedly agree with that. However, the amendment would not stop anyone writing a letter to the editor. It would not stop anyone engaging with newspapers in the way that they would have in the print medium. It would still allow that to happen; it would just ensure that below-the-line comments were subject to the same constraints as posts on Twitter. That is the entire point of amendment 89.
I do not think that I need to say much more, other than to add one more thing about the direction by comments to other, more radical and extreme pieces, or bits of information. It is sometimes the case that the comments on a newspaper article will direct people to even more extreme views. The newspaper article itself may be just slightly derogatory, while some of the comments may have links or references to other pieces, and other places on the internet where people can find a more radical point of view. That is exactly what happens on Twitter, and is exactly some of the stuff that we are trying to avoid—sending people down an extremist rabbit hole. I do not understand how the Minister thinks that the clause, which excludes below the line newspaper comments, is justifiable or acceptable.
Having been contacted by a number of newspapers, I understand and accept that some newspapers have moderation policies for their comments sections, but that is not strong enough. Twitter has a moderation policy, but that does not mean that there is actually any moderation, so I do not think that subjecting below-the-line comments to the provisions of the Bill is asking too much. It is completely reasonable for us to ask for this to happen, and I am honestly baffled as to why the Minister and the Government have chosen to make this exemption.
Before I address the amendments, I will speak to clause 49 more broadly.
Labour has concerns about a number of subsections of the clause, including subsections (2), and (8) to (10)— commonly known as the news publisher content exemption, which I have spoken about previously. We understand that the intention of the exemption is to shield broadcasters and traditional newspaper publishers from the Bill’s regulatory effects, clause 50(2) defines a “recognised news publisher” as a regulated broadcaster or any other publisher that publishes news, has an office, and has a standards code and complaints process. There is no detail about the latter two requirements, thus enabling almost any news publishing enterprise to design its own code and complaints process, however irrational, and so benefit from the exemption. “News” is also defined broadly, and may include gossip. There remains a glaring omission, which amendment 43 addresses and which I will come to.
During an earlier sitting of the Committee, in response to comments made by my hon. Friend the Member for Liverpool, Walton as we discussed clause 2, the Minister claimed that
“The metaverse is a good example, because even though it did not exist when the structure of the Bill was conceived, anything happening in the metaverse is none the less covered by the Bill. Anything that happens in the metaverse that is illegal or harmful to children, falls into the category of legal but harmful to adults, or indeed constitutes pornography will be covered because the Bill is tech agnostic.”––[Official Report, Online Safety Public Bill Committee, 7 June 2022; c. 204.]
Clause 49 exempts one-to-one live aural communications from the scope of regulation. Given that much interaction in virtual reality is live aural communication, including between two users, it is hard to understand how that would be covered by the Bill.
There is also an issue about what counts as content. Most standard understandings would define “content” as text, video, images and audio, but one of the worries about interactions in VR is that behaviour such as physical violence will be able to be replicated virtually, with psychologically harmful effects. It is very unclear how that would be within the scope of the current Bill, as it does not clearly involve content, so could the Minister please address that point? As he knows, Labour advocates for a systems-based approach, and for risk assessments and systems to take place in a more upstream and tech-agnostic way than under the current approach. At present, the Bill would struggle to be expanded effectively enough to cover those risks.
Amendment 43 removes comments sections operated by news websites where the publisher has a UK turnover of more than £100 million from the exemption for regulated user-generated comment. If the Bill is to be effective in protecting the public from harm, the least it must accomplish is a system of accountability that covers all the largest platforms used by British citizens. Yet as drafted, the Bill would exempt some of the most popular social media platforms online: those hosted on news publisher websites, which are otherwise known as comments sections. The amendment would close that loophole and ensure that the comments sections of the largest newspaper websites are subject to the regime of regulation set out in the Bill.
Newspaper comments sections are no different from the likes of Facebook and Twitter, in that they are social media platforms that allow users to interact with one another. This is done through comments under stories, comments in response to other comments, and other interactions—for example, likes and dislikes on posts. In some ways, their capacity to cause harm to the public is even greater: for example, their reach is in many cases larger than even the biggest of social media platforms. Whereas there are estimated to be around 18 million users of Twitter in the UK, more than twice that number of British citizens access newspaper websites every month, and the harm perpetuated on those platforms is severe.
In July 2020, the rapper Wiley posted a series of antisemitic tweets, which Twitter eventually removed after an unacceptable delay of 48 hours, but under coverage of the incident in The Sun newspaper, several explicitly antisemitic comments were posted. Those comments contained holocaust denial and alleged a global Jewish conspiracy to control the world. They remained up and accessible to The Sun’s 7 million daily readers for the best part of a week. If we exempt comments sections from the Bill’s proposed regime and the duties that the Bill sets for platforms, we will send the message that that kind of vicious, damaging and harmful racism is acceptable.
Similarly, after an antisemitic attack in the German city of Halle, racists comments followed in the comments section under the coverage in The Sun. There are more examples: Chinese people being described as locusts and attacked with other racial slurs; 5G and Bill Gates conspiracy theories under articles on the Telegraph website; and of course, the most popular targets for online abuse, women in public life. Comments that described the Vice-President of the United States as a “rat” and “ho” appeared on the MailOnline. A female union leader has faced dozens of aggressive and abusive comments about her appearance, and many of such comments remain accessible on newspaper comments sections to this day. Some of them have been up for months, others for years.
Last week, the Committee was sent a letter from a woman who was the victim of comments section abuse, Dr Corinne Fowler. Dr Fowler said of the comments that she received:
“These comments contained scores of suggestions about how to kill or injure me. Some were general ideas, such as hanging, but many were gender specific, saying that I should be burnt at the stake like a witch. Comments focused on physical violence, one man advising that I should slapped hard enough to make my teeth chatter”.
She added:
“I am a mother: without me knowing, my son (then 12 years old) read these reader comments. He became afraid for my safety.”
Without the amendment, the Bill cannot do anything to protect women such as Dr Fowler and their families from this vile online abuse, because comments sections will be entirely out of scope of the Bill’s new regime and the duties designed to protect users.
As I understand it, two arguments have been made to support the exemption. First, it is argued that the complaints handlers for the press already deal with such content, but the handler for most national newspapers, the Independent Press Standards Organisation, will not act until a complaint is made. It then takes an average of six months for a complaint to be processed, and it cannot do anything if the comments have not been moderated. The Opposition do not feel that that is a satisfactory response to the seriousness of harms that we know to occur, and which I have described. IPSO does not even have a code to deal with cases of antisemitic abuse that appeared on the comments section of The Sun. IPSO’s record speaks for itself from the examples that I have given, and the many more, and it has proven to be no solution to the severity of harms that appear in newspaper comments sections.
The second argument for an exemption is that publishers are legally responsible for what appears on comments sections, but that is only relevant for illegal harms. For everything else, from disinformation to racial prejudice and abuse, regulation is needed. That is why it is so important that the Bill does the job that we were promised. To keep the public safe from harm online, comments sections must be covered under the Bill.
The amendment is a proportionate solution to the problem of comments section abuse. It would protect user’s freedom of expression and, given that it is subject to a turnover threshold, ensure that duties and other requirements do not place a disproportionate burden on smaller publishers such as locals, independents and blogs.
I have reams and reams and reams of examples from comments sections that all fall under incredibly harmful abuse and should be covered by the Bill. I could be here for hours reading them all out, and while I do not think that anybody in Committee would like me to, I urge Committee members to take a look for themselves at the types of comments under newspaper articles and ask themselves whether those comments should be covered by the terms of the Bill. I think they know the answer.
On a point of order, Ms Rees. Are we considering clause 49 now? I know that it is supposed to considered under the next set of amendments, but I just wondered, because I have separate comments to make on that clause that I did not make earlier because I spoke purely to the amendment.
Let me start by addressing the substance of the two amendments and then I will answer one or two of the questions that arose in the course of the debate.
As Opposition Members have suggested, the amendments would bring the comments that appear below the line on news websites such as The Guardian, MailOnline or the BBC into the scope of the Bill’s safety duties. They are right to point out that there are occasions when the comments posted on those sites are extremely offensive.
There are two reasons why comments below BBC, Guardian or Mail articles are excluded from the scope of the Bill. First, the news media publishers—newspapers, broadcasters and their representative industry bodies—have made the case to the Government, which we are persuaded by, that the comments section below news articles is an integral part of the process of publishing news and of what it means to have a free press. The news publishers—both newspapers and broadcasters that have websites—have made that case and have suggested, and the Government have accepted, that intruding into that space through legislation and regulation would represent an intrusion into the operation of the free press.
I am sorry, but I am having real trouble buying that argument. If the Minister is saying that newspaper comments sections are exempt in order to protect the free press because they are an integral part of it, why do we need the Bill in the first place? Social media platforms could argue in the same way that they are protecting free speech. They could ask, “Why should we regulate any comments on our social media platform if we are protecting free speech?” I am sorry; that argument does not wash.
There is a difference between random individuals posting stuff on Facebook, as opposed to content generated by what we have defined as a “recognised news publisher”. We will debate that in a moment. We recognise that is different in the Bill. Although the Opposition are looking to make amendments to clause 50, they appear to accept that the press deserve special protection. Article 10 case law deriving from the European convention on human rights also recognises that the press have a special status. In our political discourse we often refer generally to the importance of the freedom of the press. We recognise that the press are different, and the press have made the case—both newspapers and broadcasters, all of which now have websites—that their reader engagement is an integral part of that free speech. There is a difference between that and individuals chucking stuff on Facebook outside of the context of a news article.
There is then a question about whether, despite that, those comments are still sufficiently dangerous that they merit regulation by the Bill—a point that the shadow Minister, the hon. Member for Pontypridd, raised. There is a functional difference between comments made on platforms such as Facebook, Twitter, TikTok, Snapchat or Instagram, and comments made below the line on a news website, whether it is The Guardian, the Daily Mail, the BBC—even The National. The difference is that on social media platforms, which are the principal topic of the Bill, there is an in-built concept of virality—things going viral by sharing and propagating content widely. The whole thing can spiral rapidly out of control.
Virality is an inherent design feature in social media sites. It is not an inherent design feature of the comments we get under the news website of the BBC, The Guardian or the Daily Mail. There is no way of generating virality in the same way as there is on Facebook and Twitter. Facebook and Twitter are designed to generate massive virality in a way that comments below a news website are not. The reach, and the ability for them to grow exponentially, is orders of magnitude lower on a news website comment section than on Facebook. That is an important difference, from a risk point of view.
No, I will let that particular weed die in the bed. I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
Question proposed, That the clause stand part of the Bill.
Briefly, as with earlier clauses, the Labour party recognises the challenge in finding the balance between freedom of expression and keeping people safe online. Our debate on the amendment has illustrated powerfully that the exemptions as they stand in the Bill are hugely flawed.
First, the exemption is open to abuse. Almost any organisation could develop a standards code and complaints process to define itself as a news publisher and benefit from the exemption. Under those rules, as outlined eloquently by my hon. Friend the Member for Batley and Spen, Russia Today already qualifies, and various extremist publishers could easily join it. Organisations will be able to spread seriously harmful content with impunity—I referred to many in my earlier contributions, and I have paid for that online.
Secondly, the exemption is unjustified, as we heard loud and clear during the oral evidence sessions. I recall that Kyle from FairVote made that point particularly clearly. There are already rigorous safeguards in the Bill to protect freedom of expression. The fact that content is posted by a news provider should not itself be sufficient reason to treat such content differently from that which is posted by private citizens.
Furthermore, quality publications with high standards stand to miss out on the exemption. The Minister must also see the lack of parity in the broadcast media space. In order for broadcast media to benefit from the exemption, they must be regulated by Ofcom, and yet there is no parallel stipulation for non-broadcast media to be regulated in order to benefit. How is that fair? For broadcast media, the requirement to be regulated by Ofcom is simple, but for non-broadcast media, the series of requirements are not rational, exclude many independent publishers and leave room for ambiguity.
I have a couple of questions that were probably too long for interventions. The Minister said that if comments on a site are the only user-generated content, they are not in scope. It would be really helpful if he explained what exactly he meant by that. We were talking about services that do not fall within the definition of “recognised news publishers”, because we were trying to add them to that definition. I am not suggesting that the Minister is wrong in any way, but I do not understand where the Bill states that those comments are excluded, and how this all fits together.
I made general comments about clause 50 during the debate on amendment 107; I will not try the Committee’s patience by repeating them, but I believe that in them, I addressed some of the issues that the shadow Minister, the hon. Member for Pontypridd, has raised.
On the hon. Member for Aberdeen North’s question about where the Bill states that sites with limited functionality—for example, functionality limited to comments alone—are out of scope, paragraph 4(1) of schedule 1 states that
“A user-to-user service is exempt if the functionalities of the service are limited, such that users are able to communicate by means of the service only in the following ways—
(a) posting comments or reviews relating to provider content;
(b) sharing such comments or reviews on a different internet service”.
Clearly, services where a user can share freely are in scope, but if they cannot share directly—if they can only share via another service, such as Facebook—that service is out of scope. This speaks to the point that I made to the hon. Member for Batley and Spen in a previous debate about the level of virality, because the ability of content to spread, proliferate, and be forced down people’s throats is one of the main risks that we are seeking to address through the Bill. I hope that paragraph 4(1) of schedule 1 is of assistance, but I am happy to discuss the matter further if that would be helpful.
Question put and agreed to.
Clause 50 accordingly ordered to stand part of the Bill.
Clause 51
“Search content”, “search results” etc
Question proposed, That the clause stand part of the Bill.
Labour does not oppose the intention of the clause. It is important to define “search content” in order to understand the responsibilities that fall within search services’ remits.
However, we have issues with the way that the Bill treats user-to-user services and search services differently when it comes to risk-assessing and addressing legal harm—an issue that we will come on to when we debate schedule 10. Although search services rightly highlight that the content returned by a search is not created or published by them, the algorithmic indexing, promotion and search prompts provided in search bars are fundamentally their responsibility. We do, however, accept that over the past 20 years, Google, for example, has developed mechanisms to provide a safer search experience for users while not curtailing access to lawful information. We also agree that search engines are critical to the proper functioning of the world wide web; they play a uniquely important role in facilitating access to the internet, and enable people to access, impart, and disseminate information.
Question put and agreed to.
Clause 51 accordingly ordered to stand part of the Bill.
Clause 52
“Illegal content” etc
I beg to move amendment 61, in clause 52, page 49, line 5, at end insert—
“(4A) An offence referred to in subsection (4) is deemed to have occurred if it would be an offence under the law of the United Kingdom regardless of whether or not it did take place in the United Kingdom.”
This amendment brings offences committed overseas within the scope of relevant offences for the purposes of defining illegal content.
With this it will be convenient to discuss the following:
Clause stand part.
That schedules 5 and 6 be the Fifth and Sixth schedules to the Bill.
With your permission, Ms Rees, I will speak to clause 52 before coming to amendment 61. Illegal content is defined in clause 52(2) as
“content that amounts to a relevant offence.”
However, as the Minister will know from representations from Carnegie UK to his Department—we share its concerns—the illegal and priority illegal regimes may not be able to operate as intended. The Bill requires companies to decide whether content “amounts to” an offence, with limited room for movement. We share concerns that that points towards decisions on an item-by-item basis; it means detecting intent for each piece of content. However, such an approach does not work at the scale on which platforms operate; it is bad regulation and poor risk management.
There seem to be two different problems relating to the definition of “illegal content” in clause 52. The first is that it is unclear whether we are talking about individual items of content or categories of content—the word “content” is ambiguous because it can be singular or plural—which is a problem for an obligation to design and run a system. Secondly, determining when an offence has taken place will be complex, especially bearing in mind mens rea and defences, so the providers are not in a position to get it right.
The use of the phrase “amounts to” in clause 52(2) seems to suggest that platforms will be required to identify accurately, in individual cases, where an offence has been committed, without any wriggle room drafted in, unlike in the draft Bill. As the definition now contains no space for error either side of the line, it could be argued that there are more incentives to avoid false negatives than false positives—providers can set higher standards than the criminal law—and that leads to a greater risk of content removal. That becomes problematic, because it seems that the obligation under clause 9(3) is then to have a system that is accurate in all cases, whereas it would be more natural to deal with categories of content. This approach seems not to be intended; support for that perspective can be drawn from clause 9(6), which recognises that there is a distinction between categories of content and individual items, and that the application of terms of service might specifically have to deal with individual instances of content. Critically, the “amounts to” approach cannot work in conjunction with a systems-based approach to harm reduction. That leaves victims highly vulnerable.
This problem is easily fixed by a combination of reverting to the draft Bill’s language, which required reasonableness, and using concepts found elsewhere in the Bill that enable a harm mitigation system to operate for illegal content. We also remind the Minister that Ofcom raised this issue in the evidence sessions. I would be grateful if the Minister confirmed whether we can expect a Government amendment to rectify this issue shortly.
More broadly, as we know, priority illegal content, which falls within illegal content, includes,
“(a) terrorism content,
(b) CSEA content, and
(c) content that amounts to an offence specified in Schedule 7”,
as set out in clause 52(7). Such content attracts a greater level of scrutiny and regulation. Situations in which user-generated content will amount to “a relevant offence” are set out in clause 52(3). Labour supports the inclusion of a definition of illegal content as outlined in the grouping; it is vital that service providers and platforms have a clear indication of the types of content that they will have a statutory duty to consider when building, or making changes to the back end of, their business models.
We have also spoken about the importance of parity between the online and offline spaces—what is illegal offline must be illegal online—so the Minister knows we have more work to do here. He also knows that we have broad concerns around the omissions in the Bill. While we welcome the inclusion of terrorism and child sexual exploitation content as priority illegal content, there remain gaps in addressing violence against women and girls content, which we all know is hugely detrimental to many online.
The UK Government stated that their intention for the Online Safety Bill was to make the UK the safest place to be online in the world, yet the Bill does not mention online gender-based violence once. More than 60,000 people have signed the Glitch and End Violence Against Women Coalition’s petition calling for women and girls to be included in the Bill, so the time to act is now. We all have a right to not just survive but thrive, engage and play online, and not have our freedom of expression curtailed or our voices silenced by perpetrators of abuse. The online space is just as real as the offline space. The Online Safety Bill is our opportunity to create safe digital spaces.
The Bill must name the problem. Violence against women and girls, particularly those who have one or multiple protected characteristics, is creating harm and inequality online. We must actively and meaningfully name this issue and take an intersectional approach to ending online abuse to ensure that the Bill brings meaningful change for all women. We also must ensure that the Bill truly covers all illegal content, whether it originated in the UK or not.
Amendment 61 brings offences committed overseas within the scope of relevant offences for the purposes of defining illegal content. The aim of the amendment is to clarify whether the Bill covers content created overseas that would be illegal if what was shown in the content took place in the UK. For example, animal abuse and cruelty content is often filmed abroad. The same can be said for dreadful human trafficking content and child sexual exploitation. The optimal protection would be if the Bill’s definition of illegal content covered matter that would be illegal in either the UK or the country it took place in, regardless of whether it originated in the UK.
I do not intend to make a speech, but I want to let the hon. Lady know that we wholeheartedly support everything that she has said on the clause and amendment 61.
I am grateful for the hon. Member’s contribution, and for her support for the amendment and our comments on the clause.
The Bill should be made clearer, and I would appreciate an update on the Minister’s assessment of the provisions in the Bill. Platforms and service providers need clarity if they are to take effective action against illegal content. Gaps in the Bill give rise to serious questions about the overwhelming practical challenges of the Bill. None of us wants a two-tier internet, in which user experience and platforms’ responsibilities in the UK differ significantly from those in the rest of the world. Clarifying the definition of illegal content and acknowledging the complexity of the situation when content originates abroad are vital if this legislation is to tackle wide-ranging, damaging content online. That is a concern I raised on Second Reading, and a number of witnesses reiterated it during the oral evidence sessions. I remind the Committee of the comments of Kevin Bakhurst from Ofcom, who said:
“We feel it is really important—hopefully this is something the Committee can contribute to—that the definition of ‘illegal content’ is really clear for platforms, and particularly the area of intent of illegality, which at the moment might be quite tricky for the platforms to pick up on.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 8, Q7.]
That has been reiterated by myriad other stakeholders, so I would be grateful for the Minister’s comments.
I rise to speak on clause 52 stand part, particularly —the Minister will not be surprised—the element in subsection (4)(c) around the offences specified in schedule 7. The debate has been very wide ranging throughout our sittings. It is extraordinary that we need a clause defining what is illegal. Presumably, most people who provide goods and services in this country would soon go out of business if they were not knowledgeable about what is illegal. The Minister is helping the debate very much by setting out clearly what is illegal, so that people who participate in the social media world are under no illusion as to what the Government are trying to achieve through this legislation.
The truth is that the online world has unfolded without a regulatory framework. New offences have emerged, and some of them are tackled in the Bill, particularly cyber-flashing. Existing offences have taken on a new level of harm for their victims, particularly when it comes to taking, making and sharing intimate images without consent. As the Government have already widely acknowledged, because the laws on that are such a patchwork, it is difficult for the enforcement agencies in this country to adequately protect the victims of that heinous crime, who are, as the Minister knows, predominately women.
As always, the right hon. Lady makes an incredibly powerful point. She asked specifically about whether the Bill is a suitable legislative vehicle in which to implement any Law Commission recommendations—we do not yet have the final version of that report—and I believe that that would be in scope. A decision about legislative vehicles depends on the final form of the Law Commission report and the Ministry of Justice response to it, and on cross-Government agreement about which vehicle to use.
I hope that addresses all the questions that have been raised by the Committee. Although the shadow Minister is right to raise the question, I respectfully ask her to withdraw amendment 61 on the basis that those matters are clearly covered in clause 52(9). I commend the clause to the Committee.
I am grateful to the Minister for his comments. The Labour party has concerns that clause 52(9) does not adequately get rid of the ambiguity around potential illegal online content. We feel that amendment 61 sets that out very clearly, which is why we will press it to a vote.
Just to help the Committee, what is it in clause 52(9) that is unclear or ambiguous?
We just feel that amendment 61 outlines matters much more explicitly and leaves no ambiguity by clearly defining any
“offences committed overseas within the scope of relevant offences for the purposes of defining illegal content.”
I think they say the same thing, but we obviously disagree.
Question put, That the amendment be made.
(2 years, 5 months ago)
Public Bill CommitteesGood morning, Ms Rees. It is a pleasure to serve once again under your chairmanship. I wondered whether the shadow Minister, the hon. Member for Pontypridd, wanted to speak first—I am always happy to follow her, if she would prefer that.
I do my best.
Clauses 17 and 27 have similar effects, the former applying to user-to-user services and the latter to search services. They set out an obligation on the companies to put in place effective and accessible content reporting mechanisms, so that users can report issues. The clauses will ensure that service providers are made aware of illegal and harmful content on their sites. In relation to priority illegal content, the companies must proactively prevent it in the first place, but in the other areas, they may respond reactively as well.
The clause will ensure that anyone who wants to report illegal or harmful content can do so in a quick and reasonable way. We are ensuring that everyone who needs to do that will be able to do so, so the facility will be open to those who are affected by the content but who are not themselves users of the site. For example, that might be non-users who are the subject of the content, such as a victim of revenge pornography, or non-users who are members of a specific group with certain characteristics targeted by the content, such as a member of the Jewish community reporting antisemitic content. There is also facility for parents and other adults with caring responsibility for children, and adults caring for another adult, to report content. Clause 27 sets out similar duties in relation to search. I commend the clauses to the Committee.
I will talk about this later, when we come to a subsequent clause to which I have tabled some amendments—I should have tabled some to this clause, but unfortunately missed the chance to do so.
I appreciate the Minister laying out why he has designated the people covered by this clause; my concern is that “affected” is not wide enough. My logic is that, on the strength of these provisions, I might not be able to report racist content that I come across on Twitter if I am not the subject of that content—if I am not a member of a group that is the subject of the content or if I am not caring for someone who is the subject of it.
I appreciate what the Minister is trying to do, and I get the logic behind it, but I think the clause unintentionally excludes some people who would have a reasonable right to expect to be able to make reports in this instance. That is why I tabled amendments 78 and 79 to clause 28, about search functions, but those proposals would have worked reasonably for this clause as well. I do not expect a positive answer from the Minister today, but perhaps he could give consideration to my concern. My later amendments would change “affected person” to “any other person”. That would allow anyone to make a report, because if something is illegal content, it is illegal content. It does not matter who makes the report, and it should not matter that I am not a member of the group of people targeted by the content.
I report things all the time, particularly on Twitter, and a significant amount of it is nothing to do with me. It is not stuff aimed at me; it is aimed at others. I expect that a number of the platforms will continue to allow reporting for people who are outwith the affected group, but I do not want to be less able to report than I am currently, and that would be the case for many people who see concerning content on the internet.
The hon. Lady is making a really important point. One stark example that comes to my mind is when English footballers suffered horrific racist abuse following the penalty shootout at the Euros last summer. Hundreds of thousands of people reported the abuse that they were suffering to the social media platforms on their behalf, in an outcry of solidarity and support, and it would be a shame if people were prevented from doing that.
I absolutely agree. I certainly do not think I am suggesting that the bigger platforms such as Twitter and Facebook will reduce their reporting mechanisms as a result of how the Bill is written. However, it is possible that newer or smaller platforms, or anything that starts after this legislation comes, could limit the ability to report on the basis of these clauses.
I give way first to the hon. Member for Aberdeen North—I think she was first on her feet—and then I will come to the hon. Member for Pontypridd.
I think the shadow Minister wanted to intervene, unless I have answered her point already.
I wanted to reiterate the point that the hon. Member for Aberdeen North made, which the Minister has not answered. If he has such faith that the systems and processes will be changed and controlled by Ofcom as a result of the Bill, why is he so reluctant to put in an ombudsman? It will not be overwhelmed with complaints if the systems and processes work, and therefore protect victims. We have already waited far too long for the Bill, and now he says that we need to wait two to four years for a review, and even longer to implement an ombudsman to protect victims. Why will he not just put this in the Bill now to keep them safe?
Because we need to give the new systems and processes time to take effect. If the hon. Lady felt so strongly that an ombudsman was required, she was entirely at liberty to table an amendment to introduce one, but she has not done so.
As I have said, at the moment there is nothing at all. Platforms such as Facebook can and do arbitrarily censor content with little if any regard for freedom of speech. Some platforms have effectively cancelled Donald Trump while allowing the Russian state to propagate shocking disinformation about the Russian invasion of Ukraine, so there is real inconsistency and a lack of respect for freedom of speech. This at least establishes something where currently there is nothing. We can debate whether “have regard to” is strong enough. We have heard the other point of view from the other side of the House, which expressed concern that it might be used to allow otherwise harmful content, so there are clearly arguments on both sides of the debate. The obligation to have regard does have some weight, because the issue cannot be completely ignored. I do not think it would be adequate to simply pay lip service to it and not give it any real regard, so I would not dismiss the legislation as drafted.
I would point to the clauses that we have recently discussed, such as clause 15, under which content of democratic importance—which includes debating current issues and not just stuff said by an MP or candidate—gets additional protection. Some of the content that my hon. Friend the Member for Don Valley referred to a second ago would probably also get protection under clause 14, under which content of democratic importance has to be taken in account when making decisions about taking down or removing particular accounts. I hope that provides some reassurance that this is a significant step forwards compared with where the internet is today.
I share the Minister’s sentiments about the Bill protecting free speech; we all want to protect that. He mentions some of the clauses we debated on Tuesday regarding democratic importance. Some would say that debating this Bill is of democratic importance. Since we started debating the Bill on Tuesday, and since I have mentioned some of the concerns raised by stakeholders and others about the journalistic exemption and, for example, Tommy Robinson, my Twitter mentions have been a complete sewer—as everyone can imagine. One tweet I received in the last two minutes states:
“I saw your vicious comments on Tommy Robinson…The only reason you want to suppress him is to bury the Pakistani Muslim rape epidemic”
in this country. Does the Minister agree that that is content of democratic importance, given we are debating this Bill, and that it should remain on Twitter?
That sounds like a very offensive tweet. Could the hon. Lady read it again? I didn’t quite catch it.
Yes:
“I saw your vicious comments on Tommy Robinson…The only reason you want to suppress him is to bury the Pakistani Muslim rape epidemic”
in this country. It goes on:
“this is a toxic combination of bloc vote grubbing and woke”
culture, and there is a lovely GIF to go with it.
I do not want to give an off-the-cuff assessment of an individual piece of content—not least because I am not a lawyer. It does not sound like it meets the threshold of illegality. It most certainly is offensive, and that sort of matter is one that Ofcom will set out in its codes of practice, but there is obviously a balance between freedom of speech and content that is harmful, which the codes of practice will delve into. I would be interested if the hon. Lady could report that to Twitter and then report back to the Committee on what action it takes.
At the moment, there is no legal obligation to do anything about it, which is precisely why this Bill is needed, but let us put it to the test.
Question put and agreed to.
Clause 19 accordingly ordered to stand part of the Bill.
Clause 20
Record-keeping and review duties
Question proposed, That the clause stand part of the Bill.
(2 years, 5 months ago)
Public Bill CommitteesIt is a great pleasure to serve under your chairmanship, Ms Rees, and I am glad that this afternoon’s Committee proceedings are being broadcast to the world.
Before we adjourned this morning, I was in the process of saying that one of the challenges with public publication of the full risk assessment, even for larger companies, is that the vulnerabilities in their systems, or the potential opportunities to exploit those systems for criminal purposes, would then be publicly exposed in a way that may not serve the public interest, and that is a reason for not requiring complete disclosure of everything.
However, I draw the Committee’s attention to the existing transparency provisions in clause 64. We will come on to them later, but I want to mention them now, given that they are relevant to amendment 10. The transparency duties state that, once a year, Ofcom must serve notice on the larger companies—those in categories 1, 2A and 2B—requiring them to produce a transparency report. That is not a power for Ofcom—it is a requirement. Clause 64(1) states that Ofcom
“must give every provider…a notice which requires the provider to produce…(a ‘transparency report’).”
The content of the transparency report is specified by Ofcom, as set out in subsection (3). As Members will see, Ofcom has wide powers to specify what must be included in the report. On page 186, schedule 8—I know that we will debate it later, but it is relevant to the amendment—sets out the scope of what Ofcom can require. It is an extremely long list that covers everything we would wish to see. Paragraph 1, for instance, states:
“The incidence of illegal content, content that is harmful to children and priority content that is harmful to adults on a service.”
Therefore, the transparency reporting requirement—it is not an option but a requirement—in clause 64 addresses the transparency point that was raised earlier.
Amendment 14 would require a provider’s board members or senior manager to take responsibility for the illegal content risk assessment. We agree with the Opposition’s point. Indeed, we agree with what the Opposition are trying to achieve in a lot of their amendments.
There is a “but” coming. We think that, in all cases apart from one, the Bill as drafted already addresses the matter. In the case of amendment 14, the risk assessment duties as drafted already explicitly require companies to consider how their governance structures may affect the risk of harm to users arising from illegal content. Ofcom will provide guidance to companies about how they can comply with those duties, which is very likely to include measures relating to senior-level engagement. In addition, Ofcom can issue confirmation decisions requiring companies to take specific steps to come into compliance. To put that simply, if Ofcom thinks that there is inadequate engagement by senior managers in relation to the risk assessment duties, it can require—it has the power to compel—a change of behaviour by the company.
I come now to clause 9—I think this group includes clause 9 stand part as well. The shadow Minister has touched on this. Clause 9 contains safety duties in relation to—
The hon. Lady raises an interesting point about time. However, the clause 8(5)(d) uses the wording,
“the level of risk of functionalities of the service facilitating the presence or dissemination of illegal content”
and so on. That presence can happen at any time, even fleetingly, as with Snapchat. Even when the image self-deletes after a certain period—so I am told, I have not actually used Snapchat—the presence has occurred. Therefore, that would be covered by clause 8(5)(d).
Will the Minister explain how we would be able to prove, once the image is deleted, that it was present on the platform?
The question of proof is a separate one, and that would apply however we drafted the clause. The point is that the clause provides that any presence of a prohibited image would fall foul of the clause. There are also duties on the platforms to take reasonable steps. In the case of matters such as child sexual exploitation and abuse images, there are extra-onerous duties that we have discussed before, for obvious and quite correct reasons.
Order. Minister, before you continue, before the Committee rose earlier today, there was a conversation about clause 9 being in, and then I was told it was out. This is like the hokey cokey; it is back in again, just to confuse matters further. I was confused enough, so that point needs to be clarified.
It is grouped, Chair. We were discussing clause 8 and the relevant amendments, then we were going to come back to clause 9 and the relevant amendments.
Obviously, I encourage the Committee to support those clauses standing part of the Bill. They impose duties on search services—we touched on search a moment ago—to assess the nature and risk to individuals of accessing illegal content via their services, and to minimise the risk of users encountering that illegal content. They are very similar duties to those we discussed for user-to-user services, but applied in the search context. I hope that that addresses all the relevant provisions in the group that we are debating.
I am grateful for the opportunity to speak to amendments to clause 9 and to clauses 23 and 24, which I did not speak on earlier. I am also very grateful that we are being broadcast live to the world and welcome that transparency for all who might be listening.
On clause 9, it is right that the user-to-user services will be required to have specific duties and to take appropriate measures to mitigate and manage the risk of harm to individuals and their likelihood of encountering priority illegal content. Again, however, the Bill does not go far enough, which is why we are seeking to make these important amendments. On amendment 18, it is important to stress that the current scope of the Bill does not capture the range of ways in which child abusers use social networks to organise abuse, including to form offender networks. They post digital breadcrumbs that signpost to illegal content on third-party messaging apps and the dark web, and they share child abuse videos that are carefully edited to fall within content moderation guidelines. This range of techniques, known as child abuse breadcrumbing, is a significant enabler of online child abuse.
Our amendment would give the regulator powers to tackle breadcrumbing and ensure a proactive upstream response. The amendment would ensure that tens of millions of interactions with accounts that actively enable the discovery and sharing of child abuse material will be brought into regulatory scope. It will not leave that as ambiguous. The amendment will also ensure that companies must tackle child abuse at the earliest possible stage. As it stands, the Bill would reinforce companies’ current focus only on material that explicitly reaches the criminal threshold. Because companies do not focus their approach on other child abuse material, abusers can exploit this knowledge to post carefully edited child abuse images and content that enables them to connect and form networks with other abusers. Offenders understand and can anticipate that breadcrumbing material will not be proactively identified or removed by the host site, so they are able to organise and link to child abuse in plain sight.
We all know that child abuse breadcrumbing takes many forms, but techniques include tribute sites where users create social media profiles using misappropriated identities of known child abuse survivors. These are used by offenders to connect with likeminded perpetrators to exchange contact information, form offender networks and signpost child abuse material elsewhere online. In the first quarter of 2021, there were 6 million interactions with such accounts.
Abusers may also use Facebook groups to build offender groups and signpost to child abuse hosted on third-party sites. Those groups are thinly veiled in their intentions; for example, as we heard in evidence sessions, groups are formed for those with an interest in children celebrating their 8th, 9th or 10th birthdays. Several groups with over 50,000 members remained alive despite being reported to Meta, and algorithmic recommendations quickly suggested additional groups for those members to join.
Lastly, abusers can signpost to content on third-party sites. Abusers are increasingly using novel forms of technology to signpost to online child abuse, including QR codes, immersive technologies such as the metaverse, and links to child abuse hosted on the blockchain. Given the highly agile nature of the child abuse threat and the demonstrable ability of sophisticated offenders to exploit new forms of technology, this amendment will ensure that the legislation is effectively futureproofed. Technological change makes it increasingly important that the ability of child abusers to connect and form offender networks can be disrupted at the earliest possible stage.
Turning to amendment 21, we know that child abuse is rarely siloed on a single platform or app. Well-established grooming pathways see abusers exploit the design features of social networks to contact children before they move communication across to other platforms, including livestreaming sites, as we have already heard, and encrypted messaging services. Offenders manipulate features such as Facebook’s algorithmic friend suggestions to make initial contact with a large number of children. They can then use direct messages to groom them and coerce children into sending sexual images via WhatsApp. Similarly, as we heard earlier, abusers can groom children through playing videogames and then bringing them on to another ancillary platform, such as Discord.
The National Society for the Prevention of Cruelty to Children has shared details of an individual whose name has been changed, and whose case particularly highlights the problems that children are facing in the online space. Ben was 14 when he was tricked on Facebook into thinking he was speaking to a female friend of a friend, who turned out to be a man. Using threats and blackmail, he coerced Ben into sending abuse images and performing sex acts live on Skype. Those images and videos were shared with five other men, who then bombarded Ben with further demands. His mum, Rachel, said:
“The abuse Ben suffered had a devastating impact on our family. It lasted two long years, leaving him suicidal.
It should not be so easy for an adult to meet and groom a child on one site then trick them into livestreaming their own abuse on another app, before sharing the images with like-minded criminals at the click of a button.
Social media sites should have to work together to stop this abuse happening in the first place, so other children do not have to go through what Ben did.”
The current drafting of the Bill does not place sufficiently clear obligations on platforms to co-operate on the cross-platform nature of child abuse. Amendment 21 would require companies to take reasonable and proportionate steps to share threat assessments, develop proportionate mechanisms to share offender intelligence, and create a rapid response arrangement to ensure that platforms develop a coherent, systemic approach to new and emerging threats. Although the industry has developed a systemic response to the removal of known child abuse images, these are largely ad hoc arrangements that share information on highly agile risk profiles. The cross-platform nature of grooming and the interplay of harms across multiple services need to be taken into account. If it is not addressed explicitly in the Bill, we are concerned that companies may be able to cite competition concerns to avoid taking action.
On the topic of child abuse images, the hon. Member spoke earlier about livestreaming and those images not being captured. I assume that she would make the same point in relation to this issue: these live images may not be captured by AI scraping for them, so it is really important that they are included in the Bill in some way as well.
I completely agree with the hon. Member, and appreciate her intervention. It is fundamental for this point to be captured in the Bill because, as we are seeing, this is happening more and more. More and more victims are coming forward who have been subject to livestreaming that is not picked up by the technology available, and is then recorded and posted elsewhere on smaller platforms.
Legal advice suggests that cross-platform co-operation is likely to be significantly impeded by the negative interplay with competition law unless there is a clear statutory basis for enabling or requiring collaboration. Companies may legitimately have different risk and compliance appetites, or may simply choose to hide behind competition law to avoid taking a more robust form of action.
New and emerging technologies are likely to produce an intensification of cross-platform risks in the years ahead, and we are particularly concerned about the child abuse impacts in immersive virtual reality and alternative-reality environments, including the metaverse. A number of high-risk immersive products are already designed to be platform-agnostic, meaning that in-product communication takes place between users across multiple products and environments. There is a growing expectation that these environments will be built along such lines, with an incentive for companies to design products in this way in the hope of blunting the ability of Governments to pursue user safety objectives.
Separately, regulatory measures that are being developed in the EU, but are highly likely to impact service users in the UK, could result in significant unintended safety consequences. Although the interoperability provisions in the Digital Markets Act are strongly beneficial when viewed through a competition lens—they will allow the competition and communication of multiple platforms—they could, without appropriate safety mitigations, provide new means for abusers to contact children across multiple platforms, significantly increase the overall profile of cross-platform risk, and actively frustrate a broad number of current online safety responses. Amendment 21 will provide corresponding safety requirements that can mitigate the otherwise significant potential for unintended consequences.
The Minister referred to clauses 23 and 24 in relation to amendments 30 and 31. We think a similar consideration should apply for search services as well as for user-to-user services. We implore that the amendments be made, in order to prevent those harms from occurring.
I have already commented on most of those amendments, but one point that the shadow Minister made that I have not addressed was about acts that are essentially preparatory to acts of child abuse or the exchange of child sexual exploitation and abuse images. She was quite right to raise that issue as a matter of serious concern that we would expect the Bill to prevent, and I offer the Committee the reassurance that the Bill, as drafted, does so.
Schedule 6 sets out the various forms of child sexual exploitation and abuse that are designated as priority offences and that platforms have to take proactive steps to prevent. On the cross-platform point, that includes, as we have discussed, things that happen through a service as well as on a service. Critically, paragraph 9 of schedule 6 includes “inchoate offences”, which means someone not just committing the offence but engaging in acts that are preparatory to committing the offence, conspiring to commit the offence, or procuring, aiding or abetting the commission of the offence. The preparatory activities that the shadow Minister referred to are covered under schedule 6, particularly paragraph 9.
I thank the Minister for giving way. I notice that schedule 6 includes provision on the possession of indecent photographs of children. Can he confirm that that provision encapsulates the livestreaming of sexual exploitation?
Amendments 20, 26, 18 and 21 to clause 9 have already been debated. Does the shadow Minister wish to press any of them to a vote?
Amendments 20, 18 and 21.
Amendment proposed: 20, in clause 9, page 7, line 30, at end insert
“, including by being directed while on the service towards priority illegal content hosted by a different service;”—(Alex Davies-Jones.)
This amendment aims to include within companies’ safety duties a duty to consider cross-platform risk.
Question put, That the amendment be made.
Yes, and that is why governance is addressed in the clause as drafted. But the one thing that will really change the way the leadership of these companies thinks about this issue is the one thing they ultimately care about—money. The reason they allow unsafe content to circulate and do not rein in or temper their algorithms, and the reason we are in this situation, which has arisen over the last 10 years or so, is that these companies have consistently prioritised profit over protection. Ultimately, that is the only language they understand—it is that and legal compulsion.
While the Bill rightly addresses governance in clause 10 and in other clauses, as I have said a few times, what has to happen to make this change occur is the compulsion that is inherent in the powers to fine and to deny service—to pull the plug—that the Bill also contains. The thing that will give reassurance to our constituents, and to me as a parent, is knowing that for the first time ever these companies can properly be held to account. They can be fined. They can have their connection pulled out of the wall. Those are the measures that will protect our children.
The Minister is being very generous with his time, but I do not think he appreciates the nature of the issue. Mark Zuckerberg’s net worth is $71.5 billion. Elon Musk, who is reported to be purchasing Twitter, is worth $218 billion. Bill Gates is worth $125 billion. Money does not matter to these people.
The Minister discusses huge fines for the companies and the potential sanction of bringing down their platforms. They will just set up another one. That is what we are seeing with the smaller platforms: they are closing down and setting up new platforms. These measures do not matter. What matters and will actually make a difference to the safety of children and adults online is personal liability—holding people personally responsible for the direct harm they are causing to people here in the United Kingdom. That is what these amendments seek to do, and that is why we are pushing them so heavily. I urge the Minister to respond to that.
We discussed personal liability extensively this morning. As we discussed, there is personal liability in relation to providing information, with a criminal penalty of up to two years’ imprisonment, to avoid situations like the one we saw a year or two ago, where one of these companies failed to provide the Competition and Markets Authority with the information that it required.
The shadow Minister pointed out the very high levels of global turnover—$71.5 billion—that these companies have. That means that ultimately they can be fined up to $7 billion for each set of breaches. That is a vast amount of money, particularly if those breaches happen repeatedly. She said that such companies will just set up again if we deny their service. Clearly, small companies can close down and set up again the next day, but gigantic companies, such as Meta—Facebook—cannot do that. That is why I think the sanctions I have pointed to are where the teeth really lie.
I accept the point about governance being important as well; I am not dismissing that. That is why we have personal criminal liability for information provision, with up to two years in prison, and it is why governance is referenced in clause 10. I accept the spirit of the points that have been made, but I think the Bill delivers these objectives as drafted.
We have deliberately avoided being too prescriptive about precisely how the duty is met. We have pointed to age verification as an example of how the duty can be met without saying that that is the only way. We would not want to bind Ofcom’s hands, or indeed the hands of platforms. Clearly, using a third party is another way of delivering the outcome. If a platform were unable to demonstrate to Ofcom that it could deliver the required outcome using its own methods, Ofcom may well tell it to use a third party instead. The critical point is that the outcome must be delivered. That is the message that the social media firms, Ofcom and the courts need to hear when they look at our proceedings. That is set out clearly in the clause. Parliament is imposing a duty, and we expect all those to whom the legislation applies to comply with it.
Question put and agreed to.
Clause 11 accordingly ordered to stand part of the Bill.
Clause 12
Adults’ risk assessment duties
I beg to move amendment 12, in clause 12, page 12, line 10, at end insert—
“(4A) A duty to publish the adults’ risk assessment and proactively supply this to OFCOM.”
This amendment creates a duty to publish the adults’ risk assessment and supply it to Ofcom.
The amendment creates a duty to publish the adults’ risk assessment and supply it to Ofcom. As my hon. Friend the Member for Worsley and Eccles South remarked when addressing clause 10, transparency and scrutiny of those all-important risk assessments must be at the heart of the Online Safety Bill. We all know that the Government have had a hazy record on transparency lately but, for the sake of all in the online space, I sincerely hope that the Minister will see the value in ensuring that the risk assessments are accurate, proactively supplied and published for us all to consider.
It is only fair that all the information about risks to personal safety be made available to users of category 1 services, which we know are the most popular and, often, the most troublesome services. We all want people to feel compelled to make their own decisions about their behaviour both online and offline. That is why we are pushing for a thorough approach to risk assessments more widely. Also, without a formal duty to publicise those risk assessments, I fear there will be little change in our safety online. The Minister has referenced that the platforms will be looking back at Hansard in years to come to determine whether or not they should be doing the right thing. Unless we make that a statutory obligation within the Bill, I fear that reference will fall on deaf ears.
Once again, I agree with the point about transparency and the need to have those matters brought into the light of day. We heard from Frances Haugen how Facebook—now Meta—actively resisted doing so. However, I point to two provisions already in the Bill that deliver precisely that objective. I know we are debating clause 12, but there is a duty in clause 13(2) for platforms to publish in their terms of service—a public document—the findings of the most recent adult risk assessment. That duty is in clause 13—the next clause we are going to debate—in addition to the obligations I have referred to twice already in clause 64, where Ofcom compels those firms to publish their transparency reports. I agree with the points that the shadow Minister made, but suggest that through clause 13(2) and clause 64, those objectives are met in the Bill as drafted.
I thank the Minister for his comments, but sadly we do not feel that is appropriate or robust enough, which is why we will be pressing the amendment to a Division.
Question put, That the amendment be made.
The Committee divided.
While I am at risk of parroting my hon. Friend the Member for Worsley and Eccles South on clause 11, it is important that adults and the specific risks they face online are considered in the clause. The Minister knows we have wider concerns about the specific challenges of the current categorisation system. I will come on to that at great length later, but I thought it would be helpful to remind him at this relatively early stage that the commitments to safety and risk assessments for category 1 services will only work if category 1 encapsulates the most harmful platforms out there. That being said, Labour broadly supports this clause and has not sought to amend it.
I am eagerly awaiting the lengthy representations that the shadow Minister just referred to, as are, I am sure, the whole Committee and indeed the millions watching our proceedings on the live broadcast. As the shadow Minister said, clause 13 sets out the safety duties in relation to adults. This is content that is legal but potentially harmful to adults, and for those topics specified in secondary legislation, it will require category 1 services to set out clearly what actions they might be taking—from the actions specified in subsection (4) —in relation to that content.
It is important to specify that the action they may choose to take is a choice for the platform. I know some people have raised issues concerning free speech and these duties, but I want to reiterate and be clear that this is a choice for the platform. They have to be publicly clear about what choices they are making, and they must apply those choices consistently. That is a significant improvement on where we are now, where some of these policies get applied in a manner that is arbitrary.
Question put and agreed to.
Clause 13 accordingly ordered to stand part of the Bill.
Clause 14
User empowerment duties
I beg to move amendment 46, in clause 14, page 14, line 12, after “non-verified users” insert
“and to enable them to see whether another user is verified or non-verified.”
This amendment would make it clear that, as part of the User Empowerment Duty, users should be able to see which other users are verified and which are non-verified.
With this it will be convenient to discuss the following:
Amendment 47, in clause 189, page 155, line 1, at end insert
“‘Identity Verification’ means a system or process designed to enable a user to prove their identity, for purposes of establishing that they are a genuine, unique, human user of the service and that the name associated with their profile is their real name.”
This amendment adds a definition of Identity Verification to the terms defined in the Bill.
New clause 8—OFCOM’s guidance about user identity verification—
“(1) OFCOM must produce guidance for providers of Category 1 services on how to comply with the duty set out in section 57(1).
(2) In producing the guidance (including revised or replacement guidance), OFCOM must have regard to—
(a) ensuring providers offer forms of identity verification which are likely to be accessible to vulnerable adult users and users with protected Characteristics under the Equality Act 2010,
(b) promoting competition, user choice, and interoperability in the provision of identity verification,
(c) protection of rights, including rights to privacy, freedom of expression, safety, access to information, and the rights of children,
(d) alignment with other relevant guidance and regulation, including with regards to Age Assurance and Age Verification.
(3) In producing the guidance (including revised or replacement guidance), OFCOM must set minimum standards for the forms of identity verification which Category services must offer, addressing—
(a) effectiveness,
(b) privacy and security,
(c) accessibility,
(d) time-frames for disclosure to Law Enforcement in case of criminal investigations,
(e) transparency for the purposes of research and independent auditing,
(f) user appeal and redress mechanisms.
(4) Before producing the guidance (including revised or replacement guidance), OFCOM must consult—
(a) the Information Commissioner,
(b) the Digital Markets Unit,
(c) persons whom OFCOM consider to have technological expertise relevant to the duty set out in section 57(1),
(d) persons who appear to OFCOM to represent the interests of users including vulnerable adult users of Category 1 services, and
(e) such other persons as OFCOM considers appropriate.
(5) OFCOM must publish the guidance (and any revised or replacement guidance).”
This new clause would require Ofcom to set a framework of principles and minimum standards for the User Verification Duty.
The revised Bill seeks to address the problems associated with anonymity through requiring platforms to empower users, with new options to verify their identity and filter out non-verified accounts. This is in line with the approach recommended by Clean Up The Internet and also reflects the approach proposed in the Social Media Platforms (Identity Verification) Bill, which was tabled by the hon. Member for Stroud (Siobhan Baillie) and attracted cross-party support. It has the potential to strike a better balance between tackling the clear role that anonymity can play in fuelling abuse and disinformation, while safeguarding legitimate uses of anonymity, including by vulnerable users, for whom anonymity can act as a protection. However, Labour does share the concerns of stakeholders around the revised Bill, which we have sought to amend.
Amendment 46 aims to empower people to use this information about verification when making judgments about the reliability of other accounts and the content they share. This would ensure that the user verification duty helps disrupt the use of networks of inauthentic accounts to spread disinformation. Labour welcomes the inclusion in the revised Bill of measures designed to address harm associated with misuse of anonymous social media accounts. There is considerable evidence from Clean Up The Internet and others that anonymity fuels online abuse, bullying and trolling and that it is one of the main tools used by organised disinformation networks to spread and amplify false, extremist and hateful content.
The revised Bill seeks to address the problems associated with anonymity, by requiring platforms to empower users with new options to verify their identity and to filter out non-verified accounts. In doing so, it has the potential to strike a better balance between tackling the clear role that anonymity can play in fuelling abuse and misinformation while safeguarding legitimate users of anonymity, including vulnerable users, for whom anonymity acts as a protection.
Clause 14 falls short of truly empowering people to make the most well-informed decisions about the type of content they engage with. We believe that this could be simple, and a simple change from a design perspective. Category 1 platforms are already able to verify different types of accounts, whether they be personal or business accounts, so ensuring that people are equipped with this information more broadly would be an easy step for the big platforms to make. Indeed, the Joint Committee’s prelegislative scrutiny recommended that the Government consider, as part of Ofcom’s code of practice, a requirement for the largest and highest-risk platforms to offer the choice of verified or unverified status and user options on how they interact with accounts in either category.
I know that there are concerns about verification, and there is a delicate balance between anonymity, free speech and protecting us all online. I somewhat sympathise with the Minister in being tasked with bringing forward this complex legislation, but the options for choosing what content and users we do and do not engage with are already there on most platforms. On Twitter, we are able to mute accounts—I do so regularly—or keywords that we want to avoid. Similarly, we can restrict individuals on Instagram.
In evidence to the Joint Committee, the Secretary of State said that the first priority of the draft Bill was to end all online abuse, not just that from anonymous accounts. Hopes were raised about the idea of giving people the option to limit their interaction with anonymous or non-verified accounts. Clearly, the will is there, and the amendment ensures that there is a way, too. I urge the Minister to accept the amendment, if he is serious about empowering users across the United Kingdom.
Now I move on to amendment 47. As it stands, the Bill does not adequately define “verification” or set minimum standards for how it will be carried out. There is a risk that platforms will treat this as a loophole in order to claim that their current, wholly inadequate processes count as verification. We also see entirely avoidable risks of platforms developing new verification processes that fail to protect users’ privacy and security or which serve merely to extend their market dominance to the detriment of independent providers. That is why it is vital that a statutory definition of identity verification is placed in the Bill.
I have already spoken at length today, and I appreciate that we are going somewhat slowly on the Bill, but it is complex legislation and this is an incredibly important detail that we need to get right if the Bill is to be truly world leading. Without a definition of identity verification, I fear that we are at risk of allowing technology, which can easily replicate the behaviours of a human being, to run rife, which would essentially invalidate the process of verification entirely.
I have also spoken at length about my concerns relating to AI technologies, the lack of future proofing in the Bill and the concerns that could arise in the future. I am sure that the Minister is aware that that could have devastating impacts on our democracy and our online safety more widely.
New clause 8 would ensure that the user empowerment duty and user verification work as intended by simply requiring Ofcom to set out principles and minimum standards for compliance. We note that the new clause is entirely compatible with the Government’s stated aims for the Bill and would provide a clearer framework for both regulated companies and the regulator. By its very nature, it is vital that in preparing the guidance Ofcom must ensure that the delicate balance that I touched on earlier between freedom of expression, the right to privacy and safety online is kept in mind throughout.
We also felt it important that, in drawing up the guidance a collaborative approach should be taken. Regulating the online space is a mammoth task, and while we have concerns about Ofcom’s independence, which I will gladly touch on later, we also know that it will be best for us all if it is required to draw on the expertise of other expert organisations in doing so.
There is a Division in the House, so I will suspend the sitting for 15 minutes.
When it comes to police investigations, if something is illegal and merits a report to the police, users should report it, regardless of whether someone is verified or not—whatever the circumstances. I would encourage any internet user to do that. That effectively applies on Twitter already; some people have blue ticks and some people do not, and people should report others to the police if they do something illegal, whether or not they happen to have a blue tick.
Amendment 47 seeks to create a definition of identity verification in clause 189. In addition, it would compel the person’s real name to be displayed. I understand the spirit of the amendment, but there are two reasons why I would not want to accept it and would ask hon. Members not to press it. First, the words “identity verification” are ordinary English words with a clear meaning and we do not normally define in legislation ordinary English words with a clear meaning. Secondly, the amendment would add the new requirement that, if somebody is verified, their real name has to be displayed, but I do not think that that is the effect of the drafting as it stands. Somebody may be verified, and the company knows who they are—if the police go to the company, they will have the verified information—but there is no obligation, as the amendment is drafted, for that information to be displayed publicly. The effect of that part of the amendment would be to force users to choose between disclosing their identity to everyone or having no control over who they interact with. That may not have been the intention, but I am not sure that this would necessarily make sense.
New clause 8 would place requirements on Ofcom about how to produce guidance on user identity verification and what that guidance must contain. We already have provisions on that in clause 58, which we will no doubt come to, although probably not later on today—maybe on Thursday. Clause 58 allows Ofcom to include in its regulatory guidance the principles and standards referenced in the new clause, which can then assist service providers in complying with their duties. Of course, if they choose to ignore the guidelines and do not comply with their duties, they will be subject to enforcement action, but we want to ensure that there is flexibility for Ofcom, in writing those guidelines, and for companies, in following those guidelines or taking alternative steps to meet their duty.
This morning, a couple of Members talked about the importance of remaining flexible and being open to future changes in technology and a wide range of user needs. We want to make sure that flexibility is retained. As drafted, new clause 8 potentially undermines that flexibility. We think that the powers set out in clause 58 give Ofcom the ability to set the relevant regulatory guidance.
Clause 14 implements the proposals made by my hon. Friend the Member for Stroud in her ten-minute rule Bill and the proposals made, as the shadow Minister has said, by a number of third-party stakeholders. We should all welcome the fact that these new user empowerment duties have now been included in the Bill in response to such widespread parliamentary lobbying.
I am grateful to the Minister for giving way. I want to recount my own experience on this issue. He mentioned that anybody in receipt of anonymous abuse on social media should report it to the police, especially if it is illegal. On Thursday, I dared to tweet my opinions on the controversial Depp-Heard case in America. As a result of putting my head above the parapet, my Twitter mentions were an absolute sewer of rape threats and death threats, mainly from anonymous accounts. My Twitter profile was mocked up—I had devil horns and a Star of David on my forehead. It was vile. I blocked, deleted and moved on, but I also reported those accounts to Twitter, especially those that sent me rape threats and death threats.
That was on Thursday, and to date no action has been taken and I have not received any response from Twitter about any of the accounts I reported. The Minister said they should be reported to the police. If I reported all those accounts to the police, I would still be there now reporting them. How does he anticipate that this will be resourced so that social media companies can tackle the issue? That was the interaction resulting from just one tweet that I sent on Thursday, and anonymous accounts sent me a barrage of hate and illegal activity.
The shadow Minister raises a very good point. Of course, what she experienced on Twitter was despicable, and I am sure that all members of the Committee would unreservedly condemn the perpetrators who put that content on there. Once the Bill is passed, there will be legal duties on Twitter to remove illegal content. At the moment, they do not exist, and there is no legal obligation for Twitter to remove that content, even though much of it, from the sound of it, would cross one of various legal thresholds. Perhaps some messages qualify as malicious communication, and others might cross other criminal thresholds. That legal duty does not exist at the moment, but when this Bill passes, for the first time there will be that duty to protect not just the shadow Minister but users across the whole country.
Question put, That the amendment be made.
I will speak to clauses 15 and 16 and to new clause 7. The duties outlined in the clause, alongside clause 16, require platforms to have special terms and processes for handling journalistic and democratically important content. In respect of journalistic content, platforms are also required to provide an expedited appeals process for removed posts, and terms specifying how they will define journalistic content. There are, however, widespread concerns about both those duties.
As the Bill stands, we feel that there is too much discretion for platforms. They are required to define “journalistic” content, a role that they are completely unsuited to and, from what I can gather, do not want. In addition, the current drafting leaves the online space open to abuse. Individuals intent on causing harm are likely to apply to take advantage of either of those duties; masquerading as journalists or claiming democratic importance in whatever harm they are causing, and that could apply to almost anything. In the evidence sessions, we also heard about the concerns expressed brilliantly by Kyle Taylor from Fair Vote and Ellen Judson from Demos, that the definitions as they stand in the Bill thus far are broad and vague. However, we will come on to those matters later.
Ultimately, treating “journalistic” and “democratically important” content differently is unworkable, leaving platforms to make impossible judgments over, for example, when and for how long an issue becomes a matter of reasonable public debate, or in what settings a person is acting as a journalist. As the Minister knows, the duties outlined in the clause could enable a far-right activist who was standing in an election, or potentially even just supporting candidates in elections, to use all social media platforms. That might allow far-right figures to be re-platformed on to social media sites where they would be free to continue spreading hate.
The Bill indicates that content will be protected if created by a political party ahead of a vote in Parliament, an election or a referendum, or when campaigning on a live political issue—basically, anything. Can the Minister confirm whether the clause means that far-right figures who have been de-platformed for hate speech already must be reinstated if they stand in an election? Does that include far-right or even neo-Nazi political parties? Content and accounts that have been de-platformed from mainstream platforms for breaking terms of service should not be allowed to return to those platforms via this potential—dangerous—loophole.
As I have said, however, I know that these matters are complex and, quite rightly, exemptions must be in place to allow for free discussion around matters of the day. What cannot be allowed to perpetuate is hate sparked by bad actors using simple loopholes to avoid any consequences.
On clause 16, the Minister knows about the important work that Hope not Hate is doing in monitoring key far-right figures. I pay tribute to it for its excellent work. Many of them self-define as journalists and could seek to exploit this loophole in the Bill and propagate hate online. Some of the most high-profile and dangerous far-right figures in the UK, including Stephen Yaxley-Lennon, also known as Tommy Robinson, now class themselves as journalists. There are also far-right and conspiracy-theory so-called “news companies” such as Rebel Media and Urban Scoop. Both those replicate mainstream news publishers, but are used to spread misinformation and discriminatory content. Many of those individuals and organisations have been de-platformed already for consistently breaking the terms of service of major social media platforms, and the exemption could see them demand their return and have their return allowed.
New clause 7 would require the Secretary of State to publish a report reviewing the effectiveness of clauses 15 and 16. It is a simple new clause to require parliamentary scrutiny of how the Government’s chosen means of protecting content of democratic importance and content of journalistic content are working.
Hacked Off provided me with a list of people it found who have claimed to be journalists and who would seek to exploit the journalistic content duty, despite being banned from social media because they are racists or bad actors. First is Charles C. Johnson, a far-right activist who describes himself as an “investigative journalist”. Already banned from Twitter for saying he would “take out” a civil rights activist, he is also alleged to be a holocaust denier.
Secondly, we have Robert Stacy McCain. Robert has been banned from Twitter for participating in targeted abuse. He was a journalist for The Washington Post, but is alleged to have also been a member of the League of the South, a far-right group known to include racists. Then, there is Richard B. Spencer, a far-right journalist and former editor, only temporary banned for using overlapping accounts. He was pictured making the Nazi salute and has repeated Nazi propaganda. When Trump became President, he encouraged people to “party like it’s 1933”. Sadly, the list goes on and on.
Transparency is at the very heart of the Bill. The Minister knows we have concerns about clauses 15 and 16, as do many of his own Back Benchers. We have heard from my hon. Friend the Member for Batley and Spen how extremist groups and individuals and foreign state actors are having a very real impact on the online space. If the Minister is unwilling to move on tightening up those concepts, the very least he could commit to is a review that Parliament will be able to formally consider.
I thank the shadow Minister for her comments and questions. I would like to pick up on a few points on the clauses. First, there was a question about what content of democratic importance and content of journalistic importance mean in practice. As with many concepts in the Bill, we will look to Ofcom to issue codes of practice specifying precisely how we might expect platforms to implement the various provisions in the Bill. That is set out in clause 37(10)(e) and (f), which appear at the top of page 37, for ease. Clauses 15 and 16 on content of democratic and journalistic importance are expressly referenced as areas where codes of practice will have to be published by Ofcom, which will do further work on and consult on that. It will not just publish it, but will go through a proper process.
The shadow Minister expressed some understandable concerns a moment ago about various extremely unpleasant people, such as members of the far right who might somehow seek to use the provisions in clauses 15 and 16 as a shield behind which to hide, to enable them to continue propagating hateful, vile content. I want to make it clear that the protections in the Bill are not absolute—it is not that if someone can demonstrate that what they are saying is of democratic importance, they can say whatever they like. That is not how the clauses are drafted.
I draw attention to subsection (2) of both clauses 15 and 16. At the end of the first block of text, just above paragraph (a), it says “taken into account”: the duty is to ensure that matters concerning the importance of freedom of expression relating to content of democratic importance are taken into account when making decisions. It is not an absolute prohibition on takedown or an absolute protection, but simply something that has to be taken into account.
If someone from the far right, as the shadow Minister described, was spewing out vile hatred, racism or antisemitism, and tried to use those clauses, the fact that they might be standing in an election might well be taken into account. However, in performing that balancing exercise, the social media platforms and Ofcom acting as enforcers—and the court if it ever got judicially reviewed—would weigh those things up and find that taking into account content of democratic importance would not be sufficient to outweigh considerations around vile racism, antisemitism or misogyny.
The Minister mentions that it would be taken into account. How long does he anticipate it would be taken into account for, especially given the nature of an election? A short campaign could be a number of weeks, or something could be posted a day before an election, be deemed democratically important and have very serious and dangerous ramifications.
As I say, if content was racist, antisemitic or flagrantly misogynistic, the balancing exercise is performed and the democratic context may be taken into account. I do not think the scales would tip in favour of leaving the content up. Even during an election period, I think common sense dictates that.
To be clear on the timing point that the hon. Lady asked about, the definition of democratic importance is not set out in hard-edged terms. It does not say, “Well, if you are in a short election period, any candidate’s content counts as of democratic importance.” It is not set out in a manner that is as black and white as that. If, for example, somebody was a candidate but it was just racist abuse, I am not sure how even that would count as democratic importance, even during an election period, because it would just be abuse; it would not be contributing to any democratic debate. Equally, somebody might not be a candidate, or might have been a candidate historically, but might be contributing to a legitimate debate after an election. That might be seen as being of democratic importance, even though they were not actually a candidate. As I said, the concept is not quite as black and white as that. The main point is that it is only to be taken into account; it is not determinative.
I appreciate the Minister’s allowing me to come back on this. During the Committee’s evidence sessions, we heard of examples where bad-faith state actors were interfering in the Scottish referendum, hosting Facebook groups and perpetuating disinformation around the royal family to persuade voters to vote “Yes” to leave the United Kingdom. That disinformation by illegal bad-faith actors could currently come under both the democratic importance and journalistic exemptions, so would be allowed to remain for the duration of that campaign. Given the exemptions in the Bill, it could not be taken down but could have huge, serious ramifications for democracy and the security of the United Kingdom.
I understand the points that the hon. Lady is raising. However, I do not think that it would happen in that way.
No, I don’t. First of all, as I say, it is taken into account; it is not determinative. Secondly, on the point about state-sponsored disinformation, as I think I mentioned yesterday in response to the hon. Member for Liverpool, Walton, there is, as we speak, a new criminal offence of foreign interference being created in the National Security Bill. That will criminalise the kind of foreign interference in elections that she referred to. Because that would then create a new category of illegal content, that would flow through into this Bill. That would not be overridden by the duty to protect content of democratic importance set out here. I think that the combination of the fact that this is a balancing exercise, and not determinative, and the new foreign interference offence being created in the National Security Bill, will address the issue that the hon. Lady is raising—reasonably, because it has happened in this country, as she has said.
I will briefly turn to new clause 7, which calls for a review. I understand why the shadow Minister is proposing a review, but there is already a review mechanism in the Bill; it is to be found in clause 149, and will, of course, include a review of the way that clauses 15 and 16 operate. They are important clauses; we all accept that journalistic content and content of democratic importance is critical to the functioning of our society. Case law relating to article 10 of the European convention on human rights, for example, recognises content of journalistic importance as being especially critical. These two clauses seek to ensure that social media firms, in making their decisions, and Ofcom, in enforcing the firms, take account of that. However, it is no more than that: it is “take account”, it is not determinative.
Question put and agreed to.
Clause 15 accordingly ordered to stand part of the Bill.
Clause 16 ordered to stand part of the Bill.
Ordered, That further consideration be now adjourned. —(Steve Double.)