(1 day, 13 hours ago)
Lords ChamberThat the draft Regulations laid before the House on 16 December 2024 be approved.
Relevant document: 13th Report from the Secondary Legislation Scrutiny Committee
My Lords, as the Online Safety Act sets out, the Secretary of State must set thresholds for three categories of service: category 1, category 2A and category 2B. The services that fall into each of these categories must comply with additional duties, with category 1 services having the most duties placed on them. These duties are in addition to the core duties which apply to all user-to-user and search services in scope, including illegal content duties and child safety duties.
All categorised services must comply with transparency reporting duties. They must also have terms on parents’ ability to access information about how their child used a service, in the tragic event that their child dies. Category 1 and 2A services also have additional duties to tackle paid-for fraudulent advertising. They will also have to comply with enhanced risk assessment and record-keeping duties.
The most additional obligations will fall on category 1 services. These are the services with the most users, and which spread content easily, quickly and widely. To the extent it is proportionate to do so, category 1 services must give adults more choice about who they interact with and the content they see. That includes suicide, self-harm and hate-inciting content. Additionally, category 1 services must protect journalistic and news publisher content and content of democratic importance. The duties will also hold these companies to account over their terms of service, making sure that they keep the promises they make to their users.
The Act requires that specific factors must be taken into account by the Secretary of State when deciding the thresholds for each category. The threshold conditions for user-to-user services, categories 1 and 2B, must be set on user numbers, functionalities and any other characteristics or factors related to the user-to-user part of the service the Secretary of State deems relevant. For category 2A, they must be set on the number of users of the search engine, plus any other factors or characteristics.
For category 1, the key consideration is the likely impact of the number of users of the user-to-user part of the service and its functionalities on how quickly, easily and widely regulated user-generated content is disseminated by means of the service. For category 2A, the key consideration is the likely impact of the number of users of the search engine on the level of risk of harm to individuals from search content that is illegal or harmful to children. For category 2B, the key consideration is the likely impact of the number of users of the user-to-user part of the service and its functionalities on the level of risk of harm to individuals from illegal content and content that is harmful to children disseminated by means of the services.
These considerations formed the basis of Ofcom’s independent research and advice, published in March last year, which the Secretary of State had to consider when setting threshold conditions. Once in force, these regulations will enable Ofcom to set up a public register of categorised services, which it expects to publish this summer. Ofcom will then consult on the remaining draft codes of practice and guidance, where relevant, for the additional duties.
In laying these regulations before Parliament, the Secretary of State has considered Ofcom’s advice and decided to follow it. I know that this decision will not please everyone, so let me set out why it was made.
Ofcom’s research concluded that, as the number of users of a service increases, so does how widely content spreads. The statutory consideration of category 1 under the Act is
“how easily, quickly and widely regulated user-generated content is disseminated by means of the service”.
Therefore, it was concluded that user numbers should not be ignored. Setting thresholds for category 1 that take into account the size and reach of services is also essential to make sure we avoid inadvertently categorising hundreds of small, low-risk services.
I turn now to the regret amendment that the noble Lord, Lord Clement-Jones, has tabled before the House. It is disappointing that a regret amendment has been tabled. I understand that it is because of the noble Lord’s view that risk should be the main consideration for category 1. He would ideally like to see so-called “small but risky” services, such as small suicide forums, brought into scope.
I also want to acknowledge that the successful amendment from the noble Baroness, Lady Morgan, made it possible to create threshold combinations by reference only to functionalities and any other factors or characteristics. However, in practice this was difficult to do at the time.
In setting the threshold conditions, the Secretary of State must act within the legal framework, which means he still must consider easy, quick and wide dissemination of user-generated content for category 1. He must also act within the powers afforded to him in setting the thresholds, which does not allow for sub-delegation to outside parties, such as coroners or Ofcom.
Unintended consequences were considered, including unintentionally categorising hundreds of small, low-risk services. I want to be very clear through this that the Government did consider options to bring small but risky services into scope, including those proposed by many thoughtful people on this complicated issue, but ultimately a workable and robust condition for capturing small but risky services was not found.
At end insert “but that this House regrets that the Regulations do not impose duties available under the parent Act on small, high-risk platforms where harmful content, often easily accessible to children, is propagated; calls on the Government to clarify which smaller platforms will no longer be covered by Ofcom’s illegal content code and which measures they will no longer be required to comply with; and calls on the Government to withdraw the Regulations and establish a revised definition of Category 1 services.”
My Lords, I am very pleased to see the Minister back in her place. I thank her for her introduction to this statutory instrument. Her disappointment at my tabling this regret amendment is exceeded only by my own disappointment at the SI. However, I hope that she will provide the antidote to the Government’s alarming tendency to pick unnecessary fights on so many important issues—a number of them overseen by her department.
Those of us who were intimately involved with its passage hoped that the Online Safety Act would bring in a new era of digital regulation, but the Government’s and Ofcom’s handling of small but high-risk platforms threatens to undermine the Act’s fundamental purpose of creating a safer online environment. That is why I am moving this amendment, and I am very grateful to all noble Lords who are present and to those taking part.
The Government’s position is rendered even more baffling by their explicit awareness of the risks. Last September, the Secretary of State personally communicated concerns to Ofcom about the proliferation of harmful content, particularly regarding children’s access. Despite this acknowledged awareness, the regulatory framework remains fundamentally flawed in its approach to platform categorisation.
The parliamentary record clearly shows that cross-party support existed for a risk-based approach to platform categorisation, which became enshrined in law. The amendment to Schedule 11 from the noble Baroness, Lady Morgan—I am very pleased to see her in her place—specifically changed the requirement for category 1 from a size “and” functionality threshold to a size “or” functionality threshold. This modification was intended to ensure that Ofcom could bring smaller, high-risk platforms under appropriate regulatory scrutiny.
Subsequently, in September 2023, on consideration of Commons amendments, the Minister responsible for the Bill, the noble Lord, Lord Parkinson—I am pleased to see him in his place—made it clear what the impact was:
“I am grateful to my noble friend Lady Morgan of Cotes for her continued engagement on the issue of small but high-risk platforms. The Government were happy to accept her proposed changes to the rules for determining the conditions that establish which services will be designated as category 1 or 2B services. In making the regulations, the Secretary of State will now have the discretion to decide whether to set a threshold based on either the number of users or the functionalities offered, or on both factors. Previously, the threshold had to be based on a combination of both”.—[Official Report, 19/9/23; col. 1339.]
I do not think that could be clearer.
This Government’s and Ofcom’s decision to ignore this clear parliamentary intent is particularly troubling. The Southport tragedy serves as a stark reminder of the real-world consequences of inadequate online regulation. When hateful content fuels violence and civil unrest, the artificial distinction between large and small platforms becomes a dangerous regulatory gap. The Government and Ofcom seem to have failed to learn from these events.
At the heart of this issue seems to lie a misunderstanding of how harmful content proliferates online. The impact on vulnerable groups is particularly concerning. Suicide promotion forums, incel communities and platforms spreading racist content continue to operate with minimal oversight due to their size rather than their risk profile. This directly contradicts the Government’s stated commitment to halving violence against women and girls, and protecting children from harmful content online. The current regulatory framework creates a dangerous loophole that allows these harmful platforms to evade proper scrutiny.
The duties avoided by these smaller platforms are not trivial. They will escape requirements to publish transparency reports, enforce their terms of service and provide user empowerment tools. The absence of these requirements creates a significant gap in user protection and accountability.
Perhaps the most damning is the contradiction between the Government’s Draft Statement of Strategic Priorities for Online Safety, published last November, which emphasises effective regulation of small but risky services, and their and Ofcom’s implementation of categorisation thresholds that explicitly exclude these services from the highest level of scrutiny. Ofcom’s advice expressly disregarded—“discounted” is the phrase it used—the flexibility brought into the Act via the Morgan amendment, and advised that regulations should be laid that brought only large platforms into category 1. Its overcautious interpretation of the Act creates a situation where Ofcom recognises the risks but fails to recommend for itself the full range of tools necessary to address them effectively.
This is particularly important in respect of small, high-risk sites, such as suicide and self-harm sites, or sites which propagate racist or misogynistic abuse, where the extent of harm to users is significant. The Minister, I hope, will have seen the recent letter to the Prime Minister from a number of suicide, mental health and anti-hate charities on the issue of categorisation of these sites. This means that platforms such as 4chan, 8chan and Telegram, despite their documented role in spreading harmful content and co-ordinating malicious activities, escaped the full force of regulatory oversight simply due to their size. This creates an absurd situation where platforms known to pose significant risks to public safety receive less scrutiny than large platforms with more robust safety measures already in place.
The Government’s insistence that platforms should be “safe by design”, while simultaneously exempting high-risk platforms from category 1 requirements based solely on size metrics, represents a fundamental contradiction and undermines what we were all convinced—and still are convinced—the Act was intended to achieve. Dame Melanie Dawes’s letter, in the aftermath of Southport, surely gives evidence enough of the dangers of some of the high-risk, smaller platforms.
Moreover, the Government’s approach fails to account for the dynamic nature of online risks. Harmful content and activities naturally migrate to platforms with lighter regulatory requirements. By creating this two-tier system, they have, in effect, signposted escape routes for bad actors seeking to evade meaningful oversight. This short-sighted approach could lead to the proliferation of smaller, high-risk platforms designed specifically to exploit these regulatory gaps. As the Minister mentioned, Ofcom has established a supervision task force for small but risky services, but that is no substitute for imposing the full force of category 1 duties on these platforms.
The situation is compounded by the fact that, while omitting these small but risky sites, category 1 seems to be sweeping up sites that are universally accepted as low-risk despite the number of users. Many sites with over 7 million users a month—including Wikipedia, a vital source of open knowledge and information in the UK—might be treated as a category 1 service, regardless of actual safety considerations. Again, we raised concerns during the passage of the Bill and received ministerial assurances. Wikipedia is particularly concerned about a potential obligation on it, if classified in category 1, to build a system that allows verified users to modify Wikipedia without any of the customary peer review.
Under Section 15(10), all verified users must be given an option to
“prevent non-verified users from interacting with content which that user generates, uploads or shares on the service”.
Wikipedia says that doing so would leave it open to widespread manipulation by malicious actors, since it depends on constant peer review by thousands of individuals around the world, some of whom would face harassment, imprisonment or physical harm if forced to disclose their identity purely to continue doing what they have done, so successfully, for the past 24 years.
This makes it doubly important for the Government and Ofcom to examine, and make use of, powers to more appropriately tailor the scope and reach of the Act and the categorisations, to ensure that the UK does not put low-risk, low-resource, socially beneficial platforms in untenable positions.
There are key questions that Wikipedia believes the Government should answer. First, is a platform caught by the functionality criteria so long as it has any form of content recommender system anywhere on UK-accessible parts of the service, no matter how minor, infrequently used and ancillary that feature is?
Secondly, the scope of
“functionality for users to forward or share regulated user-generated content on the service with other users of that service”
is unclear, although it appears very broad. The draft regulations provide no guidance. What do the Government mean by this?
Thirdly, will Ofcom be able to reliably determine how many users a platform has? The Act does not define “user”, and the draft regulations do not clarify how the concept is to be understood, notably when it comes to counting non-human entities incorporated in the UK, as the Act seems to say would be necessary.
The Minister said in her letter of 7 February that the Government are open to keeping the categorisation thresholds under review, including the main consideration for category 1, to ensure that the regime is as effective as possible—and she repeated that today. But, at the same time, the Government seem to be denying that there is a legally robust or justifiable way of doing so under Schedule 11. How can both those propositions be true?
Can the Minister set out why the regulations, as drafted, do not follow the will of Parliament—accepted by the previous Government and written into the Act—that thresholds for categorisation can be based on risk or size? Ofcom’s advice to the Secretary of State contained just one paragraph explaining why it had ignored the will of Parliament—or, as the regulator called it, the
“recommendation that allowed for the categorisation of services by reference exclusively to functionalities and characteristics”.
Did the Secretary of State ask to see the legal advice on which this judgment was based? Did DSIT lawyers provide their own advice on whether Ofcom’s position was correct, especially in the light of the Southport riots?
How do the Government intend to assess whether Ofcom’s regulatory approach to small but high-harm sites is proving effective? Have any details been provided on Ofcom’s schedule of research about such sites? Do the Government expect Ofcom to take enforcement action against small but high-harm sites, and have they made an assessment of the likely timescales for enforcement action?
My Lords, I thank the Minister for her engagement on this issue, not just with me but with Members across the House. It has been very much appreciated, including when she was not here because she was dealing with her own health issues.
When I talk about what we do here in the House of Lords, one of the great successes I point to is the scrutiny that we gave to the Online Safety Act. We did it in a cross-party way, eventually managing to persuade the Government, as well as Ofcom, about the changes that were needed. Those changes were then taken back to the House of Commons, and Ministers there conceded them. As a result of that working together, we ended up with a much stronger Bill that will do much to protect vulnerable and young people and those most at risk of harmful content online. So it is a matter of great regret that, the first time we are debating a statutory instrument of substantive interest under this Act, we—all of us, I suspect—have to say that we are deeply disappointed by the drafting that we have seen.
On 19 July 2023, I moved a very small amendment and was grateful to the House for its support. I said at the time that one change of one word—from “and” to “or”—made for a small but powerful amendment. The noble Lord, Lord Clement-Jones, set out brilliantly and comprehensively why that change was so important, so in the time available, I will not repeat what he said. The House clearly voted for change and the Minister’s own party supported that change, for which I was deeply grateful.
The other interesting thing is that Ofcom said to me that it did not object to that change. However, in its note today—I am sure that it sent the note to other Members—Ofcom talked about the harms-based approach that it is following when recommending to the Government how they should legislate under the Act. But that harms-based approach rings hollow when—through Ofcom’s interpretation, which it has given to the Government—it has ridden roughshod over looking at the risk of the small but high-harm platforms.
The draft statutory instrument is based on the number of users, and this House in its amendment made it very clear that, with harmful platforms, it is not just about the number of users they have but absolutely about the content, the functionalities and the risks that those sites will raise.
As the noble Baroness set out, Ofcom is relying on paragraph 1(5) of Schedule 11, looking at
“how easily, quickly and widely regulated user-generated content is disseminated by means of the service”.
But that paragraph says that the Secretary of State “must take into account” those things, not that the Secretary of State is bound solely by those criteria. Our criticism tonight of the statutory instrument is not just about the fact that Ofcom has chosen to take those words—I would say that Ofcom in not objecting to my amendment was being disingenuous if it already knew that it was going to rely on that sub-paragraph; the bigger question for the noble Baroness tonight is the fact that the Secretary of State did not have to accept the advice that Ofcom gave them.
The noble Lord, Lord Clement-Jones, talked, as no doubt others will, about the risk and the harm that we have seen from platforms. We will talk about the fact that for the Southport victims it needed only one person to be radicalised by a site that they were looking at to cause untold misery and devastation for families. This House voted recently on the harm caused by deepfake pornographic abuse. Again, it does not take many people to utterly ruin a victim’s life, and what about those platforms that promote suicide and self-harm content? It is not sufficient to say that this Act will impose greater burdens on illegal content. We all know from debates on the Act that there is content which is deliberately not illegal but which is deeply harmful both to victims and to the vulnerable.
As Jeremy Wright MP said in the debate on these regulations in Committee in the House of Commons, the Government are going to want or need these category 1 powers to apply to smaller, high-harm platforms before too long. Indeed, the Government’s own strategic statement published last year specifically says:
“The government would like to see Ofcom keep this approach”—
that is, the approach it has to small, risky services—
“under continual review and to keep abreast of new and emerging small but risky services, which are posing harm to users online”.
The Government and the Secretary of State already know that there are small but high-harm platforms causing immense risk which will not be caught by these regulations. As we have also heard, the flight therefore to these small, high-harm, risky platforms absolutely will happen as those who want to punt out harmful content seek to find platforms that are not bound by the most stringent regulations.
I will stop there because I know that others wish to speak. I will support the regret amendment tonight should the noble Lord, Lord Clement-Jones, decide to put it to a vote. It has taken far too long to get to this point. I understand the Government’s desire to make progress with these regulations, but the regret amendment states that it
“calls on the Government to withdraw the Regulations and establish a revised definition of Category 1 services”.
I ask the Minister to take that opportunity, because these regulations absolutely do not reflect the will of this House in that amendment. That is a great source of disappointment given the cross-party work that we all did to make sure the Online Safety Act was as comprehensive as it could be.
My Lords, I remind the House of my interests, particularly as chair of 5Rights and as adviser to the Institute for Ethics in AI at Oxford. I wholeheartedly agree with both the previous speakers, and in fact, they have put the case so forcefully that I hope that the Government are listening.
I wanted to use my time to speak about the gap between the Act that we saw pass through this House and the outcome. What worries me the most is how we should understand the purpose of an Act of Parliament and the hierarchy of the instructions it contains. I ask this because, as the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Morgan, have already said, the Government of the day, with the express support of Members of this House, including the Front Bench of the Labour Party, agreed that categorisation would be a question of risk or size, not simply size. That was the decision of the House, it was supported in the other place, and it is in the text of the Act. So, it would be useful to understand, in the view of His Majesty’s Government, whether the text of an Act and, separately, a statement made by a Minister from the Dispatch Box, have any authority. If they do, I cannot understand how Ofcom is allowed to overturn that, or how the Secretary of State, without taking action to amend the Act, has been able to allow it to do so.
It is essential to get a clear answer from the Minister about the status of the text of the Act, because this is a pattern of behaviour where the regulator and government appear to be cherry-picking which bits of the Online Safety Act are convenient and ignoring those they consider too difficult, too disruptive, or—I really hope not—too onerous for tech companies. Ofcom has similarly determined not to observe the provisions in the OSA about functionalities contained throughout the Act; for example, at Sections 9(5), 10(4) and 11(6)—I could go on; on extended use, at Section 11(6)(f); and on the requirement to consider the needs of children in different age groups which, like functionalities, run through the Act like a golden thread.
Ofcom’s own illegal harms register risk management guidance states that
“certain ‘functionalities’ stand out as posing particular risks because of the prominent role they appear to play in the spread of illegal content and the commission and facilitation of … offences”.
Ofcom then says its regulatory framework is intended to ensure service providers put in place safeguards to manage the risks posed by functionalities. It lists end-to-end encryption, pseudonymity and anonymity, live-streaming, content recommender systems, and, quite rightly, generative AI, all as functionality that it considers to be high risk. Specifically in relation to grooming, functionalities Ofcom considers risky include network expansion prompts, direct messaging, connection lists and automated information displays.
Despite acknowledgement that functionalities create heightened risk, a clear statement that addressing risk forms part of its regulatory duties, and the clearly expressed intent of Parliament and the wording of the Act, Ofcom has failed to comprehensively address functionalities both in the published illegal harms code and the draft children’s code, and it has chosen to overrule Parliament by ignoring the requirement in Schedule 11 to consider functionalities in determining which services should be designated as category 1 services.
Meanwhile, paragraph 4(a)(vii) of Schedule 4 is crystal clear in its objective of the Act that user-to-user services
“be designed and operated in such a way that … the different needs of children at different ages are taken into account”.
Ofcom has chosen to ignore that. Volume 5 of its draft children’s code says
“our proposals focus at this stage on setting the expectation of protections for all children under the age of 18”.
Any child, any parent and anyone who has spent time with children knows that five and 15 are not the same. The assertion from Ofcom in its narrative about the children’s code is blinding in its stupidity. If common sense cannot prevail, perhaps 100 years or more of child development study that sets out the ages and stages by which children can be expected to have the emotional and intellectual capacity to understand something could inform the regulator—and similarly, the age and stage by which we cannot expect a child to understand or have the intellectual capacity to deal with something.
The whole basis of child protection is that we should support the children on their journey from dependence to autonomy because we know that they do not have the capacity to do it for themselves in all contexts, because of the vulnerabilities associated with ages and development stages. Ofcom knows that the Act says that it should reflect this but somehow feels empowered to ignore or overrule the will of Parliament and, just as with categorisation, the Government appear to condone it.
My Lords, this is a regret amendment, and the conduct of Ofcom and the Government on this matter is surely deeply regrettable, for all the reasons that have been given by the noble Lord, Lord Clement-Jones, and the noble Baronesses, Lady Morgan and Lady Kidron. The treatment of small but high-risk services in these regulations simply frustrates the amendment of the noble Baroness, Lady Morgan, to Schedule 11, which was approved by this House and accepted by the Government in the Commons. It contradicts what the Minister, Mr Scully, said in the Commons when he accepted the amendment of the noble Baroness, Lady Morgan, approved by this House, and it fails to address the mischief in this context, which the noble Lord, Lord Clement-Jones, and others have clearly identified. I, too, would like to see or even to understand what possible legal advice has led to this lamentable position. The impact of the service does not—it cannot—depend only on the number of users. That was the whole point of the amendment of the noble Baroness, Lady Morgan.
The Minister suggested two arguments, as I understood her, but it is not good enough for her to say—if I may respectfully say so—that small services are still unable to act in an illegal manner. The Act is, of course, designed to provide further regulation—especially so because the criminal law is, regrettably, a blunt and slow instrument. Nor am I persuaded by the Minister’s suggestion that it is simply too difficult to draft regulations to address small but high-risk services. I simply do not accept that the expertise of the department and parliamentary counsel cannot come up with an appropriate regulation to address this mischief.
My Lords, I wish to speak to a point made by the noble Lord, Lord Clement-Jones, in relation to Wikipedia in particular. Noble Lords who took part in Committee on the Bill will recall that on several occasions I asked the Minister at the time—now my noble friend sitting on the opposition Front Bench—whether Wikipedia would be in scope of the regulation and, if so, whether it would have consequences which would make it impossible for Wikipedia, a charity, to continue with its existing model. My noble friend was unable at the time to say that; he said it would be a matter for the regulations and, indeed, for the regulator. Now here we are, nearly two years later, and we have some regulations, and I have the same question to put to the Minister on the Front Bench today. It appears to me—I must say that I have no interest to declare other than that I am an inveterate user of Wikipedia—and as the noble Lord, Lord Clement-Jones, said, that we are still left in a state of confusion about this. Regulation 3 says that for large sites—those with more than 34 million users—two criteria have to be met. One is that it has that number of users or more, and the other is that it
“uses a content recommender system”.
In paragraph (2), a content recommender system is broadly defined; for example, it says that it is not simply algorithms by means of machine learning but algorithms by machine learning or “other techniques”. The verb is not simply “determines” but
“determines, or otherwise affects, the way in which regulated user-generated content of a user, whether alone or with other content, may be encountered by other users of the service”.
Wikipedia indeed uses techniques for sending people articles and information that relate to what they have shown an interest in in the past. Would it be caught or not? What are the consequences of Wikipedia being caught? There are many, but I would like to test one out on noble Lords. I do not claim that this is definitive law, because, I suspect, much of the Act will need to be determined in the courts before we know what the definitive interpretation is.
Let us take as an example the case of some loathsome foreign dictator or other such character whose article on Wikipedia is less flattering than he might wish it to appear and he has a complaint about this. Wikipedia will consider it and then probably throw it in the waste-paper basket. If he seeks by some means to change the content of the article, of course, the editors of Wikipedia, who are a distributed network largely of volunteers, will intervene to change it back and try to ensure that it still reflects what is known to be reality. But under Section 64 of the Online Safety Act, one may apply to become a verified user. Obviously, I do not expect the loathsome person himself to apply to become a verified user; there will be some stooge, some student, some trainee or some character somewhere willing to register on their behalf who could then change the article, but because they are a verified user, under Section 15(10)(a) of the Act, they would acquire immunity to peer review. What they wrote on Wikipedia could not then be changed by the editors, because they were a verified user and had that protection.
I offer that as a genuine possibility. Noble Lords know that I am not a lawyer. This could be tested in the courts and found otherwise but, on the face of it, it appears that this sort of consequence would accrue. So I come back to the same question that I have been asking to no real effect now for two years. Perhaps when she comes to reply, the Minister can give me a definitive answer. Is Wikipedia in scope of this regulation? Is it covered by Section 3 or not? We would like to know.
My Lords, often in this House one is tempted to wander down memory lane and is filled with wonderful memories of good times and shared experiences, but none so present as the one that was referred to by the noble Baroness when she spoke earlier about the Online Safety Bill. I felt resonances up and down my back as I remembered the moment at which I decided that there was no point in reading my speech at Second Reading, which was full of sound and fury, full of anger, full of things that I was determined to see in the Bill, but realised that we all agreed about it and that the best thing was to say simply that we would work together to get the best Bill that we could out of the resources available across the House—and they are significant. As we have heard today, that worked—or it did until today.
I am very sad that I feel I will have to support the noble Lord, Lord Clement-Jones, in only my second appearance against my party. I felt very strongly that we had an agreement in the last Parliament, signed, sealed and signified by both Houses and agreed to by the noble Lord, Lord Parkinson, who is in his place. It bound any successor Government to operate within the terms of that Act. I find it egregious that the Government are seeking a way of not doing that, for reasons that I can only guess at but seem to be more about winning friends in strange places across the Atlantic than seeing the best for our people, particularly our children, in the United Kingdom.
There is an irony in that there would have been a way of avoiding this. I do not want to embarrass the noble Lord, Lord Parkinson, again, but we adopted towards the end of the Bill what I called the Parkinson rule, and rightly so because I felt that he was brave in proposing it. It was not the convention of the time, nor a structure or system that fit well within our current procedures in this House. The intention was to recognise the complexity and difficulty in the Online Safety Bill, now Act, and to invite the Government to share with the Select Committees of both Houses —the SIT Committee in the Commons and the Communications and Digital Committee in the Lords—draft material relating to the Online Safety Bill because we had a hunch that there would be issues that would need to be hammered out more clearly and more effectively than the arrangements for dealing with secondary legislation in this House currently allow. That might change, but until it does there is no way in which we can debate and discuss except through a regret amendment—or, as one might have been tempted to do on this occasion, through a fatal Motion—to an instrument which clearly has come out wrong, does not reflect the wishes of the House and may do damage which ultimately will end up in people’s lives. The responsibility will lie with the Government if they do not listen to what we are saying today.
The Parkinson rule was accepted by the noble Lord, Lord Parkinson. I quote from Hansard, although not entirely because there are some reservations which I want to skip over, though I am sure that they can be checked out. He said that the Government would
“ensure that the relevant committees have every chance to play a part in that consultation by informing them that the process is open”—
which is good—and that they would
“where possible, share draft statutory instruments directly with the relevant committees ahead of the formal laying process … on a case-by-case basis, considering what is appropriate and reasonably practical”.—[Official Report, 19/7/23; cols. 2351-52.]
That system has not been implemented by the Government.
I wrote to my noble friend the Minister while she was ill, and she has very kindly responded to me. She says she feels that the spirit of the agreement has been carried out in how the Government told both committees that there were statutory instruments on the way and that this was sufficient to meet the implications of the Parkinson rule. Given that three days’ notice was given before they were laid, that does not meet the requirement.
My Lords, I rise briefly to illustrate why we are as concerned as we are. One of the platforms that would not come under the categorisation that we would wish it to is Telegram. Last month, on 16 January, a 19 year-old man, Cameron Finnigan, a member of a Satanist extremist group called 764, was sentenced to six years in prison on charges including encouraging suicide and possessing indecent images of a child.
764 originates in the United States; Telegram has been used to disseminate it across the Atlantic. The FBI describes 764 as
“a network of violent extremists who seek to normalize the production, sharing, and possession of child pornography and gore material to desensitize and corrupt youth toward future acts of violence. Members of 764 gain notoriety by systematically targeting, grooming, and extorting victims through online social media platforms”,
particularly the small ones. It continues:
“Members demand that victims engage in and share media of self-mutilation, sexual acts, harm to animals, acts of random violence, suicide, and murder, all for the purpose of accelerating chaos and disrupting society and the world order”.
On that basis, you can understand completely why Ofcom thinks this is fine.
This is unacceptable and the Government really should look at this again. Above all, it is incumbent on Ofcom to recognise that to, apparently wilfully, diverge from the clear stated will of both Houses of Parliament, and what is written in the Act, is not simply inappropriate but, as other noble Lords have suggested, may well be illegal, and that should be looked into.
My Lords, I will be incredibly brief, having not been part of the collective of Peers who worked on the parent Act to this statutory instrument. The key question that has been highlighted is, what is the Government’s interpretation now of the powers in the Act? The Government’s and the Official Opposition’s interpretation at the time it was passed was that it had the power to include in category 1 providers on the basis of risk, not size. I am incredibly concerned because, in the debate in the Commons, the Minister said that
“as things stand, the Secretary of State does not have the power to include them”.—[Official Report, Commons, Third Delegated Legislation Committee, 4/2/25; col. 16.]
That was a reference to small but risky providers, and actually the Minister seemed slightly outraged at the implication that they were not acting where they should otherwise be doing so. So can the Minister clarify for this debate whether it is the Government’s position that they would like to include them and that that is the intention that they thought the Act had given them, but they cannot under the law as it is written; or that they do have the powers but have chosen not to, which is our understanding of their decision-making?
The reason that is so important is that the Minister has committed to reviewing these thresholds in future, but such reviews will have very little power if the Act itself is faulty and does not give them the ability to designate on the basis of risk, or the review is pointless because they already have the powers and the evidence of the risk of these providers but are choosing not to act.
I have another point on legal advice. In the debate in the Commons, the Minister committed to writing, including a letter from government lawyers, setting out in great detail what she was saying
“in relation to the powers of the Secretary of State in setting the categories”.—[Official Report, Commons, Third Delegated Legislation Committee, 4/2/25; col. 19.]
In other words, the letter would clarify for people what the interpretation, which has so shifted from the original debate, is from the Government. I may have missed that letter—maybe it was placed in the House of Commons Library—but perhaps the Minister could say whether the letter was written and share its content with this Chamber also, because I think that gets to the heart of what we are regretting today from the Government.
I just want to say very briefly that, having served alongside my noble friend Lord Stevenson on the Front Bench during the passage of this Act, I want to thoroughly endorse what he has said. I am very proud of the work that we did together—I echo what the noble Baroness, Lady Morgan, said—to try to create a piece of legislation that could work in a very complex area, and I think we did a good job.
My fear now is that, now that Ofcom, the regulator, has published its road map, it is like a juggernaut: it has just got on with delivering what it was always going to deliver and has ignored what we in this House amended the Bill to do. In that respect, it is treating us with contempt and it is important that we express our regret in one way or another this evening about the way that we have been treated. I came in wanting to be convinced by my noble friend the Minister; I am afraid that so far she has not done it.
My Lords, I am very grateful to the Minister for introducing the regulations and to the noble Lord, Lord Clement-Jones, for tabling his amendment and for moving it in the way that he did, because it has given us the opportunity to have this very important debate on this landmark Act of Parliament.
My noble friend Lady Morgan of Cotes was right to begin her remarks by reminding your Lordships that the passage of that Act was a shining example of this House doing its job very well indeed, giving careful, considered and non-partisan scrutiny to legislation before us. The noble Lord, Lord Stevenson of Balmacara, rightly recalls the cross-party spirit that he did so much to foster from Second Reading, and it was a pleasure working with noble Lords from across the House in that spirit to make sure that the Act found its way to the statute book in the improved way that it did.
We are here tonight because of a number of amendments made to the Bill as it went through this House. The Delegated Powers and Regulatory Reform Committee of your Lordships’ House recommended in its report on the Bill that the first regulation for the category 1 thresholds should be subject to the affirmative procedure. I was glad to accept that recommendation when I was the Minister taking the Bill through, and I am glad to be here for the debate on it, albeit speaking from a different Dispatch Box.
The noble Lord, Lord Stevenson, does indeed embarrass me by citing the Parkinson rule. I said at the time that Cyril Northcote Parkinson has the better reputation for Parkinson’s laws. But that undertaking was an important one that I was happy to make to ensure that Parliament had the ongoing scrutiny. We all recognised as we passed this law that this was a fast-moving area of technology, that legislatures across the world were struggling to keep up, and that it would be important for the post-legislative scrutiny to take place in the same agile and consensual way in which we sought to pass the Act.
We are also here because of an amendment made to the Bill on Report by my noble friend Lady Morgan. Both she and the noble Lord, Lord Clement-Jones, were too gracious to recall that it took me a little longer to get there. That amendment was made despite my arguments to the contrary. My noble friend pressed her amendment, defeated me and the previous Government and changed the Bill. When the Bill was in another place, the Government accepted her point.
I was helped along the way in that legislative journey by clear exhortations from noble Lords on the Labour Front Bench who were then in opposition. In our debate on my noble friend Lady Morgan’s amendment on 19 July 2023, the noble Lord, Lord Knight of Weymouth, who I am glad to see in his place, albeit now on the Back Benches, said that my noble friend’s amendment was a “no-brainer”. He pointed out that the Bill, as it stood,
“requires Ofcom to … be mindful of size”,
but argued that:
“We need to be more nuanced”.—[Official Report, 19/7/23; col. 2344.]
and that it was right to give Ofcom leeway or flexibility in the categorisation and to bring providers into the safety regime.
Those points were echoed in another place by Alex Davies-Jones, the Member of Parliament for Pontypridd, who is now a Minister at the Ministry of Justice with responsibility for tackling violence against women and girls, rape and serious sexual offences, child sexual abuse and many other very serious matters. In opposition, following that debate, she made the point that:
“Categorisation of services based on size rather than risk of harm will mean that the Bill will fail to address some of the most extreme harms on the internet”.—[Official Report, Commons, 12/7/22; col. 168.]
I wonder what Ms Davies-Jones says now that she is at the Ministry of Justice.
I am very grateful to Ofcom. I had a helpful phone call last week with Robert Brown and Mark Bunting of Ofcom to understand its approach. My criticisms are directed at the Government, not at Ofcom. Without wanting to rehearse my old job, I will help the Minister by pointing out that many of the concerns raised are covered by the Bill.
The Bill is very clear that the duties to act on illegal content and to protect children apply to services of every size. Some of the points made, including the very moving and harrowing examples given by the noble Lord, Lord Russell of Liverpool, may well be covered by the illegal duties and the protection of children duties, and the Minister was right to point that out. But there is a shift in approach from the commitments I made at the Dispatch Box when I was a Minister and the decision that Parliament took in backing my noble friend Lady Morgan’s amendment. I am interested in why the Government have changed their mind, particularly having been so strongly in favour of making those changes to the Bill when in opposition.
In her opening remarks, the Minister used the ubiquitous phrase “unintended consequences”. She mentioned that the Government did not want unintentionally to categorise hundreds of small and non-risky services, but would that necessarily be the case? Surely a granular case-by-case categorisation would not bring in so many hundreds. It seems that she and the Government are leaning rather heavily on other parts of the Act that talk about the quick, easy and wide dissemination of material online. I wonder whether the “and wide” part of that is doing a lot of heavy lifting here. Is that what is making the Government make the connection to the size? Is the width of dissemination driving the policy decision here? And it is a policy decision. The Government are not bound to follow the advice that Ofcom has provided; they can disagree with it.
In the debate in another place on these regulations, my right honourable friend Sir Jeremy Wright, a former law officer, said it would not be right to ask the Government to provide the legal advice they have had on these matters, but like the noble Lord, Lord Pannick, I would be very interested in seeing that. I wonder whether the Minister is able to say a bit more about the legal basis on which they have decided that they are unable to disagree, or are not inclined to disagree, with Ofcom on this. I hope she will be able to give a very clear answer to the very clear question posed by my noble friend Lady Penn, who put very well the question about legal advice and the Government’s room for manoeuvre here.
My Lords, I acknowledge all the hard work, and the cross-party consensus, that went into creating the Online Safety Act. For all the questions that noble Lords are raising today, it is still seen as being a global leader on online safety, so it is certainly nothing we should be ashamed of. I still believe it will be transformative when it is rolled out in the next few weeks and months, when it really will begin to have an impact. I pay tribute to those who did all that work at the time.
There has been a suggestion that we have just kowtowed in some way. I cannot tell noble Lords for how many hours, days and weeks my office and the Secretary of State’s office have pored over the detail of this to make sure that we feel we are doing the best we can to implement the Act in the way that was intended. Noble Lords who have read the draft statement of strategic priorities, which we sent to Ofcom, will see that we are reiterating a lot of the issues that colleagues around the Chamber are raising today. They are our priorities as well. It came down to the practicalities of some of the issues we were being asked to enforce. I hope that in my responses now I can address some of those questions.
I should be specific about the user number thresholds that have been chosen. In response to the noble Baroness, Lady Morgan, the noble Lord, Lord Parkinson, and others, just to put it on the record, I note that Ofcom recommended category 1 threshold combinations of either: user numbers of more than 7 million UK users in addition to the functionality of forwarding or resharing user-generated content and the characteristic of a content recommender system to be met; or user numbers of more than 34 million UK users and a content recommender system to be met.
Ofcom specifically set out in its research and advice, published last March, that it considered but discounted a recommendation that allowed for the categorisation of services for category 1 by reference exclusively to functionalities and characteristics. That was because the research indicated that user reach has an important role to play in content dissemination. Ofcom made a regulatory judgment on where to set the user number thresholds, based on an assessment of what comprised targeted and proportionate regulatory action. Ofcom also undertook sensitivity testing on the thresholds.
In this debate it has been clear that some, such as the noble Lord, Lord Clement-Jones, think there will be services—particularly, as we have been debating, small but risky services—that evade the core duties of the Act. I want to assure noble Lords that the legislation does not allow for that. All regulated user-to-user services and search engines, no matter what their size, will be subject to the existing illegal content duties and, where relevant, the child safety duties; the categories do not change that.
The codes on illegal content duties, which were laid in Parliament, have passed the objection period and may now be issued by Ofcom. The duties should be in effect next month. They will force services to put in place systems and processes to tackle illegal content and require services to name a senior person accountable for compliance. If a service is likely to be accessed by children, the child safety duties will require services to conduct a child safety risk assessment and provide safety measures for child users. We expect that these duties will come into effect this summer, on the basis that the codes for the duties will be passed by then. Together, the illegal content and child safety duties will mark the biggest material change in online safety for UK citizens since the internet era began. By Ofcom’s own assessment, the Act may cover up to 100,000 services of various sizes, showing that the legislation reaches far and wide to ensure important protections for users, particularly children, online.
The noble Lord, Lord Clement-Jones, my noble friend Lord Stevenson, and the noble Baronesses, Lady Morgan and Lady Kidron, asked why category 1 thresholds are not risk-based. I will now turn to that.
The decision of the Secretary of State to set the categorisation thresholds as per Ofcom’s recommendations, rather than deviating from its research, was as follows. When the OSA was introduced, category 1 thresholds were due to be assessed based on the level of risk of harm to adults from priority content disseminated by means of the service. As noble Lords will know, this was removed during the passage of the Bill by the then Government and replaced with consideration of the likely impact of the number of users of the service, its functionalities, and how easily, quickly and widely user-generated content is disseminated. This was a significant change and, while the risk of harm may be seen to be a more relevant factor, this is the position under the Act as it now stands.
As I have already acknowledged, the successful amendment from the noble Baroness, Lady Morgan—which was raised by the noble Lords, Lord Clement-Jones and Lord Parkinson—did make it possible to require threshold conditions on functionality and characteristics to be met without user numbers. However, as I have set out, the considerations within the Act, Ofcom’s research and advice, and the risk of unintended consequences have meant that it is not currently workable to ignore user numbers when setting a threshold for category 1.
The Minister is setting out a clear case, with which I, and I think many others in this House, disagree. To cut to the chase, the Minister has just said that the Government understand the amendment passed in this House on 19 July 2023 but have decided, on the advice of Ofcom, that that amendment does not work and therefore should be ignored. We should be clear that that is what has happened. The Government should own that decision and the House, when it votes on the amendment tonight, will decide whether it thinks that is an acceptable way to behave or an unacceptable way to behave.
I can only reiterate what I have already said: we took Ofcom’s advice after a great deal of scrutiny of why it had come to that piece of advice. Its advice was that the key factor to be taken into account was how easily, quickly and widely content is disseminated. That is the basis on which we made that decision.
Sorry to interrupt but, to return to the point made by the noble Baroness, Lady Morgan, is it the Government’s position that, although the law says it is permissible, and indeed was expected, that in making their decision about category 1 the Government would require Ofcom to ensure that both reach and risk were taken account of, the Government have decided that only reach will be taken account of?
Ofcom’s advice was that how easily, quickly and widely content is disseminated are the key factors that it needed to make the judgment. I cannot say anything more than that.
I am sorry to interrupt, but maybe this would be a good moment to answer my question about the hierarchy of text in an Act versus the regulator’s advice. It was my understanding, when the House agreed to that amendment, that it was an instruction to the regulator rather than something “nice to have” if it decided later that it did not like it.
The SI before us today, based on Ofcom’s advice, is the best way that we can find, in terms of practicality, of enforcing what was written in the Act.
Does the Minister accept that the Act does not oblige the Secretary of State to follow Ofcom’s advice, and that the Government have a separate decision-making moment—a process—to consider that advice and reach their own decision? So it is not on Ofcom; it is on the Government. It is the Government who think it is the correct way forward to ignore what was previously in the Act.
The noble Baroness is right that that is a factor that we considered. The Secretary of State received Ofcom’s advice, duly reflected on it, looked at all the evidence and decided that we would abide by Ofcom’s advice on the issue. It was the Secretary of State’s decision, and that is why we have this SI in front of us today.
The Minister heard the example that I gave and is aware of the harm that was done as a result of using the small channel Telegram. For harm to be done, the material does not need to be widely disseminated; it is disseminated through a very small group of hardcore believers in some of these strange cults, and that is how the harm is done. The fact that it is not widely disseminated is completely irrelevant. One person taking that onboard and then doing something unmentionable should be against the Act as it was written and as we understood it would be legislated for, with the approval of both Houses of Parliament. The breadth and extent of dissemination and the number of users are irrelevant.
My Lords, the whole “small but risky” issue that the noble Lord is raising is hugely close to our heart. We have engaged with Ofcom and pressed it to take more action on the sort of small but risky services that he is talking about. Our view is that they do not necessarily have to be dealt with under the categorisation process; there are other ways. Ofcom has assured us, in the way that it has come back to us, that there are other ways in which it is addressing them.
It is not as though they have been discarded. It is an absolute priority for this Government that we address the “small but risky” issue, and we are doing so. We are working with Ofcom to make sure that that is followed through. As I said when I opened this debate, the fact is that we have worked with Ofcom and it is setting up a task force to look at this, while separately we are looking at these issues. What more can we do? On the position at the moment regarding the rollout of the SI and the categorisation, the reality is that Ofcom’s research and advice, and the risk of unintended consequences, means that it is not currently workable to ignore user numbers when setting category 1 and so on.
The Minister rightly said “currently” and, even if that is the case, why are the Government closing the door to having this option available to them and Ofcom later? She is right that Ofcom is doing a lot of work in ways other than categorisation, but surely she and her colleagues in government can see that this is a useful tool to have in the armoury in the fight against the sorts of harms noble Lords have been raising. Why are the regulations written so tightly as to close that off and avoid taking the concession that was so hard won by my noble friend Lady Morgan and others when the Bill went through Parliament?
My Lords, I can only say what I have already said on this. We are looking at “small but risky”. Ofcom is working hard on this, and we are working hard on this. We can review whether the categorisation process is working. As I have already set out, that option is available to us further down the line. But, at the moment, as with other parts of the Online Safety Act, we felt we needed to get on with it and put these measures into place. Already, the categorisation provisions will take another year or 18 months to come into effect, so it is not as though that is the most imminent part of the implementation of the Act. I hear what noble Lords say. None of these issues are off the table, but we just wanted to get the Act rolled out in as quick and as current a form as we could.
If I could move on, in response to the questions raised by the noble Baroness, Lady Kidron, and the noble Lords, Lord Pannick and Lord Parkinson, I am not able to share the legal advice, but, as I have said, the Secretary of State must act within the legal framework. The current thresholds are legally valid and have been considered by the Joint Committee on Statutory Instruments. In addition to small but risky services, even though in principle there is a provision that allows a user number threshold not to be met, it does not for example allow for sub-delegations to other parties such as coroners, which was another concern of the amendment from the noble Baroness, Lady Morgan.
The decision on the categorisation thresholds has led, as I have just been saying, some to assume that certain small high-risk services are being overlooked by the legislation. However, this is not the case, as they will be subject to the stringent illegal harm and child safety duties. I know that Members are aware that the categorisation of small but risky services would also not prevent or deter users who were determined to access harmful content on dedicated forums. Moreover, the noble Lord, Lord Clement-Jones, raised the question of small but risky services evading the core duties, such as the terms of service and user empowerment. Services that exist solely to host abusive or pro-suicide content, for example, will not have terms of service banning such content, so enforcing those terms would be ineffective in reducing harm.
In addition, the user empowerment tools will enable adult users of category 1 services to avoid certain types of content, such as harmful suicide content. We anticipate that these duties will be most beneficial when services have commercial incentives to prohibit harmful content and where users wish to avoid content they may otherwise see, but not where users are actively seeking out harmful content.
I hope that begins to explain the Secretary of State’s decision. I have to say, and have said, that it was a difficult one and, while we acknowledge the possibility of deviating from Ofcom’s advice and utilising the option to set threshold combinations without a user number, this would not have had the effect of meaningfully reducing harm on small but risky services but would risk regulating hundreds of small low-risk services.
Regarding Ofcom’s small but risky supervisor task force, which the noble Lord, Lord Clement-Jones, asked about, I am confident that Ofcom can effectively use that task force to address these issues. Ofcom already had plans to ensure compliance with the first duties that go live under the Act. These include using targeted enforcement action against small risky services where there is evidence of a significant ongoing risk of harm to users, especially children, and an apparent lack of safety measures in place. In serious cases, Ofcom can seek a court order imposing business disruption measures if there is evidence of continued non-compliance. This could mean asking a third party to withdraw from the service or asking an internet service provider to limit access.
I hope that, as the child safety and illegal content duties come into force this year and the work of the task force begins, those in this House who are concerned will be able to see how these services will not evade their responsibilities under the Act.
Regarding Wikipedia, in response to the questions raised by the noble Lords, Lord Clement-Jones and Lord Moylan, the Government are not in a position to confirm which services will be designated as category 1. Indeed, this is Ofcom’s statutory obligation once the regulations have passed and are in force. It is worth noting that many of the duties on categorised services are subject to the principle of proportionality. This requires Ofcom to consider measures that are technically feasible to providers of a certain size or capacity. Where a code of practice is relevant to a duty, Ofcom must have regard to a principle of proportionality. What is proportionate for one kind of service might not be proportionate for another.
The noble Lords, Lord Clement-Jones and Lord Moylan, also queried how Ofcom could make assessments against the definitions of certain functionalities, characteristics and user number thresholds in the statutory instrument. Once the regulations have been approved by Parliament, Ofcom will issue requests for information and will start assessing services against the threshold conditions.
I also understand that there has been concern that small low-risk platforms, such as local community forums, are being overburdened by the Act and its duties. I must reiterate that these platforms, often run by a small number of users, will not be captured by the categorisation thresholds debated today. At the same time, I acknowledge that the new illegal content and child safety duties will require some additional work from these types of services.
I assure those here today that the principles of proportionality and risk are embedded into the duties on services and Ofcom in relation to the codes of practice. This means that small and low-risk services should not be overburdened by the duties in the Online Safety Act. In efforts to ease the process for small services, Ofcom is providing support to online services to help them to understand their responsibilities under the UK’s new online safety laws. These can be found on Ofcom’s website.
My noble friend Lord Stevenson raised the question of engagement with relevant committees. I agree about the importance of parliamentary scrutiny of the implementation of the Online Safety Act and welcome the expertise Members of both Houses bring. The Government agree that it is vital that regulators are accountable for their services, including through existing annual reports and reporting requirements. We will continue to work with the House of Lords Communications and Digital Committee and the House of Commons Science, Innovation and Technology Committee to support their ongoing scrutiny, as well as any other parliamentary committees that may have an interest in the Act. I am more than happy to meet my noble friend Lord Stevenson to discuss how that could be progressed further.
In response to the noble Baroness, Lady Penn, I want to put on record that a letter was shared with the Delegated Legislation and Regulatory Reform Committee in response to concerns raised during the Commons debate.
I must again stress that the Secretary of State will be holding these thresholds and the wider regulatory framework under review going forward and the Government will take whatever action is necessary to tackle risky services of any size.
I would finally like to thank all those who have contributed today: the noble Lords, Lord Clement- Jones, Lord Pannick, Lord Moylan, Lord Stevenson, Lord Russell and Lord Knight, and the noble Baronesses, Lady Morgan, Lady Kidron, Lady Penn—and of course the noble Lord, Lord Parkinson, who continues to put valuable work, expertise and energy into making the UK a safer place, both online and in the material world. I specifically thank user safety groups that have engaged with the Government on this matter and, of course, the noble Lord, Lord Clement-Jones, for his dedication to his work on these issues.
I recognise that there are some who would like to see changes to this instrument and some who believe that the decisions of the Government do not align with the intentions of the Act. I hope they understand that every decision made by this Government is made with the intention of bringing about the Act in an important and timely way. For too long, children and adults in this country have had to grapple with an unsafe online environment, and the instrument that we have debated today shows real progress.
I do not shy away from the challenge we face in navigating the ever-changing online world. I recognise that the Act is imperfect. However, it is not the destination but a significant step in the right direction. There will always be more that we can do. Years of delay and lack of progress have come at an unfathomable cost for vulnerable children and adults, with lives cut short and families’ worlds turned upside down. It is time to deliver change. I hope noble Lords will consider the time pressure and the fact that we have to get on with the rollout of the Act. I urge noble Lords to approve this vital legislation today.
I raised a number of questions and I would be grateful, if the Minister is not going to answer them in the moment, if she could write to me about the Joint Committee, the hierarchy of the Act and statements from the Dispatch Box versus this decision and other decisions.
My Lords, if I have not covered any issues, I will of course write to noble Lords to clarify any matters that are outstanding.
My Lords, I shall be extremely brief. I thank all noble Lords who have contributed this evening. The noble Lord, Lord Stevenson, used the expression “emotions raised”. That is exactly what this regret amendment has done. There is real anger about the way in which this statutory instrument has been put together. I think many noble Lords who were involved in the Act were extremely proud of our work, as has been expressed.
The Minister has made a valiant attempt, but I am afraid that she has been given a hospital pass. It is quite clear that the Secretary of State did not have to accept the advice from Ofcom. Its advice about functionalities, as the noble Baroness, Lady Kidron, made absolutely clear, and the evidence that the noble Lord, Lord Russell of Liverpool, put forward, not to mention the evidence from the anti-Semitism foundation, all indicate that there is considerable belief around this House that we are not dealing with the high-risk but smaller sites such as Telegram, 8chan and 4chan.
In these circumstances, as I believe is accepted by many noble Lords across the House, the Government have got this completely wrong and it needs rethinking. Therefore, I would like to test the opinion of the House.