Online Safety Act 2023 (Category 1, Category 2A and Category 2B Threshold Conditions) Regulations 2025

Baroness Morgan of Cotes Excerpts
Monday 24th February 2025

(1 week ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Non-Afl)
- View Speech - Hansard - -

My Lords, I thank the Minister for her engagement on this issue, not just with me but with Members across the House. It has been very much appreciated, including when she was not here because she was dealing with her own health issues.

When I talk about what we do here in the House of Lords, one of the great successes I point to is the scrutiny that we gave to the Online Safety Act. We did it in a cross-party way, eventually managing to persuade the Government, as well as Ofcom, about the changes that were needed. Those changes were then taken back to the House of Commons, and Ministers there conceded them. As a result of that working together, we ended up with a much stronger Bill that will do much to protect vulnerable and young people and those most at risk of harmful content online. So it is a matter of great regret that, the first time we are debating a statutory instrument of substantive interest under this Act, we—all of us, I suspect—have to say that we are deeply disappointed by the drafting that we have seen.

On 19 July 2023, I moved a very small amendment and was grateful to the House for its support. I said at the time that one change of one word—from “and” to “or”—made for a small but powerful amendment. The noble Lord, Lord Clement-Jones, set out brilliantly and comprehensively why that change was so important, so in the time available, I will not repeat what he said. The House clearly voted for change and the Minister’s own party supported that change, for which I was deeply grateful.

The other interesting thing is that Ofcom said to me that it did not object to that change. However, in its note today—I am sure that it sent the note to other Members—Ofcom talked about the harms-based approach that it is following when recommending to the Government how they should legislate under the Act. But that harms-based approach rings hollow when—through Ofcom’s interpretation, which it has given to the Government—it has ridden roughshod over looking at the risk of the small but high-harm platforms.

The draft statutory instrument is based on the number of users, and this House in its amendment made it very clear that, with harmful platforms, it is not just about the number of users they have but absolutely about the content, the functionalities and the risks that those sites will raise.

As the noble Baroness set out, Ofcom is relying on paragraph 1(5) of Schedule 11, looking at

“how easily, quickly and widely regulated user-generated content is disseminated by means of the service”.

But that paragraph says that the Secretary of State “must take into account” those things, not that the Secretary of State is bound solely by those criteria. Our criticism tonight of the statutory instrument is not just about the fact that Ofcom has chosen to take those words—I would say that Ofcom in not objecting to my amendment was being disingenuous if it already knew that it was going to rely on that sub-paragraph; the bigger question for the noble Baroness tonight is the fact that the Secretary of State did not have to accept the advice that Ofcom gave them.

The noble Lord, Lord Clement-Jones, talked, as no doubt others will, about the risk and the harm that we have seen from platforms. We will talk about the fact that for the Southport victims it needed only one person to be radicalised by a site that they were looking at to cause untold misery and devastation for families. This House voted recently on the harm caused by deepfake pornographic abuse. Again, it does not take many people to utterly ruin a victim’s life, and what about those platforms that promote suicide and self-harm content? It is not sufficient to say that this Act will impose greater burdens on illegal content. We all know from debates on the Act that there is content which is deliberately not illegal but which is deeply harmful both to victims and to the vulnerable.

As Jeremy Wright MP said in the debate on these regulations in Committee in the House of Commons, the Government are going to want or need these category 1 powers to apply to smaller, high-harm platforms before too long. Indeed, the Government’s own strategic statement published last year specifically says:

“The government would like to see Ofcom keep this approach”—


that is, the approach it has to small, risky services—

“under continual review and to keep abreast of new and emerging small but risky services, which are posing harm to users online”.

The Government and the Secretary of State already know that there are small but high-harm platforms causing immense risk which will not be caught by these regulations. As we have also heard, the flight therefore to these small, high-harm, risky platforms absolutely will happen as those who want to punt out harmful content seek to find platforms that are not bound by the most stringent regulations.

I will stop there because I know that others wish to speak. I will support the regret amendment tonight should the noble Lord, Lord Clement-Jones, decide to put it to a vote. It has taken far too long to get to this point. I understand the Government’s desire to make progress with these regulations, but the regret amendment states that it

“calls on the Government to withdraw the Regulations and establish a revised definition of Category 1 services”.

I ask the Minister to take that opportunity, because these regulations absolutely do not reflect the will of this House in that amendment. That is a great source of disappointment given the cross-party work that we all did to make sure the Online Safety Act was as comprehensive as it could be.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I remind the House of my interests, particularly as chair of 5Rights and as adviser to the Institute for Ethics in AI at Oxford. I wholeheartedly agree with both the previous speakers, and in fact, they have put the case so forcefully that I hope that the Government are listening.

I wanted to use my time to speak about the gap between the Act that we saw pass through this House and the outcome. What worries me the most is how we should understand the purpose of an Act of Parliament and the hierarchy of the instructions it contains. I ask this because, as the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Morgan, have already said, the Government of the day, with the express support of Members of this House, including the Front Bench of the Labour Party, agreed that categorisation would be a question of risk or size, not simply size. That was the decision of the House, it was supported in the other place, and it is in the text of the Act. So, it would be useful to understand, in the view of His Majesty’s Government, whether the text of an Act and, separately, a statement made by a Minister from the Dispatch Box, have any authority. If they do, I cannot understand how Ofcom is allowed to overturn that, or how the Secretary of State, without taking action to amend the Act, has been able to allow it to do so.

It is essential to get a clear answer from the Minister about the status of the text of the Act, because this is a pattern of behaviour where the regulator and government appear to be cherry-picking which bits of the Online Safety Act are convenient and ignoring those they consider too difficult, too disruptive, or—I really hope not—too onerous for tech companies. Ofcom has similarly determined not to observe the provisions in the OSA about functionalities contained throughout the Act; for example, at Sections 9(5), 10(4) and 11(6)—I could go on; on extended use, at Section 11(6)(f); and on the requirement to consider the needs of children in different age groups which, like functionalities, run through the Act like a golden thread.

Ofcom’s own illegal harms register risk management guidance states that

“certain ‘functionalities’ stand out as posing particular risks because of the prominent role they appear to play in the spread of illegal content and the commission and facilitation of … offences”.

Ofcom then says its regulatory framework is intended to ensure service providers put in place safeguards to manage the risks posed by functionalities. It lists end-to-end encryption, pseudonymity and anonymity, live-streaming, content recommender systems, and, quite rightly, generative AI, all as functionality that it considers to be high risk. Specifically in relation to grooming, functionalities Ofcom considers risky include network expansion prompts, direct messaging, connection lists and automated information displays.

Despite acknowledgement that functionalities create heightened risk, a clear statement that addressing risk forms part of its regulatory duties, and the clearly expressed intent of Parliament and the wording of the Act, Ofcom has failed to comprehensively address functionalities both in the published illegal harms code and the draft children’s code, and it has chosen to overrule Parliament by ignoring the requirement in Schedule 11 to consider functionalities in determining which services should be designated as category 1 services.

Meanwhile, paragraph 4(a)(vii) of Schedule 4 is crystal clear in its objective of the Act that user-to-user services

“be designed and operated in such a way that … the different needs of children at different ages are taken into account”.

Ofcom has chosen to ignore that. Volume 5 of its draft children’s code says

“our proposals focus at this stage on setting the expectation of protections for all children under the age of 18”.

Any child, any parent and anyone who has spent time with children knows that five and 15 are not the same. The assertion from Ofcom in its narrative about the children’s code is blinding in its stupidity. If common sense cannot prevail, perhaps 100 years or more of child development study that sets out the ages and stages by which children can be expected to have the emotional and intellectual capacity to understand something could inform the regulator—and similarly, the age and stage by which we cannot expect a child to understand or have the intellectual capacity to deal with something.

The whole basis of child protection is that we should support the children on their journey from dependence to autonomy because we know that they do not have the capacity to do it for themselves in all contexts, because of the vulnerabilities associated with ages and development stages. Ofcom knows that the Act says that it should reflect this but somehow feels empowered to ignore or overrule the will of Parliament and, just as with categorisation, the Government appear to condone it.

--- Later in debate ---
Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Non-Afl)
- Hansard - -

The Minister is setting out a clear case, with which I, and I think many others in this House, disagree. To cut to the chase, the Minister has just said that the Government understand the amendment passed in this House on 19 July 2023 but have decided, on the advice of Ofcom, that that amendment does not work and therefore should be ignored. We should be clear that that is what has happened. The Government should own that decision and the House, when it votes on the amendment tonight, will decide whether it thinks that is an acceptable way to behave or an unacceptable way to behave.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I can only reiterate what I have already said: we took Ofcom’s advice after a great deal of scrutiny of why it had come to that piece of advice. Its advice was that the key factor to be taken into account was how easily, quickly and widely content is disseminated. That is the basis on which we made that decision.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, a key aspect of data protection rests in how it restricts the use of personal data once it has been collected. The public need confidence that their data will be used for the reasons they had shared it and not further used in ways that breach their legitimate expectations—or they will become suspicious as regards providing their data. The underlying theme that we heard on the previous group was the danger of losing public trust, which very much applies in the area of law enforcement and national security.

However, Schedules 4 and 5 would remove the requirement to consider the legitimate expectations of the individuals whose data is being processed, or the impact that this would have on their rights, for the purposes of national security, crime detection and prevention, safeguarding or answering to a request by a public authority. Data used for the purposes listed in these schedules would not need to undergo either a balancing test under Article 6.1(f) or a compatibility test under Article 6.4 of the UK GDPR. The combined effect of these provisions would be to authorise almost unconditional data sharing for law enforcement and other public security purposes while, at the same time, reducing accountability and traceability over how the police use the information being shared with them.

As with the previous DPDI Bill, Clauses 87 to 89 of this Bill grant the Home Secretary and police powers to view and use people’s personal data through the use of national security certificates and designation notices, which are substantially the same as Clauses 28 to 30 of the previous DPDI Bill. This risks further eroding trust in law enforcement authorities. Accountability for access to data for law enforcement purposes should not be lowered, and data sharing should be underpinned by a robust test to ensure that individuals’ rights and expectations are not disproportionately impacted. It is a bafflement as to why the Government are so slavishly following their predecessor and believe that these new and unaccountable powers are necessary.

By opposing that Clause 81 stand part, I seek to retain the requirement for police forces to record the reason they are accessing data from a police database. The public need more, not less, transparency and accountability over how, why and when police staff and officers access and use records about them. Just recently, the Met Police admitted that they investigated more than 100 staff over the inappropriate accessing of information in relation to Sarah Everard. This shows that the police can and do act to access information inappropriately, and there may well be less prominent cases where police abuse their power by accessing information without worry for the consequences.

Regarding Amendments 126, 128 and 129, Rights and Security International has repeatedly argued that the Bill would violate the UK’s obligations under the European Convention on Human Rights. On Amendment 126, the requirements in the EU law enforcement directive for logging are, principally, to capture in all cases the justification for personal data being examined, copied, amended or disclosed when it is processed for a law enforcement process—the objective is clearly to ensure that data is processed only for a legitimate purpose—and, secondarily, to identify when, how and by whom the data has been accessed or disclosed. This ensures that individual accountability is captured and recorded.

Law enforcement systems in use in the UK typically capture some of the latter information in logs, but very rarely do they capture the former. Nor, I am informed, do many commodity IT solutions on the market capture why data was accessed or amended by default. For this reason, a long period of time was allowed under the law enforcement directive to modify legacy systems installed before May 2016, which, in the UK, included services such as the police national computer and the police national database, along with many others at a force level. This transitional relief extended to 6 May 2023, but UK law enforcement did not, in general, make the required changes. Nor, it seems, did it ensure that all IT systems procured after 6 May 2016 included a strict requirement for LED-aligned logging. By adopting and using commodity and hyperscaler cloud services, it has exacerbated this problem.

In early April 2023, the Data Protection Act 2018 (Transitional Provision) Regulations 2023 were laid before Parliament. These regulations had the effect of unilaterally extending the transitional relief period under the law enforcement directive for the UK from May 2023 to May 2026. The Government now wish to strike the requirement to capture the justification for any access to data completely, on the basis that this would free up to 1.5 million hours a year of valuable police time for our officers so that they can focus on tackling crime on our streets, rather than being bogged down by administration, and that this would save approximately £42.8 million per year in taxpayers’ money.

This is a serious legislative issue on two counts: it removes important evidence that may identify whether a person was acting with malicious intent when accessing data, as well as removing any deterrent effect of them having to do so; and it directly deviates from a core part of the law enforcement directive and will clearly have an impact on UK data adequacy. The application of effective control over access to data is very much a live issue in policing, and changing the logging requirement in this way does nothing to improve police data management. Rather, it excuses and perpetuates bad practice. Nor does it increase public confidence.

Clause 87(7) introduces new Section 78A into the Act. This lays down a number of exemptions and exclusions from Part 3 of that Act when the processing is deemed to be in the interests of national security. These exemptions are wide ranging, and include the ability to suspend or ignore principles 2 through 6 in Part 3, and thus run directly contrary to the provisions and expectations of the EU law enforcement directive. Ignoring those principles in itself also negates many of the controls and clauses in Part 3 in its entirety. As a result, they will almost certainly result in the immediate loss of EU law-enforcement adequacy.

I welcome the ministerial letter from the noble Lord, Lord Hanson of Flint, to the noble Lord, Lord Anderson, of 6 November, but was he really saying that all the national security exemption clause does is bring the 2018 Act into conformity with the GDPR? I very much hope that the Minister will set out for the record whether that is really the case and whether it is really necessary to safeguard national security. Although it is, of course, appropriate and necessary for the UK to protect its national security interests, it is imperative that balance remains to protect the rights of a data subject. These proposals do not, as far as we can see, strike that balance.

Clause 88 introduces the ability of law enforcement, competent authorities and intelligence agencies to act as joint controllers in some circumstances. If Clause 88 and associated clauses go forward to become law, they will almost certainly again result in withdrawal of UK law enforcement adequacy and will quite likely impact on the TCA itself.

Amendment 127 is designed to bring attention to the fact that there are systemic issues with UK law enforcement’s new use of hyperscaler cloud service providers to process personal data. These issues stem from the fact that service providers’ standard contracts and terms of service fail to meet the requirements of Part 3 of the UK’s Data Protection Act 2018 and the EU law enforcement directive. UK law enforcement agencies are subject to stringent data protection laws, including Part 3 of the DPA and the GDPR. These laws dictate how personal data, including that of victims, witnesses, suspects and offenders, can be processed. Part 3 specifically addresses data transfers to third countries, with a presumption against such transfers unless strictly necessary. This contrasts with UK GDPR, which allows routine overseas data transfer with appropriate safeguards.

Cloud service providers routinely process data outside the UK and lack the necessary contractual guarantees and legal undertakings required by Part 3 of the DPA. As a result, their use for law enforcement data processing is, on the face of it, not lawful. This non-compliance creates significant financial exposure for the UK, including potential compensation claims from data subjects for distress or loss. The sheer volume of data processed by law enforcement, particularly body-worn video footage, exacerbates the financial risk. If only a small percentage of cases result in claims, the compensation burden could reach hundreds of millions of pounds annually. The Government’s attempts to change the law highlight the issue and suggest that past processing on cloud service providers has not been in conformity with the UK GDPR and the DPA.

The current effect of Section 73(4)(b) of the Data Protection Act is to restrict transfers for competent authorities who may have a legitimate operating need, and should possess the internal capability to assess that need, from making transfers to recipients who are not relevant authorities or international organisations and that cloud service provider. This amendment is designed to probe what impact removal of this restriction would have and whether it would enable them to do so where such a transfer is justified and necessary. I beg to move.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Non-Afl)
- Hansard - -

My Lords, I will speak to Amendment 124. I am sorry that I was not able to speak on this issue at Second Reading. I am grateful to the noble and learned Lord, Lord Thomas of Cwmgiedd, for his support, and I am sorry that he has not been able to stay, due to a prior engagement.

Eagle-eyed Ministers and the Opposition Front Bench will recognise that this was originally tabled as an amendment to the Data Protection and Digital Information (No. 2) Bill. It is still supported by the Police Federation. I am grateful to the former Member of Parliament for Loughborough for originally raising this with me, and I thank the Police Federation for its assistance in briefing us in preparing this draft clause. The Police Federation understands that the Home Secretary is supportive of the objective of this amendment, so I shall listen with great interest to what the Minister has to say.

This is a discrete amendment designed to address an extremely burdensome and potentially unnecessary redaction exercise, in relation to a situation where the police are preparing a case file for submission to the Crown Prosecution Service for a charging decision. Given that this issue was talked about in the prior Bill, I do not intend to go into huge amounts of detail because we rehearsed the arguments there, but I hope very much that with the new Government there might be a willingness to entertain this as a change in the law.

Electronic Media: False Information

Baroness Morgan of Cotes Excerpts
Thursday 12th September 2024

(5 months, 2 weeks ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I agree with my noble friend that we must protect the UK’s democratic integrity. Our Defending Democracy Taskforce safeguards our democratic institutions and processes from threats, including misinformation and disinformation. Sharing best practice and strategic insights with international partners helps industry and Government to protect our democracy from media threats. Under the Online Safety Act, companies must act against illegal content, including the incitement of violence, hate speech and state-backed disinformation, and remove it. Where hateful content or misinformation and disinformation are prohibited in the largest platforms’ terms of service, they must remove it.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - -

My Lords, false information is as likely to be spread through online platforms with smaller numbers of users as those with many users. We have heard about the role of Telegram in spreading disinformation about this summer’s disorder, as well as the terrible suicide forums. I was very pleased to see the Secretary of State’s letter to Ofcom this week on “small but risky” online services. Will the Minister meet me to discuss the issue of platform categorisation, given the amendment I proposed to the then Online Safety Bill, which this House passed in July 2023?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, of course I am very happy to meet the noble Baroness to discuss this further, and I pay tribute to the work she has done on this issue in the past. On “small but risky” services, as she knows, the Secretary of State has written to Melanie Dawes, the CEO of Ofcom, and a very detailed reply was received today from Ofcom. We are still absorbing everything that it is proposing, but it is clear that it is taking this issue very seriously. That will give us the focus for our discussion when we meet.