Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Merron
Main Page: Baroness Merron (Labour - Life peer)Department Debates - View all Baroness Merron's debates with the Department for Digital, Culture, Media & Sport
(1 year, 3 months ago)
Lords ChamberMy Lords, I want to say “Hallelujah”. With this Bill, we have reached a landmark moment after the disappointments and obstacles that we have had over the last six years. It has been a marathon but we are now in the final straight with the finishing line in sight, after the extraordinary efforts by noble Lords on all sides of the House. I thank the Secretary of State for her commitment to this ground-breaking Bill, and the Minister and his officials for the effort they have put into it. The Minister is one of my “Play School” babies, who has done his utmost to make a difference in changing the online world. That makes me very happy.
We know that the eyes of the world are watching us because legislators around the world are looking for ways to extend the rule of law into the online world, which has become the Wild West of the 21st century, so it is critical that in our haste to reach the finishing post we do not neglect the question of enforcement. That is why I have put my name to Amendment 268C in the name of the noble Lord, Lord Weir: without ensuring that Ofcom is given effective powers for this task of unprecedented scale, the Bill we are passing may yet become a paper tiger.
The impact assessment for the Bill estimated that 25,000 websites would be in scope. Only last week, in an encouraging report by the National Audit Office on Ofcom’s readiness, we learned that the regulator’s own research has increased that estimate to 100,000, and the figure could be significantly higher. The report went on to point out that the great majority of those websites will be based overseas and will not have been regulated by Ofcom before.
The noble Lord, Lord Bethell, raised his concerns on the final day of Committee, seeking to amend the Bill to make it clear that Ofcom could take a schedule of a thousand sites to court and get them all blocked in one go. I was reassured when the Minister repeated the undertaking given by his counterpart in Committee in the other place that the Civil Procedure Rules already allow such multiparty claims. Will the Minister clarify once again that such enforcement at scale is possible and would not expose Ofcom to judicial review? That would give me peace of mind.
The question that remains for many is whether Ofcom will act promptly enough when children are at risk. I am being cautious because my experience in this area with regulators has led me not to assume that simply because this Parliament passes a law, it will be implemented. We all know the sorry tale of the Part 3 of the Digital Economy Act, when Ministers took it upon themselves not to decide when it should come into force, but to ask whether it should at all. When they announced that that should be never, the High Court took a dim view and allowed judicial review to proceed. Interestingly, the repeal of Part 3 and the clauses that replaced it may not have featured in this Bill were it not for that case—I always say that everything always happens for a reason. The amendment is a reminder to Ofcom that Parliament expects it to act, and to do so from the day when the law comes into force, not after a year’s grace period, six months or more of monitoring or a similar period of supervision before it contemplates any form of enforcement.
Many of the sites we are dealing with will not comply because this is the law; they will do so only when the business case makes compliance cheaper than the consequences of non-compliance, so this amendment is a gentle but necessary provision. If for any reason Ofcom does not think that exposing a significant number of children in this country to suicide, health harm, eating disorder or pornographic content—which is a universal plague—merits action, it will need to write a letter to the Secretary of State explaining why.
We have come too far to risk the Bill not being implemented in the most robust way, so I hope my noble friends will join me in supporting this belt-and-braces amendment. I look forward to the Minister’s response.
My Lords, we welcome the government amendments in this group to bring child sexual exploitation and abuse failures into the scope of the senior manager liability and enforcement regime but consider that they do not go far enough. On the government amendments, I have a question for the Minister about whether, through Clause 122, it would be possible to require a company that was subject to action to do some media literacy as part of its harm reduction; in other words, would it be possible for Ofcom to use its media literacy powers as part of the enforcement process? I offer that as a helpful suggestion.
We share the concerns expressed previously by the noble Lord, Lord Bethell, about the scope of the senior manager liability regime, which does not cover all the child safety duties in the Bill. We consider that Amendment 268, in the name of my noble friend Lord Stevenson, would provide greater flexibility, giving the possibility of expanding the list of duties covered in the future. I have a couple of brief questions to add to my first question. Will the Minister comment on how the operation of the senior manager liability regime will be kept under review? This has, of course, been something of a contentious issue in the other place, so could the Minister perhaps tell your Lordships’ House how confident he is that the current position is supported there? I look forward to hearing from the Minister.
I did not quite finish writing down the noble Baroness’s questions. I will do my best to answer them, but I may need to follow up in writing because she asked a number at the end, which is perfectly reasonable. On her question about whether confirmation decision steps could include media literacy, yes, that is a good idea; they could.
Amendment 268, tabled by the noble Lord, Lord Stevenson of Balmacara, seeks to enable the Secretary of State, through regulation, to add to the list of duties which are linked to the confirmation decision offence. We are very concerned at the prospect of allowing an unconstrained expansion of the confirmation decision offence. In particular, as I have already set out, we would be concerned about expansion of those related to search services. There is also concern about unconstrained additions of any other duties related to user-to-user services as well.
We have chosen specific duties which will tackle effectively key issues related to child safety online and tackling child abuse while ensuring that the confirmation decision offence remains targeted. Non-compliance with a requirement imposed by a confirmation decision in relation to such duties warrants the prospect of criminal enforcement on top of Ofcom’s extensive civil enforcement powers. Making excessive changes to the offence risks shifting the regime towards a more punitive and disproportionate enforcement model, which would represent a significant change to the framework as a whole. Furthermore, expansion of the confirmation decision offence could lead to services taking an excessively cautious approach to content moderation to avoid the prospect of criminal liability. We are also concerned that such excessive expansion could significantly increase the burden on Ofcom.
I am grateful to the noble Lord, Lord Weir of Ballyholme, and the noble Baroness, Lady Benjamin, for the way they set out their Amendment 268C. We are concerned about this proposal because it is important that Ofcom can respond to issues on a case-by-case basis: it may not always be appropriate or proportionate to use a specific enforcement power in response to a suspected breach. Interim service restriction orders are some of the strongest enforcement powers in the Bill and will have a significant impact on the service in question. Their use may be disproportionate in cases where there is only a minor breach, or where a service is taking steps to deal with a breach following a provisional notice of contravention.
My Lords, I can be very brief. My noble friend Lady Benjamin and the noble Baronesses, Lady Harding, Lady Morgan and Lady Fraser, have all very eloquently described why these amendments in this group are needed.
It is ironic that we are still having this debate right at the end of Report. It has been a running theme throughout the passage of the Bill, both in Committee and on Report, and of course it ran right through our Joint Committee work. It is the whole question of safety by design, harm from functionalities and, as the noble Baroness, Lady Morgan, said, understanding the operation of the algorithm. And there is still the question: does the Bill adequately cover what we are trying to achieve?
As the noble Baroness, Lady Harding, said, Clause 1 now does set out the requirement for safety by design. So, in the spirit of amity, I suggested to the Minister that he might run a check on the Bill during his free time over the next few weeks to make sure that it really does cover it. But, in a sense, there is a serious point here. Before Third Reading there is a real opportunity to run a slide rule over the Bill to see whether the present wording really is fit for purpose. So many of us around this House who have lived and breathed this Bill do not believe that it yet is. The exhortation by the ethereal presences of the noble Baronesses, Lady Kidron and Lady Harding, to keep pressing to make sure that the Bill is future-proofed and contains the right ingredients is absolutely right.
I very much hope that once again the Minister will go through the hoops and explain whether this Bill really captures functionality and design and not just content, and whether it adequately covers the points set out in the purpose of the Bill which is now there.
My Lords, as we have heard, the noble Baroness, Lady Harding, made a very clear case in support of these amendments, tabled in the name of the noble Baroness, Lady Kidron, and supported by noble Lords from across the House. The noble Baroness, Lady Morgan, gave wise counsel to the Minister, as did the noble Lord, Lord Clement-Jones, that it is worth stepping back and seeing where we are in order to ensure that the Bill is in the right place. I urge the Minister to find the time and the energy that I know he has—he certainly has the energy and I am sure he will match it with the time—to speak to noble Lords over the coming Recess to agree a way to incorporate systems and functionality into the Bill, for all the reasons we have heard.
On Monday, my noble friend Lord Knight spoke of the need for a review about loot boxes and video games. When we checked Hansard, we saw the Minister had promised that such a review would be offered in the coming months. In an unusual turn of events, the Minister exceeded the timescale. We did not have to hear the words “shortly”, “in the summer” or “spring” or anything like that, because it was announced the very next day that the department would keep legislative options under review.
I make that point simply to thank the Minister for the immediate response to my noble friend Lord Knight. But, if we are to have such a review, does this not point very much to the fact that functionality and systems should be included in the Bill? The Minister has a very nice hook to hang this on and I hope that he will do so.
My Lords, this is not just a content Bill. The Government have always been clear that the way in which a service is designed and operated, including its features and functionalities, can have a significant impact on the risk of harm to a user. That is why the Bill already explicitly requires providers to ensure their services are safe by design and to address the risks that arise from features and functionalities.
The Government have recognised the concerns which noble Lords have voiced throughout our scrutiny of the Bill, and those which predated the scrutiny of it. We have tabled a number of amendments to make it even more explicit that these elements are covered by the Bill. We have tabled the new introductory Clause 1, which makes it clear that duties on providers are aimed at ensuring that services are safe by design. It also highlights that obligations on services extend to the design and operation of the service. These obligations ensure that the consideration of risks associated with the business model of a service is a fundamental aspect of the Bill.
My noble friend Baroness Harding of Winscombe worried that we had made the Bill worse by adding this. The new clause was a collaborative one, which we have inserted while the Bill has been before your Lordships’ House. Let me reassure her and other noble Lords as we conclude Report that we have not made it worse by so doing. The Bill will require services to take a safety by design approach to the design and operation of their services. We have always been clear that this will be crucial to compliance with the legislation. The new introductory Clause 1 makes this explicit as an overarching objective of the Bill. The introductory clause does not introduce any new concepts; it is an accurate summary of the key provisions and objectives of the Bill and, to that end, the framework and introductory statement are entirely compatible.
We also tabled amendments—which we debated last Monday—to Clause 209. These make it clear that functionalities contribute to the risk of harm to users, and that combinations of functionality may cumulatively drive up the level of risk. Amendment 281BA would amend the meaning of “functionality” within the Bill, so that it includes any system or process which affects users. This presents a number of concerns. First, such a broad interpretation would mean that any service in scope of the Bill would need to consider the risk of any feature or functionality, including ones that are positive for users’ online experience. That could include, for example, processes designed for optimising the interface depending on the user’s device and language settings. The amendment would increase the burden on service providers under the existing illegal content and child safety duties and would dilute their focus on genuinely risky functionality and design.
Second, by duplicating the reference to systems, processes and algorithms elsewhere in the Bill, it implies that the existing references in the Bill to the design of a service or to algorithms must be intended to capture matters not covered by the proposed new definition of “functionality”. This would suggest that references to systems and processes, and algorithms, mentioned elsewhere in the Bill, cover only systems, processes or algorithms which do not have an impact on users. That risks undermining the effectiveness of the existing duties and the protections for users, including children.
Amendment 268A introduces a further interpretation of features and functionality in the general interpretation clause. This duplicates the overarching interpretation of functionality in Clause 208 and, in so doing, introduces legal and regulatory uncertainty, which in turn risks weakening the existing duties. I hope that sets out for my noble friend Lady Harding and others our legal concerns here.
Amendment 281FA seeks to add to the interpretation of harm in Clause 209 by clarifying the scenarios in which harm may arise, specifically from services, systems and processes. This has a number of concerning effects. First, it states that harm can arise solely from a system and process, but a design choice does not in isolation harm a user. For example, the decision to use algorithms, or even the algorithm itself, is not what causes harm to a user—it is the fact that harmful content may be pushed to a user, or content pushed in such a manner that is harmful, for example repeatedly and in volume. That is already addressed comprehensively in the Bill, including in the child safety risk assessment duties.
Secondly, noble Lords should be aware that the drafting of the amendment has the effect of saying that harm can arise from proposed new paragraphs (a) (b) and (c)—