Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Harding of Winscombe
Main Page: Baroness Harding of Winscombe (Conservative - Life peer)Department Debates - View all Baroness Harding of Winscombe's debates with the Department for Digital, Culture, Media & Sport
(1 year, 4 months ago)
Lords ChamberMy Lords, I promise I will be brief. I, too, welcome what the Minister has said and the amendments that the Government have proposed. This is the full package which we have been seeking in a number of areas, so I am very pleased to see it. My noble friend Lady Newlove and the noble Baroness, Lady Kidron, are not in their places, but I know I speak for both of them in wanting to register that, although the thoughtful and slow-and-steady approach has some benefits, there also some real costs to it. The UK Safer Internet Centre estimates that there will be some 340,000 individuals in the UK who will have no recourse for action if the platforms complaints mechanism does not work for them in the next two years. That is quite a large number of people, so I have one very simple question for the Minister: if I have exhausted the complaints procedure with an existing platform in the next two years, where do I go? I cannot go to Ofcom. My noble friend Lord Grade was very clear in front of the committee I sit on that it is not Ofcom’s job. Where do I go if I have a complaint that I cannot get resolved in the next two years?
My Lords, I declare an interest as chair of Trust Alliance Group, which operates the energy and communications ombudsman schemes, so I have a particular interest in the operation of these ADR schemes. I thank the Minister for the flexibility that he has shown in the provision about the report by Ofcom and in having backstop powers for the Secretary of State to introduce such a scheme.
Of course, I understand that the noble Baroness, Lady Newlove, and the UK Safer Internet Centre are very disappointed that this is not going to come into effect immediately, but there are advantages in not setting out the scheme at this very early point before we know what some of the issues arising are. I believe that Ofcom will definitely want to institute such a scheme, but it may be that, in the initial stages, working out the exact architecture is going to be necessary. Of course, I would have preferred to have a mandated scheme, in the sense that the report will look not at the “whether” but the “how”, but I believe that at the end of the day it will absolutely obvious that there needs to be such an ADR scheme in order to provide the kind of redress the noble Baroness, Lady Harding, was talking about.
I also agree with noble Baroness, Lady Morgan, that the kinds of complaints that this would cover should include fraudulent adverts. I very much hope that the Minister will be able to answer the questions that both noble Baronesses asked. As my noble friend said, will he reassure us that the department and Ofcom will not take their foot off the pedal, whatever the Bill may say?
My Lords, if I may, I shall speak very briefly, in the absence of my noble friend Lady Kidron, and because I am one of the signatories of this amendment, alongside the noble Lord, Lord Stevenson, and the right reverend Prelate the Bishop of Oxford. Amendment 240, together with a number of amendments that we will be debating today, turns on a fundamental issue that we have not yet resolved.
I came in this morning being told that we would be voting on this amendment and that other amendments later today would be consequential—I am a novice at this level of parliamentary procedure, so forgive me if I have got myself confused during the day—but I now understand that my noble friend considers this amendment to be consequential but, strangely, the amendments right at the end of the day are not. I just wanted to flag to the House that they all cover the same fundamental issue of whether harms can be unrelated to content, whether the harms of the online world can be to do with functionality—the systems and processes that drive the addiction that causes so much harm to our children.
It is a fundamental disagreement. I pay tribute to the amount of time the department, the Secretary of State and my noble friend have spent on it, but it is not yet resolved and, although I understand that I should now say that I beg leave to move the amendment formally, I just wanted to mark, with apologies, the necessity, most likely, of having to bring the same issue back to vote on later today.
My Lords, His Majesty’s Government indeed agree that this is consequential on the other amendments, including Amendment 35, which the noble Baroness, Lady Kidron, previously moved at Report. We disagreed with them, but we lost that vote; this is consequential, and we will not force a Division on it.
We will have further opportunity to debate the fundamental issues that lie behind it, to which my noble friend Lady Harding just referred. Some of the amendments on which we may divide later, the noble Baroness, Lady Kidron, tabled after defeating the Government the other day, so we cannot treat them as consequential. We look forward to debating them; I will urge noble Lords not to vote for them, but we will have opportunity to discuss them later.
My Lords, I rise to speak in favour of my noble friend Lord Moylan’s amendment. Given that I understand he is not going to press it, and while I see Amendment 255 as the ideal amendment, I thank the noble Lords, Lord Stevenson and Lord Clement- Jones, for their Amendments 256, 257 and 259, and the noble Lords, Lord Clement-Jones and Lord Allan of Hallam, for Amendments 258 and 258ZA.
I will try to be as brief as I can. I think about two principles—unintended consequences and the history of technology transfer. The point about technology transfer is that once a technology is used it becomes available to other people quickly, even bad guys, whether that was intended or not. There is obviously formal technology transfer, where you have agreement or knowledge transfer via foreign investment, but let us think about the Cold War and some of the great technological developments—atomic secrets, Concorde and the space shuttle. In no time at all, the other side had that access, and that was before the advent of the internet.
If we are to open a door for access to encrypted messages, that technology will be available to the bad guys in no time at all, and they will use it against dissidents, many of whom will be in contact with journalists and human rights organisations in this country and elsewhere. Therefore, the unintended consequence may well be that in seeking to protect children in this country by accessing encrypted messages or unencrypted messages, we may well be damaging the childhoods of children in other countries when their parents, who are dissidents, are suddenly taken away and maybe the whole family is wiped out. Let us be careful about those unintended consequences.
I also welcome my noble friend Lord Parkinson’s amendments about ensuring journalistic integrity, such as Amendment 257D and others. They are important. However, we must remember that once these technologies are available, everyone has a price and that technology will be transferred to the bad guys.
Given that my noble friend Lord Moylan will not press Amendment 255, let us talk about some of the other amendments—I will make some general points rather than go into specifics, as many noble Lords have raised these points. These amendments are sub-optimal, but at least there is some accountability for Ofcom being able to use this power and using it sensibly and proportionately. One of the things that has run throughout this Bill and other Bills is “who regulates the regulators?” and ensuring that regulators are accountable. The amendments proposed by the noble Lords, Lord Stevenson and Lord Clement-Jones, and by the noble Lords, Lord Clement-Jones and Lord Allan of Hallam, go some way towards ensuring that safeguards are in place. If the Government are not prepared to have an explicit statement that they will not allow access to encrypted messages, I hope that there will be some support for the noble Lords’ amendments.
My Lords, I promise to speak very briefly. I welcome the Government’s amendments. I particularly welcome that they appear to mirror partly some of the safeguards that are embedded in the Investigatory Powers Act 2016.
I have one question for my noble friend the Minister about the wording, “a skilled person”. I am worried that “a skilled person” is a very vague term. I have been taken all through the course of this Bill by the comparison with the Investigatory Powers Act and the need to think carefully about how we balance the importance of privacy with the imperative of protecting our children and being able to track down the most evil and wicked perpetrators online. That is very similar to the debates that we had here several years ago on the Investigatory Powers Act.
The IPA created the Technical Advisory Board. It is not a decision-making body. Its purpose is to advise the Investigatory Powers Commissioner and judicial commissioners on the impact of changing technology and the development of techniques to use investigatory powers while maintaining privacy. It is an expert panel constituted to advise the regulator—in this case, the judicial commissioner—specifically on technology interventions that must balance this really difficult trade-off between privacy and child protection. Why have we not followed the same recipe? Rather than having a skilled person, why would we not have a technology advisory panel of a similar standing where it is clear to all who the members are. Those members would be required to produce a regular report. It might not need to be as regular as the IPA one, but it would just take what the Government have already laid one step further towards institutionalising the independent check that is really important if these Ofcom powers were ever to be used.
My Lords, I added my name to some amendments on this issue in Committee. I have not done so on Report, not least because I have been so occupied with other things and have not had the time to focus on this. However, I remain concerned about this part of the Bill. I am sympathetic to my noble friend Lord Moylan’s Amendment 255, but listening to this debate and studying all the amendments in this group, I am a little confused and so have some simple questions.
First, I heard my noble friend the Minister say that the Government have no intention to require the platforms to carry out general monitoring, but is that now specific in any of the amendments that he has tabled? Regarding the amendments which would bring further safeguards around the oversight of Ofcom’s use of this power, like my noble friend Lady Harding, I have always been concerned that the oversight approach should be in line with that for the Investigatory Powers Act and could never understand why it was not in the original version of the Bill. Like her, I am pleased that the Government have tabled some amendments, but I am not yet convinced that they go far enough.
That leads me to the amendments that have been tabled by the noble Lords, Lord Stevenson and Lord Clement-Jones, and particularly that in the name of the noble Lord, Lord Allan of Hallam. As his noble friend Lord Clement-Jones has added his name to it, perhaps he could answer my question when he gets up. Would the safeguards that are outlined there—the introduction of the Information Commissioner—meet the concerns of the big tech companies? Do we know whether it would meet their needs and therefore lead them not to feel it necessary to withdraw their services from the UK? I am keen to understand that.
There is another thing that might be of benefit for anyone listening to this debate who is not steeped in the detail of this Bill, and I look to any of those winding up to answer it—including my noble friend the Minister. Is this an end to end-to-end encryption? Is that what is happening in this Bill? Or is this about ensuring that what is already permissible in terms of the authorities being able to use their powers to go after suspected criminals is somehow codified in this Bill to make sure it has proper safeguards around it? That is still not clear. It would be very helpful to get that clarity from my noble friend, or others.
My Lords, I note that the noble Lord, Lord Stevenson, is no longer in his place, but I promise to still try to live by his admonition to all of us to speak briefly.
I will speak to Amendments 281BA, 281FA, 286A and 281F, which has already been debated but is central to this issue. These amendments aim to fix a problem we repeatedly raised in Committee and on Report. They are also in the name of the noble Baroness, Lady Kidron, and the noble Lords, Lord Stevenson and Lord Clement-Jones, and build on amendments in Committee laid by the noble Lord, Lord Russell, my noble friend Lord Bethell and the right reverend Prelate the Bishop of Oxford. This issue has broad support across the whole House.
The problem these amendments seek to solve is that, while the Government have consistently asserted that this is a systems and processes Bill, the Bill is constructed in a manner that focuses on content. Because this is a rerun of previous debates, I will try to keep my remarks short, but I want to be clear about why this is a real issue.
I am expecting my noble friend the Minister to say, as he has done before, that this is all covered; we are just seeing shadows, we are reading the Bill wrong and the harms that we are most concerned about are genuinely in the Bill. But I really struggle to understand why, if they are in the Bill, stating them clearly on the face of the Bill creates the legal uncertainty that seems to be the Government’s favourite problem with each of the amendments we have been raising today.
My noble friend—sorry, my friend—the noble Baroness, Lady Kidron, commissioned a legal opinion that looked at the statements from the Government and compared it to the text in the Bill. That opinion, like that of the many noble Lords I have just mentioned, is that the current language in the Bill about features and functionalities only pertains as far as it relates to harmful content. All roads in this game of Mornington Crescent lead back to content.
Harmful content is set out in a schedule to the Bill, and this set of amendments ensures that the design of services, irrespective of content, is required to be safe by design. If the Government were correct in their assertion that this is already covered, then these amendments really should not pose any threat at all, and I have yet to hear the Government enunciate what the real legal uncertainty actually is in stating that harm can come from functionality, not just from content.
My Lords, this is not just a content Bill. The Government have always been clear that the way in which a service is designed and operated, including its features and functionalities, can have a significant impact on the risk of harm to a user. That is why the Bill already explicitly requires providers to ensure their services are safe by design and to address the risks that arise from features and functionalities.
The Government have recognised the concerns which noble Lords have voiced throughout our scrutiny of the Bill, and those which predated the scrutiny of it. We have tabled a number of amendments to make it even more explicit that these elements are covered by the Bill. We have tabled the new introductory Clause 1, which makes it clear that duties on providers are aimed at ensuring that services are safe by design. It also highlights that obligations on services extend to the design and operation of the service. These obligations ensure that the consideration of risks associated with the business model of a service is a fundamental aspect of the Bill.
My noble friend Baroness Harding of Winscombe worried that we had made the Bill worse by adding this. The new clause was a collaborative one, which we have inserted while the Bill has been before your Lordships’ House. Let me reassure her and other noble Lords as we conclude Report that we have not made it worse by so doing. The Bill will require services to take a safety by design approach to the design and operation of their services. We have always been clear that this will be crucial to compliance with the legislation. The new introductory Clause 1 makes this explicit as an overarching objective of the Bill. The introductory clause does not introduce any new concepts; it is an accurate summary of the key provisions and objectives of the Bill and, to that end, the framework and introductory statement are entirely compatible.
We also tabled amendments—which we debated last Monday—to Clause 209. These make it clear that functionalities contribute to the risk of harm to users, and that combinations of functionality may cumulatively drive up the level of risk. Amendment 281BA would amend the meaning of “functionality” within the Bill, so that it includes any system or process which affects users. This presents a number of concerns. First, such a broad interpretation would mean that any service in scope of the Bill would need to consider the risk of any feature or functionality, including ones that are positive for users’ online experience. That could include, for example, processes designed for optimising the interface depending on the user’s device and language settings. The amendment would increase the burden on service providers under the existing illegal content and child safety duties and would dilute their focus on genuinely risky functionality and design.
Second, by duplicating the reference to systems, processes and algorithms elsewhere in the Bill, it implies that the existing references in the Bill to the design of a service or to algorithms must be intended to capture matters not covered by the proposed new definition of “functionality”. This would suggest that references to systems and processes, and algorithms, mentioned elsewhere in the Bill, cover only systems, processes or algorithms which do not have an impact on users. That risks undermining the effectiveness of the existing duties and the protections for users, including children.
Amendment 268A introduces a further interpretation of features and functionality in the general interpretation clause. This duplicates the overarching interpretation of functionality in Clause 208 and, in so doing, introduces legal and regulatory uncertainty, which in turn risks weakening the existing duties. I hope that sets out for my noble friend Lady Harding and others our legal concerns here.
Amendment 281FA seeks to add to the interpretation of harm in Clause 209 by clarifying the scenarios in which harm may arise, specifically from services, systems and processes. This has a number of concerning effects. First, it states that harm can arise solely from a system and process, but a design choice does not in isolation harm a user. For example, the decision to use algorithms, or even the algorithm itself, is not what causes harm to a user—it is the fact that harmful content may be pushed to a user, or content pushed in such a manner that is harmful, for example repeatedly and in volume. That is already addressed comprehensively in the Bill, including in the child safety risk assessment duties.
Secondly, noble Lords should be aware that the drafting of the amendment has the effect of saying that harm can arise from proposed new paragraphs (a) (b) and (c)—
Can I just double-check what my noble friend has just said? I was lulled into a possibly false sense of security until we got to the point where he said “harmful” and then the dreaded word “content”. Does he accept that there can be harm without there needing to be content?
My Lords, being the understudy for the noble Baroness, Lady Kidron, is quite a stressful thing. I am, however, reliably informed that she is currently offline in the White House, but I know that she will scrutinise everything I say afterwards and that I will receive a detailed school report tomorrow.
I am extremely grateful to my noble friend the Minister for how he has just summed up, but I would point out two things in response. The first is the circularity of the legal uncertainty. What I think I have heard is that we are trying to insert into the Bill some clarity because we do not think it is clear, but the Government’s concern is that by inserting clarity, we then imply that there was not clarity in the rest of the Bill, which then creates the legal uncertainty—and round we go. I am not convinced that we have really solved that problem, but I may be one step further towards understanding why the Government think that it is a problem. I think we have to keep exploring that and properly bottom it out.
My second point is about what I think will for evermore be known as the marshmallow problem. We have just rehearsed across the House a really heartfelt concern that just because we cannot imagine it today, it does not mean that there will not be functionality that causes enormous harm which does not link back to a marshmallow, multiple marshmallows or any other form of content.
Those two big issues are the ones we need to keep discussing: what is really causing the legal uncertainty and how we can be confident that unimaginable harms from unimaginable functionality are genuinely going to be captured in the Bill. Provided that we can continue, maybe it is entirely fitting at the end of what I think has been an extraordinarily collaborative Report, Committee and whole process of the Bill going through this House—which I have felt incredibly proud and privileged to be a part of—that we end with a commitment to continue said collaborative process. With that, I beg leave to withdraw the amendment.