Online Safety Bill Debate
Full Debate: Read Full DebateLord Clement-Jones
Main Page: Lord Clement-Jones (Liberal Democrat - Life peer)Department Debates - View all Lord Clement-Jones's debates with the Department for Digital, Culture, Media & Sport
(1 year, 5 months ago)
Lords ChamberMy Lords, I rise briefly to welcome the fact that there is a series of amendments here where “bot” is replaced by
“bot or other automated tool”.
I point out that there is often a lot of confusion about what a bot is or is not. It is something that was largely coined in the context of a particular service—Twitter—where we understand that there are Twitter bots: accounts that have been created to pump out lots of tweets. In other contexts, on other services, there is similar behaviour but the mechanism is different. It seems to me that the word “bot” may turn out to be one of those things that was common and popular at the end of the 2010s and in the early 2020s, but in five years we will not be using it at all. It will have served its time, it will have expired and we will be using other language to describe what it is that we want to capture: a human being has created some kind of automated tool that will be very context dependent, depending on the nature of the service, and they are pumping out material. It is very clear that we want to make sure that such behaviour is in scope and that the person cannot hide behind the fact that it was an automated tool, because we are interested in the mens rea of the person sitting behind the tool.
I recognise that the Government have been very wise in making sure that whenever we refer to a bot we are adding that “automated tool” language, which will make the Bill inherently much more future-proof.
My Lords, I just want to elucidate whether the Minister has any kind of brief on my Amendment 152A. I suspect that he does not; it is not even grouped—it is so recent that it is actually not on today’s groupings list. However, just so people know what will be coming down the track, I thought it would be a good idea at this stage to say that it is very much about exactly the question that the noble Baroness, Lady Harding, was asking. It is about the interaction between a provider environment and a user, with the provider environment being an automated bot—or “tool”, as my noble friend may prefer.
It seems to me that we have an issue here. I absolutely understand what the Minister has done, and I very much support Amendment 153, which makes it clear that user-generated content can include bots. But this is not so much about a human user using a bot or instigating a bot; it is much more about a human user encountering content that is generated in an automated way by a provider, and then the user interacting with that in a metaverse-type environment. Clearly, the Government are apprised of that with regard to Part 5, but there could be a problem as regards Part 3. This is an environment that the provider creates, but it is interacted with by a user as if that environment were another user.
I shall not elaborate or make the speech that I was going to make, because that would be unfair to the Minister, who needs to get his own speaking note on this matter. But I give him due warning that I am going to degroup and raise this later.
My Lords, I warmly welcome this group of amendments. I am very grateful to the Government for a number of amendments that they are bringing forward at this stage. I want to support this group of amendments, which are clearly all about navigating forward and future-proofing the Bill in the context of the very rapid development of artificial intelligence and other technologies. In responding to this group of amendments, will the Minister say whether he is now content that the Bill is sufficiently future-proofed, given the hugely rapid development of technology, and whether he believes that Ofcom now has sufficient powers to risk assess for the future and respond, supposing that there were further parallel developments in generative AI such as we have seen over the past year?
My Lords, this has been a very interesting debate, as it is a real contrast. We have one set of amendments which say that the net is too wide and another which say that the net is not wide enough, and I agree with both of them. After all, we are trying to fine-tune the Bill to get it to deal with the proper risks—the word “risk” has come up quite a lot in this debate—that it should. Whether or not we make a specific exemption for public interest services, public information services, limited functionality services or non-commercial services, we need to find some way to deal with the issue raised by my noble friend and the noble Lord, Lord Moylan, in their amendments. All of us are Wikipedia users; we all value the service. I particularly appreciated what was said by the noble Baroness, Lady Kidron: Wikipedia does not push its content at us—it is not algorithmically based.
What the noble Lord, Lord Russell, said, resonated with me, because I think he has found a thundering great hole in the Bill. This infinite scrolling and autoplay is where the addiction of so much of social media lies, and the Bill absolutely needs systemically and functionally to deal with it. So, on the one hand, we have a service which does not rely on that infinite scrolling and algorithmic type of pushing of content and, on the other hand, we are trying to identify services which have that quality.
I very much hope the Minister is taking all this on board, because on each side we have identified real issues. Whether or not, when we come to the light at the end of the tunnel of Amendment 245 from the noble Baroness, Lady Morgan, it will solve all our problems, I do not know. All I can say is that I very much hope that the Minister will consider both sets of amendments and find a way through this that is satisfactory to all sides.
My Lords, much like the noble Lord, Lord Clement-Jones, I started off being quite certain I knew what to say about these amendments. I even had some notes—unusual for me, I know—but I had to throw them away, which I always do with my notes, because the arguments have been persuasive. That is exactly why we are here in Parliament discussing things: to try to reach common solutions to difficult problems.
We started with a challenge to the Minister to answer questions about scope, exemptions and discretion in relation to a named service—Wikipedia. However, as the debate went on, we came across the uncomfortable feeling that, having got so far into the Bill and agreed a lot of amendments today improving it, we are still coming up against quite stubborn issues that do not fit neatly into the categorisation and structures that we have. We do not seem to have the right tools to answer the difficult questions before us today, let alone the myriad questions that will come up as the technology advances and new services come in. Why have we not already got solutions to the problems raised by Amendments 281, 281A and 281B?
There is also the rather difficult idea we have from the noble Lord, Lord Russell, of dark patterns, which we need to filter into our thinking. Why does that not fit into what we have got? Why is it that we are still worried about Wikipedia, a service for public good, which clearly has risks in it and is sometimes capable of making terrible mistakes but is definitely a good thing that should not be threatened by having to conform with a structure and a system which we think is capable of dealing with some of the biggest and most egregious companies that are pushing stuff at us in the way that we have talked about?
I have a series of questions which I do not have the answers to. I am looking forward to the Minister riding to my aid on a white charger of enormous proportions and great skill which will take us out without having to fall over any fences.
If I may, I suggest to the Minister a couple of things. First, we are stuck on the word “content”. We will come back to that in the future, as we still have an outstanding problem about exactly where the Bill sets it. Time and again in discussions with the Bill team and with Ministers we have been led back to the question of where the content problem lies and where the harms relate to that, but this little debate has shown beyond doubt that harm can occur independent of and separate from content. We must have a solution to that, and I hope it will be quick.
Secondly, when approaching anybody or anything or any business or any charity that is being considered in scope for this Bill, we will not get there if we are looking only at the question of its size and its reach. We have to look at the risks it causes, and we have to drill down hard into what risks we are trying to deal with using our armoury as we approach these companies, because that is what matters to the children, vulnerable people and adults who would suffer otherwise, and not the question of whether or not these companies are big or small. I think there are solutions to that and we will get there, but, when he comes to respond, the Minister needs to demonstrate to us that he is still willing to listen and think again about one or two issues. I look forward to further discussions with him.
I do not think that it is, but it will be helpful to have a debate on categorisation later on Report, when we reach Amendment 245, to probe this further. It is not possible for me to say that a particular service will certainly be categorised one way or another, because that would give it carte blanche and we do not know how it may change in the future—estimable though I may think it is at present. That is the difficulty of setting the precise parameters that the noble Baroness, Lady Fox, sought in her contribution. We are setting broad parameters, with exemptions and categorisations, so that the burdens are not unduly heavy on services which do not cause us concern, and with the proviso for the Secretary of State to bring further exemptions before Parliament, as circumstances strike her as fit, for Parliament to continue the debate we are having now.
The noble Baroness, Lady Kidron, in her earlier speech, asked about the functionalities of user-to-user services. The definitions of user-to-user services are broad and flexible, to capture new and changing services. If a service has both user-to-user functionality and a search engine, it will be considered a combined service, with respective duties for the user-to-user services which form part of its service and search duties in relation to the search engine.
I reassure my noble friend Lady Harding of Winscombe that the Bill will not impose a disproportionate burden on services, nor will it impede the public’s access to valuable content. All duties on services are proportionate to the risk of harm and, crucially, to the capacity of companies. The Bill’s proportionate design means that low-risk services will have to put in place only measures which reflect the risk of harm to their users. Ofcom’s guidance and codes of practice will clearly set out how these services can comply with their duties. We expect that it will set out a range of measures and steps for different types of services.
Moreover, the Bill already provides for wholesale exemptions for low-risk services and for Ofcom to exempt in-scope services from requirements such as record-keeping. That will ensure that there are no undue burdens to such services. I am grateful for my noble friend’s recognition, echoed by my noble friend Lady Stowell of Beeston, that “non-profit” does not mean “not harmful” and that there can be non-commercial services which may pose harms to users. That is why it is important that there is discretion for proper assessment.
Amendment 30 seeks to allow Ofcom to withdraw the exemptions listed in Schedule 1 from the Bill. I am very grateful to my noble friend Lord Moylan for his time earlier this week to discuss his amendment and others. We have looked at it, as I promised we would, but I am afraid that we do not think that it would be appropriate for Ofcom to have this considerable power—my noble friend is already concerned that the regulator has too much.
The Bill recognises that it may be necessary to remove certain exemptions if there is an increased risk of harm from particular types of services. That is why the Bill gives the Secretary of State the power to remove particular exemptions, such as those related to services which have limited user-to-user functionality and those which offer one-to-one live aural communications. These types of services have been carefully selected as areas where future changes in user behaviour could necessitate the repeal or amendment of an exemption in Schedule 1. This power is intentionally limited to only these types of services, meaning that the Secretary of State will not be able to remove exemptions for comments on recognised news publishers’ sites. That is in recognition of the Government’s commitment to media freedom and public debate. It would not be right for Ofcom to have the power to repeal those exemptions.
Amendments 281 and 281B, in the name of the noble Lord, Lord Russell of Liverpool, are designed to ensure that the lists of features under the definition of “functionality” in the Bill apply to all regulated services. Amendment 281A aims to add additional examples of potentially addictive functionalities to the Bill’s existing list of features which constitute a “functionality”. I reassure him and other noble Lords that the list of functionalities in the Bill is non-exhaustive. There may be other functionalities which could cause harm to users and which services will need to consider as part of their risk assessment duties. For example, if a provider’s risk assessment identifies that there are functionalities which risk causing significant harm to an appreciable number of children on its service, the Bill will require the provider to put in place measures to mitigate and manage that risk.
He and other noble Lords spoke about the need for safety by design. I can reassure them this is already built into the framework of the Bill, which recognises how functionalities including many of the things mentioned today can increase the risk of harm to users and will encourage the safe design of platforms.
Amendments 281 and 281B have the effect that regulated services would need to consider the risk of harm of functionalities that are not relevant for their kind of service. For example, sharing content with other users is a functionality of user-to-user services, which is not as relevant for search services. The Bill already outlines specific features that both user-to-user and search services should consider, which are the most relevant functionalities for those types of service. Considering these functionalities would create an unnecessary burden for regulated services which would detract from where their efforts can best be focused. That is why I am afraid I cannot accept the amendments that have been tabled.
My Lords, surely it is the role of the regulators to look at functionalities of this kind. The Minister seemed to be saying that it would be an undue burden on the regulator. Is not that exactly what we are meant to be legislating about at this point?
Perhaps I was not as clear as I could or should have been. The regulator will set out in guidance the duties that fall on the businesses. We do not want the burden on the business to be unduly heavy, but there is an important role for Ofcom here. I will perhaps check—
But these functionalities are a part of their business model, are they not?
Hence Ofcom will make the assessments about categorisation based on that. Maybe I am missing the noble Lord’s point.
I think we may need further discussions on the amendment from the noble Lord, Lord Russell.
I will check what I said but I hope that I have set out why we have taken the approach that we have with the broad scope and the exemptions and categorisations that are contained in it. With that, I urge the noble Lord to withdraw his amendment.
I shall be brief, my Lords, because I know we have a Statement to follow. It is a pleasure to follow the noble Lord, Lord Russell. I certainly share his concern about the rise of incel culture, and this is a very appropriate point to raise it.
This is all about choices and the Minister, in putting forward his amendments, in response not only to the Joint Committee but the overwhelming view in Committee on the Bill that this was the right thing to do, has done the right thing. I thank him for that, with the qualification that we must make sure that the red and amber lights are used—just as my noble friend Lord Allan and the noble Baroness, Lady Stowell, qualified their support for what the Minister has done. At the same time, I make absolutely clear that I very much support the noble Baroness, Lady Kidron. I was a bit too late to get my name down to her amendment, but it would be there otherwise.
I very much took to what the right reverend Prelate had to say about the ethics of the online world and nowhere more should they apply than in respect of children and young people. That is the place where we should apply these ethics, as strongly as we can. With some knowledge of artificial intelligence, how it operates and how it is increasingly operating, I say that what the noble Baroness wants to add to the Minister’s amendment seems to be entirely appropriate. Given the way in which algorithms are operating and the amount of misinformation and disinformation that is pouring into our inboxes, our apps and our social media, this is a very proportionate addition. It is the future. It is already here, in fact. So I very strongly support Amendment 174 from the noble Baroness and I very much hope that after some discussion the Minister will accept it.
My Lords, like the noble Baroness, Lady Harding, I want to make it very clear that I think the House as a whole welcomes the change of heart by the Government to ensure that we have in the Bill the two sides of the question of content that will be harmful to children. We should not walk away from that. We made a big thing of this in Committee. The Government listened and we have now got it. The fact that we do not like it—or do not like bits of it—is the price we pay for having achieved something which is, probably on balance, good.
The shock comes from trying to work out why it is written the way it is, and how difficult it is to see what it will mean in practice when companies working to Ofcom’s instructions will take this and make this happen in practice. That lies behind, I think I am right in saying, the need for the addition to Amendment 172 from the noble Baroness, Lady Kidron, which I have signed, along with the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford. Both of them have spoken well in support of it and I do not need to repeat those points.
Somehow, in getting the good of Amendments 171 and 172, we have lost the flexibility that we think we want as well to try to get that through. The flexibility does exist, because the Government have retained powers to amend and change both primary priority content that is harmful to children and the primary content. Therefore, subject to approval through the secondary legislation process, this House will continue to have a concern about that—indeed, both Houses will.
Somehow, however, that does not get to quite where the concern comes from. The concern should be both the good points made by the noble Lord, Lord Russell—I should have caught him up in the gap and said I had already mentioned the fact that we had been together at the meeting. He found some additional points to make which I hope will also be useful to future discussion. I am glad he has done that. He is making a very good point in relation to cultural context and the work that needs to go on—which we have talked about in earlier debates—in order to make this live: in other words, to make people who are responsible for delivering this through Ofcom, but also those who are delivering it through companies, to understand the wider context. In that sense, clearly we need the misinformation/disinformation side of that stuff. It is part and parcel of the problems we have got. But more important even than that is the need to see about the functionality issues. We have come back to that. This Bill is about risk. The process that we will be going through is about risk assessment and making sure that the risks are understood by those who deliver services, and the penalties that follow the failure of the risk assessment process delivering change that we want to see in society.
However, it is not just about content. We keep saying that, but we do not see the changes around it. The best thing that could happen today would be if the Minister in responding accepted that these clauses are good—“Tick, we like them”—but could we just not finalise them until we have seen the other half of that, which is: what are the other risks to which those users of services that we have referred to and discussed are receiving through the systemic design processes that are designed to take them in different directions? It is only when we see the two together that we will have a proper concern.
I may have got this wrong, but the only person who can tell us is the Minister because he is the only one who really understands what is going on in the Bill. Am I not right in saying—I am going to say I am right; he will say no, I am not, but I am, aren’t I?—that we will get to Clauses 208 and 209, or the clauses that used to be 208 and 209, one of which deals with harms from content and the other deals with functionality? We may need to look at the way in which those are framed in order to come back and understand better how these lie and how they interact with that. I may have got the numbers wrong—the Minister is looking a bit puzzled, so I probably have—but the sense is that this will probably not come up until day 4. While I do not want to hold back the Bill, we may need to look at some of the issues that are hidden in the interstices of this set of amendments in order to make sure that the totality is better for those who have to use it.