Online Safety Bill Debate
Full Debate: Read Full DebateLord Stevenson of Balmacara
Main Page: Lord Stevenson of Balmacara (Labour - Life peer)Department Debates - View all Lord Stevenson of Balmacara's debates with the Department for Digital, Culture, Media & Sport
(1 year, 8 months ago)
Lords ChamberThe Minister just said something that was material to this debate. He said that Ofcom has existing powers to prevent app stores from providing material that would have caused problems for the services to which they allow access. Can he confirm that?
Perhaps the noble Lord could clarify his question; I was too busy finishing my answer to the noble Lord, Lord Knight.
It is a continuation of the point raised by the noble Baroness, Lady Harding, and it seems that it will go part of the way towards resolving the differences that remain between the Minister and the noble Baroness, which I hope can be bridged. Let me put it this way: is it the case that Ofcom either now has powers or will have powers, as a result of the Bill, to require app stores to stop supplying children with material that is deemed in breach of the law? That may be the basis for understanding how you can get through this. Is that right?
Services already have to comply with their duties to keep children safe. If they do not comply, Ofcom has powers of enforcement set out, which require app stores to remove applications that are harmful to children. We think this already addresses the point, but I am happy to continue discussing it offline with the noble Lord, my noble friend and others who want to explore how. As I say, we think this is already covered. A more general duty here would risk distracting from Ofcom’s existing priorities.
My Lords, this has been a very strange debate. It has been the tail end of the last session and a trailer for a much bigger debate coming down the track. It was very odd.
We do not want to see everything behind an age-gating barrier, so I agree with my noble friend. However, as the noble Baroness, Lady Kidron, reminded us, it is all about the risk profile, and that then leads to the kind of risk assessment that a platform is going to be required to carry out. There is a logic to the way that the Bill is going to operate.
When you look at Clause 11(3), you see that it is not disproportionate. It deals with “primary priority content”. This is not specified in the Bill but it is self-harm and pornography—major content that needs age-gating. Of course we need to have the principles for age assurance inserted into the Bill as well, and of course it will be subject to debate as we go forward.
There is technology to carry out age verification which is far more sophisticated than it ever was, so I very much look forward to that debate. We started that process in Part 3 of the Digital Economy Act. I was described as an internet villain for believing in age verification. I have not changed my view, but the debate will be very interesting. As regards the tail-end of the previous debate, of course we are sympathetic on these Benches to the Wikipedia case. As we said on the last group, I very much hope that we will find a way, whether it is in Schedule 1 or in another way, of making sure that Wikipedia is not affected overly by this—maybe the risk profile that is drawn up by Ofcom will make sure that Wikipedia is not unduly impacted.
Like others, I had prepared quite extensive notes to respond to what I thought the noble Lord was going to say about his amendments in this group, and I have not been able to find anything left that I can use, so I am going to have to extemporise slightly. I think it is very helpful to have a little non-focused discussion about what we are about to talk about in terms of age, because there is a snare and a delusion in quite a lot of it. I was put in mind of that in the discussions on the Digital Economy Act, which of course precedes the Minister but is certainly still alive in our thinking: in fact, we were talking about it earlier today.
The problem I see is that we have to find a way of squaring two quite different approaches. One is to prevent those who should not be able to see material, because it is illegal for them to see it. The other is to find a way of ensuring that we do not end up with an age-gated internet, which I am grateful to find that we are all, I think, agreed about: that is very good to know.
Age is very tricky, as we have heard, and it is not the only consideration we have to bear in mind in wondering whether people should be able to gain access to areas of the internet which we know will be bad and difficult for them. That leads us, of course, to the question about legal but harmful, now resolved—or is it? We are going to have this debate about age assurance and what it is. What is age verification? How do they differ? How does it matter? Is 18 a fixed and final point at which we are going to say that childhood ends and adulthood begins, and therefore one is open for everything? It is exactly the point made earlier about how to care for those who should not be exposed to material which, although legal for them by a number called age, is not appropriate for them in any of the circumstances which, clinically, we might want to bring to bear.
I do not think we are going to resolve these issues today—I hope not. We are going to talk about them for ever, but at this stage I think we still need a bit of thinking outside a box which says that age is the answer to a lot of the problems we have. I do not think it is, but whether the Bill is going to carry that forward I have my doubts. How we get that to the next stage, I do not know, but I am looking forward to hearing the Minister’s comments on it.
My Lords, I agree that this has been a rather unfortunate grouping and has led to a slightly strange debate. I apologise if it is the result of advice given to my noble friend. I know there has been some degrouping as well, which has led to slightly odd combinations today. However, as promised, I shall say a bit more about Wikipedia in relation to my noble friend’s Amendments 10 and 11.
The effect of these amendments would be that moderation actions carried out by users—in other words, community moderation of user-to-user and search services —would not be in scope of the Bill. The Government support the use of effective user or community moderation by services where this is appropriate for the service in question. As I said on the previous group, as demonstrated by services such as Wikipedia, this can be a valuable and effective means of moderating content and sharing information. That is why the Bill does not impose a one-size-fits-all requirement on services, but instead allows services to adopt their own approaches to compliance, so long as these are effective. The noble Lord, Lord Allan of Hallam, dwelt on this. I should be clear that duties will not be imposed on individual community moderators; the duties are on platforms to tackle illegal content and protect children. Platforms can achieve this through, among other things, centralised or community moderation. Ultimately, however, it is they who are responsible for ensuring compliance and it is platforms, not community moderators, who will face enforcement action if they fail to do so.