Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, it falls to me to inject some grit into what has so far been a very harmonious debate, as I will raise some concerns about Amendments 2 and 22.

I again declare my interest: I spent 10 years working for Facebook, doing the kind of work that we will regulate in this Bill. At this point noble Lords are probably thinking, “So it’s his fault”. I want to stress that, if I raise concerns about the way the regulation is going, it is not that I hold those views because I used to work for the industry; rather, I felt comfortable working in the industry because I always had those views, back to 2003 when we set up Ofcom. I checked the record, and I said things then that are remarkably consistent with how I feel today about how we need to strike the balance between the power of the state and the power of the citizen to use the internet.

I also should declare an interest in respect of Amendment 2, in that I run a blog called regulate.tech. I am not sure how many children are queueing up to read my thoughts about regulation of the tech industry, but they would be welcome to do so. The blog’s strap- line is:

“How to regulate the internet without breaking it”.


It is very much in that spirit that I raise concerns about these two amendments.

I certainly understand the challenges for content that is outside of the user-to-user or search spaces. I understand entirely why the noble Baroness, Lady Kidron, feels that something needs to be done about that content. However, I am not sure that this Bill is the right vehicle to address that kind of content. There are principled and practical reasons why it might be a mistake to extend the remit here.

The principle is that the Bill’s fundamental purpose is to restrict access to speech by people in the United Kingdom. That is what legislation such as this does: it restricts speech. We have a framework in the Human Rights Act, which tells us that when we restrict speech we have to pass a rigorous test to show that those restrictions are necessary and proportionate to the objective we are trying to achieve. Clearly, when dealing with children, we weight very heavily in that test whether something is necessary and proportionate in favour of the interest of the welfare of the children, but we cannot do away with the test altogether.

It is clear that the Government have applied that test over the years that they have been preparing this Bill and determined that there is a rationale for intervention in the context of user-to-user services and search services. At the same time, we see in the Bill that the Government’s decision is that intervention is not justified in all sorts of other contexts. Email and SMS are excluded. First-party publisher content is excluded, so none of the media houses will be included. We have a Bill that is very tightly and specifically framed around dealing with intermediaries, whether that is user-to-user intermediaries who intermediate in user-generated content, or search as an intermediary, which scoops up content from across the internet and presents it to you.

This Bill is about regulating the regulators; it is not about regulating first-party speakers. A whole world of issues will come into play if we move into that space. It does not mean that it is not important, just that it is different. There is a common saying that people are now bandying around, which is that freedom of speech is not freedom of reach. To apply a twist to that, restrictions on reach are not the same as restrictions on speech. When we talk about restricting intermediaries, we are talking about restricting reach. If I have something I want to say and Facebook or Twitter will not let me say it, that is a problem and I will get upset, but it is not the same as being told that I cannot say it anywhere on the internet.

My concern about Amendment 2 is that it could lead us into a space where we are restricting speech across the internet. If we are going to do that—there may be a rationale for doing it—we will need to go back and look at our necessity and proportionality test. It may play out differently in that context from user-to-user or intermediary-based services.

From a practical point of view, we have a Bill that, we are told, will give Ofcom the responsibility of regulating 25,000 more or less different entities. They will all be asked to pay money to Ofcom and will all be given a bunch of guidance and duties that they have to fulfil. Again, those duties, as set out in painful length in the Bill, are very specifically about the kind of things that an intermediary should do to its users. If we were to be regulating blogs or people’s first-party speech, or publishers, or the Daily Telegraph, or whoever else, I think we would come up with a very different set of duties from the duties laid out in the Bill. I worry that, however well-motivated, Amendment 2 leads us into a space for which this Bill is not prepared.

I have a lot of sympathy with the views of the noble Baroness, Lady Harding, around the app stores. They are absolutely more like intermediaries, or search, but again the tools in the Bill are not necessarily dedicated to how one would deal with app stores. I was interested in the comments of the noble Baroness, Lady Stowell, on what will be happening to our competition authorities; a lot will be happening in that space. On app stores, I worry about what is in Amendment 22: we do not want app stores to think that it is their job to police the content of third-party services. That is Ofcom’s job. We do not want the app stores to get in the middle, not least because of these commercial considerations. We do not want Apple, for instance, thinking that, to comply with UK legislation, it might determine that WhatsApp is unsafe while iMessage is safe. We do not want Google, which operates Play Store, to think that it would have a legal rationale for determining that TikTok is unsafe while YouTube is safe. Again, I know that this is not the noble Baroness’s intention or aim, but clearly there is a risk that we open that up.

There is something to be done about app stores but I do not think that we can roll over the powers in the Bill. When we talk about intermediaries such as user-to-user services and search, we absolutely want them to block bad content. The whole thrust of the Bill is about forcing them to restrict bad content. When it comes to app stores, the noble Baroness set out some of her concerns, but I think we want something quite different. I hesitate to say this, as I know that my noble friend is supportive of it, but I think that it is important as we debate these issues that we hear some of those concerns.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Could it not be argued that the noble Lord is making a case for regulation of app stores? Let us take the example of Apple’s dispute with “Fortnite”, where Apple is deciding how it wants to police things. Perhaps if this became a more regulated space Ofcom could help make sure that there was freedom of access to some of those different products, regardless of the commercial interests of the people who own the app stores.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

The noble Lord makes a good point. I certainly think we are heading into a world where there will be more regulation of app stores. Google and Apple are commercial competitors with some of the people who are present in their stores. A lot of the people in their stores are in dispute with them over things such as the fees that they have to pay. It is precisely for that reason that I do not think we should be throwing online safety into the mix.

There is a role for regulating app stores, which primarily focuses on these commercial considerations and their position in the market. There may be something to be done around age-rating; the noble Baroness made a very good point about how age-rating works in app stores. However, if we look at the range of responsibilities that we are describing in this Bill and the tools that we are giving to intermediaries, we see that they are the wrong, or inappropriate, set of tools.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

Would the noble Lord acknowledge that app stores are already undertaking these age-rating and blocking decisions? Google has unilaterally decided that, if it assesses that you are under 18, it will not serve up over-18 apps. My concern is that this is already happening but it is happening indiscriminately. How would the noble Lord address that?

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

The noble Baroness makes a very good point; they are making efforts. There is a role for app stores to play but I hope she would accept that it is qualitatively different from that played by a search engine or a user-to-user service. If we were to decide, in both instances, that we want app stores to have a greater role in online safety and a framework that allows us to look at blogs and other forms of content, we should go ahead and do that. All I am arguing is that we have a Bill that is carefully constructed around two particular concepts, a user-to-user service and a search engine, and I am not sure it will stretch that far.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I want to reassure the noble Lord: I have his blog in front of me and he was quite right—there were not a lot of children on that site. It is a very good blog, which I read frequently.

I want to make two points. First, age-rating and age-gating are two different things, and I think the noble Lord has conflated them. There is a lot of age- rating going on, and it is false information. We need good information, and we have not managed to get it by asking nicely. Secondly, I slightly dispute his idea that we have a very structured Bill regarding user-to-user and so on. We have a very structured Bill from a harms perspective that describes the harms that must be prevented—and then we got to commercial porn, and we can also get to these other things.

I agree with the noble Lord’s point about freedom of speech, but we are talking about a fixed set of harms that will, I hope, be in the Bill by the end. We can then say that if children are likely to be accessed by this test, and known harm is there, that is what we are looking at. We are certainly not looking at the noble Lord’s blog.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

I appreciate the intervention by the noble Baroness; I hope through this grit we may conjure up a pearl of some sort. The original concept of the Bill, as championed by the noble Baroness, would have been a generalised set of duties of care which could have stretched much more broadly. It has evolved in a particular direction and become ever more specific and tailored to those three services: user-to-user, search, and pornography services. Having arrived at that point, it is difficult to then open it back up and stretch it to reach other forms of service.

My intention in intervening in this debate is to raise some of those concerns because I think they are legitimate. I may be at the more sceptical end of the political world, but I am at the more regulation-friendly end of the tech community. This is said in a spirit of trying to create a Bill that will actually work. I have done the work, and I know how hard Ofcom’s job will be. That sums up what I am trying to say: my concern is that we should not give Ofcom an impossible job. We have defined something quite tight—many people still object to it, think it is too loose and do not agree with it—but I think we have something reasonably workable. I am concerned that, however tempting it is, by re-opening Pandora’s box we may end up creating something less workable.

That does not mean we should forget about app stores and non-user-to-user content, but we need to think of a way of dealing with those which does not necessarily just roll over the mechanism we have created in the Online Safety Bill to other forms of application.

Baroness Healy of Primrose Hill Portrait Baroness Healy of Primrose Hill (Lab)
- View Speech - Hansard - - - Excerpts

I strongly support the amendments in the name of the noble Baroness, Lady Kidron, because I want to see this Bill implemented but strengthened in order to fulfil the admirable intention that children must be safe wherever they are online. This will not be the case unless child safety duties are applicable in all digital environments likely to be accessed by children. This is not overly ambitious or unrealistic; the platforms need clarity as to these new responsibilities and Ofcom must be properly empowered to enforce the rules without worrying about endless legal challenges. These amendments will give that much-needed clarity in this complex area.

As the Joint Committee recommended, this regulatory alignment would simplify compliance with businesses while giving greater clarity to people who use the service and greater protection for children. It would give confidence to parents and children that they need not work out if they are in a regulated or unregulated service while online. The Government promised that the onus for keeping young people safe online would sit squarely on the tech companies’ shoulders.

Without these amendments, there is a real danger that a loophole will remain whereby some services, even those that are known to harm, are exempt, leaving thousands of children exposed to harm. They would also help to future-proof the Bill. For example, some parts of the metaverse as yet undeveloped may be out of scope, but already specialist police units have raised concerns that abuse rooms, limited to one user, are being used to practise violence and sexual violence against women and girls.

We can and must make this good Bill even better and support all the amendments in this group.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Services already have to comply with their duties to keep children safe. If they do not comply, Ofcom has powers of enforcement set out, which require app stores to remove applications that are harmful to children. We think this already addresses the point, but I am happy to continue discussing it offline with the noble Lord, my noble friend and others who want to explore how. As I say, we think this is already covered. A more general duty here would risk distracting from Ofcom’s existing priorities.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

My Lords, on that point, my reading of Clauses 131 to 135, where the Bill sets out the business disruption measures, is that they could be used precisely in that way. It would be helpful for the Minister responding later to clarify that Ofcom would use those business disruption measures, as the Government explicitly anticipate, were an app store, in a rogue way, to continue to list a service that Ofcom has said should not be made available to people in the United Kingdom.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I will be very happy to set that out in more detail.

Amendments 33A and 217A in the name of the noble Lord, Lord Storey, would place a new duty on user-to-user services that predominantly enable online gaming. Specifically, they would require them to have a classification certificate stating the age group for which they are suitable. We do not think that is necessary, given that there is already widespread, voluntary uptake of approval classification systems in online gaming.

--- Later in debate ---
Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords, before I speak to my Amendment 9, which I will be able to do fairly briefly because a great deal of the material on which my case rests has already been given to the Committee by the noble Baroness, Lady Fox of Buckley, I will make the more general and reflective point that there are two different views in the Committee that somehow need to be reconciled over the next few weeks. There is a group of noble Lords who are understandably and passionately concerned about child safety. In fact, we all share that concern. There are others of us who believe that this Bill, its approach and the measures being inserted into it will have massive ramifications outside the field of child safety, for adults, of course, but also for businesses, as the noble Baroness explained. The noble Baroness and I, and others like us, believe that these are not sufficiently taken into account either by the Bill or by those pressing for measures to be harsher and more restrictive.

Some sort of balance needs to be found. At Second Reading my noble friend the Minister said that the balance had been struck in the right place. It is quite clear that nobody really agrees with that, except on the principle, which I think is always a cop-out, that if everyone disagrees with you, you must be right, which I have never logically understood in any sense at all. I hope my noble friend will not resort to claiming that he has got it right simply because everyone disagrees with him in different ways.

My amendment is motivated by the considerations set out by the noble Baroness, which I therefore do not need to repeat. It is the Government’s own assessment that between 20,000 and 25,000 businesses will be affected by the measures in this Bill. A great number of those—some four-fifths—are small businesses or micro-businesses. The Government appear to think in their assessment that only 120 of those are high risk. The reason they think they are high risk is not that they are engaged in unpleasant activities but simply that they are engaged in livestreaming and contacting new people. That might be for nefarious purposes but equally, it might not, so the 120 we need to worry about could actually be a very small number. We handle this already through our own laws; all these businesses would still be subject to existing data protection laws and complying with the law generally on what they are allowed to publish and broadcast. It would not be a free-for-all or a wild west, even among that very small number of businesses.

My Amendment 9 takes a slightly different approach to dealing with this. I do not in any way disagree with or denigrate the approach taken by the noble Baroness, Lady Fox, but my approach would be to add two categories to the list of exemptions in the schedules. The first of these is services provided by small and medium-sized enterprises. We do not have to define those because there is already a law that helps define them for us: Section 33 of the Small Business, Enterprise and Employment Act 2015. My proposal is that we take that definition, and that those businesses that comply with it be outside the scope of the Bill.

The second area that I would propose exempting was also referred to by the noble Baroness, Lady Fox of Buckley: community-based services. The largest of these, and the one that frequently annoys us because it gets things wrong, is Wikipedia. I am a great user of Wikipedia but I acknowledge that it does make errors. Of course, most of the errors it makes, such as saying, “Lord Moylan has a wart on the end of his nose”, would not be covered by the Bill anyway. Nothing in the Bill will force people to correct factual statements that have been got wrong—my year of birth or country of birth, or whatever. That is not covered. Those are the things they usually get wrong and that normally annoy us when we see them.

However, I do think that these services are extremely valuable. Wikipedia is an immense achievement and a tremendous source of knowledge and information for people. The fact that it has been put together in this organic, community-led way over a number of years, in so many languages, is a tremendous advantage and a great human advance. Yet, under the proposed changes, Wikipedia would not be able to operate its existing model of people posting their comments.

Currently, you go on Wikipedia and you can edit it. Now, I know this would not apply to any noble Lords but, in the other place, it has been suggested that MPs have discovered how to do this. They illicitly and secretly go on to and edit their own pages, usually in a flattering way, so it is possible to do this. There is no prior restraint, and no checking in advance. There are moderators at Wikipedia—I do not know whether they are employed—who review what has been done over a period, but they do not do what this Bill requires, which is checking in advance.

It is not simply about Wikipedia; there are other community sites. Is it sensible that Facebook should be responsible if a little old lady alters the information on a community Facebook page about what is happening in the local parish? Why should Facebook be held responsible for that? Why would we want it to be responsible for it—and how could it do it without effectively censoring ordinary activities that people want to carry out, using the advantages of the internet that have been so very great?

What I am asking is not dramatic. We have many laws in which we very sensibly create exemptions for small and medium-sized enterprises. I am simply asking that this law be considered under that heading as well, and similarly for Wikipedia and community-based sites. It is slightly unusual that we have had to consider that; it is not normal, but it is very relevant to this Bill and I very much hope the Government will agree to it.

The answer that I would not find satisfactory—I say this in advance for the benefit of my noble friend the Minister, in relation to this and a number of other amendments I shall be moving in Committee—is that it will all be dealt with by Ofcom. That would not be good enough. We are the legislators and we want to know how these issues will be dealt with, so that the legitimate objectives of the Bill can be achieved without causing massive disruption, cost and disbenefit to adults.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I rise to speak in support of Amendment 9, tabled by the noble Lord, Lord Moylan, and in particular the proposed new paragraph 10A to Schedule 1. I hope I will find myself more in tune with the mood of the Committee on this amendment than on previous ones. I would be interested to know whether any noble Lords believe that Ofcom should be spending its limited resources supervising a site like Wikipedia under the new regime, as it seems to me patently obvious that that is not what we intend; it is not the purpose of the legislation.

The noble Lord, Lord Moylan, is right to remind us that one of the joys of the internet is that you buy an internet connection, plug it in and there is a vast array of free-to-use services which are a community benefit, produced by the community for the community, with no harm within them. What we do not want to do is interfere with or somehow disrupt that ecosystem. The noble Baroness, Lady Fox, is right to remind us that there is a genuine risk of people withdrawing from the UK market. We should not sidestep that. People who try to be law-abiding will look at these requirements and ask themselves, “Can I meet them?” If the Wikimedia Foundation that runs Wikipedia does not think it can offer its service in a lawful way, it will have to withdraw from the UK market. That would be to the detriment of children in the United Kingdom, and certainly not to their benefit.

There are principle-based and practical reasons why we do not want Ofcom to be operating in this space. The principle-based one is that it makes me uncomfortable that a Government would effectively tell their regulator how to manage neutral information sites such as Wikipedia. There are Governments around the world who seek to do that; we do not want to be one of those.

The amendment attempts to define this public interest, neutral, informational service. It happens to be user-to-user but it is not like Facebook, Instagram or anything similar. I would feel much more comfortable making it clear in law that we are not asking Ofcom to interfere with those kinds of services. The practical reason is the limited time Ofcom will have available. We do not want it to be spending time on things that are not important.

Definitions are another example of how, with the internet, it can often be extremely hard to draw bright lines. Functionalities bleed into each other. That is not necessarily a problem, until you try to write something into law; then, you find that your definition unintentionally captures a service that you did not intend to capture, or unintentionally misses out a service that you did intend to be in scope. I am sure the Minister will reject the amendment because that is what Ministers do; but I hope that, if he is not willing to accept it, he will at least look at whether there is scope within the Bill to make it clear that Wikipedia is intended to be outside it.

Paragraph 4 of Schedule 1 refers to “limited functionality services”. That is a rich vein to mine. It is clear that the intention is to exclude mainstream media, for example. It refers to “provider content”. In this context, Encyclopaedia Britannica is not in scope but Wikipedia is, the difference being that Wikipedia is constructed by users, while Encyclopaedia Britannica is regarded as being constructed by a provider. The Daily Mail is outside scope; indeed, all mainstream media are outside scope. Anyone who declares themselves to be media—we will debate this later on—is likely to be outside scope.

Such provider exemption should be offered to other, similar services, even if they happen to be constructed from the good will of users as opposed to a single professional author. I hope the Minister will be able to indicate that the political intent is not that we should ask Ofcom to spend time and energy regulating Wikipedia-like services. If so, can he point to where in the legislation we might get that helpful interpretation, in order to ensure that Ofcom is focused on what we want it to be focused on and not on much lower priority issues?

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

I will speak to a couple of the amendments in this group. First, small is not safe, and you cannot necessarily see these platforms in isolation. For example, there is an incel group that has only 4,000 active users, but it posts a great deal on YouTube and has 24.2 million users in that context. So we have to be clear that small and safe are not the same thing.

However, I am sympathetic to the risk-based approach. I should probably have declared an interest as someone who has given money to Wikipedia on several occasions to keep it going. I ask the Minister for some clarity on the systems and processes of the Bill, and whether the risk profile of Wikipedia—which does not entice you in and then follow you for the next six months once you have looked at something—is far lower than something very small that gets hold of you and keeps on going. I say that particularly in relation to children, but I feel it for myself also.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to follow the noble Baroness, Lady Harding. If one is permitted to say this in the digital age, I am on exactly the same page as she is.

There are two elements to the debate on this group. It is partly about compliance, and I absolutely understand the point about the costs of that, but I also take comfort from some of the things that the noble Lord. Lord Vaizey, said about the way that Ofcom is going to deliver the regulation and the very fact that this is going to be largely not a question of interpretation of the Act, when it comes down to it, but is going to be about working with the codes of practice. That will be a lot more user-friendly than simply having to go to expensive expert lawyers, as the noble Baroness, Lady Fox, said—not that I have anything against expensive expert lawyers.

I am absolutely in agreement with the noble Baroness, Lady Kidron, that small is not safe. As the noble Baroness, Lady Harding, described, small can become big. We looked at this in our Joint Committee and recommended to the Government that they should take a more nuanced approach to regulation, based not just on size and high-level functionality but on factors such as risk, reach, user base, safety performance and business model. All those are extremely relevant but risk is the key, right at the beginning. The noble Baroness, Lady Fox, also said that Reddit should potentially be outside, but Reddit has had its own problems, as we know. On that front, I am on absolutely the same page as those who have spoken about keeping us where we are.

The noble Lord, Lord Moylan, has been very cunning in the way that he has drawn up his Amendment 9. I am delighted to be on the same page as my noble friend —we are making progress—but I agree only with the first half of the amendment because, like the noble Baroness, Lady Kidron, I am a financial contributor to Wikipedia. A lot of us depend on Wikipedia; we look up the ages of various Members of this House when we see them in full flight and think, “Good heavens!” Biographies are an important part of this area. We have all had Jimmy Wales saying, as soon as we get on to Wikipedia, “You’ve already looked at Wikipedia 50 times this month. Make a contribution”, and that is irresistible. There is quite a strong case there. It is risk-based so it is not inconsistent with the line taken by a number of noble Lords in all this. I very much hope that we can get something out of the Minister—maybe some sort of sympathetic noises for a change—at this stage so that we can work up something.

I must admit that the briefing from Wikimedia, which many of us have had, was quite alarming. If the Bill means that we do not have users in high-risk places then we will find that adults get their information from other sources that are not as accurate as Wikipedia —maybe from ChatGPT or GPT-4, which the noble Lord, Lord Knight, is clearly very much an expert in—and that marginalised websites are shut down.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

For me, one of the features of the schedule’s list of exempted sites is foreign state entities. Therefore, we could end up in the absurd situation where you could not read about the Ukraine war on Wikipedia, but you would be able to read about the Ukraine war on the Russian Government website.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, if we needed an example of something that gave us cause for concern, that would be it; but a very good case has been made, certainly for the first half of the amendment in the name of the noble Lord, Lord Moylan, and we on these Benches support it.

--- Later in debate ---
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, I have to start with a slightly unprofessional confession. I accepted the Bill team’s suggestion on how my amendments might be grouped after I had grouped them rather differently. The result is that I am not entirely clear why some of these groupings are quite as they are. As my noble friend the Minister said, my original idea of having Amendments 9, 10 and 11 together would perhaps have been better, as it would have allowed him to give a single response on Wikipedia. Amendments 10 and 11 in this group relate to Wikipedia and services like it.

I am, I hope, going to cause the Committee some relief as I do not intend to repeat remarks made in the previous group. The extent to which my noble friend wishes to amplify his comments in response to the previous group is entirely a matter for him, since he said he was reserving matter that he would like to bring forward but did not when commenting on the previous group. If I do not speak further on Amendments 10 and 11, it is not because I am not interested in what my noble friend the Minister might have to say on the topic of Wikipedia.

To keep this fairly brief, I turn to Amendment 26 on age verification. I think we have all agreed in the Chamber that we are united in wanting to see children kept safe. On page 10 of the Bill, in Clause 11(3), it states that there will be a duty to

“prevent children of any age from encountering”

this content—“prevent” them “encountering” is extremely strong. We do not prevent children encountering the possibility of buying cigarettes or encountering the possibility of being injured crossing the road, but we are to prevent children from these encounters. It is strongly urged in the clause—it is given as an example—that age verification will be required for that purpose.

Of course, age verification works only if it applies to everybody: one does not ask just the children to prove their age; one has to ask everybody online. Unlike when I go to the bar in a pub, my grey hair cannot be seen online. So this provision will almost certainly have to extend to the entire population. In Clause 11(3)(b), we have an obligation to protect. Clearly, the Government intend a difference between “prevent” and “protect”, or they would not have used two different verbs, so can my noble friend the Minister explain what is meant by the distinction between “prevent” and “protect”?

My amendment would remove Clause 11(3) completely. But it is, in essence, a probing amendment and what I want to hear from the Government, apart from how they interpret the difference between “prevent” and “protect”, is how they expect this duty to be carried out without having astonishingly annoying and deterring features built into every user-to-user platform and website, so that every time one goes on Wikipedia—in addition to dealing with the GDPR, accepting cookies and all the other nonsense we have to go through quite pointlessly—we then have to provide age verification of some sort.

What mechanism that might be, I do not know. I am sure that there are many mechanisms available for age verification. I do not wish to get into a technical discussion about what particular techniques might be used—I accept that there will be a range and that they will respond and adapt in the light of demand and technological advance—but I would like to know what my noble friend the Minister expects and how wide he thinks the obligation will be. Will it be on the entire population, as I suspect? Focusing on that amendment—and leaving the others to my noble friend the Minister to respond to as he sees fit—and raising those questions, I think that the Committee would like to know how the Government imagine that this provision will work. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I will speak to the amendments in the name of the noble Lord, Lord Moylan, on moderation, which I think are more important than he has given himself credit for—they go more broadly than just Wikipedia.

There is a lot of emphasis on platform moderation, but the reality is that most moderation of online content is done by users, either individually or in groups, acting as groups in the space where they operate. The typical example, which many Members of this House have experienced, is when you post something and somebody asks, “Did you mean to post that?”, and you say, “Oh gosh, no”, and then delete it. A Member in the other place has recently experienced a rather high-profile example of that through the medium of the newspaper. On a much smaller scale, it is absolutely typical that people take down content every day, either because they regret it or, quite often, because their friends, families or communities tell them that it was unwise. That is the most effective form of moderation, because it is the way that people learn to change their behaviour online, as opposed to the experience of a platform removing content, which is often experienced as the big bad hand of the platform. The person does not learn to change their behaviour, so, in some cases, it can reinforce bad behaviour.

Community moderation, not just on Wikipedia but across the internet, is an enormous public good, and the last thing that we want to do in this legislation is to discourage people from doing it. In online spaces, that is often a volunteer activity: people give up their time to try to keep a space safe and within the guidelines they have set for that space. The noble Lord, Lord Moylan, has touched on a really important area: in the Bill, we must be absolutely clear to those volunteers that we will not create all kinds of new legal operations and liabilities on them. These are responsible people, so, if they are advised that they will incur all kinds of legal risk when trying to comply with the Online Safety Bill, they will stop doing the moderation—and then we will all suffer.

On age-gating, we will move to a series of amendments where we will discuss age assurance, but I will say at the outset, as a teaser to those longer debates, that I have sympathy with the points made by the noble Lord, Lord Moylan. He mentioned pubs—we often talk about real-world analogies. In most of the public spaces we enter in the real world, nobody does any ID checking or age checking; we take it on trust, unless and until you carry out an action, such as buying alcohol, which requires an age check.

It is legitimate to raise this question, because where we fall in this debate will depend on how we see public spaces. I see a general-purpose social network as equivalent to walking into a pub or a town square, so I do not expect to have my age and ID checked at the point at which I enter that public space. I might accept that my ID is checked at a certain point where I carry out various actions. Others will disagree and will say that the space should be checked as soon as you go into it—that is the boundary of the debate we will have across a few groups. As a liberal, I am certainly on the side that says that it is incumbent on the person wanting to impose the extra checks to justify them. We should not just assume that extra checks are cost-free and beneficial; they have a cost for us all, and it should be imposed only where there is a reasonable justification.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

Far be it for me to suggest that all the amendments tabled by the noble Lord, Lord Moylan, are in the wrong place, but I think that Amendment 26 might have been better debated with the other amendments on age assurance.

On community moderation, I underscore the point that Ofcom must have a risk profile as part of its operations. When we get to that subject, let us understand what Ofcom intends to do with it—maybe we should instruct Ofcom a little about what we would like it to do with it for community moderation. I have a lot of sympathy—but do not think it is a get-out clause—with seeing some spaces as less risky, or, at least, for determining what risky looks like in online spaces, which is a different question. This issue belongs in the risk profile: it is not about taking things out; we have to build it into the Bill we have.

On age assurance and AV, I do not think that today is the day to discuss it in full. I disagree with the point that, because we are checking kids, we have to check ourselves—that is not where the technology is. Without descending into technical arguments, as the noble Lord, Lord Moylan, asked us not to, we will bring some of those issues forward.

The noble Lords, Lord Bethell and Lord Stevenson, and the right reverend Prelate the Bishop of Oxford have a package of amendments which are very widely supported across the Committee. They have put forward a schedule of age assurance that says what the rules of the road are. We must stop pretending that age assurance is something that is being invented now in this Bill. If you log into a website with your Facebook login, it shares your age—and that is used by 42% of people online. However, if you use an Apple login, it does not share your age, so I recommend using Apple—but, interestingly, it is harder to find that option on websites, because websites want to know your age.

So, first, we must not treat age assurance as if it has just been invented. Secondly, we need to start to have rules of the road, and ask what is acceptable, what is proportionate, and when we will have zero tolerance. Watching faces around the Committee, I say that I will accept zero tolerance for pornography and some other major subjects, but, for the most part, age assurance is something that we need to have regulated. Currently, it is being done to us rather than in any way that is transparent or agreed, and that is very problematic.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, this group of government amendments relates to risk assessments; it may be helpful if I speak to them now as the final group before the dinner break.

Risk management is at the heart of the Bill’s regulatory framework. Ofcom and services’ risk assessments will form the foundation for protecting users from illegal content and content which is harmful to children. They will ensure that providers thoroughly identify the risks on their own websites, enabling them to manage and mitigate the potential harms arising from them. Ofcom will set out the risks across the sector and issue guidance to companies on how to conduct their assessments effectively. All providers will be required to carry out risk assessments, keep them up-to-date and update them before making a significant change to the design or operation of their service which could put their users at risk. Providers will then need to put in place measures to manage and mitigate the risks they identify in their risk assessments, including any emerging risks.

Given how crucial the risk assessments are to this framework, it is essential that we enable them to be properly scrutinised by the public. The government amendments in this group will place new duties on providers of the largest services—that is, category 1 and 2A services—to publish summaries of their illegal and child safety risk assessments. Through these amendments, providers of these services will also have a new duty to send full records of their risk assessments to Ofcom. This will increase transparency about the risk of harm on the largest platforms, clearly showing how risk is affected by factors such as the design, user base or functionality of their services. These amendments will further ensure that the risk assessments can be properly assessed by internet users, including by children and their parents and guardians, by ensuring that summaries of the assessments are publicly available. This will empower users to make informed decisions when choosing whether and how to use these services.

It is also important that Ofcom is fully appraised of the risks identified by service providers. That is why these amendments introduce duties for both category 1 and 2A services to send their records of these risk assessments, in full, to Ofcom. This will make it easier for Ofcom to supervise compliance with the risk assessment duties, as well as other duties linked to the findings of the risk assessments, rather than having to request the assessments from companies under its information-gathering powers.

These amendments also clarify that companies must keep a record of all aspects of their risk assessments, which strengthens the existing record-keeping duties on services. I hope that noble Lords will welcome these amendments. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

My Lords, it is risky to stand between people and their dinner, but I rise very briefly to welcome these amendments. We should celebrate the good stuff that happens in Committee as well as the challenging stuff. The risk assessments are, I think, the single most positive part of this legislation. Online platforms already do a lot of work trying to understand what risks are taking place on their platforms, which never sees the light of day except when it is leaked by a whistleblower and we then have a very imperfect debate around it.

The fact that platforms will have to do a formal risk assessment and share it with a third-party regulator is huge progress; it will create a very positive dynamic. The fact that the public will be able to see those risk assessments and make their own judgments about which services to use—according to how well they have done them—is, again, a massive public benefit. We should welcome the fact that risk assessments are there and the improvements that this group of amendments makes to them. I hope that was short enough.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I also welcome these amendments, but I have two very brief questions for the Minister. First, in Amendment 27A, it seems that the child risk assessment is limited only to category 1 services and will be published only in the terms of service. As he probably knows, 98% of people do not read terms of service, so I wondered where else we might find this, or whether there is a better way of dealing with it.

My second question is to do with Amendments 64A and 88A. It seems to me—forgive me if I am wrong—that the Bill previously stipulated that all regulated search and user services had to make and keep a written record of any measure taken in compliance with a relevant duty, but now it seems to have rowed back to only category 1 and 2A services. I may be wrong on that, but I would like to check it for the record.