All 5 Debates between Lord Allan of Hallam and Lord Knight of Weymouth

Wed 19th Jul 2023
Thu 22nd Jun 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 25th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 25th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 25th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1

Online Safety Bill

Debate between Lord Allan of Hallam and Lord Knight of Weymouth
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My Lords, I am grateful to the noble Baroness, Lady Newlove, and the noble Lord, Lord Clement-Jones, for adding their names to Amendment 270A, and to the NSPCC for its assistance in tabling this amendment and helping me to think about it.

The Online Safety Bill has the ambition, as we have heard many times, of making the UK the safest place for a child to be online. Yet, as drafted, it could pass into legislation without a system to ensure that children’s voices themselves can be heard. This is a huge gap. Children are experts in their own lives, with a first-hand understanding of the risks that they face online. It is by speaking to, and hearing from, children directly that we can best understand the harms they face online—what needs to change and how the regulation is working in practice.

User advocates are commonplace in most regulated environments and are proven to be effective. Leading children’s charities such as 5Rights, Barnardo’s and YoungMinds, as well as organisations set up by bereaved parents campaigning for child safety online, such as the Molly Rose Foundation and the Breck Foundation, have joined the NSPCC in calling for the introduction of this advocacy body for children, as set out in the amendment.

I do not wish to detain anyone. The Minister’s response when this was raised in Committee was, in essence, that this should go to the Children’s Commissioner for England. I am grateful to her for tracking me down in a Pret A Manger in Russell Square on Monday and having a chat. She reasonably pointed out that much of the amendment reads a bit like her job description, but she also could see that it is desirable to have an organisation such as the NSPCC set up a UK-wide helpline. There are children’s commissioners for Scotland, Wales and Northern Ireland who are supportive of a national advocacy body for children. She was suggesting —if the Minister agrees that this seems like a good solution—that they could commission a national helpline that works across the United Kingdom, and then advises a group that she could convene, including the children’s commissioners from the other nations of the United Kingdom. If that seems a good solution to the Minister, I do not need to press the amendment, we are all happy and we can get on with the next group. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I just want to make some brief comments in support of the principle of what the noble Lord, Lord Knight, is aiming at in this amendment.

The Bill is going to have a profound impact on children in the United Kingdom. We hope that the most profound impact will be that it will significantly advance their interests in terms of safety online. But it will also potentially have a significant impact on what they can access online and the functionality of different services. They are going to experience new forms of age assurance, about which they may have very strong views. For example, the use of their biometric data to estimate their age will be there to protect them, but they may still have strong views about that.

I have said many times that there may be some measures in the Bill that will encourage services to become 18-plus only. That is not adult in the sense of adult content. Ordinary user-to-user social media services may look at the obligations and say, “Frankly, we would much rather restrict ourselves to users from the UK who identify as being 18-plus, rather than have to take on board all the associated liabilities in respect of children”—not because they are irresponsible, but precisely because they are responsible, and they can see that there is a lot of work to do in order to be legally and safely available to those under 18. For all those reasons, it is really important that the child advocacy body looks at things such as the United Nations Convention on the Rights of the Child and the rights of children to access information, and that it is able to take a view on them.

The reason I think that is important—as will any politician who has been out and spoken in schools—is that very often children are surprising in terms of what they see as their priorities. We make assumptions about their priorities, which can often be entirely wrong. There has been some really good work done on this. There was a project called EU Kids Online, back in the days of the EU, which used to look at children right across the European Union and ask them what their experience of being online was like and what was important to them. There are groups such as Childnet International, which for years has been convening groups of children and taking them to places such as the Internet Governance Forum. That always generates a lot of information that we here would not have thought of, about what children feel is really important to them about their online experience.

For all those reasons, it really would be helpful to institutionalise this in the new regime as some kind of body that looks in the round at children’s interests—their interests to stay safe, but also their interests to be able to access a wide variety of online services and to use the internet as they want to use it. I hope that that strengthens the case the noble Lord, Lord Knight, has made for such a body to exist in some kind of coalition-like format.

Online Safety Bill

Debate between Lord Allan of Hallam and Lord Knight of Weymouth
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to everyone for that interesting and quick debate. It is occasionally one’s lot that somebody else tables an amendment but is unavoidably detained in Jerez, drinking sherry, and monitoring things in Hansard while I move the amendment. I am perhaps more persuaded than my noble friend might have been by the arguments that have been made.

We will return to this in other fora in response to the need to regulate AI. However, in the meantime, I enjoyed in particular the John Booth quote from the noble Baroness, Lady Bennett. In respect of this Bill and any of the potential harms around generative AI, if we have a Minister who is mindful of the need for safety by design when we have concluded this Bill then we will have dealt with the bits that we needed to deal with as far as this Bill is concerned.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

Can the noble Lord confirm whether he generated those comments himself, or was he on his phone while we were speaking?

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I do not have an invisible earpiece feeding me my lines—that was all human-generated. I beg leave to withdraw the amendment.

Online Safety Bill

Debate between Lord Allan of Hallam and Lord Knight of Weymouth
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My Lords, as we have said many times, this is a complex Bill. As we reflect on the priorities for Report, we can be more relaxed about some of the specifics on how Ofcom may operate, thereby giving it more flexibility—the flexibility it needs to be agile in the online world—if we as a Parliament trust Ofcom. Building trust, I believe, is a triangulation. First, there is independence from government—as discussed in respect of Secretary of State powers. Secondly, we need proper scrutiny by Parliament. Earlier today I talked about my desire for there to be proper post-legislative scrutiny and a permanent Joint Committee to do that. The third leg of the stool is the transparency to assist that scrutiny.

Clause 68 contains the provisions which would require category 1, 2A and 2B services to produce an annual transparency report containing information described by Ofcom in a notice given to the service. Under these provisions, Ofcom would be able to require these services to report on, among other things: information about the incidence of illegal content and content that is harmful to children; how many users are assumed to have encountered this content by means of the service; the steps and processes for users to report this content; and the steps and processes which a provider uses for dealing with this content.

We welcome the introduction of transparency reporting in relation to illegal content and content that is harmful to children. We agree with the Government that effective transparency reporting plays a crucial role in building Ofcom’s understanding of online harms and empowering users to make a more informed choice about the services they use.

However, despite the inclusion of transparency reporting in the Bill representing a step in the right direction, we consider that these requirements could and should be strengthened to do the trust building we think is important. First, the Bill should make clear that, subject to appropriate redactions, companies will be required to make their transparency reports publicly available—to make them transparent—hence Amendment 160A.

Although it is not clear from the Bill whether companies will be required to make these reports publicly available, we consider that, in most instances, such a requirement would be appropriate. As noted, one of the stated purposes of transparency reporting is that it would enable service users to make more informed choices about their own and their children’s internet use—but they can only do so if the reports are published. Moreover, in so far as transparency reporting would facilitate public accountability, it could also act as a powerful incentive for service providers to do more to protect their users.

We also recognise that requiring companies to publish the incidences of CSEA content on their platforms, for instance, may have the effect of encouraging individuals seeking such material towards platforms on which there are high incidences of that content—that must be avoided. I recognise that simply having a high incidence of CSEA content on a platform does not necessarily mean that that platform is problematic; it could just mean that it is better at reporting it. So, as ever with the Bill, there is a balance to be struck.

Therefore, we consider that the Bill should make it explicit that, once provided to Ofcom, transparency reports are to be made publicly available, subject to redactions. To support this, Ofcom should be required to produce guidance on the publication of transparency reports and the redactions that companies should make before making reports publicly accessible. Ofcom should also retain the power to stop a company from publishing a particular transparency report if it considers that the risk of directing individuals to illegal materials outweighs the benefit of making a report public—hence Amendments 160B and 181A.

Amendments 165 and 229 are in my noble friend Lord Stevenson’s name. Amendment 165 would broaden the transparency requirements around user-to-user services’ terms of service, ensuring that information can be sought on the scope of these terms, not just their application. As I understand it, scope is important to understand, as it is significant in informing Ofcom’s regulatory approach. We are trying to guard against minimal terms of service where detail is needed for users and Ofcom.

The proposed clause in Amendment 229 probes how Ofcom will review the effectiveness of the transparency requirements in the Bill. It would require Ofcom to undertake a review of the effectiveness of transparency reports within three years and every five years thereafter, and it would give the Secretary of State powers to implement any recommendations made by the regulator. The Committee should note that we also include a requirement that a Select Committee, charged by the relevant House, must consider and report on the regulations, with an opportunity for Parliament to debate them. So we link the three corners of the triangle rather neatly there.

If we agree that transparency is an important part of building trust in Ofcom in doing this difficult and innovative regulatory job—it is always good to see the noble Lord, Lord Grade, in his place; I know he is looking forward to getting on with this—then this proposed clause is sensible. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I am pleased that the noble Lord, Lord Knight of Weymouth, has given us an opportunity to talk about transparency reports with these amendments, which are potentially a helpful addition to the Bill. Transparency is one of the huge benefits that the legislation may bring. One of the concerns that the public have and that politicians have always had with online platforms is that they appear to be a black box—you cannot see what is going on in them.

In the entire edifice that we are constructing in the Online Safety Bill, there are huge opportunities to change that. The platforms will have to do risk assessments —there are measures in the Bill to make sure that information about these is put out—and they will have to take active steps to mitigate any risks they find. Again, we may get directions and guidance from Ofcom that will explain to the public exactly what is expected of them. The final piece of the jigsaw is the transparency reports that show the outcomes—how a platform has performed and what it has done to meet its obligations in dealing with content and behaviour on its services.

For the record, I previously worked for one of the platforms, and I would have said that I was on the pro-transparency wing of the transparency party inside the company. I believed that it was in the platform’s interest: if you do not tell people what you are doing, they will make things up about you, and what they make up will generally be worse than what you are actually doing. So there are huge advantages to the platforms from being transparent.

The noble Lord, Lord Knight, has picked up on some important points in his Amendment 160B, which talks about making sure that the transparency report is not counterproductive by giving the bad guys information that they could use to ill effect. That is a valid point; it is often debated inside the platforms. Sometimes, I argued furiously with my colleagues in the platforms about why we should disclose information. They would ask, “What about the bad guys?” Sometimes I challenged that, but other times it would have been a genuine and accurate concern. The noble Lord mentioned things such as child sexual abuse material, and we have to recognise that the bad guys are incredibly devious and creative, and if you show them anything that they can use against you to get around your systems, they will try to do that. That is a genuine and valid concern.

The sort of thing that you might put into a transparency report is, for example, whether you have banned particular organisations. I would be in favour of indicating to the public that an organisation is banned, but you can see that the potential impact of that is that all the people you are concerned about would create another organisation with a different name and then get back on to your platform. We need to be alive to those kinds of concerns.

It is also relevant to Amendment 165 and the terms of service that the more granular and detailed your terms of service are, the better they are for public information, but there are opportunities to get around them. Again, we would have that argument internally. I would say, “If we are prohibiting specific hate speech terms, tell people that, and then they won’t use them”. For me, that would be a success, as they are not using those hate speech terms anymore, but, of course, they may then find alternative hate speech terms that they can use instead. You are facing that battle all the time. That is a genuine concern that I hope we will be able to debate. I hope that Ofcom will be able to mitigate that risk by discussing with platforms what these transparency reports should look like. In a sense, we are doing a risk assessment of the transparency report process.

Amendment 229 on effectiveness is really interesting. My experience was that if you did not have a transparency report, you were under huge pressure to produce one and that once you produced one, nobody was interested. For fear of embarrassing anyone in the Committee, I would be curious to know how many noble Lords participating in this debate have read the transparency reports already produced by Meta Platforms, Google and others. If they have not read them, they should not be embarrassed, because my experience was that I would talk to regulators and politicians about something they had asked me to come in to talk about, such as hate speech or child sexual abuse material, and I learned to print off the transparency report. I would go in and say, “Well, you know what we are doing; it’s in our transparency report”. They would ask, “What transparency report?”, and I would have to show them. So, having produced a transparency report, every time we published it, we would expect there to be public interest, but little use was made of it. That is not a reason not to do them—as I said, I am very much in favour of doing them—but, on their own, they may not be effective, and Amendment 229 touches on that.

I was trying to think of a collective noun for transparency reports and, seeing as they shed light, I think it may be a “chandelier”. Where we may get the real benefit is if Ofcom can produce a chandelier of transparency reports, taking all the information it gets from the different platforms, processing it and selecting the most relevant information—the reports are often too long for people to work their way through—so that it can enable comparisons. That is really good and it is quite good for the industry that people know that platform A did this, platform B did that, and platform C did something else. They will take note of that, compare with each other and want to get into the best category. It is also critical that Ofcom puts this into user-friendly language, and Ofcom has quite a good record of producing intelligible reports. In the context of Amendment 229, a review process is good. One of the things that might come out of that, thinking ahead, would be Ofcom’s role in producing meta transparency reports, the chandelier that will shed light on what the whole sector is doing.

Online Safety Bill

Debate between Lord Allan of Hallam and Lord Knight of Weymouth
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

I am grateful for that intervention as well. That summarises the core questions that we have for the Minister. Of the three areas that we have for him, the first is the question of scope and the extent to which he can assure us that the Bill as drafted will be robust in covering the metaverse and bots, which are the issues that have been raised today. The second is on behaviours and to the two interventions that we have just had. We have been asking whether, with the behaviours that are criminal today, that criminality will stretch to new, similar forms of behaviour taking place in new environments—let us put it that way. The behaviour, the intent and the harm are the same, but the environment is different. We want to understand the extent to which the Government are thinking about that, where that thinking is happening and how confident they are that they can deal with that.

Finally, on the question of agency, how do the Government expect to deal with the fact that we will have machines operating in a user-to-user environment when the connection between the machine and another individual user is qualitatively different from anything that we have seen before? Those are just some small questions for the Minister on this Thursday afternoon.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, the debate on this group has been a little longer, deeper and more important than I had anticipated. It requires all of us to reflect before Report on some of the implications of the things we have been talking about. It was introduced masterfully by the noble Baroness, Lady Harding, and her comments—and those from the noble Baronesses, Lady Finlay and Lady Berridge—were difficult to listen to at times. I also congratulate the Government Whip on the way he handled the situation so that innocent ears were not subject to some of that difficult listening. But the questions around the implications of virtual reality, augmented reality and haptic technology are really important, and I hope the Minister will agree to meet with the noble Baroness, Lady Berridge, and the people she referenced to reflect on some of that.

Online Safety Bill

Debate between Lord Allan of Hallam and Lord Knight of Weymouth
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, it falls to me to inject some grit into what has so far been a very harmonious debate, as I will raise some concerns about Amendments 2 and 22.

I again declare my interest: I spent 10 years working for Facebook, doing the kind of work that we will regulate in this Bill. At this point noble Lords are probably thinking, “So it’s his fault”. I want to stress that, if I raise concerns about the way the regulation is going, it is not that I hold those views because I used to work for the industry; rather, I felt comfortable working in the industry because I always had those views, back to 2003 when we set up Ofcom. I checked the record, and I said things then that are remarkably consistent with how I feel today about how we need to strike the balance between the power of the state and the power of the citizen to use the internet.

I also should declare an interest in respect of Amendment 2, in that I run a blog called regulate.tech. I am not sure how many children are queueing up to read my thoughts about regulation of the tech industry, but they would be welcome to do so. The blog’s strap- line is:

“How to regulate the internet without breaking it”.


It is very much in that spirit that I raise concerns about these two amendments.

I certainly understand the challenges for content that is outside of the user-to-user or search spaces. I understand entirely why the noble Baroness, Lady Kidron, feels that something needs to be done about that content. However, I am not sure that this Bill is the right vehicle to address that kind of content. There are principled and practical reasons why it might be a mistake to extend the remit here.

The principle is that the Bill’s fundamental purpose is to restrict access to speech by people in the United Kingdom. That is what legislation such as this does: it restricts speech. We have a framework in the Human Rights Act, which tells us that when we restrict speech we have to pass a rigorous test to show that those restrictions are necessary and proportionate to the objective we are trying to achieve. Clearly, when dealing with children, we weight very heavily in that test whether something is necessary and proportionate in favour of the interest of the welfare of the children, but we cannot do away with the test altogether.

It is clear that the Government have applied that test over the years that they have been preparing this Bill and determined that there is a rationale for intervention in the context of user-to-user services and search services. At the same time, we see in the Bill that the Government’s decision is that intervention is not justified in all sorts of other contexts. Email and SMS are excluded. First-party publisher content is excluded, so none of the media houses will be included. We have a Bill that is very tightly and specifically framed around dealing with intermediaries, whether that is user-to-user intermediaries who intermediate in user-generated content, or search as an intermediary, which scoops up content from across the internet and presents it to you.

This Bill is about regulating the regulators; it is not about regulating first-party speakers. A whole world of issues will come into play if we move into that space. It does not mean that it is not important, just that it is different. There is a common saying that people are now bandying around, which is that freedom of speech is not freedom of reach. To apply a twist to that, restrictions on reach are not the same as restrictions on speech. When we talk about restricting intermediaries, we are talking about restricting reach. If I have something I want to say and Facebook or Twitter will not let me say it, that is a problem and I will get upset, but it is not the same as being told that I cannot say it anywhere on the internet.

My concern about Amendment 2 is that it could lead us into a space where we are restricting speech across the internet. If we are going to do that—there may be a rationale for doing it—we will need to go back and look at our necessity and proportionality test. It may play out differently in that context from user-to-user or intermediary-based services.

From a practical point of view, we have a Bill that, we are told, will give Ofcom the responsibility of regulating 25,000 more or less different entities. They will all be asked to pay money to Ofcom and will all be given a bunch of guidance and duties that they have to fulfil. Again, those duties, as set out in painful length in the Bill, are very specifically about the kind of things that an intermediary should do to its users. If we were to be regulating blogs or people’s first-party speech, or publishers, or the Daily Telegraph, or whoever else, I think we would come up with a very different set of duties from the duties laid out in the Bill. I worry that, however well-motivated, Amendment 2 leads us into a space for which this Bill is not prepared.

I have a lot of sympathy with the views of the noble Baroness, Lady Harding, around the app stores. They are absolutely more like intermediaries, or search, but again the tools in the Bill are not necessarily dedicated to how one would deal with app stores. I was interested in the comments of the noble Baroness, Lady Stowell, on what will be happening to our competition authorities; a lot will be happening in that space. On app stores, I worry about what is in Amendment 22: we do not want app stores to think that it is their job to police the content of third-party services. That is Ofcom’s job. We do not want the app stores to get in the middle, not least because of these commercial considerations. We do not want Apple, for instance, thinking that, to comply with UK legislation, it might determine that WhatsApp is unsafe while iMessage is safe. We do not want Google, which operates Play Store, to think that it would have a legal rationale for determining that TikTok is unsafe while YouTube is safe. Again, I know that this is not the noble Baroness’s intention or aim, but clearly there is a risk that we open that up.

There is something to be done about app stores but I do not think that we can roll over the powers in the Bill. When we talk about intermediaries such as user-to-user services and search, we absolutely want them to block bad content. The whole thrust of the Bill is about forcing them to restrict bad content. When it comes to app stores, the noble Baroness set out some of her concerns, but I think we want something quite different. I hesitate to say this, as I know that my noble friend is supportive of it, but I think that it is important as we debate these issues that we hear some of those concerns.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Could it not be argued that the noble Lord is making a case for regulation of app stores? Let us take the example of Apple’s dispute with “Fortnite”, where Apple is deciding how it wants to police things. Perhaps if this became a more regulated space Ofcom could help make sure that there was freedom of access to some of those different products, regardless of the commercial interests of the people who own the app stores.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

The noble Lord makes a good point. I certainly think we are heading into a world where there will be more regulation of app stores. Google and Apple are commercial competitors with some of the people who are present in their stores. A lot of the people in their stores are in dispute with them over things such as the fees that they have to pay. It is precisely for that reason that I do not think we should be throwing online safety into the mix.

There is a role for regulating app stores, which primarily focuses on these commercial considerations and their position in the market. There may be something to be done around age-rating; the noble Baroness made a very good point about how age-rating works in app stores. However, if we look at the range of responsibilities that we are describing in this Bill and the tools that we are giving to intermediaries, we see that they are the wrong, or inappropriate, set of tools.