Online Safety Bill Debate
Full Debate: Read Full DebateLord Knight of Weymouth
Main Page: Lord Knight of Weymouth (Labour - Life peer)Department Debates - View all Lord Knight of Weymouth's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberMy Lords, it falls to me to inject some grit into what has so far been a very harmonious debate, as I will raise some concerns about Amendments 2 and 22.
I again declare my interest: I spent 10 years working for Facebook, doing the kind of work that we will regulate in this Bill. At this point noble Lords are probably thinking, “So it’s his fault”. I want to stress that, if I raise concerns about the way the regulation is going, it is not that I hold those views because I used to work for the industry; rather, I felt comfortable working in the industry because I always had those views, back to 2003 when we set up Ofcom. I checked the record, and I said things then that are remarkably consistent with how I feel today about how we need to strike the balance between the power of the state and the power of the citizen to use the internet.
I also should declare an interest in respect of Amendment 2, in that I run a blog called regulate.tech. I am not sure how many children are queueing up to read my thoughts about regulation of the tech industry, but they would be welcome to do so. The blog’s strap- line is:
“How to regulate the internet without breaking it”.
It is very much in that spirit that I raise concerns about these two amendments.
I certainly understand the challenges for content that is outside of the user-to-user or search spaces. I understand entirely why the noble Baroness, Lady Kidron, feels that something needs to be done about that content. However, I am not sure that this Bill is the right vehicle to address that kind of content. There are principled and practical reasons why it might be a mistake to extend the remit here.
The principle is that the Bill’s fundamental purpose is to restrict access to speech by people in the United Kingdom. That is what legislation such as this does: it restricts speech. We have a framework in the Human Rights Act, which tells us that when we restrict speech we have to pass a rigorous test to show that those restrictions are necessary and proportionate to the objective we are trying to achieve. Clearly, when dealing with children, we weight very heavily in that test whether something is necessary and proportionate in favour of the interest of the welfare of the children, but we cannot do away with the test altogether.
It is clear that the Government have applied that test over the years that they have been preparing this Bill and determined that there is a rationale for intervention in the context of user-to-user services and search services. At the same time, we see in the Bill that the Government’s decision is that intervention is not justified in all sorts of other contexts. Email and SMS are excluded. First-party publisher content is excluded, so none of the media houses will be included. We have a Bill that is very tightly and specifically framed around dealing with intermediaries, whether that is user-to-user intermediaries who intermediate in user-generated content, or search as an intermediary, which scoops up content from across the internet and presents it to you.
This Bill is about regulating the regulators; it is not about regulating first-party speakers. A whole world of issues will come into play if we move into that space. It does not mean that it is not important, just that it is different. There is a common saying that people are now bandying around, which is that freedom of speech is not freedom of reach. To apply a twist to that, restrictions on reach are not the same as restrictions on speech. When we talk about restricting intermediaries, we are talking about restricting reach. If I have something I want to say and Facebook or Twitter will not let me say it, that is a problem and I will get upset, but it is not the same as being told that I cannot say it anywhere on the internet.
My concern about Amendment 2 is that it could lead us into a space where we are restricting speech across the internet. If we are going to do that—there may be a rationale for doing it—we will need to go back and look at our necessity and proportionality test. It may play out differently in that context from user-to-user or intermediary-based services.
From a practical point of view, we have a Bill that, we are told, will give Ofcom the responsibility of regulating 25,000 more or less different entities. They will all be asked to pay money to Ofcom and will all be given a bunch of guidance and duties that they have to fulfil. Again, those duties, as set out in painful length in the Bill, are very specifically about the kind of things that an intermediary should do to its users. If we were to be regulating blogs or people’s first-party speech, or publishers, or the Daily Telegraph, or whoever else, I think we would come up with a very different set of duties from the duties laid out in the Bill. I worry that, however well-motivated, Amendment 2 leads us into a space for which this Bill is not prepared.
I have a lot of sympathy with the views of the noble Baroness, Lady Harding, around the app stores. They are absolutely more like intermediaries, or search, but again the tools in the Bill are not necessarily dedicated to how one would deal with app stores. I was interested in the comments of the noble Baroness, Lady Stowell, on what will be happening to our competition authorities; a lot will be happening in that space. On app stores, I worry about what is in Amendment 22: we do not want app stores to think that it is their job to police the content of third-party services. That is Ofcom’s job. We do not want the app stores to get in the middle, not least because of these commercial considerations. We do not want Apple, for instance, thinking that, to comply with UK legislation, it might determine that WhatsApp is unsafe while iMessage is safe. We do not want Google, which operates Play Store, to think that it would have a legal rationale for determining that TikTok is unsafe while YouTube is safe. Again, I know that this is not the noble Baroness’s intention or aim, but clearly there is a risk that we open that up.
There is something to be done about app stores but I do not think that we can roll over the powers in the Bill. When we talk about intermediaries such as user-to-user services and search, we absolutely want them to block bad content. The whole thrust of the Bill is about forcing them to restrict bad content. When it comes to app stores, the noble Baroness set out some of her concerns, but I think we want something quite different. I hesitate to say this, as I know that my noble friend is supportive of it, but I think that it is important as we debate these issues that we hear some of those concerns.
Could it not be argued that the noble Lord is making a case for regulation of app stores? Let us take the example of Apple’s dispute with “Fortnite”, where Apple is deciding how it wants to police things. Perhaps if this became a more regulated space Ofcom could help make sure that there was freedom of access to some of those different products, regardless of the commercial interests of the people who own the app stores.
The noble Lord makes a good point. I certainly think we are heading into a world where there will be more regulation of app stores. Google and Apple are commercial competitors with some of the people who are present in their stores. A lot of the people in their stores are in dispute with them over things such as the fees that they have to pay. It is precisely for that reason that I do not think we should be throwing online safety into the mix.
There is a role for regulating app stores, which primarily focuses on these commercial considerations and their position in the market. There may be something to be done around age-rating; the noble Baroness made a very good point about how age-rating works in app stores. However, if we look at the range of responsibilities that we are describing in this Bill and the tools that we are giving to intermediaries, we see that they are the wrong, or inappropriate, set of tools.
My Lords, I echo the comments of the noble Lord, Lord Clement-Jones. This is an important group of amendments, and it has been a useful debate. I was slightly concerned when I heard the noble Baroness, Lady Harding, talk about using her daughter’s device to see whether it could access porn sites in terms of what that is going to do to her daughter’s algorithm and what it will now feed her. I will put that concern to one side, but any future report on that would be most welcome.
Amendments 2, 3 and 5, introduced so well by the noble Baroness, Lady Kidron, test what should be in scope to protect children. Clearly, we have a Bill that has evolved over some time, with many Ministers, to cover unambiguously social media, as user-to-user content, and search. I suspect that we will spend a lot more time discussing social media than search, but I get the rationale that those are perhaps the two main access points for a lot of the content we are concerned about. However, I would argue that apps are also main access points. I will come on to discuss the amendments in the name of the noble Baroness, Lady Harding, which I have also signed. If we are going to go with access points, it is worth probing and testing the Government’s intent in excluding some of these other things. The noble Lord, Lord Storey, raises in his amendments the issue of games, as others have done. Games are clearly a point of access for lots of children, as well as adults, and there is plenty of harm that can be created as a result of consuming them.
Along with some other noble Lords, some time ago I attended an all-party group which looked at the problems related to incel harm online and how people are breadcrumbed from mainstream sites to quite small websites to access the really problematic, most hateful and most dangerous content. Those small websites, as far as I can see, are currently excluded from the regime in the Bill, but the amendments in the name of the noble Baroness, Lady Kidron, potentially would bring them into scope. That meeting also discussed cloud services and the supply chain of the technical infrastructure that such risks, including incels and other things, use. Why are cloud services not included in some context in terms of the harms that might be created?
Questions have been asked about large language model AIs such as ChatGPT. These are future technologies that have now arrived, which lots of people are talking about and variously freaking out about or getting excited by. There is an important need to bring those quite quickly into the scope of regulation by Ofcom. ChatGPT is a privately owned platform—a privately owned technology—that is offering up not only access to the range of knowledge that is online but, essentially, the range of human concepts that are online in interaction with that knowledge—privately owned versions of truth.
What is to stop any very rich individual deciding to start their own large language model with their own version of the truth, perhaps using their own platform? Former President Trump comes to mind as someone who could do that and I suggest that, if truth is now a privatised thing, we might want to have some regulation here.
The future-proofing issues are why we should be looking very seriously at the amendments in the name of the noble Baroness, Lady Kidron. I listened carefully to the noble Lord, Lord Allan, as always, and I have reflected a lot on his very useful car safety and plane safety regulation analogy from our previous day in Committee. The proportionality issue that he raised in his useful contribution this time is potentially addressed by the proposed new clause we discussed last time. If the Bill sets out quite clearly the aim of the legislation, that would set the frame for the regulator and for how it would regulate proportionately the range of internet services that might be brought into scope by this set of amendments.
I also support Amendment 92, on bringing in safety by design and the regime that has been so successful in respect of the age-related design code and the probability of access by children, rather than what is set out in the Bill.
I turn to Amendments 19, 22, 298 and 299 in the names of the noble Baronesses, Lady Harding and Lady Stowell, the noble Lord, Lord Clement-Jones, and myself. Others, too, have drawn the analogy between app stores and corner shops selling alcohol, and it makes sense to think about the distribution points in the system—the pinch points that all users go through—and to see whether there is a viable way of protecting people and regulating through those pinch points. The Bill seeks to protect us via the platforms that host and promote content having regulation imposed on them, and risk assessments and so on, but it makes a lot of sense to add app stores, given how we now consume the internet.
I remember, all those years ago, having CD drives—floppy disk drives, even—in computers, and going off to buy software from a retail store and having to install it. I do not go quite as far back as the right reverend Prelate the Bishop of Oxford, but I remember those days well. Nowadays as consumers almost all of us access our software through app stores, be it software for our phones or software for our laptops. That is the distribution point for mobiles and essentially it is, as others have said, a duopoly that we hope will be addressed by the Digital Markets, Competition and Consumers Bill.
As others have said, 50% of children under 10 in this country use smartphones and tablets. When you get to the 12 to 15 bracket, you find that 97% of them use mobile phones and tablets. We have, as noble Lords have also said, Google Family Link and the Apple Family Sharing function. That is something we use in my family. My stepdaughter is 11—she will be 12 in June—and I appear to be in most cases the regulator who has to give her the Family Link code to go on to Google Classroom when she does her homework, and who has to allow her to download an app or add another contact—there is a whole range of things on her phone for which I provide the gatekeeper function. But you have to be relatively technically competent and confident to do all those things, and to manage her screen time, and I would like to see more protection for those who do not have that confidence—and indeed for myself as well, because maybe I would not have to be bothered quite as often.
It is worth noting that the vast majority of children in this country who have smartphones—the last time I looked at the stats, it was around 80%—have iPhones; there must be a lot of old iPhones that have been recycled down the family. To have an iCloud account, if you are under 13, you have to go through a parent or other suitable adult. However, if you are over 13, you can get on with it; that raises a whole set of issues and potential harms for children over the age of 13.
If I can finish my point, this will bring into scope services of the kind set out in the amendments, such as those designed or intended for use by children, or where children form a substantial and identifiable user group. The current condition also considers the nature and content of the service and whether it has a particular appeal for children. Ofcom will be required to consult the Information Commissioner’s Office on its guidance to providers on fulfilling this test, which will further support alignment between the Bill and the age-appropriate design code.
On the meaning of “significant”, a significant number of children means a significant number in itself or a significant proportion of the total number of UK-based users on the service. In the Bill, “significant” has its ordinary meaning, and there are many precedents for it in legislation. Ofcom will be required to produce and publish guidance for providers on how to make the children’s access assessment. Crucially, the test in the Bill provides more legal certainty and clarity for providers than the test outlined in the code. “Substantive” and “identifiable”, as suggested in this amendment, do not have such a clear legal meaning, so this amendment would give rise to the risk that the condition is more open to challenge from providers and more difficult to enforce. On the other hand, as I said, “significant” has an established precedent in legislation, making it easier for Ofcom, providers and the courts to interpret.
The noble Lord, Lord Knight, talked about the importance of future-proofing the Bill and emerging technologies. As he knows, the Bill has been designed to be technology neutral and future-proofed, to ensure that it keeps pace with emerging technologies. It will apply to companies which enable users to share content online or to interact with each other, as well as to search services. Search services using AI-powered features will be in scope of the search duties. The Bill is also clear that content generated by AI bots is in scope where it interacts with user-generated content, such as bots on Twitter. The metaverse is also in scope of the Bill. Any service which enables users to interact as the metaverse does will have to conduct a child access test and comply with the child safety duties if it is likely to be accessed by children.
I know it has been said that the large language models, such as that used by ChatGPT, will be in scope when they are embedded in search, but are they in scope generally?
They are when they apply to companies enabling users to share content online and interact with each other or in terms of search. They apply in the context of the other duties set out in the Bill.
Amendments 19, 22, 298 and 299, tabled by my noble friend Lady Harding of Winscombe, seek to impose child safety duties on application stores. I am grateful to my noble friend and others for the collaborative approach that they have shown and for the time that they have dedicated to discussing this issue since Second Reading. I appreciate that she has tabled these amendments in the spirit of facilitating a conversation, which I am willing to continue to have as the Bill progresses.
As my noble friend knows from our discussions, there are challenges with bringing application stores—or “app stores” as they are popularly called—into the scope of the Bill. Introducing new duties on such stores at this stage risks slowing the implementation of the existing child safety duties, in the way that I have just outlined. App stores operate differently from user-to-user and search services; they pose different levels of risk and play a different role in users’ experiences online. Ofcom would therefore need to recruit different people, or bring in new expertise, to supervise effectively a substantially different regime. That would take time and resources away from its existing priorities.
We do not think that that would be a worthwhile new route for Ofcom, given that placing child safety duties on app stores is unlikely to deliver any additional protections for children using services that are already in the scope of the Bill. Those services must already comply with their duties to keep children safe or will face enforcement action if they do not. If companies do not comply, Ofcom can rely on its existing enforcement powers to require app stores to remove applications that are harmful to children. I am happy to continue to discuss this matter with my noble friend and the noble Lord, Lord Knight, in the context of the differing implementation timelines, as he has asked.