Moved by
16: Clause 8, page 7, line 16, after “governance,” insert “terms of service,”
Member’s explanatory statement
This amendment makes clear that “design and operation of a service” includes its terms of service.
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - -

My Lords, this group of amendments concerns terms of service. All the amendments either have the phrase “terms of service” in them or imply that we wish to see more use of the phrase in the Bill, and seek to try to tidy up some of the other bits around that which have crept into the Bill.

Why are we doing that? Rather late in the day, terms of service has suddenly become a key fulcrum, under which much of the operations of the activity relating to people’s usage of social media and service functions on the internet will be expressed in relation to how they view the material coming to them. With the loss of the adult “legal but harmful” provisions, we also lost quite a considerable amount of what would have been primary legislation, which no doubt would have been backed up by codes of practice. The situation we are left with, and which we need to look at very closely, is the triple shield at the heart of the new obligations on companies, and, in particular, on their terms of service. That is set out primarily in Clauses 64, 65, 66 and 67, and is a subject to which my amendments largely refer.

Users of the services would be more confident that the Government have got their focus on terms of service right, if they actually said what should be said on the tin, as the expression goes. If it is the case that something in a terms of service was so written and implemented so that material which should be taken down was indeed taken down, these would become reliable methods of judging whether or not the service is the one people want to have, and the free market would be seen to be working to empower people to make their own decisions about what level of risk they can assume by using a service. That is a major change from the way the Bill was originally envisaged. Because this was done late, we have one or two of the matters to which I have referred already, which means that the amendments focus on changing what is currently in the Bill.

It is also true that the changes were not consulted upon; I do not recall there being any document from government about whether this was a good way forward. The changes were certainly not considered by the Joint Committee, of which several of those present were members—we did not discuss it in the Joint Committee and made no recommendation on it. The level of scrutiny we have enjoyed on the Bill has been absent in this area. The right reverend Prelate the Bishop of Oxford will speak shortly to amendments about terms of service, and we will be able to come back to it. I think it would have been appropriate had the earlier amendment in the name of the noble Lord, Lord Pickles, been in this group because the issue was the terms of service, even though it had many other elements that were important and that we did discuss.

The main focus of my speech is that the Government have not managed to link this new idea of terms of service and the responsibilities that will flow from that to the rest of the Bill. It does not seem to fit into the overall architecture. For example, it is not a design feature, and does not seem to work through in that way. This is a largely self-contained series of clauses. We are trying to ask some of the world’s largest companies, on behalf of the people who use them, to do things on an almost contractual basis. Terms of service are not a contract that you sign up to, but you certainly click something—or occasionally click it, if you remember to—by which you consent to the company operating in a particular set of ways. In a sense, that is a contract, but is it really a contract? At the heart of that contract between companies and users is whether the terms of service are well captured in the way the Bill is organised. I think there are gaps.

The Bill does have something that we welcome and want to hold on to, which is that the process under which the risks are assessed and decisions taken about how companies operate and how Ofcom relates to those decisions is about the design and operation of the service—both the design and the operation, something that the noble Baroness, Lady Kidron, is very keen to emphasise at all times. It all starts and ends with design, and the operation is a consequence of design choices. Other noble Baronesses have mentioned in the debate that small companies get it right and so, when they grow, can be confident that what they are doing is something that is worth doing. Design, and operating that design to make a service, is really important. Are terms of service part of that or are they different, and does it matter? It seems to me that they are downstream from the design: something can be designed and then have terms of service that were not really part of the original process. What is happening here?

My Amendments 16, 21, 66DA, 75 and 197 would ensure that the terms of service are included within the list of matters that constitute “design and operation” of the service at each point that it occurs. I have had to go right through the Bill to add it in certain areas—in a rather irritating way, I am sure, for the Bill team—because sometimes we find that what I think should be a term of service is actually described as something else, such as a “a publicly available statement”, whatever that is. It would be an advantage if we went through it again and defined terms of service and made sure that that was what we were talking about.

Amendments 70 to 72, 79 to 81 and 174 seek to help the Government and their officials with tidying up the drafting, which probably has not been scrutinised enough to pick up these issues. It may not matter, at the end of the day, but what is in the Bill is going to be law and we may as well try to get it right as best we can. I am sure the Minister will say we really do not need to worry about this because it is all about risks and outcomes, and if a company does not protect children or has illegal content, or the user-empowerment duties—the toggling—do not work, Ofcom will find a way of driving the company to sort it out. What does that mean in practice? Does it mean that Ofcom has a role in defining what terms of service are? It is not in the Bill and may not reach the Bill, but it is something that will be a bit of problem if we do not resolve what we mean by it, even if it is not by changing the legislation.

If the Minister were to disagree with my approach, it would be quite nice to have it said at the Dispatch Box so that we can look at that. The key question is: are terms of service an integral part of the design and operation of a service and, if so, can we extend the term to make sure that all aspects of the services people consume are covered by adequate and effective terms of service? There is probably going to be division in the way we approach this because, clearly, whether they are terms of service or have another name, the actual enforcement of illegal and children’s duties will be effected by Ofcom, irrespective of the wording of the Bill—I do not want to question that. However, there is obviously an overlap into questions about adults and others who are affected by the terms of service. If you cannot identify what the terms of service say in relation to something you might not wish to receive because the terms of service are imprecise, how on earth are you going to operate the services, the toggles and things, around it? If you look at that and accept there will be pressure within the market to get these terms of service right, there will be a lot of dialogue with Ofcom. I accept that all that will happen, but it would be good if the position of the terms of service was clarified in the Bill before it becomes law and that Ofcom’s powers in relation to those are clarified—do they or do they not have the chance to review terms of service if they turn out to be ineffective in practice? If that is the case, how are we going to see this work out in practice in terms of what people will be able to do about it, either through redress or by taking the issue to court? I beg to move.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

I support these amendments, which were set out wonderfully by the noble Lord, Lord Stevenson. I want to raise a point made on Tuesday when the noble Baroness, Lady Merron, said that only 3% of people read terms of service and I said that 98% of people do not read them, so one of us is wrong, but I think the direction of travel is clear. She also used a very interesting phrase about prominence, and I want to use this opportunity to ask the Minister whether there is some lever whereby Ofcom can insist on prominence for certain sorts of material—a hierarchy of information, if you like—because these are really important pieces of information, buried in the wrong place so that even 2% or 3% of people may not find them.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful for this short and focused debate, which has been helpful, and for the points made by the noble Lords, Lord Stevenson and Lord Allan, and the noble Baroness, Lady Kidron. I think we all share the same objective: ensuring that terms of service promote accountability and transparency, and empower users.

One of the Bill’s key objectives is to ensure that the terms of service of user-to-user platforms are suitable and effective. Under the Bill, companies will be required both to set out clearly how they will tackle illegal content and protect children and to ensure that their terms of service are properly enforced. The additional transparency and accountability duties on category 1 services will further ensure that users know what to expect on the largest platforms. This will put an end to these services arbitrarily removing content or, conversely, failing to remove content that they profess to prohibit.

The Bill will also ensure that search services are clear to their users about how they are complying with their adult and child safety duties under this new law. Given the very different way in which search services operate, however, this will be achieved through a publicly available statement rather than through terms of service. The two are meant distinctly.

Noble Lords are right to point to the question of intelligibility. It struck me that, if it takes 10 days to read terms of service, perhaps we should have a race during the 10 days allotted to this Committee stage to see which is quicker—but I take the point. The noble Lord, Lord Allan, is also right that the further requirements imposed through this Bill will only add to that.

The noble Baroness, Lady Kidron, asked a fair question about what “accessibility” means. The Bill requires all platforms’ terms of service for illegal content and child safety duties to be clear and accessible. Ofcom will provide guidance on what that means, including ensuring that they are suitably prominent. The same applies to terms of service for category 1 services relating to content moderation.

I will focus first on Amendments 16, 21, 66DA, 75 and 197, which seek to ensure that both Ofcom and platforms consider the risks associated with platforms’ terms of service with regard to the illegal content and child safety duties in the Bill. We do not think that these amendments are needed. User-to-user services will already be required to assess the risks regarding their terms of service for illegal content. Clause 8 requires companies to assess the “design and operation” of a service in relation to illegal content. As terms of service are integral to how a service operates, they would be covered by this provision. Similarly, Clause 10 sets out that companies likely to be accessed by children will be required to assess the “design and operation” of a service as part of their child risk assessments, which would include the extent to which their terms of service may reduce or increase the risk of harm to children.

In addition to those risk assessment duties, the safety duties will require companies to take proportionate measures effectively to manage and mitigate the risk of harm to people whom they have identified through risk assessments. This will include making changes to their terms of service, if appropriate. The Bill does not impose duties on search services relating to terms of service, as search services’ terms of service play a less important role in determining how users can engage on a platform. I will explain this point further when responding to specific amendments relating to search services but I can assure the noble Lord, Lord Stevenson, that search services will have comprehensive duties to understand and mitigate how the design and operation of their service affects risk.

Amendment 197 would require Ofcom to assess how platforms’ terms of service affect the risk of harm to people that the sector presents. While I agree that this is an important risk factor which Ofcom must consider, it is already provided for in Clause 89, which requires Ofcom to undertake an assessment of risk across regulated services. That requires Ofcom to consider which characteristics of regulated services give rise to harm. Given how integral terms of service are to how many technology companies function, Ofcom will necessarily consider the risk associated with terms of service when undertaking that risk assessment.

However, elevating terms of service above other systems and processes, as mentioned in Clause 89, would imply that Ofcom needs to take account of the risk of harm on the regulated service, more than it needs to do so for other safety-by-design systems and processes or for content moderation processes, for instance. That may not be suitable, particularly as the service delivery methods will inevitably change over time. Instead, Clause 89 has been written to give Ofcom scope to organise its risk assessment, risk register and risk profiles as it thinks suitable. That is appropriate, given that it is best placed to develop detailed knowledge of the matters in question as they evolve over time.

Amendments 70, 71, 72, 79, 80, 81, 174 and 302 seek to replace the Bill’s references to publicly available statements, in relation to search services, with terms of service. This would mean that search services would have to publish how they are complying with their illegal content and child protection duties in terms of service rather than in publicly available statements. I appreciate the spirit in which the noble Lord has tabled and introduced these amendments. However, they do not consider the very different ways in which search services operate.

User-to-user services’ terms of service fulfil a very specific purpose. They govern a user’s behaviour on the service and set rules on what a user is allowed to post and how they can interact with others. If a user breaks these terms, a service can block his or her access or remove his or her content. Under the status quo, users have very few mechanisms by which to hold user-to-user platforms accountable to these terms, meaning that users can arbitrarily see their content removed with few or no avenues for redress. Equally, a user may choose to use a service because its terms and conditions lead them to believe that certain types of content are prohibited while in practice the company does not enforce the relevant terms.

The Bill’s duties relating to user-to-user services’ terms of service seek to redress this imbalance. They will ensure that people know what to expect on a platform and enable them to hold platforms accountable. In contrast, users of search services do not create content or interact with other users. Users can search for anything without restriction from the search service provider, although a search term may not always return results. It is therefore not necessary to provide detailed information on what a user can and cannot do on a search service. The existing duties on such services will ensure that search engines are clear to users about how they are complying with their safety duties. The Bill will require search services to set out how they are fulfilling them, in publicly available statements. Their actions must meet the standards set by Ofcom. Using these statements will ensure that search services are as transparent as user-to-user services about how they are complying with their safety duties.

The noble Lord’s Amendment 174 also seeks to expand the transparency reporting requirements to cover the scope and application of the terms of service set out by search service providers. This too is unnecessary because, via Schedule 8, the Bill already ensures transparency about the scope and application of the provisions that search services must make publicly available. I hope that gives the noble Lord some reassurance that the concerns he has raised are already covered. With that, I invite him to withdraw Amendment 16.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - -

My Lords, I am very grateful to the Minister for that very detailed response, which I will have to read very carefully because it was quite complicated. That is the answer to my question. Terms of service will not be very easy to identify because to answer my questions he has had to pray in aid issues that Ofcom will necessarily have to assess—terms of services—to get at whether the companies are performing the duties that the Bill requires of them.

I will not go further on that. We know that there will be enough there to answer the main questions I had about this. I take the point about search being distinctively different in this area, although a tidy mind like mine likes to see all these things in one place and understand all the words. Every time I see “publicly available statement”, I do not know why but I think about people being hanged in public rather than a term of service or a contract.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, I would like to mention one issue that I forgot to mention, and I think it would be more efficient to pose the question now to the Minister rather than interject when he is speaking.

On the Government’s Amendments 136A, 136B and 136C on the immigration offences, the point I want to make is that online services can be literal life-savers for people who are engaged in very dangerous journeys, including journeys across the Channel. I hope the Minister will be clear that the intention here is to require platforms to deal only with content, for example, from criminals who are offering trafficking services, and that there is no intention to require platforms somehow to withdraw services from the victims of those traffickers when they are using those services in the interest of saving their own lives or seeking advice that is essential to preserving their own safety.

That would create—as I know he can imagine—real ethical and moral dilemmas, and we should not be giving any signal that we intend to require platforms to withdraw services from people who are in desperate need of help, whatever the circumstances.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - -

My Lords, we seem to have done it again—a very long list of amendments in a rather ill-conceived group has generated a very interesting discussion. We are getting quite good at this, exchanging views across the table, across the Committee, even within the Benches—Members who perhaps have not often talked together are sharing ideas and thoughts, and that is a wonderful feeling.

I want to start with an apology. I think I may be the person who got the noble Baroness, Lady Kidron, shopped by the former leader—once a leader, always a leader. What I thought I was being asked was whether the Committee would be interested in hearing the views of the noble Viscount who could not be present, and I was very keen, because when he does speak it is from a point of view that we do not often hear. I did not know that it was a transgression of the rules—but of course it is not, really, because we got round it. Nevertheless, I apologise for anything that might have upset the noble Baroness’s blood pressure—it did not stop her making a very good contribution later.

We have covered so much ground that I do not want to try and summarise it in one piece, because you cannot do that. The problem with the group as it stands is that the right reverend Prelate the Bishop of Derby and myself must have some secret connection, because we managed to put down almost the same amendments. They were on issues that then got overtaken by the Minister, who finally got round to—I mean, who put down a nice series of amendments which exactly covered the points we made, so we can lose all those. But this did not stop the right reverend Prelate the Bishop of Guildford making some very good additional points which I think we all benefited from.

I welcome back the noble Baroness, Lady Buscombe, after her illness; she gave us a glimpse of what is to come from her and her colleagues, but I will leave the particular issue that she raised for the Minister to respond to. It raises an issue that I am not competent on, but it is a very important one—we need to get the right balance between what is causing the alarm and difficulty outside in relation to what is happening on the internet, and I think we all agree with her that we should not put any barrier in the way of dealing with that.

Indeed, that was the theme of a number of the points that have been raised on the question of what is or can constitute illegal content, and how we judge it. It is useful to hear again from the master about how you do it in practice. I cannot imagine being in a room of French lawyers and experts and retaining my sanity, let alone making decisions that affect the ability of people to carry on, but the noble Lord did it; he is still here and lives to tell the tale—bearded or otherwise.

The later amendments, particularly from the noble Lord, Lord Clement-Jones, are taking us round in a circle towards the process by which Ofcom will exercise the powers that it is going to get in this area. These are probably worth another debate on their own, and maybe it will come up in a different form, because—I think the noble Baroness, Lady Stowell, made this point as well—there is a problem in having an independent regulator that is also the go-to function for getting advice on how others have to make decisions that are theirs to rule on at the end if they go wrong. That is a complicated way of saying that we may be overloading Ofcom if we also expect it to provide a reservoir of advice on how you deal with the issues that the Bill puts firmly on the companies—I agree that this is a problem that we do not really have an answer to.

My amendments were largely overtaken by the Government’s amendments, but the main one I want to talk about was Amendment 272. I am sorry that the noble Baroness, Lady Morgan, is not here, because her expertise is in an area that I want to talk about, which is fraud—cyber fraud in particular—and how that is going to be brought into the Bill. The issue, which I think has been raised by Which?, but a number of other people have also written to us about it, is that the Bill in Clauses 170 and 171 is trying to establish how a platform should identify illegal content in relation to fraud—but it is quite prescriptive. In particular, it goes into some detail which I will leave for the Minister to respond to, but uniquely it sets out a specific way for gathering information to determine whether content is illegal in this area, although it may have applicability in other areas.

One of the points that have to be taken into account is whether the platform is using human moderators, automated systems or a combination of the two. I am not quite sure why that is there in the Bill; that is really the basis for the tabling of our amendments. Clearly, one would hope that the end result is whether or not illegality has taken place, not how that information has been gathered. If one must make concessions to the process of law because a judgment is made that, because it is automated, it is in some way not as valid as if it had been done by a human moderator, there seems to be a whole world there that we should not be going into. I certainly hope that that is not going to be the case if we are talking about illegality concerning children or other vulnerable people, but that is how the Bill reads at present; I wonder whether the Minister can comment on that.

There is a risk of consumers being harmed here. The figures on fraud in the United Kingdom are extraordinary; the fact that it is not the top priority for everybody, let alone the Government, is extraordinary. It is something like the equivalent of consumers being scammed at the rate of around £7.5 billion per year. A number of awful types of scamming have emerged only because of the internet and social media. They create huge problems of anxiety and emotional distress, with lots of medical care and other things tied in if you want to work out the total bill. So we have a real problem here that we need to settle. It is great that it is in the Bill, but it would be a pity if the movement towards trying to resolve it is in any way infringed on by there being imperfect instructions in the Bill. I wonder whether the Minister would be prepared to respond to that; I would be happy to discuss it with him later, if that is possible.

As a whole, this is an interesting question as we move away from what a crime is towards how people judge how to deal with what they think is a crime but may not be. The noble Lord, Lord Allan, commented on how to do it in practice but one hopes that any initial problems will be overcome as we move forward and people become more experienced with this.

When the Joint Committee considered this issue, we spent a long time talking about why we were concerned about having certainty on the legal prescription in the Bill; that is why we were very much against the idea of “legal but harmful” because it seemed too subjective and too subject to difficulties. Out of that came another thought, which answers the point made by the noble Baroness, Lady Stowell: so much of this is about fine judgments on certain things that are there in stone and that you can work to but you then have to interpret them.

There is a role for Parliament here, I think; we will come on to this in later amendments but, if there is a debate to be had on this, let us not forget the points that have been made here today. If we are going to think again about Ofcom’s activity in practice, that is the sort of thing that either a Joint Committee or Select Committees of the two Houses could easily take on board as an issue that needs to be reflected on, with advice given to Parliament about how it might be taken forward. This might be the answer in the medium term.

In the short term, let us work to the Bill and make sure that it works. Let us learn from the experience but let us then take time out to reflect on it; that would be my recommendation but, obviously, that will be subject to the situation after we finish the Bill. I look forward to hearing the Minister’s response.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, as well as throwing up some interesting questions of law, this debate has provoked some interesting tongue-twisters. The noble Lord, Lord Allan of Hallam, offered a prize to the first person to pronounce the Netzwerkdurchsetzungsgesetz; I shall claim my prize in our debate on a later group when inviting him to withdraw his amendment.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - -

A clean shirt, perhaps?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes, that would be welcome.

--- Later in debate ---
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - -

My Lords, over the last few hours I have praised us for having developed a style of discussion and debate that is certainly relatively new and not often seen in the House, where we have tried to reach out to each other and find common ground. That was not a problem in this last group of just over an hour; I think we are united around the themes that were so brilliantly introduced in a very concise and well-balanced speech by the noble Baroness, Lady Kidron, who has been a leading and inspirational force behind this activity for so long.

Although different voices have come in at different times and asked questions that still need to be answered, I sense that we have reached a point in our thinking, if not in our actual debates, where we need a plan. I too reached this point; that was exactly the motivation I had in tabling Amendment 1, which was discussed on the first day. Fine as the Bill is—it is a very impressive piece of work in every way—it lacks what we need as a Parliament to convince others that we have understood the issues and have the answers to their questions about what this Government, or this country as a whole, are going to do about this tsunami of difference, which has arrived in the wake of the social media companies and search engines, in the way we do our business and live our lives these days. There is consensus, but it is slightly different to the consensus we had in earlier debates, where we were reassuring ourselves about the issues we were talking about but were not reaching out to the Government to change anything so much as being happy that we were speaking the same language and that they were in the same place as we are gradually coming to as a group, in a way.

Just before we came back in after the lunch break, I happened to talk to the noble Lord, Lord Grade, who is the chair of Ofcom and is listening to most of our debates and discussions when his other duties allow. I asked him what he thought about it, and he said that it was fascinating for him to recognise the level of expertise and knowledge that was growing up in the House, and that it would be a useful resource for Ofcom in the future. He was very impressed by the way in which everyone was engaging and not getting stuck in the niceties of the legislation, which he admitted he was experiencing himself. I say that softly; I do not want to embarrass him in any way because he is an honourable man. However, the point he makes is really important.

I say to the Minister that I do not think we are very far apart on this. He knows that, because we have discussed it at some length over the last six to eight weeks. What I think he should take away from this debate is that this is a point where a decision has to be taken about whether the Government are going to go with the consensus view being expressed here and put deliberately into the Bill a repetitive statement, but one that is clear and unambiguous, about the intention behind the Government’s reason for bringing forward the Bill and for us, the Opposition and other Members of this House, supporting it, which is that we want a safe internet for our children. The way we are going to do that is by having in place, up front and clearly in one place, the things that matter when the regulatory structure sits in place and has to deal with the world as it is, of companies with business plans and business models that are at variance with what we think should be happening and that we know are destroying the lives of people we love and the future of our country—our children—in a way that is quite unacceptable when you analyse it down to its last detail.

It is not a question of saying back to us across the Dispatch Box—I know he wants to but I hope he will not—“Everything that you have said is in the Bill; we don’t need to go down this route, we don’t need another piece of writing that says it all”. I want him to forget that and say that actually it will be worth it, because we will have written something very special for the world to look at and admire. It is probably not in its perfect form yet, but that is what the Government can do: take a rough and ready potential diamond, polish it, chamfer it, and bring it back and set it in a diadem we would all be proud to wear—Coronations excepted—so that we can say, “Look, we have done the dirty work here. We’ve been right down to the bottom and thought about it. We’ve looked at stuff that we never thought in our lives we would ever want to see and survived”.

I shake at some of the material we were shown that Molly Russell was looking at. But I never want to be in a situation where I will have to say to my children and grandchildren, “We had the chance to get this right and we relied on a wonderful piece of work called the Online Safety Act 2023; you will find it in there, but it is going to take you several weeks and a lot of mental harm and difficulty to understand what it means”.

So, let us make it right. Let us not just say “It’ll be alright on the night”. Let us have it there. It is almost right but, as my noble friend Lord Knight said, it needs to be patched back into what is already in the Bill. Somebody needs to look at it and say, “What, out of that, will work as a statement to the world that we care about our kids in a way that will really make a difference?” I warn the Minister that, although I said at Second Reading that I wanted to see this Bill on the statute book as quickly as possible, I will not accept a situation where we do not have more on this issue.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

I am grateful to all noble Lords who have spoken on this group and for the clarity with which the noble Lord, Lord Stevenson, has concluded his remarks.

Amendments 20, 74, 93 and 123, tabled by the noble Baroness, Lady Kidron, would mean a significant revising of the Bill’s approach to content that is harmful to children. It would set a new schedule of harmful content and risk to children—the 4 Cs—on the face of the Bill and revise the criteria for user-to-user and search services carrying out child safety risk assessments.

I start by thanking the noble Baroness publicly—I have done so privately in our discussions—for her extensive engagement with the Government on these issues over recent weeks, along with my noble friends Lord Bethell and Lady Harding of Winscombe. I apologise that it has involved the noble Baroness, Lady Harding, missing her stop on the train. A previous discussion we had also very nearly delayed her mounting a horse, so I can tell your Lordships how she has devoted hours to this—as they all have over recent weeks. I would like to acknowledge their campaigning and the work of all organisations that the noble Baroness, Lady Kidron, listed at the start of her speech, as well as the families of people such as Olly Stephens and the many others that the right reverend Prelate the Bishop of Oxford mentioned.

I also reassure your Lordships that, in developing this legislation, the Government carried out extensive research and engagement with a wide range of interested parties. That included reviewing international best practice. We want this to be world-leading legislation, including the four Cs framework on the online risks of harm to children. The Government share the objectives that all noble Lords have echoed in making sure that children are protected from harm online. I was grateful to the noble Baroness, Lady Benjamin, for echoing the remarks I made earlier in Committee on this. I am glad we are on the same page, even if we are still looking at points of detail, as we should be.

As the noble Baroness, Lady Kidron, knows, it is the Government’s considered opinion that the Bill’s provisions already deliver these objectives. I know that she remains to be convinced, but I am grateful to her for our continuing discussions on that point, and for continuing to kick the tyres on this to make sure that this is indeed legislation of which we can be proud.

It is also clear that there is broad agreement across the House that the Bill should tackle harmful content to children such as content that promotes eating disorders, illegal behaviour such as grooming and risk factors for harm such as the method by which content is disseminated, and the frequency of alerts. I am pleased to be able to put on record that the Bill as drafted already does this in the Government’s opinion, and reflects the principles of the four Cs framework, covering each of those: content, conduct, contact and commercial or contract risks to children.

First, it is important to understand how the Bill defines content, because that question of definition has been a confusing factor in some of the discussions hitherto. When we talk in general terms about content, we mean the substance of a message. This has been the source of some confusion. The Bill defines “content”, for the purposes of this legislation, in Clause 207 extremely broadly as

“anything communicated by means of an internet service”.

Under this definition, in essence, all user communication and activity, including recommendations by an algorithm, interactions in the metaverse, live streams, and so on, is facilitated by “content”. So, for example, unwanted and inappropriate contact from an adult to a child would be treated by the Bill as content harm. The distinctions that the four Cs make between content, conduct and contact risks is therefore not necessary. For the purposes of the Bill, they are all content risks.

Secondly, I know that there have been concerns about whether the specific risks highlighted in the new schedule will be addressed by the Bill.