Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

My Lords, this group of amendments concerns terms of service. All the amendments either have the phrase “terms of service” in them or imply that we wish to see more use of the phrase in the Bill, and seek to try to tidy up some of the other bits around that which have crept into the Bill.

Why are we doing that? Rather late in the day, terms of service has suddenly become a key fulcrum, under which much of the operations of the activity relating to people’s usage of social media and service functions on the internet will be expressed in relation to how they view the material coming to them. With the loss of the adult “legal but harmful” provisions, we also lost quite a considerable amount of what would have been primary legislation, which no doubt would have been backed up by codes of practice. The situation we are left with, and which we need to look at very closely, is the triple shield at the heart of the new obligations on companies, and, in particular, on their terms of service. That is set out primarily in Clauses 64, 65, 66 and 67, and is a subject to which my amendments largely refer.

Users of the services would be more confident that the Government have got their focus on terms of service right, if they actually said what should be said on the tin, as the expression goes. If it is the case that something in a terms of service was so written and implemented so that material which should be taken down was indeed taken down, these would become reliable methods of judging whether or not the service is the one people want to have, and the free market would be seen to be working to empower people to make their own decisions about what level of risk they can assume by using a service. That is a major change from the way the Bill was originally envisaged. Because this was done late, we have one or two of the matters to which I have referred already, which means that the amendments focus on changing what is currently in the Bill.

It is also true that the changes were not consulted upon; I do not recall there being any document from government about whether this was a good way forward. The changes were certainly not considered by the Joint Committee, of which several of those present were members—we did not discuss it in the Joint Committee and made no recommendation on it. The level of scrutiny we have enjoyed on the Bill has been absent in this area. The right reverend Prelate the Bishop of Oxford will speak shortly to amendments about terms of service, and we will be able to come back to it. I think it would have been appropriate had the earlier amendment in the name of the noble Lord, Lord Pickles, been in this group because the issue was the terms of service, even though it had many other elements that were important and that we did discuss.

The main focus of my speech is that the Government have not managed to link this new idea of terms of service and the responsibilities that will flow from that to the rest of the Bill. It does not seem to fit into the overall architecture. For example, it is not a design feature, and does not seem to work through in that way. This is a largely self-contained series of clauses. We are trying to ask some of the world’s largest companies, on behalf of the people who use them, to do things on an almost contractual basis. Terms of service are not a contract that you sign up to, but you certainly click something—or occasionally click it, if you remember to—by which you consent to the company operating in a particular set of ways. In a sense, that is a contract, but is it really a contract? At the heart of that contract between companies and users is whether the terms of service are well captured in the way the Bill is organised. I think there are gaps.

The Bill does have something that we welcome and want to hold on to, which is that the process under which the risks are assessed and decisions taken about how companies operate and how Ofcom relates to those decisions is about the design and operation of the service—both the design and the operation, something that the noble Baroness, Lady Kidron, is very keen to emphasise at all times. It all starts and ends with design, and the operation is a consequence of design choices. Other noble Baronesses have mentioned in the debate that small companies get it right and so, when they grow, can be confident that what they are doing is something that is worth doing. Design, and operating that design to make a service, is really important. Are terms of service part of that or are they different, and does it matter? It seems to me that they are downstream from the design: something can be designed and then have terms of service that were not really part of the original process. What is happening here?

My Amendments 16, 21, 66DA, 75 and 197 would ensure that the terms of service are included within the list of matters that constitute “design and operation” of the service at each point that it occurs. I have had to go right through the Bill to add it in certain areas—in a rather irritating way, I am sure, for the Bill team—because sometimes we find that what I think should be a term of service is actually described as something else, such as a “a publicly available statement”, whatever that is. It would be an advantage if we went through it again and defined terms of service and made sure that that was what we were talking about.

Amendments 70 to 72, 79 to 81 and 174 seek to help the Government and their officials with tidying up the drafting, which probably has not been scrutinised enough to pick up these issues. It may not matter, at the end of the day, but what is in the Bill is going to be law and we may as well try to get it right as best we can. I am sure the Minister will say we really do not need to worry about this because it is all about risks and outcomes, and if a company does not protect children or has illegal content, or the user-empowerment duties—the toggling—do not work, Ofcom will find a way of driving the company to sort it out. What does that mean in practice? Does it mean that Ofcom has a role in defining what terms of service are? It is not in the Bill and may not reach the Bill, but it is something that will be a bit of problem if we do not resolve what we mean by it, even if it is not by changing the legislation.

If the Minister were to disagree with my approach, it would be quite nice to have it said at the Dispatch Box so that we can look at that. The key question is: are terms of service an integral part of the design and operation of a service and, if so, can we extend the term to make sure that all aspects of the services people consume are covered by adequate and effective terms of service? There is probably going to be division in the way we approach this because, clearly, whether they are terms of service or have another name, the actual enforcement of illegal and children’s duties will be effected by Ofcom, irrespective of the wording of the Bill—I do not want to question that. However, there is obviously an overlap into questions about adults and others who are affected by the terms of service. If you cannot identify what the terms of service say in relation to something you might not wish to receive because the terms of service are imprecise, how on earth are you going to operate the services, the toggles and things, around it? If you look at that and accept there will be pressure within the market to get these terms of service right, there will be a lot of dialogue with Ofcom. I accept that all that will happen, but it would be good if the position of the terms of service was clarified in the Bill before it becomes law and that Ofcom’s powers in relation to those are clarified—do they or do they not have the chance to review terms of service if they turn out to be ineffective in practice? If that is the case, how are we going to see this work out in practice in terms of what people will be able to do about it, either through redress or by taking the issue to court? I beg to move.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - -

I support these amendments, which were set out wonderfully by the noble Lord, Lord Stevenson. I want to raise a point made on Tuesday when the noble Baroness, Lady Merron, said that only 3% of people read terms of service and I said that 98% of people do not read them, so one of us is wrong, but I think the direction of travel is clear. She also used a very interesting phrase about prominence, and I want to use this opportunity to ask the Minister whether there is some lever whereby Ofcom can insist on prominence for certain sorts of material—a hierarchy of information, if you like—because these are really important pieces of information, buried in the wrong place so that even 2% or 3% of people may not find them.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I am very pleased that the noble Lord, Lord Stevenson, has given us the opportunity to talk about terms of service, and I will make three points again, in a shorter intervention than on the previous group.

First, terms of service are critical as the impact of terms of service will generally be much greater in terms of the amount of intervention that occurs on content than it will ever be under the law. Terms of service create, in effect, a body of private law for a community, and they are nearly always a superset of the public law—indeed, it is very common for the first items of a terms of service to say, “You must not do anything illegal”. This raises the interesting question of “illegal where?”—what it generally means is that you must not do anything illegal in the jurisdiction in which the service provider is established. The terms of service will say, “Do not do anything illegal”, and then they will give a whole list of other things, as well as illegality, that you cannot do on the platform, and I think this is right because they have different characteristics.

--- Later in debate ---
Baroness Buscombe Portrait Baroness Buscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, before speaking to my Amendment 137, I want to put a marker down to say that I strongly support Amendment 135 in the name of my noble friend Lord Moylan. I will not repeat anything that he said but I agree with absolutely every word.

Amendment 137 is in my name and that of my noble and learned friend Lord Garnier and the noble Lord, Lord Moore of Etchingham. This amendment is one of five which I have tabled with the purpose of meeting a core purpose of the Bill. In the words of my noble friend the Minister in response to Amendment 1, it is

“to protect users of all ages from being exposed to illegal content”—[Official Report, 19/4/23; col. 724.]

—in short, to ensure that what is illegal offline is illegal online.

If accepted, this small group of amendments would, I strongly believe, make a really important difference to millions of people’s lives—people who are not necessarily listed in Clause 12. I therefore ask the Committee to allow me to briefly demonstrate the need for these amendments through the prism of millions of people and their families working and living in rural areas. They are often quite isolated and working alone in remote communities, and are increasingly at risk of or are already suffering awful online abuse and harassment. This abuse often goes way beyond suffering; it destroys businesses and a way of life.

I find it extraordinary that the Bill seems to be absent of anything to do with livelihoods. It is all about focusing on feelings, which of course are important—and the most important focus is children—but people’s businesses and livelihoods are being destroyed through abuse online.

Research carried out by the Countryside Alliance has revealed a deeply disturbing trend online that appears to be disproportionately affecting people who live in rural areas and who are involved in rural pursuits. Beyond direct abuse, a far more insidious tactic that activists have adopted involves targeting businesses involved in activities of which they disapprove, such as livestock farming or hosting shoots. They post fake reviews on platforms including Tripadvisor and Google Maps, and their aim is to damage the victim, their business and their reputation by, to put it colloquially, trashing their business and thereby putting off potential customers. This is what some call trolling.

Let me be clear that I absolutely defend, to my core, the right to freedom of expression and speech, and indeed the right to offend. Just upsetting someone is way below the bar for the Bill, or any legislation. I am deeply concerned about the hate crime—or non-crime—issue we debated yesterday; in fact, I put off reading the debate because I so disagree with this nonsense from the College of Policing.

Writing a negative review directly based on a negative experience is entirely acceptable in my book, albeit unpleasant for the business targeted. My amendments seek to address something far more heinous and wrong, which, to date, can only be addressed as libel and, therefore, through the civil courts. Colleagues in both your Lordships’ House and in another place shared with me tremendously upsetting examples from their constituents and in their neighbourhoods of how anonymous activists are ruining the lives of hard-working people who love this country and are going the extra mile to defend our culture, historic ways of life and freedoms.

Fortunately, through the Bill, the Government are taking an important step by introducing a criminal offence of false communications. With the leave of the Committee, I will briefly cite and explain the other amendments in order to make sense of Amendment 137. One of the challenges of the offence of false communications is the need to recognise that so much of the harm that underpins the whole reason why the Bill is necessary is the consequence of allowing anonymity. It is so easy to destroy and debilitate others by remaining anonymous and using false communications. Why be anonymous if you have any spine at all to stand up for what you believe? It is not possible offline—when writing a letter to a newspaper, for example—so why is it acceptable online? The usual tech business excuse of protecting individuals in rogue states is no longer acceptable, given the level of harm that anonymity causes here at home.

Therefore, my Amendment 106 seeks to address the appalling effect of harm, of whatever nature, arising from false or threatening communications committed by unverified or anonymous users—this is what we refer to as trolling. Amendments 266 and 267, in my name and those of my noble and learned friend Lord Garnier and my noble friend Lord Leicester, would widen the scope of this new and welcome offence of false communications to include financial harm, and harm to the subject of the false message arising from its communication to third parties.

The Bill will have failed unless we act beyond feelings and harm to the person and include loss of livelihood. As I said, I am amazed that it is not front and centre of the Bill after safety for our children. Amendment 268, also supported by my noble and learned friend, would bring within the scope of the communications offences the instigation of such offences by others—for example, Twitter storms, which can involve inciting others to make threats without doing so directly. Currently, we are unsure whether encouraging others to spread false information—for example, by posting fake reviews of businesses for ideologically motivated reasons—would become an offence under the Bill. We believe that it should, and my Amendment 268 would address this issue.

I turn briefly to the specifics of my Amendment 137. Schedule 7 lists a set of “priority offences” that social media platforms must act to prevent, and they must remove messages giving rise to certain offences. However, the list does not include the new communications offences created elsewhere in Part 10. We believe that this is a glaring anomaly. If there is a reason why the new communications offences are not listed, it is important that we understand why. I hope that my noble friend the Minister can explain.

The practical effect of Amendment 137 would be to include the communications offences introduced in the Bill and communications giving rise to them within the definition of “relevant offence” and “priority illegal content” for the purposes of Clause 53(4) and (7) and otherwise.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I ask the Committee to have a level of imagination here because I have been asked to read the speech of the noble Viscount, Lord Colville—

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

I do not know who advised the noble Baroness—and forgive me for getting up and getting all former Leader on her—but this is a practice that we seem to have adopted in the last couple of years and that I find very odd. It is perfectly proper for the noble Baroness to deploy the noble Viscount’s arguments, but to read his speech is completely in contravention of our guidance.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - -

I beg the pardon of the Committee. I asked about it and was misinformed; I will do as the noble Baroness says.

The noble Viscount, Lord Colville, is unable to be with us. He put his name to Amendments 273, 275, 277 and 280. His concern is that the Bill sets the threshold for illegality too low and that in spite of the direction provided by Clause 170, the standards for determining illegality are too vague.

I will make a couple of points on that thought. Clause 170(6) directs that a provider must have

“reasonable grounds to infer that all elements necessary for the commission of the offence, including mental elements, are present or satisfied”,

but that does not mean that the platform has to be certain that the content is illegal before it takes it down. This is concerning when you take it in combination with what or who will make judgments on illegality.

If a human moderator makes the decision, it will depend on the resources and time available to them as to how much information they gather in order to make that judgment. Unlike in a court case, when a wide range of information and context can be gathered, when it comes to decisions about content online, these resources are very rarely available to human moderators, who have a vast amount of content to get through.

If an automated system makes the judgment, it is very well established that algorithms are not good at context—the Communications and Digital Committee took evidence on this repeatedly when I was on it. AI simply uses the information available in the content itself to make a decision, which can lead to significant missteps. Clause 170(3) provides the requirement for the decision-makers to judge whether there is a defence for the content. In the context of algorithms, it is very unclear how they will come to such a judgment from the content itself.

I understand that these are probing amendments, but I think the concern is that the vagueness of the definition will lead to too much content being taken down. This concern was supported by Parliament’s Joint Committee on Human Rights, which wrote to the former Culture Secretary, Nadine Dorries, on that matter. I apologise again.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I support the amendments in this group that probe how removing illegal material is understood and will be used under the Bill. The noble Lord, Lord Moylan, explained a lot of my concerns, as indeed did the noble Viscount, Lord Colville, via his avatar. We have heard a range of very interesting contributions that need to be taken seriously by the Government. I have put my name to a number of amendments.

The identification of illegal material might be clear and obvious in some cases—even many cases. It sounds so black and white: “Don’t publish illegal material”. But defining communications of this nature can be highly complex, so much so that it is traditionally reserved for law enforcement bodies and the judicial system. We have already heard from the noble Lord, Lord Moylan, that, despite Home Secretaries, this House, regulations and all sorts of laws having indicated that non-crime hate incidents, for example, should not be pursued by the police, they continue to pursue them as though they are criminal acts. That is exactly the kind of issue we have.

--- Later in debate ---
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, it is genuinely difficult to summarise such a wide-ranging debate, which was of a very high standard. Only one genuinely bright idea has emerged from the whole thing: as we go through Committee, each group of amendments should be introduced by the noble Lord, Lord Allan of Hallam, because it is only after I have heard his contribution on each occasion that I have begun to understand the full complexity of what I have been saying. I suspect I am not alone in that and that we could all benefit from hearing the noble Lord before getting to our feet. That is not meant to sound the slightest bit arch; it is absolutely genuine.

The debate expressed a very wide range of concerns. Concerns about gang grooming and recruiting were expressed on behalf of the right reverend Prelate the Bishop of Derby and my noble friend Lady Buscombe expressed concerns about trolling of country businesses. However, I think it is fair to say that most speakers focused on the following issues. The first was the definition of legality, which was so well explicated by the noble Lord, Lord Allan of Hallam. The second was the judgment bar that providers have to pass to establish whether something should be taken down. The third was the legislative mandating of private foreign companies to censor free speech rights that are so hard-won here in this country. These are the things that mainly concern us.

I was delighted that I found myself agreeing so much with what the noble Baroness, Lady Kidron, said, even though she was speaking in another voice or on behalf of another person. If her own sentiments coincide with the sentiments of the noble Viscount—

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - -

I am sorry to intrude, but I must say now on the record that I was speaking on my own behalf. The complication of measuring and those particular things are terribly important to establish, so I am once again happy to agree with the noble Lord.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

I am delighted to hear the noble Baroness say that, and it shows that that pool of common ground we share is widening every time we get to our feet. However, the pool is not particularly widening, I am afraid to say—at least in respect of myself; other noble Lords may have been greatly reassured—as regards my noble friend the Minister who, I am afraid, has not in any sense addressed the issues about free speech that I and many other noble Lords raised. On some issues we in the Committee are finding a consensus that is drifting away from the Minister. We probably need to put our heads together more closely on some of these issues with the passage of time in Committee.

My noble friend also did not say anything that satisfied me in respect of the practical operation of these obligations for smaller sites. He speaks smoothly and persuasively of risk-based proactive approaches without saying that, for a large number of sites, this legislation will mean a complete re-engineering of their business model. For example, where Wikipedia operates in a minority language, such as in Welsh Wikipedia, which is the largest Welsh language website in the world, if its model is to involve monitoring what is put out by the community and correcting it as it goes along, rather than having a model in advance that is designed to prevent things being put there in the first place, then it is very likely to close down. If that is one of the consequences of this Bill the Government will soon hear about it.

Finally, although I remain concerned about public order offences, I have to say to the Minister that if he is so concerned about the dissemination of alarm among the population under the provisions of the Public Order Act, what does he think that His Majesty’s Government were doing on Sunday at 3 pm? I beg leave to withdraw the amendment.

--- Later in debate ---
Moved by
20: Clause 10, page 9, line 11, leave out paragraphs (a) to (h) and insert—
“(a) the level of risk that children who are users of the service encounter the harms as outlined in Schedule (Online harms to children) by means of the service;(b) any of the level of risks to children encountered singularly or in combination, having regard to—(i) the design of functionalities, algorithms and other features that present or increase risk of harm, such as low-privacy profile settings by default;(ii) the business model, revenue model, governance, terms of service and other systems and processes or mitigation measures that may reduce or increase the risk of harm;(iii) risks which can build up over time;(iv) the ways in which level of risks can change when experienced in combination with others;(v) the level of risk of harm to children in different age groups;(vi) the level of risk of harm to children with certain characteristics or who are members of certain groups; and(vii) the different ways in which the service is used including but not limited to via virtual and augmented reality technologies, and the impact of such use on the level of risk of harm that might be suffered by children;(c) whether the service has shown regard to the rights of children as set out in the United Nations Convention on the Rights of the Child (see general comment 25 on children’s rights in relation to the digital environment).”Member’s explanatory statement
This amendment would require providers to look at and assess risks on their platform in the round and in line with the 4 Cs of online risks to children (content, contact, conduct and contractual/commercial risks). Although these risks will not be presented on every service, this amendment requires providers to reflect on these risks, so they are not forgotten and can be built into future development of the service.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, this amendment and Amendments 74, 93 and 123 are part of a larger group that have been submitted as a package loosely referred to as the AV and harms package. They have been the subject of much private debate with the Government, for which we are grateful, and among parliamentarians, and have featured prominently in the media. The amendments are in my name and those of the noble Lord, Lord Bethell, the right reverend Prelate the Bishop of Oxford and the noble Lord, Lord Stevenson, but enjoy the support of a vast array of Members of both Houses. I thank all those who have voiced their support.

The full package of amendments defines and sets out the rules of the road for age assurance, including the timing of its introduction, and the definition of terms such as age verification and age assurance. They introduce the concept of measuring the efficacy of systems with one eye on the future so that we as parliamentarians can indicate where and when we feel that proportionality is appropriate and where it is simply not—for example, in relation to pornography. In parallel, we have developed a schedule of harms, which garners rather fewer column inches but is equally important in establishing Parliament’s intention. It is that schedule of harms that is up for debate today.

Before I lay out the amendment, I thank the 26 children’s charities which have so firmly got behind this package and acknowledge, in particular, Barnardo’s, CEASE and 5Rights, of which I am chair, which have worked tirelessly to ensure that the full expertise of children’s charities has been embedded in these amendments. I also pay tribute to the noble Baroness, Lady Benjamin, who in this area of policy has shown us all the way.

The key amendment in this group is Amendment 93, which would place a schedule of harms to children in the Bill. There are several reasons for doing so, the primary one being that by putting them in the Bill we are stating the intention of Parliament, which gives clarity to companies and underlines the authority of Ofcom to act on these matters. Amendments 20, 74 and 123 ensure that the schedule is mirrored in risk assessments and tasks Ofcom with updating its guidance every six months to capture new and emerging harms, and as such are self-evident.

The proposed harms schedule is centred around the four Cs, a widely used and understood taxonomy of harm used in legislation and regulation around the globe. Importantly, rather than articulate individual harms that may change over time, it sets its sight on categories of harm: content, contact, conduct and contract, which is sometimes referred to as commercial harm. It also accounts for cumulative harms, where two or more risk factors create a harm that is greater than any single harm or is uniquely created by the combination. The Government’s argument against the four Cs is that they are not future-proof, which I find curious since the very structure of the four Cs is to introduce broad categories of harm to which harms can be added, particularly emerging harms. By contrast, the Government are adding an ever-growing list of individual harms.

I wish to make three points in favour of our package of amendments relating first to language, secondly to the nature of the digital world, and finally to clarity of purpose. It is a great weakness of the Bill that it consistently introduces new concepts and language—for example, the terms “primary priority content”, “priority content” and “non-designated content”. These are not terms used in other similar Bills across the globe, they are not evident in current UK law and they do not correlate with established regimes, such as equalities legislation or children’s rights under the convention, more of which in group 7.

The question of language is non-trivial. It is the central concern of those who fight CSAE around the world, who frequently find that enforcement against perpetrators or takedown is blocked by legal systems that define child sexual abuse material differently—not differently in some theoretical sense but because the same image can be categorised differently in two countries and then be a barrier to enforcement across jurisdictions. Leadership from WeProtect, the enforcement community and representatives that I recently met from Africa, South America and Asia have all made this point. It undermines the concept of UK leadership in child protection that we are wilfully and deliberately rejecting accepted language which is embedded in treaties, international agreements and multilateral organisations to start again with our own, very likely with the same confused outcome.

Secondly, I am concerned that while both the Bill and the digital world are predicated on system design, the harms are all articulated as content with insufficient emphasis on systems harms, such as careless recommendations, spreading engagement and the sector-wide focus on maximising engagement, which are the very things that create the toxic and dangerous environment for children. I know, because we have discussed it, that the Minister will say that this is all in the risk assessment, but the risk assessment asks regulated companies to assess how a number of features contribute to harm, mostly expressed as content harm.

What goes through my mind is the spectre of Meta’s legal team, which I watched for several days during Molly Russell’s inquest; they stood in a court of law and insisted that hundreds, in fact thousands, of images of cut bodies and depressive messages did not constitute harm. Rather, they regarded them as cries for help or below the bar of harm as they interpreted it. Similarly, there was material that featured videos of people jumping off buildings—some of them sped-up versions of movie clips edited to suggest that jumping was freedom—and I can imagine a similar argument that says that kind of material cannot be considered harmful, because in another context it is completely legitimate. Yet this material was sent to Molly at scale.

--- Later in debate ---
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I really appreciated the contribution from the noble Baroness, Lady Ritchie of Downpatrick, because she asked a lot of questions about this group of amendments. Although I might be motivated by different reasons, I found it difficult to fully understand the impact of the amendments, so I too want to ask a set of questions.

Harm is defined in the Bill as “physical or psychological harm”, and there is no further explanation. I can understand the frustration with that and the attempts therefore to use what are described as the

“widely understood and used 4 Cs of online risk to children”.

They are not widely understood by me, and I have ploughed my way through it. I might well have misunderstood lots in it, but I want to look at and perhaps challenge some of the contents.

I was glad that Amendment 20 recognises the level of risk of harm to different age groups. That concerns me all the time when we talk about children and young people, and then end up treating four year-olds, 14 year-olds and 18 year-olds. I am glad that that is there, and I hope that we will look at it again in future.

I want to concentrate on Amendment 93 and reflect and comment more generally on the problem of a definition, or a lack of definition, of harm in the Bill. For the last several years that we have been considering bringing this Bill to this House and to Parliament, I have been worried about the definition of psychological harm. That is largely because this category has become ever more expansive and quite subjective in our therapeutic age. It is a matter of some discussion and quite detailed work by psychologists and professionals, who worry that there is an expanding concept group of what is considered harmful and what psychological harm really means.

As an illustration, I was invited recently to speak to a group of sixth-formers and was discussing things such as trigger warnings and so on. They said, “Well, you know, you’ve got to understand what it’s like”—they were 16 year-olds. “When we encounter certain material, it makes us have PTSD”. I was thinking, “No, it doesn’t really, does it?” Post-traumatic stress disorder is something that you might well gain if you have been in the middle of a war zone. The whole concept of triggering came from psychological and medical insights from the First World War, which you can understand. If you hear a car backfiring, you think it is somebody shooting at you. But the idea here is that we should have trigger warnings on great works of literature and that if we do not it will lead to PTSD.

I am not being glib, because an expanded, elastic and pathologised view of harm is being used quite cavalierly and casually in relation to young people and protecting them, often by the young people themselves. It is routinely used to close down speech as part of the cancel culture wars, which, as noble Lords know, I am interested in. Is there not a danger that this concept of harm is not as obvious as we think, and that the psychological harm issue makes it even more complicated?

The other thing is that Amendment 93 says:

“The harms in this Schedule are a non-exhaustive list of categories and other categories may be relevant”.


As with the discussion on whose judgment decides the threshold for removing illegal material, I think that judging what is harmful is even more tricky for the young in relation to psychological harm. I was reminded of that when the noble Baroness, Lady Kidron, complained that what she considered to be obviously and self-evidently harmful, Meta did not. I wondered whether that is just the case with Meta, or whether views will differ when it comes to—

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

The report found—I will not give a direct quotation—that social media contributed to the death of Molly Russell, so it was the court’s judgment, not mine, that Meta’s position was indefensible.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

I completely understand that; I was making the point that there will be disagreements in judgments. In that instance, it was resolved by a court, but we are talking about a situation where I am not sure how the judgment is made.

In these amendments, there are lists of particular harms—a variety are named, including self-harm—and I wanted to provide some counterexamples of what I consider to be harms. I have been inundated by algorithmic adverts for “Naked Education” on Channel 4, maybe because of the algorithms I am on. I think that the programme is irresponsible; I say that having watched it, rather than just having read a headline. Channel 4 is posing this programme with naked adults and children as educational by saying that it is introducing children to the naked body. I think it is harmful for children and that it should not be on the television, but it is advertised on social media—I have seen quite a lot of it.

The greatest example of self-harm we encounter at present is when gender dysphoric teenagers—as well as some younger than teenagers; they are predominately young women—are affirmed by adults, as a kind of social contagion, into taking body-changing and body-damaging hormones and performing self-mutilation, whether by breast binding or double mastectomies, which is advertised and praised by adults. That is incredibly harmful for young people, and it is reflected online at lot, because much of this is discussed, advertised or promoted online.

This is related to the earlier contributions, because I am asking: should those be added to the list of obvious harms? Although not many noble Lords are in the House now, if there were many more here, they would object to what I am saying by stating, “That is not harmful at all. What is harmful is what you’re saying, Baroness Fox, because you’re causing psychological harm to all those young people by being transphobic”. I am raising these matters because we think we all agree that there is a consensus on what is harmful material online for young people, but it is not that straightforward.

The amendment states that the Bill should target any platform that posts

“links to, or … encourages child users to seek”

out “dangerous or illegal activity”. I understand “illegal activity”, but on “dangerous” activities, I assume that we do not mean extreme sports, mountain climbing and so on, which are dangerous—that comes to mind probably because I have spent too much time with young people who spend their whole time looking at those things. I worry about the unintended consequences of things being banned or misinterpreted in that way.

--- Later in debate ---
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

I appreciate that that this is the case we all have in the back of our minds. I am asking whether, when Meta says it is content agnostic, the Bill is the appropriate place for us to list the topics that we consider harmful. If we are to do that, I was giving examples of contentious, harmful topics. I might have got this wrong—

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I will answer the noble Baroness more completely when I wind up, but I just want to say that she is missing the point of the schedule a little. Like her, I am concerned about the way we concentrate on content harms, but she is bringing it back to content harms. If she looks at it carefully, a lot of the provisions are about contact and conduct: it is about how the system is pushing children to do certain things and pushing them to certain places. It is about how things come together, and I think she is missing the point by keeping going back to individual pieces of content. I do not want to take the place of the Minister, but this is a systems and processes Bill; it is not going to deal with individual pieces of content in that way. It asks, “Are you creating these toxic environments for children? Are you delivering this at scale?” and that is the way we must look at this amendment.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

I will finish here, because we have to get on, but I did not introduce content; it is in the four Cs. One of the four Cs is “content” and I am reacting to amendments tabled by the noble Baroness. I do not think I am harping on about content; I was responding to amendments in which content was one of the key elements.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

Let us leave it there.

Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I speak in support of these amendments with hope in my heart. I thank the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, for leading the charge with such vigour, passion and determination: I am with them all the way.

The Government have said that the purpose of the Bill is to protect children, and it rests on our shoulders to make sure it delivers on this mission. Last week, on the first day in Committee, the Minister said:

“Through their duties of care, all platforms will be required proactively to identify and manage risk factors associated with their services in order to ensure that users do not encounter illegal content and that children are protected from harmful content. To achieve this, they will need to design their services to reduce the risk of harmful content or activity occurring and take swift action if it does.—[Official Report, 19/4/23; cols. 274-75.]


This is excellent and I thank the Government for saying it. But the full range of harms and risk to children will not be mitigated by services if they do not know what they are expected to risk-assess for and if they must wait for secondary legislation for this guidance.

The comprehensive range of harms children face every day is not reflected in the Bill. This includes sexual content that does not meet the threshold of pornography. This was highlighted recently in an investigation into TikTok by the Telegraph, which found that a 13 year-old boy was recommended a video about the top 10 porn-making countries, and that a 13 year-old girl was shown a livestream of a pornography actor in her underwear answering questions from viewers. This content is being marketed to children without a user even seeking out pornographic content, but this would still be allowed under the Bill.

Furthermore, high-risk challenges, such as the Benadryl and blackout challenges, which encourage dangerous behaviour on TikTok, are not dealt with in the Bill. Some features, such as the ability of children to share their location, are not dealt with either. I declare an interest as vice-president of Barnardo’s, which has highlighted how these features can be exploited by organised criminal gangs that sexually exploit children to keep tabs on them and trap them in a cycle of exploitation.

It cannot be right that the user-empowerment duties in the Bill include a list of harmful content that services must enable adults to toggle off, yet the Government refuse to produce this list for children. Instead, we have to wait for secondary legislation to outline harms to children, causing further delay to the enforcement of services’ safety duties. Perhaps the Minister can explain why this is.

The four Cs framework of harm, as set out in these amendments, is a robust framework that will ensure service risk assessments consider the full range of harms children face. I will repeat it once again: childhood lasts a lifetime, so we cannot fail children any longer. Protections are needed now, not in years to come. We have waited far too long for this. Protections need to be fast-tracked and must be included in the Bill. That is why I fully support these amendments.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I was about to list the four Cs briefly in order, which will bring me on to commercial or contract risk. Perhaps I may do that and return to those points.

I know that there have been concerns about whether the specific risks highlighted in the new schedule will be addressed by the Bill. In terms of the four Cs category of content risks, there are specific duties for providers to protect children from illegal content, such as content that intentionally assists suicide, as well as content that is harmful to children, such as pornography. Regarding conduct risks, the child safety duties cover harmful conduct or activity such as online bullying or abuse and, under the illegal content safety duties, offences relating to harassment, stalking and inciting violence.

With regard to commercial or contract risks, providers specifically have to assess the risks to children from the design and operation of their service, including their business model and governance under the illegal content and child safety duties. In relation to contact risks, as part of the child safety risk assessment, providers will need specifically to assess contact risks of functionalities that enable adults to search for and contact other users, including children, in a way that was set out by my noble friend Lord Bethell. This will protect children from harms such as harassment and abuse, and, under the illegal content safety duties, all forms of child sexual exploitation and abuse, including grooming.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - -

I agree that content, although unfathomable to the outside world, is defined as the Minister says. However, does that mean that when we see that

“primary priority content harmful to children”

will be put in regulations by the Secretary of State under Clause 54(2)—ditto Clause 54(3) and (4)—we will see those contact risks, conduct risks and commercial risks listed as primary priority, priority and non-designated harms?

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I have tried to outline the Bill’s definition of content, which I think will give some reassurance that other concerns that noble Lords have raised are covered. I will turn in a moment to address priority and primary priority content, if the noble Baroness will allow me to do that, and then perhaps intervene again if I have not done so to her satisfaction. I want to set that out and try to keep track of all the questions which have been posed as I do so.

For now, I know there have been concerns from some noble Lords that if functionalities are not labelled as harm in the legislation they would not be addressed by providers, and I reassure your Lordships’ House that this is not the case. There is an important distinction between content and other risk factors such as, for instance, an algorithm, which without content cannot risk causing harm to a child. That is why functionalities are not covered by the categories of primary, priority and priority content which is harmful to children. The Bill sets out a comprehensive risk assessment process which will cover content or activity that poses a risk of harm to children and other factors, such as functionality, which may increase the risk of harm. As such, the existing children’s risk assessment criteria already cover many of the changes proposed in this amendment. For example, the duties already require service providers to assess the risk of harm to children from their business model and governance. They also require providers to consider how a comprehensive range of functionalities affect risk, how the service is used and how the use of algorithms could increase the risks to children.

Turning to the examples of harmful content set out in the proposed new schedule, I am happy to reassure the noble Baroness and other noble Lords that the Government’s proposed list of primary, priority and priority content covers a significant amount of this content. In her opening speech she asked about cumulative harm—that is, content sent many times or content which is harmful due to the manner of its dissemination. We will look at that in detail on the next group as well, but I will respond to the points she made earlier now. The definition of harm in the Bill under Clause 205 makes it clear that physical or psychological harm may arise from the fact or manner of dissemination of the content, not just the nature of the content—content which is not harmful per se, but which if sent to a child many times, for example by an algorithm, would meet the Bill’s threshold for content that is harmful to children. Companies will have to consider this as a fundamental part of their risk assessment, including, for example, how the dissemination of content via algorithmic recommendations may increase the risk of harm, and they will need to put in place proportionate and age-appropriate measures to manage and mitigate the risks they identify. I followed the exchanges between the noble Baronesses, Lady Kidron and Lady Fox, and I make it clear that the approach set out by the Bill will mean that companies cannot avoid tackling the kind of awful content which Molly Russell saw and the harmful algorithms which pushed that content relentlessly at her.

This point on cumulative harm was picked up by my noble friend Lord Bethell. The Bill will address cumulative risk where it is the result of a combination of high-risk functionality, such as live streaming, or rewards in service by way of payment or non-financial reward. This will initially be identified through Ofcom’s sector risk assessments, and Ofcom’s risk profiles and risk assessment guidance will reflect where a combination of risk in functionalities such as these can drive up the risk of harm to children. Service providers will have to take Ofcom’s risk profiles into account in their own risk assessments for content which is illegal or harmful to children. The actions that companies will be required to take under their risk assessment duties in the Bill and the safety measures they will be required to put in place to manage the services risk will consider this bigger-picture risk profile.

The amendments of the noble Baroness, Lady Kidron, would remove references to primary priority and priority harmful content to children from the child risk assessment duties, which we fear would undermine the effectiveness of the child safety duties as currently drafted. That includes the duty for user-to-user providers to prevent children encountering primary priority harms, such as pornography and content that promotes self-harm or suicide, as well as the duty to put in place age-appropriate measures to protect children from other harmful content and activity. As a result, we fear these amendments could remove the requirement for an age-appropriate approach to protecting children online and make the requirement to prevent children accessing primary priority content less clear.

The noble Baroness, Lady Kidron, asked in her opening remarks about emerging harms, which she was right to do. As noble Lords know, the Bill has been designed to respond as rapidly as possible to new and emerging harms. First, the primary priority and priority list of content can be updated by the Secretary of State. Secondly, it is important to remember the function of non-designated content that is harmful to children in the Bill—that is content that meets the threshold of harmful content to children but is not on the lists designated by the Government. Companies are required to understand and identify this kind of content and, crucially, report it to Ofcom. Thirdly, this will inform the actions of Ofcom itself in its review and report duties under Clause 56, where it is required to review the incidence of harmful content and the severity of harm experienced by children as a result of it. This is not limited to content that the Government have listed as being harmful, as it is intended to capture new and emerging harms. Ofcom will be required to report back to the Government with recommendations on changes to the primary priority and priority content lists.

I turn to the points that the noble Lord, Lord Knight of Weymouth, helpfully raised earlier about things that are in the amendments but not explicitly mentioned in the Bill. As he knows, the Bill has been designed to be tech-neutral, so that it is future-proof. That is why there is no explicit reference to the metaverse or virtual or augmented reality. However, the Bill will apply to service providers that enable users to share content online or interact with each other, as well as search services. That includes a broad range of services such as websites, applications, social media sites, video games and virtual reality spaces such as the metaverse; those are all captured. Any service that allows users to interact, as the metaverse does, will need to conduct a children’s access assessment and comply with the child safety duties if it is likely to be accessed by children.

Amendment 123 from the noble Baroness, Lady Kidron, seeks to amend Clause 48 to require Ofcom to create guidance for Part 3 service providers on this new schedule. For the reasons I have just set out, we do not think it would be workable to require Ofcom to produce guidance on this proposed schedule. For example, the duty requires Ofcom to provide guidance on the content, whereas the proposed schedule includes examples of risky functionality, such as the frequency and volume of recommendations.

I stress again that we are sympathetic to the aim of all these amendments. As I have set out, though, our analysis leads us to believe that the four Cs framework is simply not compatible with the existing architecture of the Bill. Fundamental concepts such as risk, harm and content would need to be reconsidered in the light of it, and that would inevitably have a knock-on effect for a large number of clauses and timing. The Bill has benefited from considerable scrutiny—pre-legislative and in many discussions over many years. The noble Baroness, Lady Kidron, has been a key part of that and of improving the Bill. The task is simply unfeasible at this stage in the progress of the Bill through Parliament and risks delaying it, as well as significantly slowing down Ofcom’s implementation of the child safety duties. We do not think that this slowing down is a risk worth taking, because we believe the Bill already achieves what is sought by these amendments.

Even so, I say to the Committee that we have listened to the noble Baroness, Lady Kidron, and others and have worked to identify changes which would further address these concerns. My noble friend Lady Harding posed a clear question: if not this, what would the Government do instead? I am pleased to say that, as a result of the discussions we have had, the Government have decided to make a significant change to the Bill. We will now place the categories of primary priority and priority content which is harmful to children on the face of the Bill, rather than leaving them to be designated in secondary legislation, so Parliament will have its say on them.

We hope that this change will reassure your Lordships that protecting children from the most harmful content is indeed the priority for the Bill. That change will be made on Report. We will continue to work closely with the noble Baroness, Lady Kidron, my noble friends and others, but I am not able to accept the amendments in the group before us today. With that, I hope that she will be willing to withdraw.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I thank all the speakers. There were some magnificent speeches and I do not really want to pick out any particular ones, but I cannot help but say that the right reverend Prelate described the world without the four Cs. For me, that is what everybody in the Box and on the Front Bench should go and listen to.

I am grateful and pleased that the Minister has said that the Government are moving in this direction. I am very grateful for that but there are a couple of things that I have to come back on. First, I have swiftly read Amendment 205’s definition of harm and I do not think it says that you do not have to reach a barrier of harm; dissemination is quite enough. There is always the problem of what the end result of the harm is. The thing that the Government are not listening to is the relationship between the risk assessment and the harm. It is about making sure that we are clear that it is the functionality that can cause harm. I think we will come back to this at another point, but that is what I beg them to listen to. Secondly, I am not entirely sure that it is correct to say that the four Cs mean that you cannot have primary priority, priority and so on. That could be within the schedule of content, so those two things are not actually mutually exclusive. I would be very happy to have a think about that.

What was not addressed in the Minister’s answer was the point made by the noble Lord, Lord Allan of Hallam, in supporting the proposal that we should have in the schedule: “This is what you’ve got to do; this is what you’ve got to look at; this is what we’re expecting of you; and this is what Parliament has delivered”. That is immensely important, and I was so grateful to the noble Lord, Lord Stevenson, for putting his marker down on this set of amendments. I am absolutely committed to working alongside him and to finding ways around this, but we need to find a way of stating it.

Ironically, that is my answer to both the noble Baronesses, Lady Ritchie and Lady Fox: we should have our arguments here and now, in this Chamber. I do not wish to leave it to the Secretary of State, whom I have great regard for, as it happens, but who knows: I have seen a lot of Secretaries of State. I do not even want to leave it to the Minister, because I have seen a lot of Ministers too—ditto Ofcom, and definitely not the tech sector. So here is the place, and we are the people, to work out the edges of this thing.

Not for the first time, my friend, the noble Baroness, Lady Harding, read out what would have been my answer to the noble Baroness, Lady Ritchie. I have gone round and round, and it is like the Marx brothers’ movie: in the end, harm is defined by subsection (4)(c), but that says that harm will defined by the Secretary of State. It goes around like that through the Bill.