(9 months, 3 weeks ago)
Lords ChamberMy Lords, it is a great privilege to speak after the noble Lord, Lord Forsyth. I declare my interest as a director of a film and TV production company that works regularly with studio streamers and broadcasters, including the PSBs.
I welcome the Bill and, like others, wish it had been with us a little earlier. The focus of my remarks will be to question whether the Bill has kept up with changes in the media landscape. First, however, I add my support to recommendations made by the pre-legislative committee in the other place. At Second Reading, Dame Caroline Dinenage, the chair of the committee, said that the removal of the genre list
“was something that the PSBs themselves did not want to linger on in their evidence to us”.—[Official Report, Commons, 21/11/23; col. 248.]
The fact that they did not wish to linger on it should make us nervous. Their silence is evidence that a narrowing of the list will likely result in a downgrading of religious, arts, science and children’s programming, among others. It is true that the PSBs can take a somewhat creative view of what constitutes such programming; I once directed a film about sex workers in New York for the religious strand of Channel 4 television, so, clearly, the genres cannot be claimed to be too restrictive. However, the fact that this broad remit exists and is engraved in legislation is at the heart of what is most unique about our public broadcasting ecology.
I know that the Minister has a deep concern about the arts and hope that the House already has his ear on that issue, but when he responds, can he explain both the rationale for narrowing the genres and where the Government imagine broadcasters will find room for religious or children’s programming? If we downgrade the breadth of what PSB means, in effect, we downgrade much of what the Bill seeks to protect. These categories are central to our collective understanding of how we see ourselves and our world.
Similarly, I fully support the committee in wishing prominence to be “significant” rather than “appropriate”. Ministers in the other place argued that it may not always be appropriate to make something prominent, let alone significant. Even if you can work out what that means and it is occasionally the case, it can be dealt with by the overarching duty in the Regulators’ Code for Ofcom to be proportionate. Meanwhile, the cost of not requiring significant prominence may, over time, render the prominence measure entirely ineffective. Let us imagine, in the near future, that broadcast is consumed using connected eyewear allowing us to walk into immersive environments or that each of us has an AI-derived personal programme; “significant” would drive innovative solutions as technology changes, while “appropriate” serves up an unimaginative status quo.
The same future-proof reasoning should mean that both digital on-demand services and app stores are in scope of the legislation. If, as is often the case, an app store is the gatekeeper or first port of call for content, but its terms require a 20% or 30% cut in revenue, the Bill, in effect, gives poor digital real estate to the PSBs and leaves the most lucrative sites profiteering from their content or carrying none at all.
That leads me to my final point: simply, that I am not sure that the Bill represents a vision of media fit for our age. We are about to suffer a tsunami of synthetic material in which the guesstimate of large language models provides a further fragmentation of any consensus about the truth—witness last week’s pause on Google’s Gemini image generator after it created German soldiers from World War II incorrectly featuring a black man and an Asian woman. Those of us in this Chamber know how preposterous that is, but that is simply not the case across all UK demographics or user groups. Similarly, damaging disinformation from all quarters about the war in Gaza is circulating in our schools, and the false citations and assertions swamping our academic community undermine the very rigour on which it stakes its reputation.
In this picture, we know that the consumption of news and PSB content is falling rapidly, particularly among children. Yet the Bill does not even begin to tackle the provenance or labelling of media content, does not set out expectations about misinformation or disinformation, and does not contain a must-carry component for YouTube, app stores, Facebook or Instagram. While those things appear to be out of scope of the Bill, as a veteran of the then Online Safety Bill, the digital markets Bill and the data Bill—and because Ministers have already promised that there will be no AI Bill—I ask the Minister to tell the House where they sit.
Finally, I understand that the BBC is not held in the same high regard by all in government as it is by the public. But in a world in which media is so fractured and toxic, the Bill could have usefully reimagined the role of our national broadcaster as a scaled-up alternative to the platforms. Imagine the UK offering a PSB to educate, entertain and inform across a broad range of genres—news, entertainment, education and digital services—from a genuinely trusted media voice. It would be a real alternative to those chasing advertising revenue to the detriment of the quality, social cohesion and security of our ever more fractured world, in which the audience is seen principally as a user/consumer rather than as a citizen. This is an investment that should have been made a decade ago but even now the BBC remains one of the few public assets that could be a global phenomenon. It could be world beating.
Neither culture nor politics is a zero-sum game. It does not follow that if social media or streamers have content, we need none of it in our collective hands; nor does it follow that, because this generation of the young has been hijacked by the persuasive design strategies of an advertising business model, that should form our blueprint for the next generation. The PSB system offers the opportunity of a contemporary and collective vision of what binds us. This is a crucial time in which money rules, politics is discredited, nations states are weakened, and the international community is divided by layers of self-interest and proxy wars. It is a time in which something that can be shared may also, at its finest, allow us to discern a collective path to a very much brighter future.
(1 year, 3 months ago)
Lords ChamberMy Lords, I want to thank the Minister and other noble colleagues for such kind words. I really appreciate it.
I want to say very little. It has been an absolute privilege to work with people across both Houses on this. It is not every day that one keeps the faith in the system, but this has been a great pleasure. In these few moments that I am standing, I want to pay tribute to the bereaved parents, the children’s coalition, the NSPCC, my colleagues at 5Rights, Barnardo’s, and the other people out there who listen and care passionately that we get this right. I am not going to go through what we got right and wrong, but I think we got more right than we got wrong, and I invite the Minister to sit with me on Monday in the Gallery to make sure that those last little bits go right—because I will be there. I also remind the House that we have some work in the data Bill vis-à-vis the bereaved parents.
In all the thanks—and I really feel that I have had such tremendous support on my area of this Bill—I pay tribute to the noble Baroness, Lady Benjamin. She was there before many people were and suffered cruelly in the legislative system. Our big job now is to support Ofcom, hold it to account and help it in its task, because that is Herculean. I really thank everyone who has supported me through this.
My Lords, I am sure that your Lordships would not want the Bill to pass without hearing some squeak of protest and dissent from those of us who have spent so many days and weeks arguing for the interests of privacy and free speech, to which the Bill remains a very serious and major threat.
Before I come to those remarks, I associate myself with what other noble Lords have said about what a privilege it has been, for me personally and for many of us, to participate over so many days and weeks in what has been the House of Lords at its deliberative best. I almost wrote down that we have conducted ourselves like an academic seminar, but when you think about what most academic seminars are like—with endless PowerPoint slides and people shuttling around, and no spontaneity whatever—we exceeded that by far. The conversational tone that we had in the discussions, and the way in which people who did not agree were able to engage—indeed, friendships were made—meant that the whole thing was done with a great deal of respect, even for those of us who were in the small minority. At this point, I should perhaps say on behalf of the noble Baroness, Lady Fox of Buckley, who participated fully in all stages of the Bill, that she deeply regrets that she cannot be in her place today.
I am not going to single out anybody except for one person. I made the rather frivolous proposal in Committee that all our debates should begin with the noble Lord, Lord Allan of Hallam; we learned so much from every contribution he made that he really should have kicked them all off. We would all have been a great deal more intelligent about what we were saying, and understood it better, had we heard what he had to say. I certainly have learned a great deal from him, and that was very good.
I will raise two issues only that remain outstanding and are not assuaged by the very odd remarks made by my noble friend as he moved the Third Reading. The first concerns encryption. The fact of the matter is that everybody knows that you cannot do what Ofcom is empowered by the Bill to do without breaching end-to-end encryption. It is as simple as that. My noble friend may say that that is not the Government’s intention and that it cannot be forced to do it if the technology is not there. None of that is in the Bill, by the way. He may say that at the Dispatch Box but it does not address the fact that end-to-end encryption will be breached if Ofcom finds a way of doing what the Bill empowers it to do, so why have we empowered it to do that? How do we envisage that Ofcom will reconcile those circumstances where platforms say that they have given their best endeavours to doing something and Ofcom simply does not believe that they have? Of course, it might end up in the courts, but the crucial point is that that decision, which affects so many people—and so many people nowadays regard it as a right to have privacy in their communications—might be made by Ofcom or by the courts but will not be made in this Parliament. We have given it away to an unaccountable process and democracy has been taken out of it. In my view, that is a great shame.
I come back to my second issue—I will not be very long. I constantly ask about Wikipedia. Is Wikipedia in scope of the Bill? If it is, is it going to have to do prior checking of what is posted? That would destroy its business model and make many minority language sites—I instanced Welsh—totally unviable. My noble friend said at the Dispatch Box that, in his opinion, Wikipedia was not going to be in scope of the Bill. But when I asked why we could not put that in the Bill, he said it was not for him to decide whether it was in scope and that the Government had set up this wonderful structure whereby Ofcom will tell us whether it is—almost without appeal, and again without any real democratic scrutiny. Oh yes, and we might have a Select Committee, which might write a very good, highly regarded report, which might be debated some time within the ensuing 12 months on the Floor of your Lordships’ House. However, we will have no say in that matter; we have given it away.
I said at an earlier stage of the Bill that, for privacy and censorship, this represents the closest thing to a move back to the Lord Chamberlain and Lady Chatterley’s Lover that you could imagine but applied to the internet. That is bad, but what is almost worse is this bizarre governance structure where decisions of crucial political sensitivity are being outsourced to an unaccountable regulator. I am very sad to say that I think that, at first contact with reality, a large part of this is going to collapse, and with it a lot of good will be lost.
(1 year, 5 months ago)
Lords ChamberMy Lords, I shall speak to my Amendment 275A in this group. It would place a duty on Ofcom to report annually on areas where our legal codes need clarification and revision to remain up to date as new technologies emerge—and that is to cover technologies, some of which we have not even thought of yet.
Government Amendments 206 and 209 revealed the need for an amendment to the Bill and how it would operate, as they clarify that reference to pornographic content in the Bill includes content created by a bot. However, emerging technologies will need constant scrutiny.
As the noble Lord, Lord Clement-Jones, asked, what about provider content, which forms the background to the user interaction and may include many harms. For example, would a game backdrop that includes anti-Semitic slurs, a concentration camp, a sex shop or a Ku Klux Klan rally be caught by the Bill?
The Minister confirmed that “content” refers to anything communicated by means of an internet service and the encounter includes any content that individuals read, view, hear or otherwise experience, making providers liable for the content that they publish. Is this liable under civil, regulatory or criminal law?
As Schedule 1 goes to some lengths to exempt some service-to-provider content, can the Minister for the record provide chapter and verse, as requested by the noble Lord, Lord Clement-Jones, on provider liability and, in particular, confirm whether such content would be dealt with by the Part 3 duties under the online safety regime or whether users would have to rely on similar law for claims at their own expense through the courts or the police carry the burden of further enforcement?
Last week, the Minister confirmed that “functionality” captures any feature enabling interactions of any description between service users, but are avatars or objects created by the provider of a service, not by an individual user, in scope and therefore subject to risk assessments and their mitigation requirements? If so, will these functionalities also be added to user empowerment tools, enabling users to opt out of exposure to them, or will they be caught only by child safety duties? Are environments provided by a service provider, such as a backdrop to immersive environments, in scope through the definition of “functionality”, “content” or both? When this is provider content and not user-generated content, will this still hold true?
All this points to a deeper issue. Internet services have become more complex and vivid, with extremely realistic avatars and objects indistinguishable from people and objects in the real world. This amendment avoids focusing on negatives associated with AI and new technologies but tries to ensure that the online world is as safe as the offline world should be. It is worth noting that Interpol is already investigating how to deal with criminals in the metaverse and anticipating crimes against children, data theft, money laundering, fraud and counterfeit, ransomware, phishing, sexual assault and harassment, among other things. Many of these behaviours operate in grey areas of the law where it is not clear whether legal definitions extend to the metaverse.
Ofcom has an enormous task ahead, but it is best placed to consider the law’s relationship to new technological developments and to inform Parliament. Updating our laws through the mechanisms proposed in Amendment 275A will provide clarity to the courts, judges, police and prosecution service. I urge the Minister to provide as full an answer as possible to the many questions I have posed. I am grateful to him for all the work he has been doing. If he cannot accept my amendment as worded, will he provide an assurance that he will return to this with a government amendment at Third Reading?
My Lords, I will speak to Amendment 191A in my name. I also support Amendment 186A in the name of the noble Lord, Lord Moylan, Amendment 253 in the name of the noble Lord, Lord Clement-Jones, and Amendment 275A in the name of my noble friend Lady Finlay. I hope that my words will provide a certain level of reassurance to the noble Lord, Lord Moylan.
In Committee and on Report, the question was raised as to how to support the coronial system with information, education and professional development to keep pace with the impact of the fast-changing digital world. I very much welcome the Chief Coroner’s commitment to professional development for coroners but, as the Minister said, this is subject to funding. While it is right that the duty falls to the Chief Coroner to honour the independence and expert knowledge associated with his roles, this amendment seeks to support his duties with written guidance from Ofcom, which has no such funding issue since its work will be supported by a levy on regulated companies—a levy that I argue could usefully and desirably contribute to the new duties that benefit coroners and bereaved parents.
The role of a coroner is fundamental. They must know what preliminary questions to ask and how to triage the possibility that a child’s digital life is relevant. They must know that Ofcom is there as a resource and ally and how to activate its powers and support. They must know what to ask Ofcom for, how to analyse information they receive and what follow-up questions might be needed. Importantly, they must feel confident in making a determination and describing the way in which the use of a regulated service has contributed to a child’s death, in the case that that is indeed their finding. They must be able to identify learnings that might prevent similar tragedies happening in the future. Moreover, much of the research and information that Ofcom will gather in the course of its other duties could be usefully directed at coroners. All Amendment 191A would do is add to the list of reports that Ofcom has to produce with these issues in mind. In doing so, it would do the Chief Coroner the service of contributing to his own needs and plans for professional development.
I turn to Amendment 186A in the name of the noble Lord, Lord Moylan, who makes a very significant point in bringing it forward. Enormous effort goes into creating an aura of exceptionality for the tech sector, allowing it to avoid laws and regulations that routinely apply to other sectors. These are businesses that benefit from our laws, such as intellectual copyright or international tax law. However, they have negotiated a privileged position in which they have privatised the benefits of our attention and data while outsourcing most of the costs of their service to the public purse or, indeed, their users.
Terms and conditions are a way in which a company enters into a clear agreement with its users, who then “pay” for access with their attention and their data: two of the most valuable commodities in today’s digital society. I am very sympathetic to the noble Lord’s wish to reframe people, both adults and children, from a series of euphemisms that the sector employs—such as “users”, “community members”, “creators” or “participants”—to acknowledge their status as consumers who have rights and, in particular, the right to expect the product they use to be safe and for providers to be held accountable if it is not. I join the noble Lord in asserting that there are now six weeks before Third Reading. This is a very valuable suggestion that is worthy of government attention.
Amendment 253 in the name of the noble Lord, Lord Clement-Jones, puts forward a very strong recommendation of the pre-legislative committee. We were a bit bewildered and surprised that it was not taken up at the time, so I will be interested to hear what argument the Minister makes to exclude it, if indeed he does so. I say to him that I have already experienced the frustration of being bumped from one regulator to another. Although my time as an individual or the organisational time of a charity is minor in the picture we are discussing, it is costly in time and resources. I point to the time, resources and potential effectiveness of the regulatory regime. However well oiled and well funded the regulatory regime of the Online Safety Bill is, I do not think it will be as well oiled and well funded as those that it seeks to regulate.
I make it clear that I accept the arguments of not wanting to create a super-regulator or slow down or confuse existing regulators which each have their own responsibilities, but I feel that the noble Lord, Lord Clement-Jones, has approached this with more of a belt-and-braces approach rather than a whole realignment of regulators. He simply seeks to make it explicit that regulators can, should and do have a legal basis on which to work singularly or together when it suits them. As I indicated earlier, I cannot quite understand why that would not be desirable.
Finally, in what is truly a miscellaneous group, I will refer to the amendment in the name of my noble friend Lady Finlay. I support the intent of this amendment and sincerely hope that the Minister will be able to reassure us that this is already in the Bill and will be done by Ofcom under one duty or another. I hope that he will be able to point to something that includes this. I thank my noble friend for raising it, as it harks back to an amendment in Committee in my name that sought to establish that content deemed harmful in one format would be deemed harmful in all formats—whether synthetic, such as AI, the metaverse or augmented reality. As my noble friend alluded to, it also speaks to the debate we had last week in relation to the amendment from the noble Lord, Lord Clement-Jones, about provider content in the metaverse.
My Lords, let me add to this miscellany by speaking to the government amendments that stand in my name as part of this group. The first is Amendment 288A, which we mentioned on the first group of amendments on Report because it relates to the new introductory clause, Clause 1, and responds to the points raised by the noble Lord, Lord Stevenson of Balmacara. I am very happy to say again that the Government recognise that people with multiple and combined characteristics suffer disproportionately online and are often at greater risk of harm. This amendment therefore adds a provision in the new interpretation clause, Clause 1, to put beyond doubt that all the references to people with “a certain characteristic” throughout the Bill include people with a combination of characteristics. We had a good debate about the Interpretation Act 1978, which sets that out, but we are happy to set it out clearly here.
In his Amendment 186A, my noble friend Lord Moylan seeks to clarify a broader issue relating to consumer rights and online platforms. He got some general support—certainly gratitude—for raising this issue, although there was a bit of a Committee-style airing of it and a mixture of views on whether this is the right way or the right place. The amendment seeks to make it clear that certain protections for consumers in the Consumer Rights Act 2015 apply when people use online services and do not pay for them but rather give up their personal data in exchange. The Government are aware that the application of the law in that area is not always clear in relation to free digital services and, like many noble Lords, express our gratitude to my noble friend for highlighting the issue through his amendment.
We do not think that the Bill is the right vehicle for attempting to provide clarification on this point, however. We share some of the cautions that the noble Lord, Lord Allan of Hallam, raised and agree with my noble friend Lady Harding of Winscombe that this is part of a broader question about consumer rights online beyond the services with which the Bill is principally concerned. It could be preferable that the principle that my noble friend Lord Moylan seeks to establish through his amendment should apply more widely than merely to category 1 services regulated under the Bill. I assure him that the Bill will create a number of duties on providers which will benefit users and clarify that they have existing rights of action in the courts. We discussed these new protections in depth in Committee and earlier on Report. He drew attention to Clause 65(1), which puts a requirement on all services, not just category 1 services, to include clear and accessible provisions in their terms of service informing users about their right to bring a claim for breach of contract. Therefore, while we are grateful, we agree with noble Lords who suggested that this is a debate for another day and another Bill.
Amendment 191A from the noble Baroness, Lady Kidron, would require Ofcom to issue guidance for coroners and procurators fiscal to aid them in submitting requests to Ofcom to exercise its power to obtain information from providers about the use of a service by a deceased child. While I am sympathetic to her intention, I do not think that her amendment is the right answer. It would be inappropriate for an agency of the Executive to issue guidance to a branch of the judiciary. As I explained in Committee, it is for the Chief Coroner to provide detailed guidance to coroners. This is written to assist coroners with the law and their legal duties and to provide commentary and advice on policy and practice.
The amendment tabled by the noble Baroness cuts across the role of the Chief Coroner and risks compromising the judicial independence of the coroner, as set out in the Constitutional Reform Act 2005. As she is aware, the Chief Coroner has agreed to consider issuing guidance to coroners on social media and to consider the issues covered in the Bill. He has also agreed to explore whether coroners would benefit from additional training, with the offer of consultation with experts including Ofcom and the Information Commissioner’s Office. I suggest that the better approach would be for Ofcom and the Information Commissioner’s Office to support the Chief Coroner in his consideration of these issues where he would find that helpful.
I agree with the noble Lord, Lord Allan, that coroners must have access to online safety expertise given the technical and fast-moving nature of this sector. As we have discussed previously, Amendment 273 gives Ofcom a power to produce a report dealing with matters relevant to an investigation or inquest following a request from a coroner which will provide that expertise. I hope that this reassures the noble Baroness.
I understand the report on a specific death, which is very welcome and part of the regime as we all see it. The very long list of things that the coroner may not know that they do not know, as I set out in the amendment, is the issue which I and other noble Lords are concerned about. If the Government could find a way to make that possible, I would be very grateful.
We are keen to ensure that coroners have access to the information and expertise that they need, while respecting the independence of the judicial process to decide what they do not know and would like to know more about and the role of the Chief Coroner there. It is a point that I have discussed a lot with the noble Baroness and with my noble friend Lady Newlove in her former role as Victims’ Commissioner. I am very happy to continue doing so because it is important that there is access to that.
The noble Lord, Lord Stevenson, spoke to the amendments tabled by the noble Baroness, Lady Merron, about supposedly gendered language in relation to Clauses 141 and 157. As I made clear in Committee, I appreciate the intention—as does Lady Deben—of making clear that a person of either sex can perform the role of chairman, just as they can perform the role of ombudsman. We have discussed in Committee the semantic point there. The Government have used “chairman” here to be consistent with terminology in the Office of Communications Act 2002. I appreciate that this predates the Written Ministerial Statement which the noble Lord cited, but that itself made clear that the Government at the time recognised that in practice, parliamentary counsel would need to adopt a flexible approach to this change—for example, in at least some of the cases where existing legislation originally drafted in the former style is being amended.
The noble Lord may be aware of a further Written Ministerial Statement, made on 23 May last year, following our debates on gendered language on another Bill, when the then Lord President of the Council and Leader of the House of Commons said that the Office of the Parliamentary Counsel would update its drafting guidance in light of that. That guidance is still forthcoming. However, importantly, the term here will have no bearing on Ofcom’s decision-making on who would chair the advisory committees. It must establish that this could indeed be a person of either sex.
Amendment 253 seeks to enable co-operation, particularly via information-sharing, between Ofcom and other regulators within the UK. I reassure noble Lords that Section 393 of the Communications Act 2003 already includes provisions for sharing information between Ofcom and other regulators in the UK.
As has been noted, Ofcom already co-operates effectively with other domestic regulators. That has been strengthened by the establishment of the Digital Regulation Co-operation Forum. By promoting greater coherence, the forum helps to resolve potential tensions, offering clarity for people and the industry. It ensures collaborative work across areas of common interest to address complex problems. Its outputs have already delivered real and wide-ranging impacts, including landmark policy statements clarifying the interactions between digital regulatory regimes, research into cross-cutting issues, and horizon-scanning activities on new regulatory challenges. We will continue to assess how best to support collaboration between digital regulators and to ensure that their approaches are joined up. We therefore do not think that Amendment 253 is necessary.
My Lords, before I talk to the amendments I had intended to address, I will make a very narrow point in support of the noble Baroness, Lady Fraser. About 10 years ago, when I started doing work on children, I approached Ofcom and asked why all its research goes to 24, when childhood finishes at 18 and the UNCRC says that a child needs special consideration. Ofcom said, “Terribly sorry, but this is our inheritance from a marketing background”. The Communications and Digital Committee later wrote formally to Ofcom and asked if it could do its research up to 18 and then from 18 to 24, but it appeared to be absolutely impossible. I regret that I do not know what the current situation is and I hope that, with the noble Lord, Lord Grade, in place it may rapidly change overnight. My point is that the detailed description that the noble Baroness gave the House about why it is important to stipulate this is proven by that tale.
I also associate myself with the remarks of the noble Lord, Lord Allan, who terrified me some 50 minutes ago. I look forward to hearing what will be said.
I in fact rose to speak to government Amendments 196 and 199, and the bunch of amendments on access to data for researchers. I welcome the government amendments to which I added my name. I really am delighted every time the Government inch forward into the area of the transparency of systemic and design matters. The focus of the Bill should always be on the key factor that separates digital media from other forms of media, which is the power to determine, manipulate and orchestrate what a user does next, see how they behave or what they think. That is very different and is unique to the technology we are talking about.
It will not surprise the Minister to hear that I would have liked this amendment to cover the design of systems and processes, and features and functionalities that are not related to content. Rather than labouring this point, on this occasion I will just draw the Minister’s attention to an article published over the weekend by Professor Henrietta Bowden-Jones, the UK’s foremost expert on gambling and gaming addiction. She equates the systems and processes involved in priming behaviours on social media with the more extreme behaviours that she sees in her addiction clinics, with ever younger children. Professor Bowden-Jones is the spokesperson on behavioural addictions for the Royal College of Psychiatrists, and the House ignores her experience of the loops of reward and compulsion that manipulate behaviour, particularly the behaviour of children, at our peril.
I commend the noble Lord, Lord Bethell, for continuing to press the research issue and coming back, even in the light of the government amendment, with a little more. Access to good data about the operation of social media is vital in holding regulated companies to account, tracking the extent of harms, building an understanding of them and, importantly, building knowledge about how they might be sensibly and effectively addressed.
The benefit of having a period of time between the last day of Report on Wednesday and Third Reading is that that gives the Minister, the Bill team and parliamentary counsel the time to reflect on the kind of power that could be devised. The wording could be devised, and I would have thought that six weeks would be quite adequate for that, perhaps in a general way. After all, this is not a power that is immediately going to be used; it is a general power that could be brought into effect by regulation. Surely it is not beyond the wit to devise something suitable.
Sit down or stand up—I cannot remember.
I wonder whether the department has looked at the DSA and other situations where this is being worked out. I recognise that it takes a period of time, but it is not without some precedent that a pathway should be described.
We do not think that six weeks is enough time for the evidence base to develop sufficiently, our assessment being that to endow the Secretary of State with that power at this point is premature.
Amendment 262AA would require Ofcom to consider whether it is appropriate to require providers to take steps to comply with Ofcom’s researcher access guidance when including a requirement to take steps in a confirmation decision. This would be inappropriate because the researcher access provisions are not enforceable requirements; as such, compliance with them should not be subject to enforcement by the regulator. Furthermore, enforcement action may relate to a wide variety of very important issues, and the steps needed should be sufficient to address a failure to comply with an enforceable requirement. Singling out compliance with researcher access guidance alone risks implying that this will be adequate to address core failures.
Amendment 272AB would require Ofcom to give consideration to whether greater access to data could be achieved through legal requirements or incentives for regulated services. I reassure noble Lords that the scope of Ofcom’s report will already cover how greater access to data could be achieved, including through enforceable requirements on providers.
Amendment 272E would require Ofcom to take a provider’s compliance with Ofcom’s guidance on researcher access to data into account when assessing risks from regulated services and determining whether to take enforcement action and what enforcement action to take. However, we do not believe that this is a relevant factor for consideration of these issues. I hope noble Lords will agree that whether or not a company has enabled researcher access to its data should not be a mitigating factor against Ofcom requiring companies to deal with terrorism or child sexual exploitation or abuse content, for example.
On my noble friend Lord Bethell’s remaining Amendments 272BA, 273A and 273B, the first of these would require Ofcom to publish its report on researchers’ access to information within six months. While six months would not be deliverable given other priorities and the complexity of this issue, the government amendment to which I have spoken would reduce the timelines from two years to 18 months. That recognises the importance of the issue while ensuring that Ofcom can deliver the key priorities in establishing the core parts of the regulatory framework; for example, the illegal content and child safety duties.
My Lords, I will speak to the government Amendments 274B and 274C. I truly welcome a more detailed approach to Ofcom’s duties in relation to media literacy. However, as is my theme today, I raise two frustrations. First, having spent weeks telling us that it is impossible to include harms that go beyond content and opposing amendments on that point, the Government’s media literacy strategy includes a duty to help users to understand the harmful ways in which regulated services may be used. This is in addition to understanding the nature and impact of harmful content. It appears to suggest that it is the users who are guilty of misuse of products and services rather than putting any emphasis on the design or processes that determine how a service is most often used.
I believe that all of us, including children, are participants in creating an online culture and that educating and empowering users of services is essential. However, it should not be a substitute for designing a service that is safe by design and default. To make my point absolutely clear, I recount the findings of researchers who undertook workshops in 28 countries with more than 1,000 children. The researchers were at first surprised to find that, whether in Kigali, São Paulo or Berlin, to an overwhelming extent children identified the same problems online—harmful content, addiction, privacy, lack of privacy and so on. The children’s circumstances were so vastly different—country and town, Africa and the global north et cetera—but when the researchers did further analysis, they realised that the reason why they had such similar experiences was because they were using the same products. The products were more determining of the outcome than anything to do with religion, education, status, age, the family or even the country. The only other factor that loomed large, which I admit that the Government have recognised, was gender. Those were the two most crucial findings. It is an abdication of adult responsibility to place the onus on children to keep themselves safe. The amendment and the Bill, as I keep mentioning, should focus on the role of design, not on how a child uses it.
My second point, which is of a similar nature, is that I am very concerned that a lot of digital literacy—for adults as well as children, but my particular concern is in schools—is provided by the tech companies themselves. Therefore, once again their responsibility, their role in the system and process of what children might find from reward loops, algorithms and so on, is very low down on the agenda. Is it possible at this late stage to consider that Ofcom might have a responsibility to consider the system design as part of its literacy review?
My Lords, this has been a very interesting short debate. Like other noble Lords, I am very pleased that the Government have proposed the new clauses in Amendments 274B and 274C. The noble Baroness, Lady Bull, described absolutely the importance of media literacy, particularly for disabled people and for the vulnerable. This is really important for them. It is important also not to fall into the trap described by the noble Baroness, Lady Kidron, of saying, “You are a child or a vulnerable person. You must acquire media literacy—it’s your obligation; it’s not the obligation of the platforms to design their services appropriately”. I take that point, but it does not mean that media literacy is not extraordinarily important.
However, sadly, I do not believe that the breadth of the Government’s new media literacy amendments is as wide as the original draft Bill. If you look back at the draft Bill, that was a completely new and upgraded set of duties right across the board, replacing Section 11 of the Communications Act and, in a sense, fit for the modern age. The Government have made a media literacy duty which is much narrower. It relates only to regulated services. This is not optimum. We need something broader which puts a bigger and broader duty for the future on to Ofcom.
It is also deficient in two respects. The noble Lord, Lord Knight, will speak to his amendments, but it struck me immediately when looking at that proposed new clause that we were missing all the debate about functionalities and so on that the noble Baroness, Lady Kidron, debated the other day, regarding design, and that we must ensure that media literacy encompasses understanding the underlying functionalities and systems of the platforms that we are talking about.
I know that your Lordships will be very excited to hear that I am going to refer again to the Joint Committee. I know that the Minister has read us from cover to cover, but at paragraph 381 on the draft Bill we said, and it is still evergreen:
“If the Government wishes to improve the UK’s media literacy to reduce online harms, there must be provisions in the Bill to ensure media literacy initiatives are of a high standard. The Bill should empower Ofcom to set minimum standards for media literacy initiatives that both guide providers and ensure the information they are disseminating aligns with the goal of reducing online harm”.
I had a very close look at the clause. I could not see that Ofcom is entitled to set minimum standards. The media literacy provisions sadly are deficient in that respect.
My Lords, it is all quite exciting now, is it not? I can say “hear, hear!” a lot; everyone is talking about freedom of expression. I cannot tell noble Lords how relieved and pleased I was both to hear the speeches and to see Amendment 228 from the noble Lord, Lord Allan of Hallam, and the noble Viscount, Lord Colville of Culross, who both explained well why this is so important. I am so glad that, even late in our discussions on Report, it has returned as an important issue.
We have already discussed how in many cases, especially when it comes to what is seen as illegal speech, decisions about illegality are very complicated. They are complicated in the law courts and offline, and that is when they have the full power of lawyers, the criminal justice system and so on trying to make decisions. Leaving it up to people who, through no fault of their own, are not qualified but who work in a social media company to try to make that decision in a climate of quite onerous obligations—and having phrases such as “reasonable grounds to infer”—will lead to lawful expression being overmoderated. Ultimately, online platforms will use an abundance of caution, which will lead to a lot of important speech—perfectly lawful if not worthy speech; the public’s speech and the ability to speak freely—being removed. That is not a trivial side issue; it will discredit the Bill, if it has not done so already.
Whenever noble Lords make contributions about why a wide range of amendments and changes are needed—particularly in relation to protecting children, harm and so on—they constantly tell us that the Bill should send an uncompromising message. The difficulty I have is with the danger that the Bill will send an uncompromising message that freedom of expression is not important. I urge the Minister to look carefully at the amendment, because the message should be that, while the Bill is trying to tackle online harm and to protect children in particular—which I have never argued against—huge swathes of it might inadvertently silence people and deprive them of the right to information that they should be able to have.
My Amendment 229—I am not sure why it is in this group, but that is nothing new in the way that the groupings have worked—is about lawful speech and about what content is filtered by users. I have already argued for the replacement of the old legal but harmful duty, but the new duty of user empowerment is welcome, and at face value it puts users in the driving seat and allows adults to judge for themselves what they want and do not want to see. But—and it is a large but—that will work only if users and providers agree about when content should be filtered and what content is filtered.
As with all decisions on speech, as I have just mentioned, in the context particularly of a heightened climate of confusion and sensitivity regarding identity politics and the cancel-culture issues that we are all familiar with, there are some problems with the way that things stand in the Bill. I hope I am using the term “reasonable grounds to infer” in a better way than it is used in terms of illegality. My amendment specifies that companies need to have reasonable grounds to infer that content is abusive or inciting hatred when filtering out content in those user empowerment tools. Where a user chooses to filter out hateful content based on race, on being a woman or whatever, it should catch only content that genuinely falls under those headings. There is a risk that, without this amendment, technologies or individuals working for companies could operate in a heavy-handed way in filtering out legitimate content.
I shall give a couple of examples. Say that someone chooses to filter out abusive content targeting the protected characteristic of race. I imagine that they would have a reasonable expectation that that filter would target aggressive, unpleasant content demeaning to a person because of their race, but does the provider agree with that? Will it interpret my filtering choice as a user in the most restrictive way possible in a bid to protect my safety or by seeing my sensibilities as having a low threshold for what it might consider to be abuse?
The race issue illustrates where we get into difficulties. Will the filterers take their cue from the document that has just been revealed, which was compiled by the Diocese of St Edmundsbury and Ipswich, which the anti-racist campaigning group Don’t Divide Us has just released, and which is being used in 87 schools? Under the heading of racism we have ideas like passive racism includes agreeing that
“There are two sides to every story”,
or if you deny white privilege or if you start a sentence saying, “Not all white people”. “Veiled racism” in this document—which, as I say, is being used in schools for that particular reason by the Church of England—includes a “Euro-centric curriculum” or “cultural appropriation”. “Racist discrimination” includes “anti- immigration policies”, which, as I pointed out before, would indicate that some people would call the Government’s own Bill tonight racist.
The reason why I mention that is that you might think, “I am going to have racism filtered out”, but if there is too much caution then you will have filtered out very legitimate discussions on immigration and cultural appropriation. You will be protected, but if, for example, the filterer follows certain universities that have deemed the novels of Walter Scott, the plays of William Shakespeare or Enid Blyton’s writing as racist, then you can see that we have some real problems. When universities have said there is misogynistic bullying and sexual abuse in “The Great Gatsby” and Ovid’s “Metamorphoses”, I just want to make sure that we do not end up in a situation where there is oversensitivity by the filterers. Perhaps the filtering will take place by algorithm, machine learning and artificial intelligence, but the EHRC has noted that algorithms just cannot cope with the context, cultural difference and complexity of language within the billions of items of content produced every day.
Amendment 229 ensures that there is a common standard—a standard of objective reasonableness. It is not perfect at all; I understand that reasonableness itself is open to interpretation. However, it is an attempt to ensure that the Government’s concept of user empowerment is feasible by at least aspiring to a basic shared understanding between users and providers as to what will be filtered and what will not, and a check against providers’ filter mechanisms removing controversial or unpopular content in the name of protecting users. Just as I indicated in terms of sending a message, if the Government could indicate to the companies that rather than taking a risk-averse attitude, they had to bear in mind freedom of expression, not be oversensitive and not be too risk-averse or overcautious, we might begin to get some balance. Otherwise, an awful lot of lawful material will be removed that is not even harmful.
My Lords, I support Amendment 228. I spoke on this issue to the longer amendment in Committee. To decide whether something is illegal without the entire apparatus of the justice system, in which a great deal of care is taken to decide whether something is illegal, at high volume and high speed, is very worrying. It strikes me as amusing because someone commented earlier that they like a “must” instead of a “maybe”. In this case, I caution that a provider should treat the content as content of the kind in question accordingly, that something a little softer is needed, not a cliff edge that ends up in horrors around illegality where someone who has acted in self-defence is accused of a crime of violence, as happens to many women, and so on and so forth. I do not want to labour the point. I just urge a gentle landing rather than, as it is written, a cliff edge.
My Lords, this has been a very interesting debate. Beyond peradventure my noble friend Lord Allan and the noble Viscount, Lord Colville, and the noble Baroness, Lady Fox, have demonstrated powerfully the perils of this clause. “Lawyers’ caution” is one of my noble friend’s messages to take away, as is the complexities in making these judgments. It was interesting when he mentioned the sharing for awareness’s sake of certain forms of content and the judgments that must be taken by platforms. His phrase “If in doubt, take it out” is pretty chilling in free speech terms—I think that will come back to haunt us. As the noble Baroness, Lady Fox, said, the wrong message is being delivered by this clause. It is important to have some element of discretion here and not, as the noble Baroness, Lady Kidron, said, a cliff edge. We need a gentler landing. I very much hope that the Minister will land more gently.
My Lords, I pay tribute to the noble Baroness, Lady Harding, for her role in bringing this issue forward. I too welcome the government amendments. It is important to underline that adding the potential role of app stores to the Bill is neither an opportunity for other companies to fail to comply and wait for the gatekeepers to do the job nor a one-stop shop in itself. It is worth reminding ourselves that digital journeys rarely start and finish in one place. In spite of the incredible war for our attention, in which products and services attempt to keep us rapt on a single platform, it is quite important for everyone in the ecosystem to play their part.
I have two minor points. First, I was not entirely sure why the government amendment requires the Secretary of State to consult as opposed to Ofcom. Can the Minister reassure me that, whoever undertakes the consultation, it will include children and children’s organisations as well as tech companies? Secondly, like the noble Baroness, Lady Harding, I was a little surprised that the amendment does not define an app store but uses the term “the ordinary meaning of”. That seems like it may have the possibility for change. If there is a good reason for that—I am sure there is—then it must be stated that app stores cannot suddenly rebrand to something else and that that gatekeeper function will be kept absolutely front and centre.
Notwithstanding those comments, and associating myself with the idea that nothing should wait until 2025-26, I am very grateful to the Government for bringing this forward.
My Lords, I will make a brief contribution because I was the misery guts when this was proposed first time round. I congratulate the noble Baroness, Lady Harding, not just on working with colleagues to come up with a really good solution but on seeking me out. If I heard someone be as miserable as I was, I might try to avoid them. She did not; she came and asked me, “Why are you miserable? What is the problem here?”, and took steps to address it. Through her work with the Government, their amendments address my main concerns.
My first concern, as we discussed in Committee, was that we would be asking large companies to regulate their competitors, because the app stores are run by large tech companies. She certainly understood that concern. The second was that I felt we had not necessarily yet clearly defined the problem. There are lots of problems. Before you can come up with a solution, you need a real consensus on what problem you are trying to address. The government amendment will very much help in saying, “Let’s get really crunchy about the actual problem that we need app stores to address”.
Finally, I am a glass-half-full kind of guy as well as a misery guts—there is a contradiction there—and so I genuinely think that these large tech businesses will start to change their behaviour and address some of the concerns, such as getting age ratings correct, just by virtue of our having this regulatory framework in place. Even if today the app stores are technically outside, the fact that the sector is inside and that this amendment tells them that they are on notice will, I think and hope, have a hugely positive effect and we will get the benefits much more quickly than the timescale envisaged in the Bill. That feels like a true backstop. I sincerely hope that the people in those companies, who I am sure will be glued to our debate, will be thinking that they need to get their act together much more quickly. It is better for them to do it themselves than wait for someone to do it to them.
(1 year, 5 months ago)
Lords ChamberMy Lords, a lot of positive and interesting things have been said that I am sympathetic to, but this group of amendments raises concerns about a democratic deficit: if too much of the Bill is either delegated to the Secretary of State or open to interference in relation to the Secretary of State and Ofcom, who decides what those priorities are? I will ask for a couple of points of clarification.
I am glad to see that the term “public policy” has been replaced, because what did that mean? Everything. But I am not convinced that saying that the Secretary of State can decide not just on national security but on public safety and public health is reassuring in the present circumstances. The noble Lord, Lord Allan, has just pointed out what it feels like to be leaned on. We had a very recent example internationally of Governments leaning on big tech companies in relation to Covid policies, lockdowns and so on, and removing material that was seen to contradict official public health advice—often public health advice that turned out not to be accurate at all. There should at least have been a lot more debate about what were political responses to a terrible virus. Noble Lords will know that censorship became a matter of course during that time, and Governments interfering in or leaning on big tech directly was problematic. I am not reassured that the Government hold to themselves the ability to lean on Ofcom around those issues.
It is also worth remembering that the Secretary of State already has a huge amount of power to designate, as we have discussed previously. They can designate what constitute priority illegal offences and priority content harmful to children, and that can all change beyond what we have discussed here. We have already seen that there is a constant expansion of what those harms can be, and having those decisions removed using only secondary legislation, unaccountable to Parliament or to public scrutiny, really worries me. It is likely to give a green light to every identity group and special interest NGO to demand that the list of priority harms and so on should be dealt with. That is likely to make the job of the Secretary of State to respond to “something must be done” moral panics all the more difficult. If that is going to happen, we should have parliamentary scrutiny of it; it cannot just be allowed to happen elsewhere.
It is ironic that the Secretary of State is more democratic, because they are elected, than an unelected regulator. I just feel that there is a danger in so much smoke and mirrors. When the Minister very kindly agreed to see the noble Lord, Lord Moylan, and me, I asked in a rather exasperated way why Ofcom could not make freedom of expression a priority, with codes of practice so that it would have to check on freedom of speech. The Minister said, “It’s not up to me to tell Ofcom what to do”, and I thought, “The whole Bill is telling Ofcom what to do”. That did not seem to make any sense.
I had another exchange with the present Secretary of State—again, noble Lords will not be surprised to hear that it was not a sophisticated intervention on my part—in which I said, “Why can’t the Government force the big tech companies to put freedom of expression in their terms and conditions or terms of service?” The Minister said, “They are private companies; we’re not interfering in what they do”. So you just end up thinking, “The whole Bill is telling companies that they’re going to be compelled to act in relation to harm and safety, but not on freedom of expression”. What that means is that you feel all the time as though the Government are saying that they are outsourcing this to third parties, which means that you cannot hold anyone to account.
Civil liberties campaigner Guy Herbert compared this to what is happening with the banks at the moment; they are being blamed by the Government and held to account for things such as politically exposed people and Ts and Cs that overconcentrate on values such as EDI and ESG that may be leading to citizens of this country having their bank accounts closed down. The Government say that they will tell the regulator that it has to act and say that the banks cannot behave in this way, but this all came from legislation—it is not as though the regulator was doing it off its own bat. Maybe it overinterpreted the legislation and the banks then overinterpreted it again and overremoved.
The obvious analogy for me is that there is a danger here that we will not be able to hold anyone to account for overremoval of legitimate democratic discussion from the online world, because everyone is pointing the finger at everyone else. At the very least, the amendments are trying to say that any changes beyond what we have discussed so far on this Bill must come before Parliament. That is very important for any kind of democratic credibility to be attached to this legislation.
My Lords, I too express my admiration to the noble Baroness, Lady Stowell, for her work on this group with the Minister and support the amendments in her name. To pick up on what the noble Baroness, Lady Harding, said about infinite ping-pong, it can be used not only to avoid making a decision but as a form of power and of default decision-making—if you cannot get the information back, you are where you are. That is a particularly important point and I add my voice to those who have supported it.
I have a slight concern that I want to raise in public, so that I have said it once, and get some reassurance from the Minister. New subsection (B1)(d) in Amendment 134 concerns the Secretary of State directing Ofcom to change codes that may affect
“relations with the government of a country outside the United Kingdom”.
Many of the companies that will be regulated sit in America, which has been very forceful about protecting its sector. Without expanding on this too much, when it was suggested that senior managers would face some sort of liability in international fora, various parts of the American Government and state apparatus certainly made their feelings clearly known.
I am sure that the channels between our Government and the US are much more straightforward than any that I have witnessed, but it is absolutely definite that more than one Member of your Lordships’ House was approached about the senior management and said, “This is a worry to us”. I believe that where we have landed is very good, but I would like the Minister to say what the limits of that power are and acknowledge that it could get in a bit of a muddle with the economic outcomes that we were talking about, celebrating that they had been taken off the list, and government relations. That was the thing that slightly worried me in the government amendments, which, in all other ways, I welcome.
My Lords, this has been a consistent theme ever since the Joint Committee’s report. It was reported on by the Delegated Powers and Regulatory Reform Committee, and the Digital and Communications Committee, chaired by the noble Baroness, Lady Stowell, has rightly taken up the issue. Seeing some movement from the Minister, particularly on Clause 29 and specifically in terms of Amendments 134 to 137, is very welcome and consistent with some of the concerns that have been raised by noble Lords.
There are still questions to answer about Amendment 138, which my noble friend has raised. I have also signed the amendments to Clause 38 because I think the timetabling is extremely welcome. However, like other noble Lords, I believe we need to have Amendments 139, 140, 144 and 145 in place, as proposed by the noble Baroness, Lady Stowell of Beeston. The phrase “infinite ping-pong” makes us all sink in gloom, in current circumstances—it is a very powerful phrase. I think the Minister really does have to come back with something better; I hope he will give us that assurance, and that his discussions with the noble Baroness Stowell will bear further fruit.
I may not agree with the noble Lord, Lord Moylan, about the Clause 39 issues, but I am glad he raised issues relating to Clause 159. It is notable that of all the recommendations by the Delegated Powers and Regulatory Reform Committee, the Government accepted four out of five but did not accept the one related to what is now Clause 159. I have deliberately de-grouped the questions of whether Clauses 158 and 159 should stand part of the Bill, so I am going to pose a few questions which I hope, when we get to the second group which contains my clause stand part proposition, the Minister will be able to tell me effortlessly what he is going to do. This will prevent me from putting down further amendments on those clauses, because it seems to me that the Government are being extraordinarily inconsistent in terms of how they are dealing with Clauses 158 and 159 compared with how they have amended Clause 39.
For instance, Clause 158 allows the Secretary of State to issue a direction to Ofcom, where the Secretary of State has reasonable grounds for believing that there is a threat to public health and safety or national security, and they can direct Ofcom to set objectives in how they use their media-literacy powers in Section 11 of the Communications Act for a specific period to address the threat, and make Ofcom issue a public-statement notice. That is rather extraordinary. I will not go into great detail at this stage, and I hope the Minister can avoid me having to make a long speech further down the track, but the Government should not be in a position to be able to direct a media regulator on a matter of content. For instance, the Secretary of State has no powers over Ofcom on the content of broadcast regulation—indeed, they have limited powers to direct over radio spectrum and wires—and there is no provision for parliamentary involvement, although I accept that the Secretary of State must publish reasons for the direction. There is also the general question of whether the threshold is high enough to justify this kind of interference. So Clause 158 is not good news at all. It raises a number of questions which I hope the Minister will start to answer today, and maybe we can avoid a great debate further down the track.
My Lords, I rise briefly to support the noble Baroness, Lady Morgan, to welcome the government amendment and to say that this is a moment of delight for many girls—of all varieties. I echo the noble Baroness, Lady Fox, on the issue of having a broad consultation, which is a good idea. While our focus during the passage of this Bill was necessarily on preventing harm, I hope this guidance will be part of the rather more aspirational and exciting part of the digital world that allows young people to participate in social and civic life in ways that do not tolerate abuse and harm on the basis of their gender. In Committee, I said that we have a duty not to allow digital tech to be regressive for girls. I hope that this is a first step.
My Lords, on behalf of my party, all the groups mentioned by the noble Baroness, Lady Morgan, and potentially millions of women and girls in this country, I briefly express my appreciation for this government amendment. In Committee, many of us argued that a gender-neutral Bill would not achieve strong enough protection for women and girls as it would fail to recognise the gendered nature of online abuse. The Minister listened, as he has on many occasions during the passage of the Bill. We still have differences on some issues—cyberflashing, for instance—but in this instance I am delighted that he is amending the Bill, and I welcome it.
Why will Ofcom be required to produce guidance and not a code, as in the amendment originally tabled by the noble Baroness, Lady Morgan? Is there a difference, or is it a case of a rose by any other name? Is there a timescale by which Ofcom should produce this guidance? Are there any plans to review Ofcom’s guidance once produced, just to see how well it is working?
We all want the same thing: for women and girls to be free to express themselves online and not to be harassed, abused and threatened as they are today.
My Lords, I am most grateful to the noble Lord, Lord Clement-Jones, for tabling the amendment. If I had been quicker, I would have added my name to it, because he may— I use the word “may” advisedly, because I am not sure—have identified quite a serious gap in terms of future-proofing. As far as I understand it, in a somewhat naive way, the amendment probes whether there is a gap between provider-generated content and user-generated content and whether provider-generated content could lead to a whole lot of ghastly stuff on the metaverse without any way of tackling it because it is deemed to have fallen outside the scope of the Bill.
I am grateful to Carnegie UK for having tried to talk me through this—it is pretty complicated. As a specific example, I understand that a “Decentraland” avatar pops up on gaming sites, and it is useful because it warns you about the dangers of gambling and what it can lead to. But then there is the problem about the backdrop to this avatar: at the moment, it seems to be against gambling, but you can see how those who have an interest in gambling would be quite happy to have the avatar look pretty hideous but have a backdrop of a really enticing casino with lots of lights and people streaming in, or whatever. I am not sure where that would fit, because it seems that this type of content would be provider-generated. When it comes to the metaverse and these new ways of interacting with 3D immersion, I am not clear that we have adequately caught within the Bill some of these potentially dangerous applications. So I hope that the Minister will be able to clarify it for us today and, if not, possibly to write between now and the next time that we debate this, because I have an amendment on future-proofing, but it is in a subsequent group.
My Lords, I am interested to hear what the Minister says, but could he also explain to the House the difference in status of this sort of material in Part 5 versus Part 3? I believe that the Government brought in a lot of amendments that sorted it out and that many of us hoped were for the entire Bill, although we discovered, somewhat to our surprise, that they were only in Part 5. I would be interested if the Minister could expand on that.
My Lords, I am grateful to the noble Lord, Lord Clement-Jones, for raising this; it is important. Clause 49(3)(a)(i) mentions content
“generated directly on the service by a user”,
which, to me, implies that it would include the actions of another user in the metaverse. Sub-paragraph (ii) mentions content
“uploaded to or shared on the service by a user”,
which covers bots or other quasi-autonomous virtual characters in the metaverse. As we heard, a question remains about whether any characters or objects provided by the service itself are covered.
A scenario—in my imagination anyway—would be walking into an empty virtual bar at the start of a metaverse service. This would be unlikely to be engaging: the attractions of indulging in a lonely, morose drink at that virtual bar are limited. The provider may therefore reasonably configure the algorithm to generate characters and objects that are engaging until enough users then populate the service to make it interesting.
Of course, there is the much more straightforward question of gaming platforms. On Monday, I mentioned “Grand Theft Auto”, a game with an advisory age of 17—they are still children at that age—but that is routinely accessed by younger children. Shockingly, an article that I read claimed that it can evolve into a pornographic experience, where the player becomes the character from a first-person angle and received services from virtual sex workers, as part of the game design. So my question to the Minister is: does the Bill protect the user from these virtual characters interacting with users in virtual worlds?
We talked about bots controlled by service providers before the noble Lord, Lord Knight, asked questions on this. The Bill is designed to make online service providers responsible for the safety of their users in light of harmful activities that their platforms might facilitate. Providers of a user-to-user service will need to adhere to their duties of care, which apply to all user-generated content present on their service. The Bill does not, however, regulate content published by user-to-user providers themselves. That is because the providers are liable for the content they publish on the service themselves. The one exception to this—as the noble Baroness, Lady Kidron, alluded to in her contribution—is pornography, which poses a particular risk to children and is regulated by Part 5 of the Bill.
I am pleased to reassure the noble Lord, Lord Clement- Jones, that the Bill—
I thank the noble Lord for giving way. The Minister just said that private providers will be responsible for their content. I would love to understand what mechanism makes a provider responsible for their content?
I will write to noble Lords with further information and will make sure that I have picked up correctly the questions that they have asked.
On Amendment 152A, which the noble Lord, Lord Clement-Jones, has tabled, I am pleased to assure him that the Bill already achieves the intention of the amendment, which seeks to add characters and objects that might interact with users in the virtual world to the Bill’s definition of user-generated content. Let me be clear again: the Bill already captures any service that facilitates online user-to-user interaction, including in the metaverse or other augmented reality or immersive online worlds.
The Bill broadly defines “content” as
“anything communicated by means of an internet service”,
so it already captures the various ways in which users may encounter content. Clause 211 makes clear that “encounter” in relation to content for the purposes of the Bill means to,
“read, view, hear or otherwise experience”
content. That definition extends to the virtual worlds which noble worlds have envisaged in their contributions. It is broad enough to encompass any way of encountering content, whether that be audio-visually or through online avatars or objects.
In addition, under the Bill’s definition of “functionality”,
“any feature that enables interactions of any description between users of the service”
will be captured. That could include interaction between avatars or interaction by means of an object in a virtual world. All in-scope services must therefore consider a range of functionalities as part of their risk assessment and must put in place any necessary measures to mitigate and manage any risks that they identify.
I hope that that provides some assurance to the noble Lord that the concerns that he has raised are covered, but I shall happily write on his further questions before we reach the amendment that the noble Baroness, Lady Finlay, rightly flagged in her contribution.
My Lords, I strongly support Amendment 180, tabled by the noble Baroness, Lady Merron. I will also explain why I put forward Amendment 180A. I pay tribute to the noble Baroness, Lady Hayman, who pursued this issue with considerable force through her Question in the House.
There is clearly an omission in the Bill. One of its primary aims is to protect children from harmful online content, and animal cruelty content causes harm to the animals involved and, critically, to the people who view it, especially children. In Committee, in the Question and today, we have referred to the polling commissioned by the RSPCA, which found that 23% of 10 to 18 year-olds had seen animal cruelty on social media sites. I am sure that the numbers have increased since that survey in 2018. A study published in 2017 found—if evidence were needed—that:
“There is emerging evidence that childhood exposure to maltreatment of companion animals is associated with psychopathology in childhood and adulthood.”
The noble Baroness made an extremely good case, and I do not think that I need to add to it. When the Bill went through the Commons, assurances were given by the former Minister, Damian Collins, who acknowledged that the inclusion of animal cruelty content in the Bill deserves further consideration as the Bill progresses through its parliamentary stages. We need to keep up that pressure, and we will be very much supporting the noble Baroness if she asks for the opinion of the House.
Turning to my Amendment 180A, like the noble Baroness, I pay tribute to the Social Media Animal Cruelty Coalition, which is a very large coalition of organisations. We face a global extinction crisis which the UK Government themselves have pledged to reverse. Algorithmic amplification tools and social media recommendation engines have driven an explosive growth in online wildlife trafficking. A National Geographic article from 2020 quoted US wildlife officials describing the dizzying scale of the wildlife trade on social media. The UK’s national wildlife crime units say that cyber-enabled wildlife crime has become their priority focus, since virtually all wildlife cases they now investigate have a cyber component to them, usually involving social media or e-commerce platforms. In a few clicks it is easy to find pages, groups and postings selling wildlife products made from endangered species, such as elephant ivory, rhino horn, pangolin scales and marine turtle shells, as well as big cats, reptiles, birds, primates and insects for the exotic pet trade. This vast, unregulated trade in live animals and their parts is not only illegal but exacerbates the risk of another animal/human spillover event such as the ones that caused Ebola, HIV and the Covid-19 pandemic.
In addition to accepting the animal welfare amendment tabled by the noble Baroness, which I hope they do, the Government should also add offences under the Control of Trade in Endangered Species Regulations 2018 to Schedule 7 to the Bill. This would definitely help limit the role of social media platforms in enabling wildlife trafficking, helping to uphold the UK’s commitments to tackling global wildlife crime.
My Lords, I rise very briefly to support the noble Baroness, Lady Merron, and to make only one point. As someone who has the misfortune of seeing a great deal of upsetting material of all kinds, I have to admit that it sears an image on your mind. I have had the misfortune to see the interaction of animal and human cruelty in the same sequences, again and again. In making the point that there is a harm to humans in witnessing and normalising this kind of material, I offer my support to the noble Baroness.
My Lords, Amendments 180 and 180A seek to require the Secretary of State to conduct a review of existing legislation and how it relates to certain animal welfare offences and, contingent on this review, to make them priority offences under the regulatory framework.
I am grateful for this debate on the important issue of protecting against animal cruelty online, and all of us in this House share the view of the importance of so doing. As the House has discussed previously, this Government are committed to strong animal welfare standards and protections. In this spirit, this Government recognise the psychological harm that animal cruelty content can cause to children online. That is why we tabled an amendment that lists content that depicts real or realistic serious violence or injury against an animal, including by fictional creatures, as priority content that is harmful to children. This was debated on the first day of Report.
In addition, all services will need proactively to tackle illegal animal cruelty content where this amounts to an existing offence such as extreme pornography. User-to-user services will be required swiftly to remove other illegal content that targets an individual victim once made aware of its presence.
The noble Baroness asked about timing. We feel it is important to understand how harm to animals as already captured in the Bill will function before committing to the specific remedy proposed in the amendments.
As discussed in Committee, the Bill’s focus is rightly on ensuring that humans, in particular children, are protected online, which is why we have not listed animal offences in Schedule 7. As many have observed, this Bill cannot fix every problem associated with the internet. While we recognise the psychological harm that can be caused to adults by seeing this type of content, listing animal offences in Schedule 7 is likely to dilute providers’ resources away from protecting humans online, which is the Bill’s main purpose.
However, I understand the importance of taking action on animal mistreatment when committed online, and I am sympathetic to the intention of these amendments. As discussed with the noble Baroness, Defra is confident that the Animal Welfare Act 2006 and its devolved equivalents can successfully bring prosecutions for the commission and action of animal torture when done online in the UK. These Acts do not cover acts of cruelty that take place outside the UK. I know from the discussion we have had in this House that there are real concerns that the Animal Welfare Act 2006 cannot tackle cross-border content, so I wish to make a further commitment today.
The Government have already committed to consider further how the criminal law can best protect individuals from harmful communications, alongside other communications offences, as part of changes made in the other place. To that end, we commit to include the harm caused by animal mistreatment communications as part of this assessment. This will then provide a basis for the Secretary of State to consider whether this offence should be added to Schedule 7 to the OSB via the powers in Clause 198. This work will commence shortly, and I am confident that this, in combination with animal cruelty content listed as priority harms to children, will safeguard users from this type of content online.
For the reasons set out, I hope the noble Baroness and the noble Lord will consider not pressing their amendments.
My Lords, these seem very sensible amendments. I am curious about why they have arrived only at this stage, given this was a known problem and that the Bill has been drafted over a long period. I am genuinely curious as to why this issue has been raised only now.
On the substance of the amendments, it seems entirely sensible that, given that we are now going to have 20,000 to 25,000 regulated entities in scope, some of which will never have encountered child sexual exploitation or abuse material or understood that they have a legal duty in relation to it, it will be helpful for them to have a clear set of regulations that tell them how to treat their material.
Child sexual exploitation or abuse material is toxic in both a moral and a legal sense. It needs to be treated almost literally as toxic material inside a company, and sometimes that is not well understood. People feel that they can forward material to someone else, not understanding that in doing so they will break the law. I have had experiences where well-meaning people acting in a vigilante capacity sent material to me, and at that point you have to report them to police. There are no ifs or buts. They have committed an offence in doing so. As somebody who works inside a company, your computer has to be quarantined and taken off and cleaned, just as it would be for any other toxic material, because we framed the law, quite correctly, to say that we do not want to offer people the defence of saying “I was forwarding this material because I’m a good guy”. Forwarding the material is a strict liability offence, so to have regulations that explain, particularly to organisations that have never dealt with this material, exactly how they have to deal with it in order to be legally compliant will be extremely helpful.
One thing I want to flag is that there are going to be some really fundamental cross-border issues that have to be addressed. In many instances of child sexual exploitation or abuse material, the material has been shared between people in different jurisdictions. The provider may not be in a UK jurisdiction, and we have got to avoid any conflicts of laws. I am sure the Government are thinking about this, but in drafting those regulations, what we cannot do, for example, is order a provider to retain data in a way that would be illegal in the jurisdiction from which it originates or in which it has its headquarters. The same would apply vice versa. We would not expect a foreign Government to order a UK company to act in a way that was against UK law in dealing with child sexual exploitation or abuse material. This all has to be worked out. I hope the Government are conscious of that.
I think the public interest is best served if the United Kingdom, the United States and the European Union, in particular, adopt common standards around this. I do not think there is anything between us in terms of how we would want to approach child sexual exploitation or abuse material, so the extent to which we end up having common legal standards will be extraordinarily helpful.
As a general matter, to have regulations that help companies with their compliance is going to be very helpful. I am curious as to how we have got there with the amendment only at this very late stage.
My Lords, I rise to make a slightly lesser point, but I also welcome these amendments. I want to ask the Minister where the consultation piece of this will lie and to check that all the people who have been in this space for many years will be consulted.
My Lords, as ever, my noble friend Lord Allan and the noble Baroness, Lady Kidron, have made helpful, practical and operational points that I hope the Minister will be able to answer. In fact, the first half of my noble friend’s speech was really a speech that the Minister himself could have given in welcoming the amendment, which we do on these Benches.
(1 year, 5 months ago)
Lords ChamberMy Lords, as I set out in Committee, the Government are bringing forward a package of amendments to address the challenges that bereaved parents and coroners have faced when seeking to access data after the death of a child.
These amendments have been developed after consultation with those who, so sadly, have first-hand experience of these challenges. I thank in particular the families of Breck Bednar, Sophie Parkinson, Molly Russell, Olly Stephens and Frankie Thomas for raising awareness of the challenges they have faced when seeking access to information following the heartbreaking cases involving their children. I am also grateful to the noble Baroness, Lady Kidron, for championing this issue in Parliament and more widely. I am very happy to say that she is supporting the government amendments in this group.
The loss of any life is heartbreaking, but especially so when it involves a child. These amendments will create a more straightforward and humane process for accessing data and will help to ensure that parents and coroners receive the answers they need in cases where a child’s death may be related to online harms. We know that coroners have faced challenges in accessing relevant data from online service providers, including information about a specific child’s online activity, where that might be relevant to an investigation or inquest. It is important that coroners can access such information.
As such, I turn first to Amendments 246, 247, 249, 250, 282, 283 and 287, which give Ofcom an express power to require information from regulated services about a deceased child’s online activity following a request from a coroner. This includes the content the child had viewed or with which he or she had engaged, how the content came to be encountered by the child, the role that algorithms and other functionalities played, and the method of interaction. It also covers any content that the child generated, uploaded or shared on the service.
Crucially, this power is backed up by Ofcom’s existing enforcement powers, so that, where a company refuses to provide information requested by Ofcom, companies may be subject to enforcement action, including senior management liability. To ensure that there are no barriers to Ofcom sharing information with coroners, first, Amendment 254 enables Ofcom to share information with a coroner without the prior consent of a business to disclose such information. This will ensure that Ofcom is free to provide information it collects under its existing online safety functions to coroners, as well as information requested specifically on behalf of a coroner, where that might be useful in determining whether social media played a part in a child’s death.
Secondly, coroners must have access to online safety expertise, given the technical and fast-moving nature of the industry. As such, Amendment 273 gives Ofcom a power to produce a report dealing with matters relevant to an investigation or inquest, following a request from a coroner. This may include, for example, information about a company’s systems and processes, including how algorithms have promoted specific content to a child. To this end, the Chief Coroner’s office will consider issuing non-statutory guidance and training for coroners about social media as appropriate, subject to the prioritisation of resources. We are confident that this well-established framework provides an effective means to provide coroners with training on online safety issues.
It is also important that we address the lack of transparency from large social media services about their approach to data disclosure. Currently, there is no common approach to this issue, with some services offering memorialisation or contact-nomination processes, while others seemingly lack any formal policy. To tackle this, a number of amendments in this group will require the largest services—category 1, 2A and 2B services—to set out policies relating to the disclosure of data regarding the online activities of a deceased child in a clear, accessible and sufficiently detailed format in their terms of service. These companies will also be required to provide a written response to data requests in a timely manner and must provide a dedicated helpline, or similar means, for parents to communicate with the company, in order to streamline the process. This will address the painful radio silence experienced by many bereaved parents. The companies must also offer options so that parents can complain when they consider that a platform is not meeting its obligations. These must be easy to access, easy to use and transparent.
The package of amendments will apply not only to coroners in England and Wales but also to Northern Ireland and equivalent investigations in Scotland, where similar sad events have occurred.
The Government will also address other barriers which are beyond the scope of this Bill. For example, we will explore measures to introduce data rights for bereaved parents who wish to request information about their deceased children through the Data Protection and Digital Information Bill. We are also working, as I said in Committee, with our American counterparts to clarify and, where necessary, address unintended barriers to information sharing created by the United States Stored Communications Act. I beg to move.
My Lords, I thank the Minister and indeed the Secretary of State for bringing forward these amendments in the fulsome manner that they have. I appreciate it, but I know that Bereaved Families for Online Safety also appreciates it. The Government committed to bringing forward these amendments on the last day in Committee, so they have been pre-emptively welcomed and discussed at some length. One need only read through Hansard of 22 June to understand the strength of feeling about the pain that has been caused to families and the urgent need to prevent others experiencing the horror faced by families already dealing with the loss of their child.
I will speak briefly on three matters only. First, I must once again thank bereaved families and colleagues in this House and in the other place for their tireless work in pressing this issue. This is one of those issues that does not allow for celebration. As I walked from the Chamber on 22 June, I asked one of the parents how they felt. They said: “It is too late for me”. It was not said in bitterness but in acknowledgement of their profound hurt and the failure of companies voluntarily to do what is obvious, moral and humane. I ask the Government to see the sense in the other amendments that noble Lords brought forward on Report to make children safer, and make the same, pragmatic, thoughtful solution to those as they have done on this group of amendments. It makes a huge difference.
Secondly, I need to highlight just one gap; I have written to the Secretary of State and the Minister on this. I find it disappointing that the Government did not find a way to require senior management to attend an inquest to give evidence. Given that the Government have agreed that senior managers should be subject to criminal liability under some circumstances, I do not understand their objections to summoning them to co-operate with legal proceedings. If a company submits information in response to Ofcom and at the coroner’s request the company’s senior management is invited to attend the inquest, it makes sense that someone should be required to appear to answer and follow up those questions. Again, on behalf of the bereaved families and specifically their legal representatives, who are very clear on the importance of this part of the regime, I ask the Government to reconsider this point and ask the Minister to undertake to speak to the department and the MoJ, if necessary, to make sure that, if senior managers are asked to attend court, they are mandated to do so.
Thirdly, I will touch on the additional commitments the Minister made beyond the Bill, the first of which is the upcoming Data Protection and Digital Information Bill. I am glad to report that some of the officials working on the Bill have already reached out, so I am grateful to the Minister that this is in train, but I expect it to include guidance for companies that will, at a minimum, cover data preservation orders and guidance about the privacy of other users in cases where a child has died. I think that privacy for other users is central to this being a good outcome for everybody, and I hope we are able to include that.
I am pleased to hear about the undertaking with the US regarding potential barriers, and I believe—and I would love to hear from the Minister—that the objective is to make a bilateral agreement that would allow data to be shared between the two countries in the case of a child’s death. It is very specific requirement, not a wide-ranging one. I believe, if we can do it on a bilateral basis, it would be easier than a broad attempt to change the data storage Act.
I turn finally to training for coroners. I was delighted that the Chief Coroner made a commitment to consider issuing non-legislative guidance and training on social media for coroners and the offer of consultation with experts, including Ofcom, the ICO and bereaved families and their representatives, but this commitment was made subject to funding. I ask the Minister to agree to discuss routes to funding from the levy via Ofcom’s digital literacy duty. I have proposed an amendment to the government amendment that would make that happen, but I would welcome the opportunity to discuss it with the Minister. Coroners must feel confident in their understanding of the digital world, and I am concerned that giving this new route to regulated companies via Ofcom without giving them training on how to use it may create a spectre of failure or further frustration and distress for bereaved families. I know there is not a person in the House who would want that to be the outcome of these welcome government amendments.
My Lords, I also welcome this group of amendments. I remember a debate led by the noble Baroness, Lady Kidron, some time ago in the Moses Room, where we discussed this, and I said at the time I thought it would get fixed in the Online Safety Bill. I said that in a spirit of hope, not knowing any of the detail, and it is really satisfying to see the detail here today. As she said, it is testimony to the families, many of whom got in touch with me at that time, who have persisted in working to find a solution for other families—as the noble Baroness said, it is too late for them, but it will make a real difference to other families—and it is so impressive that, at a time of extreme grief and justifiable anger, people have been able to channel that into seeking these improvements.
The key in the amendments, which will make that difference, is that there will be a legal order to which the platforms know they have to respond. The mechanism that has been selected—the information notice—is excellent because it will become well known to every one of the 25,000 or so platforms that operate in the United Kingdom. When they get an information notice from Ofcom, that is not something that they will have discretion over; they will need to comply with it. That will make a huge difference.
My Lords, I apologise for speaking once more today. I shall introduce Amendments 100 and 101 on the child user condition. They are very technical in nature and simply align the definition of “significant” in the Bill with the ICO’s age-appropriate design code to ensure regulatory alignment and to ensure the protection of the greatest number of children.
The Minister has stated on the record that the child-user condition is the same as the age-appropriate design code; however, in Clause 30(3) of the Bill, a service is “likely to be accessed” by children if
“(a) there is a significant number of children who are users of the service or of that part of it, or (b) the service, or that part of it, is of a kind likely to attract a significant number of users who are children”.
At Clause 30(4),
“the reference to a ‘significant’ number includes a reference to a number which is significant in proportion to the total number of United Kingdom users of a service or … part of a service”.
That is a key issue: “in proportion”. Because, by contrast, the ICO’s age-appropriate design code states that a service is “likely to be accessed” if
“children form a substantive and identifiable user group”.
That is quite a different threshold.
In addition, the ICO’s draft guidance on “likely to be accessed” sets out a list of factors that should be taken into consideration when making this assessment. These factors are far more extensive than Clause 30(4) and specifically state:
“‘Significant’ in this context does not mean that a large number of children must be using the service or that children form a substantial proportion of your users. It means that there are more than a de minimis or insignificant number of children using the service”.
In other words, it is possibly quite a small group, or a stand-alone group, that is not in proportion to the users. I will stop here to make the point that sometimes users are in their millions or tens of millions, so a small proportion could be many hundreds of thousands of children—just to be really clear that this matters and I am not quite dancing on the head of a pin here.
Amendment 101 mirrors the ICO’s draft guidance on age assurance on this point. I really struggle to see, if the intention of the Government is that these two things align, why this would not be just a technical amendment that they can just say yes to and we can move on.
I finish by reminding the House that the legal opinion of my noble and learned friend Lord Neuberger, the former head of the Supreme Court, which I shared with the Government, highlights the importance of regulatory alignment, clarity and consistency, particularly in new areas of law where concepts such as “likely to be accessed” are becoming a phrase that is in more than one Act.
My noble and learned friend states:
“As the Minister rightly says, simplicity and clarity are desirable in a statute, and it serves both simplicity and clarity if the same expression is used in the two statutes, and it is made clear that the same meaning is intended … The currently drafted reference in the Bill to ‘a significant number of children’ appears to me to be something of a recipe for uncertainty, especially when compared with the drafting of section 123 of the DPA”.
With that, I beg to move.
My Lords, Amendments 100 and 101 seek further to define the meaning of “significant” in the children’s access assessment, with the intention of aligning this with the meaning of “significant” in the Information Commissioner’s draft guidance on the age-appropriate design code.
I am grateful to the noble Baroness, Lady Kidron, for the way in which she has set out the amendments and the swiftness with which we have considered it. The test in the access assessment in the Bill is already aligned with the test in the code, which determines whether a service is likely to be accessed by children in order to ensure consistency for all providers. The Information Commissioner’s Office has liaised with Ofcom on its new guidance on the likely to access test for the code, with the intention of aligning the two regulatory regimes while reflecting that they seek to do different things. In turn, the Bill will require Ofcom to consult the ICO on its guidance to providers, which will further support alignment between the tests. So while we agree about the importance of alignment, we think that it is already catered for.
With regard to Amendment 100, Clause 30(4)(a) already states that
“the reference to a ‘significant’ number includes a reference to a number which is significant in proportion to the total number of United Kingdom users of a service”.
There is, therefore, already provision in the Bill for this being a significant number in and of itself.
On Amendment 101, the meaning of “significant” must already be more than insignificant by its very definition. The amendment also seeks to define “significant” with reference to the number of children using a service rather than seeking to define what is a significant number.
I hope that that provides some reassurance to the noble Baroness, Lady Kidron, and that she will be content to withdraw the amendment.
I am not sure that, at this late hour, I completely understood what the Minister said. On the basis that we are seeking to align, I will withdraw my amendment, but can we check that we are aligned as my speech came directly from a note from officials that showed a difference? On that basis, I am happy to withdraw.
(1 year, 5 months ago)
Lords ChamberMy Lords, I reiterate what the noble Lord, Lord Bethell, has said and thank him for our various discussions between Committee and Report, particularly on this set of amendments to do with age verification. I also welcome the Government’s responsiveness to the concerns raised in Committee. I welcome these amendments, which are a step forward.
In Committee, I was arguing that there should be a level playing field for regulating any online platform with pornographic content, whether it falls under Part 3 or Part 5 of the Bill. I welcome the Government’s significant changes to Clauses 11 and 72 to ensure that robust age verification or estimation must be used and that standards are consistent across the Bill.
I have a few minor concerns that I wish to highlight. I am thoughtful about whether enough is required of search services in preventing young people from accessing pornography in Clause 25. I recognise the Government believe they have satisfied the need. I fear they may have done enough in the short term, but there is a real concern that this clause is not sufficiently future-proofed. Of course, only time will tell. Maybe the Minister could advise us further in that particular regard.
In Committee, I also argued that the duties in respect of pornography in Parts 3 and 5 must come into effect at the same time. I welcome the government commitment to placing a timeframe for the codes of practice and guidance on the face of the Bill through amendments including Amendment 230. I hope that the Minister will reassure us today that it is the Government’s intention that the duties in Clauses 11 and 72 will come into effect at the same time. Subsection (3) of the new clause proposed in Amendment 271 specifically states that the duties could come into effect at different times, which leaves a loophole for pornography to be regulated differently, even if only for a short time, between Part 3 and Part 5 services. This would be extremely regrettable.
I would also like to reiterate what I said last Thursday, in case the Minister missed my contribution when he intervened on me. I say once again that I commend the Minister for the announcement of the review of the regulation, legislation and enforcement of pornography offences, which I think was this time last week. I once again ask the Minister: will he set out a timetable for publishing the terms of reference and details of how this review will take place? If he cannot set out that timetable today, will he write to your Lordships setting out the timetable before the Recess, and ensure a copy is placed in the Library?
Finally, all of us across the House have benefited from the expertise of expert organisations as we have considered this Bill. I repeat my request to the Minister that he consider establishing an external reference group to support the review, consisting of those NGOs with particular and dedicated expertise. Such groups would have much to add to the process—they have much learning and advice, and there is much assistance there to the Government in that regard.
Once again, I thank the Minister for listening and responding. I look forward to seeing the protections for children set out in these amendments implemented. I shall watch implementation very closely, and I trust and hope that the regulator will take robust action once the codes of practice and guidance are published. Children above all will benefit from a safer internet.
My Lords, I welcome the government amendments in this group, which set out the important role that age assurance will play in the online safety regime. I particularly welcome Amendment 210, which states that companies must employ systems that are “highly effective” at correctly determining whether a particular user is a child to prevent access to pornography, and Amendment 124, which sets out in a code of practice principles which must be followed when implementing age assurance—principles that ensure alignment of standards and protections with the ICO’s age appropriate design code and include, among other things, that age assurance systems should be easy to use, proportionate to the risk and easy to understand, including to those with protected characteristics, as well as aiming to be interoperable. The code is a first step from current practice, in which age verification is opaque, used to further profile children and related adults and highly ineffective, to a world in which children are offered age-appropriate services by design and default.
I pay tribute again to the noble Lord, Lord Bethell, and the noble Baroness, Lady Benjamin, and I associate myself with the broad set of thanks that the noble Lord, Lord Bethell, gave in his opening speech. I also thank colleagues across your Lordships’ House and the other place for supporting this cause with such clarity of purpose. On this matter, I believe that the UK is world-beating, and it will be a testament to all those involved to see the UK’s age verification and estimation laws built on a foundation of transparency and trust so that those impacted feel confident in using them—and we ensure their role in delivering the online world that children and young people deserve.
I have a number of specific questions about government Amendment 38 and Amendment 39. I would be grateful if the Minister were able to answer them from the Dispatch Box and in doing so give a clear sign of the Government’s intent. I will also speak briefly to Amendments 125 and 217 in my name and those of the noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford, as well as Amendment 184 in the names of the noble Baroness, Lady Fox, and the noble Lord, Lord Moylan. All three amendments address privacy.
Government Amendment 38, to which I have added my name, offers exemptions in new subsections (3A) and (3B) that mean that a regulated company need not use age verification or estimation to prevent access to primary priority content if they already prevent it by means of its terms of service. First, I ask the Minister to confirm that these exemptions apply only if a service effectively upholds its terms of service on a routine basis, and that failure to do so would trigger enforcement action and/or an instruction from Ofcom to apply age assurance.
My Lords, I shall follow on directly from some of the comments of the noble Baroness, Lady Kidron, around privacy. I shall not repeat the arguments around children and pornography but touch on something else, which is the impact of these amendments on the vast majority of internet users, the 85%-plus who are 18 or older. Of course, when we introduce age assurance measures, they will affect everyone: we should not kid ourselves that it is only about children, because everyone will have to pass through these gateways.
I shall speak particularly to Amendments 184 and 217 on privacy. I am sure that most adults will support extra safety measures for children, but they also want to be able to access a wide range of online services with the least possible friction and the lowest risk to their own personal data. We can explore how this might work in practice by looking at something that we might all do in this Chamber. Looking round, I believe that we are all at least 18 years old, and we might find ourselves idly passing the time creating an account on a new user-to-user or search service that has been recommended. We should consider this group of amendments by how that might play out. In future, the services will have to check that we are in the United Kingdom—there is a range of ways in which they can do that. Having confirmed that, they will need to understand whether we are 18-plus or a child user so that they can tailor their service appropriately.
I hope we all agree that the services should not be asking us for passports or driving licences, for example, as that would be entirely contrary to the thrust of privacy regulations and would be a huge gateway to fraud and other problems. The most efficient way would be for them to ask us for some sort of digital certificate—a certificate that we have on our devices where we have proven to a trusted third party that we are 18-plus. The certificate does not need to contain any personal data but simply confirms that we are of age. That is very similar to the way in which secure websites work: they send a digital certificate to your browser and you verify that certificate with a trusted third party—a certificate authority—and then you can open an encrypted connection. We are reversing the flow: the service will ask the user for a certificate and then verify that before granting access. A user may have a setting on their device in future where they confirm that they are happy for their 18-plus certificate to be given to anybody or whether they would like to be asked every time there will be a new set of privacy controls.
Building the infrastructure for this is non-trivial. Many things could go wrong but at least the kind of model I am describing has some hope of achieving widespread adoption. It is very good for the adult users as they can continue to have the frictionless experience as long as they are happy for their device to send a certificate to new services. It is good for the market of internet services if new services can bring users on easily. It is good for privacy by avoiding lots of services each collecting personal data, as most people access a multiplicity of services. Perhaps most importantly in terms of the Bill’s objectives, it is good for children if services can separate out the vast majority of their users who are 18-plus and then focus their real efforts on tailoring the services for the minority of users who will be children. The Bill will introduce a whole set of new obligations.
We should not underestimate the scale of the challenge in practice; it will work only if major internet companies are willing to play the game and get into the market of offering 18-plus certificates. Companies such as Google, Meta, Amazon, Apple and Microsoft—the ones we normally love to hate—will potentially provide the solution, as well as not-for-profits. There will be foundations for those who object to the big internet companies, but it is those big internet companies which will have the reach; they each have millions of users in the United Kingdom. This is not to fly the flag for those companies; it is simply a question of efficiency. I suspect that everyone in the Chamber uses a combination of services from those big providers. We already share with them the personal data necessary for age assurance, and there would be no additional sharing of data. If they were willing to provide a certificate, they could do so at the kind of scale necessary for the 50 million or so adult internet users in the United Kingdom to be able to get one easily and then pass it to services when they choose to access them.
There may be some discomfort with big tech playing this role, but I cannot see the kind of aggressive targets that we are setting in the amendments working unless we take advantage of those existing platforms and use them to make this work. Amendment 230 tells us that we have about 18 months, which is very soon in terms of trying to build something. We should be clear that if we are to deliver this package it will depend on persuading some of those big names in tech to create age certification schemes for UK users.
For this to have widespread adoption and a competitive market, we need it to be free of direct financial costs to individual users and to services choosing to age-verify, as we have asked them to do so. We need to think very carefully about that, as it raises a whole series of competition questions that I am sure Ofcom and the Competition and Markets Authority will have to address, not least because we will be asking companies to provide age certification free of charge that will be used by their existing and future competitors to meet their compliance requirements.
There may be some listening who think that we can rely on small age-assurance start-ups. Some of them have a really important role to play and we should be proud of our homegrown industry, but we should be realistic that they will reach scale only if they work with and through the large service providers. Many of them are already seeking those kinds of relationship.
As a test case, we might think of an application such as Signal, a messaging app that prides itself on being privacy-first. It does not want to collect any additional information from its users, which is perfectly reasonable, given where it is coming from. It will be really interesting to see how comfortable such a service will be with working with certification schemes, under which it can prove that users are over 18 by taking advantage of the data held by other services which collect significant amounts of data and have a very good idea of how old we are.
I have not focused on under-18s but, once this system is in place, application providers will be thinking very carefully about the pros and cons of allowing under-18s on at all. I know that the noble Baroness, Lady Kidron, is also concerned about this. There will be services that will think very carefully, if they find that the vast majority of their users are 18-plus, about the extent to which they want to put time and effort into tailoring them for users under 18. We do not intend that outcome from the Bill, but we need realistically to consider it.
Just to be clear, I say that the purpose of my question to the Minister was to get at the fact that, for low-risk situations, there can be age assurance that is a lot less effective or intrusive, for that very reason.
I agree; that is very helpful. I think Amendments 74, 93 and 99 also talk about the exclusion, as the noble Baroness raised, of services from the child safety duties if they can show that they are only 18-plus. It will be quite material and critical to know at what level they can demonstrate that.
I have avoided talking about pornography services directly, but there are interesting questions around what will happen if this model develops, as it likely will. If big tech is now starting to provide age certification for the kinds of mainstream services we may all want to access, they may be much less comfortable providing that same certification to pornography providers, for reputational reasons. A mainstream provider would not want to enter that market. Ofcom will need to take a view on this. We have talked about interoperability in the framework we have created, but it is a big question for Ofcom whether it wants to steer all age certification providers also to provide 18-plus certification for pornography providers or, effectively, to allow two markets to develop—one for mainstream certification and one for certification for pornography.
I have taken a few minutes because this is a very high-risk area for the Bill. There are material risks in willing into existence a model that depends on technical infrastructure that has not yet been built. The noble Lord, Lord Bethell, referred to prior experience; one of the reasons why we have not delivered age assurance before is that the infrastructure was not there. We now want it built, so must recognise that it is quite a high-risk endeavour. That does not mean it is not worth attempting, but we must recognise the risks and work on them.
If the implementation is poor, it will frustrate adult users, which may bring the Bill into disrepute. We need to recognise that as a genuine risk. There are people out there already saying that the Bill means that every internet service in the world will ask you for your passport. If that is not the case, we need to stress that we do not expect that to happen. There are also potentially significant impacts on the market for online services available to both adults and children in the UK, depending on the design of this system.
The purpose of thinking about some of these risks today is not to create a doom-laden scenario and say that it will not work. It is entirely the opposite—to say that, if we are to move ahead into a world in which children are protected from harmful content, for which very good reasons have been articulated and a huge amount of work has gone ahead, and in which services can tailor and gear access to the age of the child, we have to be able to take the 18-plus out of that, put it into a separate box and do so in a really easy, straightforward manner. If not, the 18-plus will end up dragging down what we want to do for the underage.
I hope that explanation helps in the context of these amendments. We will need to test them against it as implementation happens over the next few months.
My Lords, I rise to speak to all the amendments in this group. It is a cause of great regret that, despite many private meetings with officials, government lawyers and Ministers, we have not yet come to an agreement that would explicitly include in the Bill harm that does not derive from content. I will be listening very carefully to the Minister, if he should change his mind during the debate.
The amendments in this group fall into three categories. First, there is a series of amendments in my name and those of the noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford: Amendments 35, 36, 37A and 85. I hope the Government will accept them as consequential because, in meetings last week, they would not accept that harm to children can arise from the functionality and design of services and not just from the content. Each of these amendments simply makes it clear that harm can arise absent from content: nothing more, nothing less. If the Minister agrees that harm may derive from the design of products and services, can he please explain, when he responds, why these amendments are not acceptable? Simply put, it is imperative that the features, functionalities or behaviours that are harmful to children, including those enabled or created by the design or operation of the service, are in scope of the Bill. This would make it utterly clear that a regulated company has a duty to design its service in a manner that does not harm children.
The Government have primary priority harmful content, priority content or non-designated harmful content, the latter being a category that is yet to be defined, but not the harm that emerges from how the regulated company designs its service. For example, there are the many hundreds of small reward loops that make up a doomscroll or make a game addictive; commercial decisions such as Pokémon famously did for a time, which was to end every game in a McDonald’s car park; or, more sinister still, the content-neutral friend recommendations that introduce a child to other children like them, while pushing children into siloed groups. For example, they deliberately push 13 year-old boys towards Andrew Tate—not for any content reason, but simply on the basis that 13 year-old boys are like each other and one of them has already been on that site.
The impact of a content-neutral friend recommendation has rocked our schools as female teachers and girls struggle with the attitudes and actions of young boys, and has torn through families, who no longer recognise their sons and brothers. To push hundreds of thousands of children towards Andrew Tate for no reason other than to benefit commercially from the network effect is a travesty for children and it undermines parents.
The focus on content is old-fashioned and looks backwards. The Bill is drafted as if it has particular situations and companies in mind but does not think about how fast the business moves. When we started the Bill, none of us thought about the impact of TikTok; last week, we saw a new service, Threads, go from zero to 70 million users in a single day. It is an act of stunning hubris to be so certain of the form of harm. To be unprepared to admit that some harm is simply design means that, despite repeated denials, this is just a content Bill. The promise of systems and processes being at the heart of the Bill has been broken.
The second set of amendments in this group are in the name of my noble friend Lord Russell. Amendments 46 and 90 further reveal the attitude of the Government, in that they are protecting the companies rather than putting them four-square in the middle of their regime. The Government specifically exempt the manner of dissemination from the safety duties. My noble friend Lord Russell’s amendment would leave that out and ensure that the manner of dissemination, which is fundamental to the harm that children experience, is included. Similarly, Amendment 240 would take out “presented by content” so that harm that is the result of the design decisions is included in the Bill.
The third set are government Amendments 281C and 281D, and Amendment 281F, in my name. For absence of doubt, I am totally supportive of government Amendments 281C to 281E, which acknowledge the cumulative harms; for example, those that Molly Russell experienced as she was sent more and more undermining and harmful content. In as far as they are a response to my entreaties, and those of other noble Lords, that we ensure that cumulative harmful content is the focus of our concerns, I am grateful to the Government for tabling them. However, I note that the Government have conceded only the role of cumulative harm for content. Amendments 281D and 281E once again talk about content as the only harm to children.
The noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford have added their names to Amendment 281F, and I believe I am right in saying that if there were not a limit to four names, there were a great many Peers who would have added their names also. For the benefit of the House, I will quote directly from the amendment:
“When in relation to children, references to harm include the potential impact of the design and operation of a regulated service separately and additionally from harms arising from content, including the following considerations … the potential cumulative impact of exposure to harm or a combination of harms … the potential for harm to result from features, functionalities or behaviours enabled or created by the design and operation of services … the potential for some features and functionalities within a service to be higher risk than other aspects of the service … that a service may, when used in conjunction with other services, facilitate harm to a child on a different service … the potential for design strategies that exploit a child’s developmental vulnerabilities to create harm, including validation metrics and compulsive reward loops … the potential for real time services, features and functionalities such as geolocation, livestream broadcasts or events, augmented and virtual environments to put children at immediate risk … the potential for content neutral systems that curate or generate environments, content feeds or contacts to create harm to children … that new and emerging harms may arise from artificial intelligence, machine generated and immersive environments”.
Before I continue, I ask noble Lords to consider which of those things they would not like for their children, grandchildren or, indeed, other people’s children. I have accepted that the Government will not add the schedule of harms as I first laid it: the four Cs of content, conduct, contact and commercial harms. I have also accepted that the same schedule, written in the less comfortable language of primary priority, priority and non-designated harms, has also been rejected. However, the list that I just set out, and the amendment to the duties that reflect those risks, would finally put the design of the system at the heart of the Bill. I am afraid that, in spite of all our conversations, I cannot accept the Government’s argument that all harm comes from content.
Even if we are wrong today—which we are most definitely not—in a world of AI, immersive tech and augmented reality, is it not dangerous and, indeed, foolish, to exclude harm that might come from a source other than content? I imagine that the Minister will make the argument that the features are covered in the risk assessment duties and that, unlike content, features may be good or bad so they cannot be characterised as harmful. To that I say: if the risk assessment is the only process that matters, why do the Government feel it necessary to define the child safety duties and the interpretation of harm? The truth is, they have meaning. In setting out the duty of a company to a child, why would the Government not put the company’s design decisions right at the centre of that duty?
As for the second part of the argument, a geolocation feature may of course be great for a map service but less great if it shows the real-time location of a child to a predator, and livestreaming from a school concert is very different from livestreaming from your bedroom. Just as the noble Lord, Lord Allan, explained on the first day on Report, there are things that are red lines and things that are amber; in other words, they have to be age-appropriate. This amendment does not seek—nor would it mean—that individual features or functionalities would be prevented, banned or stopped. It would mean that a company had a duty to make sure that their features and functionalities were age-appropriate and did not harm children—full stop. There would be no reducing this to content.
Finally, I want to repeat what I have said before. Sitting in the court at Molly Russell’s inquest, I watched the Meta representative contest content that included blood cascading down the legs of a young woman, messages that said, “You are worthless”, and snippets of film of people jumping off buildings. She said that none of those things met the bar of harmful content according to Meta’s terms and conditions.
Like others, I believe that the Online Safety Bill could usher in a new duty of care towards children, but it is a category error not to see harm in the round. Views on content can always differ but the outcome on a child is definitive. It is harm, not harmful content, that the Bill should measure. If the Minister does not have the power to accede, I will, with great regret, be testing the opinion of the House. I beg to move.
My Lords, as so often in the course of the Bill, I associate myself wholeheartedly with the comments that the noble Baroness, Lady Kidron, just made. I, too, thank my noble friend the Minister and the Secretary of State for listening to our debates in Committee on the need to be explicit about the impact of cumulative harmful content. So I support Amendments 281C, 281D and 281E, and I thank them for tabling them.
My Lords, I thank everybody who has spoken for these amendments. I also thank the Minister for our many discussions and apologise to the House for the amount of texts that I sent while we were trying to get stand-alone harms into the Bill—unfortunately, we could not; we were told that it was a red line.
It is with some regret that I ask the House to walk through the Lobbies. Before I do so, I acknowledge that the Government have met me on very many issues, for which I am deeply grateful. There are no concessions on this Bill, only making it better. From my perspective, there is no desire to push anybody anywhere, only to protect children and give citizens the correct relationship with the digital world.
I ask those who were not here when I said this before: please think about your children and grandchildren and other people’s children and grandchildren before you vote against these amendments. They are not only heartfelt, as the Minister said, but have been drafted with reference to many experts and people in the business, who, in their best practice, meet some of these things already. We do not want the Bill, by concentrating on content, to be a drag on what we are pushing forward. We want it to be aspirational and to push the industry into another culture and another place. At a personal level, I am very sorry to the Minister, for whom I have a great deal of respect, but I would like to test the opinion of the House.
My Lords, as somebody who is only five feet and two inches, I have felt that size does not matter for pretty much all my life and have long wanted to say that in a speech. This group of amendments is really about how size does not matter; risk does. I will briefly build on the speech just given by the noble Lord, Lord Allan, very eloquently as usual, to describe why risk matters more than size.
First, there are laws for which size does matter—small companies do not need to comply with certain systems and processes—but not those concerned with safety. I have in my mind’s eye the small village fête, where we expect a risk assessment if we are to let children ride on rides. That was not the case 100 years ago, but is today because we recognise those dangers. One of the reasons why we stumbled into thinking that size should matter in this Bill is that we are not being honest about the scale of the risk for our children. If the risk is large enough, we should not be worrying about size; we should be worrying about that risk. That is the first reason why we have to focus on risk and not size.
The second reason is subsequent to what I have just said—the principles of the physical world should apply to the online world. That is one of the core tenets of this Bill. That means that if you recognise the real and present risks of the digital world you have to say that it does not matter whether a small number of people are affected. If it is a small business, it still has an obligation not to put people in harm’s way.
Thirdly, small becomes big very quickly—unfortunately, that has not been true for me, but it is true in the digital world as Threads has just shown us. Fourthly, we also know that in the digital world re-engineering something once it has got very big is really difficult. There is also a practical reason why you want engineers to think about the risks before they launch services rather than after the event.
We keep being told, rightly, that this is a Bill about systems and processes. It is a Bill where we want not just the outcomes that the noble Lord, Lord Allan, has referred to in terms of services in the UK genuinely being safer; we are trying to effect a culture change. I would argue one of the most important culture changes is that any bright, young tech entrepreneur has to start by thinking about the risks and therefore the safety procedures they need to put in place as they build their tech business from the ground up and not once they have reached some artificial size threshold.
My Lords, I have to admit that it was incompetence rather than lack of will that meant I did not add my name to Amendment 39 in the name of the noble Lord, Lord Bethell, and I would very much like the Government to accept his argument.
In the meantime, I wonder whether the Minister would be prepared to make it utterly clear that proportionality does not mean a little bit of porn to a large group of children or a lot of porn to a small group of children; rather, it means that high-risk situations require effective measures and low-risk situations should be proportionate to that. On that theme, I say to the noble Lord, Lord Allan, whose points I broadly agree with, that while we would all wish to see companies brought into the fold rather than being out of the fold, it rather depends on their risk.
This brings me neatly to Amendments 43 and 87 from the noble Lord, Lord Russell, to which I managed to add my name. They make a very similar point to Amendment 39 but across safety duties. Amendment 242 in my name, to which the noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford have added their names, makes the same point—yet again—in relation to Ofcom’s powers.
All these things are pointing in the same direction as Amendment 245 in the name of the noble Baroness, Lady Morgan, which I keep on trumpeting from these Benches and which offers an elegant solution. I urge the Minister to consider Amendment 245 before day four of Report because if the Government were to accept it, it would focus company resources, focus Ofcom resources and, as we discussed on the first day of Report, permit companies which do not fit the risk profile of the regime and are unable to comply with something that does not fit their model yet leaves them vulnerable to enforcement also to be treated in an appropriate way.
Collectively, the ambition is to make sure that we are treating things in proportion to the risk and that proportionate does not start meaning something else.
My Lords, I agree with the noble Baroness, Lady Kidron, that all these amendments are very much heading in the same direction, and from these Benches I am extremely sympathetic to all of them. It may well be that this is very strongly linked to the categorisation debate, as the noble Baroness, Lady Kidron, said.
The amendment from the noble Lord, Lord Bethell, matters even more when we are talking about pornography in the sense that child safety duties are based on risks. I cannot for the life of me see why we should try to contradict that by adding in capacity and size and so on.
My noble friend made a characteristically thoughtful speech about the need for Ofcom to regulate in the right way and make decisions about risk and the capacity challenges of new entrants and so on. I was very taken by what the noble Baroness, Lady Harding, had to say. This is akin to health and safety and, quite frankly, it is a cultural issue for developers. What after all is safety by design if it is not advance risk assessment of the kinds of algorithm that you are developing for your platform? It is a really important factor.
My Lords, I rise briefly to note that, in the exchange between the noble Lords, Lord Allan and Lord Moylan, there was this idea about where you can complain. The independent complaints mechanism would be as advantageous to people who are concerned about freedom of speech as it would be for any other reason. I join and add my voice to other noble Lords who expressed their support for the noble Baroness, Lady Fox, on Amendment 162 about the Public Order Act.
My Lords, we are dangerously on the same page this evening. I absolutely agree with the noble Baroness, Lady Kidron, about demonstrating the need for an independent complaints mechanism. The noble Baroness, Lady Stowell, captured quite a lot of the need to keep the freedom of expression aspect under close review, as we go through the Bill. The noble Baroness, Lady Fox, and the noble Lord, Lord Moylan, have raised an important and useful debate, and there are some crucial issues here. My noble friend captured it when he talked about the justifiable limitations and the context in which limitations are made. Some of the points made about the Public Order Act offences are extremely valuable.
I turn to one thing that surprised me. It was interesting that the noble Lord, Lord Moylan, quoted the Equality and Human Rights Commission, which said it had reservations about the protection of freedom of expression in the Bill. As we go through the Bill, it is easy to keep our eyes on the ground and not to look too closely at the overall impact. In its briefing, which is pretty comprehensive, paragraph 2.14 says:
“In a few cases, it may be clear that the content breaches the law. However, in most cases decisions about illegality will be complex and far from clear. Guidance from Ofcom could never sufficiently capture the full range or complexity of these offences to support service providers comprehensively in such judgements, which are quasi-judicial”.
I am rather more optimistic than that, but we need further assurance on how that will operate. Its life would probably be easier if we did not have the Public Order Act offences in Schedule 7.
I am interested to hear what the Minister says. I am sure that there are pressures on him, from his own Benches, to look again at these issues to see whether more can be done. The EHRC says:
“Our recommendation is to create a duty to protect freedom of expression to provide an effective counterbalance to the duties”.
The noble Lord, Lord Moylan, cited this. There is a lot of reference in the Bill but not to the Ofcom duties. So this could be a late contender to settle the horses, so to speak.
This is a difficult Bill; we all know that so much nuance is involved. We really hope that there is not too much difficulty in interpretation when it is put into practice through the codes. That kind of clarity is what we are trying to achieve, and, if the Minister can help to deliver that, he will deserve a monument.
(1 year, 5 months ago)
Lords ChamberMy Lords, I will speak to Amendment 1, to which I was happy to add my name alongside that of the Minister. I too thank the noble Lord, Lord Stevenson, for tabling the original amendment, and my noble and learned friend Lord Neuberger for providing his very helpful opinion on the matter.
I am especially pleased to see that ensuring that services are safe by design and offer a higher standard of protection for children is foundational to the Bill. I want to say a little word about the specificity, as I support the noble Baroness, Lady Merron, in trying to get to the core issue here. Those of your Lordships who travel to Westminster by Tube may have seen TikTok posters saying that
“we’re committed to the safety of teens on TikTok. That’s why we provide an age appropriate experience for teens under 16. Accounts are set to private by default, and their videos don’t appear in public feeds or search results. Direct messaging is also disabled”.
It might appear to the casual reader that TikTok has suddenly decided unilaterally to be more responsible, but each of those things is a direct response to the age-appropriate design code passed in this House in 2018. So regulation does work and, on this first day on Report, I want to say that I am very grateful to the Government for the amendments that they have tabled, and “Please do continue to listen to these very detailed matters”.
With that, I welcome the amendment. Can the Minister confirm that having safety by design in this clause means that all subsequent provisions must be interpreted through that lens and will inform all the decisions of Report and those of Ofcom, and the Secretary of State’s approach to setting and enforcing standards?
My Lords, I too thank my noble friend the Minister for tabling Amendment 1, to which I add my support.
Very briefly, I want to highlight one word in it, to add to what the noble Baroness, Lady Kidron, has just said. The word is “activity”. It is extremely important that in Clause 1 we are setting out that the purpose is to
“require providers of services regulated by this Act to identify, mitigate and manage”
not just illegal or harmful content but “activity”.
I very much hope that, as we go through the few days on Report, we will come back to this and make sure that in the detailed amendments that have been tabled we genuinely live up to the objective set out in this new clause.
My Lords, I will speak briefly to Amendments 5C and 7A in this group. I welcome the Government’s moves to criminalise cyberflashing. It is something that many have campaigned for in both Houses and outside for many years. I will not repeat the issues so nobly introduced by the noble Baroness, Lady Burt, and I say yet again that I suspect that the noble Baroness, Lady Featherstone, is watching, frustrated that she is still not able to take part in these proceedings.
It is worth making the point that, if actions are deemed to be serious enough to require criminalisation and for people potentially to be prosecuted for them, I very much hope that my noble friend the Minister will be able to say in his remarks that this whole area of the law will be kept under review. There is no doubt that women and girls’ faith in the criminal justice system, both law enforcement and the Crown Prosecution Service, is already very low. If we trumpet the fact that this offence has been introduced, and then there are no prosecutions because the hurdles have not been reached, that is even worse than not introducing the offence in the first place. So I hope very much that this will be kept under review, and no doubt there will be opportunities to return to it in the future.
I do not want to get into the broader debate that we have just heard, because we could be here for a very long time, but I would just say to the noble Baronesses, Lady Kennedy and Lady Fox, that we will debate this in future days on Report and there will be specific protection and mention of women and girls on the face of the Bill—assuming, of course, that Amendment 152 is approved by this House. The guidance might not use the words that have been talked about, but the point is that that is the place to have the debate—led by the regulator with appropriate public consultation—about the gendered nature of abuse that the noble Baroness, Lady Kennedy, has so eloquently set out. I hope that will also be a big step forward in these matters.
I look forward to hearing from the Minister about how this area of law will be kept under review.
My Lords, I understand that, as this is a new stage of the Bill, I have to declare my interests: I am the chair of 5Rights Foundation, a charity that works around technology and children; I am a fellow at the computer science department at Oxford University; I run the Digital Futures Commission, in conjunction with the 5Rights Foundation and the London School of Economics; I am a commissioner on the Broadband Commission; I am an adviser for the AI ethics institute; and I am involved in Born in Bradford and the Lancet commission, and I work with a broad number of civil society organisations.
My comments will be rather shorter. I want to make a detailed comment about Amendment 5B, which I strongly support and which is in the name of the noble Lord, Lord Allan. It refers to,
“a genuine medical, scientific or educational purpose, … the purposes of peer support”
I would urge him to put “genuine peer support”. That is very important because there is a lot of dog whistling that goes on in this area. So if the noble Lord—
My working assumption would be that that would be contestable. If somebody claimed the peer support defence and it was not genuine, that would lead to them becoming liable. So I entirely agree with the noble Baroness. It is a very helpful suggestion.
I also want to support the noble Baroness, Lady Kennedy. The level of abuse to women online and the gendered nature of it has been minimised; the perpetrators have clearly felt immune to the consequences of law enforcement. What worries me a little in this discussion is the idea or conflation that anything said to a woman is an act of violence. I believe that the noble Baroness was being very specific about the sorts of language that could be caught under her suggestions. I understand from what she said that she has been having conversations with the Minister. I very much hope that something is done in this area, and that it is explored more fully, as the noble Baroness, Lady Morgan, said, in the guidance. However, I just want to make the point that online abuse is also gamified: people make arrangements to abuse people in groups in particular ways that are not direct. If they threaten violence, that is quite different to a pile-in saying that you are a marvellous human being.
My Lords, I too must declare my interests on the register—I think that is the quickest way of doing it to save time. We still have time, and I very much hope that the Minister will listen to this debate and consider it. Although we are considering clauses that, by and large, come at the end of the Bill, there is still time procedurally—if the Minister so decides—to come forward with an amendment later on Report or at Third Reading.
We have heard some very convincing arguments today. My noble friend explained that the Minister did not like the DPP solution. I have looked back again at the Law Commission report, and I cannot for the life of me see the distinction between what was proposed for the offence in its report and what is proposed by the Government. There is a cigarette paper, if we are still allowed to use that analogy, between them, but the DPP is recommended—perhaps not on a personal basis, although I do not know quite what distinction is made there by the Law Commission, but certainly the Minister clearly did not like that. My noble friend has come back with some specifics, and I very much hope that the Minister will put on the record that, in those circumstances, there would not be a prosecution. As we heard in Committee, 130 different organisations had strong concerns, and I hope that the Minister will respond to those concerns.
As regards my other noble friend’s amendment, again creatively she has come back with a proposal for including reckless behaviour. The big problem here is that many people believe that, unless you include “reckless” or “consent”, the “for a laugh” defence operates. As the Minister knows, quite expert advice has been had on this subject. I hope the Minister continues his discussions. I very much support my noble friend in this respect. I hope he will respond to her in respect of timing and monitoring—the noble Baroness, Lady Morgan, mentioned the need for the issue to be kept under review—even if at the end of the day he does not respond positively with an amendment.
Everybody believes that we need a change of culture—even the noble Baroness, Lady Fox, clearly recognises that—but the big difference is whether or not we believe that these particular amendments should be made. We very much welcome what the Law Commission proposed and what the Government have put into effect, but the question at the end of day is whether we truly are making illegal online what is illegal offline. That has always been the Government’s test. We must be mindful of that in trying to equate online behaviour with offline behaviour. I do not believe that we are there yet, however much moral leadership we are exhorted to display. I very much take the point of the noble Baroness, Lady Morgan, about the violence against women and girls amendment that the Government are coming forward with. I hope that will have a cultural change impact as well.
As regards the amendments of the noble Baroness, Lady Kennedy, I very much take the point she made, both at Committee and on Report. She was very specific, as the noble Baroness, Lady Kidron, said, and was very clear about the impact, which as men we severely underestimate if we do not listen to what she said. I was slightly surprised that the noble Baroness, Lady Fox, really underestimates the impact of that kind of abuse—particularly that kind of indirect abuse.
I was interested in what the Minister had to say in Committee:
“In relation to the noble Baroness’s Amendment 268, the intentional encouragement or assistance of a criminal offence is already captured under Sections 44 to 46 of the Serious Crime Act 2007”.—[Official Report, 22/6/23; col. 424.]
Is that still the Government's position? Has that been explained to the noble Baroness, Lady Kennedy, who I would have thought was pretty expert in the 2007 Act? If she does not agree with the Minister, that is a matter of some concern.
Finally, I agree that we need to consider the points raised at the outset by the noble and learned Lord, Lord Garnier, and I very much hope that the Government will keep that under review.
I also welcome these amendments and want to pay tribute to Maria Miller in the other place for her work on this issue. It has been extraordinary. I too was going to raise the issue of the definition of “photograph”, so perhaps the Minister could say or, even better, put it in the Bill. It does extend to those other contexts.
My main point is about children. We do not want to criminalise children, but this is pervasive among under-18s. I do want to make the distinction between those under-18s who intentionally harm another under-18 and have to be responsible for what they have done in the meaning of the law as the Minister set it out, and those who are under the incredible pressure—I do not mean coercion, because that is another out-clause—of oversharing that is inherent in the design of many of these services. That is an issue I am sure we are going to come back to later today. I would love to hear the Minister say something about the Government’s intention from the Dispatch Box: that it is preventive first and there is a balance between education and punishment for under-18s who find themselves unavoidably in this situation.
Very briefly, before I speak to these amendments, I want to welcome them. Having spoken to and introduced some of the threats of sharing intimate images under the Domestic Abuse Act 2021, I think it is really welcome that everything has been brought together in one place. Again, I pay tribute to the work of Dame Maria Miller and many others outside who have raised these as issues. I also want to pay tribute to the Ministry of Justice Minister Edward Argar, who has also worked with my noble friend the Minister on this.
I have one specific question. The Minister did mention this in his remarks, but could he be absolutely clear that these amendments do not mention specifically the lifetime anonymity of claimants and the special measures in relation to giving evidence that apply to witnesses. That came up in the last group of amendments as well. Because they are not actually in this drafting, it would be helpful if he could put on record the relationship with the provisions in the Sexual Offences Act 2003. I know that would be appreciated by campaigners.
I believe I misspoke when I asked my question. I referred to under-18s. Of course, if they are under 18 then it is child sexual abuse. I meant someone under the age of 18 with an adult image. I put that there for the record.
If the noble Baroness misspoke, I understood what she intended. I knew what she was getting at.
With that, I hope noble Lords will be content not to press their amendments and that they will support the government amendments.
(1 year, 5 months ago)
Lords ChamberMy Lords, I will speak to my Amendments 281 to 281B. I thank the noble Baronesses, Lady Harding and Lady Kidron, and the noble Lord, Lord Knight, for adding their names to them. I will deal first with Amendments 281 and 281B, then move to 281A.
On Amendments 281 and 281B, the Minister will recall that in Committee we had a discussion around how functionality is defined in the Bill and that a great deal of the child risk assessments and safety duties must have regard to functionality, as defined in Clause 208. However, as it is currently written, this clause appears to separate out functionalities of user-to-user services and search services. These two amendments are designed to adjust that slightly, to future-proof the Bill.
Why is this necessary? First, it reflects that it is likely that in the future, many of the functionalities that we currently see on user-to-user services will become present on search services and possibly vice versa. Therefore, we need to try to take account of how the world is likely to move. Secondly, this is already happening, and it poses a risk to children. Some research done by the 5Rights Foundation has found that “predictive search”, counted in the Bill as a search service functionality, is present on social media websites, leading one child user using a search bar to be presented in nanoseconds with prompts associated with eating disorders. In Committee, the Minister noted that the functionalities listed in this clause are non-exhaustive. At the very least, it would be helpful to clarify this in the Bill language.
Amendment 281A would add specific functionalities which we know are addictive or harmful to children and put them in the Bill. We have a great deal of research and evidence which demonstrates how persuasive certain design strategies are with children. These are features which are solely designed to keep users on the platform, at any cost, as much as possible and for as long as possible. The more that children are on the platform, the more harm they are likely to suffer. Given that the purpose of this Bill is for services to be safe by design, as set out usefully in Amendment 1, please can we make sure that where we know—and we do know—that risk exists, we are doing our utmost to tackle it?
The features that are listed in this amendment are known as “dark patterns”—and they are known as “dark patterns” for a very good reason. They have persuasive and pervasive design features which are deliberately baked into the design of the digital services and products, to capture and hold, in this case, children’s attention, and to create habitual, even compulsive behaviours. The damage this does to children is proven and palpable. For example, one of the features mentioned is infinite scroll, which is now ubiquitous on most major social media platforms. The inventor of infinite scroll, a certain Aza Raskin, who probably thought it was a brilliant idea at the time, has said publicly that he now deeply regrets ever introducing it, because of the effect it is having on children.
One of the young people who spoke to the researchers at 5Rights said of the struggle they have daily with the infinite scroll feature:
“Scrolling forever gives me a sick feeling in my stomach. I’m so aware of how little control I have and the feeling of needing to be online is overwhelming and consuming”.
Features designed to keep users—adults, maybe fine, but children not fine—online at any cost are taking a real toll. Managing public and frequent interactions online, which the features encourage, creates the most enormous pressures for young people, and with that comes anxiety, low self-esteem and mental health challenges. This is only increasing, and unless we are very specific about these, they are going to continue.
We have the evidence. We know what poses harm and risk to children. Please can we make sure that this is reflected accurately in the Bill?
My Lords, I rise briefly to support many of the amendments in this group. I will start with Amendments 281, 281A and 281B in the name of my noble friend Lord Russell, to which I have added my name. The noble Lord set out the case very well. I will not reiterate what he said, but it is simply the case that the features and functionalities of regulated companies should not be separated by search and user-to-user but should apply across any regulated company that has that feature. There is no need to worry about a company that does not have one of the features on the list, but it is a much more dangerous thing to have an absent feature than it is to have a single list and hold companies responsible for their features.
Only this morning, Meta released Thread as its challenger to Twitter. In the last month, Snapchat added generative AI to its offering. Instagram now does video, and TikTok does shopping. All these companies are moving into a place where they would like to be the one that does everything. That is their commercial endgame, and that is where the Bill should set its sights.
Separating out functionality and, as the noble Lord, Lord Russell, said, failing to add what we already know, puts the Bill in danger of looking very old before the ink is dry. I believe it unnecessarily curtails Ofcom in being able to approach the companies for what they are doing, rather than for what the Bill thought they might be doing at this point. So, if the Minister is not in a position to agree to the amendment, I urge him at least to take it away and have a look at it, because it is a technical rather than an ideological matter. It would be wonderful to fix it.
To follow on from that, we are talking about the obligation to bring exemptions to Parliament. Well, we are in Parliament and we are bringing exemptions. The noble Lord is recommending that we bring very specific exemptions while those that the noble Lord, Lord Moylan, and I have been recommending may be rather broad—but I thought we were bringing exemptions to Parliament. I am not being facetious. The point I am making is, “Why can’t we do it now?” We are here now, doing this. We are saying, as Parliament, “Look at these exemptions”. Can the Minister not look at them now instead of saying that we will look at them some other time?
I may as well intervene now as well, so that the Minister can get a good run at this. I too am concerned at the answer that has been given. I can see the headline now, “Online Safety Bill Age-Gates Wikipedia”. I cannot see how it does not, by virtue of some of the material that can be found on Wikipedia. We are trying to say that there are some services that are inherently in a child’s best interests—or that are in their best interests according to their evolving capacity, if we had been allowed to put children’s rights into the Bill. I am concerned that that is the outcome of the answer to the noble Lord, Lord Allan.
I thank the Minister for that. In conclusion, I hope he will reflect on those issues and come back, maybe at the end of the next group. I remind the House that in February the APPG on Commercial Sexual Exploitation, in its inquiry on pornography, recommended that the regulation of pornography should be consistent across all online platforms and between the online and offline spheres. I hope we can incorporate the voices I have already mentioned in the NGO sphere in order to assist the Government and both Houses in ensuring that we regulate the online platforms and that children are protected from any harms that may arise.
My Lords, I shall speak briefly to Amendment 174 in my name and then more broadly to this group—I note that the Minister got his defence in early.
On the question of misinformation and disinformation, I recognise what he said and I suppose that, in my delight at hearing the words “misinformation and disinformation”, I misunderstood to some degree what he was offering at the Dispatch Box, but I make the point that this poses an enormous risk to children. As an example, children are the fastest-growing group of far-right believers/activists online, and there are many areas in which we are going to see an exponential growth in misinformation and disinformation as large language models become the norm. So I ask him, in a tentative manner, to look at that.
On the other issue, I have to push back at the Minister’s explanation. Content classification around sexual content is a well-established norm. The BBFC does it and has done it for a very long time. There is an absolute understanding that what is suitable for a U, a PG, a 12 or a 12A are different things, and that as children’s capacities evolve, as they get older, there are things that are more suitable for older children, including, indeed, stronger portrayals of sexual behaviour as the age category rises. So I cannot accept that this opens a new can of worms: this is something that we have been doing for many, many years.
I think it is a bit wrongheaded to imagine that if we “solve” the porn problem, we have solved the problem—because there is still sexualisation and the commercialisation of sex. Now, if you say something about feet to a child, they start to giggle uproariously because, in internet language, you get paid for taking pictures of feet and giving them to strange people. There are such detailed and different areas that companies should be looking at. This amendment in my name and the names of the noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford, should be taken very seriously. It is not new ground, so I would ask the Minister to reconsider it.
More broadly, the Minister will have noticed that I liberally added my name to the amendments he has brought forward to meet some of the issues we raised in Committee, and I have not added my name to the schedule of harms. I want to be nuanced about this and say I am grateful to the Government for putting them in the Bill, I am grateful that the content harms have been discussed in this Chamber and not left for secondary legislation, and I am grateful for all the conversations around this. However, harm cannot be defined only as content, and the last grouping got to the core of the issue in the House. Even when the Minister was setting out this amendment, he acknowledged that the increase in harm to users may be systemic and by design. In his explanation, he used the word “harm”; in the Bill, it always manifests as “harmful content”.
While the systemic risk of increasing the presence of harmful content is consistently within the Bill, which is excellent, the concept that the design of service may in and of itself be harmful is absent. In failing to do that, the Government, and therefore the Bill, have missed the bull’s-eye. The bull’s-eye is what is particular about this method of communication that creates harm—and what is particular are the features, functionalities and design. I draw noble Lords back to the debate about Wikipedia. It is not that we all love Wikipedia adoringly; it is that it does not pursue a system of design for commercial purposes that entraps people within its grasp. Those are the harms we are trying to get at. I am grateful for the conversations I have had, and I look forward to some more. I have laid down some other amendments for Monday and beyond that would, I hope, deal with this—but until that time, I am afraid this is an incomplete picture.
My Lords, I have a comment about Amendment 174 in the name of the noble Baroness, Lady Kidron. I have no objection to the insertion of subsection (9B), but I am concerned about (9A), which deals with misinformation and disinformation. It is far too broad and political, and if we start at this late stage to try to run off into these essentially political categories, we are going to capsize the Bill altogether. So I took some heart from the fact that my noble friend on the Front Bench appeared disinclined to accept at least that limb of the amendment.
I did want to ask briefly some more detailed questions about Amendment 172 and new subsection (2) in particular. This arises from the danger of having clauses added at late stages of the Bill that have not had the benefit of proper discussion and scrutiny in Committee. I think we are all going to recognise the characteristics that are listed in new subsection (2) as mapping on to the Equality Act, which appears to be their source. I note in passing that it refers in that regard to gender reassignment. I would also note that most of the platforms, in their terms and conditions, refer not to gender reassignment but to various other things such as gender identity, which are really very different, or at least different in detail, and I would be interested to ask my noble friend how effectively he expects it to be enforced that the words used in English statute are actually applied by these foreign platforms—I am going to come back to this in a further amendment later—or how the words used in English statute are applied by what are, essentially, foreign platforms when they are operating for an audience in the United Kingdom.
My Lords, this has been a useful debate. As the noble Baroness, Lady Kidron, says, because I spoke first to move the government amendments, in effect I got my response in first to her Amendment 174, the only non-government amendment in the group. That is useful because it allows us to have a deeper debate on it.
The noble Baroness asked about the way that organisations such as the British Board of Film Classification already make assessments of sexualised content. However, the Bill’s requirement on service providers and the process that the BBFC takes to classify content are not really comparable. Services will have far less time and much more content to consider them the BBFC does, so will not be able to take the same approach. The BBFC is able to take an extended time to consider maybe just one scene, one image or one conversation, and therefore can apply nuance to its assessments. That is not possible to do at the scale at which services will have to apply the child safety duties in the Bill. We therefore think there is a real risk that they would excessively apply those duties and adversely affect children’s rights online.
I know the noble Baroness and other noble Lords are rightly concerned with protecting rights to free expression and access to information online for children and for adults. It is important that we strike the right balance, which is what we have tried to do with the government amendments in this group.
To be clear, the point that I made about the BBFC was not to suggest a similar arrangement but to challenge the idea that we cannot categorise material of a sexualised nature. Building on the point made by the noble Lord, Lord Allan, perhaps we could think about it in terms of the amber light rather than the red light—in other words, something to think about.
I certainly will think about it, but the difficulty is the scale of the material and the speed with which we want these assessments to be made and that light to be lit, in order to make sure that people are properly protected.
My noble friend Lord Moylan asked about differing international terminology. In order for companies to operate in the United Kingdom they must have an understanding of the United Kingdom, including the English-language terms used in our legislation. He made a point about the Equality Act 2010. While it uses the same language, it does not extend the Equality Act to this part of the Bill. In particular, it does not create a new offence.
The noble Baroness, Lady Fox, also mentioned the Equality Act when she asked about the phraseology relating to gender reassignment. We included this wording to ensure that the language used in the Bill matches Section 7(1) of the Equality Act 2010 and that gender reassignment has the same meaning in the Bill as it does in that legislation. As has been said by other noble Lords—
With respect, it does not say “directed at children”. Of course, I am safe in expressing that abuse in this forum, but if I were to do it, it came to the attention of children and it were abusive—because I do wish to be abusive about that practice—would I have created “priority harmful content”, about which action would have to be taken?
I will leap to the Minister’s defence on this occasion. I remind noble colleagues that this is not about individual pieces of content; there would have to be a consistent flow of such information being proffered to children before Ofcom would ask for a change.
My Lords, these words have obviously appeared in the Bill in one of those unverified sections; I have clicked the wrong button, so I cannot see them. Where does it say in Amendment 172 that it has to be a consistent flow?
(1 year, 5 months ago)
Lords ChamberMy Lords, first, I want to recognise the bravery of the families of Olly, Breck, Molly, Frankie and Sophie in campaigning for the amendments we are about to discuss. I also pay tribute to Mia, Archie, Isaac, Maia and Aime, whose families I met this morning on their way to the House. It is a great privilege to stand alongside them and witness their courage and dignity in the face of unimaginable grief. On behalf of myself, my co-signatories—the noble Lords, Lord Stevenson and Lord Clement-Jones, and the noble Baroness, Lady Morgan—and the huge number of Peers and MPs who have supported these amendments, I thank them for their work and the selflessness they have shown in their determination to ensure that other families do not suffer as they have.
This group includes Amendments 198, 199, 215 and 216, which, together, would create a pathway for coroners and, by extension, families to get access to information relevant to the death of a child from technology services. The amendments would put an end to the inhumane situation whereby coroners and families in crisis are forced to battle faceless corporations to determine whether a child’s engagement with a digital service contributed to their death. Bereaved families have a right to know what happened to their children, and coroners have a duty to ensure that lessons are learned and that those who have failed in their responsibilities are held accountable.
Since the Minister is going to be the bearer of good news this afternoon, I will take the time to make arguments for the amendments as they stand. I simply say that, while parents have been fighting for access to information, those same companies have continued to suggest friends, material and behaviours that drive children into places and spaces in which they are undermined, radicalised into despair and come to harm. In no other circumstance would it be acceptable to withhold relevant information from a court procedure. It is both immoral and a failure of justice if coroners cannot access and review all relevant evidence. For the families, it adds pain to heartbreak as they are unable to come to terms with what has happened because there is still so much that they do not know.
I am grateful to the Government for agreeing to bring forward on Report amendments that will go a very long way towards closing the loopholes that allow companies to refuse coroners’ demands and ignore parents’ entreaties. The Government’s approach is somewhat different from that in front of us, but it covers the same ground. These amendments are the result of the considerable efforts of Ministers and officials from DSIT and the Ministry of Justice, with the invaluable support of the right honourable Sajid Javid MP. I wish to note on the record the leadership of the Secretary of State, who is currently on leave, and the Minister here, the noble Lord, Lord Parkinson.
The Government’s amendments will create an express power for Ofcom to require information from services about a deceased child user’s online activity following the receipt of a Schedule 5 request from a coroner. This will vastly increase the reach and power of that coroner. Information that Ofcom can request from regulated companies under the Online Safety Bill is extremely wide and includes detailed data on what is recommended; the amount of time the child spent on the service when they accessed it; their user journey; what content they liked, shared, rewatched, paused and reported; and whether other users raised red flags about the child’s safety or well-being before their death.
Information notices prompted by a Schedule 5 request from a coroner will be backed by Ofcom’s full enforcement powers and will apply to all regulated companies. If a service fails to comply, it may be subject to enforcement action, including senior management liability and fines of up to £18 million or 10% of global turnover—vastly different from the maximum fine of £1,000 under the Coroners and Justice Act 2009. Moreover, these amendments will give coroners access to Ofcom’s expertise and understanding of how online services work and of online services’ safety duties to children. Also, there will be provisions empowering Ofcom to share information freely to assist coroners in their inquiries. Companies must provide a dedicated means of communication to manage requests for information from bereaved parents and provide written responses to those requests. I look forward to the Minister setting out that these will be operated by a team of experts and backed up by Ofcom in ensuring that the communication is adequate, timely and not obstructive. Importantly, if the communication is not adequate, bereaved families will be able to notify Ofcom.
There are a small number of outstanding questions. We remain concerned that only larger companies will be required to set out their policies on disclosure. Sadly, children are often coerced and nudged into smaller sites that have less robust safety mechanisms. Small is not safe. A further issue is to ensure that a coroner is able, via a Schedule 5 notice given to Ofcom, to compel senior management to appear at an inquest. This is a crucial ask of the legal community, who battled and failed to get companies to attend inquests, notably Wattpad at the Frankie Thomas inquest and Snap Inc at Molly Russell’s inquest. Can the Minister undertake to close these gaps before Report?
A number of matters sit outside the scope of the Online Safety Bill. I am particularly grateful to the Secretary of State for committing in writing to further work beyond the Bill to ensure that the UK’s approach is comprehensive and watertight. The Government will be exploring ways in which the Data Protection and Digital Information (No. 2) Bill can support and complement these provisions, including the potential for a code that requires data preservation if a parent or enforcement officer contacts a helpline or if there is constructive knowledge, such as when a death has been widely reported, even before a Schedule 5 notice has been delivered.
The Government are engaging with the Chief Coroner to provide training in order to ensure that coroners have the knowledge they need to carry out inquests where children’s engagement with online services is a possible factor in their death. I am concerned about the funding of this element of the Government’s plans and urge the Minister to indicate whether this could be part of Ofcom’s literacy duties and therefore benefit from the levy. Possibly most importantly, the Secretary of State has undertaken to approach the US Government to ensure that coroners can review private messages that fall outside the scope of this Bill in cases where a child’s death is being investigated. I am grateful to the noble Lord, Lord Allan, for his support in articulating the issue, and accept the invitation to work alongside the department to achieve this.
There are only two further things to say. First, delivery is in the drafting, and I hope that when he responds, the Minister will assure the House that we will see the proposed amendments well before Report so that we can ensure that this works as we have all agreed. Secondly, the Government are now looking very carefully at other amendments which deal with prevention of harm in one way or another. I share the gratitude of Bereaved Parents for Online Safety for the work that has gone into this set of amendments. However, we want to see safety by design; a comprehensive list of harms to children in the Bill, including harms caused or amplified by the design of service; principles for age assurance which ensure that the systems put in place by regulated services are measurable, secure and fit for purpose; and a proper complaints service, so that children have somewhere to turn when things go wrong. What we have been promised is a radical change of status for the coroner and for the bereaved families. What we want is fewer dead children. I beg to move.
My Lords, some of the issues that we have been dealing with in this Bill are more abstract or generic harms, but here we are responding to a specific need of families in the UK who are facing the most awful of circumstances.
I want to recognise the noble Baroness, Lady Kidron, for her direct support for many of those families, and for her persistent efforts to use policy and the tools we have available to us here to improve the situation for families who, sadly, will face similar tragedies in future. I appreciate the time that she has spent with me in the spirit of finding workable solutions. It is an alliance that might seem improbable, given our respective responsibilities, which have sometimes placed us in publicly adversarial roles. However, one of the strengths of this Committee process is that it has allowed us to focus on what is important and to find that we have more in common than separates us. Nothing could be more important than the issue we are dealing with now.
I am pleased that it looks like we will be able to use this Bill to make some significant improvements in this area to address the challenges faced by those families, some of whom are here today, challenges which add to their already heart-wrenching distress. The first challenge these families face is to find someone at an online service who is willing and able to answer their questions about their loved one’s use of that platform. This question about contacts at online platforms is not limited to these cases but comes up in other areas.
As noble Lords will know, I used to work for Facebook, where I was often contacted by all sorts of Governments asking me to find people in companies, often smaller companies, concerning very serious issues such as terrorism. Even when they were dealing with the distribution of terrorist content, they would find it very challenging. There is a generic problem around getting hold of people at platforms. A real strength of the Online Safety Bill is that it will necessarily require Ofcom to develop contacts at all online services that offer user-to-user and search services to people in the UK. The Government estimate that 25,000 entities are involved. We are talking about Ofcom building a comprehensive database of pretty much any service that matters to people in the UK.
Primarily, these contacts will be safety focused, as their main responsibility will be to provide Ofcom with evidence that the service is meeting its duties of care under the Bill, so again, they will have the right people in the right companies on their database in future. Importantly, Ofcom will have a team of several hundred people, paid for by a levy on these regulated services, to manage the contacts at the right level. We can expect that, certainly for the larger services, there may be a team of several people at Ofcom dedicated to working with them, whereas for the smaller services it may be a pooled arrangement whereby one Ofcom staff member deals with a group. However, in all cases there will be someone at the regulator with a responsibility for liaising with those companies. We do not expect Ofcom to use those contacts to resolve questions raised by individuals in the UK as a matter of course, but it makes sense to make this channel available where there is a relatively small number of highly impactful cases such as we are dealing with here.
I do indeed welcome it. I do not feel I can do justice to all the speakers; I think I will cry, as I did when the noble Baroness, Lady Newlove, was speaking. I shall not do that, but I will thank all noble Lords from the bottom of my heart and will speak to just a couple of technical matters.
First, I accept the help of the noble Lord, Lord Allan, on the progress of the data protection negotiations with the US Government. That will be very helpful. I want to put on the record that there has been a lot of discussion about the privacy of other users and ensuring that it is central, particularly because other young people are in these interactions and we have to protect them, too. That is very much in our mind.
I welcome and thank the Minister. He said a couple of things, including that he hoped that what he will bring forward will rise to the expectation—so do I. The expectation is set high, and I hope that the Government rise to it. In relation to that, I note that a number of noble Lords carefully planted their expectations in Hansard. I will be giving the noble Lord a highlighter so that he can find them. I note that it was a particular skill of the ex-Secretary of State for DCMS, for laying down the things she expected to see.
I understood “exploring” and “in our mind”; the Government have certain things in their mind. I understand the context of that because we are talking about other Bills and things that are yet to come. I want to make a statement—I do not know whether it is a promise or a threat; I rather suspect it is both. I will not rest until this entire ecosystem is sorted. This is not about winning an amendment or a concession. This is about putting it right for families and, indeed, for coroners, who are not doing a good job under the current regime.
Finally, I echo those who have pointed out the other amendments that we are seeking on safety by design, age assurance and having the harms in the Bill. I believe I speak for Bereaved Parents for Online Safety; that is what they wish to see come from their pain. It has been the privilege of my life to deal with these parents and these families and I thank the Committee for its support. With my conditions set out, I wish to withdraw my amendment.
My Lords, I will speak briefly to Amendment 218JA, spoken to by the noble Lord, Lord Allan. My name is attached to it online but has not made it on to the printed version. He introduced it so ably and comprehensively that I will not say much more, but I will be more direct with my noble friend the Minister.
This amendment would remove Clause 133(11). The noble Lord, Lord Allan, mentioned that BT has raised with us—I am sure that others have too—that the subsection gives examples of access facilities, such as ISPs and application stores. However, as the noble Lord said, there are other ways that services could use operating systems, browsers and VPNs to evade these access restriction orders. While it is convention for me to say that I would support this amendment should it be moved at a later stage, this is one of those issues that my noble friend the Minister could take off the table this afternoon—he has had letters about it to which there have not necessarily been replies—just by saying that subsection (11) does not give the whole picture, that there are other services and that it is misleading to give just these examples. Will he clarify at the Dispatch Box and on the record, for the benefit of everyone using the Bill now and in future, what broader services are caught? We could then take the issue off the table on this 10th day of Committee.
My Lords, I will be even more direct than the noble Baroness, Lady Morgan, and seek some confirmation. I understood from our various briefings in Committee that, where content is illegal, it is illegal anywhere in the digital world—it is not restricted simply to user to user, search and Part 5. Can the Minister say whether I have understood that correctly? If I have, will he confirm that Ofcom will be able to use its disruption powers on a service out of scope, as it were, such as a blog or a game with no user-to-user aspect, if it were found to be persistently hosting illegal content?
My Lords, this has been an interesting debate, though one of two halves, if not three.
The noble Lord, Lord Bethell, introduced his amendment in a very measured way. My noble friend Lady Benjamin really regrets that she cannot be here, but she strongly supports it. I will quote her without taking her speech entirely on board, as we have been admonished for that previously. She would have said that
“credit card companies have claimed ignorance using the excuse of how could they be expected to know they are supporting porn if they were not responsible for maintaining porn websites … This is simply not acceptable”.
Noble Lords must forgive me—I could not possibly have delivered that in the way that my noble friend would have done. However, I very much took on board what the noble Lord said about how this makes breaches transparent to the credit card companies. It is a right to be informed, not an enforcement power. The noble Lord described it as a simple and proportionate measure, which I think is fair. I would very much like to hear from the Minister why, given the importance of credit card companies in the provision of pornographic content, this is not acceptable to the Government.
The second part of this group is all about effective enforcement, which the noble Lord, Lord Bethell, spoke to as well. This is quite technical; it is really important that these issues have been raised, in particular by the noble Lord. The question is whether Ofcom has the appropriate enforcement powers. I was very taken by the phrase
“pre-empt a possible legal challenge”,
as it is quite helpful to get your retaliation in first. Underlying all this is that we need to know what advice the Minister and Ofcom are getting about the enforcement powers and so on.
I am slightly more sceptical about the amendments from the noble Lord, Lord Curry. I am all in favour of the need for speed in enforcement, particularly having argued for it in competition cases, where getting ex-ante powers is always a good idea—the faster one can move, the better. However, restricting the discretion of Ofcom in those circumstances seems to me a bit over the top. Many of us have expressed our confidence in Ofcom as we have gone through the Bill. We may come back to this in future; none of us thinks the Bill will necessarily be the perfect instrument, and it may prove that we do not have a sufficiently muscular regulator. I entirely respect the noble Lord’s track record and experience in regulation, but Ofcom has so far given us confidence that it will be a muscular regulator.
I turn now to the third part of the group. I was interested in the context in which my noble friend placed enforcement; it is really important and supported by the noble Baroness, Lady Morgan. It is interesting what questions have been asked about the full extent of the Government’s ambitions in this respect: are VPNs going to be subject to these kinds of notices? I would hope so; if VPNs are really the gateway to some of the unacceptable harms that we are trying to prevent, we should know about that. We should be very cognisant of the kind of possible culture being adopted by some of the social media and regulated services, and we should tailor our response accordingly. I will be interested to hear what the Government have to say on that.
(1 year, 5 months ago)
Lords ChamberMy Lords, I too want to support this group of amendments, particularly Amendment 234, and will make just a couple of brief points.
First, one of the important qualities of the online safety regime is transparency, and this really speaks to that point. It is beyond clear that we are going to need all hands on deck, and again, this speaks to that need. I passionately agree with the noble Baroness, Lady Fox, on this issue and ask, when does an independent researcher stop being independent? I have met quite a lot on my journey who suddenly find ways of contributing to the digital world other than their independent research. However, the route described here offers all the opportunities to put those balancing pieces in place.
Secondly, I am very much aware of the fear of the academics in our universities. I know that a number of them wrote to the Secretary of State last week saying that they were concerned that they would be left behind their colleagues in Europe. We do not want to put up barriers for academics in the UK. We want the UK to be at the forefront of governance of the digital world, this amendment speaks to that, and I see no reason for the Government to reject it.
Finally, I want to emphasise the importance of research. Revealing Reality did research for 5Rights called Pathways, in which it built avatars for real children and revealed the recommendation loops in action. We could see how children were being offered self-harm, suicide, extreme diets and livestream porn within moments of them arriving online. Frances Haugen has already been mentioned. She categorically proved what we have been asserting for years, namely that Instagram impacts negatively on teenage girls. As we put this regime in place, it is not adequate to rely on organisations that are willing to work in the grey areas of legality to get their research or on whistleblowers—on individual acts of courage—to make the world aware.
One of the conversations I remember happened nearly five years ago, when the then Secretary of State asked me what the most important thing about the Bill was. I said, “To bring a radical idea of transparency to the sector”. This amendment goes some way to doing just that.
My Lords, I, too, support Amendments 233 and 234, and Amendment 233A, from the noble Lord, Lord Allan. As the noble Baroness, Lady Kidron, said, it has been made clear in the past 10 days of Committee that there is a role for every part of society to play to make sure that we see the benefits of the digital world but also mitigate the potential harms. The role that researchers and academics can play in helping us understand how the digital world operates is critical—and that is going to get ever more so as we enter a world of large language models and AI. Access to data in order to understand how digital systems and processes work will become even more important—next week, not just in 10 years’ time.
My noble friend Lord Bethell quite rightly pointed out the parallels with other regulators, such as the MHRA and the Bank of England. A number of people are now comparing the way in which the MHRA and other medical regulators regulate the development of drugs with how we ought to think about the emergence of regulation for AI. This is a very good read-across: we need to set the rules of the road for researchers and ensure, as the noble Baroness, Lady Kidron, said—nailing it, as usual—that we have the most transparent system possible, enabling people to conduct their research in the light, not in the grey zone.
It is guidance rather than direction, but it will be done openly and transparently. Users will be able to see the guidance which Ofcom has issued, to see whether companies have responded to it as they see fit and, through the rest of the framework of the Bill, be empowered to make their decisions about their experiences online. This being done openly and transparently, and informed by Ofcom’s research, will mean that everyone is better informed.
We are sympathetic to the amendment. It is complex, and this has been a useful debate—
I wonder whether the Minister has an answer to the academic community, who now see their European colleagues getting ahead through being able to access data through other legislation in other parts of the world. Also, we have a lot of faith in Ofcom, but it seems a mistake to let it be the only arbiter of what needs to be seen.
We are very aware that we are not the only jurisdiction looking at the important issues the Bill addresses. The Government and, I am sure, academic researchers will observe the implementation of the European Union’s Digital Services Act with interest, including the provisions about researchers’ access. We will carefully consider any implications of our own online safety regime. As noble Lords know, the Secretary of State will be required to undertake a review of the framework between two and five years after the Bill comes into force. We expect that to include an assessment of how the Bill’s existing transparency provisions facilitate researcher access.
My Lords, I congratulate the noble Baroness on having elucidated this arcane set of amendments. Unfortunately, though, it makes me deeply suspicious when I see what the amendments seem to do. I am not entirely clear about whether we are returning to some kind of merits-based appeal. If so, since the main litigators are going to be the social media companies, it will operate for their benefit to reopen every single thing that they possibly can on the basis of the original evidence that was taken into account by Ofcom, as opposed to doing it on a JR basis. It makes me feel quite uncomfortable if it is for their benefit, because I suspect it is not going to be for the ordinary user who has been disadvantaged by a social media company. I hope our brand spanking new independent complaints system—which the Minister will no doubt assure us is well on the way—will deal with that, but this strikes me as going a little too far.
My Lords, I enter the fray with some trepidation. In a briefing, Carnegie, which we all love and respect, and which has been fantastic in the background in Committee days, shared some concerns. As I interpret its concerns, when Ofcom was created in 2003 its decisions could be appealed on their merits, as the noble Lord has just suggested, to the Competition Appeal Tribunal, and I believe that this was seen as a balancing measure against an untested regime. What followed was that the broad basis on which appeal was allowed led to Ofcom defending 10 appeals per year, which really frustrated its ability as a regulator to take timely decisions. It turned out that the appeals against Ofcom made up more than 80% of the workload of the Competition Appeal Tribunal, whose work was supposed to cover a whole gamut of matters. When there was a consultation in the fringes of the DEA, it was decided to restrict appeal to judicial review and appeal on process. I just want to make sure that we are not opening up a huge and unnecessary delaying tactic.
I thank all those who have spoken, and I very much appreciate the spirit in which the amendments were tabled. They propose changes to the standard of appeal, the standing to appeal and the appeals process itself. The Government are concerned that enabling a review of the full merits of cases, as proposed by Amendments 243 and 245, could prove burdensome for the courts and the regulator, since a full-merits approach, as we have been hearing, has been used by regulated services in other regulatory regimes to delay intervention, undermining the effectiveness of the enforcement process. With deep-pocketed services in scope, allowing for a full-merits review could incentivise speculative appeals, both undermining the integrity of the system and slowing the regulatory process.
While the Government are fully committed to making sure that the regulator is properly held to account, we feel that there is not a compelling case for replacing the decisions of an expert and well-resourced regulator with those of a tribunal. Ofcom will be better placed to undertake the complex analysis, including technical analysis, that informs regulatory decisions.
Amendment 245 would also limit standing and leave to appeal only to providers and those determined eligible entities to make super-complaints under Clause 150. This would significantly narrow the eligibility requirements for appeals. For appeals against Ofcom notices we assess that the broader, well-established standard in civil law of sufficient interest is more appropriate. Super-complaints fulfil a very different function from appeals. Unlike appeals, which will allow regulated services to challenge decisions of the regulator, super-complaints will allow organisations to advocate for users, including vulnerable groups and children, to ensure that systemic issues affecting UK users are brought to Ofcom’s attention. Given the entirely distinct purposes of these functions, it would be inappropriate to impose the eligibility requirements for super-complaints on the appeals system.
I am also concerned about the further proposal in Amendment 245 to allow the tribunal to replace Ofcom’s decision with its own. Currently, the Upper Tribunal is able to dismiss an appeal or quash Ofcom’s decision. Quashed decisions must be remitted to Ofcom for reconsideration, and the tribunal may give directions that it considers appropriate. Amendment 245 proposes instead allowing the Upper Tribunal to
“impose or revoke, or vary the amount of, a penalty … give such directions or take such other steps as OFCOM could itself have given or taken, or … make any other decision which OFCOM could itself have made”.
The concern is that this risks undermining Ofcom’s independence and discretion in applying its powers and issuing sanctions, and in challenging the regulator’s credibility and authority. It may also further incentivise well-resourced providers to appeal opportunistically, with a view to securing a more favourable outcome at a tribunal.
On that basis, I fear that the amendments tabled by the noble Lord would compromise the fundamental features of the current appeals provisions, without any significant benefits, and risk introducing a range of inadvertent consequences. We are confident that the Upper Tribunal’s judicial review process, currently set out in the Bill, provides a proportionate, effective means of appeal that avoids unnecessary expense and delays, while ensuring that the regulator’s decisions can be thoroughly scrutinised. It is for these reasons that I hope the noble Baroness will withdraw the amendment.
The fact that I labelled it as being AI-generated helped your Lordships to understand, and the transparency eases the debate. I beg to move.
My Lords, I thank the noble Lord, Lord Knight, for laying out the amendment and recognise that there was a very thoughtful debate on the subject of machine-generated content on Amendment 125 in my name on a previous day of Committee.
I appreciate that the concept of labelling or watermarking machine-generated material is central to recent EU legislation, but I am equally aware that there is more than one school of thought on the efficacy of that approach among AI experts. On the one hand, as the noble Lord, Lord Knight, beautifully set out—with the help of his artificial friend—there are those who believe that visibly marking the division of real and altered material is a clue for the public to look more carefully at what they are seeing and that labelling it might provide an opportunity for both creators and digital companies to give greater weight to “human-created material”. For example, it could be that the new BBC Verify brand is given greater validity by the public, or that Google’s search results promote it above material labelled as machine-generated as a more authentic source. There are others who feel that the scale of machine-generated material will be so vast that this labelling will be impossible or that labelling will downgrade the value of very important machine-generated material in the public imagination, when in the very near future it is likely that most human activity will be a blend of generated material and human interaction.
I spent the first part of this week locked in a room with others at the Institute for Ethics in AI in Oxford debating some of these issues. While this is a very live discussion, one thing is clear: if we are to learn from history, we must act now before all is certain, and we should act with pragmatism and a level of humility. It may be that either or both sets of experts are correct.
Industry has clearly indicated that there is an AI arms race, and many companies are launching services that they do not understand the implications of. This is not my view but one told to me by a company leader, who said that the speed of distribution was so great that the testing was confined to whether deploying large language models crashed the platforms; there was no testing for safety.
The noble Lord, Lord Stevenson, says in his explanatory statement that this is a probing amendment. I therefore ask the Minister whether we might meet before Report and look once again at the gaps that might be covered by some combination of Amendment 125 and the amendment in front of us, to make certain that the Bill adequately reflects the concerns raised by the enforcement community and reflects the advice of those who best understand the latest iterations of the digital world.
The Communications Act 2003 made a horrible mistake in not incorporating digital within it; let us not do the same here. Adding explicit safety duties to AI and machine learning would not slow down innovation but would ensure that innovation is not short-sighted and dangerous for humanity. It is a small amendment for what may turn out to be an unimaginably important purpose.
My Lords, it is a pleasure to follow the noble Baroness, Lady Kidron. I will try to keep my remarks brief.
It is extremely helpful that we have the opportunity to talk about this labelling question. I see it more as a kind of aperitif for our later discussion of AI regulation writ large. Given that it is literally aperitif hour, I shall just offer a small snifter as to why I think there may be some challenges around labelling—again, perhaps that is not a surprise to the noble Baroness.
When we make rules, as a general matter we tend to assume that people are going to read them and respond in a rationalist, conformist way. In reality, particularly in the internet space, we often see that there is a mixed environment and there will be three groups. There are the people who will look at the rules and respond in that rational way to them; a large group of people will just ignore them—they will simply be unaware and not at all focused on the rules; and another group will look for opportunities to subvert them and use them to their own advantage. I want to comment particularly on that last group by reference to cutlery and call centres, two historic examples of where rules have been subverted.
On the cutlery example, I am a Sheffielder, and “Made in Sheffield” used to mean that you had made the entire knife in Sheffield. Then we had this long period when we went from knives being made in Sheffield to bringing them to Sheffield and silver-plating them, to eventually just sharpening them and putting them in boxes. That is relevant in the context of AI. Increasingly, if there is an advantage to be gained by appearing to be human, people will look at what kind of finishing you need, so: “The content may have been generated by AI but the button to post it was pushed by a human, therefore we do not think it is AI because we looked at it and posted it”. On the speech of the noble Lord, Lord Knight, does the fact that my noble friend intervened on him and the noble Lord had to use some of his own words now mean that his speech in Hansard would not have to be labelled “AI-generated” because we have now departed from it? Therefore, there is that question of individuals who will want something to appear human-made even if it was largely AI-generated, and whether they will find the “Made in Sheffield” way of bypassing it.
Interestingly, we may see the phenomenon flipping the other way, and this is where my call centres come in. If people go to a popular search engine and type in “SpinVox”, they will see the story of a tech company that promised to transcribe voicemails into written text. This was a wonderful use of technology, and it was valued on the basis that it had developed that fantastic technology. However, it turned out—or at least there were claims, which I can repeat here under privilege—that it was using call centres in low-cost, low-wage environments to type those messages out. Therefore, again, we may see, curiously, some people seeing an advantage to presenting content as AI-generated when it is actually made by humans. That is just to flag that up—as I say, it is a much bigger debate that we are going to have. It is really important that we are having it, and labelling has a role to play. However, as we think about it, I urge that we remember those communities of people who will look at whatever rules we come up with and say, “Aha! Where can I get advantage?”, either by claiming that something is human when it is generated by AI or claiming that it is generated by AI if it suits them when it was actually produced by humans.
My Lords, we already had a long debate on this subject earlier in Committee. In the interim, many noble Lords associated with these amendments have had conversations with the Government, which I hope will bear some fruit before Report. Today, I want to reiterate a few points that I hope are clarifying to the Committee and the department. In the interests of everyone’s evening plans, the noble Lord, Lord Bethell, and the noble Baroness, Lady Harding, wish to associate themselves with these remarks so that they represent us in our entirety.
For many years, we thought age verification was a gold standard, primarily because it involved a specific government-issued piece of information such as a passport. By the same token, we thought age estimation was a lesser beast, given that it is an estimate by its very nature and that the sector primarily relied on self-declarations with very few checks and balances. In recent years, many approaches to age checking have flourished. Some companies provide age assurance tokens based on facial recognition; others use multiple signals of behaviour, friendship group, parental controls and how you move your body in gameplay; and, only yesterday, I saw the very impressive on-device privacy-preserving age-verification system that Apple rolled out in the US two weeks ago. All of these approaches, used individually and cumulatively, have a place in the age-checking ecosystem, and all will become more seamless over time. But we must ensure that, when they are used, they are adequate for the task they are performing and are quality controlled so that they do not share information about a child, are secure and are effective.
That is why, at the heart of the package of measures put forward in my name and that of the noble Lords, Lord Stevenson and Lord Bethell, and the right reverend Prelate the Bishop of Oxford, are two concepts. First, the method of measuring age should be tech neutral so that all roads can be used. Secondly, there must be robust mechanism of measurement of effectiveness so that only effective systems can be used in high-risk situations, particularly those of primary priority harms such as self-harm and pornography, and that such a measurement will be determined by Ofcom, not industry.
From my work over the last decade and from recent discussion with industry, I am certain that any regime of age assurance must be measurable and hold to certain principles. We cannot create a situation where children’s data is loosely held and liberally shared; we cannot have a system that discriminates against, or does not have automatic appeal mechanisms for, children of colour or those who are 17 or 19, who are at most likelihood of error. Systems should aim to be interoperable and private, not leave traces as children go from one service to another.
Each of the principles of our age-verification package set out in the schedule are of crucial importance. I hope that the Government will see the sense in that because, without them, this age checking will not be trusted. Equally, I urge the Committee to embrace the duality of age verification and estimation that the Government have put forward, because, if a child uses an older sibling’s form of verification and a company understands through the child’s behaviour that they are indeed a child, then we do not want to set up a perverse situation in which the verification is considered of a higher order and they cannot take action based on estimation; ditto, if estimation in gameplay is more accurate than tokens that verify whether someone is over or under 18, it may well be that estimation gives greater assurance that the company will treat the child according to their age.
I hope and believe that, in his response, the Minister will confirm that definitions of age assurance and age estimation will be on the face of the Bill. I also urge him to make a generous promise to accept the full gamut of our concerns about age checking and bring forward amendments in his name on Report that reflect them in full. I beg to move.
My Lords, I associate these Benches with the introduction by the noble Baroness, Lady Kidron, support her amendments and, likewise, hope that they form part of the package that is trundling on its way towards us.
I am very grateful to the noble Baroness for her amendment, which is a useful opportunity for us to state publicly and share with the Committee the progress we have been making in our helpful discussions on these issues in relation to these amendments. I am very grateful to her and to my noble friends Lord Bethell and Lady Harding for speaking as one on this, including, as is well illustrated, in this short debate this evening.
As the noble Baroness knows, discussions continue on the precise wording of these definitions. I share her optimism that we will be able to reach agreement on a suitable way forward, and I look forward to working with her, my noble friends and others as we do so.
The Bill already includes a definition of age assurance in Clause 207, which is
“measures designed to estimate or verify the age or age-range of users of a service”.
As we look at these issues, we want to avoid using words such as “checking”, which suggests that providers need to take a proactive approach to checking age, as that may inadvertently preclude the use of technologies which determine age through other means, such as profiling. It is also important that any definition of age assurance does not restrict the current and future use of innovative and accurate technologies. I agree that it is important that there should be robust definitions for terms which are not currently defined in the Bill, such as age verification, and recommit to the discussions we continue to have on what terms need to be defined and the best way to define them.
This has been a very helpful short debate with which to end our deliberations in Committee. I am very grateful to noble Lords for all the points that have been raised over the past 10 days, and I am very glad to be ending in this collaborative spirit. There is much for us still to do, and even more for the Office of the Parliamentary Counsel to do, before we return on Report, and I am grateful to it and to the officials working on the Bill. I urge the noble Baroness to withdraw her amendment.