(4 days, 15 hours ago)
Lords ChamberMy Lords, I shall speak very briefly. Earlier—I suppose it was this morning—we talked about child criminal exploitation at some length, thanks particularly to the work of the noble Baroness, Lady Casey, and Professor Jay. Essentially, what we are talking about in this group of amendments is child commercial exploitation. All these engines, all these technologies, are there for a commercial purpose. They have investors who are expecting a return and, to maximise the return, these technologies are designed to drive traffic, to drive addiction, and they do it very successfully. We are way behind the curve—we really are.
I echo what the noble Baroness, Lady Morgan, said about the body of knowledge within Parliament, in both Houses, that was very involved in the passage of the Online Safety Act. There is a very high level of concern, in both Houses, that we were perhaps too ambitious in assuming that a regulator that had not previously had any responsibilities in this area would be able to live up to the expectations held, and indeed some of the promises made, by the Government during the passage of that Act. I think we need to face up to that: we need to accept that we have not got it off to as good a start as we wanted and hoped, and that what is happening now is that the technologies we have been hearing about are racing ahead so quickly that we are finding it hard to catch up. Indeed, looking at the body language and the physiognomies of your Lordships in the Chamber, looking at the expressions on our faces as some of what we were talking about is being described, if it is having that effect on us, imagine what effect it is having on the children who in many cases are the subjects of these technologies.
I plead with the Minister to work very closely with his new ministerial colleague, the noble Baroness, Lady Lloyd, and DSIT. We really need to get our act together and focus; otherwise, we will have repeats of these sorts of discussions where we raise issues that are happening at an increasing pace, not just here but all around the world. I fear that we are going to be holding our hands up, saying “We’re doing our best and we’re trying to catch up”, but that is not good enough. It is not good enough for my granddaughter and not good enough for the extended families of everybody here in this Chamber. We really have to get our act together and work together to try to catch up.
My Lords, I too support the amendments in this group, particularly those tabled by my noble friend Lord Nash on security software and by the noble Baroness, Lady Kidron, on AI-generated child sexual abuse material. I declare my interest as a trustee of the Royal Society for Public Health.
As others have noted, the Online Safety Act was a landmark achievement and, in many ways, something to be celebrated, but technology has not stood still—we said it at the time—and nor can our laws. It is important that we revisit it in examining this legislation, because generative AI presents such an egregious risk to our children which was barely imaginable even two years ago when we were discussing that Act. These amendments would ensure that our regulatory architecture keeps pace.
Amendment 266 on AI CSAM risk assessment is crucial. It addresses a simple but profound question: should the provider of a generative AI service be required to assess whether that service could be used to create or facilitate child sexual abuse material? Surely the answer is yes. This is not a theoretical risk, as we have heard in testimony from many noble Lords. We know that AI can generate vivid images, optimised on a dataset scraped from children themselves on the open internet, and that can be prompted to create CSAM-like content. On this, there is no ambiguity at all. We know that chatbots trained on the vast corpora of text from children can be manipulated to generate grooming scripts and sexualised narratives to engage children and make them semi-addicted to those conversations. We know that these tools are increasingly accessible, easy to use and almost impossible to monitor by parents and, it seems, regulators.
(11 months, 1 week ago)
Grand CommitteeMy Lords, I will speak to my Amendments 198A and 198C to 198F. I also support Amendments 197, 198 and 198B, to which I have added my name, all of which address the issue of data for researchers.
As was put very thoughtfully by the noble Baroness, Lady Kidron, platforms are not making decisions about their services with due regard to product safety or with independent oversight. Ofcom’s work enforcing the Online Safety Act will significantly shift towards accountability, in some part, but it makes no provision at the moment on researchers’ data access, despite civil society and academic researchers being at the forefront of highlighting online harms for a decade. The anecdotes that the noble Baroness just gave were a very powerful testimony to the importance of that. We are, in fact, flying completely blind, making policy and, in this Room, legislation without data, facts and insight about the performance and algorithms that we seek to address. Were it not for the whistleblowers, we would not have anything to go on and we cannot rely on whistleblowers to guide our hands.
Rectifying this admission is in the Bill, and I am enormously grateful to the Minister and to the role of my noble friend Lord Camrose for putting it in the Bill. It is particularly important, because the situation with data for researchers has deteriorated considerably, even in the last 18 months—with Meta shutting CrowdTangle and X restricting researchers’ access to its API. The noble Baroness, Lady Kidron, spoke about what the whistleblowers think, and they think that this is going to get a lot worse in the future.
I welcome the inclusion of these provisions in the Bill. They will be totally transformational to this sector, bringing a level of access to serious analysts and academics, so we can better understand the impact of the digital world, for both good and bad. A good example of the importance of robust research to inform policy-making was the Secretary of State’s recent announcement that the Government were launching a
“research project to explore the impact of social media on young people’s wellbeing and mental health”.—[Official Report, Commons, 20/11/24; col. 250.]
That project will not be very effective if the researchers cannot access the data, so I very much hope that these provisions will be enforced before they start spending money on that.
To be effective and to have the desired effect, we need to ensure that the data for researchers regime, as described in the Bill, is truly effective and cannot be easily brushed off. That is why the Government need to accept the amendments in this group: to bring some clarity and to close loopholes in the scheme as it is outlined in the Bill.
I will briefly summarise the provisions in the amendments in my name. First, we need to make researcher access regulations enforceable in the same way as other requirements in the Online Safety Act. The enforcement provisions in that Act were strengthened considerably as it passed through this House, and I believe that the measures for data for researchers need to be given the same rocket boosters. Amendment 198D will mean that regulated services will be required to adhere to the regime and give Ofcom the power to levy proper remedial action if regulated services are obfuscating or non-compliant.
Secondly, we need to ensure that any contractual provision of use, such as a platform’s terms of service, is unenforceable if it would prevent
“research into online safety matters”,
as defined in the regulations. This is an important loophole that needs to be closed. It will protect UK researchers carrying out public interest research from nefarious litigation over terms of service violations as platforms seek to obfuscate access to data. We have seen this practice in other areas.
Thirdly, we need to clarify that researchers carrying out applicable research into online safety matters in the UK will be able to access information under the regime, regardless of where they are located. This is a basic point. Amendment 198E would bring the regime in line with the Digital Services Act of the EU and allow the world’s best researchers to study potential harm to UK users.
Ensuring robust researcher access to data contributes to a great ecosystem of investigation and scrutiny that will help to enforce an effective application of the law, while also guarding against overreach in terms of moderating speech. It is time to back UK civil society and academic researchers to ensure that policy-making and regulatory enforcement is as informed as possible. That is why I ask the Minister to support these measures.
My Lords, I will speak briefly. I added my name in support of Amendments 197 and 198, tabled by the noble Baroness, Lady Kidron. We do not need to rehearse the arguments as to why children are a distinct group who need to be looked at in a distinctive way, so I will not repeat those arguments.
I turn to the excellent points made in the amendments in the name of the noble Lord, Lord Bethell. Data access for researchers is fundamental. The problem with statutory bodies, regulators and departments of state is that they are not designed and set up to be experts in researching some of the more arcane areas in which these algorithms are developed. This is leading-edge stuff. The employees in these platforms—the people who are designing and tweaking these very clever algorithms—are coming from precisely the academic and research institutions that are best placed to go into those companies and find out what they are doing. In many cases, it is their own graduates and PhDs who are doing it. They are the best qualified people to look at what is going on, because they will understand what is going on. If somebody tries to obfuscate, they will see through them immediately, because they can understand that highly sophisticated language.
If we do not allow this, we will be in the deeply uncomfortable position of relying on brave people such as Frances Haugen to run the huge reputational, employability and financial risks of becoming a whistleblower. A whistleblower who takes on one of those huge platforms that has been employing them is a very brave person indeed. I would feel distinctly uncomfortable if I thought that we were trying to guard our citizens, and particularly our children, against what some of these algorithms are trying to do by relying on the good wishes and chances of a whistleblower showing us what was going on. I support all these amendments very strongly.