Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateLord Clement-Jones
Main Page: Lord Clement-Jones (Liberal Democrat - Life peer)Department Debates - View all Lord Clement-Jones's debates with the Department for Business and Trade
(1 day, 16 hours ago)
Grand CommitteeMy Lords, unusually, I rise to move an amendment, Amendment 138. For the second time in Committee, I find myself heading a group when I know that the noble Baroness, Lady Kidron, will be much better qualified to introduce the subject. Indeed, she has an amendment, Amendment 141, which is far preferable in many ways to mine.
Amendment 138 is designed to ensure that the Information Commissioner produces a code of practice specific to children up to the age of 18 for the purposes of UK law and Convention 108, and pupils as defined by the Education Act 1996, who may be up to the age of 19 or, with special educational needs, up to 25 in the education sector. The charity Data, Tech & Black Communities put it this way in a recent letter to the noble Baroness, Lady Jones:
“We recently completed a community research project examining the use of EdTech in Birmingham schools. This project brought us into contact with over 100 people … including parents, school staff and community members. A key finding was the need to make it easier for those with stewardship responsibility for children’s data, to fulfil this duty. Even with current data protection rights, parents and guardians struggle to make inquiries (of schools, EdTech companies and even DfE) about the purpose behind the collection of some of their children’s data, clarity about how it is used (or re-used) or how long data will be retained for. ‘Opting out’ on behalf of their children can be just as challenging. All of which militates against nuanced decision-making about how best to protect children’s short and long-term interests … This is why we are in support of an ICO Code of Practice for Educational Settings that would enable school staff, parents and learners, the EdTech industry and researchers to responsibly collect, share and make use of children’s data in ways that support the latter’s agency over their ‘digital selves’ and more importantly, will support their flourishing”.
The duties of settings and data processers and rights appropriate to the stage of education and children’s capacity needs clarity and consistency. Staff need confidence to access and use data appropriately within the law. As the UNCRC’s General Comment No. 16 (2013) on State Obligations Regarding the Impact of the Business Sector on Children’s Rights set out over a decade ago,
“the realization of children’s rights is not an automatic consequence of economic growth and business enterprises can also negatively impact children’s rights”.
The educational setting is different from only commercial interactions or in regard to the data subjects being children. It is more complex because of the disempowered environment and its imbalance of power between the authority, the parents and the child. The additional condition is the fact that parents’ and children’s rights are interlinked, as exemplified in the right to education described in UDHR Article 26(3), which states:
“Parents have a prior right to choose the kind of education that shall be given to their children.”
A code is needed because the explicit safeguards are missing that the GDPR requires in several places but were left out of the UK Data Protection Act 2018 drafting. Clause 80 of the Bill—“Automated decision-making”—does not address the necessary safeguards of GDPR Article 23(1) for children. Furthermore, removing the protections of the balancing test under the recognised legitimate interest condition will create new risks. Clauses on additional further processing or changes to purpose limitation are inappropriately wide without child-specific safeguards. The volume, sensitivity and intrusiveness of identifying personal data collection in educational settings only increases, while the protections are only ever reduced.
Obligations specific to children’s data, especially
“solely automated decision-making and profiling”
and exceptions, need to be consistent with clear safeguards by design where they restrict fundamental freedoms. What does that mean for children in practice, where teachers are assumed to be the rights bearers in loco parentis? The need for compliance with human rights, security, health and safety, among other standards proportionate to the risks of data processing and respecting the UK Government’s accessibility requirements, should be self-evident and adopted in a code of practice, as recommended in the five rights in the Digital Futures Commission’s blueprint for educational data governance.
The Council of Europe Strategy for the Rights of the Child (2022-2027) and the UNCRC General Comment No. 25 on Children’s Rights and the Digital Environment make it clear that
“children have the right to be heard and participate in decisions affecting them”.
They recognise that
“capacity matters, in accordance with their age and maturity. In particular attention should be paid to empowering children in vulnerable situations, such as children with disabilities.”
Paragraph 75 recognises that surveillance in educational settings should not take place without the right to object and that teachers need training to keep up with technological developments.
Participation of young people themselves has not been invited in the development of this Bill and the views of young people have not been considered. However, a small sample of parent and pupil voices has been captured in the Responsible Technology Adoption Unit’s public engagement work together with the DfE in 2024. The findings back those of Defend Digital Me’s Survation poll in 2018 and show that parents do not know that the DfE already holds named pupil records without their knowledge or permission and that the data is given away to be reused by hundreds of commercial companies, the DWP, the Home Office and the police. It stated:
“There was widespread consensus that work and data should not be used without parents’ and/or pupils’ explicit agreement. Parents, in particular, stressed the need for clear and comprehensive information about pupil work and data use and any potential risks relating to data security and privacy breaches.”
A code of practice is needed to explain the law and make it work as intended for everyone. The aims of a code of practice for educational settings would be that adherence to a code creates a mechanism for controllers and processors to demonstrate compliance with the legislation or approve certification methods. It would give providers confidence in consistent and clear standards and would be good for the edtech sector. It would allow children, parents, school staff and systems administrators to build trust in safe, fair and transparent practice so that their rights are freely met by design and default.
Further, schools give children’s personal data to many commercial companies during a child’s education—not based on consent but assumed for the performance of a task carried out in the public interest. A code should clarify any boundaries of this lawful basis for commercial purposes, where it is an obligation on parents to provide the data and what this means for the child on reaching maturity or after leaving the educational setting.
Again, a code should help companies understand “data protection by design and default” in practice, and appropriate “significant legal effect”, the edges of “public interest” in data transfers to a third country, and how special categories of data affect children in schools. A code should also support children and families in understanding the effect of the responsibilities of controllers and processes for the execution or limitation of their own rights. It would set out the responsibilities of software platforms that profile users’ metadata to share with third parties, or of commercial apps signed up for in schools that offer adverts in use.
I hope that I have explained exactly why we believe that a code of conduct is required in educational settings. I beg to move.
My Lords, I support and have added my name to Amendment 138 in the name of the noble Lord, Lord Clement-Jones. I will also speak to Amendment 141 in my name and those of the noble Lords, Lord Knight and Lord Russell, and the noble Baroness, Lady Harding.
Both these amendments propose a code of practice to address the use of children’s data in the context of education. Indeed, they have much in common. Having heard the noble Lord, Lord Clement-Jones, I have much in common with what he said. I associate myself entirely with his remarks and hope that mine will build on them. Both the amendments point to the same problem that children’s data is scandalously treated in our schools and educators need support; this is a persistent and known failure that both the DfE and the ICO have failed to confront over a period of some years.
Amendment 141 seeks to give a sense of exactly what an education code should cover. In doing so, it builds on the work of the aforementioned Digital Futures for Children centre at the LSE, which I chair, the work of Defend Digital Me, the excellent work of academics at UCL, and much of the work relating to education presented to the UN tech envoy in the course of drafting the UN global digital compact.
Subsection (1) of the proposed new clause would require the ICO to prepare a code of practice in connection with the provision of education. Subsection (2) sets out what the ICO would have to take into account, such as that education provision includes school management and safeguarding as well as learning; the different settings in which it takes place; the need for transparency and evidence of efficacy; and all the issues already mentioned, including profiling, transparency, safety, security, parental involvement and the provision of counselling services.
Subsection (3) would require the ICO to have regard to children’s entitlement to a higher standard of protection—which we are working so hard in Committee to protect—their rights under the UNCRC and their different ages and stages of development. Importantly, it also refers to the need and desire to support innovation in education and the need to ensure that the benefits derived from the use of UK children’s data accrue to the UK.
Subsection (4) lists those whom the commissioner would have to consult, and subsection (5) sets out when data processors and controllers would be subject to the code. Subsection (6) proposes a certification scheme for edtech services to demonstrate compliance with UK GDPR and the code. Subsection (7) would require edtech service and product providers to evidence compliance—importantly, transferring that responsibility from schools to providers. Subsection (8) simply defines the terms.
A code of practice is an enabler. It levels the playing field, sets terms for innovators, creates sandbox or research environments, protects children and supports schools. It offers a particularly attractive environment for developing the better digital world that we would all like to see, since schools are identifiable communities in which changes and outcomes could be measured.
My Lords, Amendment 138 tabled by the noble Lord, Lord Clement-Jones, and Amendment 141, tabled by the noble Baroness, Lady Kidron, and the noble Lord, Lord Knight, would both require the ICO to publish a code of practice for controllers and processors on the processing of personal data by educational technologies in schools.
I say at the outset that I welcome this debate and the contributions of noble Lords on this important issue. As various noble Lords have indicated, civil society organisations have also been contacting the Department for Science, Innovation and Technology and the Department for Education directly to highlight their concerns about this issue. It is a live issue.
I am grateful to my noble friend Lord Knight, who talked about some of the important and valuable contributions that technology can play in supporting children’s development and guiding teaching interventions. We have to get the balance right, but we understand and appreciate that schoolchildren, parents and schoolteachers must have the confidence to trust the way that services use children’s personal data. That is at the heart of this debate.
There is a lot of work going on, on this issue, some of which noble Lords have referred to. The Department for Education is already exploring ways to engage with the edtech market to reinforce the importance of evidence-based quality products and services in education. On my noble friend Lord Knight’s comments on AI, the Department for Education is developing a framework outlining safety expectations for AI products in education and creating resources for teachers and leaders on safe AI use.
I recognise why noble Lords consider that a dedicated ICO code of practice could help ensure that schools and edtech services are complying with data protection legislation. The Government are open-minded about exploring the merits of this further with the ICO, but it would be premature to include these requirements in the Bill. As I said, there is a great deal of work going on and the findings of the recent ICO audits of edtech service providers will help to inform whether a code of practice is necessary and what services should be in scope.
I hope that we will bear that in mind and engage on it. I would be happy to continue discussions with noble Lords, the ICO and colleagues at the Department for Education, outside of the Bill’s processes, about the possibility of future work on this, particularly as the Secretary of State has powers under the Data Protection Act 2018 to require the ICO to produce new statutory codes, as noble Lords know. Considering the explanation that I have given, I hope that the noble Lord, Lord Clement-Jones, will consider withdrawing his amendment at this stage.
My Lords, I thank the Minister for her response and all speakers in this debate. On the speech from the noble Lord, Lord Knight, I entirely agree with the Minister and the noble Viscount, Lord Camrose, that it is important to remind ourselves about the benefits that can be achieved by AI in schools. The noble Lord set out a number of those. The noble Lord, Lord Russell, also reminded us that this is not a purely domestic issue; it is international across the board.
However, all noble Lords reminded us of the disbenefits and risks. In fact, the noble Lord, Lord Knight, used the word “dystopian”, which was quite interesting, although he gets very close to science fiction sometimes. He said that
“we have good reason to be concerned”,
particularly because of issues such as the national pupil database, where the original purpose may not have been fulfilled and was, in many ways, changed. He gave an example of procurement during Covid, where the choice was either Google or Microsoft—Coke or Pepsi. That is an issue across the board in competition law, as well.
There are real issues here. The noble Lord, Lord Russell, put it very well when he said that there is any number of pieces of guidance for schools but it is important to have a code of conduct. We are all, I think, on the same page in trying to find—in the words of the noble Baroness, Lady Kidron—a fairer and more equitable set of arrangements for children in schools. We need to navigate our way through this issue; of course, organisations such as Defend Digital Me and 5rights are seriously working on it.
My Lords, it is a pleasure to take part in today’s Committee proceedings. In doing so, I declare my technology interests as set out in the register, not least as an adviser to Socially Recruited, an AI business. In moving Amendment 156A, I will also speak to Amendment 156B, and I thank the noble Lord, Lord Clement-Jones, for co-signing them.
We live in extraordinarily uncertain times, domestically and internationally. In many ways, it has always been thus. However, things are different and have accelerated, not least in the last two decades, because of the online environment and the digital selves that we find ourselves interacting with in a world that is ever changing moment by moment. These amendments seek to update an important statute that governs critical elements of how cybersecurity professionals in this nation seek to keep us all safe in these extraordinarily difficult times.
The Computer Misuse Act 1990 was introduced to defend telephony exchanges at a time when 0.5% of us were online. If that was the purpose of the Act—the statute when passed—that alone would suggest that it needs an update. Who among us would use our smartphone if we had had it for 34 years? Well, we could not—the iPhone has been around only since 2007. This whole world has changed profoundly in the last 20 years, never mind the last 34. It is not just that the Act needs to be updated because it falls short of how society and technology have changed in those intervening years; it needs, desperately and urgently, to be updated because it is currently putting every citizen in this nation at risk for want of being amended. This is the purpose of Amendments 156A and 156B.
The Computer Misuse Act 1990 is not only out of date but inadvertently criminalising the cybersecurity professionals we charge with the job of keeping us all safe. They oftentimes work, understandably, under the radar, behind not just closed but locked doors, doing such important work. Yet, for want of these amendments, they are doing that work, all too often, with at least one hand tied behind their back.
Let us take just two examples: vulnerability research and threat intelligence assessment and analysis. Both could find that cybersecurity professional falling foul of the provisions of the CMA 1990. Do not take my word for it: look to the 2024 annual report of the National Cyber Security Centre, which rightly and understandably highlights the increasing gap between the threats we face and its ability, and the ability of the cybersecurity professionals community, to meet those threats.
These amendments, in essence, perform one simple but critical task: to afford a legal defence for legitimate cybersecurity activities. That is all, but it would have such a profound impact for those whom we have asked to keep us safe and for the safety they can thus deliver to every citizen in our society.
Where is the Government’s work on updating the Computer Misuse Act 1990 in this respect? Will the Government take this opportunity to accept these amendments? Do they believe that these amendments would provide a materially positive benefit to our cybersecurity professionals and thus to our nation, and, if so, why would they not take this first opportunity to enact these amendments to this data Bill?
It is not time; it is well over time that these amendments become part of our law. If not now, when? If not these amendments, which amendments? If they do not accept these amendments, what will the Government say to all those people who will continue to be put in harm’s way for want of these protective provisions being passed? It is time to pass these amendments and give our cybersecurity professionals the tools they need. It is time, from the legislative perspective, to keep them safe so that they can do the self-same thing for all of us. It is time to cyber up. I beg to move.
My Lords, I was delighted to see these amendments tabled by the noble Lord, Lord Holmes. He, the noble Lord, Lord Arbuthnot, and I, along with many other parliamentarians, have long argued for changes to the Computer Misuse Act. For context, the original Act was created largely in response to a famous incident in which professional hackers and a technology journalist broke into British Telecom’s Prestel system in the mid-1980s. The Bill received Royal Assent in June 1990, barely two months after Tim Berners-Lee and CERN made the world wide web publicly available for the first time. Who remembers Prestel? Perhaps this is the wrong House in which to ask that question.
As the noble Lord, Lord Holmes, explained, there is no statutory public interest defence in the Act. This omission creates a legal risk for cybersecurity researchers and professionals conducting legitimate activities in the public interest. The Post Office Horizon scandal demonstrated how critical independent computer system investigation is for uncovering systemic problems and highlighted the need for protected legal pathways for researchers and investigators to examine potentially flawed systems.
I am delighted that the noble Lord, Lord Vallance, is here for this set of amendments. His Pro-innovation Regulation of Technologies Review explicitly recommends incorporating such a defence to provide stronger legal protections for cybersecurity researchers and professionals engaged in threat intelligence research. This recommendation was rooted in the understanding that such a defence would have, it said,
“a catalytic effect on innovation”
within the UK’s cybersecurity sector, which possesses “considerable growth potential”.
My Lords, I rise briefly but strongly to support my noble friend Lord Holmes. The CyberUp campaign has been banging this drum for a long time now. I remember taking part in the debates in another place on the Computer Misuse Act 34 years ago. It was the time of dial-up modems, fax machines and bulletin boards. This is the time to act, and it is the opportunity to do so.
My Lords, we ought to be mindful and congratulate the noble Lord on having been parliamentarian of the year as a result of his campaigning activities.
Could the Minister say a few words on some of those points of discourse and non-consensus, to give the Committee some flavour of the type of issues where there is no consensus as well as the extent of the gap between some of those perspectives?
Just to follow up, have the Government formally responded to the original review from the noble Lord, Lord Vallance? That would be very helpful as well, in unpacking what were clearly extremely well-informed recommendations. It should, no doubt, be taken extremely seriously.
Yes, the Government accepted the recommendations in full.
Before the Minister sits down or stands up or whatever the appropriate phrase should be, I very much hope that, since the previous Government gave that indication, this Government will take that as a spur to non-glacial progress. I hope that at least the speed might get up to a number of miles per hour before too long.
My Lords, I thank all noble Lords who have taken part in this important debate and, indeed, the Minister for her thoughtful response. We find ourselves in a position of extraordinary good fortune when it comes to these and many other amendments, not least in the area of artificial intelligence. We had a first-class report from the then Sir Patrick Vallance as CSA. It is not often in life that in a short space of time one is afforded the opportunity in government of bringing much of that excellent work into being through statute, regulation, codes and other guidance. I await further steps in this area.
There can barely be, in many ways, a more serious and pressing issue to be addressed. For every day that we delay, harms are caused. Even if the Government were only to do this on their growth agenda, much spoken of, this would have an economic benefit to the United Kingdom. It would be good to meet the Minister between Committee and Report to see if anything further can be done but, from my perspective and others, we will certainly be returning to this incredibly important issue. I beg leave to withdraw the amendment.
My Lords, I would like to just make one comment on this group. I entirely agree with everything that has been said and, in particular, with the amendments in the name of the noble Baroness, Lady Kidron, but the one that I want to single out—it is why I am bothering to stand up—is Amendment 197, which says that the Secretary of State “must” implement this measure.
I was heavily scarred back in 2017 by the Executive’s refusal to implement Part 3 of the Digital Economy Act in order to protect our children from pornography. Now, nearly eight years later, they are still not protected. It was never done properly, in my opinion, in the then Online Safety Bill either; it still has not been implemented. I think, therefore, that we need to have a “must” there. We have an Executive who are refusing to carry out the issue from Parliament in passing the legislation. We have a problem, but I think that we can amend it by putting “must” in the Bill. Then, we can hold the Executive to account.
My Lords, the trouble with this House is that some have long memories. The noble Earl, Lord Erroll, reminded us all to look back, with real regret, at the Digital Economy Act and the failure to implement Part 3. I think that that was a misstep by the previous Government.
Like all of us, I warmly welcome the inclusion of data access provisions for researchers studying online safety matters in Clause 123 of the Bill. As we heard from the noble Baroness, Lady Kidron, and the noble Lord, Lord Knight, this was very much unfinished business from the Online Safety Act. However, I believe that, in order for the Bill to be effective and have the desired effect, the Government need to accept the amendments in the names of the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell. In terms of timeframe, the width of research possible, enforceability, contractual elements and location, they cover the bases extremely effectively.
The point was made extremely well by the noble Lords, Lord Bethell and Lord Russell, that we should not have to rely on brave whistleblowers such as Frances Haugen. We should be able to benefit from quality researchers, whether from academia or elsewhere, in order to carry out this important work.
My Amendment 198B is intended as a probing amendment about the definition of researchers under Clause 123, which has to be carefully drawn to allow for legitimate non-governmental organisations, academics and so on, but not so widely that it can be exploited by bad actors. For example, we do not want those who seek to identify potential exploits in a platform to use this by calling themselves “independent researchers” if they simply describe themselves as such. For instance, could Tommy Robinson seek to protect himself from liabilities in this way? After all, he called himself an “independent journalist” in another context when he clearly was not. I hope that when the Government come to draw up the regulations they will be mindful of the need to be very clear about what constitutes an independent or accredited researcher, or whatever phrase will be used in the context.
My Lords, although I have no amendments in this group, I will comment on some of them. I might jump around the order, so please forgive me for that.
Amendment 197 would change Clause 123 so that the Secretary of State must, as soon as reasonably practicable and no later than 12 months after the Act is passed, make regulations requiring regulated services to provide information for the purposes of research into online safety. This is clearly sensible. It would ensure that valuable research into online safety may commence as soon as possible, which would benefit us all, as speakers have made abundantly clear. To that end, Amendment 198D, which would ensure that researcher access is enforceable in the same way as other requirements under the Online Safety Act, would ensure that researchers can access valuable information and carry out their beneficial research.
I am still left with some curiosity on some of these amendments, so I will indicate where I have specific questions to those who have tabled them and hope they will forgive me if I ask to have a word with them between now and Report, which would be very helpful. In that spirit, I turn to Amendment 198B, which would allow the Secretary of State to define the term “independent researcher”. I ask the noble Lord, Lord Clement-Jones, who tabled the amendment, whether he envisages the Secretary of State taking advice before making such regulations and, if so, from whom and in what mechanism. I recognise that it is a probing amendment, but I would be keen to understand more.
I am also keen to understand further from my noble friend Lord Bethell and the noble Baroness, Lady Kidron, why, under Amendment 198A, the Secretary of State would not be able to make regulations providing for independent research into the “enforcement of requirements” under these regulations. Again, I look forward to discussing that with them.
I have some concerns about Amendment 198, which would require service providers to give information pertaining to age, stage of development, gender, race, ethnicity, disability and sexuality to researchers. I understand the importance of this but my concern is that it would require the disclosure of special category data to those researchers. I express reservations, especially if the data pertains to children. Do we have the right safeguards in place to address the obviously heightened risks here?
Additionally, I have some concerns about the provisions suggested in Amendment 198E. Should we allow researchers from outside the United Kingdom to require access to information from regulated service providers? Could this result in data being transferred into jurisdictions where there are less stringent data protection laws?
My Lords, I support Amendment 203 and, in particular, Amendments 211G and 211H from the noble Baroness, Lady Owen. I have little to add to what I said on Friday. I confess to my noble friend the Minister that, in my speech on Friday, I asked whether this issue would be in scope for this Bill, so maybe I gave the noble Baroness the idea. I pay tribute to her agility in being able to act quickly to get this amendment in and include something on audio, following the speech of the noble Baroness, Lady Gohir.
I hope that the Minister has similar agility in being able to readjust the Government’s position on this. It is right that this was an urgent manifesto commitment from my party at the last election. It fits entirely with my right honourable friend the Home Secretary’s efforts around violence against women and girls. We should accept and grab this opportunity to deliver quickly by working with the noble Baroness, Lady Owen, and others between now and Report to bring forward an amendment to the Bill that the whole House will support enthusiastically.
My Lords, we have had some powerful speeches in this group, not least from the noble Baronesses, Lady Kidron and Lady Owen, who drafted important amendments that respond to the escalating harms caused by AI-generated sexual abuse material relating to children and adults. The amendment from the noble Baroness, Lady Kidron, would make it an offence to use personal data or digital information to create digital models or files that facilitate the creation of AI or computer-generated child sexual abuse material. As she outlined and the noble Lord, Lord Bethell, confirmed, it specifically would become an offence to create, train or distribute generative AI models that enable the creation of computer-generated CSAM or priority legal content; to train AI models on CSAM or priority illegal content; or to possess AI models that produce CSAM or priority legal content.
This amendment responds to a growing problem, as we have heard, around computer-generated sexual abuse material and a gap in the law. There is a total lack of safeguards preventing bad actors creating sexual abuse imagery, and it is causing real harm. Sites enabling this abuse are offering tools to harm, humiliate, harass, coerce and cause reputational damage. Without robust legal frameworks, victims are left vulnerable while perpetrators operate with impunity.
The noble Lord, Lord Bethell, mentioned the Internet Watch Foundation. In its report of July, One Step Ahead, it reported on the alarming rise of AI-generated CSAM. In October 2023, in How AI is Being Abused to Create Child Sexual Abuse Imagery, it made recommendations to the Government regarding legislation to strengthen legal frameworks to better address the evolving landscape of AI-generated CSAM and enhance preventive measures against its creation and distribution. It specifically recommended:
“That the Government legislates to make it an offence to use personal data or digital information to create digital models or files that facilitate the creation of AI or computer-generated child sexual abuse material”.
The noble Baroness, Lady Kidron, tabled such an amendment to the previous Bill. As she said, she was successful in persuading the then Government to accept it; I very much hope that she will be as successful in persuading this Government to accept her amendment.
Amendments 211G and 211H in the name of the noble Baroness, Lady Owen, are a response to the extraordinary fact that one in 14 adults has experienced threats to share intimate images in England and Wales; that rises to one in seven among young women. Research from Internet Matters shows that 49% of young teenagers in the UK aged between 13 and 16—around 750,000 children—said that they were aware of a form of image-based abuse being perpetrated against another young person known to them.
We debated the first of the noble Baroness’s amendments, which is incorporated in her Bill, last Friday. I entirely agree with the noble Lord, Lord Knight; I did not find the Government’s response at all satisfactory. I hope that, in the short passage of time between then and now, they have had time to be at least a little agile, as he requested. UK law clearly does not effectively address non-consensual intimate images. It is currently illegal to share or threaten to share non-consensual intimate images, including deepfakes, but creating them is not yet illegal; this means that someone could create a deepfake image of another person without their consent and not face legal consequences as long as they do not share, or threaten to share, it.
This amendment is extremely welcome. It addresses the gap in the law by criminalising the creation of non-consensual intimate images, including deepfakes. It rightly targets deepfakes due to their rising prevalence and potential for harm, particularly towards women. Research shows that 98% of deepfake videos online are pornographic, with 99% featuring women and girls. This makes it an inherently sexist problem that is a new frontier of violence against women—words that I know the noble Baroness has used.
I also very much welcome the new amendment not contained in her Bill, responding to what the noble Baroness, Lady Gohir, said at its Second Reading last Friday about including audio deepfakes. The words “shut down every avenue”, which I think were used by the noble Baroness, Lady Gohir, are entirely apposite in these circumstances. Despite what the noble Lord, Lord Ponsonby, said on Friday, I hope that the Government will accept both these amendments and redeem their manifesto pledge to ban the creation of sexually explicit deepfakes, whether audio or video.
My Lords, the current law does not sufficiently protect children from AI-driven CSAM because it is simply such a fast-moving issue. It is a sobering thought that, of all the many wonderful developments of AI that many of us have been predicting and speculating on for so long, CSAM is really driving the technology forward. What a depressing reflection that is.
Overall, AI is developing at an extraordinarily rapid pace and has come with a number of concerning consequences that are not all yet fully understood. However, it is understood that child sexual abuse is completely unacceptable in any and all contexts, and it is right that our law should be updated to reflect the dangers that have increased alongside AI development.
Amendment 203 seeks to create a specific offence for using personal data or digital information to create or facilitate the creation of computer-generated child sexual abuse material. Although legislation is in place to address possessing or distributing such horrendous material, we must prioritise the safety of children in this country and take the law a step further to prevent its creation. Our children must be kept safe and, subject to one reservation, which I will come to in a second, I support the amendment from the noble Baroness, Lady Kidron, to further protect them.
That reservation comes in proposed new subsection 1(c), which includes in the offence the act of collating files that, when combined, enable the creation of sexual abuse material. This is too broad. A great deal of the collation of such material can be conducted by innocent people using innocent materials that are then corrupted or given more poisonous aspects by further training, fine-tuning or combination with other materials by more malign actors. I hope there is a way we can refine this proposed new paragraph on that basis.
Unfortunately, adults can also be the targets of individuals who use AI to digitally generate non-consensual explicit images or audio files of an individual, using their likeness and personal data. I am really pleased that my noble friend Lady Owen tabled Amendments 211G and 211H to create offences for these unacceptable, cruel acts. I support these amendments unambiguously.
My Lords, I very much support these amendments. I declare an interest as an owner of written copyright in the Good Schools Guide and as a father of an illustrator. In both contexts, it is very important that we get intellectual property right, as I think the Government recognised in what they put out yesterday. However, I share the scepticism of those who have spoken as to whether the Government’s ideas can be made to work.
It is really important that we get this straight. For those of us operating at the small end of the scale, IP is under continual threat from established media. I write maybe 10 or a dozen letters a year to large media outfits reminding them of the borders, the latest to the Catholic Herald—it appears not even the 10 commandments have force on them. But what AI can do is a huge measure more difficult to deal with. I can absolutely see, by talking to Copilot, that it has gone through my paywall and absorbed the contents of the Good Schools Guide, but who am I supposed to go at for this? Who has actually done the trespassing? Who is responsible for it? Where is the ownership? It is difficult to enforce copyright, even by writing a polite letter to someone saying, “Please don’t do this”. The Government appear to propose a system of polite letters saying, “Oh dear, it looks as if you might have borrowed my copyright. Please, can you give it back?”
This is not practically enforceable, and it will not result in people who care about IP locating their businesses here. Quite clearly, we do not have ownership of the big AI systems, and it is unlikely that we will have ownership of them—all that will be overseas. What we can do is create IP. If we produce a system where we do not defend the IP that we produce, then fairly rapidly, those IP creators who are capable of being mobile will go elsewhere to places that will defend their IP. It is something that a Government who are interested in growth really ought to be interested in defending. I hope that we will see some real progress in the course of the Bill going through the House.
My Lords, I declare my AI interests as set out in the register. I will speak in support of Amendments 204, 205 and 206, which have been spoken to so inspiringly by the noble Baroness, Lady Kidron, and so well by the noble Lords, Lord Freyberg, Lord Lucas and Lord Hampton, the noble Earl, Lord Clancarty, and the noble Viscount, Lord Colville. Each demonstrated different facets of the issue.
I co-chair the All-Party Group on AI and chaired the AI Select Committee a few years ago. I wrote a book earlier this year on AI regulation, which had a namecheck from the noble Baroness, Lady Jones, at Question Time, which I was very grateful for. Before that, I had a career as an IP lawyer, defending copyright and creativity, and in this House, I have been my party’s creative industries spokesperson. The question of IP and the training of generative AI models is a key issue for me.
This is the case not just in the UK but around the world. Getty and the New York Times are suing in the United States, as are many writers, artists and musicians. It was at the root of the Hollywood actors’ and writers’ strikes last year. It is one thing to use the tech—many of us are AI enthusiasts—but it is another to be at the mercy of it.
Close to home, the FT has pointed out, using the index published by the creator of an unlicensed dataset called Books3, published online, that it is possible to identify that over 85 books written by 33 Members of the House of Lords have been pirated to train AI models from household names, such as Meta, Microsoft and Bloomberg. Although it is absolutely clear that we know that the use of copyrighted works to train AI models is contrary to UK copyright law, the laws around the transparency of these activities have not caught up. As we have heard, as well as using pirated e-books in their training data, AI developers scrape the internet for valuable professional journalism and other media, in breach of both the terms of service of websites and copyright law, to train commercial AI models. At present, developers can do this without declaring their identity, or they may use IP scraped to appear in a search index for the completely different commercial purpose of training AI models.
How can rights owners opt out of something that they do not know about? AI developers will often scrape websites or access other pirated material before they launch an LLM in public. This means that there is no way for IP owners to opt out of their material being taken before its inclusion in these models. Once used to train these models, the commercial value, as we have heard, has already been extracted from IP scraped without permission, with no way to delete data from these models.
The next wave of AI models responds to user queries by browsing the web to extract valuable news and information from professional news websites. This is known as retrieval-augmented generation—RAG. Without payment for extracting this commercial value, AI agents built by companies such as Perplexity, Google and Meta will, in effect, free-ride on the professional hard work of journalists, authors and creators. At present, such crawlers are hard to block. There is no market failure; there are well-established licensing solutions. There is no uncertainty around the existing law; the UK is absolutely clear that commercial organisations, including gen AI developers, must license the data that they use to train their large language models.
Here, as the Government’s intentions become clearer, the political, business and creative temperature is rising. Just this week, we have seen the creation of a new campaign, the Creative Rights in AI Coalition—CRAIC —across the creative and news industries and, recently, Ed Newton-Rex reached more than 30,000 signatories from among creators and creative organisations.
The noble Lord has enormous experience in these areas and will be particularly aware of the legal difficulties in enforcing rights. Given what he said, with which I entirely agree—indeed, I agree with all the speakers in supporting these amendments—and given the extraordinary expense of litigating to enforce rights, how does he envisage there being an adequate system to allow those who have had their data scraped in the way that he describes to obtain redress or, rather, suitable remedies?
I thank the noble Lord for that. He is anticipating a paragraph in my notes, which says that, although it is not set out in the amendments, robust enforcement of these provisions will be critical to their success. This includes oversight from an expert regulator that is empowered to issue significant penalties, including fines for non-compliance. There is a little extra work to do there, and I would very much like to see the Intellectual Property Office gain some teeth.
I am going to close. We are nearly at the witching hour, but it is clear that AI developers are seeking to use their lobbying clout—the noble Baroness, Lady Kidron, mentioned the Kool-Aid—to persuade the Government that new copyright law is required. Instead, this amendment would clarify that UK copyright law applies to gen AI developers. The creative industries, and noble Lords from across the House as their supporters, will rally around these amendments and vigorously oppose government plans for a new text and data- mining exception.
My Lords, if I may just interject, I have seen this happen not just in the Horizon scandal. Several years ago, the banks were saying that you could not possibly find out someone’s PIN and were therefore refusing to refund people who had had stuff stolen from them. It was not until the late Professor Ross Anderson, of the computer science department at Cambridge University, proved that they had been deliberately misidentifying to the courts which counter they should have been looking at, as to what was being read, and explained exactly how you could get the thing to default back to a different set of counters, that the banks eventually had to give way. But they went on lying to the courts for a long time. I am afraid that this is something that keeps happening again and again, and an amendment like this is essential for future justice for innocent people.
My Lords, it is a pity that this debate is taking place so late. I thank the noble Lord, Lord Arbuthnot, for his kind remarks, but my work ethic feels under considerable pressure at this time of night.
All I will say is that this is a much better amendment than the one that the noble Baroness, Lady Kidron, put forward for the Data Protection and Digital Information Bill, and I very strongly support it. Not only is this horrifying in the context of the past Horizon cases, but I read a report about the Capture software, which is likely to have created shortfalls that led to sub-postmasters being prosecuted as well. This is an ongoing issue. The Criminal Cases Review Commission is reviewing five Post Office convictions in which the Capture IT system could be a factor, so we cannot say that this is about just Horizon, as there are the many other cases that the noble Baroness cited.
We need to change this common law presumption even more in the face of a world in which AI use, with all its flaws and hallucinations, is becoming ever present, and we need to do it urgently.
My Lords, I thank the noble Baroness, Lady Kidron, for tabling her amendment. We understand its great intentions, which we believe are to prevent another scandal similar to that of Horizon and to protect innocent people from having to endure what thousands of postmasters have undergone and suffered.
However, while this amendment would make it easier to challenge evidence derived from, or produced by, a computer or computer system, we are concerned that, should it become law, this amendment could be misused by defendants to challenge good evidence. Our fear is that, in determining the reliability of such evidence, we may create a battle of the expert witnesses. This will not only substantially slow down trials but result in higher costs. Litigation is already expensive, and we would aim not to introduce additional costs to an already costly process unless absolutely necessary.
From our perspective, the underlying problem in the Horizon scandal was not that computer systems were critically wrong or that people were wrong, but that the two in combination drove the terrible outcomes that we have unfortunately seen. For many industries, regulations require firms to conduct formal systems validation, with serious repercussions and penalties should companies fail to do so. It seems to us that the disciplines of systems validation, if required for other industries, would be both a powerful protection and considerably less disruptive than potentially far-reaching changes to the law.
My Lords, having a system such as this would really focus the public sector on how we can generate more datasets. As I said earlier, education is an obvious one, but so is mobile phone data. All these companies have their licences. If a condition of the licence was that the data on how people move around the UK became a public asset, that would be hugely beneficial to policy formation. If we really understood how, why and when people move, we would make much better decisions. We could save ourselves huge amounts of money. We really ought to have this as a deep focus of government policy.
My Lords, I have far too little time to do justice to this subject. We on these Benches welcome this amendment. It is entirely consistent with the sovereign health fund proposed by Future Care Capital and, indeed, with the proposals from the Tony Blair Institute for Global Change on a similar concept called the national data trust. Indeed, this concept formed part of our Liberal Democrat manifesto at the last general election, so of course I support the amendment.
It would be very useful to hear more about the national data library, including on its purpose and operation, as the noble Baroness, Lady Kidron, said. I entirely agree with her that there is a great need for a sovereign cloud service or services. Indeed, the inability to guarantee that data on the cloud is held in this country is a real issue that has not yet been properly addressed.
My Lords, I thank the noble Baroness, Lady Kidron, for moving this amendment. As she rightly identified, the UK has a number of publicly held data assets, many of which contain extremely valuable information. This data—I flag, by way of an example, NHS data specifically—could be extremely valuable to certain organisations, such as pharmaceutical companies.
We are drawn to the idea of licensing such data—indeed, we believe that we could charge an extremely good price—but we have a number of concerns. Most notably, what additional safeguards would be required, given its sensitivity? What would be the limits and extent of the licensing agreement? Would this status close off other routes to monetising the data? Would other public sector bodies be able to use the data for free? Can this not already be done without the amendment?
Although His Majesty’s Official Opposition of course recognise the wish to ensure that the UK taxpayer gets a fair return on our information assets held by public bodies and arm’s-length organisations, and we certainly agree that we need to look at licensing, we are not yet sure that this amendment is either necessary or sufficient. We once again thank the noble Baroness, Lady Kidron, for moving it. We look forward to hearing both her and the Minister’s thoughts on the matter.
My Lords, it is a pleasure to introduce this group of amendments. I have a 35-minute speech prepared. In moving Amendment 211B, I shall speak also to Amendments 211C to 211E. The reason for this group of amendments is to try to get an increased focus on the range of issues they touch on.
I turn to Amendment 211B first. It seems at least curious to have a data Bill without talking about data centres in terms of their power usage, their environmental impact and the Government’s view of the current PUE standard. Is it of a standard that they think gives the right measure of confidence to consumers and citizens across the country, in terms of how data centres are being operated and their impacts?
Similarly, on Amendment 211C, not enough consideration is given to supply chains. I am not suggesting that they are the most exciting subject but you have to go only one or two steps back in any supply chain to get into deep depths of opacity. With this amendment, I am seeking to gain more clarity on data supply chains and the role of data across all supply chains. Through the combination of data and AI, we could potentially enable a transformation of our supply chain in real time. That would give us so much more flexibility to try for economic benefits and environmental benefits. I look forward to the Minister’s response.
I now move on to Amendment 211D. It is always a pleasure to bring AI into a Bill that really does not want to have AI in it. I am interested in the whole question of data input and output, not least with large language models. I am also interested in the Government’s view on how this interacts with the 1988 copyright Act. There may be some mileage in looking into some standards and approaches in this area, which would potentially go some way towards conditions of market access. We have some excellent examples to look at in other sectors of our economy and society, as set out in the amendment; I would welcome the Minister’s views on that.
I am happy that this group ends with Amendment 211E on the subject of public trust. In many ways, it is the golden thread that should run through everything when we talk about data; I wanted it to be the golden thread that ran through my AI regulation Bill. I always say that Clause 6 is the most important clause in that Bill because it goes to the question of public engagement and trust. Without that level of public engagement and trust, it does not matter how good the technologies are, how good the frameworks are or how good the chat around the data is. It might be golden but, if the public do not believe in it, they are not going to come and be part of it. The most likely consequence of this is that they will not be able to avail themselves of the benefits but they will almost certainly be saddled with the burdens. What these technologies enable is nothing short of a transformation of that discourse between citizen and state, with the potential to reimagine completely the social contract for the benefit of all.
Public engagement and public trust are the golden thread and the fuel for how we gain those economic, social and psychological benefits from the data. I will be very interested in the Minister’s response on what more could be done by the Government, because previous consultations, not least around some of these technologies, have been somewhat short of what we could achieve. With that #brevity and #our data, I beg to move.
My Lords, I shall be #even shorter. Data centres and their energy consumption are important issues. I agree that at a suitable moment—probably not now—it would be very interesting to hear the Government’s views on that. Reports from UK parliamentary committees and the Government have consistently emphasised the critical importance of maintaining public trust in data use and AI, but sometimes, the actions of the Government seem to go contrary to that. I support the noble Lord, Lord Holmes, in his call for essentially realising the benefits of AI while making sure that we maintain public trust.
My Lords, I thank my noble friend Lord Holmes of Richmond for tabling this amendment. As we all appreciate, taking stock of the effects of legislation is critical, as it allows us to see what has worked and what has not. Amendment 221B would require the Secretary of State to launch a consultation into the implications of the provisions of the Bill on the power usage and energy efficiency of data centres. His Majesty’s Official Opposition have no objection to the amendment’s aims but we wonder to what extent it is actually possible. By what means or benchmark can we identify whether a spike in energy usage is specifically due to a provision from this legislation, rather than as a result of some other factor? I should be most grateful if my noble friend could provide further detail on this matter in his closing speech.
Regarding Amendment 211C, we understand that much could be learned from a review of all data regulations and standards pertaining to the supply chains for financial, trade, and legal documents and products, although we wonder if this needs to happen the moment this Bill passes. Could this review not happen at any stage? By all means, let us do it sooner rather than later, but is it necessary to set a date in statute?
Moving on to Amendment 221D, we should certainly look to regulate the AI large language model sector to ensure that there are standards for the input and output of data for LLMs. However, this must be done in a way that does not stifle growth in this emerging industry.
Finally, we have some concerns about Amendment 211E. A national consultation on the use of individuals’ data is perhaps just too broad.
My Lords, listening to the noble Lord, Lord Lucas, is often an education, and today is no exception. I had no idea what local environmental records centres were, so I shall be very interested to hear what the Minister has to say in response.
My Lords, I thank my noble friend Lord Lucas for tabling Amendment 211F and all noble Lords for their brief contributions to this group.
Amendment 211F ensures that all the biodiversity data collected by or in connection with government is collected in local environment records centres to ensure that records are as good as possible. That data is then used by or in connection with government, so it is put to the best possible use.
The importance of sufficient and high-quality record collection cannot and must not be understated. With this in mind, His Majesty’s Official Opposition support the sentiment of the amendment in my noble friend’s name. These Benches will always champion matters related to biodiversity and nature recovery. In fact, many of my noble friends have raised concerns about biodiversity in Committee debates in your Lordships’ House on the Crown Estate Bill, the Water (Special Measures) Bill and the Great British Energy Bill. Indeed, they have tabled amendments that ensure that matters related to biodiversity appear at the forefront of draft legislation.
With that in mind, I am grateful to my noble friend Lord Lucas for introducing provisions, via Amendment 211F, which would require any planning application involving biodiversity net gain to include a data search report from the relevant local environmental records centre. I trust that the Minister has listened to the concerns raised collaboratively in the debate on this brief group. We must recognise the importance of good data collection and ensure that such data is used in the best possible way.