Committee (4th Day)
Relevant documents: 3rd Report from the Constitution Committee, 9th Report from the Delegated Powers Committee. Scottish, Welsh and Northern Ireland Legislative Consent sought.
15:45
Baroness Pitkeathley Portrait The Deputy Chairman of Committees (Baroness Pitkeathley) (Lab)
- Hansard - - - Excerpts

My Lords, as usual, if there is a Division in the Chamber while we are sitting, the Committee will adjourn as soon as the Division Bells are rung and resume after 10 minutes.

Clause 90: Duties of the Commissioner in carrying out functions

Amendment 135 not moved.
Clause 90 agreed.
Amendment 135A not moved.
Clause 91: Codes of practice for the processing of personal data
Clause 91 agreed.
Clause 92: Codes of practice: panels and impact assessments
Amendment 136
Moved by
136: Clause 92, page 117, line 24, leave out from “of” to the end of line 27 and insert “—
(a) a code prepared under section 124A, or(b) an amendment of such a code,that is specified or described in the regulations.”Member’s explanatory statement
New section 124B(11) of the Data Protection Act 2018 provides that the Information Commissioner’s duty to establish a panel to consider draft codes of practice may be disapplied or modified by regulations. This amendment ensures that regulations can make provision in relation to a particular code or amendment or a type of code or amendment.
Amendment 136 agreed.
Clause 92, as amended, agreed.
Amendment 137 not moved.
Amendment 138
Moved by
138: After Clause 92, insert the following new Clause—
“Code on processing personal data in education where it concerns a child or pupil(1) The Information Commissioner must consult on, prepare and publish a Code of Practice on standards to be followed in relation to the collection, processing, publication and other dissemination of personal data concerning children and pupils in connection with the provision of education services in the United Kingdom, within the meaning of the Education Act 1996, the Education (Scotland) Act 1996, and the Education and Libraries (Northern Ireland) Order 1986; and on standards on the rights of those children as data subjects which are appropriate to children’s capacity and stage of education.(2) For the purposes of subsection (1), the rights of data subjects must include—(a) measures related to responsibilities of the controller, data protection by design and by default, and security of processing,(b) safeguards and suitable measures with regard to automated decision-making, including profiling and restrictions,(c) the rights of data subjects including to object to or restrict the processing of their personal data collected during their education, including any exemptions for research purposes, and(d) matters related to the understanding and exercising of rights relating to personal data and the provision of education services.”Member’s explanatory statement
This amendment requires the Commission to consult on, prepare and publish a Code of Practice on standards to be followed in relation to the collection, processing, publication and other dissemination of personal data concerning children and pupils in connection with the provision of education services in the UK.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, unusually, I rise to move an amendment, Amendment 138. For the second time in Committee, I find myself heading a group when I know that the noble Baroness, Lady Kidron, will be much better qualified to introduce the subject. Indeed, she has an amendment, Amendment 141, which is far preferable in many ways to mine.

Amendment 138 is designed to ensure that the Information Commissioner produces a code of practice specific to children up to the age of 18 for the purposes of UK law and Convention 108, and pupils as defined by the Education Act 1996, who may be up to the age of 19 or, with special educational needs, up to 25 in the education sector. The charity Data, Tech & Black Communities put it this way in a recent letter to the noble Baroness, Lady Jones:

“We recently completed a community research project examining the use of EdTech in Birmingham schools. This project brought us into contact with over 100 people … including parents, school staff and community members. A key finding was the need to make it easier for those with stewardship responsibility for children’s data, to fulfil this duty. Even with current data protection rights, parents and guardians struggle to make inquiries (of schools, EdTech companies and even DfE) about the purpose behind the collection of some of their children’s data, clarity about how it is used (or re-used) or how long data will be retained for. ‘Opting out’ on behalf of their children can be just as challenging. All of which militates against nuanced decision-making about how best to protect children’s short and long-term interests … This is why we are in support of an ICO Code of Practice for Educational Settings that would enable school staff, parents and learners, the EdTech industry and researchers to responsibly collect, share and make use of children’s data in ways that support the latter’s agency over their ‘digital selves’ and more importantly, will support their flourishing”.


The duties of settings and data processers and rights appropriate to the stage of education and children’s capacity needs clarity and consistency. Staff need confidence to access and use data appropriately within the law. As the UNCRC’s General Comment No. 16 (2013) on State Obligations Regarding the Impact of the Business Sector on Children’s Rights set out over a decade ago,

“the realization of children’s rights is not an automatic consequence of economic growth and business enterprises can also negatively impact children’s rights”.

The educational setting is different from only commercial interactions or in regard to the data subjects being children. It is more complex because of the disempowered environment and its imbalance of power between the authority, the parents and the child. The additional condition is the fact that parents’ and children’s rights are interlinked, as exemplified in the right to education described in UDHR Article 26(3), which states:

“Parents have a prior right to choose the kind of education that shall be given to their children.”


A code is needed because the explicit safeguards are missing that the GDPR requires in several places but were left out of the UK Data Protection Act 2018 drafting. Clause 80 of the Bill—“Automated decision-making”—does not address the necessary safeguards of GDPR Article 23(1) for children. Furthermore, removing the protections of the balancing test under the recognised legitimate interest condition will create new risks. Clauses on additional further processing or changes to purpose limitation are inappropriately wide without child-specific safeguards. The volume, sensitivity and intrusiveness of identifying personal data collection in educational settings only increases, while the protections are only ever reduced.

Obligations specific to children’s data, especially

“solely automated decision-making and profiling”

and exceptions, need to be consistent with clear safeguards by design where they restrict fundamental freedoms. What does that mean for children in practice, where teachers are assumed to be the rights bearers in loco parentis? The need for compliance with human rights, security, health and safety, among other standards proportionate to the risks of data processing and respecting the UK Government’s accessibility requirements, should be self-evident and adopted in a code of practice, as recommended in the five rights in the Digital Futures Commission’s blueprint for educational data governance.

The Council of Europe Strategy for the Rights of the Child (2022-2027) and the UNCRC General Comment No. 25 on Children’s Rights and the Digital Environment make it clear that

“children have the right to be heard and participate in decisions affecting them”.

They recognise that

“capacity matters, in accordance with their age and maturity. In particular attention should be paid to empowering children in vulnerable situations, such as children with disabilities.”

Paragraph 75 recognises that surveillance in educational settings should not take place without the right to object and that teachers need training to keep up with technological developments.

Participation of young people themselves has not been invited in the development of this Bill and the views of young people have not been considered. However, a small sample of parent and pupil voices has been captured in the Responsible Technology Adoption Unit’s public engagement work together with the DfE in 2024. The findings back those of Defend Digital Me’s Survation poll in 2018 and show that parents do not know that the DfE already holds named pupil records without their knowledge or permission and that the data is given away to be reused by hundreds of commercial companies, the DWP, the Home Office and the police. It stated:

“There was widespread consensus that work and data should not be used without parents’ and/or pupils’ explicit agreement. Parents, in particular, stressed the need for clear and comprehensive information about pupil work and data use and any potential risks relating to data security and privacy breaches.”


A code of practice is needed to explain the law and make it work as intended for everyone. The aims of a code of practice for educational settings would be that adherence to a code creates a mechanism for controllers and processors to demonstrate compliance with the legislation or approve certification methods. It would give providers confidence in consistent and clear standards and would be good for the edtech sector. It would allow children, parents, school staff and systems administrators to build trust in safe, fair and transparent practice so that their rights are freely met by design and default.

Further, schools give children’s personal data to many commercial companies during a child’s education—not based on consent but assumed for the performance of a task carried out in the public interest. A code should clarify any boundaries of this lawful basis for commercial purposes, where it is an obligation on parents to provide the data and what this means for the child on reaching maturity or after leaving the educational setting.

Again, a code should help companies understand “data protection by design and default” in practice, and appropriate “significant legal effect”, the edges of “public interest” in data transfers to a third country, and how special categories of data affect children in schools. A code should also support children and families in understanding the effect of the responsibilities of controllers and processes for the execution or limitation of their own rights. It would set out the responsibilities of software platforms that profile users’ metadata to share with third parties, or of commercial apps signed up for in schools that offer adverts in use.

I hope that I have explained exactly why we believe that a code of conduct is required in educational settings. I beg to move.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I support and have added my name to Amendment 138 in the name of the noble Lord, Lord Clement-Jones. I will also speak to Amendment 141 in my name and those of the noble Lords, Lord Knight and Lord Russell, and the noble Baroness, Lady Harding.

Both these amendments propose a code of practice to address the use of children’s data in the context of education. Indeed, they have much in common. Having heard the noble Lord, Lord Clement-Jones, I have much in common with what he said. I associate myself entirely with his remarks and hope that mine will build on them. Both the amendments point to the same problem that children’s data is scandalously treated in our schools and educators need support; this is a persistent and known failure that both the DfE and the ICO have failed to confront over a period of some years.

Amendment 141 seeks to give a sense of exactly what an education code should cover. In doing so, it builds on the work of the aforementioned Digital Futures for Children centre at the LSE, which I chair, the work of Defend Digital Me, the excellent work of academics at UCL, and much of the work relating to education presented to the UN tech envoy in the course of drafting the UN global digital compact.

Subsection (1) of the proposed new clause would require the ICO to prepare a code of practice in connection with the provision of education. Subsection (2) sets out what the ICO would have to take into account, such as that education provision includes school management and safeguarding as well as learning; the different settings in which it takes place; the need for transparency and evidence of efficacy; and all the issues already mentioned, including profiling, transparency, safety, security, parental involvement and the provision of counselling services.

Subsection (3) would require the ICO to have regard to children’s entitlement to a higher standard of protection—which we are working so hard in Committee to protect—their rights under the UNCRC and their different ages and stages of development. Importantly, it also refers to the need and desire to support innovation in education and the need to ensure that the benefits derived from the use of UK children’s data accrue to the UK.

Subsection (4) lists those whom the commissioner would have to consult, and subsection (5) sets out when data processors and controllers would be subject to the code. Subsection (6) proposes a certification scheme for edtech services to demonstrate compliance with UK GDPR and the code. Subsection (7) would require edtech service and product providers to evidence compliance—importantly, transferring that responsibility from schools to providers. Subsection (8) simply defines the terms.

A code of practice is an enabler. It levels the playing field, sets terms for innovators, creates sandbox or research environments, protects children and supports schools. It offers a particularly attractive environment for developing the better digital world that we would all like to see, since schools are identifiable communities in which changes and outcomes could be measured.

16:00
I have raised the issues of edtech—the lack of privacy, the lack of evidence for learning outcomes and, in particular, some very serious known problems of safeguarding tech—with the Department for Education several times and to several Ministers. Each meeting is met with a level of shock at the evidence I produce and a determination to act, but then the department decides that providing schools with more guidance is the answer: guidance on data protection, guidance on AI, guidance on safeguarding for teachers and schools to understand and implement. There is nothing for the regulator, nothing for the companies and nothing that responds to the well-established fact that products need to be designed for privacy and safety by default. Given the known power imbalance of a company such as Microsoft or Google and a school DPO, or the skills and transparency gap between a product developer and a school safeguarding lead, heaping more burden and responsibility on teachers rather than using the tools of good government, law, regulation, certification and procurement power to foster ethical innovation is, I think, a failure of common sense if not leadership.
For example, many schools in East Anglia were recently persuaded to purchase costly visitor management software with high recurring annual subscription fees as a substitute for visitor registration books which, the company suggested, did not comply with GDPR. This expensive and unnecessary system includes biometric storage of visitors’ facial images, which raises questions of consent. I have described to the House before seeing a similar system trained on white faces unable to take a photograph as it did not recognise a black visitor as human. The waterfall of implications is extensive: to privacy, to fairness and to school budgets. A code of minimum standards for management products would avoid that.
Similarly, a code would bring clarity about how to handle and share student data. Between 2018 and 2020, the Education and Skills Funding Agency permitted access to the Learning Records Service database of some 28 million students. The data was used to build an age-verification system that was offered to online gambling companies. Research by UCL suggests it resulted in targeted gambling adverts:
“Early evidence from our study indicates evidence of participants creating gambling accounts while underage, with some spending up to £400 on these platforms”.
This is among the more egregious examples, but it is by no means the only one. A code could help us deal with that.
Similarly, a code could bring clarity to research. The Minister suggested last week that those objecting to the Government’s broadening of “scientific research” did not understand the role of research. I dispute that, and I look forward to her letter that says whether or not making a product more addictive to children could be reasonably said to be scientific, given that it involves A/B testing of children at scale. The code suggested by this amendment would clarify the distinction between research and product development in edtech by outlining when research ethics should apply in delineating institutional responsibilities when engaging in collaborative projects.
Those of us who support the introduction of technology but want it to be mindful of rights holders are often cast in the role of tech detractors, but it is a mischaracterisation. We simply want to create a fairer and more equitable set of arrangements to protect human vulnerability, in this case, of children; or to respond to institutional struggle, in this case of schools; and to protect against commercially predatory behaviour, in this case of more than a few edtech companies. An edtech code would stop the muddle of research, social provision and commercial exploitation that happens now in our schools with no rules attached.
I also believe that having an edtech code would likely give birth to an industry of standards and certification schemes, as in so many areas from wifi protocols to the strength of our car windscreens. Perhaps most important of all, it would give a basis for DfE procurement. School communities are a combination of furious and overwhelmed by the number of duties foisted on them. Procurement standards would liberate their time and anxiety. The misery of realising the state-of-the-art filtering and monitoring system that a school bought at great expense last year is no longer fit for purpose is quite devastating, and I have seen it repeatedly.
In fact, almost 50% of school monitoring and filtering services now cannot recognise harmful content from gen AI, and some services make it possible to turn off the filters for illegal content when that should be prohibited, not a question of choice. I have raised this with Ministers, but in spite of my entreaties—and those of Judy and Andy Thomas, whose daughter Frankie took her own life after accessing pro-suicide content on a school iPad because the filter was not on—the department continues to stonewall. A code that covered all edtech would give safeguarding teams confidence in the products they buy and the protocols to use them.
I have run out of time, but I say finally that the edtech code must cover early learning. The early learning communities are
“dismayed that nobody is advocating for the needs of the youngest and most vulnerable children”.
A group of 55 early years professionals wrote to me to say that
“it is alarming to many early childhood development experts, who are left confused and frustrated that the DFE have opted not to include Online safety as a statutory reference Early Years Statutory Framework despite our repeated representations”.
A code would help everybody.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My Lords, I was unsure whether to support Amendment 141, let alone speak to it, simply because I have a number of interests in this area and I should be clear about those. I chair Century-Tech Ltd, which is an AI edtech company; I am on the board of Educate Ventures Research Ltd, which offers advice to educators and schools on the use of AI in education; and I am a trustee of the Good Future Foundation, which does something similar.

I start by reminding the Committee of some of the benefits of technology and AI for education, so that there is a balance both in my speech and in the debate. Exciting practice is already taking place in the area of flipped learning, for example, where—putting issues of the digital divide to one side—in those classes and communities where there is good access to technology at home, the instructional element of learning can take place at home and school becomes a much more profoundly human endeavour, with teachers being able to save the time spent on the instructional element of teaching to bring that learning to life. I have some issues with AI in the world of tutoring in certain circumstances, but some of that can be very helpful in respect of flipped learning.

Project-based learning also becomes much more possible. That is very hard to teach but much more possible to teach by using AI tools to help link what is being learned in projects through to the curriculum. Teacher time can be saved and, by taking care of a lot of administrative tasks through AI, we can in turn make a significant contribution to the teacher retention crisis that is currently bedevilling our schools. There are novel assessment methods that can now be developed using AI, in particular making the traditional assessment method of the viva much more affordable and reliable. It is hard to use AI to cheat if you are being assessed orally.

Finally, an important element is preparation for work: if we want these young people to be able to leave school and thrive in a labour market where they must be able to collaborate effectively with machines, we need them to be able to experience that in a responsible and taught fashion in school.

However, dystopian issues can arise from an over- dependence on technology and from some of the potential impacts of using AI in education, too. I mentioned the digital divide—the 7.5 million families in this country are not connected to and confident to use the internet—and we discovered during Covid the device and data poverty that exists in this country. There is a possibility that poorer kids end up being taught by machines and not by human teachers at all. There is a danger that we do not shift our schools away from the slightly Victorian system that we have at the moment, which the noble Baroness, Lady Kidron, referenced at Second Reading. If we do not, we will end up with our children being outcompeted by machines. That overreliance on AI could also end up as privatisation by stealth because, if all the AI, technology and data are held by the private sector, and we are dependent on it, we will be beholden to the private sector however much we believe in the importance of the public good in our schools.

There are also problems of system design; I mentioned the Victorian system. I am hopeful that the curriculum and assessment review and the Children’s Wellbeing and Schools Bill that was published this week will help us. Whichever direction that review and those reforms take, we can be confident that edtech will respond. That is what it does; it responds to whatever regulation we pass, including in this Bill, over time and to whatever changes take place in the education system.

But tech needs data and it needs diversity of data. There is a danger that, if we close off access to data in this country, we will all end up using lots of AI that has been developed by using Chinese data, where they do not have the same misgivings about privacy, sharing each other’s data and acquiring data. We have to find a regime that works.

I do a bunch of work in international schooling as chair of COBIS—the Council of British International Schools—and I know of one large international school group, which I do not advise, that has done a deal with Microsoft around sharing all its pupil data, so that it can be used for Copilot. Obviously, Microsoft has a considerable interest in OpenAI, and we do not know exactly where that data is going. That points to some of the concerns that the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, have talked about.

During Covid, schools were strongly encouraged by the then Government to use either Google Classroom or Microsoft 365. Essentially, everyone was given a binary choice, and lots of data was therefore captured by those two large American corporations, which assisted them to develop further products. Any British alternative was, in essence, cut out, so we have good reason to be concerned in this area. That is why in the end I added my name and support to Amendment 141 in the name of the noble Baroness, Lady Kidron.

Children need privacy and they need digital rights. At the moment, those are exercised through parental consent for the use of these platforms and the capture of data, but I think it would be helpful to put that in a codified form, so that all those concerns have some sense of security about the regimes around which this works.

Ever since the abolition of Becta back in 2010, school leaders have been missing advice. Becta advice was used around the globe, as it was the authority on what works in technology and education. Sadly, the coalition got rid of it, and school leaders are now operating kind of blindfolded. We have 25,000 different school leaders buying technology, and very few of them really know what they are doing when faced with slick salespeople. Giving them some protection with a code would help their procurement.

The proof of the pudding will of course be in the eating—in the detail of the code—but I urge my noble friend the Minister to reflect carefully on the need for this, to talk to the DfE about it and to try to get some agreement. The DfE itself does not have the greatest track record on data and data protection. It has got into trouble with the ICO on more than one occasion.

My final cautionary tale, thanks to Defend Digital Me, is on the national pupil database, which was agreed in 2002 on the basis that children’s data would be kept private, protected and used only for research purposes—all the things that we are hearing in the debates on this Bill. Ten years later, that was all changed and 2,500 data- sharing arrangements followed that use that data, including for universal credit fraud detection. When parents allow their children’s data to be shared, they do not expect it to be used, down the line, to check universal credit entitlement. I do not think that was in the terms and conditions. There is an important issue here, and I hope that the Government are listening so that we make some progress.

16:15
Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- Hansard - - - Excerpts

I shall speak very briefly, because the previous three speakers have covered the ground extremely well and made some extremely powerful arguments.

The noble Baroness, Lady Kidron, put her finger on it. The default position of departments such as the DfE, if they recognise there is a problem, is to issue guidance. Schools are drowning in guidance. If you talk to any headmaster or headmistress or the staff in charge of technology and trying to keep on top of it, they are drowning in guidance. They are basically flying blind when being asked to take some quite major decisions, whether it is about purchasing or the safeguards around usage or about measuring the effectiveness of some of the educational technology skills that are being acquired.

There is a significant difference between guidance and a clear and concrete code. We were talking the other day, on another group, about the need to have guardrails, boundaries and clarity. We need clarity for schools and for the educational technology companies themselves to know precisely what they can and cannot do. We come back again to the issue of the necessity of measuring outcomes, not just processes and inputs, because they are constantly changing. It is very important for the companies themselves to have clear guardrails.

The research to which the noble Baroness, Lady Kidron, referred, which is being done by a variety of organisations, found problems in the areas that we are talking about in this country, the United States, Iceland, Denmark, Sweden, the Netherlands, Germany and France—and that is just scratching the surface. Things are moving very quickly and AI is accelerating that even more. With a code you are drawing a line in the sand and declaring very clearly what you expect and do not expect, what is permissible and not permissible. Guidance is simply not sufficient.

Lord Kirkhope of Harrogate Portrait Lord Kirkhope of Harrogate (Con)
- Hansard - - - Excerpts

My Lords, I make a brief intervention. I am not against these amendments —they are very useful in the context of the Bill. However, I am reflecting on the fact that, when we drafted GDPR, we took a six-year process and failed in the course of doing so to really accommodate AI, which keeps popping up every so often in this Bill. Every part of every amendment seems to have a new subsection referring to automative decisions or to AI generally.

Obviously, we are moving on to have legislation in due course on AI and I am sure that a number of pieces of legislation, including no doubt this one, will be able to be used as part of our overall package when we deal with the regulation of AI. However, although it is true that the UK GDPR gives, in theory, a higher standard of protection for children, it is important to consider that, in the context of AI, the protections that we need to have are going to have to be much greater—we know that. But if there is going to be a code of practice for children and educational areas, we need also to consider vulnerable and disabled people and other categories of people who are equally entitled to have, and particularly with regard to the AI elements need to have, some help. That is going to be very difficult. Most adults whom I know know less about AI than do children approaching the age of 18, who are much more knowledgeable. They are also more knowledgeable of the restrictions that will have to be put in place than are adults, who appear to be completely at sea and not even understanding what AI is about.

I make a precautionary point. We should be very careful, while we have AI dotted all the way through this, that when we specify a particular element—in this case, for children—we must be aware of the need to have protection in place for other groups, particularly in the context of this Bill and, indeed, future legislation.

Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, I very much support the thrust of these amendments and what the noble Lord, Lord Knight, said in support of and in addition to them. I declare an interest as a current user of the national pupil database.

The proper codification of safeguards would be a huge help. As the noble Baroness, Lady Kidron, said, it would give us a foundation on which to build. I hope that, if they are going to go in this direction, the Government will take an immediate opportunity to do so because what we have here, albeit much more disorganised, is a data resource equivalent to what we have for the National Health Service. If we used all the data on children that these systems generate, we would find it much easier to know what works and in what circumstances, as well as how to keep improving our education system.

The fact that this data is tucked away in little silos—it is not shared and is not something that can be used on a national basis—is a great pity. If we have a national code as to how this data is handled, we enable something like the use of educational data in the way that the NHS proposes to use health data. Safeguards are needed on that level but the Government have a huge opportunity; I very much hope that it is one they will take.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I start by thanking all noble Lords who spoke; I enjoyed the vivid examples that were shared by so many of them. I particularly enjoyed the comment from the noble Lord, Lord Russell, about the huge gulf in difference between guidance, of which there is far too much, and a code that actually drives matters forward.

I will speak much more briefly because this ground has been well covered already. Both the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, seek to introduce codes of practice to protect the data of children in education services. Amendment 138 in the name of the noble Lord seeks to introduce a code on processing personal data in education. This includes consultation for the creation of such a code—a highly important element because the safety of this data, as well as its eventual usage, is of course paramount. Amendment 141 in the name of the noble Baroness, Lady Kidron, also seeks to set out a code of practice to provide heightened protections for children in education.

Those amendments are absolutely right to include consultation. It is a particularly important area of legislation. It is important that it does not restrict what schools can do with their data in order to improve the quality and productivity of their work. I was very appreciative of the words of the noble Lord, Lord Knight, when he sketched out some of the possibilities of what becomes educationally possible when these techs are wisely and safely used. With individual schools often responsible for the selection of technologies and their procurement, the landscape is—at the risk of understatement —often more complex than we would wish.

Alongside that, the importance of the AI Safety Institute’s role in consultation cannot be overstated. The way in which tech and AI have developed in recent years means that its expertise on how safely to provide AI to this particularly vulnerable group is invaluable.

I very much welcome the emphasis that these amendments place on protecting children’s data, particularly in the realm of education services. Schools are a safe place. That safety being jeopardised by the rapid evolution of technology that the law cannot keep pace with would, I think we can all agree, be unthinkable. As such, I hope that the Government will give careful consideration to the points raised as we move on to Report.

Baroness Jones of Whitchurch Portrait The Parliamentary Under-Secretary of State, Department for Business and Trade and Department for Science, Information and Technology (Baroness Jones of Whitchurch) (Lab)
- Hansard - - - Excerpts

My Lords, Amendment 138 tabled by the noble Lord, Lord Clement-Jones, and Amendment 141, tabled by the noble Baroness, Lady Kidron, and the noble Lord, Lord Knight, would both require the ICO to publish a code of practice for controllers and processors on the processing of personal data by educational technologies in schools.

I say at the outset that I welcome this debate and the contributions of noble Lords on this important issue. As various noble Lords have indicated, civil society organisations have also been contacting the Department for Science, Innovation and Technology and the Department for Education directly to highlight their concerns about this issue. It is a live issue.

I am grateful to my noble friend Lord Knight, who talked about some of the important and valuable contributions that technology can play in supporting children’s development and guiding teaching interventions. We have to get the balance right, but we understand and appreciate that schoolchildren, parents and schoolteachers must have the confidence to trust the way that services use children’s personal data. That is at the heart of this debate.

There is a lot of work going on, on this issue, some of which noble Lords have referred to. The Department for Education is already exploring ways to engage with the edtech market to reinforce the importance of evidence-based quality products and services in education. On my noble friend Lord Knight’s comments on AI, the Department for Education is developing a framework outlining safety expectations for AI products in education and creating resources for teachers and leaders on safe AI use.

I recognise why noble Lords consider that a dedicated ICO code of practice could help ensure that schools and edtech services are complying with data protection legislation. The Government are open-minded about exploring the merits of this further with the ICO, but it would be premature to include these requirements in the Bill. As I said, there is a great deal of work going on and the findings of the recent ICO audits of edtech service providers will help to inform whether a code of practice is necessary and what services should be in scope.

I hope that we will bear that in mind and engage on it. I would be happy to continue discussions with noble Lords, the ICO and colleagues at the Department for Education, outside of the Bill’s processes, about the possibility of future work on this, particularly as the Secretary of State has powers under the Data Protection Act 2018 to require the ICO to produce new statutory codes, as noble Lords know. Considering the explanation that I have given, I hope that the noble Lord, Lord Clement-Jones, will consider withdrawing his amendment at this stage.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I thank the Minister for her response and all speakers in this debate. On the speech from the noble Lord, Lord Knight, I entirely agree with the Minister and the noble Viscount, Lord Camrose, that it is important to remind ourselves about the benefits that can be achieved by AI in schools. The noble Lord set out a number of those. The noble Lord, Lord Russell, also reminded us that this is not a purely domestic issue; it is international across the board.

However, all noble Lords reminded us of the disbenefits and risks. In fact, the noble Lord, Lord Knight, used the word “dystopian”, which was quite interesting, although he gets very close to science fiction sometimes. He said that

“we have good reason to be concerned”,

particularly because of issues such as the national pupil database, where the original purpose may not have been fulfilled and was, in many ways, changed. He gave an example of procurement during Covid, where the choice was either Google or Microsoft—Coke or Pepsi. That is an issue across the board in competition law, as well.

There are real issues here. The noble Lord, Lord Russell, put it very well when he said that there is any number of pieces of guidance for schools but it is important to have a code of conduct. We are all, I think, on the same page in trying to find—in the words of the noble Baroness, Lady Kidron—a fairer and more equitable set of arrangements for children in schools. We need to navigate our way through this issue; of course, organisations such as Defend Digital Me and 5rights are seriously working on it.

16:30
I welcome what the Minister had to say. She said that this is a welcome debate on a live issue and that there is a great deal of work happening in the DfE. She said that the department is working on a framework outlining expectations. Are we a gnat’s whisker away from a code of conduct? That was not entirely clear. She also said—this is always a bit of a red flag—that it is premature to start thinking about that in terms of this Bill, and that there is an ICO audit of the edtech service.
I was a member of Sir Anthony Seldon’s Institute for Ethical AI in Education, whose advisory board I chaired. The noble Lord, Lord Knight, was an extremely valuable member of that advisory board but that was some years ago—back in 2019 or 2020, I think. We have not moved much further on the kinds of guidance that are needed in the world of AI and data in schools. The Minister may say that thinking about this is premature, but we need to ratchet up the speed if we are really going to grapple with this issue. Schools are already grappling with it: AI tools are now commonplace. We must seize this and we must make sure that there is a code on which schools can rely.
I turn to the words of the noble Baroness, Lady Kidron: products are designed for privacy and security by default so, here, we are addressing not only schools but those who supply these products. We must get the procurement right in all of this. There is to some degree a sense of acceptance that work is going on but I very much hope that, as we go forward, the Minister can persuade us that we are going to press our foot on the accelerator in this respect. In the meantime, I beg leave to withdraw my amendment.
Amendment 138 withdrawn.
Amendments 139 to 141 not moved.
Clauses 93 and 94 agreed.
Clause 95: Notices from the Commissioner
Amendments 142 and 143 not moved.
Clause 95 agreed.
Amendments 144 and 144A not moved.
Clauses 96 to 100 agreed.
Clause 101: Annual report on regulatory action
Amendment 145 not moved.
Clause 101 agreed.
Clause 102 agreed.
Schedule 10 agreed.
Clause 103: Court procedure in connection with subject access requests
Amendments 146 to 150 not moved.
Clause 103 agreed.
Amendments 151 and 152 not moved.
Clause 104 agreed.
Amendment 153 not moved.
Clauses 105 to 107 agreed.
Amendments 154 to 156 not moved.
Amendment 156A
Moved by
156A: After Clause 107, insert the following new Clause—
“Data use: definition of unauthorised access to computer programs or dataIn section 17 of the Computer Misuse Act 1990, at the end of subsection (5) insert—“(c) they do not reasonably believe that the person entitled to control access of the kind in question to the program or data would have consented to that access if they had known about the access and the circumstances of it, including the reasons for seeking it, and(d) they are not empowered by an enactment, by a rule of law, or by order of a court or tribunal to access of the kind in question to the program or data.””
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

My Lords, it is a pleasure to take part in today’s Committee proceedings. In doing so, I declare my technology interests as set out in the register, not least as an adviser to Socially Recruited, an AI business. In moving Amendment 156A, I will also speak to Amendment 156B, and I thank the noble Lord, Lord Clement-Jones, for co-signing them.

We live in extraordinarily uncertain times, domestically and internationally. In many ways, it has always been thus. However, things are different and have accelerated, not least in the last two decades, because of the online environment and the digital selves that we find ourselves interacting with in a world that is ever changing moment by moment. These amendments seek to update an important statute that governs critical elements of how cybersecurity professionals in this nation seek to keep us all safe in these extraordinarily difficult times.

The Computer Misuse Act 1990 was introduced to defend telephony exchanges at a time when 0.5% of us were online. If that was the purpose of the Act—the statute when passed—that alone would suggest that it needs an update. Who among us would use our smartphone if we had had it for 34 years? Well, we could not—the iPhone has been around only since 2007. This whole world has changed profoundly in the last 20 years, never mind the last 34. It is not just that the Act needs to be updated because it falls short of how society and technology have changed in those intervening years; it needs, desperately and urgently, to be updated because it is currently putting every citizen in this nation at risk for want of being amended. This is the purpose of Amendments 156A and 156B.

The Computer Misuse Act 1990 is not only out of date but inadvertently criminalising the cybersecurity professionals we charge with the job of keeping us all safe. They oftentimes work, understandably, under the radar, behind not just closed but locked doors, doing such important work. Yet, for want of these amendments, they are doing that work, all too often, with at least one hand tied behind their back.

Let us take just two examples: vulnerability research and threat intelligence assessment and analysis. Both could find that cybersecurity professional falling foul of the provisions of the CMA 1990. Do not take my word for it: look to the 2024 annual report of the National Cyber Security Centre, which rightly and understandably highlights the increasing gap between the threats we face and its ability, and the ability of the cybersecurity professionals community, to meet those threats.

These amendments, in essence, perform one simple but critical task: to afford a legal defence for legitimate cybersecurity activities. That is all, but it would have such a profound impact for those whom we have asked to keep us safe and for the safety they can thus deliver to every citizen in our society.

Where is the Government’s work on updating the Computer Misuse Act 1990 in this respect? Will the Government take this opportunity to accept these amendments? Do they believe that these amendments would provide a materially positive benefit to our cybersecurity professionals and thus to our nation, and, if so, why would they not take this first opportunity to enact these amendments to this data Bill?

It is not time; it is well over time that these amendments become part of our law. If not now, when? If not these amendments, which amendments? If they do not accept these amendments, what will the Government say to all those people who will continue to be put in harm’s way for want of these protective provisions being passed? It is time to pass these amendments and give our cybersecurity professionals the tools they need. It is time, from the legislative perspective, to keep them safe so that they can do the self-same thing for all of us. It is time to cyber up. I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I was delighted to see these amendments tabled by the noble Lord, Lord Holmes. He, the noble Lord, Lord Arbuthnot, and I, along with many other parliamentarians, have long argued for changes to the Computer Misuse Act. For context, the original Act was created largely in response to a famous incident in which professional hackers and a technology journalist broke into British Telecom’s Prestel system in the mid-1980s. The Bill received Royal Assent in June 1990, barely two months after Tim Berners-Lee and CERN made the world wide web publicly available for the first time. Who remembers Prestel? Perhaps this is the wrong House in which to ask that question.

As the noble Lord, Lord Holmes, explained, there is no statutory public interest defence in the Act. This omission creates a legal risk for cybersecurity researchers and professionals conducting legitimate activities in the public interest. The Post Office Horizon scandal demonstrated how critical independent computer system investigation is for uncovering systemic problems and highlighted the need for protected legal pathways for researchers and investigators to examine potentially flawed systems.

I am delighted that the noble Lord, Lord Vallance, is here for this set of amendments. His Pro-innovation Regulation of Technologies Review explicitly recommends incorporating such a defence to provide stronger legal protections for cybersecurity researchers and professionals engaged in threat intelligence research. This recommendation was rooted in the understanding that such a defence would have, it said,

“a catalytic effect on innovation”

within the UK’s cybersecurity sector, which possesses “considerable growth potential”.

16:45
The current situation puts the UK at a disadvantage compared to countries such as France, Israel and the United States, which have already updated their legislation to include similar defences, allowing their cybersecurity industries to thrive. The absence of such a defence in the UK creates an uneven playing field and hinders the growth of the domestic cybersecurity sector. The noble Lord, Lord Holmes, has rightly mentioned the CyberUp campaign, which advocates for reforming the Act and emphasises the need to update the definitions of key provisions in the legislation. This would provide much greater clarity for researchers and ensure that legitimate cybersecurity activities are not unduly hampered by the fear of legal repercussions.
Despite ongoing discussions and consultations, progress towards amending the Act has been slow. The long-awaited review of the Act—which started in 2021—reported last year, and we have had a consultation which concluded this April. When will we see the Act amended? This is glacial progress on an important issue for innovation and growth. What is the hold up? This inaction inhibits innovation in a sector crucial to national security and economic growth.
The call for reform is not limited to industry groups; many others, including legal experts, academics and Members of both Houses have expressed support for updating the Act. This consensus underscores the wide- spread recognition of the Act’s inadequacy in addressing the current cyber threat landscape. As the noble Lord, Lord Holmes, mentioned, the need for these amendments, and the support for them, was highlighted by the National Cyber Security Centre and its recent annual review.
I believe the noble Lord, Lord Holmes, and the CyberUp campaign have made an overwhelming case for amending the Computer Misuse Act 1990. By agreeing to these, the Government could provide much-needed clarity and legal protection for cybersecurity professionals, enabling them to contribute effectively to the UK’s security and economic prosperity.
Lord Kirkhope of Harrogate Portrait Lord Kirkhope of Harrogate (Con)
- Hansard - - - Excerpts

My Lords, following on from what I said on earlier amendments, this is worse than what the noble Lord, Lord Clement-Jones, has just expressed. Indeed, I fully support the amendments of my noble friend Lord Holmes. However, this just demonstrates, yet again, that unless we pull ourselves together, with better smart legislation that moves faster, we will never ever catch up with developments in technology and AI. This has been demonstrated dramatically by these amendments. I express concerns that the Government move at a pace that government always moves at, but in this particular field it is not going to work. We are going to be disadvantaged and in serious trouble, unless we can move a bit faster.

Lord Arbuthnot of Edrom Portrait Lord Arbuthnot of Edrom (Con)
- Hansard - - - Excerpts

My Lords, I rise briefly but strongly to support my noble friend Lord Holmes. The CyberUp campaign has been banging this drum for a long time now. I remember taking part in the debates in another place on the Computer Misuse Act 34 years ago. It was the time of dial-up modems, fax machines and bulletin boards. This is the time to act, and it is the opportunity to do so.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, we ought to be mindful and congratulate the noble Lord on having been parliamentarian of the year as a result of his campaigning activities.

Lord Arbuthnot of Edrom Portrait Lord Arbuthnot of Edrom (Con)
- Hansard - - - Excerpts

My Lords, it has taken 34 years.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

My Lords, I rise to make a brief but emphatic comment from the health constituency. We in the NHS have been victims of appalling cyber- hacking. The pathology labs in south London were hacked and that cost many lives. It is an example of where the world is going in the future unless we act promptly. The emphatic call for quick action so that government keeps up with world changes is really well made. I ask the Minister to reflect on that.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, I, too, shall speak very briefly, which will save valuable minutes in which I can order my CyberUp Christmas mug.

Amendments 156A and 156B add to the definition of unauthorised access, so that it includes instances where a person who accesses data in the reasonable knowledge that the controller would not consent if they knew about the access or the reason for the access, and this person is not empowered to access by an enactment. Amendment 156B introduces defences to this new charge. Given the amount of valuable personal data held by controllers, as our lives have moved increasingly online—as many speakers in this debate have vividly brought out—there is absolutely clear merit not just in this idea but in the pace implied, which many noble Lords have called for. There is a need for real urgency here, and I look forward to hearing more detail from the Minister.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I turn to Amendments 156A and 156B, tabled by the noble Lord, Lord Holmes. I understand the strength of feeling and the need to provide legal protections for legitimate cybersecurity activities. I agree with the noble Lord that the UK should have the right legislative framework to allow us to tackle the harms posed by cybercriminals. We have heard examples of some of those threats this afternoon.

I reassure the noble Lord that this Government are committed to ensuring that the Computer Misuse Act remains up to date and effective in tackling criminality. We will continue to work with the cybersecurity industry, the National Cyber Security Centre and law enforcement agencies to consider whether there are workable proposals on this. The noble Lord will know that this is a complex and ongoing issue being considered as part of the review of the Computer Misuse Act being carried out by the Home Office. We are considering improved defences by engaging extensively with the cybersecurity industry, law enforcement agencies, prosecutors and system owners. However, engagement to date has not produced a consensus on the issue, even within the industry, and that is holding us back at this moment—but we are absolutely determined to move forward with this and to reach a consensus on the way forward.

I think the noble Lord, Lord Clement-Jones, said in the previous debate that the amendments were premature, and here that is certainly the case. The specific amendments that the noble Lord has tabled are premature, because we need a stronger consensus on the way forward, notwithstanding all the good reasons that noble Lords have given for why it is important that we have updated legislation. With these concerns and reasons in mind, I hope that the noble Lord will feel able to withdraw his amendment.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

Could the Minister say a few words on some of those points of discourse and non-consensus, to give the Committee some flavour of the type of issues where there is no consensus as well as the extent of the gap between some of those perspectives?

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Just to follow up, have the Government formally responded to the original review from the noble Lord, Lord Vallance? That would be very helpful as well, in unpacking what were clearly extremely well-informed recommendations. It should, no doubt, be taken extremely seriously.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I can tell the noble Lord, Lord Holmes, that we published our analysis of the consultation responses to the previous Home Office investigation in November 2023, so all those mixed responses are on the record. It was therefore concluded by the Government that further work needed to be done on this. On my noble friend’s report, was there a government response?

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- Hansard - - - Excerpts

Yes, the Government accepted the recommendations in full.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Before the Minister sits down or stands up or whatever the appropriate phrase should be, I very much hope that, since the previous Government gave that indication, this Government will take that as a spur to non-glacial progress. I hope that at least the speed might get up to a number of miles per hour before too long.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

My Lords, I thank all noble Lords who have taken part in this important debate and, indeed, the Minister for her thoughtful response. We find ourselves in a position of extraordinary good fortune when it comes to these and many other amendments, not least in the area of artificial intelligence. We had a first-class report from the then Sir Patrick Vallance as CSA. It is not often in life that in a short space of time one is afforded the opportunity in government of bringing much of that excellent work into being through statute, regulation, codes and other guidance. I await further steps in this area.

There can barely be, in many ways, a more serious and pressing issue to be addressed. For every day that we delay, harms are caused. Even if the Government were only to do this on their growth agenda, much spoken of, this would have an economic benefit to the United Kingdom. It would be good to meet the Minister between Committee and Report to see if anything further can be done but, from my perspective and others, we will certainly be returning to this incredibly important issue. I beg leave to withdraw the amendment.

Amendment 156A withdrawn.
Amendments 156B and 157 not moved.
Schedule 11 agreed.
Clause 108 agreed.
Clause 109: Interpretation of the PEC Regulations
Amendment 158 not moved.
Clause 109 agreed.
Clauses 110 and 111 agreed.
Schedule 12: Storing information in the terminal equipment of a subscriber or user
Amendment 159 had been withdrawn from the Marshalled List.
Amendments 159A to 160 not moved.
Schedule 12 agreed.
Clauses 112 and 113 agreed.
Schedule 13 agreed.
Clause 114 agreed.
Amendments 161 and 162 not moved.
Clause 115 agreed.
Schedule 14: The Information Commission
Amendments 163 to 192 not moved.
Schedule 14 agreed.
Clauses 116 to 119 agreed.
Schedule 15: Information standards for health and adult social care in England
Amendments 193 to 195 not moved.
Schedule 15 agreed.
Clause 120 agreed.
Schedule 16 agreed.
17:00
Clauses 121 and 122 agreed.
Amendment 196 not moved.
Clause 123: Information for research about online safety matters
Amendment 197
Moved by
197: Clause 123, page 153, line 6, leave out “may by regulations” and insert “must, as soon as reasonably practicable and no later than 12 months after the day on which this Act is passed, make and lay regulations to”
Member’s explanatory statement
This amendment removes the Secretary of State’s discretion on whether to lay regulations under Clause 123 and sets a time limit for laying them before Parliament.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I shall also speak to Amendment 198 in my name and register my support for the amendments in the name of the noble Lord, Lord Bethell, to which I have added my name. Independent research access is a very welcome addition to the Bill by the Government. It was a key recommendation of the pre-legislative scrutiny committee on the Online Safety Bill in 2021 and I know that I speak for many colleagues in the academic field, as well as many civil society organisations, who are delighted by its swift and definitive inclusion in the Bill.

The objective of these amendments is not to derail the Government’s plans, but rather to ensure that they happen and to make the regime work for children and the UK’s world-class academic institutions and stellar civil society organisations, ensuring that we can all do high-quality research about emergent threats to children and society more broadly.

Amendment 197 would ensure that the provisions in Clause 123 are acted on by removing the Government’s discretion as to whether or not they introduce regulations. It would also impose a deadline of 12 months for the Government to do so. I have said this before, but I have learnt the hard way that good intentions and warm words from the Dispatch Box are a poor substitute for clear provisions in law. A quick search of the Bill reveals that there are 119 uses of the word “must” and 262 uses of the word “may”. Clearly, they are being used to create different obligations or expectations. The Minister may say that this amendment is not needed and that, for all intents and purposes, we can take the word “may” as a “must” or a “will”, but I would prefer to see it in black and white. In fact, if the Government have reserved discretion on this point, I would like to understand exactly what that means for research.

Amendment 198 seeks to ensure that the regulations will enable independent researchers to research how online risks and harms impact different groups, especially vulnerable users including children. We have already discussed the fact that online harms are not experienced equally by users: those who are most vulnerable offline are often the most vulnerable online. In an earlier debate, I talked about the frustrations experienced when tech companies do not report data according to age groups. In failing to do so, it is possible to hide the reality that children are disproportionately impacted by certain risks and harms. This amendment would ensure that children and other vulnerable groups can be studied in isolation, rather than leaving independent researchers to pick through generalised datasets to uncover where harm is amplified and for whom.

I will leave the noble Lord, Lord Bethell, to explain his amendments, but I will just say why it is so important that we have a clear path to researcher access. It is fundamental to the success of the online safety regime.

Many will remember Frances Haugen, the Facebook whistleblower, who revealed the extent to which Meta knew, through its own detailed internal research, how harmful their platforms actually are to young people. Meta’s own research showed that:

“We make body image issues worse for one in three girls”.


Some 32% of teen girls said that, when they have felt bad about their bodies, Instagram has made them feel worse. Were it not for a whistleblower, this research would never have been made public.

After a series of evidence disclosures to US courts as a result of the legal action by attorneys-general at state level, we have heard whistleblowers suggest, in evidence given to the EU, that there will be a new culture in some Silicon Valley firms—no research and no emails. If you have something to say, you will have to say it in person so that it cannot be used against them in court. The irony of that is palpable given the struggle that we are having about user privacy, but it points to the need for our research regime to be water- tight. If the companies are not looking at the impact of their own services, we must. I hope that the Government continue their leadership on this issue and accept the amendments in the spirit that they are being put forward.

I have another point that I want the Minister to clarify. I apologise, because I raised this in a private meeting but I have forgotten the answer. Given the number of regulatory investigations, proceedings and civil litigations in which tech companies are engaged, I would like some comfort about the legal exemption in these clauses. I want to understand whether it applies only to advice from and between lawyers or exempts data that may negatively impact companies’ defence or surface evidence of safety failures or deficiencies. The best way that I have of explaining my concern is: if it is habitual for tech companies to cc a lawyer in all their communications on product safety, trust and safety, and so on, would that give them legal privilege?

Finally, I support the noble Lord, Lord Clement-Jones, in his desire for a definition of independent researchers. I would be interested to hear what the Minister has to say on that. I beg to move.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

My Lords, I will speak to my Amendments 198A and 198C to 198F. I also support Amendments 197, 198 and 198B, to which I have added my name, all of which address the issue of data for researchers.

As was put very thoughtfully by the noble Baroness, Lady Kidron, platforms are not making decisions about their services with due regard to product safety or with independent oversight. Ofcom’s work enforcing the Online Safety Act will significantly shift towards accountability, in some part, but it makes no provision at the moment on researchers’ data access, despite civil society and academic researchers being at the forefront of highlighting online harms for a decade. The anecdotes that the noble Baroness just gave were a very powerful testimony to the importance of that. We are, in fact, flying completely blind, making policy and, in this Room, legislation without data, facts and insight about the performance and algorithms that we seek to address. Were it not for the whistleblowers, we would not have anything to go on and we cannot rely on whistleblowers to guide our hands.

Rectifying this admission is in the Bill, and I am enormously grateful to the Minister and to the role of my noble friend Lord Camrose for putting it in the Bill. It is particularly important, because the situation with data for researchers has deteriorated considerably, even in the last 18 months—with Meta shutting CrowdTangle and X restricting researchers’ access to its API. The noble Baroness, Lady Kidron, spoke about what the whistleblowers think, and they think that this is going to get a lot worse in the future.

I welcome the inclusion of these provisions in the Bill. They will be totally transformational to this sector, bringing a level of access to serious analysts and academics, so we can better understand the impact of the digital world, for both good and bad. A good example of the importance of robust research to inform policy-making was the Secretary of State’s recent announcement that the Government were launching a

“research project to explore the impact of social media on young people’s wellbeing and mental health”.—[Official Report, Commons, 20/11/24; col. 250.]

That project will not be very effective if the researchers cannot access the data, so I very much hope that these provisions will be enforced before they start spending money on that.

To be effective and to have the desired effect, we need to ensure that the data for researchers regime, as described in the Bill, is truly effective and cannot be easily brushed off. That is why the Government need to accept the amendments in this group: to bring some clarity and to close loopholes in the scheme as it is outlined in the Bill.

I will briefly summarise the provisions in the amendments in my name. First, we need to make researcher access regulations enforceable in the same way as other requirements in the Online Safety Act. The enforcement provisions in that Act were strengthened considerably as it passed through this House, and I believe that the measures for data for researchers need to be given the same rocket boosters. Amendment 198D will mean that regulated services will be required to adhere to the regime and give Ofcom the power to levy proper remedial action if regulated services are obfuscating or non-compliant.

Secondly, we need to ensure that any contractual provision of use, such as a platform’s terms of service, is unenforceable if it would prevent

“research into online safety matters”,

as defined in the regulations. This is an important loophole that needs to be closed. It will protect UK researchers carrying out public interest research from nefarious litigation over terms of service violations as platforms seek to obfuscate access to data. We have seen this practice in other areas.

Thirdly, we need to clarify that researchers carrying out applicable research into online safety matters in the UK will be able to access information under the regime, regardless of where they are located. This is a basic point. Amendment 198E would bring the regime in line with the Digital Services Act of the EU and allow the world’s best researchers to study potential harm to UK users.

Ensuring robust researcher access to data contributes to a great ecosystem of investigation and scrutiny that will help to enforce an effective application of the law, while also guarding against overreach in terms of moderating speech. It is time to back UK civil society and academic researchers to ensure that policy-making and regulatory enforcement is as informed as possible. That is why I ask the Minister to support these measures.

Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- Hansard - - - Excerpts

My Lords, I will speak briefly. I added my name in support of Amendments 197 and 198, tabled by the noble Baroness, Lady Kidron. We do not need to rehearse the arguments as to why children are a distinct group who need to be looked at in a distinctive way, so I will not repeat those arguments.

I turn to the excellent points made in the amendments in the name of the noble Lord, Lord Bethell. Data access for researchers is fundamental. The problem with statutory bodies, regulators and departments of state is that they are not designed and set up to be experts in researching some of the more arcane areas in which these algorithms are developed. This is leading-edge stuff. The employees in these platforms—the people who are designing and tweaking these very clever algorithms—are coming from precisely the academic and research institutions that are best placed to go into those companies and find out what they are doing. In many cases, it is their own graduates and PhDs who are doing it. They are the best qualified people to look at what is going on, because they will understand what is going on. If somebody tries to obfuscate, they will see through them immediately, because they can understand that highly sophisticated language.

If we do not allow this, we will be in the deeply uncomfortable position of relying on brave people such as Frances Haugen to run the huge reputational, employability and financial risks of becoming a whistleblower. A whistleblower who takes on one of those huge platforms that has been employing them is a very brave person indeed. I would feel distinctly uncomfortable if I thought that we were trying to guard our citizens, and particularly our children, against what some of these algorithms are trying to do by relying on the good wishes and chances of a whistleblower showing us what was going on. I support all these amendments very strongly.

17:15
Lord Arbuthnot of Edrom Portrait Lord Arbuthnot of Edrom (Con)
- Hansard - - - Excerpts

My Lords, I shall speak very briefly. I have a great deal of agreement with what the noble Baroness, Lady Kidron, the noble Lord, Lord Russell, and my noble friend Lord Bethell have said. I am rising to nitpick; I apologise for that, but I suppose that is what Committee is for.

The final line of proposed new subsection (da), to be inserted by Amendment 198, refers to

“different characteristics including gender, race, ethnicity, disability, sexuality, gender”.

On our first day in Committee, I raised the importance of the issue of sex, which is different from gender or sexuality. We need to make sure that we get the wording of this amendment, if it were to be accepted by the Government, absolutely right.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My Lords, I shall also speak extremely briefly, as one of the three veterans of the Joint Committee present in Committee today, to reinforce my support for these amendments. The Government should be congratulated on Clause 123. It is welcome to see this movement but we want to see this done quickly. We want to ensure that it is properly enforceable, that terms of service cannot be used to obstruct access to researchers, as the noble Lord, Lord Bethell, said, and that there is proper global access by researchers, because, of course, these are global tech companies and UK users need to be protected through transparency. It is notable that, in the government consultation on copyright and AI published yesterday, transparency is a core principle of what the Government are arguing for. It is this transparency that we need in this context, through independent researchers. I strongly commend these amendments to the Minister.

Earl of Erroll Portrait The Earl of Erroll (CB)
- Hansard - - - Excerpts

My Lords, I would like to just make one comment on this group. I entirely agree with everything that has been said and, in particular, with the amendments in the name of the noble Baroness, Lady Kidron, but the one that I want to single out—it is why I am bothering to stand up—is Amendment 197, which says that the Secretary of State “must” implement this measure.

I was heavily scarred back in 2017 by the Executive’s refusal to implement Part 3 of the Digital Economy Act in order to protect our children from pornography. Now, nearly eight years later, they are still not protected. It was never done properly, in my opinion, in the then Online Safety Bill either; it still has not been implemented. I think, therefore, that we need to have a “must” there. We have an Executive who are refusing to carry out the issue from Parliament in passing the legislation. We have a problem, but I think that we can amend it by putting “must” in the Bill. Then, we can hold the Executive to account.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, the trouble with this House is that some have long memories. The noble Earl, Lord Erroll, reminded us all to look back, with real regret, at the Digital Economy Act and the failure to implement Part 3. I think that that was a misstep by the previous Government.

Like all of us, I warmly welcome the inclusion of data access provisions for researchers studying online safety matters in Clause 123 of the Bill. As we heard from the noble Baroness, Lady Kidron, and the noble Lord, Lord Knight, this was very much unfinished business from the Online Safety Act. However, I believe that, in order for the Bill to be effective and have the desired effect, the Government need to accept the amendments in the names of the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell. In terms of timeframe, the width of research possible, enforceability, contractual elements and location, they cover the bases extremely effectively.

The point was made extremely well by the noble Lords, Lord Bethell and Lord Russell, that we should not have to rely on brave whistleblowers such as Frances Haugen. We should be able to benefit from quality researchers, whether from academia or elsewhere, in order to carry out this important work.

My Amendment 198B is intended as a probing amendment about the definition of researchers under Clause 123, which has to be carefully drawn to allow for legitimate non-governmental organisations, academics and so on, but not so widely that it can be exploited by bad actors. For example, we do not want those who seek to identify potential exploits in a platform to use this by calling themselves “independent researchers” if they simply describe themselves as such. For instance, could Tommy Robinson seek to protect himself from liabilities in this way? After all, he called himself an “independent journalist” in another context when he clearly was not. I hope that when the Government come to draw up the regulations they will be mindful of the need to be very clear about what constitutes an independent or accredited researcher, or whatever phrase will be used in the context.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, although I have no amendments in this group, I will comment on some of them. I might jump around the order, so please forgive me for that.

Amendment 197 would change Clause 123 so that the Secretary of State must, as soon as reasonably practicable and no later than 12 months after the Act is passed, make regulations requiring regulated services to provide information for the purposes of research into online safety. This is clearly sensible. It would ensure that valuable research into online safety may commence as soon as possible, which would benefit us all, as speakers have made abundantly clear. To that end, Amendment 198D, which would ensure that researcher access is enforceable in the same way as other requirements under the Online Safety Act, would ensure that researchers can access valuable information and carry out their beneficial research.

I am still left with some curiosity on some of these amendments, so I will indicate where I have specific questions to those who have tabled them and hope they will forgive me if I ask to have a word with them between now and Report, which would be very helpful. In that spirit, I turn to Amendment 198B, which would allow the Secretary of State to define the term “independent researcher”. I ask the noble Lord, Lord Clement-Jones, who tabled the amendment, whether he envisages the Secretary of State taking advice before making such regulations and, if so, from whom and in what mechanism. I recognise that it is a probing amendment, but I would be keen to understand more.

I am also keen to understand further from my noble friend Lord Bethell and the noble Baroness, Lady Kidron, why, under Amendment 198A, the Secretary of State would not be able to make regulations providing for independent research into the “enforcement of requirements” under these regulations. Again, I look forward to discussing that with them.

I have some concerns about Amendment 198, which would require service providers to give information pertaining to age, stage of development, gender, race, ethnicity, disability and sexuality to researchers. I understand the importance of this but my concern is that it would require the disclosure of special category data to those researchers. I express reservations, especially if the data pertains to children. Do we have the right safeguards in place to address the obviously heightened risks here?

Additionally, I have some concerns about the provisions suggested in Amendment 198E. Should we allow researchers from outside the United Kingdom to require access to information from regulated service providers? Could this result in data being transferred into jurisdictions where there are less stringent data protection laws?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I thank noble Lords who have welcomed the provisions in the Bill. I very much appreciate that we have taken on board the concerns that were raised in the debates on the previous legislation. I thank the noble Baroness, Lady Kidron, and the noble Lords, Lord Bethell and Lord Clement-Jones, for their amendments.

I will speak first to Amendment 197, tabled by the noble Baroness, Lady Kidron, which would compel the Secretary of State to create a framework and to do so within 12 months of passage. I understand and share her desire to ensure that a framework allowing researchers access is installed and done promptly. This is precisely why we brought forward this provision. I reassure her that the department will consult on the framework as soon as possible after the publication of Ofcom’s report.

Turning to Amendments 198 and 198B, tabled by the noble Baroness, Lady Kidron, and the noble Lord, Lord Clement-Jones, respectively, Clause 123 provides the Secretary of State with the power to make regulations relating to researchers’ access to data. I can reassure noble Lords that it does not limit the regulations to the non-exhaustive list of examples provided. I agree that fair and proportionate criteria for who is considered a researcher are critical to the success of the future framework. I reassure noble Lords that in the provision as currently written the Secretary of State can include in the design of the framework the specific requirements that a person must meet to be considered a researcher.

Turning to Amendments 198A and 198D, tabled by the noble Lord, Lord Bethell, while I am sympathetic to his desire to provide a future framework with the robust enforcement powers of the OSA, I assure him that as the provision is written, the Secretary of State can already use the existing enforcement powers of the OSA to support a future framework. Furthermore, should the evidence suggest that additional or different measures would be more effective and appropriate, this provision allows the Secretary of State the flexibility to introduce them.

Turning next to Amendments 198C and 198E, tabled by the noble Lord, Lord Bethell, I understand the spirit of these amendments and note the importance of this issue, given the global nature of the online world. It is entirely reasonable to allow researchers who are not based in the UK to utilise our researcher access framework, as long as the subject of their research is the experience of UK users online. I reassure him that the provisions as drafted already allow the Secretary of State to make regulations permitting non-UK-based researchers to use the framework where appropriate. We plan to use the evidence gathered through our own means and through Ofcom’s report to set out who will be eligible to use the framework in the secondary legislation.

Finally, turning to Amendment 198F, I am aware of the concern that researchers have encountered blockages to conducting research and I am sympathetic to the intentions behind the amendment. We must ensure that researchers can use the future framework without fear of legal action or other consequences. I am conscious that the noble Baroness, Lady Kidron, asked me a specific question about legal exemptions and I will write to her to make that answer much clearer. I reassure noble Lords that the Government are considering the specific issues that the noble Lord raises. For these reasons, I ask that the amendments not be pressed while the Government consider these issues further and I am of course happy to engage with noble Lords in the meantime.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I thank the Minister and everyone who spoke. I do not think I heard an answer to the may/must issue and I think I need to say that just relying on Ofcom’s report to set the framework for the regime is not adequate, for two reasons. First, it is no news to the Committee that there is a considerable amount of disquiet about how the Online Safety Act has been reinterpreted without Parliament’s intention. During the passage of this Bill, we are trying to be really clear—we will win some and we will lose some—on the face of the Bill what Parliament’s intention is, so that the regulator really does what we agree, because that subject is currently quite contentious.

This is a new area and a lot of the issues that the Minister and, indeed, the noble Viscount, Lord Camrose, raised are here to be sorted out to make sure that we understand collectively what it will look like. Having said that, I would like the Government to have heard that we do not wish to rest on the actions of whistleblowers but we will be increasingly forced to do so if we do not have a good regime. We must understand the capacity of this sector to go to court. We are in court everywhere, all over the world; the sector has deep pockets.

Finally, I welcome the nitpicking of the noble Lord, Lord Arbuthnot. Long may he nitpick. We will make sure that he is content before Report. With that, I beg leave to withdraw the amendment.

Amendment 197 withdrawn.
17:30
Amendments 198 to 198F not moved.
Clause 123 agreed.
Clauses 124 to 126 agreed.
Amendment 199
Moved by
199: After Clause 126, insert the following new Clause—
“Data risks from systemic competitors and hostile actors
Data risks from systemic competitors and hostile actors(1) The Secretary of State, in consultation with the Information Commissioner, must conduct a risk assessment on the data privacy risks associated with genomics and DNA companies that are headquartered in countries the government determines to be systemic competitors and hostile actors.(2) Within 12 months of the day on which this Act is passed, the Secretary of State must present a report on the risk assessment in subsection (1) to Parliament and consult the intelligence and security agencies on the findings, taking into account the need to not make public information critical to national defence or ongoing operations.(3) This risk assessment must evaluate—(a) the degree of access granted to foreign entities, particularly those linked to systemic competitors and hostile actors, to genomic and DNA data collected within the United Kingdom,(b) the potential for genomic and DNA data to be exfiltrated outside of the United Kingdom,(c) the potential misuse of United Kingdom genomic and DNA data for dual-use or nefarious purposes,(d) the potential for such data to be used in a manner that could compromise the privacy or security of United Kingdom citizens or undermine national security and strategic advantage.(4) The risk assessment must consider and include, but is not limited to—(a) an analysis of the data handling and storage practices of genomics companies that are based in countries designated as systemic competitors and hostile actors, (b) an independent audit, including digital and physical forensic examination, at any company site that could have access to United Kingdom genomics data, and(c) evidence of clear disclosure statements to consumers of products and services from genomics companies subject to data sharing requirements in the countries where they are headquartered.(5) This risk assessment must be conducted as frequently as deemed necessary by the Secretary of State or the Information Commissioner to address evolving threats and ensure continued protection of the genomics sector from entities controlled, directly or indirectly, by countries designated as systemic competitors and hostile actors.(6) The Secretary of State may issue directives or guidelines based on the findings of the risk assessment to ensure compliance by companies or personnel operating within the genomics sector in the United Kingdom, safeguarding against identified risks and vulnerabilities to data privacy.”Member’s explanatory statement
This amendment seeks to ensure sufficient scrutiny of emerging national security and data privacy risks related to advanced technology and areas of strategic interest for systemic competitors and hostile actors. It aims to inform the development of regulations or guidelines necessary to mitigate risks and protect the data privacy of UK citizens’ genomics data and the national interest. It seeks to ensure security experts can scrutinise malign entities and guide researchers, consumers, businesses, and public bodies.
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, the UK is a world leader in genomics research. This research will no doubt result in many benefits, particularly in the healthcare space. However, genomics data can be, and increasingly is, exploited for deeply concerning purposes, including geostrategic ones.

Western intelligence agencies are reportedly becoming increasingly concerned about China using genomic data and biotechnology for military purposes. The Chinese Government have made it clear that genomics plays a key part in the civilian-military doctrine. The 13th five-year plan for military-civil fusion calls for the cross-pollination of military and civilian technology such as biotechnology. This statement, taken in conjunction with reports that the Beijing Genomics Institute—the BGI—in collaboration with the People’s Liberation Army, is looking to make ethnically Han Chinese soldiers less susceptible to altitude sickness, makes for worrying reading. Genetically engineered soldiers appear to be moving out of fiction and towards reality.

The global genomics industry has grown substantially as a result of the Covid-19 pandemic and gene giant BGI Group and its affiliated MGI Tech have acquired large databases of DNA. Further, I note that BGI has widespread links to the Chinese state. It operates the Government’s key laboratories and national gene bank, itself a vast repository of DNA data drawn from all over the world. A Reuters investigation found that a prenatal test, NIFTY, sold by BGI to expectant mothers, gathered millions of women’s DNA data. This prenatal test was developed in collaboration with the Chinese military.

For these reasons, I think we must become far more protective of genomic data gathered from our population. While many researchers use genomic data to find cures for terrible diseases, many others, I am afraid, would use it to do us harm. To this end, I have tabled Amendment 199 to require the Secretary of State and the Information Commissioner to conduct frequent risk assessments on data privacy associated with genomics and DNA companies headquartered in countries that are systemic competitors or hostile actors. I believe this will go some way to preventing genomic data transfer out of the UK and to countries such as China that may use it for military purposes. I beg to move.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

My Lords, I strongly support this amendment. As a former Minister, I was at the front line of genomic data and know how powerful it currently is and can be in the future. Having discussed this with the UK Biobank, I know that the issue of who stores and processes genomic data in the UK is a subject of huge and grave concern. I emphasise that the American Government have moved on this issue already and emphatically. There is the possibility that we will be left behind in global standards and will one day be an outlier if we do not close this important and strategically delicate loophole. For that reason, I strongly support this amendment.

Earl of Erroll Portrait The Earl of Erroll (CB)
- Hansard - - - Excerpts

My Lords, I was involved in an ethics committee that looked at genomics and cancer research some years ago, and this is very important. If research could be done on different genomic and racial types, it could be used against us adversely at some point. So there is a lot of sense in this.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I thank the noble Viscount, Lord Camrose, for moving this amendment, which raises this important question about our genomics databases, and for the disturbing examples that he has drawn to our attention. He is right that the opportunities from harnessing genomic data come with very real risks. This is why the Government have continued the important work of the UK Biological Security Strategy of 2023, including by conducting a full risk assessment and providing updated guidance to reduce the risks from the misuse of sensitive data. We plan to brief the Joint Committee on the National Security Strategy on the findings of the risk assessment in the new year. Following that, I look forward to engaging with the noble Viscount on its outcome and on how we intend to take these issues forward. As he says, this is a vital issue, but in the meantime I hope he is prepared to withdraw his amendment.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank the Minister for her answer, and I very much accept her offer of engagement. I will make a few further brief comments about the importance of this amendment, as we go forward. I hope that other noble Lords will consider it carefully before Report.

I will set out a few reasons why I believe this amendment can benefit both the Bill and this country. The first is its scope. The amendment will allow the Secretary of State and the Information Commissioner to assess data security risks across the entirety of the genomic sector, covering consumers, businesses, citizens and researchers who may be partnering with state-linked genomics companies.

The second reason is urgency. DNA is regularly described as the “new gold” and it represents our most permanent identifier, revealing physical and mental characteristics, family medical history and susceptibility to diseases. Once it has been accessed, the damage from potential misuse cannot be researched, and this places a premium on proactively scrutinising the potential risks to this data.

Thirdly, there are opportunities for global leadership. This amendment offers the UK an opportunity to take a world-leading role and become the first European country to take authoritative action to scrutinise data vulnerabilities in this area of critical technology. Scrutinising risks to UK genomic data security also provides a foundation to foster domestic genomics companies and solutions.

Fourthly, this amendment would align the UK with key security partners, particularly, as my noble friend Lord Bethell mentioned, the United States, which has already blacklisted certain genomics companies linked to China and taken steps to protect American citizens’ DNA from potential misuse.

The fifth and final reason is protection of citizens and consumers. This amendment would provide greater guidance and transparency to citizens and consumers whose DNA data is exposed to entities linked to systemic competitors. With all of that said, I thank noble Lords for their consideration and beg leave to withdraw my amendment.

Amendment 199 withdrawn.
Clauses 127 to 132 agreed.
Amendments 200 to 202 not moved.
Amendment 203
Moved by
203: After Clause 132, insert the following new Clause—
“Offence to use personal data or digital information to create digital models or files that facilitate the creation of AI- or computer-generated child sexual abuse material(1) A person commits an offence if they—(a) collect, scrape, possess, distribute or otherwise process personal data or digital information with the intention of using it, or attempting to use it, to create or train a digital model which enables the creation of AI- or computer-generated child sexual abuse material or priority illegal content;(b) use personal data or digital information to create, train or distribute or attempt to create, train or distribute a digital file or model that has been trained on child sexual abuse material or priority illegal content, or which enables the creation of AI- or computer-generated child sexual abuse material or priority illegal content;(c) collate, or attempt to collate, digital files or models based on personal data or digital information that, when combined, enable the creation of AI- or computer-generated child sexual abuse material or priority illegal content;(d) possess, or attempt to possess, a digital file or model based on personal data or digital information with the intention of using it to produce or gain access to AI- or computer-generated child sexual abuse material or priority illegal content.(2) For the purposes of this section, “AI- or computer-generated child sexual abuse material or priority illegal content” includes images, videos, audio including voice, chatbots, material generated by large language models, written text, computer files and avatars. (3) A person who commits an offence under subsection (1) is liable to the sentences set out in section 160 of the Criminal Justice Act 1988 (possession of indecent photograph of child) and section 6 of the Protection of Children Act 1978 (punishments) for the equivalent offences.(4) For the purposes of this section, “priority illegal content” is content that meets the definition of “priority illegal content” set out in section 59 of the Online Safety Act 2023.”Member's explanatory statement
It is illegal in the UK to possess or distribute child sexual abuse material including AI- or computer-generated child sexual abuse material. However, while the content is clearly covered by existing law, the mechanism that enables their creation – i.e. the files trained on or trained to create such material – is not. This amendment seeks to address that gap.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, Amendment 203 is in my name and the names of the noble Lords, Lord Bethell, Lord Stevenson and Lord Clement-Jones. I thank noble Lords wholeheartedly for their support for this measure through two versions of this Bill. I believe that I speak for all signatories in recognising the support of a huge number of colleagues in both Houses and all parties who have expressed their support for this amendment.

It is my understanding that we are going to hear good news from the Dispatch Box. In the event that I am wrong, I shall have more to say once we have heard from the Minister. In the meantime, I want to explain what the problem is that the amendment seeks to solve.

It is illegal in the UK to possess or distribute child sexual abuse material, including AI-generated or computer-generated child sexual abuse material. The laws that the police use to enforce against CSAM are Section 1 of the Protection of Children Act 1978 and Section 160 of the Criminal Justice Act 1988, both of which create offences in respect of indecent photographs or pseudo-photographs of a child. AI content depicting child sexual abuse is illegal under these laws, but creating and distributing the software models needed to generate them is not, which means that those building and distributing software that allows paedophiles to generate bespoke child sexual abuse material have operated with impunity.

There are many services that allow anyone to take any public image and put it in a false situation. Although I have argued elsewhere that AI images should carry a mark of provenance, these services are not the subject of this amendment. This amendment is laser focused on criminalising AI models that are trained on or trained to create child sexual abuse material. They are specific, specialist and currently beyond the reach of the police. The models blend images of children—known children, stock photos, images scraped from social media, school websites or synthetic, fabricated AI depictions of children—with existing CSAM or pornography, and they allow paedophiles to generate bespoke CSAM scenarios of unimaginable depravity, as they are unmitigated by any restrictions that organise the reality of the world. If someone can think, type or say it, they can make it so.

Many of the generative models are distributed for free, but more specialist models are provided on subscription for less than £50 per month. This payment provides any child sexual offender with the ability to generate limitless—and I do mean limitless—child sexual abuse images. But while the police can take action against those who possess those images, they are unable to take action against those who make it possible to do so: the means of production.

A surprising number of people think that AI abuse is a victimless crime, and I want to make it clear that it is not. First, who would be comfortable with the image of their child or grandchild or their neighbour’s child being used in this way? Anyone, adult or child, can appear in AI-generated CSAM. I am not going to say how it can be done, because I do not want my words to be a set of instructions on the public record—but the reality is, any one of us, woman or man, though 99% are women, boy or girl, though it is mostly girls, is a potential victim. If your image is used in this way, you are a victim; if you are forced to watch or copy such imagery, you are a victim; and if you are a child whose real-life abuse is not caught because you are lost in a sea of AI-generated material, you are a victim. Then there is the normalisation of sexual violence against children, which poisons relationships—intimate, familial, across generations, genders and sexes. This is not a victimless crime.

17:45
I have been aware of the industrial scale of this issue, in part because of the efforts of a specialist police unit that—day in and day out—occupies the synthetic worlds created to humiliate, objectify and abuse children. I have had the privilege of meeting many of the unit in person and a smaller group on many occasions. For obvious reasons, I do not want to name them, but I take this opportunity to thank them and recognise all on the front line of fighting against CSAM. It is an unbearably hard task.
Before I sit down, I have two brief points to make. First, although the proposed amendments are definitively focused on those who deliberately create child sexual abuse, I put on notice those companies and services that do not take sufficient care to prevent it happening accidentally. I know some image generator companies have gone out of their way to create guardrails and others have taken a “hurt first, fix later” approach. We have data law, we have the OSA, and I anticipate that in the new year we will have further offences, each of which will be robustly used to stop the careless creation of abuse. That should be the number one concern of GenAI companies.
Secondly, I am of course delighted to win a battle for children. I am happy to recognise that the previous Government promised it and the efforts of the noble Viscount, Lord Camrose, in agreeing to this at an earlier date. I also recognise the efforts of the civil servants in the Home Office and the Safeguarding Minister, Jess Phillips, all of whom have made considerable efforts.
Last Friday, however, we had a completely unacceptable answer from the Lords MoJ Minister on the related issue of non-consensual sexually explicit images and videos during a debate on the PMB of the noble Baroness, Lady Owen. I had written that line before the noble Baroness decided to lay some amendments that we will discuss in only a moment. I will let her explain her intentions, but I want to put on record my full support for her campaign, her Private Member’s Bill and her amendments, and for including them in today’s debate.
It should not be possible for the Home Office to manage and for the MoJ to not manage. We need a Government where all departments work on behalf of all victims. I will wait to hear what the Minister says, and I very much hope I can congratulate her when I stand up again. I beg to move.
Baroness Owen of Alderley Edge Portrait Baroness Owen of Alderley Edge (Con)
- Hansard - - - Excerpts

My Lords, I rise today in support of Amendment 203 in the name of the noble Baroness, Lady Kidron. I declare an interest as a recent guest of Google at its Future Forum policy conference. I apologise for not being able to make Second Reading and for not being present for my last amendment; as a newer Peer, I am very new to this and still learning as I go. I am very grateful to the noble Baroness, Lady Kidron, for stepping in.

I commend the wording of the noble Baroness’s amendment, which tackles the full process of training these models, from the collection of data or images to use as training data, all the way through to possessing a model. With these apps easily downloadable on app stores, there is a lack of friction in the process. This means that we have seen horrific cases of children using these apps in schools across the world with devastating consequences. In summer, I met the father of a little girl who had been bullied in this way and sadly took her own life.

I am very grateful to the noble Baroness, Lady Kidron, for this thoughtful and comprehensive amendment, which seeks to future-proof with its inclusion of avatars. We have already seen these threats evolving in the metaverse. I encourage the Government to adopt this amendment so that we can begin to see an end to this abusive market.

I turn to my Amendment 211G. I am very grateful to the noble Lords, Lord Clement-Jones and Lord Browne of Ladyton, and the noble Baroness, Lady Kidron, for putting their names to it. Noble Lords may recognise it from my Private Member’s Bill on non-consensual sexually explicit images and videos. I will keep my remarks brief as many of your Lordships were present on Friday.

The amendment seeks to create offences for the non-consensual creation of sexually explicit content and to close the gaps in the Sexual Offences Act. It is, vitally, consent-based, meaning that victims do not have to suffer the trauma of proving the motivation of their perpetrator. It includes solicitation to prevent any creation laws being circumnavigated by asking those in other jurisdictions to create such content for you through the uploading of clothed images to forums. Finally, it includes forced deletion so that victims can clearly see their rights to have the content destroyed from any devices or cloud-based programmes and do not have to live in fear that their perpetrator is still in possession of their content.

This amendment is inspired by the lived experience of victim survivors. The Government have repeatedly said that they are looking for the most suitable legislative vehicle to fulfil their commitment to criminalise the creation of sexually explicit deepfakes. It seems they did not think my Private Member’s Bill was the right vehicle, but it is my firm belief that the most appropriate legislative vehicle is the one that gets there quickest. I am hopeful that the Government will be more receptive to an amendment to their legislation, given the need urgently to tackle this rapidly proliferating form of abuse.

Amendment 211H addresses the problem of sexually explicit audio, which the noble Baroness, Lady Gohir, spoke about so movingly in Friday’s debate. We have seen satirical voice cloning, such as of Gareth Southgate at the 2024 Euros. However, the most state-of-the-art systems now require around three seconds of voice audio data to create speech on a parity with a human. This could be data from a short phone call or a TikTok video. As we are reaching the point where less data is required to create high-quality audio, this now has the potential to be weaponised. There is a real risk that, if we do not future-proof against this while we have the opportunity, it could rapidly develop in the way that sexually explicit deepfake images have. We are already seeing signs of new sexually explicit audio online. Its ease of use combined with its accessibility could create a huge risk in future.

Henry Ajder, the researcher who pioneered the study of non-consensual deepfake image abuse, said:

“2024 has seen AI generated voice audio widely used in spreading political disinformation and new forms of fraud, but much less attention has been paid to its potential as a tool for digital sexual abuse”.


In his research in 2018, he observed several cases of online communities experimenting with voice-cloning capabilities, targeting celebrities to create non-consensual “synthetic phone sex” content. This Bill could be a key opportunity to future-proof against this problem before it becomes widespread.

Baroness Gohir Portrait Baroness Gohir (CB)
- Hansard - - - Excerpts

My Lords, I declare my interests as set out in the register, particularly as CEO of Muslim Women’s Network UK, which operates a national helpline. I also apologise for not being here at Second Reading, but I felt compelled to speak today after the noble Baroness, Lady Owen, put forward her amendments. Before I speak to them, I support all the amendments from the noble Baroness, Lady Kidron—everything she says is always very powerful.

The noble Baroness, Lady Owen, made her case powerfully today, as she did last week. I too spoke in that debate. We were disappointed across the House that the Government were not very supportive of the Bill, but they hinted that its amendments and recommendations could be integrated into another Bill. This Bill could be it.

I will focus my comments on audio recordings, which I raised last week. This element gets overlooked, because we tend to focus on sexually explicit images and video recordings. However, perpetrators will also record audio of sexual activities without consent and either share or threaten to share it. As the noble Baroness, Lady Owen, mentioned, people can create deepfakes very easily with new technologies. A person’s voice is recognisable to the people who know them, so this must be addressed and it can be in this Bill.

Perpetrators of intimate image and intimate audio abuse can instil fear, humiliate and make victims feel unsafe without even sharing, or threatening to share, it. They can manipulate and control their victims simply by making them aware that they have recorded or created these images and recordings.

The Muslim Women’s Network’s helpline has had women call to say that, when relationships have broken down, husbands and boyfriends have made secret audio recordings and then threatened them with those recordings. Sometimes, they have shared them online or with family members and friends. Just knowing that they possess these recordings makes these women feel very unsafe and live in fear. In some communities and cultures where people will be worried about honour-based abuse, women will be even more fearful of the repercussions of these audio recordings being shared.

Whether it is original audio or digitally created deepfake audio, the law needs to be amended to prevent this type of abuse. If the Labour Party and the Government are serious about halving abuse against women and girls, they must shut down every avenue of abuse and accept these amendments.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

My Lords, I will speak in support of Amendment 203, which I have signed, and Amendments 211G and 211H in my noble friend Lady Owen’s name.

At Second Reading, the mood of the House was to consider and support the enormous opportunity that comes from AI and to acknowledge the dangers of overregulation that might, somehow, smother this massive opportunity. I endorse that sentiment. However, Amendment 203 addresses computer-generated child sexual abuse material, which I regard as a red line that we should not cross. If we leave this amendment out of the Bill and cannot tackle this one massive issue of CSAM generated by AI, we will leave the whole question of the integrity and purpose of AI vulnerable to misuse by criminals and perverts.

The scale of the issue is already enormous. The Internet Watch Foundation found 275,000 webpages containing child sexual abuse content. On just one forum, 20,000 AI-generated images were posted in a single month, over 3,000 of which depicted criminal acts of child sexual abuse. This is not a hypothetical problem or some kind of visioneering or dystopian imagination; it is happening right now. There are offices filled with people generating this material for their pleasure and for commercial reasons. That is why it is urgent that we move immediately.

Any of us who has heard the testimony of the many victims of sexual abuse will realise that the experience creates lasting anxiety and gut-wrenching trauma. These are not just pictures or videos; they often represent real harm to real people. That is why urgency is so important and this amendment is so critical.

Shockingly, the explosion of this kind of material is enabled by publicly available tools, as the noble Baroness, Lady Kidron, pointed out. The case of Hugh Nelson is a very good example. He was sentenced to 18 years in prison for creating AI videos of children being physically and sexually abused. The tool he used was Daz 3D, AI software that any of us could access from this Room. It is inconceivable that this technology remains unregulated while being weaponised by people such as Hugh Nelson to inflict huge harm. Currently, our law focuses on the possession and distribution of CSAM but fails to address the mechanisms of its creation. That is a loophole and why I support these amendments. I do so for three key reasons.

First, Amendment 203 would criminalise the creation, training and distribution of AI models that can create CSAM. That would mean that Daz and other sites like it must introduce safety-by-design measures to stop their use for creating illegal content. That is not to smother the great and bountiful explosion of beneficial AI; it is to create the most basic guard-rail that should be embedded in any of these dangerous tools.

18:00
Secondly, under the amendment it would become an offence to train models using CSAM or illegal content to generate images. These systems are trained on massive quantities of tagged images. This data is generally outsourced. AI training models are likely scraping data from the internet without authorisation or supervision. Protecting personal data is absolutely necessary to stop its misuse for creating deepfakes and other CSAM content, training AI models, or creating extreme content.
Thirdly, this amendment would make it an offence to possess digital files or AI models that are intended to produce CSAM. This measure will curb the spread of these tools and reduce the availability of such content.
Together, these measures reduce the ease with which people can currently abuse publicly available tools for their perverse sexual gratification or to destroy the reputation of others. It is no longer enough to focus solely on the content; we must also hold to account the platforms and the tools that enable this abuse. The amendment is meant to send a message to and create legal jeopardy for the major corporations such as Microsoft, Google and AWS that they should not enable those who create this horrible content.
The recent debate on deepfakes, led by my noble friend Lady Owen, gave a very clear sense of where the mood of the House is. Urgency is imperative—the technology is moving more quickly than our legislative response. I hope the Minister will realise that this is an opportunity to set a new milestone for legislative responses to a new technological threat and seize it. The explosion of computer-generated CSAM is a pressing threat to our society, so supporting the amendment is a vital step towards safeguarding thousands more from online abuse.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My Lords, I support Amendment 203 and, in particular, Amendments 211G and 211H from the noble Baroness, Lady Owen. I have little to add to what I said on Friday. I confess to my noble friend the Minister that, in my speech on Friday, I asked whether this issue would be in scope for this Bill, so maybe I gave the noble Baroness the idea. I pay tribute to her agility in being able to act quickly to get this amendment in and include something on audio, following the speech of the noble Baroness, Lady Gohir.

I hope that the Minister has similar agility in being able to readjust the Government’s position on this. It is right that this was an urgent manifesto commitment from my party at the last election. It fits entirely with my right honourable friend the Home Secretary’s efforts around violence against women and girls. We should accept and grab this opportunity to deliver quickly by working with the noble Baroness, Lady Owen, and others between now and Report to bring forward an amendment to the Bill that the whole House will support enthusiastically.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, we have had some powerful speeches in this group, not least from the noble Baronesses, Lady Kidron and Lady Owen, who drafted important amendments that respond to the escalating harms caused by AI-generated sexual abuse material relating to children and adults. The amendment from the noble Baroness, Lady Kidron, would make it an offence to use personal data or digital information to create digital models or files that facilitate the creation of AI or computer-generated child sexual abuse material. As she outlined and the noble Lord, Lord Bethell, confirmed, it specifically would become an offence to create, train or distribute generative AI models that enable the creation of computer-generated CSAM or priority legal content; to train AI models on CSAM or priority illegal content; or to possess AI models that produce CSAM or priority legal content.

This amendment responds to a growing problem, as we have heard, around computer-generated sexual abuse material and a gap in the law. There is a total lack of safeguards preventing bad actors creating sexual abuse imagery, and it is causing real harm. Sites enabling this abuse are offering tools to harm, humiliate, harass, coerce and cause reputational damage. Without robust legal frameworks, victims are left vulnerable while perpetrators operate with impunity.

The noble Lord, Lord Bethell, mentioned the Internet Watch Foundation. In its report of July, One Step Ahead, it reported on the alarming rise of AI-generated CSAM. In October 2023, in How AI is Being Abused to Create Child Sexual Abuse Imagery, it made recommendations to the Government regarding legislation to strengthen legal frameworks to better address the evolving landscape of AI-generated CSAM and enhance preventive measures against its creation and distribution. It specifically recommended:

“That the Government legislates to make it an offence to use personal data or digital information to create digital models or files that facilitate the creation of AI or computer-generated child sexual abuse material”.


The noble Baroness, Lady Kidron, tabled such an amendment to the previous Bill. As she said, she was successful in persuading the then Government to accept it; I very much hope that she will be as successful in persuading this Government to accept her amendment.

Amendments 211G and 211H in the name of the noble Baroness, Lady Owen, are a response to the extraordinary fact that one in 14 adults has experienced threats to share intimate images in England and Wales; that rises to one in seven among young women. Research from Internet Matters shows that 49% of young teenagers in the UK aged between 13 and 16—around 750,000 children—said that they were aware of a form of image-based abuse being perpetrated against another young person known to them.

We debated the first of the noble Baroness’s amendments, which is incorporated in her Bill, last Friday. I entirely agree with the noble Lord, Lord Knight; I did not find the Government’s response at all satisfactory. I hope that, in the short passage of time between then and now, they have had time to be at least a little agile, as he requested. UK law clearly does not effectively address non-consensual intimate images. It is currently illegal to share or threaten to share non-consensual intimate images, including deepfakes, but creating them is not yet illegal; this means that someone could create a deepfake image of another person without their consent and not face legal consequences as long as they do not share, or threaten to share, it.

This amendment is extremely welcome. It addresses the gap in the law by criminalising the creation of non-consensual intimate images, including deepfakes. It rightly targets deepfakes due to their rising prevalence and potential for harm, particularly towards women. Research shows that 98% of deepfake videos online are pornographic, with 99% featuring women and girls. This makes it an inherently sexist problem that is a new frontier of violence against women—words that I know the noble Baroness has used.

I also very much welcome the new amendment not contained in her Bill, responding to what the noble Baroness, Lady Gohir, said at its Second Reading last Friday about including audio deepfakes. The words “shut down every avenue”, which I think were used by the noble Baroness, Lady Gohir, are entirely apposite in these circumstances. Despite what the noble Lord, Lord Ponsonby, said on Friday, I hope that the Government will accept both these amendments and redeem their manifesto pledge to ban the creation of sexually explicit deepfakes, whether audio or video.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, the current law does not sufficiently protect children from AI-driven CSAM because it is simply such a fast-moving issue. It is a sobering thought that, of all the many wonderful developments of AI that many of us have been predicting and speculating on for so long, CSAM is really driving the technology forward. What a depressing reflection that is.

Overall, AI is developing at an extraordinarily rapid pace and has come with a number of concerning consequences that are not all yet fully understood. However, it is understood that child sexual abuse is completely unacceptable in any and all contexts, and it is right that our law should be updated to reflect the dangers that have increased alongside AI development.

Amendment 203 seeks to create a specific offence for using personal data or digital information to create or facilitate the creation of computer-generated child sexual abuse material. Although legislation is in place to address possessing or distributing such horrendous material, we must prioritise the safety of children in this country and take the law a step further to prevent its creation. Our children must be kept safe and, subject to one reservation, which I will come to in a second, I support the amendment from the noble Baroness, Lady Kidron, to further protect them.

That reservation comes in proposed new subsection 1(c), which includes in the offence the act of collating files that, when combined, enable the creation of sexual abuse material. This is too broad. A great deal of the collation of such material can be conducted by innocent people using innocent materials that are then corrupted or given more poisonous aspects by further training, fine-tuning or combination with other materials by more malign actors. I hope there is a way we can refine this proposed new paragraph on that basis.

Unfortunately, adults can also be the targets of individuals who use AI to digitally generate non-consensual explicit images or audio files of an individual, using their likeness and personal data. I am really pleased that my noble friend Lady Owen tabled Amendments 211G and 211H to create offences for these unacceptable, cruel acts. I support these amendments unambiguously.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I thank the noble Baroness, Lady Kidron, for her Amendment 203. It goes without saying that the Government treat all child sexual abuse material with the utmost seriousness. I can therefore confirm to her and the Committee that the Government will bring forward legislative measures to address the issue in this Session and that the Home Office will make an announcement on this early in the new year.

On Amendments 211G and 211H, tabled by the noble Baroness, Lady Owen, the Government share concerns that more needs to be done to protect women from deepfake image abuse. This is why the Government committed in their manifesto to criminalise the creation of sexually explicit deepfake images of adults. I reassure the noble Baroness and the whole Committee that we will deliver on our manifesto commitment in this Session. The Government are fully committed to protecting the victims of tech-enabled sexual abuse. Tackling intimate audio would be a new area of law, but we continue to keep that legislation under review.

I also say to the noble Baroness that there is already a process under Section 153 of the Sentencing Act 2020 for the court to deprive a convicted offender of property, including images that have been used for the purpose of committing or facilitating any criminal offence. As well as images, that includes computers and mobile phones that the offender either used to commit intimate image offences or intended to use for that purpose in future. For those reasons and the reassurances I have given today, I hope that noble Lords will feel able to withdraw or not press their amendments.

18:15
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, first, I thank the speakers for what were really powerful and largely unequivocal contributions.

I am grateful to the Minister. I was expecting something more a tiny bit expansive but I will take, on record, that we are going to make it a new offence for a person to make, adapt, possess, supply or offer to supply a CSA image generator, including any service, program or information in electronic form that is made, or adapted for use, to create or facilitate the creation of CSA material. I am expecting something that covers all that and I am expecting it shortly, as the Minister said. I again thank the Safeguarding Minister, Jess Phillips, for her tremendous efforts, as well as some of the civil servants who helped make it leap from one Government to the next. We can be content with that.

I feel less comfortable about the Minister’s answer to the noble Baroness, Lady Owen. We, women victims, experience the gaps in the law. If there are gaps in the law, it is our job, in this Committee and in the other place, to fix them. We all want the same thing; I know the Minister well enough to know that she wants the same thing. So I am going to push back and say that I will support the noble Baroness, Lady Owen, in trying to bring this measure back through this Bill. I believe that the mood of the Committee is with her so whatever mistakes there are on her patch will be fixed before Report, because this is not something that can wait. Kids and women are being hurt.

We all want to celebrate the digital world. I was an early adopter. I had one of those cameras on my computer before anyone else I knew did, so I could not speak to anyone; there was no one to talk to. We want this world to be good. We are not saying something different. On behalf of the noble Baroness, Lady Owen, who is nodding, let me just say that we will come back to this issue. I thank the Minister for her assurance on Amendment 203 and beg leave to withdraw.

Amendment 203 withdrawn.
Amendment 204
Moved by
204: After Clause 132, insert the following new Clause—
“Compliance with UK copyright law by operators of web crawlers and general-purpose AI models(1) The Secretary of State must by regulations make provisions clarifying the steps the operators of web crawlers and general-purpose artificial intelligence (AI) models must take to comply with United Kingdom copyright law, including the Copyright, Designs and Patents Act 1988.(2) The provisions made under subsection (1) must apply if the products and services of such operators are marketed in the United Kingdom.(3) The provisions made under subsection (1) must apply to the entire lifecycle of a general-purpose AI model, including but not limited to—(a) pre-training,(b) fine tuning, and(c) grounding and retrieval-augmented generation.(4) The Secretary of State must lay before Parliament a draft of the statutory instrument containing regulations under subsection (1) within six months of the day on which this Act is passed and the regulations are subject to the affirmative procedure.”Member’s explanatory statement
This amendment would require operators of internet scrapers and general-purpose AI models to comply with UK copyright law, and to abide by a set of procedures.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I am beginning to feel like the noble Lord, Lord Clement-Jones, but I reassure everyone that this is the last day of Committee.

I shall speak to the amendments in this group in my name and that of the noble Lords, Lord Stevenson—he is very sorry not to be in his place today—and Lord Clement-Jones, and my noble friend Lord Freyberg. I thank the News Media Association for its briefing and support. I also thank, for their wonderful and unlikely support, Sir Paul McCartney, Kate Mosse, Margaret Drabble and Richard Osman, alongside the many creative artists who have spoken, written and tweeted and are among the 37,000 people who signed a petition calling for swift action to protect their livelihoods.

I have already declared my interests for the Committee but I add, to be clear, that my husband is a writer of film, theatre and opera; and that, before I came to your Lordships’ House, I spent 30 years as a movie director. As such, I come from and live alongside a community for whom the unlicensed and illegal use of copyrighted content by generative AI developers is an existential issue. I am therefore proud to move and speak to amendments that would protect one of our most financially significant economic sectors, which contributes £126 billion in gross value added to UK GDP; employs 2.4 million people; and brings so much joy and understanding to the world.

Text and data mining without licence or permission is illegal in the UK, unless it is done specifically for research. This means that what we have witnessed over the past few years is intellectual property theft on a vast scale. Like many of the issues we have discussed in Committee, this wrongdoing has happened in plain sight of regulators and successive Governments. I am afraid that yesterday’s announcement of a consultation did not bring the relief the industry needs. As Saturday’s Times said,

“senior figures in the creative sector are scathing about the government plans”,

suggesting that the Secretary of State has drunk Silicon Valley’s “Kool-Aid” and that rights reservation is nonsense. An official at the technical briefing for the consultation said that

“rights reservation is a synonym for opt out”.

Should shopkeepers have to opt out of shoplifters? Should victims of violence have to opt out of attacks? Should those who use the internet for banking have to opt out of fraud? I could go on. I struggle to think of another situation where someone protected by law must proactively wrap it around themselves on an individual basis.

The value of our creative industries is not in question; nor is the devastation that they are experiencing as a result of non-payment of IP. A recent report from the International Confederation of Societies of Authors and Composers, which represents more than 5 million creators worldwide, said that AI developers and providers anticipate the market for GAI music and audiovisual content increasing from €3 billion to €64 billion by 2028 —much of it derived from the unlicensed reproduction of creators’ works, representing a transfer of economic value from creators to AI companies. Let there be no misunderstanding of the scale of the theft: we already know that the entire internet has been downloaded several times without the consent or financial participation of millions of copyright holders.

This transfer of economic value from writers, visual artists and composers across all formats and all genres to AI companies is not theoretical. It is straightforward: if you cannot get properly paid for your work, you cannot pay the rent or build a career. Nor should we be taken in by the “manufactured uncertainty” that Silicon Valley-funded gen AI firms and think tanks have sought to create around UK copyright law. Lobbyists and their mouthpieces, such as TechUK, speak of a lack of clarity—a narrative that may have led to Minister Chris Bryant claiming that the Government’s consultation was a “win-win”. However, I would like the Minister to explain where the uncertainty on who owns these copyrighted works lies. Also, where is the win for the creative industries in the government proposal, which in one fell swoop deprives artists of control and payment for their work—unless they actively wrap the law around them and say “no”—leaving them at the mercy of pirates and scrapers?

Last week, at a meeting in this House attended by a wide range of people, from individual artists to companies representing some of the biggest creative brands in the world, a judge from the copyright court said categorically that copyright lies with the creator. AI does not create alone; it depends on data and material then to create something else. A technological system that uses it without permission is theft. The call for a new copyright law is a tactic that delays the application of existing law while continuing to steal. Unlike the physical world, where the pursuit of a stolen masterpiece may eventually result in something of value being returned to its owner, in the digital world, once your IP is stolen, the value is absorbed and fragmented, hidden amid an infinite number of other data points and onward uses. If we continue to delay, much of the value of the creative industries’ rich dataset will be absorbed already.

The government consultation has been greeted with glee overnight by the CCIA, which lobbies for the biggest tech firms. After congratulating the Government at some length, it says that

“it will be critical to ensure that the transparency requirements are realistic and do not ask AI developers to compromise their work by giving away trade secrets and highly sensitive information that could jeopardise the safety and security of their models”.

In plain English, that means, “We have persuaded the Government to give up creatives’ copyright, and now the campaign begins to protect our own ‘sensitive business information’”. If that is not sufficiently clear to the Committee, that means they are, first, claiming their own IP while stealing others, while simultaneously pushing back at transparency, because they do not want an effective opt-out.

The government consultation does not even contain an option of retaining the current copyright framework and making it workable with transparency provisions—the provisions of the amendments in front of us. The Government have sold the creative industries down the river. Neither these amendments nor the creative community are anti-tech; on the contrary, they simply secure a path by which creatives participate in the world that they create. They ensure the continuous sustainable production of human-generated content into the future, for today’s artists and those of tomorrow. The amendments do not extend the fundamentals of the Copyright, Designs and Patents Act 1988, but they ensure that the law can be enforced on both AI developers and third parties that scrape on their behalf. They force transparency into the clandestine black box.

Amendment 204 requires the Secretary of State to set out the steps by which copyright law must be observed by web crawlers and others, making it clear that it applies during the entire lifecycle, from pretraining onwards, regardless of jurisdiction—and it must take place only with a licence or express permission.

Amendment 205 requires the Secretary of State to set out the steps by which web crawlers and general-purpose AI models are transparent. This includes but is not limited to providing a name for a crawler, identifying the legal entity responsible for it, a list of purposes for which it is engaged and what data it has passed on. It creates a transparent supply chain. Crucially, it requires operators of crawlers to disclose the businesses to which they sell the data they have scraped, making it more difficult for AI developers that purchase illegally scraped content to avoid compliance with UK copyright law, overturning current practice in which the operators of crawlers can obscure their own identity or ownership, making it difficult and time-consuming—potentially impossible—to combat illegal scraping.

Amendment 206 requires the Secretary of State to set out by regulation what information web crawlers and general-purpose models must disclose regarding copyrighted works—information such as URL, time and type of data collected and a requirement to inform the copyright holder. This level of granularity, which the tech companies are already pushing against, provides a route by which IP holders can choose or contest the ways in which their work is used, as well as provide a route for payment.

In sum, the amendments create a clear and simple process for identifying which copyright works are scraped, by whom, for what purpose and from which datasets. They provide a process by which existing law can be implemented.

I shall just mention a few more points before I finish. First, there is widespread concern that mashing up huge downloads of the internet, including the toxic, falsehoods and an increasing proportion of artificially generated or synthetic data, will cause it to degenerate or collapse, putting a block on the innovation that the Government and all of us want to see, as well as raising serious safety concerns about the information ecosystem. A dynamic licensing market would provide a continuous flow of identified human-created content from which AI can learn.

Secondly, the concept of a voluntary opt-out regime—or, as the Government prefer, rights reservation—is already dead. In the DPDI Bill, I and others put forward an amendment to make robots.txt part of the robots’ exclusion protocol opt-in. In plain English, that would have meant that the voluntary scheme in which any rights holder can put a note on their digital door saying “Don’t scrape” would have been reversed to be mandatory. Over the last few months, we have seen scrapers ignoring the agreed protocol, even when activated. I hope the Minister will explain why he thinks that creators should bear the burden and the scrapers should reap the benefit and whether the Government have done an impact assessment on how many rights holders will manage to opt out versus how many would opt in, given the choice.

18:30
Thirdly, the companies are not quite telling the whole truth. In August, news broke that Meta was indexing the web to enable its AI chatbot to provide responses to user questions. At the same time, a much less trumpeted new entry on Meta’s website stated that it will scrape everything written on the web by companies and individuals to improve products “by indexing content directly”. If indexing is equivalent to scraping and, as we debated earlier in Committee, “improving products” is scientific research, this Bill represents the end of both IP and data protection simultaneously.
Finally, it is simply not true that regulation will hold us back. Many of our most successful sectors are the most regulated and there are other factors that hold back investment and growth in the UK, including a very risk-averse investment ecosystem.
We have a rich and impactful creative sector. The reach of our artists, the soft power of our storytellers in all formats, the inventiveness of our designers and the skill of our musicians are legendary. The Government’s industrial strategy rightly recognises the creative industries and the tech sector as two of the UK’s priority growth-driving industries. The Government talk about balancing two competing sides, but they are neither the same nor equal. One is a creator and one is a distributor, regurgitator or, perhaps more generously, secondary user. As in all supply lines, you need to pay for your raw material to make something new. The Government will not achieve growth by simply allowing one growth area to cannibalise the other. Since the vast majority of benefit from AI scraping accrues to the US, it seems short-sighted, possibly criminal, to put the UK’s uniquely successful and profitable creative industries at the mercy of the predatory gen AI companies. I beg to move.
Lord Freyberg Portrait Lord Freyberg (CB)
- Hansard - - - Excerpts

My Lords, I support Amendments 204, 205 and 206, to which I have attached my name. In doing so, I declare my interest as someone with a long-standing background in the visual arts and as an artist member of the Design and Artists Copyright Society.

These amendments, tabled and superbly moved by my noble friend and supported by the noble Lords, Lord Stevenson and Lord Clement-Jones, seek to address a deep crisis in the creative sector whereby millions upon millions of creative works have been used to train general-purpose or generative AI models without permission or pay. While access to data is a fundamental aspect of this Bill, which in many cases has positive and legitimate aims, the unauthorised scraping of copyright-protected artworks, news stories, books and so forth for the use of generative AI models has significant downstream impacts. It affects the creative sectors’ ability to grow economically, to maximise their valuable assets and to retain the authenticity that the public rely on.

AI companies have used artists’ works in the training, development and deployment of AI systems without consent, despite this being a requirement under UK copyright law. As has been said, the narrow exception to copyright for text and data mining for specific research purposes does not extend to AI models, which have indiscriminately scraped creative content such as images without permission, simply to build commercial products that allow users to generate their own versions of a Picasso or a David Hockney work.

This amendment would clarify the steps that operators of web crawlers and general-purpose AI models must take to comply with UK copyright law. It represents a significant step forward in resolving the legal challenges brought by rights holders against AI companies over their training practices. Despite high-profile cases arising in the USA and the UK over unauthorised uses of content by AI companies, the reality is that individual artists simply cannot access judicial redress, given the prohibitive cost of litigation.

DACS, which represents artists’ copyright, surveyed its members and found that they were not technophobic or against AI in principle but that their concerns lay with the legality and ethics of current AI operators. In fact, 84% of respondents would sign up for a licensing mechanism to be paid when their work is used by an AI with their consent. This amendment would clarify that remuneration is owed for AI companies’ use of artists’ works across the entire development life cycle, including during the pre-training and fine-tuning stages.

Licensing would additionally create the legal certainty needed for AI companies to develop their products in the UK, as the unlawful use of works creates a litigation risk which deters investment, especially from SMEs that cannot afford litigation. DACS has also been informed by its members that commissioning clients have requested artists not to use AI products in order to avoid liability issues around its input and output, demonstrating a lack of trust or uncertainty about using AI.

This amendment would additionally settle ongoing arguments around whether compliance with UK copyright law is required where AI training takes place in other jurisdictions. By affirming its applicability where AI products are marketed in the UK, the amendment would ensure that both UK-based artists and AI companies are not put at a competitive disadvantage due to international firms’ ability to conduct training in a different jurisdiction.

One of the barriers to licensing copyright is the lack of transparency over what works have been scraped by AI companies. The third amendment in this suite of proposals, Amendment 206, seeks to address this. It would require operators of web crawlers and general-purpose AI models to be transparent about the copyright works they have scraped.

Currently, artists and creators face significant challenges in protecting their intellectual property rights in the age of AI. While tools such as Spawning AI’s “Have I Been Trained?” attempt to help creators identify whether their work has been used in AI training datasets, these initiatives provide only surface-level information. Creators may learn that their work was included in training data, but they remain in the dark about crucial details—specifically, how their work was used and which companies used it. This deeper level of transparency is essential for artists to enforce their IP rights effectively. Unfortunately, the current documentation provided by AI companies, such as data cards and model cards, falls short of delivering this necessary transparency, leaving creators without the practical means to protect their work.

Amendment 206 addresses the well-known black box issue that currently plagues the AI market, by requiring the disclosure of information about the URLs accessed by internet scrapers, information that can be used to identify individual works, the timeframe of data collection and the type of data collected, among other things. The US Midjourney litigation is a prime example of why this is necessary for UK copyright enforcement. It was initiated only after a leak revealed the names of more than 16,000 non-consenting artists whose works were allegedly used to train the tool.

Creators, including artists, should not find themselves in a position where they must rely on leaks to defend their intellectual property rights. By requiring AI companies to regularly update their own records, detailing what works were used in the training process and providing this to rights holders on request, this amendment could also create a vital cultural shift towards accountability. This would represent an important step away from the “Move fast and break things” culture pervasive amongst the Silicon Valley-based AI companies at the forefront of AI development, and a step towards preserving the gold-standard British IP framework.

Lastly, I address Amendment 205, which requires operators of internet crawlers and general-purpose AI models to be transparent about the identity and purpose of their crawlers, and not penalise copyright holders who choose to deny scraping for AI by down ranking their content in, or removing their content from, a search engine. Operators of internet crawlers that scrape artistic works and other copyright-protected content can obscure their identity, making it difficult and time-consuming for individual artists and the entities that represent their copyright interests to identify these uses and seek redress for illegal scraping.

Inclusion in search-engine results is crucial for visual artists, who rely on the visibility these provide for their work to build their reputation and client base and generate sales. At present, web operators that choose to deny scraping by internet crawlers risk the downrating or even removal of their content from search engines, as the most commonly used tools cannot distinguish between do-not-train protocols added to a site. This amendment will ensure that artists who choose to deny scraping for AI training are not disadvantaged by current technical restrictions and lose out on the exposure generated by search engines.

Finally, I will say a few words about the Government’s consultation launched yesterday, because it exposes a deeply troubling approach to creators’ IP rights, as has already been said so eloquently by the noble Baroness. For months, we have been urged to trust the Government to find the right balance between creators’ rights and AI innovation, yet their concept of balance has now been revealed for what it truly is: an incredibly unfair trade-off that gives away the rights of hundreds of thousands of creators to AI firms in exchange for vague promises of transparency.

Their proposal is built on a fundamentally flawed premise—promoted by tech lobbyists—that there is a lack of clarity in existing copyright law. This is completely untrue: the use of copyrighted content by AI companies without a licence is theft on a mass scale, as has already been said, and there is no objective case for the new text and data-mining exception. What we find in this consultation is a cynical rebranding of the opt-out mechanism as a rights reservation system. While they are positioning this as beneficial for rights holders through potential licensing revenues, the reality is that this is not achievable, yet the Government intend to leave it to Ministers alone to determine what constitutes

“effective, accessible, and widely adopted”

protection measures.

This is deeply concerning, given that no truly feasible rights reservation system for AI has been implemented anywhere in the world. Rights holders have been unequivocal: opt-out mechanisms—whatever the name they are given—are fundamentally unworkable in practice. In today’s digital world, where content can be instantly shared by anyone, creators are left powerless to protect their work. This hits visual artists particularly hard, as they must make their work visible to earn a living.

The evidence from Europe serves as a stark warning: opt-out provisions have failed to protect creators’ rights, forcing the EU to introduce additional transparency requirements in the recent AI Act. Putting it bluntly, simply legalising unauthorised use of creative works cannot be the answer to mass-scale copyright infringement. This is precisely why our proposed measures are crucial: they will maintain the existing copyright framework whereby AI companies must seek licences, while providing meaningful transparency that enables copyright holders to track the use of their work and seek proper redress, rather than blindly repeating proven failures.

Earl of Clancarty Portrait The Earl of Clancarty (CB)
- Hansard - - - Excerpts

My Lords, I speak in support of my noble friend Lady Kidron’s amendments. I declare an interest as a visual artist, and of course visual creators, as my noble friend Lord Freyberg has very well described, are as much affected by this as musicians, journalists and novelists. I am particularly grateful to the Design and Artists Copyright Society and the Authors’ Licensing and Collecting Society for their briefings.

A particular sentence in the excellent briefing for this debate by the News Media Association, referred to by my noble friend Lady Kidron, caught my eye:

“There is no ‘balance’ to be struck between creators’ copyrights and GAI innovation: IP rights are central to GAI innovation”.


This is a crucial point. One might say that data does not grow on a magic data tree. All data originates from somewhere, and that will include data produced creatively. One might also say that such authorship should be seen to precede any interests in use and access. It certainly should not be something tagged on to the end, as an afterthought. I appreciate that the Government will be looking at these things separately, but concerns of copyright should really be part of any Bill where data access is being legislated for. As an example, we are going to be discussing the smart fund a bit later in an amendment proposed by the noble Lord, Lord Bassam, but I can attest to how tricky it was getting that amendment into a Bill that should inherently be accommodating these interests.

18:45
AI of course has huge benefits in other areas, as we have heard this afternoon, not least in the arts and creative industries. The famous example that comes to mind is the “Get Back” documentary on the Beatles, directed by Peter Jackson; but, as Paul McCartney pointed out this week, it is not just the famous and secure who are in danger of having their work scraped. It will also include those at the beginning of their careers, and those who have just enough work to survive, and that includes those fine artists and illustrators who have been engaged in lawsuits in America over precisely these concerns, whose work in film and animation are threatened. In art and design, we are talking about a huge range of work—everyone from fine artists to bespoke craft and artisanship are potentially in the firing line.
A recent survey on AI carried out by the Authors’ Licensing and Collecting Society found that 96% of writers would want remuneration if their work was used to train AI, which is as much of an argument for an opt-in system as any. This is apart from the highly respected permission-based copyright standard under current UK law. Moreover, 77% of writers do not know whether their work has been used to train AI. As the ALCS says:
“We need a workable, regulated approach to create systems and data to identify with sufficient specificity the works of individual authors that have been used within GAI systems”.
“Sufficient specificity” is underlined. True transparency, which the creative industries are calling for, must surely mean an opt-in system.
Finally, at the recent All-Party Parliamentary Group for Writers reception, we heard a moving speech by the author Joanne Harris, who made perhaps the most important point. She said that to a lot of the public, as soon as you utter the words “artificial intelligence”, people still think it is science fiction. It is not science fiction. As Joanne Harris and others have pointed out, it is happening now and happening in a big way. The Government need to deal with these concerns both urgently and effectively.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - - - Excerpts

My Lords I have been very impressed by the speeches of my noble friends Lady Kidron and Lord Freyberg, so I will be very brief. I declare in interest as a television producer who produces content. I hope that it has not been scraped up by AI machines, but who knows? I support the amendments in this group.

I know that AI is going to solve many problems in our economy and our society. However, in their chase for the holy grail of promoting AI, I join other noble Lords in asking the Government not to push our creative economy under the bus. It is largely made up of SMEs and single content producers, who do not have the money to pursue powerful AI companies to get paid for the use of their content in training their AI models. It is up to noble Lords to help shape regulations that protect our data and copyright laws and can be fully deployed in the defence of the creative economy.

I too have read the Government’s Copyright and Artificial Intelligence consultation paper, published yesterday. The foreword says:

“The proposals include a mechanism for rights holders to reserve their rights”,


which I, like my noble friend Lady Kidron and others, interpret as meaning that creators’ works can be used by AI developers unless they opt out and require licensing for the use of their work. The Government are following the EU example and going for the opt-out model. I think that the European Union is beginning to realise that it is very difficult to make that work, and it brings an unfairness to content producers. Surely, the presumption should be that AI web crawlers should get agreement before using content. The real problem is that content producers do not even know when their content has been used. Even the AI companies sometimes do not know what content has been used. Surely, the opt-out measure is like having your house raided and then asking the burglar what he has taken.

I call on the Minister to work with us to create an opt-in regime. Creators’ works should be used only when already licensed by the AI companies. The companies say they usually do not use content, only data points. Surely that is like saying to a photographer, “We’ve used 99% of the pixels in a picture but not the whole picture”. If even one pixel is used, the photographer needs to know and be compensated.

The small companies and single content producers of our country are the backbone of our economy, as other noble Lords have said. They are threatened by this technology, in which we have placed so much faith. I ask the Minister to respond favourably to Amendments 204, 205 and 206 to ensure that we have fairness between some of the biggest AI players in the world and the hard-pressed people who create content.

Lord Hampton Portrait Lord Hampton (CB)
- Hansard - - - Excerpts

My Lords, I support Amendments 204, 205 and 206 in the names of my noble friends Lady Kidron and Lord Freyberg, and of the noble Lords, Lord Stevenson and Lord Clement-Jones, in what rapidly seems to be becoming the Cross-Bench creative club.

I spent 25 years as a professional photographer in London from the late 1980s. When I started, retouchers would retouch negatives and slides by hand, charging £500 an hour. Photoshop stopped that. Professional film labs such as Joe’s Basement and Metro would work 24 hours a day. Snappy Snaps and similar catered for the amateur market. Digital cameras stopped that. Many companies provided art prints, laminating and sundry items for professional portfolios. PDFs and websites stopped that. Many different forms of photography, particularly travel photography, were taken away when picture libraries cornered the market and drove down commissions to unsustainable levels. There were hundreds if not thousands of professional photographers in the country. The smartphone has virtually stopped that.

All these changes were evolution and the result of a world becoming more digitised, but AI web crawlers are different, illegally scraping images without consent or payment then potentially killing the trade of the victim by setting up in competition. This is a parasite, but not in the true sense, because a parasite is careful to keep its victims alive.

Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, I very much support these amendments. I declare an interest as an owner of written copyright in the Good Schools Guide and as a father of an illustrator. In both contexts, it is very important that we get intellectual property right, as I think the Government recognised in what they put out yesterday. However, I share the scepticism of those who have spoken as to whether the Government’s ideas can be made to work.

It is really important that we get this straight. For those of us operating at the small end of the scale, IP is under continual threat from established media. I write maybe 10 or a dozen letters a year to large media outfits reminding them of the borders, the latest to the Catholic Herald—it appears not even the 10 commandments have force on them. But what AI can do is a huge measure more difficult to deal with. I can absolutely see, by talking to Copilot, that it has gone through my paywall and absorbed the contents of the Good Schools Guide, but who am I supposed to go at for this? Who has actually done the trespassing? Who is responsible for it? Where is the ownership? It is difficult to enforce copyright, even by writing a polite letter to someone saying, “Please don’t do this”. The Government appear to propose a system of polite letters saying, “Oh dear, it looks as if you might have borrowed my copyright. Please, can you give it back?”

This is not practically enforceable, and it will not result in people who care about IP locating their businesses here. Quite clearly, we do not have ownership of the big AI systems, and it is unlikely that we will have ownership of them—all that will be overseas. What we can do is create IP. If we produce a system where we do not defend the IP that we produce, then fairly rapidly, those IP creators who are capable of being mobile will go elsewhere to places that will defend their IP. It is something that a Government who are interested in growth really ought to be interested in defending. I hope that we will see some real progress in the course of the Bill going through the House.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I declare my AI interests as set out in the register. I will speak in support of Amendments 204, 205 and 206, which have been spoken to so inspiringly by the noble Baroness, Lady Kidron, and so well by the noble Lords, Lord Freyberg, Lord Lucas and Lord Hampton, the noble Earl, Lord Clancarty, and the noble Viscount, Lord Colville. Each demonstrated different facets of the issue.

I co-chair the All-Party Group on AI and chaired the AI Select Committee a few years ago. I wrote a book earlier this year on AI regulation, which had a namecheck from the noble Baroness, Lady Jones, at Question Time, which I was very grateful for. Before that, I had a career as an IP lawyer, defending copyright and creativity, and in this House, I have been my party’s creative industries spokesperson. The question of IP and the training of generative AI models is a key issue for me.

This is the case not just in the UK but around the world. Getty and the New York Times are suing in the United States, as are many writers, artists and musicians. It was at the root of the Hollywood actors’ and writers’ strikes last year. It is one thing to use the tech—many of us are AI enthusiasts—but it is another to be at the mercy of it.

Close to home, the FT has pointed out, using the index published by the creator of an unlicensed dataset called Books3, published online, that it is possible to identify that over 85 books written by 33 Members of the House of Lords have been pirated to train AI models from household names, such as Meta, Microsoft and Bloomberg. Although it is absolutely clear that we know that the use of copyrighted works to train AI models is contrary to UK copyright law, the laws around the transparency of these activities have not caught up. As we have heard, as well as using pirated e-books in their training data, AI developers scrape the internet for valuable professional journalism and other media, in breach of both the terms of service of websites and copyright law, to train commercial AI models. At present, developers can do this without declaring their identity, or they may use IP scraped to appear in a search index for the completely different commercial purpose of training AI models.

How can rights owners opt out of something that they do not know about? AI developers will often scrape websites or access other pirated material before they launch an LLM in public. This means that there is no way for IP owners to opt out of their material being taken before its inclusion in these models. Once used to train these models, the commercial value, as we have heard, has already been extracted from IP scraped without permission, with no way to delete data from these models.

The next wave of AI models responds to user queries by browsing the web to extract valuable news and information from professional news websites. This is known as retrieval-augmented generation—RAG. Without payment for extracting this commercial value, AI agents built by companies such as Perplexity, Google and Meta will, in effect, free-ride on the professional hard work of journalists, authors and creators. At present, such crawlers are hard to block. There is no market failure; there are well-established licensing solutions. There is no uncertainty around the existing law; the UK is absolutely clear that commercial organisations, including gen AI developers, must license the data that they use to train their large language models.

Here, as the Government’s intentions become clearer, the political, business and creative temperature is rising. Just this week, we have seen the creation of a new campaign, the Creative Rights in AI Coalition—CRAIC —across the creative and news industries and, recently, Ed Newton-Rex reached more than 30,000 signatories from among creators and creative organisations.

19:00
With the new government consultation, which came out yesterday, we are now faced with a proposal regarding the text and data mining exception that we thought was settled under the last Government. There will be a statement tomorrow and we will no doubt have a second bite at the cherry, but echoed in the consultation, both Ministers—the noble Lord, Lord Vallance, and Feryal Clark MP—seem to think that we need a balance between the creative industries and the tech industries. But what kind of balance is this?
As the News Media Association says, the Government’s consultation is based on the mistaken idea, promoted by tech lobbyists and echoed in the consultation, that there is a lack of clarity in existing copyright law. This is completely untrue: the use of copyrighted content by gen AI firms without a licence is
“theft of copyright on a mass scale”,
and there is no objective case for a new text and data mining exception. Yet the Government are proposing to change the UK’s copyright framework by creating a text and data mining exception where rights holders have not expressly reserved their rights—in other words, an opt-out system, where content is free to use unless a rights holder proactively withholds consent.
To complement this, the Government are proposing transparency provisions and provisions to ensure that rights reservation mechanisms are effective. The Government have stated that they will move ahead with their preferred rights reservation option only if the transparency and rights reservation provisions are
“effective, accessible, and widely adopted”.
This is incredibly concerning, given that no effective rights reservations system for the use of content by gen AI has been proposed or implemented anywhere in the world, as the noble Lord, Lord Freyberg, said, making the Government’s proposals entirely speculative. As the NMA says, what the Government are proposing is an incredibly unfair trade-off, giving the creative industries a vague commitment to transparency, while giving the rights of hundreds of thousands of creators to gen AI firms. While creators are desperate for a solution after years of copyright theft by gen AI firms, making a crime legal cannot be the solution to mass theft.
We need transparency and a clear statement about copyright along the lines of these amendments. We absolutely should not expect artists to have to opt out. AI developers must be transparent about the identity and purposes of their crawlers and have separate crawlers for distinct purposes. Unless news publishers and the broader creative industries can retain control over their data, making UK copyright law enforceable, AI firms will be free to scrape the web without remunerating creators. This will not only reduce investment in trusted journalism but ultimately harm innovation in the AI sector.
Lord Faulks Portrait Lord Faulks (Non-Afl)
- Hansard - - - Excerpts

The noble Lord has enormous experience in these areas and will be particularly aware of the legal difficulties in enforcing rights. Given what he said, with which I entirely agree—indeed, I agree with all the speakers in supporting these amendments—and given the extraordinary expense of litigating to enforce rights, how does he envisage there being an adequate system to allow those who have had their data scraped in the way that he describes to obtain redress or, rather, suitable remedies?

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I thank the noble Lord for that. He is anticipating a paragraph in my notes, which says that, although it is not set out in the amendments, robust enforcement of these provisions will be critical to their success. This includes oversight from an expert regulator that is empowered to issue significant penalties, including fines for non-compliance. There is a little extra work to do there, and I would very much like to see the Intellectual Property Office gain some teeth.

I am going to close. We are nearly at the witching hour, but it is clear that AI developers are seeking to use their lobbying clout—the noble Baroness, Lady Kidron, mentioned the Kool-Aid—to persuade the Government that new copyright law is required. Instead, this amendment would clarify that UK copyright law applies to gen AI developers. The creative industries, and noble Lords from across the House as their supporters, will rally around these amendments and vigorously oppose government plans for a new text and data- mining exception.

Lord Faulks Portrait Lord Faulks (Non-Afl)
- Hansard - - - Excerpts

My Lords, I have very little to add because I entirely support all these amendments. I am always concerned when I see the words “lack of clarity” in a context like this. The basic principle of copyright law, whereby one provides a licence and is paid for that licence by agreement, has been well established. There is no need for any further clarity in this context, as in earlier contexts of copyright law.

I should declare an interest as the chairman of IPSO, the regulator of 95% of the printed news media and its online versions. I have been impressed by the News Media Association’s briefings. It has identified important issues. I am extremely concerned about what appears to have been a considerable amount of lobbying by big tech in this area. It reminds me of what took place when your Lordships’ House considered the Digital Markets, Competition and Consumers Bill. A low point for me was when we were told that it would be very difficult to establish a proper system otherwise Google’s human rights would be somehow infringed. It is extremely important that this so-called balance does not mean that those who create original material protected by the copyright Acts have their rights violated in order to satisfy the interests of big tech.

Earl of Effingham Portrait The Earl of Effingham (Con)
- Hansard - - - Excerpts

My Lords, my noble friend Lord Camrose apologises to the Committee but he has had to leave early for unavoidable family reasons. Needless to say, he will read Hansard carefully.

It is our belief that a society that fails to value products of the mind will never be an innovative society. We are fortunate to live in that innovative society now and we must fight to ensure it remains one. Data scraping and AI crawlers pose both novel and substantial challenges to copyright protection laws and mechanisms. His Majesty’s Official Opposition are pleased that these amendments have been brought forward to address those challenges, which differ from those posed by traditional search engine crawlers.

Generally speaking, in creating laws about data we have been able to follow a north star of replicating online the values and behaviours we take for granted offline. This was of real service to us in the Online Safety Act, for example. In many ways, however, that breaks down when we come to AI and copyright. Offline, we are happy to accept that an artist, author, musician or inventor has been influenced by existing works in their field. Indeed, we sometimes celebrate that fact, and we have a strong intuitive sense of when influence has crossed the line into copying. This means that we can form an intuitive assessment of whether a copyright has been breached offline based on what creators produce, not what content they have consumed, which we expect to be extensive. With an AI crawler, that intuition and model break down. There are simply too many variables and too much information. We have no choice but to go after the inputs.

With that in mind, it would be helpful to set out the differences between traditional search engine crawlers and AI crawlers. Indexing crawlers used by the search engines we are all familiar with store information in their indexes. This then determines the results of the search. However, AI crawlers generally fall into two categories. The training crawlers scrape the web, collecting data used to train large language models. Live retrieval crawlers pull in live data from the web and incorporate it into chatbot responses.

Historically, the robots exclusion protocol—the plain text file identified as robots.txt—has been embedded into website domains, specifying to crawlers what data they can and cannot access in part or all of the domain. This has been used for the past 30 years to protect information or IP from indexing crawlers. Although the robots exclusion protocol has worked relatively well for many years, in some ways it is not fit for the web as it exists today—especially when dealing with AI crawlers.

To exclude crawlers from websites, we must be able to identify them. This was, for the most part, workable in the early days of the internet when there were relatively few search engines and, correspondingly, few indexing crawlers. However, given the rapidly increasing number of AI services, with their corresponding crawlers trawling the web, it becomes impossible to exclude them all. To make matters worse, some AI crawlers operate in relative secrecy. Their names, which can be viewed through domain holder access logs, reveal little of their purpose.

Furthermore, the robots exclusion protocol is not an enforceable agreement; it is more like a polite request. Based on that, a crawler can simply ignore a robots.txt file and scrape the data anyway. It is also worth noting that even if a crawler acknowledges and obeys a robots.txt file, the data may be inadvertently scraped from a third-party source who has lifted the data of intellectual property either manually or using a crawler that does not obey the robots.txt files. That can then be made available without the protection of the robots exclusion protocol. This raises an unsettling question: how do we protect intellectual property and data more generally from these AI crawlers, whose developers decline the voluntary limitations placed on them?

At this point, I turn to the amendments. Amendment 204 is a great initial step toward requiring crawler operators to respect UK copyright law. However, this provision would apply only to products and services of such operators that are marketed in the United Kingdom. What about those from outside the UK? Indeed, as my noble friend Lord Camrose has often argued, any AI lab that does not want to follow our laws can infringe the same copyright with impunity in another jurisdiction. Unless and until we address the offshoring problem, we continue to have real concerns as to the enforceability of any regulations we implement here.

I will address the individual subsections in Amendment 205. Proposed new subsection (1) would require crawlers to reveal their identity, including their name, who is responsible for them, their purpose, who receives their scraped data, and a point of contact. This is an excellent idea, although we are again concerned about enforceability due to offshoring. Proposed new subsection (2) requires this information to be easily accessible. We are sure this would be beneficial, but our concerns remain about infringements in other jurisdictions.

Requiring the deployment of crawlers with distinct purposes in proposed new subsection (3) is an excellent idea as it would allow data controllers to choose what data can be trawled and for what purpose, to the extent possible using the robots exclusion protocol. We do, however, have concerns about proposed new subsection (4). We are not sure how it would be possible for the exclusion of an AI crawler not to impact the findability of content. We assume this could be achieved only if we mandated the continued use of indexing crawlers.

As for Amendment 206, requiring crawler operators to regularly disclose the information scraped from copyrighted sources and make it accessible to copyright holders on their request is an interesting suggestion. We would be curious to hear how this would work in practice, particularly given the vast scale—some of those models crawl billions of documents, generating trillions of tokens. Where would that data be published? Given the scale of data-scraping, how would copyright holders know where to look for this information? If the operator was based outside the UK, how would disclosure be enforced? Our view is that watermarking technology can come to the rescue, dependent of course on an internationally accepted technical standard for machine-readable watermarks that contain licensing information.

19:15
Finally, on the Government’s proposed consultation, we applaud them for their clear effort to make progress on an issue of genuine difficulty. None of this is easy, and it is absolutely right and correct that they should look to propose inventive solutions. His Majesty’s Official Opposition are concerned by the already strong concerns from the creative sector, even before consultation has started. Clearly, the opt-out model has not been welcomed. It may be that those worries will be addressed through consultation—it may, for instance, turn out that a lot of the labour-intensive processes behind opt-out can be automated—but so far it is not landing well. In the end, it will come down to enforceability, to which there are considerable technical and jurisdictional barriers. The offshoring problem is a particular case of the latter.
Ultimately, we need to know considerably more about this before Report, so I ask the Minister to write with a detailed technical description of the proposed solution, terms of reference for the consultation exercise and the Government’s plans to drive international adoption of their approach or to adapt their approach based on international proposals.
Lord Vallance of Balham Portrait The Minister of State, Department for Science, Innovation and Technology (Lord Vallance of Balham) (Lab)
- Hansard - - - Excerpts

As someone who has spent my life creating IP, protecting IP and sometimes giving IP away, I welcome this debate. I am extremely grateful to the noble Baroness, Lady Kidron, for a very thoughtful set of proposals. The fact that many noble Lords have spoken in this debate shows that the rapid development of AI has clearly raised concerns about how to protect the creative industries. The Government take this very seriously. As the noble Lord, Lord Lucas, pointed out, we need to get it right, which is why we have launched a very wide-ranging consultation on a package of interventions to address copyright and AI issues. It is an important first step in an area where the existing situation is clearly not working and we run the risk of many long-lasting court cases, which will not help the situation in which we find ourselves.

We are committed both to supporting human-centred creativity and to the potential of AI to unlock new horizons. Many in the creative industries use AI very widely already. Our goal is to support AI innovation in the UK while maintaining robust protection for creators and our vibrant creative industry. In response to a point that the noble Baroness, Lady Kidron, raised earlier, option 1 in the consultation refers to existing copyright law and asks for views about maintaining and increasing it. The consultation sets out the Government’s objectives for this area and proposes a range of measures on which we are seeking views. Specifically, it aims to support rights-holders to continue to exercise control over the use of their content and their ability to seek remuneration for this. As many noble Lords have pointed out, that has to be made easy and technically feasible. It also promotes greater trust and transparency and proposes mechanisms by which you can see who is looking at the data and what they are doing with it.

Finally, it aims to support the development of world-leading AI models in the UK by ensuring that access can be appropriately wide but, of course, lawful and with the approval of those it is got from. This includes the subjects of the noble Baroness’s amendments. The consultation seeks views on technological measures that can provide greater control over access to and use of the online material, as well as transparency measures that help copyright owners understand whether their work is being used by AI developers. Again, this needs to be made easy. Various technologies are coming along which can do that, including, as has been said, the watermarking approach.

Much of this needs to be wrapped into an approach to standards. It is important that this is done in a way that is reproducible and reliable. Through this consultation, we will address some of these issues and seek to continue to get input from stakeholders on all of them. We will also work towards internationally interoperable solutions, as raised by many noble Lords, including the noble Lord, Lord Freyberg, and the noble Earl, Lord Effingham.

I agree with the noble Baroness, Lady Kidron, that a vibrant and effective licensing approach—a system that works well and provides access and rights—is important. She asked about an impact assessment. I do not have the information with me now, but I will write. I look forward to updating her on this work in due course and, in the meantime, hope that she is content to withdraw her amendment.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

Does the Minister recognise the characterisation of noble Lords who have said that this is theft? Currently, we have a law and copyright is being taken without consent or remuneration. Does he agree with them that this is what the creative industries and, I presume, some of his community are experiencing?

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- Hansard - - - Excerpts

At the moment we have a system where it is unclear what the rights are and how they are being protected, and therefore things are being done which people are unable to get compensation for. We can see that in the court cases going on at the moment. There is uncertainty which needs to be resolved.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I thank the Minister for his answer and welcome him very much to the Dispatch Box—I have not yet had the pleasure of speaking with him in a debate. I hope he saw the shaking heads when he answered my question about theft and this lack of clarity. If you say “Write me the opening chapter of a Stephen King novel”, and the AI can do it, you can bet your bottom dollar that it has absorbed a Stephen King novel. We know that a lot of this material is in there and that it is not being paid for. That goes for issues big and small.

I understand that it is late and we have more to do—I have more to say on other issues—but I want to reiterate three points. First, creative people are not anti-tech; they just want control over the things they create. AI is a creation on top of a creation, and creative people want to be paid for their efforts and to be in control of them. I am not sure whether I can mention it, because it was in a private meeting, but a brand that many people in most countries will have heard of said: “We need to protect our brand. We mean something. An approximation of us is not us. It is not just the money; it is also the control”.

I also make the point that, earlier this week, Canal+ had its IPO on the London Stock Exchange. I heard the CEO answer the question, “Why is it that Canal+ decided to come and do its IPO in the UK when everybody else is scarpering elsewhere?”, by saying a lot of very warm-hearted things about Paddington Bear, then, “Because you have very good copyright laws”. That is what they said. I just want to mention that.

Finally, I am grateful to the Minister for saying that there is the option of staying with the status quo; I will look at that and try to understand it clearly. However, when he writes about the issue that I raised in terms of opting in or opting out—I am grateful to him for doing so—I would also like an answer about where the Government think the money is going to go. What is the secondary value of the AI companies, which are largely headquartered in the US? Where will the IP, which those companies have already said they want to protect—they did so in their response to the Government’s consultation; I said that it in my speech, for anyone who was not listening—go? I would like the Government to say what their plans are, if we lose the £1.6 billion and the 2.4 million jobs, to replace that money and those jobs, as well as their incredible soft power.

With that, I beg leave to withdraw the amendment.

Amendment 204 withdrawn.
Amendments 205 and 206 not moved.
Amendment 207
Moved by
207: After Clause 132, insert the following new Clause—
“Reliability of computer-based evidence(1) Electronic evidence produced by or derived from a computer, device or computer system (separately or together “system”) is admissible as evidence in any proceedings— (a) where that electronic evidence and the reliability of the system that produced it or from which it is derived are not challenged;(b) where the court is satisfied that the reliability of the system cannot reasonably be challenged;(c) where the court is satisfied that the electronic evidence is derived from a reliable system.(2) Rules of Court must provide that electronic evidence sought to be relied upon by a party in any proceedings may be challenged by another party as to its admissibility.(3) For the purposes of subsection (1)(b), Rules of Court must provide for the circumstances in which the Court may be satisfied that the admissibility of electronic evidence cannot reasonably be challenged.(4) When determining whether a system is reliable for the purposes of subsection (1)(c) the matters that may be taken into account include—(a) any instructions or rules of the system that apply to its operation;(b) any measures taken to secure the integrity of data held on the system;(c) any measures taken to prevent unauthorised access to and use of the system;(d) the security of the hardware and software used by the system;(e) any measures taken to monitor and assess the reliability of the system by the system controller or operator including steps taken to fix errors or address unexpected outcomes including the regularity of and extent of any audit of the system by an independent body;(f) any assessment of the reliability of the system made by a body with supervisory or regulatory functions;(g) the provisions of any scheme or industry standard that apply in relation to the system.(5) For the purposes of this section—“computer” means any device capable of performing mathematical or logical instructions;“device” means any apparatus or tool operating alone or connected to other apparatus or tools, that processes information or data in electronic form;“electronic evidence” means evidence derived from data contained in or produced by any device the functioning of which depends on a software program or from data stored on a computer, device or computer system or communicated over a networked computer system.”Member’s explanatory statement
This amendment overturns the current legal assumption that evidence from computers is always reliable which has contributed to miscarriages of justice including the Horizon Scandal. It enables courts to ask questions of those submitting computer evidence about its reliability.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, it is a privilege to introduce Amendment 207. I thank the noble Lords, Lord Arbuthnot and Lord Clement-Jones, and the right reverend Prelate the Bishop of St Albans, who is unfortunately unwell but wanted me to express his support.

I make it clear that, although I may use the Horizon scandal as an example, this amendment is neither focused on nor exclusive to the miscarriage of justice, malevolence and incompetence related to that scandal. It is far broader than that so, when the Minister replies, I really hope that he or she—I am not sure which yet—will not talk about the criminality of the Post Office, as previously, but rather concentrate on the law that contributed to allowing a miscarriage of justice at that scale. That is what this amendment seeks to address.

I explained during debates on the DPDI Bill that, since 1999, courts have applied

“a common law presumption, in both criminal and civil proceedings, of the proper functioning of machines—that is to say”,

the information from the computer can be presumed to be reliable. I went on to say:

“In principle, there is a low threshold for rebutting this presumption but, in practice … a person challenging evidence derived from a computer will typically have no””.—[Official Report, 24/4/24; col. GC 573.]


knowledge of the circumstance in which the system in question was operated so cannot possibly demonstrate that it failed. As Paul Marshall, the barrister who represented some of the postmasters, explains, this puts the onus on the defendant to explain to the jury the problems they encountered when all they could actually do was point to the shortfalls they had experienced—in the Horizon case, that the cash received did not match the balancing figure on the computer screen. They did not have access to the system or the record of its past failures, and they had no knowledge of what the vulnerabilities were. They only knew that it did not work.

The reality is that anyone who knows the first thing about programming or computer science knows that there are bugs in the system. Indeed, any one of us who has agreed to an update for an app or computer software understands that bug fixing is a common aspect of program maintenance. When I discussed this amendment with a computer scientist of some standing, he offered the opinion that there are likely to be 50 bugs per 1,000 lines of code; many complex systems run to tens of millions of lines of code.

Perhaps the most convincing thing of all is looking at software contracts. For the past 20 years at least, a contract is likely to contain words to this effect: “No warranty is provided that the operation of the software will be uninterrupted or error free, or that all software errors will be corrected”. This same clause applies in contracts when we say yes to a new Apple upgrade when we sign a EULA—an end-user licence agreement. In plain English, for two decades at least, those who provide software have insisted that computer information is not to be considered reliable. That is written into their commercial agreements, so the fact that computer information is not reliable is agreed by those who know about computer information.

19:30
Similarly, the wrongness of the current legal presumption that computer information is reliable is also widely agreed. It was agreed by the previous Lord Chancellor, Alex Chalk, who promised me that he would look at it. It has been the subject of discussion for several years in the MoJ, which asked Paul Marshall to report on it in 2020 and again in 2021. It has also been pointed out by Lord Justice Fraser, now a judge of the Court of Appeal, that the presumption was not correct. Although my own promised ministerial meeting with the MoJ did not materialise before this Committee, I am sure that the current Lord Chancellor would agree that the existing presumption in law is wrong because it is a presumption that anyone with even the most basic knowledge of computers would consider absurd.
I laid an amendment to the DPDI Bill, based on Section 69 of the PACE Act. Officials and Ministers worried that unscrupulous lawyers would challenge every possible automated thing. They presented the spectre of murderers challenging body cam evidence and a justice system brought to a standstill by smart lawyers of drunk drivers querying whether the breathalyser was reliable. I am no longer sure that this assessment is correct since, in most cases, there would be other evidence that did or did not corroborate, such as witnesses or other officers present, urine samples and blood tests. Some departments have a habit of making a problem so big that we can never solve it.
Given the costs of the Post Office debacle, which currently exceed £1 billion, the level of distress and hardship that it has inflicted, and the reality that it has led and will continue to lead to miscarriages of justice beyond those affected by Horizon, I find it astonishing that the Ministry of Justice has failed to tackle this issue. It is more than five years since Mr Justice Fraser, now Lord Justice Fraser, made it clear that the uncritical admission of evidence in the Horizon case was in itself an injustice, as the burden to say in what way the computer was unreliable fell on the party without access to the system while the party with access had no similar responsibility to reveal what might be unreliable.
Just as the failure to compensate the postmasters adds injury to insult and harm to hurt, so, too, the failure of the MoJ to address a known and continuing injustice adds to a picture in which the court and the Government repeatedly fail to serve the people who depend on them. If we all agree that we have a problem where the current law is not only blind to but actively asserts an untruth, from which great injustice flows, that should be a matter of urgent concern.
Amendment 207 is the result of expert advice from external counsel and computer scientists, including Professor Harold Thimbleby. Between them, they have scores of years’ experience looking specifically at the intersection of law and technology. I thank them for their time and dedication to this issue; I will shortly return to their comprehensive view.
Amendment 207 does not speak to the reliability of computers but concentrates entirely on the question of computer evidence put in front of the court, so that the presumption cannot be a cause for further injustice. Proposed new subsection (1) says that computer evidence should be “admissible”—that is, allowed to be relied on by a party in court proceedings—if, first, the other party does not object to the evidence being relied on; secondly, the court considers that no sensible or reasonable objection can be taken to the evidence being relied on, which is to say it being admitted; and, thirdly, there is evidence that the source of the evidence, such as the computer system that produced it, is reliable. Later subsections simply offer guidance for the courts in evaluating what a reliable computer system is.
The amendment provides protection against computer evidence being relied on where there is no assurance that the computer from which the material is derived is one that functions properly or reliably. Importantly, the provision does not determine that computer evidence should be accepted or given weight by the court; that remains the court’s function in civil trials and the jury’s function in criminal trials. Once admitted to a trial, evidence will be tested in the usual way, with expert witness if necessary.
If this amendment had been in place, the Post Office scandal would have been avoided in some part, possibly for decades—as would the horrific fate of nurses at the Welsh Princess of Wales Hospital who were, in 2012, wrongly accused of falsifying patient records because of discrepancies found with computer records. Some nurses were subjected to criminal prosecution, suffering years of legal action before the trial collapsed when it emerged that a visit by an engineer fixing a bug had erased data that the nurses were accused of failing to gather. If the bar for putting forward evidence as reliable, as set out by the guidance contained in this amendment, had been in place, it would have pointed even the least technical judge towards the fact that there should have been engineering and audit logs highlighting the unauthorised access to, and amendment of, data.
It is often the case that evidence from a computer is part of the evidential picture. Amendment 207 would allow for that. It would give structure to the questions that the court should ask but leave it to the court to weigh those considerations for itself. Once the court has determined the integrity of the evidence, it will be free to consider its contribution to the whole. It follows that the more important or central the data, the more important it is that reliability is assured.
Finally, let me make the observation—sadly, not for the first time in Committee—that, when issues involve the interests of commercial players rather than justice for individuals, it seems that the machinery of government is minded to act. Last year, the Electronic Trade Documents Act 2023 was introduced. The purpose of that legislation was to provide confidence in the integrity of electronic documents relied on in commerce. Of course, it makes absolute sense that you cannot trade if you have no confidence in the integrity of electronic documents, but why has that been given priority over justice in criminal and civil proceedings, even when we know that we are subject to bad law?
In the build-up to the passing of the ETDA, the Law Commission debated for some time the desirability of including guidance for the courts. At first, it decided against. However, following consultation with Lord Justice Fraser, the Law Commission changed its stance, since he urged the commission to include it and said that it would be useful for the courts. Amendment 207 encompasses that very same guidance. It is in our trade law and should be in our courts. This is the very same Lord Justice Fraser who finally broke open the sub-postmasters case and whom the MoJ has studiously ignored in finding a solution on the reliability of evidence. Amendment 207 takes his advice. It mirrors the provision in providing guidance to the court. It is not prescriptive and, given its excellent provenance, I trust that the Government will not find it wanting.
I return to what I said at the outset: this is not about the postmasters. In the last year alone we have seen bugs and problems in banking, air traffic control, supermarket delivery, banks, hospitals, trains and more. As we approach a world of AI and greater reliance on tech, we anticipate greater variations in reliability and more cases coming to the court. However, although the postmasters will not benefit from this change in the law—nor is it the sole example—they illustrate the human cost of failing to act.
The amendment is not the end of the matter. My legal advisers say we also need to overturn the presumption because it is wrong. Directing the court per this amendment is necessary and, in due course, we will also need certification or audit trails for computer information that is depended on for court matters. The amendment is put forward to reboot the conversation that was interrupted by the election.
I hope we will have a ministerial answer from the Dispatch Box that agrees to deal with this issue as a matter of urgency before Report, not one saying it is complicated. We know it is complicated, but for the postmasters, the nurses or anyone else whose life or livelihood has been taken or threatened by a bug, the status quo is unacceptable. Twenty-five years is too long for the law to assert something that is patently false. The MoJ has been looking at this issue in detail for more than five years and I have sought an urgent answer, along with the noble Lord, Lord Arbuthnot, for the past five months. If it is too complicated for the MoJ, I have a group of eminent lawyers and computer scientists who would happily do the task for it. I beg to move.
Lord Arbuthnot of Edrom Portrait Lord Arbuthnot of Edrom (Con)
- Hansard - - - Excerpts

My Lords, I declare my interest as a member of the Horizon Compensation Advisory Board. When, on 24 April this year, the noble Baroness, Lady Kidron, proposed an amendment to remove the presumption about the reliability of computer evidence, the noble Baroness who is now the Minister added her name to it—oh the perils of moving from opposition to government.

My noble friend Lord Camrose—the Minister at the time—in a sympathetic speech, resisted that amendment on the basis, first, that there were shocking failures of professional duty in the Post Office case. This was quite true, but they were facilitated by the existence of the presumption. His second reason was that taking us back to the law of 1999, as the noble Baroness, Lady Kidron, eloquently set out just now, would risk undermining prosecutions because we would need to get certificates of accuracy in cases such as breathalysers and those involving emails. There may have been something in that, so the noble Baroness has proposed an amendment that is designed to get round that second point.

I suspect that the Minister will resist this amendment too, but for reasons that I hope she will set out clearly, because we may then decide to move a different amendment on Report. We are making all the running on this—or at least the noble Baroness, Lady Kidron, is, with my full support and, I know, that of the noble Lord, Lord Clement-Jones. I take a moment out of this Committee to pay tribute to their work ethic in this Committee, which has been quite phenomenal.

The Government do not seem to have the issue quite as close to the top of their priorities as we suggest. Without repeating all that I said on 24 April, I will summarise it as follows. Paul Marshall, the barrister, has pointed out that computer evidence is hearsay, with all the limitations that that implies. Modern computer programs are too large to be exhaustively tested. If computer programs are inherently unreliable, it is wrong to have a presumption that they are reliable. That issue will grow with the growth of artificial intelligence.

The presumption that computer evidence is reliable leads either to such things as we saw occur in the Post Office scandal, with the Post Office essentially taunting the sub-postmasters, saying, “If you can’t show us what is wrong with the computer evidence, we don’t have to show you that evidence”—a shocking case of Catch-22; or to lawyers and courts voluntarily abandoning the presumption and denigrating all computer evidence, whether or not it deserves to be denigrated. That might lead, for example, to some defendants being acquitted when the evidence would require that they be convicted. We are trying to help the Government find a way through a problem that they recognise and assert exists. Will they please give us some help in return? This is both serious and urgent. Just saying that it is very difficult does not begin the process of putting it right.

19:45
Lord Tarassenko Portrait Lord Tarassenko (CB)
- Hansard - - - Excerpts

My Lords, I will speak briefly in support of this amendment. Anyone who has written computer code, and I plead guilty, knows that large software systems are never bug-free. These bugs can arise because of software design errors, human errors in coding or unexpected software interactions for some input data. Every computer scientist or software engineer will readily acknowledge that computer systems have a latent propensity to function incorrectly.

As the noble Baroness, Lady Kidron, has already said, we all regularly experience the phenomenon of bug fixing when we download updates to software products in everyday use—for example, Office 365. These updates include not only new features but patches to fix bugs which have become apparent only in the current version of the software. The legal presumption of the proper functioning of “mechanical instruments” that courts in England and Wales have been applying to computers since 1999 has been shown by the Post Office Horizon IT inquiry to be deeply flawed. The more complex the program, the more likely the occurrences of incorrect functioning, even with modular design. The program at the heart of Fujitsu’s Horizon IT system had tens of millions of lines of code.

The unwillingness of the courts to accept that the Horizon IT system developed for the Post Office was unreliable and lacking in robustness—until the key judgment, which has already been mentioned, by Mr Justice Fraser in 2019—is one of the main reasons why more than 900 sub-postmasters were wrongly prosecuted. The error logs of any computer system make it possible to identify unexpected states in the computer software and hence erroneous system behaviour. Error logs for the Horizon IT system were disclosed only in response to a direction from the court in early 2019. At that point, the records from Fujitsu’s browser-based incident management system revealed 218,000 different error records for the Horizon system.

For 18 years prior to 2019, the Post Office did not disclose any error log data, documents which are routinely maintained and kept for any computer system of any size and complexity. Existing disclosure arrangements in legal proceedings do not work effectively for computer software, and this amendment concerning the electronic evidence produced by or derived from a computer system seeks to address this issue. The Post Office Horizon IT inquiry finished hearing evidence yesterday, having catalogued a human tragedy of unparalleled scale, one of the most widespread miscarriages of justice in the UK. Whether it is by means of this amendment or otherwise, wrongful prosecutions on the basis that computers always operate properly cannot continue any longer.

Earl of Erroll Portrait The Earl of Erroll (CB)
- Hansard - - - Excerpts

My Lords, if I may just interject, I have seen this happen not just in the Horizon scandal. Several years ago, the banks were saying that you could not possibly find out someone’s PIN and were therefore refusing to refund people who had had stuff stolen from them. It was not until the late Professor Ross Anderson, of the computer science department at Cambridge University, proved that they had been deliberately misidentifying to the courts which counter they should have been looking at, as to what was being read, and explained exactly how you could get the thing to default back to a different set of counters, that the banks eventually had to give way. But they went on lying to the courts for a long time. I am afraid that this is something that keeps happening again and again, and an amendment like this is essential for future justice for innocent people.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, it is a pity that this debate is taking place so late. I thank the noble Lord, Lord Arbuthnot, for his kind remarks, but my work ethic feels under considerable pressure at this time of night.

All I will say is that this is a much better amendment than the one that the noble Baroness, Lady Kidron, put forward for the Data Protection and Digital Information Bill, and I very strongly support it. Not only is this horrifying in the context of the past Horizon cases, but I read a report about the Capture software, which is likely to have created shortfalls that led to sub-postmasters being prosecuted as well. This is an ongoing issue. The Criminal Cases Review Commission is reviewing five Post Office convictions in which the Capture IT system could be a factor, so we cannot say that this is about just Horizon, as there are the many other cases that the noble Baroness cited.

We need to change this common law presumption even more in the face of a world in which AI use, with all its flaws and hallucinations, is becoming ever present, and we need to do it urgently.

Earl of Effingham Portrait The Earl of Effingham (Con)
- Hansard - - - Excerpts

My Lords, I thank the noble Baroness, Lady Kidron, for tabling her amendment. We understand its great intentions, which we believe are to prevent another scandal similar to that of Horizon and to protect innocent people from having to endure what thousands of postmasters have undergone and suffered.

However, while this amendment would make it easier to challenge evidence derived from, or produced by, a computer or computer system, we are concerned that, should it become law, this amendment could be misused by defendants to challenge good evidence. Our fear is that, in determining the reliability of such evidence, we may create a battle of the expert witnesses. This will not only substantially slow down trials but result in higher costs. Litigation is already expensive, and we would aim not to introduce additional costs to an already costly process unless absolutely necessary.

From our perspective, the underlying problem in the Horizon scandal was not that computer systems were critically wrong or that people were wrong, but that the two in combination drove the terrible outcomes that we have unfortunately seen. For many industries, regulations require firms to conduct formal systems validation, with serious repercussions and penalties should companies fail to do so. It seems to us that the disciplines of systems validation, if required for other industries, would be both a powerful protection and considerably less disruptive than potentially far-reaching changes to the law.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I thank the noble Baroness and the noble Lord, Lord Arbuthnot, for Amendment 207 and for raising this important topic. The noble Baroness and other noble Lords are right that this issue goes far wider than Horizon. We could debate what went wrong with Horizon, but the issues before us today are much wider than that.

The Government are agreed that we must prevent future miscarriages of justice. We fully understand the intention behind the amendment and the significance of the issue. We are actively considering this matter and will announce next steps in the new year. I reassure noble Lords that we are on the case with this issue.

In the meantime, as this amendment brings into scope evidence presented in every type of court proceeding and would have a detrimental effect on the courts and prosecution—potentially leading to unnecessary delays and, more importantly, further distress to victims—I must ask the noble Baroness whether she is content to withdraw it at this stage. I ask that on the basis that this is an ongoing discussion that we are happy to have with her.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I thank the Minister, in particular for understanding that this goes way beyond Horizon. I would be very interested to be involved in those conversations, not because I have the great truth but because I have access to people with the great truth on this issue. In the conversations I have had, there has been so much pushing back. A bit like with our previous group, it would have been better to have been in the conversation before the consultation was announced than after. On that basis, I beg leave to withdraw the amendment.

Amendment 207 withdrawn.
Amendments 208 to 210 not moved.
Amendment 211
Moved by
211: After Clause 132, insert the following new Clause—
“Sovereign data assets(1) The Secretary of State may by regulations define data sets held by public bodies and arm’s length institutions and other data sets that are held in the public interest as sovereign data assets (defined in subsection (6)).(2) In selecting data sets which may be designated as sovereign data assets, the Secretary of State must—(a) have regard to—(i) the security and privacy of United Kingdom data subjects;(ii) the ongoing value of the data assets;(iii) the rights of United Kingdom intellectual property holders;(iv) ongoing adherence to the values, laws and international obligations of the United Kingdom;(v) the requirement for public sector employees, researchers, companies and organisations headquartered in the United Kingdom to have preferential terms of access;(vi) the need for data to be stored in the United Kingdom, preferably in data centres in the United Kingdom;(vii) the need to design Application Programming Interfaces (APIs) as bridges between each sovereign data asset and the client software of the authorized licence holders;(b) consult with—(i) academics with expertise in the field;(ii) the AI Safety Institute;(iii) those with responsibility for large public data sets;(iv) data subjects;(v) the Information Commissioner.(3) The Secretary of State must establish a transparent licensing system, fully reflecting the security and privacy of data held on United Kingdom subjects, for use in providing access to sovereign data assets.(4) The Secretary of State must report annually to Parliament on the ongoing value of the sovereign data assets, in terms of—(a) their value to future users of the data;(b) the financial return expected when payment is made for the use of such data in such products and services as may be expected to be developed.(5) The National Audit Office must review the licensing system established by the Secretary of State under subsection (3) and report annually to Parliament as to its effectiveness in securing the ongoing security of the sovereign data assets.(6) In this section—“sovereign data asset” means—(a) data held by public bodies and arm’s length institutions of government;(b) data sets held by third parties that volunteer data to form, or contribute to, a public asset.(7) Regulations under this section are to be made by statutory instrument.(8) A statutory instrument containing regulations under this section may not be made unless a draft of the instrument has been laid before and approved by a resolution of each House of Parliament.” Member’s explanatory statement
The UK has a number of unique publicly-held data assets, from NHS data to geospatial data and the BBC’s multimedia data. This amendment would create a special status for data held in the public interest, and a licensing scheme for providing access to them, which upholds UK laws and values, and ensure a fair return of financial benefits to the UK.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, the good news is that this is the last time I shall speak this evening. Amendment 211 seeks to ensure that the value of our publicly held large datasets is realised for the benefit of UK citizens.

Proposed new subsection (1) gives the Secretary of State the power to designate datasets held by public bodies, arm’s-length institutions or other sets held in the public interest as sovereign data assets.

Proposed new subsection (2) lists a number of requirements that the Secretary of State must have regard to when making a designation. Factors include the security and privacy of UK citizens, the ongoing value of the data assets, the rights of IP holders, the values, laws and international obligations of the UK, the requirement to give preferential access to UK-headquartered companies, organisations and the public sector, the requirement for data to be stored in the UK and the design of application programming interfaces facilitating access to the assets by authorised licence holders. It also sets out stakeholders whom the Secretary of State must consult when considering what datasets to designate as sovereign data assets. We heard in a previous debate that education data might be a good candidate.

Proposed new subsection (3) requires the setting up of a transparent licensing system. Proposed new subsection (4) requires those managing sovereign data assets to report annually on their value and anticipated return to UK subjects. This would include, for example, licence payments, profit share agreements and “in kind” returns, such as access to products or services built using sovereign data assets. Proposed new subsection (5) gives an oversight role to the National Audit Office, proposed new subsection (6) provides a definition, and proposed new subsections (7) and (8) specify that regulations made under the clause are subject to parliamentary approval.

When I raised this issue at Second Reading, the Minister answered positively, in that she felt that what I was suggesting was embodied in the plans for a national data library:

“The national data library will unlock the value of public data assets. It will provide simple, secure and ethical access to our key public data assets for researchers, policymakers and businesses, including those at the frontier of AI development, and make it easier to find, discover and make connections across those … databases. It will sit at the heart of an ambitious programme of reform that delivers the incentives, investment and leadership needed to secure the full benefits for people and the economy”.—[Official Report, 19/11/24; col. 196.]


That is a very valid and positive picture. My comments build on it, because since Second Reading, I have sought details about the national data library. It seems that plans are nascent and the level of funding, as I understand it, seems to match neither the ambition set out by the Minister nor what many experts think is necessary. One of my concerns—it will not surprise the Committee to hear, as it has come up a couple of times on previous groups—is that it appears to be a mechanism for facilitating access, rather than understanding, realising and protecting the value of these public data assets.

In the meantime, announcements of access to public data keep coming. We have worries about Palantir and the drip-feed of deals with OpenAI and Google, the latest of which was in the Health Services Journal, which said:

“The national Federated Data Platform will be used to train AI models for future use by the NHS, according to NHS England’s chief data and analytics officer”.


That sounds great, but the article went on to question the basis and the wrap-around. This is the question.

We in this House already understand the implications of an “adopt now, ask questions later”, approach. For example, as reported recently in Computer Weekly, Microsoft has now admitted to Scottish policing bodies that it cannot guarantee the sovereignty of UK policing data hosted on its hyperscale public cloud infrastructure. That is a huge problem for Police Scotland and one that is very likely to be mirrored across all sorts of government departments, in a technology that is central to so many departments. The proposed amendments offer a route to ask questions as you adopt technology, not after you have lost control.

20:00
The speed at which the Government are giving access to our data is swifter than the plans to protect its financial or societal value. I think this is something of a theme of this Committee: it does not deal with the needs of IP holders, UK citizens, children or NHS patients, or meet the spectre of AI systems. There is often a conflation by Ministers of the need to access data for medicine, space, museums and other exciting matters with the prosperity it will bring and the savings it will make, but if we look at the deals made so far, the benefit has accrued disproportionately to a handful of US-headquartered companies.
We know that handing public assets to private companies in the hope they will return a public benefit has some flaws. We are still paying for PFIs, while private water companies have consistently prioritised shareholder returns and executive pay over investment in critical infrastructure, at huge cost to the public, our rivers and seas. Thirty years after John Major privatised the railways and operators, this Government have pledged to return both to public ownership, having seen billions of taxpayer pounds go into private hands. Yet at the same time as they are reclaiming these assets and infrastructure for and on behalf of the UK, they are doing deals that undervalue one of our most valuable national assets, our publicly held data. It is a resource that could, if managed appropriately, bring revenue to our struggling public sector and revolutionise the delivery of public service while reducing spending. Instead, I worry that we give unconditional access to companies that take that learning and turn it into products and services for which we will in the future pay market price and which will generate large profits.
I applaud the productive use of UK data, but for societal goods and as a contributor to national prosperity, not as another leak of control and value to a handful of dominant incumbents. Data is not separate from other modern infrastructure considerations but part of it. I recognise the complexity of making something from the data that we hold, but just like the previous arguments about protecting intellectual property, the new innovations cannot be made without the raw material of data—or, as the noble Lord, Lord Holmes, would have it, our data.
Beyond securing financial returns, the Government’s rush to give access and their failure to consider citizens’ needs is alarming. We need to make sure that the exploitation of our data is on terms that are consistent with our values and has the consent of the people. In a word cloud that was generated based on the latest government polling about AI, one word screamed out from the pack, and that was “scary”. The only words that I could read without glasses were “dangerous”, “concern”, “unsure”, “robot”, “worry”, “nervous”, “confused”, “cautious”, “wary” and “sceptical”, so I am not the only one who sees the cavalier statements of Ministers as a threat to the safety, security and prosperity of the UK. What the word cloud tells us is that there is a disconnect between the Government’s “lean in, move fast, hurt now, fix later” approach and the views of those on whose behalf they govern.
Underlying the Government’s rhetoric is the implication that those who disagree with their strategy and the pace at which they are opening up access have failed to understand the opportunity. It is possible to be excited by AI’s potential and to disagree with the Government’s strategy, because it reflects a failure to recognise that they are being played by the tech companies, whose lobbyists are experts in spreading uncertainty and making regulators and governments feel that they hold all the answers, when those answers are self-serving.
I hope that this is one of several positive suggestions made by noble Lords in Committee that will be treated positively and subject to serious discussion and consideration, rather than summarily dismissed with no thought as to how this will play out in the decades ahead. This is a Bill, an issue and a country that need a sense of purpose; I believe that sovereign data assets could play a part in that. I beg to move.
Lord Russell of Liverpool Portrait The Deputy Chairman of Committees (Lord Russell of Liverpool) (CB)
- Hansard - - - Excerpts

My Lords, before we proceed, I draw to the attention of the Committee that we have a hard stop at 8.45 pm and we have committed to try to finish the Bill this evening. Could noble Lords please speak quickly and, if possible, concisely?

Lord Tarassenko Portrait Lord Tarassenko (CB)
- Hansard - - - Excerpts

My Lords, I support my noble friend Lady Kidron’s Amendment 211, to which I have put my name. I speak not as a technophobe but as a card-carrying technophile. I declare an interest as, for the past 15 years, I have been involved in the development of algorithms to analyse NHS data, mostly from acute NHS trusts. This is possible under current regulations, because all the research projects have received medical research ethics approval, and I hold an honorary contract with the local NHS trust.

This amendment is, in effect, designed to scale up existing provisions and make sure that they are applied to public sector data sources such as NHS data. By classifying such data as sovereign data assets, it would be possible to make it available not only to individual researchers but to industry—UK-based SMEs and pharmaceutical and big tech companies—under controlled conditions. One of these conditions, as indicated by proposed new subsection (6), is to require a business model where income is generated for the relevant UK government department from access fees paid by authorised licence holders. Each government department should ensure that the public sector data it transfers to the national data library is classified as a sovereign data asset, which can then be accessed securely through APIs acting

“as bridges between each sovereign data asset and the client software of the authorized licence holders”.

In the time available, I will consider the Department of Health and Social Care. The report of the Sudlow review, Uniting the UK’s Health Data: A Huge Opportunity for Society, published last month, sets out what could be achieved though linking multiple NHS data sources. The Academy of Medical Sciences has fully endorsed the report:

“The Sudlow recommendations can make the UK’s health data a truly national asset, improving both patient care and driving economic development”.


There is little difference, if any, between health data being “a truly national asset” and “a sovereign asset”.

Generative AI has the potential to extract clinical value from linked datasets in the various secure data environments within the NHS and to deliver a step change in patient care. It also has the potential to deliver economic value, as the application of AI models to these rich, multimodal datasets will lead to innovative software products being developed for early diagnosis and personalised treatment.

However, it seems that the rush to generate economic value is preceding the establishment of a transparent licensing system, as in proposed new subsection (3), and the setting up of a coherent business model, as in proposed new subsection (6). As my noble friend Lady Kidron pointed out, the provisions in this amendment are urgently needed, especially as the chief data and analytics officer at NHS England is reported as having said, at a recent event organised by the Health Service Journal and IBM, that the national federated data platform will soon be used to train different types of AI model. The two models mentioned in the speech were OpenAI’s proprietary ChatGPT model and Google’s medical AI, which is based on its proprietary large language model, Gemini. So, the patient data in the national federated data platform being built by Palantir, which is a US company, is, in effect, being made available to fine-tune large language models pretrained by OpenAI and Google—two big US tech companies.

As a recent editorial in the British Medical Journal argued:

“This risks leaving the NHS vulnerable to exploitation by private technology companies whose offers to ‘assist’ with infrastructure development could result in loss of control over valuable public assets”.


It is vital for the health of the UK public sector that there is no loss of control resulting from premature agreements with big tech companies. These US companies seek privileged access to highly valuable assets which consist of personal data collected from UK citizens. The Government must, as a high priority, determine the rules for access to these sovereign data assets along the lines outlined in this amendment. I urge the Minister to take on board both the aims and the practicalities of this amendment before any damaging loss of control.

Lord Freyberg Portrait Lord Freyberg (CB)
- Hansard - - - Excerpts

My Lords, I support Amendment 211 moved by my noble friend Lady Kidron, which builds on earlier contributions in this place made by the noble Lords, Lord Mitchell, Lord Stevenson, Lord Clement-Jones, and myself, as long ago as 2018, about the need to maximise the social, economic and environmental value that may be derived from personal data of national significance and, in particular, data controlled by our NHS.

The proposed definition of “sovereign data assets” is, in some sense, broad. However, the intent to recognise, protect and maximise their value in the public interest is readily inferred. The call for a transparent licensing regime to provide access to such assets and the mention of preferential access for individuals and organisations headquartered in the UK also make good sense, as the overarching aim is to build and maintain public trust in third-party data usage.

Crucially, I fully support provisions that would require the Secretary of State to report on the value and anticipated financial return from sovereign data assets. Identifying a public body that considered itself able or willing to guarantee value for money proved challenging when this topic was last explored. For too long, past Governments have dithered and delayed over the introduction of provisions that explicitly recognise the need to account for and safeguard the investment made by taxpayers in data held by public and arm’s-length institutions and associated data infrastructure—something that we do as a matter of course where the tangible assets that the National Audit Office monitors and reports on are concerned.

In recent weeks, the Chancellor of the Exchequer has emphasised the importance of recovering public funds “lost” during the Covid-19 pandemic. Yet this focus raises important questions about other potential revenue streams that were overlooked, particularly regarding NHS data assets. In 2019, Ernst & Young estimated that a curated NHS dataset could generate up to £5 billion annually for the UK while also delivering £4.6 billion in yearly patient benefits through improved data infrastructure. This begs the question: who is tracking whether these substantial economic and healthcare opportunities are being realised? Who is ensuring that these projected benefits—both financial and clinical—are actually flowing back into our healthcare system?

As we enter the age of AI, public discourse often fixates on potential risks while overlooking a crucial opportunity—namely, the rapidly increasing value of publicly controlled data and its potential to drive innovation and insights. This raises two crucial questions. First, how might we capitalise on the upside of this technological revolution to maximise the benefits on behalf of the public? Secondly, and more specifically, how will Parliament effectively scrutinise any eventual trade deal entered into with, for example, the United States of America, which might focus on a more limited digital chapter, in the absence of either an accepted valuation methodology or a transparent licensing system for use in providing access to valuable UK data assets?

Will the public, faced with a significant tax burden to improve public services and repeated reminders of the potential for data and technology to transform our NHS, trust the Government if they enable valuable digital assets to be stripped today only to be turned tomorrow into cutting-edge treatments that we can ill afford to purchase and that benefit companies paying taxes overseas? To my mind, there remains a very real risk that the UK, as my noble friend Lady Kidron, rightly stated, will inadvertently give away potentially valuable digital assets without there being appropriate safeguards in place. I therefore welcome the intent of Amendment 211 to put that right in the public interest.

20:15
Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, having a system such as this would really focus the public sector on how we can generate more datasets. As I said earlier, education is an obvious one, but so is mobile phone data. All these companies have their licences. If a condition of the licence was that the data on how people move around the UK became a public asset, that would be hugely beneficial to policy formation. If we really understood how, why and when people move, we would make much better decisions. We could save ourselves huge amounts of money. We really ought to have this as a deep focus of government policy.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I have far too little time to do justice to this subject. We on these Benches welcome this amendment. It is entirely consistent with the sovereign health fund proposed by Future Care Capital and, indeed, with the proposals from the Tony Blair Institute for Global Change on a similar concept called the national data trust. Indeed, this concept formed part of our Liberal Democrat manifesto at the last general election, so of course I support the amendment.

It would be very useful to hear more about the national data library, including on its purpose and operation, as the noble Baroness, Lady Kidron, said. I entirely agree with her that there is a great need for a sovereign cloud service or services. Indeed, the inability to guarantee that data on the cloud is held in this country is a real issue that has not yet been properly addressed.

Earl of Effingham Portrait The Earl of Effingham (Con)
- Hansard - - - Excerpts

My Lords, I thank the noble Baroness, Lady Kidron, for moving this amendment. As she rightly identified, the UK has a number of publicly held data assets, many of which contain extremely valuable information. This data—I flag, by way of an example, NHS data specifically—could be extremely valuable to certain organisations, such as pharmaceutical companies.

We are drawn to the idea of licensing such data—indeed, we believe that we could charge an extremely good price—but we have a number of concerns. Most notably, what additional safeguards would be required, given its sensitivity? What would be the limits and extent of the licensing agreement? Would this status close off other routes to monetising the data? Would other public sector bodies be able to use the data for free? Can this not already be done without the amendment?

Although His Majesty’s Official Opposition of course recognise the wish to ensure that the UK taxpayer gets a fair return on our information assets held by public bodies and arm’s-length organisations, and we certainly agree that we need to look at licensing, we are not yet sure that this amendment is either necessary or sufficient. We once again thank the noble Baroness, Lady Kidron, for moving it. We look forward to hearing both her and the Minister’s thoughts on the matter.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I am grateful to the noble Baroness, Lady Kidron, for her amendment. I agree with her that the public sector has a wealth of data assets that could be used to help our society achieve our missions and contribute to economic growth.

As well as my previous comments on the national data library, the Government’s recent Green Paper, Invest 2035: The UK’s Modern Industrial Strategy, makes it clear that we consider data access part of the modern business environment, so improving data access is integral to the UK’s approach to growth. However, we also recognise the value of our data assets as part of this approach. At the same time, it is critical that we use our data assets in a trustworthy and ethical way, as the noble Baroness, Lady Kidron, and the noble Lord, Lord Tarassenko, said, so we must tackle these issues carefully.

This is an active area of policy development for the Government, and we need to get it right. I must therefore ask the noble Baroness to withdraw her amendment. However, she started and provoked a debate that will, I hope, carry on; we would be happy to engage in that debate going forward.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I thank all speakers, in particular my noble friend Lord Tarassenko for his perspective. I am very happy to discuss this matter and let the Official Opposition know that this is a route to something more substantive to which they can agree. I beg leave to withdraw my amendment.

Amendment 211 withdrawn.
Amendment 211A not moved.
Lord Russell of Liverpool Portrait The Deputy Chairman of Committees (Lord Russell of Liverpool) (CB)
- Hansard - - - Excerpts

My Lords, before we move on to the next group, I again remind noble Lords that we have in fact only two groups to get through because Amendment 212 will not be moved. We have about 25 minutes to get through those two groups.

Amendment 211B

Moved by
211B: After Clause 132, insert the following new Clause—
“Consultation: data centre power usageOn the day on which this Act is passed, the Secretary of State must launch a consultation on the implications of the provisions in this Act for the power usage and energy efficiency of data centres.”
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

My Lords, it is a pleasure to introduce this group of amendments. I have a 35-minute speech prepared. In moving Amendment 211B, I shall speak also to Amendments 211C to 211E. The reason for this group of amendments is to try to get an increased focus on the range of issues they touch on.

I turn to Amendment 211B first. It seems at least curious to have a data Bill without talking about data centres in terms of their power usage, their environmental impact and the Government’s view of the current PUE standard. Is it of a standard that they think gives the right measure of confidence to consumers and citizens across the country, in terms of how data centres are being operated and their impacts?

Similarly, on Amendment 211C, not enough consideration is given to supply chains. I am not suggesting that they are the most exciting subject but you have to go only one or two steps back in any supply chain to get into deep depths of opacity. With this amendment, I am seeking to gain more clarity on data supply chains and the role of data across all supply chains. Through the combination of data and AI, we could potentially enable a transformation of our supply chain in real time. That would give us so much more flexibility to try for economic benefits and environmental benefits. I look forward to the Minister’s response.

I now move on to Amendment 211D. It is always a pleasure to bring AI into a Bill that really does not want to have AI in it. I am interested in the whole question of data input and output, not least with large language models. I am also interested in the Government’s view on how this interacts with the 1988 copyright Act. There may be some mileage in looking into some standards and approaches in this area, which would potentially go some way towards conditions of market access. We have some excellent examples to look at in other sectors of our economy and society, as set out in the amendment; I would welcome the Minister’s views on that.

I am happy that this group ends with Amendment 211E on the subject of public trust. In many ways, it is the golden thread that should run through everything when we talk about data; I wanted it to be the golden thread that ran through my AI regulation Bill. I always say that Clause 6 is the most important clause in that Bill because it goes to the question of public engagement and trust. Without that level of public engagement and trust, it does not matter how good the technologies are, how good the frameworks are or how good the chat around the data is. It might be golden but, if the public do not believe in it, they are not going to come and be part of it. The most likely consequence of this is that they will not be able to avail themselves of the benefits but they will almost certainly be saddled with the burdens. What these technologies enable is nothing short of a transformation of that discourse between citizen and state, with the potential to reimagine completely the social contract for the benefit of all.

Public engagement and public trust are the golden thread and the fuel for how we gain those economic, social and psychological benefits from the data. I will be very interested in the Minister’s response on what more could be done by the Government, because previous consultations, not least around some of these technologies, have been somewhat short of what we could achieve. With that #brevity and #our data, I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I shall be #even shorter. Data centres and their energy consumption are important issues. I agree that at a suitable moment—probably not now—it would be very interesting to hear the Government’s views on that. Reports from UK parliamentary committees and the Government have consistently emphasised the critical importance of maintaining public trust in data use and AI, but sometimes, the actions of the Government seem to go contrary to that. I support the noble Lord, Lord Holmes, in his call for essentially realising the benefits of AI while making sure that we maintain public trust.

Earl of Effingham Portrait The Earl of Effingham (Con)
- Hansard - - - Excerpts

My Lords, I thank my noble friend Lord Holmes of Richmond for tabling this amendment. As we all appreciate, taking stock of the effects of legislation is critical, as it allows us to see what has worked and what has not. Amendment 221B would require the Secretary of State to launch a consultation into the implications of the provisions of the Bill on the power usage and energy efficiency of data centres. His Majesty’s Official Opposition have no objection to the amendment’s aims but we wonder to what extent it is actually possible. By what means or benchmark can we identify whether a spike in energy usage is specifically due to a provision from this legislation, rather than as a result of some other factor? I should be most grateful if my noble friend could provide further detail on this matter in his closing speech.

Regarding Amendment 211C, we understand that much could be learned from a review of all data regulations and standards pertaining to the supply chains for financial, trade, and legal documents and products, although we wonder if this needs to happen the moment this Bill passes. Could this review not happen at any stage? By all means, let us do it sooner rather than later, but is it necessary to set a date in statute?

Moving on to Amendment 221D, we should certainly look to regulate the AI large language model sector to ensure that there are standards for the input and output of data for LLMs. However, this must be done in a way that does not stifle growth in this emerging industry.

Finally, we have some concerns about Amendment 211E. A national consultation on the use of individuals’ data is perhaps just too broad.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I am grateful to the noble Lord, Lord Holmes, for tabling Amendment 221B and his other amendments in this group, which are on a range of varied and important issues. Given the hour, I hope he will be content if I promise to write to him on each of these issues and in the meantime, I ask him to withdraw the amendment.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

I thank all noble Lords who participated: I will not go through them by name. I thank the Minister for her response and would very much welcome a letter. I am happy to meet her on all these subjects but, for now, I beg leave to withdraw the amendment.

Amendment 211B withdrawn.
Amendments 211C to 211E not moved.
Amendment 211F
Moved by
211F: After Clause 132, insert the following new Clause—
“Local Environmental Records Centres (“LERCs”)(1) Any planning application involving biodiversity net gain must include a data search report from the relevant Local Environmental Records Centre (LERC), and all data from biodiversity surveys conducted in connection with the application must be contributed free of charge to the LERC in record-centre-ready format.(2) All government departments and governmental organisations, local and national, that collect biodiversity data for whatever reason, must contribute it free of charge to the relevant LERCs in record-centre-ready format, and must include relevant LERC data in formulating policy and operational plans.”Member’s explanatory statement
This amendment ensures that all the biodiversity data collected by or in connection with government is collected in Local Environmental Records Centres, so records are as good as possible, and that that data is then used by or in connection with government so that data is put to the best possible use.
Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, environmental data, specifically such things as biodiversity data, is a key component to getting policy in this area right. To do so, we need to make sure that all the good data we are generating around the UK gets into our storage system, and that the best possible and most complete data is used whenever we make decisions.

We currently run that through a system of local environmental records centres that are independent and not for profit. Since that is the system we have, it ought to be run right. At the moment, we are failing to capture a lot of quality data because the data is not coming in from the planning system, or from other similar functions, in the way that it should. We are not consistently using that data in planning as we should. Natural England, which ought to be intimately linked into this system, has stepped away from it for budgetary reasons. The environment is important to us. If the Government are serious about that, we have to get our data collection and use system right. I beg to move.

20:30
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, listening to the noble Lord, Lord Lucas, is often an education, and today is no exception. I had no idea what local environmental records centres were, so I shall be very interested to hear what the Minister has to say in response.

Earl of Effingham Portrait The Earl of Effingham (Con)
- Hansard - - - Excerpts

My Lords, I thank my noble friend Lord Lucas for tabling Amendment 211F and all noble Lords for their brief contributions to this group.

Amendment 211F ensures that all the biodiversity data collected by or in connection with government is collected in local environment records centres to ensure that records are as good as possible. That data is then used by or in connection with government, so it is put to the best possible use.

The importance of sufficient and high-quality record collection cannot and must not be understated. With this in mind, His Majesty’s Official Opposition support the sentiment of the amendment in my noble friend’s name. These Benches will always champion matters related to biodiversity and nature recovery. In fact, many of my noble friends have raised concerns about biodiversity in Committee debates in your Lordships’ House on the Crown Estate Bill, the Water (Special Measures) Bill and the Great British Energy Bill. Indeed, they have tabled amendments that ensure that matters related to biodiversity appear at the forefront of draft legislation.

With that in mind, I am grateful to my noble friend Lord Lucas for introducing provisions, via Amendment 211F, which would require any planning application involving biodiversity net gain to include a data search report from the relevant local environmental records centre. I trust that the Minister has listened to the concerns raised collaboratively in the debate on this brief group. We must recognise the importance of good data collection and ensure that such data is used in the best possible way.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I thank the noble Lord, Lord Lucas, for his Amendment 211F. I absolutely agree that local environmental records centres provide an important service. I reassure noble Lords that the Government’s digital planning programme is developing data standards and tools to increase the availability, accessibility and usability of planning data. This will transform people’s experience of planning and housing, including through local environmental records centres. On that basis, I must ask the noble Lord whether he is prepared to withdraw his amendment.

Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, I am grateful for that extensive answer from the Minister. If I have anything that I hope that she might add, I will write to her afterwards.

My heart is always in the cause of making sure that the Government get their business done on time every time, and that we finish Committee stages when they ask, as doubtless they will discover with some of the other Bills they have in this Session. For now, I beg leave to withdraw my amendment.

Amendment 211F withdrawn.
Amendments 211G and 211H not moved.
Clause 133: Power to make consequential amendments
Amendment 212 not moved.
Clause 133 agreed.
Clause 134 agreed.
Clause 135: Extent
Amendments 213 and 214
Moved by
213: Clause 135, page 168, line 26, at end insert—
“(5A) The power conferred by section 63(3) of the Immigration, Asylum and Nationality Act 2006 may be exercised so as to extend to the Bailiwick of Guernsey or the Isle of Man any amendment made by section 55 of this Act of any part of that Act (with or without modification or adaptation).(5B) The power conferred by section 76(6) of the Immigration Act 2014 may be exercised so as to extend to the Bailiwick of Guernsey or the Isle of Man any amendment made by section 55 of this Act of any part of that Act (with or without modifications). (5C) The power conferred by section 95(5) of the Immigration Act 2016 may be exercised so as to extend to the Bailiwick of Guernsey or the Isle of Man any amendment made by section 55 of this Act of any part of that Act (with or without modifications).”Member's explanatory statement
The immigration legislation amended by Clause 55 may be extended to the Channel Islands or the Isle of Man. This amendment provides that the amendments made by Clause 55 may be extended to the Bailiwick of Guernsey or the Isle of Man.
214: Clause 135, page 168, line 26, at end insert—
“(5A) The power conferred by section 239(7) of the Online Safety Act 2023 may be exercised so as to extend to the Bailiwick of Guernsey or the Isle of Man any amendment or repeal made by this Act of any part of that Act (with or without modifications).”Member's explanatory statement
This amendment provides that amendments of the Online Safety Act 2023 made by the Bill (see Clauses 122 and 123) may, like the other provisions of that Act, be extended to the Bailiwick of Guernsey or the Isle of Man.
Amendments 213 and 214 agreed.
Clause 135, as amended, agreed.
Clauses 136 to 138 agreed.
Lord Russell of Liverpool Portrait The Deputy Chairman of Committees (Lord Russell of Liverpool) (CB)
- Hansard - - - Excerpts

That concludes the Committee’s proceedings on the Bill. I thank all noble Lords who have participated for being so co-operative.

Bill reported with amendments.
Committee adjourned at 8.35 pm.