All 8 contributions to the Data (Use and Access) Bill [HL] 2024-26

Read Bill Ministerial Extracts

Wed 23rd Oct 2024
Tue 19th Nov 2024
Tue 3rd Dec 2024
Tue 10th Dec 2024
Data (Use and Access) Bill [HL]
Grand Committee

Committee stage & Committee stage: Minutes of Proceedings & Committee stage: Minutes of Proceedings
Mon 16th Dec 2024
Wed 18th Dec 2024
Tue 21st Jan 2025
Data (Use and Access) Bill [HL]
Lords Chamber

Report stage: Part 1 & Report stage
Tue 21st Jan 2025

Data (Use and Access) Bill [HL]

1st reading
Wednesday 23rd October 2024

(3 months ago)

Lords Chamber
Read Full debate Data (Use and Access) Bill [HL] 2024-26 Read Hansard Text
First Reading
15:50
A Bill to make provision about access to customer data and business data; to make provision about services consisting of the use of information to ascertain and verify facts about individuals; to make provision about the recording and sharing, and keeping of registers, of information relating to apparatus in streets; to make provision about the keeping and maintenance of registers of births and deaths; to make provision for the regulation of the processing of information relating to identified or identifiable living individuals; to make provision about privacy and electronic communications; to establish the Information Commission; to make provision about information standards for health and social care; to make provision about the grant of smart meter communication licences; to make provision about the disclosure of information to improve public service delivery; to make provision about the retention of information by providers of internet services in connection with investigations into child deaths; to make provision about providing information for purposes related to the carrying out of independent research into online safety matters; to make provision about the retention of biometric data; to make provision about services for the provision of electronic signatures, electronic seals and other trust services; and for connected purposes.
The Bill was introduced by Baroness Jones of Whitchurch, read a first time and ordered to be printed.

Data (Use and Access) Bill [HL]

2nd reading
Tuesday 19th November 2024

(2 months, 1 week ago)

Lords Chamber
Read Full debate Data (Use and Access) Bill [HL] 2024-26 Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Second Reading
17:02
Moved by
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch
- View Speech - Hansard - - - Excerpts

That the Bill be now read a second time.

Relevant document: 3rd Report from the Constitution Committee. Scottish, Welsh and Northern Ireland Legislative Consent sought.

Baroness Jones of Whitchurch Portrait The Parliamentary Under-Secretary of State, Department for Business and Trade and Department for Science, Information and Technology (Baroness Jones of Whitchurch) (Lab)
- Hansard - - - Excerpts

My Lords, data is the DNA of modern life. It is integral to almost every aspect of our society and economy, from NHS treatments and bank transactions to social interactions. An estimated 85% of UK businesses handle some form of digital data, and the UK data economy was estimated to represent 6.9% of UK GDP. Data-enabled UK service exports accounted for 85% of total service exports, estimated to be worth £259 billion, but data use in the UK drives productivity benefits of around 0.12%, which is only one minute per worker per day.

We can do much more to drive productivity through data. That is why the Government are presenting the Data (Use and Access) Bill today, to harness the power of data to drive economic growth, support modern digital government and improve people’s lives. The Bill is forecast to generate £10 billion over 10 years, to underpin the Prime Minister’s missions and to fulfil several manifesto commitments; most importantly, it will help everyday processes for people, business and our public services.

The Bill has eight parts, which I will speak to in order. Before I start, I recognise that noble Lords have debated data legislation over a number of years, and many measures in the Bill will be familiar to them, as they are to me. I pay particular tribute to the noble Viscount, Lord Camrose, for his work on these measures in the past. That said, the Government and I have carefully considered the measures to be taken forward in this Bill, and noble Lords will notice several important changes that make the Bill more focused, more balanced and better able to achieve its objectives.

The first three parts are focused on growing the economy. First, we will create the right conditions to set up future smart data schemes. These models allow consumers and businesses to safely share information about themselves with authorised third parties, which can then in turn offer innovative uses, such as personalised market comparisons and financial advice. This measure, which is also a manifesto commitment, will cut costs, give people greater consumer choice and deliver economic benefit. In September this year, more than 11 million people—one in six of the UK population—were already making use of open banking services.

In Part 2, the Bill will legislate on digital verification services, meaning that organisations will be able to receive a trust mark if they are approved as meeting the stringent requirements in the trust framework and appear on the government register. As well as increasing trust in the market, these efficiency gains are expected to boost the UK economy by £4.3 billion over the next decade by doing things such as reducing the time spent completing checks to hire new workers from days to minutes.

Part 3, on the national underground asset register, or NUAR, will place this comprehensive digital map of the underground pipes and cables on a statutory footing. The measures mandate that owners of underground infrastructure, such as water companies or telecoms operators, register their assets on NUAR. This will deliver more than £400 million per year through more efficient data sharing, reduced accidents and delays, and improved worker safety. The proposed measures will also allow this data to be used for additional prescribed use cases, such as improved street work co-ordination, where commercial and national security considerations allow.

Part 4 relates to the format of the registers of births and deaths, allowing for the first time the possibility of digital registration.

Part 5 is specifically about data protection and privacy, although I stress that this Government are committed to the strongest data privacy protections throughout the Bill. This part of the Bill is the one that the Government and I have most thoroughly revisited. Our objective has been to address the current lack of clarity that impedes the safe development and responsible deployment of new technologies.

We have removed previous measures watering down the accountability framework, along with other measures that risked protections. Since the Bill’s introduction I have spoken to members of industry, civil society and the research community about this, as well as some noble Lords here today, and I am glad to note that these changes have been broadly welcomed. In this context, I would like to say something about AI, which will undoubtedly have a vital role to play in growing the UK’s economy and transforming its public services. This will include the responsible and safe use of solely automated decision-making. However, the rules in Article 22 of the UK GDPR are unclear, which holds us back. Organisations are not confident about when they can make solely automated decisions, nor about what safeguards apply and when. We suffer when this leads to hollow attempts at token human involvement to try to move the goalposts.

The Bill will fix these issues. It writes the safeguards much more clearly. You will have the right to be told about a decision, the right to human intervention, and the right to make representations about it. It specifically provides that human involvement must be meaningful or else it does not count. This—alongside clearer safeguards, the restored accountability framework, and a modernised information commission—will help us strike the right balance between the benefits of this technology being available in more circumstances, and public trust and protection.

Part 6 is on the regulator: the new information commission. This is a new-look regulator—modernised, with clear strategic direction and stronger powers, and still independent. We will bring the information commission in line with regulatory best practice, increase accountability, and enable greater transparency for organisations and the public. It will be empowered to engage effectively with the increasingly complex opportunities and challenges we see in the use of personal data, as well as to ensure high data protection standards and increased public trust.

The Government have worked closely with the ICO on these reforms, and the commissioner noted in his response to the Bill that these changes

“will significantly improve the ICO’s ability to function effectively”

and the

“refreshed governance arrangements will maintain our independence and enhance our accountability”.

Part 7 includes other provisions about the use of or access to data. Clauses on NHS information standards will create consistency across IT systems to enable data sharing. This is a positive step in driving up efficiency in our NHS and will save 140,000 hours of staff time a year. These measures will also improve patient safety; for example, by allowing authorised medical staff to access patient data to provide care in emergencies.

There is a new, fairly technical measure on smart meters, which will provide the Gas and Electricity Markets Authority with flexibility to determine the best process to follow in appointing the successor smart meter communication licensee. These clauses will ensure that the authority is able to appoint a successor in a timely and efficient way that is in the best interests of energy consumers.

Part 7 also includes measures on online safety research, laying the groundwork for crucial research into online harms to help us learn and adapt, to keep the internet safe. This is in addition to measures on data preservation notices to help coroners, or procurators fiscal in Scotland, investigate how online platform use may have had a contributing effect in the tragic death of a child. I thank the noble Lord, Lord Bethell, and the noble Baroness, Lady Kidron, for their campaigning on these important issues, which we supported in opposition. I am pleased to be able to deliver these measures early in the new Parliament.

Finally, Part 8 includes standard final provisions.

As noble Lords can probably tell from the length of that list, this is quite a wide-ranging Bill. However, I hope they will agree that the focus—on growing the economy, supporting modern, digital government, and improving lives—is a lot clearer. In summary, I have three main points to encourage the swift passage of the Bill through the House.

First, I have worked very closely with noble Lords across the House on a number of these measures over the years. I am glad to have been able to make the necessary changes to the legislation in response to our shared concerns. Secondly, we are very keen to implement these changes as soon as possible for our stakeholders—the ICO, business, and the research community, to name but a few—which have all been waiting patiently to see the benefits these reforms will bring. Thirdly and most importantly, the measures in the Bill will make a material, positive difference to people’s lives.

I hope noble Lords will work with me to pass the Bill and ensure that these reforms can bring real benefits to our economy and public services and the UK public. I beg to move.

17:13
Lord Markham Portrait Lord Markham (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I welcome the opportunity to speak on this important matter. I am especially grateful to the noble Baroness, Lady Jones of Whitchurch, for her engagement so far with me and my noble friend Lord Camrose, which has been truly helpful with this technically complex Bill. I thank in advance the other speakers and keenly look forward to hearing their—I hope—lucid and insightful commentary on the content of the Bill.

This is a wide-ranging Bill which affects a wide range of policy issues. If well executed, it could bring substantial benefits to individuals and businesses, much like the previous Conservative Government’s Bill, which was so ably championed by my noble friend Lord Camrose. However, if poorly executed, the Bill may result in data vulnerabilities for both individuals and the country as a whole.

We on these Benches are delighted that the Government are taking forward the bulk of the provisions and concepts set out by the previous Conservative Government, in particular the introduction of a national underground asset register, which will make construction and repairs more efficient, reduce the cost and inconvenience of roadworks and, most importantly, make work safer for construction workers; giving Ofcom the ability, when notified by the coroner, to demand that online service providers retain data in the event of a child’s death; reforming and modernising the Information Commissioner’s Office; introducing a centralised digital ID verification framework; allowing law enforcement bodies to make greater use of biometric data for counterterrorism purposes; and—particularly close to my heart—setting data standards in areas such as health to allow the sharing of data and its use for research and AI. All these provisions are necessary and will provide tangible benefits to the people of the UK. Indeed, the data economy is crucial. As the Government have rightly said, it has the potential to add billions in value to the UK economy through opportunities and efficiencies.

Therefore, noble Lords will find in us a largely supportive Opposition on this Bill, working constructively to get it through Parliament. However, we on these Benches have some concerns about the Bill, and I am keen to hear the Minister’s response on a number of points.

Many small and medium-sized enterprises control low-risk personal data. Although they must of course take careful measures to manage customer data, many simply do not have the resources to hire or train a data protection officer, particularly with the Government’s recent decision to increase burdens on employer NI. We have concerns that the Bill will disproportionately add to the weight of the requirements on those businesses, without greatly advancing the cause of personal privacy. We should free those SMEs—the bedrock of our economy —from following the same demanding data protection requirements as larger, better-resourced enterprises, which carry far greater risks to personal privacy. We need to allow them to concentrate on running a profitable business rather than jumping through myriad bureaucratic data protection hoops over data that, in many cases, presents little risk to privacy. In short, we need those businesses to be wisely careful but not necessarily hyper-careful.

Many of the Bill’s clauses allow or require mass digitisation of data by the public sector, such as the registers of births and deaths. These measures will improve efficiency and, therefore, save money—something that I think we can all agree is necessary. However, the more data is digitised, the more we present a tempting attack surface to hackers—thieves who steal data for profit by sale on the dark web, ransom or both. Do the Government intend to bring forward legislation that will set out improved cybersecurity recruitments for public bodies that will, because of the Bill, rapidly digitise their datasets? Furthermore, if the Government intend to bring forward additional cybersecurity measures, when do they intend to do so? Any time lag leaves public bodies and the people’s data they control vulnerable to those with malicious intent.

Building on this point, the Bill will also see a rapid increase in the digitisation and sharing of high-risk data across the public and private sectors. There will, for example, be an increase in high-risk data sharing between the NHS and adult social care providers in communities, and a range of private sector companies handling identification data to create a digital ID. Again, the more high-risk data is used, transferred or created, the greater the incentive for hackers to target organisations and bodies. Therefore, I must ask the Minister whether and when the Government intend to bring forward additional cybersecurity measures for public bodies, large businesses, and the minority of SMEs that will handle high-risk data.

Introducing a national underground asset register, or NUAR, will lead to significant benefits for people and developers alike. It will substantially reduce the risk of striking underground infrastructure during development or repairs. This will not only speed up developments and repairs but reduce costs and the risks posed to construction workers. However, having a centralised register of all underground assets, including critical infrastructure, may result in a heightened terror risk. I know this Government, like the previous one, will have devoted considerable thought to this grave risk, and I hope the Minister will set out some of the Government’s approach to mitigating it. In short, how do they intend to ensure the security of NUAR so that there can be no possibility of unauthorised access to our critical infrastructure?

We on this Bench support the Government’s position on automated decision-making, or ADM. It can rapidly increase the speed at which commercial decisions are taken, thus resulting in an increase in sales and profit, improvements to productivity and a better customer experience. AI will be the key underlying technology of almost all ADM. The vast quantity of data and the unfathomable complexity of the algorithms mean that we have to address the AI risks of bias, unfairness, inaccuracy and loss of human agency. Therefore, I think it is wise that we consider amending this Bill to put some of the use of AI in this context on a statutory footing. I hope that the Minister will share the Government’s thoughts on this matter, and I am confident that colleagues across the House will have strong views too.

I end by outlining the opportunities for setting standards for health data. As Health Minister, I would often wax lyrical on how we have the best data in the world, with our ability to link primary and secondary care data with genomic, optical and myriad other data sources going back decades. Add to this the large heterogeneous population and you have, without doubt, the best source of health data in the world. I firmly believe that by setting the data standards we can build in the UK the foundations for a Silicon Valley for the life sciences, which would be a massive benefit to patients, the NHS and the UK economy overall.

We on this Bench largely welcome the Bill, not least because it retains many of the concepts from the previous Conservative Government’s Bill. However, there are important matters that deserve our attention. I look forward to hearing today the views of noble Lords across the House to enable the productive passage of the Bill.

17:21
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I declare my interests as chair of the 5Rights Foundation and as an adviser to the Institute for Ethics in AI at Oxford.

I start by expressing my support for the removal of some of the egregious aspects of the last Bill that we battled over, and by welcoming the inclusion of access to data for researchers—although I believe there are some details to discuss. I am extremely pleased finally to see provisions for the coroner’s access to data in cases where a child has died. On that alone, I wish the Bill swift passage.

However, a Bill being less egregious is not sufficient on a subject fundamental to the future of UK society and its prosperity. I want to use my time this afternoon to ask how the Government will not simply make available, but secure the full value of, the UK’s unique datasets; why they do not fully make the UK AI-ready; and why proposals that they did not support in opposition have been included and amendments that they did support have been left out.

We have a unique opportunity, as the noble Lord, Lord Markham, just described, with unique publicly held datasets, such as the NHS’s. At a moment at which the LLMs and LMMs that will power our global future are being built and trained, these datasets hold significant value. Just as Britain’s coal reserves fuelled global industrial transformation, our data reserves could have a significant role to play in powering the AI transformation.

However, we are already giving away access to national data assets, primarily to a handful of US-based tech companies that will make billions selling the products and services built upon them. That creates the spectre of having to buy back drugs and medical innovations that simply would have not been possible without the incalculably valuable data. Reimagining and reframing publicly held data as a sovereign asset accessed under licence, protected and managed by the Government acting as custodian on behalf of UK citizens, could provide direct financial participation for the UK in the products and services built and trained on its data. It could give UK-headquartered innovators and researchers privileged access to nationally held data sets, or to investing in small and medium-sized specialist LLMs, which we will debate later in the week. Importantly, it would not simply monetise UK data but give the UK a seat at the table when setting the conditions for use of that data. What plans do the Government have to protect and value publicly held data in a way that maximises its long-term value and the values of the UK?

Similarly, the smart data schemes in the Bill do not appear to extend the rights of individual data holders to use their data in productive and creative ways. The Minister will recall an amendment to the previous data Bill, based on the work of associate professor Reuben Binns, that sought to give individuals the ability to assign their data rights to a third party for agreed purposes. The power of data is fully realised only when it is combined. Creating communal rights for UK data subjects could create social and economic opportunities for communities and smaller challenger businesses. Again, this is a missed opportunity to support the Government’s growth agenda.

My second point is that the Bill fails to tackle present-day or anticipated uses of data by AI. My understanding is that the AI Bill is to be delayed until the Government understand the requirements of the new American Administration. That is concerning on many levels, so perhaps the Minister can say something about that when she winds up. Whatever the timing, since data is, as the Minister said, in the DNA of AI infrastructure, why does the Bill so spectacularly fail to ensure that our data laws are AI-ready? As the News Media Association says, the Bill is silent on the most pressing data policy issue of our time: namely, that the unlicensed use of data created by the media and broader creative industries by AI developers represents IP theft on a mass scale.

Meanwhile, a single-sentence petition that says,

“The unlicensed use of creative works for training generative AI is a major, unjust threat to the livelihoods of the people behind those works, and must not be permitted”,


has been signed by nearly 36,000 organisations and individuals from the creative community. This issue was the subject of a cross-party amendment to which Labour put its name, which would have put the voluntary web standards represented by the robots.txt protocol on a mandatory opt-in basis—likely only one of several amendments needed to ensure that web indexing does not become a proxy for theft. In 2022, it was estimated that the UK creative industries generated £126 billion in gross value added to the economy and employed 2.4 million people. Given their importance to our economy, our sense of identity and our soft power, why do we have a data Bill that is silent on data scraping?

In my own area of particular concern, the Bill does not address the impact of generative AI on the lives and rights of children. For example, instead of continuing to allow tech companies to use pupil data to build unproven edtech products based on drill-and-practice learning models—which in any other form is a return to Victorian rote learning but with better graphics—the Bill could and should introduce a requirement for evidence-based, pedagogically sound paradigms that support teachers and pupils. In the recently announced scheme to give edtech companies access to pupil data, I could not see details about privacy, quality assurance or how the DfE intends to benefit from these commercial ventures which could, as in my previous NHS example, end with schools or the DfE having to buy back access to products built on UK pupil data. There is a quality issue, a safety issue and an ongoing privacy issue in our schools, and yet nothing in the Bill.

The noble Baroness and I met to discuss the need to tackle AI-generated sexual abuse, so I will say only that each day that it is legal to train AI models to create child sexual abuse material brings incalculable harm. On 22 May, specialist enforcement officers and I, along with the noble Viscount, Lord Camrose, were promised that the ink was almost dry on a new criminal offence. It cannot be that what was possible on that day now needs many months of further drafting. The Government must bring forward in this Bill the offence of possessing, sharing, creating or distributing an AI file that is trained on or trained to create CSAM, because this Bill is the first possible vehicle to do so. Getting this on the books is a question of conscience.

My third and final point is that the Bill retains some of the deregulatory aspects of its predecessor, while simultaneously missing the opportunity of updating data law to be fit for today. For example, the Bill extends research exemptions in the GDPR to

“any research that can reasonably be described as scientific”,

including commercial research. The Oxford English Dictionary says that “science” is

“The systematic study of the structure and behaviour of the physical and natural world through observation, experimentation, and the testing of theories against the evidence obtained”.


Could the Minister tell the House what is excluded? If a company instructs its data scientists and computing engineers to develop a new AI system of any kind, whether a tracking app for sport or a bot for an airline, is that scientific research? If their behavioural scientists are testing children’s response to persuasive design strategies to extend the stickiness of their products, is that scientific research? If the answer to these questions is yes, then this is simply an invitation to tech companies to circumvent privacy protections at scale.

I hope the noble Baroness will forgive me for saying that it will be insufficient to suggest that this is just tidying up the recitals of the GDPR. Recital 159 was deemed so inadequate that the European Data Protection Supervisor formally published the following opinion:

“the special data protection regime for scientific research is understood to apply where … the research is carried out with the aim of growing society’s collective knowledge and wellbeing, as opposed to serving primarily one or several private interests”.

I have yet to see that the Government’s proposal reflects this critical clarification, so I ask for some reassurance and query how the Government intend to account for the fact that, by putting a recital on the face of the Bill, it changes its status.

In the interests of time, I will put on the record that I have a similar set of issues about secondary processing, recognised legitimate interests, the weakening of purpose limitation, automated decision-making protections and the Secretary of State’s power to add to the list of special category data per Clause 74. These concerns are shared variously by the ODI, the Ada Lovelace Institute, the Law Society, Big Brother Watch, Defend Digital Me, 5Rights, Connected by Data and others. Collectively, these measures look like the Government are paving a runway for tech access to the private data of UK citizens or, as the Secretary of State for DSIT suggested in his interview in the Times last Tuesday, that the Government no longer think it is possible to regulate tech giants at all.

I note the inclusion of a general duty on the ICO to consider the needs of children, but it is a poor substitute for giving children wholesale protection from any downgrading of their existing data rights and protections, especially given the unprecedented obligations on the ICO to support innovation and stimulate growth. As the Ada Lovelace Institute said,

“writing additional pro-innovation duties into the face of the law … places them on an almost equivalent footing to protecting data subjects”.

I am not sure who thinks that tech needs protection from individual data rights holders, particularly children, but unlike my earlier suggestion that we protect our sovereign data assets for the benefit of UK plc, the potential riches of these deregulation measures disproportionately accrue to Silicon Valley. Why not use the Bill to identify and fix the barriers the ICO faces in enforcing the AADC? Why not use it to extend existing children’s privacy rights into educational settings, as many have campaigned for? Why not allow data subjects more freedom to share their data in creative ways? The Data (Use and Access) Bill has little in it for citizens and children.

Finally, but by no means least importantly, is the question of the reliability of computers. At col. GC 576 of Hansard on 24 April 2024, the full tragedy of the postmasters was set out by the noble Lord, Lord Arbuthnot, who is in his place and will say more. The notion that computers are reliable has devastated the lives of postmasters wrongly accused of fraud. The Minister yesterday, in answer to a question from the noble Lord, Lord Holmes, suggested that we should all be “more sceptical” in the face of computer evidence, but scepticism is not legally binding. The previous Government agreed to find a solution, albeit not a return to 1999. If the current Government fail to accept that challenge, they must shoulder responsibility for the further miscarriages of justice which will inevitably follow. I hope the noble Baroness will not simply say that the reliability of computers and the other issues raised are not for this Bill. If they are not, why not? Labour supported them in opposition. If not, then where and how will these urgent issues be addressed?

As I said at the outset, a better Bill is not a good Bill. I question why the Government did not wait a little longer to bring forward a Bill that made the UK AI ready, understood data as critical infrastructure and valued the UK’s sovereign data assets. It could have been a Bill that did more work in reaching out to the public to get their consent and understanding of positive use cases for publicly held data, while protecting their interests—whether as IP holders, communities that want to share data for their own public good or children who continue to suffer at the hands of corporate greed. My hope is that, as we go to Committee, the Government will come forward with the missing pieces. I believe there is a much more positive and productive piece of legislation to be had.

17:37
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I remind the House of my interests, particularly in chairing the board of Century-Tech, an AI edtech company— I will have to talk to the noble Baroness, Lady Kidron, about that. I am a director of Educate Ventures Research, which is also an AI-in-education business, and Goodnotes, an AI note-taking app. It is a pleasure to follow the noble Baroness, Lady Kidron. I agreed with most of what she said, and I look forward to working with her on the passage of the Bill.

I guess we are hoping it is third time lucky with a data Bill. I am sure we will hear from all speakers that there is a sense that this is an improved Bill on the previous two attempts. It is a particular joy to see that terrible stuff around DWP data not appearing in this Bill. There is plenty that I welcome in terms of the improvements. Like most speakers, I imagine, I mostly want to talk about what might need further debate and what might be missing, rather than just congratulating my noble friend the Minister on the improvements she and her colleagues have been able to make.

I anticipate that this will not be the only Bill we have on data and AI. It would be really helpful for this Government to rediscover the joys of a White Paper. If we had a document that set out the whole story and the vision, so that we could more easily place this Bill in context, that would be really helpful. This could include where we are with the Matt Clifford action plan, and a very clear aim of data adequacy with the EU regime. I wonder whether, among all the people the Minister said she had been able to talk to about this Bill, she had also spoken to the EU to make sure we are moving in the right direction with adequacy, which has to be resolved by the summer.

Clearly, this is a Bill about data. The Minister said that data is the DNA of modern life. It has achieved a new prominence with the rollout of generative AI, which has captured everyone’s imagination—or entered their nightmares, depending on how you think about it. The Communications and Digital Committee, which I am privileged to serve on, has been thinking about that in respect of the future of news, which we will publish a report on shortly, and of scaling our businesses here in the UK. It is clear that the core ingredients you need are computing power, talent, finance and, of course, data, in order successfully to grow AI businesses here.

I agree with the noble Baroness, Lady Kidron, that we have a unique asset in our public sector datasets that the US does not have to anything like the same extent—in particular in health, but also in culture and education. It is really important that the Government have a regime, established by this legislation and any other legislation we may or may not know about, to protect and deploy that data to the public benefit and not just the private benefit, be it in large language models or other foundational models of whatever size.

It is then also important to ask, whose data is it? In my capacity as chair of a board of an AI company, I am struck by the fact that our current financial regulation does not allow us to list our data as an asset on our balance sheet. I wonder when we might be able to move in that direction, because it is clearly of some significance to these sorts of businesses. But it is also true that the data I share as a citizen, and have given consent to, should be my data. I should have the opportunity to get it back quite easily and to decide who to share it with, and it should empower me as a citizen. I should be able to hold my own data, and I definitely should not have to pay twice for it: I should not have to pay once through my taxes and then a second time by having to pay for a product that has been generated by the data that I paid for the first time. So I am also attracted to what the noble Baroness said about data as a sovereign asset.

In the same way that both Front-Bench speakers were excited about the national underground asset register, I am equally excited about the smart data provisions in the Bill, particularly in respect of the National Health Service. Unfortunately, my family have been intensive users of the National Health Service over the past year or so, and the extent to which the various elements of our NHS do not talk to each other in terms of data is a tragedy that costs lives and that we urgently need to resolve. If, as a result of this Bill, we can take the glorious way in which I can share my banking data with various platforms in order to benefit myself, and do the same with health data, that would be a really good win for us as a nation. Can the Minister reassure me that the same could be true for education? The opportunity to build digital credentials in education by using the same sort of technology that we use in open banking would also excite me.

I ask the Minister also to think about and deliver on a review of Tell Us Once, which, when I was a Minister in the DWP a long time ago, I was very happy to work on. By using Tell Us Once, on the bereavement of a relative, for example, you have to tell only one part of the public sector and that information then cascades across. That relieves you of an awful lot of difficult admin at a time of bereavement. We need a review to see how this is working and whether we can improve it, and to look at a universal service priority register for people going through bereavement in order to prioritise services that need to pass the message on.

I am concerned that we should have cross-sector open data standards and alignment with international interoperability standards. There is a danger in the Bill that the data-sharing provisions are protected within sectors, and I wonder whether we need some kind of authority to drive that.

It is important to clarify that the phrase used in the first part of the Bill, a

“person of a specified description”,

can include government departments and public bodies so that, for example, we can use those powers for smart data and net-zero initiatives. Incidentally, how will the Government ensure that the supply chains of transformers, processors, computing power and energy are in place to support AI development? How will we publish the environmental impact of that energy use for AI?

There is a lot more I could say, but time marches on. I could talk about digital verification services, direct marketing and a data consent regime, but those are all things to explore in Committee. However, there are two other things that I would briefly like to say before winding up. First, I have spoken before in this House about the number of people who are hired, managed and fired by AI automated decision-making. I fear that, under the Bill as drafted, those people may get a general explanation of how the automated decision-making algorithms are working, when in those circumstances they need a much more personalised explanation of why they have been impacted in this way. What is it about you, your socioeconomic status and the profile that has caused the decision to go the way it has?

Secondly, I am very interested in the role of the Digital Regulation Cooperation Forum in preventing abuse and finding regulatory gaps. I wonder whether, after the perennial calls in this Chamber when debating Bills such as this for a permanent Committee of both Houses to monitor digital regulation, the new Government have a view on that. I know that that is a matter for the usual channels and not Ministers, but it is a really important thing for this House to move on. I am fairly bored with making the case over the past two or three years.

In summary, this is a good Bill but it is a long Bill, and there is lots to do. I wish the Minister good luck with it.

17:46
Lord Thomas of Cwmgiedd Portrait Lord Thomas of Cwmgiedd (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I, too, welcome the Bill, but there is one matter we should have at the forefront of our minds as we work through it: that it must be implemented and carried through by SMEs and individuals. Regrettably—and I say this as a lawyer—lawyers have become far too expensive. We must appreciate the need to draft legislation and regulatory regimes that are as easy as possible to operate without the benefit of legal advice. If we cannot achieve that, it must be incumbent on the Government and the regulators to set out clearly what the position is, in a way that people can understand. We do not want our SMEs and individual traders to enter into operating under this new regime without being able to understand the law. I fear that this Bill, by its very length, is a good example of how we can overcomplicate things.

The second issue is the protection and transferability of data. The Minister, the noble Lord, Lord Markham, and the noble Baroness, Lady Kidron, have all spoken about the importance and value of data, its transferability and the need to balance correctly the protections and rights of the individual against the importance of being able to use it in research. I want to say a word about the contrasting positions we face in the transferability of data between us and the European Union, and the slightly more difficult and unpredictable situation that may arise between us and the United States. They are the same problem, but they may need addressing in different ways. On the first, I need to be slightly technical, but as the adequacy of our data regime is such an important issue, I hope that noble Lords will forgive me.

I am going to ask the Minister a question, but it is not for answer today; I think it will require a bit more than that. It takes us back to the battles and debates we have had over the last six years in relation to the manner of our withdrawal from the European Union. When we left the EU, we left in place retained EU law. We got rid of the charter, because it was said that all that mattered and was important was embodied in retained EU law. That was almost certainly right, but the problem that I believe has arisen—it is partly complicated by advice contained in the Government’s human rights memorandum attached to the Bill—arises from the effect of the Retained EU Law (Revocation and Reform) Act. I can hear, almost visibly, the sighs—“Are we back to that again?”—and I am so sorry to be dredging this up.

I have looked at various things—I am particularly grateful for the help I have had from Eleonor Duhs of Bates Wells—and I believe there is a problem we need to address. As data adequacy is so important, I will say a word about the detail. At the moment, I think we proceed on the assumption that the UK GDPR, with its numerous references to the data subject’s rights and freedoms, is adequate. The last Government, when dealing with the matter, passed the Data Protection (Fundamental Rights and Freedoms) (Amendment) Regulations, which said that all the many references in the UK GDPR to these rights are to be read as referring to

“the Convention rights within the meaning of the Human Rights Act”.

The difficulty that has arisen is in paragraph 47 of the Government’s human rights memorandum:

“Where processing is conducted by a public authority and engages a right under the ECHR, that authority must, in accordance with section 6 of the Human Rights Act 1998, ensure that such processing is not incompatible with a convention right”.


Then comes the important sentence:

“Where processing is conducted by a private body, that processing will not usually engage convention rights”.


The important point is that it is generally understood that, save in specific circumstances, the Human Rights Act applies only to state entities and not to private companies. If and where data is being processed by private entities, as the Bill and the market largely envisage, how are we to be sure that our references in the UK GDPR refer to the human rights convention but not to the charter? Having lost EU retained law, how are data privacy and data protections protected when processed by private companies?

I raise this point because it is important that we clarify it. If there is an issue, and I hope the Government will look at this carefully, we will need to amend the Bill to make sure that there can be no doubt that, where data is processed by private companies, the data rights are properly protected as they would have been if we had retained EU law, or if the charter applied. It is a very narrow point but one of fundamental importance as to the Human Rights Act being directed at state actors, by and large, and not private entities. I am sorry to take up a little time on this very general subject, but data protection is so important, and retaining our data adequacy status is, as I have learned over many years, essential to our industry.

We know that, provided we can get our law in order, there is no problem as regards the EU, I hope. We face a much more difficult problem with regard to data dealings with the United States. First, the law is much more complicated and developing at an enormous pace. It is partly federal and partly state. Of course, we have no idea—and I am not going to speculate, because speculation is pointless—what may happen under the new Administration in the United States. One thing we have learned from the EU, particularly the EU AI Act, is that legislating in terms that are hard can produce results that very quickly get out of date. It seems to me that we have to look constructively at finding a way to adapt our legislative framework to what happens in the United States as regards transferability and, more importantly, the protection of our data in respect of the very large American companies. How are we to do this? Do we give Ministers very broad statutory powers? There may, I regret to say, be a case for doing that. It is something that I do not favour. If Ministers are to have such broad statutory powers, how is that power to be made properly accountable to this House?

As the noble Baroness, Lady Kidron, demonstrated, there is no use delaying these decisions until we know what the US regime may be. Maybe the US regime, unlike the EU, will change very rapidly. Bureaucracy has some advantages when you are dealing with it from the outside, but someone who believes in constant change and turmoil is much more difficult to deal with from our legislative point of view. It is a very important aspect of this legislation that we look at how, in the transnational market in data, which is of immense value and importance to us, we protect the British public.

There are loads of other points that one could raise, but I will raise only one, to follow what has just been said. It is of fundamental importance that we examine automated decision-making with the greatest care. Some very good principles have been developed both in the United States, under the current regime, and in Europe. When a decision is made by a machine—that is a rather facile way of describing it; it is made as a result of an algorithmic process—how do we ensure that, first, there is some right to a human intervention and, secondly, and equally importantly, that the person affected understands why the decision has been made? The point that has just been made is very important, because when you get a decision from an individual, you normally have it accompanied by an understanding of the human, plus reasons. This is a very important part of the Bill; it is so important to give confidence about the way forward.

There are many other detailed points, but those are the three principal points I wanted to make. Let us keep it simple, look at the transnational aspects and look at automated decision-making.

17:58
Lord Arbuthnot of Edrom Portrait Lord Arbuthnot of Edrom (Con)
- View Speech - Hansard - - - Excerpts

My Lords, it is always rather daunting following the noble and learned Lord, Lord Thomas. I think the safest thing to say, which has the added benefit of being true, is that I agreed with him.

I declare my interests as set out in the register, in particular that I am a member of the Horizon Compensation Advisory Board and the chair of the advisory panel of Thales UK, which makes the British passport. As the noble Lord, Lord Knight, said, this Bill is a wonderful opportunity to talk about everything that is not in it and to discuss further measures that could be included. The noble Baroness, Lady Kidron, mentioned the amendment she moved on 24 April to the predecessor Bill, designed to deal with the presumption that computer evidence is reliable, despite the fact that we all know that it is not. We shall need to come to that presumption in Committee.

I supported the amendment from the noble Baroness in Committee earlier this year, although I accept—as I think she does—that simply returning to the position as it was in 1999, before the presumption existed, may not be the best solution. We need some method, for example, of accepting that breathalysers, on the whole, work as they are intended to do, and that emails, on the whole, work as they are intended to do, and we should not have to get a certificate of accuracy from Microsoft before every trial.

The need to find a solution to the problems so brutally exposed by the Post Office scandal is urgent. In the Post Office cases, in essence, the proper burden of proof was reversed and hearsay evidence that was false was accepted as gospel truth. As a result of Horizon and the appalling human behaviour that accompanied it, the lives of hundreds, perhaps thousands, of postmasters were ruined and the UK and Fujitsu are going to have to pay billions in compensation. So this matter is urgent.

The solution may be, as Alistair Kelman has recommended, a set of seven or so detailed statements setting out what has been done to ensure that the computer evidence is reliable. It may be, as a senior and highly respected judge has recommended, a division between complex cases, such as those involving a system such as Horizon, and simple cases, such as those involving breathalysers and emails, with more stringent evidentiary requirements for the complex cases. It may be, as Professor Steven Murdoch has suggested, that documents designed to test the reliability of computer systems should be made available to the other side and be subject to challenge. It may be something else, but this Bill is an opportunity to consider, examine and test those solutions, and another such opportunity may not come along quickly. I repeat: this matter is urgent.

On a different matter, Part 2 of the Bill establishes a regulatory framework for the provision of digital verification services in the UK. We need to be clear that having a clear and verifiable digital identity is a completely different matter from going down the route of identity cards. This is not an identity card Bill. It is an essential method of establishing, if you want or need to have a digital identity, that you are who you say you are and you have the attributes that you say you have. It is a way of establishing relevant facts about yourself without having to produce gas bills. I do not know about other noble Lords, but I find producing gas bills rather tricky now that they are almost all online.

Sometimes the fact you need to establish will be age: to establish that you are allowed to drink or to drive, or that you are still alive, or whatever. Sometimes it will be your address; sometimes it will be your sex. We do not want men going to women’s prisons, nor men who identify as women working in rape crisis centres. Sex is an issue on which it is necessary to have some degree of factual clarity in those circumstances where it matters. The Bill, again, is an opportunity to ensure that this factual clarity exists in the register of births. It will then be for the individual to decide whether to share the information about their sex, age, or whatever.

An organisation called Sex Matters—I am grateful for the briefing—issued a report yesterday pointing out that, at the moment, data verification services are not authoritative, in that they allow people to change their official records to show them as the opposite sex on request. One consequence is that, for example, transgender people risk being flagged up as a synthetic identity risk and excluded, for example, from banking or travel. Another is that illnesses may be misdiagnosed or that medical risks may fail to be identified.

So this Bill is a rare opportunity to put right some things that currently need to be addressed. Those of us speaking today have received a number of helpful briefings from organisations interested in various issues: I have mentioned only a couple. I hope we will take the opportunity given to us by the Bill to take on board several of those proposals.

18:05
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, it is a feature of your Lordships’ House that certain topics and Bills within them tend to attract a small and very intense group of persons, who get together to talk a language that is not generally understood by the rest of the world—certainly not by the rest of this House—and get down to business with an enthusiasm and attitude which is very refreshing. I am seeing smiles from the other side of the House. This is not meant to be in any way a single-party point—just a very nice example of the way in which the House can operate.

I have already been struck today, as I am sure have others in the group that I am talking about—who know who they are—by the recognition that we have perhaps been a little narrow in our thinking. A couple of the speeches today have brought a new thought and a new sense of engagement with this particular subject and the others we deal with. We need to be aware of that, and I am very grateful to those noble Lords. In addition, I am grateful to the repeating by the noble Lord, Lord Knight, of the speeches he had to make in 2018 and subsequent dates, and also the wonderfully grumpy speech from the noble Baroness, Lady Kidron. We have also got to take into account what we got wrong on joining the European market—which I certainly look forward to. It is a serious point.

I am also very grateful to my noble friend the Minister for setting out the new Government’s vision for data protection, for her letters—which have been very useful—and for her help in setting up the meeting I had with her officials, which I found very useful indeed. Our Minister has done a really good job in getting the Bill ready so quickly. It is good that some of the more egregious measures included in the previous Bill—particularly the changes on direct marketing during elections and the extensive access to bank account details—have gone. There are indeed some good extras as well.

We have already had some excellent speeches setting out some concerns. I have one major concern about the current Bill and three rather lesser issues which I suspect will need further debate and discussion in Committee. I will cover them quite briefly. My major concern is that, although the Bill has the intention to boost growth and productivity, and also makes a valiant attempt to provide a unified set of rules and regulations on data processing, it may in the process have weakened the protections that we want to see here in the exploitation of personal data. Data, as other noble Lords have said, is of course not just for growth and prosperity. There will be, as we have heard, clear, practical benefits in making data work for the wider social good and for the empowerment of working people. There is huge potential for data to revitalise the public services. Indeed, I liked the point made by the noble Lord, Lord Knight, that data is in some way an asset missing from the balance sheet on many operations, and we need to think carefully about how best we can configure that to make sure that the reality comes to life.

There has been, of course, a huge change. We have moved into the age of AI, but we do not have the Bill in front of us that will deal with that. The GDPR needs a top-to-toe revision so that we can properly regulate data capture, data storage, and how it may be best shared in the public interest. As an example of that, following the Online Safety Act we have a new regulator in Ofcom with the power to regulate technology providers and their algorithmic impacts. The Digital Markets, Competition and Consumers Act has given the Competition and Markets Authority new and innovative powers to regulate commercial interests, which we heard about yesterday at an all-party group. However, this Bill has missed the opportunity to strengthen the role of the ICO so we can provide a third leg capable of regulating the use of data in today’s AI-dominated world. This is a gap that we need to think very carefully about.

I hope my noble friend the Minister will acknowledge that there is a long way to go if this legislation is to earn public confidence and if our data protection regime is to work not just for the tech monopolies but for small businesses, consumers, workers and democracy. We must end the confusion, empower the regulators, and in turn empower Parliament.

There are three specific issues, and I will go through them relatively quickly. The first is on Clauses 67 and 68, already referred to, where the Bill brings in wording from Recital 159 of the GDPR—as we inherited it from the EU. This sets out how the processing of personal data for scientific research purposes should be interpreted. The recital is drafted in extraordinarily broad terms, including

“technological development and demonstration, fundamental research, applied research and privately funded research”.

It specifically mentions that:

“Scientific research purposes should also include studies conducted in the public interest in the area of public health”.


The latest ICO guidance, which contains a couple of references to commercial scientific research, says that such research

“can also include research carried out in commercial settings, and technological development, innovation and demonstration”.

However, we lack a definition, and it is rather curious that the definition of research does exist elsewhere in statute in the UK laws. It is necessary in order to fund the research councils, for example. It is also part of the process of the tax code in order to get research benefits and tax benefits for research. So, we have a definition somewhere else, but somehow the Bill avoids that and tries to go down a clarification route of trying to bring forward into the current legislation that which is already the law—according to those who have drafted it—but which is of course so complicated that it cannot be understood. I think the Government’s thinking is to provide researchers with consistency, and they say very firmly that the Bill does not create any new permissions for using or reusing data for research purposes. In my meeting with officials, they were insistent that these clauses are about fine-tuning the data protection framework, making clarifications and small-scale changes but reducing uncertainties.

I agree that it is helpful to have the key provisions—currently buried, as they are, in the recitals—on the face of the Bill, and it may be that the new “reasonableness” test will give researchers greater clarity. Of course, we also retain the requirement that research must be in the public interest. But surely the issue that we need to address is whether the Bill, by incorporating new language and putting in this new “reasonableness” test, will permit changes to how data held by the NHS, including patients’ medical records, could be used and shared. It may be that the broad definition of “scientific research”, which can be “publicly or privately funded” and “commercial or non-commercial” inadvertently waters down consent protections and removes purpose-limitation safeguards. Without wishing to be too alarmist, we need to be satisfied that these changes will not instigate a seismic change in the rules currently governing NHS data.

It is relevant to note that the Government have stated in a separate way an intention to include in the next NHS 10-year plan significant changes as to how patients’ medical records are held and how NHS data is used. Launching a “national conversation” about the plans, the Secretary of State, my right honourable friend Wes Streeting MP, highlighted a desire to introduce electronic health records called “patient passports” and to work “hand in hand” with the private sector to use data to develop new treatments. He acknowledged that these plans would raise concerns about privacy and about how to get the

“best possible deal for the NHS in return”

for private sector access to NHS data. The details of this are opaque. As currently drafted, the Bill is designed to enable patient passports and sharing of data with private companies, but to my mind it does not address concerns about patient privacy or private sector access to health data. I hope we can explore that further in Committee and be reassured.

My second point concerns the unlicensed use of data created by the media and broader creative industries by developers of the large language models—this has already been referred to. UK copyright law is absolutely clear that AI developers must obtain a licence when they are text or data mining—the technique used to train AI models. The media companies have suggested that the UK Government should introduce provisions to ensure that news publishers and others can retain control over their data; that there must be significant penalties for non-compliance; and that AI developers must be transparent about what data their crawlers have “scraped” from websites—a rather unpleasant term, but that is what they say. Why are the Government not doing much more to stop what seems clearly to be theft of intellectual property on a mass scale, and if not in this Bill, what are their plans? At a meeting yesterday of the APPG which I have already referred to, it was clear that the CMA does not believe that it is the right body to enforce IP law. But if it is not, who is, and if there is a gap in regulatory powers, should this Bill not be used to ensure that the situation is ameliorated?

My third and final point is about putting into statute the previous Government’s commitments about regulating AI, as outlined in the rather good Bletchley declaration. Does my noble friend not agree that it would be at least a major statement of intent if the Bill could begin to address

“the protection of human rights, transparency and explainability, fairness, accountability, regulation, safety, appropriate human oversight, ethics, bias mitigation, privacy and data protection”?

These are all points raised in the Bletchley declaration. We will need to address the governance of AI technologies in the very near future. It does not seem wise to delay, even if the detailed approach has yet to be worked through and consulted upon. At the very least, as has been referred to, we should be picking up the points made by the Ada Lovelace Institute about: the inconsistent powers across regulators; the absence of regulators to enforce the principles such as recruitment and employment, or diffusely regulated areas of public service such as policing; the absence of developer-focused obligations; and the absence and high variability of meaningful recourse mechanisms when things go wrong, as they will.

When my noble friend Lord Knight of Weymouth opened the Second Reading of the last Government’s data protection Bill, he referred to his speech on the Second Reading during the passage of the 2018 Act—so he has been around for a while. He said:

“We need to power the economy and innovation with data while protecting the rights of the individual and of wider society from exploitation by those who hold our data”.—[Official Report, 19/12/23; col. 2164.]


For me, that remains a vision that we need to realise. It concerns me that the Bill will not achieve that.

18:17
Lord Vaux of Harrowden Portrait Lord Vaux of Harrowden (CB)
- View Speech - Hansard - - - Excerpts

My Lords, like others, I think I am experiencing the same sense of déjà vu that has been referred to. As others said, one of the more welcome aspects of this Bill is that it is not the same as its predecessor, which was introduced by the previous Government and which was mercifully a casualty of the election. Many of us lost far too many hours of our lives on that Bill, which was, frankly, a bad one—others have called it egregious.

So, I am pleased that this Government have clearly taken account of those debates—perhaps some of those hours were not wasted after all—and have produced a slightly slimmed-down version. That, in part, is because some of the old Bill has been removed from this one, but I am afraid it is expected to reappear again; I hate to disappoint the noble Lords, Lord Knight and Lord Stevenson, but we are going to see those DWP bank account access clauses in a separate Bill. However, at least it will be a stand-alone Bill rather than tucked in the background of a two-inch-thick data Bill.

I will start with a general concern which the noble and learned Lord, Lord Thomas, mentioned, which is that of EU data adequacy, which a number of us raised in the context of the last Bill. The helpful letter from the noble Lord, Lord Ricketts, the chair of the European Affairs Committee, dated 22 October to the Secretary of State for Science, Innovation and Technology, sets out very clearly the

“significant extra costs and administrative burdens on businesses and public-sector organisations which share data between the UK and the EU”

that would be incurred if we were to lose that data adequacy ruling, which is due to expire in June 2025—so very soon. I do not think I have seen a response from the Government to that letter, so I would be very interested to hear what the Minister has to say on that. Although this Bill is clearly less contentious than its predecessor and the risk is therefore clearly lower, it is not zero risk, and we need to be careful to ensure that there is nothing in the Bill that risks significantly the loss of that ruling.

To that end, I would be grateful if the Minister could explain what assessment the Government have made of the risk of losing the EU data adequacy ruling and, perhaps more importantly, tell us the extent to which the Bill has been discussed with our European counterparts to ensure that there is nothing in it that is concerning them. Clearly, we do not need to and should not follow the letter of the EU data protection rules, but we should at least work with our EU counterparts to ensure that we are not risking the adequacy ruling.

Part 1 deals with so-called smart data. I welcome it but note that it consists mainly of a series of powers to regulate rather than any firm steps, which is a little disappointing. The only current live example of smart data that we have is open banking, which a number of noble Lords have referred to—maybe, one day, we will see a pensions dashboard; who knows? However, open banking has been rather slower to take off than had been hoped. It has been six or seven years since it was first mooted. I urge the Government to carry out a review of why that is, before they start to make the regulations that the Bill proposes around smart data. There are lessons to be learned from open banking, to ensure that what we do with smart data in the future is more successful. The claims that smart data will boost the UK economy by £10 billion over the next 10 years looks a little optimistic, especially as the impact assessment from the Department for Business and Trade accompanying the Bill fails to monetise any costs or benefits of the smart data elements. I think that the smart data concept is good but hope that we get it right.

Part 2 of the Bill deals with the digital verification services. Again, on the whole, I am supportive of this. The Bill should improve security of and trust in digital verification. As the noble Lord, Lord Arbuthnot, said, it is not about digital ID cards. However, a number of us raised a concern last time round. There is a danger that this could become a slippery slope towards a situation where people may find themselves compelled to use digital verification services and therefore excluded from accessing services or products if they are not able or willing to use digital verification. The “not willing” part of it is important. Some people are wary of putting detailed identity information online. I am increasingly wary, particularly as a resident of Dumfries and Galloway, where all medical records from NHS Dumfries & Galloway were recently hacked, stuck online for ransomware and probably published. Therefore, I have some sympathy with those who do not fully trust official systems. I am curious to hear what the Minister has to say in response to the comments from the noble Lord, Lord Markham, about increased cyber- security in the public sector, as that is a good example of where it has gone wrong.

I know that there is no intention on the part of the Government at this time to make the use of DVS compulsory, but it is quite easy to see other providers, such as estate agents, financial institutions and, as one noble Lord mentioned, employers, making it a requirement. While supportive, I think we need some protections to ensure that people are not excluded from services by that. I would be interested to hear the Minister’s thoughts.

On Part 5, the House of Lords Select Committee on the Fraud Act 2006 and Digital Fraud heard a number of times that banks and other financial institutions were unwilling to share data for fraud prevention purposes because they felt constrained by data protection rules. I suspect that they were wrong but am very pleased that data processing for the purposes of detecting, investigating or preventing crime is to be expressly included as a legitimate interest. I hope that the Information Commissioner will ensure that it is widely pointed out and that we will start to see greater co-operation between payment providers and the tech and telecoms companies where the vast bulk of frauds originate.

However, on the subject of the legitimate interest changes, I am concerned that the Secretary of State will be able to make changes to matters considered to be legitimate interests by regulation. That is a significant power in terms of data processing and potentially a retrograde step. It could also raise concerns with respect to the EU data adequacy points that I raised earlier. While the EU might be happy with what is currently proposed, the ability to change key aspects could raise alarm bells.

Other noble Lords have talked about automated decision-making, where I am also concerned about the weakening of rights. Currently, automated decision-making is broadly prohibited, with specific exceptions. This Bill would permit it in a wider set of circumstances, with fewer safeguards. In her introduction, the Minister seemed to indicate that the same safeguards would apply. As I understand it, that is the case only where special category data is used. I would be grateful if the Minister could explain whether I have got that wrong. It seems to me to increase the risk of unfair or opaque decisions. The noble Lord, Lord Arbuthnot, talked about the Horizon/Post Office scandal. That should certainly give us pause for thought. The computer does not always get it right. There are myriad examples of AI inventing false information and giving fake answers. It is called “hallucination”. The right to challenge solely automated decisions should be sacrosanct. Why have the Government decided to weaken those safeguards?

Finally, I am pleased to get on to a point that no one else has raised so far, which is an achievement. I note with relief that the abolition of the Biometrics and Surveillance Camera Commissioner has been removed. However, issues remain in these areas. In particular, the previous commissioner has described a lack of an overarching accountability framework around surveillance camera and biometrics usage. Can the Minister explain what the Government’s plans are for the regulation of surveillance camera and biometric use, especially facial recognition and especially as the use of AI expands into that area?

In summary, it is a much better Bill, but there is a lot of work to do.

18:26
Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

My Lords, it is a privilege to follow the noble Lord, Lord Vaux. I agree with him completely that this is a better Bill. It is a real tribute to the Minister, and I thank her for how she introduced it.

I start with some good news. This Bill plugs a long-standing gap in our data provisions very handsomely—in providing data for researchers. It has been a real problem for civic society that we have had no reach into the affairs and behaviours of our tech companies, no true understanding of what the activities of their services are, how their algorithms are designed and how they behave, or even what kind of audiences they reach. The most basic questions cannot be answered because we do not have any legal reach or insight into what they are up to. That has been a long-standing inhibitor of accountability and, frankly, of good policy-making.

There were a number of attempts to bring this provision into legislation in the Online Safety Act and the previous data Bill. I am really pleased to see in Clause 123 the information for research about online safety matters provisions, which meet all the requests of those who were pressing for them. I pay tribute to the Minister for ensuring that this is in the Bill, as promised. I pay tribute to my noble friend Lord Camrose. He got these provisions into the previous Bill. Unfortunately, that Bill was pulled at the last minute, but I offer some respect to him too on that point.

This is such an important provision. It should not be overshadowed by the other important contents of this Bill. Can the Minister use the opportunity of the passing of this Bill to flesh out some of the provisions in this important clause? I would like to find a forum to understand how researchers can apply for data access; how the privacy protection measures will be applied; how the government consultation will be put together; and how the grounds for an application might work. There are opportunities for those who resist transparency to use any of these measures to throw a spanner in the works. We owe it to ourselves, having seen these provisions put into the Bill, to ensure that they work as well as intended.

That gap having been plugged, I would like to address two others that are still standing. First is the importance of preventing AI-trained algorithms from creating CSAM, which the noble Baroness, Lady Kidron, spoke movingly about. I pull that out of the many things that she mentioned because it is such a graphic example of how we really must draw a red line under some of the most egregious potential behaviours of AI. I am fully aware that we do not, as a country, want to throw obstacles in the way of progress and the very many good things that AI might bring us. There is a tension between the European and American approaches, on which we seek a position of balance. But if we cannot stop the AI from creating images and behaviours around CSAM, my goodness, what can we stop? Therefore, I ask the Minister to answer the question: why cannot we bring such provision on the face the Bill? I will strongly support any efforts to do so.

Lastly, following the comments from the noble and learned Lord, Lord Thomas, on data processing, I flag the very important issue of transfers of data to areas where there is not any clear adequacy and, in fact, no legal system for implementing the rule of law necessary to stand up standard contractual clauses. Your Lordships will be aware that in countries like China and Russia the rule of law is very lightly applied to matters of data. Protecting British citizens’ data, when it goes to such countries, should be the responsibility of any Government, but that is a very difficult thing to provide for. Huge amounts of data is now travelling across borders to countries where we really do not have any legal reach. The BYD car explosion in the UK is an example of the sheer quantity of data that is going overseas. Genomic information using Chinese genomic machines is an example of where some of that data is now more sensitive. It is a big gap in our data protection laws that we do not have a mechanism for fully accounting for the legal handling of that data. I brought in an amendment to the previous Bill, Amendment 111, which I urge the Minister to look at if she would like to understand this issue more carefully. I give fair warning that I will seek to move a version of that amendment for this Bill.

18:32
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - - - Excerpts

I, too, thank the Minister for her introduction to this welcome Bill. I feel that most noble Lords have an encyclopaedic knowledge of this subject, having been around the course not just once but several times. As a newcomer to this Bill, I am aware that I have plenty to learn from their experience. I would like to thank the Ada Lovelace Institute, the Public Law Project, Connected by Data and the Open Data Institute, among others, which have helped me get to grips with this complicated Bill.

Data is the oil of the 21st century. It is the commodity which drives our great tech companies and the material on which the large language models of AI are trained. We are seeing an exponential growth in the training and deployment of AI models. As many noble Lords have said, it has never been more important than now to protect personal data from being ruthlessly exploited by these companies, often without the approval of either the data owners or the creators. It is also important that, as we roll out algorithmic use of data, we ensure adequate protections for people’s data. I, too, hope this Bill will soon be followed by another regulating the development of AI.

I would like to draw noble Lords’ attention to a few areas of the Bill which cause me concern. During the debates over the last data protection Bill, I know there were worries over the weakening of data subjects’ protection and the loosening of processing of their data. The Government must be praised for losing many of these clauses, but I am concerned, like some other noble Lords, to ensure adequate safeguards for the new “recognised legitimate interests” power given to data processors. I support the Government’s growth agenda and understand that this power will create less friction for companies when using data for their businesses, but I hope that we will have time later in the passage of the Bill to scrutinise the exemption from the three tests for processing data, particularly the balancing test, which are so important in forcing companies to consider the data rights of individuals. This is especially so when safeguarding children and vulnerable people. The test must not be dropped at the cost of the rights of people whose data is being used.

This concern is reinforced by the ICO stating in its guidance that this test is valuable in ensuring companies do not use data in a way that data subjects would not reasonably expect it to be used. It would be useful in the Explanatory Notes to the Bill to state explicitly that when a data processor uses “recognised legitimate interests”, their assessment includes the consideration of proportionality of the processing activity. Does the Minister agree with this suggestion?

The list of four areas for this exemption has been carefully thought through, and I am glad that the category of democratic engagement has been removed. However, the clause does give future Ministers a Henry VIII power to extend the list. I am worried; I have heard some noble Lords say that they are as well, and that the clause’s inclusion in the previous Bill also concerned other noble Lords. It could allow future Ministers to succumb to commercial interests and add new categories, which might be to the cost of data subjects. The Minister, when debating this power in the previous data Bill, reminded the House that the Delegated Powers and Regulatory Reform Committee said of these changes:

“The grounds for lawful processing of personal data go to the heart of the data processing legislation and therefore in our view should not be capable of being changed by subordinate legislation”.


The Constitution Committee’s report called for the Secretary of State’s powers in this area to be subject to primary and not secondary legislation. Why do these concerns not apply to Clause 70 in this Bill?

I welcome the Government responding to the scientific community’s demand that they should be able to reuse data for scientific, historic or statistical research. There will be many occasions when data was collected for the study of a specific disease and the researchers want to reuse it years later for further study, but they have been restricted by the narrow distinctions between the original and the new purpose. The Government have incorporated recitals from the original GDPR in the Bill, but the changes in Clause 67 must be read against the developments taking place in AI and the way in which it is being deployed.

I understand that the Government have gone to great efforts to set out a clear definition of scientific research in this clause. One criterion is the

“processing for the purposes of technological development or demonstration … so far as those activities can reasonably be described as scientific”,

and another is the publication of scientific papers from the study. But my fear is that AI companies, in their urgent need to scrape datasets for training large language models, will go beyond the policy intention in this clause. They might posit that their endeavours are scientific and may even be supported by academic papers, but when this is combined with the inclusion of commercial activities in the Bill, it opens the way for data reuses in creating AI data-driven products which claim they are for scientific research. The line between product development and scientific research is blurred because of how little is understood about these emerging technologies. Maybe it would help if the Bill set out what areas of commercial activity should not be considered scientific research. Can the Minister share with the House how the clause will stop attempts by AI developers to claim they are doing scientific research when they are reusing data to increase model efficiency and capabilities, or studying their risks? They might even be producing scientific papers in the process.

I have attended a forum with scientists and policymakers from tech companies using the training data for AI who admitted that it is sometimes difficult to define the meaning of scientific research in this context. This concern is compounded by Clause 77, which provides an exemption to Article 13 of the UK GDPR for researchers and archivists to provide additional information to a data subject when reusing their data for different purposes if it requires disproportionate effort to obtain the required information. I understand these provisions are drawn to help reuse medical data, but they could also be used by AI developers to say that contacting people for the reuse of datasets from an already trained AI model requires disproportionate effort. I understand there are caveats around this exemption. However, in an era when AI companies are scraping millions of pieces of data to train their models, noble Lords need to bear in mind it is often difficult for them to get permission from the data subjects before reusing the information for AI purposes.

I am impressed by the safeguards for the exemption for medical research set out in Clause 85. The clause says that medical research should be supervised by a research ethics committee to assess the ethical reuse of the data. Maybe the Government should think about using some kind of independent research committee with standards set by UKRI before commercial researchers are allowed to reuse data.

Like many other noble Lords, I am concerned about the changes to Article 22 of the UK GDPR put forward in Clause 80. I quite understand why the Government want to expand solely automated decision-making in order for decisions to be made quickly and efficiently. However, these changes need to be carefully scrutinised. The clause removes the burden on the data controller to overcome tests before implementing ADM, outside of the use of sensitive information. The new position requires the data subject to proactively ask if they would like a human to be involved in the decision made about them. Surely the original Article 22 was correct in making the processor think hard before making a decision to use ADM, rather than putting the burden on the data subject. That must be the right way round.

There are other examples, which do not include sensitive data, where ADM decisions have been problematic. Noble Lords will know that, during Covid, algorithms were used to predict A-level results which, in many cases, were flawed. None of that information would have been classified as sensitive, yet the decisions made were wrong in too many cases.

Once again, I am concerned about the Henry VIII powers which have been granted to the Secretary of State in new Article 22D(1) and (2). This clause is already extending the use of ADM, but it gives Secretaries of State in the future the power to change by regulation the definition of “meaningful human involvement”. This potentially allows for an expansion of the use of ADM; they could water down the effectiveness of human involvement needed to be considered meaningful.

Likewise, I am worried by the potential for regulations to be used to change the definition of a decision having a “significant adverse effect” on a data subject. The risk is that this could be used to exclude them from the relevant protection, but the decision could nevertheless still have a significant harmful effect on the individual. An example would be if the Secretary of State decided to exclude from the scope of a “significant decision” interim, rather than final, decisions. This could result in the exclusion of a decision taken entirely on the basis of a machine learning predictive tool, without human involvement, to suspend somebody’s universal credit pending an investigation and final decision of whether fraud had actually been committed. Surely some of the anxiety about this potential extension of ADMs would be assuaged by increased transparency around how they are used. The Bill is a chance for the Government to give greater transparency to how ADMs process our information. The result would be to greatly increase public trust.

The Algorithmic Transparency Recording Standard delivers greater understanding about the nature of tools being used in the public sector. However, of the 55 ADM tools in operation, only 9 reports have currently been subject to the ATRS. In contrast, the Public Law Project’s Tracking Automated Government register has identified at least 55 additional tools, with many others still to be uncovered. I suggest that the Government make it mandatory for public bodies to publish information about the ADM systems that they are using on the ATRS hub.

Just as importantly, this is a chance for people to obtain personal information about how an automated decision is made. The result would be that, if somebody is subject to a decision made or supported by AI or an algorithmic tool, they should be notified at the time of the decision and provided with a personalised explanation of how and why it was reached.

Finally, I will look at the new digital verification services trust framework being set up in Part 2. The Government must be praised for setting up digital IDs, which will be so useful in the online world. My life, and I am sure that of many others, is plagued by the vagaries of getting access to the various websites we need to run our lives, and I include the secondary security on our phones, which so often does not work. The effectiveness of this ID will depend on the trust framework that is created and on who is involved in building it.

At the moment, in Clause 28, the Secretary of State must consult the Information Commissioner and such other persons as the Secretary of State sees appropriate. It seems to me that the DVS will be useful only if it can be used across national boundaries. Interoperability must be crucial in a digital world without frontiers. I suggest that an international standards body should be included in the Bill. The most obvious would be W3C, the World Wide Web Consortium, which is the standards body for web technology. It was founded by Sir Tim Berners-Lee and is already responsible for the development of a range of web standards, from HTML to CSS. More than that, it is used in the beta version of the UK digital identity and attributes trust framework and has played a role in both the EU and the Australian digital identity services frameworks. I know that the Government want the Secretary of State to have flexibility in drawing up this framework, but the inclusion of an international standards body in the Bill would ensure that the Minister has them in the forefront of their mind when drawing up this much-needed framework.

The Bill is a wonderful opportunity for our country to build public trust in data-driven businesses and their development. It is a huge improvement on its predecessor; it goes a long way to ensure that the law has protections for data subjects and sets out how companies can lawfully use and reuse data. It is just as crucial in the era of AI that, during the passage of the Bill through the House, we do not leave the door open for personal data to be ruthlessly exploited by the big tech companies. We would all be damaged if that was allowed to happen.

18:45
Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, when we considered the Data Protection and Digital Information Bill earlier in the year, I confessed to feeling somewhat out of my depth and at the edge of my comfort zone. It is with some trepidation that I enter this debate, in particular following the brilliant speeches by my noble friends Lord Knight and Lord Stevenson, and the noble Baroness, Lady Kidron, who lead in this field and understand the subject much better than I do.

Like others, I am delighted by the changes that the Bill brings forward, not least because, when we were in opposition, the noble Baroness, Lady Jones, and I attacked the DPDI Bill for being incoherent and lacking vision. It was simply a bundle of proposals that many people did not want. Although this data Bill is still a considerably large piece of legislation, it is much narrower and the better for it. Firms and data subjects alike will more easily understand it. Therefore, I hope it keeps us much closer to the EU’s rules, as we approach its crucial review of the UK’s data adequacy decision.

The Minister correctly set this new Bill in the context of driving economic growth. The Bill rightly focuses on harnessing data for economic growth, supporting modern digital government, and improving or seeking to improve the lives of our citizens. The last Administration sought to dramatically water down the rights of data subjects, particularly around data subject access requests and high-risk processing. This Bill has dropped many of those proposals, leaving in place the requirement to have UK-based representatives and maintaining the duties of data protection officers, which include carrying out impact assessments. This should reassure individuals that their data will continue to be kept safe.

The last Bill contained some egregious measures—others have referred to the DWP measures that would have required banks and financial organisations to provide data about accounts linked to benefits claimants, including the state pension. That, thankfully, has gone, with Labour’s promise to introduce a separate Bill tackling fraud and error.

Gone too are the plans to give the Secretary of State a veto on codes of practice prepared by the Information Commissioner, which called into question the commissioner’s independence. Similarly, the Government have taken out plans to abolish the Biometrics and Surveillance Camera Commissioner, as the noble Lord, Lord Vaux, rightly mentioned. A number of colleagues expressed concern about the implications of this, when the use of such data and equipment was becoming more widespread, rather than less. Again, this Bill does not include these measures, leaving those officers and the requirements on them very much in place.

The last Government sought to change data rules on political parties, allowing them to use certain datasets for campaigning purposes. Former Ministers could not properly explain what this would mean in practice or where the request had come from, so removing these measures from the Bill is, again, welcome.

When introduced, the DPDI Bill did not contain measures on coroners’ access to data, despite that having been promised to the noble Baroness, Lady Kidron. As Opposition Front-Benchers, I and the noble Baroness, Lady Jones, made commitments to her that we would take this forward. This Bill delivers on that manifesto commitment. Similarly, we worked collaboratively with the noble Lord, Lord Bethell, during the passage of what became the Online Safety Act to promote access to data for researchers. The inclusion of such measures in this Bill again shows that the Labour Party is following through on its commitments, including those given at the time of the election.

Like other noble Lords, I have received a number of briefing papers, some of which raise highly interesting questions and points. The Law Society says it remains concerned about the UK’s ability to ensure that we meet EU data adequacy standards. I recall that we were concerned about this in opposition. It suggests that Clause 84 deregulates the transfer of data across borders and international organisations, so can the Minister reassure the House and me that this will not put the UK at risk during the 2025 assessment of data adequacy? In a similar vein, can she assure noble Lords that the newly formed information commission will maintain sufficient independence from government and entrench our EU-UK data adequacy in that respect?

Another issue raised during debates on the DPDI Bill was the need to ensure that there is meaningful human involvement in decisions where automated decision-making processes are in play. Other noble Lords have raised this again. Can we have an assurance that the ethical principles of safety, transparency, fairness and contestability will be properly in place?

One other area which I know the Minister is interested in and excited by is the potential of the legislation in relation to the national underground asset register. We probed this in opposition; are we satisfied in government that the move to an NUAR over the next year or so will take into account the existence of the private sector company LinesearchbeforeUdig and ensure that there is a smooth transition to a national network? Given the current impact on critical national infrastructure of £2.4 billion-worth of accidents in our various grids, we must make sure that we harvest the benefit of the new technology to protect that critical part of infra- structure and our national economy.

Finally, I raise the concerns expressed by the News Media Association about the unlicensed use of data created by the media and broader creative industries. It argues that this represents intellectual property theft by AI developers. The consequences of AI firms being free to scrape the web without remunerating creators can be profound. Its long-term fear is that this will reduce investment in trusted journalism. If less and less human-authored intellectual property is produced, tech developers may find ultimately that the high-quality data essential for generative AI is lacking. I realise that this Bill does not cover AI, but it is important that, if we are to drive growth and innovation using AI, we consider developing a dynamic licensing market by making the UK’s copyright regime enforceable. Can the Minister offer some insight into government thinking on that point?

This is a much more narrowly focused Bill, large though it is, and benefits from it. I think the hours we spent earlier in the year interrogating the DPDI Bill were well spent, because they paved the way for this more streamlined and pragmatic approach, which we welcome.

18:54
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to take part in this Second Reading debate. I thank the Minister for the way she introduced the Bill. I declare my interests as set out in the register, particularly those in technology and financial services: as an adviser to Ecospend, an open banking technology, and to Socially Recruited, an AI business.

It is a pleasure to take part in a Second Reading for the third time on one Bill with three different names. We should all feel grateful that the only word to survive in all those titles is “data”, which must be a good thing. It is also a pleasure to follow so many excellent speeches, to which I find myself simply saying “Yes, agree, agree”, in particular the excellent speech of the noble Baroness, Lady Kidron, who pointed to some of the most extreme and urgent issues that we must address in data. I also support the concept from the noble Lord, Lord Knight, of the Government laying out their overall approach to all new technologies and issues around data so that we have a road map, suite, menu or whatever of everything they intend in the coming months and years, so that we can have clarity and try to enable consistency through all these Bills and statutory measures, which cover so much of our economy and society. As this is the third Second Reading for one Bill, I will cover three issues: smart data, automated decisions and the use of data in training AI.

On smart data, perhaps it would be better for the public if we called it “smart uses of data”. As has been mentioned, open banking is currently the only smart use of data. Perhaps one of the reasons why it has not been mainstreamed or got to a critical level in our society is the brand and rollout of open banking. We should all be rightly proud of the concept’s success— made in the UK and replicated in over 60 jurisdictions around the world, many of which have gone much further than us in a much shorter time. It demonstrates that we know how to do right-sized regulation and shows that we know how to regulate for innovation, consumer protection and citizens’ rights. Yet we are doing so little of this legislation and regulation.

It is one thing to pass that willed regulatory intervention; perhaps the Government and other entities did not do anywhere near enough promotion of the opportunities and possibilities of open banking. If you polled people on main street about open banking, I imagine they would say “I have no idea what you’re talking about; they’ve closed all the branches”. This goes to the heart of the point raised by the noble Lord, Lord Knight. Without a coherent narrative, explained, communicated and connected across our society, it is hardly surprising that we have not only this level of take-up of open banking but this level of connection to all the opportunities around these new technologies.

The opportunities are immense, as set out in this Bill. The extension of smart data into areas such as energy provision could be truly transformational for citizens and bill payers. What is the Government’s plan to communicate these opportunities on the passage of this Bill to make all bill payers, citizens and consumers aware of the opportunities that these smart data, smart energy and smart savings provisions may bring to them?

Secondly, as has rightly and understandably been mentioned by noble Lords, the Bill proposes a significant and material change to automated decision-making. It could be argued that one of the impacts of gen AI has been to cause a tidal wave of automated decisions, not least in recruitment and employment. Somebody may find themselves on the wrong end of a shortlisting decision for a role: an automated decision where the individual did not even know that AI was in the mix. I suggest that that makes as clear a case as any for the need to label all goods and products in which AI is involved.

The Bill seeks to take Article 22 and turn it into what we see in Clause 80. Would the Minister not agree that Clause 80 is largely saying, “It’s over to you, pal”? How can somebody effectively assert their right if they do not even know that AI and automated decision-making were in the mix at the time? Would the Minister not agree that, at the very least, there must be a right for an individual to have a personalised decision to understand what was at play, with some potential for redress if so sought?

Thirdly, on the use of data in training AI, where is the Bill on this most critical point? Our creatives add billions to the UK economy and they enrich our society. They lift our souls, making music where otherwise there may be silence, filling in the blank page with words that change our lives and pictures that elevate the human condition. Yet right now, we allow their works to be purloined without consent, respect or remuneration. What does the Bill do for our creative community, a section of the economy growing at twice the rate of the rest of it?

More broadly, why is the Bill silent when it comes to artificial intelligence, impacting as it does so many elements of our economy, society and individuals’ lives right now? If we are not doing AI in this Bill, when will we be? What are we waiting to know that we do not already know to make a decent effort at AI legislation and regulation?

The danger is that, with so much running through the Bill, if we do not engender a connection with the public then there will be no trust. No matter how much potential there is in these rich datasets and potential models to transform our health, education, mobility and so much more, none of it will come to anything if there is not public trust. I guess we should not be so surprised that, while we all enjoy “Wolf Hall: The Mirror and the Light” every Sunday evening, there is more than a degree of Henry VIII spattered through this Bill as a whole.

I move to some final questions. What is the Government’s position when it comes to the reversal of the burden of proof in computer evidence? We may need to modernise the situation pre-1999, but it should certainly be the case that that evidence is put to proof. We cannot continue with the situation so shamefully and shockingly set out in the Horizon situation, as rightly set out by my noble friend Lord Arbuthnot, who has done more than any in that area.

Similarly, on the Bill in its entirety, has the “I” of GenAI been passed over in the Bill as currently constructed? So many of the clauses and so much of the wording were put together before the arrival of GenAI. Is there not a sense that there is a need for renewal throughout the Bill, with so many clauses at least creaking as a consequence of the arrival of GenAI?

Will the Government consider updating the Computer Misuse Act, legislation which came into being before we had any of this modern AI or modern computing? Will they at least look at a statutory defence for our cyber community, who do so much to keep us all safe but, for want of a statutory defence, have to do so much of that with at least one hand tied behind their back?

Does the Minister believe that this Bill presents the opportunity to move forward with data literacy? This will be required if citizens are to assert their data rights and be able to say of their data, “It is my data and I decide to whom it goes, for what and for what remuneration”?

Finally, what is the Government’s approach to data connected to AI legislation, and when may we see at least a White Paper in that respect?

Data may be, as the Minister said, the DNA of our time, or, as other noble Lords have said, the oil; perhaps more pertinently it may be the plastic of our time, for all that that entails. The critical point is this: it offers so much potential, but not inevitability, to drive economic, social and psychological growth. We need to enable and empower all our citizens to be able to say, full-throatedly, “Our data; our decisions; our human-led digital futures”.

19:05
Earl of Erroll Portrait The Earl of Erroll (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I want to get on to the digital verification service. First, I declare that I am very interested in digital twins; there are huge advantages in modelling—for instance, the underground and all the various things that can choke traffic. I went to a very interesting event at Connected Places Catapult, where they are modelling all the inferences on traffic and flows, et cetera, to try to work out how you can alleviate it and get emergency services through when everything is choked up. There is huge advantage in being able to model that, and for that we need data sharing and all the other things.

The other thing I got very interested and involved in, with FIDO and Kaimai, is causal AI. As people say, we need to know how it got there: what sources was it relying on when it reached certain things that it put in the reports or decisions made? It is a very important aspect, because the “I” in AI is not really the right word to use. A computer is not innately intelligent. It is like a child; what you put into it and what it learns from that could well be not what you expected it to learn at all. We have to be very careful of believing too much in it, putting too much faith in it and thinking that it will run the future beautifully.

Here is the bit that I am more interested in. I was talking to my noble friend Lady Kidron just before the debate, and she pointed out something to me because of my previous involvement in chairing the British Standard PAS 1296 on age verification, which we did around the Digital Economy Act when we were worried about all the implications and the pornography issues. The trouble now is that the Government seem to think that the only age verification that matters is checking that someone is over 18, so that they can purchase alcohol and knives and view inappropriate pornography, which should not be there for youngsters. But when we wrote it, we were very careful to make sure that there was no age specified. For educational purposes, there is material that you want to go to particular age cohorts and do not want for children at other ages because it is wrong for their stage of development and knowledge. Also, you need to be able to check that older people are not trying to get into children’s social groups; they must be excludable from them. Age verification, whenever it is referred to, should work in any direction and at any age you want it. It should not be so inflexible.

I was sent a briefing by the Association of Document Validation Professionals and the Age Verification Providers Association. I was very much there when all that started off, when I was chairman of EURIM, which became the Digital Policy Alliance. They represent some 50 attribute and identity providers, and they warmly welcome the Bill and the priority that the new Government are giving to returning it to Parliament. I will highlight the sections of the Bill dealing with digital verification services that they feel would merit further discussion during later stages.

In Clause 27, “Introductory”, ideally there would be a clear purpose statement that the Bill makes certified digital ID legally valid as a proof of identity. This has always been a problem. To progress in the digital world, we will need to ensure that digital verification of ID is given equal standing with physical passports and driving licences. These data are technically only tokens to enable you to cross borders or drive a vehicle, but they are frequently used as proof of ID. This can be done digitally, and the benefit of digital ID is that it is much harder to forge and therefore much more reliable. For some reason, we have always had a fear of that in the past.

In Clause 29, on “supplementary codes”, they are very worried that it could add time and cost to developing these if these processes are owned by the Office for Digital Identities and Attributes—OfDIA. There should be a stronger obligation to co-create with the industry, both in preparing the initial rules and in any revisions. The drafting is, apparently, currently ambiguous about any requirements for consultation. I know that that has been a problem in the past. There will be specialist requirements particular to specific sectors and the OfDIA will not necessarily have the required expertise in-house. There are already organisations in place to do things around each of these codes.

In Clause 33, on registration on the digital verification services register, the changes to the previous Bill around registration processes are welcome and, most notably, the Government have recognised in the Bill the need for national security checks. The problem is that there is no independent appeals mechanism if the Secretary of State refuses to register a DVS or removes it from the register, short of judicial review—and that is both time consuming and very expensive. Most would not be able to survive long enough to bring the case to a conclusion, so we need to think of other remedies, such as some form of appeals tribunal.

In Clause 39, on the fees for registration et cetera, the fees are a new tax on the industry and may go beyond raising sufficient funds for the costs of administering the scheme. They welcome fees now being subject to parliamentary scrutiny, but would like to see a statutory limit on raising more than is required to fund DVS governance. There are figures on it which I could give you, but I will not bore you with them right now.

In Clause 50, on trust marks for use by registered persons, there may be a benefit from more direct linking of the requirements relating to marks of conformity to the Trade Marks Act.

In Clause 51, on the powers of a Secretary of State to require information, this wide-ranging power to demand information may inherently override the Data Protection Act. It extinguishes any obligation of confidentiality owed by a conformity assessment body to its clients, such as the contents of an audit report. The net effect could be to open up audit reports to freedom of information requests, because the exemption to an FoI would be that they were confidential, but the Bill appears to override that, and the way the Bill is structured could mean that the Secretary of State can also override a court order imposing confidentiality. I do not think we should allow that.

Clause 52 is about arrangements for third parties to exercise functions. In its current form, the Office for Digital Identities and Attributes is an unusual regulator. It is not independent from the Government and does not share the features of other regulators. It may therefore not be able to participate in the Digital Regulation Cooperation Forum, for example, based on the powers relied upon by its members to collaborate with other regulators.

The OfDIA may not be in scope of regulatory duty for most regulators to promote growth. It is unclear whether the new regulatory innovation office will have jurisdiction over the OfDIA. It would be helpful to explore whether a more conventional status as an independent regulator would be preferable.

I think that is enough complication for the moment.

19:13
Lord Davies of Brixton Portrait Lord Davies of Brixton (Lab)
- View Speech - Hansard - - - Excerpts

I welcome the Bill and thank my noble friend Lady Jones of Whitchurch for her clear introduction. It represents a significant improvement on the Data Protection and Digital Information Bill that we had such fun discussing last year under the previous Government. I thank the noble Viscount, Lord Camrose, for his handling of the Bill at that stage and look forward to continuing these discussions.

However, there are some concerns on which it would be good to have some reassurance from the Government, so I welcome the opportunity to discuss potential improvements during the Bill’s passage through the House. It is also worth bearing in mind the remarks of the noble Lord, Lord Holmes of Richmond, that this is a fast-moving field and there is a continual danger of fighting the last war. I think that is certainly the case in relation to AI. So, time in Committee will have to be spent considering whether there is more that needs to be done because of the way the world has developed.

I am pleased that the Bill no longer covers information for social security purposes. I am not so pleased that it is going to reappear through the separate fraud, error and debt Bill. That is, of course, a discussion for another day; we have not seen it yet. My Government announced it two months ago and we have not yet seen it, so fingers crossed they are having second thoughts.

My prime concern with the Bill, and where I want to ensure that there are adequate safeguards, is individuals’ health data and what the provisions in the Bill mean for patients and for the public. It is notable that one of the key stated purposes of the Bill is to

“build an NHS fit for the future”,

which is of course one of the Government’s five missions.

My noble friend Lord Stevenson of Balmacara, who is not in his place, set out the issues very clearly. Nevertheless, I will repeat them, because I think that the point is so important. We have the problem that data regulation can slow down the pace of data sharing, increase risk aversion and make research and innovation more difficult. That is all true—it is true of all data, but particularly of health data. However, patients and the public rightly expect high standards for data protection, particularly when it comes to their health data, and I am worried that the effects of the Bill are not as strong as might be wished. This will need close examination during its passage through Committee. To get this wrong would damage public trust, negatively impact patient care, complicate the running of the health service and have a harmful effect on academic research and our life sciences industry. We must do our best to ensure that any concerns are misplaced—I hope that I am wrong.

Under current data protection laws there are transparency obligations, which means that information needs to be provided to the data subject that explains the use of their data. Reusing data for a different purpose is currently possible, but under limited circumstances—for example, the UK Health Security Agency. The main point of concern with the Bill, however, is with Clause 77, which, in the words of the BMA,

“will water down the transparency of information to patients”.

I suggest that we have to take the concerns of the BMA most seriously on this, which I am highlighting, but also on the other points it has made. What we have is a situation where data collected for one purpose can be reused for scientific research. In those circumstances, there is not necessarily a requirement to tell the data subjects about it. The definition of “scientific research” is very wide. It can be commercial or non-commercial. It can be funded publicly or privately. It also covers technological development, which is broadening the idea of scientific research.

Clearly, this is thought to be a good thing. It will remove barriers for valuable health research—timely availability of data is something important when you are undertaking research—and it is always possible that, during the course of the research, you can identify things which were not in the original proposal. All that is right, but there is a risk of data being reused for activities that data subjects might not have supported, have no control over and have no knowledge that it is happening. This feels like it contradicts the “no surprises” Caldicott principle. It is unclear to me at this stage who exactly is going to have oversight of all the data reuses to check that they are ethical and to check that the right standards are being applied.

The consequence is a real risk of the loss of patient and public trust in data use and sharing within the health sector and more widely. To reiterate, patients and the public rightly expect high standards of data processing to protect their confidential health data. I have serious concerns that the Bill, in its current state, runs the risk of diluting those standards and protections.

The underlying policy priority for the Bill, as I understand it, is to stimulate innovation through broadening the definition of “scientific research”. However, there is concern—for example, that expressed by the Ada Lovelace Institute—that, as currently written, the provisions in the Bill are susceptible to misuse. We must ensure that the Bill explicitly forbids the mass reuse of personal data scraped from the internet or acquired through social media for AI product development under the auspices of “scientific research”, with the potential for considerable public backlash. Voluntary commitments from the tech industry to protect people from the potential harms of AI models are welcome, of course, but are not good enough. Only hard rules enshrined in law can incentivise the developers and deployers of AI to comply, and empower the regulators to act.

Another unknown at this stage—I hope my noble friend can guide us here—is how far the Bill diverges from EU standards and potentially puts at risk the free flow of personal data between the EU and the UK. This free flow is critical to medical research and innovation and must be maintained.

I am also concerned about the issue of making data anonymous. It is incredibly difficult to make medical data anonymous. It is valueless in most cases if you do not know how old the subject is or their pre-existing conditions, and as soon as you have that sort of data it is open to manipulation. I believe that to counter those problems we need to expand the use of so-called trusted research environments. This is a well-developed technique in which Britain is the leader. I believe it should be a legal requirement in this field. The Bill does not go that far. It is certainly something we should discuss in Committee.

This is a system where the information—the subject’s data—is kept within a locked box. It stays within the box. The medical researchers, who are crucial, come up with their program, using a sandbox, which is then applied to the locked-away data. The researchers would not get the data, they would just get the results of their inquiry. They do not go anywhere near the data. This level of protection is required to achieve public support. The outcome of the research in these circumstances is identical but the subjects’ medical information—crucially, but not only, their genetic information—is kept away and kept secure.

Finally, another point of concern that has been mentioned by a number of speakers is automated decision-making. The Bill removes the general prohibition on automated decision-making, placing responsibility on individuals to enforce their rights rather than on companies to demonstrate why automation is permissible. Even with the new safeguards being introduced, people will struggle to get meaningful explanations about decisions that will deeply affect their lives and will have difficulty exercising their right to appeal against automated decisions when the basis on which the decisions have been made is kept from them.

With those concerns, which I am sure we will discuss in Committee, I support the Bill.

19:25
Lord Freyberg Portrait Lord Freyberg (CB)
- View Speech - Hansard - - - Excerpts

My Lords, it is a great pleasure to follow the noble Lord, Lord Davies, and what he had to say on health data, much of which I agree entirely with. The public demand that we get this right and we really must endeavour to do all we can to reassure the public in this area.

I speak as someone deeply rooted in the visual arts and as an artist member of DACS—the Design and Artists Copyright Society. In declaring my interests, I also express gratitude for the helpful briefing provided by DACS.

The former Data Protection and Digital Information Bill returns to this House after its journey was interrupted by July’s general election. While this renewed Bill’s core provisions remain largely unchanged, the context in which we examine them has shifted significantly. The rapid advancements in artificial intelligence compel us to scrutinise this legislation not just for its immediate impact but for its long-term consequences. Our choices today will shape how effectively we safeguard the rights and interests of our citizens in an increasingly digital society. For this reason, the Bill demands meticulous and thorough examination to ensure that it establishes a robust governance framework capable of meeting present and future challenges.

Over the past year, Members of this House have carefully considered the opportunities and risks of large language models which power artificial intelligence applications—work that is still ongoing. I note that even today, the Lords Communications and Digital Committee, chaired by the noble Baroness, Lady Stowell of Beeston, is holding an evidence session on the role of AI in creative tech.

The committee’s previous inquiry into large language models stressed a need for cautious action. Drawing on expert testimony, its recommendations highlighted critical gaps in our current approach, particularly in addressing immediate risks in areas such as cybersecurity, counterterrorism, child protection, and disinformation. The committee rightly stressed the need for stronger assessments and guardrails to mitigate these harms, including in the area of data protection.

Regrettably, however, this Bill moves in the opposite direction, and instead seeks to lighten the regulatory governance of data processing and relaxes rules around automated decision-making, as other noble Lords have referred to. Such an approach risks leaving our legislative framework ill prepared to address the potential risks that our own committee has so carefully documented.

The creative industries, which contribute £126 billion annually to the UK economy, stand particularly exposed. Evidence submitted to the committee documented systematic unauthorised use of copyrighted works by large language models, which harvest content across the internet while circumventing established licensing frameworks and creator permissions.

This threat particularly impacts visual artists—photographers, illustrators, designers, et cetera—many of whom already earn far below the minimum wage, as others, including the noble Baroness, Lady Kidron, and the noble Lords, Lord Bassam and Lord Holmes, have already highlighted. These creators now confront a stark reality: AI systems can instantaneously generate derivative works that mimic their distinctive styles and techniques, all without attribution or compensation. This is not merely a theoretical concern; this technological displacement is actively eroding creative professionals’ livelihoods, with documented impacts on commission rates and licensing revenues.

Furthermore, the unauthorised use of reliable, trusted data, whether from reputable news outlets or authoritative individuals, fuels the spread of disinformation. These challenges require a solution that enables individuals and entities, such as news publishers, to meaningfully authorise and license their works for a fair fee.

This Bill not only fails to address these fundamental challenges but actively weakens existing protections. Most alarmingly, it removes vital transparency requirements for personal data, including data relating to individual creators, when used for research, archival and statistical purposes. Simultaneously, it broadens the definition of research to encompass “commercial” activities, effectively creating a loophole ripe for exploitation by profit-driven entities at the expense of individual privacy and creative rights.

Finally, a particularly troubling aspect of the Bill is its proposal to dissolve the Information Commissioner’s Office in favour of an information commission—a change that goes far beyond mere restructuring. Although I heard what the Minister said on this, by vesting the Secretary of State with sweeping powers to appoint key commission members, the Bill threatens to compromise the fundamental independence that has long characterised our data protection oversight. Such centralised political influence could severely undermine the commission’s ability to make impartial, evidence-based decisions, particularly when regulating AI companies with close government ties or addressing sensitive matters of national interest. This erosion of regulatory independence should concern us all.

In summary, the cumulative effect of this Bill’s provisions exposes a profound mismatch between the protections our society urgently needs and those this legislation would actually deliver. At a time when artificial intelligence poses unprecedented challenges to personal privacy and creative rights, this legislation, although positive on many fronts, appears worryingly inadequate.

19:31
Lord Lucas Portrait Lord Lucas (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I declare an interest in that, through the Good Schools Guide, I am an extensive user of government schools data. With another hat on, I share my noble friend Lord Markham’s worries about how this affects little organisations with a bit of membership data.

I very much look forward to Committee, when we will get into the Bill’s substance. I supported almost everything that the noble Baroness, Lady Kidron, said and look forward to joining in on that. I also very much support what my noble friend Lord Holmes said, in particular about trust, so he will be glad to know that I have in advance consulted Copilot as to the changes they would like to see in the Bill. If I may summarise what they said—noble Lords will note that I have taken the trouble to ascertain their choice of pronouns—they would like to see enhanced privacy safeguards, better transparency and accountability, regular public consultation and reviews of the Act, impact assessments before implementation, support for smaller entities and clearer definition of key terms. I am delighted by how much I find myself in agreement with our future overlords.

To add to what the noble Earl, Lord Erroll, said about digital identity being better, there was a widespread demonstration of that during Covid, when right-to-work checks went digital. Fraud went down as a result.

On the substantial changes that I would like to see, like my noble friend Lord Arbuthnot of Edrom, I would like a clear focus on getting definitions of data right. It is really important that we have stability and precision in data. What has been going on in sex and gender in particular is ridiculous. Like many other noble Lords, I also want a focus on the use of artificial intelligence in hiring. It is so easy now to get AI support for making a job application that the number of job applications has risen hugely. In response to this, of course, AI has been used in assessing job applications, because you really cannot plough through 500 in order to make a shortlist. Like the Better Hiring Institute, which I am associated with, I would really like to see AI used to give people the reasons why they have not been successful. Give everybody a reply and engage everybody in this process, rather than just ignoring them—and I apologise to the many people who send me emails that I do not reply to, but perhaps I will do better with a bit of AI.

This is a very seasonal Christmas tree of a Bill and I shall not be shy of hanging baubles on it when we come to Committee, in the way that many other noble Lords have done. My choices include trying to make it possible for the Student Loans Company to be more adventurous in the use of its data. It ought to be a really good way of finding out how successful our university system is. It is in touch with university graduates in a way that no other organisation is, but it feels constrained in the sorts of questions it might ask. I would really like Action Fraud to record all attempts at fraud, not just the successful frauds. We need a better picture of what is going on there. I would like to see another attempt to persuade the DfE that schools admissions data should be centrally gathered. At the moment it is really hard for parents to use, which means there is a huge advantage for parents who are savvy and have the time. That is not the way it should be. Everybody should have good, intelligent access to understanding what schools are open to them. There will be plenty of opportunities in Committee, which, as I say, I look forward to.

In the context of data and House of Lords reform, when I did a snap census at 5.47 pm, the Cross-Bench Peers were in the majority in the House. That suggests that, in providing Peers who have a real interest in the core business of this House—revising legislation—the process of choosing Cross-Bench Peers does rather better than the process of choosing the rest of us. If we are to reform the House of Lords, getting that aspect into the political selection would be no bad thing. I would also like some data, in the sense of some clear research, on the value of Statement repeats. I cannot recall an occasion when a Statement repeat resulted in any change of government policy of any description. Perhaps other noble Lords can enlighten me.

19:38
Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the noble Lord, Lord Lucas, for injecting a bit of reality into this discussion. I declare my interest as a governor of Coram. I thank the noble Lord very much for his comments about the Cross Benches. Perhaps if we let HOLAC choose the Cross Benches and Copilot the political appointees, we might be slightly more successful than we have been in recent years.

I am conscious that I am the only person between your Lordships and the joys of the Front-Bench spokespeople, so I shall not take too long. I welcome the Bill, but I have some concerns. In particular, in reading through the introduction on page 1 line by line, I counted 12 instances of the Bill being “to make provision” and not a single specific mention of protection, which I feel is perhaps a slight imbalance.

I have six areas of concern that I suspect I and many others will want to explore in Committee. I am not going to dive into the detail at this stage, because I do not think it is appropriate.

Like many noble Lords, including the noble Lords, Lord Knight, Lord Vaux and Lord Bassam, and the noble and learned Lord, Lord Thomas, I have some concerns about the extension of the UK’s data adequacy status beyond June next year. Given that one of the key objectives of this Bill is to help economic growth, it is incredibly important that that happens smoothly. It is my hope and expectation that the new His Majesty’s Government will find it slightly less painful and more straightforward to talk to some of our colleagues across the water in the EU to try to understand what each side is thinking and to ease the way of making that happen.

Secondly, like many noble Lords, I have a lot of concern about what is rather inelegantly known as “web scraping”—a term that sounds to me rather like an unpleasant skin rash—of international property and its sheer disregard for IP rights, so undermining the core elements of copyright and the value of unique creative endeavour. It is deeply distasteful and very harmful.

One area that I hope we will move towards in Committee is the different departments of His Majesty’s Government that have an interest in different parts of this Bill consciously working together. In the case of web scraping, I think the engagement of Sir Chris Bryant in his dual role as Minister of State for Data Protection and Minister for Creative Industries will be very important. I hope and expect that the Minister and her colleagues will be able to communicate as directly as possible and have a sort of interplay between us in Committee and that department to make sure that we are getting the answers that we need. It is frankly unfair on the Minister, given that she is covering the ground of I-do-not-know-how-many Ministers down the other end, for her also to take on board the interests, concerns and views of other departments, so I hope that we can find a way of managing that in an integrated way.

Thirdly, like my noble friend Lady Kidron, I am appalled by AI-produced child sexual abuse material. To that extent, I make a direct request to the Minister and the Bill team that they read an article published on 18 October in the North American Atlantic magazine by Caroline Mimbs Nyce, The Age of AI Child Abuse is Here. She writes about what is happening the USA, but it is unpleasantly prescient. She writes,

“child-safety advocates have warned repeatedly that generative AI is now being widely used to create sexually abusive imagery of real children”—

not AI avatars—

“a problem that has surfaced in schools across the country”.

It is certainly already happening here. It will be accelerating. Some of your Lordships may have read about a scandal that has emerged in South Korea. My daughter-in-law is South Korean. AI-created adult sexual material has caused major trauma and a major decline in female mental health. In addition, when it comes to schools, there are real concerns about the ability of companies to scoop up photographic information about children from photos that schools have on their own websites or Facebook pages. Those images can then potentially be used for variety of very unpleasant reasons, so I think that is an area which we would want to look at very carefully.

Fourthly, there are real concerns about the pervasive spread of educational technology—edtech, as it is known informally—driven, understandably, by commercial rather than educational ambition in many cases. We need to ensure that the age-appropriate design code applies to edtech and that is something we should explore. We need to prioritise the creation of a code of practice for edtech. We know of many instances where children’s data has been collected in situations where the educational establishments themselves, although they are charged with safeguarding, are wholly inadequate in trying to do it, partly because they do not really understand it and partly because they do not necessarily have the expertise to do it. It is unacceptable that children in school, a place that should be a place of safety, are inadvertently exposed to potential harm because schools do not have the power, resources and knowledge to protect the children for whom they are responsible. We need to think carefully about what we need to do to enhance their ability to do that.

On my fifth concern, the noble Lords, Lord Stevenson and Lord Holmes, made a very good point, in part about the Bletchley declaration. It would be helpful for us as a country and certainly as Houses of Parliament to have some idea of where the Government think we are going. I understand that the new Government are relatively recently into their first term and are somewhat cautious about saying too much about areas that they might subsequently regret, but I think there is a real appetite for a declaratory vision with a bit of flesh on it. We all understand that it might need to change as AI, in particular, threatens to overtake it, but having a stab at saying where we are, what we are doing, why we are doing it, the direction of travel and what we are going to do to modify it as we go along, because we are going to have to because of AI, would be helpful and, frankly, reassuring.

Lastly, during the passage of the Online Safety Bill, many of us tried to make the case for a Joint Committee to oversee digital regulation and the regulators themselves. I think it would be fair to say that the experience of those of us who were particularly closely involved with what is now the Online Safety Act and the interactions that we have had formally or informally with the regulator since then, and the frustrations that have emerged from those interactions, have demonstrated the value of having a joint statutory committee with all the powers that it would have to oversee and, frankly, to call people to account. It would really concentrate minds and make the implementation of that Act, and potentially this Act, more streamlined, more rapid and more effective. It could be fine-tuned thereafter much more effectively, in particular if we are passing a Bill that I and my fellow members of the Secondary Legislation Scrutiny Committee will have the joy of looking at in the form of statutory instruments. Apart from anything else, having a Joint Committee keep a close watch on the flow of statutory instruments would be enormously helpful.

As we are dealing with areas which are in departments that are not immediately within the remit of the Minister, such as the Department for Education given what I was talking about with schools, anything we can do to make it clear that the left hand knows what the right hand is doing would be extraordinarily helpful. I think there have been precedents in particular cases in Committee when we are dealing with quite detailed amendments for colleagues from other departments to sit on the Bench alongside the Minister to provide real departmental input. That might be a course that we could fruitfully follow, and I would certainly support it.

19:49
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I draw attention to my AI interests in the register. I thank the Minister for her upbeat introduction to the Bill and all her engagement to date on its contents. It has been a real pleasure listening to so many expert speeches this afternoon. The noble Lord, Lord Bassam, did not quite use the phrase “practice makes perfect”, because, after all, this is the third shot at a data protection Bill over the past few years, but I was really taken by the vision and breadth of so many speeches today. I think we all agree that this Bill is definitely better than its two predecessors, but of course most noble Lords went on to say “but”, and that is exactly my position.

Throughout, we have been reminded of the growing importance of data in the context of AI adoption, particularly in the private and public sectors. I think many of us regret that “protection” is not included in the Bill title, but that should go hand in hand if not with actual AI regulation then at least with an understanding of where we are heading on AI regulation.

Like others, I welcome that the Bill omits many of the proposals from the unlamented Data Protection and Digital Information Bill, which in our view— I expect to see a vigorous shake of the head from the noble Viscount, Lord Camrose—watered down data subject rights. The noble Lord, Lord Bassam, did us a great favour by setting out the list of many of the items that were missing from that Bill.

I welcome the retention of some elements in this Bill, such as the digital registration of birth and deaths. As the noble Lord, Lord Knight, said, and as Marie Curie has asked, will the Government undertake a review of the Tell Us Once service to ensure that it covers all government departments across the UK and is extended to more service providers?

I also welcome some of the new elements, in particular amendments to the Online Safety Act—essentially unfinished business, as far back as our Joint Committee. It was notable that the noble Lord, Lord Bethell, welcomed the paving provisions regarding independent researchers’ access to social media and search services, but there are questions even around the width of that provision. Will this cover research regarding non-criminal misinformation on internet platforms? What protection will researchers conducting public interest research actually receive?

Then there is something that the noble Baroness, Lady Kidron, Ian Russell and many other campaigners have fought for: access for coroners to the data of young children who have passed away. I think that will be a milestone.

The Bill may need further amendment. On these Benches we may well put forward further changes for added child protection, given the current debate over the definition of category 1 services.

There are some regrettable omissions from the previous Bill, such as those extending the soft opt-in that has always existed for commercial organisations to non-commercial organisations, including charities. As we have heard, there are a considerable number of unwelcome retained provisions.

Many noble Lords referred to “recognised legitimate interests”. The Bill introduces to Article 6 of the GDPR a new ground of recognised legitimate interest, which counts as a lawful basis for processing if it meets any of the descriptions in the new Annex 1 to the GDPR in Schedule 4 of the Bill. The Bill essentially qualifies the public interest test under Article 6(1)(e) of the GDPR and, as the noble Lord, Lord Vaux, pointed out, gives the Secretary of State powers to define additional recognised legitimate interests beyond those in the annex. This was queried by the Constitution Committee, and we shall certainly be kicking the tyres on that during Committee. Crucially, there is no requirement for the controller to make any balancing test, as the noble Viscount, Lord Colville, mentioned, taking the data subject’s interests into account. It just needs to meet the grounds in the annex. These provisions diminish data protection and represent a threat to data adequacy, and should be dropped.

Almost every noble Lord raised the changes to Article 22 and automated decision-making. With the exception of sub-paragraph (d), to be inserted by Clause 80, the provisions are very similar to those of the old Clause 14 of the DPDI Bill in limiting the right not to be subject to automated decision-making processing or profiling to special category data. Where automated decision-making is currently broadly prohibited with specific exceptions, the Bill will permit it in all but a limited set of circumstances. The Secretary of State is given the power to redefine what ADM actually is. Again, the noble Viscount, Lord Colville, was right in how he described what the outcome of that will be. Given the Government’s digital transformation agenda in the public sector and the increasing use of AI in the private sector, this means increasing the risk of biased and discriminatory outcomes in ADM systems.

Systems such as HART, which predicted reoffending risk, PredPol, which was used to allocate policing resources based on postcodes, and the gangs matrix, which harvests intelligence, have all been shown to have had discriminatory effects. It was a pleasure to hear what the noble Lord, Lord Arbuthnot, had to say. Have the Government learned nothing from the Horizon scandal? As he said, we need to move urgently to change the burden of proof for computer evidence. What the noble Earl, Lord Errol, said, in reminding us of the childlike learning abilities of AI, was extremely important in that respect. We should not put our trust in that way in the evidence given by these models.

ADM safeguards are critical to public trust in AI, and our citizens need greater not less protection. As the Ada Lovelace Institute says, the safeguards around automated decision-making, which exist only in data protection law, are more critical than ever in ensuring that people understand when a significant decision about them is being automated, why that decision has been made, and the routes to challenge it or ask for it to be decided by a human. The noble Viscount, Lord Colville, and the noble Lord, Lord Holmes, set out that prescription, and I entirely agree with them.

This is a crucial element of the Bill but I will not spend too much time on it because, noble Lords will be very pleased to hear, I have a Private Member’s Bill on this subject, providing much-needed additional safe- guards for ADM in the public sector, coming up on 13 December. I hope noble Lords will be there and that the Government will see the sense of it in the meantime.

We have heard a great deal about research. Clause 68 widens research access to data. There is a legitimate government desire to ensure that valuable research does not have to be discarded because of a lack of clarity around reuse or because of very narrow distinctions between the original and new purpose. However, it is quite clear that the definition of scientific research introduced by the Bill is too broad and risks abuse by commercial interests. A number of noble Lords raised that, and I entirely agree with the noble Baroness, Lady Kidron, that the Bill opens the door to data reuse and mass data scraping by any data-driven product development under the auspices of scientific research. Subjects cannot make use of their data rights if they do not even know that their data is being processed.

On overseas transfers, I was very grateful to hear what the noble and learned Lord, Lord Thomas, had to say about data adequacy, and the noble Lords, Lord Bethell, Lord Vaux and Lord Russell, also raised this. All of us are concerned about the future of data adequacy, particularly the tensions that are going to be created with the new Administration in the US if there are very different bases for dealing with data transfer between countries.

We have concerns about the national security provisions. I will not go into those in great detail, but why do the Government believe that these clauses are necessary to safeguard national security?

Many noble Lords raised the question of digital verification services. It was very interesting to hear what the noble Earl, Lord Erroll, had to say, given his long-standing interest in this area. We broadly support the provisions, but the Constitution Committee followed the DPRRC in criticising the lack of parliamentary scrutiny of the framework to be set by the Secretary of State or managed by DSIT. How will they interoperate with the digital identity verification services being offered by DSIT within the Government’s One Login programme?

Will the new regulator be independent, ensure effective governance and accountability, monitor compliance, investigate malicious actors and take enforcement action regarding these services? For high levels of trust in digital ID services, we need high-quality governance. As the noble Lord, Lord Vaux, said, we need to be clear about the status of physical ID alongside that. Why is there still no digital identity offence? I entirely agreed with what the noble Lords, Lord Lucas and Lord Arbuthnot, said about the need for factual clarity underlying the documents that will be part of the wallet—so to speak—in terms of digital ID services. It is vital that we distinguish and make sure that both sex and gender are recorded in our key documents.

There are other areas about which we on these Benches have concerns, although I have no time to go through them in great detail. We support the provisions on open banking, which we want to see used and the opportunities properly exploited. However, as the noble Lord, Lord Holmes, said, we need a proper narrative that sells the virtues of open banking. We are concerned that the current design allows landlords to be given access to monitoring the bank accounts of tenants for as long as an open banking approval lasts. Smart data legislation should mandate that the maximum and default access duration be no longer than 24 hours.

A formidable number of noble Lords spoke about web trawling by AI developers to train their models. It is vital that copyright owners have meaningful control over their content, and that there is a duty of transparency and penalties for scraping news publisher and other copyrighted content.

The noble and learned Lord, Lord Thomas, very helpfully spoke about the Government’s ECHR memorandum. I do not need to repeat what he said, but clearly, this could lead to a significant gap, given that the Retained EU Law (Revocation and Reform) Act 2023 has not been altered and is not altered by this Bill.

There are many other aspects to this. The claims for this Bill and these provisions are as extravagant as for the old one; I think the noble Baroness mentioned the figure of £10 billion at the outset. We are in favour of growth and innovation, but how will this Bill also ensure that fundamental rights for the citizen will be enhanced in an increasingly AI-driven world?

We need to build public trust, as the noble Lord, Lord Holmes, and the noble Baroness, Lady Kidron, said, in data sharing and access. To achieve the ambitions of the Sudlow review, there are lessons that need to be learned by the Department of Health and the NHS. We need to deal with edtech, as has been described by a number of noble Lords. All in all, the Government are still not diverging enough from the approach of their predecessor in their enthusiasm for the sharing and use of data across the public and private sectors without the necessary safeguards. We still have major reservations, which I hope the Government will respond to. I look forward—I think—to Grand Committee.

20:04
Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

My Lords, let me start by repeating the thanks others have offered to the Minister for her ongoing engagement and openness, and to the Bill team for their—I hope ongoing—helpfulness.

Accessing and using data safely is a deeply technical legislative subject. It is, perhaps mysteriously, of interest to few but important to more or less everyone. Before I get started, I will review some of the themes we have been hearing about. Given the hour, I will not go into great detail about most of them, but I think it is worth playing some of them back.

The first thing that grabbed me, which a number of noble Lords brought up, was the concept of data as an asset. I believe the Minister used the phrase “data as DNA”, and that is exactly the right metaphor. Whether data is a sovereign asset or on the balance sheet of a private organisation, that is an incredibly important and helpful way to see it. A number of noble Lords brought this up, including the noble Baroness, Lady Kidron, and the noble Lords, Lord Knight and Lord Stevenson of Balmacara.

I was pleased that my noble friend Lord Lucas brought up the use of AI in hiring, if only because I have a particular bee in my bonnet about this. I have taken to writing far too many grumpy letters to the Financial Times about it. I look forward to engaging with him and others on that.

I was pleased to hear a number of noble Lords raise the issue of the burdens on small business and making sure that those burdens, in support of the crucial goal of protecting privacy, do not become disproportionate relative to the ability of small businesses to execute against them. The noble and learned Lord, Lord Thomas, the noble Lords, Lord Stevenson of Balmacara and Lord Bassam, and my noble friend Lord Markham brought that up very powerfully.

I have cheated by making an enormous group of themes, including ADM, AI and text and data mining—and then I have added Horizon on at the end. It is thematically perhaps a little ambitious, but we are getting into incredibly important areas for the well-being and prosperity of so many people. A great many noble Lords got into this very persuasively and compellingly, and I look forward to a great deal of discussion of those items as we go into Committee.

Needless to say, the importance of adequacy came up, particularly from the noble Lords, Lord Vaux and Lord Bassam, and the noble and learned Lord, Lord Thomas. There is a key question here: have we reduced the risk of loss of adequacy to as close to zero as we can reasonably get, while recognising that it is a decision that is essentially out of our sovereign hands?

A number of noble Lords brought up the very tricky matter of the definition of scientific research—among them the noble Viscount, Lord Colville, my noble friend Lord Bethell and the noble Lords, Lord Davies of Brixton and Lord Freyberg. This is a significant challenge to the effectiveness of the legislation. We all know what we are trying to achieve, but the skill and the art of writing it down is a considerable challenge.

My final theme, just because I so enjoyed the way in which it was expressed by the noble Lord, Lord Knight, is the rediscovery of the joys of a White Paper. That is such an important point—to have the sense of an overall strategy around data and technology as well as around the various Bills that came through in the previous Parliament and will, of course, continue to come now, as these technologies develop so rapidly.

My noble friend Lord Markham started by saying that we on these Benches absolutely welcome the Government’s choice to move forward with so many of the provisions originally set out in the previous Government’s DPDI Bill. That Bill was built around substantial consultation and approved by a range of stakeholders. We are particularly pleased to see the following provisions carried forward. One is the introduction of a national underground asset register. As many others have said, it will not only make construction and repairs more efficient but make them safer for construction workers. Another is giving Ofcom the ability, when notified by the coroner, to demand that online service providers retain data in the event of any child death. I notice the noble Baroness, Lady Kidron, nodding at that—and I am delighted that it remains.

On reforming and modernising the ICO, I absolutely take the point raised by some that this is an area that will take quite considerable questioning and investigation, but overall the thrust of the purpose of modernising that function is critical to the success of the Bill. We absolutely welcome the introduction of a centralised digital ID verification framework, recognising noble Lords’ concerns about it, of course, and allowing law enforcement bodies to make greater use of biometric data for counterterrorism purposes.

That said, there are provisions that were in the old DPDI Bill whose removal we regret, many of which we felt would have improved data protection and productivity by offering SMEs in particular greater agency to deal with non-high-risk data in less cumbersome ways while still retaining the highest protections for high-risk data. I very much welcome the views so well expressed by the noble and learned Lord, Lord Thomas of Cwmgiedd, on this matter. As my noble friend Lord Markham put it, this is about being wisely careful but not necessarily hyper-careful in every case. That is at least a way of expressing the necessary balance.

I regret, for example—the noble Lord, Lord Clement-Jones, possibly regrets this less than I do—that the Government have chosen to drop the “vexatious and excessive” standard for subject access requests to refer to “manifestly unfounded or excessive”. The term “vexatious” emerged from extensive consultation and would, among other things, have prevented the use of SARs to circumvent courts’ discovery processes. I am concerned that, by dropping this definition, the Government have missed an opportunity to prevent misuse of the deeply important subject access rights. I hope very much to hear from the Minister how the Government propose to address such practices.

In principle, we do not approve of the Government giving themselves the power to gain greater knowledge of citizens’ activities. Indeed, the Constitution Committee has made it clear that any legislation dealing with data protection must carefully balance the use of personal data by the state for the provision of services and for national security purposes against the right to a private life and freedom of expression. We on these Benches feel that, on the whole, the DPDI Bill maintained the right balance between those two opposing legislative forces. However, we worry that the DUA Bill, if used in conjunction with other powers that have been promised in the fraud, error and debt Bill, would tip too far in favour of government overreach.

Part 1 of the Bill, on customer and business data, contains many regulation-making powers. The noble Viscount, Lord Colville, my noble friend Lord Holmes and the noble Lord, Lord Russell, spoke powerfully about this, and I would like to express three concerns. First, the actual regulations affecting vast quantities of business and personal data are not specified in the Bill; they will be implemented through secondary legislation. Will the Minister give us some more information, when she stands up, about what these regulations may contain? This concern also extends to Part 2, on digital verification services, where in Clause 28,

“The Secretary of State must prepare and publish … rules concerning the provision of digital verification services”.


The Select Committee on the Constitution has suggested that this power should be subject to parliamentary scrutiny. I must say that I am minded to agree.

Secondly, throughout Part 1, regulation-making powers are delegated to both the Secretary of State and the Treasury. This raises several questions. Can the Secretary of State and the Treasury make regulations independently of one another? In the event of a disagreement between these government departments, who has the final say, and what are the mechanisms should they disagree? We would welcome some commentary and explanation from the Minister.

Thirdly, as the Select Committee on the Constitution has rightly pointed out, Clause 133 contains a Henry VIII power. It allows the Secretary of State, by regulations, to make consequential amendments to the provisions made by this Bill. This allows amendments to any

“enactment passed or made before the end of the Session in which this Act is passed”.

Why is this necessary?

The Bill introduces some exciting new terminology, namely “data holder” and data “trader”. Will the Minister tell the House what these terms mean and why they need to coexist alongside the existing terminology of “data processor” and “data controller”? I certainly feel that data legislation is quite complex enough without adding overlapping new terminology if we do not really need it.

I stress once again the concerns rightly raised by my noble friend Lord Markham about NUAR security. Are the Government satisfied that the operational protection of NUAR is sufficient to protect this valuable information from terrorist and criminal threats? More generally, additional cybersecurity measures must be implemented to protect personal data during this mass digitisation push. Will the Minister tell the House how these necessary security measures will be brought forward?

Finally, as I am sure all noble Lords will recall, the previous Government published a White Paper that set out five principles for AI. As a reminder, those were: safety, security and robustness; appropriate transparency and explainability; fairness; accountability and governance; and contestability and redress. I am minded to table an amendment to Clause 80, requiring those using AI in their automated decision-making process to have due regard for these five principles. I noted with interest that the noble Lord, Lord Stevenson of Balmacara, proposed something very similar but using the Bletchley principles. I am very keen to explore that further, on the grounds that it might be an interesting way of having principles-driven AI inserted into this critical Bill.

In conclusion, we on these Benches are broadly supportive of the Bill. We do, as I have set out, have a few concerns, which I hope the Minister will be willing to listen to.

20:18
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I thank all noble Lords for what has genuinely been a fascinating, very insightful debate. Even though I was part, I think, of my noble friend Lord Stevenson’s gang that has been working on this for some time, one learns new things, and I have learned new things again today about some of the issues that are challenging us. So I thank noble Lords for their contributions this evening, and I am very pleased to hear that a number of noble Lords have welcomed the Government’s main approach to the Bill, though of course beyond that there are areas where our concerns will diverge and, I am sure, be subject to further debate. I will try to clarify the Government’s thinking. I am sure noble Lords will understand, because we have had a very wide-ranging discussion, that if I am not able to cover all points, I will follow those up in writing.

I shall start with smart data. As was raised by my noble friend Lord Knight of Weymouth, and other noble Lords, the Government are keen to establish a smart data economy that brings benefits to consumers across all sectors.

Through the Smart Data Council, the Government are working closely to identify areas where smart data schemes might be able to bring more benefits. I think the point was made that we are perhaps not using it sufficiently at the moment. The Government intend to communicate where and in what ways smart data schemes can support innovation and growth and empower customers across a spectrum of markets—so there is more work to be done on that, for sure. These areas include providing the legislative basis for the fuel finder service announced by the Department for Energy Security and Net Zero, and supporting an upcoming call for evidence on the smart data scheme for the energy sector. Last week, the Government set out their priorities for the future of open banking in the national payments vision, which will pave the way for the UK to lead in open finance.

I turn now to digital identity, as raised by the noble Earl, Lord Erroll, and a number of other noble Lords. The measures in the Bill aim to help people and businesses across Britain to use innovative digital identity technologies and to realise their benefits with confidence. As the noble Lord, Lord Arbuthnot, said, the Bill does not make digital identities mandatory. The Bill will create a legislative structure of standards, governance and oversight for digital verification services that wish to appear on a government register, so that people will know what a good digital identity looks like. It is worth saying that a lot of these digital verification schemes already exist; we are trying to make sure that they are properly registered and have oversight. People need to know what a good digital identity looks like.

The noble Lord, Lord Arbuthnot, raised points about Sex Matters. Digital verification services can be used to prove sex or gender in the same way that individuals can already prove their sex using their passport, for example. Regarding the concerns of the noble Lord, Lord Vaux, about the inclusion of non-digital identity, the Government are clear that people who do not want to use digital identity or the digital verification services can continue to access services and live their daily lives referring to paper documents when they need to. Where people want to use more technology and feel left behind, DSIT is working hard to co-ordinate government work on digital inclusion. This is a high priority for the Government, and we hope to come back with further information on that very soon.

The Office for Digital Identities and Attributes has today published its first digital identity inclusion monitoring report. The results show a broadly positive picture of inclusion at this early stage of the markets, and its findings will inform future policy interventions.

I would like to reassure the noble Lord, Lord Markham, and the noble Viscount, Lord Camrose, that NUAR takes advantage of the latest technologies to ensure that data is accessed only for approved purposes, with all access audited. It also includes controls, developed in collaboration with the National Protective Security Authority, the National Cyber Security Centre and the security teams of asset owners themselves.

We had a very wide-ranging debate on data protection issues, and I thank noble Lords for their support for our changes to this legislation. The noble Viscount, Lord Camrose, and others mentioned delegated powers. The Government have carefully considered each delegated power and the associated parliamentary procedure and believe that each is proportionate. The detail of our rationale is set out in our delegated powers memorandum.

Regarding the concerns of the noble Lord, Lord Markham, and the noble Viscount, Lord Camrose, about the effect of the legislation on SMEs, we believe that small businesses would have struggled with the lack of clarity in the term “high-risk processing activities” in the previous Bill, which could have created more burdens for SMEs. We would prefer to focus on how small businesses can be supported to comply with the current legislation, including through user-friendly guidance on the ICO’s small business portal.

Many noble Lords, including the noble Viscount, Lord Camrose, the noble and learned Lord, Lord Thomas, and the noble Lord, Lord Vaux, raised EU adequacy. The UK Government recognise the importance of retaining our personal data adequacy decisions from the EU. I reassure the noble Lord, Lord Vaux, and my noble friend Lord Bassam that Ministers are already engaging with the European Commission, and officials will actively support the EU’s review process in advance of the renewal deadline next year. The free flow of personal data between the UK and the EU is one of the underpinning actions that enables research and innovation, supports the improvement of public services and keeps people safe. I join the noble Lord, Lord Vaux, in thanking the European Affairs Committee for its work on the matter. I can reassure him and the committee that the Secretary of State will respond within the required timeframe.

The noble Lord, Lord Bethell, and others raised international data transfers. Controllers and processors must take reasonable and proportionate steps to satisfy themselves that, after the international transfer, the level of protection for the data subject will be “not materially lower” than under UK data protection law. The Government take their responsibility seriously to ensure that data and its supporting infrastructure are secure and resilient.

On the question from the noble Viscount, Lord Colville, about the new recognised legitimate interest lawful ground, the entire point of the new lawful ground is to provide more legal certainty for data controllers that they are permitted to process personal data for the activities mentioned in new Annexe 1 to the UK GDPR. However, the processing must still be necessary and proportionate and meet all other UK GDPR requirements. That includes the general data protection principles in Article 5 of the UK GDPR, and the safeguards in relation to the processing of special category data in Article 9.

The Bill has significantly tightened up on the regulation-making power associated with this clause. The only processing activities that can be added to the list of recognised legitimate interests are those that serve the objectives of public interest, as described in Article 23(1) of the UK GDPR. The Secretary of State would also have to have regard to people’s rights and the fact that children may be less aware of the risks and consequences of the processing of their data before adding new activities to the list.

My noble friends Lord Davies of Brixton and Lord Stevenson of Balama—do you know, I have never had to pronounce his full name—Balmacara, raised NHS data. These clauses are intended to ensure that IT providers comply with relevant information standards in relation to IT use for health and adult social care, so that, where data is shared, it can be done in an easier, faster and cheaper way. Information standards create binding rules to standardise the processing of data where it is otherwise lawful to process that data. They do not alter the legal obligations that apply in relation to decisions about whether to share data. Neither the Department of Health and Social Care nor the NHS sells data or provides it for purely commercial purposes such as insurance or marketing purposes.

With regard to data assets, as raised by the noble Baroness, Lady Kidron, and my noble friend Lord Knight of Weymouth, the Government recognise that data is indeed one of the most valuable assets. It has the potential to transform public services and drive cutting-edge innovation. The national data library will unlock the value of public data assets. It will provide simple, secure and ethical access to our key public data assets for researchers, policymakers and businesses, including those at the frontier of AI development, and make it easier to find, discover and make connections across those different databases. It will sit at the heart of an ambitious programme of reform that delivers the incentives, investment and leadership needed to secure the full benefits for people and the economy.

The Government are currently undertaking work to design the national data library. In its design, we want to explore the best models of access so that public sector data benefits our society, much in the way that the noble Baroness, Lady Kidron, outlined. So, decisions on its design and implementation will be taken in due course.

Regarding the concerns of the noble Lord, Lord Markham, about cybersecurity, as announced in the King’s Speech, the Government will bring forward a cybersecurity and resilience Bill this Session. The Bill will strengthen our defences and ensure that more essential digital services than ever before are protected.

The noble Baroness, Lady Kidron, the noble Viscount, Lord Colville, and my noble friend Lord Stevenson of Balmacara, asked about the Government’s plans to regulate AI and the timing of this legislation. As set out in the King’s Speech, the Government are committed to establishing appropriate legislation for companies developing the most powerful AI systems. The Government will work with industry, civil society and experts across the UK before legislation is drawn up. I look forward to updating the House on these proposals in due course. In addition, the AI opportunities action plan will set out a road map for government to capture the opportunities of AI to enhance growth and productivity and create tangible benefits for UK citizens.

Regarding data scraping, as raised by the noble Baroness, Lady Kidron, the noble Viscount, Lord Colville of Culross, and others, although it is not explicitly addressed in the data protection legislation, any such activity involving personal data would require compliance with the data protection framework, especially that the use of data must be fair, lawful and transparent.

A number of noble Lords talked about AI in the creative industries, particularly the noble Lords, Lord Holmes and Lord Freyberg—

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

I am sorry to interrupt what is a very fluent and comprehensive response. I do not want to break the thread, but can I press the Minister a little bit on those companies whose information which is their intellectual property is scraped? How will that be resolved? I did not pick up from what the Minister said that there was going to be any action by the Government. Are we left where we are? Is it up to those who feel that their rights are being taken away or that their data has been stolen to raise appropriate action in the courts?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I was going to come on to some of those issues. Noble Lords talked about AI in the creative industries, which I think my noble friend is particularly concerned about. The Government are working hard on this and are developing an effective approach that meets the needs of the UK. We will announce more details in due course. We are working closely with relevant stakeholders and international partners to understand views across the creative sector and AI sectors. Does that answer my noble friend’s point?

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

With respect, it is the narrow question that a number of us have raised. Training the new AI systems is entirely dependent on them being fed vast amounts of material which they can absorb, process and reshape in order to answer questions that are asked of them. That information is to all intents and purposes somebody else’s property. What will happen to resolve the barrier? At the moment, they are not paying for it but just taking it—scraping it.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

Perhaps I may come in too. Specifically, how does the data protection framework change it? We have had the ICO suggesting that the current framework works perfectly well and that it is the responsibility of the scrapers to let the IP holders know, while the IP holders have not a clue that it is being scraped. It is already scraped and there is no mechanism. I think we are a little confused about what the plan is.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I can certainly write to noble Lords setting out more details on this. I said in response to an Oral Question a few days ago that my honourable friend Minister Clark in DSIT and Chris Bryant, whom the noble Lord, Lord Russell, mentioned, are working jointly on this. They are looking at a proposal that can come forward on intellectual property in more detail. I hope that I can write to noble Lords and set out more detail on that.

On the question of the Horizon scandal and the validity of computers, raised, quite rightly, by the noble Lords, Lord Arbuthnot and Lord Holmes, and the noble Baroness, Lady Kidron, I think we all understand that the Horizon scandal was a terrible miscarriage of justice, and the convictions of postmasters who were wrongly convicted have been rightly overturned. Those Post Office prosecutions relied on assertions that the Horizon system was accurate and reliable, which the Post Office knew to be wrong. This was supported by expert evidence, which it knew to be misleading. The issue was not, therefore, purely about the reliability of the computer-generated evidence. Almost all criminal cases rely to some extent on computer evidence, so the implications of amending the law in this area are far- reaching, a point made by several noble Lords. The Government are aware that this is an issue, are considering this matter very carefully and will announce next steps in due course.

Many noble Lords, including the noble Lords, Lord Clement-Jones, Lord Vaux and Lord Holmes of Richmond, and the noble and learned Lord, Lord Thomas, raised automated decision-making. I noted in my opening speech how the restored accountability framework gives us greater confidence in ADM, so I will not go over that again in detail. But to explain the Bill’s drafting, I want to reassure and clarify for noble Lords that the Bill means that the organisation must first inform individuals if a legal or significant decision has been taken in relation to them based solely on automated processing, and then they must give individuals the opportunity to challenge such decisions, obtain human intervention for them and make representations about them to the controller.

The regulation-making powers will future-proof the ADM reforms in the Bill, ensuring that the Government will have the powers to bring greater legal certainty, where necessary and proportionate, in the light of constantly evolving technology. I reiterate that there will be the right to human intervention, and it will be on a personal basis.

The noble Baroness, Lady Kidron, and the noble Lords, Lord Russell of Liverpool and Lord Clement-Jones, raised concerns about edtech. The Government recognise that concerns have been raised about the amount of personal data collected by education technology used in schools, and whether this is fully transparent to children and parents. The Department for Education is committed to improving guidance and support for schools to help them better navigate this market. For example, its Get Help with Data Protection in Schools project has been established to help schools develop guidance and tools to help them both understand and comply with data protection legislation. Separately, the ICO has carried out a series of audits on edtech service providers, assessing privacy risks and potential non-compliance with data protection regulations in the development, deployment and use of edtech solutions in schools.

The creation of child sexual abuse material, CSAM, through all mediums including AI—offline or online—is and continues to be illegal. This is a forefront priority for this Government and we are considering all levers that can be utilised to fight child sexual abuse. Responsibility for the law in this area rests with the Home Office; I know it is actively and sympathetically looking at this matter and I understand that my colleague the Safeguarding Minister will be in touch with the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, ahead of Committee.

I can see that I am running out of time so, rather than testing noble Lords’ patience, will draw my comments to a close. I have not picked up all the comments that colleagues made, but I thank everybody for their excellent contributions. This is the beginning of a much longer conversation, which I am very much looking forward to, as I am to hearing all those who promised to participate in Committee. I am sure we will have a rich and interesting discussion then.

I hope I have persuaded some noble Lords that the Bill is not only wide ranging but has a clear and simple focus, which is about growing the economy, creating a modern, digital government and, most importantly, improving people’s lives, which will be underpinned by robust personal data protection. I will not say any more at this stage. We will follow up but, in the meantime, I beg to move.

Bill read a second time.
Commitment and Order of Consideration Motion
Moved by
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch
- Hansard - - - Excerpts

That the Bill be committed to a Grand Committee, and that it be an instruction to the Grand Committee that they consider the Bill in the following order: Clauses 1 to 56, Schedule 1, Clauses 57 and 58, Schedule 2, Clauses 59 to 65, Schedule 3, Clauses 66 to 70, Schedule 4, Clause 71, Schedule 5, Clauses 72 to 80, Schedule 6, Clauses 81 to 84, Schedules 7 to 9, Clauses 85 to 102, Schedule 10, Clauses 103 to 107, Schedule 11, Clauses 108 to 111, Schedule 12, Clauses 112 and 113, Schedule 13, Clauses 114 and 115, Schedule 14, Clauses 116 to 119, Schedule 15, Clause 120, Schedule 16, Clauses 121 to 138, Title.

Motion agreed.
Committee (1st Day)
15:45
Relevant documents: 3rd Report from the Constitution Committee and 9th Report from the Delegated Powers Committee. Scottish, Welsh and Northern Ireland Legislative Consent sought.
Baroness McIntosh of Hudnall Portrait The Deputy Chairman of Committees (Baroness McIntosh of Hudnall) (Lab)
- Hansard - - - Excerpts

My Lords, if there is a Division in the Chamber while we are sitting, this Committee will adjourn as soon as the Division bells are rung and resume after 10 minutes.

Clause 1: Customer data and business data

Amendment 1

Moved by
1: Clause 1, page 2, leave out lines 34 to 37
Member's explanatory statement
This is a probing amendment to ascertain why this new term is necessary.
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, I start by reflecting on the strangeness of the situation—to me, anyway. Here we all are again, in slightly different seats but with a largely similar Bill. As I said at Second Reading, we welcome this important Bill; it is absolutely crucial to get our data economy right. We have a number of amendments to the Bill, a great many of which are probing. The overall theme of our amendments is how to make the Bill maximally effective at the important job that it sets out to do.

The terminology of data law is well understood. Lawmakers, lawyers, businesses and data subjects are all to some extent familiar with the terminology. A “controller” means

“the natural or legal person, public authority, agency or other body which, alone or jointly with others, determines the purposes and means of the processing of personal data”.

A “processor” means

“a natural or legal person, public authority, agency or other body which processes personal data on behalf of the controller”.

We are all familiar with those terms.

In this Bill, new terms are introduced, named “data holder” and “trader”. A data holder, in relation to customer data or business data of a trader is the trader, or

“a person who, in the course of a business, processes the data”.

How is that materially different from a processor? A trader is described as a person who supplies or provides

“goods, services or digital content”

in the course of business, whether personally, through someone acting in the trader’s name, or on the trader’s behalf. Again, I ask how that is different from a controller.

While I grant that this may seem a very small point in a very large Bill, already data regulations are relatively poorly understood and difficult to follow. Therefore, surely there is no real need to make them more complex by introducing overlapping terms just for this one section of the Bill. As I explained in our explanatory note, this is a probing amendment, and I hope the Minister will be able to explain why these terms are materially different from the existing terms, why they are necessary and so on. If so, I would of course be happy to withdraw my amendment. I beg to move.

Lord Markham Portrait Lord Markham (Con)
- Hansard - - - Excerpts

Just to follow on from that, I very much support my noble friend’s words. The only reason I can see why you would introduce new definitions is that there are new responsibilities that are different, and you would want people to be aware of the new rules that have been placed on them. I will be interested to hear the Minister’s answer. If that is the case, we can set that out and understand whether the differences are so big that you need a whole new category, as my noble friend said.

Having run lots of small businesses myself, I am aware that, with every new definition that you add, you add a whole new set of rules and complications. As a business owner, how am I going to find out what applies to me and how I am to be responsible? The terms trader, controller, data holder and processor all sound fairly similar, so how will I understand what applies to me and what does not? To the other point that my noble friend made, the more confusing it gets, the less likelihood there is that people will understand the process.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I am not sure whether I should open by saying that it is a pleasure to take part in the passage of the third iteration of this Bill, but, as I said at Second Reading, this is an improvement. Nevertheless, there are aspects of the Bill that need close scrutiny.

The noble Viscount, Lord Camrose, explained his approach to this Bill. Our approach is that we very much support the use of data for public benefit but, at the same time, we want to make sure that this Bill does not water down individual data rights and that they are, where necessary, strengthened. In that spirit, I wish to ask the Minister about the general nature of Clause 1, rather than following up on the amendments tabled by the noble Viscount.

The definition of “business data” seems quite general. A report that came out yesterday, Data On Our Minds: Affective Computing At Work, highlighted the kinds of data that are now being collected in the workplace. It is a piece of work sponsored by the Joseph Rowntree Charitable Trust, the Trust for London and the Institute for the Future of Work. They are concerned about the definition of “business data”. The Minister probably will not have an answer on this matter at this stage but it would be useful if she could write in due course to say whether the definition of excludes emotional data and neurosurveillance data collected from employees.

This is very much a workplace question rather than a question about the customer; I could ask the same question about the customer, I suppose, except the report is about workplace data collection. I thought I would opportunistically take advantage of the rather heavy de-grouping that has taken place and ask the Minister a question.

Baroness Jones of Whitchurch Portrait The Parliamentary Under-Secretary of State, Department for Science, Information and Technology (Baroness Jones of Whitchurch) (Lab)
- Hansard - - - Excerpts

First, let me say what a pleasure it is to be back on this old ground again, although with slightly different functions this time round. I very much support what the noble Viscount, Lord Camrose, said. We want to get the wording of this Bill right and to have a robust Bill; that is absolutely in our interests. We are on the same territory here. I thank the noble Viscount and other noble Lords for expressing their interest.

On Amendments 1 and 2, the Government consider the terms used in Part 1, as outlined in Clause 1, necessary to frame the persons and the data to which a scheme will apply. The noble Lord, Lord Clement-Jones, mentioned the powers. I assure him that the powers in Part 1 sit on top of the Data Protection Act. They are not there instead of it; they are another layer on top of it, and they provide additional rights over and above what already exists.

In relation to the specific questions from the noble Viscount, Lord Camrose, and the noble Lord, Lord Markham, smart data schemes require suppliers or providers of goods, services or digital content to provide data. They are referred to as “traders” in accordance with recent consumer legislation, including the Consumer Rights Act 2015. The term “data holder” ensures that the requirements may also be imposed on any third party that might hold the data on the trader’s behalf. That is why these additional terminologies have been included: it is based on existing good legislation. I hope noble Lords will recognise why this is necessary and that this explains the rationale for these terms. These terms are independent of terms in data protection legislation; they have a different scope and that is why separate terms are necessary. I hope that, on that basis, the noble Viscount will withdraw his amendment.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank the Minister for that explanation. I see the point she makes that, in existing legislation, these terms are used. I wonder whether there is anything we can do better to explain the terms. There seems to be significant overlap between processors, holders, owners and traders. The more we can do to clarify absolutely, with great rigour, what those terms mean, the more we will bring clarity and simplicity to this necessarily complex body of law.

I thank the Minister for explaining the rationale. I am satisfied that, although it may not be the most elegant outcome, for the time being, in the absence of a change to the 2015 Act that she references, we will probably have to grin and bear it. I beg leave to withdraw the amendment.

Amendment 1 withdrawn.
Amendment 2 not moved.
Clause 1 agreed.
Clause 2: Power to make provision in connection with customer data
Amendment 3
Moved by
3: Clause 2, page 3, line 23, leave out “Secretary of State or the”
Member’s explanatory statement
This amendment seeks to probe the role of the Secretary of State and HM Treasury in these provisions.
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, Amendments 3, 4 and 20 seek to probe the Government’s position on the roles of the Secretary of State and the Treasury. Amendment 6 seeks to probe whether the Treasury or the Secretary of State shall have precedence when making regulations under this Bill.

Clarity over decision-making powers is critical to good governance, in particular over who has final decision rights and in what circumstances. Throughout Part 1 of the Bill, the Secretary of State and the Treasury are both given regulation-making powers, often on the same matter. Our concern is that having two separate Ministers and two departments responsible for making the same regulations is likely to cause problems. What happens if and when the departments have a difference of opinion on what these regulations should contain or achieve? Who is the senior partner in the relationship? When it comes to putting statute on paper, who has the final say, the Secretary of State or the Treasury?

All the amendments are probing and, at this point, simply seek greater clarification from the Government. If the Minister can explain why two departments are jointly responsible for the same regulations, why this is necessary and a good idea, and what provisions will be in place to avoid legislative confusion, I will be happy not to press the amendments.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

The amendments in group 2 cover smart data and relate to the Secretary of State and the Treasury. Apart from the financial services sector clauses, most of the powers in Part 1, as well as the statutory spending authority in Clause 13, are imposed on the Secretary of State and the Treasury. That is the point that the noble Viscount made. These allow the relevant government departments to make smart data regulations. Powers are conferred on the Treasury as the department responsible for financial services, given the Government’s commitment to open banking and open financing. There is no precedence between the Secretary of State or the Treasury when using these powers, as regulations are likely to be made by the department responsible for the sector to which the smart data scheme applies, following, as with other regulations, the appropriate cross-government write-round and collective agreement procedures. I add that interdepartmental discussions are overseen by the Smart Data Council, which will give advice on this issue.

The noble Viscount raises concerns relating to Clause 13. Just as regulations may be made by the relevant government department, it is most appropriate for financial assistance to be provided by the government department responsible for the smart data scheme in question. Clause 13 is intended to provide statutory authority for that assistance, as a matter of regularity. It is for these reasons that I urge the noble Viscount not to press these amendments. These are standard procedures where the Treasury is involved and that is why more than one department is referenced.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank the Minister for that explanation. I am pleased to hear that these are standard procedures. Will she put that in writing, in a letter to me, explaining and setting it out so that we have it on the record? It is really important to understand where the decisions break down and to have a single point of accountability for all such decisions and, if it cannot be in the Bill, it could at least be explained elsewhere. Otherwise, I am happy to proceed with the explanation that she has kindly given.

16:00
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I confirm that I am happy to write.

Amendment 3 withdrawn.
Amendment 4 not moved.
Baroness McIntosh of Hudnall Portrait The Deputy Chairman of Committees (Baroness McIntosh of Hudnall) (Lab)
- Hansard - - - Excerpts

My Lords, Amendment 5 is in the name of the noble Lord, Lord Lucas, whom I do not see with us. Would the noble Lord, Lord Arbuthnot, like to move it on his behalf?

Lord Arbuthnot of Edrom Portrait Lord Arbuthnot of Edrom (Con)
- Hansard - - - Excerpts

I am grateful. I do not know about the amendment in the name of the noble Lord, Lord Lucas, but I wonder whether I might speak to Amendments 34 and 48.

Baroness McIntosh of Hudnall Portrait The Deputy Chairman of Committees (Baroness McIntosh of Hudnall) (Lab)
- Hansard - - - Excerpts

Would the noble Lord be prepared to move Amendment 5 first? He need not necessarily speak to it at any length. That said, the noble Lord, Lord Lucas, is now with us, so the problem is solved.

Amendment 5

Moved by
5: Clause 2, page 3, line 28, at end insert—
“(1A) The Secretary of State may by regulations make provision requiring a data holder to communicate (to the extent that they have the data required to do this) in a specified manner with all or a subset of the customers for whom they hold data.”Member’s explanatory statement
This amendment is to enable communication with customers to ascertain, for instance, whether regulations have been complied with or, for example in the case of the Student Loans Company, to enable research into the outcomes of courses that they have funded.
Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, I apologise to the Committee for having not expected things to go quite as fast as they did. In moving Amendment 5, I will also speak to Amendments 200 and 202 in this group.

Amendment 5 is very much to do, in my mind, with the Office for Students and the Student Loans Company, but it is about a problem of more generality, in that public bodies that hold a great deal of customer data find that they are unable to use that access and understanding for the greater public good. In the particular instance of the Student Loans Company, it is in active touch with most British young people who have been through university and is in an excellent position to help us understand the quality of the university courses that they have been through and looked back on a few years later so that we can get data and information that will enable universities to improve those courses for future students. That is important feedback that we ought to have in our university system. Otherwise, universities just concentrate on students who are there now; the moment those students leave, the universities are not interested any more, until they are old enough to get donations out of.

We should have a much better and more self-improving system, which could be driven through the Student Loans Company. I have in the past asked the company whether it would feel able to participate in such a thing, and it said no, it would not be permitted by data protection regulations to communicate in this way with the students it looks after. We should give ourselves the power to consider that in this Bill, so that we can look at how we could use that data to make life better for future generations of students.

There are other examples of where the public realm has gathered data and contact information on people to do with a particular set of transactions but feels unable to communicate with them again to do something slightly wider than that, so I suggest to the Government that something along the lines of Amendment 5 would open some very interesting doors to improving the performance of the public realm.

Amendment 200 is on a completely different subject: how we properly define the data we are collecting so that across the public realm a particular dataset means the same thing. The instance I choose to illustrate this is sex. One would have thought that sex means male or female and, in fact, properly construed, there are only two sexes, and I hope the Supreme Court will agree in due course. Gender can be as wide as you like, but sex has two possible values, male or female. If we are collecting data on that in the National Health Service, the police service and other aspects of life to see whether we are treating men and women equally, it is very important that that data item should mean the same thing, but the police now routinely record rapes as being committed by women because the person convicted of rape chooses to identify as a woman because they think they will then get better treatment thereafter. If you are recording gender, it can be what you want, but if you are recording sex, it should be male or female.

It is really important within the National Health Service that we always mean male or female because male and female physiology differs, and if someone is a candidate for a particular treatment, it may well depend on their sex. For instance, in blood transfusions, it is important to know whether the donation came from a man or a woman, because people may react in different ways to the blood.

Having a data dictionary within government that defines particular terms for use in government statistics so that statistics collected across different departments are comparable and mean the same thing, so that you can work with them knowing exactly what they mean, ought to be part of the way we run government. Certainly, whenever I have been involved in collecting data within a largish business, data dictionaries have been common.

Lastly, I turn to a third entirely different subject, which is schools admissions data. There is provision in legislation for schools admissions authorities to publish admissions data. This, when it started, was quite useful. Local authorities would publish booklets and you could pick up a booklet for your local authority and see what the admissions rules were for all the schools in that local authority and what the outcomes of those rules had been in previous years. With a little work, you could understand which schools your child had a chance of getting into. That would then form the basis of the investigations you would do about which school you should be using. Over time, the quality of this data has degraded, mostly because the concept of an admissions authority has moved far beyond local authorities, which is where it used to be. Many individual schools and school groups are now their own admissions authority, and they do not share data with the local authority, which means that there is now—certainly in the local authorities I have looked at recently—no consolidated source of schools admissions information, either on the rules prospective pupils are subject to or on the outcomes in previous years.

That makes it a much longer and harder business to establish which schools your child has a right to go to, and the result is that it is only the socially advantaged who can find out what their options are. Anyone short of time or data literacy finds it difficult to know anything beyond which their nearest school is and to see all the other options that might be available to them.

That is something which we should turn around, and the way to do so is to make all admissions authorities drop their data into a common database. That is not difficult—it might take someone of medium talent about a day to design—and all schools have this data in a form that is easy to drop into a database, because that data is subject to a data dictionary. Terms are defined, and you know what they mean because they have to be interpreted in a consistent way by parents. It is a really easy thing to create.

Once the data is all in one place, it would be much easier for parents to establish which schools they could send their children to. It would be an opportunity for businesses of all sorts to help parents to make that easier. We ought to be putting ourselves in a position where we are making sure that we do not disadvantage people because they are disadvantaged. We should look after people who find it difficult to deal with differently arranged and differently stated sets of admissions criteria. We should not be disadvantaging people like that; we ought to—it is really quite simple—put them in a position where they are on a level footing with everyone else. I beg to move.

Lord Arbuthnot of Edrom Portrait Lord Arbuthnot of Edrom (Con)
- Hansard - - - Excerpts

My Lords, I am pleased that my noble friend Lord Lucas managed to make it, because I found him extremely persuasive and I agree with what he said. I shall return to the issue that his second amendment dealt with—namely, the issue of sex. I thank the organisation Sex Matters for its briefing on Amendments 34 and 48. I am not sure why there are no explanatory notes to these but I referred to the point in my speech at Second Reading, and I hope I will be able to explain it adequately now.

The core aim of the digital verification system that we are legislating for is to enable people to prove who they are and to provide information about themselves. The reason this system of digital identity can be trusted with our data, our safety and our personal and economic lives is that the information it contains comes from authoritative sources. It draws on information in my passport or my driving licence, which itself comes from information on my birth certificate, which itself is a certified copy of the entry on the birth register, which before that came from the information recorded at the hospital where I was born. Actually, I was born at home; nevertheless, the issue remains true.

However, if the chain of integrity of data is broken, the system of digital verification is no longer trustworthy. The Bill contains provision to secure the reliability of digital verification services by means of a carefully constructed framework, a register of providers, an information gateway and a trust mark, but there is a flaw, which my noble friend has referred to, and it has been pointed out by the human rights charity Sex Matters. The digital verification system that has recently been published in its gamma edition, after several years of development, assumes that government sources are reliable and accurate, but, when it comes to the attribute of sex—whether someone is male or female—we know that those records are not accurate or reliable.

16:15
Here I am going to descend into controversy. I believe that sex is an intrinsic feature of every person, which is determined at conception and observed at birth. I do not believe that it can change over a person’s lifetime. My digital record shows that I am male. This also appears on my passport and is encoded into my driver’s licence. But neither the Passport Office nor the driving licence authority can reliably attest that I really am male. This is because, over the past decades, these agencies have allowed people to change their recorded sex.
This practice goes back at least as far as the 1950s. We know, for example, from the case of Corbett v Corbett, that in 1961, at the request of Arthur Corbett, the 3rd Baron Rowallan, the Passport Office provided a female passport to a young man who had changed his name to April Ashley and was seeking to live as a woman. Changes to these official government records were done without legislation as a humanitarian accommodation to address the difficulty for someone who wanted to live and travel with documentation that did not appear to match their sex or way of life. It was done informally for what at the time was thought to be a tiny number of people.
Since then, thousands of records have been altered. Recent freedom of information requests showed that, over the past five years, at least 3,188 passport sex records have been changed to show a sex different from that on the birth certificate. Over the past six years, at least 15,481 driving licence sex records have been changed to show a different sex. This can be done on request—sometimes, but not always, with a doctor’s note stating that the person wishes to live permanently as if they were of the opposite sex. The organisation Sex Matters has told me that some people have one sex on their passport and another on their driver’s licence.
As I said at Second Reading, we do not want to have biological males working in women’s rape crisis centres. We do not want biological males being sent to female prisons. Yet if we do not have accurate records, how can we prevent that? Changing government records to enable people who identify as transgender to live, work and travel as a different gender was a humane solution to an issue, but it has rendered those records inaccurate.
A digital solution would keep everyone’s personal information accurate but allow anyone to keep any piece of personal information private in any situation or transaction, as I can do with my name and date of birth using an age-verification app. Amendment 34 would require the Secretary of State to ensure that the DVS system assesses whether key public authorities, such as the Passport Office, can reliably ascertain and verify the key facts of each individual’s date of birth, place of birth and sex.
Amendment 48 to Clause 45 would create a quality assurance requirement for public authorities to ensure that they do not provide information about an individual via the information gateway that does not meet the basic expectation of integrity—that is, that it is accurate, at least at the time that it was recorded, has not been tampered with, and is accompanied by clear metadata that describes the data so that its meaning cannot be misconstrued. These are the same requirements that private sector providers certified under the trust frame- work must meet, and it seems right that public bodies should give a similar level of assurance of the integrity of the data that they provide. As I have said, public bodies have been modifying data with the best of intentions but, none the less, causing a problem for data integrity.
This amendment requires not only that public bodies provide accurate data but that it is supported by clear metadata. Metadata is information that describes and explains what kind of information a particular data bucket holds. For example, the piece of data “Christian” might be attached to an individual, but is it their first name, their middle name, their surname, their religion or the name of the street where they live? The data itself is not self-explanatory; you have to know what data bucket it has been stored in, and you have to be confident that the buckets could not have been muddled up.
Similarly, clear metadata is needed for the attribute of sex. The importance of metadata means that it is not enough that my digital identity contains the information “male”. This data must be recorded in a field that makes clear that it is accurate and could not have been tampered with, modified or confused with the idea of gender identity—which, as my noble friend said, is a completely different issue. This is the heart of verification.
At Second Reading, the Minister gave me the impression that she did not recognise the problem of inaccurate and unreliable sex data provided by public authorities such as the Passport Office. Will she ask the Information Commissioner’s Office whether data controllers that do not accurately and reliably collect and store sex data are breaching data protection principles? Will she agree to meet me and the organisation Sex Matters to discuss these matters?
Earl of Erroll Portrait The Earl of Erroll (CB)
- Hansard - - - Excerpts

My Lords, I would like to say a few things about this. The first is that Amendment 5, in the name of the noble Lord, Lord Lucas, is very sensible; sometimes the GDPR has gone too far in trying to block what you can use things for. It was originally thought of when so much spamming was going on, with people gathering data from adverts and all sorts of other things and then misusing it for other purposes. People got fed up with the level of spam. This is not about that sort of thing; it is about having useful data that would help people in the future, and which they would not mind being used for other purposes. As long as it is done properly and seriously, and not for marketing, advertising and all those other things, and for something which is useful to people, I cannot see what the problem is. An overzealous use of GDPR, which has happened from time to time, has made it very difficult to use something perfectly sensible, which people would not mind having other people know about when it is being useful.

The next matter is sex, which is an interesting issue. The noble Lord is absolutely correct that biological or genetic sex is vital when applying medicines and various other things. You have to know that you are administering certain drugs properly. As we get more and more new drugs coming on, it will matter how a person’s body will react to them, which will depend on the genetic material, effectively. Therefore, it is essential to know what the biological sex is. The answer is that we need another category—probably “current gender”—alongside “sex at birth”. Someone can then decide to use “current gender” for certain purposes, including for such things as passports and driving licences, where people do not want to be asked questions—“Oh, do you mean you’re not?”—because they look completely different.

I remember meeting April Ashley in her restaurant. I would not, in my innocence—I was quite young—have guessed that she was not a woman, except that someone said that her hands were very big. It never worried us in those days. I am not worried about people using a different gender, but the basic underlying truth is essential. It comes into the issue of sport. If you have grown up and developed physically as a biological male, your bone structure and strength are likely to be different from that of a female. There are huge issues with that, and we need to know both; people can decide which to use at certain points. Having both would give you the flexibility to do that.

That also applies to Amendment 200, from the noble Lord, Lord Lucas, which is exactly the same concept. I thoroughly agree with those amendments and think we should push them forward.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I too am delighted that the noble Lord, Lord Lucas, came in to move his amendment. He is the expert in that whole area of education data; like the noble Lord, Lord Arbuthnot, I found what he said extremely persuasive.

I need to declare an interest as chair of the council of Queen Mary, University of London, in the context of Amendment 5 in the name of the noble Lord, Lord Lucas. I must say, if use were made of that data, it would benefit not only students but universities. I am sure that the Minister will take that seriously but, on the face of it, like the noble Earl, Lord Erroll, I cannot see any reason why this amendment should not be adopted.

I very much support Amendments 34 and 48 in the name of the noble Lord, Lord Arbuthnot. I too have read the briefing from Sex Matters. The noble Lord’s pursuit of accuracy for the records that will be part of the wallet, if you like, to be created for these digital verification services is a matter of considerable importance. In reading the Sex Matters briefing, I was quite surprised. I had not realised that it is possible to change your stated sex on your passport in the way that has taken place. The noble Lord referred to the more than 3,000 cases of this; for driving licences, there have been more than 15,000.

I agree with Sex Matters when it says that this could lead to a loss of trust in the system. However, I also agree with the noble Earl, Lord Erroll, that this is not an either/or. It could be both. It is perfectly feasible to have both on your passport, if you so choose. I do not see this as a great divide as long as the statement about sex is accurate because, for a great many reasons—not least in healthcare—it is of considerable importance that the statement about one’s sex is accurate.

I looked back at what the Minister said at Second Reading. I admit that I did not find it too clear but I hope that, even if she cannot accept these amendments, she will be able to give an assurance that, under this scheme—after all, it is pretty skeletal; we will come on to some amendments that try to flesh it out somewhat—the information on which it will be based is accurate. That must be a fundamental underlying principle. We should thank the noble Lord, Lord Arbuthnot, for tabling these two important amendments in that respect.

Lord Markham Portrait Lord Markham (Con)
- Hansard - - - Excerpts

My Lords, I want to come in on Amendment 5. Although I am very much in favour of the intent of what we are trying to do—making more use of the sharing of data—I have to remember my old Health Minister’s hat in talking about all the different terms and speaking to the different angles that we are all coming from.

Noble Lords have heard me speak many a time about the value of our health data and the tremendous possibilities that it offers for drug discovery and all the associated benefits. At the same time, I was very aware of loads of companies purporting to own it. There are GP data companies, which do the systems for GPs and, naturally, hold all the patient data in them. In terms of their business plans, some have been bought for vast sums of money because of the data that they hold. My concern is that, although it is well intended to say that the use of health data should be allowed for the general good, at the same time, I do not believe that GP companies own that data. We have been quite clear on that. I want to make it clear that it is actually the NHS that will benefit from the pulling together of all this, if that happens in those sorts of formats.

Similarly on student loans data—I shall not pretend that this is a subject I know a lot about—I can see a lot of good causes for the student loans, but I can also see that it would be very useful for financial services companies to understand customers’ creditworthiness. In all these cases, although the intent is right, we need to find a way to be clear about what they can and cannot use it for, and there lies a lot of complexity.

16:30
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank my noble friends Lord Lucas and Lord Arbuthnot for their Amendments 5, 34, 48, 200 and 202. They and other noble Lords who have spoken have powerfully raised some crucial issues in these amendments.

Amendment 5 addresses a key gap, and I take on board what my noble friend Lord Markham said, in how we manage and use customer data in specific contexts. At its heart, it seeks to enable effective communication between organisations holding customer data and customers themselves. The ability to communicate directly with individuals in a specified manner is vital for various practical reasons, from regulatory compliance to research purposes.

One clear example of where this amendment would be crucial is in the context of the Student Loans Company. Through this amendment, the Secretary of State could require the SLC to communicate with students for important purposes, such as conducting research into the outcomes of courses funded by loans. For instance, by reaching out to students who have completed their courses, the SLC could gather valuable insights into how those qualifications have impacted on their employment prospects, income levels or career trajectories. This is the kind of research that could help shape future educational policies, ensuring that loan schemes are working as intended and that the investments made in students’ education are yielding tangible benefits. This, in turn, would allow for better decision-making on future student loans funding and educational opportunities.

Amendment 34 from my noble friend Lord Arbuthnot proposes a welcome addition to the existing clause, specifically aiming to ensure that public authorities responsible for ascertaining key personal information about individuals are reliable in their verification processes and provide clear, accurate metadata on that information. This amendment addresses the essential issue of trust and reliability in the digital verification process. We increasingly rely on digital systems to confirm identity, and for these systems to be effective, we have to make sure that the core information they are verifying is accurate and consistent. If individuals’ key identifying details—date of birth, place of birth and, as we heard very powerfully, sex at birth—are not consistently or accurately recorded across various official databases, it undermines the integrity of the digital verification process. It is important that we have consistency across the public authorities listed in this amendment. By assessing whether these bodies are accurately verifying and maintaining this data, we can ensure uniformity in the information they provide. This consistency is essential for establishing a reliable foundation for digital verification.

When we consider the range of public services that rely on personal identification information, from the NHS and His Majesty’s Revenue and Customs to the Home Office, they are all responsible for verifying identity in some capacity. The amendment would ensure that the data they are using is robust, accurate and standardised, creating smoother interactions for individuals seeking public services. It reduces the likelihood of discrepancies that delay or prevent access to public services.

Amendment 48 would introduce important protections for the privacy and integrity of personal information disclosed by public authorities. In our increasingly digital world, data privacy has become one of the most pressing concerns for individuals and for society. By requiring public authorities to attest to the accuracy, integrity and clarity of the data they disclose, the amendment would help to protect the privacy of individuals and ensure that their personal information was handled with the proper care and respect.

My noble friend Lord Lucas’s Amendment 200 would introduce a data dictionary. It would allow the Secretary of State to establish regulations defining key terms used in digital verification services, birth and death registers, and public data more generally. I heard clearly the powerful arguments about sex and gender, but I come at the issue of data dictionaries from the angle of the efficiency, effectiveness and reusability of the data that these systems generate. The more that we have a data dictionary defining the metadata, the more we will benefit from the data used, whichever of these bodies generates the data itself. I am supportive of the requirement to use a data dictionary to provide standardised definitions in order to avoid confusion and ensure that data used in government services is accurate, reliable and consistent. The use of the negative resolution procedure would ensure that Parliament had oversight while allowing for the efficient implementation of these definitions.

Amendment 202 would create a national register for school admissions rules and outcomes in England. This would be a crucial step towards increasing transparency and ensuring fairness in the school admissions process, which affects the lives of millions of families every year. We want to ensure that navigating the school admissions system is not overly opaque and too complex a process for many parents. With different schools following different rules, criteria and procedures, it can, as my noble friend, Lord Lucas, pointed out, be difficult for families to know what to expect or how best to make informed decisions. The uncertainty can be especially challenging for those who are new to the system, those who face language barriers or those in areas where the school’s rules are not readily accessible or clear.

For many parents, particularly those in areas with complex school systems or scarce school places, access to clear, consistent information can make all the difference. This amendment would allow parents to see exactly how the school admissions process works and whether they were likely to secure a place at their preferred school. By laying out the rules in advance, the system would ensure that parents could make better informed decisions about which schools to apply to, based on criteria such as proximity, siblings or academic performance.

We want to ensure that parents understand how decisions are made and whether schools are adhering to the rules fairly. By requiring all schools to publish their admissions rules and the outcomes of their admissions process, the amendment would introduce a level of accountability. I join other noble Lords in strongly supporting this amendment, as it would create a more effective and efficient school admissions system that works for everyone.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, we have had a good and wide-ranging discussion on all this. I will try to deal with the issues as they were raised.

I thank the noble Lord, Lord Lucas, for the proposed Amendment 5 to Clause 2. I am pleased to confirm that the powers under Clauses 2 and 4 can already be used to provide customer data to customers or third parties authorised by them, and for the publication or disclosure of wider data about the goods or services that the supplier provides. The powers provide flexibility as to when and how the data may be provided or published, which was in part the point that the noble Viscount, Lord Camrose, was making. The powers may also be used to require the collection and retention of specific data, including to require new data to be gathered by data holders so that this data may be made available to customers and third parties specified by regulations.

I note in particular the noble Lord’s interest in the potential uses of these powers for the Student Loans Company. It would be for the Department for Education to consider whether the use of the smart data powers in Part 1 of the Bill may be beneficial in the context of providing information about student loans and to consult appropriately if so, rather than to specify it at this stage in the Bill. I hope the noble Lord will consider those points and how it can best be pursued with that department in mind.

On Amendments 34, 48 and 200, the Government believe that recording, storing and sharing accurate data is essential to deliver services that meet citizens’ needs. Public sector data about sex and gender is collected based on user needs for data and any applicable legislation. As noble Lords have said, definitions and concepts of sex and gender differ.

Amendment 48 would require that any information shared must be accurate, trusted and accompanied by meta data. Depending on the noble Lord’s intentions here, this could either duplicate existing protections under data protection legislation or, potentially, conflict with them and other legal obligations.

The measures in Part 2 of the Bill are intended to secure the reliability of the process by which citizens verify their data. It is not intended to create new ways to determine a person’s sex or gender but rather to allow people to digitally verify the facts about themselves based on documents that already exist. It worries me that, if noble Lords pursued their arguments, we could end up with a passport saying one thing and a digital record saying something different. We have to go back to the original source documents, such as passports and birth certificates, and rely on them for accuracy, which would then feed into the digital record—otherwise, as I say, we could end up pointing in two different directions.

I reassure the noble Lord, Lord Arbuthnot, that my colleague, Minister Clark, is due to meet Sex Matters this week to discuss digital verification services. Obviously, I am happy to encourage that discussion. However, to prescribe where public authorities can usefully verify “sex at birth”, as noble Lords now propose, extends well beyond the scope of the measures in the Bill, so I ask them to reflect on that and whether this is the right place to pursue those issues.

In addition, the Government recently received the final report of the Sullivan review of data, statistics and research on sex and gender, which explores some of these matters in detail. These matters are more appropriately considered holistically—for example, in the context of that report—rather than by a piecemeal approach, which is what is being proposed here. We are currently considering our response to that report. I hope noble Lords will consider that point as they consider their amendments; this is already being debated and considered elsewhere.

Amendment 202 seeks to create a national register of individual school admissions arrangements and outcomes, which can be used to provide information to parents to help them understand their chances of securing a place at their local school. I agree with the noble Lord that choosing a school for their child is one of the most important decisions that a parent can make. That is why admissions authorities are required to publish admission arrangements on their schools’ websites. They must also provide information to enable local authorities to publish an annual admissions prospectus for parents, including admissions arrangements and outcomes for all state schools in their area.

I refer the noble Lord, Lord Lucas, to the School Information (England) Regulations 2008, which require admission authorities and local authorities to publish prescribed information relating to admissions. Those protections are already built into the legislation, and if a local authority is not complying with that, there are ways of pursuing it. We believe that the existing approach is proportionate, reflects the diversity of admissions arrangements and local circumstances, and is not overly burdensome on schools or local authorities, while still enabling parents to have the information they need about their local schools.

I hope that, for all the reasons I have outlined, noble Lords will be prepared not to press their amendments.

16:45
Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, I am very grateful to the Minister for her reply and to my noble friends and others for their interventions before that. I am delighted that she considers that Clause 2(3)(a) covers my Amendment 5. If I have any further concerns about that when I have reread her reply in Hansard, I will write to her.

I am sure that we need to do something about data integrity across the piece. I will very much take into account what the Minister has said about the Sullivan review and how sex data is or might be recorded in the future. However, it is a considerable problem that there is no reliable source of it, particularly when it comes to deciding how to treat people medically but also in other circumstances, as my noble friend has said, such as prisons and sports. We have to think through how to have a reliable source of it, which is clearly not passports, while for those with a gender recognition certificate, birth certificates are not a reliable source of information. There are obviously other aspects of life, too, where one wants to know that the data being collected is accurate.

So far as schools’ admissions regulations are concerned, I am afraid the state of the matter is that local authorities are no longer publishing the data that they ought to. The previous Government, who had plenty of time to enforce it, did not and this Government have not yet picked up on that. I will read what the Minister has said and pursue her colleagues in the Department for Education to see if we can get some improvement on the current state of affairs. With thanks to the Minister, I beg leave to withdraw my amendment.

Amendment 5 withdrawn.
Amendment 6 not moved.
Clause 2 agreed.
Clause 3: Customer data: supplementary
Amendment 7
Moved by
7: Clause 3, page 5, line 28, at end insert—
(f) provision requiring that third party recipients of customer data publish regular statements on their cyber resilience against specified standards and outcomes.”Member’s explanatory statement
This amendment would give the Secretary of State or the Treasury scope to introduce requirements on third party recipients of customer data to publish regular statements on their cyber resilience against specified standards and outcomes.
Lord Arbuthnot of Edrom Portrait Lord Arbuthnot of Edrom (Con)
- Hansard - - - Excerpts

My Lords, Amendment 7, the first in this group is a probing amendment and I am extremely grateful to ISACA, an international professional association focused on IT governance, for drafting it. This amendment

“would give the Secretary of State or the Treasury scope to introduce requirements on third party recipients of customer data to publish regular statements on their cyber resilience against specified standards and outcomes”.

Third parties play a vital role in the modern digital ecosystem, providing businesses with advanced technology, specialised expertise and a wide range of services, but integrating third parties into business operations comes with cyber risks. Their access to critical networks and all the rest of it can create vulnerabilities that cyber- criminals exploit. Third parties are often seen as easier targets, with weaker security measures or indirect connections serving as gateways to larger organisations.

Further consideration is to be given to the most effective means of driving the required improvements in cyber risk management, including, in my suggestion, making certain guidance statutory. This is not about regulating and imposing additional cost burdens, but rather creating the environment for digital trust and growth in the UK economy, as well as creating the right conditions for the sustainable use of emerging technologies that will benefit us all. This is something that leading associations and groups such as ISACA have been arguing for.

The Cyber Governance Code of Practice, which the previous Administration introduced, marks an important step towards improving how organisations approach cybersecurity. Its primary goal is to ensure that boards of directors should take their proper responsibility in mitigating cyber risks.

While that code is a positive development, compliance is not legally required, which leaves organisations to decide whether to put their priorities elsewhere. As a result, the code’s effectiveness in driving widespread improvements in cyber resilience will largely depend on their organisation’s willingness to recognise its importance. The amendment would require businesses regularly to review and update their cybersecurity strategies and controls, and to stay responsive to evolving threats and technologies, thereby fostering a culture of continuous improvement. In addition, by mandating ongoing assessments of internal controls and risk-management processes, organisations will be better able to anticipate emerging threats and enhance their ability to detect, prevent and respond to cyber incidents. I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, this is a fairly disparate group of amendments. I am speaking to Amendments 8, 9, 10, 24, 30, 31 and 32. In the first instance, Amendments 8, 9, 10 and 30 relate to the question that I asked at Second Reading: where is the ambition to use the Bill to encourage data sharing to support net zero?

The clean heat market mechanism, designed to create a market incentive to grow the number of heat pumps installed in existing premises each year, is set to be introduced after being delayed a year due to backlash from the boiler industry. If government departments and partners had access to sales data of heating appliances, there would be a more transparent and open process for setting effective and realistic targets.

I have been briefed by Ambient, a not-for-profit organisation in this field. It says that low visibility of high power-consuming assets makes it challenging to maintain grid stability in a clean-power world. Low visibility and influence over future installations of high power-consuming assets make it difficult to plan for grid updates. Inability to shift peak electricity demand leads to higher capacity requirements with associated time and cost implications. Giving the Government and associated bodies access to utility-flexible tariff data would enable the Government and utilities to work together to increase availability and uptake of tariffs, leading to lower peak electricity demand requirements.

Knowing which homes have the oldest and least efficient boilers, and giving public sector and partners access to the Gas Safe Register and CORGI data on boiler age at household level, would mean that they could identify and target households and regions, ensuring that available funds go to those most in need. Lack of clarity on future clean heating demand makes it challenging for the industry to scale and create jobs, and to assess workforce needs for growing electricity demand. Better demand forecasting through access to sales data on low-carbon heating appliances would signal when and where electrification was creating need for workforce expansion in grid management and upgrade, as well as identify regional demand for installers and technicians.

The provisions of Part 1 of the Bill contain powers for the Secretary of State to require the sharing of business data to customers and other people of specified description. It does not indicate, however, that persons of specified description could include actors such as government departments, public bodies such as NISO and GB Energy, and Ministers. An expanded list of suggested recipients could overcome this issue, as stated in Amendment 9 in my name. It makes no provision for the format of information sharing—hence, my Amendments 8 and 10.

In summary, my questions to the Minister are therefore on: whether it has been considered how the primary legislation outlined in the Bill could be exercised to accelerate progress towards clean power by 2030; whether climate missions such as clean power by 2030 or achieving net zero are purposes “of a public nature” in relation to the outline provisions for public bodies; and whether specifying the format of shared business data would enable more efficient and collaborative use of data for research and planning purposes.

Coming on to Amendments 24, 31 and 32, the Bill expands the potential use of smart data to additional public and private sector entities, but it lacks safeguards for sensitive information regularly used in court. It makes specific provision for legal privilege earlier in the Bill, but this is not extended in provisions relating to smart data. I very much hope that the Government will commit to consult with legal professions before extending smart data to courts.

Many of us support open banking, but open banking is being used, as designed, by landlords to keep watching tenant bank accounts for months after approving their tenancy. Open banking was set up to enhance inter- operability between finance providers, with the most obvious example being the recent new ability of the iPhone wallet app to display balances and recent transactions from various bank accounts.

Open banking approval normally lasts six months. While individual landlords may not choose this access, if given a free choice, the service industry providing the tenant-checking service to landlords is strongly incentivised to maximise such access, otherwise their competitors have a selling point. If open banking is to be added to the statute book, the Bill should mandate that the default time be reduced to no more than 24 hours in the first instance, and reconfirmed much more often. For most one-off approval processes, these access times may be as short as minutes and the regulations should account for that.

Coming on to Amendment 31, consumers have mixed feelings about the potential benefits to them of smart data schemes, as shown in polling such as that carried out a couple of years ago by Deltapoll with the CDEI, now the Responsible Technology Adoption Unit, as regards the perceived potential risks versus the benefits. Approximately one-quarter of respondents in each case were unsure about this trade-off. Perhaps unsurprisingly, individuals who said that they trusted banks and financial institutions or telecommunications providers were more likely to support open finance and open communications, and customers who had previous experience of switching services more frequently reported believing that the benefits of smart data outweighed the risks.

Is it therefore the Government’s expectation that people should be compelled to use these services? Open banking and imitators can do a great deal of good but can also give easy access to highly sensitive data for long periods. The new clause introduced by Amendment 31 would make it the same criminal offence to compel unnecessary access under these new provisions as it already is to compel data provision via subject access requests under the existing Data Protection Act.

Amendment 32 is a probing amendment as to the Government’s intentions regarding these new smart data provisions. In the Minister’s letter of 27 November, she said:

“The Government is working closely to identify areas where smart data schemes might be able to bring benefits. We want to build on the lessons learned from open banking and establish smart data schemes in other markets for goods and services.”


I very much hope that the Minister will be able to give us a little taste of what she thinks these powers are going to be used for, and in what sectors the Government believe that business can take advantage of these provisions.

Baroness Neville-Jones Portrait Baroness Neville-Jones (Con)
- Hansard - - - Excerpts

My Lords, I support Amendment 7 introduced by my noble friend Lord Arbuthnot, for the reasons that he gave. The amendment was designed to have the effect of increasing the reliability and handling of information inside any system. If, as I would certainly support, we want to see information and data in digital form circulated more readily, more freely and more often, it is very important that people should trust the system within which it happens. That is where the need to assure the cybersecurity of the system becomes very important and is a companion note to this Bill.

17:00
Earlier this year, DSIT conducted a survey which indicated that while 75% of senior managers said that cybersecurity is high-risk, very important and a high priority, the evidence showed that they had not really translated that perception into responsibility in the firm for taking on the risk and managing it. As my noble friend said, the time has come: we must have something which is more than just guidance and becomes a statutory obligation to make proper assessments of companies’ levels, and tolerance, of risk.
This amendment would give the Government the power—they would exercise it, I hope—to require third- party recipients of customer data to publish regular statements on their cyber resilience against some external specified standards. Of course, people will say that the Financial Reporting Council is due to issue a revised code of corporate governance. That will come into effect, but not for a whole year yet. The difficulty is that it is not compulsory and in no real sense are the standards specified.
We need to move to something more demanding, such as one sees happening in the United States; it is not known for being overly regulatory, but it certainly sees the importance of companies being able to assure their clients and customers in this area. It would be very helpful if the Government could give us some assurance that they will take the necessary measures to introduce a degree of compulsion into the disclosure of the state of companies’ cybersecurity in relation to specified standards.
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, I am delighted that the Government have chosen to take forward the smart data schemes from the DPDI Bill. The ability seamlessly to harness and use data is worth billions to the UK economy. However, data sharing and the profit that it generates must be balanced against proper oversight.

Let me start by offering strong support to my noble friend Lord Arbuthnot’s Amendment 7. Personally, I would greatly welcome a more sophisticated and widespread insurance market for cyber protections. Such a market would be based on openly shared data; the widespread publication of that data, as set out in the amendment, could help to bring this about.

I also support in principle Amendments 8 and 10 in the name of the noble Lord, Lord Clement-Jones, because, as I set out on the previous group, there is real and inherent value in interoperability. However, I wonder whether the noble Lord might reconsider the term “machine readable” and change it to something— I do not think that I have solved it—a bit more like “digitally interoperable”. I just worry that, in practice, everything is machine-readable today and the term might become obsolete. I am keen to hear the Minister’s response to his very interesting Amendment 31 on the compulsion of any person to provide data.

I turn to the amendments in my name. Amendment 16 would insert an appeals mechanism by which a person is charged a fee under subsection (1). It is quite reasonable that persons listed under subsection (2)—that is, data holders, decision-makers, interface bodies, enforcers and others with duties or powers under these regulations —may charge a fee for the purposes of meeting the expenses they incur, performing duties or exercising powers imposed by regulations made under this part. However, there should be an appeals mechanism so that, in the event that a person is charged an unreasonable fee, they have a means of recourse.

Amendment 17 is a probing amendment intended to explore the rate at which interest accrues on money owed to specific public authorities for unpaid levies. Given that this interest will be mandated by law, do the Government intend to monitor the levels and, if so, how?

Amendment 18 is a probing amendment designed to explore how the Government intend to deal with a situation when a person listed under subsection (2) of this clause believes they have been charged a levy wrongly. Again, it is reasonable that an appeals mechanism be created, and this would ensure that those who considered themselves to have been wrongly charged have a means of recourse.

Amendment 19 is looking for clarification on how the Government envisage unpaid levies being recovered. I would be grateful if the Minister could set out some further detail on that matter.

Amendment 21 is a probing amendment. I am curious to know the maximum value of financial assistance that the Government would allow the Secretary of State or the Treasury to give to persons under Clause 13. I do not think it would be prudent for the Government to become a financial backstop for participants in smart data schemes, so on what basis is that maximum going to be calculated?

Amendment 22 follows on from those concerns and looks to ensure that there is parliamentary oversight of any assistance provided. I am most curious to hear the Minister’s comments on this matter.

Amendment 23 is a straightforward—I think—amendment to the wording. I feel that the phrase “reasonably possible” seems to open the door to almost limitless endeavours and therefore suggest replacing it with “reasonably practicable”.

On Amendment 25, easy access to the FCA’s policy regarding penalties and levies is important. That would allow oversight, not only parliamentary but by those who are directly or indirectly affected by decisions taken under this policy. I therefore believe the amendment is necessary, as a website is the most accessible location for that information. Furthermore, regular review is necessary to ensure that the policy is functioning and serving its purpose.

Amendments 26 and 27 return to the matter of an appeals process. I will not repeat myself too much, but it is important to be able to appeal penalties and to create a route by which individuals understand how they can go about doing so.

Amendment 28 would ensure that, when the Secretary of State and the Treasury review the regulations made under Part 1 of the Bill, they do so concurrently. This amendment would prevent separate reviews being conducted that may contradict each other or be published at different times; it would force the relevant departments to produce one review and to produce it together. This would be prudent. It would prevent the Government doing the same work twice, unnecessarily spending public money, and would prevent contradicting reviews, which may cause confusion and financial costs in the smart data scheme industry.

Lastly, Amendment 29, which would ensure that Section 10 of this part was subject to the affirmative procedure, would allow for parliamentary oversight of regulations made under this clause.

We are pleased that the Government have chosen to bring smart data schemes forward, but I hope the Minister can take my concerns on board and share with us some of the detail in her response.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, we have had a detailed discussion, and it may be that I will not be able to pick up all the points that noble Lords have raised. If I do not, I guarantee to write to people.

First, I want to pick up the issues raised by the noble Lord, Lord Arbuthnot, and the noble Baroness, Lady Neville-Jones, about cybersecurity and cyber resilience. This Government, like previous Governments, take this issue hugely seriously. It is built into all our thinking. The noble Lord, and the noble Baroness in particular, will know that the advice we get on all these issues is top class. The Government are already committed to producing a cybersecurity and resilience Bill within this Parliament. We have all these things in hand, and that will underpin a lot of the protections that we are going to have in this Bill and others. I agree with noble Lords that this is a hugely important issue.

I am pleased to confirm that Clause 3(7) allows the regulations to impose requirements on third-party recipients in relation to the processing of data, which will include security-related requirements. So it is already in the Bill, but I assure noble Lords that it will be underpinned, as I say, by other legislation that we are bringing forward.

In relation to Amendments 8 and 10, I am pleased to confirm that Clause 5(4) clarifies that regulations may make provision about the providing or publishing of business data and the format in which that must be provided. That may include relevant energy-related data. The noble Lord gave some very good examples about how useful those connections and that data could be; he was quite right to raise those issues.

Regarding Amendment 9, in the name of the noble Lord, Lord Clement-Jones, I am pleased to confirm that there is nothing to prevent regulations requiring the provision of business data to government departments, publicly owned bodies and local and regional authorities. This is possible through Clause 4(1)(b), which allows regulations to require provision of business data to a person of a specified description. I hope the noble Lord will look at those cross-references and be satisfied by them.

Noble Lords spoke about the importance of sensitive information in future smart data schemes. A smart data scheme about legal services is not currently under consideration. Having said that, the Government would have regard to the appropriateness of such a scheme and the nature of any data involved and would consult the sector and any other appropriate stakeholders if that was being considered. It is not at the top of our list of priorities, but the noble Lord might be able to persuade us that it would have some merit, and we could start a consultation based on that.

Amendments 16 to 22 consider fees and the safeguards applying to them, which were raised by the noble Viscount. Fees and levies, enabled by Clauses 11 and 12, are an essential mechanism to fund a smart data scheme. The Government consider that appropriate and proportionate statutory safeguards are already built in. For example, requirements in Clause 11(3) and Clause 12(2) circumscribe the expenses in relation to which fees or the levy may be charged, and the persons on whom they may be charged.

Capping the interest rate for unpaid money, which is one of the noble Viscount’s proposals, would leave a significant risk of circumstances in which it might be financially advantageous to pay the levy late. The Government anticipate that regulations would provide an appropriate mechanism to ensure payment of an amount that is reasonable in the context of a late payment that is proposed. Just as regulations may be made by the relevant government department, it is most appropriate for financial assistance to be provided by the government department responsible for the smart data scheme in question. Clause 13 is intended to provide statutory authority for that assistance as a matter of regularity.

Amendments 23 to 27 deal with the clauses relating to the FCA. Clause 15(3) is drafted to be consistent with the wording of established legislation which confers powers on the FCA, most notably the Financial Services and Markets Act 2000. Section 1B of that Act uses the same formulation, using the phrase

“so far as is reasonably possible”

in relation to the FCA’s general duties. This wording is established and well understood by both the FCA and the financial services sector as it applies to the FCA’s strategic and operational objectives. Any deviation from it could create uncertainty and inconsistency.

Amendment 24 would cause significant disruption to current data-sharing arrangements and fintech businesses. Reauthenticating this frequently with every data holder would add considerable friction to open banking services and greatly reduce the user experience—which was the point raised by the noble Lord, Lord Clement-Jones. For example, it is in the customer’s interest to give ongoing consent to a fintech app to provide them with real-time financial advice that might adapt to daily changes in their finances.

Many SMEs provide ongoing access to their bank accounts in order to receive efficient cloud accounting services. If they had to re-register frequently, that would undermine the basis and operability of some of those services. It could inhibit the adoption and viability of open banking, which would defeat one of the main purposes of the Bill.

17:15
Clause 15(3) already provides that regulations must require the FCA to publish a policy statement on financial penalties, as well as setting out other matters that the regulations may cover. The current drafting of the Bill already ensures that financial penalties imposed by the FCA will be subject to an appeals process, as will be the case for financial penalties issued by other enforcers under the Bill. Appropriate safeguards are put in place on the FCA’s ability to introduce a levy in order to meet the costs incurred in implementing a smart data scheme.
I thank the noble Viscount, Lord Camrose, for his proposed Amendment 28. I am pleased to say that Clause 19 already allows for a review of regulations by both the Secretary of State and the Treasury, and government departments would, of course, liaise with each other as appropriate. I reassure noble Lords that the effect of Amendment 29 in requiring the affirmative resolution for regulations relating to financial penalties under Clause 10 is already achieved. These regulations are made under Clause 8 and are always subject to affirmative scrutiny.
On Amendment 30, the definition in Part 1 of “public authority” covers any person whose functions are or include those of a public nature. This definition reflects the definition of “public authority” in Section 6 of the Human Rights Act 1998. I thank the noble Lord, Lord Clement Jones, for that amendment and for his Amendments 31 and 32.
Clause 20 confirms that regulations should not generally be read as authorising or requiring the processing of personal data that would contravene data protection legislation. As I said earlier, this Bill does not intend to displace the Data Protection Act 2018 but instead to build on existing data rights and protections, so data subjects will continue to be protected by the UK GDPR.
Smart data schemes require proper consultation and debate before implementation. By requiring publication within six months of Royal Assent to the Bill, Amendment 26 might lead Governments to commit to using powers in sectors where they may not finally be felt to be appropriate. It would also be difficult to predict at any specific point in time in what context smart data interventions might be required in the future.
Finally, I turn to whether Clause 13 should stand part of the Bill. Clause 13 enables the Secretary of State or the Treasury to give financial assistance to decision-makers or enforcers for the purpose of meeting any expenses in the exercise of their functions in smart data schemes. It is intended that smart data schemes will be self-financing through the fees and levies provided for by Clauses 11 and 12, but it is deemed appropriate for there to be a statutory spending authority as a backstop provision, if that is necessary. Any spending commitment of resources will be subject to the usual estimates process and to existing public sector spending controls and transparency requirements.
I hope that by going through the detail of the large number of amendments I have provided reassurance to noble Lords on these amendments, as well as on why we feel that the inclusion of Clause 13 is necessary. I therefore hope that noble Lords will not press their amendments.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Does the Minister have any thoughts about where smart data schemes might be introduced? I am sure that they are being introduced for a purpose. Is there a plan to issue a policy document or is it purely about consulting different sectors? Perhaps the Minister can give us a glimpse of the future.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

The noble Lord is tempting me. What I would say is that, once this legislation is passed, it will encourage departments to look in detail at where they think smart data schemes can be applied and provide a useful service for customers and businesses alike. I know that one issue that has been talked about is providing citizens with greater information about their energy supplies—the way that is being used and whether they can use their energy differently or find a different supplier—but that is only one example, and I do not want people to get fixated on it.

The potential is enormous; I feel that we need to encourage people to think creatively about how some of these provisions can be used when the Bill is finally agreed. There is a lot of cross-government thinking at the moment and a lot of considering how we can empower citizens more. I could say a lot off the top of my head but putting it on the record in Hansard would probably be a mistake, so I will not be tempted any more by the noble Lord. I am sure that he can write to me with some suggestions, if he has any.

Lord Arbuthnot of Edrom Portrait Lord Arbuthnot of Edrom (Con)
- Hansard - - - Excerpts

My Lords, one problem with cybersecurity is the fact that, if one company is spending money on it but is worrying that its competitor companies are not, they might feel that an element of compulsion would be helpful. I just raise that with the Minister, who suggests that some of these things might be better in the cybersecurity and resilience Bill. My noble friend Lady Neville-Jones and I think she is right, so I beg leave to withdraw my amendment.

Amendment 7 withdrawn.
Clause 3 agreed.
Clause 4: Power to make provision in connection with business data
Amendments 8 to 10 not moved.
Clause 4 agreed.
Clause 5 agreed.
Clause 6: Decision-makers
Amendment 11
Moved by
11: Clause 6, page 10, line 9, after “guidance” insert “on their website”
Member's explanatory statement
This amendment would require decision-makers to publish their guidance on their website to allow persons seeking authorisation to receive customer among other things.
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, this sequence of amendments is concerned with the publication and availability of guidance. Decision-makers are individuals responsible for deciding if a person has satisfied the conditions for authorisation to receive customer or business data. They may publish guidance on how they intend to exercise their functions. Given the nature of these responsibilities, these individuals are deciding who can receive information pertaining to individuals and businesses. The guidelines which set out how decisions are taken should be easily accessible and the best place for this is on their websites.

Following on from this point, Amendment 12 would require this guidance to be reviewed annually and any changes to be published, again on decision-makers’ websites, at least 28 days before coming into effect. This would ensure that the guidelines are fit for purpose and provide ample time for people affected by these changes to review them and act accordingly.

Amendments 13 and 14 seek to create similar requirements for enforcers—that is, a public authority authorised to carry out monitoring or enforcement of regulations under this part. Again, given the nature of these responsibilities, the guidelines should be easily accessible on the enforcer’s website and reviewed annually, with any changes published, again on their website, at least 28 days before coming into effect. This will, once again, ensure that the guidelines are fit for purpose and provide ample time for people affected by these changes to review them and act accordingly.

Finally, Amendment 15 would require the Secretary of State or the Treasury to provide guidance on who may be charged a fee under Clause 6(1) and to review it annually. Ensuring the regular review of guidelines will ensure their effectiveness, and the ready availability of guidelines will ensure that they are used and observed. I therefore believe that these amendments will be of benefit to the functioning of the Bill and should be given consideration by the Minister.

Lord Leong Portrait Lord Leong (Lab)
- Hansard - - - Excerpts

My Lords, I thank the noble Viscount, Lord Camrose, for those amendments. I will cover the final group of amendments to Part 1, dealing with smart data guidance.

On Amendments 11, 12, 13 and 14, which relate to the publishing of the guidelines, I am pleased to confirm that Clause 5(4) clarifies that regulations may make provisions about the providing or publishing of business data. This includes the location where they should be published, including, as the noble Viscount suggests, the website of the responsible person.

Furthermore, Clause 21 clarifies that regulation may make provision about the form and manner in which things must be done. That provision can be used to establish appropriate processes around the sharing of information and guidance, including its regular update, publication and sharing with the relevant person.

Amendment 15 refers to the amount of fee charged and how it should be determined. The power is already broad enough to allow the information to be reviewed as and when necessary, but to mandate that the review must take place at least once a year may be a bit restrictive. For these reasons, I ask the noble Viscount not to press his amendments.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank the noble Lord for his answers. I understand what he says, although I would be grateful if either he or the noble Baroness, Lady Jones, could summarise those points in writing because I did not quite capture them all. If I understand correctly, all the concerns that we have raised are dealt with in other areas of the Bill, but if they could write to me then that would be great. I beg leave to withdraw the amendment.

Amendment 11 withdrawn.
Amendment 12 not moved.
Clause 6 agreed.
Clause 7 agreed.
Clause 8: Enforcement of regulations under this Part
Amendments 13 to 14 not moved.
Clause 8 agreed.
Clauses 9 and 10 agreed.
Clause 11: Fees
Amendments 15 and 16 not moved.
Clause 11 agreed.
Clause 12: Levy
Amendments 17 to 19 not moved.
Clause 12 agreed.
Clause 13: Financial assistance
Amendments 20 to 22 not moved.
Clause 13 agreed.
Clause 14 agreed.
Clause 15: The FCA and financial services interfaces: supplementary
Amendments 23 and 24 not moved.
Clause 15 agreed.
Clause 16: The FCA and financial services interfaces: penalties and levies
Amendments 25 to 27 not moved.
Clause 16 agreed.
Clauses 17 and 18 agreed.
Clause 19: Duty to review regulations
Amendment 28 not moved.
Clause 19 agreed.
Clauses 20 and 21 agreed.
Clause 22: Regulations under this Part: Parliamentary procedure and consultation
Amendment 29 not moved.
Clause 22 agreed.
Clauses 23 and 24 agreed.
Clause 25: Other defined terms
Amendment 30 not moved.
Clause 25 agreed.
Clause 26 agreed.
Amendments 31 and 32 not moved.
Clause 27 agreed.
17:30
Clause 28: DVS trust framework
Amendment 33
Moved by
33: Clause 28, page 30, line 28, at end insert—
“(2A) Those rules must include processes for ongoing monitoring of compliance, including but not limited to processes and procedures for monitoring and investigating compliance.(2B) The rules must contain mechanisms for redress for harms caused by compliance failures.(2C) The Secretary of State must establish an independent process for hearing appeals against the findings of compliance investigations.”Member's explanatory statement
This amendment specifies additional rules for the trust framework.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I almost have a full house in this group, apart from Amendment 35, so I will not read out the numbers of all the amendments in this group. I should just say that I very much support what the noble Viscount, Lord Colville, has put forward in his Amendment 35.

Many noble Lords will have read the ninth report of the Delegated Powers and Regulatory Reform Committee. I am sad to say that it holds exactly the same view about this Bill as it did about the previous Bill’s provisions regarding digital verification services. It said that

“we remain of the view that the power conferred by clause 28 should be subject to parliamentary scrutiny, with the affirmative procedure providing the appropriate level of scrutiny”.

It is against that backdrop that I put forward a number of these amendments. I am concerned that, although the Secretary of State is made responsible for this framework, in reality, they cannot be accountable for delivering effective governance in any meaningful way. I have tried, through these amendments, to introduce at least some form of appropriate governance.

Of course, these digital verification provisions are long-awaited—the Age Verification Providers Association is pleased to see them introduced—but we need much greater clarity. How is the Home Office compliant with Part 2 of the Bill as it is currently written? How will these digital verification services be managed by DSIT? How will they interoperate with the digital identity verification services being offered by DSIT in the UK Government’s One Login programme?

Governance, accountability and effective, independent regulation are also missing. There is no mechanism for monitoring compliance, investigating malicious actors or taking enforcement action regarding these services. The Bill has no mechanism for ongoing monitoring or the investigation of compliance failures. The Government propose to rely on periodic certification being sufficient but I understand that, when pressed, DSIT officials say that they are talking to certification bodies and regulators about how they can do so. This is not really sufficient. I very much share the intention of both this Government and the previous one to create a market in digital verification services, but the many good players in this marketplace believe that high levels of trust in the sector depend on a high level of assurance and focus from the governance point of view. That is missing in this part of the Bill.

Amendment 33 recognises the fact that the Bill has no mechanism for ongoing monitoring or the investigation of compliance failures. As we have seen from the Grenfell public inquiry, a failure of governance caused by not proactively monitoring, checking and challenging compliance has real, harmful consequences. Digital verification services rely on the trustworthiness of the governance model; what is proposed is not trustworthy but creates material risk for UK citizens and parties who rely on the system.

There are perfectly decent examples of regulatory frameworks. PhonepayPlus provides one such example, with a panel of three experts supported by a secretariat; the panel can meet once a quarter to give its opinion. That has been dismissed as being too expensive, but I do not believe that any costings have been produced or that it has been considered how such a cost would weigh against the consequences of a failure in governance of the kind identified in recent public inquiries.

Again, as regards Amendment 36, there is no mechanism in the Bill whereby accountability is clearly established in a meaningful way. Accountability is critical if relying parties and end-users are to have confidence that their interests are safeguarded.

Amendment 38 is linked to Amendment 36. The review under Clause 31 must be meaningful in improving accountability and effective governance. The amendment proposes that the review must include performance, specifically against the five-year strategy and of the compliance, monitoring and investigating mechanisms. We would also like to see the Secretary of State held accountable by the Science and Technology Select Committee for the performance captured in the review.

On Amendment 41, the Bill is silent on how the Secretary of State will determine that there is a compliance failure. It is critical to have some independence and professional rigour included here; the independent appeals process is really crucial.

As regards Amendments 42 and 43, recent public inquiries serve to illustrate the importance of effective governance. Good practice for effective governance would require the involvement of an independent body in the determination of compliance decisions. There does not appear to be an investigatory resource or expertise within DSIT, and the Bill currently fails to include requirements for investigatory processes or appeals. In effect, there is no check on the authority of the Secretary of State in that context, as well as no requirement for the Secretary of State proactively to monitor and challenge stakeholders on compliance.

As regards Amendment 44, there needs to be a process or procedure for that; fairness requires that there should be a due process of investigation, a review of evidence and a right of appeal to an independent body.

I turn to Amendment 45 on effective governance. A decision by the appeals body that a compliance failure is so severe that removal from the register is a proportionate measure must be binding on the Secretary of State, otherwise there is a risk of lobbying and investment in compliance and service improvement being relegated below that of investment in lobbying. Malicious actors view weaknesses in enforcement as a green light and so adopt behaviours that both put at risk the safety and security of UK citizens and undermine the potential of trustworthy digital verification to drive economic growth.

Amendment 39 would exclude powers in this part being used by government as part of GOV.UK’s One Login.

I come on to something rather different in Amendment 46, which is very much supported by Big Brother Watch, the Digital Poverty Alliance and Age UK. Its theme was raised at Second Reading. A significant proportion of the UK’s population lacks internet access, with this issue disproportionately affecting older adults, children and those from low-income backgrounds. This form of digital exclusion presents challenges in an increasingly digital world, particularly concerning identity verification.

Although digital identity verification can be beneficial, it poses difficulty for individuals who cannot or choose not to engage digitally. Mandating online identity verification can create barriers for digitally excluded groups. For example, the National Audit Office found that only 20% of universal credit applicants could verify their identity online, highlighting concerns for those with limited digital skills. The Lords Communications and Digital Select Committee emphasised the need for accessible, offline alternatives to ensure inclusivity in a connected world. The proponents of this amendment advocate the availability of offline options for essential public and private services, particularly those requiring identity verification. This is crucial as forcing digital engagement can negatively impact the well-being and societal participation of older people.

This is the first time that I have prayed in aid what the Minister said during the passage of the Data Protection and Digital Information Bill; this could be the first of a few such occasions. When we debated the DPDI Bill, she stressed the importance of a legal right to choose between digital and non-digital identity verification methods. I entirely agreed with her at the time. She said that this right is vital for individual liberty, equality and building trust in digital identity systems and that, ultimately, such systems should empower individuals with choices rather than enforce digital compliance. That is a fair summary of what she said at the time.

I turn to Amendment 50. In the context of Clause 45 and the power of public authorities to disclose information, some of which may be the most sensitive information, it is important for the Secretary of State to be able to require the public authority to provide information on what data is being disclosed and where the data is going, as well as why the data is going there. This amendment will ensure that data is being disclosed for the right reasons, to the right places and in the right proportion. I beg to move.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - - - Excerpts

My Lords, I tabled Amendment 35 because I want to make the DVS trust framework as useful as possible. I support Amendment 33 in the name of the noble Lord, Lord Clement-Jones, and Amendment 37 in the name of the noble Viscount, Lord Camrose.

The framework’s mandate is to define a set of rules and standards designed to establish trust in digital identity products in the UK. It is what I would hope for as a provision in this Bill. As the Minister told us at Second Reading, the establishment of digital ID services with a trust mark will increase faith in the digital market and reduce physical checks—not to mention reducing the time spent on a range of activities, from hiring new workers to moving house. I and many other noble Lords surely welcome the consequent reduction in red tape, which so often impedes the effectiveness of our public services.

Clause 28(3) asks the Secretary of State to consult the Information Commissioner and such persons as they consider appropriate. However, in order to ensure that these digital ID services are used and recognised as widely as possible—and, more importantly, that they can be used by organisations beyond our borders— I suggest Amendment 35, which would include putting consultation with an international digital standards body in the Bill. This amendment is supported by the Open Data Institute.

I am sure that the Minister will tell me that that amendment is unnecessary as we can leave it to the common sense of Ministers and civil servants in DSIT to consult such a body but, in my view, it is helpful to remind them that Parliament thinks the consultation of an international standards body is important. The international acceptance of DVS is crucial to its success. Just like an email, somebody’s digital identity should not be tied to a company or a sector. Imagine how frustrating it would be if we could only get Gmail in the UK and Outlook in the EU. Imagine if, in a world of national borders and jurisdictions, you could not send emails between the UK and the EU as a result. Although the DVS will work brilliantly to break down digital identity barriers in the UK, there is a risk that no international standards body might be consulted in the development of the DVS scheme. This amendment would be a reminder to the Secretary of State that there must be collaboration between this country, the EU and other nations, such as Commonwealth countries, that are in the process of developing similar schemes.

17:45
The benefits of adopting Amendment 35 are clear. Noble Lords can only imagine how frustrating it is at the moment for UK citizens who want to move to employment abroad. Take the example of a doctor who wants to work in the EU. At the moment, they need to take a traditional approach to getting access to their qualifications. Not only does this process have administrative costs, it opens the way for qualification fraud. A digital version of the doctor’s registration certificate would quickly confirm their UK credentials and speed them on the way to international employment.
I have suggested the Worldwide Web Consortium, which is known to the industry as W3C, as the best body to consult, but in the fast-changing digital world, I have left the amendment open for another body to be nominated for consultation by the Minister. W3C was established in 1994 to create the architecture to accommodate the rapid pace of change in web standards. It is the body that set standards for HTML and CSS, which is used to build websites. As its founder, Sir Tim Berners-Lee said recently, digital identity systems, like the DVS trust framework, have the potential to transform how we live and work, but only if they are built on interoperable, international standards. Just as the World Wide Web itself thrives on global collaboration and universal protocols, digital identity must not be constrained by borders or fragmented by conflicting systems. Without co-ordinated international effort, we risk creating isolated silos of identity that hinder mobility, innovation and trust across borders. W3C will ensure that the trust frameworks like that in the UK align with global standards.
This Bill is looking to the future when our country is a digital leader on international standards. A DVS-trusted scheme will allow that to happen. I urge the Minister to accept my Amendment 35.
It is also clear that Amendment 37 is needed. Surely noble Lords agree that this complicated and novel process should at least be laid before Parliament before it is finally agreed.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

In an act that I hope he is going to repeat throughout, the noble Lord, Lord Clement-Jones, has fully explained all the amendments that I want to support, so I put on record that I agree fully with all the points he made. I want to add just one or two other points. They are mainly in the form of questions for the Minister.

Some users are more vulnerable to harms than others, so Amendment 33 would insert a new subsection 2B which mentions redress. What do the Government imagine for those who may be more vulnerable and how do they think they might use this system? Obviously, I am thinking about children, but there could be other categories of users, certainly the elderly.

That led me to wonder what consideration has been given to vulnerable users more generally and how that is being worked through. That led to me to question exactly how this system is going to interact with the age-assurance work that the IC is doing as a result of the Online Safety Act and make sure that children are not forced into a position where they have to show their identity in order to prove their age or, indeed, cannot prove their identity because they have been deemed to have been dealt with elsewhere in another piece of legislation. Because, actually, children do open bank accounts and do have to have certain sorts of ID.

That led me to ask what in the framework prevents service providers giving more information than is required. I have read the Bill; someone said earlier that it is skeletal. From what we know, you can separate pieces of information, attributes, from each other, but what is to prevent a service provider not doing so? This is absolutely crucial to the trust in and workings of this system, and it leads me to the inverse, Amendment 46, which asks how we can prevent this system being forced and thrust upon people. As the noble Lord, Lord Clement-Jones, set out, we need to make sure that people have the right not to use the system as well as the right to use it.

Finally, I absolutely agree with the noble Viscount, Lord Colville, and the amendment in the name of the noble Viscount, Lord Camrose: something this fundamental must come back to Parliament. With that, I strongly associate myself with the words of the noble Lord, Lord Clement-Jones, on all his amendments.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank noble Lords for their comments and contributions in what has been an absolutely fascinating debate. I have a couple of points to make.

I agree with the noble Lord, Lord Clement-Jones, on his Amendment 33, on ongoing monitoring, and his Amendment 50. Where we part company, I think, is on his Amendment 36. I feel that we will never agree about the effectiveness or otherwise of five-year strategies, particularly in the digital space. I simply do not buy that his amendment will have the desirable effects that the noble Lord wants.

I do not necessarily agree with the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, that we should put extra burdens around the right to use non-digital methods. In my opinion, and I very much look forward to hearing from the Minister on this matter, the Act preserves that right quite well as it is. I look forward to the Government’s comments on that.

I strongly support the noble Viscount, Lord Colville, on his very important point about international standards. I had intended to sign his amendment but I am afraid that, for some administrative reason, that did not happen. I apologise for that, but I will sign it because I think that it is so important. In my opinion, not much of the Bill works in the absence of effective international collaboration around these matters. This is so important. We are particularly going to run up against this issue when we start talking about ADM, AI and copyright issues. It is international standards that will allow us to enforce any of the provisions that we put in here, so they are so important. I am more agnostic on whether this will happen via W3C, the ITU or other international standards bodies, but we really must go forward with the principle that international standards are what will get us over the line here. I look forward to hearing the Minister’s confirmation of the importance, in the Government’s view, of such standards.

Let me turn to the amendments listed in my name. Amendment 37 would ensure parliamentary oversight of the DVS trust framework. Given the volume of sensitive data that these services providers will be handling, it is so important that Parliament can keep an eye on how the framework operates. I thank noble Lords for supporting this amendment.

Amendment 40 is a probing amendment. To that end, I look forward to hearing the Minister’s response. Accredited conformity assessment bodies are charged with assessing whether a service complies with the DVS framework. As such, they are giving a stamp of approval from which customers will draw a sense of security. Therefore, the independence of these accreditation bodies must be guaranteed. Failing to do so would allow the industry to regulate itself. Can the Minister set out how the Government will guarantee the independence of these accreditation bodies?

Amendment 49 is also a probing amendment. It is designed to explore the cybersecurity measures that the Government expect of digital verification services. Given the large volume of data that these services will be handling, it is essential that the Government demand substantial cybersecurity measures. This is a theme that we are going to come back to again and again; we heard about it earlier, and I think that we will come on to more of this. As these services become more useful and more powerful, they present a bigger attack surface that we have to defend, and I look forward to hearing how we will do that.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I thank the noble Lords, Lord Clement-Jones and Lord Markham, the noble Viscount, Lord Colville, and the noble Baroness, Lady Kidron, for raising these topics around digital verification services. As I explained at Second Reading, these digital verification services already exist. They are already out there making all sorts of claims for themselves. With the new trust framework, we are trying to provide some more statutory regulation of the way that they operate. It is important that we have this debate and that we get it right, but some of the things we are doing are still work in progress, which is why we do not always have all the detailed answers that noble Lords are searching for here and why some powers have been left to the Secretary of State.

I shall go from the top through the points that have been raised. Amendments 33 and 43, tabled by the noble Lord, Lord Clement-Jones, and Amendment 40 tabled by the noble Viscount, Lord Colville, would require the trust framework to include rules on monitoring compliance and redress mechanisms and would require the Secretary of State to ensure the independence of accredited conformity assessment bodies. The noble Baroness, Lady Kidron, asked questions akin to those regarding redress for the vulnerable, and I will write to her setting out a response to that in more detail.

On the issue of redress mechanisms in the round, the scope of the trust framework document is solely focused on the rules that providers of digital verification services are required to follow. It does not include matters of governance. Compliance is ensured via a robust certification process where services are assessed against the trust framework rules. They are assessed by independent conformity assessment bodies accredited by the United Kingdom Accreditation Service, so some oversight is already being built into this model.

The Bill contains powers for the Secretary of State to refuse applications to the DVS register or to remove providers where he is satisfied that the provider has failed to comply with the trust framework or if he considers it necessary in the interests of national security. These powers are intended as a safety net, for example, to account for situations where the Secretary of State might have access to intelligence sources that independent conformity assessment bodies cannot assess and therefore will not be able to react to, or it could be that a particular failure of the security of one of these trust marks comes to light very quickly, and we want to act very quickly against it. That is why the Secretary of State has those powers to be able to react quickly in what might be a national security situation or some other potential leak of important data and so on.

In addition, conformity assessment bodies carry out annual surveillance audits and can choose to conduct spot audits on certified providers, and they have the power to withdraw certification where non-conformities are found. Adding rules on compliance would cut across that independent certification process and would be outside the scope of the trust framework. Those independent certification processes already exist.

Amendments 33, 41, 42, 44 and 45 tabled by the noble Lord, Lord Clement-Jones, would in effect require the creation of an independent appeals body to adjudicate on the refusal of an application to the DVS register and the implementation of an investigatory process applicable to refusal and removal from the DVS register. The powers of the Secretary of State in this regard are not without safeguards. They may be exercised only in limited circumstances after the completion of an investigatory process and are subject to public law principles, for example, reasonableness. They may also be challenged by judicial review.

To go back to the point I was making, it might be something where we would need to move quickly. Rather than having a convoluted appeals process in the way that the noble Lord was talking about, I hope he understands the need sometimes for that flexibility. The creation and funding of an independent body to adjudicate such a limited power would therefore be inappropriate.

18:00
I thank the noble Viscount, Lord Colville, for Amendment 35, which would require an international standards body to be consulted in preparing the trust framework. The non-statutory published framework that already exists references the World Wide Web Consortium’s standards, along with other relevant international standards, in its body of rules. The Bill provides for the consultation of such bodies as the Secretary of State considers appropriate. The Government are keen to focus the consultation on those bodies that are most appropriate at a given time. Although it is perfectly likely that the World Wide Web Consortium’s standards would be consulted, committing to one specific body could undermine this flexibility and it might be that we want to take broader soundings on the issues being considered.
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

It would be reassuring if the Minister could share with us some of the meetings that the Secretary of State or Ministers are having with those bodies on the subject of these internationally shared technical standards.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I might need to write to the noble Viscount, but I am pretty sure that that is happening at an official level on a fairly regular basis. The noble Viscount raises an important point. I reassure him that those discussions are ongoing, and we have huge respect for those international organisations. I will put the detail of that in writing to him.

I turn to Amendment 37, tabled by the noble Viscount, Lord Camrose, which would require the DVS trust framework to be laid before Parliament. The trust framework contains auditable rules to be followed by registered providers of digital verification services. The rules, published in their third non-statutory iteration last week on GOV.UK, draw on and often signpost existing technical requirements, standards, best practice, guidance and legislation. It is a hugely technical document, and I am not sure that Parliament would make a great deal of sense of it if it was put forward in its current format. However, the Bill places consultation on a statutory footing, ensuring that it must take place when the trust framework is being prepared and reviewed.

Amendments 36 and 38, tabled by the noble Lord, Lord Clement-Jones, would create an obligation for the Secretary of State to reconsult and publish a five-year strategy on digital verification services. It is important to ensure that the Government have a coherent strategy for enabling the digital verification services market. That is why we have already consulted publicly on these measures, and we continue to work with experts. However, given the nascency of the digital identity market and the pace of those technological developments, as the noble Viscount, Lord Camrose, said, forecasting five years into the future is not practical at this stage. We will welcome scrutiny through the publication of the annual report, which we are committed to publishing, as required by Clause 53. This report will support transparency through the provision of information, including performance data regarding the operation of Part 2.

Amendment 39, also tabled by the noble Lord, Lord Clement-Jones, proposes to exclude certified public bodies from registering to provide digital verification services. We believe that such an exclusion could lead to unnecessary restrictions on the UK’s young digital verification market. The noble Lord mentioned the GOV.UK One Login programme, which is aligned with the standards of the trust framework but is a separate government programme which gives people a single sign-on service to access public services. It uses different legal powers to operate its services from what is being proposed here. We do not accept that we need to exclude public bodies from the scrutiny that would otherwise take place.

Amendment 46 seeks to create a duty for organisations that require verification and use digital verification for that purpose to offer, where reasonably practicable, a non-digital route and ensure that individuals are made aware of both options for verification. I should stress here that the provision in the Bill relates to the provision of digital verification services, not requirements on businesses in general about how they conduct verification checks.

Ensuring digital inclusion is a priority for this Government, which is why we have set up the digital inclusion and skills unit within DSIT. Furthermore, there are already legislative protections in the Equality Act 2010 in respect of protected groups, and the Government will take action in the future if evidence emerges that people are being excluded from essential products and services by being unable to use digital routes for proving their identity or eligibility.

The Government will publish a code of practice for disclosure of information, subject to parliamentary review, highlighting best practice and relevant information to be considered when sharing information. As for Amendment 49, the Government intend to update this code only when required, so an annual review process would not be necessary. I stress to the Committee that digital verification services are not going to be mandatory. It is entirely voluntary for businesses to use them, so it is up to individuals whether they use that service or not. I think people are feeling that it is going to be imposed on people, and I would push against that proposal.

If the regulation-making power in Amendment 50 proposed by the noble Lord, Lord Clement-Jones, was used, it would place obligations on the Information Commissioner to monitor the volume of verification checks being made, using the permissive powers to disclose information created in the clause. The role of the commissioner is to regulate data protection in the UK, which already includes monitoring and promoting responsible data-sharing by public authorities. For the reasons set out above, I hope that noble Lords will feel comfortable in not pressing their amendments.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

Can I double-check that nothing was said about the interaction between the Bill and the OSA in all of that? I understood the Minister to say that she would perhaps write to me about vulnerable people, but my question about how this interacts was not answered. Perhaps she will write to me on that issue as well.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Was that a question on age assurance?

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

Yes, the ICO is undertaking work on age assurance under the OSA at the moment. My point was about how the two regimes intersect and how children get treated under each. Do they fall between?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I will, of course, write to the noble Baroness.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Was the Minister saying that in view of the current duties of the ICO, Amendment 50 is not needed because public authorities will have the duty to inform the ICO of the information that they have been passing across to these identity services?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Again, I will have to write to the noble Lord on that. I think we were saying that it is outside the current obligations of the ICO, but we will clarify the responsibility.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I am not quite sure whether to be reassured or not because this is terra incognita. I am really struggling, given the Minister’s response. This is kind of saying, “Hands off, Parliament, we want the lightest touch on all of this, and the Secretary of State will decide”.

I should first thank the noble Baroness, Lady Kidron, for her support. I thought that the noble Viscount, Lord Colville, made an extremely good case for Amendment 35 because all of us want to make sure that we have that interoperability. One of the few areas where I was reassured by the Minister was on the consultations taking place.

I am sure that the noble Viscount, Lord Camrose, was right to ask what the consultations are. We need to be swimming in the right pool for our digital services to be interoperable. It is not as if we do not have contact with quite a number of these digital service providers. Some of them are extremely good and want a level of mandation for these international services. There is a worrying lack of detail here. We have devil and the deep blue sea. We have these rules on GOV.UK which are far too complicated for mere parliamentarians to comprehend. They are so detailed that we are going to get bogged down.

On the other hand, we do not know what the Secretary of State is doing. This is the detailed trust framework, but what is the governance around it? At the beginning of her speech, the Minister said that governance is different from certification and the conformity assessment service. I would have thought that governance was all part of the same warp and weft. I do not really understand. The Secretary of State has the power to refuse accreditation, so we do not need an independent appeals body. It would be much more straightforward if we knew that there was a regulator and that it was going to be transparent in terms of how the system worked. I just feel that this is all rather half baked at the moment. We need a lot more information than we are getting. To that extent, that is the case for all the amendments in this group.

The crucial amendment is Amendment 37 tabled by the noble Viscount, Lord Camrose, because we absolutely need to bring all this into the light of day by parliamentary approval, whether or not it is a complicated document. Perhaps we could put it through an AI model and simplify it somewhat before we debate it. We have to get to grips with this. I have a feeling that we are going to want to return to this aspect on Report because no good reason has been given, not to the DPRRC either, about why we are not debating this in Parliament in terms of the scheme itself. It is a bit sad to have to say this because we all support the digital verification progress, if you like. Yet, we are all in a bit of a fog about how it is all going to work.

I very much hope that the Minister can come back to us, perhaps with a must-write letter that sets it all out to a much more satisfactory extent. I hope she understands why we have had this fairly protracted debate on this group of amendments because this is an important aspect that the Bill is skeletal about. I beg leave to withdraw the amendment.

Amendment 33 withdrawn.
Amendments 34 to 37 not moved.
Clause 28 agreed.
Clauses 29 to 30 agreed.
Clause 31: Review of DVS trust framework and supplementary codes
Amendment 38 not moved.
Clause 31 agreed.
Clause 32 agreed.
Clause 33: Registration in the DVS register
Amendments 39 and 40 not moved.
Clause 33 agreed.
Clause 34: Power to refuse registration in the DVS register
Amendments 41 to 43 not moved.
Clause 34 agreed.
Clauses 35 to 40 agreed.
Clause 41: Power to remove person from the DVS register
Amendments 44 and 45 not moved.
Clause 41 agreed.
Clauses 42 to 44 agreed.
Amendment 46 not moved.
18:15
Amendment 47
Moved by
47: After Clause 44, insert the following new Clause—
“Cyber-security rules for DVS providers(1) The Secretary of State must prepare and publish a set of cyber-security rules for Digital Verification Service providers.(2) Rules under subsection (1) must be reviewed at least annually, and any updates to the rules must be published at least 28 days before they come into force.”Member’s explanatory statement
This is a probing amendment on the cyber-security measures expected of Digital Verification Services providers.
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, Amendment 47 is in another slightly peculiar group, but we will persevere. It aims to bolster the cybersecurity framework for digital verification services providers. Needless to say, as we continue to advance in the digital age, it is vital that our online systems, especially those handling sensitive information, are protected against ever-evolving cyberthreats. As DVSs gain in currency as they gain in usage, the incentive for cyberattackers to attack them and try to take advantage grows. They need to be protected.

The proposed amendment therefore mandates the creation and regular review of cybersecurity rules for all DVS providers. These rules are designed to ensure that services involved in verifying identities and other critical data maintain the highest standards of protection, resilience and trustworthiness consonant with their importance and the sensitivity of any breaches of that data.

We could hardly be more aware that we live in an increasingly digital world where almost every aspect of our lives is connected online. Digital verification services play a key role in this landscape, and that role is going to increase. They are used by individuals and organisations to confirm identities, authenticate transactions and verify data. These services underpin critical areas, such as banking, healthcare and public services, where security is paramount. However, as the cyberthreat landscape becomes more sophisticated, so does the need for robust security measures to protect these services. Hackers and malicious actors are continuously developing new ways to exploit vulnerabilities in digital systems. This puts personal data, business operations and even national security at risk.

A security breach in a digital verification system could have devastating consequences not only for the immediate victims but for the reputation and integrity of the service providers. That is why we on these Benches feel that the proposed amendment is absolutely critical. It would ensure that all DVS providers are held to a high, standardised set of cybersecurity practices. This would not only reduce the risk of cyberthreats but build greater public trust in the safety and reliability of those services and, therefore, enhance their uptake.

One of the key aspects of the amendment is the requirement for the cybersecurity rules to be reviewed annually. This is especially important in the context of the rapid evolution of the cyberthreats that we face. Technologies, attack methods and vulnerabilities are constantly changing, and what is secure today may not be secure tomorrow. By reviewing the cyber rules every year, we will ensure that they remain current and effective in protecting against the latest threats. I beg to move.

Lord Markham Portrait Lord Markham (Con)
- Hansard - - - Excerpts

I support that. I completely agree with all the points that the noble Lord, Lord Clement-Jones, made on the previous groupings, but the one that we all agree is absolutely vital is the one just brought up by my noble friend. Coming from the private sector, I am all in favour of a market—I think that it is the right way to go—but standards within that are equally vital.

I come at this issue having had the misfortune of having to manage the cyberattack that we all recall happening against our diagnostic services in hospitals last summer. We found that the weakest link there was through the private sector supplier to that system, and it became clear that the health service—or cybersecurity, or whoever it was—had not done enough to make sure that those standards were set, published and adhered to effectively.

With that in mind, and trying to learn the lessons from it, I think that this clause is vital in terms of its intent, but it will be valuable only if it is updated on a frequent basis. In terms of everything that we have spoken about today, and on this issue in particular, I feel that that point is probably the most important. Although everything that we are trying to do is a massive advance in terms of trying to get the data economy to work even better, I cannot emphasise enough how worrying that attack on our hospitals last summer was at the time.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I thank both noble Lords for raising this; I absolutely concur with them on how important it is. In fact, I remember going to see the noble Viscount, Lord Camrose, when he was in his other role, to talk about exactly this issue: whether the digital verification services were going to be robust enough against cyberattacks.

I pray in aid the noble Lord, Lord Arbuthnot, and the noble Baroness, Lady Neville-Jones, who both felt that the new Cyber Security and Resilience Bill will provide some underpinning for all of this, because our Government take this issue very seriously. As the Committee can imagine, we get regular advice from the security services about what is going on and what we need to do to head it off. Yes, it is a difficult issue, but we are doing everything we can to make sure that our data is safe; that is fundamental.

Amendment 47 would require the Secretary of State to prepare and publish rules on cybersecurity for providers to follow. The existing trust framework includes rules on cybersecurity, against which organisations will be certified. Specifically, providers will be able to prove either that they meet the internationally recognised information security standards or that they have a security management system that matches the criteria set out in the trust framework.

I assure noble Lords that the Information Commissioner’s Office, the National Cyber Security Centre and other privacy stakeholders have contributed to the development of the trust framework. This includes meeting international best practice around encryption and cryptology techniques. I will happily write to noble Lords to reassure them further by detailing the range of protections already in place. Alternatively, if noble Lords here today would benefit from an official technical briefing on the trust framework, we would be delighted to set up such a meeting because it is important that we all feel content that this will be a robust system, for exactly the reasons that the noble Lord, Lord Markham, explained. We are absolutely on your Lordships’ side and on the case on all this; if it would be helpful to have a meeting, we will certainly do that.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank the Minister and my noble friend Lord Markham for those comprehensive and welcome comments. I would certainly like to take up the Minister’s offer of a technical briefing on the trust framework; that really is extremely important.

To go briefly off-piste, one sign that we are doing this properly will be the further development of an insurance marketplace for cybersecurity. It exists but is not very developed at the moment. As and when this information is regularly published and updated, we will see products becoming available that allow people to take insurance based on known risks around cybersecurity.

As I say, I take comfort from the Minister’s words and look forward to attending the tech briefing. When it comes, the cyber Bill will also play a serious role in this space and I look forward to seeing how, specifically, it will interact with DVS and the other services that we have been discussing and will continue to discuss. I beg leave to withdraw my amendment.

Amendment 47 withdrawn.
Clause 45: Power of public authority to disclose information to registered person
Amendment 48 not moved.
Clause 45 agreed.
Clauses 46 to 48 agreed.
Clause 49: Code of practice about the disclosure of information
Amendment 49 not moved.
Clause 49 agreed.
Amendment 50 not moved.
Clause 50: Trust mark for use by registered persons
Amendment 51
Moved by
51: Clause 50, page 46, line 19, at end insert—
“(3A) A person who acts in contravention of subsection (3) commits an offence.(3B) A person who commits an offence under subsection (3A) is liable—(a) on summary conviction to a fine; or(b) on conviction on indictment to a term of imprisonment not exceeding 2 years or to a fine or both.”Member’s explanatory statement
This amendment makes it an offence for someone to use a trust mark when they have no permission to do so, aimed to weed out fraud.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, in moving Amendment 51, I will also speak to Amendments 52, 53, 54 and 209 in my name, which seek to create new criminal offences under the Bill. The first is the offence of using a trust mark without permission; the second is providing false information to the Secretary of State in response to an information notice; and the third is using a false digital identity document, which is really an alternative to digital identity theft.

Clause 50 currently contains no real consequence for a person using a trust mark without permission. A trust mark, which has no specific definition in the Bill, may be used only by those who are in the DVS register. Clause 50(3) says:

“A mark designated under this section may not be used by a person in the course of providing, or offering to provide, digital verification services unless the person is registered in the DVS register in respect of those digital verification services”.


Clause 50(4) then says:

“The Secretary of State may enforce subsection (3)”


by civil injunction or interdict. This has no real teeth in circumstances where there are persistent and flagrant offenders, regardless of whether it is on a personal or commercial scale.

Amendment 51 would give appropriate penalties, with a fine on summary conviction and two years’ imprisonment, or a fine on indictment. Amendment 52 would make provision so that a prosecution may not be brought unless by or with the consent of the appropriate chief prosecutor. Amendment 54 relates to providing false information to the Secretary of State. That is advanced on a similar basis, containing a power for the Secretary of State to require information. Of course, many regulators have this power.

On the issue of false digital identities—identity theft —Amendment 53 is a refinement of the Amendment 289 which I tabled to the late, unlamented DPDI Bill in Committee. That amendment was entitled “Digital identity theft”. I have also retabled the original amendment, but in many ways Amendment 53 is preferable because it is much more closely aligned to the Identity Documents Act, which contains several offences that relate to the use of a person’s identity document. Currently, an identity document includes an immigration document—a passport or similar document—or a driving licence.

18:30
All these identity documents are physical documents; it therefore does not extend to digital documents. The Bill establishes a framework that could include all or some of these documents in digital verification services. DVS could further create new or similar identity documents. The Bill would not, as the law currently stands, make offences for an identity document created or verified by DVS to be misused or falsified in a way similar to how the Identity Documents Act 2010 makes it an offence for misuse or falsification of physical documents.
Amendment 53 proposes to extend to online the offences of misuse and falsifying identity documents, so far as they extend offline. I know that successive Governments have been keen to treat online the same as offline. Identity documents should not be falsified or misused, whether offline or online. Indeed, it should not matter whether it is offline or online for it to be an offence to misuse or falsify identity documents. The Identity Documents Act 2010 prescribes the penalties for the separate offences under that Act: 10 years for an offence under Sections 4 and 5 and two years on indictment of the relevant statutory maximum under Section 6.
In terms of digital identity, whichever amendment the Minister prefers, I hope that she will at least give a hearing to this new Amendment 53. It is the product of some thought about how best to protect those who use digital identity online and ensure that they have the same protection as they would otherwise have had in the physical world. I beg to move.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I support the amendments in the name of the noble Lord, Lord Clement-Jones. I perhaps did not say it at the beginning of my remarks on this section, but I fully support the Government’s efforts to create a trust framework. I think I started with criticism rather than with the fact that this is really important. Trust is in the name and if we cannot trust it, it is not going to be a trust framework. It is important to anticipate and address the likelihood that some will seek to abuse it. If there are not sufficient consequences for abusing it, I do not understand quite how we can have the level of trust needed for this to have wide adoption.

I particularly want to say that good systems cannot rely on good people. We know that and we see it. We are going to discuss it later in Committee, but good systems need checks and balances. In relation to this set of amendments, we need a disincentive for bad actors to mislead or give false information to government or the public. I am not going to rehearse each amendment that the noble Lord, Lord Clement-Jones, explained so brilliantly. The briefing on the trust framework is a very important one for us all. The amount of support there is for the idea, and the number of questions about what it means and how it will work, mean that we will come back to this if we do not have a full enough explanation of the disincentives for a bad actor.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

My Lords, I support these amendments and applaud the noble Lord, Lord Clement-Jones, for his temerity and for offering a variety of choices, making it even more difficult for my noble friend to resist it.

It has puzzled me for some time why the Government do not wish to see a firm line being taken about digital theft. Identity theft in any form must be the most heinous of crimes, particularly in today’s world. This question came up yesterday in an informal meeting about a Private Member’s Bill due up next Friday on the vexed question of the sharing of intimate images and how the Government are going to respond to it. We were sad to discover that there was no support among the Ministry of Justice officials who discussed the Bill with its promoter for seeing it progress any further.

At the heart of that Bill is the same question about what happens when one’s identity is taken and one’s whole career and personality are destroyed by those who take one’s private information and distort it in such a way that those who see it regard it as being a different person or in some way involved in activities that the original person would never have been involved in. Yet we hear that the whole basis on which this digital network has been built up is a voluntary one, and the logic of that is that it would not be necessary to have the sort of amendments that are before us now.

I urge the Government to think very hard about this. There must be a break point here. Maybe the meeting that has been promised will help us, but there is a fundamental point about whether in the digital world we can rely on the same protections that we have in the real world—and, if not, why not?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, I will address the amendments proposed by the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron. I have nothing but the deepest respect for their diligence, and indeed wisdom, in scrutinising all three flavours of the Bill as it has come out, and for their commitment to strengthening the legislative framework against fraud and other misuse of digital systems. However, I have serious reservations about the necessity and proportionality of the amendments under consideration, although I look forward to further debates and I am certainly open to being convinced.

Amendments 51 and 52 would introduce criminal sanctions, including imprisonment, for the misuse of trust marks. While the protection of trust marks is vital for maintaining public confidence in digital systems, I am concerned that introducing custodial sentences for these offences risks overcriminalisation. The misuse of trust marks can and should be addressed through robust civil enforcement mechanisms. Turning every such transgression into a criminal matter would place unnecessary burdens on, frankly, an already strained justice system and risks disproportionately punishing individuals or small businesses for inadvertent breaches.

Furthermore, the amendment’s stipulation that proceedings could be brought only by or with the consent of the Director of Public Prosecutions or the Secretary of State is an important safeguard, yet it underscores the high level of discretion required to enforce these provisions effectively, highlighting the unsuitability of broad criminalisation in this context.

Amendment 53 seeks to expand the definition of identity documents under the Identity Documents Act 2010 to include digital identity documents. While the noble Lord, Lord Clement-Jones, makes a persuasive case, the proposal raises two concerns. First, it risks pre-emptively criminalising actions before a clear and universally understood framework for digital identity verification is in place. The technology and its standards are still evolving, and it might be premature to embed such a framework into criminal law. Secondly, there is a risk that this could have unintended consequences for innovation in the digital identity sector. Businesses and individuals navigating this nascent space could face disproportionate legal risks, which may hinder progress in a field critical to the UK’s digital economy.

Amendment 54 would introduce an offence of knowingly or recklessly providing false information in response to notices under Clause 51. I fully support holding individuals accountable for deliberate deception, but the proposed measure’s scope could lead to serious ambiguities. What constitutes recklessness in this context? Are we inadvertently creating a chilling effect where individuals or businesses may refrain from engaging with the system for fear of misinterpretation or error? These are questions that need to be addressed before such provisions are enshrined in law.

We must ensure that our legislative framework is fit for purpose, upholds the principles of justice and balances enforcement with fairness. The amendments proposed, while they clearly have exactly the right intentions, risk, I fear, undermining these principles. They introduce unnecessary criminal sanctions, create uncertainty in the digital identity space and could discourage good-faith engagement with the regulatory system. I therefore urge noble Lords to carefully consider the potential consequences of these amendments and, while expressing gratitude to the noble Lords for their work, I resist their inclusion in the Bill.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, of course we want to take trust seriously. I could not agree more that the whole set of proposals is predicated on that. Noble Lords have all made the point, in different ways, that if there is not that level of trust then people simply will not use the services and we will not be able to make progress. We absolutely understand the vital importance of all that. I thank all noble Lords for their contributions on this and I recognise their desire to ensure that fraudulent use of the trust mark is taken seriously, as set out in Amendments 51 and 52.

The trust mark is in the process of being registered as a trademark in the UK. As such, once that is done, the Secretary of State will be able to take appropriate legal action for misuse of it. Robust legal protections are also provided through Clause 50, through the trademark protections, and through other existing legislative provisions, such as the Consumer Protection from Unfair Trading Regulations 2008. There is already legislation that underpins the use of that trust mark. Additionally, each trust mark will have a unique number that allows users to check that it is genuine. These amendments would duplicate those existing protections.

In seeking to make the misuse of a digital identity a criminal offence, which Amendments 53 and 209 attempt to do, the noble Lord offered me several different ways of approaching this, so I will offer him some back. The behaviour he is targeting is already addressed in the Fraud Act 2006, the Computer Misuse Act 1990 and the Data Protection Act 2018. We would argue that it is already by existing legislation.

On the noble Lord’s point about the Identity Documents Act 2010, defining every instance of verification as an identity document within the scope of offences in that Act could create an unclear, complicated and duplicative process for the prosecution of digital identity theft. The provision of digital verification services does not always create one single comprehensive identity proof—I think this is the point that the noble Viscount, Lord Camrose, was making. People use it in different ways. It might be a yes/no check to ensure that a person is over 18, or it might be a digital verification services provider providing several derived credentials that can be used in different combinations for different use cases. We have to be flexible enough to be able to deal with that and not just make one fraudulent act. It would not be appropriate to add digital identity to the list of documents set out in the Identity Documents Act.

Amendment 54 would create an offence of supplying false information to the Secretary of State, but sanctions already exist in this situation, as the organisation can be removed from the DVS register via the power in Clause 41. Similarly, contractual arrangements between the Office for Digital Identities and Attributes and conformity assessment bodies require them to adhere to the principle of truthfulness and accuracy. To create a new offence would be disproportionate when safeguards already exist. I take on board the intent and aims of the noble Lord, Lord Clement-Jones, but argue that there are already sufficient protections in current law and in the way in which the Bill is drafted to provide the reassurance that he seeks. Therefore, I hope that he feels comfortable in not pressing his amendment.

18:45
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I thank the Minister. I was quite amused in listening to the noble Viscount, Lord Camrose. I thought about the halcyon days of listening to the language that he used when he was a Minister, with words like “premature”, “unintended consequences”, “disproportionate” and “ambiguity”. I thought, “Come back, Viscount Camrose”—but I appreciate that he took the trouble at least to analyse, from his point of view, where he saw the problems with some of the amendments.

I go back to the physical identity verification aspect. I should have said that I very much hope that the Minister and I can discuss how the Equality Act 2010 has an impact. I am not entirely sure about the protected characteristics playing into this because, obviously, the Equality Act only references those. I think that there could be a greater cohort of people who may be disadvantaged by commercial operators insisting on digital verification, as opposed to physical verification, for instance; I may need to discuss that with the Minister.

I am grateful to the Minister for having gone through where she thinks that there are safeguards and sanctions against using trust identity falsely; that was a very helpful run-through so I shall not go back to what she said. The really important area is this whole offline/online criminal aspect. I understand that it may not be perfect because the scheme is not in place—it may not need to be on all fours exactly with the 2010 Act—but I think that the Minister’s brief was incorrect in this respect. If the Bill team look back at the report from the committee that the noble Baroness, Lady Morgan, chaired back in 2022, Fighting Fraud: Breaking the Chain, they will see that it clearly said:

“Identity theft is often a predicate action to the criminal offence of fraud, as well as other offences including organised crime and terrorism, but it is not a criminal offence”.


That is pretty clear. The committee went into this in considerable detail and said:

“The Government should consult on the introduction of legislation to create a specific criminal offence of identity theft. Alternatively, the Sentencing Council should consider including identity theft as a serious aggravating factor in cases of fraud”.


First, I am going to set the noble Baroness, Lady Morgan, on the noble Viscount, Lord Camrose, to persuade him of the wisdom of creating a new offence. I urge the Minister to think about the consequences of not having any criminal sanction for misuse of digital and identity theft. Whatever you might call it, there must be some way to protect people in these circumstances, if we are going to have public trust in the physical verification framework that we are setting up under this Bill. This will be rolled out—if only I had read GOV.UK, I would be far wiser.

It was very interesting to hear the Minister start to unpack quite a lot of detail. We heard about the new regulator, the Office for Digital Identities and Attributes. That was the first reference to the new regulator, but what are its powers going to be? We need a parliamentary debate on this, clearly. Is this an office delegated by the Secretary of State? Presumably, it is non-statutory, in a sense, and will have powers that are at the will of the Secretary of State. It will be within DSIT, I assume—and so on.

I am afraid that we are going round in circles here. We need to know a great deal more. I hope that we get much more protection for those who have the benefit of the service; otherwise, we will find ourselves in a situation that we are often in as regards the digital world, whereby there is a lack of trust and the public react against what they perceive as somebody taking something away from them. In the health service, for example, 3 million people have opted out from sharing their GP personal health data. I am only saying that we need to be careful in this area and to make sure that we have all the right protections in place. In the meantime, I beg leave to withdraw my amendment.

Amendment 51 withdrawn.
Amendment 52 not moved.
Clause 50 agreed.
Amendment 53 not moved.
Clause 51: Power of Secretary of State to require information
Amendment 54 not moved.
Clause 51 agreed.
Clauses 52 to 55 agreed.
Clause 56: National Underground Asset Register: England and Wales
Amendment 55
Moved by
55: Clause 56, page 53, line 17, at end insert—
“(2A) The Secretary of State must make regulations providing for the security measures which must be complied with before persons may receive information from NUAR.”Member’s explanatory statement
This is a probing amendment to question whether stringent security measures will be put in place to protect critical infrastructure from criminal and terrorist threats.
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, I am confident that, somewhere, there is a moral philosopher and legal scholar who can explain why this amendment is not part of the next group on NUAR but, in the meantime, my amendment addresses a fundamental issue. It would ensure that strict security measures are in place before any individual or organisation is allowed access to the sensitive information held on the National Underground Asset Register. The NUAR is a crucial tool for managing the UK’s underground infrastructure. It holds critical data about pipelines, cables and other assets that underpin vital services such as water, energy, telecommunications and transport.

This information, while essential for managing and maintaining infrastructure, is also a potential target for misuse. As such, ensuring the security of this data is not just important but vital for the safety and security of our nation. The information contained in the NUAR is sensitive. Its misuse could have disastrous consequences. If this data were to fall into the wrong hands, whether through criminal activities, cyberattacks or terrorism, it could be exploited to disrupt or damage critical infrastructure. I know that the Government take these risks seriously but this amendment seeks to address them further by ensuring that only those with a legitimate need, who have been properly vetted and who have met specific security requirements can access this data. We must ensure that the people accessing this register are trusted individuals or organisations that understand the gravity of handling this sensitive information and are fully aware of the risks involved.

The amendment would ensure that we have a framework for security—one that demands that the Secretary of State introduces clear, enforceable regulations specifying the security measures that must be in place before anyone can access the NUAR. These measures may include: background checks to ensure that those seeking access are trustworthy and legitimate; cybersecurity safeguards to prevent unauthorised digital access or breaches; physical security measures to protect the infrastructure where this information is stored; and clear guidelines on who should be allowed access and the conditions under which they can view this sensitive data.

The potential threats posed by unsecured access to the NUAR cannot be overstated. Criminals could exploit this information to target and disrupt key infrastructure systems. Terrorist organisations could use it to plan attacks on essential services, endangering lives and causing mass disruption. The stakes are incredibly high; I am sure that I do not need to convince noble Lords of that. In an era where digital and physical infrastructure are increasingly interconnected, the risks associated with unsecured access to information of the kind held in the NUAR are growing every day. This amendment would address this concern head on by requiring that we implement safeguards that are both thorough and resilient to these evolving threats. Of course, the cyber Bill is coming, but I wonder whether we need something NUAR-specific and, if so, whether we need it in this Bill. I beg to move.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I thank the noble Viscount for raising the issue of the National Underground Asset Register’s cybersecurity. As he said, Amendment 55 seeks to require more detail on the security measures in the regulations that will be applied to the accessing of NUAR data.

The noble Viscount is right: it is absolutely fundamental that NUAR data is protected, for all the reasons he outlined. It hosts extremely sensitive data. It is, of course, supported by a suite of sophisticated security measures, which ensure that the very prescribed users’ access to data is proportionate. I hope that the noble Viscount understands that we do not necessarily want to spell out what all those security measures are at this point; he will know well enough the sorts of discussions and provisions that go on behind the scenes.

Security stakeholders, including the National Cyber Security Centre and the National Protective Security Authority, have been involved in NUAR’s development and are members of its security governance board, which is a specific governance board overseeing its protection. As I say, access to it occurs on a very tight basis. No one can just ask for access to the whole of the UK’s data on NUAR; it simply is not geared up to be operated in that way.

We are concerned that the blanket provision proposed in the amendment would lead to the publication of detailed security postures, exposing arrangements that are not public knowledge. It could also curtail the Government’s ability to adapt security measures when needed and, with support from security stakeholders, to accommodate changing circumstances—or, indeed, changing threats—that we become aware of. We absolutely understand why the noble Viscount wants that reassurance. I can assure him that it is absolutely the best security system we could possibly provide, and that it will be regularly scrutinised and updated; I really hope that the noble Viscount can take that assurance and withdraw his amendment.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank the Minister for that answer. Of course, I take the point that to publish the security arrangements is somehow to advertise them, but I am somehow not yet altogether reassured. I wonder whether there is something that we can push further as part of a belt-and-braces approach to the NUAR security arrangements. We have talked about cybersecurity a lot this afternoon. All of these things tend to create additional incentives towards cyberattacks —if anything, NUAR does so the most.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

If it helps a little, I would be very happy to write to the noble Viscount on this matter.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Yes, that would be great. I thank the Minister. I beg leave to withdraw my amendment.

Amendment 55 withdrawn.
Clause 56 agreed.
Schedule 1 agreed.
Clauses 57 and 58 agreed.
Schedule 2 agreed.
Clauses 59 and 60 agreed.
Amendment 56
Moved by
56: After Clause 60, insert the following new Clause—
“Private sector consultation regarding NUARThe Secretary of State must consult with relevant private sector organisations before implementing the provisions regarding the National Underground Asset Register.”Member’s explanatory statement
This is a probing amendment to determine the level of Government consultation with the private sector regarding NUAR.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, successive Governments have demonstrated their enthusiasm for NUAR. It was quite interesting to hear the Minister’s enthusiasm for the digitisation of the map of the Underground, so to speak; she was no less enthusiastic than her predecessor. However, as the Minister knows, there are tensions between them—the new, bright, shiny NUAR—and LSBUD, or LinesearchbeforeUdig, which in some respects is the incumbent.

19:00
LSBUD believes that it has already achieved many of the aims of NUAR, and it is more cost effective. It has been in place for more than 20 years; it includes data from more than 150 asset owners and is used by 250,000 UK digging contractors and individuals. Its view is that the Government and NUAR are not currently working effectively with it and other industry actors and that, without aligning with proven industry best practice, NUAR poses a number of serious risks including open-ended costs and unclear benefits. It believes that, without a plan to work more closely with LSBUD, as the key industry representative, NUAR risks providing a poorer service than that currently on offer, potentially creating more accidental strikes of key network infrastructure.
LSBUD believes that its own system is working well and is popular with industry, but I do not think we can turn the clock back. Surely the Government must work in partnership with the safe digging organisations that have the experience and know-how to ensure that the benefits currently enjoyed from existing best practice are not lost, and to keep assets and digging workers safe. I very much hope that, even if the Government does not accept an amendment along these lines, they give an assurance that LSBUD is being properly consulted and is included in the plans of NUAR. Otherwise, it cannot be healthy to have two separately operating systems.
I come to my other amendments, which, for some unknown reason, are grouped with the ones on NUAR. They about NHS IT, so it must be infrastructure—I suppose that must be the common factor between them. I happen to have a speaking note on this. The whole intent of my Amendments 193 to 195 is to seek clarification on two critical aspects. First, these information standards should explicitly apply to IT providers involved in the processing of data within primary care as well as secondary care. Secondly, the standards must extend to existing contracts with IT providers, not just new agreements formed after the passage of this legislation.
The NHS operates on a complex ecosystem—I barely need to say that, standing next to the noble Lord, Lord Markham, given his previous occupation—and, in some cases, legacy systems, which are there in spades. Any effort to improve data interoperability must account for these realities. Without retroactive application of these information standards to existing contracts, there is a risk that entrenched barriers to effective data sharing will persist. These barriers would undermine the goals of the legislation, delaying improvements in patient care and operational efficiency. In addition, if the application of these information standards is to extend beyond the interoperability of operational NHS data and provide secure access to electronic health records that support the delivery of direct care, it is vital that we make explicit in this legislation its application to IT providers within the primary care estate. I beg to move.
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank the noble Lord, Lord Clement-Jones, for these amendments. Amendment 46 is about NUAR and the requirement to perform consultation first. I am not convinced that is necessary because it is already a requirement to consult under Clause 60 and, perhaps more pertinently, NUAR is an industry-led initiative. It came out of an industry meeting and has been led by them throughout. I am therefore not sure, even in spite of the requirement to consult, that much is going to come out of that consultation exercise.

In respect of other providers out there, LSBUD among them, when we were going through this exact debate in DPDI days, the offer I made—and I ask the Minister if she would consider doing the same—was to arrange a demonstration of NUAR to anyone who had not seen it. I have absolutely unshakeable confidence that anybody who sees NUAR in action will not want anything else. I am not a betting man, but—

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

For the record, the noble Viscount is getting a vigorous nod from the Minister.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

We will see, but such a demonstration would certainly ease any perfectly reasonable concerns that might emerge. To put it in a more colourful way, this is Netflix in the age of Blockbuster Video.

The slightly different Amendments 193, 194 and 195 clarify that these information standards should explicitly apply to IT providers involved in the processing of data within primary as well as secondary care, and that the standards must extend to existing contracts with providers, not just new agreements formed after this Act. I understand the point of these amendments but I am slightly concerned about how the retroactivity would affect existing contractual agreements. I am also slightly concerned about the wish to hard-code certain conditions into rules that function best the more they are principles-based and the less they are specifically related to particular areas of technology. That said, I think I am persuadable on it, but I have not yet made that leap.

Lord Markham Portrait Lord Markham (Con)
- Hansard - - - Excerpts

I am not going to say much except to try to persuade my noble friend. I am absolutely with the intent of what the noble Lord, Lord Clement-Jones, is trying to do here and I understand the massive benefits that can be gained from it.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I am grateful to the noble Viscount for joining me in my enthusiasm for NUAR. He is right: having seen it in practice, I am a great enthusiast for it. If it is possible to demonstrate it to other people, I would be very happy to do so, because it is quite a compelling story when you see it in practice.

Amendment 56, in the name of the noble Lord, Lord Clement-Jones, would place a duty on the Secretary of State to consult relevant private sector organisations before implementing the NUAR provisions under the Bill. I want to make clear then that the Geospatial Commission, which oversees NUAR, has been engaging with stakeholders on NUAR since 2018. Since then, there have been extensive reviews of existing processes and data exchange services. That includes a call for evidence, a pilot project, public consultation and numerous workshops. A series of in-person focus groups were completed last week and officials have visited commercial companies with specific concerns, including LinesearchbeforeUdig, so there has been extensive consultation with them.

I suppose one can understand why they feel slightly put out about NUAR appearing on the scene, but NUAR is a huge public asset that we should celebrate. We can potentially use it in other ways for other services in the future, once it is established, and we should celebrate the fact that we have managed to create it as a public asset. I say to the noble Lord, Lord Clement-Jones, that a further consultation on that basis would provide no additional benefit but would delay the realisation of the significant benefits that NUAR could deliver.

Moving on to the noble Lord’s other amendments, Amendments 193, 194, and 195, he is absolutely right about the need for data interoperability in the health service. We can all think of examples of where that would be of benefit to patients and citizens. It is also true that we absolutely need to ensure that our health and care system is supported by robust information standards. Again, we go back to the issue of trust: people need to know that those protections are there.

This is why we would ensure, through Clause 119 and Schedule 15, that suppliers of IT products and services used in the provision of health or adult social care in England are required to meet relevant information standards. In doing so, we can ensure that IT suppliers are held to account where information standards are not implemented. The application of information standards is independent of commercial organisations, and we would hold IT companies to them. Furthermore, the definition of healthcare as set out in the Health and Social Care Act 2012, as amended by the Health and Care Act 2022, already ensures that all forms of healthcare are within scope of information standards, which would include primary care. That was one of the other points that the noble Lord made.

As an add-on to this whole discussion, the noble Lord will know that the Government are preparing the idea of a national data library, which would encourage further interoperability between government departments to make sure that we use it to improve services. Health and social care is the obvious one, but the members of the Committee can all think of all sorts of other ways where government departments, if they collaborated on an interoperable basis, could drive up standards and make life easier for a whole lot of citizens in different ways. We are on the case and are absolutely determined to deliver it. I hope that, on that basis, the noble Lord will withdraw his amendment.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I am sorry to interrupt the Minister, but she has whetted our appetite about the national data library. It is not included in the Bill. We talked about it a little at Second Reading, but I wonder whether she can tell us a little more about what is planned. Is it to be set up on a statutory basis or is it a shadow thing? What substance will it actually have and how?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Well, details of it were in our manifesto, in as much as a manifesto is ever detailed. It is a commitment to deliver cross-departmental government services and create a means whereby some of the GDPR blockages that stop one department speaking to another can, where necessary, be freed up to make sure that people exchange data in a more positive way to improve services. There will be more details coming out. It is a work in progress at the moment and may well require some legislation to underpin it. There is an awful lot of work to be done in making sure that one dataset can talk to another before we can progress in any major way, but we are working at speed to try to get this new system up and running.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I thank the Minister for that, which was very interesting. We were talking about medical health IT and “GDPR blockages” almost has a medical quality to it. The embryonic national data library will obviously get some more mentions as we go through the Bill. It is a work in progress, so I hope that we will know more at the end of the Bill than we did at the beginning.

The Minister talked about datasets talking to each other. We will have to get the noble Viscount, Lord Camrose, to use other phrases, not just “Netflix in the age of Blockbuster” but something equally exciting about datasets talking to each other.

19:15
Clearly, there is a difference of perception around NUAR. I would be very happy to look at it, but that does not necessarily answer the question as to how we can settle the horses. I do not know whether it is purely commercial competition or whether it is a genuine feeling that there is a lack of safety if all the players are not involved, but there does seem to be something fundamental. LSBUD came back on the last Bill with very much the same points; several months have passed now, so there must be something which is concerning it. Will the Minister take that away and interrogate the system as to whether we can do better to ensure that there is consensus about how everybody goes forward?
I thank the Minister for the primary care point, which is extremely valuable and will be welcome to those who briefed me on these amendments. I do not think the question of retrospectivity was really dealt with. Some of the issues with legacy systems in the NHS are a real problem and the more pressure one can put to rectify some of those systems, the better. I suspect that the noble Lord, Lord Markham, was around when people were talking about the number of fax machines used in the NHS and things of that sort—I do not think any of that has gone away either. I suspect that a bit of retrospectivity would not go amiss in the circumstances. In the meantime, I beg leave to withdraw my amendment.
Amendment 56 withdrawn.
Clause 61: Form in which registers of births and deaths are to be kept
Amendment 57
Moved by
57: Clause 61, page 71, line 18, at end insert—
“(2A) The Registrar General must make provision to ensure the security of the registers of live-births, still-births, and deaths.”Member’s explanatory statement
This is a probing amendment intended to ensure that suitable cyber-security measures are put in place to secure the larger attack surface of digital registers of live-births, still-births, and deaths.
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, there is a great deal to be gained from digitising the registers of births, stillbirths and deaths. Not only does it reduce the number of physical documents that need to be maintained and kept secure but it means that people do not have to physically sign the register of births or deaths in the presence of a registrar. This will make people’s lives a great deal easier during those stressful periods of their lives.

However, digitising all this data—I am rather repeating arguments I made about NUAR and other things earlier—creates a much larger attack surface for people looking to steal personal data. This amendment explores how the Government will protect this data from malign actors. If the Minister could provide further detail on this, I would be most grateful.

This is a probing amendment and has been tabled in a constructive spirit. I know that we all want to harness the power of data and tech in this space and use it to benefit people’s lives but, particularly with this most personal of data, we have to take appropriate steps to keep it secure. Should there be a data breach, hackers would have access to an enormous quantity of personal data. Therefore, I suggest that, regardless of how much thought the Government have given this point up to now, the digitisation of these registers should not occur until substantial cybersecurity measures are in place. I look forward to the Minister’s comments.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

On Amendment 57, legislation is already in place to ensure the security of electronic registers. Articles 25 and 32 of the UK General Data Protection Regulation impose duties on controllers of personal data to implement appropriate technical and organisational measures, including security measures, so this already applies.

The electronic system has been in place for births and deaths since 2009, and all events have been registered electronically since that date, in parallel with the paper registers and with no loss of data. What is happening with this legislation is that people do not have to keep paper records anymore; it is about the existing electronic system. The noble Lord will remember that it is up to registrars even so, but I think that the idea is that they will no longer have to keep the paper registers as well, which everybody felt was an unnecessary administrative burden.

Nevertheless, the system is subject to Home Office security regulations, and robust measures are in place to protect the data. There has been no loss of data or hacking of that data up to now. Obviously, we need to make sure that the security is kept up to date, but we think that it is a pretty robust system. It is the paper documents that are losing out here.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank the Minister. I take the point that this has been ongoing for a while and that, in fact, the security is better because there is less reliance on the paper documents. That said, I am encouraged by her answer and encouraged that the Government continue to anticipate this growing risk and act accordingly. On that basis, I withdraw the amendment.

Amendment 57 withdrawn.
Clause 61 agreed.
Clauses 62 to 64 agreed.
Amendment 58
Moved by
58: After Clause 64, insert the following new Clause—
“Review of notification of changes of circumstances legislation(1) The Secretary of State must commission a review of the operation of the Social Security (Notification of Changes of Circumstances) Regulations 2010.(2) In conducting the review, the designated reviewer must -(a) consider the current operation and effectiveness of the legislation (b) identify any gaps in its operations and provisions(c) consider and publish recommendations as to how the scope of the legislation could be expanded to include non-public sector, voluntary and private sector holders of personal data.(3) In undertaking the review, the reviewer must consult -(a) specialists in data sharing(b) people and organisations who campaign for the interests of people affected by, and use the legislation(c) any other persons and organisations the review considers appropriate.(4) The Secretary of State must lay a report of the review before each House of Parliament within six months of this Act coming into force.”Member's explanatory statement
This amendment requires a review of the operation of the ‘Tell Us Once’ programme—which seeks to provide simpler mechanisms for citizens to pass information regarding births and deaths to government—and consider whether the pioneering progress of Tell Us Once could be extended to non-public sector holders of data.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, of course I welcome the fact that the Bill will enable people to register a death in person and online, which was a key recommendation from the UK Commission on Bereavement. I have been asked to table this amendment by Marie Curie; it is designed to achieve improvements to UK bereavement support services, highlighting the significant administrative burden faced by bereaved individuals.

Marie Curie points to the need for a review of the existing Tell Us Once service and the creation of a universal priority service register to streamline death-related notifications across government and private sectors. It argued that the Bill presents an opportunity to address these issues through improved data-sharing and online death registration. Significant statistics illustrate the scale of the problem, showing a large percentage of bereaved people struggling with numerous administrative tasks. It urges the Government, as I do, to commit to implementing those changes to reduce the burden on bereaved families.

Bereaved people face many practical and administrative responsibilities and tasks after a death, which are often both complex and time sensitive. This Bill presents an opportunity to improve the way in which information is shared between different public and private service providers, reducing the burden of death administration.

When someone dies, the Tell Us Once service informs the various parts of national and local government that need to know. That means the local council stops charging council tax, the DVLA cancels the driving licence, the Passport Office cancels the passport, et cetera. Unfortunately, Tell Us Once is currently not working across all Government departments and does not apply to Northern Ireland. No updated equality impact assessment has ever been undertaken. While there are death notification services in the private sector, they are severely limited by not being a public service programme—and, as a result, there are user costs associated, adding to bereaved people’s financial burden and penalising the most struggling families. There is low public awareness and take-up among all these services, as well as variable and inconsistent provision by the different companies. The fact that there is not one service for all public and private sector notifications means that dealing with the deceased’s affairs is still a long and painful process.

The Bill should be amended to require Ministers to carry out a review into the current operation and effectiveness of the Tell Us Once service, to identify any gaps in its operation and provisions and make recommendations as to how the scope of the service could be expanded. Priority service registers are voluntary schemes which utility companies create to ensure that extra help is available to certain vulnerable customers. The previous Government recognised that the current PSRs are disjointed, resource intensive and duplicative for companies, carrying risks of inconsistencies and can be “burdensome for customers”.

That Government concluded that there is “significant opportunity to improve the efficiencies and delivery of these services”. The Bill is an opportunity for this Government to confirm their commitment to implementing a universal priority services register and delivering any legislative measures required to facilitate it. A universal PSR service must include the interests of bereaved people within its scope, and charitable voluntary organisations such as Marie Curie, which works to support bereaved people, should be consulted in its development.

I have some questions to the Minister. First, what measures does this Bill introduce that will reduce the administrative burden on bereaved people after the death of a loved one? Secondly, the Tell Us Once service was implemented in 2010 and the original equality impact assessment envisaged that its operation should be kept under review to reflect the changing nature of how people engage with public services, but no review has ever happened. Will the Minister therefore commit the Government to undertake a review of Tell Us Once? Thirdly, the previous Government’s Smarter Regulation White Paper committed to taking forward a plan to create a “shared once” support register, which would bring together priority service registers. Will the Minister commit this Government to taking that work forward? I beg to move.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

My Lords, it occurred to me when the noble Lord was speaking that we had lost a valuable member of our Committee. This could not be the noble Lord, Lord Clement-Jones, who was speaking to us just then. It must have been some form of miasma or technical imposition. Maybe his identity has been stolen and not been replaced. Normally, the noble Lord would have arrived with a short but punchy speech that set out in full how the new scheme was to be run, by whom, at what price, what its extent would be and the changes that would result. The Liberal future it may have been, but it was always delightful to listen to. I am sad that all the noble Lord has asked for here is a modest request, which I am sure the noble Baroness will want to jump to and accept, to carry out a review—as if we did not have enough of those.

Seriously, I once used the service that we have been talking about when my father-in-law died, and I found it amazing. It was also one that I stumbled on and did not know about before it happened. Deaths did not happen often enough in my family to make me aware of it. But, like the noble Lord, Lord Clement-Jones, I felt that it should have done much more than what it did, although it was valuable for what it did. It also occurred to me, as life moved on and we produced children, that there would be a good service when introducing a new person—a service to tell you once about that, because the number of tough issues one has to deal with when children are born is also extraordinary and can be annoying, if you miss out on one—particularly with the schooling issues, which are more common these days than they were when my children were being born.

I endorse what was said, and regret that the amendment perhaps did not go further, but I hope that the Minister when she responds will have good news for us.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank the noble Lord, Lord Clement-Jones, for raising this, and the noble Lord, Lord Stevenson, for raising the possibility that we are in the presence of a digital avatar of the noble Lord, Lord Clement-Jones. It is a scary thought, indeed.

The amendment requires a review of the operation of the Tell Us Once programme, which seeks to provide a simpler mechanism for citizens to pass information regarding births and deaths to the Government. It considers whether the pioneering progress of Tell Us Once could be extended to non-public sector holders of data. When I read the amendment, I was more cynical than I am now, having heard what the noble Lord, Lord Clement-Jones, had to say. I look forward to hearing the Minister’s answers. I take the point from the noble Lord, Lord Stevenson, that we do not necessarily need another review—but now that I have heard about it, it feels a better suggestion than I thought it was when reading about it.

I worry that expanding this programme to non-public sector holders of data would be a substantial undertaking; it would surely require the Government to hold records of all the non-public sector organisations that have retained and processed an individual’s personal data. First, I am not sure that this would even be possible—or practicable, anyway. Secondly, I am not sure that it would end up being an acceptable level of state surveillance. I look forward to hearing the Minister’s response but I am on the fence on this one.

19:30
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, the noble Lord, Lord Clement-Jones, explained very well how the Tell Us Once service works. It is a really important asset for bereaved citizens who would otherwise have to notify all sorts of different departments right across government of a death. It provides a lifesaver for those who are struggling with providing all that information; we should put on record our thanks to Marie Curie and others for helping to create it and promoting it so well.

I think I have go on the record in the past—I thought that the noble Lord, Lord Clement-Jones, was going to dig out another one of my previous speeches—on this issue. I seem to remember making a very positive speech on the importance of the Tell Us Once service when we debated the previous Bill.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I only come up with the really positive ones.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

We support this service, of course—we can see the potential for expanding it further if we get this measure right—but I have to tell noble Lords that the current service is not in great shape in terms of its technology. It has suffered from insufficient investment over time and it needs to be improved before we can take it to the next stage of its potential. We consider that the best way to address this issue is, first, to upgrade its legacy technology, which is what we are operating at the moment. I realised that this is a problem only as I took over this brief; I had assumed that it would be more straightforward, but the problem seems to be that we are operating on ancient technology here.

Work is already under way to try to bring it all up to date. We are looking to improve the current service and at the opportunities to extend it to more of government. Our initial task is to try to extend it to some of the government departments that do not recognise it at the moment. Doing that will inform us of the potential limitations and the opportunities should we wish to extend it to the private sector in future. I say to the noble Lord that this will have to be a stage process because of the technological challenges that we currently have.

We are reluctant to commit to a review and further expansion of the service at this time but, once the service is updated, we would absolutely be happy to talk to noble Lords and revisit this issue, because we see the potential of it. The update is expected to be completed in the next two years; I hope that we will be able to come back and give a progress report to noble Lords at that time. However, I have to say, this is what we have inherited—bear with us, because we have a job to do in bringing it up to date. I hope that, on that basis, the noble Lord will withdraw his amendment, albeit reluctantly.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I thank the Minister for that response, and I thank the noble Lord, Lord Stevenson—at least, I think I do—for his contribution.

I have clearly worked on far too many Bills in the past. I have to do better when I move amendments like this. I have to bring the full package, but we are allowed to speak for only a quarter of an hour, so we cannot bring everything to the table. All I can promise the noble Viscount is that my avatar will haunt him while he is sitting on the fence.

I thank the Minister for giving a sympathetic response to this, but clearly there are barriers to rolling out anything beyond where we have got to. I was rather disappointed by two years because I was formulating a plan to bring back an Oral Question in about six months’ time. I am afraid that she may find that we are trying to hurry her along a little on this. I recognise that there are technology issues, but convening people and getting broader engagement with various players is something that could be done without the technology in the first instance, so the Minister can expect follow-up on this front rather earlier than two years’ time. She does not have the luxury of waiting around before we come back to her on it, but I thank her because this is a fantastic service. It is limited, but, as far as it goes, it is a godsend for the bereaved. We need to make sure that it improves and fulfils its potential across the private sector as well as the public sector. In the meantime, I beg leave to withdraw my amendment.

Amendment 58 withdrawn.
Clause 65 agreed.
Schedule 3 agreed.
Clause 66 agreed.
Committee adjourned at 7.36 pm.
Committee (2nd Day)
15:45
Relevant documents: 3rd Report from the Constitution Committee and 9th Report from the Delegated Powers Committee. Scottish, Welsh and Northern Ireland Legislative Consent sought.
Baroness McIntosh of Hudnall Portrait The Deputy Chairman of Committees (Baroness McIntosh of Hudnall) (Lab)
- Hansard - - - Excerpts

My Lords, I remind the Committee that if there is a Division in the Chamber, the Committee will adjourn for 10 minutes from the sound of the Division Bells.

Clause 67: Meaning of research and statistical purposes

Amendment 59

Moved by
59: Clause 67, page 75, line 9, after “processing” insert “solely”.
Member’s explanatory statement
This amendment prevents misuse of the scientific research exceptions for data reuse by ensuring that the only purpose for which the reuse is permissible is for the scientific research—with no additional purposes.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - - - Excerpts

I have tabled Amendments 59, 62, 63 and 65, and I thank the noble Lord, Lord Clement-Jones, my noble friend Lady Kidron and the noble Viscount, Lord Camrose, for adding their names to them. I am sure that the Committee will agree that these amendments have some pretty heavyweight support. I also support Amendment 64, in the name of the noble Lord, Lord Clement-Jones, which is an alternative to my Amendment 63. Amendments 68 and 69 in this group also warrant attention.

I very much support the Government’s aim in Clause 67 to ensure that valuable research does not get discarded due to a lack of clarity around its use or because of an overly narrow distinction between the original and new purposes of the use of the data. The Government’s position is that this clause clarifies the law by incorporating into the Bill recitals to the original GDPR. However, while the effect is to encourage scientific research and development, it has to be seen in the context of the fast-evolving world of developments in AI and the way that AI developers, given the need for huge amounts of data to train their large language models, are reusing data.

My concern is that the scraping of vast amounts of data by these AI companies is often positioned as scientific research and in some cases is even supported by the production of academic papers. I ask the Minister to understand my concerns and those of many in the data community and beyond. The fact is that the lines between scientific research, as set out in Clause 67, and AI product development are blurred. This might not be the concern of the original recitals, but I beg to suggest to the Minister that, in the new world of AI, there should be concern about the definition presented in the Bill.

Like other noble Lords, I very much hope to make this country a centre of AI development, but I do not want this to happen at the expense of data subjects’ privacy and data protection. It costs at least £1 billion—even more, sometimes—to develop a large language model and, although the cost will soon go down, there is a huge financial incentive to scrape data that pushes the boundaries of what is legitimate. In this climate, it is important that the Bill closes any loopholes that allow AI developers to claim the protections offered by Clause 67. My Amendments 59, 62, 63 and 65 go some way to ensuring that this will not happen.

The definition of scientific research in proposed new paragraph 2, in Clause 67(1)(b), is drawn broadly. My concern is that many commercial developments of digital products, particularly those involving AI, could still claim to be, in the words of the clause, “reasonably … described as scientific”. AI model development usually involves a mix of purposes—not just developing its capabilities but also commercialising as it develops services. The exemption allowed for “purposes of technological development” makes me concerned that this vague area creates a threat whereby AI developers will misuse the provisions of the Bill to reuse personal data for any AI developments, provided that one of their goals is technological advancement.

Amendments 59 and 62, by inserting the word “solely” into proposed new paragraphs 2 and 3 in Clause 67, would disaggregate reuse of data for scientific research purposes from other purposes, ensuring that the only goal of reuse is scientific research.

An example of the threat under the present definition is shown by Meta’s recently allowing the reuse of Instagram users’ data to train its new generation of Llama models. When the news got out, it created a huge backlash, with more than half a million people reposting a viral hoax image that claimed to deny Meta the right to reuse their data to train AI. This caused the ICO to say that it was pleased that Meta had paused its data processing in response to users’ concerns, adding:

“It is crucial that the public can trust that their privacy rights will be respected from the outset”.


However, Meta could well claim under this clause that it is creating technological advancement which would allow it to reuse any data collected by users under the legitimate interest grounds for training the model. The Bill as it stands would not require the company to conduct its research in accordance with any of the features of genuine scientific research. These amendments go some way to rectify that.

Amendment 63 increases the test for what is deemed to be scientific interest. At the moment, the public interest test is applied only to public health. I am pleased that NHS researchers will have to recognise this threshold, but why should all researchers doing scientific work not have to adhere to this threshold? Why should that test not be applied to all data reuse for scientific research? By deleting the public health exception, the public interest test would apply to all data reuse for scientific purposes.

The original intention of the RAS purpose of the GDPR supports public health for scientific interests. This is complemented by Amendment 65, which uses the tests for consent already laid out in Clause 68. The inclusion of ethical thresholds in the reuse of data should meet the highest levels of academic rigour and oversight envisaged in the original GDPR. It will demand not just ethical standards in research but for it to be supervised by an independent research ethics committee that meets UKRI guidance. These requirements will ensure that the high standards of ethics that we expect from scientific research will be applied in evaluating the exemption in Clause 67.

I do not want noble Lords to think that these amendments are thwarting the development of AI. There is plenty of AI research that is clearly scientific. Look at DeepMind AlphaFold, which uses AI to analyse the shape of proteins so that they can be incorporated in future drug treatment and will move pharmaceutical development. It is an AI model developed in accordance with the ethical standards expected from modern scientific research.

The Minister will argue that the definition has been taken straight from EU recitals. I therefore ask her to consider very seriously what has been said about this definition by the EU’s premier data body, the European Data Protection Supervisor, in its preliminary opinion on data protection and scientific research. In its executive summary, it states:

“The boundary between private sector research and traditional academic research is blurrier than ever, and it is ever harder to distinguish research with generalisable benefits for society from that which primarily serves private interests. Corporate secrecy, particularly in the tech sector, which controls the most valuable data for understanding the impact of digitisation and specific phenomena like the dissimilation of misinformation, is a major barrier to social science research … there have been few guidelines or comprehensive studies on the application of data protection rules to research”.


It suggests that the rules should be interpreted in such a way that permits reuse only for genuine scientific research.

For the purpose of this preliminary opinion by the EDPS, the special data protection regime for scientific research is understood to apply if each of three criteria are met: first, personal data is processed; secondly, relevant sectorial standards of methodology and ethics apply, including the notion of informed consent, accountability and oversight; and, thirdly, the research is carried out with the aim of growing society’s collective knowledge and well-being as opposed to serving primarily one or several private interests. I hope that noble Lords will recognise that these are features that the amendments before the Committee would incorporate into Clause 67.

In the circumstances, I hope that the Minister, who I know has thought deeply about these issues, will recognise that the EU’s institutions are worried about the definition of scientific research that has been incorporated into the Bill. If they are worried, I suggest that we should be worried. I hope that these amendments will allay those fears and ensure that true scientific research is encouraged by Clause 67 and that it is not abused by AI companies. I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I support the amendments from the noble Viscount, Lord Colville, which I have signed, and will put forward my Amendments 64, 68, 69, 130 and 132 and my Clause 85 stand part debate.

This part of the GDPR is a core component of how data protection law functions. It makes sure that organisations use personal data only for the reason that it was collected. One of the exceptional circumstances is scientific research. Focus on the definitions and uses of data in research increased in the wake of the Covid-19 pandemic, when some came to the view that legal uncertainty and related risk aversion were a barrier to clinical research.

There is a legitimate government desire to ensure that valuable research does not have to be discarded because of a lack of clarity around reuse or very narrow distinctions between the original and new purpose. The Government’s position seems to be that the Bill will only clarify the law, incorporating recitals to the original GDPR in the legislation. While this may be the policy intention, the Bill must be read in the context of recent developments in artificial intelligence and the practice of AI developers.

The Government need to provide reassurance that the intention and impact of the research provisions are not to enable the reuse of personal data, as the noble Viscount said, scraped from the internet or collected by tech companies under legitimate interest for training AI. Large tech companies could abuse the provisions to legitimise mass data scraping of personal data from the internet or to collect via legitimate interest—for example, by a social media platform, about its users. This could be legally reused for training AI systems under the new provisions if developers can claim that it constitutes scientific research. That is why we very much support what the noble Viscount said.

In our view, the definition of scientific research adopted in the Bill is too broad and will permit abuse by commercial interests outside the policy intention. The Bill must recognise the reality that companies will likely position any AI development as “reasonably described as scientific”. Combined with the inclusion of commercial activities in the Bill, that opens the door to data reuse for any data-driven product development under the auspices that it represents scientific research, even where the relationship to real scientific progress is unclear or tenuous. That is not excluded in these provisions.

I turn to Amendments 64, 68, 69, 130 and 132 and the Clause 85 stand part debate. The definition of scientific research in proposed new paragraph 2 under Clause 67(1)(b) is drawn so broadly that most commercial development of digital products and services, particularly those involving machine learning, could ostensibly be claimed by controllers to be “reasonably described as scientific”. Amendment 64, taken together with those tabled by the noble Viscount that I have signed, would radically reduce the scope for misuse of data reuse provisions by ensuring that controllers cannot mix their commercial purposes with scientific research and that such research must be in the public interest and conducted in line with established academic practice for genuine scientific research, such as ethics approval.

Since the Data Protection Act was introduced in 2018, based on the 2016 GDPR, the education sector has seen enormous expansion of state and commercial data collection, partly normalised in the pandemic, of increased volume, sensitivity, intrusiveness and high risk. Children need particular care in view of the special environment of educational settings, where pupils and families are disempowered and have no choice over the products procured, which they are obliged to use for school administrative purposes, for learning in the classroom, for homework and for digital behavioural monitoring.

The implications of broadening the definition of research activities conducted within the state education sector include questions of the appropriateness of applying the same rules where children are in a compulsory environment without agency or routine practice for research ethics oversight, particularly if the definition is expanded to commercial activity.

Parental and family personal data is often inextricably linked to the data of a child in education, such as home address, heritable health conditions or young carer status. The Responsible Technology Adoption Unit within DSIT commissioned research in the Department for Education to understand how parents and pupils feel about the use of AI tools in education and found that, while parents and pupils did not expect to make specific decisions about AI optimisation, they did expect to be consulted on whether and by whom pupil work and data can be used. There was widespread consensus that work and data should not be used without parents’ and/or pupils’ explicit agreement.

16:00
Businesses already routinely conduct trials or profit from children’s use of educational technology for product development, without their knowledge or parental permission. This is contrary to the UNCRC Article 32 principle of a right to protection from economic exploitation or public engagement, which work suggests parents want.
I turn to Amendments 68 and 69. There is a danger of what can be described as a clubcard culture of sharing data—however useful and without consideration of the data subject—permeating this Government’s approach to data. These amendments probe whether a researcher who is self-described as scientific would be able to use the data of those who have objected to their data being used in that way. They add safeguards to Clause 68 to ensure that confidence in research and government uses of data is maintained. They are designed to make it clear that, when the purpose limitations are changed, a choice must be offered to data subjects, and to ensure that existing data subject dissents are respected and cannot be ignored.
On the clause stand part notice, Clause 85, despite its title, actually removes safeguards on the use of data for research purposes, as the noble Viscount mentioned and as I explained. The powers in the clause, particularly in new Article 84D, provide wide discretion to the Secretary of State without meaningful parliamentary scrutiny. These powers, as the noble Viscount has mentioned, were identified by EU stakeholders as a main source of concern regarding the continuation of the UK adequacy decision, a review of which is due in 2025—as we have referred to throughout proceedings. The risks these powers constitute to the UK adequacy decision are more than hypothetical. If the need to establish a delegated legislative power is justified, it needs to be subject to clear restraints and the Secretary of State should not be given unfettered discretion to override the rights and freedoms of individuals under the GDPR.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I will speak to Amendments 59, 62, 63 and 65 in the name of my noble friend Lord Colville, and Amendment 64 in the name of the noble Lord, Lord Clement-Jones, to which I added my name. I am also very much in sympathy with the other amendments in this group more broadly.

My noble friend Lord Colville set out how he is seeking to understand what the Government intend by “scientific research” and to make sure that the Bill does not offer a loophole so big that any commercial company can avoid data protections of UK citizens in the name of science.

At Second Reading, I read out a dictionary definition of science:

“The systematic study of the structure and behaviour of the physical and natural world through observation, experimentation, and the testing of theories against the evidence obtained”—


i.e. everything. I also ask the Minister if the following scenarios could reasonably be considered scientific. Is updating or improving a new tracking app for fitness, or a bot for an airline, scientific? Is the behavioural science of testing children’s response to persuasive design strategies in order to extend the stickiness of commercial products scientific? These are practical scenarios, and I would be grateful for an answer in order to understand what is in and out of the scope of the Bill.

When I raised Clause 67 at a briefing meeting, it was said that it was, as my noble friend Lord Colville suggested, just housekeeping. The law firm Taylor Wessing suggests that what can

“‘reasonably be described as scientific’ is arguably very wide and fairly vague, so it will be interesting to see how this is interpreted, but the assumption is that it is intended to be a very broad definition”.

Each of the 14 law firm blogs and briefings that I read over the weekend described it variously as loosening, expanding or broadening. Not one suggested that it was a tightening and not one said that it was a no-change change. As we have heard, the European Data Protection Supervisor published an opinion stating that

“scientific research is understood to apply where … the research is carried out with the aim of growing society’s collective knowledge and wellbeing, as opposed to serving primarily one or several private interests”.

When the Minister responds, perhaps she could say whether the particular scenarios I have set out fall within the definition of scientific and why the Government have failed to reflect the critical clarification of the European Data Protection Supervisor in transferring the recital into the Bill.

I turn briefly to Amendment 64, which would limit the use of children’s personal data for the purposes of research and education by making it subject to a public interest requirement and opt-in from the child or a parent. I will speak in our debate on a later grouping to amendments that would enshrine children’s right to higher protection and propose a comprehensive code of practice on the use of children’s data in education, which is an issue of increasing scandal and concern. For now, it would be good to understand whether the Government agree that education is an area of research where a public interest requirement is necessary and appropriate and that children’s data should always be used to support their right to learn, rather than to commoditise them.

During debate on the DPDI Bill, a code of practice on children’s data and scientific research was proposed; the Minister added her name to it. It is by accident rather than by design that I have failed to lay it here, but I will listen carefully to the Minister’s reply to see whether children need additional protections from scientific research as the Government now define it.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

My Lords, I have in subsequent groups a number of amendments that touch on many of the issues that are raised here, so I will not detain the Committee by going through them at this stage and repeating them later. However, I feel that, although the Government have had the best intentions in bringing forward a set of proposals in this area that were to update and to bring together rather conflicting and difficult pieces of legislation that have been left because of the Brexit arrangements, they have managed to open up a gap between where we want to be and where we will be if the Bill goes forward in its present form. I say that in relation to AI, which is a subject requiring a lot more attention and a lot more detail than we have before us. I doubt very much whether the Government will have the appetite for dealing with that in time for this Bill, but I hope that at the very least—it would be a minor concession at this stage—they will commit at the Dispatch Box to seeking to resolve these issues in the legislation within a very short period because, as we have heard from the arguments made today, it is desperately needed.

More importantly, if, by bringing together documentation that is thought to represent the current situation, either inadvertently or otherwise, the Government have managed to open up a loophole that will devalue the way in which we currently treat personal data—I will come on to this when I get to my groups in relation to the NHS in particular—that would be a grievous situation. I hope that, going forward, the points that have been made here can be accommodated in a statement that will resolve them, because they need to be resolved.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

My Lords, it is a pleasure to take part in today’s Committee proceedings. In doing so, I declare my technology interests as set out in the register, not least as adviser to Socially Recruited, an AI business.

I support the noble Viscount, Lord Colville, in his amendments and all the other amendments in this group. They were understandably popular, to the extent that when I got my pen out, there was no space left for me to co-sign them, so I was left with the oral tradition in which to reflect my support for them. Before going into the detail, I just say that we have had three data Bills in just over three years: DPDI, DISD and this Bill. Over that period, though the names have changed, much of the meat remains the same in the legislation. Yet, in that period, everything and nothing haschanged —everything in terms of what has happened with generative AI.

Considering that seismic shift that has occurred over these three Bills, could the Minister say what in this Bill specifically has changed, not least in this part, to reflect that seismic change? Regarding “nothing has changed”, nothing has changed in terms of the incredibly powerful potential of AI for positive or negative outcomes, ably demonstrated with this set of amendments.

If you went on to Main Street and polled the public, I believe that you would get a pretty clear understanding of what they considered scientific research to be. You know it. You understand why we would want to have a specified definition of scientific research and what that would mean for the researchers and for the country.

However, if we are to draw that definition as broadly as it currently is in the Bill, why would we bother to have such a definition at all? If the Government’s intention is to enable so much to come within the perimeter, let us not have the definition at all and let us allow to continue what is happening right now, not least in the reuse of scrape data or in how data is being treated in these generative AI models.

We have seen what has happened in terms of the training, but when you look at what could be called development and improvement, as the noble Viscount has rightly pointed out, all this and more could easily fit within the scientific research definition. It could even more easily fit in when lawyers are deployed to ensure that that is so. I know we are going to come on to rehearsing a number of these subjects in the next group but, for this group, I support all the amendments as set out.

I ask the Minister these two questions. First, what has changed in all the provisions that have gone through all these three iterations of the data Bill? Secondly, what is the Government’s intention when it comes to scientific research, if it is not truly to mean scientific research, if it is not to have ethics committee involvement and if it is not to feel sound and be defined as what most people on Main Street would recognise as scientific research?

Lord Markham Portrait Lord Markham (Con)
- Hansard - - - Excerpts

I start by apologising because, due to a prior commitment, I am not able to stay for many of the proceedings today, but I see these groupings and others as critical. In the few words that I will say, I hope to bring to bear to this area some of my experience as a Health Minister, particularly in charge of technology and development of AI.

I can see a lot of good intent behind these clauses, to make sure that we do not stop a lot of the research that we need. I was recently very much involved in the negotiation of the pandemic accord regarding the next pandemic and how you make sure that any vaccines that you develop on a worldwide basis can be distributed on a worldwide basis as well. One of the main stumbling blocks was that the so-called poorer countries were trying to demand, as part of that, the intellectual property to be able to develop the vaccines in their own countries.

The point we were trying to make was that, although we could see the good intentions behind that, it would have a real chilling effect on pharmaceutical companies investing the hundreds of millions or even billions of pounds, which you often need with vaccines, to find a cure, because if they felt that they were going to lose their intellectual property and rights at the end, it would be much harder for them to justify the investment up front.

16:15
One thing that got me excited about the potential of AI in the health space was said by some Harvard professors. For years and years, we have not been able to make any inroads into dementia because we just do not know any of the causes and what we are trying to go after. The reason we were able to get a Covid vaccine so quickly was that we knew exactly what we were trying to attack. With dementia, we do not have those avenues of attack, but the professors said that if you take the data we have in the UK—yes, it would involve scraping—and look at the people who are suffering from dementia today, wind the clock back 10, 15 or 20 years and look at what they were seeing their GP about, you will start to see some of the early warning indicators. If you throw all that at AI, you might suddenly have whole new avenues of attack, because it identifies patterns that you did not realise existed.
There absolutely are scientific research reasons for doing that, and it is done from a very tricky position; but of course, the pharmaceutical companies would do it for the commercial benefit, because if you can find a cure or something to ameliorate the progression of dementia, that would obviously be incredibly valuable. It is about getting that balance right.
We have the best data in the world, and with that, we absolutely have the opportunity here to be the Silicon Valley of the life sciences world. My fear about that, which noble Lords have heard me say many times, is that, if we are too restrictive, all we will get is the offshoring of that research. As a result, we would lose out at a clinical level, because the clinical trials and everything that follows would not take place in our hospitals. As a result, we would not get the treatments as quickly, and we would lose out on the commercial value as well.
There is a balance, and we have done a lot of research because we realise that if you are going to do it in the health space, you need to bring the public with you. A lot of research was done earlier in the year, with a lot of public engagement sessions, asking people how they felt about their data being used for different benefits. The broad findings—it would be well worth digging them out, as I am doing this partially from memory—were that people were okay if there was a commercial benefit from the use of their data. They wanted there to be a scientific angle, and there was a research benefit as well, but they were okay with the commercial benefit provided that the health service was benefiting from it, including financially. They wanted to make sure that the NHS would also get the benefit of the value of that data. As I said, I am doing this from memory, but there was quite widespread support for that, of around 60% to 70%.
That shows that it is possible to do this in the right way and to bring the public with us. I totally understand that we do not want people completely scraping the data and using it purely for commercial purposes, but there is a fear that if we swing the pendulum too far the other way, there will be a chilling effect on some of this research work, which will happen only if there is also a commercial benefit at the end of the road. Trying to find that balance is the key thing we need to find in this Bill.
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I start by thanking all noble Lords who spoke for their comments and fascinating contributions. We on these Benches share the concern of many noble Lords about the Bill allowing the use of data for research purposes, especially scientific research purposes.

Amendment 59 has, to my mind, the entirely right and important intention of preventing misuse of the scientific research exemption for data reuse by ensuring that the only purpose for which the reuse is permissible is scientific research. Clearly, there is merit in this idea, and I look forward to hearing the Minister give it due consideration.

However, there are two problems with the concept and definition of scientific research in the Bill overall, and, again, I very much look forward to hearing the Government’s view. First, I echo the important points raised by my noble friend Lord Markham. Almost nothing in research or, frankly, life more broadly, is done with only one intention. Even the most high-minded, curiosity-driven researcher will have at the back of their mind the possibility of commercialisation. Alongside protecting ourselves from the cynical misuse of science as a cover story for commercial pursuit, we have to be equally wary of creating law that pushes for the complete absence of the profit motive in research, because to the extent that we succeed in doing that, we will see less research. Secondly—the noble Viscount, Lord Colville, and the noble Lord, Lord Clement-Jones, made this point very powerfully—I am concerned that the broad definition of scientific research in the Bill might muddy the waters further. I worry that, if the terminology itself is not tightened, restricting the exemption might serve little purpose.

On Amendment 62, to which I have put my name, the same arguments very much apply. I accept that it is very challenging to find a form of words that both encourages research and innovation and does not do so at the expense of data protection. Again, I look forward to hearing the Government’s view. I am also pleased to have signed Amendment 63, which seeks to ensure that personal data can be reused only if doing so is in the public interest. Having listened carefully to some of the arguments, I feel that the public interest test may be more fertile ground than a kind of research motivation purity test to achieve that very difficult balance.

On Amendment 64, I share the curiosity to hear how the Minister defines research and statistical processes —again, not easy but I look forward to her response.

Amendment 65 aims to ensure that research seeking to use the scientific research exemption to obtaining consent meets the minimum levels of scientific rigour. The aim of the amendment is, needless to say, excellent. We should seek to avoid creating opportunities which would allow companies—especially but not uniquely AI labs—to cloak their commercial research as scientific, thus reducing the hoops they must jump through to reuse data in their research without explicit consent. However, Amendment 66, tabled in my name, which inserts the words:

“Research considered scientific research that is carried out as a commercial activity must be subject to the approval of an independent ethics committee”,


may be a more adaptive solution.

Many of these amendments show that we are all quite aligned in what we want but that it is really challenging to codify that in writing. Therefore, the use of an ethics committee to conduct these judgments may be the more agile, adaptive solution.

I confess that I am not sure I have fully understood the mechanism behind Amendments 68 and 69, but I of course look forward to the Minister’s response. I understand that they would essentially mean consent by failing to opt out. If so, I am not sure I could get behind that.

Amendment 130 would prevent the processing of personal data for research, archiving and statistical purposes if it permits the identification of a living individual. This is a sensible precaution. It would prevent the sharing of unnecessary or irrelevant information and protect people’s privacy in the event of a data breach.

Amendment 132 appears to uphold existing patient consent for the use of their data for research, archiving and statistical purposes. I just wonder whether this is necessary. Is that not already the case?

Finally, I turn to the Clause 85 stand part notice. I listened carefully to the noble Lord, Lord Clement-Jones, but I am not, I am afraid, at a point where I can support this. There need to be safeguards on the use of data for this purpose; I feel that Clause 85 is our way of having them.

Baroness Jones of Whitchurch Portrait The Parliamentary Under-Secretary of State, Department for Business and Trade and Department for Science, Information and Technology (Baroness Jones of Whitchurch) (Lab)
- Hansard - - - Excerpts

My Lords, it is a great pleasure to be here this afternoon. I look forward to what I am sure will be some excellent debates.

We have a number of debates on scientific research; it is just the way the groupings have fallen. This is just one of several groupings that will, in different ways and from different directions, probe some of these issues. I look forward to drilling down into all the implications of scientific research in the round. I should say at the beginning—the noble Lord, Lord Markham, is absolutely right about this—that we have a fantastic history of and reputation for doing R&D and scientific research in this country. We are hugely respected throughout the world. We must be careful that we do not somehow begin to demonise some of those people by casting aspersions on a lot of the very good research that is taking place.

A number of noble Lords said that they are struggling to know what the definition of “scientific research” is. A lot of scientific research is curiosity driven; it does not necessarily have an obvious outcome. People start a piece of research, either in a university or on a commercial basis, and they do not quite know where it will lead them. Then—it may be 10 or 20 years later—we begin to realise that the outcome of their research has more applications than we had ever considered in the past. That is the wonderful thing about human knowledge: as we build and we learn, we find new applications for it. So I hope that whatever we decide and agree on in this Bill does not put a dampener on that great aspect of human knowledge and the drive for further exploration, which we have seen in the UK in life sciences in particular but also in other areas such as space exploration and quantum. Noble Lords could probably identify many more areas where we are increasingly getting a reputation for being at the global forefront of this thinking. We have to take the public with us, of course, and get the balance right, but I hope we do not lose sight of the prize we could have if we get the regulations and legislation right.

Let me turn to the specifics that have been raised today. Amendments 59 and 62 to 65 relate to scientific provisions, and the noble Lord, Lord Clement-Jones, the noble Viscount, Lord Colville, and others have commented on them. I should make it clear that this Bill is not expanding the meaning of “scientific research”. If anything, it is restricting it, because the reasonableness test that has been added to the legislation—along with clarification of the requirement for research to have a lawful basis—will constrain the misuse of the existing definition. The definition is tighter, and we have attempted to do that in order to make sure that some of the new developments and technologies coming on stream will fall clearly within the constraints we are putting forward in the Bill today.

Amendments 59 and 62 seek to prevent misuse of the exceptions for data reuse. I assure the noble Viscount, Lord Colville, that the existing provisions for research purposes already prevent the controller taking advantage of them for any other purpose they may have in mind. That is controlled.

16:30
On Amendment 63, also from the noble Viscount, Lord Colville, scientific research has a privileged position in the UK GDPR because it is generally considered to be in the public interest. Requiring individual researchers to spend time formally assessing the value of their research may lead to uncertainty and discourage new research, particularly as set out in the way I described earlier. The researchers may not know the route they want to take or the outcome; it is therefore quite difficult to say in beginning that research that it is definitely in the public interest. Conversely, all sorts of things that were carried out for completely different purposes turn out to be in the public interest, so it is not as clear cut as noble Lords have been trying to define.
Currently, the extra step is required to be taken only when there may be a higher risk, such as those wishing to process sensitive data using the research condition under Schedule 1 to the Data Protection Act 2018. I can assure the Committee that even if research is scientific, that is not enough. Controllers would still need to find a lawful basis and perform the balancing tests to see whether it was a legitimate interest.
Noble Lords have raised their concern about AI research and web crawling. The provisions in the legislation are technology-agnostic, so all of that would fall within the current restrictions of the legislation as set out.
On Amendment 65, approval from ethics committees is required at the moment only to carefully define “approved medical research”, because only that term has an exemption around allowing decisions about particular data subjects. Requiring it for other kinds of research may impede valuable research in areas which follow other ethical procedures. For example, in space there may be other ethical checks and balances that take place; we can imagine that in other areas as well.
We have to be careful not to specify the ethical basis of this. If we step back and look at it, first, it is hugely bureaucratic when a lot of research is moving very quickly. Secondly, passing that ethical test may not be a simple measure and might require all sorts of complications, which would hold the research up even though it was generally considered to be in the public interest.
Turning to Amendments 68 and 69 from the noble Lord, Lord Clement-Jones, I agree that a data subject’s consent should be respected. I would like to provide reassurance that the data subject can revoke their consent at any point in the process.
On Clause 85, including Amendments 130 and 132, this clause consolidates the safeguards for research, making it easier for researchers to navigate their requirements. When the research purposes can be achieved with anonymous data, the clause requires that data to be anonymised. For some types of research, such as that using genetic or health data, it may not be possible to use only anonymous data. The Government believe that such potentially life-saving research should be permitted, considering the benefits it could bring to society, but I can reassure the Committee that it must meet the other conditions in this clause. Wider requirements elsewhere in the Bill and the data protection framework would also apply.
I hope also to provide reassurance in regard to patient data. All organisations providing services to patients in the UK, whether in the NHS or privately, must also follow the common law duty of confidentiality. In addition, patients can utilise the national data opt- out in health and social care.
I know that we have gone around this subject in a very wide sense and that we might equally revisit some of these issues on other amendments. But I hope that, for the moment, I have reassured noble Lords on the specific details of their amendments and persuaded them that those strong protections are in place.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - - - Excerpts

I thank the Minister very much, but is she not concerned by the preliminary opinion from the EDPS, particularly that traditional academic research is blurrier than ever and that it is even harder to distinguish research which generally benefits society from that which primarily serves private interest? People in the street would be worried about that and the Bill ought to be responding to that concern.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I have not seen that observation, but we will look at it. It goes back to my point that the provisions in this Bill are designed to be future facing as well as for the current day. The strength of those provisions will apply regardless of the technology, which may well include AI. Noble Lords may know that we will bring forward a separate piece of legislation on AI, when we will be able to debate this in more detail.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - - - Excerpts

My Lords, this has been a very important debate about one of the most controversial areas of this Bill. My amendments are supported across the House and by respected civic institutions such as the Ada Lovelace Institute. I understand that the Minister thinks they will stifle scientific research, particularly by nascent AI companies, but the rights of the data subject must be borne in mind. As it stands, under Clause 67, millions of data subjects could find their information mined by AI companies, to be reused without consent.

The concerns about this definition being too broad were illustrated very well across the Committee. The noble Lord, Lord Clement-Jones, said that it was too broad and must recognise that AI development will be open to using data research for any AI purposes and talked about his amendment on protecting children’s data, which is very important and worthy of consideration. This was supported by my noble friend Lady Kidron, who pointed out that the definition of scientific research could cover everything and warned that Clause 67 is not just housekeeping. She quoted the EDPS and talked about its critical clarification not being included in the transfer of the scientific definition into the Bill. The noble Lord, Lord Holmes, asked what in the Bill has changed when you consider how much has changed in AI. I was very pleased to have the support of the noble Viscount, Lord Camrose, who warned against the abuse and misuse of data and the broad definition in this Bill, which could muddy the waters. He supported the public interest test, which would be fertile ground for helping define scientific data.

Surely this Bill should walk the line in encouraging the AI rollout to boost research and development in our science sector. I ask the Minister to meet me and other concerned noble Lords to tighten up Clauses 67 and 68. On that basis, I beg leave to withdraw my amendment.

Amendment 59 withdrawn.
Amendment 60
Moved by
60: Clause 67, page 75, line 10, leave out from “scientific” to end of line 12
Member's explanatory statement
This amendment seeks to ensure that the Bill does not extend the meaning of “research purposes” to include privately funded or commercial activity, to avert the possibility that such ventures might benefit from exemptions in copyright law relating to data mining.
Lord Freyberg Portrait Lord Freyberg (CB)
- Hansard - - - Excerpts

My Lords, I have tabled Amendment 60 to add to our discussion and establish some further clarity from the Minister on the impact of widening the scope of the interpretation of scientific research to include commercial and private activities. I thank her for her letter of 27 November to all noble Lords who spoke at Second Reading, a copy of which was placed in the Lords Library; it provides some reassurance that scientific research activities must still pass a reasonableness test. However, I move this probing amendment out of concern that the change in definition may have unintended consequences for copyright law. It is vital that we do not just look at this Bill in isolation but consider the wider impact that changing definitions and interpretations will have on other aspects of legislation.

Research activities are identified under the Copyright, Designs and Patents Act 1988. Some researchers require access to and reproduction of data and copyright-protected material for research purposes. Under Section 29A, researchers can avail themselves of an exemption from copyright which allows data mining and analysis of copyright-protected works for non-commercial research only, without permission from the copyright holder. The UK copyright framework is popularly known as the “gold standard” internationally, as it carefully balances the rights of copyright holders with the need for certain uses to take place, such as non-commercial research, educational uses and those that protect free speech. That balance is fragile, and we must be very careful not to disrupt it unintentionally.

The previous Government sought to widen Section 29A of the Act by allowing text and data mining of copyright-protected works for commercial purposes, but this recommendation was quickly reversed when the Government considered that the decision was made without appropriate evidence. That was a sensible move. The current Government are still due to consult with stakeholders on the exemption to the law, against the backdrop of AI companies using copyright-protected works for training large language models without permission or fair pay. Given the global presence of AI, it is expected that this consultation will consider how the UK policy on copyright works within an international context. Therefore, while the Government are carefully considering this, we must ensure that we do not fast forward to a conclusion before that important work has taken place.

If the Minister can confirm that this definition has no impact on existing copyright law, I will happily withdraw this amendment. However, if there are potential implications on the Copyright, Designs and Patents Act 1988, I would urge the Minister to table her own amendment to explicitly preserve the current definition of “scientific research” within that Act. This would ensure that we maintain legal clarity while the broader international considerations are fully examined. I beg to move.

Baroness McIntosh of Hudnall Portrait The Deputy Chairman of Committees (Baroness McIntosh of Hudnall) (Lab)
- Hansard - - - Excerpts

I advise the Committee that, if this amendment is agreed, I cannot call Amendment 61 by reason of pre-emption.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

My Lords, it is a pleasure to take part in the debate on these amendments. I very much support Amendment 60 as introduced. I was delighted to hear the Minister tell the Grand Committee that the Government are coming forward with an AI Bill. I wonder if I might tempt her into sharing a bit more detail with your Lordships on when we might see that Bill or indeed the consultation. Will it be before Santa or sometime after his welcome appearance later this month?

We touched on a number of areas related to Amendment 65A in the previous group. This demonstrates the importance of and concern about Clause 67, as so many amendments pertain to it. I ask the Minister whether a large language model that comes up with medically significant conclusions but, prior to that, gained a considerable amount of that data from scraping, would be fine within Clause 67 as drafted.

Similarly, there are overriding and broader reuse possibilities from the drafting as set out. Again, as has already been debated, scientific research has a clear meaning in many respects. That clarity very much comes when you add public interest and ethics. Could a model that has taken vast quantities of others’ data without consent and—nodding more towards Amendment 60 —without remuneration and consent still potentially fit within the definition of “scientific research”?

In many ways, we are debating these points around data in the context of scientific research, but we could go to the very nub or essence of the issue. All that noble Lords are asking, in their many eloquent and excellent ways, is whose data is it, to what purpose is it being put and have those data owners been consented, respected and, where appropriate—particularly when it comes to IP and copyrighted data—remunerated? This is an excellent opportunity to expand on the earlier debate on Clause 67. I look forward to the Minister’s response.

16:45
Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, I declare an interest in that I checked yesterday and Copilot has clearly scraped data from behind the paywall on the Good Schools Guide. It very kindly does not publish the whole of the review, but it publishes a summary of it. It concerns me how we police copyright and how we get things right in this Bill.

However, I do not think that trying to draw a boundary around “scientific” is the right way to do it. Looking at all the evidence on engineering biology that we have just taken for the Science and Technology Committee, they are all doing science, but they all want to make money out of it at the end, if things go right. There is no sensible boundary between science and commerce. We should expect that, with anything that is done for science, even if it is done in the social sciences, someone at the end of the day will want to build a consultancy on it. There is no defendable boundary between the two.

As my noble friend Lord Camrose said, getting a working definition of public interest is key, as is, in the context of this amendment, recognising the importance of the concepts of intellectual property, copyright, trademark, patents and so on. They are international concepts, and we should seek to hold the line in the face of technological challenges because the concepts as they are have shown their worth. We may have to adapt them in one way or another, but this should be an international thing, and we should not support local infringement, because we would then make the UK a much less worthwhile place to hold intellectual property. My intellectual property is not mobile but a lot of it is, and it wants to be held in a place where it can be defended. If we do not offer that in our legal system, we will lose a great deal by it.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I have to admit that I am slightly confused by the groupings at this point. It is very easy to have this debate in the medical space, to talk about the future of disease, fixing diseases and longevity, but my rather mundane questions have now gone unanswered twice. Perhaps the Minister will write to me about where the Government see scientific research on product development in some of these other spaces.

We will come back to the question of scraping and intellectual copyright, but I want to add my support to my noble friend Lord Freyberg’s amendment. I also want to add my voice to the question of the AI Bill that is coming. Data is fundamental to the AI infra- structure; data is infrastructure. I do not understand how we can have a data Bill that does not have one eye on AI, looking towards it, or how we are supposed to understand the intersection between the AI Bill and the data Bill if the Government are not more forthcoming about their intentions. At the moment, we are seeing a reduction in data protection that looks as though it is anticipating, or creating a runway for, certain sorts of companies.

Finally, I am sorry that the noble Lord is no longer in his place, but later amendments look at creating sovereign data assets around the NHS and so on, and I do not think that those of us who are arguing to make sure that it is not a free-for-all are unwilling to create, or are not interested in creating, ways in which the huge investment in the NHS and other datasets can be realised for UK plc. I do not want that to appear to be where we are starting just because we are unhappy about the roadway that Clause 67 appears to create.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Many thanks to the noble Lords who have spoken in this debate and to the noble Lord, Lord Freyberg, for his Amendment 60. Before I start, let me endorse and add my name to the request for something of a briefing about the AI Bill. I am concerned that we will put a lot of weight of expectation on that Bill. When it comes, if I understand this right, it will focus on the very largest AI labs and may not necessarily get to all the risks that we are talking about here.

Amendment 60 seeks to ensure that the Bill does not allow privately funded or commercial activities to be considered scientific research in order

“to avert the possibility that such ventures might benefit from exemptions in copyright law relating to data mining”.

This is a sensible, proportionate measure to achieve an important end, but I have some concerns about the underlying assumption, as it strikes me. There is a filtering criterion of whether or not the research is taxpayer funded; that feels like a slightly crude means of predicting the propensity to infringe copyright. I do not know where to take that so I shall leave it there for the moment.

Amendment 61 in my name would ensure that data companies cannot justify data scraping for AI training as scientific research. As many of us said in our debate on the previous group, as well as in our debate on this group, the definition of “scientific research” in the Bill is extremely broad. I very much take on board the Minister’s helpful response on that but, I must say, I continue to have some concerns about the breadth of the definition. The development of AI programs, funded privately and as part of a commercial enterprise, could be considered scientific, so I believe that this definition is far too broad, given that Article 8A(3), to be inserted by Clause 71(5), states:

“Processing of personal data for a new purpose is to be treated as processing in a manner compatible with the original purpose where … the processing is carried out … for the purposes of scientific research”.


By tightening up the definition of “scientific research” to exclude activities that are primarily commercial, it prevents companies from creating a scientific pretence for research that is wholly driven by commercial gain rather than furthering our collective knowledge. I would argue that, if we wish to allow these companies to build and train AI—we must, or others will—we must put in proper safeguards for people’s data. Data subjects should have the right to consent to their data being used in such a manner.

Amendment 65A in the name of my noble friend Lord Holmes would also take steps to remedy this concern. I believe that this amendment would work well in tangent with Amendment 61. It makes it absolutely clear that we expect AI developers to obtain consent from data subjects before they use or reuse their data for training purposes. For now, though, I shall not press my amendment.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I share the confusion of the noble Baroness, Lady Kidron, about the groupings. If we are not careful, we are going to keep returning to this issue again and again over four or five groups.

With the possible exception of the noble Lord, Lord Lucas, I think that we are all very much on the same page here. On the suggestion from the noble Viscount, Lord Colville, that we meet to discuss the precise issue of the definition of “scientific research”, this would be extremely helpful; the noble Baroness and I do not need to repeat the concerns.

I should declare an interest in two respects: first, my interests as regards AI, which are set out on the register; and, secondly—I very much took account of what the noble Viscount, Lord Camrose, and the noble Lord, Lord Markham, had to say—I chair the council of a university that has a strong health faculty. It does a great deal of health research and a lot of that research relies on NHS datasets.

This is not some sort of Luddism we are displaying here. This is caution about the expansion of the definition of scientific research, so that it does not turn into something else: that it does not deprive copyright holders of compensation, and that it does not allow personal data to be scraped off the internet without consent. There are very legitimate issues being addressed here, despite the fact that many of us believe that this valuable data should of course be used for the public benefit.



One of the key themes—this is perhaps where we come back on to the same page as the noble Lord, Lord Lucas—may be public benefit, which we need to reintroduce so that we really understand that scientific research for public benefit is the purpose we want this data used for.

I do not think I need to say much more: this issue is already permeating our discussions. It is interesting that we did not get on to it in a major way during the DPDI Bill, yet this time we have focused much more heavily on it. Clearly, in opposition, the noble Viscount has seen the light. What is not to like about that? Further discussion, not least of the amendment of the noble Baroness, Lady Kidron, further down the track will be extremely useful.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I feel we are getting slightly repetitive, but before I, too, repeat myself, I should like to say something that I did not get the chance to say the noble Viscount, Lord Colville, the noble Baroness, Lady Kidron, and others: I will write, we will meet—all the things that you have asked for, you can take it for granted that they will happen, because we want to get this right.

I say briefly to the noble Baroness: we are in danger of thinking that the only good research is health research. If you go to any university up and down the country, you find that the most fantastic research is taking place in the most obscure subjects, be it physics, mechanical engineering, fabrics or, as I mentioned earlier, quantum. A lot of great research is going on. We are in danger of thinking that life sciences are the only thing that we do well. We need to open our minds a bit to create the space for those original thinkers in other sectors.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

Perhaps I did not make myself clear. I was saying that the defence always goes to space or to medicine, and we are trying to ascertain the product development that is not textiles, and so on. I have two positions in two different universities; they are marvellous places; research is very important.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I am glad we are on the same page on all that.

I now turn to the specifics of the amendments. I thank the noble Lords, Lord Freyberg and Lord Holmes, and the noble Viscount, Lord Camrose, for their amendments, and the noble Lord, Lord Lucas, for his contribution. As I said in the previous debate, I can reassure all noble Lords that if an area of research does not count as scientific research at the moment, it will not under the Bill. These provisions do not expand the meaning of scientific research. If noble Lords still feel unsure about that, I am happy to offer a technical briefing to those who are interested in this issue to clarify that as far as possible.

Moreover, the Bill’s requirement for a reasonableness test will help limit the misuse of this definition more than the current UK GDPR, which says that scientific research should be interpreted broadly. We are tightening up the regulations. This is best assessed on a case-by- case basis, along with the ICO guidance, rather than automatically disqualifying or passing into our activity sectors by approval.

Scientific research that is privately funded or conducted by commercial organisations can also have a life-changing impact. The noble Lord, Lord Markham, was talking earlier about health; issues such as the development of Covid vaccines are just one example of this. It was commercial research that was absolutely life-saving, at the end of the day.

17:00
The guidance from the ICO provides further helpful detail on whether an activity constitutes scientific research. This includes a list of features that are likely to be present if the research is indeed scientific. Researchers must also fulfil additional requirements, including ensuring that the processes do not cause substantial distress to the individual.
I note for noble Lords that merely falling under the definition of scientific research does not qualify as permission to process personal data for this purpose. A valid lawful basis is always required, including for reuse of personal data, as clarified by Clause 71. Further protections, including key data protection principles such as fairness and transparency, continue to apply.
Lastly, the noble Lord, Lord Freyberg, emphasised the issue of copyright law in Amendment 60. We debated this at Second Reading, and I fear that we may debate it again during these proceedings, but I reassure noble Lords that changes to the data protection framework do not change copyright law.
In response to the noble Lord, Lord Holmes, and to other points on AI legislation, as per the King’s Speech, the Government are seeking to establish the appropriate legislation to place requirements on those working to develop the most powerful artificial intelligence models. The next steps on that will be announced in the usual way—so maybe not this side of Santa, as I was asked.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Can the Minister say whether this will be a Bill, a draft Bill or a consultation?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

We will announce this in the usual way—in due course. I refer the noble Lord to the King’s Speech on that issue. I feel that noble Lords want more information, but they will just have to go with what I am able to say at the moment.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Perhaps another aspect the Minister could speak to is whether this will be coming very shortly, shortly or imminently.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Let me put it this way: other things may be coming before it. I think I promised at the last debate that we would have something on copyright in the very, very, very near future. This may not be as very, very, very near future as that. We will tie ourselves in knots if we carry on pursuing this discussion.

On that basis, I hope that this provides noble Lords with sufficient reassurance not to press their amendments.

Lord Freyberg Portrait Lord Freyberg (CB)
- Hansard - - - Excerpts

I thank your Lordships for this interesting debate. I apologise to the Committee for degrouping the amendment on copyright, but I thought it was important to establish from the Minister that there really was no effect on the copyright Act. I am very reassured that she has said that. It is also reassuring to hear that there will be more of an opportunity to look at this issue in greater detail. On that basis, I beg leave to withdraw the amendment.

Amendment 60 withdrawn.
Amendments 61 to 65A not moved.
Amendment 66
Moved by
66: Clause 67, page 75, line 21, at end insert—
“3A. Research considered scientific research that is carried out as a commercial activity must be subject to the approval of an independent ethics committee.”Member's explanatory statement
This amendment ensures personal data is not used for commercial purposes, which is subject to fewer ethical safeguards; preventing data being used in a matter data subjects may not consider an appropriate, such as training Large Language Models.
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, Amendments 66, 67 and 80 in this group are all tabled in my name. Amendment 66 requires scientific research carried out for commercial purposes to

“be subject to the approval of an independent ethics committee”.

Commercial research is, perhaps counterintuitively, generally subjected to fewer ethical safeguards than research carried out purely for scientific endeavour by educational institutions. Given the current broad definition of scientific research in the Bill—I am sorry to repeat this—which includes research for commercial purposes, and the lower bar for obtaining consent for data reuse should the research be considered scientific, I think it would be fair to require more substantial ethical safeguards on such activities.

We do not want to create a scenario where unscrupulous tech developers use the Bill to harvest significant quantities of personal data under the guise of scientific endeavour to develop their products, without having to obtain consent from data subjects or even without them knowing. An independent ethics committee would be an excellent way to monitor scientific research that would be part of commercial activities, without capping data access for scientific research, which aims more purely to expand the horizon of our knowledge and benefit society. Let us be clear: commercial research makes a huge and critically important contribution to scientific research, but it is also surely fair to subject it to the same safeguards and scrutiny required of non-commercial scientific research.

Amendment 67 would ensure that data controllers cannot gain consent for research purposes that cannot be defined at the time of data collection. As the Bill stands, consent will be considered obtained for the purposes of scientific research if, at the time consent is sought, it is not possible to identify fully the purposes for which the personal data is to be processed. I fully understand that there needs to be some scope to take advantage of research opportunities that are not always foreseeable at the start of studies, particularly multi-year longitudinal studies, but which emerge as such studies continue. I am concerned, however, that the current provisions are a little too broad. In other words: is consent not actually being given at the start of the process for, effectively, any future purpose?

Amendment 80 would prevent the data reuse test being automatically passed if the reuse is for scientific purposes. Again, I have tabled this amendment due to my concerns that research which is part of commercial activities could be artificially classed as scientific, and that other clauses in the Bill would therefore allow too broad a scope for data harvesting. I beg to move.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - - - Excerpts

My Lords, it seems very strange indeed that Amendment 66 is in a different group from group 1, which we have already discussed. Of course, I support Amendment 66 from the noble Viscount, Lord Camrose, but in response to my suggestion for a similar ethical threshold, the Minister said she was concerned that scientific research would find this to be too bureaucratic a hurdle. She and many of us here sat through debates on the Online Safety Bill, now an Act. I was also on the Communications Committee when it looked at digital regulations and came forward with one of the original reports on this. The dynamic and impetus which drove us to worry about this was the lack of ethics within the tech companies and social media. Why on earth would we want to unleash some of the most powerful companies in the world on reusing people’s data for scientific purposes if we were not going to have an ethical threshold involved in such an Act? It is important that we consider that extremely seriously.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I welcome the noble Viscount to the sceptics’ club because he has clearly had a damascene conversion. It may be that this goes too far. I am slightly concerned, like him, about the bureaucracy involved in this, which slightly gives the game away. It could be seen as a way of legitimising commercial research, whereas we want to make it absolutely certain that that research is for the public benefit, rather than imposing an ethical board on every single aspect of research which has any commercial content.

We keep coming back to this, but we seem to be degrouping all over the place. Even the Government Whips Office seems to have given up trying to give titles for each of the groups; they are just called “degrouped” nowadays, which I think is a sign of deep depression in that office. It does not tell us anything about what the different groups contain, for some reason. Anyway, it is good to see the noble Viscount, Lord Camrose, kicking the tyres on the definition of the research aspect.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I am not quite sure about the groupings, either, but let us go with what we have. I thank noble Lords who have spoken, and the noble Viscount, Lord Camrose, for his amendments. I hope I am able to provide some reassurance for him on the points he raised.

As I said when considering the previous group, the Bill does not expand the definition of scientific research. The reasonableness test, along with clarifying the requirement for researchers to have a lawful basis, will significantly reduce the misuse of the existing definition. The amendment seeks to reduce the potential for misuse of the definition of scientific research by commercial companies using AI by requiring scientific researchers for a commercial company to submit their research to an ethics committee. As I said on the previous group, making it a mandatory requirement for all research may impede studies in areas that might have their own bespoke ethical procedures. This may well be the case in a whole range of different research areas, particularly in the university sector, and in sectors more widely. Some of this research may be very small to begin with but might grow in size. The idea that a small piece of start-up research has to be cleared for ethical research at an early stage is expecting too much and will put off a lot of the new innovations that might otherwise come forward.

Amendment 80 relates to Clause 71 and the reuse of personal data. This would put at risk valuable research that relies on data originally generated from diverse contexts, since the difference between the purposes may not always be compatible.

Turning to Amendment 67, I can reassure noble Lords that the concept of broad consent is not new. Clause 68 reproduces the text from the current UK GDPR recitals because the precise definition of scientific research may become clear only during later analysis of the data. Obtaining broad consent for an area of research from the outset allows scientists to focus on potentially life-saving research. Clause 68 has important limitations. It cannot be used if the researcher already knows the specific purpose—an important safeguard that should not be removed. It also includes a requirement to give the data subject the choice to consent to only part of the research processing, if possible. Most importantly, the data subject can revoke their consent at any point. I hope this reassures the noble Viscount, Lord Camrose, and he feels content to withdraw his amendment on this basis.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank the noble Viscount, Lord Colville, and the noble Lord, Lord Clement-Jones, for their remarks and support, and the Minister for her helpful response. Just over 70% of scientific research in the UK is privately funded, 28% is taxpayer funded and around 1% comes through the charity sector. Perhaps the two most consequential scientific breakthroughs of the last five years, Covid vaccines and large language models, have come principally from private funding.

17:15
I make these points just to stress again how important it is to maintain the quality and quantity of public research and to walk the line between data protection and safety and the need for ever-growing, ongoing research. Our challenge with the Bill is therefore twofold: to continue to support public and private research with the safe use of personal data, and to ensure that we do not allow commercial drivers to override ethical considerations.
The amendments we have discussed in this group are intended to drive these outcomes. We want to make it as easy and safe as possible for research to use data, but we must put safeguards in place to prevent companies using personal data in a way that would expose data subjects to harm in pursuit of their own ends. That said, I beg leave to withdraw the amendment.
Amendment 66 withdrawn.
Clause 67 agreed.
Clause 68: Consent to processing for the purposes of scientific research
Amendments 67 to 69 not moved.
Amendment 70
Moved by
70: Clause 68, page 76, leave out lines 17 and 18 and insert—
“7. For the avoidance of doubt, consent as defined here is not sufficient for the purposes of Article 6(1)(a) (lawful processing) and Article 9(2)(a) (processing of special categories of personal data).”Member's explanatory statement
This amendment would mitigate the lowering of the threshold for a data subject to be deemed to have given consent.
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

My Lords, I rise to move the amendment standing in my name and to speak to my other amendments in this group. I am grateful to the noble Baroness, Lady Kidron and the noble Lord, Lord Clement-Jones, for signing a number of those amendments, and I am also very grateful to Foxglove Legal and other bodies that have briefed me in preparation for this.

My amendments are in a separate group, and I make no apology for that because although some of these points have indeed been covered in other amendments, my focus is entirely on NHS patient data, partly because it is the subject of a wider debate going on elsewhere about whether value can be obtained for it to help finance the National Health Service and our health in future years. This changes the nature of the relationship between research and the data it is using, and I think it is important that we focus hard on this and get some of the points that have already been made into a form where we can get reasonable answers to the questions that it leaves.

If my amendments are accepted or agreed—a faint hope—they would make it clear beyond peradventure that the consent protections in the Bill apply to the processing of data for scientific research, that a consistent definition of consent is applied and that that consistent definition is the one with which researchers and the public are already familiar and can trust going forward.

The Minister said at the end of Second Reading, in response to concerns I and others raised about research data in general and NHS data in particular, that the provisions in this Bill

“do not alter the legal obligations that apply in relation to decisions about whether to share data”.—[Official Report, 19/11/24; col. 196.]

I accept that that may be the intention, and I have discussed this with officials, who make the same point very strongly. However, Clause 68 introduces a novel and, I suggest, significantly watered-down definition of consent in the case of scientific research. Clause 71 deploys this watered-down definition of consent to winnow down the “purpose limitation” where the processing is for the purposes of scientific research in the public interest. Taken together, this means that there has been a change in the legal obligations that apply to the need to obtain consent before data is shared.

Clause 68 amends the pivotal definition of consent in Article 4(11). Instead of consent requiring something express—freely given, specific, informed, and unambiguous through clear affirmative action—consent can now be imputed. A data subject’s consent is deemed to meet these strict requirements even when it does not, as long as the consent is given to the processing of personal data for the purposes of an area of scientific research; at the time the consent is sought, it is not possible to identify fully the purposes for which the personal data is to be processed; seeking consent in relation to the area of scientific research is consistent with generally recognised ethical standards relevant to the area of research; and, so far as the intended purposes of the processing allow, the data subject is given the opportunity to consent to processing for only part of the research. These all sound very laudable, but I believe they cut down the very strict existing standards of consent.

Proposed new paragraph 7, in Clause 68, then extends the application of this definition across the regulation:

“References in this Regulation to consent given for a specific purpose (however expressed) include consent described in paragraph 6.”


Thus, wherever you read “consent” in the regulation you can also have imputed consent as set out in proposed new paragraph 6 of Article 4. This means that “consent” within the meaning of proposed new paragraph 6(a)—i.e. the basis for lawful processing—can be imputed consent in the new way introduced by the Bill, so there is a new type of lawful basis for processing.

The Minister is entitled to disagree, of course; I expect him to say that when he comes to respond. I hope that, when he does, he will agree that we share a concern on the importance of giving researchers a clear framework, as it is this uncertainty about the legal framework that could inadvertently act as a barrier to the good research we all need. So my first argument today is that, as drafted, the Bill leaves too much room for different interpretations, which will lead to exactly the kind of uncertainty that the Minister—indeed, all of us—wish to avoid.

As we have heard already, as well as the risk of uncertainty among researchers, there is also the risk of distrust among the general public. The public rightly want and expect to have a say in what uses their data is put to. Past efforts to modernise how the NHS uses data, such as care.data, have been expensive failures, in part because they have failed to win the public’s trust. More than 3.3 million people have already opted out of NHS data sharing under the national data opt-out; that is nearly 8% of the adults who could have been part of surveys. We have talked about the value of our data and being the gold standard or gold attractor for researchers but, if we do not have all the people who could contribute, we are definitely devaluing and debasing that research. Although we want to respect people’s choice as to whether to participate, of course, this enormous vote against research reflects a pretty spectacular failure to win public trust—one that undermines the value and quality of the data, as I said.

So my second point is that watering down the rights of those whose data is held by the NHS will not put that data for research purposes on a sustainable, long-term footing. Surely, we want a different outcome this time. We cannot afford more opt-outs; we want people opting back in. I argue that this requires a different approach—one that wins the public’s trust and gains public consent. The Secretary of State for Health is correct to say that most of the public want to see the better use of health data to help the NHS and to improve the health of the nation. I agree, but he must accept that the figures show that the general public also have concerns about privacy and about private companies exploiting their data without them having a say in the matter. The way forward must be to build trust by genuinely addressing those concerns. There must not be even a whiff of watering down legal protections, so that those concerns can instead be turned into support.

This is also important because NHS healthcare includes some of the most intimate personal data. It cannot make sense for that data to have a lower standard of consent protection going forward if it is being used for research. Having a different definition of consent and a lower standard of consent will inevitably lead to confusion, uncertainty and mistrust. Taken together, these amendments seek to avoid uncertainty and distrust, as well as the risk of backlash, by making it abundantly clear that Article 4 GDPR consent protections apply despite the new wording introduced by this Bill. Further, these are the same protections that apply to other uses of data; they are identical to the protections already understood by researchers and by the public.

I turn now to a couple of the amendments in this group. Amendment 71 seeks to address the question of consent, but in a rather narrow way. I have argued that Clause 68 introduces a novel and significantly watered-down definition of consent in the case of scientific research; proposed new paragraph 7 deploys this watered-down definition to winnow down the purpose limitation. There are broader questions about the wisdom of this, which Amendments 70, 79 and 81 seek to address, but Amendment 71 focuses on the important case of NHS health data.

If the public are worried that their health data might be shared with private companies without their consent, we need an answer to that. We see from the large number of opt-outs that there is already a problem; we have also seen it recently in NHS England’s research on public attitudes to health data. This amendment would ensure that the Bill does not increase uncertainty or fuel patient distrust of plans for NHS data. It would help to build the trust that data-enabled transformation of the NHS requires.

The Government may well retort that they are not planning to share NHS patient data with commercial bodies without patient consent. That is fine, but it would be helpful if, when he comes to respond, the Minister could say that clearly and unambiguously at the Dispatch Box. However, I put it to him that, if he could accept these amendments, the law would in fact reflect that assurance and ensure that any future Government would need to come back to Parliament if they wanted to take a different approach.

It is becoming obvious that whether research is in the public interest will be the key issue that we need to resolve in this Bill, and Amendment 72 provides a proposal. The Bill makes welcome references to health research being in the public interest, but it does not explain how on earth we decide or how that requirement would actually bite. Who makes the assessment? Do we trust a rogue operator to make its own assessment of how its research is in the public interest? What would be examples of the kind of research that the Government expect this requirement to prevent? I look forward to hearing the answer to that, but perhaps it would be more helpful if the Minister responded in a letter. In the interim, this amendment seeks to introduce some procedural clarity about how research will be certified as being in the public interest. This would provide clarity and reassurance, and I commend it to the Minister.

Finally, Amendment 131 seeks to improve the appropriate safeguards that would apply to processing for research, archiving and scientific purposes, including a requirement that the data subject has given consent. This has already been touched on in another amendment, but it is a way of seeking to address the issues that Amendments 70, 79 and 81 are also trying to address. Perhaps the Government will continue to insist that this is addressing a non-existent problem because nothing in Clauses 69 or 71 waters down the consent or purpose limitation protections and therefore the safeguards themselves add nothing. However, as I have said, informed readers of the Bill are interpreting it differently, so spelling out this safeguard would add clarity and avoid uncertainty. Surely such clarity on such an important matter is worth a couple of lines of additional length in a 250-page Bill. If the Government are going to argue that our Amendment 131 adds something objectionable, let them explain what is objectionable about consent protections applying to data processing for these purposes. I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I support Amendments 70 to 72, which I signed, in the name of the noble Lord, Lord Stevenson of Balmacara. I absolutely share his view about the impact of Clause 68 on the definition of consent and the potential and actual mistrust among the public about sharing of their data, particularly in the health service. It is highly significant that 3.3 million people have opted out of sharing their patient data.

I also very much share the noble Lord’s views about the need for public interest. In a sense, this takes us back to the discussion that we had on previous groups about whether we should add that in a broader sense so not purely for health data or whatever but for scientific research more broadly, as he specifies. I very much support what he had to say.

Broadly speaking, the common factor between my clause stand part and what he said is health data. Data subjects cannot make use of their data rights if they do not even know that their data is being processed. Clause 77 allows a controller reusing data under the auspices of scientific research to not notify a data subject in accordance with Article 13 and 14 rights if doing so

“is impossible or would involve a disproportionate effort”.

We on these Benches believe that Clause 77 should be removed from the Bill. The safeguards are easily circumvented. The newly articulated compatibility test in new Article 8A inserted by Clause 71 that specifies how related the new and existing purposes for data use need to be to permit reuse is essentially automatically passed if it is conducted

“for the purposes of scientific research or historical research”.

This makes it even more necessary for the definition of scientific research to be tightened to prevent abuse.

Currently, data controllers must provide individuals with information about the collection and use of their personal data. These transparency obligations generally do not require the controller to contact each data subject. Such obligations can usually be satisfied by providing privacy information using different techniques that can reach large numbers of individuals, such as relevant websites, social media, local newspapers and so on.

17:30
The BMA is deeply concerned that this provision will water down the transparency of information to patients. Clause 77 would mean that personal data collected through mass scraping or ingested during AI training would not be subject to normal notification requirements if it involved disproportionate effort. Any reduction in transparency requirements is a backward step in promoting confidence in the use of health data, given the very close relationship between transparency and public trust that we have discussed. It contradicts the approach in the recently published ICO guidance about improving transparency in health and social care. Disapplying transparency requirements is contrary to societal expectations. More not less transparency is required to build and maintain public trust. Reducing transparency is also in direct contradiction of the National Data Guardian’s advice that there should be no surprises for patients about how and why their data is used.
Furthermore, one of the factors listed in the Bill, which has a bearing on whether disproportionate effort is required, is the number of data subjects. The implication is that the more individuals whose personal data is being collected, the easier it will be for controllers to apply the exemption to provide information—more processing means less transparency. This is a deeply concerning direction of travel.
Given that existing transparency obligations generally do not require contact to be made with each data subject, it is hard to envisage how using methods that can reach large numbers of individuals at once would require disproportionate effort, such that it would impair the progression of research. Conversely, failure to be transparent may impair research if a loss of public trust occurs. Any reduction in transparency requirements is a backwards step in promoting confidence in the use of health data, given the very close relationship between transparency and public trust.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I rise briefly to support the amendments in the name of the noble Lord, Lord Stevenson of Balmacara. I must say that the noble Lord, Lord Clement-Jones, made a very persuasive speech; I shall be rereading it and thinking about it more carefully.

In many ways, purpose limitation is the jewel in the crown of GDPR. It does what it says on the tin: data should be used for the original purpose, and if the purpose is then extended, we should go back to the person and ask whether it can be used again. While I agree with and associate myself with the technical arguments made by the noble Lord, Lord Stevenson, that is the fundamental point.

The issue here is, what are the Government trying to do? What are we clearing a pathway for? In a later group, we will speak to a proposal to create a UK data sovereign fund to make sure that the value of UK publicly held data is realised. The value is not simply economic or financial, but societal. There are ways of arranging all this that would satisfy everyone.

I have been sitting here wondering whether to say it, but here I go: I am one of the 3.3 million.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

So is the noble Lord, Lord Clement-Jones. I withdrew my consent because I did not trust the system. I think that what both noble Lords have said about trust could be spread across the Bill as a whole.

We want to use our data well. We want it to benefit our public services. We want it to benefit UK plc and we want to make the world a better place, but not at the cost of individual data subjects and not at too great a cost. I add my voice to that. On the whole, I prefer systems that offer protections by design and default, as consent is a somewhat difficult concept. But, in as much as consent is a fundamental part of the current regulatory system and nothing in the Bill gets rid of it wholesale for some better system, it must be applied meaningfully. Amendments 79, 81 and 131 make clear what we mean by the term, ensure that the definition is consistent and clarify that it is not the intention of the Government to lessen the opportunity for meaningful consent. I, too, ask the Minister to confirm that it is not the Government’s intention to downgrade the concept of meaningful consent in the way that the noble Lord, Lord Stevenson, has set out.

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - - - Excerpts

My Lords, I support Amendment 71 and others in this group from the noble Lords, Lord Clement-Jones and Lord Stevenson. I apologise for not being able to speak at Second Reading. The noble Lord, Lord Clement-Jones, will remember that we took a deep interest in this issue when I was a Health Minister and the conversations that we had.

I had a concern at the time. We all know that the NHS needs to be digitised and that relevant health professionals need to be able to access relevant data when they need to, so that there is no need to be stuck with one doctor when you go to another part of the country. There are so many efficiencies that we could have in the system, as long as they are accessed by relevant and appropriate health professionals at the right time. But it is also important that patients have confidence in the system and that their personal data cannot be shared with commercial organisations without them knowing. As other noble Lords have said, this is an issue of trust.

For that reason, when I was in that position, I reached out to civil liberties organisations to understand their concerns. For example, medConfidential was very helpful and had conversations with DHSC and NHS officials. In fact, after those conversations, officials told me that its demands were reasonable and that some of the things being asked for were not that difficult to give and common sense.

I asked a Written Question of the noble Baroness’s ministerial colleague, the noble Baroness, Lady Merron, about whether patients will be informed of who has had access to their patient record, because that is important for confidence. The Answer I got back was that the Government were proposing a single unified health record. We all know that. She said that:

“Ensuring that patients’ confidential information remains protected and is seen only by those who need to see it will be a priority. Public engagement next month will help us understand what safeguards patients would want to see”.


Surely the fact that patients have opted out shows that they already have concerns and have raised them.

The NHS can build the best data system—or the federated data platform, as it is called—but without patient confidence it is simply a castle made of sand. As one of my heroes, Jimi Hendrix, once said, castles made of sand fall into the sea eventually. We do not want to see that with the federated data platform. We want to see a modernised system of healthcare digital records, allowing joined-up thinking on health and care right across a patient’s life. We should be able to use machine learning to analyse those valuable datasets to improve preventive care. But, for that to happen, the key has to be trust and patients being confident that their data is secure and used in the appropriate way. I look forward to the Minister’s response.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

My Lords, I support these amendments in the names of the noble Lords, Lord Stevenson and Lord Clement-Jones. It is a pleasure to follow the second ex-Health Minister this afternoon. In many ways, the arguments are just the same for health data as they are for all data. It is just that, understandably, it is at the sharpest end of this debate. Probably the most important point for everybody to realise, although it is espoused so often, is that there is no such thing as NHS data. It is a collection of the data of every citizen in this country, and it matters. Public trust matters significantly for all data but for health data in particular, because it goes so close to our identity—our very being.

Yet we know how to do public trust in this country. We know how to engage and have had significant success in public engagement decades ago. What we could do now with human-led technology-supported public engagement could be on such a positive and transformational scale. But, so far, there has been so little on this front. Let us not talk of NHS data; let us always come back to the fundamental principle encapsulated in this group of amendments and across so many of our discussions on the Bill. Does the Minister agree that it is about not NHS data but our data—our decisions—and, through that, if we get it right, our human-led digital futures?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Many thanks to all noble Lords who have proposed and supported these amendments. I will speak to just a few of them.

Amendment 70 looks to mitigate the lowering of the consent threshold for scientific research. As I have set out on previous groups, I too have concerns about that consent threshold. However, for me the issue is more with the definition of scientific research than with the consent threshold, so I am not yet confident that the amendment is the right way to achieve those desirable aims.

Amendment 71 would require that no NHS personal data can be made available for scientific research without the explicit consent of the patient. I thank the noble Lords, Lord Stevenson of Balmacara and Lord Clement-Jones, for raising this because it is such an important matter. While we will discuss this under other levels, as the noble Baroness, Lady Kidron, points out, it is such an important thing and we need to get it right.

I regret to advise my noble friend Lord Holmes that I was going to start my next sentence with the words “Our NHS data”, but I will not. The data previously referred to is a very significant and globally unique national asset, comprising many decades of population-wide, cradle-to-grave medical data. No equivalent at anything like the same scale or richness exists anywhere, which makes it incredibly valuable. I thank my noble friend Lord Kamall for stressing this point with, as ever, the help of Jimi Hendrix.

However, that data is valuable only to the extent that it can be safely exploited for research and development purposes. The data can collectively help us develop new medicines or improve the administration and productivity of the NHS, but we need to allow it to do so properly. I am concerned that this amendment, if enacted, would create too high an operational and administrative barrier to the safe exploitation of this data. I have no interest in compromising on the safety, but we have to find a more efficient and effective way of doing it.

Amendments 79, 81 and 131 all look to clarify that the definition of consent to be used is in line with the definition in Article 4.11 of the UK GDPR:

“‘consent’ of the data subject means any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her”.

This amendment would continue the use of a definition that is well understood. However, paragraph 3(a) of new Article 8A appears sufficient, in that the purpose for which a data subject consents is “specified, explicit and legitimate”.

Finally, with respect to Clause 77 stand part, I take the point and believe that we will be spending a lot of time on these matters going forward. But, on balance and for the time being, I feel that this clause needs to remain, as there must be clear rules on what information should be provided to data subjects. We should leave it in for now, although we will no doubt be looking to polish it considerably.

17:45
Lord Leong Portrait Lord Leong (Lab)
- Hansard - - - Excerpts

My Lords, I thank noble Lords for another thought-provoking debate on consent in scientific research. First, let me set out my staunch agreement with all noble Lords that a data subject’s consent should be respected.

Regarding Amendment 70, Clause 68 reproduces the text from the current UK GDPR recitals, enabling scientists to obtain “broad consent” for an area of research from the outset and to focus on potentially life-saving research. This has the same important limitations, including that it cannot be used if the researcher already knows its specific purpose and that consent can be revoked at any point.

I turn to Amendments 71 and 72, in the name of my noble friend Lord Stevenson, on assessments for research. Requiring all research projects to be submitted for assessments could discourage or delay researchers in their important work, as various noble Lords mentioned. However, I understand that my noble friend’s main concern is around NHS data. I assure him that, if NHS data is used for research, individual patients cannot be identified unless either a patient has specifically agreed for that data to be shared or the Health Research Authority has approved an application for this information to be used, informed by advice from the independent and expert Confidentiality Advisory Group. Research projects using confidential patient data are always subject to rigorous governance, including the approval of an ethics committee; the Minister, my noble friend Lady Jones, mentioned this earlier. There are also strict controls around who can see the data and how it is used and stored. Nothing in this clause will change that approach.

I turn to Amendments 81 and 131 on consent. I understand the motivations behind adding consent as a safeguard. However, organisations such as the Health Research Authority have advised researchers against relying on consent under the UK GDPR; for instance, an imbalance of power may mean that consent cannot truly be “freely given”.

On Amendment 79, I am happy to reassure my noble friend Lord Stevenson that references to “consent” in Clause 71 do indeed fall under the definition in Article 4.11.

Lastly, I turn to Clause 77, which covers the notification exemption; we will discuss this in our debates on upcoming groups. The Government have identified a gap in the UK GDPR that may disproportionately affect researchers. Where data is not collected from the data subject, there is an exemption from notifying them if getting in contact would mean a disproportionate amount of effort. This does not apply to data collected from the data subject. However, in certain studies, such as those of degenerative neurological conditions, it can be impossible or involve a disproportionate effort to recontact data subjects to inform them of any change in the study. The Bill will therefore provide a limited exemption with strong safeguards for data subjects.

Numerous noble Lords asked various questions. They touched on matters that we care about very much: trust in the organisation asking for data; the transparency rules; public interest; societal value; the various definitions of “consent”; and, obviously, whether we can have confidence in what is collected. I will not do noble Lords’ important questions justice if I stand here and try to give answers on the fly, so I will do more than just write a letter to them: I will also ask officials to organise a technical briefing and meeting so that we can go into everyone’s concerns in detail.

With that, I hope that I have reassured noble Lords that there are strong protections in place for data subjects, including patients; and that, as such, noble Lords will feel content to withdraw or not press their amendments.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

My Lords, I thank those who participated in this debate very much indeed. It went a little further than I had intended in drafting these amendments, but it has raised really important issues which I think we will probably come back to, if not later in Committee, certainly at Report.

At the heart of what we discussed, we recognise, as the noble Baroness, Lady Kidron, put it, that our data held by the NHS—if that is a better way of saying it—is valuable both in financial terms and because it should and could bring better health in future. Therefore, we value it specifically among some of the other datasets that we are talking about, because it has a returning loop in it. It is of benefit not just to the individual but to the UK as a whole, and we must respect that.

However, the worry that underlies framing it in that way is that, at some point, a tempting offer will be made by a commercial body—perhaps one is already on the table—which would generate new funding for the NHS and our health more generally, but the price obtained for that will not reflect the value that we have put into it over the years and the individual data that is being collected. That lack of trust is at the heart of what we have been talking about. In a sense, these amendments are about trust, but they are also bigger. They are also about the whole question of what it is that the Government as a whole do on our behalf in holding our data and what value they will obtain for that—something which I think we will come back to on a later amendment.

I agree with much of what was said from all sides. I am very grateful to the noble Lords, Lord Kamall and Lord Holmes, from the Opposition for joining in the debate and discussion, and their points also need to be considered. The Minister replied in a very sensible and coherent way; I will read very carefully what he said in Hansard and we accept his kind offer of a technical briefing on the Bill—that would be most valuable. I beg leave to withdraw the amendment.

Amendment 70 withdrawn.
Amendment 71 not moved.
Clause 68 agreed.
Amendment 72 not moved.
Clause 69 agreed.
Clause 70: Lawfulness of processing
Amendment 73
Moved by
73: Clause 70, page 77, leave out lines 34 to 38
Member's explanatory statement
This amendment and another amendment in Lord Clement-Jones’s name to clause 70 omits paragraphs 70(2)(b)-(c), (4), (5) and (6) which make amendments to UK GDPR to define certain data processing activities as “recognised legitimate interests”.
Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - - - Excerpts

My Lords, I start with an apology, because almost every amendment in this group is one of mine and I am afraid I have quite a long speech to make about the different amendments, which include Amendments 73, 75, 76, 77, 78, 78A, 83, 84, 85, 86, 89 and 90, and stand part debates on Schedules 4, 5 and 7 and Clause 74. But I know that the Members of this Committee are made of strong stuff.

Clause 70 and Schedule 4 introduce a new ground of recognised legitimate interest, which in essence counts as a lawful basis for processing if it meets any of the descriptions in the new Annexe 1 to the UK GDPR, which is at Schedule 4 to the Bill—for example, processing necessary for the purposes of responding to an emergency or detecting crime. These have been taken from the previous Government’s Data Protection and Digital Information Bill. This is supposed to reduce the burden on data controllers and the cost of legal advice when they have to assess whether it is okay to use or share data or not. Crucially, while the new ground shares its name with “legitimate interest”, it does not require the controller to make any balancing test taking the data subject’s interests into account. It just needs to meet the grounds in the list. The Bill gives the Secretary of State powers to define additional recognised legitimate interests beyond those in Annexe 1—a power heavily criticised by the Delegated Powers and Regulatory Reform Committee’s report on the Bill.

Currently where a private body shares personal data with a public body in reliance on Article 6(1)(e) of the GDPR, it can rely on the condition that the processing is

“necessary for the performance of a task carried out in the public interest”.

New conditions in Annexe 1, as inserted by Schedule 4, would enable data sharing between the private and public sectors to occur without any reference to a public interest test. In the list of recognised legitimate interests, the most important is the ability of any public body to ask another controller, usually in the private sector, for the disclosure of personal data it needs to deliver its functions. This applies to all public bodies. The new recognised legitimate interest legal basis in Clause 70 and Schedule 4 should be dropped.

Stephen Cragg KC, giving his legal opinion on the DPDI Bill, which, as I mentioned, has the same provision, stated that this list of recognised legitimate interests

“has been elevated to a position where the fundamental rights of data subjects (including children) can effectively be ignored where the processing of personal data is concerned”.

The ICO has also flagged concerns about recognised legitimate interests. In its technical drafting comments on the Bill, it said:

“We think it would be helpful if the explanatory notes could explicitly state that, in all the proposed new recognised legitimate interests, an assessment of necessity involves consideration of the proportionality of the processing activity”.


An assessment of proportionality is precisely what the balancing test is there to achieve. Recognised legitimate interests undermine the fundamental rights and interests of individuals, including children, in specific circumstances.

When companies are processing data without consent, it is essential that they do the work to balance the interests of the people who are affected by that processing against their own interests. Removing recognised legitimate interests from the Bill will not stop organisations from sharing data with the public sector or using data to advance national security, detect crime or safeguard children and vulnerable people. The existing legitimate interest lawful basis is more than flexible enough for these purposes. It just requires controllers to consider and respect people’s rights as they do so.

During the scrutiny of recognised legitimate interests in the DPDI Bill—I am afraid to have to mention this—the noble Baroness, Lady Jones of Whitchurch, who is now leading on this Bill as the Minister, raised concerns about the broad nature of the objectives. She rightly said:

“There is no strong reason for needing that extra power, so, to push back a little on the Minister, why, specifically, is it felt necessary? If it were a public safety interest, or one of the other examples he gave, it seems to me that that would come under the existing list of public interests”.—[Official Report, 25/3/24; col. GC 106.]


She never spoke a truer word.

However, this Government have reintroduced the same extra power with no new articulation of any strong reason for needing it. The constraints placed on the Secretary of State are slightly higher in this Bill than they were in the DPDI Bill, as new paragraph (9), inserted by Clause 70(4), means that they able to add new recognised legitimate interests only if they consider processing the case to be necessary to safeguard an objective listed in UK GDPR Article 23(1)(c) to (j). However, this list includes catch-alls, such as

“other important objectives of general public interest”.

To give an example of what this power would allow, the DPDI Bill included a recognised legitimate interest relating to the ability of political parties to use data about citizens during election campaigns on the basis that democratic participation is an objective of general public interest. I am glad to say that this is no longer included. Another example is that a future Secretary of State could designate workplace productivity as a recognised legitimate interest—which, without a balancing test, would open the floodgates to intrusive workplace surveillance and unsustainable data-driven work intensification. That does not seem to be in line with the Government’s objectives.

Amendment 74 is rather more limited. Alongside the BMA, we are unclear about the extent of the impact of Clause 70 on the processing of health data. It is noted that the recognised legitimate interest avenue appears to be available only to data controllers that are not public authorities. Therefore, NHS organisations appear to be excluded. We would welcome confirmation that health data held by an NHS data controller is excluded from the scope of Clause 70 now and in the future, regardless of the lawful basis that is being relied on to process health data.

18:00
Should health data fall within the scope of Clause 70, this would mean that processing is no longer subject to the ICO’s balancing test, which could significantly dilute the protection of health data. It is also unclear how this might affect the rights of data subjects, who are patients, including the right to object to processing. We would like the Government to clarify how health data would be affected and how controllers of health data will be able to reassure patients that they have a valid justification for processing.
We would also welcome clarity and reassurance that there is no scope for the new recognised legitimate interest avenue to apply to the processing of identifiable health data held by non-public bodies, such as research organisations or any controller of health data. In our view and that of the BMA, it would represent a dilution of the protections for health data should it be deemed that data controllers no longer need to justify why they are processing data.
As regards the various stand part notices and Amendments 83 and 90, the Bill also introduces several other clauses that would allow the Secretary of State to override primary legislation and modify key aspects of UK data protection law via statutory instrument, mostly inherited from the previous Government’s DPDI Bill. These include powers to introduce exemptions to the purpose-limitation principle, known as the list of compatible purposes, as in Clauses 70(4) and 71(5), which give broad powers to the Secretary of State to amend the UK GDPR lawfulness of processing provisions and purpose limitation provisions, respectively; to add or remove categories of data from the definition of what constitutes “special categories data”, also known as sensitive data, as in Clause 74; to add or remove safeguards for the use of data for research purposes, as in Clause 85, and for the use of data for solely automated decision-making, as in Clause 80; to designate automated decisions that are exempt from the safeguards provided by new Articles 22A, 22B and 22C in Clause 80; and to authorise transfers of personal data to third countries, as in Schedule 7.
There has been concern in recent years that
“more and more extensive powers to make law have been delegated to Ministers while parliamentary control over the exercise of those powers has eroded”,
to the extent that it
“compromises the UK’s system of parliamentary democracy”.
The Attorney-General recently highlighted that
“excessive reliance on delegated powers … upsets the proper balance between Parliament and the executive. This not only strikes at the rule of law values … but also at the cardinal principles of accessibility and legal certainty”.
He has emphasised the need to reconsider
“the balance between primary and secondary legislation, which in recent years has weighed too heavily in favour of delegated powers”.
The Bill is one of the first tests of whether this balance will be struck in practice.
It is positive that the Bill has implemented some of the recommendations of the Delegated Powers and Regulatory Reform Committee to address some of the problematic use of delegated powers in the DPDI Bill. But key powers remain, and we, alongside many others, are concerned that the Bill continues to include several widely drafted delegated powers that would permit the Secretary of State to make significant changes to the data protection regime without adequate parliamentary scrutiny.
We know from the DPDI Bill that the European Commission does not like the new recognised legitimate interest legal basis. With data adequacy up for renewal in 2025, it presents a significant risk to the free flow of data. All these Secretary of State powers were identified by European Union stakeholders as a main source of concern and constitute a major threat to the continuation of the UK adequacy decision and the functioning of the EU-UK Trade and Cooperation Agreement.
Amendment 77 is a short amendment that seeks to level the playing field between the big tech integrated platforms and smaller businesses that can often bring together different capabilities in groups or chains of companies. It is one of the benefits of the internet that small businesses can communicate and collaborate to compete with their bigger competitors in this way. The issue with the language of the Bill is that it would enable data to be more freely shared within a single integrated business than through contractual protections that would operate as safeguards to prevent misuse of data by smaller business affiliates. Indeed, it could be argued that such contractual protections are stronger safeguards than would often arise within a single integrated firm.
Amendment 78 is a probing amendment. If Clause 70 does stand part of the Bill, it requires the Secretary of State to explain why existing lawful bases for data processing are inadequate for the processing of personal data when additional recognised legitimate interests are introduced. Amendment 78A is in response to the fact that paragraph 1 of new Annexe 1 to the UK GDPR inserted by Schedule 4 states that,
“the processing is necessary for the purposes of making a disclosure of personal data to another person in response to a request from the other person, and (b) the request states that the other person needs the personal data for the purposes of carrying out processing”—
essentially for its public task or in its official authority under UK law, as stated in in Article 6(1)(e). This means that any public body can ask any other controller to disclose any personal data because they are needed for its functions. If the whole of the annex does not go, these words are probably the most dangerous in the annexe and should be omitted.
Amendments 84, 85 and 86 are probing amendments about Annexe 2 inserted by Schedule 5. Amendment 84 seeks to clarify whether the Government intend to allow personal data processing for purposes that are commercial under the conditions described in the provision. Amendment 85 seeks to ensure that transparency and accountability obligations are not removed from data controllers when processing personal data for the purposes of safeguarding vulnerable individuals based on undefined characteristics that may change and that may apply or not apply to any given individual at any point in time. Amendment 86 seeks to clarify whether and how the conditions for processing personal data based on the vulnerability of an individual should expire when the individual’s circumstances change. Amendment 89 is a probing amendment to ensure that personal data remains personal data. I beg to move.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I cannot compete with that tour de force. I shall speak to Amendments 73 and 75 in the name of noble Lord, Lord Clement-Jones, to which I have added my name, Amendments 76, 83 and 90 on the Secretary of State’s powers and Amendments 85 and 86 to which I wish I had added my name, but it is hard to keep up with the noble Lord. I am in sympathy with the other amendments in the group.

The issue of recognised legitimate interest has made a frequent appearance in the many briefings I have received and despite reading the Explanatory Notes for the Bill several times, I have struggled to understand in plain English the Government’s intent and purpose. I went to the ICO website to remind myself of the definition of legitimate interest to try to understand why recognised legitimate interest was necessary. It states:

“Legitimate interests is the most flexible lawful basis for processing, but you cannot assume it will always be the most appropriate.”


and then goes on:

“If you choose to rely on legitimate interests, you are taking on extra responsibility for considering and protecting people’s rights and interests.”

That seems to strike a balance between compelling justifications for processing and the need to consider and protect individual data rights and interests. I would be very interested to hear from Minister why the new category of “recognised legitimate interest” is necessary. Specifically, why do the Government believe that when processing may have far-reaching consequences, such as national security, crime prevention and safeguarding, there is no need to undertake a legitimate interest assessment? What is the justification for the ability of any public body to demand that data from private companies for any purpose? I ask those questions to be precise about the context and purpose.

18:10
Sitting suspended for a Division in the House.
18:21
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I am not suggesting that there is no legitimate interest for processing personal data without consent, but the legitimate interest assessment is a check and balance that ensures oversight and reduces the risk of overreach. It is a test, not a blocker, and does not in itself prevent processing if the balancing test determines that processing should go ahead. Amendment 85 illustrates this point in relation to vulnerable users. Given that a determination that a person is at risk would have far-reaching consequences for that person, the principles of fairness and accountability demand that those making the decision must follow a due process and that those subject to the decision are aware—if not in an emergency, certainly at some point in the proceedings.

In laying Amendment 86, the noble Lord, Lord Clement-Jones, raises an important question that I am keen to hear from Ministers on, namely, what is the Government’s plan for ensuring that a designation that an individual is vulnerable is monitored and removed when it is no longer appropriate? If a company or organisation has a legitimate interest in processing someone’s data considering the balancing interests of data subjects, it is free to do so. I ask the Minister again to give concrete examples of circumstances in which the current legitimate interest basis is insufficient, so that we understand the problem the Government are trying to solve.

At Second Reading, the Government’s curious defence of this new measure was the idea that organisations had concerns about whether they were doing the balancing test correctly, so the new measure is there to help, but perhaps the Minister can explain what benefits accrue from introducing the new measure that could not have been better achieved by the ICO providing more concrete guidance on the balancing test. Given that the measure is focused on the provision of public interest areas, such as national security and the detection of crime, how does the creation of the recognised legitimate interest help the majority of data controllers, rather than simply serving the interests of incumbents and/or government departments by removing an important check or balance?

Amendments 76, 83 and 90 seek to curb the power of the Secretary of State to override primary legislation and to modify key aspects of UK data protection law via statutory instrument. The proposed provisions in Clauses 70, 71 and 74 put one person in control, rather than Parliament. Elon Musk’s new role in the upcoming US Administration gives him legitimacy as an incoming officeholder in the Executive, but his new role is complicated by the fact that he is also CEO and majority shareholder of X. Like OpenAI, Google, Amazon, Palantir or any other tech behemoth, tech execs are not elected or bound to fulfil social goods or commitments, other than making a profit for their shareholders. They also fund many of the think tanks, reports and events in the political ecosystem, and there is a well-worn path of employment between industry, government and regulators.

No single person should be the carrier of that incredible burden. For now, Parliament is the only barrier in the increasingly confused picture of regulatory and political capture by the tech sector. We should fight to keep it that way.

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - - - Excerpts

My Lords, I support Amendment 74 from the noble Lords, Lord Scriven and Lord Clement-Jones, on excluding personal health data from being a recognised legitimate interest. I also support Amendment 78 on having a statement by the Secretary of State to recognise that legitimate interest and Amendments 83 and 90, which would remove powers from the Secretary of State to override primary legislation to modify data protection via an SI. There is not much to add to what I said on the previous group, so I will not repeat all the arguments made then. In simple terms, I repeat the necessity for trust—in health, particularly for patient trust. You do not gain trust simply by defining personal health data as a legitimate interest or by overriding primary legislation on the say-so of a Secretary of State, even if it is laid as a statutory instrument.

Lord Cameron of Lochiel Portrait Lord Cameron of Lochiel (Con)
- Hansard - - - Excerpts

My Lords, I want to ask the Minister and the noble Lord, Lord Clement-Jones, in very general terms for their views on retrospectivity. Do they believe that the changes to data protection law in the Bill are intended to be applied to data already held at this time or will the new regime apply only to personal data collected going forwards from this point? I ask that specifically of data pertaining to children, from whom sensitive data has already been collected. Will the forthcoming changes to data protection law apply to such data that controllers and processors already hold, or will it apply only to data held going forward?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank in particular the noble Lord, Lord Clement-Jones, who has clearly had his Weetabix this morning. I will comment on some of the many amendments tabled.

On Amendments 73, 75, 76, 77, 83 and 90, I agree it is concerning that the Secretary of State can amend such important legislation via secondary legislation. However, these amendments are subject to the affirmative procedure and, therefore, to parliamentary scrutiny. Since the DPDI Bill proposed the same, I have not changed my views; I remain content that this is the right level of oversight and that these changes do not need to be made via primary legislation.

As for Amendment 74, preventing personal health data from being considered a legitimate interest seems wise. It is best to err on the side of caution when it comes to sharing personal health data.

Amendment 77 poses an interesting suggestion, allowing businesses affiliated by contract to be treated in the same way as large businesses that handle data from multiple companies in a group. This would certainly be beneficial for SMEs collaborating on a larger project. However, each such business may have different data protection structures and terms of use. Therefore, while this idea certainly has merit, I am a little concerned that it may benefit from some refining to ensure that the data flows between businesses in a way to which the data subject has consented.

On Amendment 78A and Schedule 4 standing part, there are many good, legitimate interest reasons why data must be quickly shared and processed, many of which are set out in Schedule 4: for example, national security, emergencies, crimes and safeguarding. This schedule should therefore be included in the Bill to set out the details on these important areas of legitimate interest processing. Amendment 84 feels rather like the central theme of all our deliberations thus far today, so I will listen with great interest, as ever, to the Minister’s response.

I have some concerns about Amendment 85, especially the use of the word “publicly”. The information that may be processed for the purposes of safeguarding vulnerable individuals is likely to be deeply sensitive and should not be publicly available. Following on from this point, I am curious to hear the Minister’s response to Amendment 86. It certainly seems logical that provisions should be in place so that individuals can regain control of their personal data should the reason for their vulnerability be resolved. As for the remaining stand part notices in this group, I do not feel that these schedules should be removed because they set out important detail on which we will come to rely.

18:30
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, when the noble Lord, Lord Clement-Jones, opened his speech he said that he hoped that noble Lords would be made of strong stuff while he worked his way through it. I have a similar request regarding my response: please bear with me. I will address these amendments slightly out of order to ensure that related issues are grouped together.

The Schedule 4 stand part notice, and Amendments 73 and 75, tabled by the noble Lord, Lord Clement-Jones, and supported by the noble Baroness, Lady Kidron, would remove the new lawful ground of “recognised legitimate interests” created by Clause 70 and Schedule 4 to the Bill. The aim of these provisions is to give data controllers greater confidence about processing personal data for specified and limited public interest objectives. Processing that is necessary and proportionate to achieve one of these objectives can take place without a person’s consent and without undertaking the legitimate interests balancing test. However, they would still have to comply with the wider requirements of data protection legislation, where relevant, ensuring that the data is processed in compliance with the other data protection principles.

I say in response to the point raised by the noble Lord, Lord Cameron, that the new lawful ground of recognised legitimate interest will apply from the date of commencement and will not apply retrospectively.

The activities listed include processing of data where necessary to prevent crime, safeguarding national security, protecting children or responding to emergencies. They also include situations where a public body requests that a non-public body share personal data with it to help deliver a public task that is sanctioned by law. In these circumstances, it is very important that data is shared without delay, and removal of these provisions from the Bill, as proposed by the amendment, could make that harder.

Amendment 74, tabled by noble Lord, Lord Scriven, would prevent health data being processed as part of this new lawful ground, but this could have some unwelcome effects. For example, the new lawful ground is designed to give controllers greater confidence about reporting safeguarding concerns, but if these concerns relate to a vulnerable person’s health, they would not be able to rely on the new lawful ground to process the data and would have to identify an alternative lawful ground.

On the point made by the noble Lord, Lord Clement-Jones, about which data controllers can rely on the new lawful ground, it would not be available to public bodies such as the NHS; it is aimed at non-public bodies.

I reassure noble Lords that there are still sufficient safeguards in the wider framework. Any processing that involves special category data, such as health data, would also need to comply with the conditions and safeguards in Article 9 of the UK GDPR and Schedule 1 to the Data Protection Act 2018.

Amendment 78A, tabled by the noble Lord, Lord Clement-Jones, would remove the new lawful ground for non-public bodies or individuals to disclose personal data at the request of public bodies, where necessary, to help those bodies deliver their public interest tasks without carrying out a legitimate interest balance test. We would argue that, without it, controllers may lack certainty about the correct lawful ground to rely on when responding to such requests.

Amendment 76, also tabled by the noble Lord, Lord Clement-Jones, would remove the powers of regulations in Clause 70 that would allow the Secretary of State to keep the list of recognised legitimate interests up to date. Alternatively, the noble Lord’s Amendment 78 would require the Secretary of State to publish a statement every time he added a new processing activity to the list, setting out its purpose, which controllers it was aimed at and for how long they can use it. I reassure the noble Lord that the Government have already taken steps to tighten up these powers since the previous Bill was considered by this House.

Any new processing activities added would now also have to serve

“important objectives of … public interest”

as described in Article 23.1 of the UK GDPR and, as before, new activities could be added to the list only following consultation with the ICO and other interested parties. The Secretary of State would also have to consider the impact of any changes on people’s rights and have regard to the specific needs of children. Although these powers are likely to be used sparingly, the Government think it important that they be retained. I reassure the Committee that we will be responding to the report from the Delegated Powers Committee within the usual timeframes and we welcome its scrutiny of the Bill.

The noble Lord’s Amendment 77 seeks to make it clear that organisations should also be able to rely on Article 6.1(f) to make transfers between separate businesses affiliated by contract. The list of activities mentioned in Clause 70 is intended to be illustrative only and is drawn from the recitals to the UK GDPR. This avoids providing a very lengthy list that might be viewed as prescriptive. Article 6.1(f) of the UK GDPR is flexible. The transmission of personal data between businesses affiliated by contract may constitute a legitimate interest, like many other commercial interests. It is for the controller to determine this on a case-by-case basis.

I will now address the group of amendments tabled by the noble Lord, Lord Clement-Jones, concerning the purpose limitation principle, specifically Amendments 83 to 86. This principle limits the ways that personal data collected for one purpose can be used for another, but Clause 71 aims to provide more clarity and certainty around how it operates, including how certain exemptions apply.

Amendment 84 seeks to clarify whether the first exemption in proposed new Annexe 2 to the UK GDPR would allow personal data to be reused for commercial purposes. The conditions for using this exemption are that the requesting controller has a public task or official authority laid down in law that meets a public interest objective in Article 23.1 of the UK GDPR. As a result, I and the Government are satisfied that these situations would be for limited public interest objectives only, as set out in law.

Amendments 85 and 86 seek to introduce greater transparency around the use of safeguarding exemptions in paragraph 8 of new Annexe 2. These conditions are drawn from the Care Act 2014 and replicated in the existing condition for sensitive data processing for safeguarding purposes in the Data Protection Act 2018. I can reassure the Committee that processing cannot occur if it does not meet these conditions, including if the vulnerability of the individual no longer exists. In addition, requiring that an assessment be made and given to the data subject before the processing begins could result in safeguarding delays and would defeat the purpose of this exemption.

Amendment 83 would remove the regulation-making powers associated with this clause so that new exceptions could not be added in future. I remind noble Lords that there is already a power to create exemptions from the purpose limitation principle in the DPA 2018. This Bill simply moves the existing exemptions to a new annexe to the UK GDPR. The power is strictly limited to the public objectives listed in Article 23.1 of the UK GDPR.

I now turn to the noble Lord’s Amendment 89, which seeks to set conditions under which pseudonymised data should be treated as personal data. This is not necessary as pseudonymised data already falls within the definition of personal data under Article 4.1 of the UK GDPR. This amendment also seeks to ensure that a determination by the ICO that data is personal data applies

“at all points in that processing”.

However, the moment at which data is or becomes personal should be a determination of fact based on its identifiability to a living individual.

I turn now to Clause 74 stand part, together with Amendment 90. Noble Lords are aware that special categories of data require additional protection. Article 9 of the UK GDPR sets out an exhaustive list of what is sensitive data and outlines processing conditions. Currently, this list cannot be amended without primary legislation, which may not always be available. This leaves the Government unable to respond swiftly when new types of sensitive data are identified, including as a result of emerging technologies. The powers in Clause 74 enable the Government to respond more quickly and add new special categories of data, tailor the conditions applicable to their use and add new definitions if necessary.

Finally, I turn to the amendment tabled by the noble Lord, Lord Clement-Jones, that would remove Schedule 7 from the Bill. This schedule contains measures to create a clearer and more outcomes-focused UK international data transfers regime. As part of these reforms, this schedule includes a power for the Secretary of State to recognise new transfer mechanisms for protecting international personal data transfers. Without this, the UK would be unable to respond swiftly to emerging developments and global trends in personal data transfers. In addition, the ICO will be consulted on any new mechanisms, and they will be subject to debate in Parliament under the affirmative resolution procedure.

I hope this helps explain the Government’s intention with these clauses and that the noble Lord will feel able to withdraw his amendment.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I thank the Minister. She covered quite a lot of ground and all of us will have to read Hansard quite carefully. However, it is somewhat horrifying that, for a Bill of this size, we had about 30 seconds from the Minister on Schedule 7, which could have such a huge influence on our data adequacy when that is assessed next year. I do not think anybody has talked about international transfers at this point, least of all me in introducing these amendments. Even though it may appear that we are taking our time over this Bill, we are not fundamentally covering all its points. The importance of this Bill, which obviously escapes most Members of this House—there are just a few aficionados—is considerable and could have a far-reaching impact.

I still get Viscount Camrose vibes coming from the Minister.

None Portrait Noble Lords
- Hansard -

Oh!

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Perhaps I should stay that this kind of enthusiasm clearly conquers all. I should thank a former Minister, the noble Lord, Lord Kamall, and I thank the noble Baroness, Lady Kidron, for her thoughtful speech, particularly in questioning the whole recognised legitimate interest issue, especially in relation to vulnerable individuals.

It all seems to be a need for speed, whether it is the Secretary of State who has to make snappy decisions or a data controller. We are going to conquer uncertainty. We have to keep bustling along. In a way, to hell with individual data rights; needs must. I feel somewhat Canute-like holding up the barrier of data that will be flowing across us. I feel quite uncomfortable with that. I think the DPRRC is likewise going to feel pretty cheesed off.

18:45
The Minister was pretty unequivocal about how she wanted to keep the powers in Clauses 70 and 71—fully supported by the noble Viscount, Lord Camrose, of course, who has not been on the road to Damascus as far as that is concerned—but the fact is that the Government are doing exactly what the previous Government did: ignoring the DPRRC, which is not a good look. I thought that we were getting new brooms, a new culture and a new approach to all this, but I do not see a great deal of that.
However, there are some glimmerings here. I welcome the Minister’s assurances on Amendments 84 to 86; obviously, I will need to read Hansard in detail. Generally, I feel that there is a kind of overenthusiasm here, which the Government have adopted in line with their predecessor. This whole category of “recognised legitimate interest” is deeply unsound. We need the balancing test—the noble Baroness, Lady Kidron, explained why far better than I could, in terms of the desirability of sticking to “legitimate interest” as opposed to “recognised legitimate interest”—but, clearly, the Government are currently unpersuaded. Let us hope that, at least as far as the Secretary of State’s powers are concerned, the Government will think again and address the DPRRC’s concerns, which they have not done so far.
In the meantime, I beg leave to withdraw my Amendment 73.
Amendment 73 withdrawn.
Amendments 74 to 77 not moved.
Clause 70 agreed.
Amendment 78 not moved.
Schedule 4: Lawfulness of processing: recognised legitimate interests
Amendment 78A not moved.
Schedule 4 agreed.
Clause 71: The purpose limitation
Amendments 79 to 81 not moved.
Amendment 82
Moved by
82: Clause 71, page 81, line 14, at end at end insert—
“4A. Where the controller collected the personal data based on Article 6(1)(a) (data subject’s consent), processing for a new purpose is not compatible with the original purpose if—(a) the data subject is a child,(b) the processing is based on consent given or authorised by the holder of parental responsibility over the child,(c) the data subject is an adult to whom either (a) or (b) applied at the time of the consent collection, or(d) the data subject is a deceased child.”Member’s explanatory statement
This amendment seeks to exclude children from the new provisions on purpose limitation for further processing under Article 8A.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I thought I had no speech; that would have been terrible. In moving my amendment, I thank the noble Baronesses, Lady Kidron and Lady Harding of Winscombe, and the noble Lord, Lord Russell of Liverpool, for their support. I shall speak also to Amendments 94, 135 and 196.

Additional safeguards are required for the protection of children’s data. This amendment

“seeks to exclude children from the new provisions on purpose limitation for further processing under Article 8A”.

The change to the purpose limitation in Clause 71 raises questions about the lifelong implications of the proposed change for children, given the expectation that they are less aware of the risks of data processing and may not have made their own preferences or choices known at the time of data collection.

For most children’s data processing, adults give permission on their behalf. The extension of this for additional purposes may be incompatible with what a data subject later wishes as an adult. The only protection they may have is purpose limitation to ensure that they are reconsented or informed of changes to processing. Data reuse and access must not mean abandoning the first principles of data protection. Purpose limitation rests on the essential principles of “specified” and “explicit” at the time of collection, which this change does away with.

There are some questions that I would like to put to the Minister. If further reuses, such as more research, are compatible, they are already permitted under current law. If further reuses are not permitted under current law, why should data subjects’ current rights be undermined as a child and, through this change, never be able to be reclaimed at any time in the future? How does the new provision align with the principle of acting in the best interests of the child, as outlined in the UK GDPR, the UNCRC in Scotland and the Rights of Children and Young Persons (Wales) Measure 2011? What are the specific risks to children’s data privacy and security under the revised rules for purpose limitation that may have an unforeseeable lifelong effect? In summary, a blanket exclusion for children’s data processing conforms more with the status quo of data protection principles. Children should be asked again about data processing once they reach maturity and should not find that data rights have been given away by their parents on their behalf.

Amendment 196 is more of a probing amendment. Ofcom has set out its approach to the categorisation of category 1 services under the Online Safety Act. Ofcom’s advice and research, submitted to the Secretary of State, outlines the criteria for determining whether a service falls into category 1. These services are characterised by having the highest reach and risk functionalities among user-to-user services. The categorisation is based on certain threshold conditions, which include user numbers and functionalities such as content recommender systems and the ability for users to forward or reshare content. Ofcom has recommended that category 1 services should meet either of two sets of conditions: having more than 34 million UK users with a content recommender system or having more than 7 million UK users with a content recommender system and the ability for users to forward or reshare user-generated content. The categorisation process is part of Ofcom’s phased approach to implementing codes and guidance for online safety, with additional obligations for category 1 services due to their potential as sources of harm.

The Secretary of State recently issued the Draft Statement of Strategic Priorities for Online Safety, under Section 172 of the Online Safety Act. It says:

“Large technology companies have a key role in helping the UK to achieve this potential, but any company afforded the privilege of access to the UK’s vibrant technology and skills ecosystem must also accept their responsibility to keep people safe on their platforms and foster a safer online world … The government appreciates that Ofcom has set out to government its approach to tackling small but risky services. The government would like to see Ofcom keep this approach under continual review and to keep abreast of new and emerging small but risky services, which are posing harm to users online.


As the online safety regulator, we expect Ofcom to continue focusing its efforts on safety improvements among services that pose the highest risk of harm to users, including small but risky services. All search services in scope of the Act have duties to minimise the presentation of search results which include or lead directly to illegal content or content that is harmful to children. This should lead to a significant reduction in these services being accessible via search results”.


During the parliamentary debates on the Online Safety Bill and in Joint Committee, there was significant concern about the categorisation of services, particularly about the emphasis on size over risk. Initially, the categorisation was based largely on user numbers and functionalities, which led to concerns that smaller platforms with high-risk content might not be adequately addressed. In the Commons, Labour’s Alex Davies-Jones MP, now a Minister in the Ministry of Justice, argued that focusing on size rather than risk could fail to address extreme harms present on smaller sites.

The debates also revealed a push for a more risk-based approach to categorisation. The then Government eventually accepted an amendment allowing the Secretary of State discretion in setting thresholds based on user numbers, functionalities or both. This change aimed to provide flexibility in addressing high-risk smaller platforms. However, concerns remain, despite the strategy statement and the amendment to the original Online Safety Bill, that smaller platforms with significant potential for harm might not be sufficiently covered under the category 1 designation. Overall, while the final approach allows some flexibility, there is quite some debate about whether enough emphasis will be placed by Ofcom in its categorisation on the risks posed by smaller players. My colleagues on these Benches and in the Commons have emphasised to me that we should be rigorously addressing these issues. I beg to move.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I shall speak to all the amendments in this group, and I thank noble Lords who have added their names to Amendments 88 and 135 in my name.

Amendment 88 creates a duty for data controllers and processors to consider children’s needs and rights. Proposed new subsection (1) simply sets out children’s existing rights and acknowledges that children of different ages have different capacities and therefore may require different responses. Proposed new subsection (2) addresses the concern expressed during the passage of the Bill and its predecessor that children should be shielded from the reduction in privacy protections that adults will experience under the proposals. Proposed new subsection (3) simply confirms that a child is anyone under the age 18.

This amendment leans on a bit of history. Section 123 of the Data Protection Act 2018 enshrined the age-appropriate design code into our data regime. The AADC’s journey from amendment to fully articulated code, since mirrored and copied around the world, has provided two useful lessons.

First, if the intent of Parliament is clear in the Bill, it is fixed. After Royal Assent to the Data Protection Act 2018, the tech lobby came calling to both the Government and the regulator arguing that the proposed age of adulthood in the AADC be reduced from 18 to 13, where it had been for more than two decades. Both the department and the regulator held up their hands and pointed at the text, which cited the UNCRC that defines a child as a person under 18. That age remains, not only in the UK but in all the other jurisdictions that have since copied the legislation.

In contrast, on several other issues both in the AADC and, more recently, in the Online Safety Act, the intentions of Parliament were not spelled out and have been reinterpreted. Happily, the promised coroner provisions are now enshrined in this Bill, but promises from the Dispatch Box about the scope and form of the coroner provisions were initially diluted and had to be refought for a second time by bereaved parents. Other examples, such as promises of a mixed economy, age-assurance requirements and a focus on contact harm, features and functionalities as well as content are some of the ministerial promises that reflected Parliament’s intention but do not form part of the final regulatory standards, in large part because they were not sufficiently spelled out in the Bill. What is on in the Bill really matters.

Secondly, our legislation over the past decade is guilty of solving the problems of yesterday. There is departmental resistance to having outcomes rather than processes enshrined in legislation. Overarching principles, such as a duty of care, or rights, such as children’s rights to privacy, are abandoned in favour of process measures, tools that even the tech companies admit are seldom used and narrow definitions of what must and may not be taken down.

Tech is various, its contexts infinite, its rate of change giddy and the skills of government and regulator are necessarily limited. At some point we are going to have to start saying what the outcome should be, what the principles are, and not what the process is. My argument for this amendment is that we need to fix our intention that in the Bill children have an established set of needs according to their evolving capacity. Similarly, they have a right to a higher bar of privacy, so that both these principles become unavoidable.

19:00
Amendment 135 is similar but here the duty to consider children’s needs and rights applies to the ICO rather than the controllers and processors. Proposed paragraphs (e) to (g) in Amendment 135 mirror the provisions in paragraphs (a) to (c) of proposed subsection (1) in Amendment 88, while its proposed subsection (2) once again puts into the Bill the fact that a child is a person under 18. I suspect that the Minister may say that Amendment 135 is not needed because the Government have already proposed a duty relating to children, but the wording in Clause 90 is inadequate. It places a duty on the ICO to have regard to
“the fact that children may be less aware of the risks and consequences associated with processing of personal data and of their rights in relation to such processing”.
This does not carry any requirement for the ICO to determine the contribution of product and service design, or how default settings add to children’s privacy as opposed to, for example, providing information.
Clause 90 also does not address children’s rights or their different needs at different ages and stages of development. I anticipate that the argument from the Government is that this wording reflects the wording of recital 38, and they are simply absorbing it into the Bill. If that is the case, I should like to understand why the Government chose to transcribe only a part of the recital’s text and deliberately omitted the most critical part, namely, the first 10 words:
“Children merit specific protection with regard to their personal data.”
Adding Amendment 135 to the Bill would give both instruction and mandate to the ICO to fulfil its duties to children.
I turn to Amendment 94 in the name of the noble Lord, Lord Clement-Jones. My preference would be for the Government to drop the proposal altogether. However, at the very least, children should not be included in its provision. It is well evidenced that data subjects rarely read notices about how their data is processed; I point out again that products and services should be private and safe by design and default, rather than rely on information. None the less, while children and parents may not read them, civil society organisations and regulators do.
Earlier this year, Steve Wood, previously deputy commissioner at the ICO, wrote a report on the impact of regulation on children’s digital lives. He was looking at the AADC, the OSA and the DSA, among others and, as he commenced his work, he wrote to 50 companies that make the products most popular with children. Only eight responded and not one answered his questions comprehensively. Ultimately, the greatest source of information was the written notices, which he and others trawled through to establish changes in terms across different products and features in multiple jurisdictions. Doing away with information notices makes it easier for online services to hide poor practice and harder for those of us working for safer digital products and services for children to scrutinise them. I ask the Minister: how will the information currently provided in such notices be made available if the Government choose to ignore the noble Lord’s amendments?
I will also speak briefly to Amendments 82 and 196 in the name of the noble Lord, Lord Clement-Jones. He set out in great detail in his speech that removing children from the new provisions that lessen the impact of purpose limitation is an excellent example of circumstances where the principle that children merit specific protection should, in fact, lead to a practical higher level of protection. Again, I would prefer that the Government had dropped purpose limitation provisions altogether but if they are determined to press ahead, we should at least maintain existing standards for children.
Finally, all I can say of Amendment 196, from over a decade of work on online safety, is that small is not safe.
Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- Hansard - - - Excerpts

My Lords, I put my name to the amendments from the noble Baroness, Lady Kidron, and will briefly support them. I state my interest as a governor of Coram, the children’s charity. One gets a strong sense of déjà vu with this Bill. It takes me back to the Online Safety Bill and the Victims and Prisoners Bill, where we spent an inordinate amount of time trying to persuade the Government that children are children and need to be treated as children, not as adults. That was hard work. They have an absolute right to be protected and to be treated differently.

I ask the Minister to spend some time, particularly when her cold is better, with some of her colleagues whom we worked alongside during the passage of those Bills in trying to persuade the then Government of the importance of children being specifically recognised and having specific safeguards. If she has time to talk to the noble Lords, Lord Ponsonby, Lord Stevenson and Lord Knight, and the noble Baroness, Lady Thornton —when she comes out of hospital, which I hope will be soon—she will have chapter, book and verse about the arguments we used, which I hope we will not have to rehearse yet again in the passage of this Bill. I ask her please to take the time to learn from that.

As the noble Baroness said, what is fundamental is not what is hinted at or implied at the Dispatch Box, but what is actually in the Bill. When it is in the Bill, you cannot wriggle out of it—it is clearly there, stating what it is there for, and it is not open to clever legal interpretation. In a sense, we are trying to future-proof the Bill by, importantly, as she said, focusing on outcomes. If you do so, you are much nearer to future-proofing than if you focus on processes, which by their very nature will be out of date by the time you have managed to understand what they are there to do.

Amendment 135 is important because the current so-called safeguard for the Information Commissioner to look after the interests of children is woefully inadequate. One proposed new section in Clause 90 talks of

“the fact that children may be less aware of the risks and consequences associated with processing of personal data and of their rights in relation to such processing”.

It is not just children; most adults do not have a clue about any of that, so to expect children to have even the remotest idea is just a non-starter. To add insult to injury, that new section begins

“the Commissioner must have regard to such of the following”—

of which the part about children is one—

“as appear to the Commissioner to be relevant in the circumstances”.

That is about as vague and weaselly as it is possible to imagine. It is not adequate in any way, shape or form.

In all conscience, I hope that will be looked at very carefully. The idea that the commissioner might in certain circumstances deem that the status and importance of children is not relevant is staggering. I cannot imagine a circumstance in which that would be the case. Again, what is in the Bill really matters.

On Amendment 94, not exempting the provision of information regarding the processing of children’s data is self-evidently extremely important. On Amendment 82, ring-fencing children’s data from being used by a controller for a different purpose again seems a no-brainer.

Amendment 196, as the noble Lord, Lord Clement-Jones, says, is a probing amendment. It seems eminently sensible when creating Acts of Parliament that in some senses overlap, particularly in the digital and online world, that the left hand should know what the right hand is doing and how two Acts may be having an effect on one another, perhaps not in ways that had been understood or foreseen when the legislation was put forward. We are looking for consistency, clarity, future-proofing and a concentration on outputs, not processes. First and foremost, we are looking for the recognition, which we fought for so hard and finally got, that children are children and need to be recognised and treated as children.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

My Lords, I think we sometimes forget, because the results are often so spectacular, the hard work that has had to happen over the years to get us to where we are, particularly in relation to the Online Safety Act. It is well exemplified by the previous speaker. He put his finger on the right spot in saying that we all owe considerable respect for the work of the noble Baroness, Lady Kidron, and others. I helped a little along the way. It is extraordinary to feel that so much of this could be washed away if the Bill goes forward in its present form. I give notice that I intend to work with my colleagues on this issue because this Bill is in serious need of revision. These amendments are part of that and may need to be amplified in later stages.

I managed to sign only two of the amendments in this group. I am sorry that I did not sign the others, because they are also important. I apologise to the noble Lord, Lord Clement-Jones, for not spotting them early enough to be able to do so. I will speak to the ones I have signed, Amendments 88 and 135. I hope that the Minister will give us some hope that we will be able to see some movement on this.

The noble Lord, Lord Russell, mentioned the way in which the wording on page 113 seems not only to miss the point but to devalue the possibility of seeing protections for children well placed in the legislation. New Clause 120B(e), which talks of

“the fact that children may be less aware of the risks and consequences associated with processing of personal data and of their rights in relation to such processing”,

almost says it all for me. I do not understand how that could possibly have got through the process by which this came forward, but it seems to speak to a lack of communication between parts of government that I hoped this new Government, with their energy, would have been able to overcome. It speaks to the fact that we need to keep an eye on both sides of the equation: what is happening in the online safety world and how data that is under the control of others, not necessarily those same companies, will be processed in support or otherwise of those who might wish to behave in an improper or illegal way towards children.

At the very least, what is in these amendments needs to be brought into the Bill. In fact, other additions may need to be made. I shall certainly keep my eye on it.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, for bringing forward amendments in what is a profoundly important group. For all that data is a cornerstone of innovation and development, as we have often argued in this Committee, we cannot lose sight of our responsibility to safeguard the rights and welfare of our children.

19:15
On that basis, I absolutely welcome Amendments 82 and 94. Children’s unique vulnerabilities demand special consideration. Their personal data, whether collected through educational platforms, social media or health applications, requires the most stringent protections. It is clearly both our moral and legislative obligation to ensure that this data is used responsibly and ethically, without compromising their privacy or exposing them to harm. Moreover, by extending these protections beyond childhood, this amendment recognises that the consequences of data collection during childhood can stretch far into adulthood. This is an acknowledgment of the fact that privacy is a lifelong right; the data collected in our formative years should not be used in ways that could undermine our dignity or well-being later in life.
I also welcome Amendments 88 and 135, which underscore our collective responsibility to ensure that the personal data of children is treated with the highest level of care and respect. They would strengthen the existing frameworks of data protection. In today’s increasingly connected world, where personal data is crucial to the functioning of online services, we must recognise that, due to their vulnerability and developmental needs, children require special protection for their personal data.
This principle of prioritising the best interests of the child is enshrined in the UN Convention on the Rights of the Child, a treaty that I believe has been ratified by every nation apart from the United States and which underscores the importance of protecting children’s rights in all areas, including their privacy and personal data. The UNCRC emphasises that, in all matters affecting children, their best interests must be a primary consideration. This principle is essential in the digital environment, where children may be exposed to risks such as data exploitation, manipulation or even harm through targeted marketing.
Further, this age-appropriate and developmentally appropriate approach to data protection is crucial. Children at different ages have different needs and abilities in understanding the consequences of data collection. This higher standard of protection is not just a legal obligation; it is a moral imperative. It is a commitment to ensuring that, as children grow up in an increasingly connected world, their privacy, safety and rights are respected and upheld.
The noble Lord, Lord Clement-Jones, was absolutely right to bring Amendment 196 forward to explore the occasionally complex interaction between this Bill and the Online Safety Act. Today, he presents us with a probing amendment that seeks to bring attention to a crucial issue: how the provisions in Clause 122 of this Bill align and interact with the provisions concerning category 1 services under the Online Safety Act.
The provisions in Chapter 2 of the Online Safety Act have already made headlines, primarily for their stringent requirements on the largest and most influential online platforms, such as social media giants and search engines. These services, which cater to millions—even billions—of users globally, are being tasked with the profound responsibility of protecting vulnerable users, particularly children, from harmful online content. The rationale for these obligations is clear: with the reach and power that these platforms have, they also bear a substantial duty to mitigate risks, including online abuse, exploitation and exposure to harmful content. As a result of the Act, these platforms must carry out comprehensive risk assessments for users, especially children, and take proactive steps to protect them.
This Bill seeks to ensure that, when a child’s death is suspected to be linked to their activity on a regulated online service, service providers are required to retain relevant data; this would allow Ofcom to oversee the retention of this data, ensuring that it is preserved for future investigations. The Bill allows Ofcom to retain information when an investigation into the tragic death of a child is under way. In such circumstances, the Government can compel service providers to retain data related to the deceased child’s online activity, ensuring that it is not deleted in the course of regular operations.
The OSA and this Bill’s provisions must work in harmony, ensuring that category 1 services can fulfil their safety duties without infringing on privacy rights. This is why it is crucial that platforms can securely store data needed for investigations while maintaining safeguards that prevent unnecessary or excessive data collection. Any data retention required under this Bill must be only for as long as is necessary for an investigation and in a manner that does not violate data protection laws.
In this regard, I stress that we need clarity. The amendment calls for an explicit report from the Secretary of State to Parliament to ensure that both the OSA’s and this Bill’s provisions are aligned and do not create legal conflicts, particularly in data retention and privacy. Our goal should always be to create an online environment where children are protected from harm, but we must do so in a way that does not compromise their right to privacy or the integrity of data protection laws. It is a fine balance, and one that we must continue to examine and refine. The Government providing a report to Parliament on the interaction between these provisions will be a crucial step in ensuring that this balance is struck effectively.
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I thank all noble Lords who have raised this important topic. I say at the outset that I appreciate and pay tribute to those who have worked on this for many years—in particular the noble Baroness, Lady Kidron, who has been a fantastic champion of these issues.

I also reassure noble Lords that these provisions are intended to build upon, and certainly not to undermine, the rights of children as they have previously been defined. We share noble Lords’ commitment to ensuring high standards of protection for children. That is why I am glad that the Bill, together with existing data protection principles, already provides robust protections for children. I hope that my response to these amendments shows that we take these issues seriously. The ICO also recognises in its guidance, after the UN Committee on the Rights of the Child, that the duties and responsibilities to respect the rights of children extend in practice to private actors and business enterprises.

Amendment 82, moved by the noble Lord, Lord Clement-Jones, would exclude children’s personal data from the exemptions to the purpose limitation principles in Schedule 5 to the Bill. The new purposes are for important public interests only, such as safeguarding vulnerable individuals or children. Broader existing safeguards in the data protection framework, such as the fairness and lawfulness principles, also apply. Prohibiting a change of purpose in processing could impede important activities, such as the safeguarding issues to which I have referred.

Amendment 88, tabled by the noble Baroness, Lady Kidron, would introduce a new duty requiring all data controllers to consider that children are entitled to higher protection than adults. We understand the noble Baroness’s intentions and, in many ways, share her aims, but we would prefer to focus on improving compliance with the current legislation, including through the way the ICO discharges its regulatory functions.

In addition, the proposed duty could have some unwelcome and unintended effects. For example, it could lead to questions about why other vulnerable people are not entitled to enhanced protections. It would also apply to organisations of all sizes, including micro-businesses and voluntary sector organisations, even if they process children’s data on only a small scale. It could also cause confusion about what they would need to do to verify age to comply with the new duty.

Amendment 94, also tabled by the noble Baroness, would ensure that the new notification exemptions under Article 13 would not apply to children. However, removing children’s data from this exemption could mean that some important research—for example, on the causes of childhood diseases—could not be undertaken if the data controller were unable to contact the individuals about the intended processing activity.

Amendment 135 would place new duties on the ICO to uphold the rights of children. The ICO’s new strategic framework, introduced by the Bill, has been carefully structured to achieve a similar effect. Its principal objective requires the regulator to

“secure an appropriate level of protection for personal data”.

This gives flexibility and nuance in the appropriateness of the level of protections; they are not always the same for all data subjects, all the time.

Going beyond this, though, the strategic framework includes the new duty relating to children. This acknowledges that, as the noble Baroness, Lady Kidron, said, children may be less aware of the risks and consequences associated with the processing of their data, as well of as their rights. As she pointed out, this is drawn from recital 38 to the UK GDPR, but the Government’s view is that the Bill’s language gives sufficient effect to the recital. We recognise the importance of clarity on this issue and hope that we have achieved it but, obviously, we are happy to talk further to the noble Baroness on this matter.

This duty will also be a consideration for the ICO and one to which the commissioner must have regard across all data protection activities, where relevant. It will inform the regulator’s thinking on everything from enforcement to guidance, including how work might need to be tailored to suit children at all stages of childhood in order to ensure that the levels of protection are appropriate.

Finally, regarding Amendment 196—

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I thank the Minister for giving way. I would like her to explain why only half of the recital is in the Bill and why the fact that children merit special attention is in the Bill. How can it possibly be that, in this Bill, we are giving children adequate protection? I can disagree with some of the other things that she said, but I would like her to answer that specific question.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

To be on the safe side, I will write to the noble Baroness. We feel that other bits in the provisions of the Bill cover the other aspects but, just to be clear on it, I will write to her. On Amendment 196 and the Online Safety Act—

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

I am sorry to interrupt but I am slightly puzzled by the way in which that exchange just happened. I take it from what the Minister is saying that there is no dissent, in her and the Bill team’s thinking, about children’s rights having to be given the correct priority, but she feels that the current drafting is better than what is now proposed because it does not deflect from the broader issues that she has adhered to. She has fallen into the trap, which I thought she never would do, of blaming unintended consequences; I am sure that she will want to rethink that before she comes back to the Dispatch Box.

Surely the point being made here is about the absolute need to make sure that children’s rights never get taken down because of the consideration of other requirements. They are on their own, separate and not to be mixed up with those considerations that are truly right for the commissioner—and the ICO, in its new form—to take but which should never deflect from the way children are protected. If the Minister agrees with that, could she not see some way of reaching out to be a bit closer to where the noble Baroness, Lady Kidron, is?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I absolutely recognise the importance of the issues being raised here, which is why I think I really should write: I want to make sure that whatever I say is properly recorded and that we can all go on to debate it further. I am not trying to duck the issue; this issue is just too important for me to give an off-the-cuff response on it. I am sure that we will have further discussions on this. As I say, let me put it in writing, and we can pick that up. Certainly, as I said at the beginning, our intention was to enhance children’s protection rather than deflect from it.

Moving on to Amendment 196, I thank the noble Lord, Lord Clement-Jones, and other noble Lords for raising this important issue and seeking clarity on how the provision relates to the categorisation of services in the Online Safety Act. These categories are, however, not directly related to Clause 122 of this Bill as a data preservation notice can be issued to any service provider regulated in the Online Safety Act, regardless of categorisation. A list of the relevant persons is provided in paragraphs (a) to (e) of Section 100(5) of the Act; it includes any user-to-user service, search service and ancillary service.

I absolutely understand noble Lords saying that these things should cross-reference in some way but, as far we are concerned, they complement each other, and that protection is currently in the Online Safety Act. As I said, I will write to noble Lords and am happy to meet if that would be helpful. In the meantime, I hope that the explanations I have given are sufficient grounds for noble Lords not to press their amendments at this stage.

19:30
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I thank the Minister for her response. I should say at the outset that, although I may have led the group, it is clear that the noble Baroness, Lady Kidron, leads the pack as far as this is concerned. I know that she wants me to say that the noble Baroness, Lady Harding, wished to say that she was extremely sorry not to be able to attend as she wanted to associate herself wholeheartedly with these amendments. She said, “It’s so disappointing still to be fighting for children’s data to have higher protection but it seems that that’s our lot!” I think she anticipated the response, sadly. I very much thank the noble Baroness, Lady Kidron, the noble Lords, Lord Russell and Lord Stevenson, and the noble Viscount, Lord Camrose, in particular for his thoughtful response to Amendment 196.

I was very interested in the intervention from the noble Lord, Lord Stevenson, and wrote down “Not invented here” to sum up the Government’s response to some of these amendments, which has been consistently underwhelming throughout the debates on the DPDI Bill and this Bill. They have brought out such things as “the unintended effects” and said, “We don’t want to interfere with the ICO”, and so on. This campaign will continue; it is really important. Obviously, we will read carefully what the Minister said but, given the troops behind me, I think the campaign will only get stronger.

The Minister did not really deal with the substance of Amendment 196, which was not just a cunning ploy to connect the Bill with the Online Safety Act; it was about current intentions on categorisation. There is considerable concern that the current category 1 is overconservative and that we are not covering the smaller, unsafe social media platforms. When we discussed the Online Safety Bill, both in the Joint Committee and in the debates on subsequent stages of the Bill, it was clear that this was about risk, not just size, and we wanted to cover those risky, smaller platforms as well. While I appreciate the Government’s strategic statement, which made it pretty clear, and without wishing to overly terrorise Ofcom, we should make our view on categorisation pretty clear, and the Government should do likewise.

This argument and debate will no doubt continue. In the meantime, I beg leave to withdraw my amendment.

Amendment 82 withdrawn.
Amendment 83 not moved.
Clause 71 agreed.
Schedule 5: Purpose limitation: processing to be treated as compatible with original purpose
Amendments 84 to 86 not moved.
Schedule 5 agreed.
Clause 72 agreed.
Amendment 87
Moved by
87: After Clause 72, insert the following new Clause—
“Application of the European Convention on Human Rights to the processing of personal data by private bodies(1) Where personal data is processed by any private body not subject to the obligations under the European Convention on Human Rights as enacted by the Human Rights Act 1998, that private body is to be treated as subject to the obligations under the Convention as if it were a public authority and must ensure that such processing is not incompatible with a Convention right. (2) If a private body fails to ensure that the processing of personal data is in accordance with subsection (1), the private body is liable to any person whose rights under the Convention are infringed as if it were a public authority,”Member's explanatory statement
This is a probing amendment to ensure for the purpose of equivalence that the processing of personal data by private bodies is subject to the ECHR on the same basis as public bodies.
Lord Thomas of Cwmgiedd Portrait Lord Thomas of Cwmgiedd (CB)
- Hansard - - - Excerpts

My Lords, although it is a late hour, I want to make two or three points. I hope that I will be able to finish what I wish to say relatively quickly. It is important that in looking at the whole of this Bill we keep in mind two things. One is equivalence, and the other is the importance of the rights in the Bill and its protections being anchored in something ordinary people can understand. Unfortunately, I could not be here on the first day but having sat through most of today, I deeply worry about the unintelligibility of this whole legislative package. We are stuck with it for now, but I sincerely hope that this is the last Civil Service-produced Bill of this kind. We need radical new thinking, and I shall try to explore that when we look at automated decision-making—again, a bit that is far too complicated.

Amendment 87 specifically relates to equivalence, and I want to touch on Amendment 125. There is in what I intend to suggest a fix to the problem, if it really exists, that will also have the benefit of underpinning this legislation by rights that people understand and that are applicable not merely to the state but to private companies. The problem that seems to have arisen—there are byproducts of Brexit that from time to time surface—is the whole history of the way in which we left the European Community. We left initially under the withdrawal Act, leaving retained EU law. No doubt many of us remember the debates that took place. The then Government were wholly opposed to keeping the charter. In respect of the protection of people’s data being processed, that is probably acceptable on the basis that the rights of the charter had merged into ordinary retained EU law through the decisions of the Court of Justice of the European Union. All was relatively well until the retained Retained EU Law (Revocation and Reform) Act, which deleted most general EU retained law principles, including fundamental rights, from the UK statute book. What then happened, as I understand it, was that a fix to this problem was attempted by the Data Protection (Fundamental Rights and Freedoms) (Amendment) Regulations 2023, which tidied up the UK GDPR by making clear that any references to fundamental rights and freedoms were regarded as reference to convention rights within the meaning of the Human Rights Act.

For good and understandable reasons, the Human Rights Act applies to public authorities and in very limited circumstances to private bodies but not as a whole. That is accepted generally and certainly is accepted in the human rights memorandum in respect of this Bill. The difficulty with the Bill, therefore, is that the protections under the Human Rights Act apply only to public authorities but not to private authorities. Whereas, generally speaking, the way in which the Charter of Fundamental Rights operated was to protect, also on a horizontal basis, the processing or use of data by private companies.

This seems to cause two problems. First, it is critical that there is no doubt about this, and I look forward to hearing what the Minister has to say as to the view of the Government’s legal advisers as to whether there is a doubt. Secondly, the amendment goes to the second of the two objectives which we are trying to achieve, which is to instil an understanding of the principles so that the ordinary member of the public can have trust. I defy anyone, even the experts who drafted this, to think that this is intelligible to any ordinary human being. It is simply not. I am sorry to be so rude about it, but this is the epitome of legislation that is, because of its sheer complexity, impossible to understand.

Of course, it could be made a lot better by a short series of principles introduced in the Bill, the kind of thing we have been talking about at times today, with a short, introductory summary of what the rights are under the Bill. I hope consideration can be given to that, but that is not the purpose of my amendment. One purpose that I suggest as a fix to this—to both the point of dealing with rights in a way that people can understand and the point on equivalence—is a very simple application, for the purposes of data processing, of the rights and remedies under the Human Rights Act, extending it to private bodies. One could therefore properly point, in going through the way that the Bill operates, to fundamental rights that people understand which are applicable, not merely if a public authority is processing the data but to the processing of data by private bodies. That is what I wanted to say about Amendment 87.

I wanted to add a word of support, because it is closely allied to this on the equivalence point, to the amendment in the name of the noble Lord, Lord Clement-Jones, for whose support I am grateful in respect of Amendment 87. That relates to the need to have a thorough review of equivalence. Obviously, negotiations will take place, but it really is important that thorough attention is given to the adequacy of our legislation to ensure that there is no incompatibility with the EU regime so we do not get adequacy. Those are the two amendments to which I wished to speak in this group. There are two reasons why I feel it would be wrong for me to go on and deal with the others. Some are very narrow and some very broad, and it is probably easiest to listen to those who are speaking to those amendments in due course. On that basis, therefore, I beg to move.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I will speak to Amendments 139, 140 and 109A—which was a bit of a late entry this morning—in my name. I express my thanks to those who have co-signed them.

19:45
I laid these amendments during the passage of the DPDI Bill; the very detailed argument for them can be found at column 89GC of volume 837 of Hansard. In summary, although Ministers of both this Government and the previous one argue for extending the rights of commercial companies and government to our data, I would like to interest them in extending the power of individuals and collectives to use data to enhance their lives or, indeed, the lives of their communities.
The last couple of decades have shown that the real power of tech lies in holding either vast swathes of general data, such as those used by LLMs, or large groups of specialist data, such as medical scans. In short, the value—I mean the value to society as well as the financial value—lies in bringing data together. These amendments would allow individuals voluntarily to create specialist or big datasets for specific purposes, whether they are a group of sole traders working for the same company or parents assessing exam boards; enable the elderly to negotiate cheaper house insurance because they are home a lot; or enable gig workers to check that they are not being exploited. The possibilities of sharing data in communities are infinite.
The notion of data fiduciaries, data trusts or data unions—all of those are labels for this sort of thinking—where you can place your data for collective benefit is not new. There are many expert proponents of them, particularly in the US. Interestingly, though, it does not have the regulatory structure to pull this off. We do, but our data law is complex. The complicated waterfall of concepts adequately illustrated by the noble and learned Lord, Lord Thomas, eludes most non-experts, which is why the amendments in this group would give UK citizens access to data experts for matters that concern them deeply.
Amendment 139 would require the ICO to set out a code of conduct for data communities, including guidance on establishing, operating and joining a data community, as well as guidance for data controllers and data processors on responding to requests made by data communities. Amendment 140 would require the ICO to keep a register of data communities, to make it publicly available and to ensure proper oversight. Amendment 109A would simply create
“a mechanism for data subjects to assign their data rights to be … asserted collectively”.
Together, these amendments would provide a mechanism for non-experts—that is, any UK citizen—to assign their data rights to a community run by representatives, which would benefit the entire group.
When I introduced these amendments to the DPDI Bill, I explained that they were based on work done by a colleague at Oxford, Dr Reuben Binns, in association with Worker Info Exchange. Using the data of individual Uber drivers, they created a pool of data from several hundred drivers, allowing him to see how the algorithm reacts to those who refuse a poorly paid job; how it assigns lucrative airport runs; whether where you start impacts on your daily earnings; whether those who work short hours are given less lucrative jobs; and so on. This project continues—full explanation of it can be found in Hansard—but it was made possible by a bespoke arrangement between the ICO, Uber and the researchers that, if it were routine, would provide opportunities for challenger businesses, community groups and research projects.
What better example of how empowering collective data control can be found than its use here in giving self-employed gig workers—those with the fewest protections in our economy and those with the greatest need—the power of knowledge? Indeed, it is instructive to look at the example of the Rodeo app, a British-built tech start-up that has taken the idea of giving gig workers control through data and applied it to delivery workers. However, in the current legal environment, it is very difficult to access the data that individual workers would like to hand over for the common and collective good. This is technically possible but, in practice, it is hard and often subject to well-resourced legal challenge. The cost of regulatory response makes gathering data one by one onerous. If these amendments were in place, it would be routine, contractual, time limited and subject to a code of conduct. The opportunity for citizens’ empowerment is immense, but so too is the opportunity for economic growth.
Amendment 139 makes specific reference to the ICO setting out what constitutes “good practice” in a data community. If I were rewriting it, I would include a community operator’s fiduciary duty to members of the community and express provisions that prevent exploitation by entrenched platforms. However, the amendment is intended as a starter for 10 rather than the finished article.
This Bill is short on vision. I am arguing here for a more open and innovative approach that benefits the citizen rather than, again, simply transferring more power to the incumbents. I hope that the Government are feeling ambitious and do not want to get
“comfortable in the tepid bath”
of which they speak. I hope that they are willing to tackle the asymmetric and disempowering status quo for data subjects and will instead find a way to support these amendments, however drafted, to help communities make social and economic goods from their data.
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I start by speaking to two amendments tabled in my name.

Amendment 91 seeks to change

“the definition of request by data subjects to data controllers”

that can be declined or

“for which a fee can be charged from ‘manifestly unfounded or excessive’ to ‘vexatious or excessive’”.

I am sure that many of us will remember, without a great deal of fondness, our debates on these terms in the DPDI Bill. When we debated this issue at that time, it was, rather to my regret, often presented as a way to reduce protections and make it easier to decline or charge a fee for a subject access request. In fact, the purpose was to try to filter out cynical or time-wasting requests, such as attempts to bypass legal due process or to bombard organisations with vast quantities of essentially meaningless access requests. Such requests are not unfounded but they are harmful; by reducing them, we would give organisations more time and capacity to respond to well-founded requests. I realise that I am probably on a loser on this one but let me encourage noble Lords one last time to reconsider their objections and take a walk on the vexatious side.

Amendment 97 would ensure that

“AI companies who process data not directly obtained from data subjects are required to provide information to data subjects where possible. Without this amendment, data subjects may not know their data is being held”.

If a subject does not even know that their data is being held, they cannot enforce their data rights.

Amendment 99 follows on from that point, seeking to ensure that AI companies using large datasets cannot avoid providing information to data subjects on the basis that their datasets are too large. Again, if a subject does not know that their data is being held, they cannot enforce their rights. Therefore, it is really important that companies cannot avoid telling individuals about their personal data and the way in which it is being used because of sheer weight of information. These organisations are specialists in such processing of huge volumes of data, of course, so I struggle to accept that this would be too technically demanding for them.

Let me make just a few comments on other amendments tabled by noble Lords. Under Amendment 107, the Secretary of State would have

“to publish guidance within six months of the Act’s passing to clarify what constitutes ‘reasonable and proportionate’ in protection of personal data”.

I feel that this information should be published at the same time as this Bill comes into effect. It serves no purpose to have six months of uncertainty.

I do not believe that Amendment 125 is necessary. The degree to which the Government wish to align—or not—with the EU is surely a matter for the Government and their priorities.

Finally, I was struck by the interesting point that the noble and learned Lord, Lord Thomas, made when he deplored the Bill’s incomprehensibility. I have extremely high levels of personal sympathy with that view. To me, the Bill is the source code. There is a challenge in making it comprehensible and communicating it in a much more accessible way once it goes live. Perhaps the Minister can give some thought to how that implementation phase could include strong elements of communication. While that does not make the Bill any easier to understand for us, it might help the public at large.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, the problem is that I have a 10-minute speech and there are five minutes left before Hansard leaves us, so is it sensible to draw stumps at this point? I have not counted how many amendments I have, but I also wish to speak to the amendment by the noble and learned Lord, Lord Thomas. I would have thought it sensible to break at this point.

Lord Leong Portrait Lord Leong (Lab)
- Hansard - - - Excerpts

That is a sensible suggestion.

Debate on Amendment 87 adjourned.
Committee adjourned at 7.56 pm.
Committee (3rd Day)
15:45
Relevant documents: 3rd Report from the Constitution Committee and 9th Report from the Delegated Powers Committee. Scottish, Welsh and Northern Ireland Legislative Consent sought.
Lord Haskel Portrait The Deputy Chairman of Committees (Lord Haskel) (Lab)
- Hansard - - - Excerpts

My Lords, if there is a Division in the Chamber while we are sitting, the Committee will adjourn as soon as the Division Bells are rung and resume after 10 minutes.

Debate on Amendment 87 resumed.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, in carrying on on this group, I will speak to the question that Clause 78 stands part, and to Amendments 107, 109, 125, 154, 155 and 156, but to start I support Amendment 87 in the name of the noble and learned Lord, Lord Thomas of Cwmgiedd. We had a masterclass from him last Tuesday and he made an extremely good case for that amendment, which is very elegant.

The previous Government deleted the EU Charter of Fundamental Rights from the statute book through the Retained EU Law (Revocation and Reform) Act 2023, and this Bill does nothing to restore it. Although references in the UK GDPR to fundamental rights and freedoms are now to be read as references to the ECHR as implemented through the Human Rights Act 1998, the Government’s ECHR memorandum states:

“Where processing is conducted by a private body, that processing will not usually engage convention rights”.


As the noble and learned Lord mentioned, this could leave a significant gap in protection for individuals whose data is processed by private organisations and will mean lower data protection rights in the UK compared with the EU, so these Benches strongly support his Amendment 87, which would apply the convention to private bodies where personal data is concerned. I am afraid we do not support Amendments 91 and 97 from the noble Viscount, Lord Camrose, which seem to hanker after the mercifully defunct DPDI.

We strongly support Amendments 139 and 140 from the noble Baroness, Lady Kidron. Data communities are one of the important omissions from the Bill. Where are the provisions that should be there to support data-sharing communities and initiatives such as Solid? We have been talking about data trusts and data communities since as long ago as the Hall-Pesenti review. Indeed, it is interesting that the Minister herself only this April said in Grand Committee:

“This seems to be an area in which the ICO could take a lead in clarifying rights and set standards”.


Indeed, she put forward an amendment:

“Our Amendment 154 would therefore set a deadline for the ICO to do that work and for those rights to be enacted. The noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, made a good case for broadening these rights in the Bill and, on that basis, I hope the Minister will agree to follow this up, and follow up his letter so that we can make further progress on this issue”.—[Official Report, 17/4/24; col. GC 322.]


I very much hope that, now the tables are turned, so to speak, the Minister will take that forward herself in government.

Amendments 154, 155 and 156 deal with the removal of the principle of the supremacy of EU law. They are designed to undo the lowering of the standard of data protection rights in the UK brought about by the REUL Act 2023. The amendments would apply the protections required in Article 23.2 of the UK GDPR to all the relevant exceptions in Schedules 2 to 4 to the Data Protection Act 2018. This is important because data adequacy will be lost if the standard of protection of personal data in the UK is no longer essentially equivalent to that in the EU.

The EU’s adequacy decision stated that it did not apply in the area of immigration and referred to the case of Open Rights Group v the Secretary of State for the Home Department in the Court of Appeal. This case was brought after the UK left the EU, but before the REULA came into effect. The case is an example of how the preservation of the principle of the supremacy of EU law continued to guarantee high data protection standards in the UK, before this principle was deleted from the statute book by the REULA. In broad terms, the Court of Appeal found that the immigration exception in Schedule 2 to the Data Protection Act 2018 conflicted with the safeguards in Article 23 of the UK GDPR. This was because the immigration exemption was drafted too broadly and failed to incorporate the safeguards prescribed for exemptions under Article 23.2 of the UK GDPR. It was therefore held to be unlawful and was disapplied.

The Home Office redrafted the exemption to make it more protective, but it took several attempts to bring forward legislation which provided sufficient safeguards for data subjects. The extent of the safeguards now set out in the immigration exemption underscores both what is needed for compatibility with Article 23.2 of the UK GDPR and the deficiencies in the rest of the Schedule 2 exemptions. It is clear when reading the judgment in the Open Rights case that the majority of the exemptions from data subject rights under Schedule 2 to the Data Protection Act fail to meet the standards set out in Article 23.2 to the UK GDPR. The deletion of the principle of the supremacy of EU law has removed the possibility of another Open Rights-style challenge to the other exemptions in Schedule 2 to the Data Protection Act 2018. I hope that, ahead of the data adequacy discussions with the Commission, the Government’s lawyers have had a good look at the amendments that I have tabled, drafted by a former MoJ lawyer.

The new clause after Clause 107 in Amendment 154 applies new protections to the immigration exemption to the whole of Schedule 2 to the DPA 2018, with the exception of the exemptions that apply in the context of journalism or research, statistics and archiving. Unlike the other exemptions, they already contain detailed safeguards.

Amendment 155 is a new clause extending new protections which apply to the immigration exemption to Schedule 3 to the DPA 2018, and Amendment 156 is another new clause applying new protections which apply to the immigration exemption to Schedule 2 to the DPA 2018.

As regards Amendment 107, the Government need to clarify how data processing under recognised legitimate interests are compatible with conditions for data processing under existing lawful bases, including the special categories of personal data under Articles 5 and 9 of the UK GDPR. The Bill lowers the standard of the protection of personal data where data controllers only have to provide personal data based on

“a reasonable and proportionate search”.

The lack of clarity on what reasonable and proportionate mean in the context of data subject requests creates legal uncertainty for data controllers and organisations, specifically regarding whether the data subject’s consideration on the matter needs to be accounted for when responding to requests. This is a probing amendment which requires the Secretary of State to explain why the existing lawful bases for data processing are inadequate for the processing of personal data when additional recognised legitimate interests are introduced. It requires the Secretary of State to publish guidance within six months of the Act’s passing to clarify what constitutes reasonable and proportionate protections of personal data.

Amendment 109 would insert a new clause, to ensure that data controllers assess the risk of collective and societal harms,

“including to equality and the environment”,

when carrying out data protection impact assessments. It requires them to consult affected people and communities while carrying out these assessments to improve their quality, and requires data controllers to publish their assessments to facilitate informed decision-making by data subjects and to enable data controllers to be held accountable.

Turning to whether Clause 78 should stand part, on top of Clause 77, Clause 78 would reduce the scope of transparency obligations and rights. Many AI systems are designed in a way that makes it difficult to retrieve personal data once ingested, or understand how this data is being used. This is not principally due to technical limitations but the decision of AI developers who do not prioritise transparency and explainability.

As regards Amendment 125, it is clear that there are still further major changes proposed to the GDPR on police duties, automated decision-making and recognised legitimate interests which continue to make retention of data adequacy for the purposes of digital trade with the EU of the utmost priority in considering those changes. During the passage of the Data Protection and Digital Information Bill, I tabled an amendment to require the Government to publish an assessment of the impact of the Bill on EU/UK data adequacy within six months of the Act passing; I have tabled a similar amendment, with one change, to this Bill. As the next reassessment of data adequacy is set for June 2025, a six-month timescale may prove inconsequential to the overall adequacy decision. We must therefore recommend stipulating that this assessment takes place before this reassessment.

Baroness Jones of Whitchurch Portrait The Parliamentary Under-Secretary of State, Department for Business and Trade and Department for Science, Information and Technology (Baroness Jones of Whitchurch) (Lab)
- Hansard - - - Excerpts

My Lords, I thank all noble Lords for their consideration of these clauses. First, I will address Amendment 87 tabled by the noble and learned Lord, Lord Thomas, and the noble and learned Lord—sorry, the noble Lord—Lord Clement-Jones.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I will take any compliment.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

We should take them while we can. Like the noble Lord, Lord Clement-Jones, I agree that the noble and learned Lord, Lord Thomas, made an excellent contribution. I appreciate this is a particularly technical area of legislation, but I hope I can reassure both noble Lords that the UK’s data protection law gives effect to convention rights and is designed to protect them. The Human Rights Act requires legislation to be interpreted compatibly with convention rights, whether processing is carried out by public or private bodies. ECHR rights are therefore a pervasive aspect of the rules that apply to public and private controllers alike. The noble and learned Lord is right that individuals generally cannot bring claims against private bodies for breaches of convention rights, but I reassure him that they can bring a claim for breaching the data protection laws giving effect to those rights.

I turn to Amendment 91, tabled by the noble Viscount, Lord Camrose, Amendment 107, tabled by the noble Lord, Lord Clement-Jones, and the question of whether Clause 78 should stand part, which all relate to data subject requests. The Government believe that transparency and the right of access is crucial. That is why they will not support a change to the language around the threshold for data subject requests, as this will undermine data subjects’ rights. Neither will the Bill change the current expectations placed on controllers. The Bill reflects the EU principle of proportionality, which has always underpinned this legislation, as well as existing domestic case law and current ICO guidance. I hope that reassures noble Lords.

Amendments 97 and 99, tabled by the noble Viscount, Lord Camrose, and the noble Lord, Lord Markham, relate to the notification exemption in Article 14 of the UK GDPR. I reassure noble Lords that the proportionality test provides an important safeguard for the existing exemption when data is collected from sources other than the data subject. The controller must always consider the impact on data subjects’ rights of not notifying. They cannot rely on the disproportionate effort exemption just because of how much data they are processing—even when there are many data subjects involved, such as there would be with web scraping. Moreover, a lawful basis is required to reuse personal data: a web scraper would still need to pass the balancing test to use the legitimate interest ground, as is usually the case.

The ICO’s recent outcomes report, published on 12 December, specifically referenced the process of web scraping. The report outlined:

“Web scraping for generative AI training is a high-risk, invisible processing activity. Where insufficient transparency measures contribute to people being unable to exercise their rights, generative AI developers are likely to struggle to pass the balancing test”.

16:00
Amendment 109 from the noble Lord, Lord Clement-Jones, would amend requirements for data protection impact assessments. The noble Lord will know that I and the Government share his concerns about the measures in the previous Government’s Data Protection and Digital Information Bill. I am therefore glad that this Bill does not include them. The existing provisions in the UK GDPR already require data controllers to carry out a data protection impact assessment when the processing is likely to result in high risks to the rights and freedoms of individuals. This would include, for example, a risk that a processing activity may give rise to discrimination. The assessment must contain, among other things, a description of safeguards to ensure protection of personal data. However, the Government would prefer to avoid requiring organisations to comply with even more rigorous requirements, such as the need to consider environmental impacts.
On EU data adequacy, I turn to Amendment 125, tabled by the noble Lord, Lord Clement-Jones. I agree with noble Lords on the need to maintain data adequacy, which is a priority for this Government. The free flow of personal data with our EU partners is vital in underpinning research and innovation and keeping people safe. For that reason, the Government are doing all that we can to support its swift renewal. I reassure noble Lords that the Bill has been designed with EU adequacy in mind. The Government have incorporated robust safeguards and changed proposals that did not serve our priorities and were of concern to the EU. It is, though, for the EU to undertake its review of the UK, which we are entering into now. On that basis, I suggest to noble Lords that we should respect that process and provide discretion and not interfere while it is under way.
I thank the noble Baroness, Lady Kidron, and the noble Lords, Lord Stevenson, Lord Clement-Jones and Lord Knight, for Amendments 109A, 139 and 140, concerning data communities. The Government firmly believe that giving data subjects greater agency over their personal data is important for strengthening data subject rights and for innovation and economic growth. Smart data schemes and digital verification services are good examples of such action arising from this Bill.
I reassure noble Lords that we continue to believe that this area should be further explored. The Government are in dialogue with businesses and innovators to develop collaborative, evidence-based interventions in this area. The UK GDPR does not prevent data subjects authorising third parties to exercise certain rights on their behalf. I am happy to update noble Lords on this in due course and invite the noble Baroness to meet to discuss this area further, if she would like to do so.
I turn to Amendments 154, 155 and 156, tabled by the noble Lord, Lord Clement-Jones, to the exemptions in Schedules 2 to 4 to the Data Protection Act 2018. Most of those exemptions have been in use since the Data Protection Act 1998. The noble Lord refers to the immigration exemption, which was amended following a court ruling specifically about that exemption. I reassure him that there is a power in the Data Protection Act to amend the other exemptions if necessary.
Given the above reassurances, I hope noble Lords will agree not to press their amendments in this group.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

The Minister said there is a power to amend, but she has not said whether she thinks that would be desirable. Is the power to be used only if we are found not to be data-adequate because the immigration exemption does not apply across the board? That is, will the power be used only if we are forced to use it?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I reassure the noble Lord that, as he knows, we are very hopeful that we will have data adequacy so that issue will not arise. I will write to him to set out in more detail when those powers would be used.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I thank the Minister for her offer of a meeting. I could tell from the nods of my co-signatories that that would indeed be very welcome and we would all like to come. I was interested in the quote from the ICO about scraping. I doubt the Minister has it to hand, but perhaps she could write to say what volume of enforcement action has been taken by the ICO on behalf of data rights holders against scraping on that basis.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Yes, it would be helpful if we could write and set that out in more detail. Obviously the ICO’s report is fairly recent, but I am sure he has considered how the enforcement would follow on from that. I am sure we can write and give more details.

Lord Thomas of Cwmgiedd Portrait Lord Thomas of Cwmgiedd (CB)
- Hansard - - - Excerpts

My Lords, I thank the Minister for her response. I wish to make three points. First, the critical question is: are our laws adequate to pass the adequacy test? Normally, when you go in for a legal test, you check that your own house is in order. I am therefore slightly disappointed by the response to Amendment 125. Normally one has the full-scale medical first, rather than waiting until you are found to be ill afterwards.

Secondly, I listened to what the Minister said about my Amendment 87 and the difference between what rights are protected by the charter and the much greater limitation of the ECHR, normally simply to do with the extent to which they apply horizontally to private individuals. I will look at her answer, but at first sight it does not seem right to me that, where you have fundamental rights, you move to a second stage of rights—namely, the rights under the Data Protection Act.

Thirdly, I want to comment on the whole concept of data communities and data trusts. This is an important area, and it takes me back to what I said last time: this legislation really needs trying to reduce to principles. I am going to throw out a challenge to the very learned people behind the Minister, particularly the lawyers: can they come up with something intelligible to the people who are going to do this?

This legislation is ghastly; I am sorry to say that, but it is. It imposes huge costs on SMEs—not to say on others, but they can probably afford it—and if you are going to get trust from people, you have to explain things in simple principles. My challenge to those behind the Minister is: can they draft a Clause 1 of the Bill to say, “The principles that underpin the Bill are as follows, and the courts are to interpret it in accordance with those principles”? That is my challenge—a challenge, as the noble Baroness, Lady Kidron, points out, to be ambitious and not to sit in a tepid bath. I beg leave to withdraw the amendment.

Amendment 87 withdrawn.
Amendments 88 and 89 not moved.
Clause 73 agreed.
Clause 74: Processing of special categories of personal data
Amendment 90 not moved.
Clause 74 agreed.
Clause 75: Fees and reasons for responses to data subjects’ requests about law enforcement processing
Amendment 91 not moved.
Clause 75 agreed.
Clause 76 agreed.
Clause 77: Information to be provided to data subjects
Amendment 92
Moved by
92: Clause 77, page 91, line 5, leave out “the number of data subjects,”
Member’s explanatory statement
This amendment reduces the likelihood of misuse of Clause 77 by AI model developers, who may otherwise seek to claim they do not need to notify data subjects of reuse for scientific purposes under Clause 77 because of the way that personal data is typically collected and processed for AI development, for example by scraping large amounts of personal data from the internet.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - - - Excerpts

My Lords, I have tabled Amendments 92, 93, 101 and 105, and I thank the noble Lord, Lord Clement-Jones for adding his name to them. I also support Amendment 137 in the name of my noble friend Lady Kidron.

Clause 77 grants an exemption to the Article 13 and 14 rights of data subjects to be told within a set timeframe that their data will be reused for scientific research, if it would be impossible or involve disproportionate effort to do so. These amendments complement those I proposed to Clause 67. They aim to ensure that “scientific research” is limited in its definition and that the large language AI developers cannot say that they are doing scientific research and that the GDPR requirements involve too much effort to have to contact data subjects to reuse their data.

It costs AI developers time and money to identify data subjects, so this exemption is obviously very valuable to them and they will use it if possible. They will claim that processing and notifying data subjects from such a huge collection of data is a disproportionate effort, as it is hard to extract the identity of data subjects from the original AI model.

Up to 5 million data subjects could be involved in reusing data to train a large language model. However, the ICO requires data controllers to inform subjects that their data could be reused even if it involves contacting 5 million data subjects. The criteria set out in proposed new subsection (6) in Clause 77 play straight into the hands of ruthless AI companies that want to take advantage of this exemption.

Amendments 92 and 101 would ensure that the disproportionate effort excuse is not used if the number of data subjects is mentioned as a reason for deploying the excuse. Amendments 93 and 105 would clarify the practices and facts that would not qualify for the disproportionate effort exemption—namely,

“the fact the personal data was not collected from the data subject, or any processing undertaken by the controller that makes the effort involved greater”.

Without this wording, the Bill will mean that the data controller, when wanting to reuse data for training another large language model, could process the personal data on the original model and then reuse it without asking permission from the original subjects. The AI developer could say, “I don’t have the original details of the data subject, as they were deleted when the original model was trained. There was no identification of the original data subjects; only the data weight”. I fear that many companies will use this excuse to get around GDPR notification expectations.

Noble Lords should recognise that these provisions affect only AI developers seeking to reuse data under the scientific research provisions. These will mainly be the very large AI developers, which tend to use scrape data to train their general purpose models. Controllers will still be able to use personal data to train AI systems when they have lawful grounds to do so—they either have the consent of the data subject or there is a legitimate interest—but I want to make it clear that these provisions will not inhibit the legitimate training of AI models.

These amendments would ensure that organisations, especially large language AI developers, are not able to reuse data at scale, in contradiction to the expectations and intentions of data subjects. Failure to get this right will risk setting off a public backlash against the use of personal data for AI use, which would impede this Government’s aims of making this country an AI superpower. I beg to move.

16:15
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, in speaking to Amendment 137 in my name I thank the noble Baroness, Lady Harding, the noble Lord, Lord Stevenson, and my noble friend Lord Russell for their support. I also add my enthusiastic support to the amendments in the name of my noble friend Lord Colville.

This is the same amendment that I laid to the DPDI Bill, which at the time had the support of the Labour Party. I will not labour that point, but it is consistently disappointing that these things have gone into the “too difficult” box.

Amendment 137 would introduce a code of practice on children and AI. AI drives the recommender systems that determine all aspects of a child’s digital experience, including the videos they watch, their learning opportunities, the people they follow and the products they buy—and, as reported last weekend, AI is even helping farmers pick the ripest tomatoes for baked beans. But it no longer concerns simply the elective parts of life where, arguably, a child or a parent on their behalf can choose to avoid certain products and services. AI is invisibly and ubiquitously present in all areas of their lives, and its advances and impact are particularly evident in the education and health sectors, the first of which is compulsory for children and the second of which is necessary for all of us.

The amendment has three parts. The first requires the ICO to create a code and sets out the expectations of its scope; the second considers who and what should be consulted and considered, including experts, children, and the frameworks that codify children’s existing rights; and the third part defines elements of the process, including risk assessment definitions, and sets out the principles to which the code must adhere.

When we debated this before, I anticipated that the Minister would say that the ICO had already published guidance, that we do not want to exclude children from the benefits of AI, and that we must not get in the way of innovation. Given that the new Government have taken so many cues from the previous one, I am afraid I anticipate a similar response.

I first point out, therefore, that the ICO’s non-binding guidance on AI and data protection is insufficient. It has only a single mention of a child in its 140 pages, which is a case study about child benefits. In the hundreds of pages of guidance, toolkits and sector information, nowhere are the specific needs and rights, or development vulnerabilities, of children comprehensively addressed in relation to AI. This absence of children is also mirrored in government publications on AI. Of course, we all want children to enjoy the benefits of AI, but consideration of their needs would increase the likelihood of those benefits. Moreover, it seems reckless and unprincipled not to protect them from known harms. Surely the last three decades of tech development have shown us that the experiment of a “build first, worry about the kids later—or never” approach has cost our children dearly.

Innovation is welcome but not all innovation is equal. We have bots offering 13 year-olds advice on how to seduce grown men, or encouraging them to take their own lives, edtech products that profile children to unfair and biased outcomes that limit their education and life chances, and we have gen AI that perpetuates negative, racist, misogynist and homophobic stereotypes. Earlier this month, the Guardian reported a deep bias in the AI used by the Department for Work and Pensions. This “hurt first, fix later” approach creates a lack of trust, increases unfairness, and has real-world consequences. Is it too much to insist that we ask better questions of systems that may result in children going hungry?

Why children? I am saddened that I must explain this, but from our deeply upsetting debate last week on the child protection amendments, in which the Government asserted that children are already catered for while deliberately downgrading their protections, it seems that the Government or their advisers have forgotten.

Children are different for three reasons. First, as has been established over decades, children are on a development journey. There are ages and stages at which children are developmentally able to do certain things, such as walk, talk, understand risk and irony and learn different social skills. There are equally ages and stages at which they cannot do those things. The long-established consensus is that families, social groups and society more broadly, including government, step in to support them on this journey. Secondly, children have less voice and less choice about how and where they spend their time, so the places and spaces they inhabit have to be designed to be fit for childhood. Thirdly, we have a responsibility towards children that extends even beyond our responsibility to each other. This means that we cannot legitimatise profit at their expense. Allowing systems to play in the wild in the name of growth and innovation, leaving kids to pay the price, is a low bar.

It is worth noting that since we debated it, a proposal for this AI code for children that follows the full life cycle of development, deployment, use and retirement of AI systems has been drafted and has the support of multiple expert organisations and individuals around the globe. I am sure that all nations and intergovernmental organisations will have additional inputs and requirements, but it is worth saying that the proposed code, which was written with input from academics, computer scientists, lawyers, engineers and children’s rights activists, is mindful of and compatible with the EU AI Act, the White House Blueprint for an AI Bill of Rights, the Executive Order on the Safe, Secure and Trustworthy Development and Use of Artificial Intelligence, the Council of Europe’s Framework Convention on Artificial Intelligence and, of course, the UNCRC general comment no. 25.

This proposal will be launched early next year as an indication of what could and should be done. Unless the Government find their compass vis-à-vis children and tech, I suspect that another jurisdiction will adopt it ahead of the UK, making that the go-to destination for trusted tech development for child-safe products. It is perhaps worth reminding the Committee that one in three connected people is under 18, which is roughly 1 billion children. As the demographics change, the proportion and number of children will rise. It is a huge financial market.

Before I sit down, I shall briefly talk about the AADC because sometimes Ministers say that we already have a children’s code. The age-appropriate design code covers only ISS, which automatically limits it, and even the ICO by now agrees that its enforcement record is neither extensive nor impressive. It does not clearly cover the urgent area of edtech, which is the subject of another amendment, and, most pertinently to this amendment, it addresses AI profiling only, which means that it is limited in how it can look at the new and emerging challenges of generative AI. A revamp of the AADC to tackle the barriers of enforcement, account for technological advances, cover all products and services likely to be accessed by children and make our data regime AI-sensitive would be welcome, but rather than calling for a strengthening of the AADC, the ICO agreed to the downgrading of children’s data protection in the DPDI Bill and, again, has agreed to the downgrading of protections in the current Bill on ADM, scientific research, onward processing and so on. A stand-alone code for AI development is required because in this way we could be sure that children are in the minds of developers at the outset.

It is disappointing that the UK is failing to claim its place as the centre of regulated and trusted innovation. Although we are promised an AI Bill, the Government repeatedly talk of large frontier companies. AI is in every part of a child’s life from the news they read to the prices they pay for travel and goods. It is clear from previous groups that many colleagues feel that a data Bill with no AI provisions is dangerous commercially and for the communities of the UK. An AI Bill with no consideration of the daily impact on children may be a very poor next choice. Will the Minister say why a Labour Government are willing to abandon children to technology rather than building technology that anticipates children’s rights and needs?

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

My Lords, it is a pleasure to follow my friend the noble Baroness, Lady Kidron, and to give full-throated support to my friend the noble Viscount, Lord Colville, on all his amendments. Given that the noble Baroness mentioned it and that another week has passed since we asked the Minister the question, will we see an AI Bill or a consultation before Santa comes or at some stage in the new year? I support all the amendments in this group and in doing so, as it is the first time I have spoken today in Committee, I declare my technology interests as set out in the register, not least as an adviser to Socially Recruited, an AI business.

I will speak particularly to my Amendment 211A. I have put down “image, likeness and personality” not because I believe that they stand as the most important rights that are being transgressed or that they are the most important rights which we should consider; I have put them down to give a specific focus on them because, right now, they are being largely cut across and ignored, so that all of our creatives find themselves with their works, but also image, likeness and personality, disappearing into these largely foundation AI models with no potential for redress.

Once parts of you such as your name, face or voice have been ingested, as the noble Lord, Lord Clement-Jones, said in the previous group, it is difficult then to have them extracted from the model. There is no sense, for example, of seeking an equitable remedy to put one back in the situation had the breach not occurred. It is almost “once in, forever in”, then works start to be created based on those factors, features and likenesses, which compete directly with the creatives. This is already particularly prevalent in the music industry.

What plans do the Government have in terms of personality rights, image and likeness? Are they content with the current situation where there is no protection for our great creatives, not least in the music industry? What does the Bill do for our creatives? I go back to the point made by the noble Baroness, Lady Kidron. How can we have all these debates on a data Bill which is silent when it comes to AI, and a product regulation Bill where AI is specifically excluded, and yet have no AI Bill on the near horizon—unless the Minister can give us some up-to-date information this afternoon? I look forward to hearing from her.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My Lords, I should first apologise for not being able to attend Second Reading or, arguably more importantly, to be in Committee last week to support the many amendments of the noble Baroness, Lady Kidron, on child protection. I read Hansard carefully and was deeply depressed to see that we were once again needing to rehearse, as she has done again today, the importance of protecting children in the digital era. It seems to be our lot that there is a group of us who keep coming back. We play the merry-go-round and sit in different places; it is a privilege to sit next to the noble Baroness, Lady Kidron, for the first time in the decade that I have been in the House. I support her Amendment 137. She has given a good exposé as to why we should think really carefully about how we protect children in this AI world. I would just like to add one point about AI itself.

We keep being told—in a good way—that AI is an underlying and general-purpose technology. That means we need to properly establish the principles with which we should protect children there. We know that technology is morally neutral; it is the human beings who do the damage. In every other underlying, breakthrough technology, we have learned that we have needed to protect the most vulnerable, whether it was electricity when it first went into factories, toys when they were first distributed on the mass market, or social media, with the age-appropriate design code. I feel that it would be a huge mistake, on the third Bill where many of us have debated this subject matter, for us not to address the fact that, as of today, this is the biggest breakthrough technology of our lifetime. We should recognise that children will need protecting, as well as having the opportunity to benefit from it.

16:30
Lord Kirkhope of Harrogate Portrait Lord Kirkhope of Harrogate (Con)
- Hansard - - - Excerpts

My Lords, I was not going to rise at all for the moment because there are other amendments coming later that are of interest. I declare my rather unusual interest: I was one of the architects of the GDPR in Brussels.

I rise to support Amendment 211A in the name of my noble friend Lord Holmes because here we are referring to AI. I know that other remarks have now been passed on this matter, which we will come to later, but it seems to me—this has come straight into my mind—that, when the preparation of the data legislation and the GDPR was being undertaken, we really did fail at that stage to accommodate the vast and important areas that AI brings to the party, as it were. We will fail again, I suspect, if we are not careful, in this piece of legislation. AI is with us now and moving at an enormous pace—faster than any legislator can ever manage to keep up with in order to control it and to make sure that there are sufficient protections in place for both the misuse of this technology and the way it may develop. So I support this amendment, particularly in relation to the trading or use of likenesses and the algorithmic effects that come about.

We will deal with that matter later, but I hope that the Minister will touch on this, particularly having heard the remarks of my noble friend Lord Holmes—and, indeed, the remarks of my noble friend Lady Harding a moment ago—because AI is missing. It was missing in the GDPR to a large extent. It is in the European Union’s new approach and its regulations on AI, but the EU has already shown that it has enormous difficulties in trying to offer, at one stage, control as well as redress and the proper involvement of human beings and individual citizens.

Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- Hansard - - - Excerpts

My Lords, I rise briefly to support my noble friend Lady Kidron on Amendment 137. The final comments from the noble and learned Lord, Lord Thomas, in our debate on the previous group were very apposite. We are dealing with a rapidly evolving and complex landscape, which AI is driving at warp speed. It seems absolutely fundamental that, given the panoply of different responsibilities and the level of detail that the different regulators are being asked to cover, there is on the face of what they have to do with children absolute clarity in terms of a code of practice, a code of conduct, a description of the types of outcomes that will be acceptable and a description of the types of outcomes that will be not only unacceptable but illegal. The clearer that is in the Bill, the more it will do something to future-proof the direction in which regulators will have to travel. If we are clear about what the outcomes need to be in terms of the welfare, well-being and mental health of children, that will give us some guidelines to work within as the world evolves so quickly.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

My Lords, I have co-signed Amendment 137. I do not need to repeat the arguments that have already been made by those who have spoken before me on it; they were well made, as usual. Again, it seems to expose a gap in where the Government are coming from in this area of activity, which should be at the forefront of all that they do but does not appear to be so.

As has just been said, this may be as simple as putting in an initial clause right up at the front of the Bill. Of course, that reminds me of the battle royal we had with the then Online Safety Bill in trying to get up front anything that made more sense of the Bill. It was another beast that was difficult to ingest, let alone understand, when we came to make amendments and bring forward discussions about it.

My frustration is that we are again talking about stuff that should have been well inside the thinking of those responsible for drafting the Bill. I do not understand why a lot of what has been said today has not already appeared in the planning for the Bill, and I do not think we will get very far by sending amendments back and forward that say the same thing again and again: we will only get the response that this is all dealt with and we should not be so trivial about it. Could we please have a meeting where we get around the table and try and hammer out exactly what it is that we see as deficient in the Bill, to set out very clearly for Ministers where we have red lines—that will make it very easy for them to understand whether they are going to meet them or not—and do it quickly?

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, the debate on this group emphasises how far behind the curve we are, whether it is by including new provisions in this Bill or by bringing forward an AI Bill—which, after all, was promised in the Government’s manifesto. It emphasises that we are not moving nearly fast enough in thinking about the implications of AI. While we are doing so, I need to declare an interest as co-chair of the All-Party Parliamentary Group on AI and a consultant to DLA Piper on AI policy and regulation.

I have followed the progress of AI since 2016 in the capacity of co-chair of the all-party group and chair of the AI Select Committee. We need to move much faster on a whole range of different issues. I very much hope that the noble Lord, Lord Vallance, will be here on Wednesday, when we discuss our crawler amendments, because although the noble Lord, Lord Holmes, has tabled Amendment 211A, which deals with personality rights, there is also extreme concern about the whole area of copyright. I was tipped off by the noble Lord, Lord Stevenson, so I was slightly surprised that he did not bring our attention to it: we are clearly due the consultation at any moment on intellectual property, but there seems to be some proposal within it for personality rights themselves. Whether that is a quid pro quo for a much-weakened situation on text and data mining, I do not know, but something appears to be moving out there which may become clear later this week. It seems a strange time to issue a consultation, but I recognise that it has been somewhat delayed.

In the meantime, we are forced to put forward amendments to this Bill trying to anticipate some of the issues that artificial intelligence is increasingly giving rise to. I strongly support Amendments 92, 93, 101 and 105 put forward by the noble Viscount, Lord Colville, to prevent misuse of Clause 77 by generative AI developers; I very much support the noble Lord, Lord Holmes, in wanting to see protection for image, likeness and personality; and I very much hope that we will get a positive response from the Minister in that respect.

We have heard from the noble Baronesses, Lady Kidron and Lady Harding, and the noble Lords, Lord Russell and Lord Stevenson, all of whom have made powerful speeches on previous Bills—the then Online Safety Bill and the Data Protection and Digital Information Bill—to say that children should have special protection in data protection law. As the noble Baroness, Lady Kidron, says, we need to move on from the AADC. That was a triumph she gained during the passage of the Data Protection Act 2018, but six years later the world looks very different and young people need protection from AI models of the kind she has set out in Amendment 137. I agree with the noble Lord, Lord Stevenson, that we need to talk these things through. If it produces an amendment to this Bill that is agreed, all well and good, but it could mean an amendment or part of a new AI Bill when that comes forward. Either way, we need to think constructively in this area because protection of children in the face of generative AI models, in particular, is extremely important.

This group, looking forward to further harms that could be caused by AI, is extremely important on how we can mitigate them in a number of different ways, despite the fact that these amendments appear to deal with quite a disparate group of issues.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, I too thank all noble Lords for their insightful contributions to this important group of amendments, even if some of them bemoaned the fact that they have had to repeat themselves over the course of several Bills. I am also very heartened to see how many people have joined us for Committee today. I have been involved in only two of these sittings, but this is certainly a record, and on present trends it is going to be standing room only, which is all to the good.

I have two observations before I start. First, we have to acknowledge that perhaps this area is among the most important we are going to discuss. The rights and protections of data subjects, particularly children, are in many ways the crux of all this and we have to get it right. Secondly, I absolutely take on board that there is a real appetite to get ahead of something around AI legislation. I have an amendment I am very excited about later when we come particularly to ADM, and there will be others as well, but I absolutely take on board that we need to get going on that.

Amendment 92 in the names of the noble Viscount, Lord Colville, and the noble Lord, Lord Clement-Jones, seeks to reduce the likelihood of the misuse of Clause 77 by AI model developers who may seek to claim that they do not need to notify data subjects of reuse for scientific purposes under that clause. This relates to the way that personal data is typically collected and processed for AI development. Amendment 93 similarly seeks to reduce the possibility of misuse of Clause 77 by model developers who could claim they do not need to notify data subjects of reuse for scientific purposes. Amendment 101 also claims to address the potential misuse of Clause 77 by the developers, as does Amendment 105. I strongly support the intent of amendments from the noble Viscount, Lord Colville, and the noble Lord, Lord Clement-Jones, in seeking to maintain and make provisions for the rights and protections of data subjects, and look forward very much to hearing the views of the Minister.

I turn to Amendment 137 in the names of the noble Lords, Lord Russell and Lord Stevenson, and the noble Baronesses, Lady Kidron and Lady Harding. This amendment would require the commissioner to prepare and produce a code of practice which ensures that data processors prioritise the interests, rights and freedoms of children. It goes without saying that the rights and protection of children are of utmost importance. Certainly, this amendment looks to me not only practical but proportionate, and I support it.

Finally, Amendment 211A in the name of my noble friend Lord Holmes ensures the prohibition of

“the development, deployment, marketing and sale of data related to an individual’s image, likeness or personality for AI training”

without that person’s consent. Like the other amendments in this group, this makes provision to strengthen the rights and protections of data subjects against the potential misuse or sale of data and seems entirely sensible. I am sure the Minister has listened carefully to all the concerns powerfully raised from all sides of the Committee today. It is so important that we do not lose sight of the importance of the rights and protection of data subjects.

16:45
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I thank the noble Viscount, Lord Colville, and the noble Baroness, Lady Kidron, for their amendments and consideration of this policy area. I hope noble Lords will bear with me if I save some of the points I shall make on web crawling and intellectual property for the later group, which is specifically on that topic.

Amendments 92 and 93 from the noble Viscount are about the new disproportionate effort exemption in Article 13. I can reassure noble Lords that this exemption applies only when data is collected directly from the data subject, so it cannot be used for web crawling, which is, if you like, a secondary activity. I think that answers that concern.

Amendments 101 and 105, also from the noble Viscount, are about the changes to the existing exemption in Article 14, where data is collected from other sources. Noble Lords debated this issue in the previous group, where Amendments 97 and 99 sought to remove this exemption. The reassurances I provided to noble Lords in that debate about the proportionality test being a case-by-case exercise also apply here. Disproportionate effort cannot be used as an excuse; developers must consider the rights of the data subject on each occasion.

I also draw noble Lords’ attention to another quote from the ICO itself, made when publishing its recent outcome reports. I know I have already said that I will share more information on this. It says:

“Generative AI developers, it’s time to tell people how you’re using their information”.


The ICO is on the case on this issue, and is pursuing it.

On Amendment 137 from the noble Baronesses, Lady Kidron and Lady Harding, and other noble Lords, I fully recognise the importance of organisations receiving clear guidance from regulators, especially on complex and technical issues. AI is one such issue. I know that noble Lords are particularly conscious of how it might affect children, and I am hearing the messages about that today.

As the noble Baroness will know, the Secretary of State already has the power to request statutory codes such as this from the regulator. The existing power will allow us to ensure the correct scope of any future codes, working closely with the ICO and stakeholders and including noble Lords here today, and I am happy to meet them to discuss this further. The Government are, naturally, open to evidence about whether new statutory codes should be provided for by regulations in future. Although I appreciate the signal this can send, at the moment I do not believe that a requirement for codes on this issue is needed in this legislation. I hope noble Lords are reassured that the Government are taking this issue seriously.

Amendment 211A from the noble Lord, Lord Holmes, is about prohibiting the processing of people’s names, facial images, voices or any physical characteristics for AI training without their consent. Facial images and other physical characteristics that can be used to identify a person are already protected by the data protection legislation. An AI developer processing such data would have to identify a lawful ground for this. Consent is not the only option available, but I can reassure the noble Lord that there are firm safeguards in place for all the lawful grounds. These include, among many other things, making sure that the processing is fair and transparent. Noble Lords will know that even more stringent conditions, such as safeguards applying in relation to race, sexual orientation and any biometric data that can be used to identify someone as types of a special category of data are also covered.

Noble Lords tried to tempt me once again on the timetable for the AI legislation. I said as much as I could on that when we debated this in the last session, so I cannot add any more at this stage.

I hope that reassures noble Lords that the Bill has strong protections in place to ensure responsible data use and reuse, and, as such, that they feel content not to press their amendments.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I understand the point that the Secretary of State has the power, but does he have the intention? We are seeking an instruction to the ICO to do exactly this thing. The Secretary of State’s intention would be an excellent compromise all round to activate such a thing, and to see that in the Bill is the point here.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Discussions with the ICO are taking place at the moment about the scope and intention of a number of issues around AI, and this issue would be included in that. However, I cannot say at the moment that that intention is specifically spelled out in the way that the noble Baroness is asking.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - - - Excerpts

This has been a wide-ranging debate, with important contributions from across the Committee. I take some comfort from the Minister’s declaration that the exemptions will not be used for web crawling, but I want to make sure that they are not used at the expense of the privacy and control of personal data belonging to the people of Britain.

That seems particularly so for Amendment 137 in the name of the noble Baroness, Lady Kidron. I was particularly taken by her pointing out that children’s data privacy had not been taken into account when it came to AI, reinforced by the noble Baroness, Lady Harding, telling us about the importance of the Bill. She said it was paramount to protect children in the digital age and reminded us that this is the biggest breakthrough of our lifetime and that children need protecting from it. I hope very much that there will be some successful meetings, and maybe a government amendment on Report, responding to these passionate and heartfelt demands. On that basis, I sincerely hope the Minister will meet us all and other noble Lords to discuss these matters of data privacy further. On that basis, I beg leave to withdraw my amendment.

Amendment 92 withdrawn.
Amendments 93 and 94 not moved.
Amendment 95
Moved by
95: Clause 77, page 91, line 16, leave out “to the extent that” and insert “when any one or more of the following is true”
Member’s explanatory statement
This amendment would clarify that only one condition under paragraph 5 must be present for paragraphs 1 to 4 to not apply.
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My Lords, I was in such a hurry to apologise just now for missing Second Reading that I forgot to declare my interests and remind the Committee of my technology and, with regard to this group, charitable interests as set out in the register.

I shall speak to Amendments 95, 96, 98, 101, 102 and 104 in my name and those of the noble Lords, Lord Clement-Jones and Lord Stevenson of Balmacara, and my noble friend Lord Black of Brentwood, and Amendments 103 and 106 in my name and those of the noble Lords, Lord Clement-Jones and Lord Stevenson. I also support Amendment 162 in the name of the noble Lord, Lord Clement-Jones. I will speak only on the marketing amendments in my name and leave the noble Lord, Lord Clement-Jones, to do, I am sure, great justice to the charitable soft opt-in.

These amendments are nothing like as philosophical and emotive as the last amendment on children and AI. They aim to address a practical issue that we debated in the late spring on the Data Protection and Digital Information Bill. I will not rehearse the arguments that we made, not least because the Minister was the co-signatory of those amendments, so I know she is well versed in them.

Instead, I shall update the Committee on what has happened since then and draw noble Lords’ attention to a couple of the issues that are very real and present now. It is strange that all Governments seem reluctant to restrict the new technology companies’ use of our data but extremely keen to get into the micro detail of restricting older forms of our using data that we have all got quite used to.

That is very much the case for the open electoral register. Some 63% of people opt out of being marketed at, because they have put their name as such on the electoral register. This is a well known and well understood use of personal data. Yet, because of the tribunal ruling, it is increasingly the case that companies cannot use the open electoral register and target the 37% of people who have said that they are quite happy to receive marketing unless the company lets every single one of those users know that they are about to market to them. The danger is that we create a new cookie problem—a physical cookie problem—where, if you want to use a data source that has been commonplace for 40 years, you have to send some marketing to tell people that you are about to use it. That of course means that you will not do so, which means that you reduce the data available to a lot of small and medium-sized businesses to market their products and hand them straight to the very big tech companies, which are really happy to scrape our data all over the place.

This is a strange one, where I find myself arguing that we should just allow something that is not broken not to need to be fixed. I appreciate that the Minister will probably tell us that the wording in these amendments is not appropriate. As I said earlier in the year—in April, in the previous incarnation—I very much hope that if the wording is incorrect we could, between Committee and Report, have a discussion and agree on some wording that achieves what seems just practical common sense.

The tribunal ruling that created this problem recognised that it was causing a problem. It stated that it accepted that the loophole it created would allow one company, Experian, a sizeable competitive advantage. It is a slightly perverse one: it means that it has to let only 5 million people know that it might be about to use the open electoral register, while its competitors have to let 22 million people know. That just does not pass the common-sense test of practical use of data. Given the prior support that the Minister has shown for this issue, I very much hope that we can resolve it between Committee and Report. I beg to move.

Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, I have a couple of amendments in this group, Amendments 158 and 161. Amendment 158 is largely self-evident; it tries to make sure that, where there is a legal requirement to communicate, that communication is not obstructed by the Bill. I would say much the same of Amendment 161; that, again, it is obvious that there ought to be easy communication where a person’s pension is concerned and the Bill should not obstruct it. I am not saying that these are the only ways to achieve these things, but they should be achieved.

I declare an interest on Amendment 160, in that I control the website of the Good Schools Guide, which has advertising on it. The function of advertising on the web is to enable people to see things for free. It is why it does not close down to a subscription-only service. If people put advertisements on the web, they want to know that they are effective and have been seen, and some information about who they have been seen by. I moved a similar amendment to the previous Government’s Bill and encountered some difficulty. If the Government are of the same mind—that this requires us to be careful—I would very much welcome the opportunity of a meeting between now and Report, and I imagine others would too, to try to understand how best to make sure that advertising can flourish on the internet.

17:00
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I welcome the amendments spoken to so well by the noble Baroness, Lady Harding, regarding the open electoral register. They are intended to provide legal certainty around the use of the register, without compromising on any aspect of the data privacy of UK citizens or risking data adequacy. The amendments specify that companies are exempt from the requirement to provide individuals with information in cases where their personal data has not been obtained directly from them if that data was obtained from the open electoral register. They also provide further clarification on what constitutes “disproportionate effort” under new paragraph 5(e) of Article 14 of GDPR.

The noble Baroness covered the ground so effectively that all I need to add is that the precedent established by the current interpretation by the tribunal will affect not only the open electoral register but other public sources of data, including the register of companies, the Registry of Judgments, Orders and Fines, the Land Registry and the Food Standards Agency register. Importantly, it may even prevent the important work being done to create a national data library achieving its objectives of public sector data sharing. It will have far-reaching implications if we do not change the Bill in the way that the noble Baroness has put forward.

I thank the noble Lord, Lord Lucas, for his support for Amendment 160. I reciprocate in supporting—or, at least, hoping that we get clarification as a result of—his Amendments 158 and 161.

Amendment 159B seeks to ban what are colloquially known as cookie paywalls. As can be seen, it is the diametric opposite to Amendment 159A, tabled by the noble Viscount, Lord Camrose. For some unaccountable reason, cookie paywalls require a person who accesses a website or app to pay a fee to refuse consent to cookies being accessed from or stored on their device. Some of these sums can be extortionate and exorbitant, so I was rather surprised by the noble Viscount’s counter amendment.

Earlier this year, the Information Commissioner launched a call for views which looked to obtain a range of views on its regulatory approach to consent or pay models under data protection law. The call for views highlighted that organisations that are looking to adopt, or have already adopted, a consent-or-pay model must consider the data protection implications.

Cookie paywalls are a scam and reduce people’s power to control their data. I wonder why someone must pay if they do not consent to cookies being stored or accessed. The PEC regulations do not currently prohibit cookie paywalls. The relevant regulation is Regulation 6, which is due to be substituted by Clause 111, and is supplemented by new Schedule A1 to the PEC regulations, as inserted by Schedule 12 to the Bill. The regulation, as substituted by Clause 111 and Schedule 12, does not prohibit cookie paywalls. This comes down to the detail of the regulations, both as they currently are and as they will be if the Bill remains as drafted. It is drafted in terms that do not prevent a person signifying lack of consent to cookies, and a provider may add or set controls—namely, by imposing requirements—for how a person may signify that lack of consent. Cookie paywalls would therefore be completely legal, and they certainly have proliferated online.

This amendment makes it crystal clear that a provider must not require a person to pay a fee to signify lack of consent to their data being stored or accessed. This would mean that, in effect, cookie paywalls would be banned.

Amendment 160 is sought by the Advertising Association. It seeks to ensure that the technical storage of or access to information is considered necessary under paragraph 5 of the new Schedule A1 to the PEC regulations inserted by Schedule 12 if it would support measurement or verification of the performance of advertising services to allow website owners to charge for their advertising services more accurately. The Bill provides practical amendments to the PEC regulations through listing the types of cookies that no longer require consent.

This is important, as not all cookies should be treated the same and not all carry the same high-level risks to personal privacy. Some are integral to the service and the website itself and are extremely important for subscription-free content offered by publishers, which is principally funded by advertising. Introducing specific and target cookie exemptions has the benefit of, first, simplifying the cookie consent banner, and, secondly, increasing further legal and economic certainty for online publishers. As I said when we debated the DPDI Bill, audience measurement is an important function for media owners to determine the consumption of content, to be able to price advertising space for advertisers. Such metrics are crucial to assess the effectiveness of a media channel. For sites that carry advertising, cookies are used to verify the delivery and performance of a digital advertisement—ie, confirmation that an ad has been served or presented to a user and whether it has been clicked on. This is essential information to invoice an advertiser accurately for the number of ad impressions in a digital ad campaign.

My reading of the Bill suggests that audience measurement cookies would be covered under the list of exemptions from consent under Schedule 12, however. Can the Government confirm this? Is it the Government’s intention to use secondary legislation in future to exempt ad performance cookies?

Coming to Amendment 162 relating to the soft opt-in, I am grateful to the noble Lord, Lord Black of Brentwood, and the noble Baroness, Lady Harding of Winscombe, for their support. This amendment would enable charities to communicate to donors in the same way that businesses have been able to communicate to customers since 2003. The clause will help to facilitate greater fundraising and support the important work that charities do for society. I can do no better than quote from the letter that was sent to Secretary of State Peter Kyle on 25 November, which was co-ordinated by the DMA and involved nearly 20 major charities, seeking support for reinstating the original Clause 115 of the DPDI Bill into this Bill:

“Clause 115 of the previous DPDI Bill extended the ‘soft opt-in’ for email marketing for charities and non-commercial organisations. The DMA estimates that extending the soft opt-in to charities would increase annual donations in the UK by £290 million”,


based on analysis of 13.1 million donors by the Salocin Group. The letter continues:

“At present, the DUA Bill proposals remove this. The omission of the soft opt-in will prevent charities from being able to communicate to donors in the same way as businesses can. As representatives of both corporate entities and charitable organisations, it is unclear to the DMA why charities should be at a disadvantage in this regard”.


I hope that the Government will listen to the DMA and the charities involved.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank noble Lords for their comments and contributions. I shall jump to Amendments 159 and 159A, one of which is in my name and both of which are concerned with cookie paywalls. I am not sure I can have properly understood the objection to cookie paywalls. Do they not simply offer users three choices: pay money and stay private; share personal data and read for free; or walk away? So many times, we have all complained about the fact that these websites harvest our data and now, for the first time, this approach sets a clear cash value on the data that they are harvesting and offers us the choice. The other day somebody sent me a link from the Sun. I had those choices. I did not want to pay the money or share my data, so I did not read the article. I feel this is a personal decision, supported by clear data, which it is up to the individual to take, not the Government. I do not think we should take away this choice.

Let me turn to some of the other amendments in this group. Amendment 161 in the name of my noble friend Lord Lucas is, if I may say so, a thoughtful amendment. It would allow pension providers to communicate information on their product. This may mean that the person who will benefit from that pension does not miss out on useful information that would benefit their saving for retirement. Given that pension providers already hold the saver’s personal data, it seems to be merely a question of whether this information is wanted; of course, if it is not, the saver can simply opt out.

Amendment 162 makes an important point: many charities rely on donations from the public. Perhaps we should consider bringing down the barriers to contacting people regarding fundraising activities. At the very least, I am personally not convinced that members of the public have different expectations around what kinds of organisation can and cannot contact them and in what circumstances, so I support any step that simplifies the—to my mind—rather arbitrary differences in the treatment of business and charity communications.

Amendment 104 certainly seems a reasonable addition to the list of what might constitute “unreasonable effort” if the information is already public. However, I have some concerns about Amendments 98 and 100 to 103. For Amendment 98, who would judge the impact on the individual? I suspect that the individual and the data controllers may have different opinions on this. In Amendment 100, the effort and cost of compliance are thorny issues that would surely be dictated by the nature of the data itself and the reason for providing it to data subjects. In short, I am concerned that the controllers’ view may be more subjective than we would want.

On Amendment 102, again, when it comes to providing information to them,

“the damage and distress to the data subjects”

is a phrase on which the subject and the controller will almost inevitably have differing opinions. How will these be balanced? Additionally, one might presume that information that is either damaging or distressing to the data subjects should not necessarily be withheld from them as it is likely to be extremely important.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, we have covered a range of issues in our debate on this grouping; nevertheless, I will try to address each of them in turn. I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Harding, for their Amendments 95, 96, 98, 100, 102 to 104 and 106 regarding notification requirements.

First, with regard to the amendments in the name of the noble Baroness, Lady Harding, I say that although the Government support the use of public data sources, transparency is a key data protection principle. We do not agree that such use of personal data should remove or undermine the transparency requirements. The ICO considers that the use and sale of open electoral register data alone is likely not to require notification. However, when the data is combined with data from other sources, in order to build an extensive profile to be sold on for direct marketing, notification may be proportionate since the processing may go beyond the individual’s reasonable expectations. When individuals are not notified about processing, it makes it harder for them to exercise their data subject rights, such as the right to object.

Adding other factors to the list of what constitutes a “disproportionate effort” for notification is unnecessary given that the list is already non-exhaustive. The “disproportionate effort” exemption must be applied according to the safeguards of the wider data protection framework. According to the fairness principle, controllers should already account for whether the processing meets the reasonable expectations of a data subject. The data minimisation and purpose limitation principles also act as an important consideration for data controllers. Controllers should continue to assess on a case-by-case basis whether they meet the threshold for the existing exemptions to notify; if not, they should notify. I hope that this helps clarify our position on that.

17:15
Amendment 158 from the noble Lord, Lord Lucas, seeks to amend the definition of “direct marketing”, to make it clear that it excludes communications necessary to avoid harm or improve consumer outcomes, when complying with law or regulatory standards.  I understand the sentiment behind the amendment, but financial services firms can already provide regulatory communication messages to their customers without permission, provided these messages are neutral in tone, factual and do not include promotional content.  While such messages can support customers to make informed decisions about their financial investments, they would not be classed as advertising or marketing material. As such, they would not engage the direct marketing rules within the Privacy and Electronic Communications Regulations. I refer the noble Lord to paragraph 803 of the Explanatory Notes to the Bill, where we have taken steps to clarify that position.
Amendment 159A from the noble Viscount, Lord Camrose, is aimed at enabling cookie paywalls. As we have identified, conversely, Amendment 159 from the noble Lord, Lord Clement-Jones, seeks to ban their use. Generally, these paywalls work by giving web users the option to pay for a cookie-free browsing experience. Earlier this year the Information Commissioner launched a call for views on “consent or pay” models for cookies. The aim of the Information Commissioner’s call for views is to provide the online advertising industry with clarity on how advertising cookies and paywalls can be used in compliance with data protection and privacy laws. We will consider the Information Commissioner’s findings when he publishes his response to this call for views. It would be premature to make legal changes without considering the findings or consulting interested parties. I hope noble Lords will bear that in mind.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

When does the Minister anticipate that the ICO will produce that report?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I do not have the detail of all that. Obviously, the call for views has only recently gone out and he will need time for consideration of the responses. I hope the noble Lord will accept that the ICO is on the case on this matter. If we can provide more information, we will.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

May I ask the Minister a hypothetical question? If the ICO believes that these are not desirable, what instruments are there for changing the law? Can the ICO, under its own steam, so to speak, ban them; do we need to do it in primary legislation; or can it be done in secondary legislation? If the Minister cannot answer now, perhaps she can write to me.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Of course I will write to the noble Lord. It will be within the ICO’s normal powers to make changes where he finds that they are necessary.

I move to Amendment 160, tabled by noble Lord, Lord Lucas, which seeks to create a new exemption for advertising performance cookies. There is a balance to strike between driving growth in the advertising, news and publishing sectors while ensuring that people retain choice and control over how their data is used. To exempt advertising measurement cookies, we would need to assess how intrusive these cookies are, including what they track and where data is sent. We have taken a delegated power so that exemptions to the prohibition can be added in future once evidence supports it, and we can devise appropriate safeguards to minimise privacy risks. In the meantime, we have been actively engaging with the advertising and publishing sectors on this issue and will continue to work with them to consider the potential use of the regulation-making power. I hope that the noble Lord will accept that this is work in progress.

Amendment 161, also from the noble Lord, Lord Lucas, aims to extend the soft opt-in rule under the privacy and electronic communications regulations to providers of auto-enrolment pension schemes. The soft opt-in rule removes the need for some commercial organisations to seek consent for direct marketing messages where there is an existing relationship between the organisation and the customer, provided the recipient did not object to receiving direct marketing messages when their contact details were collected.

The Government recognise that people auto-enrolled by their employers in workplace pension schemes may not have an existing relationship with their pension provider, so I understand the noble Lord’s motivations for this amendment. However, pension providers have opportunities to ask people to express their direct mail preferences, such as when the customer logs on to their account online. We are taking steps to improve the support available for pension holders through the joint Government and FCA advice guidance boundary review. The FCA will be seeking feedback on any interactions of proposals with direct marketing rules through that consultation process. Again, I hope the noble Lord will accept that this issue is under active consideration.

Amendment 162, tabled by the noble Lord, Lord Clement-Jones, would create an equivalent provision to the soft opt-in but for charities. It would enable a person to send electronic marketing without permission to people who have previously expressed an interest in their charitable objectives. The noble Lord will recall, and has done so, that the DPDI Bill included a provision similar to his amendment. The Government removed it from that Bill due to the concerns that it would increase direct marketing from political parties. I think we all accepted at the time that we did not want that to happen.

As the noble Lord said, his amendment is narrower because it focuses on communications for charitable purposes, but it could still increase the number of messages received by people who have previously expressed an interest in the work of charities. We are listening carefully to arguments for change in this area and will consider the points he raises, but I ask that he withdraws his amendment while we consider its potential impact further. We are happy to have further discussions on that.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- Hansard - - - Excerpts

I apologise to the Minister for intervening on her when I have not spoken earlier in this debate, but I was reassured by what she just said on Amendment 162. Remarks made by other noble Lords in this debate suggest both that members of the public might not object to charities having the same access rights as businesses and that the public do not necessarily draw a distinction between businesses and charities. As a former chairman of the Charity Commission, I can say that that is not what is generally found. People have an expectation of charities that differs from what they would expect by way of marketing from businesses. In considering this amendment, therefore, I urge the Minister to think carefully before deciding what action the Government should take.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I thank the noble Baroness very much for that very helpful intervention. If she has any more information about the view of the Charity Commission, we would obviously like to engage with that because we need to get this right. We want to make sure that individuals welcome and appreciate the information given to them, rather than it being something that could have a negative impact.

I think I have covered all the issues. I hope those explanations have been of some reassurance to noble Lords and that, as such, they are content not to press their amendments.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

May I just follow up by asking one quick question? I may be clutching at straws here but, in responding to the amendments in my name, she stated what the ICO believes rather than what the Government believe. She also said that the ICO may think that further permission is required to ensure transparency. I understand from the Data & Marketing Association that users of this data have four different ways of ensuring transparency. Would the Minister agree to a follow-up meeting to see whether there is a meeting of minds with what the Government think, rather than the ICO?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I am very happy to talk to the noble Baroness about this issue. She asked what the Government’s view is; we are listening very carefully to the Information Commissioner and the advice that he is putting together on this issue.

Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, I am very grateful for the answers the noble Baroness gave to my amendments. I will study carefully what she said in Hansard, and if I have anything further to ask, I will write to her.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My Lords, in response—and very briefly, given the technical nature of all these amendments—I think that we should just note that there are a number of different issues in this group, all of which I think noble Lords in this debate will want to follow up. I thank the many noble Lords who have contributed both this time round and in the previous iterations, and ask that we follow up on each of the different issues, probably separately rather than in one group, as we will get ourselves quite tangled in the web of data if we are not careful. With that, I beg leave to withdraw the amendment.

Amendment 95 withdrawn.
Amendments 96 to 106 not moved.
Clause 77 agreed.
Clause 78 agreed.
Amendment 107 not moved.
Clause 79: Data subjects’ rights to information: legal professional privilege exemption
Amendment 108
Moved by
108: Clause 79, page 93, line 18, leave out “court” and insert “tribunal”
Member’s explanatory statement
This amendment is consequential on the new Clause (Transfer of jurisdiction of courts to tribunals).
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, in moving Amendment 108, I will also speak to all the other amendments in this group. They are all designed to transfer all existing provisions from the courts to the tribunals and simplify the enforcement of data rights. Is that not something to be desired? This is not just a procedural change but a necessary reform to ensure that the rights granted on paper translate into enforceable rights in reality.

The motivation for these amendments stems from recurring issues highlighted in cases such as Killock and Veale v the Information Commissioner, and Delo v the Information Commissioner. These cases revealed a troubling scenario where the commissioner presented contradictory positions across different levels of the judiciary, exacerbating the confusion and undermining the credibility of the regulatory framework governing data protection. In these cases, the courts have consistently pointed out the confusing division of jurisdiction between different courts and tribunals, which not only complicates the legal process but wastes considerable public resources. As it stands, individuals often face the daunting task of determining the correct legal venue for their claims, a challenge that has proved insurmountable for many, leading to denied justice and unenforced rights.

By transferring all data protection provisions from the courts to more specialised tribunals, which are better equipped to handle such cases, and clarifying the right-to-appeal decisions made by the commissioner, these amendments seek to eliminate unnecessary legal barriers. Many individuals, often representing themselves and lacking legal expertise, face the daunting challenge of navigating complex legal landscapes, deterred by high legal costs and the intricate determination of appropriate venues for their claims. This shift will not only reduce the financial burden on individuals but enhance the efficiency and effectiveness of the judicial process concerning data protection. By simplifying the legal landscape, we can safeguard individual rights more effectively and foster a more trustworthy digital environment.

17:30
The proposed changes are a crucial step towards aligning our legal framework with the realities of modern data use and ensuring that everyone can genuinely protect their data rights. I previously introduced similar amendments during the debate on the now-defunct DPDI Bill. They addressed the persistent jurisdictional confusion embedded in the Data Protection Act 2018—a confusion that has significantly hindered individuals’ ability to enforce their data protection rights effectively.
Additionally, these amendments clarify the right to appeal decisions made by the commissioner, touching directly on the core issues raised in the Killock case. On any basis, given the insightful postscript by Mrs Justice Farbey in Killock, it is clear that a comprehensive review of the appeal mechanisms for rights under the DPA is long overdue. Such a review would streamline processes, conserve judicial resources and, most importantly, make it easier for individuals to enforce their data protection rights. I beg to move.
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

My Lords, I rise briefly to support my friend, the noble Lord, Lord Clement-Jones, and his string of amendments. He made the case clearly: it is simply about access, the right to redress and a clear pathway to that redress, a more efficient process and clarity and consistency across this part of our data landscape. There is precious little point in having obscure remedies or rights—or even, in some cases, as we have discussed in our debates on previous groups, no right or obvious pathways to redress. I believe that this suite of amendments addresses that issue. Again, I full-throatedly support them.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, I address the amendments tabled by the noble Lord, Lord Clement-Jones. These proposals aim to transfer jurisdiction from courts to tribunals; to establish a new right of appeal against decisions made by the Information Commissioner; and to grant the Lord Chancellor authority to implement tribunal procedure rules. I understand and recognise the noble Lord’s intent here, of course, but I have reservations about these amendments and urge caution in accepting them.

The suggestion to transfer jurisdiction from courts to tribunals raises substantial concerns. Courts have a long-standing authority and expertise in adjudicating complex legal matters, including data protection cases. By removing these disputes from the purview of the courts, the risk is that we undermine the depth and breadth of legal oversight required in such critical areas. Tribunals, while valuable for specialised and expedited decisions, may not provide the same level of rigorous legal analysis.

Cases such as those cited by the noble Lord, Lord Clement-Jones—Killock and another v the Information Commissioner and Delo v the Information Commissioner—demonstrate to me the intricate interplay between data protection, administrative discretion and broader legal principles. It is questionable whether tribunals, operating under less formal procedures, can consistently handle such complexities without diminishing the quality of justice. Further, I am not sure that the claim that this transfer will streamline the system and reduce burdens on the courts is fully persuasive. Shifting cases to tribunals does not eliminate complexity; it merely reallocates it, potentially at the expense of the detailed scrutiny that these cases demand.

I turn to the right of appeal against the commissioner’s decisions. Although the introduction of a right of appeal against these decisions may seem like a safeguard, it risks creating unnecessary layers of litigation. The ICO already operates within a robust framework of accountability, including judicial review for cases of legal error or improper exercise of discretion. Adding a formal right of appeal risks encouraging vexatious challenges, overwhelming the tribunal system and diverting resources from addressing genuine grievances.

I think we in my party understand the importance of regulatory accountability. However, creating additional mechanisms should not come at the expense of efficiency and proportionality. The existing legal remedies are designed to strike an appropriate balance, and further appeals risk creating a chilling effect on the ICO’s ability to act decisively in protecting data rights.

On tribunal procedure rules and centralised authority, the proposed amendment granting the Lord Chancellor authority to set tribunal procedure rules bypasses the Tribunal Procedure Committee, an independent body designed to ensure that procedural changes are developed with judicial oversight. This move raises concerns about the concentration of power and the erosion of established checks and balances. I am concerned that this is a case of expediency overriding the principles of good governance. While I acknowledge that consultation with the judiciary is included in the amendment, it is not a sufficient substitute for the independent deliberative processes currently in place. The amendment risks undermining the independence of our legal institutions and therefore I have concerns about it.

These amendments overall, while presented as technical fixes, and certainly I recognise the problem and the intent, would have far-reaching consequences for our data protection framework. The vision of my party for governance is one that prioritises stability, legal certainty and the preservation of integrity. We must avoid reforms that, whatever their intent, introduce confusion or inefficiency or undermine public trust in our system. Data protection is, needless to say, a cornerstone of our modern economy and individual rights. As such, any changes to its governance must be approached with the utmost care.

Lord Vallance of Balham Portrait The Minister of State, Department for Science, Innovation and Technology (Lord Vallance of Balham) (Lab)
- Hansard - - - Excerpts

I thank the noble Lord, Lord Clement-Jones, for his Amendments 108, 146 to 153 and 157, and I am grateful for the comments by the noble Lord, Lord Holmes, and the noble Viscount, Lord Camrose.

The effect of this group of amendments would be to make the First-tier Tribunal and the Upper-tier Tribunal responsible for all data protection cases. They would transfer ongoing as well as future cases out of the court system to the relevant tribunals and, as has been alluded to, may cause more confusion in doing so.

As the noble Lord is aware, there is currently a blend of jurisdiction under the data protection legislation for both tribunals and courts according to the nature of the proceedings in question. This is because certain types of cases are appropriate to fall under tribunal jurisdiction while others are more appropriate for court settings. For example, claims by individuals against organisations for breaches of legal requirements can result in awards of compensation for the individuals and financial and reputational damage for the organisations. It is appropriate that such cases are handled by a court in conformance with their strict procedural and evidential rules. Indeed, under the Killock and Delo examples, it was noted that there could be additional confusion in that ability to go between those two possibilities if you went solely to one of the tribunals.

On the transfer of responsibility for making tribunal procedural rules from the Tribunal Procedure Committee to the Lord Chancellor, we think that would be inappropriate. The committee is comprised of legal experts appointed or nominated by senior members of the judiciary or the Lord Chancellor. This committee is best placed to make rules to ensure that tribunals are accessible and fair and that cases are dealt with quickly and efficiently. It keeps the rules under constant review to ensure that they are fit for purpose in line with new appeal rights and the most recent legislative changes.

Amendment 151 would also introduce a statutory appeals procedure for tribunals to determine the merits of decisions made by the Information Commissioner. Data subjects and controllers alike can already challenge the merits of the Information Commissioner’s decisions by way of judicial review in a way that would preserve the discretion and independence of the Information Commissioner’s decision-making, so no statutory procedure is needed. The Government therefore believe that the current jurisdictional framework is well-balanced and equitable, and that it provides effective and practical routes of redress for data subjects and controllers as well as appropriate safeguards to ensure compliance by organisations. For these reasons, I hope the noble Lord will not press his amendments.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I thank the Minister for his response to my amendments and welcome him to the Dispatch Box and a whole world of pain on the Data (Use and Access) Bill, as he has, no doubt, noted already after just two hours’ worth of this Committee.

I found his response disappointing, and I think both he and the noble Viscount, Lord Camrose, have misunderstood the nature of this situation. This is not a blend, which is all beautifully logical depending on the nature of the case. This is an absolute mishmash where the ordinary litigant is faced with great confusion, not knowing quite often whether to go to the court or a tribunal, where the judges themselves have criticised the confusion and where there appears to be no appetite, for some reason, in government for a review of the jurisdictions.

I felt that the noble Viscount was probably reading from his previous ministerial brief. Perhaps he looked back at Hansard for what he said on the DPDI Bill. It certainly sounded like that. The idea that the courts are peerless in their legal interpretation and the poor old tribunals really just do not know what they are doing is wrong. They are expert tribunals, you can appear before them in person and there are no fees. It is far easier to access a tribunal than a court and certainly, as far as appeals are concerned, the idea that the ordinary punter is going to take judicial review proceedings, which seems to be the implication of staying with the current system on appeals if the merits of the ICO’s decisions are to examined, seems quite breathtaking. I know from legal practice that JR is not cheap. Appearing before a tribunal and using that as an appeal mechanism would seem far preferable.

I will keep on pressing this because it seems to me that at the very least the Government need to examine the situation to have a look at what the real objections are to the jurisdictional confusion and the impact on data subjects who wish to challenge decisions. In the meantime, I beg leave to withdraw the amendment.

Amendment 108 withdrawn.
Clause 79 agreed.
Amendments 109 and 109A not moved.
Clause 80: Automated decision-making
Amendment 110
Moved by
110: Clause 80, page 94, line 24, at end insert—
“3. To qualify as meaningful human involvement, the review must be performed by a person with the necessary competence, training, authority to alter the decision and analytical understanding of the data.”Member's explanatory statement
This amendment would make clear that in the context of new Article 22A of the UK GDPR, for human involvement to be considered as meaningful, the review must be carried out by a competent person.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I beg to move Amendment 110 and will speak to Amendments 112, 114, 120, 121, 122, 123 and Clause 80 stand part. As we have heard, artificial intelligence and algorithmic and automated decision-making tools, are increasingly being used across the public sector to make and support many of the highest impact decisions affecting individuals, families and communities across healthcare, welfare, education, policing, immigration and many other sensitive areas of an individual’s life.



The Committee will be pleased to hear that I will not repeat the contents of my speech on my Private Member’s Bill on this subject last Friday. But the fact remains that the rapid adoption of AI in the public sector presents significant risks and challenges, including: the potential for unfairness, discrimination and misuse, as demonstrated by scandals such as the UK’s Horizon and Australia’s Robodebt cases; automated decisions that are prone to serious error; lack of transparency and accountability in automated decision-making processes; privacy and data protection concerns; algorithmic bias; and the need for human oversight.

17:45
To counter this, on Friday the Government prayed in aid the algorithmic transparency standard and the GDPR, but it appears that they are intent on watering down the GDPR Article 22 provisions with this Bill. As I said then, as Governments continue to adopt AI technologies, it is crucial to balance the potential benefits with the need for responsible and ethical implementation to ensure fairness, transparency and public trust. Many of us putting forward amendments today are very much on the same page in wanting to improve safeguards, by contrast.
I hope that the Minister will recognise Amendments 110 and 112 as amendments that she tabled to the DPDI Bill. I am glad to see that they are now supported by the noble Lord, Lord Knight. Amendment 110 would make it clear that, in the context of new Article 22A of the UK GDPR, for human involvement to be considered as meaningful, the review must be carried out by a competent person. Amendment 112, which would amend new Article 22B of the UK GDPR, aims to make it clear that data processing that contravenes any part of the Equality Act 2010 is prohibited.
Amendment 114 would expand the scope of Article 22 to predominantly automated decision-making in line with, while not as radical as, the recommendation formulated by the Information Commissioner’s Office in its response to the Data: A New Direction consultation in 2021. I support and have signed Amendment 119 in the name of the noble Viscount, Lord Colville, which would mandate the ATRS in government, as would my Amendment 121.
To address these challenges, several measures are also contained in Amendments 120 and 122. Amendment 120 would require public authorities to be responsible for completing an algorithmic impact assessment, the format of which would be prescribed in regulations, prior to the deployment of an algorithmic or automated decision-making system. Amendment 122, which is similar to Amendment 123A in the name of the noble Lord, Lord Holmes, would require public authorities to set up a comprehensive, publicly accessible register of all ADM systems used by public authorities, enabling scrutiny and providing transparency on the rationale and functionality of ADM systems, including information on human oversight. Individuals affected by decisions made by ADM systems would have the right to receive a meaningful and personalised explanation of how a decision was reached, including information about the decision-making process.
We very much support Amendment 123B in the name of the noble Lord, Lord Holmes. It would require employees involved in using ADM systems to have the capabilities to challenge system outputs, understand potential risks and enable oversight in line with OECD principles.
Clause 80 introduces a provision inherited from the previous DPDI Bill for the Secretary of State to use regulations to define what constitutes “meaningful human involvement” for the purposes of paragraph 1(a) of new Article 22A and whether a decision is or is not to be taken to have
“a similarly significant effect for the data subject”.
Both these terms are critical in defining the scope of Article 22 protections. These terms have also been the subject of significant uncertainty and debate, due to limited existing case law. What constitutes meaningful human involvement raises important questions around the impact of the automation bias, opacity, competence and authority of the human involved. What constitutes a similarly significant effect engages important questions, for example, about how the law applies to decision processes with multiple stages.
The only mechanism for clarifying these terms in the Bill is the power vested in the Secretary of State to define them in the context of data protection and automated decision-making. These are not merely technical changes: they represent significant policy decisions that go to the heart of the Bill and therefore require sufficient parliamentary oversight. Amendment 123 would require the Secretary of State, in conjunction with the ICO, to develop guidance on the interpretation of the safeguards in Article 22C and on important terms, such as “similarly significant effect” and “meaningful human involvement”. As the dedicated regulator, the ICO is best placed and equipped to advise and ensure consistency of application. The required timeline for publishing the guidance is six months after Royal Assent.
I very much support Amendment 115 in the name of the noble Lord, Lord Lucas. It is notable that the noble Lords, Lord Knight and Lord Holmes, both highlighted the risk of AI in employment decisions at Second Reading. At the end of the day, however, Clause 80 and the changes to Article 22 will not wash. It removes important protections for automated decision-making and AI; that position is supported by a great number of civil society organisations, such as Big Brother Watch, the Ada Lovelace Institute, Connected by Data, Defend Digital Me, Liberty, the Open Rights Group, Privacy International, the Public Law Project and Worker Info Exchange.
Article 22 of the GDPR enshrines the right not to be subject to a decision based on solely automated processing that has legal or otherwise significant effects on the individual concerned. This has proven to be a highly effective right that protects individuals from harmful decisions and discrimination. However, Clause 80 of this Bill would deprive individuals of this important right in most circumstances and would exacerbate power imbalances by requiring individuals to scrutinise, contest and assert their rights against decisions taken by systems outside their control.
I have not even talked yet about the impact of Clauses 82 and 83. In the context of law enforcement processing, the potential for people’s rights and liberties to be infringed by automated processing is extremely serious. As such, ADM involving sensitive personal data could be used in UK policing. Further diluted safeguards apply under proposed new Section 50C(3), to be inserted by Clause 80(3), whereby, rather than explicitly requiring the data controller to notify an affected individual—as is currently the case under Section 50(2)(a) of the Data Protection Act 2018—they must merely create measures to provide information about the ADM and enable the subject to contest the decision.
There are no provisions for any course of action after such secret ADM decisions are made—not even if, for example, the human review finds that an automated decision was wrong. It is extremely concerning that any ADM about a person can take place without their right to know, but for it to be conducted by police in secret and in a way that detrimentally impacts their life is an affront to justice and likely to interfere with any number of an individual’s rights.
Clause 84 would amend Sections 96 and 97 of the Data Protection Act 2018 to change the definition of ADM in the context of intelligence services processing. I very much hope that the Government will reconsider. I hope that, if they will not listen to me, they will listen to what civil society organisations have to say. In their letter of 6 September to the Secretary of State, co-ordinated by the Open Rights Group, they said:
“We recognise that there are benefits to be gained from Artificial Intelligence … Yet there are concerns. Data can be biased. Models can be wrong. The potential for discrimination and for deepening inequalities is known and significant. Important machine decisions can be wrong and unjust, and frequently Artificial Intelligence providers are unwilling or unable to address shortcomings … We respectfully ask that these clauses be re-examined to ensure that people are not simply subjected to life changing decisions made solely by machines, and forced to prove their innocence when machines get it wrong. The government should extend AI accountability, rather than reduce it, at this critical moment”.
Finally, on Amendment 123C, research by the Institute for the Future of Work suggests that the utility and effectiveness of data protection impact assessments are limited by the absence of basic disclosure provisions and strict limitations to their application to data subjects and data rights. In particular, significant social and economic impacts on workers, workplace and labour rights are likely to fall between protection in data and employment legislation as they stand. Areas of concern include hiring and access to work; pay and work allocation; impacts on the conditions and quality of work; monitoring and surveillance, including neuro and emotional surveillance; and discipline or termination of work. Research shows that AI and other data-driven technologies have already had significant impacts on the nature of work and jobs, on the conditions and quality of people’s work, and on access and enforceability of rights.
Amendment 123C adopts the language of the Institute for the Future of Work about automation archetypes. These have been developed as part of the Nuffield Foundation supported Pissarides Review into the Future of Work and Wellbeing, which will be published in January 2025, and they challenge our understanding and narratives about automation, its potential and the choices that we make now to shape our futures. It is not enough to rely on the enforcement of individual rights in discrete domains, after the event. Pre-emptive assessment of significant impacts and establishing a process for ongoing monitoring and intervention are necessary. This is in line with the Council of Europe’s framework convention on AI, signed in September and to which the UK is a signatory. The Council of Europe’s committee on AI has just officially adopted the HUDERIA human rights algorithmic impact assessment.
The amendment could lead to the introduction of measures to ensure private sector assessment and monitoring of impacts on work, people and fundamental rights, which would conform to the framework convention. If this does not do what the Government intend as regards adoption in the UK of that framework convention, I very much hope that the Government can give us more information about that at this time. I beg to move.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - - - Excerpts

My Lords, Amendment 119 is in my name, and I thank the noble Lord, Lord Knight, for adding his name to it. I am pleased to add my name to Amendment 115A in the name of noble Viscount, Lord Camrose.

Transparency is key to ensuring that the rollout of ADM brings the public and, most importantly, public trust with it. I give the Committee an example of how a lack of transparency can erode that trust. The DWP is using a machine learning model to analyse all applications for a loan, as an advance on a benefit to pay bills and other costs, while a recipient waits for their first universal credit payment. The DWP’s own analysis of the model concluded that for all of the protected characteristics that were analysed, including age, marital status and disability, it found disparities in who was most likely to be incorrectly referred by the model.

It is difficult to assess whether the model is discriminatory, effective or even lawful. When the DWP rolled it out, it was unable to reassure the Comptroller and Auditor-General that its anti-fraud models treated all customer groups fairly. The rollout continues despite these concerns. The DWP maintains that the analysis does not present

“any immediate concerns of discrimination, unfair treatment or detrimental impact on customers”.

However, because so little information is available about the model, this claim cannot be independently verified to provide the public with confidence. Civil rights organisations, including the Public Law Project, are currently working on a potential claim against the DWP, including in relation to this model, on the basis that they may consider it may be unlawful.

The Government’s commitment to rolling out ADM has been accompanied by a statement in the other place in November by AI Minister Feryal Clark that the mandatory requirement for the use of the ATRS has been seen as a significant acceleration towards adopting the standard. In response to a Written Question, the Secretary of State confirmed that, as part of the rollout of ADM phase 1 to the 16 largest ministerial departments plus HMRC, there is a deadline for them to publish their first ATRS records by the end of July 2024. Despite the Government’s statement, only eight ATRS reports have been published on the hub. The Public Law Project’s TAG project has discovered at least 74 areas in which ADM is being used, and they are only the ones that it has been able to uncover by freedom of information requests and from tip-offs by affected people. There is clearly a shortfall in the implementation and rolling out of the use of the ATRS across government departments.

18:00
Amendment 119 does not demand that these standards should be put in the Bill but gives the Government the option to introduce regulations should the rollout of the ATRS continue to be so slow. The need for these measures to be required by the Bill is clear from the slow rollout in other countries, such as Canada, where the standard was introduced but its implementation has been very slow. Canada introduced a non-statutory requirement for disclosing ADM models, which was then enforced by an internal government review. However, this process had not been effective. For May 2024, only 21 ADM models were disclosed, whereas more than 300 models are in use by the Canadian Government, according to the Starling Centre, a non-governmental research organisation.
This amendment would give flexibility in other ways. Subsection (2) recognises that the standard for ATRS creation and publication might well change and that, as its use becomes more common, the Government might want to tweak it. Likewise, subsection (3) allows flexibly in regulations about the manner of publication.
I recognise that transparency cannot apply to all collection of public data. For instance, nobody would want to influence a fraud inquiry by publishing data that might affect the outcome of that inquiry. However, I remind the Minister that when this was discussed on the Data Protection and Digital Information Bill, she tabled Amendment 74 in Committee, calling for the insertion of the mandatory use of the ATRS, almost exactly along the lines of my amendment. In the debate that followed, the noble Lord, Lord Bassam, said that putting the ATRS on a statutory footing would be,
“key to securing trust in what will be something of a revolution in how public services are delivered and procured in the future”.—[Official Report, 27/3/24; col. GC 214.]
Stephanie Peacock, then the Labour spokesman for the Bill in the other place, tabled a similar amendment in Committee in which she said,
“Relying on self-regulation in the early stages of the scheme is understandable, but having conducted successful pilots, from the Cabinet Office to West Midlands police, it is unclear why the Government now choose not to commit to the very standard they created”.—[Official Report, Commons, Data Protection and Digital Information (No. 2) Bill Committee, 23/5/23; col. 284.]
If the Minister’s party had those concerns then, why would they not be relevant now, especially as it is becoming clear that transparency is not accompanying the rollout of ADM across the public sector?
I have also added my name to Amendment 115A, which aims to delete the regulations in new Article 22D. As the noble Viscount, Lord Camrose, will no doubt explain, there is a danger of mission creep with the rollout of ADM. The key concern is that the Secretary of State could, through secondary legislation, water down what counts as meaningful human involvement. My fear is that this would allow decision-makers to bypass the need to comply with safeguards in new Articles 22A to 22C by having a nominal human in the loop, even if that human was not in a position to be an effective safeguard. This could be because they were not sufficiently competent; they were not allowed enough time properly to revise a decision; they were influenced by automation bias, in which human decision-makers are unduly influenced by the recommendations of the machine; or, because of the black box nature of the algorithm, they were not in a position to understand it.
There is a good example of the concern about mission creep in a recent case in the Netherlands. There was a successful challenge by Uber drivers against the firm’s “robo-firing” system, where drivers faced allegations of fraudulent activity determined by a machine and were dismissed without appeal. Although there was some human involvement in the process, the Dutch court found it was
“not … much more than a purely symbolic act”,
noting that Uber had failed to make clear
“what the qualifications and level of knowledge”
of the people involved were. It therefore concluded that there was not sufficient evidence of “meaningful human intervention”, so the system was caught by Article 22. The concern would be that if the Secretary of State were to legislate to declare that this kind of human intervention was in fact sufficient, it would deny British people the protections their European counterparts have under the EU GDPR. As for the definition of “similarly significant” safeguards in regulations, this allows for slippage which would harm individuals.
As useful as ADM is for promoting efficient government, people are afraid of it. They do not necessarily trust the Government, and many are worried by the Government using algorithms to make important decisions that affect their lives. If the Government intend to roll out ADM across the public sector, as they promise, then it is essential to do everything possible along the way to nurture trust with the public. These amendments would go some way to doing that.
Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, my Amendment 115 would similarly act in that way by making automated decision-making processes explain themselves to the people affected by them. This would be a much better way of controlling the quality of what is going on with automated decision-making than restricting that sort of information to professionals—to people who are anyway overworked and have a lot of other things to do. There is no one more interested in the decision of an automated process than the person about whom it is being made. If we are to trust these systems then their ability, which is way beyond the human ability, to have the time to explain why they took the decision they did—which, if the machine is any good, it knows and can easily set out—is surely the way to generate trust: you can absolutely see what decision has been made and why, and you can respond to it.

This would, beyond anything else, produce a much better system for our young people when they apply for their first job. My daughter’s friends in that position are getting into the hundreds of unexplained rejections. This is not a good way to treat young people. It does not help them to improve and understand what is going on. I completely understand why firms do not explain; they have so many applications that they just do not have the time or the personnel to sit down and write a response—but that does not apply to an automated decision-making machine. It could produce a much better situation when it comes to hiring.

As I said, my principal concern, to echo that of the noble Viscount, is that it would give us sight of the decisions that have been taken and why. If it becomes evident that they are taken well and for good reasons, we shall learn to trust them. If it becomes evident that they really are not fair or understandable, we shall be in a position to demand changes.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

My Lords, it is a pleasure to take part in the debate on this group. I support the spirit of all the amendments debated thus far.

Speaking of spirits, and it being the season, I have more than a degree of sympathy for the Minister. With so many references to her previous work, this Christmas is turning into a bit of the Ghost of Amendments Past for her. That is good, because all the amendments she put down in the past were of an excellent quality, well thought through, equally considered and even-handed.

As has been mentioned many times, we have had three versions of a data Bill so far over just over three years. One wonders whether all the elements of this current draft have kept up with what has happened in the outside world over those three years, not least when it comes to artificial intelligence. This goes to the heart of the amendments in this group on automated decision-making.

When the first of these data Bills emerged, ADM was present—but relatively discreetly present—in our society and our economy. Now it would be fair to say that it proliferates across many areas of our economy and our society, often in situations where people find themselves at the sharpest end of the economy and the sharpest end of these automated decisions, often without even knowing that ADM was present. More than that, even on the discovery that ADM was in the mix, depending on which sector of the economy or society they find that decision being made in, they may find themselves with no or precious little redress—employment and recruitment, to name but one sector.

It being the season, it is high time when it comes to ADM that we start to talk turkey. In all the comments thus far, we are talking not just about ADM but about the principles that should underpin all elements of artificial intelligence—that is, they should be human led. These technologies should be in our human hands, with our human values feeding into human oversight: human in the loop and indeed, where appropriate, human over the loop.

That goes to elements in my two amendments in this group, Amendments 123A and 123B. Amendment 123A simply posits, through a number of paragraphs, the point that if someone is subject to an automated decision then they have the right to a personalised explanation of that decision. That explanation should be accessible in its being in plain language of their choice, not having a cost attached to it and not being in any sense technically or technologically convoluted or opaque. That would be relatively straightforward to achieve, but the positive impact for all those citizens would certainly be more than material.

Amendment 123B goes to the heart of those humans charged with the delivery of these personalised explanations. It is not enough to simply say that there are individuals within an organisation responsible for the provision of personalised explanations for automated decisions; it is critical that those individuals have the training, the capabilities and, perhaps most importantly, the authority within that organisation to make a meaningful impact regarding those personalised explanations. If not, this measure may have a small voice but would have absolutely no teeth when it comes to the citizen.

In short, ADM is proliferating so we need to ensure that we have a symmetrical situation for citizens, for consumers, and for anyone who finds themselves in any domain or sector of our economy and society. We must assert the principles: human-led, human in the loop, “Our decisions, our data”, and “We determine, we decide, we choose”. That is how I believe we can have an effective, positive, enabling and empowering AI future. I look forward to the Minister’s comments.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My Lords, I shall speak to the series of amendments on automated decision-making to which I have added my name but are mostly in the name of the noble Lord, Lord Clement-Jones. As he said, we had a rehearsal for this debate last Friday when we debated his Private Member’s Bill so I will not delay the Committee by saying much about the generalities of ADMs in the public sector.

Suffice it to say that human involvement in overseeing AIs must be meaningful—for example, without those humans themselves being managed by algorithms. We must ensure that ADMs comply by design with the Equality Act and safeguard data subjects’ other rights and freedoms. As discussed in earlier groups, we must pay particular attention to children’s rights with regard to ADMs, and we must reinforce the obligation on public bodies to use the algorithmic transparency recording standards. I also counsel my noble friend the Minister that, as we have heard, there are many voices from civil society advising me and others that the new Article 22 of the GDPR takes us backwards in terms of protection.

That said, I want to focus on Amendment 123C, relating to ADMs in the workplace, to which I was too late to add my name but would have done. This amendment follows a series of probing amendments tabled by me to the former DPDI Bill. In this, I am informed by my work as the co-chair of the All-Party Parliamentary Group on the Future of Work, assisted by the Institute for the Future of Work. These amendments were also mirrored during the passage of the Procurement Act and competition Act to signal the importance of the workplace, and in particular good work, as a cross-cutting objective and lens for policy orientation.

18:15
I am pleased to note that in each case, progress was made. In particular, the CMA, following my probing amendment in January, has initiated some potentially world-leading new labour market investigations, extending to the secondary, often hidden, impacts of concentration associated with the digital giants. I am not saying that there is necessarily a causation attached to my amendments and those investigations, but we need more of this sort of work—and I am hoping to continue my winning streak.
At a time of sluggish economic recovery, the UK will benefit from a more cohesive, future-oriented approach to policy-making aimed at supporting transitions and building the capabilities of people and institutions to support pro-human and pro-innovation automation: that is, making the most of human as well as technological capabilities.
There is a significant risk that some significant impacts on people or groups may get lost if the employment, AI and data Bills are not triangulated. Good and effective employment protection must cover transitions as well as hire and fire. I am grateful to techUK for its briefing on these amendments. It broadly supports the Bill and the use of ADMs for things such as faster logging into systems and personalisation, but it refers to a risk-based approach and references employment decisions as being higher risk and needing special attention.
As this amendment argues, we need the introduction of additional principles, thresholds and requirements for the high-risk environment of work. By giving these basic protections, we free people to innovate, including around the use of ADMs. This is increasingly important to recognise in the world of large language models, when new types of automation, new builds and new risks emerge. Automation is not just about displacement but about different types of work, different skills, and different ways of people interacting with technology and imagining different possibilities for the future. There is increasing evidence on this, and if the Minister is willing to meet with me and other members of the All-Party Group on the Future of Work, we can help the UK potentially to develop a gold-standard, world-leading, evidence-driven model for reflexive, context-sensitive, pre-emptive regulation in the workplace and beyond it.
I would want to see algorithmic impact assessments that cover significant impacts on work and workers, such as any impact on equal opportunities or outcomes at work, access to employment, pay, contractual status, terms and conditions of employment, health, lawful association, rights and training. Assessments should also be on an ongoing rather than a snapshot basis, involve those affected, including official representatives, in a proportionate way, and should disclose metrics and methods and be developed by regulators at both a domain and a sector level. I could go on, but I look forward to the Minister’s response.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I speak to Amendment 114 to which I have added my name. It is a very simple amendment that prevents controllers circumventing the duties for automated decision-making by adding trivial human elements to avoid the designation. So, as such, it is a very straightforward—and, I would have thought, uncontroversial—amendment. I really hope that the Government will find something in all our amendments to accept, and perhaps that is one such thing.

I am struck that previous speeches have referred to questions that I raised last week: what is the Bill for, who is it for and why is not dealing with a host of overlapping issues that cannot really be extrapolated one from another? In general, a bit like the noble Lord, Lord Holmes, I am very much with the spirit of all these amendments. They reflect the view of the Committee and the huge feeling of civil society—and many lawyers—that this sort of attack on Article 22 by Clause 80 downgrades UK data rights at a time when we do not understand the Government’s future plans and hear very little about protections. We hear about the excitements of AI, which I feel bound to say that we all share, but not at the expense of individuals.

I raise one last point in this group. I had hoped that the Minister would have indicated the Government’s openness to Amendment 88 last week, which proposed an overarching duty on controllers and processors to provide children with heightened protections. That seemed to me the most straightforward mechanism for ensuring that current standards were maintained and then threaded through new situations and technologies as they emerged. I put those two overarching amendments down on the understanding that Labour, when in opposition, was very much for this approach to children. We may need to bring back specific amendments, as we did throughout the Data Protection and Digital Information Bill, including Amendment 46 to that Bill, which sought to ensure

“that significant decisions that impact children cannot be made using automated processes unless they are in a child’s best interest”.

If the Minister does not support an overarching provision, can she indicate whether the Government would be more open to clause-specific carve-outs to protect children and uphold their rights?

Lord Thomas of Cwmgiedd Portrait Lord Thomas of Cwmgiedd (CB)
- Hansard - - - Excerpts

My Lords, I rise briefly, first, to thank everyone who has spoken so eloquently about the importance of automated decision-making, in particular its importance to public trust and the importance of human intervention. The retrograde step of watering down Article 22 is to be deplored. I am therefore grateful to the noble Lord, Lord Clement-Jones, for putting forward that this part of the Bill should not stand part. Secondly, the specific amendment that I have laid seeks to retain the broader application of human intervention for automated decision-making where it is important. I can see no justification for that watering down, particularly when there is such uncertainty about the scope that AI may bring to what can be done by automated decision-making.

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - - - Excerpts

My Lords, in speaking to this group of amendments I must apologise to the Committee that, when I spoke last week, I forgot to mention my interests in the register, specifically as an unpaid adviser to the Startup Coalition. For Committee, noble Lords will realise that I have confined myself to amendments that may be relevant to our healthcare and improving that.

I will speak to Amendments 111 and 116 in the names of my noble friends Lord Camrose and Lord Markham, and Amendment 115 from my noble friend Lord Lucas and the noble Lords, Lord Clement-Jones and Lord Knight of Weymouth, as well as other amendments, including from my noble friend Lord Holmes—I will probably touch on most amendments in this group. To illustrate my concerns, I return to two personal experiences that I shared during debate on the Data Protection and Digital Information Bill. I apologise to noble Lords who have heard these examples previously, but they illustrate the points being made in discussing this group of amendments.

A few years ago, when I was supposed to be travelling to Strasbourg, my train to the airport got delayed. My staff picked me up, booked me a new flight and drove me to the airport. I got to the airport with my new boarding pass and scanned it to get into the gate area, but as I was about to get on the flight, I scanned my pass again and was not allowed on the flight. No one there could explain why, having been allowed through security, I was not allowed on the flight. To cut a long story short, after two hours of being gaslighted by four or five staff, with them not even saying that they could not explain things to me, I eventually had to return to the check-in desk—this was supposed to be avoided by all the automation—to ask what had happened. The airline claimed that it had sent me an email that day. The next day, it admitted that it had not sent me an email. It then explained what had happened by saying that a flag had gone off in its system. That was simply the explanation.

This illustrates the point about human intervention, but it is also about telling customers and others what happens when something goes wrong. The company clearly had not trained its staff in how to speak to customers or in transparency. Companies such as that airline get away with this sort of disgraceful behaviour all the time, but imagine if such technology were being used in the NHS. Imagine the same scenario: you turn up for an operation, and you scan your barcode to enter the hospital—possibly even the operating theatre—but you are denied access. There must be accountability, transparency and human intervention, and, in these instances, there has to be human intervention immediately. These things are critical.

I know that this Bill makes some sort of differentiation between more critical and less critical ADM, but let me illustrate my point with another example. A few years ago, I paid for an account with one of those whizzy fintech banks. Its slogan was: “We are here to make money work for everyone”. I downloaded the app and filled out the fields, then a message popped up telling me, “We will get back to you within 48 hours”. Two weeks later, I got a message on the app saying that I had been rejected and that, by law, the bank did not have to explain why. Once again, I ask noble Lords to imagine. Imagine Monzo’s technology being used on the NHS app, which many people currently use for repeat prescriptions or booking appointments. What would happen if you tried to book an appointment but you received a message saying, “Your appointment has been denied and, by law, we do not have to explain why”? I hope that we would have enough common sense to ensure that there is human intervention immediately.

I realise that the noble Lord, Lord Clement-Jones, has a Private Member’s Bill on this issue—I am sorry that I have not been able to take part in those debates—but, for this Bill, I hope that the two examples I have just shared illustrate the point that I know many noble Lords are trying to make in our debate on this group of amendments. I look forward to the response from the Minister.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank all noble Lords who have spoken. I must confess that, of all the groups we are looking at today, I have been particularly looking forward to this one. I find this area absolutely fascinating.

Let me begin in that spirit by addressing an amendment in my name and that of my noble friend Lord Markham and I ask the Government and all noble Lords to give it considerable attention. Amendment 111 seeks to insert the five principles set out in the AI White Paper published by the previous Government and to require all those participating in ADM—indeed, all forms of AI—to have due regard for them. They are:

“safety, security and robustness, appropriate transparency and explainability, fairness, accountability and governance, and contestability and redress”.

These principles for safe AI are based on those originally developed with the OECD and have been the subject of extensive consultation. They have been refined and very positively received by developers, public sector organisations, private sector organisations and civil society. They offer real safeguards against the risks of AI while continuing to foster innovation.

I will briefly make three brief points to commend their inclusion in the Bill, as I have described. First, the Bill team has argued throughout that these principles are already addressed by the principles of data protection and so are covered in the Bill. There is overlap, of course, but I do not agree that they are equivalent. Data protection is a significant concern in AI but the risks and, indeed, the possibilities of AI go far further than data protection. We simply cannot entrust all our AI risks to data protection principles.

Secondly, I think the Government will point to their coming AI Bill and suggest that we should wait for that before we move significantly on AI. However, in practice all we have to go on about the Bill—I recognise that Ministers cannot describe much of it now—is that it will focus on the largest AI labs and the largest models. I assume it will place existing voluntary agreements on a statutory footing. In other words, we do not know when the Bill is coming, but this approach will allow a great many smaller AI fish to slip through the net. If we want to enshrine principles into law that cover all use of AI here, this may not quite be the only game in town, but it is certainly the only all-encompassing, holistic game in town likely to be positively impactful. I look forward to the Minister’s comments on this point.

18:30
Thirdly, prescriptive regulation for AI is difficult because the technology moves so fast, but what will not move fast at all, if ever, is the principles. That is why it will be so valuable to have them set out in the Bill. This is not a prescriptive approach; it is one that specifies the outcomes we want and gives agency to those best placed to bring them about. I strongly commend this approach to noble Lords and look forward to the Minister’s comments.
I turn to other amendments tabled in my name. Amendments 114A and 115A are both necessary to remove the Secretary of State’s regulation-making powers under Clause 80 and Article 22D, and I thank the noble Viscount, Lord Colville, for co-signing them. As the Bill stands, the Secretary of State can, by regulation, decide whether there has or has not been meaningful human involvement in ADM, whether a decision has had an adverse effect similar to that of an adverse legal effect and what safeguards should be in place around an ADM. Like the noble Viscount, Lord Colville, I am concerned here about mission creep and micromanagement. Each of these types of decision would, I feel, be best taken by the data controllers, or the courts, in the event of disputes. I suggest it would be better if the Secretary of State were to publish guidance setting out what should be considered meaningful human involvement and what level of adversity would equate to adverse legal consequences and making suggestions for what would constitute suitable safeguards. This would allow the Government to shape how ADM is deployed while also giving companies using AI-driven ADM flexibility and agency to make it work for their circumstances.
Amendment 116 would require the Secretary of State to provide guidance on how consent should be obtained for ADM. This amendment would provide guidance for data controllers who wish to use ADM, helping them to set clear processes for obtaining consent, thus avoiding complaints and potential litigation. Amendment 117 would prevent children giving consent for their special category data to be used in ADM. Special category data reveals some of the most personal details about people’s lives, details which should not be shared without good reason. Allowing children to disclose their special category data raises safeguarding concerns as this information may be, perhaps unwittingly, made available to people unsuited to receive it. In law, we take the view that children lack the life experience to see all ends and should not be allowed to make decisions that could put them in harm’s way. I do not see why we should depart from this wisdom in the context of ADM.
Finally, Amendment 118 would ensure that human intervention in ADM
“is carried out by a person with sufficient competency and authority and is, therefore, effective”.
My view is that this should remove the grounds for concern behind Amendment 114, which would introduce this concept of “predominantly” automated processing. To me, this weakens and obscures the binary elegance and clarity of the rule: either a decision is solely automated or it is not. In the latter case, certain protections kick in. Once we introduce this concept of graduating degrees of automative-ness, we muddy the waters with needless complexity.
All of this depends, though, on a genuinely robust and effective definition of what kind of human input is required. Without Amendment 118, data subjects may find themselves in a situation where they have requested intervention by a human being only to realise that the person doing so does not have sufficient knowledge to understand the nature of the problem nor the power to rectify any problems should they be identified, rendering the whole process not very far from pointless. That said, I will not press my amendments.
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, we have had a really profound and significant debate on these issues; it has been really helpful that they have been aired by a number of noble Lords in a compelling and articulate way. I thank everybody for their contributions.

I have to say at the outset that the Government want data protection rules fit for the age of emerging technologies. The noble Lord, Lord Holmes, asked whether we are addressing issues of the past or issues of the future. We believe that the balance we have in this Bill is exactly about addressing the issues of the future. Our reforms will reduce barriers to the responsible use of automation while clarifying that organisations must provide stringent safeguards for individuals.

I stress again how seriously we take these issues. A number of examples have been quoted as the debate has gone on. I say to those noble Lords that examples were given where there was no human involved. That is precisely what the new provisions in this Bill attempt to address, in order to make sure that there is meaningful human involvement and people’s futures are not being decided by an automated machine.

Amendment 110 tabled by the noble Lords, Lord Clement-Jones and Lord Knight, seeks to clarify that, for human involvement to be meaningful, it must be carried out by a competent person. Our reforms make clear that solely automated decisions lack meaningful human involvement. That goes beyond a tick-box exercise. The ICO guidance also clarifies that

“the human involvement has to be active and not just a token gesture”;

that right is absolutely underpinned by the wording of the regulations here.

I turn next to Amendment 111. I can assure—

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I was listening very carefully. Does “underpinned by the regulations” mean that it will be underpinned?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Yes. The provisions in this Bill cover exactly that concern.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

The issue of meaningful human involvement is absolutely crucial. Is the Minister saying that regulations issued by the Secretary of State will define “meaningful human involvement”, or is she saying that it is already in the primary legislation, which is not my impression?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Sorry—it is probably my choice of language. I am saying that it is already in the Bill; it is not intended to be separate. I was talking about whether solely automated decisions lack meaningful human involvement. This provision is already set out in the Bill; that is the whole purpose of it.

On Amendment 111, I assure the noble Viscount, Lord Camrose, that controllers using solely automated processing are required to comply with the data protection principles. I know that he was anticipating this answer, but we believe that it captures the principles he proposes and achieves the same intended effect as his amendment. I agree with the noble Viscount that data protection is not the only lens through which AI should be regulated, and that we cannot address all AI risks through the data protection legislation, but the data protection principles are the right ones for solely automated decision-making, given its place in the data protection framework. I hope that that answers his concerns.

On Amendment 112, which seeks to prohibit solely automated decisions that contravene the Equality Act 2010, I assure the noble Lords, Lord Clement-Jones and Lord Knight, that the data protection framework is clear that controllers must adhere to the Equality Act.

Amendments 113 and 114 would extend solely automated decision-making safeguards to predominantly automated decision-making. I assure the noble and learned Lord Thomas, the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, that the safeguards in Clause 80 are designed to protect individuals where meaningful human involvement is lacking. Predominantly automated decision-making will already include meaningful human involvement and therefore does not require these additional safeguards.

On Amendments 114A and 115A, tabled by the noble Viscount, Lord Camrose, many noble Lords have spoken in our debates about the importance of future-proofing the legislation. These powers are an example of that: without them, the Government will not have the ability to act quickly to update protections for individuals in the light of rapid technology developments.

I assure noble Lords that the regulation powers are subject to a number of safeguards. The Secretary of State must consult the Information Commissioner and have regard to other relevant factors, which can include the impact on individuals’ rights and freedoms as well as the specific needs and rights of children. As with all regulations, the exercise of these powers must be rational; they cannot be used irrationally or arbitrarily. Furthermore, the regulations will be subject to the affirmative procedure and so must be approved by both Houses of Parliament.

I assure the noble Lord, Lord Clement-Jones, that one of the powers means that his Amendment 123 is not necessary, as it can be used to describe specifically what is or is not meaningful human involvement.

Amendment 115A, tabled by the noble Viscount, Lord Camrose, would remove the reforms to Parts 3 and 4 of the Data Protection Act, thereby putting them out of alignment with the UK GDPR. That would cause confusion and ambiguity for data subjects.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I am sorry to interrupt again as we go along but, a sentence or so ago, the Minister said that the definition in Amendment 123 of meaningful human involvement in automated decision-making was unnecessary. The amendment is designed to change matters. It would not be the Secretary of State who determined the meaning of meaningful human involvement; in essence, it would be initiated by the Information Commissioner, in consultation with the Secretary of State. So I do not quite understand why the Minister used “unnecessary”. It may be an alternative that is undesirable, but I do not understand why she has come to the conclusion that it is unnecessary. I thought it was easier to challenge the points as we go along rather than at the very end.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, we would say that a definition in the Bill is not necessary because it is dealt with case by case and is supplemented by these powers. The Secretary of State does not define meaningful human involvement; it is best done case by case, supported by the ICO guidance. I hope that that addresses the noble Lord’s point.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

That is slightly splitting hairs. The noble Viscount, Lord Camrose, might want to comment because he wanted to delete the wording that says:

“The Secretary of State may by regulations provide that … there is, or is not, to be taken to be meaningful human involvement”.


He certainly will determine—or is able to determine, at least—whether or not there is human involvement. Surely, as part of that, there will need to be consideration of what human involvement is.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Will the Minister reflect on the issues around a case-by-case basis? If I were running an organisation of any sort and decided I wanted to use ADM, how would I make a judgment about what is meaningful human involvement on a case-by-case basis? It implies that I would have to hope that my judgment was okay because I have not had clarity from anywhere else and in retrospect, someone might come after me if I got that judgment wrong. I am not sure that works, so will she reflect on that at some point?

18:45
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

The Secretary of State can help describe specific cases in the future but, on the point made by my noble friend Lord Knight, the ICO guidance will clarify some of that. There will be prior consultation with the ICO before that guidance is finalised, but if noble Lords are in any doubt about this, I am happy to write and confirm that in more detail.

Amendment 115 in the names of the noble Lords, Lord Clement-Jones, Lord Lucas and Lord Knight, and Amendment 123A in the name of the noble Lord, Lord Holmes, seek to ensure that individuals are provided with clear and accessible information about solely automated decision-making. The safeguards set out in Clause 80, alongside the wider data protection framework’s safeguards, such as the transparency principle, already achieve this purpose. The UK GDPR requires organisations to notify individuals about the existence of automated decision-making and provide meaningful information about the logic involved in a clear and accessible format. Individuals who have been subject to solely automated decisions must be provided with information about the decisions.

On Amendment 116 in the names of the noble Viscount, Lord Camrose, and the noble Lord, Lord Markham, I reassure noble Lords that Clause 69 already provides a definition of consent that applies to all processing under the law enforcement regime.

On Amendment 117 in the names of the noble Viscount, Lord Camrose, the noble Lords, Lord Markham, and my noble friend Lord Knight, I agree with them on the importance of protecting the sensitive personal data of children by law enforcement agencies, and there is extensive guidance on this issue. However, consent is rarely used as the basis for processing law enforcement data. Other law enforcement purposes, such as the prevention, detection and investigation of crime, are quite often used instead.

I will address Amendment 118 in the name of the noble Viscount, Lord Camrose, and Amendment 123B in the name of the noble Lord, Lord Holmes, together, as they focus on obtaining human intervention for a solely automated decision. I agree that human intervention should be carried out competently and by a person with the authority to correct a wrongful outcome. However, the Government believe that there is currently no need to specify the qualifications of human reviewers as the ICO’s existing guidance explains how requests for human review should be managed.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Does the Minister agree that the crux of this machinery is solely automated decision-making as a binary thing—it is or it is not—and, therefore, that the absolute key to it is making sure that the humans involved are suitably qualified and finding some way to do so, whether by writing a definition or publishing guidelines?

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

On the question of qualification, the Minister may wish to reflect on the broad discussions we have had in the past around certification and the role it may play. I gently her take her back to what she said on Amendment 123A about notification. Does she see notification as the same as a personalised response to an individual?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Noble Lords have asked several questions. First, in response to the noble Viscount, Lord Camrose, I think I am on the same page as him about binary rather than muddying the water by having degrees of meaningful intervention. The ICO already has guidance on how human review should be provided, and this will be updated after the Bill to ensure that it reflects what is meant by “meaningful human involvement”. Those issues will be addressed in the ICO guidance, but if it helps, I can write further on that.

I have forgotten the question that the noble Lord, Lord Holmes, asked me. I do not know whether I have addressed it.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

In her response the Minister said “notification”. Does she see notification as the same as “personalised response”?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My understanding is that it would be. Every individual who was affected would receive their own notification rather than it just being on a website, for example.

Let me just make sure I have not missed anyone out. On Amendment 123B on addressing bias in automated decision-making, compliance with the data protection principles, including accuracy, transparency and fairness, will ensure that organisations take the necessary measures to address the risk of bias.

On Amendment 123C from the noble Lord, Lord Clement-Jones, I reassure him that the Government strongly agree that employment rights should be fit for a modern economy. The plan to make work pay will achieve this by addressing the challenges introduced by new trends and technologies. I agree very much with my noble friend Lord Knight that although we have to get this right, there are opportunities for a different form of work, and we should not just see this as being potentially a negative impact on people’s lives. However, we want to get the balance right with regard to the impact on individuals to make sure that we get the best rather than the possible negative effects out of it.

Employment rights law is more suitable for regulating the specific use of data and technology in the workplace rather than data protection law in isolation, as data protection law sets out general rules and principles for processing that apply in all contexts. Noble Lords can rest assured that we take the impact on employment and work very seriously, and as part of our plan to make work pay and the Employment Rights Bill, we will return to these issues.

On Amendments 119, 120, 121 and 122, tabled by the noble Lord, Lord Clement-Jones, the noble Viscount, Lord Colville, and my noble friend Lord Knight, the Government share the noble Lords’ belief in the importance of public sector algorithmic transparency, and, as the noble Lord, Lord Clement-Jones, reminded us, we had a very good debate on this last week. The algorithmic transparency recording standard is already mandatory for government departments and arm’s-length bodies. This is a cross-government policy mandate underpinned by digital spend controls, which means that when budget is requested for a relevant tool, the team in question must commit to publishing an ATRS record before receiving the funds.

As I said on Friday, we are implementing this policy accordingly, and I hope to publish further records imminently. I very much hope that when noble Lords see what I hope will be a significant number of new records on this, they will be reassured that the nature of the mandation and the obligation on public sector departments is working.

Policy routes also enable us to provide detailed guidance to the public sector on how to carry out its responsibilities and monitor compliance. Examples include the data ethics framework, the generative AI framework, and the guidelines for AI procurement. Additionally, the data protection framework already achieves some of the intended outcomes of these amendments. It requires organisations, including public authorities, to demonstrate how they have identified and mitigated risks when processing personal data. The ICO provides guidance on how organisations can audit their privacy management and ensure a high level of data protection compliance.

I know I have given a great deal of detail there. If I have not covered all the points that the noble Lords have raised, I will write. In the meantime, given the above assurances, I hope that the noble Lord will withdraw his amendment.

Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, I would be very grateful if the Minister wrote to me about Amendment 115. I have done my best before and after to study Clause 80 to understand how it provides the safeguards she describes, and have failed. If she or her officials could take the example of a job application and the responses expected from it, and take me through the clauses to understand what sort of response would be expected and how that is set out in the legislation, I would be most grateful.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I am happy to write.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I thank the Minister for her very detailed and careful response to all the amendments. Clearly, from the number of speakers in this debate, this is one of the most important areas of the Bill and one that has given one of the greatest degrees of concern, both inside and outside the Committee. I think the general feeling is that there is still concern. The Minister is quite clear that the Government are taking these issues seriously, in terms of ADM itself and the impact in the workplace, but there are missing parts here. If you add all the amendments together—no doubt we will read Hansard and, in a sense, tick off the areas where we have been given an assurance about the interpretation of the Bill—there are still great gaps.

It was very interesting to hear what the noble Lord, Lord Kamall, had to say about how the computer said “no” as he reached the gate. A lot of this is about communications. I would be very interested if any letter to the noble Lord, Lord Lucas, was copied more broadly, because that is clearly one of the key issues. It was reassuring to hear that the ICO will be on top of this in terms of definitions, guidance, audit and so on, and that we are imminently to get the publication of the records of algorithmic systems in use under the terms of the algorithmic transparency recording standard.

We have had some extremely well-made points from the noble Viscounts, Lord Colville and Lord Camrose, the noble Lords, Lord Lucas, Lord Knight and Lord Holmes, and the noble Baroness, Lady Kidron. I am not going to unpack all of them, but we clearly need to take this further and chew it over before we get to Report. I very much hope that the Minister will regard a will write letter on stilts as required before we go very much further, because I do not think we will be purely satisfied by this debate.

The one area where I would disagree is on treating solely automated decision-making as the pure subject of the Clause 80 rights. Looking at it in the converse, it is perfectly proper to regard something that does not have meaningful human involvement as predominantly automated decision-making. I do not think, in the words of the noble Viscount, Lord Camrose, that this does muddy the waters. We need to be clearer about what we regard as being automated decision-making for the purpose of this clause.

There is still quite a lot of work to do in chewing over the Minister’s words. In the meantime, I beg leave to withdraw my amendment.

Amendment 110 withdrawn.
Amendments 111 to 118 not moved.
Clause 80 agreed.
19:00
Amendments 119 to 123C not moved.
Schedule 6 agreed.
Clause 81: Logging of law enforcement processing
Debate on whether Clause 81 should stand part of the Bill.
Member’s explanatory statement
This seeks to retain the requirement for police forces to record the reason they are accessing data from a police database.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, a key aspect of data protection rests in how it restricts the use of personal data once it has been collected. The public need confidence that their data will be used for the reasons they had shared it and not further used in ways that breach their legitimate expectations—or they will become suspicious as regards providing their data. The underlying theme that we heard on the previous group was the danger of losing public trust, which very much applies in the area of law enforcement and national security.

However, Schedules 4 and 5 would remove the requirement to consider the legitimate expectations of the individuals whose data is being processed, or the impact that this would have on their rights, for the purposes of national security, crime detection and prevention, safeguarding or answering to a request by a public authority. Data used for the purposes listed in these schedules would not need to undergo either a balancing test under Article 6.1(f) or a compatibility test under Article 6.4 of the UK GDPR. The combined effect of these provisions would be to authorise almost unconditional data sharing for law enforcement and other public security purposes while, at the same time, reducing accountability and traceability over how the police use the information being shared with them.

As with the previous DPDI Bill, Clauses 87 to 89 of this Bill grant the Home Secretary and police powers to view and use people’s personal data through the use of national security certificates and designation notices, which are substantially the same as Clauses 28 to 30 of the previous DPDI Bill. This risks further eroding trust in law enforcement authorities. Accountability for access to data for law enforcement purposes should not be lowered, and data sharing should be underpinned by a robust test to ensure that individuals’ rights and expectations are not disproportionately impacted. It is a bafflement as to why the Government are so slavishly following their predecessor and believe that these new and unaccountable powers are necessary.

By opposing that Clause 81 stand part, I seek to retain the requirement for police forces to record the reason they are accessing data from a police database. The public need more, not less, transparency and accountability over how, why and when police staff and officers access and use records about them. Just recently, the Met Police admitted that they investigated more than 100 staff over the inappropriate accessing of information in relation to Sarah Everard. This shows that the police can and do act to access information inappropriately, and there may well be less prominent cases where police abuse their power by accessing information without worry for the consequences.

Regarding Amendments 126, 128 and 129, Rights and Security International has repeatedly argued that the Bill would violate the UK’s obligations under the European Convention on Human Rights. On Amendment 126, the requirements in the EU law enforcement directive for logging are, principally, to capture in all cases the justification for personal data being examined, copied, amended or disclosed when it is processed for a law enforcement process—the objective is clearly to ensure that data is processed only for a legitimate purpose—and, secondarily, to identify when, how and by whom the data has been accessed or disclosed. This ensures that individual accountability is captured and recorded.

Law enforcement systems in use in the UK typically capture some of the latter information in logs, but very rarely do they capture the former. Nor, I am informed, do many commodity IT solutions on the market capture why data was accessed or amended by default. For this reason, a long period of time was allowed under the law enforcement directive to modify legacy systems installed before May 2016, which, in the UK, included services such as the police national computer and the police national database, along with many others at a force level. This transitional relief extended to 6 May 2023, but UK law enforcement did not, in general, make the required changes. Nor, it seems, did it ensure that all IT systems procured after 6 May 2016 included a strict requirement for LED-aligned logging. By adopting and using commodity and hyperscaler cloud services, it has exacerbated this problem.

In early April 2023, the Data Protection Act 2018 (Transitional Provision) Regulations 2023 were laid before Parliament. These regulations had the effect of unilaterally extending the transitional relief period under the law enforcement directive for the UK from May 2023 to May 2026. The Government now wish to strike the requirement to capture the justification for any access to data completely, on the basis that this would free up to 1.5 million hours a year of valuable police time for our officers so that they can focus on tackling crime on our streets, rather than being bogged down by administration, and that this would save approximately £42.8 million per year in taxpayers’ money.

This is a serious legislative issue on two counts: it removes important evidence that may identify whether a person was acting with malicious intent when accessing data, as well as removing any deterrent effect of them having to do so; and it directly deviates from a core part of the law enforcement directive and will clearly have an impact on UK data adequacy. The application of effective control over access to data is very much a live issue in policing, and changing the logging requirement in this way does nothing to improve police data management. Rather, it excuses and perpetuates bad practice. Nor does it increase public confidence.

Clause 87(7) introduces new Section 78A into the Act. This lays down a number of exemptions and exclusions from Part 3 of that Act when the processing is deemed to be in the interests of national security. These exemptions are wide ranging, and include the ability to suspend or ignore principles 2 through 6 in Part 3, and thus run directly contrary to the provisions and expectations of the EU law enforcement directive. Ignoring those principles in itself also negates many of the controls and clauses in Part 3 in its entirety. As a result, they will almost certainly result in the immediate loss of EU law-enforcement adequacy.

I welcome the ministerial letter from the noble Lord, Lord Hanson of Flint, to the noble Lord, Lord Anderson, of 6 November, but was he really saying that all the national security exemption clause does is bring the 2018 Act into conformity with the GDPR? I very much hope that the Minister will set out for the record whether that is really the case and whether it is really necessary to safeguard national security. Although it is, of course, appropriate and necessary for the UK to protect its national security interests, it is imperative that balance remains to protect the rights of a data subject. These proposals do not, as far as we can see, strike that balance.

Clause 88 introduces the ability of law enforcement, competent authorities and intelligence agencies to act as joint controllers in some circumstances. If Clause 88 and associated clauses go forward to become law, they will almost certainly again result in withdrawal of UK law enforcement adequacy and will quite likely impact on the TCA itself.

Amendment 127 is designed to bring attention to the fact that there are systemic issues with UK law enforcement’s new use of hyperscaler cloud service providers to process personal data. These issues stem from the fact that service providers’ standard contracts and terms of service fail to meet the requirements of Part 3 of the UK’s Data Protection Act 2018 and the EU law enforcement directive. UK law enforcement agencies are subject to stringent data protection laws, including Part 3 of the DPA and the GDPR. These laws dictate how personal data, including that of victims, witnesses, suspects and offenders, can be processed. Part 3 specifically addresses data transfers to third countries, with a presumption against such transfers unless strictly necessary. This contrasts with UK GDPR, which allows routine overseas data transfer with appropriate safeguards.

Cloud service providers routinely process data outside the UK and lack the necessary contractual guarantees and legal undertakings required by Part 3 of the DPA. As a result, their use for law enforcement data processing is, on the face of it, not lawful. This non-compliance creates significant financial exposure for the UK, including potential compensation claims from data subjects for distress or loss. The sheer volume of data processed by law enforcement, particularly body-worn video footage, exacerbates the financial risk. If only a small percentage of cases result in claims, the compensation burden could reach hundreds of millions of pounds annually. The Government’s attempts to change the law highlight the issue and suggest that past processing on cloud service providers has not been in conformity with the UK GDPR and the DPA.

The current effect of Section 73(4)(b) of the Data Protection Act is to restrict transfers for competent authorities who may have a legitimate operating need, and should possess the internal capability to assess that need, from making transfers to recipients who are not relevant authorities or international organisations and that cloud service provider. This amendment is designed to probe what impact removal of this restriction would have and whether it would enable them to do so where such a transfer is justified and necessary. I beg to move.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Non-Afl)
- Hansard - - - Excerpts

My Lords, I will speak to Amendment 124. I am sorry that I was not able to speak on this issue at Second Reading. I am grateful to the noble and learned Lord, Lord Thomas of Cwmgiedd, for his support, and I am sorry that he has not been able to stay, due to a prior engagement.

Eagle-eyed Ministers and the Opposition Front Bench will recognise that this was originally tabled as an amendment to the Data Protection and Digital Information (No. 2) Bill. It is still supported by the Police Federation. I am grateful to the former Member of Parliament for Loughborough for originally raising this with me, and I thank the Police Federation for its assistance in briefing us in preparing this draft clause. The Police Federation understands that the Home Secretary is supportive of the objective of this amendment, so I shall listen with great interest to what the Minister has to say.

This is a discrete amendment designed to address an extremely burdensome and potentially unnecessary redaction exercise, in relation to a situation where the police are preparing a case file for submission to the Crown Prosecution Service for a charging decision. Given that this issue was talked about in the prior Bill, I do not intend to go into huge amounts of detail because we rehearsed the arguments there, but I hope very much that with the new Government there might be a willingness to entertain this as a change in the law.

19:15
The point of the amendment is that the existing data protection legislation requires our police forces to spend huge amounts of time and resources, first, in going through information that has been gathered by investigating officers to identify every single item of personal data contained in that information; secondly, on deciding whether it is necessary or, in many cases, strictly necessary for the CPS to consider each item of personal data when making its charging decision; and then, thirdly, on redacting every item of personal data that does not meet that test. I ask the Committee to imagine, with things such as body cameras being worn by the police today, just how much personal data is being collected by every officer. The Police Federation and the National Police Chiefs’ Council estimate that the national cost of this redaction exercise is approximately £5.6 million per annum, and that, since 1 January 2021, 365,000 policing hours have been consumed with that exercise.
It is potentially unnecessary in the case of any given case file because the CPS decides to charge in approximately only 75% of cases so, in the 25% of cases where the CPS decides not to charge, the unredacted file could simply be deleted by the CPS. Where the CPS decides to charge, the case file could be returned to the police force to then carry out the redaction exercise before there is any risk of that file being disclosed to any person or body other than the CPS.
The simple and practical solution set out in the amendment is for the police to carry out the redaction exercise in relation to any given case file only after the CPS has taken the decision to charge. What is proposed would not remove any substantive protection of the personal data in question. It would not remove the obligation to review and redact the personal data contained in material in the case file. It would simply provide for that review and redaction to be conducted by the police after, rather than before, a charging decision has been made by the CPS.
The Police Federation has discussed this issue and is grateful to the Home Office for the meeting that happened on 25 April. There appear to be two main objections, which I shall touch on. The first was that, even if the redaction of case files before submission to the CPS for a charging decision were not required by the data protection legislation, it is required by Article 8 of the ECHR. The second objection was that it was appropriate for the police to carry out the redaction exercise on case files before submission to the CPS because that would mean the case files would not contain irrelevant personal data, which could give rise to potential pitfalls further down the line.
I will not set out the long explanations but basically, in relation to the point about Article 8, the discussion and considerations have demonstrated clearly that it is the data protection legislation, not Article 8 of the ECHR, which requires the burdensome redaction exercise. Secondly, in relation to the “further down the line” rejection the short answer is that, under the proposal in the amendment, if a decision is made by the CPS not to charge then, as I said, the unredacted file can simply be deleted or placed in secure storage. It would not go any further down the line; it would do so only if a decision was made to charge, in which case the file could be redacted in the usual way.
This change would speed up the criminal justice process. It would reduce considerably the financial burden on the taxpayer and the massive number of police hours committed. Everything we hear from the current Government, with which I have huge amounts of sympathy, says that there is a need to reduce pressure on the public purse and to speed up police time in being able to get on to the streets and do what I think all of us hope they will do: spending time on the streets, supporting victims, catching criminals, not spending hours redacting lots of images from body-worn cameras just in case the CPS happens to use that in a charging decision. I look forward to hearing from the Minister in due course.
Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, I have Amendment 201 in this group. At the moment, Action Fraud does not record attempted fraud; it has to have been successful for the website to agree to record it. I think that results in the Government taking decisions based on distorted and incomplete data. Collecting full data must be the right thing to do.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I had expected the noble Baroness, Lady Owen of Alderley Edge, to be in the Room at this point. She is not, so I wish to draw the Committee’s attention to her Amendment 210. On Friday, many of us were in the Chamber when she made a fantastic case for her Private Member’s Bill. It obviously dealt with a much broader set of issues but, as we have just heard, the overwhelming feeling of the House was to support her. I think we would all like to see the Government wrap it up, put a bow on it and give it to us all for Christmas. But, given that that was not the indication we got, I believe that the noble Baroness’s intention here is to deal with the fact that the police are giving phones and devices back to perpetrators with the images remaining on them. That is an extraordinary revictimisation of people who have been through enough. So, whether or not this is the exact wording or way to do it, I urge the Government to look on this carefully and positively to find a way of allowing the police the legal right to delete data in those circumstances.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, none of us can be under any illusion about the growing threats of cyberattacks, whether from state actors, state-affiliated actors or criminal gangs. It is pretty unusual nowadays to find someone who has not received a phishing email, had hackers target an account or been promised untold riches by a prince from a faraway country. But, while technology has empowered these criminals, it is also the most powerful tool we have against them. To that end, we must do all we can do to assist the police, the NCA, the CPS, the SIS and their overseas counterparts in countries much like our own. That said, we must also balance this assistance with the right of individuals to privacy.

Regarding the Clause 81 stand part notice from the noble Lord, Lord Clement-Jones, I respectfully disagree with this suggestion. If someone within the police were to access police records in an unauthorised capacity or for malign reasons, I simply doubt that they would be foolish enough to enter their true intentions into an access log. They would lie, of course, rendering the log pointless, so I struggle to see—we had this debate on the DPDI Bill—how this logging system would help the police to identify unauthorised access to sensitive data. It would simply eat up hours of valuable police time. I remember from our time working on the DPDI Bill that the police supported this view.

As for Amendment 124, which allows for greater collaboration between the police and the CPS when deciding charging decisions, there is certainly something to be said for this principle. If being able to share more detailed information would help the police and the CPS come to the best decision for victims, society and justice, then I absolutely support it.

Amendments 126, 128 and 129 seek to keep the UK in close alignment with the EU regarding data sharing. EU alignment or non-alignment is surely a decision for the Government of the day alone. We should not look to bind a future Administration to the EU.

I understand that Amendment 127 looks to allow data transfers to competent authorities—that is, law enforcement bodies in other countries—that may have a legitimate operating need. Is this not already the case? Are there existing provisions in the Bill to facilitate such transfers and, if so, does this not therefore duplicate them? I would very much welcome the thoughts of both the Minister and the noble Lord, Lord Clement-Jones, when he sums up at the end.

Amendment 156A would add to the definition of “unauthorised access” so that it includes instances where a person accesses data in the reasonable knowledge that the controller would not consent if they knew about the access or the reason for the access, and the person is not empowered to access it by an enactment. Given the amount of valuable personal data held by controllers as our lives continue to move online, there is real merit to this idea from my noble friend Lord Holmes, and I look forward to hearing the views of the Minister.

Finally, I feel Amendment 210 from my noble friend Lady Owen—ably supported in her unfortunate absence by the noble Baroness, Lady Kidron—is an excellent amendment as it prevents a person convicted of a sexual offence from retaining the images that breached the law. This will prevent them from continuing to use the images for their own ends and from sharing them further. It would help the victims of these crimes regain control of these images which, I hope, would be of great value to those affected. I hope that the Minister will give this serious consideration, particularly in light of noble Lords’ very positive response to my noble friend’s Private Member’s Bill at the end of last week.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I think the noble Viscount, Lord Camrose, referred to Amendment 156A from the noble Lord, Lord Holmes—I think he will find that is in a future group. I saw the Minister looking askance because I doubt whether she has a note on it at this stage.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank the noble Lord, Lord Clement-Jones; let me consider it a marker for future discussion.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I thank the noble Lord, Lord Clement-Jones, for coming to my rescue there.

I turn to the Clause 81 stand part notice tabled by the noble Lord, Lord Clement-Jones, which would remove Clause 81 from the Bill. Section 62 of the Data Protection Act requires law enforcement agencies to record their processing activities, including their reasons for accessing and disclosing personal information. Entering a justification manually was intended to help detect unauthorised access. The noble Lord was right that the police do sometimes abuse their power; however, I agree with the noble Viscount, Lord Camrose, that the reality is that anyone accessing the system unlawfully is highly unlikely to record that, making this an ineffective safeguard.

Meanwhile, the position of the National Police Chiefs’ Council is that this change will not impede any investigation concerning the unlawful processing of personal data. Clause 81 does not remove the strong safeguards that ensure accountability for data use by law enforcement that include the requirement to record time, date, and where possible, who has accessed the data, which are far more effective in monitoring potential data misuse. We would argue that the requirement to manually record a justification every time case information is accessed places a considerable burden on policing. I think the noble Lord himself said that we estimate that this clause may save approximately 1.5 million policing hours, equivalent to a saving in the region of £42.8 million a year.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

There were some raised eyebrows.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Yes, we could not see the noble Lord’s raised eyebrows.

Turning to Amendment 124, I thank the noble Baroness, Lady Morgan, for raising this important issue. While I obviously understand and welcome the intent, I do not think that the legislative change is what is required here. The Information Commissioner’s Office agrees that the Data Protection Act is not a barrier to the sharing of personal data between the police and the CPS. What is needed is a change in the operational processes in place between the police and the CPS that are causing this redaction burden that the noble Baroness spelled out so coherently.

We are very much aware that this is an issue and, as I think the noble Baroness knows, the Government are committed to reducing the burden on the police and the Home Office and to exploring with partners across the criminal justice system how this can best be achieved. We absolutely understand the point that the noble Baroness has raised, but I hope that she could agree to give space to the Home Office and the CPS to try to find a resolution so that we do not have the unnecessary burden of redaction when it is not necessary. It is an ongoing discussion—which I know the noble Baroness knows really—and I hope that she will not pursue it on that basis.

I will address Amendments 126 to 129 together. These amendments seek to remove parts of Schedule 8 to avoid divergence from EU legislation. The noble Lord, Lord Clement-Jones, proposes instead to remove existing parts of Section 73 of the Data Protection Act 2018. New Section 73(4)(aa), introduced by this Bill, with its bespoke path for personal data transfers from UK controllers to international processors, is crucial. In the modern age, where the use of such capabilities and the benefits they provide is increasing, we need to ensure that law enforcement can make effective use of them to tackle crime and keep citizens safe.

19:30
The aim of this reform is to provide legal clarity in the Bill to law enforcement agencies in the UK, so that they can embrace the technology they need and make use of international processors with confidence. Such transfers are already permissible under the legislation but we know that there is some ambiguity in how the law can be applied in practice. This reform intends to remove those obstacles. The noble Lord would like to refrain from divergence from EU law. I believe that in this Bill we have drafted the provisions, including this one, with retaining adequacy in mind. As the noble Lord is aware, the Government are committed to maintaining our EU adequacy status.
In addressing the Clause 87 stand part notice, this clause replaces the current national security exemption under the law enforcement regime with a revised version that mirrors the exemptions already available to organisations operating under the UK GDPR and intelligence services regimes. It is essential that competent authorities have access to the full range of exemptions, so that they are properly able to safeguard national security. For instance, if a law enforcement agency is investigating a data subject who it suspects may be involved in an imminent terrorist attack, it is likely to need to share personal data with other agencies at very short notice.
Turning to the stand part notices on Clauses 88 and 89, these two clauses will enable qualifying competent authorities to jointly process data with an intelligence service under part 4 of the Data Protection Act 2018 in circumstances where it is required to safeguard national security. Part 4 of the 2018 Act regulates processing by the intelligence services, so to jointly process data in this manner the Secretary of State must approve the proposed processing by issuing a designation notice. That notice can be issued only following consultation with the ICO and if the Secretary of State is satisfied that processing is necessary to safeguard national security.
These joint partnerships were previously possible under the Data Protection Act 1998, while reports on the Manchester Arena and Fishmonger’s Hall attacks highlight the public interest in closer joint working between law enforcement bodies and the intelligence services in matters of national security. I think the noble Lord also referenced the noble Lord, Lord Anderson. My understanding is that he has given his support to our proposals in the Bill.
With regard to Amendment 201, tabled by the noble Lord, Lord Lucas, attempted fraud can currently be reported to Action Fraud, the national reporting service for fraud and cybercrime. However, I can reassure the noble Lord that an improved service is being worked on, making the best use of technology to ensure the best experience for victims, intelligence for law enforcement and public data on fraud issues. All reports are analysed for intelligence that could support law enforcement in pursuing criminals and keeping the public safe. Key data, including outcomes, are published online and summarised in an interactive dashboard. I understand that further work is taking place on that improved service to replace Action Fraud. I therefore hope that the noble Lord will give the space for those proposals to come forward.
Finally, I turn to Amendment 210, tabled by the noble Baroness, Lady Owen; she is not here to speak to it. I put on record that we share her desire that images used to commit offences under Sections 66A or 66B of the Sexual Offences Act 2003 be removed from convicted offenders. However, there is already a process for this to happen. Under Section 153 of the Sentencing Act 2020, the court has the power to deprive an offender convicted of these offences of any property, including images, used for the purpose of committing those offences. Although judges’ use of these powers is a matter of judicial independence, we will closely examine what changes may be necessary and will revisit this if it is felt that changes are necessary.
Considering all the explanations I have given, I hope that noble Lords will withdraw or not press their amendments.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I thank the Minister for her response on this group, which was, again, very detailed. There is a lot to consider in what she had to say, particularly about the clauses beyond Clause 81. I am rather surprised that the current Government are still going down the same track on Clause 81. It is as if, because the risk of abuse is so high, this Government, like the previous one, have decided that it is not necessary to have the safeguard of putting down the justification in the first place. Yet we have heard about the Sarah Everard police officers. It seems to me perverse not to require justification. I will read further what the Minister had to say but it seems quite extraordinary to be taking away a safeguard at this time, especially when the Minister says that, at the same time, they need to produce logs of the time of the data being shared and so on. I cannot see what is to be gained—I certainly cannot see £42 million being saved. It is a very precise figure: £42.8 million. I wonder where the £800,000 comes from. It seems almost too precise to be credible.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I emphasise that we believe the safeguards are there. This is not a watering down of provisions. We are just making sure that the safeguards are more appropriate for the sort of abuse that we think might happen in future from police misusing their records. I do not want it left on the record that we do not think that is important.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

No. As I was saying, it seems that the Minister is saying that there will still be the necessity to log the fact that data has been shared. However, it seems extraordinary that, at the same time, it is not possible to say what the justification is. The justification could be all kinds of things, but it makes somebody think before they simply share the data. It seems to me that, given the clear evidence of abuse of data by police officers—data of the deceased, for heaven’s sake—we need to keep all the safeguards we currently have. That is a clear bone of contention.

I will read what else the Minister had to say about the other clauses in the group, which are rather more sensitive from the point of view of national security, data sharing abroad and so on.

Clause 81 agreed.
Amendment 124 not moved.
Clauses 82 to 84 agreed.
Amendment 125 not moved.
Schedule 7 agreed.
Schedule 8: Transfers of personal data to third countries etc: law enforcement processing
Amendments 126 to 129 not moved.
Schedule 8 agreed.
Schedule 9 agreed.
Clause 85: Safeguards for processing for research etc purposes
Amendments 130 to 132 not moved.
Clause 85 agreed.
Clauses 86 to 88 agreed.
Clause 89: Joint processing: consequential amendments
Amendment 133
Moved by
133: Clause 89, page 112, line 24, at end insert—
“(10) In section 199(2)(a) of the Investigatory Powers Act 2016 (bulk personal datasets: meaning of “personal data”), after “section 82(1) of that Act” insert “by an intelligence service”.”Member’s explanatory statement
Clause 88 of the Bill amends section 82 in Part 4 of the Data Protection Act 2018 (intelligence services processing). This amendment makes a consequential change to a definition in the Investigatory Powers Act 2016 which cross-refers to section 82.
Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- Hansard - - - Excerpts

These four technical government amendments do not, we believe, have a material policy effect but will improve the clarity and operation of the Bill text.

Amendment 133 amends Section 199 of the Investigatory Powers Act 2016, which provides a definition of “personal data” for the purposes of bulk personal datasets. This definition cross-refers to Section 82(1) of the Data Protection Act 2018, which is amended by Clauses 88 and 89 of the Bill, providing for joint processing by the intelligence services and competent authorities. This amendment will retain the effect of that cross-reference to ensure that processing referred to in Section 199 of the IPA remains that done by an intelligence service.

Amendment 136 concerns Clause 92 and ICO codes of practice. Clause 92 establishes a new procedure for panels to consider ICO codes of practice before they are finalised. It includes a regulation-making power for the Secretary of State to disapply or modify that procedure for particular codes or amendments to them. Amendment 136 will enable the power to be used to disapply or modify the panel’s procedure for specific amendments or types of amendments to a code, rather than for all amendments to it.

Finally, Amendments 213 and 214 will allow for changes made to certain immigration legislation and the Online Safety Act 2023 by Clauses 55, 122 and 123 to be extended via existing powers in those Acts, exercisable by Orders in Council, to Guernsey and the Isle of Man, should they seek this.

I beg to move.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, I will keep my comments brief as these are all technical amendments to the Bill. I understand that Amendments 133 and 136 are necessary for the functioning of the law and therefore have no objection. As for Amendment 213, extending immigration legislation amended by Clause 55 of this Bill to the Bailiwick of Guernsey or the Isle of Man, this is a sensible measure. The same can be said for Amendment 214, which extends the provision of the Online Safety Act 2023, amended by this Bill, to the Bailiwick of Guernsey or the Isle of Man.

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- Hansard - - - Excerpts

I thank the noble Viscount.

Amendment 133 agreed.
Clause 89, as amended, agreed.
Clause 90: Duties of the Commissioner in carrying out functions
Amendment 134
Moved by
134: Clause 90, page 113, leave out lines 1 to 5 and insert—
“(a) to monitor the application of GDPR, the applied GDPR and this Act, and ensure are fully enforced with all due diligence;(b) to act upon receiving a complaint, to investigate, to the extent appropriate, the subject matter of the complaint, and to take steps to clarify unsubstantiated issues before dismissing the complaint.”Member’s explanatory statement
This amendment removes the secondary objectives introduced by the Data Use and Access Bill, which frame innovation, competition, crime prevention and national security as competing objectives against the enforcement of data protection law.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, in moving Amendment 134—it is the lead amendment in this group—I shall speak to the others in my name and my Clause 92 stand part notice. Many of the amendments in this group stem from concerns that the new structure for the ICO will diminish its independence. The ICO is abolished in favour of the commission.

19:45
Part 6, which includes Clauses 115 to 118, establishes the information commission to replace the existing regulator. This provision abolishes the ICO and transfers all the duties and responsibilities of the existing commissioner to the new body corporate. Page 9 of the Bill’s Explanatory Notes explains that this change would give the regulator
“a more modern structure—while maintaining its independence”.
However, under Clause 91, the commissioner is required to consult the Secretary of State before preparing or amending codes of practice.
The problem remains, too, that the Secretary of State appoints the most important members of the commission. This ability to appoint has the potential to give the Secretary of State undue influence over the commission’s decision-making processes. What checks and balances, if any, will there be on the identity of commission members? The independence of the Information Commissioner’s Office is a key component that secured the designation of EU-UK data adequacy. We are concerned that the transition to the information commission could dilute its independence, thereby representing another threat to data adequacy. The independence of the information commission is important for maintaining EU-UK data adequacy.
We are concerned that requiring the commissioner to consult the Secretary of State may also present the possibility of political influence. The commissioner may be put under pressure to support the Government’s growth and innovation agenda, which may be in tension with the need to protect data subjects’ personal data and data protection rights. The stand part notices for Clauses 91 and 92 would limit the Secretary of State’s powers and leeway to interfere with the objective and impartial functioning of the new information commission.
We believe that the Government should remove the provisions compelling the commissioner to consult the Secretary of State, thus reaffirming the UK’s commitment to the regulator’s independence. If so amended, the Bill would ensure that the new Information Commissioner is at a sufficient arm’s length from the Government to oversee public and private bodies’ use of personal data with impartiality and objectiveness.
As regards the other amendments in my name, Clause 90 introduces competing and ambivalent objectives that the new information commission would have to pursue, such as
“the desirability of promoting innovation”,
competition,
“public security and national security”,
and preventing crimes. Strong, effective and objective data protection enforcement is important to ensure that innovation results in products and services that benefit individuals and society; to ensure that important public programmes retain the public trust they need to operate; and to ensure that companies compete fairly and are required to improve safety standards. However, Clause 90 builds on the false assumption that objectives such as innovation, economic growth and public security would be competing interests and thus need balancing against data protection. By requiring the new information commission to adopt a more condoning and lenient approach on data protection breaches, Clause 90 could undermine the same policies it aims to promote.
The objective of promoting public trust and confidence in the processing of personal data also represents a significant change in emphasis and tone from the UK GDPR, Article 57 of which articulates the ICO’s task to
“promote public awareness and understanding of the risks, rules, safeguards and rights in relation to processing”.
Amendment 134 would amend Clause 90 to clarify the role and statutory objective of the Information Commissioner’s Office by removing unnecessary and potentially counterproductive objectives. This would clearly state in legislation that the ICO has a duty to investigate infringement and ensure the diligent application of data protection rules. If so amended, Clause 90 would promote clarity and consistency in the ICO’s regulatory function. As the Institute for Government points out:
“Clarity of roles and responsibilities is the most important factor for effectiveness”
of arm’s-length bodies such as the ICO.
I come to Amendment 144. The Information Commissioner’s Office has a poor track record on enforcement. In 2021-22, it did not serve a single GDPR enforcement notice, secured no criminal convictions and issued only four GDPR fines, totalling just £633,000, despite the fact that it received over 40,000 data subject complaints. Open Rights Group’s recently published ICO Alternative Annual Report shows that the ICO issued just one fine and two enforcement notices against public sector bodies, and that
“Only eight UK GDPR-related enforcement actions were taken against private sector organisations”.
In contrast, the ICO issued 28 reprimands to the public sector over the last financial year. Reprimands are written statements where the ICO expresses regret over an organisation’s failure to comply with data protection law, but they do not provide any incentive for change. A reprimand lacks legal force and organisations face no further consequences from one. Despite the fact that reprimands clearly lack deterrence, the ICO relies on them extensively and for serious violations of data protection laws.
I shall give a few examples. Police, prosecutors or the NHS have exposed personal address details of victims of abuse, or witnesses to crime, to their abusers or those they were accusing, creating immediate personal and physical risks. In one example, the person affected had to move house. In another example, patients of the University Hospitals of Derby and Burton NHS Foundation Trust did not receive medical treatment for up to two years. Two police authorities, West Mercia Police and Warwickshire Police, lost the detailed records of investigations they had made, which could have impacted prosecutions or caused potential miscarriages of justice. Two police authorities, Sussex Police and Surrey Police, recorded the conversations of hundreds of thousands of individuals without their consent. There were also persistent failures by two police authorities and three local authorities to respond to subject access requests in a timely period over periods of up to five years.
The ICO decided to drop Open Rights Group’s and several members of the public’s complaints against Meta’s reuse of personal data to train AI without carrying out any meaningful probe, despite substantiated evidence that Meta’s practices do not comply with data protection law. This includes the fact that pictures of children on parents’ Facebook profiles could end up in Meta’s AI model as it assumes consent, yet the ICO has not even launched an investigation.
Evidence proves that overreliance on reprimand lacks deterrence for lawbreakers. For instance, the Home Office was issued three consecutive reprimands in 2022 for a number of data protection breaches, recording and publishing conversations with Windrush victims without consent, and a systemic failure to answer subject access requests within statutory limits, with over 22,000 requests handled late. Against this background, the ICO issued yet another reprimand to the Home Office in 2024. The Home Office’s persistence in not complying with data protection law is a good example of how reprimands, if not supported by the threat of substantive enforcement action, fail to provide a deterrence and thus get ignored by the public sector.
The fact is that the ICO has consistently relied on non-binding and highly symbolic enforcement actions to react to serious infringements of the law. Indeed, the Information Commissioner has publicly stated his intention not to rely on ineffective enforcement against big private sector organisations because
“fines against big tech companies are ineffective”.
This opinion has, of course, been widely disputed by data protection experts and practitioners, including the former Information Commissioner, Elizabeth Denham.
Amendment 144 would impose a limit on the number of reprimands that the ICO can give to a given organisation without adopting any substantive regulatory action, such as an enforcement notice and a fine. This would ensure that the ICO could not evade its regulatory responsibilities by adopting enforcement actions that lack deterrence or the force of law.
Amendments 163 to 166 and 168 to 192 to Schedule 14 are designed to replace the involvement of the Secretary of State with the commissioner and transfer the responsibility to appoint the commissioner from the Government to Parliament. They would also modify Schedule 14 to transfer budget responsibility in the appointment process of the non-executive members of the Information Commission to the relevant Select Committee.
The Bill as drafted will provide significant powers for the Secretary of State to interfere with the objective and impartial functioning of the new Information Commissioner, such as by appointing non-executive members of the newly formed Information Commission, or by introducing a requirement for the new Information Commission to consult the Secretary of State before laying a code of practice before Parliament.
The monitoring and enforcement of data protection laws must be carried out objectively and free from partisan or extralegal considerations but there appears to be a lack of criticality—speaking truth to power—in the present ICO. The commissioner expressed views on the DPDI Bill that match those of the Government, despite widespread criticism coming from other arm’s-length bodies such as the National Data Guardian, the Biometrics and Surveillance Camera Commissioner, the Scottish Biometrics Commissioner and the Equality and Human Rights Commission. The Information Commissioner has once again welcomed this Bill, despite the fact that the new Bill dropped several provisions of the old DPDI Bill that the ICO was previously supportive of. Where is the objective and constructive feedback on government policies?
The other amendments in this group are designed to remove the involvement of the Secretary of State and transfer the responsibility to appoint the commissioner from the Government to Parliament. Amendment 167A would ensure that non-executive members of the commission have a sufficient balance of expertise to inform the commission outside purely data protection issues. There is concern that the ICO will simply draw its NEDs from the same narrow profile of data protection lawyers as has previously been the case. We know from the European Union that it is important that regulators understand the broader horizon and appropriately balance GDPR enforcement with other fundamental rights, such as civil liberties and the economic impact that rulings can have. Will the Minister agree that the ICO should be looking for a broad range of expertise that can aid its decision-making in the reformed structure? I beg to move.
Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

I have Amendment 135A in this group. The Bill provides a new set of duties for the Information Commissioner but no strategic framework, as the DPDI Bill did. The Information Commissioner is a whole-economy regulator. To my mind, the Government’s strategic priorities should bear on it. This amendment would provide an enabling power, such as that which the Competition and Markets Authority, which is in an equivalent economic position, already has.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I have huge sympathy for, and experience of, many of the issues raised by the noble Lord, Lord Clement-Jones, but, given the hour, I will speak only to Amendment 145 in my name and those of the noble Baroness, Lady Harding, my noble friend Lord Russell and the noble Lord, Lord Stevenson. Given that I am so critical, I want to say how pleased I am to see the ICO reporting requirements included in the Bill.

Amendment 145 is very narrow. It would require the ICO to report specifically and separately on children. It is fair to say that one of the many frustrations for those of us who spend our time advocating for children’s privacy and safety is trying to extrapolate child-specific data from generalised reporting. Often it is not reported because it is useful to hide some of the inadequacies in the level of protection afforded to children. For example, none of the community guidelines enforcement reports published for Instagram, YouTube, TikTok or Snapchat provides a breakdown of the violation rate by age group, even though that would provide valuable information for academics, Governments, legislators, NGOs and, of course, regulators. It was a point of contention between many civil society organisations and Ofcom that there was no evidence that children of different ages react in different ways, which, for anyone who has had children, is clearly not the case.

Similarly, for many years we struggled to understand Ofcom’s reporting because older children were included in a group that went up to 24, and it took over 10 years for that to change. It seems to me—I hope the Government agree—that since children are entitled to specific data privacy benefits, it follows that the application and enforcement of those benefits should be reported separately. I hope that the Government can give a quick yes on this small but important amendment.

20:00
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, given the hour, I will try to be as brief as possible. I will start by speaking to the amendments tabled in my name.

Amendment 142 seeks to prevent the Information Commissioner’s Office sending official notices via email. Official notices from the ICO will not be trivial: they relate to serious matters of data protection, such as monetary penalty notices or enforcement notices. My concern is that it is all too easy for an email to be missed. An email may be filtered into a spam folder, where it sits for weeks before being picked up. It is also possible that an email may be sent to a compromised email address, meaning one that the holder has lost control of due to a hacker. These concerns led me also to table Amendment 143, which removes the assumption that a notice sent by email had been received within 48 hours of being sent.

Additionally, I suspect I am right in saying that a great many people expect official correspondence to arrive via the post. I wonder, therefore, whether there might be a risk that people ignore an unexpected email from the ICO, concerned that it might well be a scam or a hack of some description. I, for one, am certainly deeply suspicious of unexpected but official-looking messages that arrive. I believe that official correspondence which may have legal ramifications should really be sent by post.

On some of the other amendments tabled, Amendment 135A, which seeks to introduce a measure from the DPDI Bill, makes provision for the introduction of a statement of strategic priorities by the Secretary of State that sets out the Government’s data protection priorities, to which the commissioner must have regard, and the commissioner’s duties in relation to the statement. Although I absolutely accept that this measure would create more alignment and efficiency in the way that data protection is managed, I understand the concerns that it would undermine the independence of the Information Commissioner’s Office. That in itself, of course, would tend to bear on the adequacy risk.

I do not support the stand part notices on Clauses 91 and 92. Clause 91 requires the Information Commissioner to prepare codes of practice for the processing of data, which seems a positive measure. It provides guidance to controllers, helping them to control best practice when processing data, and is good for data subjects, as it is more likely that their data will be processed in an appropriate manner. As for Clause 92, which would effectively increase expert oversight of codes of practice, surely that would lead to more effective codes, which will benefit both controllers and data subjects.

I have some concerns about Amendment 144, which limits the Information Commissioner to sending only one reprimand to a given controller during a fixed period. If a controller or processor conducts activities that infringe the provisions of the GDPR and does so repeatedly, why should the commissioner be prevented from issuing reprimands? Indeed, what incentives does that give for people to commit a minor sin and then a major one later?

I welcome Amendment 145, in the name of the noble Baroness, Lady Kidron, which would ensure that the ICO’s annual report records activities and action taken by the ICO in relation to children. This would clearly give the commissioner, parliamentarians and the data and tech industry as a whole a better understanding of how policies are affecting children and what changes may be necessary.

Finally, I turn my attention to many of the amendments tabled by the noble Lord, Lord Clement-Jones, which seek to remove the involvement of the Secretary of State from the functions of the commissioner and transfer the responsibility from government to Parliament. I absolutely understand the arguments the noble Lord advances, as persuasively as ever, but I am concerned even so that the Secretary of State for the relevant department is the best person to work with the commissioner to ensure both clarity of purpose and rapidity of decision-making.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

I wanted to rise to my feet in time to stop the noble Viscount leaping forward as he gets more and more excited as we reach—I hope—possibly the last few minutes of this debate. I am freezing to death here.

I wish only to add my support to the points of the noble Baroness, Lady Kidron, on Amendment 145. It is much overused saw, but if it is not measured, it will not get reported.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I thank noble Lords for their consideration of the issues before us in this group. I begin with Amendment 134 from the noble Lord, Lord Clement-Jones. I can confirm that the primary duty of the commissioner will be to uphold the principal objective: securing an appropriate level of data protection, carrying out the crucial balancing test between the interests of data subjects, controllers and wider public interests, and promoting public trust and confidence in the use of personal data.

The other duties sit below this objective and do not compete with it—they do not come at the expense of upholding data protection standards. The commissioner will have to consider these duties in his work but will have discretion as to their application. Moreover, the new objectives inserted by the amendment concerning monitoring, enforcement and complaints are already covered by legislation.

I thank the noble Lord, Lord Lucas for Amendment 135A. The amendment was a previous feature of the DPDI Bill but the Government decided that a statement of strategic priorities for the ICO in this Bill is not necessary. The Government will of course continue to set out their priorities in relation to data protection and other related areas and discuss them with the Information Commissioner as appropriate.

Amendment 142 from the noble Viscount, Lord Camrose, would remove the ICO’s ability to serve notices by email. We would argue that email is a fast, accessible and inexpensive method for issuing notices. I can reassure noble Lords that the ICO can serve a notice via email only if it is sent to an email address published by the recipient or where the ICO has reasonable grounds to believe that the notice will come to the attention of the person, significantly reducing the risk that emails may be missed or sent to the wrong address.

Regarding the noble Viscount’s Amendment 143, the assumption that an email notice will be received in 48 hours is reasonable and equivalent to the respective legislation of other regulators, such as the CMA and Ofcom.

I thank the noble Lord, Lord Clement-Jones, for Amendment 144 concerning the ICO’s use of reprimands. The regulator does not commonly issue multiple reprimands to the same organisation. But it is important that the ICO, as an independent regulator, has the discretion and flexibility in instances where there may be a legitimate need to issue multiple reprimands within a particular period without placing arbitrary limits on that.

Turning to Amendment 144A, the new requirements in Clause 101 will already lead to the publication of an annual report, which will include the regulator’s investigation and enforcement activity. Reporting will be categorised to ensure that where the detail of cases is not public, commercially sensitive investigations are not inadvertently shared. Splitting out reporting by country or locality would make it more difficult to protect sensitive data.

Turning to Amendment 145, with thanks to the noble Baroness, Lady Kidron, I agree with the importance of ensuring that the regulator can be held to account on this issue effectively. The new annual report in Clause 101 will cover all the ICO’s regulatory activity, including that taken to uphold the rights of children. Clause 90 also requires the ICO to publish a strategy and report on how it has complied with its new statutory duties. Both of these will cover the new duty relating to children’s awareness and rights, and this should include the ICO’s activity to support and uphold its important age-appropriate design code.

I thank the noble Lord, Lord Clement-Jones, for Amendments 163 to 192 to Schedule 14, which establishes the governance structure of the information commission. The approach, including the responsibilities conferred on the Secretary of State, at the core of the amendments follows standard corporate governance best practice and reflects the Government’s commitment to safeguarding the independence of the regulator. This includes requiring the Secretary of State to consult the chair of the information commission before making appointments of non-executive members.

Amendments 165 and 167A would require members of the commission to be appointed to oversee specific tasks and to be from prescribed fields of expertise. Due to the commission’s broad regulatory remit, the Government consider that it would not be appropriate or helpful for the legislation to set out specific areas that should receive prominence over others. The Government are confident that the Bill will ensure that the commission has the right expertise on its board. Our approach safeguards the integrity and independence of the regulator, draws clearly on established precedent and provides appropriate oversight of its activities.

Finally, Clauses 91 and 92 were designed to ensure that the ICO’s statutory codes are consistent in their development, informed by relevant expertise and take account of their impact on those likely to be affected by them. They also ensure that codes required by the Secretary of State have the same legal effect as pre-existing codes published under the Data Protection Act.

Considering the explanations I have offered, I hope that the noble Lords, Lord Clement-Jones and Lord Lucas, the noble Viscount, Lord Camrose, and the noble Baroness, Lady Kidron, will agree not to press their amendments.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I thank the Minister for that response. If I speak for four minutes, that will just about fill the gap, but I hope to speak for less than that.

The Minister’s response was very helpful, particularly the way in which she put the clarification of objectives. Of course, this is shared with other regulators, where this new growth duty needs to be set in the context of the key priorities of the regulator. My earlier amendment reflected a nervousness about adding innovation and growth duties to a regulator, which may be seen to unbalance the key objectives of the regulator in the first place, but I will read carefully what the Minister said. I welcome the fact that, unlike in the DPDI Bill, there is no requirement for a statement of strategic priorities. That is why I did not support Amendment 135A.

It is somewhat ironic that, in discussing a digital Bill, the noble Viscount, Lord Camrose, decided to go completely analogue, but that is life. Maybe that is what happens to you after four and a half hours of the Committee.

I do not think the Minister covered the ground on the reprimands front. I will read carefully what she said about the annual report and the need for the ICO—or the commission, as it will be—to report on its actions. I hope, just by putting down these kinds of amendments on reprimands, that the ICO will take notice. I have been in correspondence with the ICO myself, as have a number of organisations. There is some dissatisfaction, particularly with companies such as Clearview, where it is felt that the ICO has not taken adequate action on scraping and building databases from the internet. We will see whether the ICO becomes more proactive in that respect. I was reassured, however, by what the Minister said about NED qualifications and the general objective on the independence of the regulator.

There is much to chew on in what the Minister said. In the meantime, I beg leave to withdraw my amendment.

Amendment 134 withdrawn.
Committee adjourned at 8.14 pm.

Data (Use and Access) Bill [HL]

Committee (4th Day)
Relevant documents: 3rd Report from the Constitution Committee, 9th Report from the Delegated Powers Committee. Scottish, Welsh and Northern Ireland Legislative Consent sought.
15:45
Baroness Pitkeathley Portrait The Deputy Chairman of Committees (Baroness Pitkeathley) (Lab)
- Hansard - - - Excerpts

My Lords, as usual, if there is a Division in the Chamber while we are sitting, the Committee will adjourn as soon as the Division Bells are rung and resume after 10 minutes.

Clause 90: Duties of the Commissioner in carrying out functions

Amendment 135 not moved.
Clause 90 agreed.
Amendment 135A not moved.
Clause 91: Codes of practice for the processing of personal data
Clause 91 agreed.
Clause 92: Codes of practice: panels and impact assessments
Amendment 136
Moved by
136: Clause 92, page 117, line 24, leave out from “of” to the end of line 27 and insert “—
(a) a code prepared under section 124A, or(b) an amendment of such a code,that is specified or described in the regulations.”Member’s explanatory statement
New section 124B(11) of the Data Protection Act 2018 provides that the Information Commissioner’s duty to establish a panel to consider draft codes of practice may be disapplied or modified by regulations. This amendment ensures that regulations can make provision in relation to a particular code or amendment or a type of code or amendment.
Amendment 136 agreed.
Clause 92, as amended, agreed.
Amendment 137 not moved.
Amendment 138
Moved by
138: After Clause 92, insert the following new Clause—
“Code on processing personal data in education where it concerns a child or pupil(1) The Information Commissioner must consult on, prepare and publish a Code of Practice on standards to be followed in relation to the collection, processing, publication and other dissemination of personal data concerning children and pupils in connection with the provision of education services in the United Kingdom, within the meaning of the Education Act 1996, the Education (Scotland) Act 1996, and the Education and Libraries (Northern Ireland) Order 1986; and on standards on the rights of those children as data subjects which are appropriate to children’s capacity and stage of education.(2) For the purposes of subsection (1), the rights of data subjects must include—(a) measures related to responsibilities of the controller, data protection by design and by default, and security of processing,(b) safeguards and suitable measures with regard to automated decision-making, including profiling and restrictions,(c) the rights of data subjects including to object to or restrict the processing of their personal data collected during their education, including any exemptions for research purposes, and(d) matters related to the understanding and exercising of rights relating to personal data and the provision of education services.”Member’s explanatory statement
This amendment requires the Commission to consult on, prepare and publish a Code of Practice on standards to be followed in relation to the collection, processing, publication and other dissemination of personal data concerning children and pupils in connection with the provision of education services in the UK.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, unusually, I rise to move an amendment, Amendment 138. For the second time in Committee, I find myself heading a group when I know that the noble Baroness, Lady Kidron, will be much better qualified to introduce the subject. Indeed, she has an amendment, Amendment 141, which is far preferable in many ways to mine.

Amendment 138 is designed to ensure that the Information Commissioner produces a code of practice specific to children up to the age of 18 for the purposes of UK law and Convention 108, and pupils as defined by the Education Act 1996, who may be up to the age of 19 or, with special educational needs, up to 25 in the education sector. The charity Data, Tech & Black Communities put it this way in a recent letter to the noble Baroness, Lady Jones:

“We recently completed a community research project examining the use of EdTech in Birmingham schools. This project brought us into contact with over 100 people … including parents, school staff and community members. A key finding was the need to make it easier for those with stewardship responsibility for children’s data, to fulfil this duty. Even with current data protection rights, parents and guardians struggle to make inquiries (of schools, EdTech companies and even DfE) about the purpose behind the collection of some of their children’s data, clarity about how it is used (or re-used) or how long data will be retained for. ‘Opting out’ on behalf of their children can be just as challenging. All of which militates against nuanced decision-making about how best to protect children’s short and long-term interests … This is why we are in support of an ICO Code of Practice for Educational Settings that would enable school staff, parents and learners, the EdTech industry and researchers to responsibly collect, share and make use of children’s data in ways that support the latter’s agency over their ‘digital selves’ and more importantly, will support their flourishing”.


The duties of settings and data processers and rights appropriate to the stage of education and children’s capacity needs clarity and consistency. Staff need confidence to access and use data appropriately within the law. As the UNCRC’s General Comment No. 16 (2013) on State Obligations Regarding the Impact of the Business Sector on Children’s Rights set out over a decade ago,

“the realization of children’s rights is not an automatic consequence of economic growth and business enterprises can also negatively impact children’s rights”.

The educational setting is different from only commercial interactions or in regard to the data subjects being children. It is more complex because of the disempowered environment and its imbalance of power between the authority, the parents and the child. The additional condition is the fact that parents’ and children’s rights are interlinked, as exemplified in the right to education described in UDHR Article 26(3), which states:

“Parents have a prior right to choose the kind of education that shall be given to their children.”


A code is needed because the explicit safeguards are missing that the GDPR requires in several places but were left out of the UK Data Protection Act 2018 drafting. Clause 80 of the Bill—“Automated decision-making”—does not address the necessary safeguards of GDPR Article 23(1) for children. Furthermore, removing the protections of the balancing test under the recognised legitimate interest condition will create new risks. Clauses on additional further processing or changes to purpose limitation are inappropriately wide without child-specific safeguards. The volume, sensitivity and intrusiveness of identifying personal data collection in educational settings only increases, while the protections are only ever reduced.

Obligations specific to children’s data, especially

“solely automated decision-making and profiling”

and exceptions, need to be consistent with clear safeguards by design where they restrict fundamental freedoms. What does that mean for children in practice, where teachers are assumed to be the rights bearers in loco parentis? The need for compliance with human rights, security, health and safety, among other standards proportionate to the risks of data processing and respecting the UK Government’s accessibility requirements, should be self-evident and adopted in a code of practice, as recommended in the five rights in the Digital Futures Commission’s blueprint for educational data governance.

The Council of Europe Strategy for the Rights of the Child (2022-2027) and the UNCRC General Comment No. 25 on Children’s Rights and the Digital Environment make it clear that

“children have the right to be heard and participate in decisions affecting them”.

They recognise that

“capacity matters, in accordance with their age and maturity. In particular attention should be paid to empowering children in vulnerable situations, such as children with disabilities.”

Paragraph 75 recognises that surveillance in educational settings should not take place without the right to object and that teachers need training to keep up with technological developments.

Participation of young people themselves has not been invited in the development of this Bill and the views of young people have not been considered. However, a small sample of parent and pupil voices has been captured in the Responsible Technology Adoption Unit’s public engagement work together with the DfE in 2024. The findings back those of Defend Digital Me’s Survation poll in 2018 and show that parents do not know that the DfE already holds named pupil records without their knowledge or permission and that the data is given away to be reused by hundreds of commercial companies, the DWP, the Home Office and the police. It stated:

“There was widespread consensus that work and data should not be used without parents’ and/or pupils’ explicit agreement. Parents, in particular, stressed the need for clear and comprehensive information about pupil work and data use and any potential risks relating to data security and privacy breaches.”


A code of practice is needed to explain the law and make it work as intended for everyone. The aims of a code of practice for educational settings would be that adherence to a code creates a mechanism for controllers and processors to demonstrate compliance with the legislation or approve certification methods. It would give providers confidence in consistent and clear standards and would be good for the edtech sector. It would allow children, parents, school staff and systems administrators to build trust in safe, fair and transparent practice so that their rights are freely met by design and default.

Further, schools give children’s personal data to many commercial companies during a child’s education—not based on consent but assumed for the performance of a task carried out in the public interest. A code should clarify any boundaries of this lawful basis for commercial purposes, where it is an obligation on parents to provide the data and what this means for the child on reaching maturity or after leaving the educational setting.

Again, a code should help companies understand “data protection by design and default” in practice, and appropriate “significant legal effect”, the edges of “public interest” in data transfers to a third country, and how special categories of data affect children in schools. A code should also support children and families in understanding the effect of the responsibilities of controllers and processes for the execution or limitation of their own rights. It would set out the responsibilities of software platforms that profile users’ metadata to share with third parties, or of commercial apps signed up for in schools that offer adverts in use.

I hope that I have explained exactly why we believe that a code of conduct is required in educational settings. I beg to move.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I support and have added my name to Amendment 138 in the name of the noble Lord, Lord Clement-Jones. I will also speak to Amendment 141 in my name and those of the noble Lords, Lord Knight and Lord Russell, and the noble Baroness, Lady Harding.

Both these amendments propose a code of practice to address the use of children’s data in the context of education. Indeed, they have much in common. Having heard the noble Lord, Lord Clement-Jones, I have much in common with what he said. I associate myself entirely with his remarks and hope that mine will build on them. Both the amendments point to the same problem that children’s data is scandalously treated in our schools and educators need support; this is a persistent and known failure that both the DfE and the ICO have failed to confront over a period of some years.

Amendment 141 seeks to give a sense of exactly what an education code should cover. In doing so, it builds on the work of the aforementioned Digital Futures for Children centre at the LSE, which I chair, the work of Defend Digital Me, the excellent work of academics at UCL, and much of the work relating to education presented to the UN tech envoy in the course of drafting the UN global digital compact.

Subsection (1) of the proposed new clause would require the ICO to prepare a code of practice in connection with the provision of education. Subsection (2) sets out what the ICO would have to take into account, such as that education provision includes school management and safeguarding as well as learning; the different settings in which it takes place; the need for transparency and evidence of efficacy; and all the issues already mentioned, including profiling, transparency, safety, security, parental involvement and the provision of counselling services.

Subsection (3) would require the ICO to have regard to children’s entitlement to a higher standard of protection—which we are working so hard in Committee to protect—their rights under the UNCRC and their different ages and stages of development. Importantly, it also refers to the need and desire to support innovation in education and the need to ensure that the benefits derived from the use of UK children’s data accrue to the UK.

Subsection (4) lists those whom the commissioner would have to consult, and subsection (5) sets out when data processors and controllers would be subject to the code. Subsection (6) proposes a certification scheme for edtech services to demonstrate compliance with UK GDPR and the code. Subsection (7) would require edtech service and product providers to evidence compliance—importantly, transferring that responsibility from schools to providers. Subsection (8) simply defines the terms.

A code of practice is an enabler. It levels the playing field, sets terms for innovators, creates sandbox or research environments, protects children and supports schools. It offers a particularly attractive environment for developing the better digital world that we would all like to see, since schools are identifiable communities in which changes and outcomes could be measured.

16:00
I have raised the issues of edtech—the lack of privacy, the lack of evidence for learning outcomes and, in particular, some very serious known problems of safeguarding tech—with the Department for Education several times and to several Ministers. Each meeting is met with a level of shock at the evidence I produce and a determination to act, but then the department decides that providing schools with more guidance is the answer: guidance on data protection, guidance on AI, guidance on safeguarding for teachers and schools to understand and implement. There is nothing for the regulator, nothing for the companies and nothing that responds to the well-established fact that products need to be designed for privacy and safety by default. Given the known power imbalance of a company such as Microsoft or Google and a school DPO, or the skills and transparency gap between a product developer and a school safeguarding lead, heaping more burden and responsibility on teachers rather than using the tools of good government, law, regulation, certification and procurement power to foster ethical innovation is, I think, a failure of common sense if not leadership.
For example, many schools in East Anglia were recently persuaded to purchase costly visitor management software with high recurring annual subscription fees as a substitute for visitor registration books which, the company suggested, did not comply with GDPR. This expensive and unnecessary system includes biometric storage of visitors’ facial images, which raises questions of consent. I have described to the House before seeing a similar system trained on white faces unable to take a photograph as it did not recognise a black visitor as human. The waterfall of implications is extensive: to privacy, to fairness and to school budgets. A code of minimum standards for management products would avoid that.
Similarly, a code would bring clarity about how to handle and share student data. Between 2018 and 2020, the Education and Skills Funding Agency permitted access to the Learning Records Service database of some 28 million students. The data was used to build an age-verification system that was offered to online gambling companies. Research by UCL suggests it resulted in targeted gambling adverts:
“Early evidence from our study indicates evidence of participants creating gambling accounts while underage, with some spending up to £400 on these platforms”.
This is among the more egregious examples, but it is by no means the only one. A code could help us deal with that.
Similarly, a code could bring clarity to research. The Minister suggested last week that those objecting to the Government’s broadening of “scientific research” did not understand the role of research. I dispute that, and I look forward to her letter that says whether or not making a product more addictive to children could be reasonably said to be scientific, given that it involves A/B testing of children at scale. The code suggested by this amendment would clarify the distinction between research and product development in edtech by outlining when research ethics should apply in delineating institutional responsibilities when engaging in collaborative projects.
Those of us who support the introduction of technology but want it to be mindful of rights holders are often cast in the role of tech detractors, but it is a mischaracterisation. We simply want to create a fairer and more equitable set of arrangements to protect human vulnerability, in this case, of children; or to respond to institutional struggle, in this case of schools; and to protect against commercially predatory behaviour, in this case of more than a few edtech companies. An edtech code would stop the muddle of research, social provision and commercial exploitation that happens now in our schools with no rules attached.
I also believe that having an edtech code would likely give birth to an industry of standards and certification schemes, as in so many areas from wifi protocols to the strength of our car windscreens. Perhaps most important of all, it would give a basis for DfE procurement. School communities are a combination of furious and overwhelmed by the number of duties foisted on them. Procurement standards would liberate their time and anxiety. The misery of realising the state-of-the-art filtering and monitoring system that a school bought at great expense last year is no longer fit for purpose is quite devastating, and I have seen it repeatedly.
In fact, almost 50% of school monitoring and filtering services now cannot recognise harmful content from gen AI, and some services make it possible to turn off the filters for illegal content when that should be prohibited, not a question of choice. I have raised this with Ministers, but in spite of my entreaties—and those of Judy and Andy Thomas, whose daughter Frankie took her own life after accessing pro-suicide content on a school iPad because the filter was not on—the department continues to stonewall. A code that covered all edtech would give safeguarding teams confidence in the products they buy and the protocols to use them.
I have run out of time, but I say finally that the edtech code must cover early learning. The early learning communities are
“dismayed that nobody is advocating for the needs of the youngest and most vulnerable children”.
A group of 55 early years professionals wrote to me to say that
“it is alarming to many early childhood development experts, who are left confused and frustrated that the DFE have opted not to include Online safety as a statutory reference Early Years Statutory Framework despite our repeated representations”.
A code would help everybody.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My Lords, I was unsure whether to support Amendment 141, let alone speak to it, simply because I have a number of interests in this area and I should be clear about those. I chair Century-Tech Ltd, which is an AI edtech company; I am on the board of Educate Ventures Research Ltd, which offers advice to educators and schools on the use of AI in education; and I am a trustee of the Good Future Foundation, which does something similar.

I start by reminding the Committee of some of the benefits of technology and AI for education, so that there is a balance both in my speech and in the debate. Exciting practice is already taking place in the area of flipped learning, for example, where—putting issues of the digital divide to one side—in those classes and communities where there is good access to technology at home, the instructional element of learning can take place at home and school becomes a much more profoundly human endeavour, with teachers being able to save the time spent on the instructional element of teaching to bring that learning to life. I have some issues with AI in the world of tutoring in certain circumstances, but some of that can be very helpful in respect of flipped learning.

Project-based learning also becomes much more possible. That is very hard to teach but much more possible to teach by using AI tools to help link what is being learned in projects through to the curriculum. Teacher time can be saved and, by taking care of a lot of administrative tasks through AI, we can in turn make a significant contribution to the teacher retention crisis that is currently bedevilling our schools. There are novel assessment methods that can now be developed using AI, in particular making the traditional assessment method of the viva much more affordable and reliable. It is hard to use AI to cheat if you are being assessed orally.

Finally, an important element is preparation for work: if we want these young people to be able to leave school and thrive in a labour market where they must be able to collaborate effectively with machines, we need them to be able to experience that in a responsible and taught fashion in school.

However, dystopian issues can arise from an over- dependence on technology and from some of the potential impacts of using AI in education, too. I mentioned the digital divide—the 7.5 million families in this country are not connected to and confident to use the internet—and we discovered during Covid the device and data poverty that exists in this country. There is a possibility that poorer kids end up being taught by machines and not by human teachers at all. There is a danger that we do not shift our schools away from the slightly Victorian system that we have at the moment, which the noble Baroness, Lady Kidron, referenced at Second Reading. If we do not, we will end up with our children being outcompeted by machines. That overreliance on AI could also end up as privatisation by stealth because, if all the AI, technology and data are held by the private sector, and we are dependent on it, we will be beholden to the private sector however much we believe in the importance of the public good in our schools.

There are also problems of system design; I mentioned the Victorian system. I am hopeful that the curriculum and assessment review and the Children’s Wellbeing and Schools Bill that was published this week will help us. Whichever direction that review and those reforms take, we can be confident that edtech will respond. That is what it does; it responds to whatever regulation we pass, including in this Bill, over time and to whatever changes take place in the education system.

But tech needs data and it needs diversity of data. There is a danger that, if we close off access to data in this country, we will all end up using lots of AI that has been developed by using Chinese data, where they do not have the same misgivings about privacy, sharing each other’s data and acquiring data. We have to find a regime that works.

I do a bunch of work in international schooling as chair of COBIS—the Council of British International Schools—and I know of one large international school group, which I do not advise, that has done a deal with Microsoft around sharing all its pupil data, so that it can be used for Copilot. Obviously, Microsoft has a considerable interest in OpenAI, and we do not know exactly where that data is going. That points to some of the concerns that the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, have talked about.

During Covid, schools were strongly encouraged by the then Government to use either Google Classroom or Microsoft 365. Essentially, everyone was given a binary choice, and lots of data was therefore captured by those two large American corporations, which assisted them to develop further products. Any British alternative was, in essence, cut out, so we have good reason to be concerned in this area. That is why in the end I added my name and support to Amendment 141 in the name of the noble Baroness, Lady Kidron.

Children need privacy and they need digital rights. At the moment, those are exercised through parental consent for the use of these platforms and the capture of data, but I think it would be helpful to put that in a codified form, so that all those concerns have some sense of security about the regimes around which this works.

Ever since the abolition of Becta back in 2010, school leaders have been missing advice. Becta advice was used around the globe, as it was the authority on what works in technology and education. Sadly, the coalition got rid of it, and school leaders are now operating kind of blindfolded. We have 25,000 different school leaders buying technology, and very few of them really know what they are doing when faced with slick salespeople. Giving them some protection with a code would help their procurement.

The proof of the pudding will of course be in the eating—in the detail of the code—but I urge my noble friend the Minister to reflect carefully on the need for this, to talk to the DfE about it and to try to get some agreement. The DfE itself does not have the greatest track record on data and data protection. It has got into trouble with the ICO on more than one occasion.

My final cautionary tale, thanks to Defend Digital Me, is on the national pupil database, which was agreed in 2002 on the basis that children’s data would be kept private, protected and used only for research purposes—all the things that we are hearing in the debates on this Bill. Ten years later, that was all changed and 2,500 data- sharing arrangements followed that use that data, including for universal credit fraud detection. When parents allow their children’s data to be shared, they do not expect it to be used, down the line, to check universal credit entitlement. I do not think that was in the terms and conditions. There is an important issue here, and I hope that the Government are listening so that we make some progress.

16:15
Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- Hansard - - - Excerpts

I shall speak very briefly, because the previous three speakers have covered the ground extremely well and made some extremely powerful arguments.

The noble Baroness, Lady Kidron, put her finger on it. The default position of departments such as the DfE, if they recognise there is a problem, is to issue guidance. Schools are drowning in guidance. If you talk to any headmaster or headmistress or the staff in charge of technology and trying to keep on top of it, they are drowning in guidance. They are basically flying blind when being asked to take some quite major decisions, whether it is about purchasing or the safeguards around usage or about measuring the effectiveness of some of the educational technology skills that are being acquired.

There is a significant difference between guidance and a clear and concrete code. We were talking the other day, on another group, about the need to have guardrails, boundaries and clarity. We need clarity for schools and for the educational technology companies themselves to know precisely what they can and cannot do. We come back again to the issue of the necessity of measuring outcomes, not just processes and inputs, because they are constantly changing. It is very important for the companies themselves to have clear guardrails.

The research to which the noble Baroness, Lady Kidron, referred, which is being done by a variety of organisations, found problems in the areas that we are talking about in this country, the United States, Iceland, Denmark, Sweden, the Netherlands, Germany and France—and that is just scratching the surface. Things are moving very quickly and AI is accelerating that even more. With a code you are drawing a line in the sand and declaring very clearly what you expect and do not expect, what is permissible and not permissible. Guidance is simply not sufficient.

Lord Kirkhope of Harrogate Portrait Lord Kirkhope of Harrogate (Con)
- Hansard - - - Excerpts

My Lords, I make a brief intervention. I am not against these amendments —they are very useful in the context of the Bill. However, I am reflecting on the fact that, when we drafted GDPR, we took a six-year process and failed in the course of doing so to really accommodate AI, which keeps popping up every so often in this Bill. Every part of every amendment seems to have a new subsection referring to automative decisions or to AI generally.

Obviously, we are moving on to have legislation in due course on AI and I am sure that a number of pieces of legislation, including no doubt this one, will be able to be used as part of our overall package when we deal with the regulation of AI. However, although it is true that the UK GDPR gives, in theory, a higher standard of protection for children, it is important to consider that, in the context of AI, the protections that we need to have are going to have to be much greater—we know that. But if there is going to be a code of practice for children and educational areas, we need also to consider vulnerable and disabled people and other categories of people who are equally entitled to have, and particularly with regard to the AI elements need to have, some help. That is going to be very difficult. Most adults whom I know know less about AI than do children approaching the age of 18, who are much more knowledgeable. They are also more knowledgeable of the restrictions that will have to be put in place than are adults, who appear to be completely at sea and not even understanding what AI is about.

I make a precautionary point. We should be very careful, while we have AI dotted all the way through this, that when we specify a particular element—in this case, for children—we must be aware of the need to have protection in place for other groups, particularly in the context of this Bill and, indeed, future legislation.

Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, I very much support the thrust of these amendments and what the noble Lord, Lord Knight, said in support of and in addition to them. I declare an interest as a current user of the national pupil database.

The proper codification of safeguards would be a huge help. As the noble Baroness, Lady Kidron, said, it would give us a foundation on which to build. I hope that, if they are going to go in this direction, the Government will take an immediate opportunity to do so because what we have here, albeit much more disorganised, is a data resource equivalent to what we have for the National Health Service. If we used all the data on children that these systems generate, we would find it much easier to know what works and in what circumstances, as well as how to keep improving our education system.

The fact that this data is tucked away in little silos—it is not shared and is not something that can be used on a national basis—is a great pity. If we have a national code as to how this data is handled, we enable something like the use of educational data in the way that the NHS proposes to use health data. Safeguards are needed on that level but the Government have a huge opportunity; I very much hope that it is one they will take.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I start by thanking all noble Lords who spoke; I enjoyed the vivid examples that were shared by so many of them. I particularly enjoyed the comment from the noble Lord, Lord Russell, about the huge gulf in difference between guidance, of which there is far too much, and a code that actually drives matters forward.

I will speak much more briefly because this ground has been well covered already. Both the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, seek to introduce codes of practice to protect the data of children in education services. Amendment 138 in the name of the noble Lord seeks to introduce a code on processing personal data in education. This includes consultation for the creation of such a code—a highly important element because the safety of this data, as well as its eventual usage, is of course paramount. Amendment 141 in the name of the noble Baroness, Lady Kidron, also seeks to set out a code of practice to provide heightened protections for children in education.

Those amendments are absolutely right to include consultation. It is a particularly important area of legislation. It is important that it does not restrict what schools can do with their data in order to improve the quality and productivity of their work. I was very appreciative of the words of the noble Lord, Lord Knight, when he sketched out some of the possibilities of what becomes educationally possible when these techs are wisely and safely used. With individual schools often responsible for the selection of technologies and their procurement, the landscape is—at the risk of understatement —often more complex than we would wish.

Alongside that, the importance of the AI Safety Institute’s role in consultation cannot be overstated. The way in which tech and AI have developed in recent years means that its expertise on how safely to provide AI to this particularly vulnerable group is invaluable.

I very much welcome the emphasis that these amendments place on protecting children’s data, particularly in the realm of education services. Schools are a safe place. That safety being jeopardised by the rapid evolution of technology that the law cannot keep pace with would, I think we can all agree, be unthinkable. As such, I hope that the Government will give careful consideration to the points raised as we move on to Report.

Baroness Jones of Whitchurch Portrait The Parliamentary Under-Secretary of State, Department for Business and Trade and Department for Science, Information and Technology (Baroness Jones of Whitchurch) (Lab)
- Hansard - - - Excerpts

My Lords, Amendment 138 tabled by the noble Lord, Lord Clement-Jones, and Amendment 141, tabled by the noble Baroness, Lady Kidron, and the noble Lord, Lord Knight, would both require the ICO to publish a code of practice for controllers and processors on the processing of personal data by educational technologies in schools.

I say at the outset that I welcome this debate and the contributions of noble Lords on this important issue. As various noble Lords have indicated, civil society organisations have also been contacting the Department for Science, Innovation and Technology and the Department for Education directly to highlight their concerns about this issue. It is a live issue.

I am grateful to my noble friend Lord Knight, who talked about some of the important and valuable contributions that technology can play in supporting children’s development and guiding teaching interventions. We have to get the balance right, but we understand and appreciate that schoolchildren, parents and schoolteachers must have the confidence to trust the way that services use children’s personal data. That is at the heart of this debate.

There is a lot of work going on, on this issue, some of which noble Lords have referred to. The Department for Education is already exploring ways to engage with the edtech market to reinforce the importance of evidence-based quality products and services in education. On my noble friend Lord Knight’s comments on AI, the Department for Education is developing a framework outlining safety expectations for AI products in education and creating resources for teachers and leaders on safe AI use.

I recognise why noble Lords consider that a dedicated ICO code of practice could help ensure that schools and edtech services are complying with data protection legislation. The Government are open-minded about exploring the merits of this further with the ICO, but it would be premature to include these requirements in the Bill. As I said, there is a great deal of work going on and the findings of the recent ICO audits of edtech service providers will help to inform whether a code of practice is necessary and what services should be in scope.

I hope that we will bear that in mind and engage on it. I would be happy to continue discussions with noble Lords, the ICO and colleagues at the Department for Education, outside of the Bill’s processes, about the possibility of future work on this, particularly as the Secretary of State has powers under the Data Protection Act 2018 to require the ICO to produce new statutory codes, as noble Lords know. Considering the explanation that I have given, I hope that the noble Lord, Lord Clement-Jones, will consider withdrawing his amendment at this stage.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I thank the Minister for her response and all speakers in this debate. On the speech from the noble Lord, Lord Knight, I entirely agree with the Minister and the noble Viscount, Lord Camrose, that it is important to remind ourselves about the benefits that can be achieved by AI in schools. The noble Lord set out a number of those. The noble Lord, Lord Russell, also reminded us that this is not a purely domestic issue; it is international across the board.

However, all noble Lords reminded us of the disbenefits and risks. In fact, the noble Lord, Lord Knight, used the word “dystopian”, which was quite interesting, although he gets very close to science fiction sometimes. He said that

“we have good reason to be concerned”,

particularly because of issues such as the national pupil database, where the original purpose may not have been fulfilled and was, in many ways, changed. He gave an example of procurement during Covid, where the choice was either Google or Microsoft—Coke or Pepsi. That is an issue across the board in competition law, as well.

There are real issues here. The noble Lord, Lord Russell, put it very well when he said that there is any number of pieces of guidance for schools but it is important to have a code of conduct. We are all, I think, on the same page in trying to find—in the words of the noble Baroness, Lady Kidron—a fairer and more equitable set of arrangements for children in schools. We need to navigate our way through this issue; of course, organisations such as Defend Digital Me and 5rights are seriously working on it.

16:30
I welcome what the Minister had to say. She said that this is a welcome debate on a live issue and that there is a great deal of work happening in the DfE. She said that the department is working on a framework outlining expectations. Are we a gnat’s whisker away from a code of conduct? That was not entirely clear. She also said—this is always a bit of a red flag—that it is premature to start thinking about that in terms of this Bill, and that there is an ICO audit of the edtech service.
I was a member of Sir Anthony Seldon’s Institute for Ethical AI in Education, whose advisory board I chaired. The noble Lord, Lord Knight, was an extremely valuable member of that advisory board but that was some years ago—back in 2019 or 2020, I think. We have not moved much further on the kinds of guidance that are needed in the world of AI and data in schools. The Minister may say that thinking about this is premature, but we need to ratchet up the speed if we are really going to grapple with this issue. Schools are already grappling with it: AI tools are now commonplace. We must seize this and we must make sure that there is a code on which schools can rely.
I turn to the words of the noble Baroness, Lady Kidron: products are designed for privacy and security by default so, here, we are addressing not only schools but those who supply these products. We must get the procurement right in all of this. There is to some degree a sense of acceptance that work is going on but I very much hope that, as we go forward, the Minister can persuade us that we are going to press our foot on the accelerator in this respect. In the meantime, I beg leave to withdraw my amendment.
Amendment 138 withdrawn.
Amendments 139 to 141 not moved.
Clauses 93 and 94 agreed.
Clause 95: Notices from the Commissioner
Amendments 142 and 143 not moved.
Clause 95 agreed.
Amendments 144 and 144A not moved.
Clauses 96 to 100 agreed.
Clause 101: Annual report on regulatory action
Amendment 145 not moved.
Clause 101 agreed.
Clause 102 agreed.
Schedule 10 agreed.
Clause 103: Court procedure in connection with subject access requests
Amendments 146 to 150 not moved.
Clause 103 agreed.
Amendments 151 and 152 not moved.
Clause 104 agreed.
Amendment 153 not moved.
Clauses 105 to 107 agreed.
Amendments 154 to 156 not moved.
Amendment 156A
Moved by
156A: After Clause 107, insert the following new Clause—
“Data use: definition of unauthorised access to computer programs or dataIn section 17 of the Computer Misuse Act 1990, at the end of subsection (5) insert—“(c) they do not reasonably believe that the person entitled to control access of the kind in question to the program or data would have consented to that access if they had known about the access and the circumstances of it, including the reasons for seeking it, and(d) they are not empowered by an enactment, by a rule of law, or by order of a court or tribunal to access of the kind in question to the program or data.””
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

My Lords, it is a pleasure to take part in today’s Committee proceedings. In doing so, I declare my technology interests as set out in the register, not least as an adviser to Socially Recruited, an AI business. In moving Amendment 156A, I will also speak to Amendment 156B, and I thank the noble Lord, Lord Clement-Jones, for co-signing them.

We live in extraordinarily uncertain times, domestically and internationally. In many ways, it has always been thus. However, things are different and have accelerated, not least in the last two decades, because of the online environment and the digital selves that we find ourselves interacting with in a world that is ever changing moment by moment. These amendments seek to update an important statute that governs critical elements of how cybersecurity professionals in this nation seek to keep us all safe in these extraordinarily difficult times.

The Computer Misuse Act 1990 was introduced to defend telephony exchanges at a time when 0.5% of us were online. If that was the purpose of the Act—the statute when passed—that alone would suggest that it needs an update. Who among us would use our smartphone if we had had it for 34 years? Well, we could not—the iPhone has been around only since 2007. This whole world has changed profoundly in the last 20 years, never mind the last 34. It is not just that the Act needs to be updated because it falls short of how society and technology have changed in those intervening years; it needs, desperately and urgently, to be updated because it is currently putting every citizen in this nation at risk for want of being amended. This is the purpose of Amendments 156A and 156B.

The Computer Misuse Act 1990 is not only out of date but inadvertently criminalising the cybersecurity professionals we charge with the job of keeping us all safe. They oftentimes work, understandably, under the radar, behind not just closed but locked doors, doing such important work. Yet, for want of these amendments, they are doing that work, all too often, with at least one hand tied behind their back.

Let us take just two examples: vulnerability research and threat intelligence assessment and analysis. Both could find that cybersecurity professional falling foul of the provisions of the CMA 1990. Do not take my word for it: look to the 2024 annual report of the National Cyber Security Centre, which rightly and understandably highlights the increasing gap between the threats we face and its ability, and the ability of the cybersecurity professionals community, to meet those threats.

These amendments, in essence, perform one simple but critical task: to afford a legal defence for legitimate cybersecurity activities. That is all, but it would have such a profound impact for those whom we have asked to keep us safe and for the safety they can thus deliver to every citizen in our society.

Where is the Government’s work on updating the Computer Misuse Act 1990 in this respect? Will the Government take this opportunity to accept these amendments? Do they believe that these amendments would provide a materially positive benefit to our cybersecurity professionals and thus to our nation, and, if so, why would they not take this first opportunity to enact these amendments to this data Bill?

It is not time; it is well over time that these amendments become part of our law. If not now, when? If not these amendments, which amendments? If they do not accept these amendments, what will the Government say to all those people who will continue to be put in harm’s way for want of these protective provisions being passed? It is time to pass these amendments and give our cybersecurity professionals the tools they need. It is time, from the legislative perspective, to keep them safe so that they can do the self-same thing for all of us. It is time to cyber up. I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I was delighted to see these amendments tabled by the noble Lord, Lord Holmes. He, the noble Lord, Lord Arbuthnot, and I, along with many other parliamentarians, have long argued for changes to the Computer Misuse Act. For context, the original Act was created largely in response to a famous incident in which professional hackers and a technology journalist broke into British Telecom’s Prestel system in the mid-1980s. The Bill received Royal Assent in June 1990, barely two months after Tim Berners-Lee and CERN made the world wide web publicly available for the first time. Who remembers Prestel? Perhaps this is the wrong House in which to ask that question.

As the noble Lord, Lord Holmes, explained, there is no statutory public interest defence in the Act. This omission creates a legal risk for cybersecurity researchers and professionals conducting legitimate activities in the public interest. The Post Office Horizon scandal demonstrated how critical independent computer system investigation is for uncovering systemic problems and highlighted the need for protected legal pathways for researchers and investigators to examine potentially flawed systems.

I am delighted that the noble Lord, Lord Vallance, is here for this set of amendments. His Pro-innovation Regulation of Technologies Review explicitly recommends incorporating such a defence to provide stronger legal protections for cybersecurity researchers and professionals engaged in threat intelligence research. This recommendation was rooted in the understanding that such a defence would have, it said,

“a catalytic effect on innovation”

within the UK’s cybersecurity sector, which possesses “considerable growth potential”.

16:45
The current situation puts the UK at a disadvantage compared to countries such as France, Israel and the United States, which have already updated their legislation to include similar defences, allowing their cybersecurity industries to thrive. The absence of such a defence in the UK creates an uneven playing field and hinders the growth of the domestic cybersecurity sector. The noble Lord, Lord Holmes, has rightly mentioned the CyberUp campaign, which advocates for reforming the Act and emphasises the need to update the definitions of key provisions in the legislation. This would provide much greater clarity for researchers and ensure that legitimate cybersecurity activities are not unduly hampered by the fear of legal repercussions.
Despite ongoing discussions and consultations, progress towards amending the Act has been slow. The long-awaited review of the Act—which started in 2021—reported last year, and we have had a consultation which concluded this April. When will we see the Act amended? This is glacial progress on an important issue for innovation and growth. What is the hold up? This inaction inhibits innovation in a sector crucial to national security and economic growth.
The call for reform is not limited to industry groups; many others, including legal experts, academics and Members of both Houses have expressed support for updating the Act. This consensus underscores the wide- spread recognition of the Act’s inadequacy in addressing the current cyber threat landscape. As the noble Lord, Lord Holmes, mentioned, the need for these amendments, and the support for them, was highlighted by the National Cyber Security Centre and its recent annual review.
I believe the noble Lord, Lord Holmes, and the CyberUp campaign have made an overwhelming case for amending the Computer Misuse Act 1990. By agreeing to these, the Government could provide much-needed clarity and legal protection for cybersecurity professionals, enabling them to contribute effectively to the UK’s security and economic prosperity.
Lord Kirkhope of Harrogate Portrait Lord Kirkhope of Harrogate (Con)
- Hansard - - - Excerpts

My Lords, following on from what I said on earlier amendments, this is worse than what the noble Lord, Lord Clement-Jones, has just expressed. Indeed, I fully support the amendments of my noble friend Lord Holmes. However, this just demonstrates, yet again, that unless we pull ourselves together, with better smart legislation that moves faster, we will never ever catch up with developments in technology and AI. This has been demonstrated dramatically by these amendments. I express concerns that the Government move at a pace that government always moves at, but in this particular field it is not going to work. We are going to be disadvantaged and in serious trouble, unless we can move a bit faster.

Lord Arbuthnot of Edrom Portrait Lord Arbuthnot of Edrom (Con)
- Hansard - - - Excerpts

My Lords, I rise briefly but strongly to support my noble friend Lord Holmes. The CyberUp campaign has been banging this drum for a long time now. I remember taking part in the debates in another place on the Computer Misuse Act 34 years ago. It was the time of dial-up modems, fax machines and bulletin boards. This is the time to act, and it is the opportunity to do so.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, we ought to be mindful and congratulate the noble Lord on having been parliamentarian of the year as a result of his campaigning activities.

Lord Arbuthnot of Edrom Portrait Lord Arbuthnot of Edrom (Con)
- Hansard - - - Excerpts

My Lords, it has taken 34 years.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

My Lords, I rise to make a brief but emphatic comment from the health constituency. We in the NHS have been victims of appalling cyber- hacking. The pathology labs in south London were hacked and that cost many lives. It is an example of where the world is going in the future unless we act promptly. The emphatic call for quick action so that government keeps up with world changes is really well made. I ask the Minister to reflect on that.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, I, too, shall speak very briefly, which will save valuable minutes in which I can order my CyberUp Christmas mug.

Amendments 156A and 156B add to the definition of unauthorised access, so that it includes instances where a person who accesses data in the reasonable knowledge that the controller would not consent if they knew about the access or the reason for the access, and this person is not empowered to access by an enactment. Amendment 156B introduces defences to this new charge. Given the amount of valuable personal data held by controllers, as our lives have moved increasingly online—as many speakers in this debate have vividly brought out—there is absolutely clear merit not just in this idea but in the pace implied, which many noble Lords have called for. There is a need for real urgency here, and I look forward to hearing more detail from the Minister.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I turn to Amendments 156A and 156B, tabled by the noble Lord, Lord Holmes. I understand the strength of feeling and the need to provide legal protections for legitimate cybersecurity activities. I agree with the noble Lord that the UK should have the right legislative framework to allow us to tackle the harms posed by cybercriminals. We have heard examples of some of those threats this afternoon.

I reassure the noble Lord that this Government are committed to ensuring that the Computer Misuse Act remains up to date and effective in tackling criminality. We will continue to work with the cybersecurity industry, the National Cyber Security Centre and law enforcement agencies to consider whether there are workable proposals on this. The noble Lord will know that this is a complex and ongoing issue being considered as part of the review of the Computer Misuse Act being carried out by the Home Office. We are considering improved defences by engaging extensively with the cybersecurity industry, law enforcement agencies, prosecutors and system owners. However, engagement to date has not produced a consensus on the issue, even within the industry, and that is holding us back at this moment—but we are absolutely determined to move forward with this and to reach a consensus on the way forward.

I think the noble Lord, Lord Clement-Jones, said in the previous debate that the amendments were premature, and here that is certainly the case. The specific amendments that the noble Lord has tabled are premature, because we need a stronger consensus on the way forward, notwithstanding all the good reasons that noble Lords have given for why it is important that we have updated legislation. With these concerns and reasons in mind, I hope that the noble Lord will feel able to withdraw his amendment.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

Could the Minister say a few words on some of those points of discourse and non-consensus, to give the Committee some flavour of the type of issues where there is no consensus as well as the extent of the gap between some of those perspectives?

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Just to follow up, have the Government formally responded to the original review from the noble Lord, Lord Vallance? That would be very helpful as well, in unpacking what were clearly extremely well-informed recommendations. It should, no doubt, be taken extremely seriously.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I can tell the noble Lord, Lord Holmes, that we published our analysis of the consultation responses to the previous Home Office investigation in November 2023, so all those mixed responses are on the record. It was therefore concluded by the Government that further work needed to be done on this. On my noble friend’s report, was there a government response?

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- Hansard - - - Excerpts

Yes, the Government accepted the recommendations in full.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Before the Minister sits down or stands up or whatever the appropriate phrase should be, I very much hope that, since the previous Government gave that indication, this Government will take that as a spur to non-glacial progress. I hope that at least the speed might get up to a number of miles per hour before too long.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

My Lords, I thank all noble Lords who have taken part in this important debate and, indeed, the Minister for her thoughtful response. We find ourselves in a position of extraordinary good fortune when it comes to these and many other amendments, not least in the area of artificial intelligence. We had a first-class report from the then Sir Patrick Vallance as CSA. It is not often in life that in a short space of time one is afforded the opportunity in government of bringing much of that excellent work into being through statute, regulation, codes and other guidance. I await further steps in this area.

There can barely be, in many ways, a more serious and pressing issue to be addressed. For every day that we delay, harms are caused. Even if the Government were only to do this on their growth agenda, much spoken of, this would have an economic benefit to the United Kingdom. It would be good to meet the Minister between Committee and Report to see if anything further can be done but, from my perspective and others, we will certainly be returning to this incredibly important issue. I beg leave to withdraw the amendment.

Amendment 156A withdrawn.
Amendments 156B and 157 not moved.
Schedule 11 agreed.
Clause 108 agreed.
Clause 109: Interpretation of the PEC Regulations
Amendment 158 not moved.
Clause 109 agreed.
Clauses 110 and 111 agreed.
Schedule 12: Storing information in the terminal equipment of a subscriber or user
Amendment 159 had been withdrawn from the Marshalled List.
Amendments 159A to 160 not moved.
Schedule 12 agreed.
Clauses 112 and 113 agreed.
Schedule 13 agreed.
Clause 114 agreed.
Amendments 161 and 162 not moved.
Clause 115 agreed.
Schedule 14: The Information Commission
Amendments 163 to 192 not moved.
Schedule 14 agreed.
Clauses 116 to 119 agreed.
Schedule 15: Information standards for health and adult social care in England
Amendments 193 to 195 not moved.
Schedule 15 agreed.
Clause 120 agreed.
Schedule 16 agreed.
17:00
Clauses 121 and 122 agreed.
Amendment 196 not moved.
Clause 123: Information for research about online safety matters
Amendment 197
Moved by
197: Clause 123, page 153, line 6, leave out “may by regulations” and insert “must, as soon as reasonably practicable and no later than 12 months after the day on which this Act is passed, make and lay regulations to”
Member’s explanatory statement
This amendment removes the Secretary of State’s discretion on whether to lay regulations under Clause 123 and sets a time limit for laying them before Parliament.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I shall also speak to Amendment 198 in my name and register my support for the amendments in the name of the noble Lord, Lord Bethell, to which I have added my name. Independent research access is a very welcome addition to the Bill by the Government. It was a key recommendation of the pre-legislative scrutiny committee on the Online Safety Bill in 2021 and I know that I speak for many colleagues in the academic field, as well as many civil society organisations, who are delighted by its swift and definitive inclusion in the Bill.

The objective of these amendments is not to derail the Government’s plans, but rather to ensure that they happen and to make the regime work for children and the UK’s world-class academic institutions and stellar civil society organisations, ensuring that we can all do high-quality research about emergent threats to children and society more broadly.

Amendment 197 would ensure that the provisions in Clause 123 are acted on by removing the Government’s discretion as to whether or not they introduce regulations. It would also impose a deadline of 12 months for the Government to do so. I have said this before, but I have learnt the hard way that good intentions and warm words from the Dispatch Box are a poor substitute for clear provisions in law. A quick search of the Bill reveals that there are 119 uses of the word “must” and 262 uses of the word “may”. Clearly, they are being used to create different obligations or expectations. The Minister may say that this amendment is not needed and that, for all intents and purposes, we can take the word “may” as a “must” or a “will”, but I would prefer to see it in black and white. In fact, if the Government have reserved discretion on this point, I would like to understand exactly what that means for research.

Amendment 198 seeks to ensure that the regulations will enable independent researchers to research how online risks and harms impact different groups, especially vulnerable users including children. We have already discussed the fact that online harms are not experienced equally by users: those who are most vulnerable offline are often the most vulnerable online. In an earlier debate, I talked about the frustrations experienced when tech companies do not report data according to age groups. In failing to do so, it is possible to hide the reality that children are disproportionately impacted by certain risks and harms. This amendment would ensure that children and other vulnerable groups can be studied in isolation, rather than leaving independent researchers to pick through generalised datasets to uncover where harm is amplified and for whom.

I will leave the noble Lord, Lord Bethell, to explain his amendments, but I will just say why it is so important that we have a clear path to researcher access. It is fundamental to the success of the online safety regime.

Many will remember Frances Haugen, the Facebook whistleblower, who revealed the extent to which Meta knew, through its own detailed internal research, how harmful their platforms actually are to young people. Meta’s own research showed that:

“We make body image issues worse for one in three girls”.


Some 32% of teen girls said that, when they have felt bad about their bodies, Instagram has made them feel worse. Were it not for a whistleblower, this research would never have been made public.

After a series of evidence disclosures to US courts as a result of the legal action by attorneys-general at state level, we have heard whistleblowers suggest, in evidence given to the EU, that there will be a new culture in some Silicon Valley firms—no research and no emails. If you have something to say, you will have to say it in person so that it cannot be used against them in court. The irony of that is palpable given the struggle that we are having about user privacy, but it points to the need for our research regime to be water- tight. If the companies are not looking at the impact of their own services, we must. I hope that the Government continue their leadership on this issue and accept the amendments in the spirit that they are being put forward.

I have another point that I want the Minister to clarify. I apologise, because I raised this in a private meeting but I have forgotten the answer. Given the number of regulatory investigations, proceedings and civil litigations in which tech companies are engaged, I would like some comfort about the legal exemption in these clauses. I want to understand whether it applies only to advice from and between lawyers or exempts data that may negatively impact companies’ defence or surface evidence of safety failures or deficiencies. The best way that I have of explaining my concern is: if it is habitual for tech companies to cc a lawyer in all their communications on product safety, trust and safety, and so on, would that give them legal privilege?

Finally, I support the noble Lord, Lord Clement-Jones, in his desire for a definition of independent researchers. I would be interested to hear what the Minister has to say on that. I beg to move.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

My Lords, I will speak to my Amendments 198A and 198C to 198F. I also support Amendments 197, 198 and 198B, to which I have added my name, all of which address the issue of data for researchers.

As was put very thoughtfully by the noble Baroness, Lady Kidron, platforms are not making decisions about their services with due regard to product safety or with independent oversight. Ofcom’s work enforcing the Online Safety Act will significantly shift towards accountability, in some part, but it makes no provision at the moment on researchers’ data access, despite civil society and academic researchers being at the forefront of highlighting online harms for a decade. The anecdotes that the noble Baroness just gave were a very powerful testimony to the importance of that. We are, in fact, flying completely blind, making policy and, in this Room, legislation without data, facts and insight about the performance and algorithms that we seek to address. Were it not for the whistleblowers, we would not have anything to go on and we cannot rely on whistleblowers to guide our hands.

Rectifying this admission is in the Bill, and I am enormously grateful to the Minister and to the role of my noble friend Lord Camrose for putting it in the Bill. It is particularly important, because the situation with data for researchers has deteriorated considerably, even in the last 18 months—with Meta shutting CrowdTangle and X restricting researchers’ access to its API. The noble Baroness, Lady Kidron, spoke about what the whistleblowers think, and they think that this is going to get a lot worse in the future.

I welcome the inclusion of these provisions in the Bill. They will be totally transformational to this sector, bringing a level of access to serious analysts and academics, so we can better understand the impact of the digital world, for both good and bad. A good example of the importance of robust research to inform policy-making was the Secretary of State’s recent announcement that the Government were launching a

“research project to explore the impact of social media on young people’s wellbeing and mental health”.—[Official Report, Commons, 20/11/24; col. 250.]

That project will not be very effective if the researchers cannot access the data, so I very much hope that these provisions will be enforced before they start spending money on that.

To be effective and to have the desired effect, we need to ensure that the data for researchers regime, as described in the Bill, is truly effective and cannot be easily brushed off. That is why the Government need to accept the amendments in this group: to bring some clarity and to close loopholes in the scheme as it is outlined in the Bill.

I will briefly summarise the provisions in the amendments in my name. First, we need to make researcher access regulations enforceable in the same way as other requirements in the Online Safety Act. The enforcement provisions in that Act were strengthened considerably as it passed through this House, and I believe that the measures for data for researchers need to be given the same rocket boosters. Amendment 198D will mean that regulated services will be required to adhere to the regime and give Ofcom the power to levy proper remedial action if regulated services are obfuscating or non-compliant.

Secondly, we need to ensure that any contractual provision of use, such as a platform’s terms of service, is unenforceable if it would prevent

“research into online safety matters”,

as defined in the regulations. This is an important loophole that needs to be closed. It will protect UK researchers carrying out public interest research from nefarious litigation over terms of service violations as platforms seek to obfuscate access to data. We have seen this practice in other areas.

Thirdly, we need to clarify that researchers carrying out applicable research into online safety matters in the UK will be able to access information under the regime, regardless of where they are located. This is a basic point. Amendment 198E would bring the regime in line with the Digital Services Act of the EU and allow the world’s best researchers to study potential harm to UK users.

Ensuring robust researcher access to data contributes to a great ecosystem of investigation and scrutiny that will help to enforce an effective application of the law, while also guarding against overreach in terms of moderating speech. It is time to back UK civil society and academic researchers to ensure that policy-making and regulatory enforcement is as informed as possible. That is why I ask the Minister to support these measures.

Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- Hansard - - - Excerpts

My Lords, I will speak briefly. I added my name in support of Amendments 197 and 198, tabled by the noble Baroness, Lady Kidron. We do not need to rehearse the arguments as to why children are a distinct group who need to be looked at in a distinctive way, so I will not repeat those arguments.

I turn to the excellent points made in the amendments in the name of the noble Lord, Lord Bethell. Data access for researchers is fundamental. The problem with statutory bodies, regulators and departments of state is that they are not designed and set up to be experts in researching some of the more arcane areas in which these algorithms are developed. This is leading-edge stuff. The employees in these platforms—the people who are designing and tweaking these very clever algorithms—are coming from precisely the academic and research institutions that are best placed to go into those companies and find out what they are doing. In many cases, it is their own graduates and PhDs who are doing it. They are the best qualified people to look at what is going on, because they will understand what is going on. If somebody tries to obfuscate, they will see through them immediately, because they can understand that highly sophisticated language.

If we do not allow this, we will be in the deeply uncomfortable position of relying on brave people such as Frances Haugen to run the huge reputational, employability and financial risks of becoming a whistleblower. A whistleblower who takes on one of those huge platforms that has been employing them is a very brave person indeed. I would feel distinctly uncomfortable if I thought that we were trying to guard our citizens, and particularly our children, against what some of these algorithms are trying to do by relying on the good wishes and chances of a whistleblower showing us what was going on. I support all these amendments very strongly.

17:15
Lord Arbuthnot of Edrom Portrait Lord Arbuthnot of Edrom (Con)
- Hansard - - - Excerpts

My Lords, I shall speak very briefly. I have a great deal of agreement with what the noble Baroness, Lady Kidron, the noble Lord, Lord Russell, and my noble friend Lord Bethell have said. I am rising to nitpick; I apologise for that, but I suppose that is what Committee is for.

The final line of proposed new subsection (da), to be inserted by Amendment 198, refers to

“different characteristics including gender, race, ethnicity, disability, sexuality, gender”.

On our first day in Committee, I raised the importance of the issue of sex, which is different from gender or sexuality. We need to make sure that we get the wording of this amendment, if it were to be accepted by the Government, absolutely right.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My Lords, I shall also speak extremely briefly, as one of the three veterans of the Joint Committee present in Committee today, to reinforce my support for these amendments. The Government should be congratulated on Clause 123. It is welcome to see this movement but we want to see this done quickly. We want to ensure that it is properly enforceable, that terms of service cannot be used to obstruct access to researchers, as the noble Lord, Lord Bethell, said, and that there is proper global access by researchers, because, of course, these are global tech companies and UK users need to be protected through transparency. It is notable that, in the government consultation on copyright and AI published yesterday, transparency is a core principle of what the Government are arguing for. It is this transparency that we need in this context, through independent researchers. I strongly commend these amendments to the Minister.

Earl of Erroll Portrait The Earl of Erroll (CB)
- Hansard - - - Excerpts

My Lords, I would like to just make one comment on this group. I entirely agree with everything that has been said and, in particular, with the amendments in the name of the noble Baroness, Lady Kidron, but the one that I want to single out—it is why I am bothering to stand up—is Amendment 197, which says that the Secretary of State “must” implement this measure.

I was heavily scarred back in 2017 by the Executive’s refusal to implement Part 3 of the Digital Economy Act in order to protect our children from pornography. Now, nearly eight years later, they are still not protected. It was never done properly, in my opinion, in the then Online Safety Bill either; it still has not been implemented. I think, therefore, that we need to have a “must” there. We have an Executive who are refusing to carry out the issue from Parliament in passing the legislation. We have a problem, but I think that we can amend it by putting “must” in the Bill. Then, we can hold the Executive to account.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, the trouble with this House is that some have long memories. The noble Earl, Lord Erroll, reminded us all to look back, with real regret, at the Digital Economy Act and the failure to implement Part 3. I think that that was a misstep by the previous Government.

Like all of us, I warmly welcome the inclusion of data access provisions for researchers studying online safety matters in Clause 123 of the Bill. As we heard from the noble Baroness, Lady Kidron, and the noble Lord, Lord Knight, this was very much unfinished business from the Online Safety Act. However, I believe that, in order for the Bill to be effective and have the desired effect, the Government need to accept the amendments in the names of the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell. In terms of timeframe, the width of research possible, enforceability, contractual elements and location, they cover the bases extremely effectively.

The point was made extremely well by the noble Lords, Lord Bethell and Lord Russell, that we should not have to rely on brave whistleblowers such as Frances Haugen. We should be able to benefit from quality researchers, whether from academia or elsewhere, in order to carry out this important work.

My Amendment 198B is intended as a probing amendment about the definition of researchers under Clause 123, which has to be carefully drawn to allow for legitimate non-governmental organisations, academics and so on, but not so widely that it can be exploited by bad actors. For example, we do not want those who seek to identify potential exploits in a platform to use this by calling themselves “independent researchers” if they simply describe themselves as such. For instance, could Tommy Robinson seek to protect himself from liabilities in this way? After all, he called himself an “independent journalist” in another context when he clearly was not. I hope that when the Government come to draw up the regulations they will be mindful of the need to be very clear about what constitutes an independent or accredited researcher, or whatever phrase will be used in the context.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, although I have no amendments in this group, I will comment on some of them. I might jump around the order, so please forgive me for that.

Amendment 197 would change Clause 123 so that the Secretary of State must, as soon as reasonably practicable and no later than 12 months after the Act is passed, make regulations requiring regulated services to provide information for the purposes of research into online safety. This is clearly sensible. It would ensure that valuable research into online safety may commence as soon as possible, which would benefit us all, as speakers have made abundantly clear. To that end, Amendment 198D, which would ensure that researcher access is enforceable in the same way as other requirements under the Online Safety Act, would ensure that researchers can access valuable information and carry out their beneficial research.

I am still left with some curiosity on some of these amendments, so I will indicate where I have specific questions to those who have tabled them and hope they will forgive me if I ask to have a word with them between now and Report, which would be very helpful. In that spirit, I turn to Amendment 198B, which would allow the Secretary of State to define the term “independent researcher”. I ask the noble Lord, Lord Clement-Jones, who tabled the amendment, whether he envisages the Secretary of State taking advice before making such regulations and, if so, from whom and in what mechanism. I recognise that it is a probing amendment, but I would be keen to understand more.

I am also keen to understand further from my noble friend Lord Bethell and the noble Baroness, Lady Kidron, why, under Amendment 198A, the Secretary of State would not be able to make regulations providing for independent research into the “enforcement of requirements” under these regulations. Again, I look forward to discussing that with them.

I have some concerns about Amendment 198, which would require service providers to give information pertaining to age, stage of development, gender, race, ethnicity, disability and sexuality to researchers. I understand the importance of this but my concern is that it would require the disclosure of special category data to those researchers. I express reservations, especially if the data pertains to children. Do we have the right safeguards in place to address the obviously heightened risks here?

Additionally, I have some concerns about the provisions suggested in Amendment 198E. Should we allow researchers from outside the United Kingdom to require access to information from regulated service providers? Could this result in data being transferred into jurisdictions where there are less stringent data protection laws?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I thank noble Lords who have welcomed the provisions in the Bill. I very much appreciate that we have taken on board the concerns that were raised in the debates on the previous legislation. I thank the noble Baroness, Lady Kidron, and the noble Lords, Lord Bethell and Lord Clement-Jones, for their amendments.

I will speak first to Amendment 197, tabled by the noble Baroness, Lady Kidron, which would compel the Secretary of State to create a framework and to do so within 12 months of passage. I understand and share her desire to ensure that a framework allowing researchers access is installed and done promptly. This is precisely why we brought forward this provision. I reassure her that the department will consult on the framework as soon as possible after the publication of Ofcom’s report.

Turning to Amendments 198 and 198B, tabled by the noble Baroness, Lady Kidron, and the noble Lord, Lord Clement-Jones, respectively, Clause 123 provides the Secretary of State with the power to make regulations relating to researchers’ access to data. I can reassure noble Lords that it does not limit the regulations to the non-exhaustive list of examples provided. I agree that fair and proportionate criteria for who is considered a researcher are critical to the success of the future framework. I reassure noble Lords that in the provision as currently written the Secretary of State can include in the design of the framework the specific requirements that a person must meet to be considered a researcher.

Turning to Amendments 198A and 198D, tabled by the noble Lord, Lord Bethell, while I am sympathetic to his desire to provide a future framework with the robust enforcement powers of the OSA, I assure him that as the provision is written, the Secretary of State can already use the existing enforcement powers of the OSA to support a future framework. Furthermore, should the evidence suggest that additional or different measures would be more effective and appropriate, this provision allows the Secretary of State the flexibility to introduce them.

Turning next to Amendments 198C and 198E, tabled by the noble Lord, Lord Bethell, I understand the spirit of these amendments and note the importance of this issue, given the global nature of the online world. It is entirely reasonable to allow researchers who are not based in the UK to utilise our researcher access framework, as long as the subject of their research is the experience of UK users online. I reassure him that the provisions as drafted already allow the Secretary of State to make regulations permitting non-UK-based researchers to use the framework where appropriate. We plan to use the evidence gathered through our own means and through Ofcom’s report to set out who will be eligible to use the framework in the secondary legislation.

Finally, turning to Amendment 198F, I am aware of the concern that researchers have encountered blockages to conducting research and I am sympathetic to the intentions behind the amendment. We must ensure that researchers can use the future framework without fear of legal action or other consequences. I am conscious that the noble Baroness, Lady Kidron, asked me a specific question about legal exemptions and I will write to her to make that answer much clearer. I reassure noble Lords that the Government are considering the specific issues that the noble Lord raises. For these reasons, I ask that the amendments not be pressed while the Government consider these issues further and I am of course happy to engage with noble Lords in the meantime.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I thank the Minister and everyone who spoke. I do not think I heard an answer to the may/must issue and I think I need to say that just relying on Ofcom’s report to set the framework for the regime is not adequate, for two reasons. First, it is no news to the Committee that there is a considerable amount of disquiet about how the Online Safety Act has been reinterpreted without Parliament’s intention. During the passage of this Bill, we are trying to be really clear—we will win some and we will lose some—on the face of the Bill what Parliament’s intention is, so that the regulator really does what we agree, because that subject is currently quite contentious.

This is a new area and a lot of the issues that the Minister and, indeed, the noble Viscount, Lord Camrose, raised are here to be sorted out to make sure that we understand collectively what it will look like. Having said that, I would like the Government to have heard that we do not wish to rest on the actions of whistleblowers but we will be increasingly forced to do so if we do not have a good regime. We must understand the capacity of this sector to go to court. We are in court everywhere, all over the world; the sector has deep pockets.

Finally, I welcome the nitpicking of the noble Lord, Lord Arbuthnot. Long may he nitpick. We will make sure that he is content before Report. With that, I beg leave to withdraw the amendment.

Amendment 197 withdrawn.
17:30
Amendments 198 to 198F not moved.
Clause 123 agreed.
Clauses 124 to 126 agreed.
Amendment 199
Moved by
199: After Clause 126, insert the following new Clause—
“Data risks from systemic competitors and hostile actors
Data risks from systemic competitors and hostile actors(1) The Secretary of State, in consultation with the Information Commissioner, must conduct a risk assessment on the data privacy risks associated with genomics and DNA companies that are headquartered in countries the government determines to be systemic competitors and hostile actors.(2) Within 12 months of the day on which this Act is passed, the Secretary of State must present a report on the risk assessment in subsection (1) to Parliament and consult the intelligence and security agencies on the findings, taking into account the need to not make public information critical to national defence or ongoing operations.(3) This risk assessment must evaluate—(a) the degree of access granted to foreign entities, particularly those linked to systemic competitors and hostile actors, to genomic and DNA data collected within the United Kingdom,(b) the potential for genomic and DNA data to be exfiltrated outside of the United Kingdom,(c) the potential misuse of United Kingdom genomic and DNA data for dual-use or nefarious purposes,(d) the potential for such data to be used in a manner that could compromise the privacy or security of United Kingdom citizens or undermine national security and strategic advantage.(4) The risk assessment must consider and include, but is not limited to—(a) an analysis of the data handling and storage practices of genomics companies that are based in countries designated as systemic competitors and hostile actors, (b) an independent audit, including digital and physical forensic examination, at any company site that could have access to United Kingdom genomics data, and(c) evidence of clear disclosure statements to consumers of products and services from genomics companies subject to data sharing requirements in the countries where they are headquartered.(5) This risk assessment must be conducted as frequently as deemed necessary by the Secretary of State or the Information Commissioner to address evolving threats and ensure continued protection of the genomics sector from entities controlled, directly or indirectly, by countries designated as systemic competitors and hostile actors.(6) The Secretary of State may issue directives or guidelines based on the findings of the risk assessment to ensure compliance by companies or personnel operating within the genomics sector in the United Kingdom, safeguarding against identified risks and vulnerabilities to data privacy.”Member’s explanatory statement
This amendment seeks to ensure sufficient scrutiny of emerging national security and data privacy risks related to advanced technology and areas of strategic interest for systemic competitors and hostile actors. It aims to inform the development of regulations or guidelines necessary to mitigate risks and protect the data privacy of UK citizens’ genomics data and the national interest. It seeks to ensure security experts can scrutinise malign entities and guide researchers, consumers, businesses, and public bodies.
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, the UK is a world leader in genomics research. This research will no doubt result in many benefits, particularly in the healthcare space. However, genomics data can be, and increasingly is, exploited for deeply concerning purposes, including geostrategic ones.

Western intelligence agencies are reportedly becoming increasingly concerned about China using genomic data and biotechnology for military purposes. The Chinese Government have made it clear that genomics plays a key part in the civilian-military doctrine. The 13th five-year plan for military-civil fusion calls for the cross-pollination of military and civilian technology such as biotechnology. This statement, taken in conjunction with reports that the Beijing Genomics Institute—the BGI—in collaboration with the People’s Liberation Army, is looking to make ethnically Han Chinese soldiers less susceptible to altitude sickness, makes for worrying reading. Genetically engineered soldiers appear to be moving out of fiction and towards reality.

The global genomics industry has grown substantially as a result of the Covid-19 pandemic and gene giant BGI Group and its affiliated MGI Tech have acquired large databases of DNA. Further, I note that BGI has widespread links to the Chinese state. It operates the Government’s key laboratories and national gene bank, itself a vast repository of DNA data drawn from all over the world. A Reuters investigation found that a prenatal test, NIFTY, sold by BGI to expectant mothers, gathered millions of women’s DNA data. This prenatal test was developed in collaboration with the Chinese military.

For these reasons, I think we must become far more protective of genomic data gathered from our population. While many researchers use genomic data to find cures for terrible diseases, many others, I am afraid, would use it to do us harm. To this end, I have tabled Amendment 199 to require the Secretary of State and the Information Commissioner to conduct frequent risk assessments on data privacy associated with genomics and DNA companies headquartered in countries that are systemic competitors or hostile actors. I believe this will go some way to preventing genomic data transfer out of the UK and to countries such as China that may use it for military purposes. I beg to move.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

My Lords, I strongly support this amendment. As a former Minister, I was at the front line of genomic data and know how powerful it currently is and can be in the future. Having discussed this with the UK Biobank, I know that the issue of who stores and processes genomic data in the UK is a subject of huge and grave concern. I emphasise that the American Government have moved on this issue already and emphatically. There is the possibility that we will be left behind in global standards and will one day be an outlier if we do not close this important and strategically delicate loophole. For that reason, I strongly support this amendment.

Earl of Erroll Portrait The Earl of Erroll (CB)
- Hansard - - - Excerpts

My Lords, I was involved in an ethics committee that looked at genomics and cancer research some years ago, and this is very important. If research could be done on different genomic and racial types, it could be used against us adversely at some point. So there is a lot of sense in this.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I thank the noble Viscount, Lord Camrose, for moving this amendment, which raises this important question about our genomics databases, and for the disturbing examples that he has drawn to our attention. He is right that the opportunities from harnessing genomic data come with very real risks. This is why the Government have continued the important work of the UK Biological Security Strategy of 2023, including by conducting a full risk assessment and providing updated guidance to reduce the risks from the misuse of sensitive data. We plan to brief the Joint Committee on the National Security Strategy on the findings of the risk assessment in the new year. Following that, I look forward to engaging with the noble Viscount on its outcome and on how we intend to take these issues forward. As he says, this is a vital issue, but in the meantime I hope he is prepared to withdraw his amendment.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank the Minister for her answer, and I very much accept her offer of engagement. I will make a few further brief comments about the importance of this amendment, as we go forward. I hope that other noble Lords will consider it carefully before Report.

I will set out a few reasons why I believe this amendment can benefit both the Bill and this country. The first is its scope. The amendment will allow the Secretary of State and the Information Commissioner to assess data security risks across the entirety of the genomic sector, covering consumers, businesses, citizens and researchers who may be partnering with state-linked genomics companies.

The second reason is urgency. DNA is regularly described as the “new gold” and it represents our most permanent identifier, revealing physical and mental characteristics, family medical history and susceptibility to diseases. Once it has been accessed, the damage from potential misuse cannot be researched, and this places a premium on proactively scrutinising the potential risks to this data.

Thirdly, there are opportunities for global leadership. This amendment offers the UK an opportunity to take a world-leading role and become the first European country to take authoritative action to scrutinise data vulnerabilities in this area of critical technology. Scrutinising risks to UK genomic data security also provides a foundation to foster domestic genomics companies and solutions.

Fourthly, this amendment would align the UK with key security partners, particularly, as my noble friend Lord Bethell mentioned, the United States, which has already blacklisted certain genomics companies linked to China and taken steps to protect American citizens’ DNA from potential misuse.

The fifth and final reason is protection of citizens and consumers. This amendment would provide greater guidance and transparency to citizens and consumers whose DNA data is exposed to entities linked to systemic competitors. With all of that said, I thank noble Lords for their consideration and beg leave to withdraw my amendment.

Amendment 199 withdrawn.
Clauses 127 to 132 agreed.
Amendments 200 to 202 not moved.
Amendment 203
Moved by
203: After Clause 132, insert the following new Clause—
“Offence to use personal data or digital information to create digital models or files that facilitate the creation of AI- or computer-generated child sexual abuse material(1) A person commits an offence if they—(a) collect, scrape, possess, distribute or otherwise process personal data or digital information with the intention of using it, or attempting to use it, to create or train a digital model which enables the creation of AI- or computer-generated child sexual abuse material or priority illegal content;(b) use personal data or digital information to create, train or distribute or attempt to create, train or distribute a digital file or model that has been trained on child sexual abuse material or priority illegal content, or which enables the creation of AI- or computer-generated child sexual abuse material or priority illegal content;(c) collate, or attempt to collate, digital files or models based on personal data or digital information that, when combined, enable the creation of AI- or computer-generated child sexual abuse material or priority illegal content;(d) possess, or attempt to possess, a digital file or model based on personal data or digital information with the intention of using it to produce or gain access to AI- or computer-generated child sexual abuse material or priority illegal content.(2) For the purposes of this section, “AI- or computer-generated child sexual abuse material or priority illegal content” includes images, videos, audio including voice, chatbots, material generated by large language models, written text, computer files and avatars. (3) A person who commits an offence under subsection (1) is liable to the sentences set out in section 160 of the Criminal Justice Act 1988 (possession of indecent photograph of child) and section 6 of the Protection of Children Act 1978 (punishments) for the equivalent offences.(4) For the purposes of this section, “priority illegal content” is content that meets the definition of “priority illegal content” set out in section 59 of the Online Safety Act 2023.”Member's explanatory statement
It is illegal in the UK to possess or distribute child sexual abuse material including AI- or computer-generated child sexual abuse material. However, while the content is clearly covered by existing law, the mechanism that enables their creation – i.e. the files trained on or trained to create such material – is not. This amendment seeks to address that gap.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, Amendment 203 is in my name and the names of the noble Lords, Lord Bethell, Lord Stevenson and Lord Clement-Jones. I thank noble Lords wholeheartedly for their support for this measure through two versions of this Bill. I believe that I speak for all signatories in recognising the support of a huge number of colleagues in both Houses and all parties who have expressed their support for this amendment.

It is my understanding that we are going to hear good news from the Dispatch Box. In the event that I am wrong, I shall have more to say once we have heard from the Minister. In the meantime, I want to explain what the problem is that the amendment seeks to solve.

It is illegal in the UK to possess or distribute child sexual abuse material, including AI-generated or computer-generated child sexual abuse material. The laws that the police use to enforce against CSAM are Section 1 of the Protection of Children Act 1978 and Section 160 of the Criminal Justice Act 1988, both of which create offences in respect of indecent photographs or pseudo-photographs of a child. AI content depicting child sexual abuse is illegal under these laws, but creating and distributing the software models needed to generate them is not, which means that those building and distributing software that allows paedophiles to generate bespoke child sexual abuse material have operated with impunity.

There are many services that allow anyone to take any public image and put it in a false situation. Although I have argued elsewhere that AI images should carry a mark of provenance, these services are not the subject of this amendment. This amendment is laser focused on criminalising AI models that are trained on or trained to create child sexual abuse material. They are specific, specialist and currently beyond the reach of the police. The models blend images of children—known children, stock photos, images scraped from social media, school websites or synthetic, fabricated AI depictions of children—with existing CSAM or pornography, and they allow paedophiles to generate bespoke CSAM scenarios of unimaginable depravity, as they are unmitigated by any restrictions that organise the reality of the world. If someone can think, type or say it, they can make it so.

Many of the generative models are distributed for free, but more specialist models are provided on subscription for less than £50 per month. This payment provides any child sexual offender with the ability to generate limitless—and I do mean limitless—child sexual abuse images. But while the police can take action against those who possess those images, they are unable to take action against those who make it possible to do so: the means of production.

A surprising number of people think that AI abuse is a victimless crime, and I want to make it clear that it is not. First, who would be comfortable with the image of their child or grandchild or their neighbour’s child being used in this way? Anyone, adult or child, can appear in AI-generated CSAM. I am not going to say how it can be done, because I do not want my words to be a set of instructions on the public record—but the reality is, any one of us, woman or man, though 99% are women, boy or girl, though it is mostly girls, is a potential victim. If your image is used in this way, you are a victim; if you are forced to watch or copy such imagery, you are a victim; and if you are a child whose real-life abuse is not caught because you are lost in a sea of AI-generated material, you are a victim. Then there is the normalisation of sexual violence against children, which poisons relationships—intimate, familial, across generations, genders and sexes. This is not a victimless crime.

17:45
I have been aware of the industrial scale of this issue, in part because of the efforts of a specialist police unit that—day in and day out—occupies the synthetic worlds created to humiliate, objectify and abuse children. I have had the privilege of meeting many of the unit in person and a smaller group on many occasions. For obvious reasons, I do not want to name them, but I take this opportunity to thank them and recognise all on the front line of fighting against CSAM. It is an unbearably hard task.
Before I sit down, I have two brief points to make. First, although the proposed amendments are definitively focused on those who deliberately create child sexual abuse, I put on notice those companies and services that do not take sufficient care to prevent it happening accidentally. I know some image generator companies have gone out of their way to create guardrails and others have taken a “hurt first, fix later” approach. We have data law, we have the OSA, and I anticipate that in the new year we will have further offences, each of which will be robustly used to stop the careless creation of abuse. That should be the number one concern of GenAI companies.
Secondly, I am of course delighted to win a battle for children. I am happy to recognise that the previous Government promised it and the efforts of the noble Viscount, Lord Camrose, in agreeing to this at an earlier date. I also recognise the efforts of the civil servants in the Home Office and the Safeguarding Minister, Jess Phillips, all of whom have made considerable efforts.
Last Friday, however, we had a completely unacceptable answer from the Lords MoJ Minister on the related issue of non-consensual sexually explicit images and videos during a debate on the PMB of the noble Baroness, Lady Owen. I had written that line before the noble Baroness decided to lay some amendments that we will discuss in only a moment. I will let her explain her intentions, but I want to put on record my full support for her campaign, her Private Member’s Bill and her amendments, and for including them in today’s debate.
It should not be possible for the Home Office to manage and for the MoJ to not manage. We need a Government where all departments work on behalf of all victims. I will wait to hear what the Minister says, and I very much hope I can congratulate her when I stand up again. I beg to move.
Baroness Owen of Alderley Edge Portrait Baroness Owen of Alderley Edge (Con)
- Hansard - - - Excerpts

My Lords, I rise today in support of Amendment 203 in the name of the noble Baroness, Lady Kidron. I declare an interest as a recent guest of Google at its Future Forum policy conference. I apologise for not being able to make Second Reading and for not being present for my last amendment; as a newer Peer, I am very new to this and still learning as I go. I am very grateful to the noble Baroness, Lady Kidron, for stepping in.

I commend the wording of the noble Baroness’s amendment, which tackles the full process of training these models, from the collection of data or images to use as training data, all the way through to possessing a model. With these apps easily downloadable on app stores, there is a lack of friction in the process. This means that we have seen horrific cases of children using these apps in schools across the world with devastating consequences. In summer, I met the father of a little girl who had been bullied in this way and sadly took her own life.

I am very grateful to the noble Baroness, Lady Kidron, for this thoughtful and comprehensive amendment, which seeks to future-proof with its inclusion of avatars. We have already seen these threats evolving in the metaverse. I encourage the Government to adopt this amendment so that we can begin to see an end to this abusive market.

I turn to my Amendment 211G. I am very grateful to the noble Lords, Lord Clement-Jones and Lord Browne of Ladyton, and the noble Baroness, Lady Kidron, for putting their names to it. Noble Lords may recognise it from my Private Member’s Bill on non-consensual sexually explicit images and videos. I will keep my remarks brief as many of your Lordships were present on Friday.

The amendment seeks to create offences for the non-consensual creation of sexually explicit content and to close the gaps in the Sexual Offences Act. It is, vitally, consent-based, meaning that victims do not have to suffer the trauma of proving the motivation of their perpetrator. It includes solicitation to prevent any creation laws being circumnavigated by asking those in other jurisdictions to create such content for you through the uploading of clothed images to forums. Finally, it includes forced deletion so that victims can clearly see their rights to have the content destroyed from any devices or cloud-based programmes and do not have to live in fear that their perpetrator is still in possession of their content.

This amendment is inspired by the lived experience of victim survivors. The Government have repeatedly said that they are looking for the most suitable legislative vehicle to fulfil their commitment to criminalise the creation of sexually explicit deepfakes. It seems they did not think my Private Member’s Bill was the right vehicle, but it is my firm belief that the most appropriate legislative vehicle is the one that gets there quickest. I am hopeful that the Government will be more receptive to an amendment to their legislation, given the need urgently to tackle this rapidly proliferating form of abuse.

Amendment 211H addresses the problem of sexually explicit audio, which the noble Baroness, Lady Gohir, spoke about so movingly in Friday’s debate. We have seen satirical voice cloning, such as of Gareth Southgate at the 2024 Euros. However, the most state-of-the-art systems now require around three seconds of voice audio data to create speech on a parity with a human. This could be data from a short phone call or a TikTok video. As we are reaching the point where less data is required to create high-quality audio, this now has the potential to be weaponised. There is a real risk that, if we do not future-proof against this while we have the opportunity, it could rapidly develop in the way that sexually explicit deepfake images have. We are already seeing signs of new sexually explicit audio online. Its ease of use combined with its accessibility could create a huge risk in future.

Henry Ajder, the researcher who pioneered the study of non-consensual deepfake image abuse, said:

“2024 has seen AI generated voice audio widely used in spreading political disinformation and new forms of fraud, but much less attention has been paid to its potential as a tool for digital sexual abuse”.


In his research in 2018, he observed several cases of online communities experimenting with voice-cloning capabilities, targeting celebrities to create non-consensual “synthetic phone sex” content. This Bill could be a key opportunity to future-proof against this problem before it becomes widespread.

Baroness Gohir Portrait Baroness Gohir (CB)
- Hansard - - - Excerpts

My Lords, I declare my interests as set out in the register, particularly as CEO of Muslim Women’s Network UK, which operates a national helpline. I also apologise for not being here at Second Reading, but I felt compelled to speak today after the noble Baroness, Lady Owen, put forward her amendments. Before I speak to them, I support all the amendments from the noble Baroness, Lady Kidron—everything she says is always very powerful.

The noble Baroness, Lady Owen, made her case powerfully today, as she did last week. I too spoke in that debate. We were disappointed across the House that the Government were not very supportive of the Bill, but they hinted that its amendments and recommendations could be integrated into another Bill. This Bill could be it.

I will focus my comments on audio recordings, which I raised last week. This element gets overlooked, because we tend to focus on sexually explicit images and video recordings. However, perpetrators will also record audio of sexual activities without consent and either share or threaten to share it. As the noble Baroness, Lady Owen, mentioned, people can create deepfakes very easily with new technologies. A person’s voice is recognisable to the people who know them, so this must be addressed and it can be in this Bill.

Perpetrators of intimate image and intimate audio abuse can instil fear, humiliate and make victims feel unsafe without even sharing, or threatening to share, it. They can manipulate and control their victims simply by making them aware that they have recorded or created these images and recordings.

The Muslim Women’s Network’s helpline has had women call to say that, when relationships have broken down, husbands and boyfriends have made secret audio recordings and then threatened them with those recordings. Sometimes, they have shared them online or with family members and friends. Just knowing that they possess these recordings makes these women feel very unsafe and live in fear. In some communities and cultures where people will be worried about honour-based abuse, women will be even more fearful of the repercussions of these audio recordings being shared.

Whether it is original audio or digitally created deepfake audio, the law needs to be amended to prevent this type of abuse. If the Labour Party and the Government are serious about halving abuse against women and girls, they must shut down every avenue of abuse and accept these amendments.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

My Lords, I will speak in support of Amendment 203, which I have signed, and Amendments 211G and 211H in my noble friend Lady Owen’s name.

At Second Reading, the mood of the House was to consider and support the enormous opportunity that comes from AI and to acknowledge the dangers of overregulation that might, somehow, smother this massive opportunity. I endorse that sentiment. However, Amendment 203 addresses computer-generated child sexual abuse material, which I regard as a red line that we should not cross. If we leave this amendment out of the Bill and cannot tackle this one massive issue of CSAM generated by AI, we will leave the whole question of the integrity and purpose of AI vulnerable to misuse by criminals and perverts.

The scale of the issue is already enormous. The Internet Watch Foundation found 275,000 webpages containing child sexual abuse content. On just one forum, 20,000 AI-generated images were posted in a single month, over 3,000 of which depicted criminal acts of child sexual abuse. This is not a hypothetical problem or some kind of visioneering or dystopian imagination; it is happening right now. There are offices filled with people generating this material for their pleasure and for commercial reasons. That is why it is urgent that we move immediately.

Any of us who has heard the testimony of the many victims of sexual abuse will realise that the experience creates lasting anxiety and gut-wrenching trauma. These are not just pictures or videos; they often represent real harm to real people. That is why urgency is so important and this amendment is so critical.

Shockingly, the explosion of this kind of material is enabled by publicly available tools, as the noble Baroness, Lady Kidron, pointed out. The case of Hugh Nelson is a very good example. He was sentenced to 18 years in prison for creating AI videos of children being physically and sexually abused. The tool he used was Daz 3D, AI software that any of us could access from this Room. It is inconceivable that this technology remains unregulated while being weaponised by people such as Hugh Nelson to inflict huge harm. Currently, our law focuses on the possession and distribution of CSAM but fails to address the mechanisms of its creation. That is a loophole and why I support these amendments. I do so for three key reasons.

First, Amendment 203 would criminalise the creation, training and distribution of AI models that can create CSAM. That would mean that Daz and other sites like it must introduce safety-by-design measures to stop their use for creating illegal content. That is not to smother the great and bountiful explosion of beneficial AI; it is to create the most basic guard-rail that should be embedded in any of these dangerous tools.

18:00
Secondly, under the amendment it would become an offence to train models using CSAM or illegal content to generate images. These systems are trained on massive quantities of tagged images. This data is generally outsourced. AI training models are likely scraping data from the internet without authorisation or supervision. Protecting personal data is absolutely necessary to stop its misuse for creating deepfakes and other CSAM content, training AI models, or creating extreme content.
Thirdly, this amendment would make it an offence to possess digital files or AI models that are intended to produce CSAM. This measure will curb the spread of these tools and reduce the availability of such content.
Together, these measures reduce the ease with which people can currently abuse publicly available tools for their perverse sexual gratification or to destroy the reputation of others. It is no longer enough to focus solely on the content; we must also hold to account the platforms and the tools that enable this abuse. The amendment is meant to send a message to and create legal jeopardy for the major corporations such as Microsoft, Google and AWS that they should not enable those who create this horrible content.
The recent debate on deepfakes, led by my noble friend Lady Owen, gave a very clear sense of where the mood of the House is. Urgency is imperative—the technology is moving more quickly than our legislative response. I hope the Minister will realise that this is an opportunity to set a new milestone for legislative responses to a new technological threat and seize it. The explosion of computer-generated CSAM is a pressing threat to our society, so supporting the amendment is a vital step towards safeguarding thousands more from online abuse.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My Lords, I support Amendment 203 and, in particular, Amendments 211G and 211H from the noble Baroness, Lady Owen. I have little to add to what I said on Friday. I confess to my noble friend the Minister that, in my speech on Friday, I asked whether this issue would be in scope for this Bill, so maybe I gave the noble Baroness the idea. I pay tribute to her agility in being able to act quickly to get this amendment in and include something on audio, following the speech of the noble Baroness, Lady Gohir.

I hope that the Minister has similar agility in being able to readjust the Government’s position on this. It is right that this was an urgent manifesto commitment from my party at the last election. It fits entirely with my right honourable friend the Home Secretary’s efforts around violence against women and girls. We should accept and grab this opportunity to deliver quickly by working with the noble Baroness, Lady Owen, and others between now and Report to bring forward an amendment to the Bill that the whole House will support enthusiastically.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, we have had some powerful speeches in this group, not least from the noble Baronesses, Lady Kidron and Lady Owen, who drafted important amendments that respond to the escalating harms caused by AI-generated sexual abuse material relating to children and adults. The amendment from the noble Baroness, Lady Kidron, would make it an offence to use personal data or digital information to create digital models or files that facilitate the creation of AI or computer-generated child sexual abuse material. As she outlined and the noble Lord, Lord Bethell, confirmed, it specifically would become an offence to create, train or distribute generative AI models that enable the creation of computer-generated CSAM or priority legal content; to train AI models on CSAM or priority illegal content; or to possess AI models that produce CSAM or priority legal content.

This amendment responds to a growing problem, as we have heard, around computer-generated sexual abuse material and a gap in the law. There is a total lack of safeguards preventing bad actors creating sexual abuse imagery, and it is causing real harm. Sites enabling this abuse are offering tools to harm, humiliate, harass, coerce and cause reputational damage. Without robust legal frameworks, victims are left vulnerable while perpetrators operate with impunity.

The noble Lord, Lord Bethell, mentioned the Internet Watch Foundation. In its report of July, One Step Ahead, it reported on the alarming rise of AI-generated CSAM. In October 2023, in How AI is Being Abused to Create Child Sexual Abuse Imagery, it made recommendations to the Government regarding legislation to strengthen legal frameworks to better address the evolving landscape of AI-generated CSAM and enhance preventive measures against its creation and distribution. It specifically recommended:

“That the Government legislates to make it an offence to use personal data or digital information to create digital models or files that facilitate the creation of AI or computer-generated child sexual abuse material”.


The noble Baroness, Lady Kidron, tabled such an amendment to the previous Bill. As she said, she was successful in persuading the then Government to accept it; I very much hope that she will be as successful in persuading this Government to accept her amendment.

Amendments 211G and 211H in the name of the noble Baroness, Lady Owen, are a response to the extraordinary fact that one in 14 adults has experienced threats to share intimate images in England and Wales; that rises to one in seven among young women. Research from Internet Matters shows that 49% of young teenagers in the UK aged between 13 and 16—around 750,000 children—said that they were aware of a form of image-based abuse being perpetrated against another young person known to them.

We debated the first of the noble Baroness’s amendments, which is incorporated in her Bill, last Friday. I entirely agree with the noble Lord, Lord Knight; I did not find the Government’s response at all satisfactory. I hope that, in the short passage of time between then and now, they have had time to be at least a little agile, as he requested. UK law clearly does not effectively address non-consensual intimate images. It is currently illegal to share or threaten to share non-consensual intimate images, including deepfakes, but creating them is not yet illegal; this means that someone could create a deepfake image of another person without their consent and not face legal consequences as long as they do not share, or threaten to share, it.

This amendment is extremely welcome. It addresses the gap in the law by criminalising the creation of non-consensual intimate images, including deepfakes. It rightly targets deepfakes due to their rising prevalence and potential for harm, particularly towards women. Research shows that 98% of deepfake videos online are pornographic, with 99% featuring women and girls. This makes it an inherently sexist problem that is a new frontier of violence against women—words that I know the noble Baroness has used.

I also very much welcome the new amendment not contained in her Bill, responding to what the noble Baroness, Lady Gohir, said at its Second Reading last Friday about including audio deepfakes. The words “shut down every avenue”, which I think were used by the noble Baroness, Lady Gohir, are entirely apposite in these circumstances. Despite what the noble Lord, Lord Ponsonby, said on Friday, I hope that the Government will accept both these amendments and redeem their manifesto pledge to ban the creation of sexually explicit deepfakes, whether audio or video.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, the current law does not sufficiently protect children from AI-driven CSAM because it is simply such a fast-moving issue. It is a sobering thought that, of all the many wonderful developments of AI that many of us have been predicting and speculating on for so long, CSAM is really driving the technology forward. What a depressing reflection that is.

Overall, AI is developing at an extraordinarily rapid pace and has come with a number of concerning consequences that are not all yet fully understood. However, it is understood that child sexual abuse is completely unacceptable in any and all contexts, and it is right that our law should be updated to reflect the dangers that have increased alongside AI development.

Amendment 203 seeks to create a specific offence for using personal data or digital information to create or facilitate the creation of computer-generated child sexual abuse material. Although legislation is in place to address possessing or distributing such horrendous material, we must prioritise the safety of children in this country and take the law a step further to prevent its creation. Our children must be kept safe and, subject to one reservation, which I will come to in a second, I support the amendment from the noble Baroness, Lady Kidron, to further protect them.

That reservation comes in proposed new subsection 1(c), which includes in the offence the act of collating files that, when combined, enable the creation of sexual abuse material. This is too broad. A great deal of the collation of such material can be conducted by innocent people using innocent materials that are then corrupted or given more poisonous aspects by further training, fine-tuning or combination with other materials by more malign actors. I hope there is a way we can refine this proposed new paragraph on that basis.

Unfortunately, adults can also be the targets of individuals who use AI to digitally generate non-consensual explicit images or audio files of an individual, using their likeness and personal data. I am really pleased that my noble friend Lady Owen tabled Amendments 211G and 211H to create offences for these unacceptable, cruel acts. I support these amendments unambiguously.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I thank the noble Baroness, Lady Kidron, for her Amendment 203. It goes without saying that the Government treat all child sexual abuse material with the utmost seriousness. I can therefore confirm to her and the Committee that the Government will bring forward legislative measures to address the issue in this Session and that the Home Office will make an announcement on this early in the new year.

On Amendments 211G and 211H, tabled by the noble Baroness, Lady Owen, the Government share concerns that more needs to be done to protect women from deepfake image abuse. This is why the Government committed in their manifesto to criminalise the creation of sexually explicit deepfake images of adults. I reassure the noble Baroness and the whole Committee that we will deliver on our manifesto commitment in this Session. The Government are fully committed to protecting the victims of tech-enabled sexual abuse. Tackling intimate audio would be a new area of law, but we continue to keep that legislation under review.

I also say to the noble Baroness that there is already a process under Section 153 of the Sentencing Act 2020 for the court to deprive a convicted offender of property, including images that have been used for the purpose of committing or facilitating any criminal offence. As well as images, that includes computers and mobile phones that the offender either used to commit intimate image offences or intended to use for that purpose in future. For those reasons and the reassurances I have given today, I hope that noble Lords will feel able to withdraw or not press their amendments.

18:15
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, first, I thank the speakers for what were really powerful and largely unequivocal contributions.

I am grateful to the Minister. I was expecting something more a tiny bit expansive but I will take, on record, that we are going to make it a new offence for a person to make, adapt, possess, supply or offer to supply a CSA image generator, including any service, program or information in electronic form that is made, or adapted for use, to create or facilitate the creation of CSA material. I am expecting something that covers all that and I am expecting it shortly, as the Minister said. I again thank the Safeguarding Minister, Jess Phillips, for her tremendous efforts, as well as some of the civil servants who helped make it leap from one Government to the next. We can be content with that.

I feel less comfortable about the Minister’s answer to the noble Baroness, Lady Owen. We, women victims, experience the gaps in the law. If there are gaps in the law, it is our job, in this Committee and in the other place, to fix them. We all want the same thing; I know the Minister well enough to know that she wants the same thing. So I am going to push back and say that I will support the noble Baroness, Lady Owen, in trying to bring this measure back through this Bill. I believe that the mood of the Committee is with her so whatever mistakes there are on her patch will be fixed before Report, because this is not something that can wait. Kids and women are being hurt.

We all want to celebrate the digital world. I was an early adopter. I had one of those cameras on my computer before anyone else I knew did, so I could not speak to anyone; there was no one to talk to. We want this world to be good. We are not saying something different. On behalf of the noble Baroness, Lady Owen, who is nodding, let me just say that we will come back to this issue. I thank the Minister for her assurance on Amendment 203 and beg leave to withdraw.

Amendment 203 withdrawn.
Amendment 204
Moved by
204: After Clause 132, insert the following new Clause—
“Compliance with UK copyright law by operators of web crawlers and general-purpose AI models(1) The Secretary of State must by regulations make provisions clarifying the steps the operators of web crawlers and general-purpose artificial intelligence (AI) models must take to comply with United Kingdom copyright law, including the Copyright, Designs and Patents Act 1988.(2) The provisions made under subsection (1) must apply if the products and services of such operators are marketed in the United Kingdom.(3) The provisions made under subsection (1) must apply to the entire lifecycle of a general-purpose AI model, including but not limited to—(a) pre-training,(b) fine tuning, and(c) grounding and retrieval-augmented generation.(4) The Secretary of State must lay before Parliament a draft of the statutory instrument containing regulations under subsection (1) within six months of the day on which this Act is passed and the regulations are subject to the affirmative procedure.”Member’s explanatory statement
This amendment would require operators of internet scrapers and general-purpose AI models to comply with UK copyright law, and to abide by a set of procedures.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I am beginning to feel like the noble Lord, Lord Clement-Jones, but I reassure everyone that this is the last day of Committee.

I shall speak to the amendments in this group in my name and that of the noble Lords, Lord Stevenson—he is very sorry not to be in his place today—and Lord Clement-Jones, and my noble friend Lord Freyberg. I thank the News Media Association for its briefing and support. I also thank, for their wonderful and unlikely support, Sir Paul McCartney, Kate Mosse, Margaret Drabble and Richard Osman, alongside the many creative artists who have spoken, written and tweeted and are among the 37,000 people who signed a petition calling for swift action to protect their livelihoods.

I have already declared my interests for the Committee but I add, to be clear, that my husband is a writer of film, theatre and opera; and that, before I came to your Lordships’ House, I spent 30 years as a movie director. As such, I come from and live alongside a community for whom the unlicensed and illegal use of copyrighted content by generative AI developers is an existential issue. I am therefore proud to move and speak to amendments that would protect one of our most financially significant economic sectors, which contributes £126 billion in gross value added to UK GDP; employs 2.4 million people; and brings so much joy and understanding to the world.

Text and data mining without licence or permission is illegal in the UK, unless it is done specifically for research. This means that what we have witnessed over the past few years is intellectual property theft on a vast scale. Like many of the issues we have discussed in Committee, this wrongdoing has happened in plain sight of regulators and successive Governments. I am afraid that yesterday’s announcement of a consultation did not bring the relief the industry needs. As Saturday’s Times said,

“senior figures in the creative sector are scathing about the government plans”,

suggesting that the Secretary of State has drunk Silicon Valley’s “Kool-Aid” and that rights reservation is nonsense. An official at the technical briefing for the consultation said that

“rights reservation is a synonym for opt out”.

Should shopkeepers have to opt out of shoplifters? Should victims of violence have to opt out of attacks? Should those who use the internet for banking have to opt out of fraud? I could go on. I struggle to think of another situation where someone protected by law must proactively wrap it around themselves on an individual basis.

The value of our creative industries is not in question; nor is the devastation that they are experiencing as a result of non-payment of IP. A recent report from the International Confederation of Societies of Authors and Composers, which represents more than 5 million creators worldwide, said that AI developers and providers anticipate the market for GAI music and audiovisual content increasing from €3 billion to €64 billion by 2028 —much of it derived from the unlicensed reproduction of creators’ works, representing a transfer of economic value from creators to AI companies. Let there be no misunderstanding of the scale of the theft: we already know that the entire internet has been downloaded several times without the consent or financial participation of millions of copyright holders.

This transfer of economic value from writers, visual artists and composers across all formats and all genres to AI companies is not theoretical. It is straightforward: if you cannot get properly paid for your work, you cannot pay the rent or build a career. Nor should we be taken in by the “manufactured uncertainty” that Silicon Valley-funded gen AI firms and think tanks have sought to create around UK copyright law. Lobbyists and their mouthpieces, such as TechUK, speak of a lack of clarity—a narrative that may have led to Minister Chris Bryant claiming that the Government’s consultation was a “win-win”. However, I would like the Minister to explain where the uncertainty on who owns these copyrighted works lies. Also, where is the win for the creative industries in the government proposal, which in one fell swoop deprives artists of control and payment for their work—unless they actively wrap the law around them and say “no”—leaving them at the mercy of pirates and scrapers?

Last week, at a meeting in this House attended by a wide range of people, from individual artists to companies representing some of the biggest creative brands in the world, a judge from the copyright court said categorically that copyright lies with the creator. AI does not create alone; it depends on data and material then to create something else. A technological system that uses it without permission is theft. The call for a new copyright law is a tactic that delays the application of existing law while continuing to steal. Unlike the physical world, where the pursuit of a stolen masterpiece may eventually result in something of value being returned to its owner, in the digital world, once your IP is stolen, the value is absorbed and fragmented, hidden amid an infinite number of other data points and onward uses. If we continue to delay, much of the value of the creative industries’ rich dataset will be absorbed already.

The government consultation has been greeted with glee overnight by the CCIA, which lobbies for the biggest tech firms. After congratulating the Government at some length, it says that

“it will be critical to ensure that the transparency requirements are realistic and do not ask AI developers to compromise their work by giving away trade secrets and highly sensitive information that could jeopardise the safety and security of their models”.

In plain English, that means, “We have persuaded the Government to give up creatives’ copyright, and now the campaign begins to protect our own ‘sensitive business information’”. If that is not sufficiently clear to the Committee, that means they are, first, claiming their own IP while stealing others, while simultaneously pushing back at transparency, because they do not want an effective opt-out.

The government consultation does not even contain an option of retaining the current copyright framework and making it workable with transparency provisions—the provisions of the amendments in front of us. The Government have sold the creative industries down the river. Neither these amendments nor the creative community are anti-tech; on the contrary, they simply secure a path by which creatives participate in the world that they create. They ensure the continuous sustainable production of human-generated content into the future, for today’s artists and those of tomorrow. The amendments do not extend the fundamentals of the Copyright, Designs and Patents Act 1988, but they ensure that the law can be enforced on both AI developers and third parties that scrape on their behalf. They force transparency into the clandestine black box.

Amendment 204 requires the Secretary of State to set out the steps by which copyright law must be observed by web crawlers and others, making it clear that it applies during the entire lifecycle, from pretraining onwards, regardless of jurisdiction—and it must take place only with a licence or express permission.

Amendment 205 requires the Secretary of State to set out the steps by which web crawlers and general-purpose AI models are transparent. This includes but is not limited to providing a name for a crawler, identifying the legal entity responsible for it, a list of purposes for which it is engaged and what data it has passed on. It creates a transparent supply chain. Crucially, it requires operators of crawlers to disclose the businesses to which they sell the data they have scraped, making it more difficult for AI developers that purchase illegally scraped content to avoid compliance with UK copyright law, overturning current practice in which the operators of crawlers can obscure their own identity or ownership, making it difficult and time-consuming—potentially impossible—to combat illegal scraping.

Amendment 206 requires the Secretary of State to set out by regulation what information web crawlers and general-purpose models must disclose regarding copyrighted works—information such as URL, time and type of data collected and a requirement to inform the copyright holder. This level of granularity, which the tech companies are already pushing against, provides a route by which IP holders can choose or contest the ways in which their work is used, as well as provide a route for payment.

In sum, the amendments create a clear and simple process for identifying which copyright works are scraped, by whom, for what purpose and from which datasets. They provide a process by which existing law can be implemented.

I shall just mention a few more points before I finish. First, there is widespread concern that mashing up huge downloads of the internet, including the toxic, falsehoods and an increasing proportion of artificially generated or synthetic data, will cause it to degenerate or collapse, putting a block on the innovation that the Government and all of us want to see, as well as raising serious safety concerns about the information ecosystem. A dynamic licensing market would provide a continuous flow of identified human-created content from which AI can learn.

Secondly, the concept of a voluntary opt-out regime—or, as the Government prefer, rights reservation—is already dead. In the DPDI Bill, I and others put forward an amendment to make robots.txt part of the robots’ exclusion protocol opt-in. In plain English, that would have meant that the voluntary scheme in which any rights holder can put a note on their digital door saying “Don’t scrape” would have been reversed to be mandatory. Over the last few months, we have seen scrapers ignoring the agreed protocol, even when activated. I hope the Minister will explain why he thinks that creators should bear the burden and the scrapers should reap the benefit and whether the Government have done an impact assessment on how many rights holders will manage to opt out versus how many would opt in, given the choice.

18:30
Thirdly, the companies are not quite telling the whole truth. In August, news broke that Meta was indexing the web to enable its AI chatbot to provide responses to user questions. At the same time, a much less trumpeted new entry on Meta’s website stated that it will scrape everything written on the web by companies and individuals to improve products “by indexing content directly”. If indexing is equivalent to scraping and, as we debated earlier in Committee, “improving products” is scientific research, this Bill represents the end of both IP and data protection simultaneously.
Finally, it is simply not true that regulation will hold us back. Many of our most successful sectors are the most regulated and there are other factors that hold back investment and growth in the UK, including a very risk-averse investment ecosystem.
We have a rich and impactful creative sector. The reach of our artists, the soft power of our storytellers in all formats, the inventiveness of our designers and the skill of our musicians are legendary. The Government’s industrial strategy rightly recognises the creative industries and the tech sector as two of the UK’s priority growth-driving industries. The Government talk about balancing two competing sides, but they are neither the same nor equal. One is a creator and one is a distributor, regurgitator or, perhaps more generously, secondary user. As in all supply lines, you need to pay for your raw material to make something new. The Government will not achieve growth by simply allowing one growth area to cannibalise the other. Since the vast majority of benefit from AI scraping accrues to the US, it seems short-sighted, possibly criminal, to put the UK’s uniquely successful and profitable creative industries at the mercy of the predatory gen AI companies. I beg to move.
Lord Freyberg Portrait Lord Freyberg (CB)
- Hansard - - - Excerpts

My Lords, I support Amendments 204, 205 and 206, to which I have attached my name. In doing so, I declare my interest as someone with a long-standing background in the visual arts and as an artist member of the Design and Artists Copyright Society.

These amendments, tabled and superbly moved by my noble friend and supported by the noble Lords, Lord Stevenson and Lord Clement-Jones, seek to address a deep crisis in the creative sector whereby millions upon millions of creative works have been used to train general-purpose or generative AI models without permission or pay. While access to data is a fundamental aspect of this Bill, which in many cases has positive and legitimate aims, the unauthorised scraping of copyright-protected artworks, news stories, books and so forth for the use of generative AI models has significant downstream impacts. It affects the creative sectors’ ability to grow economically, to maximise their valuable assets and to retain the authenticity that the public rely on.

AI companies have used artists’ works in the training, development and deployment of AI systems without consent, despite this being a requirement under UK copyright law. As has been said, the narrow exception to copyright for text and data mining for specific research purposes does not extend to AI models, which have indiscriminately scraped creative content such as images without permission, simply to build commercial products that allow users to generate their own versions of a Picasso or a David Hockney work.

This amendment would clarify the steps that operators of web crawlers and general-purpose AI models must take to comply with UK copyright law. It represents a significant step forward in resolving the legal challenges brought by rights holders against AI companies over their training practices. Despite high-profile cases arising in the USA and the UK over unauthorised uses of content by AI companies, the reality is that individual artists simply cannot access judicial redress, given the prohibitive cost of litigation.

DACS, which represents artists’ copyright, surveyed its members and found that they were not technophobic or against AI in principle but that their concerns lay with the legality and ethics of current AI operators. In fact, 84% of respondents would sign up for a licensing mechanism to be paid when their work is used by an AI with their consent. This amendment would clarify that remuneration is owed for AI companies’ use of artists’ works across the entire development life cycle, including during the pre-training and fine-tuning stages.

Licensing would additionally create the legal certainty needed for AI companies to develop their products in the UK, as the unlawful use of works creates a litigation risk which deters investment, especially from SMEs that cannot afford litigation. DACS has also been informed by its members that commissioning clients have requested artists not to use AI products in order to avoid liability issues around its input and output, demonstrating a lack of trust or uncertainty about using AI.

This amendment would additionally settle ongoing arguments around whether compliance with UK copyright law is required where AI training takes place in other jurisdictions. By affirming its applicability where AI products are marketed in the UK, the amendment would ensure that both UK-based artists and AI companies are not put at a competitive disadvantage due to international firms’ ability to conduct training in a different jurisdiction.

One of the barriers to licensing copyright is the lack of transparency over what works have been scraped by AI companies. The third amendment in this suite of proposals, Amendment 206, seeks to address this. It would require operators of web crawlers and general-purpose AI models to be transparent about the copyright works they have scraped.

Currently, artists and creators face significant challenges in protecting their intellectual property rights in the age of AI. While tools such as Spawning AI’s “Have I Been Trained?” attempt to help creators identify whether their work has been used in AI training datasets, these initiatives provide only surface-level information. Creators may learn that their work was included in training data, but they remain in the dark about crucial details—specifically, how their work was used and which companies used it. This deeper level of transparency is essential for artists to enforce their IP rights effectively. Unfortunately, the current documentation provided by AI companies, such as data cards and model cards, falls short of delivering this necessary transparency, leaving creators without the practical means to protect their work.

Amendment 206 addresses the well-known black box issue that currently plagues the AI market, by requiring the disclosure of information about the URLs accessed by internet scrapers, information that can be used to identify individual works, the timeframe of data collection and the type of data collected, among other things. The US Midjourney litigation is a prime example of why this is necessary for UK copyright enforcement. It was initiated only after a leak revealed the names of more than 16,000 non-consenting artists whose works were allegedly used to train the tool.

Creators, including artists, should not find themselves in a position where they must rely on leaks to defend their intellectual property rights. By requiring AI companies to regularly update their own records, detailing what works were used in the training process and providing this to rights holders on request, this amendment could also create a vital cultural shift towards accountability. This would represent an important step away from the “Move fast and break things” culture pervasive amongst the Silicon Valley-based AI companies at the forefront of AI development, and a step towards preserving the gold-standard British IP framework.

Lastly, I address Amendment 205, which requires operators of internet crawlers and general-purpose AI models to be transparent about the identity and purpose of their crawlers, and not penalise copyright holders who choose to deny scraping for AI by down ranking their content in, or removing their content from, a search engine. Operators of internet crawlers that scrape artistic works and other copyright-protected content can obscure their identity, making it difficult and time-consuming for individual artists and the entities that represent their copyright interests to identify these uses and seek redress for illegal scraping.

Inclusion in search-engine results is crucial for visual artists, who rely on the visibility these provide for their work to build their reputation and client base and generate sales. At present, web operators that choose to deny scraping by internet crawlers risk the downrating or even removal of their content from search engines, as the most commonly used tools cannot distinguish between do-not-train protocols added to a site. This amendment will ensure that artists who choose to deny scraping for AI training are not disadvantaged by current technical restrictions and lose out on the exposure generated by search engines.

Finally, I will say a few words about the Government’s consultation launched yesterday, because it exposes a deeply troubling approach to creators’ IP rights, as has already been said so eloquently by the noble Baroness. For months, we have been urged to trust the Government to find the right balance between creators’ rights and AI innovation, yet their concept of balance has now been revealed for what it truly is: an incredibly unfair trade-off that gives away the rights of hundreds of thousands of creators to AI firms in exchange for vague promises of transparency.

Their proposal is built on a fundamentally flawed premise—promoted by tech lobbyists—that there is a lack of clarity in existing copyright law. This is completely untrue: the use of copyrighted content by AI companies without a licence is theft on a mass scale, as has already been said, and there is no objective case for the new text and data-mining exception. What we find in this consultation is a cynical rebranding of the opt-out mechanism as a rights reservation system. While they are positioning this as beneficial for rights holders through potential licensing revenues, the reality is that this is not achievable, yet the Government intend to leave it to Ministers alone to determine what constitutes

“effective, accessible, and widely adopted”

protection measures.

This is deeply concerning, given that no truly feasible rights reservation system for AI has been implemented anywhere in the world. Rights holders have been unequivocal: opt-out mechanisms—whatever the name they are given—are fundamentally unworkable in practice. In today’s digital world, where content can be instantly shared by anyone, creators are left powerless to protect their work. This hits visual artists particularly hard, as they must make their work visible to earn a living.

The evidence from Europe serves as a stark warning: opt-out provisions have failed to protect creators’ rights, forcing the EU to introduce additional transparency requirements in the recent AI Act. Putting it bluntly, simply legalising unauthorised use of creative works cannot be the answer to mass-scale copyright infringement. This is precisely why our proposed measures are crucial: they will maintain the existing copyright framework whereby AI companies must seek licences, while providing meaningful transparency that enables copyright holders to track the use of their work and seek proper redress, rather than blindly repeating proven failures.

Earl of Clancarty Portrait The Earl of Clancarty (CB)
- Hansard - - - Excerpts

My Lords, I speak in support of my noble friend Lady Kidron’s amendments. I declare an interest as a visual artist, and of course visual creators, as my noble friend Lord Freyberg has very well described, are as much affected by this as musicians, journalists and novelists. I am particularly grateful to the Design and Artists Copyright Society and the Authors’ Licensing and Collecting Society for their briefings.

A particular sentence in the excellent briefing for this debate by the News Media Association, referred to by my noble friend Lady Kidron, caught my eye:

“There is no ‘balance’ to be struck between creators’ copyrights and GAI innovation: IP rights are central to GAI innovation”.


This is a crucial point. One might say that data does not grow on a magic data tree. All data originates from somewhere, and that will include data produced creatively. One might also say that such authorship should be seen to precede any interests in use and access. It certainly should not be something tagged on to the end, as an afterthought. I appreciate that the Government will be looking at these things separately, but concerns of copyright should really be part of any Bill where data access is being legislated for. As an example, we are going to be discussing the smart fund a bit later in an amendment proposed by the noble Lord, Lord Bassam, but I can attest to how tricky it was getting that amendment into a Bill that should inherently be accommodating these interests.

18:45
AI of course has huge benefits in other areas, as we have heard this afternoon, not least in the arts and creative industries. The famous example that comes to mind is the “Get Back” documentary on the Beatles, directed by Peter Jackson; but, as Paul McCartney pointed out this week, it is not just the famous and secure who are in danger of having their work scraped. It will also include those at the beginning of their careers, and those who have just enough work to survive, and that includes those fine artists and illustrators who have been engaged in lawsuits in America over precisely these concerns, whose work in film and animation are threatened. In art and design, we are talking about a huge range of work—everyone from fine artists to bespoke craft and artisanship are potentially in the firing line.
A recent survey on AI carried out by the Authors’ Licensing and Collecting Society found that 96% of writers would want remuneration if their work was used to train AI, which is as much of an argument for an opt-in system as any. This is apart from the highly respected permission-based copyright standard under current UK law. Moreover, 77% of writers do not know whether their work has been used to train AI. As the ALCS says:
“We need a workable, regulated approach to create systems and data to identify with sufficient specificity the works of individual authors that have been used within GAI systems”.
“Sufficient specificity” is underlined. True transparency, which the creative industries are calling for, must surely mean an opt-in system.
Finally, at the recent All-Party Parliamentary Group for Writers reception, we heard a moving speech by the author Joanne Harris, who made perhaps the most important point. She said that to a lot of the public, as soon as you utter the words “artificial intelligence”, people still think it is science fiction. It is not science fiction. As Joanne Harris and others have pointed out, it is happening now and happening in a big way. The Government need to deal with these concerns both urgently and effectively.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - - - Excerpts

My Lords I have been very impressed by the speeches of my noble friends Lady Kidron and Lord Freyberg, so I will be very brief. I declare in interest as a television producer who produces content. I hope that it has not been scraped up by AI machines, but who knows? I support the amendments in this group.

I know that AI is going to solve many problems in our economy and our society. However, in their chase for the holy grail of promoting AI, I join other noble Lords in asking the Government not to push our creative economy under the bus. It is largely made up of SMEs and single content producers, who do not have the money to pursue powerful AI companies to get paid for the use of their content in training their AI models. It is up to noble Lords to help shape regulations that protect our data and copyright laws and can be fully deployed in the defence of the creative economy.

I too have read the Government’s Copyright and Artificial Intelligence consultation paper, published yesterday. The foreword says:

“The proposals include a mechanism for rights holders to reserve their rights”,


which I, like my noble friend Lady Kidron and others, interpret as meaning that creators’ works can be used by AI developers unless they opt out and require licensing for the use of their work. The Government are following the EU example and going for the opt-out model. I think that the European Union is beginning to realise that it is very difficult to make that work, and it brings an unfairness to content producers. Surely, the presumption should be that AI web crawlers should get agreement before using content. The real problem is that content producers do not even know when their content has been used. Even the AI companies sometimes do not know what content has been used. Surely, the opt-out measure is like having your house raided and then asking the burglar what he has taken.

I call on the Minister to work with us to create an opt-in regime. Creators’ works should be used only when already licensed by the AI companies. The companies say they usually do not use content, only data points. Surely that is like saying to a photographer, “We’ve used 99% of the pixels in a picture but not the whole picture”. If even one pixel is used, the photographer needs to know and be compensated.

The small companies and single content producers of our country are the backbone of our economy, as other noble Lords have said. They are threatened by this technology, in which we have placed so much faith. I ask the Minister to respond favourably to Amendments 204, 205 and 206 to ensure that we have fairness between some of the biggest AI players in the world and the hard-pressed people who create content.

Lord Hampton Portrait Lord Hampton (CB)
- Hansard - - - Excerpts

My Lords, I support Amendments 204, 205 and 206 in the names of my noble friends Lady Kidron and Lord Freyberg, and of the noble Lords, Lord Stevenson and Lord Clement-Jones, in what rapidly seems to be becoming the Cross-Bench creative club.

I spent 25 years as a professional photographer in London from the late 1980s. When I started, retouchers would retouch negatives and slides by hand, charging £500 an hour. Photoshop stopped that. Professional film labs such as Joe’s Basement and Metro would work 24 hours a day. Snappy Snaps and similar catered for the amateur market. Digital cameras stopped that. Many companies provided art prints, laminating and sundry items for professional portfolios. PDFs and websites stopped that. Many different forms of photography, particularly travel photography, were taken away when picture libraries cornered the market and drove down commissions to unsustainable levels. There were hundreds if not thousands of professional photographers in the country. The smartphone has virtually stopped that.

All these changes were evolution and the result of a world becoming more digitised, but AI web crawlers are different, illegally scraping images without consent or payment then potentially killing the trade of the victim by setting up in competition. This is a parasite, but not in the true sense, because a parasite is careful to keep its victims alive.

Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, I very much support these amendments. I declare an interest as an owner of written copyright in the Good Schools Guide and as a father of an illustrator. In both contexts, it is very important that we get intellectual property right, as I think the Government recognised in what they put out yesterday. However, I share the scepticism of those who have spoken as to whether the Government’s ideas can be made to work.

It is really important that we get this straight. For those of us operating at the small end of the scale, IP is under continual threat from established media. I write maybe 10 or a dozen letters a year to large media outfits reminding them of the borders, the latest to the Catholic Herald—it appears not even the 10 commandments have force on them. But what AI can do is a huge measure more difficult to deal with. I can absolutely see, by talking to Copilot, that it has gone through my paywall and absorbed the contents of the Good Schools Guide, but who am I supposed to go at for this? Who has actually done the trespassing? Who is responsible for it? Where is the ownership? It is difficult to enforce copyright, even by writing a polite letter to someone saying, “Please don’t do this”. The Government appear to propose a system of polite letters saying, “Oh dear, it looks as if you might have borrowed my copyright. Please, can you give it back?”

This is not practically enforceable, and it will not result in people who care about IP locating their businesses here. Quite clearly, we do not have ownership of the big AI systems, and it is unlikely that we will have ownership of them—all that will be overseas. What we can do is create IP. If we produce a system where we do not defend the IP that we produce, then fairly rapidly, those IP creators who are capable of being mobile will go elsewhere to places that will defend their IP. It is something that a Government who are interested in growth really ought to be interested in defending. I hope that we will see some real progress in the course of the Bill going through the House.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I declare my AI interests as set out in the register. I will speak in support of Amendments 204, 205 and 206, which have been spoken to so inspiringly by the noble Baroness, Lady Kidron, and so well by the noble Lords, Lord Freyberg, Lord Lucas and Lord Hampton, the noble Earl, Lord Clancarty, and the noble Viscount, Lord Colville. Each demonstrated different facets of the issue.

I co-chair the All-Party Group on AI and chaired the AI Select Committee a few years ago. I wrote a book earlier this year on AI regulation, which had a namecheck from the noble Baroness, Lady Jones, at Question Time, which I was very grateful for. Before that, I had a career as an IP lawyer, defending copyright and creativity, and in this House, I have been my party’s creative industries spokesperson. The question of IP and the training of generative AI models is a key issue for me.

This is the case not just in the UK but around the world. Getty and the New York Times are suing in the United States, as are many writers, artists and musicians. It was at the root of the Hollywood actors’ and writers’ strikes last year. It is one thing to use the tech—many of us are AI enthusiasts—but it is another to be at the mercy of it.

Close to home, the FT has pointed out, using the index published by the creator of an unlicensed dataset called Books3, published online, that it is possible to identify that over 85 books written by 33 Members of the House of Lords have been pirated to train AI models from household names, such as Meta, Microsoft and Bloomberg. Although it is absolutely clear that we know that the use of copyrighted works to train AI models is contrary to UK copyright law, the laws around the transparency of these activities have not caught up. As we have heard, as well as using pirated e-books in their training data, AI developers scrape the internet for valuable professional journalism and other media, in breach of both the terms of service of websites and copyright law, to train commercial AI models. At present, developers can do this without declaring their identity, or they may use IP scraped to appear in a search index for the completely different commercial purpose of training AI models.

How can rights owners opt out of something that they do not know about? AI developers will often scrape websites or access other pirated material before they launch an LLM in public. This means that there is no way for IP owners to opt out of their material being taken before its inclusion in these models. Once used to train these models, the commercial value, as we have heard, has already been extracted from IP scraped without permission, with no way to delete data from these models.

The next wave of AI models responds to user queries by browsing the web to extract valuable news and information from professional news websites. This is known as retrieval-augmented generation—RAG. Without payment for extracting this commercial value, AI agents built by companies such as Perplexity, Google and Meta will, in effect, free-ride on the professional hard work of journalists, authors and creators. At present, such crawlers are hard to block. There is no market failure; there are well-established licensing solutions. There is no uncertainty around the existing law; the UK is absolutely clear that commercial organisations, including gen AI developers, must license the data that they use to train their large language models.

Here, as the Government’s intentions become clearer, the political, business and creative temperature is rising. Just this week, we have seen the creation of a new campaign, the Creative Rights in AI Coalition—CRAIC —across the creative and news industries and, recently, Ed Newton-Rex reached more than 30,000 signatories from among creators and creative organisations.

19:00
With the new government consultation, which came out yesterday, we are now faced with a proposal regarding the text and data mining exception that we thought was settled under the last Government. There will be a statement tomorrow and we will no doubt have a second bite at the cherry, but echoed in the consultation, both Ministers—the noble Lord, Lord Vallance, and Feryal Clark MP—seem to think that we need a balance between the creative industries and the tech industries. But what kind of balance is this?
As the News Media Association says, the Government’s consultation is based on the mistaken idea, promoted by tech lobbyists and echoed in the consultation, that there is a lack of clarity in existing copyright law. This is completely untrue: the use of copyrighted content by gen AI firms without a licence is
“theft of copyright on a mass scale”,
and there is no objective case for a new text and data mining exception. Yet the Government are proposing to change the UK’s copyright framework by creating a text and data mining exception where rights holders have not expressly reserved their rights—in other words, an opt-out system, where content is free to use unless a rights holder proactively withholds consent.
To complement this, the Government are proposing transparency provisions and provisions to ensure that rights reservation mechanisms are effective. The Government have stated that they will move ahead with their preferred rights reservation option only if the transparency and rights reservation provisions are
“effective, accessible, and widely adopted”.
This is incredibly concerning, given that no effective rights reservations system for the use of content by gen AI has been proposed or implemented anywhere in the world, as the noble Lord, Lord Freyberg, said, making the Government’s proposals entirely speculative. As the NMA says, what the Government are proposing is an incredibly unfair trade-off, giving the creative industries a vague commitment to transparency, while giving the rights of hundreds of thousands of creators to gen AI firms. While creators are desperate for a solution after years of copyright theft by gen AI firms, making a crime legal cannot be the solution to mass theft.
We need transparency and a clear statement about copyright along the lines of these amendments. We absolutely should not expect artists to have to opt out. AI developers must be transparent about the identity and purposes of their crawlers and have separate crawlers for distinct purposes. Unless news publishers and the broader creative industries can retain control over their data, making UK copyright law enforceable, AI firms will be free to scrape the web without remunerating creators. This will not only reduce investment in trusted journalism but ultimately harm innovation in the AI sector.
Lord Faulks Portrait Lord Faulks (Non-Afl)
- Hansard - - - Excerpts

The noble Lord has enormous experience in these areas and will be particularly aware of the legal difficulties in enforcing rights. Given what he said, with which I entirely agree—indeed, I agree with all the speakers in supporting these amendments—and given the extraordinary expense of litigating to enforce rights, how does he envisage there being an adequate system to allow those who have had their data scraped in the way that he describes to obtain redress or, rather, suitable remedies?

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I thank the noble Lord for that. He is anticipating a paragraph in my notes, which says that, although it is not set out in the amendments, robust enforcement of these provisions will be critical to their success. This includes oversight from an expert regulator that is empowered to issue significant penalties, including fines for non-compliance. There is a little extra work to do there, and I would very much like to see the Intellectual Property Office gain some teeth.

I am going to close. We are nearly at the witching hour, but it is clear that AI developers are seeking to use their lobbying clout—the noble Baroness, Lady Kidron, mentioned the Kool-Aid—to persuade the Government that new copyright law is required. Instead, this amendment would clarify that UK copyright law applies to gen AI developers. The creative industries, and noble Lords from across the House as their supporters, will rally around these amendments and vigorously oppose government plans for a new text and data- mining exception.

Lord Faulks Portrait Lord Faulks (Non-Afl)
- Hansard - - - Excerpts

My Lords, I have very little to add because I entirely support all these amendments. I am always concerned when I see the words “lack of clarity” in a context like this. The basic principle of copyright law, whereby one provides a licence and is paid for that licence by agreement, has been well established. There is no need for any further clarity in this context, as in earlier contexts of copyright law.

I should declare an interest as the chairman of IPSO, the regulator of 95% of the printed news media and its online versions. I have been impressed by the News Media Association’s briefings. It has identified important issues. I am extremely concerned about what appears to have been a considerable amount of lobbying by big tech in this area. It reminds me of what took place when your Lordships’ House considered the Digital Markets, Competition and Consumers Bill. A low point for me was when we were told that it would be very difficult to establish a proper system otherwise Google’s human rights would be somehow infringed. It is extremely important that this so-called balance does not mean that those who create original material protected by the copyright Acts have their rights violated in order to satisfy the interests of big tech.

Earl of Effingham Portrait The Earl of Effingham (Con)
- Hansard - - - Excerpts

My Lords, my noble friend Lord Camrose apologises to the Committee but he has had to leave early for unavoidable family reasons. Needless to say, he will read Hansard carefully.

It is our belief that a society that fails to value products of the mind will never be an innovative society. We are fortunate to live in that innovative society now and we must fight to ensure it remains one. Data scraping and AI crawlers pose both novel and substantial challenges to copyright protection laws and mechanisms. His Majesty’s Official Opposition are pleased that these amendments have been brought forward to address those challenges, which differ from those posed by traditional search engine crawlers.

Generally speaking, in creating laws about data we have been able to follow a north star of replicating online the values and behaviours we take for granted offline. This was of real service to us in the Online Safety Act, for example. In many ways, however, that breaks down when we come to AI and copyright. Offline, we are happy to accept that an artist, author, musician or inventor has been influenced by existing works in their field. Indeed, we sometimes celebrate that fact, and we have a strong intuitive sense of when influence has crossed the line into copying. This means that we can form an intuitive assessment of whether a copyright has been breached offline based on what creators produce, not what content they have consumed, which we expect to be extensive. With an AI crawler, that intuition and model break down. There are simply too many variables and too much information. We have no choice but to go after the inputs.

With that in mind, it would be helpful to set out the differences between traditional search engine crawlers and AI crawlers. Indexing crawlers used by the search engines we are all familiar with store information in their indexes. This then determines the results of the search. However, AI crawlers generally fall into two categories. The training crawlers scrape the web, collecting data used to train large language models. Live retrieval crawlers pull in live data from the web and incorporate it into chatbot responses.

Historically, the robots exclusion protocol—the plain text file identified as robots.txt—has been embedded into website domains, specifying to crawlers what data they can and cannot access in part or all of the domain. This has been used for the past 30 years to protect information or IP from indexing crawlers. Although the robots exclusion protocol has worked relatively well for many years, in some ways it is not fit for the web as it exists today—especially when dealing with AI crawlers.

To exclude crawlers from websites, we must be able to identify them. This was, for the most part, workable in the early days of the internet when there were relatively few search engines and, correspondingly, few indexing crawlers. However, given the rapidly increasing number of AI services, with their corresponding crawlers trawling the web, it becomes impossible to exclude them all. To make matters worse, some AI crawlers operate in relative secrecy. Their names, which can be viewed through domain holder access logs, reveal little of their purpose.

Furthermore, the robots exclusion protocol is not an enforceable agreement; it is more like a polite request. Based on that, a crawler can simply ignore a robots.txt file and scrape the data anyway. It is also worth noting that even if a crawler acknowledges and obeys a robots.txt file, the data may be inadvertently scraped from a third-party source who has lifted the data of intellectual property either manually or using a crawler that does not obey the robots.txt files. That can then be made available without the protection of the robots exclusion protocol. This raises an unsettling question: how do we protect intellectual property and data more generally from these AI crawlers, whose developers decline the voluntary limitations placed on them?

At this point, I turn to the amendments. Amendment 204 is a great initial step toward requiring crawler operators to respect UK copyright law. However, this provision would apply only to products and services of such operators that are marketed in the United Kingdom. What about those from outside the UK? Indeed, as my noble friend Lord Camrose has often argued, any AI lab that does not want to follow our laws can infringe the same copyright with impunity in another jurisdiction. Unless and until we address the offshoring problem, we continue to have real concerns as to the enforceability of any regulations we implement here.

I will address the individual subsections in Amendment 205. Proposed new subsection (1) would require crawlers to reveal their identity, including their name, who is responsible for them, their purpose, who receives their scraped data, and a point of contact. This is an excellent idea, although we are again concerned about enforceability due to offshoring. Proposed new subsection (2) requires this information to be easily accessible. We are sure this would be beneficial, but our concerns remain about infringements in other jurisdictions.

Requiring the deployment of crawlers with distinct purposes in proposed new subsection (3) is an excellent idea as it would allow data controllers to choose what data can be trawled and for what purpose, to the extent possible using the robots exclusion protocol. We do, however, have concerns about proposed new subsection (4). We are not sure how it would be possible for the exclusion of an AI crawler not to impact the findability of content. We assume this could be achieved only if we mandated the continued use of indexing crawlers.

As for Amendment 206, requiring crawler operators to regularly disclose the information scraped from copyrighted sources and make it accessible to copyright holders on their request is an interesting suggestion. We would be curious to hear how this would work in practice, particularly given the vast scale—some of those models crawl billions of documents, generating trillions of tokens. Where would that data be published? Given the scale of data-scraping, how would copyright holders know where to look for this information? If the operator was based outside the UK, how would disclosure be enforced? Our view is that watermarking technology can come to the rescue, dependent of course on an internationally accepted technical standard for machine-readable watermarks that contain licensing information.

19:15
Finally, on the Government’s proposed consultation, we applaud them for their clear effort to make progress on an issue of genuine difficulty. None of this is easy, and it is absolutely right and correct that they should look to propose inventive solutions. His Majesty’s Official Opposition are concerned by the already strong concerns from the creative sector, even before consultation has started. Clearly, the opt-out model has not been welcomed. It may be that those worries will be addressed through consultation—it may, for instance, turn out that a lot of the labour-intensive processes behind opt-out can be automated—but so far it is not landing well. In the end, it will come down to enforceability, to which there are considerable technical and jurisdictional barriers. The offshoring problem is a particular case of the latter.
Ultimately, we need to know considerably more about this before Report, so I ask the Minister to write with a detailed technical description of the proposed solution, terms of reference for the consultation exercise and the Government’s plans to drive international adoption of their approach or to adapt their approach based on international proposals.
Lord Vallance of Balham Portrait The Minister of State, Department for Science, Innovation and Technology (Lord Vallance of Balham) (Lab)
- Hansard - - - Excerpts

As someone who has spent my life creating IP, protecting IP and sometimes giving IP away, I welcome this debate. I am extremely grateful to the noble Baroness, Lady Kidron, for a very thoughtful set of proposals. The fact that many noble Lords have spoken in this debate shows that the rapid development of AI has clearly raised concerns about how to protect the creative industries. The Government take this very seriously. As the noble Lord, Lord Lucas, pointed out, we need to get it right, which is why we have launched a very wide-ranging consultation on a package of interventions to address copyright and AI issues. It is an important first step in an area where the existing situation is clearly not working and we run the risk of many long-lasting court cases, which will not help the situation in which we find ourselves.

We are committed both to supporting human-centred creativity and to the potential of AI to unlock new horizons. Many in the creative industries use AI very widely already. Our goal is to support AI innovation in the UK while maintaining robust protection for creators and our vibrant creative industry. In response to a point that the noble Baroness, Lady Kidron, raised earlier, option 1 in the consultation refers to existing copyright law and asks for views about maintaining and increasing it. The consultation sets out the Government’s objectives for this area and proposes a range of measures on which we are seeking views. Specifically, it aims to support rights-holders to continue to exercise control over the use of their content and their ability to seek remuneration for this. As many noble Lords have pointed out, that has to be made easy and technically feasible. It also promotes greater trust and transparency and proposes mechanisms by which you can see who is looking at the data and what they are doing with it.

Finally, it aims to support the development of world-leading AI models in the UK by ensuring that access can be appropriately wide but, of course, lawful and with the approval of those it is got from. This includes the subjects of the noble Baroness’s amendments. The consultation seeks views on technological measures that can provide greater control over access to and use of the online material, as well as transparency measures that help copyright owners understand whether their work is being used by AI developers. Again, this needs to be made easy. Various technologies are coming along which can do that, including, as has been said, the watermarking approach.

Much of this needs to be wrapped into an approach to standards. It is important that this is done in a way that is reproducible and reliable. Through this consultation, we will address some of these issues and seek to continue to get input from stakeholders on all of them. We will also work towards internationally interoperable solutions, as raised by many noble Lords, including the noble Lord, Lord Freyberg, and the noble Earl, Lord Effingham.

I agree with the noble Baroness, Lady Kidron, that a vibrant and effective licensing approach—a system that works well and provides access and rights—is important. She asked about an impact assessment. I do not have the information with me now, but I will write. I look forward to updating her on this work in due course and, in the meantime, hope that she is content to withdraw her amendment.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

Does the Minister recognise the characterisation of noble Lords who have said that this is theft? Currently, we have a law and copyright is being taken without consent or remuneration. Does he agree with them that this is what the creative industries and, I presume, some of his community are experiencing?

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- Hansard - - - Excerpts

At the moment we have a system where it is unclear what the rights are and how they are being protected, and therefore things are being done which people are unable to get compensation for. We can see that in the court cases going on at the moment. There is uncertainty which needs to be resolved.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I thank the Minister for his answer and welcome him very much to the Dispatch Box—I have not yet had the pleasure of speaking with him in a debate. I hope he saw the shaking heads when he answered my question about theft and this lack of clarity. If you say “Write me the opening chapter of a Stephen King novel”, and the AI can do it, you can bet your bottom dollar that it has absorbed a Stephen King novel. We know that a lot of this material is in there and that it is not being paid for. That goes for issues big and small.

I understand that it is late and we have more to do—I have more to say on other issues—but I want to reiterate three points. First, creative people are not anti-tech; they just want control over the things they create. AI is a creation on top of a creation, and creative people want to be paid for their efforts and to be in control of them. I am not sure whether I can mention it, because it was in a private meeting, but a brand that many people in most countries will have heard of said: “We need to protect our brand. We mean something. An approximation of us is not us. It is not just the money; it is also the control”.

I also make the point that, earlier this week, Canal+ had its IPO on the London Stock Exchange. I heard the CEO answer the question, “Why is it that Canal+ decided to come and do its IPO in the UK when everybody else is scarpering elsewhere?”, by saying a lot of very warm-hearted things about Paddington Bear, then, “Because you have very good copyright laws”. That is what they said. I just want to mention that.

Finally, I am grateful to the Minister for saying that there is the option of staying with the status quo; I will look at that and try to understand it clearly. However, when he writes about the issue that I raised in terms of opting in or opting out—I am grateful to him for doing so—I would also like an answer about where the Government think the money is going to go. What is the secondary value of the AI companies, which are largely headquartered in the US? Where will the IP, which those companies have already said they want to protect—they did so in their response to the Government’s consultation; I said that it in my speech, for anyone who was not listening—go? I would like the Government to say what their plans are, if we lose the £1.6 billion and the 2.4 million jobs, to replace that money and those jobs, as well as their incredible soft power.

With that, I beg leave to withdraw the amendment.

Amendment 204 withdrawn.
Amendments 205 and 206 not moved.
Amendment 207
Moved by
207: After Clause 132, insert the following new Clause—
“Reliability of computer-based evidence(1) Electronic evidence produced by or derived from a computer, device or computer system (separately or together “system”) is admissible as evidence in any proceedings— (a) where that electronic evidence and the reliability of the system that produced it or from which it is derived are not challenged;(b) where the court is satisfied that the reliability of the system cannot reasonably be challenged;(c) where the court is satisfied that the electronic evidence is derived from a reliable system.(2) Rules of Court must provide that electronic evidence sought to be relied upon by a party in any proceedings may be challenged by another party as to its admissibility.(3) For the purposes of subsection (1)(b), Rules of Court must provide for the circumstances in which the Court may be satisfied that the admissibility of electronic evidence cannot reasonably be challenged.(4) When determining whether a system is reliable for the purposes of subsection (1)(c) the matters that may be taken into account include—(a) any instructions or rules of the system that apply to its operation;(b) any measures taken to secure the integrity of data held on the system;(c) any measures taken to prevent unauthorised access to and use of the system;(d) the security of the hardware and software used by the system;(e) any measures taken to monitor and assess the reliability of the system by the system controller or operator including steps taken to fix errors or address unexpected outcomes including the regularity of and extent of any audit of the system by an independent body;(f) any assessment of the reliability of the system made by a body with supervisory or regulatory functions;(g) the provisions of any scheme or industry standard that apply in relation to the system.(5) For the purposes of this section—“computer” means any device capable of performing mathematical or logical instructions;“device” means any apparatus or tool operating alone or connected to other apparatus or tools, that processes information or data in electronic form;“electronic evidence” means evidence derived from data contained in or produced by any device the functioning of which depends on a software program or from data stored on a computer, device or computer system or communicated over a networked computer system.”Member’s explanatory statement
This amendment overturns the current legal assumption that evidence from computers is always reliable which has contributed to miscarriages of justice including the Horizon Scandal. It enables courts to ask questions of those submitting computer evidence about its reliability.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, it is a privilege to introduce Amendment 207. I thank the noble Lords, Lord Arbuthnot and Lord Clement-Jones, and the right reverend Prelate the Bishop of St Albans, who is unfortunately unwell but wanted me to express his support.

I make it clear that, although I may use the Horizon scandal as an example, this amendment is neither focused on nor exclusive to the miscarriage of justice, malevolence and incompetence related to that scandal. It is far broader than that so, when the Minister replies, I really hope that he or she—I am not sure which yet—will not talk about the criminality of the Post Office, as previously, but rather concentrate on the law that contributed to allowing a miscarriage of justice at that scale. That is what this amendment seeks to address.

I explained during debates on the DPDI Bill that, since 1999, courts have applied

“a common law presumption, in both criminal and civil proceedings, of the proper functioning of machines—that is to say”,

the information from the computer can be presumed to be reliable. I went on to say:

“In principle, there is a low threshold for rebutting this presumption but, in practice … a person challenging evidence derived from a computer will typically have no””.—[Official Report, 24/4/24; col. GC 573.]


knowledge of the circumstance in which the system in question was operated so cannot possibly demonstrate that it failed. As Paul Marshall, the barrister who represented some of the postmasters, explains, this puts the onus on the defendant to explain to the jury the problems they encountered when all they could actually do was point to the shortfalls they had experienced—in the Horizon case, that the cash received did not match the balancing figure on the computer screen. They did not have access to the system or the record of its past failures, and they had no knowledge of what the vulnerabilities were. They only knew that it did not work.

The reality is that anyone who knows the first thing about programming or computer science knows that there are bugs in the system. Indeed, any one of us who has agreed to an update for an app or computer software understands that bug fixing is a common aspect of program maintenance. When I discussed this amendment with a computer scientist of some standing, he offered the opinion that there are likely to be 50 bugs per 1,000 lines of code; many complex systems run to tens of millions of lines of code.

Perhaps the most convincing thing of all is looking at software contracts. For the past 20 years at least, a contract is likely to contain words to this effect: “No warranty is provided that the operation of the software will be uninterrupted or error free, or that all software errors will be corrected”. This same clause applies in contracts when we say yes to a new Apple upgrade when we sign a EULA—an end-user licence agreement. In plain English, for two decades at least, those who provide software have insisted that computer information is not to be considered reliable. That is written into their commercial agreements, so the fact that computer information is not reliable is agreed by those who know about computer information.

19:30
Similarly, the wrongness of the current legal presumption that computer information is reliable is also widely agreed. It was agreed by the previous Lord Chancellor, Alex Chalk, who promised me that he would look at it. It has been the subject of discussion for several years in the MoJ, which asked Paul Marshall to report on it in 2020 and again in 2021. It has also been pointed out by Lord Justice Fraser, now a judge of the Court of Appeal, that the presumption was not correct. Although my own promised ministerial meeting with the MoJ did not materialise before this Committee, I am sure that the current Lord Chancellor would agree that the existing presumption in law is wrong because it is a presumption that anyone with even the most basic knowledge of computers would consider absurd.
I laid an amendment to the DPDI Bill, based on Section 69 of the PACE Act. Officials and Ministers worried that unscrupulous lawyers would challenge every possible automated thing. They presented the spectre of murderers challenging body cam evidence and a justice system brought to a standstill by smart lawyers of drunk drivers querying whether the breathalyser was reliable. I am no longer sure that this assessment is correct since, in most cases, there would be other evidence that did or did not corroborate, such as witnesses or other officers present, urine samples and blood tests. Some departments have a habit of making a problem so big that we can never solve it.
Given the costs of the Post Office debacle, which currently exceed £1 billion, the level of distress and hardship that it has inflicted, and the reality that it has led and will continue to lead to miscarriages of justice beyond those affected by Horizon, I find it astonishing that the Ministry of Justice has failed to tackle this issue. It is more than five years since Mr Justice Fraser, now Lord Justice Fraser, made it clear that the uncritical admission of evidence in the Horizon case was in itself an injustice, as the burden to say in what way the computer was unreliable fell on the party without access to the system while the party with access had no similar responsibility to reveal what might be unreliable.
Just as the failure to compensate the postmasters adds injury to insult and harm to hurt, so, too, the failure of the MoJ to address a known and continuing injustice adds to a picture in which the court and the Government repeatedly fail to serve the people who depend on them. If we all agree that we have a problem where the current law is not only blind to but actively asserts an untruth, from which great injustice flows, that should be a matter of urgent concern.
Amendment 207 is the result of expert advice from external counsel and computer scientists, including Professor Harold Thimbleby. Between them, they have scores of years’ experience looking specifically at the intersection of law and technology. I thank them for their time and dedication to this issue; I will shortly return to their comprehensive view.
Amendment 207 does not speak to the reliability of computers but concentrates entirely on the question of computer evidence put in front of the court, so that the presumption cannot be a cause for further injustice. Proposed new subsection (1) says that computer evidence should be “admissible”—that is, allowed to be relied on by a party in court proceedings—if, first, the other party does not object to the evidence being relied on; secondly, the court considers that no sensible or reasonable objection can be taken to the evidence being relied on, which is to say it being admitted; and, thirdly, there is evidence that the source of the evidence, such as the computer system that produced it, is reliable. Later subsections simply offer guidance for the courts in evaluating what a reliable computer system is.
The amendment provides protection against computer evidence being relied on where there is no assurance that the computer from which the material is derived is one that functions properly or reliably. Importantly, the provision does not determine that computer evidence should be accepted or given weight by the court; that remains the court’s function in civil trials and the jury’s function in criminal trials. Once admitted to a trial, evidence will be tested in the usual way, with expert witness if necessary.
If this amendment had been in place, the Post Office scandal would have been avoided in some part, possibly for decades—as would the horrific fate of nurses at the Welsh Princess of Wales Hospital who were, in 2012, wrongly accused of falsifying patient records because of discrepancies found with computer records. Some nurses were subjected to criminal prosecution, suffering years of legal action before the trial collapsed when it emerged that a visit by an engineer fixing a bug had erased data that the nurses were accused of failing to gather. If the bar for putting forward evidence as reliable, as set out by the guidance contained in this amendment, had been in place, it would have pointed even the least technical judge towards the fact that there should have been engineering and audit logs highlighting the unauthorised access to, and amendment of, data.
It is often the case that evidence from a computer is part of the evidential picture. Amendment 207 would allow for that. It would give structure to the questions that the court should ask but leave it to the court to weigh those considerations for itself. Once the court has determined the integrity of the evidence, it will be free to consider its contribution to the whole. It follows that the more important or central the data, the more important it is that reliability is assured.
Finally, let me make the observation—sadly, not for the first time in Committee—that, when issues involve the interests of commercial players rather than justice for individuals, it seems that the machinery of government is minded to act. Last year, the Electronic Trade Documents Act 2023 was introduced. The purpose of that legislation was to provide confidence in the integrity of electronic documents relied on in commerce. Of course, it makes absolute sense that you cannot trade if you have no confidence in the integrity of electronic documents, but why has that been given priority over justice in criminal and civil proceedings, even when we know that we are subject to bad law?
In the build-up to the passing of the ETDA, the Law Commission debated for some time the desirability of including guidance for the courts. At first, it decided against. However, following consultation with Lord Justice Fraser, the Law Commission changed its stance, since he urged the commission to include it and said that it would be useful for the courts. Amendment 207 encompasses that very same guidance. It is in our trade law and should be in our courts. This is the very same Lord Justice Fraser who finally broke open the sub-postmasters case and whom the MoJ has studiously ignored in finding a solution on the reliability of evidence. Amendment 207 takes his advice. It mirrors the provision in providing guidance to the court. It is not prescriptive and, given its excellent provenance, I trust that the Government will not find it wanting.
I return to what I said at the outset: this is not about the postmasters. In the last year alone we have seen bugs and problems in banking, air traffic control, supermarket delivery, banks, hospitals, trains and more. As we approach a world of AI and greater reliance on tech, we anticipate greater variations in reliability and more cases coming to the court. However, although the postmasters will not benefit from this change in the law—nor is it the sole example—they illustrate the human cost of failing to act.
The amendment is not the end of the matter. My legal advisers say we also need to overturn the presumption because it is wrong. Directing the court per this amendment is necessary and, in due course, we will also need certification or audit trails for computer information that is depended on for court matters. The amendment is put forward to reboot the conversation that was interrupted by the election.
I hope we will have a ministerial answer from the Dispatch Box that agrees to deal with this issue as a matter of urgency before Report, not one saying it is complicated. We know it is complicated, but for the postmasters, the nurses or anyone else whose life or livelihood has been taken or threatened by a bug, the status quo is unacceptable. Twenty-five years is too long for the law to assert something that is patently false. The MoJ has been looking at this issue in detail for more than five years and I have sought an urgent answer, along with the noble Lord, Lord Arbuthnot, for the past five months. If it is too complicated for the MoJ, I have a group of eminent lawyers and computer scientists who would happily do the task for it. I beg to move.
Lord Arbuthnot of Edrom Portrait Lord Arbuthnot of Edrom (Con)
- Hansard - - - Excerpts

My Lords, I declare my interest as a member of the Horizon Compensation Advisory Board. When, on 24 April this year, the noble Baroness, Lady Kidron, proposed an amendment to remove the presumption about the reliability of computer evidence, the noble Baroness who is now the Minister added her name to it—oh the perils of moving from opposition to government.

My noble friend Lord Camrose—the Minister at the time—in a sympathetic speech, resisted that amendment on the basis, first, that there were shocking failures of professional duty in the Post Office case. This was quite true, but they were facilitated by the existence of the presumption. His second reason was that taking us back to the law of 1999, as the noble Baroness, Lady Kidron, eloquently set out just now, would risk undermining prosecutions because we would need to get certificates of accuracy in cases such as breathalysers and those involving emails. There may have been something in that, so the noble Baroness has proposed an amendment that is designed to get round that second point.

I suspect that the Minister will resist this amendment too, but for reasons that I hope she will set out clearly, because we may then decide to move a different amendment on Report. We are making all the running on this—or at least the noble Baroness, Lady Kidron, is, with my full support and, I know, that of the noble Lord, Lord Clement-Jones. I take a moment out of this Committee to pay tribute to their work ethic in this Committee, which has been quite phenomenal.

The Government do not seem to have the issue quite as close to the top of their priorities as we suggest. Without repeating all that I said on 24 April, I will summarise it as follows. Paul Marshall, the barrister, has pointed out that computer evidence is hearsay, with all the limitations that that implies. Modern computer programs are too large to be exhaustively tested. If computer programs are inherently unreliable, it is wrong to have a presumption that they are reliable. That issue will grow with the growth of artificial intelligence.

The presumption that computer evidence is reliable leads either to such things as we saw occur in the Post Office scandal, with the Post Office essentially taunting the sub-postmasters, saying, “If you can’t show us what is wrong with the computer evidence, we don’t have to show you that evidence”—a shocking case of Catch-22; or to lawyers and courts voluntarily abandoning the presumption and denigrating all computer evidence, whether or not it deserves to be denigrated. That might lead, for example, to some defendants being acquitted when the evidence would require that they be convicted. We are trying to help the Government find a way through a problem that they recognise and assert exists. Will they please give us some help in return? This is both serious and urgent. Just saying that it is very difficult does not begin the process of putting it right.

19:45
Lord Tarassenko Portrait Lord Tarassenko (CB)
- Hansard - - - Excerpts

My Lords, I will speak briefly in support of this amendment. Anyone who has written computer code, and I plead guilty, knows that large software systems are never bug-free. These bugs can arise because of software design errors, human errors in coding or unexpected software interactions for some input data. Every computer scientist or software engineer will readily acknowledge that computer systems have a latent propensity to function incorrectly.

As the noble Baroness, Lady Kidron, has already said, we all regularly experience the phenomenon of bug fixing when we download updates to software products in everyday use—for example, Office 365. These updates include not only new features but patches to fix bugs which have become apparent only in the current version of the software. The legal presumption of the proper functioning of “mechanical instruments” that courts in England and Wales have been applying to computers since 1999 has been shown by the Post Office Horizon IT inquiry to be deeply flawed. The more complex the program, the more likely the occurrences of incorrect functioning, even with modular design. The program at the heart of Fujitsu’s Horizon IT system had tens of millions of lines of code.

The unwillingness of the courts to accept that the Horizon IT system developed for the Post Office was unreliable and lacking in robustness—until the key judgment, which has already been mentioned, by Mr Justice Fraser in 2019—is one of the main reasons why more than 900 sub-postmasters were wrongly prosecuted. The error logs of any computer system make it possible to identify unexpected states in the computer software and hence erroneous system behaviour. Error logs for the Horizon IT system were disclosed only in response to a direction from the court in early 2019. At that point, the records from Fujitsu’s browser-based incident management system revealed 218,000 different error records for the Horizon system.

For 18 years prior to 2019, the Post Office did not disclose any error log data, documents which are routinely maintained and kept for any computer system of any size and complexity. Existing disclosure arrangements in legal proceedings do not work effectively for computer software, and this amendment concerning the electronic evidence produced by or derived from a computer system seeks to address this issue. The Post Office Horizon IT inquiry finished hearing evidence yesterday, having catalogued a human tragedy of unparalleled scale, one of the most widespread miscarriages of justice in the UK. Whether it is by means of this amendment or otherwise, wrongful prosecutions on the basis that computers always operate properly cannot continue any longer.

Earl of Erroll Portrait The Earl of Erroll (CB)
- Hansard - - - Excerpts

My Lords, if I may just interject, I have seen this happen not just in the Horizon scandal. Several years ago, the banks were saying that you could not possibly find out someone’s PIN and were therefore refusing to refund people who had had stuff stolen from them. It was not until the late Professor Ross Anderson, of the computer science department at Cambridge University, proved that they had been deliberately misidentifying to the courts which counter they should have been looking at, as to what was being read, and explained exactly how you could get the thing to default back to a different set of counters, that the banks eventually had to give way. But they went on lying to the courts for a long time. I am afraid that this is something that keeps happening again and again, and an amendment like this is essential for future justice for innocent people.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, it is a pity that this debate is taking place so late. I thank the noble Lord, Lord Arbuthnot, for his kind remarks, but my work ethic feels under considerable pressure at this time of night.

All I will say is that this is a much better amendment than the one that the noble Baroness, Lady Kidron, put forward for the Data Protection and Digital Information Bill, and I very strongly support it. Not only is this horrifying in the context of the past Horizon cases, but I read a report about the Capture software, which is likely to have created shortfalls that led to sub-postmasters being prosecuted as well. This is an ongoing issue. The Criminal Cases Review Commission is reviewing five Post Office convictions in which the Capture IT system could be a factor, so we cannot say that this is about just Horizon, as there are the many other cases that the noble Baroness cited.

We need to change this common law presumption even more in the face of a world in which AI use, with all its flaws and hallucinations, is becoming ever present, and we need to do it urgently.

Earl of Effingham Portrait The Earl of Effingham (Con)
- Hansard - - - Excerpts

My Lords, I thank the noble Baroness, Lady Kidron, for tabling her amendment. We understand its great intentions, which we believe are to prevent another scandal similar to that of Horizon and to protect innocent people from having to endure what thousands of postmasters have undergone and suffered.

However, while this amendment would make it easier to challenge evidence derived from, or produced by, a computer or computer system, we are concerned that, should it become law, this amendment could be misused by defendants to challenge good evidence. Our fear is that, in determining the reliability of such evidence, we may create a battle of the expert witnesses. This will not only substantially slow down trials but result in higher costs. Litigation is already expensive, and we would aim not to introduce additional costs to an already costly process unless absolutely necessary.

From our perspective, the underlying problem in the Horizon scandal was not that computer systems were critically wrong or that people were wrong, but that the two in combination drove the terrible outcomes that we have unfortunately seen. For many industries, regulations require firms to conduct formal systems validation, with serious repercussions and penalties should companies fail to do so. It seems to us that the disciplines of systems validation, if required for other industries, would be both a powerful protection and considerably less disruptive than potentially far-reaching changes to the law.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I thank the noble Baroness and the noble Lord, Lord Arbuthnot, for Amendment 207 and for raising this important topic. The noble Baroness and other noble Lords are right that this issue goes far wider than Horizon. We could debate what went wrong with Horizon, but the issues before us today are much wider than that.

The Government are agreed that we must prevent future miscarriages of justice. We fully understand the intention behind the amendment and the significance of the issue. We are actively considering this matter and will announce next steps in the new year. I reassure noble Lords that we are on the case with this issue.

In the meantime, as this amendment brings into scope evidence presented in every type of court proceeding and would have a detrimental effect on the courts and prosecution—potentially leading to unnecessary delays and, more importantly, further distress to victims—I must ask the noble Baroness whether she is content to withdraw it at this stage. I ask that on the basis that this is an ongoing discussion that we are happy to have with her.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I thank the Minister, in particular for understanding that this goes way beyond Horizon. I would be very interested to be involved in those conversations, not because I have the great truth but because I have access to people with the great truth on this issue. In the conversations I have had, there has been so much pushing back. A bit like with our previous group, it would have been better to have been in the conversation before the consultation was announced than after. On that basis, I beg leave to withdraw the amendment.

Amendment 207 withdrawn.
Amendments 208 to 210 not moved.
Amendment 211
Moved by
211: After Clause 132, insert the following new Clause—
“Sovereign data assets(1) The Secretary of State may by regulations define data sets held by public bodies and arm’s length institutions and other data sets that are held in the public interest as sovereign data assets (defined in subsection (6)).(2) In selecting data sets which may be designated as sovereign data assets, the Secretary of State must—(a) have regard to—(i) the security and privacy of United Kingdom data subjects;(ii) the ongoing value of the data assets;(iii) the rights of United Kingdom intellectual property holders;(iv) ongoing adherence to the values, laws and international obligations of the United Kingdom;(v) the requirement for public sector employees, researchers, companies and organisations headquartered in the United Kingdom to have preferential terms of access;(vi) the need for data to be stored in the United Kingdom, preferably in data centres in the United Kingdom;(vii) the need to design Application Programming Interfaces (APIs) as bridges between each sovereign data asset and the client software of the authorized licence holders;(b) consult with—(i) academics with expertise in the field;(ii) the AI Safety Institute;(iii) those with responsibility for large public data sets;(iv) data subjects;(v) the Information Commissioner.(3) The Secretary of State must establish a transparent licensing system, fully reflecting the security and privacy of data held on United Kingdom subjects, for use in providing access to sovereign data assets.(4) The Secretary of State must report annually to Parliament on the ongoing value of the sovereign data assets, in terms of—(a) their value to future users of the data;(b) the financial return expected when payment is made for the use of such data in such products and services as may be expected to be developed.(5) The National Audit Office must review the licensing system established by the Secretary of State under subsection (3) and report annually to Parliament as to its effectiveness in securing the ongoing security of the sovereign data assets.(6) In this section—“sovereign data asset” means—(a) data held by public bodies and arm’s length institutions of government;(b) data sets held by third parties that volunteer data to form, or contribute to, a public asset.(7) Regulations under this section are to be made by statutory instrument.(8) A statutory instrument containing regulations under this section may not be made unless a draft of the instrument has been laid before and approved by a resolution of each House of Parliament.” Member’s explanatory statement
The UK has a number of unique publicly-held data assets, from NHS data to geospatial data and the BBC’s multimedia data. This amendment would create a special status for data held in the public interest, and a licensing scheme for providing access to them, which upholds UK laws and values, and ensure a fair return of financial benefits to the UK.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, the good news is that this is the last time I shall speak this evening. Amendment 211 seeks to ensure that the value of our publicly held large datasets is realised for the benefit of UK citizens.

Proposed new subsection (1) gives the Secretary of State the power to designate datasets held by public bodies, arm’s-length institutions or other sets held in the public interest as sovereign data assets.

Proposed new subsection (2) lists a number of requirements that the Secretary of State must have regard to when making a designation. Factors include the security and privacy of UK citizens, the ongoing value of the data assets, the rights of IP holders, the values, laws and international obligations of the UK, the requirement to give preferential access to UK-headquartered companies, organisations and the public sector, the requirement for data to be stored in the UK and the design of application programming interfaces facilitating access to the assets by authorised licence holders. It also sets out stakeholders whom the Secretary of State must consult when considering what datasets to designate as sovereign data assets. We heard in a previous debate that education data might be a good candidate.

Proposed new subsection (3) requires the setting up of a transparent licensing system. Proposed new subsection (4) requires those managing sovereign data assets to report annually on their value and anticipated return to UK subjects. This would include, for example, licence payments, profit share agreements and “in kind” returns, such as access to products or services built using sovereign data assets. Proposed new subsection (5) gives an oversight role to the National Audit Office, proposed new subsection (6) provides a definition, and proposed new subsections (7) and (8) specify that regulations made under the clause are subject to parliamentary approval.

When I raised this issue at Second Reading, the Minister answered positively, in that she felt that what I was suggesting was embodied in the plans for a national data library:

“The national data library will unlock the value of public data assets. It will provide simple, secure and ethical access to our key public data assets for researchers, policymakers and businesses, including those at the frontier of AI development, and make it easier to find, discover and make connections across those … databases. It will sit at the heart of an ambitious programme of reform that delivers the incentives, investment and leadership needed to secure the full benefits for people and the economy”.—[Official Report, 19/11/24; col. 196.]


That is a very valid and positive picture. My comments build on it, because since Second Reading, I have sought details about the national data library. It seems that plans are nascent and the level of funding, as I understand it, seems to match neither the ambition set out by the Minister nor what many experts think is necessary. One of my concerns—it will not surprise the Committee to hear, as it has come up a couple of times on previous groups—is that it appears to be a mechanism for facilitating access, rather than understanding, realising and protecting the value of these public data assets.

In the meantime, announcements of access to public data keep coming. We have worries about Palantir and the drip-feed of deals with OpenAI and Google, the latest of which was in the Health Services Journal, which said:

“The national Federated Data Platform will be used to train AI models for future use by the NHS, according to NHS England’s chief data and analytics officer”.


That sounds great, but the article went on to question the basis and the wrap-around. This is the question.

We in this House already understand the implications of an “adopt now, ask questions later”, approach. For example, as reported recently in Computer Weekly, Microsoft has now admitted to Scottish policing bodies that it cannot guarantee the sovereignty of UK policing data hosted on its hyperscale public cloud infrastructure. That is a huge problem for Police Scotland and one that is very likely to be mirrored across all sorts of government departments, in a technology that is central to so many departments. The proposed amendments offer a route to ask questions as you adopt technology, not after you have lost control.

20:00
The speed at which the Government are giving access to our data is swifter than the plans to protect its financial or societal value. I think this is something of a theme of this Committee: it does not deal with the needs of IP holders, UK citizens, children or NHS patients, or meet the spectre of AI systems. There is often a conflation by Ministers of the need to access data for medicine, space, museums and other exciting matters with the prosperity it will bring and the savings it will make, but if we look at the deals made so far, the benefit has accrued disproportionately to a handful of US-headquartered companies.
We know that handing public assets to private companies in the hope they will return a public benefit has some flaws. We are still paying for PFIs, while private water companies have consistently prioritised shareholder returns and executive pay over investment in critical infrastructure, at huge cost to the public, our rivers and seas. Thirty years after John Major privatised the railways and operators, this Government have pledged to return both to public ownership, having seen billions of taxpayer pounds go into private hands. Yet at the same time as they are reclaiming these assets and infrastructure for and on behalf of the UK, they are doing deals that undervalue one of our most valuable national assets, our publicly held data. It is a resource that could, if managed appropriately, bring revenue to our struggling public sector and revolutionise the delivery of public service while reducing spending. Instead, I worry that we give unconditional access to companies that take that learning and turn it into products and services for which we will in the future pay market price and which will generate large profits.
I applaud the productive use of UK data, but for societal goods and as a contributor to national prosperity, not as another leak of control and value to a handful of dominant incumbents. Data is not separate from other modern infrastructure considerations but part of it. I recognise the complexity of making something from the data that we hold, but just like the previous arguments about protecting intellectual property, the new innovations cannot be made without the raw material of data—or, as the noble Lord, Lord Holmes, would have it, our data.
Beyond securing financial returns, the Government’s rush to give access and their failure to consider citizens’ needs is alarming. We need to make sure that the exploitation of our data is on terms that are consistent with our values and has the consent of the people. In a word cloud that was generated based on the latest government polling about AI, one word screamed out from the pack, and that was “scary”. The only words that I could read without glasses were “dangerous”, “concern”, “unsure”, “robot”, “worry”, “nervous”, “confused”, “cautious”, “wary” and “sceptical”, so I am not the only one who sees the cavalier statements of Ministers as a threat to the safety, security and prosperity of the UK. What the word cloud tells us is that there is a disconnect between the Government’s “lean in, move fast, hurt now, fix later” approach and the views of those on whose behalf they govern.
Underlying the Government’s rhetoric is the implication that those who disagree with their strategy and the pace at which they are opening up access have failed to understand the opportunity. It is possible to be excited by AI’s potential and to disagree with the Government’s strategy, because it reflects a failure to recognise that they are being played by the tech companies, whose lobbyists are experts in spreading uncertainty and making regulators and governments feel that they hold all the answers, when those answers are self-serving.
I hope that this is one of several positive suggestions made by noble Lords in Committee that will be treated positively and subject to serious discussion and consideration, rather than summarily dismissed with no thought as to how this will play out in the decades ahead. This is a Bill, an issue and a country that need a sense of purpose; I believe that sovereign data assets could play a part in that. I beg to move.
Lord Russell of Liverpool Portrait The Deputy Chairman of Committees (Lord Russell of Liverpool) (CB)
- Hansard - - - Excerpts

My Lords, before we proceed, I draw to the attention of the Committee that we have a hard stop at 8.45 pm and we have committed to try to finish the Bill this evening. Could noble Lords please speak quickly and, if possible, concisely?

Lord Tarassenko Portrait Lord Tarassenko (CB)
- Hansard - - - Excerpts

My Lords, I support my noble friend Lady Kidron’s Amendment 211, to which I have put my name. I speak not as a technophobe but as a card-carrying technophile. I declare an interest as, for the past 15 years, I have been involved in the development of algorithms to analyse NHS data, mostly from acute NHS trusts. This is possible under current regulations, because all the research projects have received medical research ethics approval, and I hold an honorary contract with the local NHS trust.

This amendment is, in effect, designed to scale up existing provisions and make sure that they are applied to public sector data sources such as NHS data. By classifying such data as sovereign data assets, it would be possible to make it available not only to individual researchers but to industry—UK-based SMEs and pharmaceutical and big tech companies—under controlled conditions. One of these conditions, as indicated by proposed new subsection (6), is to require a business model where income is generated for the relevant UK government department from access fees paid by authorised licence holders. Each government department should ensure that the public sector data it transfers to the national data library is classified as a sovereign data asset, which can then be accessed securely through APIs acting

“as bridges between each sovereign data asset and the client software of the authorized licence holders”.

In the time available, I will consider the Department of Health and Social Care. The report of the Sudlow review, Uniting the UK’s Health Data: A Huge Opportunity for Society, published last month, sets out what could be achieved though linking multiple NHS data sources. The Academy of Medical Sciences has fully endorsed the report:

“The Sudlow recommendations can make the UK’s health data a truly national asset, improving both patient care and driving economic development”.


There is little difference, if any, between health data being “a truly national asset” and “a sovereign asset”.

Generative AI has the potential to extract clinical value from linked datasets in the various secure data environments within the NHS and to deliver a step change in patient care. It also has the potential to deliver economic value, as the application of AI models to these rich, multimodal datasets will lead to innovative software products being developed for early diagnosis and personalised treatment.

However, it seems that the rush to generate economic value is preceding the establishment of a transparent licensing system, as in proposed new subsection (3), and the setting up of a coherent business model, as in proposed new subsection (6). As my noble friend Lady Kidron pointed out, the provisions in this amendment are urgently needed, especially as the chief data and analytics officer at NHS England is reported as having said, at a recent event organised by the Health Service Journal and IBM, that the national federated data platform will soon be used to train different types of AI model. The two models mentioned in the speech were OpenAI’s proprietary ChatGPT model and Google’s medical AI, which is based on its proprietary large language model, Gemini. So, the patient data in the national federated data platform being built by Palantir, which is a US company, is, in effect, being made available to fine-tune large language models pretrained by OpenAI and Google—two big US tech companies.

As a recent editorial in the British Medical Journal argued:

“This risks leaving the NHS vulnerable to exploitation by private technology companies whose offers to ‘assist’ with infrastructure development could result in loss of control over valuable public assets”.


It is vital for the health of the UK public sector that there is no loss of control resulting from premature agreements with big tech companies. These US companies seek privileged access to highly valuable assets which consist of personal data collected from UK citizens. The Government must, as a high priority, determine the rules for access to these sovereign data assets along the lines outlined in this amendment. I urge the Minister to take on board both the aims and the practicalities of this amendment before any damaging loss of control.

Lord Freyberg Portrait Lord Freyberg (CB)
- Hansard - - - Excerpts

My Lords, I support Amendment 211 moved by my noble friend Lady Kidron, which builds on earlier contributions in this place made by the noble Lords, Lord Mitchell, Lord Stevenson, Lord Clement-Jones, and myself, as long ago as 2018, about the need to maximise the social, economic and environmental value that may be derived from personal data of national significance and, in particular, data controlled by our NHS.

The proposed definition of “sovereign data assets” is, in some sense, broad. However, the intent to recognise, protect and maximise their value in the public interest is readily inferred. The call for a transparent licensing regime to provide access to such assets and the mention of preferential access for individuals and organisations headquartered in the UK also make good sense, as the overarching aim is to build and maintain public trust in third-party data usage.

Crucially, I fully support provisions that would require the Secretary of State to report on the value and anticipated financial return from sovereign data assets. Identifying a public body that considered itself able or willing to guarantee value for money proved challenging when this topic was last explored. For too long, past Governments have dithered and delayed over the introduction of provisions that explicitly recognise the need to account for and safeguard the investment made by taxpayers in data held by public and arm’s-length institutions and associated data infrastructure—something that we do as a matter of course where the tangible assets that the National Audit Office monitors and reports on are concerned.

In recent weeks, the Chancellor of the Exchequer has emphasised the importance of recovering public funds “lost” during the Covid-19 pandemic. Yet this focus raises important questions about other potential revenue streams that were overlooked, particularly regarding NHS data assets. In 2019, Ernst & Young estimated that a curated NHS dataset could generate up to £5 billion annually for the UK while also delivering £4.6 billion in yearly patient benefits through improved data infrastructure. This begs the question: who is tracking whether these substantial economic and healthcare opportunities are being realised? Who is ensuring that these projected benefits—both financial and clinical—are actually flowing back into our healthcare system?

As we enter the age of AI, public discourse often fixates on potential risks while overlooking a crucial opportunity—namely, the rapidly increasing value of publicly controlled data and its potential to drive innovation and insights. This raises two crucial questions. First, how might we capitalise on the upside of this technological revolution to maximise the benefits on behalf of the public? Secondly, and more specifically, how will Parliament effectively scrutinise any eventual trade deal entered into with, for example, the United States of America, which might focus on a more limited digital chapter, in the absence of either an accepted valuation methodology or a transparent licensing system for use in providing access to valuable UK data assets?

Will the public, faced with a significant tax burden to improve public services and repeated reminders of the potential for data and technology to transform our NHS, trust the Government if they enable valuable digital assets to be stripped today only to be turned tomorrow into cutting-edge treatments that we can ill afford to purchase and that benefit companies paying taxes overseas? To my mind, there remains a very real risk that the UK, as my noble friend Lady Kidron, rightly stated, will inadvertently give away potentially valuable digital assets without there being appropriate safeguards in place. I therefore welcome the intent of Amendment 211 to put that right in the public interest.

20:15
Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, having a system such as this would really focus the public sector on how we can generate more datasets. As I said earlier, education is an obvious one, but so is mobile phone data. All these companies have their licences. If a condition of the licence was that the data on how people move around the UK became a public asset, that would be hugely beneficial to policy formation. If we really understood how, why and when people move, we would make much better decisions. We could save ourselves huge amounts of money. We really ought to have this as a deep focus of government policy.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I have far too little time to do justice to this subject. We on these Benches welcome this amendment. It is entirely consistent with the sovereign health fund proposed by Future Care Capital and, indeed, with the proposals from the Tony Blair Institute for Global Change on a similar concept called the national data trust. Indeed, this concept formed part of our Liberal Democrat manifesto at the last general election, so of course I support the amendment.

It would be very useful to hear more about the national data library, including on its purpose and operation, as the noble Baroness, Lady Kidron, said. I entirely agree with her that there is a great need for a sovereign cloud service or services. Indeed, the inability to guarantee that data on the cloud is held in this country is a real issue that has not yet been properly addressed.

Earl of Effingham Portrait The Earl of Effingham (Con)
- Hansard - - - Excerpts

My Lords, I thank the noble Baroness, Lady Kidron, for moving this amendment. As she rightly identified, the UK has a number of publicly held data assets, many of which contain extremely valuable information. This data—I flag, by way of an example, NHS data specifically—could be extremely valuable to certain organisations, such as pharmaceutical companies.

We are drawn to the idea of licensing such data—indeed, we believe that we could charge an extremely good price—but we have a number of concerns. Most notably, what additional safeguards would be required, given its sensitivity? What would be the limits and extent of the licensing agreement? Would this status close off other routes to monetising the data? Would other public sector bodies be able to use the data for free? Can this not already be done without the amendment?

Although His Majesty’s Official Opposition of course recognise the wish to ensure that the UK taxpayer gets a fair return on our information assets held by public bodies and arm’s-length organisations, and we certainly agree that we need to look at licensing, we are not yet sure that this amendment is either necessary or sufficient. We once again thank the noble Baroness, Lady Kidron, for moving it. We look forward to hearing both her and the Minister’s thoughts on the matter.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I am grateful to the noble Baroness, Lady Kidron, for her amendment. I agree with her that the public sector has a wealth of data assets that could be used to help our society achieve our missions and contribute to economic growth.

As well as my previous comments on the national data library, the Government’s recent Green Paper, Invest 2035: The UK’s Modern Industrial Strategy, makes it clear that we consider data access part of the modern business environment, so improving data access is integral to the UK’s approach to growth. However, we also recognise the value of our data assets as part of this approach. At the same time, it is critical that we use our data assets in a trustworthy and ethical way, as the noble Baroness, Lady Kidron, and the noble Lord, Lord Tarassenko, said, so we must tackle these issues carefully.

This is an active area of policy development for the Government, and we need to get it right. I must therefore ask the noble Baroness to withdraw her amendment. However, she started and provoked a debate that will, I hope, carry on; we would be happy to engage in that debate going forward.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I thank all speakers, in particular my noble friend Lord Tarassenko for his perspective. I am very happy to discuss this matter and let the Official Opposition know that this is a route to something more substantive to which they can agree. I beg leave to withdraw my amendment.

Amendment 211 withdrawn.
Amendment 211A not moved.
Lord Russell of Liverpool Portrait The Deputy Chairman of Committees (Lord Russell of Liverpool) (CB)
- Hansard - - - Excerpts

My Lords, before we move on to the next group, I again remind noble Lords that we have in fact only two groups to get through because Amendment 212 will not be moved. We have about 25 minutes to get through those two groups.

Amendment 211B

Moved by
211B: After Clause 132, insert the following new Clause—
“Consultation: data centre power usageOn the day on which this Act is passed, the Secretary of State must launch a consultation on the implications of the provisions in this Act for the power usage and energy efficiency of data centres.”
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

My Lords, it is a pleasure to introduce this group of amendments. I have a 35-minute speech prepared. In moving Amendment 211B, I shall speak also to Amendments 211C to 211E. The reason for this group of amendments is to try to get an increased focus on the range of issues they touch on.

I turn to Amendment 211B first. It seems at least curious to have a data Bill without talking about data centres in terms of their power usage, their environmental impact and the Government’s view of the current PUE standard. Is it of a standard that they think gives the right measure of confidence to consumers and citizens across the country, in terms of how data centres are being operated and their impacts?

Similarly, on Amendment 211C, not enough consideration is given to supply chains. I am not suggesting that they are the most exciting subject but you have to go only one or two steps back in any supply chain to get into deep depths of opacity. With this amendment, I am seeking to gain more clarity on data supply chains and the role of data across all supply chains. Through the combination of data and AI, we could potentially enable a transformation of our supply chain in real time. That would give us so much more flexibility to try for economic benefits and environmental benefits. I look forward to the Minister’s response.

I now move on to Amendment 211D. It is always a pleasure to bring AI into a Bill that really does not want to have AI in it. I am interested in the whole question of data input and output, not least with large language models. I am also interested in the Government’s view on how this interacts with the 1988 copyright Act. There may be some mileage in looking into some standards and approaches in this area, which would potentially go some way towards conditions of market access. We have some excellent examples to look at in other sectors of our economy and society, as set out in the amendment; I would welcome the Minister’s views on that.

I am happy that this group ends with Amendment 211E on the subject of public trust. In many ways, it is the golden thread that should run through everything when we talk about data; I wanted it to be the golden thread that ran through my AI regulation Bill. I always say that Clause 6 is the most important clause in that Bill because it goes to the question of public engagement and trust. Without that level of public engagement and trust, it does not matter how good the technologies are, how good the frameworks are or how good the chat around the data is. It might be golden but, if the public do not believe in it, they are not going to come and be part of it. The most likely consequence of this is that they will not be able to avail themselves of the benefits but they will almost certainly be saddled with the burdens. What these technologies enable is nothing short of a transformation of that discourse between citizen and state, with the potential to reimagine completely the social contract for the benefit of all.

Public engagement and public trust are the golden thread and the fuel for how we gain those economic, social and psychological benefits from the data. I will be very interested in the Minister’s response on what more could be done by the Government, because previous consultations, not least around some of these technologies, have been somewhat short of what we could achieve. With that #brevity and #our data, I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I shall be #even shorter. Data centres and their energy consumption are important issues. I agree that at a suitable moment—probably not now—it would be very interesting to hear the Government’s views on that. Reports from UK parliamentary committees and the Government have consistently emphasised the critical importance of maintaining public trust in data use and AI, but sometimes, the actions of the Government seem to go contrary to that. I support the noble Lord, Lord Holmes, in his call for essentially realising the benefits of AI while making sure that we maintain public trust.

Earl of Effingham Portrait The Earl of Effingham (Con)
- Hansard - - - Excerpts

My Lords, I thank my noble friend Lord Holmes of Richmond for tabling this amendment. As we all appreciate, taking stock of the effects of legislation is critical, as it allows us to see what has worked and what has not. Amendment 221B would require the Secretary of State to launch a consultation into the implications of the provisions of the Bill on the power usage and energy efficiency of data centres. His Majesty’s Official Opposition have no objection to the amendment’s aims but we wonder to what extent it is actually possible. By what means or benchmark can we identify whether a spike in energy usage is specifically due to a provision from this legislation, rather than as a result of some other factor? I should be most grateful if my noble friend could provide further detail on this matter in his closing speech.

Regarding Amendment 211C, we understand that much could be learned from a review of all data regulations and standards pertaining to the supply chains for financial, trade, and legal documents and products, although we wonder if this needs to happen the moment this Bill passes. Could this review not happen at any stage? By all means, let us do it sooner rather than later, but is it necessary to set a date in statute?

Moving on to Amendment 221D, we should certainly look to regulate the AI large language model sector to ensure that there are standards for the input and output of data for LLMs. However, this must be done in a way that does not stifle growth in this emerging industry.

Finally, we have some concerns about Amendment 211E. A national consultation on the use of individuals’ data is perhaps just too broad.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I am grateful to the noble Lord, Lord Holmes, for tabling Amendment 221B and his other amendments in this group, which are on a range of varied and important issues. Given the hour, I hope he will be content if I promise to write to him on each of these issues and in the meantime, I ask him to withdraw the amendment.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

I thank all noble Lords who participated: I will not go through them by name. I thank the Minister for her response and would very much welcome a letter. I am happy to meet her on all these subjects but, for now, I beg leave to withdraw the amendment.

Amendment 211B withdrawn.
Amendments 211C to 211E not moved.
Amendment 211F
Moved by
211F: After Clause 132, insert the following new Clause—
“Local Environmental Records Centres (“LERCs”)(1) Any planning application involving biodiversity net gain must include a data search report from the relevant Local Environmental Records Centre (LERC), and all data from biodiversity surveys conducted in connection with the application must be contributed free of charge to the LERC in record-centre-ready format.(2) All government departments and governmental organisations, local and national, that collect biodiversity data for whatever reason, must contribute it free of charge to the relevant LERCs in record-centre-ready format, and must include relevant LERC data in formulating policy and operational plans.”Member’s explanatory statement
This amendment ensures that all the biodiversity data collected by or in connection with government is collected in Local Environmental Records Centres, so records are as good as possible, and that that data is then used by or in connection with government so that data is put to the best possible use.
Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, environmental data, specifically such things as biodiversity data, is a key component to getting policy in this area right. To do so, we need to make sure that all the good data we are generating around the UK gets into our storage system, and that the best possible and most complete data is used whenever we make decisions.

We currently run that through a system of local environmental records centres that are independent and not for profit. Since that is the system we have, it ought to be run right. At the moment, we are failing to capture a lot of quality data because the data is not coming in from the planning system, or from other similar functions, in the way that it should. We are not consistently using that data in planning as we should. Natural England, which ought to be intimately linked into this system, has stepped away from it for budgetary reasons. The environment is important to us. If the Government are serious about that, we have to get our data collection and use system right. I beg to move.

20:30
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, listening to the noble Lord, Lord Lucas, is often an education, and today is no exception. I had no idea what local environmental records centres were, so I shall be very interested to hear what the Minister has to say in response.

Earl of Effingham Portrait The Earl of Effingham (Con)
- Hansard - - - Excerpts

My Lords, I thank my noble friend Lord Lucas for tabling Amendment 211F and all noble Lords for their brief contributions to this group.

Amendment 211F ensures that all the biodiversity data collected by or in connection with government is collected in local environment records centres to ensure that records are as good as possible. That data is then used by or in connection with government, so it is put to the best possible use.

The importance of sufficient and high-quality record collection cannot and must not be understated. With this in mind, His Majesty’s Official Opposition support the sentiment of the amendment in my noble friend’s name. These Benches will always champion matters related to biodiversity and nature recovery. In fact, many of my noble friends have raised concerns about biodiversity in Committee debates in your Lordships’ House on the Crown Estate Bill, the Water (Special Measures) Bill and the Great British Energy Bill. Indeed, they have tabled amendments that ensure that matters related to biodiversity appear at the forefront of draft legislation.

With that in mind, I am grateful to my noble friend Lord Lucas for introducing provisions, via Amendment 211F, which would require any planning application involving biodiversity net gain to include a data search report from the relevant local environmental records centre. I trust that the Minister has listened to the concerns raised collaboratively in the debate on this brief group. We must recognise the importance of good data collection and ensure that such data is used in the best possible way.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I thank the noble Lord, Lord Lucas, for his Amendment 211F. I absolutely agree that local environmental records centres provide an important service. I reassure noble Lords that the Government’s digital planning programme is developing data standards and tools to increase the availability, accessibility and usability of planning data. This will transform people’s experience of planning and housing, including through local environmental records centres. On that basis, I must ask the noble Lord whether he is prepared to withdraw his amendment.

Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, I am grateful for that extensive answer from the Minister. If I have anything that I hope that she might add, I will write to her afterwards.

My heart is always in the cause of making sure that the Government get their business done on time every time, and that we finish Committee stages when they ask, as doubtless they will discover with some of the other Bills they have in this Session. For now, I beg leave to withdraw my amendment.

Amendment 211F withdrawn.
Amendments 211G and 211H not moved.
Clause 133: Power to make consequential amendments
Amendment 212 not moved.
Clause 133 agreed.
Clause 134 agreed.
Clause 135: Extent
Amendments 213 and 214
Moved by
213: Clause 135, page 168, line 26, at end insert—
“(5A) The power conferred by section 63(3) of the Immigration, Asylum and Nationality Act 2006 may be exercised so as to extend to the Bailiwick of Guernsey or the Isle of Man any amendment made by section 55 of this Act of any part of that Act (with or without modification or adaptation).(5B) The power conferred by section 76(6) of the Immigration Act 2014 may be exercised so as to extend to the Bailiwick of Guernsey or the Isle of Man any amendment made by section 55 of this Act of any part of that Act (with or without modifications). (5C) The power conferred by section 95(5) of the Immigration Act 2016 may be exercised so as to extend to the Bailiwick of Guernsey or the Isle of Man any amendment made by section 55 of this Act of any part of that Act (with or without modifications).”Member's explanatory statement
The immigration legislation amended by Clause 55 may be extended to the Channel Islands or the Isle of Man. This amendment provides that the amendments made by Clause 55 may be extended to the Bailiwick of Guernsey or the Isle of Man.
214: Clause 135, page 168, line 26, at end insert—
“(5A) The power conferred by section 239(7) of the Online Safety Act 2023 may be exercised so as to extend to the Bailiwick of Guernsey or the Isle of Man any amendment or repeal made by this Act of any part of that Act (with or without modifications).”Member's explanatory statement
This amendment provides that amendments of the Online Safety Act 2023 made by the Bill (see Clauses 122 and 123) may, like the other provisions of that Act, be extended to the Bailiwick of Guernsey or the Isle of Man.
Amendments 213 and 214 agreed.
Clause 135, as amended, agreed.
Clauses 136 to 138 agreed.
Lord Russell of Liverpool Portrait The Deputy Chairman of Committees (Lord Russell of Liverpool) (CB)
- Hansard - - - Excerpts

That concludes the Committee’s proceedings on the Bill. I thank all noble Lords who have participated for being so co-operative.

Bill reported with amendments.
Committee adjourned at 8.35 pm.

Data (Use and Access) Bill [HL]

Report (1st Day)
Relevant documents: 3rd Report from the Constitution Committee, 9th and 12th Reports from the Delegated Powers Committee. Scottish, Welsh and Northern Ireland Legislative Consent sought.
16:14
Report received.
Clause 1: Customer data and business data
Amendment 1
Moved by
1: Clause 1, page 3, line 11, at end insert—
“(5A) In subsection (2), references to information includes inferred data.”Member's explanatory statement
This amendment ensures that when traders are required to provide information relating to goods, services and digital content supplied or provided to the customer that includes information that has been created using AI to build a profile about them.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, last week the Government published the AI Opportunities Action Plan and confirmed that they have accepted or partially accepted all 50 of the recommendations from the report’s author, Matt Clifford. Reading the report, there can be no doubting Government’s commitment to making the UK a welcoming environment for AI companies. What is less clear is how creating the infrastructure and skills pool needed for AI companies to thrive will lead to economic and social benefits for UK citizens.

I am aware that the Government have already said that they will provide further details to flesh out the top-level commitments, including policy and legislative changes over the coming months. I reiterate the point made by many noble Lords in Committee that, if data is the ultimate fuel and infrastructure on which AI is built, why, given that we have a new Government, is the data Bill going through the House without all the strategic pieces in place? This is a Bill flying blind.

Amendment 1 is very modest and would ensure that information that traders were required to provide to customers on goods, services and digital content included information that had been created using AI to build a profile about them. This is necessary because the data that companies hold about us is already a combination of information proffered by us and information inferred, increasingly, by AI. This amendment would simply ensure that all customer data—our likes and dislikes, buying habits, product uses and so on—was disclosable, whether provided by us or a guesstimate by AI.

The Government’s recent statements have promised to “mainline AI into the veins” of the nation. If AI were a drug, its design and deployment would be subject to governance and oversight to ensure its safety and efficacy. Equally, they have said that they will “unleash” AI into our public services, communities and business. If the rhetoric also included commitments to understand and manage the well-established risks of AI, the public might feel more inclined to trust both AI and the Government.

The issue of how the data Bill fails to address AI— and how the AI Opportunities Action Plan, and the government response to it, fail to protect UK citizens, children, the creative industries and so on—will be a theme throughout Report. For now, I hope that the Government can find their way to agreeing that AI-generated content that forms part of a customer’s profile should be considered personal data for the purposes of defining business and customer data. I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this is clearly box-office material, as ever.

I support Amendment 1 tabled by the noble Baroness, Lady Kidron, on inferred data. Like her, I regret that we do not have this Bill flying in tandem with an AI Bill. As she said, data and AI go together, and we need to see the two together in context. However, inferred data has its own dangers: inaccuracy and what are called junk inferences; discrimination and unfair treatment; invasions of privacy; a lack of transparency; security risks; predatory targeting; and a loss of anonymity. These dangers highlight the need for strong data privacy protection for consumers in smart data schemes and more transparent data collection practices.

Noble Lords will remember that Cambridge Analytica dealt extensively with inferred data. That company used various data sources to create detailed psychological profiles of individuals going far beyond the information that users explicitly provided. I will not go into the complete history, but, frankly, we do not want to repeat that. Without safeguards, the development of AI technologies could lead to a lack of public trust, as the noble Baroness said, and indeed to a backlash against the use of AI, which could hinder the Government’s ambitions to make the UK an AI superpower. I do not like that kind of boosterish language—some of the Government’s statements perhaps could have been written by Boris Johnson—nevertheless the ambition to put the UK on the AI map, and to keep it there, is a worthy one. This kind of safeguard is therefore extremely important in that context.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

I start by thanking the noble Baroness, Lady Kidron, for introducing this group. I will speak particularly to the amendment in my name but before I do so, I want to say how much I agree with the noble Baroness and with the noble Lord, Lord Clement-Jones, that it is a matter of regret that we are not simultaneously looking at an AI Bill. I worry that this Bill has to take a lot of the weight that an AI Bill would otherwise take, but we will come to that in a great deal more detail in later groups.

I will address the two amendments in this group in reverse order. Amendment 5 in my name and that of my noble friend Lord Markham would remove Clause 13, which makes provision for the Secretary of State or the Treasury to give financial assistance to decision-makers and enforcers—that is, in essence, to act as a financial backstop. While I appreciate the necessity of guaranteeing the stability of enforcers who are public authorities and therefore branches of state, I am concerned that this has been extended to decision-makers. The Bill does not make the identity of a decision-maker clear. Therefore, I wonder who exactly we are protecting here. Unless those individuals or bodies or organisations can be clearly defined, how can we know whether we should extend financial assistance to them?

I raised these concerns in Committee and the Minister assured us at that time that smart data schemes should be self-financing through fees and levies as set out in Clauses 11 and 12 and that this provision is therefore a back-up plan. If that is indeed the case and we are assured of the self-funding nature of smart data schemes, then what exactly makes this necessary? Why must the statutory spending authority act as a backstop if we do not believe there is a risk it will be needed? If we do think there is such a risk, can the Minister elaborate on what it is?

I turn now to the amendment tabled by the noble Baroness, Lady Kidron, which would require data traders to supply customers with information that has been used by AI to build a profile on them. While transparency and explainability are hugely important, I worry that the mechanism proposed here will be too burdensome. The burden would grow linearly with the scale of the models used. Collating and supplying this information would, I fear, increase the cost of doing business for traders. Given AI’s potential to be an immense asset to business, helping generate billions of pounds for the UK economy—and, by the way, I rather approve of the boosterish tone and think we should strive for a great deal more growth in the economy—we should not seek to make its use more administratively burdensome for business. Furthermore, since the information is AI-generated, it is going to be a guess or an assumption or an inference. Therefore, should we require companies to disclose not just the input data but the intermediate and final outputs? Speaking as a consumer, I am not sure that I personally would welcome this. I look forward to hearing the Minister’s responses.

Lord Vallance of Balham Portrait The Minister of State, Department for Science, Innovation and Technology (Lord Vallance of Balham) (Lab)
- View Speech - Hansard - - - Excerpts

I thank the noble Baroness, Lady Kidron, and the noble Viscount, Lord Camrose, for their proposed amendments and continued interest in Part 1 of this Bill. I hope I can reassure the noble Baroness that the definition of customer data is purposefully broad. It encompasses information relating to a customer or a trader and the Government consider that this would indeed include inferred data. The specific data to be disclosed under a smart data scheme will be determined in the context of that scheme and I reassure the noble Baroness that there will be appropriate consultation before a smart data scheme is introduced.

I turn to Amendment 5. Clause 13 provides statutory authority for the Secretary of State or the Treasury to give financial assistance to decision-makers, enforcers and others for the purpose of meeting any expense in the exercise of their functions in the smart data schemes. Existing and trusted bodies such as sector regulators will likely be in the lead of the delivery of new schemes. These bodies will act as decision-makers and enforcers. It is intended that smart data schemes will be self-financing through the fees and levies produced by Clauses 11 and 12. However, because of the nature of the bodies that are involved, it is deemed appropriate for there to be a statutory spending authority as a backstop provision if that is necessary. Any spending commitment of resources will, of course, be subject to the usual estimates process and to existing public sector spending controls and transparency requirements.

I hope that with this brief explanation of the types of bodies involved, and the other explanations, the noble Baroness will be content to withdraw Amendment 1 and that noble Lords will not press Amendment 5.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I thank the Minister for his reassurance, particularly that we will have an opportunity for a consultation on exactly how the smart data scheme works. I look forward to such agreement throughout the afternoon. With that, I beg leave to withdraw my amendment.

Amendment 1 withdrawn.
Clause 2: Power to make provision in connection with customer data
Amendment 2
Moved by
2: Clause 2, page 4, line 1, after “to” insert “the customer's data rights or”
Member's explanatory statement
This amendment adds enacting data rights to the list of actions that the Secretary of State or the Treasury can enable an “authorised person” to take on behalf of customers. This would make it possible for customers to assign their data rights to a third party to activate on their behalf.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, in moving Amendment 2 I will speak to Amendments 3, 4, 25, 42 and 43, all of which are in my name and that of the noble Lord, Lord Clement-Jones. The very detailed arguments for Amendments 25, 42 and 43 were made during the DPDI Bill and can be found at col. GC 89 of vol. 837 of Hansard, and the subsequent arguments for their inclusion in this Bill were made in Committee at col. GC 454. For that reason, I do not propose to make them in full again. I simply say that these amendments for data communities represent a more ambitious and optimistic view of the Bill that would empower citizens to use data law to benefit those with common interests. The example I gave last time was of gig workers assigning their data rights to an expert third party to see whether they were being fairly compensated. That is not something that any individual data subject can easily do alone.

The new Amendments 2, 3 and 4 demonstrate how the concept of data communities might work in relation to the Government’s smart data scheme. Amendment 2 would add enacting data rights to the list of actions that the Secretary of State or the Treasury can enable an authorised person to take on behalf of customers. Amendment 3 requires the Secretary of State or the Treasury to include data communities in the list of those who would be able to activate rights, including data rights on a customer’s behalf. Amendment 4 provides a definition of “data communities”.

Data communities are a process by which one data holder can assign their rights for a given purpose to a community of people who agree with that purpose. I share the Government’s desire to empower consumers and to promote innovation, and these amendments would do just that. Allowing the sharing of data rights of individuals, as opposed to specific categories of data, would strengthen the existing proposal and provide economic and social benefit to the UK and its citizens, rather than imagining that the third party is always a commercial entity.

In response to these amendments in Committee, the then Minister said two things. The first was that the UK GDPR does not prevent data subjects authorising third parties to exercise certain rights on their behalf. She also warmly said that something of this kind was being planned by government and invited me and other noble Lords to discuss this area further. I made it clear that I would like such a meeting, but it has only just been scheduled and is planned for next week, which clearly does not meet the needs of the House, since we are discussing this today. I would be grateful if the current Minister could undertake to bring something on this subject back at Third Reading if we are not reassured by what we hear at the meeting.

While the UK GDPR does not prevent data subjects authorising third parties to exercise certain rights on their behalf, in the example I gave the Minister in Committee it took many years and a bespoke agreement between the ICO and Uber for the 300-plus drivers to combine their data. Under equivalent GDPR provisions in European law, it required a Court of Appeal judgment in Norway before Uber conceded that it was entitled to the data on the drivers’ behalf. A right that cannot be activated without legal action and years of effort is not a right fully given; the UK GDPR is not sufficient in these circumstances.

I want to stress that these amendments are not simply about contesting wrongs. Introducing the concept of data communities would facilitate innovation and promote fairness, which is surely an aim of the legislation.

16:30
Before I sit down, I wanted to acknowledge that the AI action plan recommends in many places making it easier for organisations, including commercial companies, to access datasets, but it is silent on how citizens might be able to access and share their data collectively. Instead, it appears to assume that data mining is something that will happen to them, rather than by them or on their behalf. Matt Clifford, its author, is an AI tech investor. While there is much on which to agree with him when it comes to skills or investment in infrastructure, the relentless tech sector viewpoint, rather than that of worker, creator, citizen or child, is a weakness in itself and a problem in its timing. Those of us who would most like to be supportive of the UK being a tech-enabled nation find the needs of our communities and fellow citizens unserved by this unbridled tech utopianism that both recent history and some of the sector’s greatest innovators would suggest is very unwise. I beg to move.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, the noble Baroness, Lady Kidron, is setting a cracking pace this afternoon, and I am delighted to support her amendments and speak to them. Citizens should have the clear right to assign their data to data communities or trusts, which act as intermediaries between those who hold data and those who wish to use it, and are designed to ensure that data is shared in a fair, safe and equitable manner.

A great range of bodies have explored and support data communities and data trusts. There is considerable pedigree behind the proposals that the noble Baroness has put forward today, starting with a recommendation of the Hall-Pesenti review. We then had the Royal Society and the British Academy talking about data stewardship; the Ada Lovelace Institute has explored legal mechanisms for data stewardship, including data trusts; the Open Data Institute has been actively researching and piloting data trusts in the real world; the Alan Turing Institute has co-hosted a workshop exploring data trusts; and the Royal Society of Arts has conducted citizens’ juries on AI explainability and explored the use of data trusts for community engagement and outreach.

There are many reasons why data communities are so important. They can help empower individuals, give them more control over their data and ensure that it is used responsibly; they can increase bargaining power, reduce transaction costs, address data law complexity and protect individual rights; they can promote innovation by facilitating data-sharing; and they can promote innovation in the development of new products and services. We need to ensure responsible operation and build trust in data communities. As proposed by Amendment 43 in particular, we should establish a register of data communities overseen by the ICO, along with a code of conduct and complaint mechanisms, as proposed by Amendment 42.

It is high time we move forward on this; we need positive steps. In the words of the noble Baroness, Lady Kidron, we do not just seek assurance that there is nothing to prevent these data communities; we need to take positive steps and install mechanisms to make sure that we can set them up and benefit from that.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

I thank the noble Baroness, Lady Kidron, for leading on this group, and the noble Lord, Lord Clement-Jones, for his valuable comments on these important structures of data communities. Amendments 2, 3, 4 and 25 work in tandem and are designed to enable data communities, meaning associations of individuals who have come together and wish to designate a third party, to act on the group’s behalf in their data use.

There is no doubt that the concept of a data community is a powerful idea that can drive innovation and a great deal of value. I thank the noble Lord, Lord Clement-Jones, for cataloguing the many groups that have driven powerful thinking in this area, the value of which is very clear. However—and I keep coming back to this when we discuss this idea—what prevents this being done already? I realise that this may be a comparatively trivial example, but if I wanted to organise a community today to oppose a local development, could I not do so with an existing lawful basis for data processing? It is still not clear in what way these amendments would improve my ability to do so, or would reduce my administrative burden or the risks of data misuse.

I look forward to hearing more about this from the Minister today and, ideally, as the noble Baroness, Lady Kidron, said, in a briefing on the Government’s plan to drive this forward. However, I remain concerned that we do not necessarily need to drive forward this mechanism by passing new legislation. I look forward to the Minister’s comments.

Amendment 42 would require the Information Commissioner to draw up a code of practice setting out how data communities must operate and how data controllers and processors should engage with these communities. Amendment 43 would create a register of data communities and additional responsibilities for the data community controller. I appreciate the intent of the noble Baroness, Lady Kidron, in trying to ensure data security and transparency in the operation of data communities. If we on these Benches supported the idea of their creation in this Bill, we would surely have to implement mechanisms of the type proposed in these amendments. However, this observation confirms us in our view that the administration required to operate these communities is starting to look rather burdensome. We should be looking to encourage the use of data to generate economic growth and to make people’s lives easier. I am concerned that the regulation of data communities, were it to proceed as envisaged by these amendments, might risk doing just the opposite. That said, I will listen with interest to the response of noble Lords and the Minister.

Lord Leong Portrait Lord in Waiting/Government Whip (Lord Leong) (Lab)
- Hansard - - - Excerpts

My Lords, I rise to speak to Amendments 2, 3, 4, 25, 42 and 43. I thank the noble Baroness, Lady Kidron, and the noble Lord, Lord Clement-Jones, for these amendments on data communities, which were previously tabled in Committee, and for the new clauses linking these with the Bill’s clauses on smart data.

As my noble friend Lady Jones noted in Committee, the Government support giving individuals greater agency over their data. The Government are strongly supportive of a robust regime of data subject rights and believe strongly in the opportunity presented by data for innovation and economic growth. UK GDPR does not prevent data subjects authorising third parties to exercise certain rights on their behalf. Stakeholders have, however, said that there may be barriers to this in practice.

I reassure noble Lords that the Government are actively exploring how we can support data intermediaries while maintaining the highest data protection standards. It is our intention to publish a call for evidence in the coming weeks on the activities of data intermediaries and the exercise of data subject rights by third parties. This will enable us to ensure that the policy settings on this topic are right.

In the context of smart data specifically, Part 1 of the Bill does not limit who the regulations may allow customers to authorise. Bearing in mind the IT and security-related requirements inherent in smart data schemes, provisions on who a customer may authorise are best determined in the context of a specific scheme, when the regulations are made following appropriate consultation. I hope to provide some additional reassurance that exercise of the smart data powers is subject to data protection legislation and does not displace data rights under that legislation.

There will be appropriate consultation, including with the Information Commissioner’s Office, before smart data schemes are introduced. This year, the Department for Business and Trade will be publishing a strategy on future uses of these powers.

While the smart data schemes and digital verification services are initial examples of government action to facilitate data portability and innovative uses of data, my noble friend Lady Jones previously offered a meeting with officials and the noble Baroness, Lady Kidron, to discuss these proposals, which I know my officials have arranged for next week—as the noble Baroness indicated earlier. I hope she is therefore content to withdraw her amendment.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

Before the Minister sits down, may I ask whether there is a definition of “customer” and whether that includes a user in the broader sense, or means worker or any citizen? Is it a customer relationship?

Lord Leong Portrait Lord Leong (Lab)
- View Speech - Hansard - - - Excerpts

My understanding is that “customer” reflects an individual, but I am sure that the Minister will give a better explanation at the meeting with officials next week.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

Again before the Minister sits down—I am sure he will not be able to sit down for long—would he open that invitation to a slightly wider group?

Lord Leong Portrait Lord Leong (Lab)
- Hansard - - - Excerpts

I thank the noble Lord for that request, and I am sure my officials would be willing to do that.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I do not intend to detain the House on this for very long, but I want to say that holding meetings after the discussion on Report is not adequate. “Certain rights” and “customer” are exactly the sort of terms that I am trying to address here. To the noble Viscount—and my noble friend—Lord Camrose, I say that it is not adequate, and we have an academic history going back a long way. I hope that the meeting next week is fruitful and that the Government’s enthusiasm for this benefits workers, citizens and customers. I beg leave to withdraw the amendment.

Amendment 2 withdrawn.
Clause 3: Customer data: supplementary
Amendments 3 and 4 not moved.
Clause 13: Financial assistance
Amendment 5 not moved.
Clause 28: DVS trust framework
Amendment 6
Moved by
6: Clause 28, page 30, line 28, at end insert—
“(2A) In preparing the DVS trust framework the Secretary of State must assess whether the public authorities listed in subsection (2B) reliably ascertain the personal data attributes that they collect, record and share.(2B) The public authorities are—(a) HM Passport Office;(b) Driver and Vehicle Licensing Agency;(c) General Register Office;(d) National Records Office;(e) General Register Office for Northern Ireland;(f) NHS Personal Demographics Service;(g) NHS Scotland;(h) NI Health Service Executive;(i) Home Office Online immigration status (eVisa);(j) Disclosure and Barring Service;(k) Disclosure Scotland;(l) Nidirect (AccessNI);(m) HM Revenue and Customs;(n) Welsh Revenue Authority;(o) Revenue Scotland.”Member's explanatory statement
This amendment is to ensure that there is oversight that the public authorities that provide core identity information via the information gateway provide accurate and reliable information.
Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, in moving Amendment 6 in my name I will also to speak to Amendment 8. This section of the Bill deals with digital verification services, and the root word there is verify/veritas—truth. Digital verification input must be truthful for the digital system to work. It is fundamental.

One can find all sorts of workarounds for old analogue systems. They are very flexible. With digital, one has to be precise. Noble Lords may remember the case in November of baby Lilah from Sutton-in-Ashfield, who was registered at birth as male by accident, as she was clearly female. The family corrected this on the birth register by means of a marginal note. There is no provision in law to correct an error on a birth certificate other than a marginal note. That works in analogue—it is there on the certificate—but in digital these are separate fields. In the digital systems, her sex is recorded as male.

16:45
There will be another field called “marginal note”, which nobody will look at much and which you cannot rely on the AI systems to look at either. So it will become really difficult to handle systems where data is inexact and wrong. This particular area is one I hope the Government might find the space to clear up in the course of the rest of the Bill’s passage. We really need to be able to correct errors when they are there.
Amendment 6 is about verifying that the sources of information for verification are good. Amendment 8 is about giving institutions a duty to be accurate. Those things matter if you are dealing with a verification system that will become, for instance, a source of digital identity in pubs and clubs—most obviously for age, but also for sex in terms of using particular facilities. The systems will have drawn their data from the sort of institutions mentioned in Amendment 6 and will regard that data as accurate. It will be extremely difficult for someone in a club to argue with something that appears on a digital verification system. It will be important that that information is accurate.
The Passport Office has allowed people to replace their sex with self-identified gender on passports since the 1960s and, until recently, kept no central records of this. I declare an interest in that my late wife altered her date of birth on her passport in the days when you could do such things, but sex is rather more serious. Doing that officially—to have a passport that shows your sex as something that it is not and to then use that as the basis for a digital verification system—starts to corrupt the whole system. The system is becoming unreliable; it is really important that what gets in there is correct. If you are dealing with use of facilities in clubs and relying on the digital verification system to apply quite reasonably, say, a rule that you have to be female to use the female changing rooms, which are communal, then you need to have a system that is accurate. That means we must take care, as we go into this AI world, to make sure that the data sources we feed in are accurate.
We also have an aspect of this in nursing and domiciliary care, where many people will want intimate care to be provided by people of the same sex. That has always seemed a reasonable request and one that offence should not be taken at. It is quite properly part of a lot of people’s upbringing that they are careful about the way they are exposed in front of the opposite sex. This can apply to males and females. The base of this has to be that the data held on the sex of the workers involved is accurate. There have been several cases recently within the NHS where that has clearly not been the case. We are looking at a government transformation and moving to an AI world. That AI world will be intolerable if it is based on data that is not truthful.
Anyway, what are these organisations doing, knowingly recording untruthful data? How do they manage that under the GDPR? What rights do they have to hold data that they know to be wrong? It seems astonishing to me that this has grown up. In any event, given where we are going and given where this Government are taking us, although I share other noble Lords’ concerns and fears, I am none the less behind the wagon, pushing. There could be some interesting outcomes to AI and what it might offer but we have to get it right. If we allow it to become corrupted, it will not work; it will spread all sorts of inefficiencies and wrongs without us being able to correct it. To get it right at the beginning is important. Tech-enabled should mean truth-enabled. I beg to move.
Lord Arbuthnot of Edrom Portrait Lord Arbuthnot of Edrom (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I support my noble friend. I have a confession to make. Before this Bill came up, I foolishly thought that sex and gender were the same thing. I have discovered that they are not. Gender is not a characteristic defined in UK law. I believe that you are born with a biological sex, as being male or female, and that some people will choose, or need, to have a gender reassignment or to identify as a different gender. I thank the charity Sex Matters, which works to provide clarity on this issue of sex in law.

As my noble friend Lord Lucas said, the digital verification system currently operates on the basis of chosen gender, not of sex at birth. You can change your records on request without even having a gender recognition certificate. That means that, over the last five years, at least 3,000 people have changed their passports to show the wrong sex. Over the last six years, at least 15,000 people have changed their driving licences. The NHS has no records of how many people now have different sexes recorded from those they had at birth. It is thought that perhaps 100,000 people have one sex indicated in one record and a different sex in another. We cannot go on like that.

The consequences of this are really concerning. It means people with mismatched identities risk being flagged up as a synthetic identity risk. It means authorities with statutory safeguarding responsibilities will not be able to assess the risk that they are trying to deal with. It means that illnesses may be misdiagnosed and treatments misprescribed if the wrong sex is stated in someone’s medical records. The police will be unable to identify people if they are looking in the wrong records. Disclosure and Barring Service checks may fail to match individuals with the wrong sex. I hope that the Government will look again at correcting this. It is a really important issue.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to Amendments 7 and 9. Amendment 7 would require the Secretary of State to lay the DVS trust framework before Parliament. Given the volume of sensitive data that digital ID providers will be handling, it is crucial for Parliament to oversee the framework rules governing digital verification service providers.

The amendment is essentially one that was tabled in Committee by the noble Viscount, Lord Camrose. I thought that he expressed this well in Committee, emphasising that such a fundamental framework demands parliamentary approval for transparency and accountability, regardless of the document’s complexity. This is an important framework with implications for data privacy and security, and should not be left solely to the discretion of the Secretary of State.

The DPRRC in its ninth report and the Constitution Committee in its third report of the Session also believed the DVS trust framework should be subject to parliamentary scrutiny; the former because it has legislative effect, and it recommended using the affirmative procedure, which would require Parliament to actively approve the framework, as the Secretary of State has significant power without adequate parliamentary involvement. The latter committee, the Constitution Committee, said:

“We reiterate our statement from our report on the Data Protection and Digital Information Bill that ‘[d]ata protection is a matter of great importance in maintaining a relationship of trust between the state and the individual. Access to personal data is beneficial to the provision of services by the state and assists in protecting national security. However, the processing of personal data affects individual rights, including the right to respect for private life and the right to freedom of expression. It is important that the power to process personal data does not become so broad as to unduly limit those rights’”.


Those views are entirely consistent with the committee’s earlier stance on a similar provision in the previous Data Protection and Digital Information Bill. That was why it was so splendid that the noble Viscount tabled that amendment in Committee. It was like a Damascene conversion.

The noble Baroness, Lady Jones, argued in Committee and in correspondence that the trust framework is a highly technical document that Parliament might find difficult to understand. That is a bit of a red rag to a bull. However, this argument fails to address the core concerns about democratic oversight. The framework aims to establish a trusted digital identity marketplace by setting requirements for providers to gain certification as trusted providers.

I am extremely grateful to the Minister, the Bill team and the department for allowing officials to give the noble Viscount, Lord Camrose, and me a tutorial on the trust framework. It depends heavily on being voluntary in nature, with the UK Accreditation Service essentially overseeing the certifiers, such as BSI, Kantara and the Age Check Certification Scheme, certifying the providers, with the installation of ISO 17065 as the governing standard.

Compliance is assured through the certification process, where services are assessed against the framework rules by independent conformity assessment bodies accredited by the UK Accreditation Service, and the trust framework establishes rules and standards for digital identity verification but does not directly contain specific provision for regulatory oversight or for redress mechanisms such as a specific ombudsman service, industry-led dispute resolution or set contract terms for consumer redress or enforcement powers. The Government say, however, that they intend to monitor the types of complaints received. Ultimately, the scope of the framework is limited to the rules providers must follow in order to remain certificated and it does not address governance matters.

Periodic certification alone is not enough to ensure ongoing compliance and highlights the lack of an independent mechanism to hold the Secretary of State accountable. The noble Baroness, Lady Jones, stated in Committee that the Government preferred a light-touch approach to regulating digital verification services. She believed that excessive parliamentary scrutiny would hinder innovation and flexibility in this rapidly evolving sector.

The Government have consistently emphasised that they have no plans to introduce mandatory digital IDs or ID cards The focus is on creating a secure and trusted system that gives citizens more choice and control over their data. The attributes trust framework is a crucial step towards achieving the goal of a secure, trusted and innovative digital identity market—all the more reason to get the process for approval right.

These services will inevitably be high-profile. Digital ID is a sensitive area which potentially also involves age verification. These services could have a major impact on data privacy and security. Public debate on such a critical issue is crucial to build trust and confidence in these systems. Laying the DVS trust framework before Parliament would allow for a wider range of voices and perspectives to be heard, ensuring a more robust and democratic approval process.

17:00
The lack in the framework of specific redress mechanisms and a dedicated regulator further underscores the need for parliamentary oversight to protect individuals’ rights and interests in this rapidly evolving digital landscape. In his letters to the chairs of those two committees, the Minister, the noble Lord, Lord Vallance, made similar arguments to those made by the noble Baroness, Lady Jones. By the way, we wish the noble Baroness well and appreciate the baton being picked up for this Bill by the noble Lord.
I hope that I have made the case and that the final paragraph the Minister put in his letter does not counteract what I have to say about the benefits of parliamentary approval. The Minister writes that he hopes his letter will
“provide some helpful context … the Government remains of the view that it does not require parliamentary scrutiny, because its primary role is in the conformity assessment space which sits outside of the Bill”.
If anything, the letter makes the arguments of the Constitution Committee and the DPRRC stronger.
I turn to Amendment 9. The rapid growth of digital services and the potential for misuse emphasise the need for a review of a digital identity offence, as proposed in this amendment. As I pointed out to both this Government and the last one, there is currently no specific crime of digital identity theft, despite various laws that address related offences such as fraud, using a false identity and unauthorised computer access. This gap in legislation leaves the public vulnerable to the harms of digital identity theft. Creating a new specific offence of digital identity theft would better protect those who use digital identity online, ensuring that they had the same protection as they do in the physical world. Existing laws do not adequately cover the nuances of digital identity theft, and a clear criminal offence would serve as a deterrent to malicious actors. As I argued in Committee, the Government should follow the recommendations of the committee chaired by the noble Baroness, Lady Morgan, which produced the 2022 report Fighting Fraud: Breaking the Chain, which concluded that a specific criminal offence for identity theft was necessary, or that identity theft should be considered a serious aggravating factor in cases of fraud.
In Committee, the noble Baroness, Lady Jones, said that existing legislation, such as the Fraud Act 2006, the Computer Misuse Act 1990 and the Data Protection Act 2018, already addressed the behaviour targeted by the amendment. She said that new offences were unnecessary and could lead to overcriminalisation. Defining every instance of verification as an identity document under the Identity Documents Act 2010 could create
“an unclear … and duplicative process for … prosecution”—[Official Report, 3/12/24; col. GC 382.]
was another of her arguments. However, while existing legislation might touch on aspects of digital identity theft, the Fraud Act 2006 does not explicitly address the unique challenges posed by digital identity theft. The lack of a specific offence creates ambiguity and could allow perpetrators to exploit loopholes. Creating a specific offence would provide clarity and demonstrate a commitment to tackling this growing offence. By conducting a thorough review, the Government could ensure that the legal framework effectively combated digital identity theft while promoting a secure and trustworthy digital environment for individuals and businesses.
As for the approach of these Benches to the amendments tabled by the noble Lords, Lord Lucas and Lord Arbuthnot, I have some sympathy for the desire for accuracy in the records covered by digital identity services, and I hope the Government will be able to give assurances about that. However, we do not wish to turn this into a battle and a culture war opportunity, so we will not be supporting the noble Lords if they push them to a vote.
Earl of Erroll Portrait The Earl of Erroll (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I very much support the amendments from the noble Lords, Lord Lucas and Lord Arbuthnot, particularly Amendment 6, about accuracy. It has become apparent—and Committee stage was interesting—that there is a challenge with having gender and sex as interchangeable. The problem becomes physical, because you cannot avoid the fact that you will react differently medically to certain things according to the sex you were born and to your DNA.

That can be very dangerous in two cases. The first case is where drugs or cures are being administered by someone who thinks they are treating a patient of one sex but they are actually a different sex. That could kill someone, quite happily. The second case is if you are doing medical research and relying on something, but then find that half the research is invalid because a person is not actually that sex but have decided to choose another gender. Therefore, all the research on that person could be invalid. That could lead to cures being missed, other things being diagnosed as being all right, and a lot of dangers.

As a society, we have decided that it will be all right for people to change gender—let us say that, as I think it is probably the easiest way to describe it. I do not see any problem with that, but we need critical things to be kept on records that are clearly separate. Maybe we can make decisions in Parliament, or wherever, about what you are allowed to declare on identity documents such as a passport. We need to have two things: one is sex, which is immutable, and therefore can help with all the other things behind the scenes, including research and treatments; the other is gender, which can be what you wish to declare, and society accepts that you can declare yourself as being of another gender. I cannot see any way round that. I have had discussions with people about this, and as one who would have said that this is quite wrong and unnecessary, I was convinced by the end of those discussions that it was right. Keeping the two separate in our minds would solve a lot of problems. These two amendments are vital for that.

I agree in many ways with the points from the noble Lord, Lord Clement-Jones. Just allowing some of these changes to be made by the stroke of a pen—a bit like someone is doing across the Atlantic—without coming to Parliament, is perhaps unwise sometimes. The combined wisdom of Parliament, looking at things from a different point of view, and possibly with a more societal point of view than the people who are trying to make systems work on a governmental basis, can be sensible and would avoid other mistakes being made. I certainly support his amendments, but I disagree entirely with his last statement where he did not support the noble Lords, Lord Lucas and Lord Arbuthnot.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

I thank my noble friend Lord Lucas for introducing this group and for bringing these important and sometimes very difficult matters to the attention of the House. I will address the amendments slightly out of order, if I may.

For digital verification services to work, the information they have access to and use to verify documents must be accurate; this is, needless to say, critical to the success of the entire scheme. Therefore, it is highly sensible for Amendment 8 to require public authorities, when they disclose information via the information gateway, to ensure that it is accurate and reliable and that they can prove it. By the same measure, Amendment 6, which requires the Secretary of State to assess whether the public authorities listed are collecting accurate information, is equally sensible. These amendments as a pair will ensure the reliability of DVS services and encourage the industry to flourish.

I would like to consider the nature of accurate information, especially regarding an individual’s biological sex. It is possible for an individual to change their recorded sex on their driving licence or passport, for example, without going through the process of obtaining a gender recognition certificate. Indeed, a person can change the sex on their birth certificate if they obtain a GRC, but many would argue that changing some words on a document does not change the reality of a person’s genome, physical presentation and, in some cases, medical needs, meaning that the information recorded does not accurately relate to their sex. I urge the Minister to consider how best to navigate this situation, and to acknowledge that it is crucially important, as we have heard so persuasively from the noble Earl, Lord Errol, and my noble friends Lord Arbuthnot and Lord Lucas, that a person’s sex is recorded accurately to facilitate a fully functioning DVS system.

The DVS trust framework has the potential to rapidly transform the way identities and information are verified. It should standardise digital verification services, ensure reliability and build trust in the concept of a digital verification service. It could seriously improve existing, cumbersome methods of verifying information, saving companies, employers, employees, landlords and tenants time and money. Personally, I have high hopes of its potential to revolutionise the practices of recruitment. I certainly do not know many people who would say no to less admin. If noble Lords are minded to test the opinion of the House, we will certainly support them with respect to Amendments 6 and 8.

With the greatest respect to the noble Lord, Lord Clement-Jones, I think it is a mistake to regard this as part of some culture war struggle. As I understand it, this is about accuracy of data and the importance, for medical and other reasons, of maintaining accurate data.

All the benefits of DVS cannot be to the detriment of data privacy and data minimisation. Parliament is well-practised at balancing multiple competing concepts and doing so with due regard to public opinion. Therefore, Amendment 7 is indeed a sensible idea.

Finally, Amendment 9 would require the Secretary of State to review whether an offence of false use of identity documents created or verified by a DVS provider is needed. This is certainly worth consideration. I have no doubt that the Secretary of State will require DVS providers to take care that their services are not being used with criminal intent, and I am quite sure that DVS service providers do not want to facilitate crimes. However, the history of technology is surely one of high-minded purposes corrupted by cynical practices. Therefore, it seems prudent for the Secretary of State to conduct a review into whether creating this offence is necessary and, if it is, the best way that it can be laid out in law. I look forward to hearing the Minister’s comments on this and other matters.

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

I thank the noble Lords, Lord Clement-Jones, Lord Lucas and Lord Arbuthnot, for their amendments and interest in the important area of digital verification services. I thank the noble Viscount, Lord Camrose, for his support for this being such an important thing to make life easier for people.

I will go in reverse order and start with Amendment 9. I thank the noble Lord, Lord Clement-Jones, for reconsidering his stance since Committee on the outright creation of these offences. Amendment 9 would create an obligation for the Secretary of State to review the need for digital identity theft offences. We believe this would be unnecessary, as existing legislation—for example, the Fraud Act 2006, the Computer Misuse Act 1990 and the Data Protection Act 2018—already addresses the behaviour targeted by this amendment.

However, we note the concerns raised and confirm that the Government are taking steps to tackle the issue. First, the Action Fraud service, which allows individuals to report fraud enabled by identity theft, is being upgraded with improved reporting tools, increased intelligence flows to police forces and better support services for victims. Secondly, the Home Office is reviewing the training offered to police officers who have to respond to fraud incidents, and identifying the improvements needed.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I am sorry to interrupt the Minister. He is equating digital identity theft to fraud, and that is not always the case. Is that the advice that he has received?

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- Hansard - - - Excerpts

The advice is that digital identity theft would be captured by those Acts. Therefore, there is no need for a specific offence. However, as I said, the Government are taking steps to tackle this and will support the Action Fraud service as a way to deal with it, even though I agree that not everything falls as fraud under that classification.

17:15
Amendment 7 would require the DVS trust framework to be laid before Parliament. The trust framework is a very technical document that sets the rules that digital verification services can be certified against and requires the Secretary of State to consult when preparing, publishing or revising the trust framework following an annual review. These rules, now in their fourth non-statutory version on GOV.UK, draw on existing technical requirements, standards, best practice, guidance and legislation. Compliance with the rules is ensured by third-party independent auditors—the conformity assessment bodies—which certify digital verification services when they are compliant with the trust framework. Similar certification schemes exist across numerous industries for providing quality assurance.
Although the Secretary of State has powers in the Bill relating to the trust framework, the primary role of the framework in practice is to provide baseline rules against which digital verification services can be assessed by the conformity assessment bodies. That process takes place outside the Bill and relies on tried and trusted accreditation processes, overseen by the United Kingdom Accreditation Service. For these reasons, and for the reason that this is indeed a process that exists and works, the Government remain of the view that the trust framework does not require parliamentary scrutiny.
The rules in the framework are likely to act as a robust baseline for the independent conformity assessment process. Schemes such as this exist in many sectors, as I have said, and draw heavily on existing standards. The Secretary of State will have to undertake an annual review and consult the Information Commissioner and other appropriate stakeholders as part of that process. The trust framework’s development will be informed by industry and regulatory knowledge as the market evolves.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I am sorry to interrupt the Minister again, but could he therefore confirm that, by reiterating his previous view that the Secretary of State should not have to bring the framework to Parliament, he disagrees with both the Delegated Powers and Regulatory Reform Committee and the Constitution Committee, both of which made the same point on this occasion and on the previous Bill—that Parliament should look at the trust framework?

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- Hansard - - - Excerpts

For the reasons that I have given, I think that the trust framework is a technical document and one best dealt with in this technical form. It is built on other assurance processes, with the United Kingdom Accreditation Service overseeing the conformity accreditation bodies that will test the digital verification services. In this case, our view is that it does not need to come under parliamentary scrutiny.

On Amendments 6 and 8 from the noble Lord, Lord Lucas, I am absolutely behind the notion that the validity of the data is critical. We have to get this right. Of course, the Bill itself takes the data from other sources, and those sources have authority to get the information correct, but it is important, for a digital service in particular, that this is dealt with very carefully and that we have good assurance processes.

On the specific point about gender identity, the Bill does not create or prescribe new ways in which to determine that, but work is ongoing to try to ensure that there is consistency and accuracy. The Central Digital and Data Office has started to progress work on developing data standards and key entities and their attributes to ensure that the way data is organised, stored and shared is consistent between public authorities. Work has also been commenced via the domain expert group on the person entity, which has representations from the Home Office, HMRC, the Office for National Statistics—importantly—NHS England, the Department for Education, the Ministry of Justice, the Local Government Association and the Police Digital Service. The group has been established as a pilot under the Data Standards Authority to help to ensure consistency across organisations, and specific pieces of work are going on relating to gender in that area.

The measures in Part 2 are intended to help secure the reliability of the process through which citizens can verify their identity digitally. They do not intervene in how government departments record and store identity data. In clarifying this important distinction, and with reference to the further information I will set out, I cannot support the amendments.

Lord Arbuthnot of Edrom Portrait Lord Arbuthnot of Edrom (Con)
- Hansard - - - Excerpts

I would be grateful if the Minister could confirm whether he accepts that, on some occasions, passports and drivers’ licences inaccurately reflect the sex of their holders.

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- Hansard - - - Excerpts

I can be absolutely clear that we must have a single version of the truth on this. There needs to be a way to verify it consistently and there need to be rules. That is why the ongoing work is so important. I know from my background in scientific research that, to know what you are dealing with, data is the most important thing to get. Making sure that we have a system to get this clear will be part of what we are doing.

Amendment 6 would require the Secretary of State to assess which public authorities can reliably verify related facts about a person in the preparation of the trust framework. This exercise is out of scope of the trust framework, as the Good Practice Guide 45—a standard signposted in the trust framework—already provides guidance for assessing the reliability of authoritative information across a wide range of use cases covered by the trust framework. Furthermore, the public authorities mentioned are already subject to data protection legislation which requires personal data processed to be accurate and, where relevant, kept up to date.

Amendment 8 would require any information shared by public authorities to be clearly defined, accompanied by metadata and accurate. The Government already support and prioritise the accuracy of the data they store, and I indicated the ongoing work to make sure that this continues to be looked at and improved. This amendment could duplicate or potentially conflict with existing protections under data protection legislation and/or other legal obligations. I reassure noble Lords that the Government believe that ensuring the data they process is accurate is essential to deliver services that meet citizens’ needs and ensure accurate evaluation and research. The Central Digital and Data Office has already started work on developing data standards on key entities and their attributes to ensure that the way data is organised, stored and shared is consistent.

It is our belief that these matters are more appropriately considered together holistically, rather than by a piecemeal approach through diverse legislation such as this data Bill. As such, I would be grateful if noble Lords would consider withdrawing their amendments.

Lord Lucas Portrait Lord Lucas (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am very grateful to all noble Lords who have spoken on this. I actually rather liked the amendments of the noble Lord, Lord Clement-Jones—if I am allowed to reach across to him—but I think he is wrong to describe Amendments 6 and 8 as “culture war”. They are very much about AI and the fundamentals of digital. Self-ID is an attractive thought; I would very much like to self-identify as a life Peer at the moment.

None Portrait Noble Lords
- Hansard -

Oh!

Lord Lucas Portrait Lord Lucas (Con)
- View Speech - Hansard - - - Excerpts

However, the truth should come before personal feelings, particularly when looking at data and the fundamentals of society. I hope that the noble Lord will take parliamentary opportunities to bring the framework in front of Parliament when it appears. I agree with him that Parliament should take an interest in and look at this, and I hope we will be able to do that through a short debate at some stage—or that he will be able to, because I suspect that I shall not be here to do so. It is important that, where such fundamental rights and the need for understanding are involved, there is a high degree of openness. However expert the consideration the Government may give this through the mechanisms the Minister has described, I do not think they go far enough.

So far as my own amendments are concerned, I appreciate very much what the Minister has said. We are clearly coming from the same place, but we should not let the opportunity of this Bill drift. We should put down the marker here that this is an absolutely key part of getting data and government right. I therefore beg leave to test the opinion of the House.

17:25

Division 1

Ayes: 205

Noes: 159

Amendment 7
Moved by
7: Clause 28, page 31, line 22, at end insert—
“(11) The Secretary of State must lay the DVS trust framework before Parliament.”Member's explanatory statement
This amendment will ensure Parliamentary oversight of the rules with which digital verification service providers must comply.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I support the conclusions of the Delegated Powers and Regulatory Reform Committee and the Constitution Committee, and I beg leave to seek the opinion of the House.

17:40

Division 2

Ayes: 87

Noes: 157

17:50
Clause 45: Power of public authority to disclose information to registered person
Amendment 8
Moved by
8: Clause 45, page 42, line 23, at end insert—
“(5A) A public authority must not disclose information about an individual under this section unless the information—(a) is clearly defined and accompanied by metadata, and(b) the public authority is able to attest that it—(i) was accurate at the time it was recorded, and(ii) has not been changed or tampered, or(c) the public authority is able to attest that it—(i) has been corrected through a lawfully made correction, and(ii) was accurate at the time of the correction.”Member’s explanatory statement
This amendment is to ensure that public authorities that disclose information via the information gateway provide accurate and reliable information and that if the information has been corrected it is the correct information that is provided.
Amendment 8 agreed.
Amendment 9 not moved.
Clause 56: National Underground Asset Register: England and Wales
Amendment 10
Moved by
10: Clause 56, page 52, line 13, leave out “undertaker’s” and insert “contractor’s”
Member’s explanatory statement
New section 106B(6) of the New Roads and Street Works Act 1991 (defence where certain people have taken reasonable care) refers to “the undertaker’s employees” twice. This amendment corrects that by replacing one of those references with a reference to “the contractor’s employees”.
Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- Hansard - - - Excerpts

My Lords, Amendments 10 and 12 seek to amend Clauses 56 and 58, which form part of the national underground asset register provisions. These two minor, technical amendments address a duplicate reference to “the undertaker’s employees” and replace it with the correct reference to “the contractor’s employees”. I reassure noble Lords that the amendments do not have a material policy effect and are intended to correct the drafting. I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the Minister for these two technical amendments. I take this opportunity to thank him also for responding to correspondence about LinesearchbeforeUdig and its wish to meet government and work with existing services to deliver what it describes as the safe digging elements of the NUAR. The Minister has confirmed that the heavy lifting on this—not heavy digging—will be carried out by the noble Baroness, Lady Jones, on her return, which I am sure she will look forward to. As I understand it, officials will meet LinesearchbeforeUdig this week, and they will look at the survey carried out by the service. We have made some process since Committee, and I am grateful to the Minister for that.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

My Lords, given that these are technical amendments, correcting wording errors, I have little to add to the remarks already made. We have no concerns about these amendments and will not seek to oppose the Government in making these changes.

Amendment 10 agreed.
Amendment 11
Moved by
11: Clause 56, page 53, line 17, at end insert—
“(2A) The Secretary of State must provide guidance to relevant stakeholders on cyber-security measures before they may receive information from NUAR.”Member's explanatory statement
This amendment will require the Secretary of State to provide guidance to relevant stakeholders on security measures before they receive information from NUAR.
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, I will speak to Amendments 11 and 13 in my name and that of my noble friend Lord Markham. The national underground asset register contains the details of all underground assets and apparatus in England, Wales and Northern Ireland, or at any rate it will do as it goes forward. This includes water pipes, electricity cables, internet cables and fibres—details of the critical infrastructure necessary to sustain the UK as we know it.

Needless to say, there are many hostile actors who, if they got their hands on this information, would or could use it to commit appalling acts of terror. I am mindful of and grateful for the Government’s assurances given in Committee that it is and will be subject to rigorous security measures. However, the weakest link in cyber defence is often third-party suppliers and other partners who do not recognise the same level of risk. We should take every possible measure to ensure that the vital data in NUAR is kept safe and shared only with stakeholders who have the necessary security provisions in place.

For this reason, I have tabled Amendment 11, which would require the Secretary of State to provide guidance to relevant stakeholders on the cybersecurity measures which should be in place before they receive information from NUAR. I do not believe this would place a great burden on government departments, as appropriate cybersecurity standards already exist. The key is to ensure that they are duly observed.

I cannot overstate the importance of keeping this information secure, but I doubt noble Lords need much convincing on that score. Given how frighteningly high the stakes are, I strongly urge the most proactive possible approach to cybersecurity, advising stakeholders and taking every possible step to keep us all safe.

Amendment 13, also tabled in my name, requires the Registrar-General to make provisions to ensure the cybersecurity of the newly digitised registers of births, still-births, and deaths. There are a great many benefits in moving from a paper-based register of births and deaths to a digitised version. People no longer have to make the trip to sign the register in person, saving time and simplifying the necessary admin at very busy or very difficult points in people’s lives. It also reduces the number of physical documents that need to be maintained and kept secure. However, in digitising vast quantities of personal, valuable information, we are making a larger attack surface which will appeal to malign actors looking to steal personal data.

I know we discussed this matter in Committee, when the noble Baroness the Minister made the point that this legislation is more about a digitisation drive, in that all records will now be digital rather than paper and digital. While I appreciate her summary, I am not sure it addresses my concerns about the security risks of shifting to a purely digital model. We present a large and tempting attack surface, and the absence of paper back-ups increases the value of digital information even more, as it is the only register. Of course, there are already security measures in place for the digital copies of these registers. I have no doubt we have back-ups and a range of other fallback opportunities. But the same argument applies.

Proactive cybersecurity provisions are required, taking into account the added value of these registers and the ever-evolving threat we face from cybercriminals. I will listen with great interest to the thoughts of other noble Lords and the Minister.

Lord Leong Portrait Lord Leong (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the noble Viscount, Lord Camrose, and the noble Lord, Lord Markham, for these amendments. Clause 56 forms part of NUAR provisions. The security of NUAR remains of the utmost importance. Because of this, the Government have closely involved a wide range of security stakeholders in the development of NUAR, including the National Protective Security Authority and security teams from the asset owners themselves. Providing clear acceptable user and usage policies for any digital service is important. As such, we intend to establish clear guidance on the appropriate usage of NUAR, including what conditions end users must fulfil before gaining access to the service. This may include cybersecurity arrangements, as well as personal vetting. However, we do not feel it appropriate to include this in the Bill.

Care must be taken when disclosing platform-specific cybersecurity information, as this could provide bad actors with greater information to enable them to counter these measures, ultimately making NUAR less secure. Furthermore, regulations made in relation to access to information from NUAR would be subject to the affirmative procedure. As such, there will be future opportunities for relevant committees to consider in full these access arrangements, including, on an individual basis, any security impacts. I therefore reassure noble Lords that these measures will ensure that access to NUAR data is subject to appropriate safeguards.

18:00
Turning to Amendment 13, also tabled by the noble Viscount, the registration online system has been in place for births, stillbirths and deaths since 2009. The system is protected to Home Office security standards and employs a range of anti-cyberattack best practices through the deployment of advanced, fully managed firewalls and intrusion detection systems. The data is replicated to a secure cloud platform every 30 minutes and robust measures are in place to protect it. Articles 25 and 32 of the UK general data protection regulation impose duties on controllers of personal data to implement appropriate technical and organisational measures, including security measures. Therefore, legislation is already in place to ensure the security of the electronic registers. The robust security measures the Home Office has in place ensure that we are complying with these statutory obligations.
With those explanations, I hope that the noble Viscount will be content to withdraw Amendment 11.
Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

I thank the Minister for his considered reply. It is clear that the Government and the department are taking the issue of security with all due seriousness. However, I remain concerned, particularly about the move to NUAR as a highly tempting attack service for malign actors. In light of this, I am minded to test the opinion of the House.

18:02

Division 3

Ayes: 186

Noes: 162

18:13
Clause 58: National Underground Asset Register: Northern Ireland
Amendment 12
Moved by
12: Clause 58, page 62, line 34, leave out “undertaker’s” and insert “contractor’s”
Member’s explanatory statement
New Article 45B(6) of the Street Works (Northern Ireland) Order 1995 (defence where certain people have taken reasonable care) refers to “the undertaker’s employees” twice. This amendment corrects that by replacing one of those references with a reference to “the contractor’s employees”.
Amendment 12 agreed.
Clause 61: Form in which registers of births and deaths are to be kept
Amendment 13 not moved.
Clause 67: Meaning of research and statistical purposes
Amendment 14
Moved by
14: Clause 67, page 75, line 10, after “scientific” insert “and that is conducted in the public interest”
Member’s explanatory statement
This amendment ensures that to qualify for the scientific research exception for data reuse, that research must be in the public interest. This requirement already exists for medical research, but this amendment would apply it to all scientific research wishing to take advantage of the exception.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - - - Excerpts

My Lords, I thank my noble friend Lady Kidron and the noble Viscount, Lord Camrose, for adding their signatures to my Amendment 14. I withdrew this amendment in Committee, but I am now asking the Minister to consider once again the definition of “scientific research” in the Bill. If he cannot satisfy me in his speech this evening, I will seek the opinion of the House.

I have been worried about the safeguards for defining scientific research since the Bill was published. This amendment will require that the research should be in “the public interest”, which I am sure most noble Lords will agree is a laudable aim and an important safeguard. This amendment has been looked at in the context of the Government’s recent announcements on turning this country into an AI superpower. I am very much a supporter of this endeavour, but across the country there are many people who are worried about the need to set up safeguards for their data. They fear data safety is threatened by this explosion of AI and its inexorable development by the big tech companies. This amendment will go some way to building public trust in the AI revolution.

The vision of Donald Trump surrounded at his inauguration yesterday by tech billionaires, most of whom have until recently been Democrats, puts the fear of God into me. I fear their companies are coming for our data. We have some of the best data in the world, and it needs to be safeguarded. The AI companies are spending billions of dollars developing their foundation models, and they are beholden to their shareholders to minimise the cost of developing these models.

Clause 67 gives a huge fillip to the scientific research community. It exempts research which falls within the definition of scientific research as laid out in the Bill from having to gain new consent from data subjects to reuse millions of points of data.

It costs time and money for the tech companies to get renewed consent from data holders before reusing their data. This is an issue we will discuss further when we debate amendments on scraping data from creatives without copyright licensing. It is clear from our debates in Committee that many noble Lords fear that AI companies will do what they can to avoid either getting consent or licensing data for use in scraping data. Defining their research as scientific will allow them to escape these constraints. I could not be a greater supporter of the wonderful scientific research that is carried out in this country, but I want the Bill to ensure that it really is scientific research and not AI development camouflaged as scientific research.

The line between product development and scientific research is often blurred. Many developers posit efforts to increase model capabilities, efficiency, or indeed the study of their risks, as scientific research. The balance has to be struck between allowing this country to become an AI superpower and exploiting its data subjects. I contend that this amendment will go far to allay public fears of the abuse and use of their data to further the profits and goals of huge AI companies, most of which are based in the United States.

Noble Lords have only to look at the outrage last year at Meta’s use of Instagram users’ data without their consent to train the datasets for its new Llama AI model to understand the levels of concern. There were complaints to regulators, and the ICO posted that Meta

“responded to our request to pause and review plans to use Facebook and Instagram user data to train generative AI”.

However, so far, there has been no official change to Meta’s privacy policy that would legally bind it to stop processing data without consent for the development of its AI technologies, and the ICO has not issued a binding order to stop Meta’s plans to scrape users’ data to train its AI systems. Meanwhile, Meta has resumed reusing subjects’ data without their consent.

I thank the Minister for meeting me and talking through Amendment 14. I understand his concerns that, at a public interest threshold, the definition of scientific research will create a heavy burden on researchers, but I think it is worth the risk in the name of safety. Some noble Lords are concerned about the difficulty of defining “public interest”. However, the ICO has very clear guidelines about what public interest consists of. It states that

“you should broadly interpret public interest in the research context to include any clear and positive public benefit likely to arise from that research”.

It continues:

“The public interest covers a wide range of values and principles about the public good, or what is in society’s best interests. In making the case that your research is in the public interest, it is not enough to point to your own private interests”.


The guidance even includes further examples of research in the public interest, such as

“the advancement of academic knowledge in a given field … the preservation of art, culture and knowledge for the enrichment of society … or … the provision of more efficient or more effective products and services for the public”.

This guidance is already being applied in the Bill to sensitive data and public health data. I contend that if these carefully thought-through guidelines are good enough for health data, they should be good enough for all scientific data.

This view is supported in the EU, where

“the special data protection regime for scientific research is understood to apply where … the research is carried out with the aim of growing society’s collective knowledge and wellbeing, as opposed to serving primarily one or several private interests.”

The Minister will tell the House that the data exempted to be used for scientific research is well protected—that it has both the lawfulness test, as set out in the UK GDPR, and a reasonableness test. I am concerned that the reasonableness test in this Bill references

“processing for the purposes of any research that can reasonably be described as scientific, whether publicly or privately funded and whether carried out as a commercial or non-commercial activity”.

Normally, a reasonableness test requires an expert in the context of that research to decide whether it is reasonable to consider it scientific. However, in this Bill, “reasonable” just means that an ordinary person in the street can decide whether the research is reasonable to be considered scientific. This must be a broadening of the threshold of the definition.

It seems “reasonable” in the current climate to ask the Government to include a public interest test before giving the AI companies extensive scope to reuse our data, without getting renewed consent, on the pretext that the work is for scientific research. In the light of possible deregulation of the sector by the new regime in America, it is beholden on this country to ensure that our scientific research is dynamic, but safe. If the Government can bring this reassurance then for millions of people in this country they will increase trust in Britain’s AI revolution. I beg to move.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I support my noble friend Lord Colville. He has made an excellent argument, and I ask noble Lords on the Government Benches to think about it very carefully. If it is good enough for health data, it is good enough for the rest of science. In the interest of time, I will give an example of one of the issues, rather than repeat the excellent argument made by my noble friend.

In Committee, I asked the Government three times whether the cover of scientific research could be used, for example, to market-test ways to hack human responses to dopamine in order to keep children online. In the Minister’s letter, written during Committee, she could not say that the A/B testing of millions of children to make services more sticky—that is, more addictive—would not be considered scientific, but rather that the regulator, the ICO, could decide on a case-by-case basis. That is not good enough.

There is no greater argument for my noble friend Lord Colville’s amendment than the fact that the Government are unable to say if hacking children’s attention for commercial gain is scientific or not. We will come to children and child protection in the Bill in the next group, but it is alarming that the Government feel able to put in writing that this is an open question. That is not what Labour believed in opposition, and it is beyond disappointing that, now in government, Labour has forgotten what it then believed. I will be following my noble friend through the Lobby.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, it is almost impossible to better the arguments put forward by the noble Viscount, Lord Colville, and the noble Baroness, Lady Kidron, so I am not even going to try.

The inclusion of a public interest requirement would ensure that the use of data for scientific research would serve a genuine societal benefit, rather than primarily benefiting private interests. This would help safeguard against the misuse of data for purely commercial purposes under the guise of research. The debate in Committee highlighted the need for further clarity and stronger safeguards in the Bill, to ensure that data for scientific research genuinely serves the public interest, particularly concerning the sensitive data of children. The call for a public interest requirement reflects the desire to ensure a balance between promoting research and innovation and upholding the rights and interests of data subjects. I very much hope that the House will support this amendment.

Lord Sentamu Portrait Lord Sentamu (CB)
- View Speech - Hansard - - - Excerpts

My Lords, we are playing a bit of Jack-in-the-box. When I was being taught law by a wonderful person from Gray’s Inn, who was responsible for drafting the constitution of Uganda’s independence, Sir Dingle Foot, he said a phrase which struck me, and which has always stayed with me: law is a statement of public policy. The noble Viscount, Lord Coville, seeks that if there is to be scientific work, it must be conducted “in the public interest”. Law simply does not express itself for itself; it does it for the public, as a public policy. It would be a wonderful phrase to include, and I hope the Minister will accept it so that we do not have to vote on it.

Lord Lucas Portrait Lord Lucas (Con)
- View Speech - Hansard - - - Excerpts

My Lords, the regulator quite clearly needs a standard against which to judge. Public interest is the established one in FOI, medicine and elsewhere. It is the standard that is used when I apply for data under the national pupil database—and quite right too. It works well, it is flexible, it is well understood and it is a decent test to meet. We really ought to insist on it today.

Earl of Erroll Portrait The Earl of Erroll (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I want to add very quickly that we have got a problem here. If someone did take all this private data because we did not put this block on them, and they then had it, it would probably become their copyright and their stuff, which they could then sit on and block other people getting at. This amendment is fairly essential.

Lord Markham Portrait Lord Markham (Con)
- View Speech - Hansard - - - Excerpts

Like the noble Lord, Lord Clement-Jones, I am not going to try to better the excellent speech made by the noble Viscount, Lord Colville.

We debated at much length in Committee the definition of the scientific interest, as it will dictate the breadth of the consent exemption for the data reused. If it is too broad, it could allow data companies—I am thinking specifically of AI programs—to justify data scraping without obtaining consent, should they successfully argue that it constitutes scientific research. However, should we create too narrow a definition, we could stifle commercial research and innovation. This would be disastrous for economic growth and the UK science and technology sector, which is one of our most dynamic sectors and has the potential to become one of the most profitable. We should be looking to support and grow, not hinder. Finding the happy medium here is no small feat, but the amendment tabled by the noble Viscount, Lord Colville of Culross, goes a long way towards achieving this by threading the needle.

By requiring the research to be in the public interest to qualify for the consent exemption for data reuse, we will prevent companies cloaking purely commercial activities for their own ends in the guise of scientific research, while allowing commercial research which will benefit the general public.

This particularly chimes with my time as Health Minister, when we tried to ensure that we could bring the public with us on the use of their health data. We did a lot of focus groups on all of this, and we found that we could have very widespread—70%-plus—public support if we could demonstrate that there really was a medical research benefit from all of this. This amendment is very much in keeping with that. As I say, it threads the needle. That is why we will be strongly supporting the amendment tabled by the noble Viscount, Lord Colville, and we hope he is minded to put the matter to a Division.

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

I am grateful to the noble Viscount, Lord Colville, for his amendment and his engagement on this matter. I fully agree with the importance of ensuring that the term “scientific research” is not abused. Clause 67 will help avoid the misuse of the term by introducing a test of whether the research could reasonably be described as scientific. By explicitly requiring a reasonableness test, which is a well-known part of law, the provision is narrowing not broadening the current position.

18:30
The Government believe the test is sufficiently robust to limit misuse of the term “scientific research”. For example, many activities related to marketing or direct product development would not meet the test to be reasonably described as scientific. However, it is important not to disqualify entire fields of activity, because there may be a minority that constitutes genuine scientific research. This is often the case in the development of new medicines, for example, which is why the test needs to be case by case.
The test will not operate alone. There is currently extensive guidance by the ICO on the meaning of “scientific research”. This includes a list of indicators of genuine scientific research and outlines the globally accepted Frascati definition of research. This ICO guidance should be considered when assessing the reasonableness of describing an activity as scientific research; it has to be in the context of the Frascati definition and the ICO’s guidance.
However, the Government’s view is that a requirement for all scientific researchers to undergo an additional formal process to demonstrate that their specific research project is in the public interest would, at best, be a significant and unnecessary burden on our world-class research community. At worst, it would have a chilling effect on research that would ultimately damage public benefit. Much research is driven by curiosity and understanding; defining the precise public benefit at the outset may not be easy or even possible. The public benefit arises many years later. That is the case with the scientific research that we see coming to fruition now: nobody could have known what it would be useful for.
A public interest test is currently a requirement for scientific researchers only in limited circumstances, where there is extra risk that justifies the burden, such as when processing sensitive data under the research condition in the DPA 2018 or undertaking specific public health research. Not all health research is covered, but a specific aspect is. There will be further constraints on researchers through the specific safeguards set out in Clause 85 and the wider requirements of the UK GDPR, such as fairness.
Several people have spoken on this quite passionately and I completely understand why we need to get it right. It is important that companies cannot get hold of data and use it for things that we do not want them to use it for, including marketing and other approaches that will potentially cause harm. In looking after that, we must be mindful not to damage one of our great success stories in this country—scientific research—for which we have unique datasets that are important to improve all sorts of aspects of life.
The Bill will also clear up existing misunderstandings by clarifying, in Clause 71, that a lawful ground is required for all reuse of personal data. That includes scientific research, so it would not be possible to reuse things for a different purpose, in any sphere.
I hope the noble Viscount is content to withdraw this amendment, given these reassurances and the concerns about a significant unintended consequence from going down this route.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful and impressed that the Minister has stepped into this controversial sphere of data management at such short notice. I wish his colleague, the noble Baroness, Lady Jones, a swift recovery.

I hope that noble Lords listened to the persuasive speeches that were given across the Benches, particularly from my noble friend Lady Kidron, with her warning about blurring the definition of scientific research. I am also grateful to the Opposition Benches for their support. I am glad that the noble Lord, Lord Markham, thinks that I am threading the needle between research and public trust.

I listened very carefully to the Minister’s response and understand that he is concerned by the heavy burden that this amendment would put on scientific research. I have listened to his explanation of the OECD Frascati principles, which define scientific research. I understand his concern that the rigorous task of demanding that new researchers have to pass a public interest test will stop many from going ahead with research. However, I repeat what I said in my opening speech: there has to be a balance between generating an AI revolution in this country and bringing the trust of the British people along with it. The public interest test is already available for restricted research in this field; I am simply asking for it to be extended to all scientific research.

I am glad that the reasonableness and lawfulness tests are built into Clause 67, but I ask for a test that I am sure most people would support—that the research should have a positive public benefit. On that note, I would like to seek the opinion of the House.

18:35

Division 4

Ayes: 258

Noes: 138

18:48
Clause 68: Consent to processing for the purposes of scientific research
Amendment 15
Moved by
15: Clause 68, page 76, line 16, at end insert—
“(e) the data subject is not a child.”Member's explanatory statement
This amendment ensures the bill maintains the high level of legal protection for children’s data even when the protections offered to adults are lowered.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I rise to move Amendment 15 and to speak to Amendments 16, 20, 22, 27, 39, 45 and, briefly, government Amendment 40. Together, these amendments offer protections that children were afforded in the Data Protection Act 2018, which passed through this House, and they seek to fix some of the underperformance of the ICO in relation to children’s data.

Before we debate these amendments, it is perhaps worth the Government reflecting on the fact that survey after survey shows that the vast majority—indeed, almost all—of the UK population support stronger digital regulation in respect of children. In refusing to accept these amendments, or, indeed, in replacing them with their own amendments to the same effect, the Government are throwing away one of the successes of the UK Parliament with their newfound enthusiasm for tech with fewer safeguards.

I repeat my belief that lowering data protections for adults is a regressive step for all of us, but for children it is a tragedy that puts them at greater risk of harm—a harm that we in this House have a proud record of seeking to mitigate. The amendments in my name and variously in the names of the noble Lords, Lord Stevenson and Lord Clement-Jones, my noble friend Lord Russell and the noble Baroness, Lady Harding, are essential to preserving the UK’s commitment to child protection and privacy. As the House is well aware, there is cross-party support for child protection. While I will listen very carefully to the Minister, I too am prepared to test the opinion of the House if he has nothing to offer, and I will ask Labour colleagues to consider their responsibility to the nation’s children before they walk through the Lobby.

I will take the amendments out of numerical order, for the benefit of those who have not been following our proceedings. Amendment 22 creates a direct, unambiguous obligation on data processors and controllers to consider the central principles of the age-appropriate design code when processing children’s data. It acknowledges that children of different ages have different capacities and therefore may require different responses. Subsection (2) of the new clause it would insert addresses the concern expressed during the passage of the Bill and its predecessor that children should be shielded from the reduction in privacy protections that adults would experience under the Act when passed.

In the last few weeks, Meta has removed its moderators, and the once-lauded Twitter has become flooded with disinformation and abuse as a result of Elon Musk’s determined deregulation and support of untruth. We have seen the dial move on elections in Romania’s presidential election via TikTok, a rise in scams and the horror of sexually explicit deepfakes, which we will discuss in a later group.

Public trust in both tech and politics is catastrophically low. While we may disagree on the extent to which adults deserve privacy and protection, there are few in this House or the other place who do not believe it is a duty of government to protect children. Amendment 22 simply makes it a requirement that those who control and process children’s data are directly accountable for considering and prioritising their needs. Amendment 39 does the same job in relation to the ICO, highlighting the need to consider that high bar of privacy to which children are entitled, which should be a focus of the commissioner when exercising its regulatory functions, with a particular emphasis on their age and development stage.

Despite Dame Elizabeth Denham’s early success in drafting the age-appropriate design code, the ICO’s track record on enforcement is poor and the leadership has not championed children by robustly enforcing the ADC, or when faced with proposals that watered down child protections in this Bill and its predecessor. We will get to the question of the ICO next week, but I have been surprised by the amount of incoming mail dissatisfied with the regulator and calling on Parliament to demand more robust action. This amendment does exactly that in relation to children.

Government Amendment 40 would require the ICO, when exercising its functions, to consider the fact that children merit specific protections. I am grateful for and welcome this addition as far as it goes; but in light of the ICO’s disappointing track record, clearer and more robust guidance on its obligations is needed.

Moreover, the Government’s proposal is also insufficient because it creates a duty on the ICO only. It does nothing for the controllers and processors, as I have already set out in Amendment 22. It is essential that those who control and process children’s data are directly accountable for prioritising their needs. The consequences when they do not are visible in the anxiety, body dysmorphia and other developmental issues that children experience as a result of their time online.

The Government have usefully introduced an annual report of ICO activities and action. Amendment 45 simply requires them to report the action it has taken specifically in relation to children, as a separate item. Creating better reporting is one of the advances the Government have made; making it possible to see what the ICO has done in regard to children is little more than housekeeping.

This group also includes clause-specific amendments, which are more targeted than Amendment 22. Amendment 15 excludes children from the impact of the proposal to widen the definition of scientific research in Clause 68. Given that we have just discussed this, I may reconsider that amendment. However, Amendment 16 excludes children from the “recognised legitimate interest” provisions in Clause 70. This means that data controllers would still be required to consider and protect children, as currently required under the legitimate interest basis for processing their data.

Amendment 20 excludes children from the new provisions in Clause 71 on purpose limitation. Purpose limitation is at the heart of GDPR. If you ask for a particular purpose and consent to it, extending that purpose is problematic. Amendment 21 ensures that, for children at least, the status quo of data protection law stays the same: that is to say, their personal data can be used only for the purpose for which it was originally collected. If the controller wants to use it in a different way, it must go back to the child—or, if they are under 13, their parent—to ask for further permission.

Finally, Amendment 27 ensures that significant decisions that impact children cannot be made during automated processes unless they are in a child’s best interest. This is a reasonable check and balance on the proposals in Clause 80.

In full, these amendments uphold our collective responsibility to support, protect and make allowances for children as they journey from infancy to adulthood. I met with the Minister and the Bill team, and I thank them for their time. They rightly made the point that children should be participants in the digital world, and I should not seek to exempt them. I suggest to the House that it is the other way round: I will not seek to exempt children if the Government do not seek to put them at risk.

Our responsibility to children is woven into the fabric of our laws, our culture and our behaviour. It has taken two decades to begin to weave childhood into the digital environment, and I am asking the House to make sure we do not take a single retrograde step. The Government have a decision to make. They can choose to please the CEOs of Silicon Valley in the hope that capitulation on regulatory standards will get us a data centre or two; or they can prioritise the best interests of UK children and agree to these amendments, which put children’s needs first. I beg to move.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I rise to support all the amendments in this group. I have added my name to Amendments 15, 22, 27 and 45. The only reason my name is not on the other amendments is that others got there before me. As is always the case in our debates on this topic, I do not need to repeat the arguments of the noble Baroness, Lady Kidron. I would just like to make a very high-level point.

19:00
In her last paragraph, the noble Baroness referenced the Government’s concern that we should not seek to exempt children from the digital world. It is really not difficult to encourage children to use the digital world. Any of us who have young children, or young grandchildren, know that in the blink of an eye children pick up a device and access anything they want. We have not got a problem with children accessing digital.
What we have got a very big problem with is how to protect them in that world. Those of us who have worked and campaigned on child internet safety for the last 15 years know how very hard it is to protect our children. I respect the Minister enormously, and I send my good wishes to the noble Baroness, Lady Jones, and wish her a speedy recovery; I know they both care about the issue. However, those of us who have spent a lot of time working in this area have learned that you need to have the detail in the Bill. Many of us worked a decade ago on the age-appropriate design code, which even though it was in a Bill, was incredibly hard to get implemented. We are all learning to our cost already regarding issues in relation to the Online Safety Act that we were told in this Chamber it did not matter whether they were on the face of the Bill and Ofcom would be able to sort them. We are now told that Ofcom does not have the powers because it is not on the face of the Bill.
I urge the Minister to take on board the concern that I know he will hear from all sides of the House that we need substantially to strengthen this Bill’s protection for children, otherwise I fear that, in a year or two, the same group of us will be saying the same thing about another Bill and millions of children will still be unprotected.
Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I have also put my name to most of the amendments. As with the noble Baroness, Lady Harding, that some of them do not have my name on them is because I arrived too late. Between her and my noble friend Lady Kidron, they have said everything that needs to be said very powerfully. As one who has more recently become involved in a variety of Bills—the Policing and Crime Bill, the Online Safety Bill, and the Victims and Prisoners Bill—in every case trying to fight for and clarify children’s rights, I can say that it has been an uphill battle. But the reason we have been fighting for this is that we have lamentably failed to protect the interests of children for the past two decades as the world has changed around us. All of us who have children or grandchildren, nephews or nieces, or, like me, take part in the Learn with the Lords programme and go into schools, or who deal with mental health charities, are aware of the failure of government and regulators to take account, as the world changed around us, of the effect it would have on children.

In our attempts to codify and clarify in law what the dangers are and what needs to be put in place to try to prevent them, we have had an uphill struggle, regardless of the colour of government. In principle, everyone agrees. In practice, there is always a reason why it is too difficult—or, the easy way out is to say, “We will tell the regulator what our intent is, but we will leave it up to the regulator to decide”.

Our experience to date of the ability of a regulator entirely to take on board what was very clearly the will of Parliament when the Bill became an Act is not being made flesh when it comes to setting out the regulation. Unless it is in an Act and it is made manifestly clear what the desired outcomes are in terms of safety of children, the regulator—because it is difficult to do this well—will not unreasonably decide that if it is too difficult to do, they will settle for something that is not as good as it could be.

What we are trying to do with this set of amendments is to say to the Government up front, “We want this to be as effective as it possibly could be now”. We do not want to come back and rue the consequences of not being completely clear and of putting clear onus of responsibility on the regulators in two or three years’ time, because in another two or three years children will have important parts of their childhood deteriorating quite rapidly, with consequences that will stay with them for the rest of their lives.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I was one of those who was up even earlier than the noble Baroness, Lady Harding, and managed to get my name down on these amendments. It puts me in a rather difficult position to be part of the government party but to seek to change what the Government have arrived at as their sticking position in relation to this issue in particular—and indeed one or two others, but I have learned to live with those.

This one caught my eye in Committee. I felt suddenly, almost exactly as the noble Lord, Lord Russell said, a sense of discontinuity in relation to what we thought it was in the Government’s DNA—that is, to bring forward the right solution to the problems that we have been seeking to change in other Bills. With the then Online Safety Bill, we seemed to have an agreement around the House about what we wanted, but every time we put it back to the officials and people went away with it and came back with other versions, it got worse and not better. How children are dealt with and how important it is to make sure that they are prioritised appears to be one of those problems.

The amendments before us—and I have signed many of them, because I felt that we wanted to have a good and open debate about what we wanted here—do not need to be passed today. It seems to me that the two sides are, again, very close in what we want to achieve. I sensed from the excellent speech of the noble Baroness, Lady Kidron, that she has a very clear idea of what needs to go into this Bill to ensure that, at the very least, we do not diminish the sensible way in which we drafted the 2018 Bill. I was part of that process as well; I remember those debates very well. We got there because we hammered away at it until we found a way of finding the right words that bridged the two sides. We got closer and closer together, but sometimes we had to go even beyond what the clerks would feel comfortable with in terms of government procedure to do that. We may be here again.

When he comes to respond, can the Minister commit to us today in this House that he will bring back at Third Reading a version of what he has put forward—which I think we all would say does not quite go far enough; it needs a bit more, but not that much more—to make it meet with where we currently are and where, guided by the noble Baroness, Lady Kidron, we should be in relation to the changing circumstances in both the external world and indeed in our regulator, which of course is going to go through a huge change as it reformulates itself? We have an opportunity, but there is also a danger that we do not take it. If we weaken ourselves now, we will not be in the right position in a few years’ time. I appeal to my noble friend to think carefully about how he might manage this process for the best benefit of all of us. The House, I am sure, is united about where we want to get to. The Bill does not get us there. Government Amendment 18 is too modest in its approach, but it does not need a lot to get it there. I think there is a way forward that we do not need to divide on. I hope the Minister will take the advice that has been given.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, we have heard some of the really consistent advocates for children’s online protection today. I must say that I had not realised that the opportunity of signing the amendments of the noble Baroness, Lady Kidron, was rather like getting hold of Taylor Swift tickets—clearly, there was massive competition and rightly so. I pay tribute not only to the speakers today but in particular to the noble Baroness for all her campaigning, particularly with 5Rights, on online child protection.

All these amendments are important for protecting children’s data, because they address concerns about data misuse and the need for heightened protection for children in the digital environment, with enhanced oversight and accountability in the processing of children’s data. I shall not say very much. If the noble Baroness pushes Amendment 20 to a vote, I want to make sure that we have time before the dinner hour to do so, which means going through the next group very quickly. I very much hope that we will get a satisfactory answer from the Minister. The sage advice from the noble Lord, Lord Stevenson, hit the button exactly.

Amendment 20 is particularly important in this context. It seeks to exclude children from the new provisions on purpose limitation for further processing under Article 8A. As the noble Baroness explains, that means that personal data originally collected from a child with consent for a specific purpose could not be reused for a different, incompatible purpose without obtaining fresh consent, even if the child is now an adult. In my view, that is core. I hope the Minister will come back in the way that has been requested by the noble Lord, Lord Stevenson, so we do not have to have a vote. However, we will support the noble Baroness if she wishes to test the opinion of the House.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I too thank the noble Baroness, Lady Kidron, for all her amendments in this group, and I thank the Minister for his amendment.

Amendment 15 seeks to maintain the high level of legal protection for children’s data even where protections for adults may be eased in the context of scientific research. I acknowledge the concerns raised about the potential implications that this amendment could have for medical research and safeguarding work. It is important to recognise that young people aged 16 and over are entitled to control their medical information under existing legal frameworks, reflecting their ability to understand and consent in specific contexts.

There is a legitimate concern that by excluding all children categorically, including those aged 16 and 17, we risk impeding critical medical research that could benefit young people themselves. Research into safeguarding may also be impacted by such an amendment. Studies that aim to improve systems for identifying and preventing abuse or neglect rely on the careful processing of children’s data. If this amendment were to inadvertently create a barrier to such vital work, we could find ourselves undermining some of the protections that it seeks to reinforce.

That said, the amendment highlights an important issue: the need to ensure that ethical safeguards for children remain robust and proportionate. There is no question that the rights and welfare of children should remain paramount in research contexts, but we must find the right balance—one that allows valuable, ethically conducted research to continue without eroding the legal protections that exist for children’s data. So I welcome the intent of the amendment in seeking to protect children, of course, and I urge us, as the noble Lord, Lord Stevenson, put it, to continue working collaboratively to achieve a framework that upholds their rights without hindering progress in areas that ultimately serve their best interests.

As with the previous amendment, I recognise the intent of Amendment 16, which seeks to protect children’s data by excluding them from the scope of recognised legitimate interests. Ensuring that children continue to benefit from the highest level of legal protection is a goal that, needless to say, we all share. However, I remain concerned that this could have less desirable consequences too, particularly in cases requiring urgent safeguarding action. There are scenarios where swift and proportionate data processing is critical to protecting a child at risk, and it is vital that the framework that we establish does not inadvertently create barriers to such essential work.

I am absolutely in support of Amendment 20. It provides an important safeguard by ensuring that children’s data is not used for purposes beyond those for which it was originally collected, unless it is fully compatible with the original purpose. Children are particularly vulnerable when it comes to data processing and their understanding of consent is limited. The amendment would strengthen protection for children by preventing the use of their data in ways that were not made clear to them or their guardians at the time of collection. It would ensure that children’s data remained secure and was not exploited for unrelated purposes.

On Amendment 22, the overarching duty proposed in this new clause—to prioritise children’s best interests and ensure that their data is handled with due care and attention—aligns with the objective that we all share of safeguarding children in the digital age. We also agree with the principle that the protections afforded to children’s data should not be undermined or reduced, and that those protections should remain consistent with existing standards under the UK GDPR.

However, although we support the intent of the amendment, we have concerns about the reference to the UN Convention on the Rights of the Child and general comment 25. Although these international frameworks are important, we do not believe they should be explicitly tied into this legislation. Our preference would be for a redraft of this provision that focused more directly on UK law and principles, ensuring that the protections for children’s data were robust and tailored to our legal context, rather than linking it to international standards in a way that could create potential ambiguities.

19:15
I support Amendment 27. It is an important amendment that would ensure that significant decisions impacting a child could not be made solely using automated decision-making unless those decisions were in the child’s best interests. This is a rather ingenious safeguard to ensure that children’s rights and welfare are fully considered in decisions that could affect them. The amendment would ensure that decisions made using automated processes could not be taken unless it was clear that they served the best interests of the child, taking into account their rights and development stage. The amendment would build on the principles already set out in the Data Protection Act 2018, reinforcing the need for extra protections for children.
I support the intent behind Amendment 39, which rightly recognises that children are entitled to a higher standard of protection regarding their personal data. We agree that children’s data requires special consideration at different stages of their development, as they may not fully understand the risks or consequences associated with the processing of their data. This principle is fundamental to safeguarding their rights. Again, though, while we support the overall intent of the amendment, we have concerns about the explicit reference to the UN Convention on the Rights of the Child and general comment 25, as per my comments on the previous amendment.
Amendment 40 rightly emphasises that children merit specific protection when it comes to their personal data, given their vulnerability and the fact that they may be less aware of the risks and consequences associated with such processing. I am reassured to see the Government taking steps to ensure the highest level of protection for children’s data, as that is essential to safeguarding their rights in an increasingly digital world. I support the spirit of the amendment but would characterise it as a minor technical adjustment to ensure clarity. It is certainly important that the Information Commissioner’s duties are clearly set out, and the amendment would help to reinforce the specific protections that children should receive in relation to their personal data.
We on these Benches support Amendment 45, which seeks to ensure that the Information Commissioner’s annual report clearly records activities and actions taken in relation to children’s data protection. That is an important step in enhancing transparency, accountability and understanding of how children’s data is being safeguarded under the regulatory framework. The inclusion of those specific details in the annual report would not only be beneficial for ensuring accountability but reinforce the commitment to prioritising children’s best interests in the regulatory framework. It would provide clarity on the actions taken by the ICO, fostering greater trust in the oversight and enforcement of data protection laws, particularly with respect to children.
Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

I will speak first to government Amendment 40, tabled in my name, concerning the ICO’s duty relating to children’s personal data. Before that, though, I thank the noble Lords, Lord Stevenson and Lord Russell, the noble Baroness, Lady Harding, and in particular the noble Baroness, Lady Kidron, for such considered debates on this incredibly important issue, both in today’s discussion in the House and in the meetings we have had together. Everyone here wants this to be effective and recognises that we must protect children.

The Government are firmly committed to maintaining high standards of protection for children, which is why they decided not to proceed with measures in the previous Data Protection and Digital Information Bill that would have reduced requirements for data protection impact assessments, prior consultation with the ICO and the designation of data protection officers. The ICO guidance is clear that organisations must complete an impact assessment in relation to any processing activity that uses children’s or other vulnerable people’s data for marketing purposes, profiling or other automated decision-making, or for offering online services directly to children.

The Government also expect organisations which provide online services likely to be accessed by children to continue to follow the standards on age-appropriate design set out in the children’s code. The noble Baroness, Lady Kidron, worked tirelessly to include those provisions in the Data Protection Act 2018 and the code continues to provide essential guidance for relevant online services on how to comply with the data protection principles in respect of children’s data. In addition to these existing provisions, Clause 90 already includes a requirement for the ICO to consider the rights and interests of children when carrying out its functions.

I appreciate the point that the noble Baroness made in Committee about the omission of the first 10 words of recital 38 from these provisions. As such, I am very happy to rectify this through government Amendment 40. The changes we are making to Clause 90 will require the Information Commissioner to consider, where relevant, when carrying out its regulatory functions the fact that children merit special protection with regard to their personal data. I hope noble Lords will support this government amendment.

Turning to Amendment 15 from the noble Baroness, Lady Kidron, which excludes children’s data from Clause 68, I reassure her that neither the protections for adults nor for children are being lowered. Clause 68 faithfully transposes the existing concept of giving consent to processing for an area of scientific research from the current recital. This must be freely given and be fully revokable at any point. While the research purpose initially identified may become more specific as the research progresses, this clause does not permit researchers to use the data for research that lies outside the original consent. As has been highlighted by the noble Viscount, Lord Camrose, excluding children from Clause 68 could have a detrimental effect on health research in children and could unfairly disadvantage them. This is already an area of research that is difficult and underrepresented.

I know that the noble Baroness, Lady Kidron, cares deeply about this but the fact is that if we start to make research in children more difficult—for example, if research on children with a particular type of cancer found something in those children that was relevant to another cancer, this would preclude the use of that data—that cannot be right for children. It is a risk to move and exempt children from this part of the Bill.

Amendment 16 would prevent data controllers from processing children’s data under the new recognised legitimate interests lawful ground. However, one of the main reasons this ground was introduced was to encourage organisations to process personal data speedily when there is a pressing need to do so for important purposes. This could be where there is a need to report a safeguarding concern or to prevent a crime being committed against a child. Excluding children’s data from the scope of the provision could therefore delay action being taken to protect some children—a point also made in the debate.

Amendment 20 aims to prohibit further processing of children’s personal data when it was collected under the consent lawful basis. The Government believe an individual’s consent should not be undermined, whether they are an adult or a child. This is why the Bill sets out that personal data should be used only for the purpose a person has consented to, apart from situations that are in the public interest and authorised by law or to comply with the UK GDPR principles. Safeguarding children or vulnerable individuals is one of these situations. There may be cases where a child’s data is processed under consent by a social media company and information provided by the child raises serious safeguarding concerns. The social media company must be able to further process the child’s data to make safeguarding referrals when necessary. It is also important to note that these public interest exceptions apply only when the controller cannot reasonably be expected to obtain consent.

I know the noble Baroness, Lady Kidron, hoped that the Government might also introduce amendments to require data controllers to apply a higher standard of protection to children’s data than to adults’. The Government have considered Amendment 22 carefully, but requiring all data controllers to identify whether any of the personal data they hold relates to children, and to apply a higher standard to it, would place disproportionate burdens on small businesses and other organisations that currently have no way of differentiating age groups.

Although we cannot pursue this amendment as drafted, my understanding of the very helpful conversations that I have had with the noble Baroness, Lady Kidron, is that she intended for this amendment to be aimed at online services directed at or likely to be accessed by children, not to every public body, business or third sector organisation that might process children’s data from time to time.

I reassure noble Lords that the Government are open to exploring a more targeted approach that focuses on those services that the noble Baroness is most concerned about. The age-appropriate design code already applies to such services and we are very open to exploring what further measures could be beneficial to strengthen protection for children’s data. This point was eloquently raised by the noble Baronesses, Lady Harding and Lady Kidron, and the noble Lord, Lord Stevenson, and is one that we would like to continue. Combined with the steps we are taking in relation to the new ICO duty, which will influence the support and guidance it provides for organisations, we believe this could drive better rates of compliance. I would be very pleased to work with all noble Lords who have spoken on this to try to get this into the right place.

I turn to Amendment 27, tabled by the noble Baroness, Lady Kidron. I agree with her on the importance of protecting children’s rights and interests when undertaking solely automated decision-making. However, we think this amendment, as currently drafted, would cause operational confusion as to when solely automated decision-making can be carried out. Compliance with the reformed Article 22 and the wider data protection legislation will ensure high standards of protection for adults and children alike, and that is what we should pursue.

I now turn to Amendment 39, which would replace the ICO’s children’s duty, and for which I again thank the noble Baroness, Lady Kidron, and the noble Lord, Lord Russell. As a public body, the ICO must adhere to the UK’s commitment to the UN Convention on the Rights of the Child, and we respectfully submit that it is unnecessary to add further wording of this nature to the ICO’s duty. We believe that government Amendment 40, coupled with the ICO’s principal objective to secure an appropriate level of protection, takes account of the fact that the needs of children might not always look the same.

Finally, to address Amendment 45, the Government believe that the Bill already delivers on this aim. While the new annual regulatory action report in Clause 101 will not break down the activity that relates to children, it does cover all the ICO’s regulatory activity, including that taken to uphold the rights of children. This will deliver greater transparency and accountability on the ICO’s actions. Furthermore, Clause 90 requires the ICO to set out in its annual report how it has complied with its statutory duties. This includes the new duty relating to children.

To conclude, I hope that the amendment we tabled today and the responses I have set out reassure noble Lords of our commitment to protect children’s data. I ask noble Lords to support the amendment tabled in my name, and hope that the noble Baroness, Lady Kidron, feels content to withdraw her own.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

Before the Minister sits down, I have some things to say about his words. I did not hear: “agree to bring forward a government amendment at Third Reading”. Those are the magic words that would help us get out of this situation. I have tried to suggest several times that the Government bring forward their own amendment at Third Reading, drafted in a manner that would satisfy the whole House, with the words of the noble Viscount, Lord Camrose, incorporated and the things that are fundamental.

I very much admire the Minister and enjoy seeing him in his place but I say to him that we have been round this a few times now and a lot of those amendments, while rather nerdy in their obsession, are based on lived experience of trying to hold the regulator and the companies to account for the law that we have already passed. I am seeking those magic words before the Minister sits down.

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

I have likewise enjoyed working with the noble Baroness. As has been said several times, we are all working towards the same thing, which is to protect children. The age-appropriate design code has been a success in that regard. That is why we are open to exploring what further measures can be put in place in relation to the ICO duty, which can help influence and support the guidance to get that into the right place. That is what I would be more than happy to work on with the noble Baroness and others to make sure that we get it right.

19:30
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

I am presuming a little here that the Minister’s lack of experience in the procedures of the House is holding him back, but I know he is getting some advice from his left. The key thing is that we will not be able to discuss this again in this House unless he agrees that he will bring forward an amendment. We do not have to specify today what that amendment will be. It might not be satisfactory, and we might have to vote against it anyway. But the key is that he has to say this now, and the clerk has to nod in agreement that he has covered the ground properly.

We have done this before on a number of other Bills, so we know the rules. If the Minister can do that, we can have the conversations he is talking about. We have just heard the noble Baroness, Lady Kidron, explain in a very graceful way that this will be from a blank sheet of paper so that we can build something that will command the consensus of the House. We did it on the Online Safety Bill; we can do it here. Please will he say those words?

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- Hansard - - - Excerpts

I am advised that I should say that I am happy for the amendment to be brought forward, but not as a government amendment. We are happy to hear an amendment from the noble Baroness at Third Reading.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

Let us be quite clear about this. It does not have to be a government amendment, but the Government Minister has to agree that it can be brought forward.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

We take that as a yes.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

This is a self-governing House.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I thank the Minister for that very generous offer. I also thank the noble Lord, Lord Stevenson, for his incredible support. I note that, coming from the Government Benches, that is a very difficult thing to do, and I really appreciate it. On the basis that we are to have an amendment at Third Reading, whether written by me with government and opposition help or by the Government, that will address these fundamental concerns set out by noble Lords, I will not press this amendment today.

These are not small matters. The implementation of the age-appropriate design code depends on some of the things being resolved in the Bill. There is no equality of arms here. A child, whether five or 15, is no match for the billions of dollars spent hijacking their attention, their self-esteem and their body. We have to, in these moments as a House, choose David over Goliath. I thank the Minister and all the supporters in this House —the “Lords tech team”, as we have been called in the press. With that, I beg leave to withdraw the amendment.

Amendment 15 withdrawn.
Clause 70: Lawfulness of processing
Amendment 16 not moved.
19:34
Consideration on Report adjourned until not before 8.14 pm.

Data (Use and Access) Bill [HL]

Report (1st Day) (Continued)
20:14
Baroness Morris of Bolton Portrait The Deputy Speaker (Baroness Morris of Bolton) (Con)
- View Speech - Hansard - - - Excerpts

I call on the noble Lord, Lord Clement-Jones, to speak to Amendment 17.

Amendment 17

Moved by
17: Clause 70, page 78, leave out lines 9 to 30
Member’s explanatory statement
This amendment removes powers for the Secretary of State to override primary legislation and modify key aspects of UK data protection law via statutory instrument.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I apologise for interrupting the Minister, in what sounded almost like full flow. I am sure that he was so eager to move his amendment.

In moving Amendment 17, I will speak also to Amendment 21. These aim to remove the Secretary of State’s power to override primary legislation and modify key aspects of the UK data protection law via statutory instruments. They are similar to those proposed by me to the previous Government’s Data Protection and Digital Information Bill, which the noble Baroness, Lady Jones of Whitchurch, then in opposition, supported. These relate to Clauses 70(4) and 71(5).

There are a number of reasons to support accepting these amendments. The Delegated Powers and Regulatory Reform Committee has expressed concerns about the broad scope of the Secretary of State’s powers, as it did previously in relation to the DBS scheme. It recommended removing the power from the previous Bill, and in its ninth report it maintains this view for the current Bill. The Constitution Committee has said likewise; I will not read out what it said at the time, but I think all noble Lords know that both committees were pretty much on the same page.

The noble Baroness, Lady Jones, on the previous DPDI Bill, argued that there was no compelling reason for introducing recognised legitimate interests. On these Benches, we agree. The existing framework already allows for data sharing with the public sector and data use for national security, crime detection and safeguarding vulnerable individuals. However, the noble Baroness, in her ministerial capacity, argued that swift changes might be needed—hence the necessity for the Secretary of State’s power. Nevertheless, the DPRRC’s view is that the grounds for the lawful processing of personal data are fundamental and should not be subject to modification by subordinate legislation.

The letter from the Minister, the noble Lord, Lord Vallance, to the Constitution Committee and the DPRRC pretty much reiterates those arguments. I will not go through all of it again, but I note, in closing, that in his letter he said:

“I hope it will reassure the Committee that the power will be used only when necessary and in the public interest”.


He could have come forward with an amendment to that effect at any point in the passage of the Bill, but he has not. I hope that, on reflection—in the light of both committees’ repeated recommendations, the potential threats to individual privacy and data adequacy, and the lack of strong justification for these powers—the Minister will accept these two amendments. I beg to move.

Baroness Morris of Bolton Portrait The Deputy Speaker (Baroness Morris of Bolton) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I must inform the House that if Amendment 17 is agreed to, I cannot call Amendment 18 for reasons of pre-emption.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the noble Lord, Lord Clement-Jones, for raising these significant issues. While I share some of the concerns expressed, I find myself unable—at least for the moment—to offer support for the amendments in their current form.

Amendment 17 seeks to remove the powers granted to the Secretary of State to override primary legislation and to modify aspects of UK data protection law via statutory instrument. I agree with the principle underpinning this amendment: that any changes to data protection law must be subject to appropriate scrutiny. It is essential that parliamentary oversight remains robust and meaningful, particularly when it comes to matters as sensitive and far-reaching as data protection.

However, my hesitation lies in the practical implications of the amendment. While I sympathise with the call for greater transparency, I would welcome more detail on how this oversight mechanism might work in practice. Would it involve enhanced scrutiny procedures or a stronger role for relevant parliamentary committees? I fear that, without this clarity, we risk creating uncertainty in an area that requires, above all, precision and confidence.

The Minister’s Amendment 18 inserts specific protections for children’s personal data into the UK GDPR framework. The Government have rightly emphasised the importance of safeguarding children in the digital age. I commend the intention behind the amendment and agree wholeheartedly that children deserve special protections when it comes to the processing of their personal data.

It is worth noting that this is a government amendment to their own Bill. While Governments amending their own legislation is not unprecedented—the previous Government may have indulged in the practice from time to time—it is a practice that can give rise to questions. I will leave my comments there; obviously it is not ideal, but these things happen.

Finally, Amendment 21, also tabled by the noble Lord, Lord Clement-Jones, mirrors Amendment 17 in seeking to curtail the Secretary of State’s powers to amend primary legislation via statutory instrument. My earlier comments on the importance of parliamentary oversight apply here. As with Amendment 17, I am of course supportive of the principle. The delegation of such significant powers to the Executive should not proceed without robust scrutiny. However, I would appreciate greater clarity on how this proposed mechanism would function in practice. As it stands, I fear that the amendment raises too many questions. If these concerns could be addressed, I would be most grateful.

In conclusion, these amendments raise important points about the balance of power between the Executive and Parliament, as well as the protection of vulnerable individuals in the digital sphere. I look forward to hearing more detail and clarity, so that we can move forward with confidence.

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, government Amendment 18 is similar to government Amendment 40 in the previous group, which added an express reference to children meriting specific protection to the new ICO duty. This amendment will give further emphasis to the need for the Secretary of State to consider the fact that children merit specific protection when deciding whether to use powers to amend the list of recognised legitimate interests.

Turning to Amendment 17 from the noble Lord, Lord Clement-Jones, I understand the concerns that have been raised about the Secretary of State’s power to add or vary the list of recognised legitimate interests. This amendment seeks to remove the power from the Bill.

In response to some of the earlier comments, including from the committees, I want to make it clear that we have constrained these powers more tightly than they were in the previous data Bill. Before making any changes, the Secretary of State must consider the rights and freedoms of individuals, paying particular attention to children, who may be less aware of the risks associated with data processing. Furthermore, any addition to the list must meet strict criteria, ensuring that it serves a clear and necessary public interest objective as described in Article 23.1 of the UK GDPR.

The Secretary of State is required to consult the Information Commissioner and other stakeholders before making any changes, and any regulations must then undergo the affirmative resolution procedure, guaranteeing parliamentary scrutiny through debates in both Houses. Retaining this regulation-making power would allow the Government to respond quickly if future public interest activities are identified that should be added to the list of recognised legitimate interests. However, the robust safeguards and limitations in Clause 70 will ensure that these powers are used both sparingly and responsibly.

I turn now to Amendment 21. As was set out in Committee, there is already a relevant power in the current Data Protection Act to provide exceptions. We are relocating the existing exemptions, so the current power, so far as it relates to the purpose limitation principle, will no longer be relevant. The power in Clause 71 is intended to take its place. In seeking to reassure noble Lords, I want to reiterate that the power cannot be used for purposes other than the public interest objectives listed in Article 23.1 of the UK GDPR. It is vital that the Government can act quickly to ensure that public interest processing is not blocked. If an exemption is misused, the power will also ensure that action can be swiftly taken to protect data subjects by placing extra safeguards or limitations on it.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the Minister for that considered reply. It went into more detail than the letter he sent to the two committees, so I am grateful for that, and it illuminated the situation somewhat. But at the end of the day, the Minister is obviously intent on retaining the regulation-making power.

I thank the noble Viscount, Lord Camrose, for his support—sort of—in principle. I am not quite sure where that fitted; it was post-ministerial language. I think he needs to throw off the shackles of ministerial life and live a little. These habits die hard but in due course, he will come to realise that there are benefits in supporting amendments that do not give too much ministerial power.

Turning to one point of principle—I am not going to press either amendment—it is a worrying trend that both the previous Government and this Government seem intent on simply steamrollering through powers for Secretaries of State in the face of pretty considered comment by House of Lords committees. This trend has been noted, first for skeletal Bills and secondly for Bills that, despite being skeletal, include a lot of regulation-making power for Secretaries of State, and Henry VIII powers. So I just issue a warning that we will keep returning to this theme and we will keep supporting and respecting committees of this House, which spend a great deal of time scrutinising secondary legislation and warning of overweening executive power. In the meantime, I beg leave to withdraw Amendment 17.

Amendment 17 withdrawn.
Amendment 18
Moved by
18: Clause 70, page 78, line 23, after “children” insert “merit specific protection with regard to their personal data because they”
Member's explanatory statement
This amendment adds an express reference to children meriting specific protection with regard to their personal data in new paragraph 8(b) of Article 6 of the UK GDPR (lawful processing: recognised legitimate interests). See also the amendment in my name to Clause 90, page 113, line 20.
Amendment 18 agreed.
Schedule 4: Lawfulness of processing: recognised legitimate interests
Amendment 19 not moved.
Clause 71: The purpose limitation
Amendments 20 and 21 not moved.
Amendment 22 not moved.
Clause 75: Fees and reasons for responses to data subjects’ requests about law enforcement processing
Amendment 23 not moved.
Clause 77: Information to be provided to data subjects
Amendment 24
Moved by
24: Clause 77, page 91, line 16, at end insert—
“(ia) after point (d), insert—“(e) the personal data is from the Open Electoral Register. When personal data from the Open Electoral Register is combined with personal data from other sources to build a profile for direct marketing then transparency obligations must be fulfilled at the point the individual first provides the additional personal data to a data provider. Additional transparency must be provided by organisations using the data for direct marketing via their privacy policy and by including a data notification in a direct mail pack.””
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to Amendment 24 in my name and in the names of the noble Lords, Lord Clement-Jones and Lord Stevenson, and my noble friend Lord Black of Brentwood, all of whom I want to thank for their support. I also welcome government Amendment 49.

Amendment 24 concerns the use of the open electoral register, an issue we debated last year in considering the Data Protection and Digital Information Bill, and through the course of this Bill. Noble Lords may think this a small, technical and unimportant issue—certainly at this time of the evening. I have taken it on because it is emblematic of the challenge we face in this country in growing our economy.

Everyone wants strong economic growth. We know that the Government do. We know that the Chancellor has been challenging all regulators to come up with ideas to create growth. This is an example of a regulator hampering growth, and we in this House have an opportunity to do something about it. Those of us who have run businesses know that often, it is in the detail of the regulation that the dead hand of the state does its greatest damage. Because each change is very detailed and affects only a tiny part of the economy, the changes get through the bureaucracy unnoticed and quietly stifle growth. This is one of those examples.

20:30
Forgive me at this late hour, but I need to go into the detail. I thank the Minister, his officials and the Bill team for engaging with me between Committee and Report, and the Data & Marketing Association for the briefings I have had. Together, we have narrowed down the area of debate. However, as there is still quite substantial disagreement, I must set out what is quite a technical issue, for which I apologise. The sharing of publicly available data sources with the private sector to achieve economic growth and improve productivity and public services has been well established for decades. Data sources such as the open electoral register, the register of companies, the register of judgments, orders and fines, the land registry and the Food Standards Agency register have been successfully shared for decades —long before the digital era and long before the current Government’s priority of increasing the sharing of public sector data with the private sector to achieve benefits for the nation.
All these data sources are established by law and governed under strict regulations clearly identifying the uses to which they can be put and the notifications that must be made to individuals. The open electoral register is by far the most important and was always intended for use in direct marketing. Use of the OER for direct marketing is a well-known and well-understood use of personal data that has operated with limited harm and almost no complaint for over 40 years. That is why this amendment is strictly limited to the use of the OER. It is designed to enable direct marketing companies to continue to use the OER data to undertake direct marketing activity, as they have done for many decades.
Direct marketing firms believe that they will no longer be able to use this OER data to conduct direct marketing campaigns unless they contact all citizens on the OER to tell them in advance that they are about to use the OER data. This may seem crazy. I do not think that any consumer is asking for more unsolicited letters or emails that are not targeted at them but simply warn them that their data is about to be used. But do not worry—this will not happen. It makes using the OER unaffordable and means that instead, the direct marketing companies will be withdrawing their products from the market, which is the opposite of the economic growth we are looking for. I am afraid that this is not fanciful. One company has already told me that unless it receives reassurance via this Bill or directly from the ICO, it will cease using OER data in the next couple of months.
In Committee, the Minister, the noble Baroness, Lady Jones of Whitchurch, was clear that using OER data on its own would not require prior notification; so far, so good. However, she was less reassuring where OER data is combined with any other data; she implied that notification of intended use would then be required. Let me give your Lordships a practical example. I think this means that, if I want to send direct marketing to everyone in a particular postcode area who is in the open electoral register, that is fine. However, if I want to combine the OER data with additional data—for example, telling me who in that area likes jazz music, in order to market a jazz concert at the local music venue—I would in theory have to pre-notify that I intend to combine the OER data with the additional data to create a tailored campaign. That is before I actually send the direct marketing. It is really hard to see how that is good for anyone. It does not seem to be proportionate. While you could argue that it increases transparency, in practice it will mean that people receive more untargeted communication or, worse, a very strange communication about potential communication. I can see I am losing my audience already.
My amendment seeks to make explicit what good direct marketing best practice actually is in terms of transparency and proportionality. If a direct marketing provider combines open electoral register data with other data in order to create a piece of targeted direct marketing, they must fulfil their transparency obligations when they first acquire the non-OER personal data, so that consumers can exercise their data rights over that data as well as the OER data. They must be further transparent that they are combining data from the OER with other data sources, via their privacy policy and by including a data notification in their direct mail pack. This would enable the consumer to know what is being done with their data and who to contact should they wish to exercise their other data rights, but not to be bombarded with notifications without receiving any direct marketing. It would also allow direct marketing companies to continue to make use of the OER in producing targeted direct marketing campaigns for clients, as they have been doing for decades.
I know that the Government are concerned about ensuring transparency. The right to transparency in the processing of personal data is fundamental and foundational to GDPR. It is essential to affording data subject autonomy and to achieving the purpose of the GDPR: that a person should have control of their own personal data. Most importantly, it is critical in establishing trust between an organisation and its customers—the most critical component of brand loyalty and economic success. The importance of transparency is not in dispute. The point I am making is that transparency already exists in the system, in the way the system has operated for many decades: by companies adhering to the best practice I have just set out. Officials working on the Bill team have been very generous with their time, but I have not, to date, been able to discuss these issues with the ICO.
It is possible that the Government and the ICO believe that the current processes are not transparent enough—that perhaps it is not clear to the general public that when they consent to their data remaining on the open electoral register, it is likely to be combined with other data in order to create the targeted marketing campaigns we know consumers want. I am not aware of any evidence of that, but if there is, I would argue that the right approach to resolving this is to improve the privacy notices that local authorities place on the open electoral register, which are inconsistent and over which the industry does not actually have any control. This would be a substantially lower cost and more efficient way of meeting the principles of transparency.
If the Minister is unable to accept my simple amendment, will he ask the ICO to develop, in collaboration with the Data & Marketing Association, detailed guidance for local authorities to improve their privacy notices, and detailed guidance for direct marketing firms that clarifies that the processes set out in my amendment constitute an acceptable and proportionate means of complying with transparency obligations under GDPR?
In conclusion, as important as transparency is, another fundamental principle of data protection is proportionality. Proportionality is an overarching principle of GDPR—indeed, of all data protection legislation. My amendment is necessary because the regulator’s approach to transparency in the use of data from the open electoral register for direct marketing purposes is disproportionate, taking into account the high levels of existing transparency, the benefits of the processing and the individual’s desire, because they have opted in to the OER, to actually receive relevant and tailored direct marketing.
I apologise again for all the detail, but this is how we create economic growth: by preventing regulators stifling activity such as this. I beg to move.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I do not think the noble Baroness, Lady Harding, lost the audience at all; she made an excellent case. Before speaking in support of the noble Baroness, I should say, “Blink, and you lose a whole group of amendments”. We seem to have completely lost sight of the group starting with Amendment 19—I know the noble Lord, Lord Holmes, is not here—and including Amendments 23, 74 and government Amendment 76, which seems to have been overlooked. I suggest that we degroup next week and come back to Amendments 74 and 76. I do not know what will happen to Amendment 23; I am sure there is a cunning plan on the Opposition Front Bench to reinstate that in some shape or form. I just thought I would gently point that out, since we are speeding along and forgetting some of the very valuable amendments that have been tabled.

I very much support, as I did in Committee, what the noble Baroness, Lady Harding, said about Amendment 24, which aims to clarify the use of open electoral register data for direct marketing. The core issue is the interpretation of Article 14 of the GDPR, specifically regarding the disproportionate effort exemption. The current interpretation, influenced by recent tribunal rulings, suggests that companies using open electoral register—OER—data would need to notify every individual whose data is used, even if they have not opted out. As the noble Baroness, Lady Harding, implied, notifying millions of individuals who have not opted out is unnecessary and burdensome. Citizens are generally aware of the OER system, and those who do not opt out reasonably expect to receive direct marketing materials. The current interpretation leads to excessive, unhelpful notifications.

There are issues about financial viability. Requiring individual notifications for the entire OER would be financially prohibitive for companies, potentially leading them to cease using the register altogether. On respect for citizens’ choice, around 37% of voters choose not to opt out of OER use for direct marketing, indicating their consent to such use. The amendment upholds this choice by exempting companies from notifying those individuals, which aligns with the GDPR’s principle of respecting data subject consent.

On clarity and certainty, Amendment 24 provides clear exemptions for OER data use, offering legal certainty for companies while maintaining data privacy and adequacy. This addresses the concerns about those very important tribunal rulings creating ambiguity and potentially disrupting legitimate data use. In essence, Amendment 24 seeks to reconcile the use of OER data for direct marketing with the principles of transparency and data subject rights. On that basis, we on these Benches support it.

I turn to my amendment, which seeks a soft opt-in for charities. As we discussed in Committee, a soft opt-in in Regulation 22 of the Privacy and Electronic Communications (EC Directive) Regulations 2003 allows organisations to send electronic mail marketing to existing customers without their consent, provided that the communication is for similar products and services and the messages include an “unsubscribe” link. The soft opt-in currently does not apply to non-commercial organisations such as charities and membership organisations. The Data & Marketing Association estimates that extending the soft opt-in to charities would

“increase … annual donations in the UK by £290 million”.

Extending the soft opt-in as proposed in both the Minister’s and my amendment would provide charities with a level playing field, as businesses have enjoyed this benefit since the introduction of the Privacy and Electronic Communications Regulations. Charities across the UK support this change. For example, the CEO of Mind stated:

“Mind’s ability to reach people who care about mental health is vital. We cannot deliver life changing mental health services without the financial support we receive from the public”.


Oxfam’s individual engagement director noted:

“It’s now time to finally level the playing field for charities too and to allow them to similarly engage their passionate and committed audiences”.


Topically, too, this amendment is crucial to help charities overcome the financial challenges they face due to the cost of living crisis and the recent increase in employer national insurance contributions. So I am delighted, as I know many other charities will be, that the Government have proposed Amendment 49, which achieves the same effect as my Amendment 50.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I declare an interest that my younger daughter works for a charity which will rely heavily on the amendments that have just been discussed by the noble Lord, Lord Clement-Jones.

I want to explain that my support for the amendment moved by the noble Baroness, Lady Harding, was not inspired by any quid pro quo for earlier support elsewhere —certainly not. Looking through the information she had provided, and thinking about the issue and what she said in her speech today, it seemed there was an obvious injustice happening. It seemed wrong, in a period when we were trying to support growth, that we cannot see our way through it. It was in that spirit that I suggested we should push on with it and bring it back on Report, and I am very happy to support it.

Lord Davies of Brixton Portrait Lord Davies of Brixton (Lab)
- View Speech - Hansard - - - Excerpts

I do not want to try the patience of the House at this late hour. I am unhappy about Clause 77 as a whole. Had I had the opportunity, we could have debated it in Committee; unfortunately, I was double-booked, so was unable. Now we are on Report, which does not really provide a platform for discussing the exclusion of the clause.

However, the noble Baroness has provided an opportunity for me to make the point that combining data is the weak point, the point at which we lose control. For that reason, I am unhappy about this amendment. We need to keep high levels of vigilance with regard to the ability to take data from one area and apply it in another, because that is when personal privacy disappears.

20:45
Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

My Lords, as we reach the end of this important group, I thank particularly my noble friend Lady Harding for her contribution and detailed account of some of the issues being faced, which I found both interesting and valuable. I thought the example about the jazz concert requiring the combination of those different types of data was very illuminating. These proposed changes provide us the opportunity to carefully balance economic growth with the fundamental right to data privacy, ensuring that the Bill serves all stakeholders fairly.

Amendment 24 introduces a significant consideration regarding the use of the open electoral register for direct marketing purposes. The proposal to include data from the OER, combined with personal data from other sources, to build marketing profiles creates a range of issues that require careful consideration.

Amendment 24 stipulates that transparency obligations must be fulfilled when individuals provide additional data to a data provider, and that this transparency should be reflected both in the privacy policy and via a data notification in a direct mail pack. While there is certainly potential to use the OER to enhance marketing efforts and support economic activity, we have to remain vigilant to the privacy implications. We need to make sure that individuals are informed of how and where their OER data is being processed, especially when it is combined with other data sources to build profiles.

The requirement for transparency is a positive step, but it is essential that these obligations are fully enforced and that individuals are not left in the dark about how their personal information is being used. I hope the Minister will explain a little more about how these transparency obligations will be implemented in practice and whether additional safeguards are proposed.

Amendment 49 introduces a change to Regulation 22, creating an exception for charities to use electronic mail for direct marketing in specific circumstances. This amendment enables charities to send direct marketing emails when the sole purpose is to further one or more of their charitable purposes, provided that certain conditions are met. These conditions include that the charity obtained the recipient’s contact details when the individual expressed interest in the charity or offered previous support for the charity. This provision recognises the role of charities in fundraising and that their need to communicate with volunteers, supporters or potential donors is vital for their work.

However, I understand the argument that we must ensure that the use of email marketing does not become intrusive or exploitative. The amendment requires that recipients are clearly informed about their right to refuse future marketing communications and that this option is available both when the data is first collected and with every subsequent communication. This helps strike the right balance between enabling charities to raise funds for their causes and protecting individuals from unwanted marketing.

I welcome the Government’s commitment to ensuring that charities continue to engage with their supporters while respecting individuals’ right to privacy. However, it is essential that these safeguards are robustly enforced to prevent exploitation. Again, I look forward to hearing from the Minister on how the Government plan to ensure that their provisions will be properly implemented and monitored.

Amendment 50 introduces the concept of soft opt-ins for email marketing by charities, allowing them to connect with individuals who have previously expressed interest in their charitable causes. This can help charities maintain and grow their supporter base but, again, we must strike the right balance with the broader impact this could have on people in receipt of this correspondence. It is crucial that any system put in place respects individuals’ right to privacy and their ability to opt out easily. We must ensure that charities provide a clear, simple and accessible way for individuals to refuse future communications, and that this option is consistently available.

Finally, we should also consider the rules governing the use of personal data by political parties. This is, of course, an area where we must ensure that transparency, accountability and privacy are paramount. Political parties, like any other organisation, must be held to the highest standards in their handling of personal data. I hope the Government can offer some clear guidance on improving and strengthening the rules surrounding data use by political parties to ensure that individuals’ rights are fully respected and protected.

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I now turn to government Amendment 49. I thank the noble Lord, Lord Clement-Jones, and other noble Lords for raising the concerns of the charity sector during earlier debates. The Government have also heard from charities and trade associations directly.

This amendment will permit charities to send marketing material—for example, promoting campaigns or fundraising activities—to people who have previously expressed an interest in their charitable purposes, without seeking express consent. Charities will have to provide individuals with a simple means of opting out of receiving direct marketing when their contact details are collected and with every subsequent message sent. The current soft opt-in rule for marketing products and services has similar requirements.

Turning to Amendment 24, I am grateful to the noble Baroness, Lady Harding, for our discussions on this matter. As was said in the debate in Grand Committee, the Government are committed to upholding the principles of transparency. I will try to outline some of that.

I understand that this amendment is about data brokers buying data from the open electoral register and combining it with data they have collected from other sources to build profiles on individuals with the intention of selling them for marketing. Despite what was said in the last debate on this, I am not convinced that all individuals registering on the open electoral register would reasonably expect this kind of profiling or invisible processing using their personal data. If individuals are unaware of the processing, this undermines their ability to exercise their other rights, such as to object to the processing. That point was well made by the noble Lord, Lord Davies.

With regard to the open electoral register, the Government absolutely agree that there are potential benefits to society through its use—indeed, economic growth has been mentioned. Notification is not necessary in all cases. There is, for example, an exemption if notifying the data subject would involve a disproportionate effort and the data was not collected directly from them. The impact on the data subject must be considered when assessing whether the effort is disproportionate. If notification is proportionate, the controller must notify.

The ICO considers that the use and sale of open electoral register data alone is unlikely to require notification. As was set out in Committee, the Government believe that controllers should continue to assess on a case-by-case basis whether cases meet the conditions for the existing disproportionate effort exemption. Moreover, I hope I can reassure the noble Baroness that in the event that the data subject already has the information—from another controller, for example—another exemption from notification applies.

The Government therefore do not see a case for a new exemption for this activity, but as requested by the noble Baroness, Lady Harding, I would be happy to facilitate further engagement between the industry and the ICO to improve a common understanding of how available exemptions are to be applied on a case-by-case basis. I understand that the ICO will use the Bill as an opportunity to take stock of how its guidance can address particular issues that organisations face.

Amendment 50, tabled by the noble Lord, Lord Clement-Jones, seeks to achieve a very similar thing to the government amendment and we studied it when designing our amendment. The key difference is that the government amendment defines which organisations can rely on the new measure and for what purposes, drawing on definitions of “charity” and “charitable purpose” in relevant charities legislation.

I trust that the noble Lord will be content with this government amendment and feel content to not to press his own.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

Before the Minister sits down, can I follow up and ask a question about invisible processing? I wonder whether he considers that a better way of addressing potential concerns about invisible processing is improving the privacy notices when people originally sign up for the open electoral register. That would mean making it clear how your data could be used when you say you are happy to be on the open electoral register, rather than creating extra work and potentially confusing communication with people after that. Can the Minister confirm that that would be in scope of potential options and further discussions with the ICO?

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

The further discussions with the ICO are exactly to try to get to these points about the right way to do it. It is important that people know what they are signing up for, and it is equally important that they are aware that they can withdraw at any point. Those points obviously need to be discussed with the industry to make sure that everyone is clear about the rules.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

I thank noble Lords for having humoured me in the detail of this debate. I am very pleased to hear that response from the Minister and look forward to ongoing discussions with the ICO and the companies involved. As such, I beg leave to withdraw my amendment.

Amendment 24 withdrawn.
Amendment 25 not moved.
Clause 80: Automated decision-making
Amendment 26
Moved by
26: Clause 80, page 94, line 24, at end insert—
“3. When an automated decision-making process involves artificial intelligence (AI), the AI programme must have due regard for the following principles—(a) safety, security, and robustness;(b) appropriate transparency and explainability;(c) fairness;(d) accountability and governance;(e) contestability and redress.”Member's explanatory statement
This amendment inserts the five principles from the “A pro-innovation approach to AI regulation” White Paper, ensuring AI programmes used in automated decision making have due regards for safety, security, robustness, appropriate transparency and explainability, fairness, accountability and governance, and contestability and redress.
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, I rise to speak to Amendments 26, 31 and 32 tabled in my name and that of my noble friend Lord Markham. I will address the amendments in reverse order.

Amendment 32 would ensure that, where a significant decision is taken by ADM, the data subject was able to request intervention by a human with sufficient competency and authority. While that is clearly the existing intent of the ADM provisions in the Bill, this amendment brings further clarity. I am concerned that, where data processors update their ADM procedures in the light of this Bill, it should be abundantly clear to them at every stage what the requirements are and that, as currently written, there may be a risk of misunderstanding. Given the significance of decisions that may be made by ADM, we should make sure this does not happen. Data subjects must have recourse to a person who both understands their problem and is able to do something about it. I look forward to hearing the Minister’s views on this.

Amendment 31 would require the Secretary of State to provide guidance on how consent should be obtained for ADM involving special category data. It would also ensure that this guidance was readily available and reviewed frequently. The amendment would provide guidance for data controllers who wish to use ADM, helping them to set clear processes for obtaining consent, thus avoiding complaints and potential litigation.

We all know that litigation can be slow, disruptive and sometimes prohibitively expensive. If we want to encourage the use of ADM so that customers and businesses can save both time and money, we should seek to ensure that the sector does not become a hotbed of litigation. The risk can be mitigated by providing ample guidance for the sector. For relatively minimal effort on the part of the Secretary of State, we may be able to facilitate substantial growth in the use and benefits of ADM. I would be most curious to hear the Minister’s opinions on this matter and, indeed, the opinions of noble Lords more broadly.

Amendment 26 would insert the five principles set out in the AI White Paper published by the previous Government, requiring all data controllers and processors who partake in AI-driven ADM to have due regard for them. In the event that noble Lords are not familiar with these principles, they are: safety, security and robustness; appropriate transparency and explainability; fairness; accountability and governance; and contestability and redress.

These principles for safe AI are based on those originally developed with the OECD and have been the subject of extensive consultation. They have been refined and very positively received by developers, public sector organisations, private sector organisations and civil society. They offer real, and popular, safeguards against the risks of AI while continuing to foster innovation.

21:00
The risks posed by AI go far beyond data protection. Although the data protection principles covered in the Bill have significant overlap with the AI principles, it is possible, as the Bill stands, for ADM activities to adhere to data protection principles but violate the AI principles. Since, at this point, we know little of the Government’s planned AI legislation—I understand it to be focused on the activities of the main AI labs—for a great proportion of AI activity before that Bill is passed, and possibly after, the ADM provisions in this Bill may represent the entirety of our AI-specific laws in this country.
I should add that prescriptive regulation for AI is extremely challenging, as the technology is evolving at an astonishing rate; what may regulate the specifics of LLMs today may come to look like regulations for Betamax videocassettes tomorrow. Principles have a timeless and adaptive quality. That is why it will be so valuable to have them set out in the Bill. This is not a prescriptive approach; it is one that specifies the outcomes we want and gives agency to those best placed to bring them to fruition.
I strongly commend this amendment to the House, and I am minded to test the opinion of the House.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to Amendments 28, 29, 33, 34 and 36. I give notice that I will only speak formally to Amendment 33. For some reason, it seems to have escaped this group and jumped into the next one.

As we discussed in Committee, and indeed on its previous versions, the Bill removes the general prohibition on solely automated decisions and places the responsibility on individuals to enforce their rights rather than on companies to demonstrate why automation is permissible. The Bill also amends Article 22 of the GDPR so that protection against solely automated decision-making applies only to decisions made using sensitive data such as race, religion and health data. This means that decisions based on other personal data, such as postcode, nationality, sex or gender, would be subject to weaker safeguards, increasing the risk of unfair or discriminatory outcomes. This will allow more decisions with potentially significant impacts to be made without human oversight, even if they do not involve sensitive data. This represents a significant weakening of existing protection against unsafe automated decision-making. That is why I tabled Amendment 33 to leave out the whole clause.

However, the Bill replaces the existing Article 22 with Articles 22A to 22D, which redefine automated decisions and allow for solely automated decision-making in a broader range of circumstances. This change raises concerns about transparency and the ability of individuals to challenge automated decisions. Individuals may not be notified about the use of ADM, making it difficult to exercise their rights. Moreover, the Bill’s safeguards for automated decisions, particularly in the context of law enforcement, are weaker compared with the protections offered by the existing Article 22. This raises serious concerns about the potential for infringement of people’s rights and liberties in areas such as policing, where the use of sensitive data in ADM could become more prevalent. Additionally, the lack of clear requirements for personalised explanations about how ADM systems reach decisions further limits individuals’ understanding of and ability to challenge outcomes.

In the view of these Benches, the Bill significantly weakens safeguards around ADM, creates legal uncertainty due to vague definitions, increases the risk of discrimination, and limits transparency and redress for individuals—ultimately undermining public trust in the use of these technologies. I retabled Amendments 28, 29, 33 and 34 from Committee to address continuing concerns regarding these systems. The Bill lacks clear definitions of crucial terms such as “meaningful human involvement” and, similarly, “significant effect”, which are essential for determining the scope of protection. That lack of clarity could lead to varying interpretations and inconsistencies in application, creating legal uncertainty for individuals and organisations.

In Committee, the noble Baroness, Lady Jones, emphasised the Government’s commitment to responsible ADM and argued against defining meaningful human involvement in the Bill, but instead for allowing the Secretary of State to define those terms through delegated legislation. However, that raises concerns about transparency and parliamentary oversight, as these are significant policy decisions. Predominantly automated decision-making should be included in Clause 80, as in Amendment 28, as a decision may lack meaningful human involvement and significantly impact individuals’ rights. The assertion by the noble Baroness, Lady Jones, that predominantly automated decisions inherently involve meaningful human oversight can be contested, particularly given the lack of a clear definition of such involvement in the Bill.

There are concerns that changes in the Bill will increase the risk of discrimination, especially for marginalised groups. The noble Baroness, Lady Jones, asserted in Committee that the data protection framework already requires adherence to the Equality Act. However, that is not enough to prevent algorithmic bias and discrimination in ADM systems. There is a need for mandatory bias assessments of all ADM systems, particularly those used in the public sector, as well as for greater transparency in how those systems are developed and deployed.

We have not returned to the fray on the ATRS, but it is clear that a statutory framework for the ATRS is necessary to ensure its effectiveness and build trust in public sector AI. Despite the assurance by the noble Baroness, Lady Jones, that the ATRS is mandatory for government departments, its implementation relies on a cross-government policy mandate that lacks statutory backing and may prove insufficient to ensure the consistent and transparent use of algorithmic tools.

My Amendment 34 seeks to establish requirements for public sector organisations using ADM systems. Its aim is to ensure transparency and accountability in the use of these systems by requiring public authorities to publish details of the systems they use, including the purpose of the system, the data used and any mitigating measures to address risks. I very much welcome Amendment 35 from the noble Baroness, Lady Freeman, which would improve it considerably and which I have also signed. Will the ATRS do as good a job as that amendment?

Concerns persist about the accessibility and effectiveness of this mechanism for individuals seeking redress against potentially harmful automated decisions. A more streamlined and user-friendly process for challenging automated decisions is needed in the in the age of increasing ADM. The lack of clarity and specific provisions in the Bill raises concerns about its effectiveness in mitigating the risks posed by automated systems, particularly in safeguarding vulnerable groups such as children.

My Amendment 36 would require the Secretary of State to produce a definition of “meaningful human involvement” in ADM in collaboration with the Information Commissioner’s Office, or to clearly set out their reasoning as to why that is not required within six months of the Act passing. The amendment is aimed at addressing the ambiguity surrounding “meaningful human involvement” and ensuring that there is a clear understanding of what constitutes appropriate human oversight in ADM processes.

I am pleased that the Minister has promised a code of practice, but what assurance can he give regarding the forthcoming ICO code of practice about automated decision-making? How will it provide clear guidance on how to implement and interpret the safeguards for ADM, and will it address the definition of meaningful human involvement? What forms of redress will it require to be established? What level of transparency will be required? A code of conduct offered by the Minister would be acceptable, provided that the Secretary of State did not have the sole right to determine the definition of meaningful human involvement. I therefore hope that my Amendment 29 will be accepted alongside Amendment 36, because it is important that the definition of such a crucial term should be developed independently, and with the appropriate expertise, to ensure that ADM systems are used fairly and responsibly, and that individual rights are adequately protected.

Amendments 31 and 32 from the Opposition Front Bench seem to me to have considerable merit, particularly Amendment 32, in terms of the nature of the human intervention. However, I confess to some bafflement as to the reasons for Amendment 26, which seeks to insert the OECD principles set out in the AI White Paper. Indeed, they were the G20 principles as well and are fully supportable in the context of an AI Bill, for instance, and I very much hope that will form Clause 1 of a new AI Bill going forward. I am not going to go into great detail, but I wonder whether those principles are already effectively addressed in data protection legislation. If we are not careful, we are going to find a very confused regulator in these circumstances. So, although there is much to commend the principles as such, whether they are a practical proposition in a Bill of this nature is rather moot.

Baroness Freeman of Steventon Portrait Baroness Freeman of Steventon (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I support Amendment 34 from the noble Lord, Lord Clement-Jones, and will speak to my own Amendment 35, which amends it. When an algorithm is being used to make important decisions about our lives, it is vital that everyone is aware of what it is doing and what data it is based on. On Amendment 34, I know from having had responsibility for algorithmic decision support tools that users are very interested in how recent the data it is based on is, and how relevant it is to them. Was the algorithm derived from a population that included people who share their characteristics? Subsection (1)(c)(ii) of the new clause proposed in Amendment 34 refers to regular assessment of the data used by the system. I would hope that this would be part of the meaningful explanation to individuals to be prescribed by the Secretary of State in subsection (1)(b).

Amendment 35 would add to this that it is vital that all users and procurers of such a system understand its real-world efficacy. I use the word “efficacy” rather than “accuracy” because it might be difficult to define accuracy with regard to some of these systems. The procurer of any ADM system should want to know how accurate it is using realistic testing, and users should also be aware of those findings. Does the system give the same outcome as a human assessor 95% or 60% of the time? Is that the same for all kinds of queries, or is it more accurate for some groups of people than others? The efficacy is really one of the most important aspects and should be public. I have added an extra line that ensures that this declaration of efficacy would be kept updated. One would hope that the performance of any such system would be monitored anyway, but this ensures that the outcomes of such monitoring are in the public domain.

In Committee, the Minister advised us to wait for publication of the algorithmic transparency records that were released in December. Looking at them, I think they make clear the much greater need for guidance and stringency in what should be mandated. I will give two short examples from those records. For the DBT: Find Exporters algorithm, under “Model performance” it merely says that it uses Brier scoring and other methods, without giving any actual results of that testing to indicate how well it performs. It suggests looking at the GitHub pages. I followed that link, and it did not allow me in. The public have no access to those pages. This is why these performance declarations need to be mandated and forced to be in the public domain.

In the second example, the Cambridgeshire trial of an externally supplied object detection system just cites the company’s test data, claiming average precision in a “testing environment” of 43.5%. This does not give the user a lot of information. Again, it links to GitHub pages produced by the supplier. Admittedly, this is a trial, so perhaps the Cambridgeshire Partnership will update it with its real-world trial data. But that is why we need to ensure annual updates of performance data and ensure that that data is not just a report of the supplier’s claims in a test environment.

The current model of algorithmic transparency records is demonstrably not fit for purpose, and these provisions would help put them on a much firmer footing. These systems, after all, are making life-changing decisions for all of us and we all need to be sure how well they are doing and put appropriate levels of trust in them accordingly.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I have added my name to Amendment 36 tabled by the noble Lord, Lord Clement-Jones. I also support Amendments 26, 27, 28, 31, 32 and 35. The Government, in their AI Statement last week, said that ADM will be rolled out across the public sector in the coming months and years. It will increase productivity and provide better public services to the people of this country.

However, there are many people who are fearful of their details being taken by an advanced computer, and a decision which could affect their lives being made by that computer. Surely the days of “computer says no” must be over. People need to know that there is a possibility of a human being involved in the process, particularly when dealing with the public sector. I am afraid that my own interactions with public sector software in various government departments have not always been happy ones, and I have been grateful to be able to appeal to a human.

21:15
When it comes to safe decisions being made by public bodies, safety, transparency and trust must be supreme. Without these assurances, the rollout of ADM across the public sector will flounder. Amendment 36 would allow the Secretary of State to define what meaningful human involvement in these decisions means. I ask the Minister to go further and lay out for your Lordships’ House what might be the defining aspects of human involvement, and put on the record for the House the positive aspects of human involvement.
My slight concern with this amendment is that allowing the Government to define human involvement could allow the definition to be watered down. My fear, and that of many civic institutions involved in this new era of AI, is that it could become a tick-box exercise. The questions which need answering are whether the definition of human involvement will require the involvement of a human who has the competence to understand how the decision has been made and the authority to review and respond to the decision, and the transparency for the data subject to be able to appeal a decision.
If the House can be given reassurance along these lines, the Minister will go a long way towards allaying the fears of many people in this country. In the process, it would help the rollout of AI across the public sector.
Lord Lucas Portrait Lord Lucas (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I support what the noble Baroness, Lady Freeman, said. Her maiden speech was a forewarning of how good her subsequent speeches would be and how dedicated she is to openness, which is absolutely crucial in this area. We are going to have to get used to a lot of automatic processes and come to consider that they are by and large fair. Unless we are able to challenge it, understand it and see that it has been properly looked after, we are not going to develop that degree of trust in it.

Anyone who has used current AI programs will know about the capacity of AI for hallucination. The noble Lord, Lord Clement-Jones, uses them a lot. I have been looking, with the noble Lord, Lord Saatchi, at how we could use them in this House to deal with the huge information flows we have and to help us understand the depths of some of the bigger problems and challenges we are asked to get a grip on. But AI can just invent things, leaping at an answer that is easier to find, ignoring two-thirds of the evidence and not understanding the difference between reliable and unreliable witnesses.

There is so much potential, but there is so much that needs to be done to make AI something we can comfortably rely on. The only way to get there is to be absolutely open and allow and encourage challenge. The direction pointed out by the noble Lord, Lord Clement-Jones, and, most particularly by the noble Baroness, Lady Freeman, is one that I very much think we should follow.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I will very briefly speak to Amendment 30 in my name. Curiously, it was in the name of the noble Viscount, Lord Camrose, in Committee, but somehow it has jumped.

On the whole, I have always advocated for age-appropriate solutions. The amendment refers to preventing children consenting to special category data being used in automated decision-making, simply because there are some things that children should not be able to consent to.

I am not sure that this exact amendment is the answer. I hope that the previous conversation that we had before the dinner break will produce some thought about this issue—about how automatic decision-making affects children specifically—and we can deal with it in a slightly different way.

While I am on my feet, I want to say that I was very struck by the words of my noble friend Lady Freeman, particularly about efficacy. I have seen so many things that have purported to work in clinical conditions that have failed to work in the complexity of real life, and I want to associate myself with her words and, indeed, the amendments in her name and that of the noble Lord, Lord Clement-Jones.

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

I start with Amendment 26, tabled by the noble Viscount, Lord Camrose. As he said in Committee, a principles-based approach ensures that our rules remain fit in the face of fast-evolving technologies by avoiding being overly prescriptive. The data protection framework achieves this by requiring organisations to apply data protection principles when personal data is processed, regardless of the technology used.

I agree with the principles that are present for AI, which are useful in the context in which they were put together, but introducing separate principles for AI could cause confusion around how data protection principles are interpreted when using other technologies. I note the comment that there is a significant overlap between the principles, and the comment from the noble Viscount that there are situations in which one would catch things and another would not. I am unable to see what those particular examples are, and I hope that the noble Viscount will agree with the Government’s rationale for seeking to protect the framework’s technology-neutral set of principles, rather than having two separate sets.

Amendment 28 from the noble Lord, Lord Clement-Jones, would extend the existing safeguards for decisions based on solely automated processing to decisions based on predominantly automated processing. These safeguards protect people when there is no meaningful human involvement in the decision-making. The introduction of predominantly automated decision-making, which already includes meaningful human involvement—and I shall say a bit more about that in a minute—could create uncertainty over when the safeguards are required. This may deter controllers from using automated systems that have significant benefits for individuals and society at large. However, the Government agree with the noble Viscount on strengthening the protections for individuals, which is why we have introduced a definition for solely automated decision-making as one which lacks “meaningful human involvement”.

I thank noble Lords for Amendments 29 and 36 and the important points raised in Committee on the definition of “meaningful human involvement”. This terminology, introduced in the Bill, goes beyond the current UK GDPR wording to prevent cursory human involvement being used to rubber stamp decisions as not being solely automated. The point at which human involvement becomes meaningful is context specific, which is why we have not sought to be prescriptive in the Bill. The ICO sets out in its guidance its interpretation that meaningful human involvement must be active: someone must review the decision and have the discretion to alter it before the decision is applied. The Government’s introduction of “meaningful” into primary legislation does not change this definition, and we are supportive of the ICO’s guidance in this space.

As such, the Government agree on the importance of the ICO continuing to provide its views on the interpretation of terms used in the legislation. Our reforms do not remove the ICO’s ability to do this, or to advise Parliament or the Government if it considers that the law needs clarification. The Government also acknowledge that there may be a need to provide further legal certainty in future. That is why there are a number of regulation-making powers in Article 22D, including the power to describe meaningful human involvement or to add additional safeguards. These could be used, for example, to impose a timeline on controllers to provide human intervention upon the request of the data subject, if evidence suggested that this was not happening in a timely manner following implementation of these reforms. Any regulations must follow consultation with the ICO.

Amendment 30 from the noble Baroness, Lady Kidron, would prevent law enforcement agencies seeking the consent of a young person to the processing of their special category or sensitive personal data when using automated decision-making. I thank her for this amendment and agree about the importance of protecting the sensitive personal data of children and young adults. We believe that automated decision-making will continue to be rarely deployed in the context of law enforcement decision-making as a whole.

Likewise, consent is rarely used as a lawful basis for processing by law enforcement agencies, which are far more likely to process personal data for the performance of a task, such as questioning a suspect or gathering evidence, as part of a law enforcement process. Where consent is needed—for example, when asking a victim for fingerprints or something else—noble Lords will be aware that Clause 69 clearly defines consent under the law enforcement regime as

“freely given, specific, informed and unambiguous”

and

“as easy … to withdraw … as to give”.

So the tight restrictions on its use will be crystal clear to law enforcement agencies. In summary, I believe the taking of an automated decision based on a young person’s sensitive personal data, processed with their consent, to be an extremely rare scenario. Even when it happens, the safeguards that apply to all sensitive processing will still apply.

I thank the noble Viscount, Lord Camrose, for Amendments 31 and 32. Amendment 31 would require the Secretary of State to publish guidance specifying how law enforcement agencies should go about obtaining the consent of the data subject to process their data. To reiterate a point made by my noble friend Lady Jones in Committee, Clause 69 already provides a definition of “consent” and sets out the conditions for its use; they apply to all processing under the law enforcement regime, not just automated decision-making, so the Government believe this amendment is unnecessary.

Amendment 32 would require the person reviewing an automated decision to have sufficient competence and authority to amend the decision if required. In Committee, the noble Viscount also expressed the view that a person should be “suitably qualified”. Of course, I agree with him on that. However, as my noble friend Lady Jones said in Committee, the Information Commissioner’s Office has already issued guidance which makes it clear that the individual who reconsiders an automated decision must have the “authority and competence” to change it. Consequently, the Government do not feel that it is necessary to add further restrictions in the Bill as to the type of person who can carry out such a review.

The noble Baroness, Lady Freeman, raised extremely important points about the performance of automated decision-making. The Government already provide a range of products, but A Blueprint for Modern Digital Government, laid this morning, makes it clear that part of the new digital centre’s role will be to offer specialist insurance support, including, importantly in relation to this debate,

“a service to rigorously test models and products before release”.

That function will be in place and available to departments.

On Amendments 34 and 35, my noble friend Lady Jones previously advised the noble Lord, Lord Clement-Jones, that the Government would publish new algorithmic transparency recording standard records imminently. I am pleased to say that 14 new records were published on 17 December, with more to follow. I accept that these are not yet in the state in which we would wish them to be. Where these amendments seek to ensure that the efficacy of such systems is evaluated, A Blueprint for Modern Digital Government, as I have said, makes it clear that part of the digital centre’s role will be to offer such support, including this service. I hope that this provides reassurance.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, before the Minister sits down, I was given considerable assurance between Committee and Report that a code of practice, drawn up with the ICO, would be quite detailed in how it set out the requirements for those engaging in automated decision-making. The Minister seems to have given some kind of assurance that it is possible that the ICO will come forward with the appropriate provisions, but he has not really given any detail as to what that might consist of and whether that might meet some of the considerations that have been raised in Committee and on Report, not least Amendments 34 and 35, which have just been discussed as if the ATRS was going to cover all of that. Of course, any code would no doubt cover both the public and private sectors. What more can the Minister say about the kind of code that would be expected? We seem to be in somewhat of a limbo in this respect.

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- Hansard - - - Excerpts

I apologise; I meant to deal with this at the end. I think I am dealing with the code in the next group.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Well, that is something to look forward to.

21:30
Baroness Freeman of Steventon Portrait Baroness Freeman of Steventon (CB)
- Hansard - - - Excerpts

Before the Minister sits down, he said that there will be evaluations of the efficacy of these systems but he did not mention whether those will have to be made public. Can he give me any assurance on that?

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- Hansard - - - Excerpts

There is a requirement. Going back to the issue of principles, which was discussed earlier on, one of the existing principles—which I am now trying to locate and cannot—is transparency. I expect that we would make as much of the information public as we can in order to ensure good decision-making and assure people as to how the decisions have been reached.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

I thank all noble Lords and the Minister for their comments and contributions to what has been a fascinating debate. I will start by commenting on the other amendments in this group before turning to those in my name.

First, on Amendments 28 and 29, I am rather more comfortable with the arrangements for meaningful human intervention set out in the Bill than the noble Lord, Lord Clement-Jones. For me, either a decision has meaningful human intervention or it does not. In the latter case, certain additional rights kick in. To me, that binary model is clear and straightforward, and could only be damaged by introducing some of the more analogue concepts such as “predominantly”, “principally”, “mainly” or “wholly”, so I am perfectly comfortable with that as it is.

However, I recognise that puts a lot of weight on to the precise meaning of “meaningful human involvement”. Amendment 36 in the name of the noble Lord, Lord Clement-Jones, which would require the Secretary of State to produce a definition of “meaningful human involvement” in ADM in collaboration with the ICO, seems to take on some value in those circumstances, so I am certainly more supportive of that one.

As for Amendments 34 and 35 in the names of the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Freeman, I absolutely recognise the value and potential of efficacy; I agree it is a very valuable term. I have more faith in the rollout and use of the ATRS but on a non-statutory basis, believing, as I do, that this would allow it to continue to develop in an agile and adaptive manner. I welcome the Minister’s words on this subject, and for now I remain comfortable that the ATRS is the direction forward for that.

I turn to the amendments in my name. I thank all noble Lords and, indeed, the Minister for their comments and contributions regarding Amendments 31 and 32. I very much take the Minister’s point that definitions of consent feature elsewhere in the Bill. That reduces my concern somewhat.

However, I continue to strongly commend Amendment 26 to the House. I believe it will foster innovation while protecting data rights. It is popular with the public and with private sector stakeholders. It will bring about outcomes that we all want to see in AI safety without stifling this new and exciting technology. In the absence of an AI Bill—and possibly even in the presence of one—it is the only AI-specific legislation that will be around. It is important somehow to get those AI principles in the Bill, at least until an AI Bill comes along. With this in mind, I wish to test the opinion of the House.

21:34

Division 5

Ayes: 79

Noes: 112

21:44
Amendments 27 to 32 not moved.
Amendment 33
Moved by
33: Leave out Clause 80
Member's explanatory statement
This is a probing amendment intended to elicit assurances from the Minister regarding the forthcoming ICO code of practice about automated decision-making.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, we have waited with bated breath for the Minister to share his hand, and I very much hope that he will reveal the nature of his bountiful offer of a code of practice on the use of automated decision-making.

I will wear it as a badge of pride to be accused of introducing an analogue concept by the noble Viscount, Lord Camrose. I am still keen to see the word “predominantly” inserted into the Bill in reference to automated decision-making.

As the Minister can see, there is considerable unhappiness with the nature of Clause 80. There is a view that it does not sufficiently protect the citizen in the face of automated decision-making, so I hope that he will be able to elaborate further on the nature of those protections.

I will not steal any of the thunder of the noble Baroness, Lady Kidron. For some unaccountable reason, Amendment 33 is grouped with Amendment 41. The groupings on this Bill have been rather peculiar and at this time of night I do not think any long speeches are in order, but it is important that we at least have some debate about the importance of a code of conduct for the use of AI in education, because it is something that a great many people in the education sector believe is necessary. I beg to move.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I shall speak to Amendment 41 in my name and in the names of my noble friend Lord Russell, the noble Baroness, Lady Harding, and the noble Lord, Lord Clement-Jones. The House can be forgiven if it is sensing a bit of déjà-vu, since I have proposed this clause once or twice before. However, since Committee, a couple of things have happened that make the argument for the code more urgent. We have now heard that the Prime Minister thinks that regulating AI is “leaning out” when we should be, as the tech industry likes to say, leaning in. We have had Matt Clifford’s review, which does not mention children even once. In the meantime, we have seen rollout of AI in almost all products and services that children use. In one of the companies—a household name that I will not mention—an employee was so concerned that they rang me to say that nothing had been checked except whether the platform would fall over.

Amendment 41 does not seek to solve what is a global issue of an industry arrogantly flying a little too close to the sun and it does not grasp how we could use this extraordinary technology and put it to use for humankind on a more equitable basis than the current extractive and winner-takes-all model; it is far more modest than that. It simply says that products and services that engage with kids should undertake a mandatory process that considers their specific vulnerabilities related to age. I want to stress this point. When we talk about AI, increasingly we imagine the spectre of diagnostic benefits or the multiple uses of generative models, but of course AI is not new nor confined to these uses. It is all around us and, in particular, it is all around children.

In 2021, Amazon’s AI voice assistant, Alexa, instructed a 10 year-old to touch a live electrical plug with a coin. Last year, Snapchat’s My AI gave adult researchers posing as a 13 year-old girl tips on how to lose her virginity with a 31 year-old. Researchers were also able to obtain tips on how to hide the smell of alcohol and weed and how to conceal Snapchat conversations from their parents. Meanwhile, character.ai is being sued by the mother of a 14 year-old boy in Florida who died by suicide after becoming emotionally attached to a companion bot that encouraged him to commit suicide.

In these cases, the companies in question responded by implementing safety measures after the fact, but how many children have to put their fingers in electrical sockets, injure themselves, take their own lives and so on before we say that those measures should be mandatory? That is all that the proposed code does. It asks that companies consider the ways in which their products may impact on children and, having considered them, take steps to mitigate known risk and put procedures in place to deal with emerging risks.

One of the frustrating things about being an advocate for children in the digital world is how much time I spend articulating avoidable harms. The sorts of solutions that come after the event, or suggestions that we ban children from products and services, take away from the fact that the vast majority of products and services could, with a little forethought, be places of education, entertainment and personal growth for children. However, children are by definition not fully mature, which puts them at risk. They chat with smart speakers, disclosing details that grown-ups might consider private. One study found that three to six year-olds believed that smart speakers have thoughts, feelings and social abilities and are more reliable than human beings when it came to answering fact-based questions.

I ask the Minister: should we ban children from the kitchen or living room in which the smart speaker lives, or demand, as we do of every other product and service, minimum standards of product safety based on the broad principle that we have a collective obligation to the safety and well-being of children? An AI code is not a stretch for the Bill. It is a bare minimum.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak very briefly, given the hour, just to reinforce three things that I have said as the wingman to the noble Baroness, Lady Kidron, many times, sadly, in this Chamber in child safety debates. The age-appropriate design code that we worked on together and which she championed a decade ago has driven real change. So we have evidence that setting in place codes of conduct that require technology companies to think in advance about the potential harms of their technologies genuinely drives change. That is point one.

Point two is that we all know that AI is a foundational technology which is already transforming the services that our children use. So we should be applying that same principle that was so hard fought 10 years ago for non-AI digital to this foundational technology. We know that, however well meaning, technology companies’ development stacks are always contended. They always have more good things that they think they can do to improve their products for their consumers, that will make them money, than they have the resources to do. However much money they have, they just are contended. That is the nature of technology businesses. This means that they never get to the safety-by-design issues unless they are required to. It was no different 150 or 200 years ago as electricity was rolling through the factories of the mill towns in the north of England. It required health and safety legislation. AI requires health and safety legislation. You start with codes of conduct and then you move forward, and I really do not think that we can wait.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

My Lords, Amendment 41 aims to establish a code of practice for the use of children’s data in the development of AI technologies. In the face of rapidly advancing AI, it is, of course, crucial that we ensure children’s data is handled with the utmost care, prioritising their best interests and fundamental rights. We agree that AI systems that are likely to impact children should be designed to be safe and ethical by default. This code of practice will be instrumental in guiding data controllers to ensure that AI development and deployment reflect the specific needs and vulnerabilities of children.

However, although we support the intent behind the amendment, we have concerns, which echo concerns on amendments in a previous group, about the explicit reference to the UN Convention on the Rights of the Child and general comment 25. I will not rehearse my comments from earlier groups, except to say that it is so important that we do not have these explicit links to international frameworks, important as they are, in UK legislation.

In the light of this, although we firmly support the overall aim of safeguarding children’s data in AI, we believe this can be achieved more effectively by focusing on UK legal principles and ensuring that the code of practice is rooted in our domestic context.

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- Hansard - - - Excerpts

I thank the noble Lord, Lord Clement-Jones, for Amendment 33, and the noble Baroness, Lady Kidron, for Amendment 41, and for their thoughtful comments on AI and automated decision-making throughout this Bill’s passage.

The Government have carefully considered these issues and agree that there is a need for greater guidance. I am pleased to say that we are committing to use our powers under the Data Protection Act to require the ICO to produce a code of practice on AI and solely automated decision-making through secondary legislation. This code will support controllers in complying with their data protection obligations through practical guidance. I reiterate that the Government are committed to this work as an early priority, following the Bill receiving Royal Assent. The secondary legislation will have to be approved by both Houses of Parliament, which means it will be scrutinised by Peers and parliamentarians.

I can also reassure the noble Baroness that the code of practice will include guidance about protecting data subjects, including children. The new ICO duties set out in the Bill will ensure that where children’s interests are relevant to any activity the ICO is carrying out, it should consider the specific protection of children. This includes when preparing codes of practice, such as the one the Government are committing to in this area.

I understand that noble Lords will be keen to discuss the specific contents of the code. The ICO, as the independent data protection regulator, will have views as to the scope of the code and the topics it should cover. We should allow it time to develop those thoughts. The Government are also committed to engaging with noble Lords and other stakeholders after Royal Assent to make sure that we get this right. I hope noble Lords will agree that working closely together to prepare the secondary legislation to request this code is the right approach instead of pre-empting the exact scope.

The noble Lord, Lord Clement-Jones, mentioned edtech. I should add—I am getting into a habit now—that it is discussed in a future group.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

Before the Minister sits down, I welcome his words, which are absolutely what we want to hear. I understand that the ICO is an independent regulator, but it is often the case that the scope and some of Parliament’s concerns are delivered to it from this House—or, indeed, from the other place. I wonder whether we could find an opportunity to make sure that the ICO hears Parliament’s wish on the scope of the children’s code, at least. I am sure the noble Lord, Lord Clement-Jones, will say similar on his own behalf.

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- Hansard - - - Excerpts

It will be clear to the ICO from the amendments that have been tabled and my comments that there is an expectation that it should take into account the discussion we have had on this Bill.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the Minister for his very considered response. In the same way as the noble Baroness, Lady Kidron, I take it that, effectively, the Minister is pledging to engage directly with us and others about the nature and contents of the code, and that the ICO will also engage on that. As the Minister knows, the definition of terms such as meaningful human engagement is something that we will wish to discuss and consider in the course of that engagement. I hope that the AI edtech code will also be part of that.

I thank the Minister. I know he has had to think about this quite carefully during the Bill’s passage. Currently, Clause 80 is probably the weakest link in the Bill, and this amendment would go some considerable way towards repairing it. My final question is not to the Minister, but to the Opposition: what on earth have they got against the UN? In the meantime, I beg leave to withdraw my amendment.

Amendment 33 withdrawn.
Amendment 34 not moved.
Amendment 36 not moved.
22:00
Amendment 37
Moved by
37: After Clause 84, insert the following new Clause—
“Impact of this Act and other developments at national and international level on EU data adequacy decisionBefore the European Union’s reassessment of data adequacy in June 2025, the Secretary of State must carry out an assessment of the likely impact on the European Union data adequacy decisions relating to the United Kingdom of the following—(a) this Act;(b) other changes to the United Kingdom’s domestic frameworks which are relevant to the matters listed in Article 45(2) of the UK GDPR (transfers on the basis of an adequacy decision);(c) relevant changes to the United Kingdom’s international commitments or other obligations arising from legally binding conventions or instruments, as well as from its participation in multilateral or regional systems, in particular in relation to the protection of personal data.”Member's explanatory statement
This amendment requires the Secretary of State to carry out an assessment of the impact of this Act and other changes to the UK’s domestic and international frameworks relating to data adequacy.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, Amendment 37 is on the subject of data adequacy, which has been a consistent issue throughout the passage of the Bill. The mechanism put forward in the amendment would review the question of data adequacy.

Safeguarding data adequacy is crucial for the UK’s economy and international partnerships. Losing data adequacy status would impose significant costs and administrative burdens on businesses and public sector organisations that share data between the UK and the EU. It would also hinder international trade and economic co-operation, and trust in the UK’s digital economy, contradicting the Government’s objective of economic growth. I hope very much that the Government are proactively engaging with the European Commission to ensure a smooth adequacy renewal process this year.

Early engagement and reassurance about the retention of adequacy status are of crucial importance, given the looming deadline of June this year. This includes explaining and providing reassurance regarding any planned data protection reforms, particularly concerning the independence of the Information Commissioner’s Office, ministerial powers to add new grounds—for instance, recognised legitimate interest for data processing —and the new provisions relating to automated decision-making.

Despite assurances from the noble Baroness, Lady Jones, that the proposed changes will not dilute data subjects’ rights or threaten EU adequacy, proactive engagement with the EU and robust safeguards are necessary to ensure the continued free flow of data while maintaining high data protection standards. The emphasis on proportionality as a safeguard against the dilution of data subjects’ rights, as echoed by the noble Baroness, Lady Jones, and the ICO, is insufficient. The lack of a clear definition of proportionality within the context of data subjects’ rights could provide loopholes for controllers and undermine the essential equivalence required for data adequacy. The Bill’s reliance on the ICO’s interpretation of proportionality without explicit legislative clarity could be perceived as inadequate by the European Commission, particularly in areas such as web scraping for AI training.

The reassurance that the Government are taking data adequacy seriously and are committing to engaging with the EU needs to be substantiated by concrete actions. The Government do not, it appears, disclose assessments and reports relating to the compatibility of the UK’s domestic data protection framework with the Council of Europe’s Convention 108+, and that raises further concerns about transparency and accountability. Access to this information would enable scrutiny and informed debate, ultimately contributing to building trust and ensuring compatibility with international data protection standards.

In conclusion, while the Government maintain that this Bill would not jeopardise data adequacy, the concerns raised by myself and others during its passage mean that I continue to believe that a comprehensive review of EU data adequacy, as proposed in Amendment 37, is essential to ensure the continued free flow of data, while upholding high data protection standards and maintaining the UK’s position as a trusted partner in international data exchange. I beg to move.

Lord Thomas of Cwmgiedd Portrait Lord Thomas of Cwmgiedd (CB)
- View Speech - Hansard - - - Excerpts

I have added my name to this amendment, about which the noble Lord, Lord Clement-Jones, has spoken so eloquently, because of the importance to our economic growth of maintaining data adequacy with the EU. I have two points to add to what he said.

First, as I said and observed on some occasions in Committee, this is legislation of unbelievable complexity. It is a bad read, except if you want a cure for insomnia. Secondly, it has the technique of amending and reamending earlier legislation. Thirdly, this is not the time to go into detail of the legal problems that arise, some of which we canvassed in Committee, as to whether this legislation has no holes in it. I do not think I would be doing any favours either to the position of the United Kingdom or to those who have been patient enough to stay and listen to this part of the debate by going into any of those in any detail, particularly those involving the European Convention on Human Rights and the fundamental charter. That is my first point, on the inherent nature of the legislative structure that we have created. As I said earlier, I very much hope we will never have such legislation again.

Secondly, in my experience, there is a tendency among lawyers steeped in an area or department often to feel, “Well, we know it’s all right; we built it. The legislation’s fine”. Therefore, there is an additional and important safeguard that I think we should adopt, which is for a fresh pair of eyes, someone outside the department or outside those who have created the legislation, to look at it again to see whether there are any holes in it. We cannot afford to go into this most important assessment of data adequacy without ensuring that our tackle is in order. I appreciate what the Minister said on the last occasion in Committee—it is for the EU to pick holes in it—but the only prudent course when dealing with anything of this complexity in a legal dispute or potential dispute is to ensure that your own tackle is in order and not to go into a debate about something without being sure of that, allowing the other side to make all the running. We should be on top of this and that is why I very much support this amendment.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the noble Lord, Lord Clement-Jones—as ever—and the noble and learned Lord, Lord Thomas, for tabling Amendment 37 in their names. It would introduce a new clause that would require the Secretary of State to carry out an impact assessment of this Act and other changes to the UK’s domestic and international frameworks relating to data adequacy before the European Union’s reassessment of data adequacy in June this year.

I completely understand the concerns behind tabling this amendment. In the very worst-case scenario, of a complete loss of data adequacy in the assessment by the EU, the effect on many businesses and industries in this country would be knocking at the door of catastrophic. It cannot be allowed to happen.

However, introducing a requirement to assess the impact of the Bill on the European Union data adequacy decision requires us to speculate on EU intentions in a public document, which runs the risk of prompting changes on its part or revealing our hand to it in ways that we would rather not do. It is important that we do two things: understand our risk, without necessarily publishing it publicly; and continue to engage at ministerial and official level, as I know we are doing intensively. I think the approach set out in this amendment runs the risk of being counterproductive.

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

I thank the noble Lord, Lord Clement-Jones, for his amendment, and the noble and learned Lord, Lord Thomas, for his contribution. I agree with them on the value and importance placed on maintaining our data adequacy decisions from the EU this year. That is a priority for the Government, and I reassure those here that we carefully considered all measures in the light of the EU’s review of our adequacy status when designing the Bill.

The Secretary of State wrote to the House of Lords European Affairs Committee on 20 November 2024 on this very point and I would be happy to share this letter with noble Lords if that would be helpful. The letter sets out the importance this Government place on renewal of our EU adequacy decisions and the action we are taking to support this process.

It is important to recognise that the EU undertakes its review of its decisions for the UK in a unilateral, objective and independent way. As the DSIT Secretary of State referenced in his appearance before the Select Committee on 3 December, it is important that we acknowledge the technical nature of the assessments. For that reason, we respect the EU’s discretion about how it manages its adequacy processes. I echo some of the points made by the noble Viscount, Lord Camrose.

That being said, I reassure noble Lords that the UK Government are doing all they can to support a swift renewal of our adequacy status in both technical preparations and active engagement. The Secretary of State met the previous EU Commissioner twice last year to discuss the importance of personal data sharing between the UK and EU. He has also written to the new Commissioner for Justice responsible for the EU’s review and looks forward to meeting Commissioner McGrath soon.

I also reassure noble Lords that DSIT and the Home Office have dedicated teams that have been undertaking preparations ahead of this review, working across government as needed. Those teams are supporting European Commission officials with the technical assessment as required. UK officials have met with the European Commission four times since the introduction of the Bill, with future meetings already in the pipeline.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, the noble and learned Lord, Lord Thomas, whose intervention I very much appreciated, particularly at this time of the evening, talked about a fresh pair of eyes. What kind of reassurance can the Minister give on that?

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- Hansard - - - Excerpts

It is worth remembering that the ultimate decision is with the EU Commission and we are quite keen to have its eyes on it now, which is why we are engaging with it very carefully. It is looking at it as we are going through it—we are talking to it and we have dedicated teams of people brought together specifically to do this. There are several people from outside the direct construct of the Bill who are looking at this to make sure that we have adequacy and are having very direct conversations with the EU to ensure that that process is proceeding as we would wish it to.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I thank the Minister for his response. It would be very reassuring if it was our own fresh pair of eyes rather than across the North Sea. That is all I can say as far as that is concerned. I appreciate what he said—that the Government are taking this seriously. It is a continuing concern precisely because the chair of the European Affairs Committee wrote to the Government. It is a continuing issue for those of us observing the passage of the Bill and we will continue to keep our eyes on it as we go forward. I very much hope that June 2025 passes without incident and that the Minister’s predictions are correct. In the meantime, I beg leave to withdraw the amendment.

Amendment 37 withdrawn.
Consideration on Report adjourned.
House adjourned at 10.14 pm.