All 9 Lord Clement-Jones contributions to the Data (Use and Access) Bill [HL] 2024-26

Read Bill Ministerial Extracts

Tue 19th Nov 2024
Tue 3rd Dec 2024
Tue 10th Dec 2024
Data (Use and Access) Bill [HL]
Grand Committee

Committee stage & Committee stage: Minutes of Proceedings & Committee stage: Minutes of Proceedings
Mon 16th Dec 2024
Wed 18th Dec 2024
Tue 21st Jan 2025
Tue 21st Jan 2025
Tue 28th Jan 2025
Tue 28th Jan 2025

Data (Use and Access) Bill [HL]

Lord Clement-Jones Excerpts
2nd reading
Tuesday 19th November 2024

(2 months, 1 week ago)

Lords Chamber
Read Full debate Data (Use and Access) Bill [HL] 2024-26 Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, I draw attention to my AI interests in the register. I thank the Minister for her upbeat introduction to the Bill and all her engagement to date on its contents. It has been a real pleasure listening to so many expert speeches this afternoon. The noble Lord, Lord Bassam, did not quite use the phrase “practice makes perfect”, because, after all, this is the third shot at a data protection Bill over the past few years, but I was really taken by the vision and breadth of so many speeches today. I think we all agree that this Bill is definitely better than its two predecessors, but of course most noble Lords went on to say “but”, and that is exactly my position.

Throughout, we have been reminded of the growing importance of data in the context of AI adoption, particularly in the private and public sectors. I think many of us regret that “protection” is not included in the Bill title, but that should go hand in hand if not with actual AI regulation then at least with an understanding of where we are heading on AI regulation.

Like others, I welcome that the Bill omits many of the proposals from the unlamented Data Protection and Digital Information Bill, which in our view— I expect to see a vigorous shake of the head from the noble Viscount, Lord Camrose—watered down data subject rights. The noble Lord, Lord Bassam, did us a great favour by setting out the list of many of the items that were missing from that Bill.

I welcome the retention of some elements in this Bill, such as the digital registration of birth and deaths. As the noble Lord, Lord Knight, said, and as Marie Curie has asked, will the Government undertake a review of the Tell Us Once service to ensure that it covers all government departments across the UK and is extended to more service providers?

I also welcome some of the new elements, in particular amendments to the Online Safety Act—essentially unfinished business, as far back as our Joint Committee. It was notable that the noble Lord, Lord Bethell, welcomed the paving provisions regarding independent researchers’ access to social media and search services, but there are questions even around the width of that provision. Will this cover research regarding non-criminal misinformation on internet platforms? What protection will researchers conducting public interest research actually receive?

Then there is something that the noble Baroness, Lady Kidron, Ian Russell and many other campaigners have fought for: access for coroners to the data of young children who have passed away. I think that will be a milestone.

The Bill may need further amendment. On these Benches we may well put forward further changes for added child protection, given the current debate over the definition of category 1 services.

There are some regrettable omissions from the previous Bill, such as those extending the soft opt-in that has always existed for commercial organisations to non-commercial organisations, including charities. As we have heard, there are a considerable number of unwelcome retained provisions.

Many noble Lords referred to “recognised legitimate interests”. The Bill introduces to Article 6 of the GDPR a new ground of recognised legitimate interest, which counts as a lawful basis for processing if it meets any of the descriptions in the new Annex 1 to the GDPR in Schedule 4 of the Bill. The Bill essentially qualifies the public interest test under Article 6(1)(e) of the GDPR and, as the noble Lord, Lord Vaux, pointed out, gives the Secretary of State powers to define additional recognised legitimate interests beyond those in the annex. This was queried by the Constitution Committee, and we shall certainly be kicking the tyres on that during Committee. Crucially, there is no requirement for the controller to make any balancing test, as the noble Viscount, Lord Colville, mentioned, taking the data subject’s interests into account. It just needs to meet the grounds in the annex. These provisions diminish data protection and represent a threat to data adequacy, and should be dropped.

Almost every noble Lord raised the changes to Article 22 and automated decision-making. With the exception of sub-paragraph (d), to be inserted by Clause 80, the provisions are very similar to those of the old Clause 14 of the DPDI Bill in limiting the right not to be subject to automated decision-making processing or profiling to special category data. Where automated decision-making is currently broadly prohibited with specific exceptions, the Bill will permit it in all but a limited set of circumstances. The Secretary of State is given the power to redefine what ADM actually is. Again, the noble Viscount, Lord Colville, was right in how he described what the outcome of that will be. Given the Government’s digital transformation agenda in the public sector and the increasing use of AI in the private sector, this means increasing the risk of biased and discriminatory outcomes in ADM systems.

Systems such as HART, which predicted reoffending risk, PredPol, which was used to allocate policing resources based on postcodes, and the gangs matrix, which harvests intelligence, have all been shown to have had discriminatory effects. It was a pleasure to hear what the noble Lord, Lord Arbuthnot, had to say. Have the Government learned nothing from the Horizon scandal? As he said, we need to move urgently to change the burden of proof for computer evidence. What the noble Earl, Lord Errol, said, in reminding us of the childlike learning abilities of AI, was extremely important in that respect. We should not put our trust in that way in the evidence given by these models.

ADM safeguards are critical to public trust in AI, and our citizens need greater not less protection. As the Ada Lovelace Institute says, the safeguards around automated decision-making, which exist only in data protection law, are more critical than ever in ensuring that people understand when a significant decision about them is being automated, why that decision has been made, and the routes to challenge it or ask for it to be decided by a human. The noble Viscount, Lord Colville, and the noble Lord, Lord Holmes, set out that prescription, and I entirely agree with them.

This is a crucial element of the Bill but I will not spend too much time on it because, noble Lords will be very pleased to hear, I have a Private Member’s Bill on this subject, providing much-needed additional safe- guards for ADM in the public sector, coming up on 13 December. I hope noble Lords will be there and that the Government will see the sense of it in the meantime.

We have heard a great deal about research. Clause 68 widens research access to data. There is a legitimate government desire to ensure that valuable research does not have to be discarded because of a lack of clarity around reuse or because of very narrow distinctions between the original and new purpose. However, it is quite clear that the definition of scientific research introduced by the Bill is too broad and risks abuse by commercial interests. A number of noble Lords raised that, and I entirely agree with the noble Baroness, Lady Kidron, that the Bill opens the door to data reuse and mass data scraping by any data-driven product development under the auspices of scientific research. Subjects cannot make use of their data rights if they do not even know that their data is being processed.

On overseas transfers, I was very grateful to hear what the noble and learned Lord, Lord Thomas, had to say about data adequacy, and the noble Lords, Lord Bethell, Lord Vaux and Lord Russell, also raised this. All of us are concerned about the future of data adequacy, particularly the tensions that are going to be created with the new Administration in the US if there are very different bases for dealing with data transfer between countries.

We have concerns about the national security provisions. I will not go into those in great detail, but why do the Government believe that these clauses are necessary to safeguard national security?

Many noble Lords raised the question of digital verification services. It was very interesting to hear what the noble Earl, Lord Erroll, had to say, given his long-standing interest in this area. We broadly support the provisions, but the Constitution Committee followed the DPRRC in criticising the lack of parliamentary scrutiny of the framework to be set by the Secretary of State or managed by DSIT. How will they interoperate with the digital identity verification services being offered by DSIT within the Government’s One Login programme?

Will the new regulator be independent, ensure effective governance and accountability, monitor compliance, investigate malicious actors and take enforcement action regarding these services? For high levels of trust in digital ID services, we need high-quality governance. As the noble Lord, Lord Vaux, said, we need to be clear about the status of physical ID alongside that. Why is there still no digital identity offence? I entirely agreed with what the noble Lords, Lord Lucas and Lord Arbuthnot, said about the need for factual clarity underlying the documents that will be part of the wallet—so to speak—in terms of digital ID services. It is vital that we distinguish and make sure that both sex and gender are recorded in our key documents.

There are other areas about which we on these Benches have concerns, although I have no time to go through them in great detail. We support the provisions on open banking, which we want to see used and the opportunities properly exploited. However, as the noble Lord, Lord Holmes, said, we need a proper narrative that sells the virtues of open banking. We are concerned that the current design allows landlords to be given access to monitoring the bank accounts of tenants for as long as an open banking approval lasts. Smart data legislation should mandate that the maximum and default access duration be no longer than 24 hours.

A formidable number of noble Lords spoke about web trawling by AI developers to train their models. It is vital that copyright owners have meaningful control over their content, and that there is a duty of transparency and penalties for scraping news publisher and other copyrighted content.

The noble and learned Lord, Lord Thomas, very helpfully spoke about the Government’s ECHR memorandum. I do not need to repeat what he said, but clearly, this could lead to a significant gap, given that the Retained EU Law (Revocation and Reform) Act 2023 has not been altered and is not altered by this Bill.

There are many other aspects to this. The claims for this Bill and these provisions are as extravagant as for the old one; I think the noble Baroness mentioned the figure of £10 billion at the outset. We are in favour of growth and innovation, but how will this Bill also ensure that fundamental rights for the citizen will be enhanced in an increasingly AI-driven world?

We need to build public trust, as the noble Lord, Lord Holmes, and the noble Baroness, Lady Kidron, said, in data sharing and access. To achieve the ambitions of the Sudlow review, there are lessons that need to be learned by the Department of Health and the NHS. We need to deal with edtech, as has been described by a number of noble Lords. All in all, the Government are still not diverging enough from the approach of their predecessor in their enthusiasm for the sharing and use of data across the public and private sectors without the necessary safeguards. We still have major reservations, which I hope the Government will respond to. I look forward—I think—to Grand Committee.

Data (Use and Access) Bill [HL]

Lord Clement-Jones Excerpts
Lord Markham Portrait Lord Markham (Con)
- Hansard - - - Excerpts

Just to follow on from that, I very much support my noble friend’s words. The only reason I can see why you would introduce new definitions is that there are new responsibilities that are different, and you would want people to be aware of the new rules that have been placed on them. I will be interested to hear the Minister’s answer. If that is the case, we can set that out and understand whether the differences are so big that you need a whole new category, as my noble friend said.

Having run lots of small businesses myself, I am aware that, with every new definition that you add, you add a whole new set of rules and complications. As a business owner, how am I going to find out what applies to me and how I am to be responsible? The terms trader, controller, data holder and processor all sound fairly similar, so how will I understand what applies to me and what does not? To the other point that my noble friend made, the more confusing it gets, the less likelihood there is that people will understand the process.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I am not sure whether I should open by saying that it is a pleasure to take part in the passage of the third iteration of this Bill, but, as I said at Second Reading, this is an improvement. Nevertheless, there are aspects of the Bill that need close scrutiny.

The noble Viscount, Lord Camrose, explained his approach to this Bill. Our approach is that we very much support the use of data for public benefit but, at the same time, we want to make sure that this Bill does not water down individual data rights and that they are, where necessary, strengthened. In that spirit, I wish to ask the Minister about the general nature of Clause 1, rather than following up on the amendments tabled by the noble Viscount.

The definition of “business data” seems quite general. A report that came out yesterday, Data On Our Minds: Affective Computing At Work, highlighted the kinds of data that are now being collected in the workplace. It is a piece of work sponsored by the Joseph Rowntree Charitable Trust, the Trust for London and the Institute for the Future of Work. They are concerned about the definition of “business data”. The Minister probably will not have an answer on this matter at this stage but it would be useful if she could write in due course to say whether the definition of excludes emotional data and neurosurveillance data collected from employees.

This is very much a workplace question rather than a question about the customer; I could ask the same question about the customer, I suppose, except the report is about workplace data collection. I thought I would opportunistically take advantage of the rather heavy de-grouping that has taken place and ask the Minister a question.

Baroness Jones of Whitchurch Portrait The Parliamentary Under-Secretary of State, Department for Science, Information and Technology (Baroness Jones of Whitchurch) (Lab)
- Hansard - - - Excerpts

First, let me say what a pleasure it is to be back on this old ground again, although with slightly different functions this time round. I very much support what the noble Viscount, Lord Camrose, said. We want to get the wording of this Bill right and to have a robust Bill; that is absolutely in our interests. We are on the same territory here. I thank the noble Viscount and other noble Lords for expressing their interest.

On Amendments 1 and 2, the Government consider the terms used in Part 1, as outlined in Clause 1, necessary to frame the persons and the data to which a scheme will apply. The noble Lord, Lord Clement-Jones, mentioned the powers. I assure him that the powers in Part 1 sit on top of the Data Protection Act. They are not there instead of it; they are another layer on top of it, and they provide additional rights over and above what already exists.

In relation to the specific questions from the noble Viscount, Lord Camrose, and the noble Lord, Lord Markham, smart data schemes require suppliers or providers of goods, services or digital content to provide data. They are referred to as “traders” in accordance with recent consumer legislation, including the Consumer Rights Act 2015. The term “data holder” ensures that the requirements may also be imposed on any third party that might hold the data on the trader’s behalf. That is why these additional terminologies have been included: it is based on existing good legislation. I hope noble Lords will recognise why this is necessary and that this explains the rationale for these terms. These terms are independent of terms in data protection legislation; they have a different scope and that is why separate terms are necessary. I hope that, on that basis, the noble Viscount will withdraw his amendment.

--- Later in debate ---
Earl of Erroll Portrait The Earl of Erroll (CB)
- Hansard - - - Excerpts

My Lords, I would like to say a few things about this. The first is that Amendment 5, in the name of the noble Lord, Lord Lucas, is very sensible; sometimes the GDPR has gone too far in trying to block what you can use things for. It was originally thought of when so much spamming was going on, with people gathering data from adverts and all sorts of other things and then misusing it for other purposes. People got fed up with the level of spam. This is not about that sort of thing; it is about having useful data that would help people in the future, and which they would not mind being used for other purposes. As long as it is done properly and seriously, and not for marketing, advertising and all those other things, and for something which is useful to people, I cannot see what the problem is. An overzealous use of GDPR, which has happened from time to time, has made it very difficult to use something perfectly sensible, which people would not mind having other people know about when it is being useful.

The next matter is sex, which is an interesting issue. The noble Lord is absolutely correct that biological or genetic sex is vital when applying medicines and various other things. You have to know that you are administering certain drugs properly. As we get more and more new drugs coming on, it will matter how a person’s body will react to them, which will depend on the genetic material, effectively. Therefore, it is essential to know what the biological sex is. The answer is that we need another category—probably “current gender”—alongside “sex at birth”. Someone can then decide to use “current gender” for certain purposes, including for such things as passports and driving licences, where people do not want to be asked questions—“Oh, do you mean you’re not?”—because they look completely different.

I remember meeting April Ashley in her restaurant. I would not, in my innocence—I was quite young—have guessed that she was not a woman, except that someone said that her hands were very big. It never worried us in those days. I am not worried about people using a different gender, but the basic underlying truth is essential. It comes into the issue of sport. If you have grown up and developed physically as a biological male, your bone structure and strength are likely to be different from that of a female. There are huge issues with that, and we need to know both; people can decide which to use at certain points. Having both would give you the flexibility to do that.

That also applies to Amendment 200, from the noble Lord, Lord Lucas, which is exactly the same concept. I thoroughly agree with those amendments and think we should push them forward.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I too am delighted that the noble Lord, Lord Lucas, came in to move his amendment. He is the expert in that whole area of education data; like the noble Lord, Lord Arbuthnot, I found what he said extremely persuasive.

I need to declare an interest as chair of the council of Queen Mary, University of London, in the context of Amendment 5 in the name of the noble Lord, Lord Lucas. I must say, if use were made of that data, it would benefit not only students but universities. I am sure that the Minister will take that seriously but, on the face of it, like the noble Earl, Lord Erroll, I cannot see any reason why this amendment should not be adopted.

I very much support Amendments 34 and 48 in the name of the noble Lord, Lord Arbuthnot. I too have read the briefing from Sex Matters. The noble Lord’s pursuit of accuracy for the records that will be part of the wallet, if you like, to be created for these digital verification services is a matter of considerable importance. In reading the Sex Matters briefing, I was quite surprised. I had not realised that it is possible to change your stated sex on your passport in the way that has taken place. The noble Lord referred to the more than 3,000 cases of this; for driving licences, there have been more than 15,000.

I agree with Sex Matters when it says that this could lead to a loss of trust in the system. However, I also agree with the noble Earl, Lord Erroll, that this is not an either/or. It could be both. It is perfectly feasible to have both on your passport, if you so choose. I do not see this as a great divide as long as the statement about sex is accurate because, for a great many reasons—not least in healthcare—it is of considerable importance that the statement about one’s sex is accurate.

I looked back at what the Minister said at Second Reading. I admit that I did not find it too clear but I hope that, even if she cannot accept these amendments, she will be able to give an assurance that, under this scheme—after all, it is pretty skeletal; we will come on to some amendments that try to flesh it out somewhat—the information on which it will be based is accurate. That must be a fundamental underlying principle. We should thank the noble Lord, Lord Arbuthnot, for tabling these two important amendments in that respect.

Lord Markham Portrait Lord Markham (Con)
- Hansard - - - Excerpts

My Lords, I want to come in on Amendment 5. Although I am very much in favour of the intent of what we are trying to do—making more use of the sharing of data—I have to remember my old Health Minister’s hat in talking about all the different terms and speaking to the different angles that we are all coming from.

Noble Lords have heard me speak many a time about the value of our health data and the tremendous possibilities that it offers for drug discovery and all the associated benefits. At the same time, I was very aware of loads of companies purporting to own it. There are GP data companies, which do the systems for GPs and, naturally, hold all the patient data in them. In terms of their business plans, some have been bought for vast sums of money because of the data that they hold. My concern is that, although it is well intended to say that the use of health data should be allowed for the general good, at the same time, I do not believe that GP companies own that data. We have been quite clear on that. I want to make it clear that it is actually the NHS that will benefit from the pulling together of all this, if that happens in those sorts of formats.

Similarly on student loans data—I shall not pretend that this is a subject I know a lot about—I can see a lot of good causes for the student loans, but I can also see that it would be very useful for financial services companies to understand customers’ creditworthiness. In all these cases, although the intent is right, we need to find a way to be clear about what they can and cannot use it for, and there lies a lot of complexity.

--- Later in debate ---
Lord Arbuthnot of Edrom Portrait Lord Arbuthnot of Edrom (Con)
- Hansard - - - Excerpts

My Lords, Amendment 7, the first in this group is a probing amendment and I am extremely grateful to ISACA, an international professional association focused on IT governance, for drafting it. This amendment

“would give the Secretary of State or the Treasury scope to introduce requirements on third party recipients of customer data to publish regular statements on their cyber resilience against specified standards and outcomes”.

Third parties play a vital role in the modern digital ecosystem, providing businesses with advanced technology, specialised expertise and a wide range of services, but integrating third parties into business operations comes with cyber risks. Their access to critical networks and all the rest of it can create vulnerabilities that cyber- criminals exploit. Third parties are often seen as easier targets, with weaker security measures or indirect connections serving as gateways to larger organisations.

Further consideration is to be given to the most effective means of driving the required improvements in cyber risk management, including, in my suggestion, making certain guidance statutory. This is not about regulating and imposing additional cost burdens, but rather creating the environment for digital trust and growth in the UK economy, as well as creating the right conditions for the sustainable use of emerging technologies that will benefit us all. This is something that leading associations and groups such as ISACA have been arguing for.

The Cyber Governance Code of Practice, which the previous Administration introduced, marks an important step towards improving how organisations approach cybersecurity. Its primary goal is to ensure that boards of directors should take their proper responsibility in mitigating cyber risks.

While that code is a positive development, compliance is not legally required, which leaves organisations to decide whether to put their priorities elsewhere. As a result, the code’s effectiveness in driving widespread improvements in cyber resilience will largely depend on their organisation’s willingness to recognise its importance. The amendment would require businesses regularly to review and update their cybersecurity strategies and controls, and to stay responsive to evolving threats and technologies, thereby fostering a culture of continuous improvement. In addition, by mandating ongoing assessments of internal controls and risk-management processes, organisations will be better able to anticipate emerging threats and enhance their ability to detect, prevent and respond to cyber incidents. I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, this is a fairly disparate group of amendments. I am speaking to Amendments 8, 9, 10, 24, 30, 31 and 32. In the first instance, Amendments 8, 9, 10 and 30 relate to the question that I asked at Second Reading: where is the ambition to use the Bill to encourage data sharing to support net zero?

The clean heat market mechanism, designed to create a market incentive to grow the number of heat pumps installed in existing premises each year, is set to be introduced after being delayed a year due to backlash from the boiler industry. If government departments and partners had access to sales data of heating appliances, there would be a more transparent and open process for setting effective and realistic targets.

I have been briefed by Ambient, a not-for-profit organisation in this field. It says that low visibility of high power-consuming assets makes it challenging to maintain grid stability in a clean-power world. Low visibility and influence over future installations of high power-consuming assets make it difficult to plan for grid updates. Inability to shift peak electricity demand leads to higher capacity requirements with associated time and cost implications. Giving the Government and associated bodies access to utility-flexible tariff data would enable the Government and utilities to work together to increase availability and uptake of tariffs, leading to lower peak electricity demand requirements.

Knowing which homes have the oldest and least efficient boilers, and giving public sector and partners access to the Gas Safe Register and CORGI data on boiler age at household level, would mean that they could identify and target households and regions, ensuring that available funds go to those most in need. Lack of clarity on future clean heating demand makes it challenging for the industry to scale and create jobs, and to assess workforce needs for growing electricity demand. Better demand forecasting through access to sales data on low-carbon heating appliances would signal when and where electrification was creating need for workforce expansion in grid management and upgrade, as well as identify regional demand for installers and technicians.

The provisions of Part 1 of the Bill contain powers for the Secretary of State to require the sharing of business data to customers and other people of specified description. It does not indicate, however, that persons of specified description could include actors such as government departments, public bodies such as NISO and GB Energy, and Ministers. An expanded list of suggested recipients could overcome this issue, as stated in Amendment 9 in my name. It makes no provision for the format of information sharing—hence, my Amendments 8 and 10.

In summary, my questions to the Minister are therefore on: whether it has been considered how the primary legislation outlined in the Bill could be exercised to accelerate progress towards clean power by 2030; whether climate missions such as clean power by 2030 or achieving net zero are purposes “of a public nature” in relation to the outline provisions for public bodies; and whether specifying the format of shared business data would enable more efficient and collaborative use of data for research and planning purposes.

Coming on to Amendments 24, 31 and 32, the Bill expands the potential use of smart data to additional public and private sector entities, but it lacks safeguards for sensitive information regularly used in court. It makes specific provision for legal privilege earlier in the Bill, but this is not extended in provisions relating to smart data. I very much hope that the Government will commit to consult with legal professions before extending smart data to courts.

Many of us support open banking, but open banking is being used, as designed, by landlords to keep watching tenant bank accounts for months after approving their tenancy. Open banking was set up to enhance inter- operability between finance providers, with the most obvious example being the recent new ability of the iPhone wallet app to display balances and recent transactions from various bank accounts.

Open banking approval normally lasts six months. While individual landlords may not choose this access, if given a free choice, the service industry providing the tenant-checking service to landlords is strongly incentivised to maximise such access, otherwise their competitors have a selling point. If open banking is to be added to the statute book, the Bill should mandate that the default time be reduced to no more than 24 hours in the first instance, and reconfirmed much more often. For most one-off approval processes, these access times may be as short as minutes and the regulations should account for that.

Coming on to Amendment 31, consumers have mixed feelings about the potential benefits to them of smart data schemes, as shown in polling such as that carried out a couple of years ago by Deltapoll with the CDEI, now the Responsible Technology Adoption Unit, as regards the perceived potential risks versus the benefits. Approximately one-quarter of respondents in each case were unsure about this trade-off. Perhaps unsurprisingly, individuals who said that they trusted banks and financial institutions or telecommunications providers were more likely to support open finance and open communications, and customers who had previous experience of switching services more frequently reported believing that the benefits of smart data outweighed the risks.

Is it therefore the Government’s expectation that people should be compelled to use these services? Open banking and imitators can do a great deal of good but can also give easy access to highly sensitive data for long periods. The new clause introduced by Amendment 31 would make it the same criminal offence to compel unnecessary access under these new provisions as it already is to compel data provision via subject access requests under the existing Data Protection Act.

Amendment 32 is a probing amendment as to the Government’s intentions regarding these new smart data provisions. In the Minister’s letter of 27 November, she said:

“The Government is working closely to identify areas where smart data schemes might be able to bring benefits. We want to build on the lessons learned from open banking and establish smart data schemes in other markets for goods and services.”


I very much hope that the Minister will be able to give us a little taste of what she thinks these powers are going to be used for, and in what sectors the Government believe that business can take advantage of these provisions.

Baroness Neville-Jones Portrait Baroness Neville-Jones (Con)
- Hansard - - - Excerpts

My Lords, I support Amendment 7 introduced by my noble friend Lord Arbuthnot, for the reasons that he gave. The amendment was designed to have the effect of increasing the reliability and handling of information inside any system. If, as I would certainly support, we want to see information and data in digital form circulated more readily, more freely and more often, it is very important that people should trust the system within which it happens. That is where the need to assure the cybersecurity of the system becomes very important and is a companion note to this Bill.

--- Later in debate ---
I hope that by going through the detail of the large number of amendments I have provided reassurance to noble Lords on these amendments, as well as on why we feel that the inclusion of Clause 13 is necessary. I therefore hope that noble Lords will not press their amendments.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Does the Minister have any thoughts about where smart data schemes might be introduced? I am sure that they are being introduced for a purpose. Is there a plan to issue a policy document or is it purely about consulting different sectors? Perhaps the Minister can give us a glimpse of the future.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

The noble Lord is tempting me. What I would say is that, once this legislation is passed, it will encourage departments to look in detail at where they think smart data schemes can be applied and provide a useful service for customers and businesses alike. I know that one issue that has been talked about is providing citizens with greater information about their energy supplies—the way that is being used and whether they can use their energy differently or find a different supplier—but that is only one example, and I do not want people to get fixated on it.

The potential is enormous; I feel that we need to encourage people to think creatively about how some of these provisions can be used when the Bill is finally agreed. There is a lot of cross-government thinking at the moment and a lot of considering how we can empower citizens more. I could say a lot off the top of my head but putting it on the record in Hansard would probably be a mistake, so I will not be tempted any more by the noble Lord. I am sure that he can write to me with some suggestions, if he has any.

--- Later in debate ---
Moved by
33: Clause 28, page 30, line 28, at end insert—
“(2A) Those rules must include processes for ongoing monitoring of compliance, including but not limited to processes and procedures for monitoring and investigating compliance.(2B) The rules must contain mechanisms for redress for harms caused by compliance failures.(2C) The Secretary of State must establish an independent process for hearing appeals against the findings of compliance investigations.”Member's explanatory statement
This amendment specifies additional rules for the trust framework.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I almost have a full house in this group, apart from Amendment 35, so I will not read out the numbers of all the amendments in this group. I should just say that I very much support what the noble Viscount, Lord Colville, has put forward in his Amendment 35.

Many noble Lords will have read the ninth report of the Delegated Powers and Regulatory Reform Committee. I am sad to say that it holds exactly the same view about this Bill as it did about the previous Bill’s provisions regarding digital verification services. It said that

“we remain of the view that the power conferred by clause 28 should be subject to parliamentary scrutiny, with the affirmative procedure providing the appropriate level of scrutiny”.

It is against that backdrop that I put forward a number of these amendments. I am concerned that, although the Secretary of State is made responsible for this framework, in reality, they cannot be accountable for delivering effective governance in any meaningful way. I have tried, through these amendments, to introduce at least some form of appropriate governance.

Of course, these digital verification provisions are long-awaited—the Age Verification Providers Association is pleased to see them introduced—but we need much greater clarity. How is the Home Office compliant with Part 2 of the Bill as it is currently written? How will these digital verification services be managed by DSIT? How will they interoperate with the digital identity verification services being offered by DSIT in the UK Government’s One Login programme?

Governance, accountability and effective, independent regulation are also missing. There is no mechanism for monitoring compliance, investigating malicious actors or taking enforcement action regarding these services. The Bill has no mechanism for ongoing monitoring or the investigation of compliance failures. The Government propose to rely on periodic certification being sufficient but I understand that, when pressed, DSIT officials say that they are talking to certification bodies and regulators about how they can do so. This is not really sufficient. I very much share the intention of both this Government and the previous one to create a market in digital verification services, but the many good players in this marketplace believe that high levels of trust in the sector depend on a high level of assurance and focus from the governance point of view. That is missing in this part of the Bill.

Amendment 33 recognises the fact that the Bill has no mechanism for ongoing monitoring or the investigation of compliance failures. As we have seen from the Grenfell public inquiry, a failure of governance caused by not proactively monitoring, checking and challenging compliance has real, harmful consequences. Digital verification services rely on the trustworthiness of the governance model; what is proposed is not trustworthy but creates material risk for UK citizens and parties who rely on the system.

There are perfectly decent examples of regulatory frameworks. PhonepayPlus provides one such example, with a panel of three experts supported by a secretariat; the panel can meet once a quarter to give its opinion. That has been dismissed as being too expensive, but I do not believe that any costings have been produced or that it has been considered how such a cost would weigh against the consequences of a failure in governance of the kind identified in recent public inquiries.

Again, as regards Amendment 36, there is no mechanism in the Bill whereby accountability is clearly established in a meaningful way. Accountability is critical if relying parties and end-users are to have confidence that their interests are safeguarded.

Amendment 38 is linked to Amendment 36. The review under Clause 31 must be meaningful in improving accountability and effective governance. The amendment proposes that the review must include performance, specifically against the five-year strategy and of the compliance, monitoring and investigating mechanisms. We would also like to see the Secretary of State held accountable by the Science and Technology Select Committee for the performance captured in the review.

On Amendment 41, the Bill is silent on how the Secretary of State will determine that there is a compliance failure. It is critical to have some independence and professional rigour included here; the independent appeals process is really crucial.

As regards Amendments 42 and 43, recent public inquiries serve to illustrate the importance of effective governance. Good practice for effective governance would require the involvement of an independent body in the determination of compliance decisions. There does not appear to be an investigatory resource or expertise within DSIT, and the Bill currently fails to include requirements for investigatory processes or appeals. In effect, there is no check on the authority of the Secretary of State in that context, as well as no requirement for the Secretary of State proactively to monitor and challenge stakeholders on compliance.

As regards Amendment 44, there needs to be a process or procedure for that; fairness requires that there should be a due process of investigation, a review of evidence and a right of appeal to an independent body.

I turn to Amendment 45 on effective governance. A decision by the appeals body that a compliance failure is so severe that removal from the register is a proportionate measure must be binding on the Secretary of State, otherwise there is a risk of lobbying and investment in compliance and service improvement being relegated below that of investment in lobbying. Malicious actors view weaknesses in enforcement as a green light and so adopt behaviours that both put at risk the safety and security of UK citizens and undermine the potential of trustworthy digital verification to drive economic growth.

Amendment 39 would exclude powers in this part being used by government as part of GOV.UK’s One Login.

I come on to something rather different in Amendment 46, which is very much supported by Big Brother Watch, the Digital Poverty Alliance and Age UK. Its theme was raised at Second Reading. A significant proportion of the UK’s population lacks internet access, with this issue disproportionately affecting older adults, children and those from low-income backgrounds. This form of digital exclusion presents challenges in an increasingly digital world, particularly concerning identity verification.

Although digital identity verification can be beneficial, it poses difficulty for individuals who cannot or choose not to engage digitally. Mandating online identity verification can create barriers for digitally excluded groups. For example, the National Audit Office found that only 20% of universal credit applicants could verify their identity online, highlighting concerns for those with limited digital skills. The Lords Communications and Digital Select Committee emphasised the need for accessible, offline alternatives to ensure inclusivity in a connected world. The proponents of this amendment advocate the availability of offline options for essential public and private services, particularly those requiring identity verification. This is crucial as forcing digital engagement can negatively impact the well-being and societal participation of older people.

This is the first time that I have prayed in aid what the Minister said during the passage of the Data Protection and Digital Information Bill; this could be the first of a few such occasions. When we debated the DPDI Bill, she stressed the importance of a legal right to choose between digital and non-digital identity verification methods. I entirely agreed with her at the time. She said that this right is vital for individual liberty, equality and building trust in digital identity systems and that, ultimately, such systems should empower individuals with choices rather than enforce digital compliance. That is a fair summary of what she said at the time.

I turn to Amendment 50. In the context of Clause 45 and the power of public authorities to disclose information, some of which may be the most sensitive information, it is important for the Secretary of State to be able to require the public authority to provide information on what data is being disclosed and where the data is going, as well as why the data is going there. This amendment will ensure that data is being disclosed for the right reasons, to the right places and in the right proportion. I beg to move.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - - - Excerpts

My Lords, I tabled Amendment 35 because I want to make the DVS trust framework as useful as possible. I support Amendment 33 in the name of the noble Lord, Lord Clement-Jones, and Amendment 37 in the name of the noble Viscount, Lord Camrose.

The framework’s mandate is to define a set of rules and standards designed to establish trust in digital identity products in the UK. It is what I would hope for as a provision in this Bill. As the Minister told us at Second Reading, the establishment of digital ID services with a trust mark will increase faith in the digital market and reduce physical checks—not to mention reducing the time spent on a range of activities, from hiring new workers to moving house. I and many other noble Lords surely welcome the consequent reduction in red tape, which so often impedes the effectiveness of our public services.

Clause 28(3) asks the Secretary of State to consult the Information Commissioner and such persons as they consider appropriate. However, in order to ensure that these digital ID services are used and recognised as widely as possible—and, more importantly, that they can be used by organisations beyond our borders— I suggest Amendment 35, which would include putting consultation with an international digital standards body in the Bill. This amendment is supported by the Open Data Institute.

I am sure that the Minister will tell me that that amendment is unnecessary as we can leave it to the common sense of Ministers and civil servants in DSIT to consult such a body but, in my view, it is helpful to remind them that Parliament thinks the consultation of an international standards body is important. The international acceptance of DVS is crucial to its success. Just like an email, somebody’s digital identity should not be tied to a company or a sector. Imagine how frustrating it would be if we could only get Gmail in the UK and Outlook in the EU. Imagine if, in a world of national borders and jurisdictions, you could not send emails between the UK and the EU as a result. Although the DVS will work brilliantly to break down digital identity barriers in the UK, there is a risk that no international standards body might be consulted in the development of the DVS scheme. This amendment would be a reminder to the Secretary of State that there must be collaboration between this country, the EU and other nations, such as Commonwealth countries, that are in the process of developing similar schemes.

--- Later in debate ---
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I will, of course, write to the noble Baroness.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Was the Minister saying that in view of the current duties of the ICO, Amendment 50 is not needed because public authorities will have the duty to inform the ICO of the information that they have been passing across to these identity services?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Again, I will have to write to the noble Lord on that. I think we were saying that it is outside the current obligations of the ICO, but we will clarify the responsibility.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I am not quite sure whether to be reassured or not because this is terra incognita. I am really struggling, given the Minister’s response. This is kind of saying, “Hands off, Parliament, we want the lightest touch on all of this, and the Secretary of State will decide”.

I should first thank the noble Baroness, Lady Kidron, for her support. I thought that the noble Viscount, Lord Colville, made an extremely good case for Amendment 35 because all of us want to make sure that we have that interoperability. One of the few areas where I was reassured by the Minister was on the consultations taking place.

I am sure that the noble Viscount, Lord Camrose, was right to ask what the consultations are. We need to be swimming in the right pool for our digital services to be interoperable. It is not as if we do not have contact with quite a number of these digital service providers. Some of them are extremely good and want a level of mandation for these international services. There is a worrying lack of detail here. We have devil and the deep blue sea. We have these rules on GOV.UK which are far too complicated for mere parliamentarians to comprehend. They are so detailed that we are going to get bogged down.

On the other hand, we do not know what the Secretary of State is doing. This is the detailed trust framework, but what is the governance around it? At the beginning of her speech, the Minister said that governance is different from certification and the conformity assessment service. I would have thought that governance was all part of the same warp and weft. I do not really understand. The Secretary of State has the power to refuse accreditation, so we do not need an independent appeals body. It would be much more straightforward if we knew that there was a regulator and that it was going to be transparent in terms of how the system worked. I just feel that this is all rather half baked at the moment. We need a lot more information than we are getting. To that extent, that is the case for all the amendments in this group.

The crucial amendment is Amendment 37 tabled by the noble Viscount, Lord Camrose, because we absolutely need to bring all this into the light of day by parliamentary approval, whether or not it is a complicated document. Perhaps we could put it through an AI model and simplify it somewhat before we debate it. We have to get to grips with this. I have a feeling that we are going to want to return to this aspect on Report because no good reason has been given, not to the DPRRC either, about why we are not debating this in Parliament in terms of the scheme itself. It is a bit sad to have to say this because we all support the digital verification progress, if you like. Yet, we are all in a bit of a fog about how it is all going to work.

I very much hope that the Minister can come back to us, perhaps with a must-write letter that sets it all out to a much more satisfactory extent. I hope she understands why we have had this fairly protracted debate on this group of amendments because this is an important aspect that the Bill is skeletal about. I beg leave to withdraw the amendment.

Amendment 33 withdrawn.
--- Later in debate ---
Moved by
51: Clause 50, page 46, line 19, at end insert—
“(3A) A person who acts in contravention of subsection (3) commits an offence.(3B) A person who commits an offence under subsection (3A) is liable—(a) on summary conviction to a fine; or(b) on conviction on indictment to a term of imprisonment not exceeding 2 years or to a fine or both.”Member’s explanatory statement
This amendment makes it an offence for someone to use a trust mark when they have no permission to do so, aimed to weed out fraud.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, in moving Amendment 51, I will also speak to Amendments 52, 53, 54 and 209 in my name, which seek to create new criminal offences under the Bill. The first is the offence of using a trust mark without permission; the second is providing false information to the Secretary of State in response to an information notice; and the third is using a false digital identity document, which is really an alternative to digital identity theft.

Clause 50 currently contains no real consequence for a person using a trust mark without permission. A trust mark, which has no specific definition in the Bill, may be used only by those who are in the DVS register. Clause 50(3) says:

“A mark designated under this section may not be used by a person in the course of providing, or offering to provide, digital verification services unless the person is registered in the DVS register in respect of those digital verification services”.


Clause 50(4) then says:

“The Secretary of State may enforce subsection (3)”


by civil injunction or interdict. This has no real teeth in circumstances where there are persistent and flagrant offenders, regardless of whether it is on a personal or commercial scale.

Amendment 51 would give appropriate penalties, with a fine on summary conviction and two years’ imprisonment, or a fine on indictment. Amendment 52 would make provision so that a prosecution may not be brought unless by or with the consent of the appropriate chief prosecutor. Amendment 54 relates to providing false information to the Secretary of State. That is advanced on a similar basis, containing a power for the Secretary of State to require information. Of course, many regulators have this power.

On the issue of false digital identities—identity theft —Amendment 53 is a refinement of the Amendment 289 which I tabled to the late, unlamented DPDI Bill in Committee. That amendment was entitled “Digital identity theft”. I have also retabled the original amendment, but in many ways Amendment 53 is preferable because it is much more closely aligned to the Identity Documents Act, which contains several offences that relate to the use of a person’s identity document. Currently, an identity document includes an immigration document—a passport or similar document—or a driving licence.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I thank the Minister. I was quite amused in listening to the noble Viscount, Lord Camrose. I thought about the halcyon days of listening to the language that he used when he was a Minister, with words like “premature”, “unintended consequences”, “disproportionate” and “ambiguity”. I thought, “Come back, Viscount Camrose”—but I appreciate that he took the trouble at least to analyse, from his point of view, where he saw the problems with some of the amendments.

I go back to the physical identity verification aspect. I should have said that I very much hope that the Minister and I can discuss how the Equality Act 2010 has an impact. I am not entirely sure about the protected characteristics playing into this because, obviously, the Equality Act only references those. I think that there could be a greater cohort of people who may be disadvantaged by commercial operators insisting on digital verification, as opposed to physical verification, for instance; I may need to discuss that with the Minister.

I am grateful to the Minister for having gone through where she thinks that there are safeguards and sanctions against using trust identity falsely; that was a very helpful run-through so I shall not go back to what she said. The really important area is this whole offline/online criminal aspect. I understand that it may not be perfect because the scheme is not in place—it may not need to be on all fours exactly with the 2010 Act—but I think that the Minister’s brief was incorrect in this respect. If the Bill team look back at the report from the committee that the noble Baroness, Lady Morgan, chaired back in 2022, Fighting Fraud: Breaking the Chain, they will see that it clearly said:

“Identity theft is often a predicate action to the criminal offence of fraud, as well as other offences including organised crime and terrorism, but it is not a criminal offence”.


That is pretty clear. The committee went into this in considerable detail and said:

“The Government should consult on the introduction of legislation to create a specific criminal offence of identity theft. Alternatively, the Sentencing Council should consider including identity theft as a serious aggravating factor in cases of fraud”.


First, I am going to set the noble Baroness, Lady Morgan, on the noble Viscount, Lord Camrose, to persuade him of the wisdom of creating a new offence. I urge the Minister to think about the consequences of not having any criminal sanction for misuse of digital and identity theft. Whatever you might call it, there must be some way to protect people in these circumstances, if we are going to have public trust in the physical verification framework that we are setting up under this Bill. This will be rolled out—if only I had read GOV.UK, I would be far wiser.

It was very interesting to hear the Minister start to unpack quite a lot of detail. We heard about the new regulator, the Office for Digital Identities and Attributes. That was the first reference to the new regulator, but what are its powers going to be? We need a parliamentary debate on this, clearly. Is this an office delegated by the Secretary of State? Presumably, it is non-statutory, in a sense, and will have powers that are at the will of the Secretary of State. It will be within DSIT, I assume—and so on.

I am afraid that we are going round in circles here. We need to know a great deal more. I hope that we get much more protection for those who have the benefit of the service; otherwise, we will find ourselves in a situation that we are often in as regards the digital world, whereby there is a lack of trust and the public react against what they perceive as somebody taking something away from them. In the health service, for example, 3 million people have opted out from sharing their GP personal health data. I am only saying that we need to be careful in this area and to make sure that we have all the right protections in place. In the meantime, I beg leave to withdraw my amendment.

Amendment 51 withdrawn.
--- Later in debate ---
Moved by
56: After Clause 60, insert the following new Clause—
“Private sector consultation regarding NUARThe Secretary of State must consult with relevant private sector organisations before implementing the provisions regarding the National Underground Asset Register.”Member’s explanatory statement
This is a probing amendment to determine the level of Government consultation with the private sector regarding NUAR.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, successive Governments have demonstrated their enthusiasm for NUAR. It was quite interesting to hear the Minister’s enthusiasm for the digitisation of the map of the Underground, so to speak; she was no less enthusiastic than her predecessor. However, as the Minister knows, there are tensions between them—the new, bright, shiny NUAR—and LSBUD, or LinesearchbeforeUdig, which in some respects is the incumbent.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank the noble Lord, Lord Clement-Jones, for these amendments. Amendment 46 is about NUAR and the requirement to perform consultation first. I am not convinced that is necessary because it is already a requirement to consult under Clause 60 and, perhaps more pertinently, NUAR is an industry-led initiative. It came out of an industry meeting and has been led by them throughout. I am therefore not sure, even in spite of the requirement to consult, that much is going to come out of that consultation exercise.

In respect of other providers out there, LSBUD among them, when we were going through this exact debate in DPDI days, the offer I made—and I ask the Minister if she would consider doing the same—was to arrange a demonstration of NUAR to anyone who had not seen it. I have absolutely unshakeable confidence that anybody who sees NUAR in action will not want anything else. I am not a betting man, but—

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

For the record, the noble Viscount is getting a vigorous nod from the Minister.

--- Later in debate ---
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I am grateful to the noble Viscount for joining me in my enthusiasm for NUAR. He is right: having seen it in practice, I am a great enthusiast for it. If it is possible to demonstrate it to other people, I would be very happy to do so, because it is quite a compelling story when you see it in practice.

Amendment 56, in the name of the noble Lord, Lord Clement-Jones, would place a duty on the Secretary of State to consult relevant private sector organisations before implementing the NUAR provisions under the Bill. I want to make clear then that the Geospatial Commission, which oversees NUAR, has been engaging with stakeholders on NUAR since 2018. Since then, there have been extensive reviews of existing processes and data exchange services. That includes a call for evidence, a pilot project, public consultation and numerous workshops. A series of in-person focus groups were completed last week and officials have visited commercial companies with specific concerns, including LinesearchbeforeUdig, so there has been extensive consultation with them.

I suppose one can understand why they feel slightly put out about NUAR appearing on the scene, but NUAR is a huge public asset that we should celebrate. We can potentially use it in other ways for other services in the future, once it is established, and we should celebrate the fact that we have managed to create it as a public asset. I say to the noble Lord, Lord Clement-Jones, that a further consultation on that basis would provide no additional benefit but would delay the realisation of the significant benefits that NUAR could deliver.

Moving on to the noble Lord’s other amendments, Amendments 193, 194, and 195, he is absolutely right about the need for data interoperability in the health service. We can all think of examples of where that would be of benefit to patients and citizens. It is also true that we absolutely need to ensure that our health and care system is supported by robust information standards. Again, we go back to the issue of trust: people need to know that those protections are there.

This is why we would ensure, through Clause 119 and Schedule 15, that suppliers of IT products and services used in the provision of health or adult social care in England are required to meet relevant information standards. In doing so, we can ensure that IT suppliers are held to account where information standards are not implemented. The application of information standards is independent of commercial organisations, and we would hold IT companies to them. Furthermore, the definition of healthcare as set out in the Health and Social Care Act 2012, as amended by the Health and Care Act 2022, already ensures that all forms of healthcare are within scope of information standards, which would include primary care. That was one of the other points that the noble Lord made.

As an add-on to this whole discussion, the noble Lord will know that the Government are preparing the idea of a national data library, which would encourage further interoperability between government departments to make sure that we use it to improve services. Health and social care is the obvious one, but the members of the Committee can all think of all sorts of other ways where government departments, if they collaborated on an interoperable basis, could drive up standards and make life easier for a whole lot of citizens in different ways. We are on the case and are absolutely determined to deliver it. I hope that, on that basis, the noble Lord will withdraw his amendment.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

I am sorry to interrupt the Minister, but she has whetted our appetite about the national data library. It is not included in the Bill. We talked about it a little at Second Reading, but I wonder whether she can tell us a little more about what is planned. Is it to be set up on a statutory basis or is it a shadow thing? What substance will it actually have and how?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Well, details of it were in our manifesto, in as much as a manifesto is ever detailed. It is a commitment to deliver cross-departmental government services and create a means whereby some of the GDPR blockages that stop one department speaking to another can, where necessary, be freed up to make sure that people exchange data in a more positive way to improve services. There will be more details coming out. It is a work in progress at the moment and may well require some legislation to underpin it. There is an awful lot of work to be done in making sure that one dataset can talk to another before we can progress in any major way, but we are working at speed to try to get this new system up and running.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

I thank the Minister for that, which was very interesting. We were talking about medical health IT and “GDPR blockages” almost has a medical quality to it. The embryonic national data library will obviously get some more mentions as we go through the Bill. It is a work in progress, so I hope that we will know more at the end of the Bill than we did at the beginning.

The Minister talked about datasets talking to each other. We will have to get the noble Viscount, Lord Camrose, to use other phrases, not just “Netflix in the age of Blockbuster” but something equally exciting about datasets talking to each other.

--- Later in debate ---
Moved by
58: After Clause 64, insert the following new Clause—
“Review of notification of changes of circumstances legislation(1) The Secretary of State must commission a review of the operation of the Social Security (Notification of Changes of Circumstances) Regulations 2010.(2) In conducting the review, the designated reviewer must -(a) consider the current operation and effectiveness of the legislation (b) identify any gaps in its operations and provisions(c) consider and publish recommendations as to how the scope of the legislation could be expanded to include non-public sector, voluntary and private sector holders of personal data.(3) In undertaking the review, the reviewer must consult -(a) specialists in data sharing(b) people and organisations who campaign for the interests of people affected by, and use the legislation(c) any other persons and organisations the review considers appropriate.(4) The Secretary of State must lay a report of the review before each House of Parliament within six months of this Act coming into force.”Member's explanatory statement
This amendment requires a review of the operation of the ‘Tell Us Once’ programme—which seeks to provide simpler mechanisms for citizens to pass information regarding births and deaths to government—and consider whether the pioneering progress of Tell Us Once could be extended to non-public sector holders of data.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, of course I welcome the fact that the Bill will enable people to register a death in person and online, which was a key recommendation from the UK Commission on Bereavement. I have been asked to table this amendment by Marie Curie; it is designed to achieve improvements to UK bereavement support services, highlighting the significant administrative burden faced by bereaved individuals.

Marie Curie points to the need for a review of the existing Tell Us Once service and the creation of a universal priority service register to streamline death-related notifications across government and private sectors. It argued that the Bill presents an opportunity to address these issues through improved data-sharing and online death registration. Significant statistics illustrate the scale of the problem, showing a large percentage of bereaved people struggling with numerous administrative tasks. It urges the Government, as I do, to commit to implementing those changes to reduce the burden on bereaved families.

Bereaved people face many practical and administrative responsibilities and tasks after a death, which are often both complex and time sensitive. This Bill presents an opportunity to improve the way in which information is shared between different public and private service providers, reducing the burden of death administration.

When someone dies, the Tell Us Once service informs the various parts of national and local government that need to know. That means the local council stops charging council tax, the DVLA cancels the driving licence, the Passport Office cancels the passport, et cetera. Unfortunately, Tell Us Once is currently not working across all Government departments and does not apply to Northern Ireland. No updated equality impact assessment has ever been undertaken. While there are death notification services in the private sector, they are severely limited by not being a public service programme—and, as a result, there are user costs associated, adding to bereaved people’s financial burden and penalising the most struggling families. There is low public awareness and take-up among all these services, as well as variable and inconsistent provision by the different companies. The fact that there is not one service for all public and private sector notifications means that dealing with the deceased’s affairs is still a long and painful process.

The Bill should be amended to require Ministers to carry out a review into the current operation and effectiveness of the Tell Us Once service, to identify any gaps in its operation and provisions and make recommendations as to how the scope of the service could be expanded. Priority service registers are voluntary schemes which utility companies create to ensure that extra help is available to certain vulnerable customers. The previous Government recognised that the current PSRs are disjointed, resource intensive and duplicative for companies, carrying risks of inconsistencies and can be “burdensome for customers”.

That Government concluded that there is “significant opportunity to improve the efficiencies and delivery of these services”. The Bill is an opportunity for this Government to confirm their commitment to implementing a universal priority services register and delivering any legislative measures required to facilitate it. A universal PSR service must include the interests of bereaved people within its scope, and charitable voluntary organisations such as Marie Curie, which works to support bereaved people, should be consulted in its development.

I have some questions to the Minister. First, what measures does this Bill introduce that will reduce the administrative burden on bereaved people after the death of a loved one? Secondly, the Tell Us Once service was implemented in 2010 and the original equality impact assessment envisaged that its operation should be kept under review to reflect the changing nature of how people engage with public services, but no review has ever happened. Will the Minister therefore commit the Government to undertake a review of Tell Us Once? Thirdly, the previous Government’s Smarter Regulation White Paper committed to taking forward a plan to create a “shared once” support register, which would bring together priority service registers. Will the Minister commit this Government to taking that work forward? I beg to move.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

My Lords, it occurred to me when the noble Lord was speaking that we had lost a valuable member of our Committee. This could not be the noble Lord, Lord Clement-Jones, who was speaking to us just then. It must have been some form of miasma or technical imposition. Maybe his identity has been stolen and not been replaced. Normally, the noble Lord would have arrived with a short but punchy speech that set out in full how the new scheme was to be run, by whom, at what price, what its extent would be and the changes that would result. The Liberal future it may have been, but it was always delightful to listen to. I am sad that all the noble Lord has asked for here is a modest request, which I am sure the noble Baroness will want to jump to and accept, to carry out a review—as if we did not have enough of those.

Seriously, I once used the service that we have been talking about when my father-in-law died, and I found it amazing. It was also one that I stumbled on and did not know about before it happened. Deaths did not happen often enough in my family to make me aware of it. But, like the noble Lord, Lord Clement-Jones, I felt that it should have done much more than what it did, although it was valuable for what it did. It also occurred to me, as life moved on and we produced children, that there would be a good service when introducing a new person—a service to tell you once about that, because the number of tough issues one has to deal with when children are born is also extraordinary and can be annoying, if you miss out on one—particularly with the schooling issues, which are more common these days than they were when my children were being born.

I endorse what was said, and regret that the amendment perhaps did not go further, but I hope that the Minister when she responds will have good news for us.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

I only come up with the really positive ones.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

We support this service, of course—we can see the potential for expanding it further if we get this measure right—but I have to tell noble Lords that the current service is not in great shape in terms of its technology. It has suffered from insufficient investment over time and it needs to be improved before we can take it to the next stage of its potential. We consider that the best way to address this issue is, first, to upgrade its legacy technology, which is what we are operating at the moment. I realised that this is a problem only as I took over this brief; I had assumed that it would be more straightforward, but the problem seems to be that we are operating on ancient technology here.

Work is already under way to try to bring it all up to date. We are looking to improve the current service and at the opportunities to extend it to more of government. Our initial task is to try to extend it to some of the government departments that do not recognise it at the moment. Doing that will inform us of the potential limitations and the opportunities should we wish to extend it to the private sector in future. I say to the noble Lord that this will have to be a stage process because of the technological challenges that we currently have.

We are reluctant to commit to a review and further expansion of the service at this time but, once the service is updated, we would absolutely be happy to talk to noble Lords and revisit this issue, because we see the potential of it. The update is expected to be completed in the next two years; I hope that we will be able to come back and give a progress report to noble Lords at that time. However, I have to say, this is what we have inherited—bear with us, because we have a job to do in bringing it up to date. I hope that, on that basis, the noble Lord will withdraw his amendment, albeit reluctantly.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I thank the Minister for that response, and I thank the noble Lord, Lord Stevenson—at least, I think I do—for his contribution.

I have clearly worked on far too many Bills in the past. I have to do better when I move amendments like this. I have to bring the full package, but we are allowed to speak for only a quarter of an hour, so we cannot bring everything to the table. All I can promise the noble Viscount is that my avatar will haunt him while he is sitting on the fence.

I thank the Minister for giving a sympathetic response to this, but clearly there are barriers to rolling out anything beyond where we have got to. I was rather disappointed by two years because I was formulating a plan to bring back an Oral Question in about six months’ time. I am afraid that she may find that we are trying to hurry her along a little on this. I recognise that there are technology issues, but convening people and getting broader engagement with various players is something that could be done without the technology in the first instance, so the Minister can expect follow-up on this front rather earlier than two years’ time. She does not have the luxury of waiting around before we come back to her on it, but I thank her because this is a fantastic service. It is limited, but, as far as it goes, it is a godsend for the bereaved. We need to make sure that it improves and fulfils its potential across the private sector as well as the public sector. In the meantime, I beg leave to withdraw my amendment.

Amendment 58 withdrawn.

Data (Use and Access) Bill [HL]

Lord Clement-Jones Excerpts
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I support the amendments from the noble Viscount, Lord Colville, which I have signed, and will put forward my Amendments 64, 68, 69, 130 and 132 and my Clause 85 stand part debate.

This part of the GDPR is a core component of how data protection law functions. It makes sure that organisations use personal data only for the reason that it was collected. One of the exceptional circumstances is scientific research. Focus on the definitions and uses of data in research increased in the wake of the Covid-19 pandemic, when some came to the view that legal uncertainty and related risk aversion were a barrier to clinical research.

There is a legitimate government desire to ensure that valuable research does not have to be discarded because of a lack of clarity around reuse or very narrow distinctions between the original and new purpose. The Government’s position seems to be that the Bill will only clarify the law, incorporating recitals to the original GDPR in the legislation. While this may be the policy intention, the Bill must be read in the context of recent developments in artificial intelligence and the practice of AI developers.

The Government need to provide reassurance that the intention and impact of the research provisions are not to enable the reuse of personal data, as the noble Viscount said, scraped from the internet or collected by tech companies under legitimate interest for training AI. Large tech companies could abuse the provisions to legitimise mass data scraping of personal data from the internet or to collect via legitimate interest—for example, by a social media platform, about its users. This could be legally reused for training AI systems under the new provisions if developers can claim that it constitutes scientific research. That is why we very much support what the noble Viscount said.

In our view, the definition of scientific research adopted in the Bill is too broad and will permit abuse by commercial interests outside the policy intention. The Bill must recognise the reality that companies will likely position any AI development as “reasonably described as scientific”. Combined with the inclusion of commercial activities in the Bill, that opens the door to data reuse for any data-driven product development under the auspices that it represents scientific research, even where the relationship to real scientific progress is unclear or tenuous. That is not excluded in these provisions.

I turn to Amendments 64, 68, 69, 130 and 132 and the Clause 85 stand part debate. The definition of scientific research in proposed new paragraph 2 under Clause 67(1)(b) is drawn so broadly that most commercial development of digital products and services, particularly those involving machine learning, could ostensibly be claimed by controllers to be “reasonably described as scientific”. Amendment 64, taken together with those tabled by the noble Viscount that I have signed, would radically reduce the scope for misuse of data reuse provisions by ensuring that controllers cannot mix their commercial purposes with scientific research and that such research must be in the public interest and conducted in line with established academic practice for genuine scientific research, such as ethics approval.

Since the Data Protection Act was introduced in 2018, based on the 2016 GDPR, the education sector has seen enormous expansion of state and commercial data collection, partly normalised in the pandemic, of increased volume, sensitivity, intrusiveness and high risk. Children need particular care in view of the special environment of educational settings, where pupils and families are disempowered and have no choice over the products procured, which they are obliged to use for school administrative purposes, for learning in the classroom, for homework and for digital behavioural monitoring.

The implications of broadening the definition of research activities conducted within the state education sector include questions of the appropriateness of applying the same rules where children are in a compulsory environment without agency or routine practice for research ethics oversight, particularly if the definition is expanded to commercial activity.

Parental and family personal data is often inextricably linked to the data of a child in education, such as home address, heritable health conditions or young carer status. The Responsible Technology Adoption Unit within DSIT commissioned research in the Department for Education to understand how parents and pupils feel about the use of AI tools in education and found that, while parents and pupils did not expect to make specific decisions about AI optimisation, they did expect to be consulted on whether and by whom pupil work and data can be used. There was widespread consensus that work and data should not be used without parents’ and/or pupils’ explicit agreement.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Many thanks to the noble Lords who have spoken in this debate and to the noble Lord, Lord Freyberg, for his Amendment 60. Before I start, let me endorse and add my name to the request for something of a briefing about the AI Bill. I am concerned that we will put a lot of weight of expectation on that Bill. When it comes, if I understand this right, it will focus on the very largest AI labs and may not necessarily get to all the risks that we are talking about here.

Amendment 60 seeks to ensure that the Bill does not allow privately funded or commercial activities to be considered scientific research in order

“to avert the possibility that such ventures might benefit from exemptions in copyright law relating to data mining”.

This is a sensible, proportionate measure to achieve an important end, but I have some concerns about the underlying assumption, as it strikes me. There is a filtering criterion of whether or not the research is taxpayer funded; that feels like a slightly crude means of predicting the propensity to infringe copyright. I do not know where to take that so I shall leave it there for the moment.

Amendment 61 in my name would ensure that data companies cannot justify data scraping for AI training as scientific research. As many of us said in our debate on the previous group, as well as in our debate on this group, the definition of “scientific research” in the Bill is extremely broad. I very much take on board the Minister’s helpful response on that but, I must say, I continue to have some concerns about the breadth of the definition. The development of AI programs, funded privately and as part of a commercial enterprise, could be considered scientific, so I believe that this definition is far too broad, given that Article 8A(3), to be inserted by Clause 71(5), states:

“Processing of personal data for a new purpose is to be treated as processing in a manner compatible with the original purpose where … the processing is carried out … for the purposes of scientific research”.


By tightening up the definition of “scientific research” to exclude activities that are primarily commercial, it prevents companies from creating a scientific pretence for research that is wholly driven by commercial gain rather than furthering our collective knowledge. I would argue that, if we wish to allow these companies to build and train AI—we must, or others will—we must put in proper safeguards for people’s data. Data subjects should have the right to consent to their data being used in such a manner.

Amendment 65A in the name of my noble friend Lord Holmes would also take steps to remedy this concern. I believe that this amendment would work well in tangent with Amendment 61. It makes it absolutely clear that we expect AI developers to obtain consent from data subjects before they use or reuse their data for training purposes. For now, though, I shall not press my amendment.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I share the confusion of the noble Baroness, Lady Kidron, about the groupings. If we are not careful, we are going to keep returning to this issue again and again over four or five groups.

With the possible exception of the noble Lord, Lord Lucas, I think that we are all very much on the same page here. On the suggestion from the noble Viscount, Lord Colville, that we meet to discuss the precise issue of the definition of “scientific research”, this would be extremely helpful; the noble Baroness and I do not need to repeat the concerns.

I should declare an interest in two respects: first, my interests as regards AI, which are set out on the register; and, secondly—I very much took account of what the noble Viscount, Lord Camrose, and the noble Lord, Lord Markham, had to say—I chair the council of a university that has a strong health faculty. It does a great deal of health research and a lot of that research relies on NHS datasets.

This is not some sort of Luddism we are displaying here. This is caution about the expansion of the definition of scientific research, so that it does not turn into something else: that it does not deprive copyright holders of compensation, and that it does not allow personal data to be scraped off the internet without consent. There are very legitimate issues being addressed here, despite the fact that many of us believe that this valuable data should of course be used for the public benefit.



One of the key themes—this is perhaps where we come back on to the same page as the noble Lord, Lord Lucas—may be public benefit, which we need to reintroduce so that we really understand that scientific research for public benefit is the purpose we want this data used for.

I do not think I need to say much more: this issue is already permeating our discussions. It is interesting that we did not get on to it in a major way during the DPDI Bill, yet this time we have focused much more heavily on it. Clearly, in opposition, the noble Viscount has seen the light. What is not to like about that? Further discussion, not least of the amendment of the noble Baroness, Lady Kidron, further down the track will be extremely useful.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I feel we are getting slightly repetitive, but before I, too, repeat myself, I should like to say something that I did not get the chance to say the noble Viscount, Lord Colville, the noble Baroness, Lady Kidron, and others: I will write, we will meet—all the things that you have asked for, you can take it for granted that they will happen, because we want to get this right.

I say briefly to the noble Baroness: we are in danger of thinking that the only good research is health research. If you go to any university up and down the country, you find that the most fantastic research is taking place in the most obscure subjects, be it physics, mechanical engineering, fabrics or, as I mentioned earlier, quantum. A lot of great research is going on. We are in danger of thinking that life sciences are the only thing that we do well. We need to open our minds a bit to create the space for those original thinkers in other sectors.

--- Later in debate ---
In response to the noble Lord, Lord Holmes, and to other points on AI legislation, as per the King’s Speech, the Government are seeking to establish the appropriate legislation to place requirements on those working to develop the most powerful artificial intelligence models. The next steps on that will be announced in the usual way—so maybe not this side of Santa, as I was asked.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Can the Minister say whether this will be a Bill, a draft Bill or a consultation?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

We will announce this in the usual way—in due course. I refer the noble Lord to the King’s Speech on that issue. I feel that noble Lords want more information, but they will just have to go with what I am able to say at the moment.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Perhaps another aspect the Minister could speak to is whether this will be coming very shortly, shortly or imminently.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Let me put it this way: other things may be coming before it. I think I promised at the last debate that we would have something on copyright in the very, very, very near future. This may not be as very, very, very near future as that. We will tie ourselves in knots if we carry on pursuing this discussion.

On that basis, I hope that this provides noble Lords with sufficient reassurance not to press their amendments.

--- Later in debate ---
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - - - Excerpts

My Lords, it seems very strange indeed that Amendment 66 is in a different group from group 1, which we have already discussed. Of course, I support Amendment 66 from the noble Viscount, Lord Camrose, but in response to my suggestion for a similar ethical threshold, the Minister said she was concerned that scientific research would find this to be too bureaucratic a hurdle. She and many of us here sat through debates on the Online Safety Bill, now an Act. I was also on the Communications Committee when it looked at digital regulations and came forward with one of the original reports on this. The dynamic and impetus which drove us to worry about this was the lack of ethics within the tech companies and social media. Why on earth would we want to unleash some of the most powerful companies in the world on reusing people’s data for scientific purposes if we were not going to have an ethical threshold involved in such an Act? It is important that we consider that extremely seriously.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I welcome the noble Viscount to the sceptics’ club because he has clearly had a damascene conversion. It may be that this goes too far. I am slightly concerned, like him, about the bureaucracy involved in this, which slightly gives the game away. It could be seen as a way of legitimising commercial research, whereas we want to make it absolutely certain that that research is for the public benefit, rather than imposing an ethical board on every single aspect of research which has any commercial content.

We keep coming back to this, but we seem to be degrouping all over the place. Even the Government Whips Office seems to have given up trying to give titles for each of the groups; they are just called “degrouped” nowadays, which I think is a sign of deep depression in that office. It does not tell us anything about what the different groups contain, for some reason. Anyway, it is good to see the noble Viscount, Lord Camrose, kicking the tyres on the definition of the research aspect.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I am not quite sure about the groupings, either, but let us go with what we have. I thank noble Lords who have spoken, and the noble Viscount, Lord Camrose, for his amendments. I hope I am able to provide some reassurance for him on the points he raised.

As I said when considering the previous group, the Bill does not expand the definition of scientific research. The reasonableness test, along with clarifying the requirement for researchers to have a lawful basis, will significantly reduce the misuse of the existing definition. The amendment seeks to reduce the potential for misuse of the definition of scientific research by commercial companies using AI by requiring scientific researchers for a commercial company to submit their research to an ethics committee. As I said on the previous group, making it a mandatory requirement for all research may impede studies in areas that might have their own bespoke ethical procedures. This may well be the case in a whole range of different research areas, particularly in the university sector, and in sectors more widely. Some of this research may be very small to begin with but might grow in size. The idea that a small piece of start-up research has to be cleared for ethical research at an early stage is expecting too much and will put off a lot of the new innovations that might otherwise come forward.

Amendment 80 relates to Clause 71 and the reuse of personal data. This would put at risk valuable research that relies on data originally generated from diverse contexts, since the difference between the purposes may not always be compatible.

Turning to Amendment 67, I can reassure noble Lords that the concept of broad consent is not new. Clause 68 reproduces the text from the current UK GDPR recitals because the precise definition of scientific research may become clear only during later analysis of the data. Obtaining broad consent for an area of research from the outset allows scientists to focus on potentially life-saving research. Clause 68 has important limitations. It cannot be used if the researcher already knows the specific purpose—an important safeguard that should not be removed. It also includes a requirement to give the data subject the choice to consent to only part of the research processing, if possible. Most importantly, the data subject can revoke their consent at any point. I hope this reassures the noble Viscount, Lord Camrose, and he feels content to withdraw his amendment on this basis.

--- Later in debate ---
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

My Lords, I rise to move the amendment standing in my name and to speak to my other amendments in this group. I am grateful to the noble Baroness, Lady Kidron and the noble Lord, Lord Clement-Jones, for signing a number of those amendments, and I am also very grateful to Foxglove Legal and other bodies that have briefed me in preparation for this.

My amendments are in a separate group, and I make no apology for that because although some of these points have indeed been covered in other amendments, my focus is entirely on NHS patient data, partly because it is the subject of a wider debate going on elsewhere about whether value can be obtained for it to help finance the National Health Service and our health in future years. This changes the nature of the relationship between research and the data it is using, and I think it is important that we focus hard on this and get some of the points that have already been made into a form where we can get reasonable answers to the questions that it leaves.

If my amendments are accepted or agreed—a faint hope—they would make it clear beyond peradventure that the consent protections in the Bill apply to the processing of data for scientific research, that a consistent definition of consent is applied and that that consistent definition is the one with which researchers and the public are already familiar and can trust going forward.

The Minister said at the end of Second Reading, in response to concerns I and others raised about research data in general and NHS data in particular, that the provisions in this Bill

“do not alter the legal obligations that apply in relation to decisions about whether to share data”.—[Official Report, 19/11/24; col. 196.]

I accept that that may be the intention, and I have discussed this with officials, who make the same point very strongly. However, Clause 68 introduces a novel and, I suggest, significantly watered-down definition of consent in the case of scientific research. Clause 71 deploys this watered-down definition of consent to winnow down the “purpose limitation” where the processing is for the purposes of scientific research in the public interest. Taken together, this means that there has been a change in the legal obligations that apply to the need to obtain consent before data is shared.

Clause 68 amends the pivotal definition of consent in Article 4(11). Instead of consent requiring something express—freely given, specific, informed, and unambiguous through clear affirmative action—consent can now be imputed. A data subject’s consent is deemed to meet these strict requirements even when it does not, as long as the consent is given to the processing of personal data for the purposes of an area of scientific research; at the time the consent is sought, it is not possible to identify fully the purposes for which the personal data is to be processed; seeking consent in relation to the area of scientific research is consistent with generally recognised ethical standards relevant to the area of research; and, so far as the intended purposes of the processing allow, the data subject is given the opportunity to consent to processing for only part of the research. These all sound very laudable, but I believe they cut down the very strict existing standards of consent.

Proposed new paragraph 7, in Clause 68, then extends the application of this definition across the regulation:

“References in this Regulation to consent given for a specific purpose (however expressed) include consent described in paragraph 6.”


Thus, wherever you read “consent” in the regulation you can also have imputed consent as set out in proposed new paragraph 6 of Article 4. This means that “consent” within the meaning of proposed new paragraph 6(a)—i.e. the basis for lawful processing—can be imputed consent in the new way introduced by the Bill, so there is a new type of lawful basis for processing.

The Minister is entitled to disagree, of course; I expect him to say that when he comes to respond. I hope that, when he does, he will agree that we share a concern on the importance of giving researchers a clear framework, as it is this uncertainty about the legal framework that could inadvertently act as a barrier to the good research we all need. So my first argument today is that, as drafted, the Bill leaves too much room for different interpretations, which will lead to exactly the kind of uncertainty that the Minister—indeed, all of us—wish to avoid.

As we have heard already, as well as the risk of uncertainty among researchers, there is also the risk of distrust among the general public. The public rightly want and expect to have a say in what uses their data is put to. Past efforts to modernise how the NHS uses data, such as care.data, have been expensive failures, in part because they have failed to win the public’s trust. More than 3.3 million people have already opted out of NHS data sharing under the national data opt-out; that is nearly 8% of the adults who could have been part of surveys. We have talked about the value of our data and being the gold standard or gold attractor for researchers but, if we do not have all the people who could contribute, we are definitely devaluing and debasing that research. Although we want to respect people’s choice as to whether to participate, of course, this enormous vote against research reflects a pretty spectacular failure to win public trust—one that undermines the value and quality of the data, as I said.

So my second point is that watering down the rights of those whose data is held by the NHS will not put that data for research purposes on a sustainable, long-term footing. Surely, we want a different outcome this time. We cannot afford more opt-outs; we want people opting back in. I argue that this requires a different approach—one that wins the public’s trust and gains public consent. The Secretary of State for Health is correct to say that most of the public want to see the better use of health data to help the NHS and to improve the health of the nation. I agree, but he must accept that the figures show that the general public also have concerns about privacy and about private companies exploiting their data without them having a say in the matter. The way forward must be to build trust by genuinely addressing those concerns. There must not be even a whiff of watering down legal protections, so that those concerns can instead be turned into support.

This is also important because NHS healthcare includes some of the most intimate personal data. It cannot make sense for that data to have a lower standard of consent protection going forward if it is being used for research. Having a different definition of consent and a lower standard of consent will inevitably lead to confusion, uncertainty and mistrust. Taken together, these amendments seek to avoid uncertainty and distrust, as well as the risk of backlash, by making it abundantly clear that Article 4 GDPR consent protections apply despite the new wording introduced by this Bill. Further, these are the same protections that apply to other uses of data; they are identical to the protections already understood by researchers and by the public.

I turn now to a couple of the amendments in this group. Amendment 71 seeks to address the question of consent, but in a rather narrow way. I have argued that Clause 68 introduces a novel and significantly watered-down definition of consent in the case of scientific research; proposed new paragraph 7 deploys this watered-down definition to winnow down the purpose limitation. There are broader questions about the wisdom of this, which Amendments 70, 79 and 81 seek to address, but Amendment 71 focuses on the important case of NHS health data.

If the public are worried that their health data might be shared with private companies without their consent, we need an answer to that. We see from the large number of opt-outs that there is already a problem; we have also seen it recently in NHS England’s research on public attitudes to health data. This amendment would ensure that the Bill does not increase uncertainty or fuel patient distrust of plans for NHS data. It would help to build the trust that data-enabled transformation of the NHS requires.

The Government may well retort that they are not planning to share NHS patient data with commercial bodies without patient consent. That is fine, but it would be helpful if, when he comes to respond, the Minister could say that clearly and unambiguously at the Dispatch Box. However, I put it to him that, if he could accept these amendments, the law would in fact reflect that assurance and ensure that any future Government would need to come back to Parliament if they wanted to take a different approach.

It is becoming obvious that whether research is in the public interest will be the key issue that we need to resolve in this Bill, and Amendment 72 provides a proposal. The Bill makes welcome references to health research being in the public interest, but it does not explain how on earth we decide or how that requirement would actually bite. Who makes the assessment? Do we trust a rogue operator to make its own assessment of how its research is in the public interest? What would be examples of the kind of research that the Government expect this requirement to prevent? I look forward to hearing the answer to that, but perhaps it would be more helpful if the Minister responded in a letter. In the interim, this amendment seeks to introduce some procedural clarity about how research will be certified as being in the public interest. This would provide clarity and reassurance, and I commend it to the Minister.

Finally, Amendment 131 seeks to improve the appropriate safeguards that would apply to processing for research, archiving and scientific purposes, including a requirement that the data subject has given consent. This has already been touched on in another amendment, but it is a way of seeking to address the issues that Amendments 70, 79 and 81 are also trying to address. Perhaps the Government will continue to insist that this is addressing a non-existent problem because nothing in Clauses 69 or 71 waters down the consent or purpose limitation protections and therefore the safeguards themselves add nothing. However, as I have said, informed readers of the Bill are interpreting it differently, so spelling out this safeguard would add clarity and avoid uncertainty. Surely such clarity on such an important matter is worth a couple of lines of additional length in a 250-page Bill. If the Government are going to argue that our Amendment 131 adds something objectionable, let them explain what is objectionable about consent protections applying to data processing for these purposes. I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I support Amendments 70 to 72, which I signed, in the name of the noble Lord, Lord Stevenson of Balmacara. I absolutely share his view about the impact of Clause 68 on the definition of consent and the potential and actual mistrust among the public about sharing of their data, particularly in the health service. It is highly significant that 3.3 million people have opted out of sharing their patient data.

I also very much share the noble Lord’s views about the need for public interest. In a sense, this takes us back to the discussion that we had on previous groups about whether we should add that in a broader sense so not purely for health data or whatever but for scientific research more broadly, as he specifies. I very much support what he had to say.

Broadly speaking, the common factor between my clause stand part and what he said is health data. Data subjects cannot make use of their data rights if they do not even know that their data is being processed. Clause 77 allows a controller reusing data under the auspices of scientific research to not notify a data subject in accordance with Article 13 and 14 rights if doing so

“is impossible or would involve a disproportionate effort”.

We on these Benches believe that Clause 77 should be removed from the Bill. The safeguards are easily circumvented. The newly articulated compatibility test in new Article 8A inserted by Clause 71 that specifies how related the new and existing purposes for data use need to be to permit reuse is essentially automatically passed if it is conducted

“for the purposes of scientific research or historical research”.

This makes it even more necessary for the definition of scientific research to be tightened to prevent abuse.

Currently, data controllers must provide individuals with information about the collection and use of their personal data. These transparency obligations generally do not require the controller to contact each data subject. Such obligations can usually be satisfied by providing privacy information using different techniques that can reach large numbers of individuals, such as relevant websites, social media, local newspapers and so on.

--- Later in debate ---
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I rise briefly to support the amendments in the name of the noble Lord, Lord Stevenson of Balmacara. I must say that the noble Lord, Lord Clement-Jones, made a very persuasive speech; I shall be rereading it and thinking about it more carefully.

In many ways, purpose limitation is the jewel in the crown of GDPR. It does what it says on the tin: data should be used for the original purpose, and if the purpose is then extended, we should go back to the person and ask whether it can be used again. While I agree with and associate myself with the technical arguments made by the noble Lord, Lord Stevenson, that is the fundamental point.

The issue here is, what are the Government trying to do? What are we clearing a pathway for? In a later group, we will speak to a proposal to create a UK data sovereign fund to make sure that the value of UK publicly held data is realised. The value is not simply economic or financial, but societal. There are ways of arranging all this that would satisfy everyone.

I have been sitting here wondering whether to say it, but here I go: I am one of the 3.3 million.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

So is the noble Lord, Lord Clement-Jones. I withdrew my consent because I did not trust the system. I think that what both noble Lords have said about trust could be spread across the Bill as a whole.

We want to use our data well. We want it to benefit our public services. We want it to benefit UK plc and we want to make the world a better place, but not at the cost of individual data subjects and not at too great a cost. I add my voice to that. On the whole, I prefer systems that offer protections by design and default, as consent is a somewhat difficult concept. But, in as much as consent is a fundamental part of the current regulatory system and nothing in the Bill gets rid of it wholesale for some better system, it must be applied meaningfully. Amendments 79, 81 and 131 make clear what we mean by the term, ensure that the definition is consistent and clarify that it is not the intention of the Government to lessen the opportunity for meaningful consent. I, too, ask the Minister to confirm that it is not the Government’s intention to downgrade the concept of meaningful consent in the way that the noble Lord, Lord Stevenson, has set out.

--- Later in debate ---
Moved by
73: Clause 70, page 77, leave out lines 34 to 38
Member's explanatory statement
This amendment and another amendment in Lord Clement-Jones’s name to clause 70 omits paragraphs 70(2)(b)-(c), (4), (5) and (6) which make amendments to UK GDPR to define certain data processing activities as “recognised legitimate interests”.
--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - -

My Lords, I start with an apology, because almost every amendment in this group is one of mine and I am afraid I have quite a long speech to make about the different amendments, which include Amendments 73, 75, 76, 77, 78, 78A, 83, 84, 85, 86, 89 and 90, and stand part debates on Schedules 4, 5 and 7 and Clause 74. But I know that the Members of this Committee are made of strong stuff.

Clause 70 and Schedule 4 introduce a new ground of recognised legitimate interest, which in essence counts as a lawful basis for processing if it meets any of the descriptions in the new Annexe 1 to the UK GDPR, which is at Schedule 4 to the Bill—for example, processing necessary for the purposes of responding to an emergency or detecting crime. These have been taken from the previous Government’s Data Protection and Digital Information Bill. This is supposed to reduce the burden on data controllers and the cost of legal advice when they have to assess whether it is okay to use or share data or not. Crucially, while the new ground shares its name with “legitimate interest”, it does not require the controller to make any balancing test taking the data subject’s interests into account. It just needs to meet the grounds in the list. The Bill gives the Secretary of State powers to define additional recognised legitimate interests beyond those in Annexe 1—a power heavily criticised by the Delegated Powers and Regulatory Reform Committee’s report on the Bill.

Currently where a private body shares personal data with a public body in reliance on Article 6(1)(e) of the GDPR, it can rely on the condition that the processing is

“necessary for the performance of a task carried out in the public interest”.

New conditions in Annexe 1, as inserted by Schedule 4, would enable data sharing between the private and public sectors to occur without any reference to a public interest test. In the list of recognised legitimate interests, the most important is the ability of any public body to ask another controller, usually in the private sector, for the disclosure of personal data it needs to deliver its functions. This applies to all public bodies. The new recognised legitimate interest legal basis in Clause 70 and Schedule 4 should be dropped.

Stephen Cragg KC, giving his legal opinion on the DPDI Bill, which, as I mentioned, has the same provision, stated that this list of recognised legitimate interests

“has been elevated to a position where the fundamental rights of data subjects (including children) can effectively be ignored where the processing of personal data is concerned”.

The ICO has also flagged concerns about recognised legitimate interests. In its technical drafting comments on the Bill, it said:

“We think it would be helpful if the explanatory notes could explicitly state that, in all the proposed new recognised legitimate interests, an assessment of necessity involves consideration of the proportionality of the processing activity”.


An assessment of proportionality is precisely what the balancing test is there to achieve. Recognised legitimate interests undermine the fundamental rights and interests of individuals, including children, in specific circumstances.

When companies are processing data without consent, it is essential that they do the work to balance the interests of the people who are affected by that processing against their own interests. Removing recognised legitimate interests from the Bill will not stop organisations from sharing data with the public sector or using data to advance national security, detect crime or safeguard children and vulnerable people. The existing legitimate interest lawful basis is more than flexible enough for these purposes. It just requires controllers to consider and respect people’s rights as they do so.

During the scrutiny of recognised legitimate interests in the DPDI Bill—I am afraid to have to mention this—the noble Baroness, Lady Jones of Whitchurch, who is now leading on this Bill as the Minister, raised concerns about the broad nature of the objectives. She rightly said:

“There is no strong reason for needing that extra power, so, to push back a little on the Minister, why, specifically, is it felt necessary? If it were a public safety interest, or one of the other examples he gave, it seems to me that that would come under the existing list of public interests”.—[Official Report, 25/3/24; col. GC 106.]


She never spoke a truer word.

However, this Government have reintroduced the same extra power with no new articulation of any strong reason for needing it. The constraints placed on the Secretary of State are slightly higher in this Bill than they were in the DPDI Bill, as new paragraph (9), inserted by Clause 70(4), means that they able to add new recognised legitimate interests only if they consider processing the case to be necessary to safeguard an objective listed in UK GDPR Article 23(1)(c) to (j). However, this list includes catch-alls, such as

“other important objectives of general public interest”.

To give an example of what this power would allow, the DPDI Bill included a recognised legitimate interest relating to the ability of political parties to use data about citizens during election campaigns on the basis that democratic participation is an objective of general public interest. I am glad to say that this is no longer included. Another example is that a future Secretary of State could designate workplace productivity as a recognised legitimate interest—which, without a balancing test, would open the floodgates to intrusive workplace surveillance and unsustainable data-driven work intensification. That does not seem to be in line with the Government’s objectives.

Amendment 74 is rather more limited. Alongside the BMA, we are unclear about the extent of the impact of Clause 70 on the processing of health data. It is noted that the recognised legitimate interest avenue appears to be available only to data controllers that are not public authorities. Therefore, NHS organisations appear to be excluded. We would welcome confirmation that health data held by an NHS data controller is excluded from the scope of Clause 70 now and in the future, regardless of the lawful basis that is being relied on to process health data.

--- Later in debate ---
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, when the noble Lord, Lord Clement-Jones, opened his speech he said that he hoped that noble Lords would be made of strong stuff while he worked his way through it. I have a similar request regarding my response: please bear with me. I will address these amendments slightly out of order to ensure that related issues are grouped together.

The Schedule 4 stand part notice, and Amendments 73 and 75, tabled by the noble Lord, Lord Clement-Jones, and supported by the noble Baroness, Lady Kidron, would remove the new lawful ground of “recognised legitimate interests” created by Clause 70 and Schedule 4 to the Bill. The aim of these provisions is to give data controllers greater confidence about processing personal data for specified and limited public interest objectives. Processing that is necessary and proportionate to achieve one of these objectives can take place without a person’s consent and without undertaking the legitimate interests balancing test. However, they would still have to comply with the wider requirements of data protection legislation, where relevant, ensuring that the data is processed in compliance with the other data protection principles.

I say in response to the point raised by the noble Lord, Lord Cameron, that the new lawful ground of recognised legitimate interest will apply from the date of commencement and will not apply retrospectively.

The activities listed include processing of data where necessary to prevent crime, safeguarding national security, protecting children or responding to emergencies. They also include situations where a public body requests that a non-public body share personal data with it to help deliver a public task that is sanctioned by law. In these circumstances, it is very important that data is shared without delay, and removal of these provisions from the Bill, as proposed by the amendment, could make that harder.

Amendment 74, tabled by noble Lord, Lord Scriven, would prevent health data being processed as part of this new lawful ground, but this could have some unwelcome effects. For example, the new lawful ground is designed to give controllers greater confidence about reporting safeguarding concerns, but if these concerns relate to a vulnerable person’s health, they would not be able to rely on the new lawful ground to process the data and would have to identify an alternative lawful ground.

On the point made by the noble Lord, Lord Clement-Jones, about which data controllers can rely on the new lawful ground, it would not be available to public bodies such as the NHS; it is aimed at non-public bodies.

I reassure noble Lords that there are still sufficient safeguards in the wider framework. Any processing that involves special category data, such as health data, would also need to comply with the conditions and safeguards in Article 9 of the UK GDPR and Schedule 1 to the Data Protection Act 2018.

Amendment 78A, tabled by the noble Lord, Lord Clement-Jones, would remove the new lawful ground for non-public bodies or individuals to disclose personal data at the request of public bodies, where necessary, to help those bodies deliver their public interest tasks without carrying out a legitimate interest balance test. We would argue that, without it, controllers may lack certainty about the correct lawful ground to rely on when responding to such requests.

Amendment 76, also tabled by the noble Lord, Lord Clement-Jones, would remove the powers of regulations in Clause 70 that would allow the Secretary of State to keep the list of recognised legitimate interests up to date. Alternatively, the noble Lord’s Amendment 78 would require the Secretary of State to publish a statement every time he added a new processing activity to the list, setting out its purpose, which controllers it was aimed at and for how long they can use it. I reassure the noble Lord that the Government have already taken steps to tighten up these powers since the previous Bill was considered by this House.

Any new processing activities added would now also have to serve

“important objectives of … public interest”

as described in Article 23.1 of the UK GDPR and, as before, new activities could be added to the list only following consultation with the ICO and other interested parties. The Secretary of State would also have to consider the impact of any changes on people’s rights and have regard to the specific needs of children. Although these powers are likely to be used sparingly, the Government think it important that they be retained. I reassure the Committee that we will be responding to the report from the Delegated Powers Committee within the usual timeframes and we welcome its scrutiny of the Bill.

The noble Lord’s Amendment 77 seeks to make it clear that organisations should also be able to rely on Article 6.1(f) to make transfers between separate businesses affiliated by contract. The list of activities mentioned in Clause 70 is intended to be illustrative only and is drawn from the recitals to the UK GDPR. This avoids providing a very lengthy list that might be viewed as prescriptive. Article 6.1(f) of the UK GDPR is flexible. The transmission of personal data between businesses affiliated by contract may constitute a legitimate interest, like many other commercial interests. It is for the controller to determine this on a case-by-case basis.

I will now address the group of amendments tabled by the noble Lord, Lord Clement-Jones, concerning the purpose limitation principle, specifically Amendments 83 to 86. This principle limits the ways that personal data collected for one purpose can be used for another, but Clause 71 aims to provide more clarity and certainty around how it operates, including how certain exemptions apply.

Amendment 84 seeks to clarify whether the first exemption in proposed new Annexe 2 to the UK GDPR would allow personal data to be reused for commercial purposes. The conditions for using this exemption are that the requesting controller has a public task or official authority laid down in law that meets a public interest objective in Article 23.1 of the UK GDPR. As a result, I and the Government are satisfied that these situations would be for limited public interest objectives only, as set out in law.

Amendments 85 and 86 seek to introduce greater transparency around the use of safeguarding exemptions in paragraph 8 of new Annexe 2. These conditions are drawn from the Care Act 2014 and replicated in the existing condition for sensitive data processing for safeguarding purposes in the Data Protection Act 2018. I can reassure the Committee that processing cannot occur if it does not meet these conditions, including if the vulnerability of the individual no longer exists. In addition, requiring that an assessment be made and given to the data subject before the processing begins could result in safeguarding delays and would defeat the purpose of this exemption.

Amendment 83 would remove the regulation-making powers associated with this clause so that new exceptions could not be added in future. I remind noble Lords that there is already a power to create exemptions from the purpose limitation principle in the DPA 2018. This Bill simply moves the existing exemptions to a new annexe to the UK GDPR. The power is strictly limited to the public objectives listed in Article 23.1 of the UK GDPR.

I now turn to the noble Lord’s Amendment 89, which seeks to set conditions under which pseudonymised data should be treated as personal data. This is not necessary as pseudonymised data already falls within the definition of personal data under Article 4.1 of the UK GDPR. This amendment also seeks to ensure that a determination by the ICO that data is personal data applies

“at all points in that processing”.

However, the moment at which data is or becomes personal should be a determination of fact based on its identifiability to a living individual.

I turn now to Clause 74 stand part, together with Amendment 90. Noble Lords are aware that special categories of data require additional protection. Article 9 of the UK GDPR sets out an exhaustive list of what is sensitive data and outlines processing conditions. Currently, this list cannot be amended without primary legislation, which may not always be available. This leaves the Government unable to respond swiftly when new types of sensitive data are identified, including as a result of emerging technologies. The powers in Clause 74 enable the Government to respond more quickly and add new special categories of data, tailor the conditions applicable to their use and add new definitions if necessary.

Finally, I turn to the amendment tabled by the noble Lord, Lord Clement-Jones, that would remove Schedule 7 from the Bill. This schedule contains measures to create a clearer and more outcomes-focused UK international data transfers regime. As part of these reforms, this schedule includes a power for the Secretary of State to recognise new transfer mechanisms for protecting international personal data transfers. Without this, the UK would be unable to respond swiftly to emerging developments and global trends in personal data transfers. In addition, the ICO will be consulted on any new mechanisms, and they will be subject to debate in Parliament under the affirmative resolution procedure.

I hope this helps explain the Government’s intention with these clauses and that the noble Lord will feel able to withdraw his amendment.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I thank the Minister. She covered quite a lot of ground and all of us will have to read Hansard quite carefully. However, it is somewhat horrifying that, for a Bill of this size, we had about 30 seconds from the Minister on Schedule 7, which could have such a huge influence on our data adequacy when that is assessed next year. I do not think anybody has talked about international transfers at this point, least of all me in introducing these amendments. Even though it may appear that we are taking our time over this Bill, we are not fundamentally covering all its points. The importance of this Bill, which obviously escapes most Members of this House—there are just a few aficionados—is considerable and could have a far-reaching impact.

I still get Viscount Camrose vibes coming from the Minister.

None Portrait Noble Lords
- Hansard -

Oh!

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Perhaps I should stay that this kind of enthusiasm clearly conquers all. I should thank a former Minister, the noble Lord, Lord Kamall, and I thank the noble Baroness, Lady Kidron, for her thoughtful speech, particularly in questioning the whole recognised legitimate interest issue, especially in relation to vulnerable individuals.

It all seems to be a need for speed, whether it is the Secretary of State who has to make snappy decisions or a data controller. We are going to conquer uncertainty. We have to keep bustling along. In a way, to hell with individual data rights; needs must. I feel somewhat Canute-like holding up the barrier of data that will be flowing across us. I feel quite uncomfortable with that. I think the DPRRC is likewise going to feel pretty cheesed off.

--- Later in debate ---
Moved by
82: Clause 71, page 81, line 14, at end at end insert—
“4A. Where the controller collected the personal data based on Article 6(1)(a) (data subject’s consent), processing for a new purpose is not compatible with the original purpose if—(a) the data subject is a child,(b) the processing is based on consent given or authorised by the holder of parental responsibility over the child,(c) the data subject is an adult to whom either (a) or (b) applied at the time of the consent collection, or(d) the data subject is a deceased child.”Member’s explanatory statement
This amendment seeks to exclude children from the new provisions on purpose limitation for further processing under Article 8A.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I thought I had no speech; that would have been terrible. In moving my amendment, I thank the noble Baronesses, Lady Kidron and Lady Harding of Winscombe, and the noble Lord, Lord Russell of Liverpool, for their support. I shall speak also to Amendments 94, 135 and 196.

Additional safeguards are required for the protection of children’s data. This amendment

“seeks to exclude children from the new provisions on purpose limitation for further processing under Article 8A”.

The change to the purpose limitation in Clause 71 raises questions about the lifelong implications of the proposed change for children, given the expectation that they are less aware of the risks of data processing and may not have made their own preferences or choices known at the time of data collection.

For most children’s data processing, adults give permission on their behalf. The extension of this for additional purposes may be incompatible with what a data subject later wishes as an adult. The only protection they may have is purpose limitation to ensure that they are reconsented or informed of changes to processing. Data reuse and access must not mean abandoning the first principles of data protection. Purpose limitation rests on the essential principles of “specified” and “explicit” at the time of collection, which this change does away with.

There are some questions that I would like to put to the Minister. If further reuses, such as more research, are compatible, they are already permitted under current law. If further reuses are not permitted under current law, why should data subjects’ current rights be undermined as a child and, through this change, never be able to be reclaimed at any time in the future? How does the new provision align with the principle of acting in the best interests of the child, as outlined in the UK GDPR, the UNCRC in Scotland and the Rights of Children and Young Persons (Wales) Measure 2011? What are the specific risks to children’s data privacy and security under the revised rules for purpose limitation that may have an unforeseeable lifelong effect? In summary, a blanket exclusion for children’s data processing conforms more with the status quo of data protection principles. Children should be asked again about data processing once they reach maturity and should not find that data rights have been given away by their parents on their behalf.

Amendment 196 is more of a probing amendment. Ofcom has set out its approach to the categorisation of category 1 services under the Online Safety Act. Ofcom’s advice and research, submitted to the Secretary of State, outlines the criteria for determining whether a service falls into category 1. These services are characterised by having the highest reach and risk functionalities among user-to-user services. The categorisation is based on certain threshold conditions, which include user numbers and functionalities such as content recommender systems and the ability for users to forward or reshare content. Ofcom has recommended that category 1 services should meet either of two sets of conditions: having more than 34 million UK users with a content recommender system or having more than 7 million UK users with a content recommender system and the ability for users to forward or reshare user-generated content. The categorisation process is part of Ofcom’s phased approach to implementing codes and guidance for online safety, with additional obligations for category 1 services due to their potential as sources of harm.

The Secretary of State recently issued the Draft Statement of Strategic Priorities for Online Safety, under Section 172 of the Online Safety Act. It says:

“Large technology companies have a key role in helping the UK to achieve this potential, but any company afforded the privilege of access to the UK’s vibrant technology and skills ecosystem must also accept their responsibility to keep people safe on their platforms and foster a safer online world … The government appreciates that Ofcom has set out to government its approach to tackling small but risky services. The government would like to see Ofcom keep this approach under continual review and to keep abreast of new and emerging small but risky services, which are posing harm to users online.


As the online safety regulator, we expect Ofcom to continue focusing its efforts on safety improvements among services that pose the highest risk of harm to users, including small but risky services. All search services in scope of the Act have duties to minimise the presentation of search results which include or lead directly to illegal content or content that is harmful to children. This should lead to a significant reduction in these services being accessible via search results”.


During the parliamentary debates on the Online Safety Bill and in Joint Committee, there was significant concern about the categorisation of services, particularly about the emphasis on size over risk. Initially, the categorisation was based largely on user numbers and functionalities, which led to concerns that smaller platforms with high-risk content might not be adequately addressed. In the Commons, Labour’s Alex Davies-Jones MP, now a Minister in the Ministry of Justice, argued that focusing on size rather than risk could fail to address extreme harms present on smaller sites.

The debates also revealed a push for a more risk-based approach to categorisation. The then Government eventually accepted an amendment allowing the Secretary of State discretion in setting thresholds based on user numbers, functionalities or both. This change aimed to provide flexibility in addressing high-risk smaller platforms. However, concerns remain, despite the strategy statement and the amendment to the original Online Safety Bill, that smaller platforms with significant potential for harm might not be sufficiently covered under the category 1 designation. Overall, while the final approach allows some flexibility, there is quite some debate about whether enough emphasis will be placed by Ofcom in its categorisation on the risks posed by smaller players. My colleagues on these Benches and in the Commons have emphasised to me that we should be rigorously addressing these issues. I beg to move.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I shall speak to all the amendments in this group, and I thank noble Lords who have added their names to Amendments 88 and 135 in my name.

Amendment 88 creates a duty for data controllers and processors to consider children’s needs and rights. Proposed new subsection (1) simply sets out children’s existing rights and acknowledges that children of different ages have different capacities and therefore may require different responses. Proposed new subsection (2) addresses the concern expressed during the passage of the Bill and its predecessor that children should be shielded from the reduction in privacy protections that adults will experience under the proposals. Proposed new subsection (3) simply confirms that a child is anyone under the age 18.

This amendment leans on a bit of history. Section 123 of the Data Protection Act 2018 enshrined the age-appropriate design code into our data regime. The AADC’s journey from amendment to fully articulated code, since mirrored and copied around the world, has provided two useful lessons.

First, if the intent of Parliament is clear in the Bill, it is fixed. After Royal Assent to the Data Protection Act 2018, the tech lobby came calling to both the Government and the regulator arguing that the proposed age of adulthood in the AADC be reduced from 18 to 13, where it had been for more than two decades. Both the department and the regulator held up their hands and pointed at the text, which cited the UNCRC that defines a child as a person under 18. That age remains, not only in the UK but in all the other jurisdictions that have since copied the legislation.

In contrast, on several other issues both in the AADC and, more recently, in the Online Safety Act, the intentions of Parliament were not spelled out and have been reinterpreted. Happily, the promised coroner provisions are now enshrined in this Bill, but promises from the Dispatch Box about the scope and form of the coroner provisions were initially diluted and had to be refought for a second time by bereaved parents. Other examples, such as promises of a mixed economy, age-assurance requirements and a focus on contact harm, features and functionalities as well as content are some of the ministerial promises that reflected Parliament’s intention but do not form part of the final regulatory standards, in large part because they were not sufficiently spelled out in the Bill. What is on in the Bill really matters.

Secondly, our legislation over the past decade is guilty of solving the problems of yesterday. There is departmental resistance to having outcomes rather than processes enshrined in legislation. Overarching principles, such as a duty of care, or rights, such as children’s rights to privacy, are abandoned in favour of process measures, tools that even the tech companies admit are seldom used and narrow definitions of what must and may not be taken down.

Tech is various, its contexts infinite, its rate of change giddy and the skills of government and regulator are necessarily limited. At some point we are going to have to start saying what the outcome should be, what the principles are, and not what the process is. My argument for this amendment is that we need to fix our intention that in the Bill children have an established set of needs according to their evolving capacity. Similarly, they have a right to a higher bar of privacy, so that both these principles become unavoidable.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

I thank the Minister for her response. I should say at the outset that, although I may have led the group, it is clear that the noble Baroness, Lady Kidron, leads the pack as far as this is concerned. I know that she wants me to say that the noble Baroness, Lady Harding, wished to say that she was extremely sorry not to be able to attend as she wanted to associate herself wholeheartedly with these amendments. She said, “It’s so disappointing still to be fighting for children’s data to have higher protection but it seems that that’s our lot!” I think she anticipated the response, sadly. I very much thank the noble Baroness, Lady Kidron, the noble Lords, Lord Russell and Lord Stevenson, and the noble Viscount, Lord Camrose, in particular for his thoughtful response to Amendment 196.

I was very interested in the intervention from the noble Lord, Lord Stevenson, and wrote down “Not invented here” to sum up the Government’s response to some of these amendments, which has been consistently underwhelming throughout the debates on the DPDI Bill and this Bill. They have brought out such things as “the unintended effects” and said, “We don’t want to interfere with the ICO”, and so on. This campaign will continue; it is really important. Obviously, we will read carefully what the Minister said but, given the troops behind me, I think the campaign will only get stronger.

The Minister did not really deal with the substance of Amendment 196, which was not just a cunning ploy to connect the Bill with the Online Safety Act; it was about current intentions on categorisation. There is considerable concern that the current category 1 is overconservative and that we are not covering the smaller, unsafe social media platforms. When we discussed the Online Safety Bill, both in the Joint Committee and in the debates on subsequent stages of the Bill, it was clear that this was about risk, not just size, and we wanted to cover those risky, smaller platforms as well. While I appreciate the Government’s strategic statement, which made it pretty clear, and without wishing to overly terrorise Ofcom, we should make our view on categorisation pretty clear, and the Government should do likewise.

This argument and debate will no doubt continue. In the meantime, I beg leave to withdraw my amendment.

Amendment 82 withdrawn.
--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I start by speaking to two amendments tabled in my name.

Amendment 91 seeks to change

“the definition of request by data subjects to data controllers”

that can be declined or

“for which a fee can be charged from ‘manifestly unfounded or excessive’ to ‘vexatious or excessive’”.

I am sure that many of us will remember, without a great deal of fondness, our debates on these terms in the DPDI Bill. When we debated this issue at that time, it was, rather to my regret, often presented as a way to reduce protections and make it easier to decline or charge a fee for a subject access request. In fact, the purpose was to try to filter out cynical or time-wasting requests, such as attempts to bypass legal due process or to bombard organisations with vast quantities of essentially meaningless access requests. Such requests are not unfounded but they are harmful; by reducing them, we would give organisations more time and capacity to respond to well-founded requests. I realise that I am probably on a loser on this one but let me encourage noble Lords one last time to reconsider their objections and take a walk on the vexatious side.

Amendment 97 would ensure that

“AI companies who process data not directly obtained from data subjects are required to provide information to data subjects where possible. Without this amendment, data subjects may not know their data is being held”.

If a subject does not even know that their data is being held, they cannot enforce their data rights.

Amendment 99 follows on from that point, seeking to ensure that AI companies using large datasets cannot avoid providing information to data subjects on the basis that their datasets are too large. Again, if a subject does not know that their data is being held, they cannot enforce their rights. Therefore, it is really important that companies cannot avoid telling individuals about their personal data and the way in which it is being used because of sheer weight of information. These organisations are specialists in such processing of huge volumes of data, of course, so I struggle to accept that this would be too technically demanding for them.

Let me make just a few comments on other amendments tabled by noble Lords. Under Amendment 107, the Secretary of State would have

“to publish guidance within six months of the Act’s passing to clarify what constitutes ‘reasonable and proportionate’ in protection of personal data”.

I feel that this information should be published at the same time as this Bill comes into effect. It serves no purpose to have six months of uncertainty.

I do not believe that Amendment 125 is necessary. The degree to which the Government wish to align—or not—with the EU is surely a matter for the Government and their priorities.

Finally, I was struck by the interesting point that the noble and learned Lord, Lord Thomas, made when he deplored the Bill’s incomprehensibility. I have extremely high levels of personal sympathy with that view. To me, the Bill is the source code. There is a challenge in making it comprehensible and communicating it in a much more accessible way once it goes live. Perhaps the Minister can give some thought to how that implementation phase could include strong elements of communication. While that does not make the Bill any easier to understand for us, it might help the public at large.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, the problem is that I have a 10-minute speech and there are five minutes left before Hansard leaves us, so is it sensible to draw stumps at this point? I have not counted how many amendments I have, but I also wish to speak to the amendment by the noble and learned Lord, Lord Thomas. I would have thought it sensible to break at this point.

Lord Leong Portrait Lord Leong (Lab)
- Hansard - - - Excerpts

That is a sensible suggestion.

Data (Use and Access) Bill [HL]

Lord Clement-Jones Excerpts
Debate on Amendment 87 resumed.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, in carrying on on this group, I will speak to the question that Clause 78 stands part, and to Amendments 107, 109, 125, 154, 155 and 156, but to start I support Amendment 87 in the name of the noble and learned Lord, Lord Thomas of Cwmgiedd. We had a masterclass from him last Tuesday and he made an extremely good case for that amendment, which is very elegant.

The previous Government deleted the EU Charter of Fundamental Rights from the statute book through the Retained EU Law (Revocation and Reform) Act 2023, and this Bill does nothing to restore it. Although references in the UK GDPR to fundamental rights and freedoms are now to be read as references to the ECHR as implemented through the Human Rights Act 1998, the Government’s ECHR memorandum states:

“Where processing is conducted by a private body, that processing will not usually engage convention rights”.


As the noble and learned Lord mentioned, this could leave a significant gap in protection for individuals whose data is processed by private organisations and will mean lower data protection rights in the UK compared with the EU, so these Benches strongly support his Amendment 87, which would apply the convention to private bodies where personal data is concerned. I am afraid we do not support Amendments 91 and 97 from the noble Viscount, Lord Camrose, which seem to hanker after the mercifully defunct DPDI.

We strongly support Amendments 139 and 140 from the noble Baroness, Lady Kidron. Data communities are one of the important omissions from the Bill. Where are the provisions that should be there to support data-sharing communities and initiatives such as Solid? We have been talking about data trusts and data communities since as long ago as the Hall-Pesenti review. Indeed, it is interesting that the Minister herself only this April said in Grand Committee:

“This seems to be an area in which the ICO could take a lead in clarifying rights and set standards”.


Indeed, she put forward an amendment:

“Our Amendment 154 would therefore set a deadline for the ICO to do that work and for those rights to be enacted. The noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, made a good case for broadening these rights in the Bill and, on that basis, I hope the Minister will agree to follow this up, and follow up his letter so that we can make further progress on this issue”.—[Official Report, 17/4/24; col. GC 322.]


I very much hope that, now the tables are turned, so to speak, the Minister will take that forward herself in government.

Amendments 154, 155 and 156 deal with the removal of the principle of the supremacy of EU law. They are designed to undo the lowering of the standard of data protection rights in the UK brought about by the REUL Act 2023. The amendments would apply the protections required in Article 23.2 of the UK GDPR to all the relevant exceptions in Schedules 2 to 4 to the Data Protection Act 2018. This is important because data adequacy will be lost if the standard of protection of personal data in the UK is no longer essentially equivalent to that in the EU.

The EU’s adequacy decision stated that it did not apply in the area of immigration and referred to the case of Open Rights Group v the Secretary of State for the Home Department in the Court of Appeal. This case was brought after the UK left the EU, but before the REULA came into effect. The case is an example of how the preservation of the principle of the supremacy of EU law continued to guarantee high data protection standards in the UK, before this principle was deleted from the statute book by the REULA. In broad terms, the Court of Appeal found that the immigration exception in Schedule 2 to the Data Protection Act 2018 conflicted with the safeguards in Article 23 of the UK GDPR. This was because the immigration exemption was drafted too broadly and failed to incorporate the safeguards prescribed for exemptions under Article 23.2 of the UK GDPR. It was therefore held to be unlawful and was disapplied.

The Home Office redrafted the exemption to make it more protective, but it took several attempts to bring forward legislation which provided sufficient safeguards for data subjects. The extent of the safeguards now set out in the immigration exemption underscores both what is needed for compatibility with Article 23.2 of the UK GDPR and the deficiencies in the rest of the Schedule 2 exemptions. It is clear when reading the judgment in the Open Rights case that the majority of the exemptions from data subject rights under Schedule 2 to the Data Protection Act fail to meet the standards set out in Article 23.2 to the UK GDPR. The deletion of the principle of the supremacy of EU law has removed the possibility of another Open Rights-style challenge to the other exemptions in Schedule 2 to the Data Protection Act 2018. I hope that, ahead of the data adequacy discussions with the Commission, the Government’s lawyers have had a good look at the amendments that I have tabled, drafted by a former MoJ lawyer.

The new clause after Clause 107 in Amendment 154 applies new protections to the immigration exemption to the whole of Schedule 2 to the DPA 2018, with the exception of the exemptions that apply in the context of journalism or research, statistics and archiving. Unlike the other exemptions, they already contain detailed safeguards.

Amendment 155 is a new clause extending new protections which apply to the immigration exemption to Schedule 3 to the DPA 2018, and Amendment 156 is another new clause applying new protections which apply to the immigration exemption to Schedule 2 to the DPA 2018.

As regards Amendment 107, the Government need to clarify how data processing under recognised legitimate interests are compatible with conditions for data processing under existing lawful bases, including the special categories of personal data under Articles 5 and 9 of the UK GDPR. The Bill lowers the standard of the protection of personal data where data controllers only have to provide personal data based on

“a reasonable and proportionate search”.

The lack of clarity on what reasonable and proportionate mean in the context of data subject requests creates legal uncertainty for data controllers and organisations, specifically regarding whether the data subject’s consideration on the matter needs to be accounted for when responding to requests. This is a probing amendment which requires the Secretary of State to explain why the existing lawful bases for data processing are inadequate for the processing of personal data when additional recognised legitimate interests are introduced. It requires the Secretary of State to publish guidance within six months of the Act’s passing to clarify what constitutes reasonable and proportionate protections of personal data.

Amendment 109 would insert a new clause, to ensure that data controllers assess the risk of collective and societal harms,

“including to equality and the environment”,

when carrying out data protection impact assessments. It requires them to consult affected people and communities while carrying out these assessments to improve their quality, and requires data controllers to publish their assessments to facilitate informed decision-making by data subjects and to enable data controllers to be held accountable.

Turning to whether Clause 78 should stand part, on top of Clause 77, Clause 78 would reduce the scope of transparency obligations and rights. Many AI systems are designed in a way that makes it difficult to retrieve personal data once ingested, or understand how this data is being used. This is not principally due to technical limitations but the decision of AI developers who do not prioritise transparency and explainability.

As regards Amendment 125, it is clear that there are still further major changes proposed to the GDPR on police duties, automated decision-making and recognised legitimate interests which continue to make retention of data adequacy for the purposes of digital trade with the EU of the utmost priority in considering those changes. During the passage of the Data Protection and Digital Information Bill, I tabled an amendment to require the Government to publish an assessment of the impact of the Bill on EU/UK data adequacy within six months of the Act passing; I have tabled a similar amendment, with one change, to this Bill. As the next reassessment of data adequacy is set for June 2025, a six-month timescale may prove inconsequential to the overall adequacy decision. We must therefore recommend stipulating that this assessment takes place before this reassessment.

Baroness Jones of Whitchurch Portrait The Parliamentary Under-Secretary of State, Department for Business and Trade and Department for Science, Information and Technology (Baroness Jones of Whitchurch) (Lab)
- Hansard - - - Excerpts

My Lords, I thank all noble Lords for their consideration of these clauses. First, I will address Amendment 87 tabled by the noble and learned Lord, Lord Thomas, and the noble and learned Lord—sorry, the noble Lord—Lord Clement-Jones.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

I will take any compliment.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

We should take them while we can. Like the noble Lord, Lord Clement-Jones, I agree that the noble and learned Lord, Lord Thomas, made an excellent contribution. I appreciate this is a particularly technical area of legislation, but I hope I can reassure both noble Lords that the UK’s data protection law gives effect to convention rights and is designed to protect them. The Human Rights Act requires legislation to be interpreted compatibly with convention rights, whether processing is carried out by public or private bodies. ECHR rights are therefore a pervasive aspect of the rules that apply to public and private controllers alike. The noble and learned Lord is right that individuals generally cannot bring claims against private bodies for breaches of convention rights, but I reassure him that they can bring a claim for breaching the data protection laws giving effect to those rights.

I turn to Amendment 91, tabled by the noble Viscount, Lord Camrose, Amendment 107, tabled by the noble Lord, Lord Clement-Jones, and the question of whether Clause 78 should stand part, which all relate to data subject requests. The Government believe that transparency and the right of access is crucial. That is why they will not support a change to the language around the threshold for data subject requests, as this will undermine data subjects’ rights. Neither will the Bill change the current expectations placed on controllers. The Bill reflects the EU principle of proportionality, which has always underpinned this legislation, as well as existing domestic case law and current ICO guidance. I hope that reassures noble Lords.

Amendments 97 and 99, tabled by the noble Viscount, Lord Camrose, and the noble Lord, Lord Markham, relate to the notification exemption in Article 14 of the UK GDPR. I reassure noble Lords that the proportionality test provides an important safeguard for the existing exemption when data is collected from sources other than the data subject. The controller must always consider the impact on data subjects’ rights of not notifying. They cannot rely on the disproportionate effort exemption just because of how much data they are processing—even when there are many data subjects involved, such as there would be with web scraping. Moreover, a lawful basis is required to reuse personal data: a web scraper would still need to pass the balancing test to use the legitimate interest ground, as is usually the case.

The ICO’s recent outcomes report, published on 12 December, specifically referenced the process of web scraping. The report outlined:

“Web scraping for generative AI training is a high-risk, invisible processing activity. Where insufficient transparency measures contribute to people being unable to exercise their rights, generative AI developers are likely to struggle to pass the balancing test”.

--- Later in debate ---
Given the above reassurances, I hope noble Lords will agree not to press their amendments in this group.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

The Minister said there is a power to amend, but she has not said whether she thinks that would be desirable. Is the power to be used only if we are found not to be data-adequate because the immigration exemption does not apply across the board? That is, will the power be used only if we are forced to use it?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I reassure the noble Lord that, as he knows, we are very hopeful that we will have data adequacy so that issue will not arise. I will write to him to set out in more detail when those powers would be used.

--- Later in debate ---
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

My Lords, I have co-signed Amendment 137. I do not need to repeat the arguments that have already been made by those who have spoken before me on it; they were well made, as usual. Again, it seems to expose a gap in where the Government are coming from in this area of activity, which should be at the forefront of all that they do but does not appear to be so.

As has just been said, this may be as simple as putting in an initial clause right up at the front of the Bill. Of course, that reminds me of the battle royal we had with the then Online Safety Bill in trying to get up front anything that made more sense of the Bill. It was another beast that was difficult to ingest, let alone understand, when we came to make amendments and bring forward discussions about it.

My frustration is that we are again talking about stuff that should have been well inside the thinking of those responsible for drafting the Bill. I do not understand why a lot of what has been said today has not already appeared in the planning for the Bill, and I do not think we will get very far by sending amendments back and forward that say the same thing again and again: we will only get the response that this is all dealt with and we should not be so trivial about it. Could we please have a meeting where we get around the table and try and hammer out exactly what it is that we see as deficient in the Bill, to set out very clearly for Ministers where we have red lines—that will make it very easy for them to understand whether they are going to meet them or not—and do it quickly?

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, the debate on this group emphasises how far behind the curve we are, whether it is by including new provisions in this Bill or by bringing forward an AI Bill—which, after all, was promised in the Government’s manifesto. It emphasises that we are not moving nearly fast enough in thinking about the implications of AI. While we are doing so, I need to declare an interest as co-chair of the All-Party Parliamentary Group on AI and a consultant to DLA Piper on AI policy and regulation.

I have followed the progress of AI since 2016 in the capacity of co-chair of the all-party group and chair of the AI Select Committee. We need to move much faster on a whole range of different issues. I very much hope that the noble Lord, Lord Vallance, will be here on Wednesday, when we discuss our crawler amendments, because although the noble Lord, Lord Holmes, has tabled Amendment 211A, which deals with personality rights, there is also extreme concern about the whole area of copyright. I was tipped off by the noble Lord, Lord Stevenson, so I was slightly surprised that he did not bring our attention to it: we are clearly due the consultation at any moment on intellectual property, but there seems to be some proposal within it for personality rights themselves. Whether that is a quid pro quo for a much-weakened situation on text and data mining, I do not know, but something appears to be moving out there which may become clear later this week. It seems a strange time to issue a consultation, but I recognise that it has been somewhat delayed.

In the meantime, we are forced to put forward amendments to this Bill trying to anticipate some of the issues that artificial intelligence is increasingly giving rise to. I strongly support Amendments 92, 93, 101 and 105 put forward by the noble Viscount, Lord Colville, to prevent misuse of Clause 77 by generative AI developers; I very much support the noble Lord, Lord Holmes, in wanting to see protection for image, likeness and personality; and I very much hope that we will get a positive response from the Minister in that respect.

We have heard from the noble Baronesses, Lady Kidron and Lady Harding, and the noble Lords, Lord Russell and Lord Stevenson, all of whom have made powerful speeches on previous Bills—the then Online Safety Bill and the Data Protection and Digital Information Bill—to say that children should have special protection in data protection law. As the noble Baroness, Lady Kidron, says, we need to move on from the AADC. That was a triumph she gained during the passage of the Data Protection Act 2018, but six years later the world looks very different and young people need protection from AI models of the kind she has set out in Amendment 137. I agree with the noble Lord, Lord Stevenson, that we need to talk these things through. If it produces an amendment to this Bill that is agreed, all well and good, but it could mean an amendment or part of a new AI Bill when that comes forward. Either way, we need to think constructively in this area because protection of children in the face of generative AI models, in particular, is extremely important.

This group, looking forward to further harms that could be caused by AI, is extremely important on how we can mitigate them in a number of different ways, despite the fact that these amendments appear to deal with quite a disparate group of issues.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, I too thank all noble Lords for their insightful contributions to this important group of amendments, even if some of them bemoaned the fact that they have had to repeat themselves over the course of several Bills. I am also very heartened to see how many people have joined us for Committee today. I have been involved in only two of these sittings, but this is certainly a record, and on present trends it is going to be standing room only, which is all to the good.

I have two observations before I start. First, we have to acknowledge that perhaps this area is among the most important we are going to discuss. The rights and protections of data subjects, particularly children, are in many ways the crux of all this and we have to get it right. Secondly, I absolutely take on board that there is a real appetite to get ahead of something around AI legislation. I have an amendment I am very excited about later when we come particularly to ADM, and there will be others as well, but I absolutely take on board that we need to get going on that.

Amendment 92 in the names of the noble Viscount, Lord Colville, and the noble Lord, Lord Clement-Jones, seeks to reduce the likelihood of the misuse of Clause 77 by AI model developers who may seek to claim that they do not need to notify data subjects of reuse for scientific purposes under that clause. This relates to the way that personal data is typically collected and processed for AI development. Amendment 93 similarly seeks to reduce the possibility of misuse of Clause 77 by model developers who could claim they do not need to notify data subjects of reuse for scientific purposes. Amendment 101 also claims to address the potential misuse of Clause 77 by the developers, as does Amendment 105. I strongly support the intent of amendments from the noble Viscount, Lord Colville, and the noble Lord, Lord Clement-Jones, in seeking to maintain and make provisions for the rights and protections of data subjects, and look forward very much to hearing the views of the Minister.

I turn to Amendment 137 in the names of the noble Lords, Lord Russell and Lord Stevenson, and the noble Baronesses, Lady Kidron and Lady Harding. This amendment would require the commissioner to prepare and produce a code of practice which ensures that data processors prioritise the interests, rights and freedoms of children. It goes without saying that the rights and protection of children are of utmost importance. Certainly, this amendment looks to me not only practical but proportionate, and I support it.

Finally, Amendment 211A in the name of my noble friend Lord Holmes ensures the prohibition of

“the development, deployment, marketing and sale of data related to an individual’s image, likeness or personality for AI training”

without that person’s consent. Like the other amendments in this group, this makes provision to strengthen the rights and protections of data subjects against the potential misuse or sale of data and seems entirely sensible. I am sure the Minister has listened carefully to all the concerns powerfully raised from all sides of the Committee today. It is so important that we do not lose sight of the importance of the rights and protection of data subjects.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I welcome the amendments spoken to so well by the noble Baroness, Lady Harding, regarding the open electoral register. They are intended to provide legal certainty around the use of the register, without compromising on any aspect of the data privacy of UK citizens or risking data adequacy. The amendments specify that companies are exempt from the requirement to provide individuals with information in cases where their personal data has not been obtained directly from them if that data was obtained from the open electoral register. They also provide further clarification on what constitutes “disproportionate effort” under new paragraph 5(e) of Article 14 of GDPR.

The noble Baroness covered the ground so effectively that all I need to add is that the precedent established by the current interpretation by the tribunal will affect not only the open electoral register but other public sources of data, including the register of companies, the Registry of Judgments, Orders and Fines, the Land Registry and the Food Standards Agency register. Importantly, it may even prevent the important work being done to create a national data library achieving its objectives of public sector data sharing. It will have far-reaching implications if we do not change the Bill in the way that the noble Baroness has put forward.

I thank the noble Lord, Lord Lucas, for his support for Amendment 160. I reciprocate in supporting—or, at least, hoping that we get clarification as a result of—his Amendments 158 and 161.

Amendment 159B seeks to ban what are colloquially known as cookie paywalls. As can be seen, it is the diametric opposite to Amendment 159A, tabled by the noble Viscount, Lord Camrose. For some unaccountable reason, cookie paywalls require a person who accesses a website or app to pay a fee to refuse consent to cookies being accessed from or stored on their device. Some of these sums can be extortionate and exorbitant, so I was rather surprised by the noble Viscount’s counter amendment.

Earlier this year, the Information Commissioner launched a call for views which looked to obtain a range of views on its regulatory approach to consent or pay models under data protection law. The call for views highlighted that organisations that are looking to adopt, or have already adopted, a consent-or-pay model must consider the data protection implications.

Cookie paywalls are a scam and reduce people’s power to control their data. I wonder why someone must pay if they do not consent to cookies being stored or accessed. The PEC regulations do not currently prohibit cookie paywalls. The relevant regulation is Regulation 6, which is due to be substituted by Clause 111, and is supplemented by new Schedule A1 to the PEC regulations, as inserted by Schedule 12 to the Bill. The regulation, as substituted by Clause 111 and Schedule 12, does not prohibit cookie paywalls. This comes down to the detail of the regulations, both as they currently are and as they will be if the Bill remains as drafted. It is drafted in terms that do not prevent a person signifying lack of consent to cookies, and a provider may add or set controls—namely, by imposing requirements—for how a person may signify that lack of consent. Cookie paywalls would therefore be completely legal, and they certainly have proliferated online.

This amendment makes it crystal clear that a provider must not require a person to pay a fee to signify lack of consent to their data being stored or accessed. This would mean that, in effect, cookie paywalls would be banned.

Amendment 160 is sought by the Advertising Association. It seeks to ensure that the technical storage of or access to information is considered necessary under paragraph 5 of the new Schedule A1 to the PEC regulations inserted by Schedule 12 if it would support measurement or verification of the performance of advertising services to allow website owners to charge for their advertising services more accurately. The Bill provides practical amendments to the PEC regulations through listing the types of cookies that no longer require consent.

This is important, as not all cookies should be treated the same and not all carry the same high-level risks to personal privacy. Some are integral to the service and the website itself and are extremely important for subscription-free content offered by publishers, which is principally funded by advertising. Introducing specific and target cookie exemptions has the benefit of, first, simplifying the cookie consent banner, and, secondly, increasing further legal and economic certainty for online publishers. As I said when we debated the DPDI Bill, audience measurement is an important function for media owners to determine the consumption of content, to be able to price advertising space for advertisers. Such metrics are crucial to assess the effectiveness of a media channel. For sites that carry advertising, cookies are used to verify the delivery and performance of a digital advertisement—ie, confirmation that an ad has been served or presented to a user and whether it has been clicked on. This is essential information to invoice an advertiser accurately for the number of ad impressions in a digital ad campaign.

My reading of the Bill suggests that audience measurement cookies would be covered under the list of exemptions from consent under Schedule 12, however. Can the Government confirm this? Is it the Government’s intention to use secondary legislation in future to exempt ad performance cookies?

Coming to Amendment 162 relating to the soft opt-in, I am grateful to the noble Lord, Lord Black of Brentwood, and the noble Baroness, Lady Harding of Winscombe, for their support. This amendment would enable charities to communicate to donors in the same way that businesses have been able to communicate to customers since 2003. The clause will help to facilitate greater fundraising and support the important work that charities do for society. I can do no better than quote from the letter that was sent to Secretary of State Peter Kyle on 25 November, which was co-ordinated by the DMA and involved nearly 20 major charities, seeking support for reinstating the original Clause 115 of the DPDI Bill into this Bill:

“Clause 115 of the previous DPDI Bill extended the ‘soft opt-in’ for email marketing for charities and non-commercial organisations. The DMA estimates that extending the soft opt-in to charities would increase annual donations in the UK by £290 million”,


based on analysis of 13.1 million donors by the Salocin Group. The letter continues:

“At present, the DUA Bill proposals remove this. The omission of the soft opt-in will prevent charities from being able to communicate to donors in the same way as businesses can. As representatives of both corporate entities and charitable organisations, it is unclear to the DMA why charities should be at a disadvantage in this regard”.


I hope that the Government will listen to the DMA and the charities involved.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank noble Lords for their comments and contributions. I shall jump to Amendments 159 and 159A, one of which is in my name and both of which are concerned with cookie paywalls. I am not sure I can have properly understood the objection to cookie paywalls. Do they not simply offer users three choices: pay money and stay private; share personal data and read for free; or walk away? So many times, we have all complained about the fact that these websites harvest our data and now, for the first time, this approach sets a clear cash value on the data that they are harvesting and offers us the choice. The other day somebody sent me a link from the Sun. I had those choices. I did not want to pay the money or share my data, so I did not read the article. I feel this is a personal decision, supported by clear data, which it is up to the individual to take, not the Government. I do not think we should take away this choice.

Let me turn to some of the other amendments in this group. Amendment 161 in the name of my noble friend Lord Lucas is, if I may say so, a thoughtful amendment. It would allow pension providers to communicate information on their product. This may mean that the person who will benefit from that pension does not miss out on useful information that would benefit their saving for retirement. Given that pension providers already hold the saver’s personal data, it seems to be merely a question of whether this information is wanted; of course, if it is not, the saver can simply opt out.

Amendment 162 makes an important point: many charities rely on donations from the public. Perhaps we should consider bringing down the barriers to contacting people regarding fundraising activities. At the very least, I am personally not convinced that members of the public have different expectations around what kinds of organisation can and cannot contact them and in what circumstances, so I support any step that simplifies the—to my mind—rather arbitrary differences in the treatment of business and charity communications.

Amendment 104 certainly seems a reasonable addition to the list of what might constitute “unreasonable effort” if the information is already public. However, I have some concerns about Amendments 98 and 100 to 103. For Amendment 98, who would judge the impact on the individual? I suspect that the individual and the data controllers may have different opinions on this. In Amendment 100, the effort and cost of compliance are thorny issues that would surely be dictated by the nature of the data itself and the reason for providing it to data subjects. In short, I am concerned that the controllers’ view may be more subjective than we would want.

On Amendment 102, again, when it comes to providing information to them,

“the damage and distress to the data subjects”

is a phrase on which the subject and the controller will almost inevitably have differing opinions. How will these be balanced? Additionally, one might presume that information that is either damaging or distressing to the data subjects should not necessarily be withheld from them as it is likely to be extremely important.

--- Later in debate ---
Amendment 159A from the noble Viscount, Lord Camrose, is aimed at enabling cookie paywalls. As we have identified, conversely, Amendment 159 from the noble Lord, Lord Clement-Jones, seeks to ban their use. Generally, these paywalls work by giving web users the option to pay for a cookie-free browsing experience. Earlier this year the Information Commissioner launched a call for views on “consent or pay” models for cookies. The aim of the Information Commissioner’s call for views is to provide the online advertising industry with clarity on how advertising cookies and paywalls can be used in compliance with data protection and privacy laws. We will consider the Information Commissioner’s findings when he publishes his response to this call for views. It would be premature to make legal changes without considering the findings or consulting interested parties. I hope noble Lords will bear that in mind.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

When does the Minister anticipate that the ICO will produce that report?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I do not have the detail of all that. Obviously, the call for views has only recently gone out and he will need time for consideration of the responses. I hope the noble Lord will accept that the ICO is on the case on this matter. If we can provide more information, we will.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

May I ask the Minister a hypothetical question? If the ICO believes that these are not desirable, what instruments are there for changing the law? Can the ICO, under its own steam, so to speak, ban them; do we need to do it in primary legislation; or can it be done in secondary legislation? If the Minister cannot answer now, perhaps she can write to me.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Of course I will write to the noble Lord. It will be within the ICO’s normal powers to make changes where he finds that they are necessary.

I move to Amendment 160, tabled by noble Lord, Lord Lucas, which seeks to create a new exemption for advertising performance cookies. There is a balance to strike between driving growth in the advertising, news and publishing sectors while ensuring that people retain choice and control over how their data is used. To exempt advertising measurement cookies, we would need to assess how intrusive these cookies are, including what they track and where data is sent. We have taken a delegated power so that exemptions to the prohibition can be added in future once evidence supports it, and we can devise appropriate safeguards to minimise privacy risks. In the meantime, we have been actively engaging with the advertising and publishing sectors on this issue and will continue to work with them to consider the potential use of the regulation-making power. I hope that the noble Lord will accept that this is work in progress.

Amendment 161, also from the noble Lord, Lord Lucas, aims to extend the soft opt-in rule under the privacy and electronic communications regulations to providers of auto-enrolment pension schemes. The soft opt-in rule removes the need for some commercial organisations to seek consent for direct marketing messages where there is an existing relationship between the organisation and the customer, provided the recipient did not object to receiving direct marketing messages when their contact details were collected.

The Government recognise that people auto-enrolled by their employers in workplace pension schemes may not have an existing relationship with their pension provider, so I understand the noble Lord’s motivations for this amendment. However, pension providers have opportunities to ask people to express their direct mail preferences, such as when the customer logs on to their account online. We are taking steps to improve the support available for pension holders through the joint Government and FCA advice guidance boundary review. The FCA will be seeking feedback on any interactions of proposals with direct marketing rules through that consultation process. Again, I hope the noble Lord will accept that this issue is under active consideration.

Amendment 162, tabled by the noble Lord, Lord Clement-Jones, would create an equivalent provision to the soft opt-in but for charities. It would enable a person to send electronic marketing without permission to people who have previously expressed an interest in their charitable objectives. The noble Lord will recall, and has done so, that the DPDI Bill included a provision similar to his amendment. The Government removed it from that Bill due to the concerns that it would increase direct marketing from political parties. I think we all accepted at the time that we did not want that to happen.

As the noble Lord said, his amendment is narrower because it focuses on communications for charitable purposes, but it could still increase the number of messages received by people who have previously expressed an interest in the work of charities. We are listening carefully to arguments for change in this area and will consider the points he raises, but I ask that he withdraws his amendment while we consider its potential impact further. We are happy to have further discussions on that.

--- Later in debate ---
Moved by
108: Clause 79, page 93, line 18, leave out “court” and insert “tribunal”
Member’s explanatory statement
This amendment is consequential on the new Clause (Transfer of jurisdiction of courts to tribunals).
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, in moving Amendment 108, I will also speak to all the other amendments in this group. They are all designed to transfer all existing provisions from the courts to the tribunals and simplify the enforcement of data rights. Is that not something to be desired? This is not just a procedural change but a necessary reform to ensure that the rights granted on paper translate into enforceable rights in reality.

The motivation for these amendments stems from recurring issues highlighted in cases such as Killock and Veale v the Information Commissioner, and Delo v the Information Commissioner. These cases revealed a troubling scenario where the commissioner presented contradictory positions across different levels of the judiciary, exacerbating the confusion and undermining the credibility of the regulatory framework governing data protection. In these cases, the courts have consistently pointed out the confusing division of jurisdiction between different courts and tribunals, which not only complicates the legal process but wastes considerable public resources. As it stands, individuals often face the daunting task of determining the correct legal venue for their claims, a challenge that has proved insurmountable for many, leading to denied justice and unenforced rights.

By transferring all data protection provisions from the courts to more specialised tribunals, which are better equipped to handle such cases, and clarifying the right-to-appeal decisions made by the commissioner, these amendments seek to eliminate unnecessary legal barriers. Many individuals, often representing themselves and lacking legal expertise, face the daunting challenge of navigating complex legal landscapes, deterred by high legal costs and the intricate determination of appropriate venues for their claims. This shift will not only reduce the financial burden on individuals but enhance the efficiency and effectiveness of the judicial process concerning data protection. By simplifying the legal landscape, we can safeguard individual rights more effectively and foster a more trustworthy digital environment.

--- Later in debate ---
Lord Vallance of Balham Portrait The Minister of State, Department for Science, Innovation and Technology (Lord Vallance of Balham) (Lab)
- Hansard - - - Excerpts

I thank the noble Lord, Lord Clement-Jones, for his Amendments 108, 146 to 153 and 157, and I am grateful for the comments by the noble Lord, Lord Holmes, and the noble Viscount, Lord Camrose.

The effect of this group of amendments would be to make the First-tier Tribunal and the Upper-tier Tribunal responsible for all data protection cases. They would transfer ongoing as well as future cases out of the court system to the relevant tribunals and, as has been alluded to, may cause more confusion in doing so.

As the noble Lord is aware, there is currently a blend of jurisdiction under the data protection legislation for both tribunals and courts according to the nature of the proceedings in question. This is because certain types of cases are appropriate to fall under tribunal jurisdiction while others are more appropriate for court settings. For example, claims by individuals against organisations for breaches of legal requirements can result in awards of compensation for the individuals and financial and reputational damage for the organisations. It is appropriate that such cases are handled by a court in conformance with their strict procedural and evidential rules. Indeed, under the Killock and Delo examples, it was noted that there could be additional confusion in that ability to go between those two possibilities if you went solely to one of the tribunals.

On the transfer of responsibility for making tribunal procedural rules from the Tribunal Procedure Committee to the Lord Chancellor, we think that would be inappropriate. The committee is comprised of legal experts appointed or nominated by senior members of the judiciary or the Lord Chancellor. This committee is best placed to make rules to ensure that tribunals are accessible and fair and that cases are dealt with quickly and efficiently. It keeps the rules under constant review to ensure that they are fit for purpose in line with new appeal rights and the most recent legislative changes.

Amendment 151 would also introduce a statutory appeals procedure for tribunals to determine the merits of decisions made by the Information Commissioner. Data subjects and controllers alike can already challenge the merits of the Information Commissioner’s decisions by way of judicial review in a way that would preserve the discretion and independence of the Information Commissioner’s decision-making, so no statutory procedure is needed. The Government therefore believe that the current jurisdictional framework is well-balanced and equitable, and that it provides effective and practical routes of redress for data subjects and controllers as well as appropriate safeguards to ensure compliance by organisations. For these reasons, I hope the noble Lord will not press his amendments.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I thank the Minister for his response to my amendments and welcome him to the Dispatch Box and a whole world of pain on the Data (Use and Access) Bill, as he has, no doubt, noted already after just two hours’ worth of this Committee.

I found his response disappointing, and I think both he and the noble Viscount, Lord Camrose, have misunderstood the nature of this situation. This is not a blend, which is all beautifully logical depending on the nature of the case. This is an absolute mishmash where the ordinary litigant is faced with great confusion, not knowing quite often whether to go to the court or a tribunal, where the judges themselves have criticised the confusion and where there appears to be no appetite, for some reason, in government for a review of the jurisdictions.

I felt that the noble Viscount was probably reading from his previous ministerial brief. Perhaps he looked back at Hansard for what he said on the DPDI Bill. It certainly sounded like that. The idea that the courts are peerless in their legal interpretation and the poor old tribunals really just do not know what they are doing is wrong. They are expert tribunals, you can appear before them in person and there are no fees. It is far easier to access a tribunal than a court and certainly, as far as appeals are concerned, the idea that the ordinary punter is going to take judicial review proceedings, which seems to be the implication of staying with the current system on appeals if the merits of the ICO’s decisions are to examined, seems quite breathtaking. I know from legal practice that JR is not cheap. Appearing before a tribunal and using that as an appeal mechanism would seem far preferable.

I will keep on pressing this because it seems to me that at the very least the Government need to examine the situation to have a look at what the real objections are to the jurisdictional confusion and the impact on data subjects who wish to challenge decisions. In the meantime, I beg leave to withdraw the amendment.

Amendment 108 withdrawn.
--- Later in debate ---
Moved by
110: Clause 80, page 94, line 24, at end insert—
“3. To qualify as meaningful human involvement, the review must be performed by a person with the necessary competence, training, authority to alter the decision and analytical understanding of the data.”Member's explanatory statement
This amendment would make clear that in the context of new Article 22A of the UK GDPR, for human involvement to be considered as meaningful, the review must be carried out by a competent person.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I beg to move Amendment 110 and will speak to Amendments 112, 114, 120, 121, 122, 123 and Clause 80 stand part. As we have heard, artificial intelligence and algorithmic and automated decision-making tools, are increasingly being used across the public sector to make and support many of the highest impact decisions affecting individuals, families and communities across healthcare, welfare, education, policing, immigration and many other sensitive areas of an individual’s life.



The Committee will be pleased to hear that I will not repeat the contents of my speech on my Private Member’s Bill on this subject last Friday. But the fact remains that the rapid adoption of AI in the public sector presents significant risks and challenges, including: the potential for unfairness, discrimination and misuse, as demonstrated by scandals such as the UK’s Horizon and Australia’s Robodebt cases; automated decisions that are prone to serious error; lack of transparency and accountability in automated decision-making processes; privacy and data protection concerns; algorithmic bias; and the need for human oversight.

--- Later in debate ---
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, we have had a really profound and significant debate on these issues; it has been really helpful that they have been aired by a number of noble Lords in a compelling and articulate way. I thank everybody for their contributions.

I have to say at the outset that the Government want data protection rules fit for the age of emerging technologies. The noble Lord, Lord Holmes, asked whether we are addressing issues of the past or issues of the future. We believe that the balance we have in this Bill is exactly about addressing the issues of the future. Our reforms will reduce barriers to the responsible use of automation while clarifying that organisations must provide stringent safeguards for individuals.

I stress again how seriously we take these issues. A number of examples have been quoted as the debate has gone on. I say to those noble Lords that examples were given where there was no human involved. That is precisely what the new provisions in this Bill attempt to address, in order to make sure that there is meaningful human involvement and people’s futures are not being decided by an automated machine.

Amendment 110 tabled by the noble Lords, Lord Clement-Jones and Lord Knight, seeks to clarify that, for human involvement to be meaningful, it must be carried out by a competent person. Our reforms make clear that solely automated decisions lack meaningful human involvement. That goes beyond a tick-box exercise. The ICO guidance also clarifies that

“the human involvement has to be active and not just a token gesture”;

that right is absolutely underpinned by the wording of the regulations here.

I turn next to Amendment 111. I can assure—

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I was listening very carefully. Does “underpinned by the regulations” mean that it will be underpinned?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Yes. The provisions in this Bill cover exactly that concern.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

The issue of meaningful human involvement is absolutely crucial. Is the Minister saying that regulations issued by the Secretary of State will define “meaningful human involvement”, or is she saying that it is already in the primary legislation, which is not my impression?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Sorry—it is probably my choice of language. I am saying that it is already in the Bill; it is not intended to be separate. I was talking about whether solely automated decisions lack meaningful human involvement. This provision is already set out in the Bill; that is the whole purpose of it.

On Amendment 111, I assure the noble Viscount, Lord Camrose, that controllers using solely automated processing are required to comply with the data protection principles. I know that he was anticipating this answer, but we believe that it captures the principles he proposes and achieves the same intended effect as his amendment. I agree with the noble Viscount that data protection is not the only lens through which AI should be regulated, and that we cannot address all AI risks through the data protection legislation, but the data protection principles are the right ones for solely automated decision-making, given its place in the data protection framework. I hope that that answers his concerns.

On Amendment 112, which seeks to prohibit solely automated decisions that contravene the Equality Act 2010, I assure the noble Lords, Lord Clement-Jones and Lord Knight, that the data protection framework is clear that controllers must adhere to the Equality Act.

Amendments 113 and 114 would extend solely automated decision-making safeguards to predominantly automated decision-making. I assure the noble and learned Lord Thomas, the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, that the safeguards in Clause 80 are designed to protect individuals where meaningful human involvement is lacking. Predominantly automated decision-making will already include meaningful human involvement and therefore does not require these additional safeguards.

On Amendments 114A and 115A, tabled by the noble Viscount, Lord Camrose, many noble Lords have spoken in our debates about the importance of future-proofing the legislation. These powers are an example of that: without them, the Government will not have the ability to act quickly to update protections for individuals in the light of rapid technology developments.

I assure noble Lords that the regulation powers are subject to a number of safeguards. The Secretary of State must consult the Information Commissioner and have regard to other relevant factors, which can include the impact on individuals’ rights and freedoms as well as the specific needs and rights of children. As with all regulations, the exercise of these powers must be rational; they cannot be used irrationally or arbitrarily. Furthermore, the regulations will be subject to the affirmative procedure and so must be approved by both Houses of Parliament.

I assure the noble Lord, Lord Clement-Jones, that one of the powers means that his Amendment 123 is not necessary, as it can be used to describe specifically what is or is not meaningful human involvement.

Amendment 115A, tabled by the noble Viscount, Lord Camrose, would remove the reforms to Parts 3 and 4 of the Data Protection Act, thereby putting them out of alignment with the UK GDPR. That would cause confusion and ambiguity for data subjects.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

I am sorry to interrupt again as we go along but, a sentence or so ago, the Minister said that the definition in Amendment 123 of meaningful human involvement in automated decision-making was unnecessary. The amendment is designed to change matters. It would not be the Secretary of State who determined the meaning of meaningful human involvement; in essence, it would be initiated by the Information Commissioner, in consultation with the Secretary of State. So I do not quite understand why the Minister used “unnecessary”. It may be an alternative that is undesirable, but I do not understand why she has come to the conclusion that it is unnecessary. I thought it was easier to challenge the points as we go along rather than at the very end.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, we would say that a definition in the Bill is not necessary because it is dealt with case by case and is supplemented by these powers. The Secretary of State does not define meaningful human involvement; it is best done case by case, supported by the ICO guidance. I hope that that addresses the noble Lord’s point.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

That is slightly splitting hairs. The noble Viscount, Lord Camrose, might want to comment because he wanted to delete the wording that says:

“The Secretary of State may by regulations provide that … there is, or is not, to be taken to be meaningful human involvement”.


He certainly will determine—or is able to determine, at least—whether or not there is human involvement. Surely, as part of that, there will need to be consideration of what human involvement is.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Will the Minister reflect on the issues around a case-by-case basis? If I were running an organisation of any sort and decided I wanted to use ADM, how would I make a judgment about what is meaningful human involvement on a case-by-case basis? It implies that I would have to hope that my judgment was okay because I have not had clarity from anywhere else and in retrospect, someone might come after me if I got that judgment wrong. I am not sure that works, so will she reflect on that at some point?

--- Later in debate ---
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I am happy to write.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I thank the Minister for her very detailed and careful response to all the amendments. Clearly, from the number of speakers in this debate, this is one of the most important areas of the Bill and one that has given one of the greatest degrees of concern, both inside and outside the Committee. I think the general feeling is that there is still concern. The Minister is quite clear that the Government are taking these issues seriously, in terms of ADM itself and the impact in the workplace, but there are missing parts here. If you add all the amendments together—no doubt we will read Hansard and, in a sense, tick off the areas where we have been given an assurance about the interpretation of the Bill—there are still great gaps.

It was very interesting to hear what the noble Lord, Lord Kamall, had to say about how the computer said “no” as he reached the gate. A lot of this is about communications. I would be very interested if any letter to the noble Lord, Lord Lucas, was copied more broadly, because that is clearly one of the key issues. It was reassuring to hear that the ICO will be on top of this in terms of definitions, guidance, audit and so on, and that we are imminently to get the publication of the records of algorithmic systems in use under the terms of the algorithmic transparency recording standard.

We have had some extremely well-made points from the noble Viscounts, Lord Colville and Lord Camrose, the noble Lords, Lord Lucas, Lord Knight and Lord Holmes, and the noble Baroness, Lady Kidron. I am not going to unpack all of them, but we clearly need to take this further and chew it over before we get to Report. I very much hope that the Minister will regard a will write letter on stilts as required before we go very much further, because I do not think we will be purely satisfied by this debate.

The one area where I would disagree is on treating solely automated decision-making as the pure subject of the Clause 80 rights. Looking at it in the converse, it is perfectly proper to regard something that does not have meaningful human involvement as predominantly automated decision-making. I do not think, in the words of the noble Viscount, Lord Camrose, that this does muddy the waters. We need to be clearer about what we regard as being automated decision-making for the purpose of this clause.

There is still quite a lot of work to do in chewing over the Minister’s words. In the meantime, I beg leave to withdraw my amendment.

Amendment 110 withdrawn.
--- Later in debate ---
This seeks to retain the requirement for police forces to record the reason they are accessing data from a police database.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, a key aspect of data protection rests in how it restricts the use of personal data once it has been collected. The public need confidence that their data will be used for the reasons they had shared it and not further used in ways that breach their legitimate expectations—or they will become suspicious as regards providing their data. The underlying theme that we heard on the previous group was the danger of losing public trust, which very much applies in the area of law enforcement and national security.

However, Schedules 4 and 5 would remove the requirement to consider the legitimate expectations of the individuals whose data is being processed, or the impact that this would have on their rights, for the purposes of national security, crime detection and prevention, safeguarding or answering to a request by a public authority. Data used for the purposes listed in these schedules would not need to undergo either a balancing test under Article 6.1(f) or a compatibility test under Article 6.4 of the UK GDPR. The combined effect of these provisions would be to authorise almost unconditional data sharing for law enforcement and other public security purposes while, at the same time, reducing accountability and traceability over how the police use the information being shared with them.

As with the previous DPDI Bill, Clauses 87 to 89 of this Bill grant the Home Secretary and police powers to view and use people’s personal data through the use of national security certificates and designation notices, which are substantially the same as Clauses 28 to 30 of the previous DPDI Bill. This risks further eroding trust in law enforcement authorities. Accountability for access to data for law enforcement purposes should not be lowered, and data sharing should be underpinned by a robust test to ensure that individuals’ rights and expectations are not disproportionately impacted. It is a bafflement as to why the Government are so slavishly following their predecessor and believe that these new and unaccountable powers are necessary.

By opposing that Clause 81 stand part, I seek to retain the requirement for police forces to record the reason they are accessing data from a police database. The public need more, not less, transparency and accountability over how, why and when police staff and officers access and use records about them. Just recently, the Met Police admitted that they investigated more than 100 staff over the inappropriate accessing of information in relation to Sarah Everard. This shows that the police can and do act to access information inappropriately, and there may well be less prominent cases where police abuse their power by accessing information without worry for the consequences.

Regarding Amendments 126, 128 and 129, Rights and Security International has repeatedly argued that the Bill would violate the UK’s obligations under the European Convention on Human Rights. On Amendment 126, the requirements in the EU law enforcement directive for logging are, principally, to capture in all cases the justification for personal data being examined, copied, amended or disclosed when it is processed for a law enforcement process—the objective is clearly to ensure that data is processed only for a legitimate purpose—and, secondarily, to identify when, how and by whom the data has been accessed or disclosed. This ensures that individual accountability is captured and recorded.

Law enforcement systems in use in the UK typically capture some of the latter information in logs, but very rarely do they capture the former. Nor, I am informed, do many commodity IT solutions on the market capture why data was accessed or amended by default. For this reason, a long period of time was allowed under the law enforcement directive to modify legacy systems installed before May 2016, which, in the UK, included services such as the police national computer and the police national database, along with many others at a force level. This transitional relief extended to 6 May 2023, but UK law enforcement did not, in general, make the required changes. Nor, it seems, did it ensure that all IT systems procured after 6 May 2016 included a strict requirement for LED-aligned logging. By adopting and using commodity and hyperscaler cloud services, it has exacerbated this problem.

In early April 2023, the Data Protection Act 2018 (Transitional Provision) Regulations 2023 were laid before Parliament. These regulations had the effect of unilaterally extending the transitional relief period under the law enforcement directive for the UK from May 2023 to May 2026. The Government now wish to strike the requirement to capture the justification for any access to data completely, on the basis that this would free up to 1.5 million hours a year of valuable police time for our officers so that they can focus on tackling crime on our streets, rather than being bogged down by administration, and that this would save approximately £42.8 million per year in taxpayers’ money.

This is a serious legislative issue on two counts: it removes important evidence that may identify whether a person was acting with malicious intent when accessing data, as well as removing any deterrent effect of them having to do so; and it directly deviates from a core part of the law enforcement directive and will clearly have an impact on UK data adequacy. The application of effective control over access to data is very much a live issue in policing, and changing the logging requirement in this way does nothing to improve police data management. Rather, it excuses and perpetuates bad practice. Nor does it increase public confidence.

Clause 87(7) introduces new Section 78A into the Act. This lays down a number of exemptions and exclusions from Part 3 of that Act when the processing is deemed to be in the interests of national security. These exemptions are wide ranging, and include the ability to suspend or ignore principles 2 through 6 in Part 3, and thus run directly contrary to the provisions and expectations of the EU law enforcement directive. Ignoring those principles in itself also negates many of the controls and clauses in Part 3 in its entirety. As a result, they will almost certainly result in the immediate loss of EU law-enforcement adequacy.

I welcome the ministerial letter from the noble Lord, Lord Hanson of Flint, to the noble Lord, Lord Anderson, of 6 November, but was he really saying that all the national security exemption clause does is bring the 2018 Act into conformity with the GDPR? I very much hope that the Minister will set out for the record whether that is really the case and whether it is really necessary to safeguard national security. Although it is, of course, appropriate and necessary for the UK to protect its national security interests, it is imperative that balance remains to protect the rights of a data subject. These proposals do not, as far as we can see, strike that balance.

Clause 88 introduces the ability of law enforcement, competent authorities and intelligence agencies to act as joint controllers in some circumstances. If Clause 88 and associated clauses go forward to become law, they will almost certainly again result in withdrawal of UK law enforcement adequacy and will quite likely impact on the TCA itself.

Amendment 127 is designed to bring attention to the fact that there are systemic issues with UK law enforcement’s new use of hyperscaler cloud service providers to process personal data. These issues stem from the fact that service providers’ standard contracts and terms of service fail to meet the requirements of Part 3 of the UK’s Data Protection Act 2018 and the EU law enforcement directive. UK law enforcement agencies are subject to stringent data protection laws, including Part 3 of the DPA and the GDPR. These laws dictate how personal data, including that of victims, witnesses, suspects and offenders, can be processed. Part 3 specifically addresses data transfers to third countries, with a presumption against such transfers unless strictly necessary. This contrasts with UK GDPR, which allows routine overseas data transfer with appropriate safeguards.

Cloud service providers routinely process data outside the UK and lack the necessary contractual guarantees and legal undertakings required by Part 3 of the DPA. As a result, their use for law enforcement data processing is, on the face of it, not lawful. This non-compliance creates significant financial exposure for the UK, including potential compensation claims from data subjects for distress or loss. The sheer volume of data processed by law enforcement, particularly body-worn video footage, exacerbates the financial risk. If only a small percentage of cases result in claims, the compensation burden could reach hundreds of millions of pounds annually. The Government’s attempts to change the law highlight the issue and suggest that past processing on cloud service providers has not been in conformity with the UK GDPR and the DPA.

The current effect of Section 73(4)(b) of the Data Protection Act is to restrict transfers for competent authorities who may have a legitimate operating need, and should possess the internal capability to assess that need, from making transfers to recipients who are not relevant authorities or international organisations and that cloud service provider. This amendment is designed to probe what impact removal of this restriction would have and whether it would enable them to do so where such a transfer is justified and necessary. I beg to move.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Non-Afl)
- Hansard - - - Excerpts

My Lords, I will speak to Amendment 124. I am sorry that I was not able to speak on this issue at Second Reading. I am grateful to the noble and learned Lord, Lord Thomas of Cwmgiedd, for his support, and I am sorry that he has not been able to stay, due to a prior engagement.

Eagle-eyed Ministers and the Opposition Front Bench will recognise that this was originally tabled as an amendment to the Data Protection and Digital Information (No. 2) Bill. It is still supported by the Police Federation. I am grateful to the former Member of Parliament for Loughborough for originally raising this with me, and I thank the Police Federation for its assistance in briefing us in preparing this draft clause. The Police Federation understands that the Home Secretary is supportive of the objective of this amendment, so I shall listen with great interest to what the Minister has to say.

This is a discrete amendment designed to address an extremely burdensome and potentially unnecessary redaction exercise, in relation to a situation where the police are preparing a case file for submission to the Crown Prosecution Service for a charging decision. Given that this issue was talked about in the prior Bill, I do not intend to go into huge amounts of detail because we rehearsed the arguments there, but I hope very much that with the new Government there might be a willingness to entertain this as a change in the law.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, none of us can be under any illusion about the growing threats of cyberattacks, whether from state actors, state-affiliated actors or criminal gangs. It is pretty unusual nowadays to find someone who has not received a phishing email, had hackers target an account or been promised untold riches by a prince from a faraway country. But, while technology has empowered these criminals, it is also the most powerful tool we have against them. To that end, we must do all we can do to assist the police, the NCA, the CPS, the SIS and their overseas counterparts in countries much like our own. That said, we must also balance this assistance with the right of individuals to privacy.

Regarding the Clause 81 stand part notice from the noble Lord, Lord Clement-Jones, I respectfully disagree with this suggestion. If someone within the police were to access police records in an unauthorised capacity or for malign reasons, I simply doubt that they would be foolish enough to enter their true intentions into an access log. They would lie, of course, rendering the log pointless, so I struggle to see—we had this debate on the DPDI Bill—how this logging system would help the police to identify unauthorised access to sensitive data. It would simply eat up hours of valuable police time. I remember from our time working on the DPDI Bill that the police supported this view.

As for Amendment 124, which allows for greater collaboration between the police and the CPS when deciding charging decisions, there is certainly something to be said for this principle. If being able to share more detailed information would help the police and the CPS come to the best decision for victims, society and justice, then I absolutely support it.

Amendments 126, 128 and 129 seek to keep the UK in close alignment with the EU regarding data sharing. EU alignment or non-alignment is surely a decision for the Government of the day alone. We should not look to bind a future Administration to the EU.

I understand that Amendment 127 looks to allow data transfers to competent authorities—that is, law enforcement bodies in other countries—that may have a legitimate operating need. Is this not already the case? Are there existing provisions in the Bill to facilitate such transfers and, if so, does this not therefore duplicate them? I would very much welcome the thoughts of both the Minister and the noble Lord, Lord Clement-Jones, when he sums up at the end.

Amendment 156A would add to the definition of “unauthorised access” so that it includes instances where a person accesses data in the reasonable knowledge that the controller would not consent if they knew about the access or the reason for the access, and the person is not empowered to access it by an enactment. Given the amount of valuable personal data held by controllers as our lives continue to move online, there is real merit to this idea from my noble friend Lord Holmes, and I look forward to hearing the views of the Minister.

Finally, I feel Amendment 210 from my noble friend Lady Owen—ably supported in her unfortunate absence by the noble Baroness, Lady Kidron—is an excellent amendment as it prevents a person convicted of a sexual offence from retaining the images that breached the law. This will prevent them from continuing to use the images for their own ends and from sharing them further. It would help the victims of these crimes regain control of these images which, I hope, would be of great value to those affected. I hope that the Minister will give this serious consideration, particularly in light of noble Lords’ very positive response to my noble friend’s Private Member’s Bill at the end of last week.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

I think the noble Viscount, Lord Camrose, referred to Amendment 156A from the noble Lord, Lord Holmes—I think he will find that is in a future group. I saw the Minister looking askance because I doubt whether she has a note on it at this stage.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank the noble Lord, Lord Clement-Jones; let me consider it a marker for future discussion.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I thank the noble Lord, Lord Clement-Jones, for coming to my rescue there.

I turn to the Clause 81 stand part notice tabled by the noble Lord, Lord Clement-Jones, which would remove Clause 81 from the Bill. Section 62 of the Data Protection Act requires law enforcement agencies to record their processing activities, including their reasons for accessing and disclosing personal information. Entering a justification manually was intended to help detect unauthorised access. The noble Lord was right that the police do sometimes abuse their power; however, I agree with the noble Viscount, Lord Camrose, that the reality is that anyone accessing the system unlawfully is highly unlikely to record that, making this an ineffective safeguard.

Meanwhile, the position of the National Police Chiefs’ Council is that this change will not impede any investigation concerning the unlawful processing of personal data. Clause 81 does not remove the strong safeguards that ensure accountability for data use by law enforcement that include the requirement to record time, date, and where possible, who has accessed the data, which are far more effective in monitoring potential data misuse. We would argue that the requirement to manually record a justification every time case information is accessed places a considerable burden on policing. I think the noble Lord himself said that we estimate that this clause may save approximately 1.5 million policing hours, equivalent to a saving in the region of £42.8 million a year.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

There were some raised eyebrows.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Yes, we could not see the noble Lord’s raised eyebrows.

Turning to Amendment 124, I thank the noble Baroness, Lady Morgan, for raising this important issue. While I obviously understand and welcome the intent, I do not think that the legislative change is what is required here. The Information Commissioner’s Office agrees that the Data Protection Act is not a barrier to the sharing of personal data between the police and the CPS. What is needed is a change in the operational processes in place between the police and the CPS that are causing this redaction burden that the noble Baroness spelled out so coherently.

We are very much aware that this is an issue and, as I think the noble Baroness knows, the Government are committed to reducing the burden on the police and the Home Office and to exploring with partners across the criminal justice system how this can best be achieved. We absolutely understand the point that the noble Baroness has raised, but I hope that she could agree to give space to the Home Office and the CPS to try to find a resolution so that we do not have the unnecessary burden of redaction when it is not necessary. It is an ongoing discussion—which I know the noble Baroness knows really—and I hope that she will not pursue it on that basis.

I will address Amendments 126 to 129 together. These amendments seek to remove parts of Schedule 8 to avoid divergence from EU legislation. The noble Lord, Lord Clement-Jones, proposes instead to remove existing parts of Section 73 of the Data Protection Act 2018. New Section 73(4)(aa), introduced by this Bill, with its bespoke path for personal data transfers from UK controllers to international processors, is crucial. In the modern age, where the use of such capabilities and the benefits they provide is increasing, we need to ensure that law enforcement can make effective use of them to tackle crime and keep citizens safe.

--- Later in debate ---
Considering all the explanations I have given, I hope that noble Lords will withdraw or not press their amendments.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I thank the Minister for her response on this group, which was, again, very detailed. There is a lot to consider in what she had to say, particularly about the clauses beyond Clause 81. I am rather surprised that the current Government are still going down the same track on Clause 81. It is as if, because the risk of abuse is so high, this Government, like the previous one, have decided that it is not necessary to have the safeguard of putting down the justification in the first place. Yet we have heard about the Sarah Everard police officers. It seems to me perverse not to require justification. I will read further what the Minister had to say but it seems quite extraordinary to be taking away a safeguard at this time, especially when the Minister says that, at the same time, they need to produce logs of the time of the data being shared and so on. I cannot see what is to be gained—I certainly cannot see £42 million being saved. It is a very precise figure: £42.8 million. I wonder where the £800,000 comes from. It seems almost too precise to be credible.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I emphasise that we believe the safeguards are there. This is not a watering down of provisions. We are just making sure that the safeguards are more appropriate for the sort of abuse that we think might happen in future from police misusing their records. I do not want it left on the record that we do not think that is important.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

No. As I was saying, it seems that the Minister is saying that there will still be the necessity to log the fact that data has been shared. However, it seems extraordinary that, at the same time, it is not possible to say what the justification is. The justification could be all kinds of things, but it makes somebody think before they simply share the data. It seems to me that, given the clear evidence of abuse of data by police officers—data of the deceased, for heaven’s sake—we need to keep all the safeguards we currently have. That is a clear bone of contention.

I will read what else the Minister had to say about the other clauses in the group, which are rather more sensitive from the point of view of national security, data sharing abroad and so on.

Clause 81 agreed.
--- Later in debate ---
Moved by
134: Clause 90, page 113, leave out lines 1 to 5 and insert—
“(a) to monitor the application of GDPR, the applied GDPR and this Act, and ensure are fully enforced with all due diligence;(b) to act upon receiving a complaint, to investigate, to the extent appropriate, the subject matter of the complaint, and to take steps to clarify unsubstantiated issues before dismissing the complaint.”Member’s explanatory statement
This amendment removes the secondary objectives introduced by the Data Use and Access Bill, which frame innovation, competition, crime prevention and national security as competing objectives against the enforcement of data protection law.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, in moving Amendment 134—it is the lead amendment in this group—I shall speak to the others in my name and my Clause 92 stand part notice. Many of the amendments in this group stem from concerns that the new structure for the ICO will diminish its independence. The ICO is abolished in favour of the commission.

--- Later in debate ---
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I thank noble Lords for their consideration of the issues before us in this group. I begin with Amendment 134 from the noble Lord, Lord Clement-Jones. I can confirm that the primary duty of the commissioner will be to uphold the principal objective: securing an appropriate level of data protection, carrying out the crucial balancing test between the interests of data subjects, controllers and wider public interests, and promoting public trust and confidence in the use of personal data.

The other duties sit below this objective and do not compete with it—they do not come at the expense of upholding data protection standards. The commissioner will have to consider these duties in his work but will have discretion as to their application. Moreover, the new objectives inserted by the amendment concerning monitoring, enforcement and complaints are already covered by legislation.

I thank the noble Lord, Lord Lucas for Amendment 135A. The amendment was a previous feature of the DPDI Bill but the Government decided that a statement of strategic priorities for the ICO in this Bill is not necessary. The Government will of course continue to set out their priorities in relation to data protection and other related areas and discuss them with the Information Commissioner as appropriate.

Amendment 142 from the noble Viscount, Lord Camrose, would remove the ICO’s ability to serve notices by email. We would argue that email is a fast, accessible and inexpensive method for issuing notices. I can reassure noble Lords that the ICO can serve a notice via email only if it is sent to an email address published by the recipient or where the ICO has reasonable grounds to believe that the notice will come to the attention of the person, significantly reducing the risk that emails may be missed or sent to the wrong address.

Regarding the noble Viscount’s Amendment 143, the assumption that an email notice will be received in 48 hours is reasonable and equivalent to the respective legislation of other regulators, such as the CMA and Ofcom.

I thank the noble Lord, Lord Clement-Jones, for Amendment 144 concerning the ICO’s use of reprimands. The regulator does not commonly issue multiple reprimands to the same organisation. But it is important that the ICO, as an independent regulator, has the discretion and flexibility in instances where there may be a legitimate need to issue multiple reprimands within a particular period without placing arbitrary limits on that.

Turning to Amendment 144A, the new requirements in Clause 101 will already lead to the publication of an annual report, which will include the regulator’s investigation and enforcement activity. Reporting will be categorised to ensure that where the detail of cases is not public, commercially sensitive investigations are not inadvertently shared. Splitting out reporting by country or locality would make it more difficult to protect sensitive data.

Turning to Amendment 145, with thanks to the noble Baroness, Lady Kidron, I agree with the importance of ensuring that the regulator can be held to account on this issue effectively. The new annual report in Clause 101 will cover all the ICO’s regulatory activity, including that taken to uphold the rights of children. Clause 90 also requires the ICO to publish a strategy and report on how it has complied with its new statutory duties. Both of these will cover the new duty relating to children’s awareness and rights, and this should include the ICO’s activity to support and uphold its important age-appropriate design code.

I thank the noble Lord, Lord Clement-Jones, for Amendments 163 to 192 to Schedule 14, which establishes the governance structure of the information commission. The approach, including the responsibilities conferred on the Secretary of State, at the core of the amendments follows standard corporate governance best practice and reflects the Government’s commitment to safeguarding the independence of the regulator. This includes requiring the Secretary of State to consult the chair of the information commission before making appointments of non-executive members.

Amendments 165 and 167A would require members of the commission to be appointed to oversee specific tasks and to be from prescribed fields of expertise. Due to the commission’s broad regulatory remit, the Government consider that it would not be appropriate or helpful for the legislation to set out specific areas that should receive prominence over others. The Government are confident that the Bill will ensure that the commission has the right expertise on its board. Our approach safeguards the integrity and independence of the regulator, draws clearly on established precedent and provides appropriate oversight of its activities.

Finally, Clauses 91 and 92 were designed to ensure that the ICO’s statutory codes are consistent in their development, informed by relevant expertise and take account of their impact on those likely to be affected by them. They also ensure that codes required by the Secretary of State have the same legal effect as pre-existing codes published under the Data Protection Act.

Considering the explanations I have offered, I hope that the noble Lords, Lord Clement-Jones and Lord Lucas, the noble Viscount, Lord Camrose, and the noble Baroness, Lady Kidron, will agree not to press their amendments.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I thank the Minister for that response. If I speak for four minutes, that will just about fill the gap, but I hope to speak for less than that.

The Minister’s response was very helpful, particularly the way in which she put the clarification of objectives. Of course, this is shared with other regulators, where this new growth duty needs to be set in the context of the key priorities of the regulator. My earlier amendment reflected a nervousness about adding innovation and growth duties to a regulator, which may be seen to unbalance the key objectives of the regulator in the first place, but I will read carefully what the Minister said. I welcome the fact that, unlike in the DPDI Bill, there is no requirement for a statement of strategic priorities. That is why I did not support Amendment 135A.

It is somewhat ironic that, in discussing a digital Bill, the noble Viscount, Lord Camrose, decided to go completely analogue, but that is life. Maybe that is what happens to you after four and a half hours of the Committee.

I do not think the Minister covered the ground on the reprimands front. I will read carefully what she said about the annual report and the need for the ICO—or the commission, as it will be—to report on its actions. I hope, just by putting down these kinds of amendments on reprimands, that the ICO will take notice. I have been in correspondence with the ICO myself, as have a number of organisations. There is some dissatisfaction, particularly with companies such as Clearview, where it is felt that the ICO has not taken adequate action on scraping and building databases from the internet. We will see whether the ICO becomes more proactive in that respect. I was reassured, however, by what the Minister said about NED qualifications and the general objective on the independence of the regulator.

There is much to chew on in what the Minister said. In the meantime, I beg leave to withdraw my amendment.

Amendment 134 withdrawn.

Data (Use and Access) Bill [HL]

Lord Clement-Jones Excerpts
Moved by
138: After Clause 92, insert the following new Clause—
“Code on processing personal data in education where it concerns a child or pupil(1) The Information Commissioner must consult on, prepare and publish a Code of Practice on standards to be followed in relation to the collection, processing, publication and other dissemination of personal data concerning children and pupils in connection with the provision of education services in the United Kingdom, within the meaning of the Education Act 1996, the Education (Scotland) Act 1996, and the Education and Libraries (Northern Ireland) Order 1986; and on standards on the rights of those children as data subjects which are appropriate to children’s capacity and stage of education.(2) For the purposes of subsection (1), the rights of data subjects must include—(a) measures related to responsibilities of the controller, data protection by design and by default, and security of processing,(b) safeguards and suitable measures with regard to automated decision-making, including profiling and restrictions,(c) the rights of data subjects including to object to or restrict the processing of their personal data collected during their education, including any exemptions for research purposes, and(d) matters related to the understanding and exercising of rights relating to personal data and the provision of education services.”Member’s explanatory statement
This amendment requires the Commission to consult on, prepare and publish a Code of Practice on standards to be followed in relation to the collection, processing, publication and other dissemination of personal data concerning children and pupils in connection with the provision of education services in the UK.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, unusually, I rise to move an amendment, Amendment 138. For the second time in Committee, I find myself heading a group when I know that the noble Baroness, Lady Kidron, will be much better qualified to introduce the subject. Indeed, she has an amendment, Amendment 141, which is far preferable in many ways to mine.

Amendment 138 is designed to ensure that the Information Commissioner produces a code of practice specific to children up to the age of 18 for the purposes of UK law and Convention 108, and pupils as defined by the Education Act 1996, who may be up to the age of 19 or, with special educational needs, up to 25 in the education sector. The charity Data, Tech & Black Communities put it this way in a recent letter to the noble Baroness, Lady Jones:

“We recently completed a community research project examining the use of EdTech in Birmingham schools. This project brought us into contact with over 100 people … including parents, school staff and community members. A key finding was the need to make it easier for those with stewardship responsibility for children’s data, to fulfil this duty. Even with current data protection rights, parents and guardians struggle to make inquiries (of schools, EdTech companies and even DfE) about the purpose behind the collection of some of their children’s data, clarity about how it is used (or re-used) or how long data will be retained for. ‘Opting out’ on behalf of their children can be just as challenging. All of which militates against nuanced decision-making about how best to protect children’s short and long-term interests … This is why we are in support of an ICO Code of Practice for Educational Settings that would enable school staff, parents and learners, the EdTech industry and researchers to responsibly collect, share and make use of children’s data in ways that support the latter’s agency over their ‘digital selves’ and more importantly, will support their flourishing”.


The duties of settings and data processers and rights appropriate to the stage of education and children’s capacity needs clarity and consistency. Staff need confidence to access and use data appropriately within the law. As the UNCRC’s General Comment No. 16 (2013) on State Obligations Regarding the Impact of the Business Sector on Children’s Rights set out over a decade ago,

“the realization of children’s rights is not an automatic consequence of economic growth and business enterprises can also negatively impact children’s rights”.

The educational setting is different from only commercial interactions or in regard to the data subjects being children. It is more complex because of the disempowered environment and its imbalance of power between the authority, the parents and the child. The additional condition is the fact that parents’ and children’s rights are interlinked, as exemplified in the right to education described in UDHR Article 26(3), which states:

“Parents have a prior right to choose the kind of education that shall be given to their children.”


A code is needed because the explicit safeguards are missing that the GDPR requires in several places but were left out of the UK Data Protection Act 2018 drafting. Clause 80 of the Bill—“Automated decision-making”—does not address the necessary safeguards of GDPR Article 23(1) for children. Furthermore, removing the protections of the balancing test under the recognised legitimate interest condition will create new risks. Clauses on additional further processing or changes to purpose limitation are inappropriately wide without child-specific safeguards. The volume, sensitivity and intrusiveness of identifying personal data collection in educational settings only increases, while the protections are only ever reduced.

Obligations specific to children’s data, especially

“solely automated decision-making and profiling”

and exceptions, need to be consistent with clear safeguards by design where they restrict fundamental freedoms. What does that mean for children in practice, where teachers are assumed to be the rights bearers in loco parentis? The need for compliance with human rights, security, health and safety, among other standards proportionate to the risks of data processing and respecting the UK Government’s accessibility requirements, should be self-evident and adopted in a code of practice, as recommended in the five rights in the Digital Futures Commission’s blueprint for educational data governance.

The Council of Europe Strategy for the Rights of the Child (2022-2027) and the UNCRC General Comment No. 25 on Children’s Rights and the Digital Environment make it clear that

“children have the right to be heard and participate in decisions affecting them”.

They recognise that

“capacity matters, in accordance with their age and maturity. In particular attention should be paid to empowering children in vulnerable situations, such as children with disabilities.”

Paragraph 75 recognises that surveillance in educational settings should not take place without the right to object and that teachers need training to keep up with technological developments.

Participation of young people themselves has not been invited in the development of this Bill and the views of young people have not been considered. However, a small sample of parent and pupil voices has been captured in the Responsible Technology Adoption Unit’s public engagement work together with the DfE in 2024. The findings back those of Defend Digital Me’s Survation poll in 2018 and show that parents do not know that the DfE already holds named pupil records without their knowledge or permission and that the data is given away to be reused by hundreds of commercial companies, the DWP, the Home Office and the police. It stated:

“There was widespread consensus that work and data should not be used without parents’ and/or pupils’ explicit agreement. Parents, in particular, stressed the need for clear and comprehensive information about pupil work and data use and any potential risks relating to data security and privacy breaches.”


A code of practice is needed to explain the law and make it work as intended for everyone. The aims of a code of practice for educational settings would be that adherence to a code creates a mechanism for controllers and processors to demonstrate compliance with the legislation or approve certification methods. It would give providers confidence in consistent and clear standards and would be good for the edtech sector. It would allow children, parents, school staff and systems administrators to build trust in safe, fair and transparent practice so that their rights are freely met by design and default.

Further, schools give children’s personal data to many commercial companies during a child’s education—not based on consent but assumed for the performance of a task carried out in the public interest. A code should clarify any boundaries of this lawful basis for commercial purposes, where it is an obligation on parents to provide the data and what this means for the child on reaching maturity or after leaving the educational setting.

Again, a code should help companies understand “data protection by design and default” in practice, and appropriate “significant legal effect”, the edges of “public interest” in data transfers to a third country, and how special categories of data affect children in schools. A code should also support children and families in understanding the effect of the responsibilities of controllers and processes for the execution or limitation of their own rights. It would set out the responsibilities of software platforms that profile users’ metadata to share with third parties, or of commercial apps signed up for in schools that offer adverts in use.

I hope that I have explained exactly why we believe that a code of conduct is required in educational settings. I beg to move.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I support and have added my name to Amendment 138 in the name of the noble Lord, Lord Clement-Jones. I will also speak to Amendment 141 in my name and those of the noble Lords, Lord Knight and Lord Russell, and the noble Baroness, Lady Harding.

Both these amendments propose a code of practice to address the use of children’s data in the context of education. Indeed, they have much in common. Having heard the noble Lord, Lord Clement-Jones, I have much in common with what he said. I associate myself entirely with his remarks and hope that mine will build on them. Both the amendments point to the same problem that children’s data is scandalously treated in our schools and educators need support; this is a persistent and known failure that both the DfE and the ICO have failed to confront over a period of some years.

Amendment 141 seeks to give a sense of exactly what an education code should cover. In doing so, it builds on the work of the aforementioned Digital Futures for Children centre at the LSE, which I chair, the work of Defend Digital Me, the excellent work of academics at UCL, and much of the work relating to education presented to the UN tech envoy in the course of drafting the UN global digital compact.

Subsection (1) of the proposed new clause would require the ICO to prepare a code of practice in connection with the provision of education. Subsection (2) sets out what the ICO would have to take into account, such as that education provision includes school management and safeguarding as well as learning; the different settings in which it takes place; the need for transparency and evidence of efficacy; and all the issues already mentioned, including profiling, transparency, safety, security, parental involvement and the provision of counselling services.

Subsection (3) would require the ICO to have regard to children’s entitlement to a higher standard of protection—which we are working so hard in Committee to protect—their rights under the UNCRC and their different ages and stages of development. Importantly, it also refers to the need and desire to support innovation in education and the need to ensure that the benefits derived from the use of UK children’s data accrue to the UK.

Subsection (4) lists those whom the commissioner would have to consult, and subsection (5) sets out when data processors and controllers would be subject to the code. Subsection (6) proposes a certification scheme for edtech services to demonstrate compliance with UK GDPR and the code. Subsection (7) would require edtech service and product providers to evidence compliance—importantly, transferring that responsibility from schools to providers. Subsection (8) simply defines the terms.

A code of practice is an enabler. It levels the playing field, sets terms for innovators, creates sandbox or research environments, protects children and supports schools. It offers a particularly attractive environment for developing the better digital world that we would all like to see, since schools are identifiable communities in which changes and outcomes could be measured.

--- Later in debate ---
Baroness Jones of Whitchurch Portrait The Parliamentary Under-Secretary of State, Department for Business and Trade and Department for Science, Information and Technology (Baroness Jones of Whitchurch) (Lab)
- Hansard - - - Excerpts

My Lords, Amendment 138 tabled by the noble Lord, Lord Clement-Jones, and Amendment 141, tabled by the noble Baroness, Lady Kidron, and the noble Lord, Lord Knight, would both require the ICO to publish a code of practice for controllers and processors on the processing of personal data by educational technologies in schools.

I say at the outset that I welcome this debate and the contributions of noble Lords on this important issue. As various noble Lords have indicated, civil society organisations have also been contacting the Department for Science, Innovation and Technology and the Department for Education directly to highlight their concerns about this issue. It is a live issue.

I am grateful to my noble friend Lord Knight, who talked about some of the important and valuable contributions that technology can play in supporting children’s development and guiding teaching interventions. We have to get the balance right, but we understand and appreciate that schoolchildren, parents and schoolteachers must have the confidence to trust the way that services use children’s personal data. That is at the heart of this debate.

There is a lot of work going on, on this issue, some of which noble Lords have referred to. The Department for Education is already exploring ways to engage with the edtech market to reinforce the importance of evidence-based quality products and services in education. On my noble friend Lord Knight’s comments on AI, the Department for Education is developing a framework outlining safety expectations for AI products in education and creating resources for teachers and leaders on safe AI use.

I recognise why noble Lords consider that a dedicated ICO code of practice could help ensure that schools and edtech services are complying with data protection legislation. The Government are open-minded about exploring the merits of this further with the ICO, but it would be premature to include these requirements in the Bill. As I said, there is a great deal of work going on and the findings of the recent ICO audits of edtech service providers will help to inform whether a code of practice is necessary and what services should be in scope.

I hope that we will bear that in mind and engage on it. I would be happy to continue discussions with noble Lords, the ICO and colleagues at the Department for Education, outside of the Bill’s processes, about the possibility of future work on this, particularly as the Secretary of State has powers under the Data Protection Act 2018 to require the ICO to produce new statutory codes, as noble Lords know. Considering the explanation that I have given, I hope that the noble Lord, Lord Clement-Jones, will consider withdrawing his amendment at this stage.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I thank the Minister for her response and all speakers in this debate. On the speech from the noble Lord, Lord Knight, I entirely agree with the Minister and the noble Viscount, Lord Camrose, that it is important to remind ourselves about the benefits that can be achieved by AI in schools. The noble Lord set out a number of those. The noble Lord, Lord Russell, also reminded us that this is not a purely domestic issue; it is international across the board.

However, all noble Lords reminded us of the disbenefits and risks. In fact, the noble Lord, Lord Knight, used the word “dystopian”, which was quite interesting, although he gets very close to science fiction sometimes. He said that

“we have good reason to be concerned”,

particularly because of issues such as the national pupil database, where the original purpose may not have been fulfilled and was, in many ways, changed. He gave an example of procurement during Covid, where the choice was either Google or Microsoft—Coke or Pepsi. That is an issue across the board in competition law, as well.

There are real issues here. The noble Lord, Lord Russell, put it very well when he said that there is any number of pieces of guidance for schools but it is important to have a code of conduct. We are all, I think, on the same page in trying to find—in the words of the noble Baroness, Lady Kidron—a fairer and more equitable set of arrangements for children in schools. We need to navigate our way through this issue; of course, organisations such as Defend Digital Me and 5rights are seriously working on it.

--- Later in debate ---
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

My Lords, it is a pleasure to take part in today’s Committee proceedings. In doing so, I declare my technology interests as set out in the register, not least as an adviser to Socially Recruited, an AI business. In moving Amendment 156A, I will also speak to Amendment 156B, and I thank the noble Lord, Lord Clement-Jones, for co-signing them.

We live in extraordinarily uncertain times, domestically and internationally. In many ways, it has always been thus. However, things are different and have accelerated, not least in the last two decades, because of the online environment and the digital selves that we find ourselves interacting with in a world that is ever changing moment by moment. These amendments seek to update an important statute that governs critical elements of how cybersecurity professionals in this nation seek to keep us all safe in these extraordinarily difficult times.

The Computer Misuse Act 1990 was introduced to defend telephony exchanges at a time when 0.5% of us were online. If that was the purpose of the Act—the statute when passed—that alone would suggest that it needs an update. Who among us would use our smartphone if we had had it for 34 years? Well, we could not—the iPhone has been around only since 2007. This whole world has changed profoundly in the last 20 years, never mind the last 34. It is not just that the Act needs to be updated because it falls short of how society and technology have changed in those intervening years; it needs, desperately and urgently, to be updated because it is currently putting every citizen in this nation at risk for want of being amended. This is the purpose of Amendments 156A and 156B.

The Computer Misuse Act 1990 is not only out of date but inadvertently criminalising the cybersecurity professionals we charge with the job of keeping us all safe. They oftentimes work, understandably, under the radar, behind not just closed but locked doors, doing such important work. Yet, for want of these amendments, they are doing that work, all too often, with at least one hand tied behind their back.

Let us take just two examples: vulnerability research and threat intelligence assessment and analysis. Both could find that cybersecurity professional falling foul of the provisions of the CMA 1990. Do not take my word for it: look to the 2024 annual report of the National Cyber Security Centre, which rightly and understandably highlights the increasing gap between the threats we face and its ability, and the ability of the cybersecurity professionals community, to meet those threats.

These amendments, in essence, perform one simple but critical task: to afford a legal defence for legitimate cybersecurity activities. That is all, but it would have such a profound impact for those whom we have asked to keep us safe and for the safety they can thus deliver to every citizen in our society.

Where is the Government’s work on updating the Computer Misuse Act 1990 in this respect? Will the Government take this opportunity to accept these amendments? Do they believe that these amendments would provide a materially positive benefit to our cybersecurity professionals and thus to our nation, and, if so, why would they not take this first opportunity to enact these amendments to this data Bill?

It is not time; it is well over time that these amendments become part of our law. If not now, when? If not these amendments, which amendments? If they do not accept these amendments, what will the Government say to all those people who will continue to be put in harm’s way for want of these protective provisions being passed? It is time to pass these amendments and give our cybersecurity professionals the tools they need. It is time, from the legislative perspective, to keep them safe so that they can do the self-same thing for all of us. It is time to cyber up. I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I was delighted to see these amendments tabled by the noble Lord, Lord Holmes. He, the noble Lord, Lord Arbuthnot, and I, along with many other parliamentarians, have long argued for changes to the Computer Misuse Act. For context, the original Act was created largely in response to a famous incident in which professional hackers and a technology journalist broke into British Telecom’s Prestel system in the mid-1980s. The Bill received Royal Assent in June 1990, barely two months after Tim Berners-Lee and CERN made the world wide web publicly available for the first time. Who remembers Prestel? Perhaps this is the wrong House in which to ask that question.

As the noble Lord, Lord Holmes, explained, there is no statutory public interest defence in the Act. This omission creates a legal risk for cybersecurity researchers and professionals conducting legitimate activities in the public interest. The Post Office Horizon scandal demonstrated how critical independent computer system investigation is for uncovering systemic problems and highlighted the need for protected legal pathways for researchers and investigators to examine potentially flawed systems.

I am delighted that the noble Lord, Lord Vallance, is here for this set of amendments. His Pro-innovation Regulation of Technologies Review explicitly recommends incorporating such a defence to provide stronger legal protections for cybersecurity researchers and professionals engaged in threat intelligence research. This recommendation was rooted in the understanding that such a defence would have, it said,

“a catalytic effect on innovation”

within the UK’s cybersecurity sector, which possesses “considerable growth potential”.

--- Later in debate ---
Lord Arbuthnot of Edrom Portrait Lord Arbuthnot of Edrom (Con)
- Hansard - - - Excerpts

My Lords, I rise briefly but strongly to support my noble friend Lord Holmes. The CyberUp campaign has been banging this drum for a long time now. I remember taking part in the debates in another place on the Computer Misuse Act 34 years ago. It was the time of dial-up modems, fax machines and bulletin boards. This is the time to act, and it is the opportunity to do so.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, we ought to be mindful and congratulate the noble Lord on having been parliamentarian of the year as a result of his campaigning activities.

Lord Arbuthnot of Edrom Portrait Lord Arbuthnot of Edrom (Con)
- Hansard - - - Excerpts

My Lords, it has taken 34 years.

--- Later in debate ---
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

Could the Minister say a few words on some of those points of discourse and non-consensus, to give the Committee some flavour of the type of issues where there is no consensus as well as the extent of the gap between some of those perspectives?

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Just to follow up, have the Government formally responded to the original review from the noble Lord, Lord Vallance? That would be very helpful as well, in unpacking what were clearly extremely well-informed recommendations. It should, no doubt, be taken extremely seriously.

--- Later in debate ---
Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- Hansard - - - Excerpts

Yes, the Government accepted the recommendations in full.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Before the Minister sits down or stands up or whatever the appropriate phrase should be, I very much hope that, since the previous Government gave that indication, this Government will take that as a spur to non-glacial progress. I hope that at least the speed might get up to a number of miles per hour before too long.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

My Lords, I thank all noble Lords who have taken part in this important debate and, indeed, the Minister for her thoughtful response. We find ourselves in a position of extraordinary good fortune when it comes to these and many other amendments, not least in the area of artificial intelligence. We had a first-class report from the then Sir Patrick Vallance as CSA. It is not often in life that in a short space of time one is afforded the opportunity in government of bringing much of that excellent work into being through statute, regulation, codes and other guidance. I await further steps in this area.

There can barely be, in many ways, a more serious and pressing issue to be addressed. For every day that we delay, harms are caused. Even if the Government were only to do this on their growth agenda, much spoken of, this would have an economic benefit to the United Kingdom. It would be good to meet the Minister between Committee and Report to see if anything further can be done but, from my perspective and others, we will certainly be returning to this incredibly important issue. I beg leave to withdraw the amendment.

--- Later in debate ---
Earl of Erroll Portrait The Earl of Erroll (CB)
- Hansard - - - Excerpts

My Lords, I would like to just make one comment on this group. I entirely agree with everything that has been said and, in particular, with the amendments in the name of the noble Baroness, Lady Kidron, but the one that I want to single out—it is why I am bothering to stand up—is Amendment 197, which says that the Secretary of State “must” implement this measure.

I was heavily scarred back in 2017 by the Executive’s refusal to implement Part 3 of the Digital Economy Act in order to protect our children from pornography. Now, nearly eight years later, they are still not protected. It was never done properly, in my opinion, in the then Online Safety Bill either; it still has not been implemented. I think, therefore, that we need to have a “must” there. We have an Executive who are refusing to carry out the issue from Parliament in passing the legislation. We have a problem, but I think that we can amend it by putting “must” in the Bill. Then, we can hold the Executive to account.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, the trouble with this House is that some have long memories. The noble Earl, Lord Erroll, reminded us all to look back, with real regret, at the Digital Economy Act and the failure to implement Part 3. I think that that was a misstep by the previous Government.

Like all of us, I warmly welcome the inclusion of data access provisions for researchers studying online safety matters in Clause 123 of the Bill. As we heard from the noble Baroness, Lady Kidron, and the noble Lord, Lord Knight, this was very much unfinished business from the Online Safety Act. However, I believe that, in order for the Bill to be effective and have the desired effect, the Government need to accept the amendments in the names of the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell. In terms of timeframe, the width of research possible, enforceability, contractual elements and location, they cover the bases extremely effectively.

The point was made extremely well by the noble Lords, Lord Bethell and Lord Russell, that we should not have to rely on brave whistleblowers such as Frances Haugen. We should be able to benefit from quality researchers, whether from academia or elsewhere, in order to carry out this important work.

My Amendment 198B is intended as a probing amendment about the definition of researchers under Clause 123, which has to be carefully drawn to allow for legitimate non-governmental organisations, academics and so on, but not so widely that it can be exploited by bad actors. For example, we do not want those who seek to identify potential exploits in a platform to use this by calling themselves “independent researchers” if they simply describe themselves as such. For instance, could Tommy Robinson seek to protect himself from liabilities in this way? After all, he called himself an “independent journalist” in another context when he clearly was not. I hope that when the Government come to draw up the regulations they will be mindful of the need to be very clear about what constitutes an independent or accredited researcher, or whatever phrase will be used in the context.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, although I have no amendments in this group, I will comment on some of them. I might jump around the order, so please forgive me for that.

Amendment 197 would change Clause 123 so that the Secretary of State must, as soon as reasonably practicable and no later than 12 months after the Act is passed, make regulations requiring regulated services to provide information for the purposes of research into online safety. This is clearly sensible. It would ensure that valuable research into online safety may commence as soon as possible, which would benefit us all, as speakers have made abundantly clear. To that end, Amendment 198D, which would ensure that researcher access is enforceable in the same way as other requirements under the Online Safety Act, would ensure that researchers can access valuable information and carry out their beneficial research.

I am still left with some curiosity on some of these amendments, so I will indicate where I have specific questions to those who have tabled them and hope they will forgive me if I ask to have a word with them between now and Report, which would be very helpful. In that spirit, I turn to Amendment 198B, which would allow the Secretary of State to define the term “independent researcher”. I ask the noble Lord, Lord Clement-Jones, who tabled the amendment, whether he envisages the Secretary of State taking advice before making such regulations and, if so, from whom and in what mechanism. I recognise that it is a probing amendment, but I would be keen to understand more.

I am also keen to understand further from my noble friend Lord Bethell and the noble Baroness, Lady Kidron, why, under Amendment 198A, the Secretary of State would not be able to make regulations providing for independent research into the “enforcement of requirements” under these regulations. Again, I look forward to discussing that with them.

I have some concerns about Amendment 198, which would require service providers to give information pertaining to age, stage of development, gender, race, ethnicity, disability and sexuality to researchers. I understand the importance of this but my concern is that it would require the disclosure of special category data to those researchers. I express reservations, especially if the data pertains to children. Do we have the right safeguards in place to address the obviously heightened risks here?

Additionally, I have some concerns about the provisions suggested in Amendment 198E. Should we allow researchers from outside the United Kingdom to require access to information from regulated service providers? Could this result in data being transferred into jurisdictions where there are less stringent data protection laws?

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My Lords, I support Amendment 203 and, in particular, Amendments 211G and 211H from the noble Baroness, Lady Owen. I have little to add to what I said on Friday. I confess to my noble friend the Minister that, in my speech on Friday, I asked whether this issue would be in scope for this Bill, so maybe I gave the noble Baroness the idea. I pay tribute to her agility in being able to act quickly to get this amendment in and include something on audio, following the speech of the noble Baroness, Lady Gohir.

I hope that the Minister has similar agility in being able to readjust the Government’s position on this. It is right that this was an urgent manifesto commitment from my party at the last election. It fits entirely with my right honourable friend the Home Secretary’s efforts around violence against women and girls. We should accept and grab this opportunity to deliver quickly by working with the noble Baroness, Lady Owen, and others between now and Report to bring forward an amendment to the Bill that the whole House will support enthusiastically.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, we have had some powerful speeches in this group, not least from the noble Baronesses, Lady Kidron and Lady Owen, who drafted important amendments that respond to the escalating harms caused by AI-generated sexual abuse material relating to children and adults. The amendment from the noble Baroness, Lady Kidron, would make it an offence to use personal data or digital information to create digital models or files that facilitate the creation of AI or computer-generated child sexual abuse material. As she outlined and the noble Lord, Lord Bethell, confirmed, it specifically would become an offence to create, train or distribute generative AI models that enable the creation of computer-generated CSAM or priority legal content; to train AI models on CSAM or priority illegal content; or to possess AI models that produce CSAM or priority legal content.

This amendment responds to a growing problem, as we have heard, around computer-generated sexual abuse material and a gap in the law. There is a total lack of safeguards preventing bad actors creating sexual abuse imagery, and it is causing real harm. Sites enabling this abuse are offering tools to harm, humiliate, harass, coerce and cause reputational damage. Without robust legal frameworks, victims are left vulnerable while perpetrators operate with impunity.

The noble Lord, Lord Bethell, mentioned the Internet Watch Foundation. In its report of July, One Step Ahead, it reported on the alarming rise of AI-generated CSAM. In October 2023, in How AI is Being Abused to Create Child Sexual Abuse Imagery, it made recommendations to the Government regarding legislation to strengthen legal frameworks to better address the evolving landscape of AI-generated CSAM and enhance preventive measures against its creation and distribution. It specifically recommended:

“That the Government legislates to make it an offence to use personal data or digital information to create digital models or files that facilitate the creation of AI or computer-generated child sexual abuse material”.


The noble Baroness, Lady Kidron, tabled such an amendment to the previous Bill. As she said, she was successful in persuading the then Government to accept it; I very much hope that she will be as successful in persuading this Government to accept her amendment.

Amendments 211G and 211H in the name of the noble Baroness, Lady Owen, are a response to the extraordinary fact that one in 14 adults has experienced threats to share intimate images in England and Wales; that rises to one in seven among young women. Research from Internet Matters shows that 49% of young teenagers in the UK aged between 13 and 16—around 750,000 children—said that they were aware of a form of image-based abuse being perpetrated against another young person known to them.

We debated the first of the noble Baroness’s amendments, which is incorporated in her Bill, last Friday. I entirely agree with the noble Lord, Lord Knight; I did not find the Government’s response at all satisfactory. I hope that, in the short passage of time between then and now, they have had time to be at least a little agile, as he requested. UK law clearly does not effectively address non-consensual intimate images. It is currently illegal to share or threaten to share non-consensual intimate images, including deepfakes, but creating them is not yet illegal; this means that someone could create a deepfake image of another person without their consent and not face legal consequences as long as they do not share, or threaten to share, it.

This amendment is extremely welcome. It addresses the gap in the law by criminalising the creation of non-consensual intimate images, including deepfakes. It rightly targets deepfakes due to their rising prevalence and potential for harm, particularly towards women. Research shows that 98% of deepfake videos online are pornographic, with 99% featuring women and girls. This makes it an inherently sexist problem that is a new frontier of violence against women—words that I know the noble Baroness has used.

I also very much welcome the new amendment not contained in her Bill, responding to what the noble Baroness, Lady Gohir, said at its Second Reading last Friday about including audio deepfakes. The words “shut down every avenue”, which I think were used by the noble Baroness, Lady Gohir, are entirely apposite in these circumstances. Despite what the noble Lord, Lord Ponsonby, said on Friday, I hope that the Government will accept both these amendments and redeem their manifesto pledge to ban the creation of sexually explicit deepfakes, whether audio or video.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, the current law does not sufficiently protect children from AI-driven CSAM because it is simply such a fast-moving issue. It is a sobering thought that, of all the many wonderful developments of AI that many of us have been predicting and speculating on for so long, CSAM is really driving the technology forward. What a depressing reflection that is.

Overall, AI is developing at an extraordinarily rapid pace and has come with a number of concerning consequences that are not all yet fully understood. However, it is understood that child sexual abuse is completely unacceptable in any and all contexts, and it is right that our law should be updated to reflect the dangers that have increased alongside AI development.

Amendment 203 seeks to create a specific offence for using personal data or digital information to create or facilitate the creation of computer-generated child sexual abuse material. Although legislation is in place to address possessing or distributing such horrendous material, we must prioritise the safety of children in this country and take the law a step further to prevent its creation. Our children must be kept safe and, subject to one reservation, which I will come to in a second, I support the amendment from the noble Baroness, Lady Kidron, to further protect them.

That reservation comes in proposed new subsection 1(c), which includes in the offence the act of collating files that, when combined, enable the creation of sexual abuse material. This is too broad. A great deal of the collation of such material can be conducted by innocent people using innocent materials that are then corrupted or given more poisonous aspects by further training, fine-tuning or combination with other materials by more malign actors. I hope there is a way we can refine this proposed new paragraph on that basis.

Unfortunately, adults can also be the targets of individuals who use AI to digitally generate non-consensual explicit images or audio files of an individual, using their likeness and personal data. I am really pleased that my noble friend Lady Owen tabled Amendments 211G and 211H to create offences for these unacceptable, cruel acts. I support these amendments unambiguously.

--- Later in debate ---
Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, I very much support these amendments. I declare an interest as an owner of written copyright in the Good Schools Guide and as a father of an illustrator. In both contexts, it is very important that we get intellectual property right, as I think the Government recognised in what they put out yesterday. However, I share the scepticism of those who have spoken as to whether the Government’s ideas can be made to work.

It is really important that we get this straight. For those of us operating at the small end of the scale, IP is under continual threat from established media. I write maybe 10 or a dozen letters a year to large media outfits reminding them of the borders, the latest to the Catholic Herald—it appears not even the 10 commandments have force on them. But what AI can do is a huge measure more difficult to deal with. I can absolutely see, by talking to Copilot, that it has gone through my paywall and absorbed the contents of the Good Schools Guide, but who am I supposed to go at for this? Who has actually done the trespassing? Who is responsible for it? Where is the ownership? It is difficult to enforce copyright, even by writing a polite letter to someone saying, “Please don’t do this”. The Government appear to propose a system of polite letters saying, “Oh dear, it looks as if you might have borrowed my copyright. Please, can you give it back?”

This is not practically enforceable, and it will not result in people who care about IP locating their businesses here. Quite clearly, we do not have ownership of the big AI systems, and it is unlikely that we will have ownership of them—all that will be overseas. What we can do is create IP. If we produce a system where we do not defend the IP that we produce, then fairly rapidly, those IP creators who are capable of being mobile will go elsewhere to places that will defend their IP. It is something that a Government who are interested in growth really ought to be interested in defending. I hope that we will see some real progress in the course of the Bill going through the House.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I declare my AI interests as set out in the register. I will speak in support of Amendments 204, 205 and 206, which have been spoken to so inspiringly by the noble Baroness, Lady Kidron, and so well by the noble Lords, Lord Freyberg, Lord Lucas and Lord Hampton, the noble Earl, Lord Clancarty, and the noble Viscount, Lord Colville. Each demonstrated different facets of the issue.

I co-chair the All-Party Group on AI and chaired the AI Select Committee a few years ago. I wrote a book earlier this year on AI regulation, which had a namecheck from the noble Baroness, Lady Jones, at Question Time, which I was very grateful for. Before that, I had a career as an IP lawyer, defending copyright and creativity, and in this House, I have been my party’s creative industries spokesperson. The question of IP and the training of generative AI models is a key issue for me.

This is the case not just in the UK but around the world. Getty and the New York Times are suing in the United States, as are many writers, artists and musicians. It was at the root of the Hollywood actors’ and writers’ strikes last year. It is one thing to use the tech—many of us are AI enthusiasts—but it is another to be at the mercy of it.

Close to home, the FT has pointed out, using the index published by the creator of an unlicensed dataset called Books3, published online, that it is possible to identify that over 85 books written by 33 Members of the House of Lords have been pirated to train AI models from household names, such as Meta, Microsoft and Bloomberg. Although it is absolutely clear that we know that the use of copyrighted works to train AI models is contrary to UK copyright law, the laws around the transparency of these activities have not caught up. As we have heard, as well as using pirated e-books in their training data, AI developers scrape the internet for valuable professional journalism and other media, in breach of both the terms of service of websites and copyright law, to train commercial AI models. At present, developers can do this without declaring their identity, or they may use IP scraped to appear in a search index for the completely different commercial purpose of training AI models.

How can rights owners opt out of something that they do not know about? AI developers will often scrape websites or access other pirated material before they launch an LLM in public. This means that there is no way for IP owners to opt out of their material being taken before its inclusion in these models. Once used to train these models, the commercial value, as we have heard, has already been extracted from IP scraped without permission, with no way to delete data from these models.

The next wave of AI models responds to user queries by browsing the web to extract valuable news and information from professional news websites. This is known as retrieval-augmented generation—RAG. Without payment for extracting this commercial value, AI agents built by companies such as Perplexity, Google and Meta will, in effect, free-ride on the professional hard work of journalists, authors and creators. At present, such crawlers are hard to block. There is no market failure; there are well-established licensing solutions. There is no uncertainty around the existing law; the UK is absolutely clear that commercial organisations, including gen AI developers, must license the data that they use to train their large language models.

Here, as the Government’s intentions become clearer, the political, business and creative temperature is rising. Just this week, we have seen the creation of a new campaign, the Creative Rights in AI Coalition—CRAIC —across the creative and news industries and, recently, Ed Newton-Rex reached more than 30,000 signatories from among creators and creative organisations.

--- Later in debate ---
Lord Faulks Portrait Lord Faulks (Non-Afl)
- Hansard - - - Excerpts

The noble Lord has enormous experience in these areas and will be particularly aware of the legal difficulties in enforcing rights. Given what he said, with which I entirely agree—indeed, I agree with all the speakers in supporting these amendments—and given the extraordinary expense of litigating to enforce rights, how does he envisage there being an adequate system to allow those who have had their data scraped in the way that he describes to obtain redress or, rather, suitable remedies?

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

I thank the noble Lord for that. He is anticipating a paragraph in my notes, which says that, although it is not set out in the amendments, robust enforcement of these provisions will be critical to their success. This includes oversight from an expert regulator that is empowered to issue significant penalties, including fines for non-compliance. There is a little extra work to do there, and I would very much like to see the Intellectual Property Office gain some teeth.

I am going to close. We are nearly at the witching hour, but it is clear that AI developers are seeking to use their lobbying clout—the noble Baroness, Lady Kidron, mentioned the Kool-Aid—to persuade the Government that new copyright law is required. Instead, this amendment would clarify that UK copyright law applies to gen AI developers. The creative industries, and noble Lords from across the House as their supporters, will rally around these amendments and vigorously oppose government plans for a new text and data- mining exception.

--- Later in debate ---
Earl of Erroll Portrait The Earl of Erroll (CB)
- Hansard - - - Excerpts

My Lords, if I may just interject, I have seen this happen not just in the Horizon scandal. Several years ago, the banks were saying that you could not possibly find out someone’s PIN and were therefore refusing to refund people who had had stuff stolen from them. It was not until the late Professor Ross Anderson, of the computer science department at Cambridge University, proved that they had been deliberately misidentifying to the courts which counter they should have been looking at, as to what was being read, and explained exactly how you could get the thing to default back to a different set of counters, that the banks eventually had to give way. But they went on lying to the courts for a long time. I am afraid that this is something that keeps happening again and again, and an amendment like this is essential for future justice for innocent people.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, it is a pity that this debate is taking place so late. I thank the noble Lord, Lord Arbuthnot, for his kind remarks, but my work ethic feels under considerable pressure at this time of night.

All I will say is that this is a much better amendment than the one that the noble Baroness, Lady Kidron, put forward for the Data Protection and Digital Information Bill, and I very strongly support it. Not only is this horrifying in the context of the past Horizon cases, but I read a report about the Capture software, which is likely to have created shortfalls that led to sub-postmasters being prosecuted as well. This is an ongoing issue. The Criminal Cases Review Commission is reviewing five Post Office convictions in which the Capture IT system could be a factor, so we cannot say that this is about just Horizon, as there are the many other cases that the noble Baroness cited.

We need to change this common law presumption even more in the face of a world in which AI use, with all its flaws and hallucinations, is becoming ever present, and we need to do it urgently.

Earl of Effingham Portrait The Earl of Effingham (Con)
- Hansard - - - Excerpts

My Lords, I thank the noble Baroness, Lady Kidron, for tabling her amendment. We understand its great intentions, which we believe are to prevent another scandal similar to that of Horizon and to protect innocent people from having to endure what thousands of postmasters have undergone and suffered.

However, while this amendment would make it easier to challenge evidence derived from, or produced by, a computer or computer system, we are concerned that, should it become law, this amendment could be misused by defendants to challenge good evidence. Our fear is that, in determining the reliability of such evidence, we may create a battle of the expert witnesses. This will not only substantially slow down trials but result in higher costs. Litigation is already expensive, and we would aim not to introduce additional costs to an already costly process unless absolutely necessary.

From our perspective, the underlying problem in the Horizon scandal was not that computer systems were critically wrong or that people were wrong, but that the two in combination drove the terrible outcomes that we have unfortunately seen. For many industries, regulations require firms to conduct formal systems validation, with serious repercussions and penalties should companies fail to do so. It seems to us that the disciplines of systems validation, if required for other industries, would be both a powerful protection and considerably less disruptive than potentially far-reaching changes to the law.

--- Later in debate ---
Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, having a system such as this would really focus the public sector on how we can generate more datasets. As I said earlier, education is an obvious one, but so is mobile phone data. All these companies have their licences. If a condition of the licence was that the data on how people move around the UK became a public asset, that would be hugely beneficial to policy formation. If we really understood how, why and when people move, we would make much better decisions. We could save ourselves huge amounts of money. We really ought to have this as a deep focus of government policy.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I have far too little time to do justice to this subject. We on these Benches welcome this amendment. It is entirely consistent with the sovereign health fund proposed by Future Care Capital and, indeed, with the proposals from the Tony Blair Institute for Global Change on a similar concept called the national data trust. Indeed, this concept formed part of our Liberal Democrat manifesto at the last general election, so of course I support the amendment.

It would be very useful to hear more about the national data library, including on its purpose and operation, as the noble Baroness, Lady Kidron, said. I entirely agree with her that there is a great need for a sovereign cloud service or services. Indeed, the inability to guarantee that data on the cloud is held in this country is a real issue that has not yet been properly addressed.

Earl of Effingham Portrait The Earl of Effingham (Con)
- Hansard - - - Excerpts

My Lords, I thank the noble Baroness, Lady Kidron, for moving this amendment. As she rightly identified, the UK has a number of publicly held data assets, many of which contain extremely valuable information. This data—I flag, by way of an example, NHS data specifically—could be extremely valuable to certain organisations, such as pharmaceutical companies.

We are drawn to the idea of licensing such data—indeed, we believe that we could charge an extremely good price—but we have a number of concerns. Most notably, what additional safeguards would be required, given its sensitivity? What would be the limits and extent of the licensing agreement? Would this status close off other routes to monetising the data? Would other public sector bodies be able to use the data for free? Can this not already be done without the amendment?

Although His Majesty’s Official Opposition of course recognise the wish to ensure that the UK taxpayer gets a fair return on our information assets held by public bodies and arm’s-length organisations, and we certainly agree that we need to look at licensing, we are not yet sure that this amendment is either necessary or sufficient. We once again thank the noble Baroness, Lady Kidron, for moving it. We look forward to hearing both her and the Minister’s thoughts on the matter.

--- Later in debate ---
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

My Lords, it is a pleasure to introduce this group of amendments. I have a 35-minute speech prepared. In moving Amendment 211B, I shall speak also to Amendments 211C to 211E. The reason for this group of amendments is to try to get an increased focus on the range of issues they touch on.

I turn to Amendment 211B first. It seems at least curious to have a data Bill without talking about data centres in terms of their power usage, their environmental impact and the Government’s view of the current PUE standard. Is it of a standard that they think gives the right measure of confidence to consumers and citizens across the country, in terms of how data centres are being operated and their impacts?

Similarly, on Amendment 211C, not enough consideration is given to supply chains. I am not suggesting that they are the most exciting subject but you have to go only one or two steps back in any supply chain to get into deep depths of opacity. With this amendment, I am seeking to gain more clarity on data supply chains and the role of data across all supply chains. Through the combination of data and AI, we could potentially enable a transformation of our supply chain in real time. That would give us so much more flexibility to try for economic benefits and environmental benefits. I look forward to the Minister’s response.

I now move on to Amendment 211D. It is always a pleasure to bring AI into a Bill that really does not want to have AI in it. I am interested in the whole question of data input and output, not least with large language models. I am also interested in the Government’s view on how this interacts with the 1988 copyright Act. There may be some mileage in looking into some standards and approaches in this area, which would potentially go some way towards conditions of market access. We have some excellent examples to look at in other sectors of our economy and society, as set out in the amendment; I would welcome the Minister’s views on that.

I am happy that this group ends with Amendment 211E on the subject of public trust. In many ways, it is the golden thread that should run through everything when we talk about data; I wanted it to be the golden thread that ran through my AI regulation Bill. I always say that Clause 6 is the most important clause in that Bill because it goes to the question of public engagement and trust. Without that level of public engagement and trust, it does not matter how good the technologies are, how good the frameworks are or how good the chat around the data is. It might be golden but, if the public do not believe in it, they are not going to come and be part of it. The most likely consequence of this is that they will not be able to avail themselves of the benefits but they will almost certainly be saddled with the burdens. What these technologies enable is nothing short of a transformation of that discourse between citizen and state, with the potential to reimagine completely the social contract for the benefit of all.

Public engagement and public trust are the golden thread and the fuel for how we gain those economic, social and psychological benefits from the data. I will be very interested in the Minister’s response on what more could be done by the Government, because previous consultations, not least around some of these technologies, have been somewhat short of what we could achieve. With that #brevity and #our data, I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I shall be #even shorter. Data centres and their energy consumption are important issues. I agree that at a suitable moment—probably not now—it would be very interesting to hear the Government’s views on that. Reports from UK parliamentary committees and the Government have consistently emphasised the critical importance of maintaining public trust in data use and AI, but sometimes, the actions of the Government seem to go contrary to that. I support the noble Lord, Lord Holmes, in his call for essentially realising the benefits of AI while making sure that we maintain public trust.

Earl of Effingham Portrait The Earl of Effingham (Con)
- Hansard - - - Excerpts

My Lords, I thank my noble friend Lord Holmes of Richmond for tabling this amendment. As we all appreciate, taking stock of the effects of legislation is critical, as it allows us to see what has worked and what has not. Amendment 221B would require the Secretary of State to launch a consultation into the implications of the provisions of the Bill on the power usage and energy efficiency of data centres. His Majesty’s Official Opposition have no objection to the amendment’s aims but we wonder to what extent it is actually possible. By what means or benchmark can we identify whether a spike in energy usage is specifically due to a provision from this legislation, rather than as a result of some other factor? I should be most grateful if my noble friend could provide further detail on this matter in his closing speech.

Regarding Amendment 211C, we understand that much could be learned from a review of all data regulations and standards pertaining to the supply chains for financial, trade, and legal documents and products, although we wonder if this needs to happen the moment this Bill passes. Could this review not happen at any stage? By all means, let us do it sooner rather than later, but is it necessary to set a date in statute?

Moving on to Amendment 221D, we should certainly look to regulate the AI large language model sector to ensure that there are standards for the input and output of data for LLMs. However, this must be done in a way that does not stifle growth in this emerging industry.

Finally, we have some concerns about Amendment 211E. A national consultation on the use of individuals’ data is perhaps just too broad.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, listening to the noble Lord, Lord Lucas, is often an education, and today is no exception. I had no idea what local environmental records centres were, so I shall be very interested to hear what the Minister has to say in response.

Earl of Effingham Portrait The Earl of Effingham (Con)
- Hansard - - - Excerpts

My Lords, I thank my noble friend Lord Lucas for tabling Amendment 211F and all noble Lords for their brief contributions to this group.

Amendment 211F ensures that all the biodiversity data collected by or in connection with government is collected in local environment records centres to ensure that records are as good as possible. That data is then used by or in connection with government, so it is put to the best possible use.

The importance of sufficient and high-quality record collection cannot and must not be understated. With this in mind, His Majesty’s Official Opposition support the sentiment of the amendment in my noble friend’s name. These Benches will always champion matters related to biodiversity and nature recovery. In fact, many of my noble friends have raised concerns about biodiversity in Committee debates in your Lordships’ House on the Crown Estate Bill, the Water (Special Measures) Bill and the Great British Energy Bill. Indeed, they have tabled amendments that ensure that matters related to biodiversity appear at the forefront of draft legislation.

With that in mind, I am grateful to my noble friend Lord Lucas for introducing provisions, via Amendment 211F, which would require any planning application involving biodiversity net gain to include a data search report from the relevant local environmental records centre. I trust that the Minister has listened to the concerns raised collaboratively in the debate on this brief group. We must recognise the importance of good data collection and ensure that such data is used in the best possible way.

Data (Use and Access) Bill [HL]

Lord Clement-Jones Excerpts
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, last week the Government published the AI Opportunities Action Plan and confirmed that they have accepted or partially accepted all 50 of the recommendations from the report’s author, Matt Clifford. Reading the report, there can be no doubting Government’s commitment to making the UK a welcoming environment for AI companies. What is less clear is how creating the infrastructure and skills pool needed for AI companies to thrive will lead to economic and social benefits for UK citizens.

I am aware that the Government have already said that they will provide further details to flesh out the top-level commitments, including policy and legislative changes over the coming months. I reiterate the point made by many noble Lords in Committee that, if data is the ultimate fuel and infrastructure on which AI is built, why, given that we have a new Government, is the data Bill going through the House without all the strategic pieces in place? This is a Bill flying blind.

Amendment 1 is very modest and would ensure that information that traders were required to provide to customers on goods, services and digital content included information that had been created using AI to build a profile about them. This is necessary because the data that companies hold about us is already a combination of information proffered by us and information inferred, increasingly, by AI. This amendment would simply ensure that all customer data—our likes and dislikes, buying habits, product uses and so on—was disclosable, whether provided by us or a guesstimate by AI.

The Government’s recent statements have promised to “mainline AI into the veins” of the nation. If AI were a drug, its design and deployment would be subject to governance and oversight to ensure its safety and efficacy. Equally, they have said that they will “unleash” AI into our public services, communities and business. If the rhetoric also included commitments to understand and manage the well-established risks of AI, the public might feel more inclined to trust both AI and the Government.

The issue of how the data Bill fails to address AI— and how the AI Opportunities Action Plan, and the government response to it, fail to protect UK citizens, children, the creative industries and so on—will be a theme throughout Report. For now, I hope that the Government can find their way to agreeing that AI-generated content that forms part of a customer’s profile should be considered personal data for the purposes of defining business and customer data. I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, this is clearly box-office material, as ever.

I support Amendment 1 tabled by the noble Baroness, Lady Kidron, on inferred data. Like her, I regret that we do not have this Bill flying in tandem with an AI Bill. As she said, data and AI go together, and we need to see the two together in context. However, inferred data has its own dangers: inaccuracy and what are called junk inferences; discrimination and unfair treatment; invasions of privacy; a lack of transparency; security risks; predatory targeting; and a loss of anonymity. These dangers highlight the need for strong data privacy protection for consumers in smart data schemes and more transparent data collection practices.

Noble Lords will remember that Cambridge Analytica dealt extensively with inferred data. That company used various data sources to create detailed psychological profiles of individuals going far beyond the information that users explicitly provided. I will not go into the complete history, but, frankly, we do not want to repeat that. Without safeguards, the development of AI technologies could lead to a lack of public trust, as the noble Baroness said, and indeed to a backlash against the use of AI, which could hinder the Government’s ambitions to make the UK an AI superpower. I do not like that kind of boosterish language—some of the Government’s statements perhaps could have been written by Boris Johnson—nevertheless the ambition to put the UK on the AI map, and to keep it there, is a worthy one. This kind of safeguard is therefore extremely important in that context.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

I start by thanking the noble Baroness, Lady Kidron, for introducing this group. I will speak particularly to the amendment in my name but before I do so, I want to say how much I agree with the noble Baroness and with the noble Lord, Lord Clement-Jones, that it is a matter of regret that we are not simultaneously looking at an AI Bill. I worry that this Bill has to take a lot of the weight that an AI Bill would otherwise take, but we will come to that in a great deal more detail in later groups.

I will address the two amendments in this group in reverse order. Amendment 5 in my name and that of my noble friend Lord Markham would remove Clause 13, which makes provision for the Secretary of State or the Treasury to give financial assistance to decision-makers and enforcers—that is, in essence, to act as a financial backstop. While I appreciate the necessity of guaranteeing the stability of enforcers who are public authorities and therefore branches of state, I am concerned that this has been extended to decision-makers. The Bill does not make the identity of a decision-maker clear. Therefore, I wonder who exactly we are protecting here. Unless those individuals or bodies or organisations can be clearly defined, how can we know whether we should extend financial assistance to them?

I raised these concerns in Committee and the Minister assured us at that time that smart data schemes should be self-financing through fees and levies as set out in Clauses 11 and 12 and that this provision is therefore a back-up plan. If that is indeed the case and we are assured of the self-funding nature of smart data schemes, then what exactly makes this necessary? Why must the statutory spending authority act as a backstop if we do not believe there is a risk it will be needed? If we do think there is such a risk, can the Minister elaborate on what it is?

I turn now to the amendment tabled by the noble Baroness, Lady Kidron, which would require data traders to supply customers with information that has been used by AI to build a profile on them. While transparency and explainability are hugely important, I worry that the mechanism proposed here will be too burdensome. The burden would grow linearly with the scale of the models used. Collating and supplying this information would, I fear, increase the cost of doing business for traders. Given AI’s potential to be an immense asset to business, helping generate billions of pounds for the UK economy—and, by the way, I rather approve of the boosterish tone and think we should strive for a great deal more growth in the economy—we should not seek to make its use more administratively burdensome for business. Furthermore, since the information is AI-generated, it is going to be a guess or an assumption or an inference. Therefore, should we require companies to disclose not just the input data but the intermediate and final outputs? Speaking as a consumer, I am not sure that I personally would welcome this. I look forward to hearing the Minister’s responses.

--- Later in debate ---
Before I sit down, I wanted to acknowledge that the AI action plan recommends in many places making it easier for organisations, including commercial companies, to access datasets, but it is silent on how citizens might be able to access and share their data collectively. Instead, it appears to assume that data mining is something that will happen to them, rather than by them or on their behalf. Matt Clifford, its author, is an AI tech investor. While there is much on which to agree with him when it comes to skills or investment in infrastructure, the relentless tech sector viewpoint, rather than that of worker, creator, citizen or child, is a weakness in itself and a problem in its timing. Those of us who would most like to be supportive of the UK being a tech-enabled nation find the needs of our communities and fellow citizens unserved by this unbridled tech utopianism that both recent history and some of the sector’s greatest innovators would suggest is very unwise. I beg to move.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, the noble Baroness, Lady Kidron, is setting a cracking pace this afternoon, and I am delighted to support her amendments and speak to them. Citizens should have the clear right to assign their data to data communities or trusts, which act as intermediaries between those who hold data and those who wish to use it, and are designed to ensure that data is shared in a fair, safe and equitable manner.

A great range of bodies have explored and support data communities and data trusts. There is considerable pedigree behind the proposals that the noble Baroness has put forward today, starting with a recommendation of the Hall-Pesenti review. We then had the Royal Society and the British Academy talking about data stewardship; the Ada Lovelace Institute has explored legal mechanisms for data stewardship, including data trusts; the Open Data Institute has been actively researching and piloting data trusts in the real world; the Alan Turing Institute has co-hosted a workshop exploring data trusts; and the Royal Society of Arts has conducted citizens’ juries on AI explainability and explored the use of data trusts for community engagement and outreach.

There are many reasons why data communities are so important. They can help empower individuals, give them more control over their data and ensure that it is used responsibly; they can increase bargaining power, reduce transaction costs, address data law complexity and protect individual rights; they can promote innovation by facilitating data-sharing; and they can promote innovation in the development of new products and services. We need to ensure responsible operation and build trust in data communities. As proposed by Amendment 43 in particular, we should establish a register of data communities overseen by the ICO, along with a code of conduct and complaint mechanisms, as proposed by Amendment 42.

It is high time we move forward on this; we need positive steps. In the words of the noble Baroness, Lady Kidron, we do not just seek assurance that there is nothing to prevent these data communities; we need to take positive steps and install mechanisms to make sure that we can set them up and benefit from that.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

I thank the noble Baroness, Lady Kidron, for leading on this group, and the noble Lord, Lord Clement-Jones, for his valuable comments on these important structures of data communities. Amendments 2, 3, 4 and 25 work in tandem and are designed to enable data communities, meaning associations of individuals who have come together and wish to designate a third party, to act on the group’s behalf in their data use.

There is no doubt that the concept of a data community is a powerful idea that can drive innovation and a great deal of value. I thank the noble Lord, Lord Clement-Jones, for cataloguing the many groups that have driven powerful thinking in this area, the value of which is very clear. However—and I keep coming back to this when we discuss this idea—what prevents this being done already? I realise that this may be a comparatively trivial example, but if I wanted to organise a community today to oppose a local development, could I not do so with an existing lawful basis for data processing? It is still not clear in what way these amendments would improve my ability to do so, or would reduce my administrative burden or the risks of data misuse.

I look forward to hearing more about this from the Minister today and, ideally, as the noble Baroness, Lady Kidron, said, in a briefing on the Government’s plan to drive this forward. However, I remain concerned that we do not necessarily need to drive forward this mechanism by passing new legislation. I look forward to the Minister’s comments.

Amendment 42 would require the Information Commissioner to draw up a code of practice setting out how data communities must operate and how data controllers and processors should engage with these communities. Amendment 43 would create a register of data communities and additional responsibilities for the data community controller. I appreciate the intent of the noble Baroness, Lady Kidron, in trying to ensure data security and transparency in the operation of data communities. If we on these Benches supported the idea of their creation in this Bill, we would surely have to implement mechanisms of the type proposed in these amendments. However, this observation confirms us in our view that the administration required to operate these communities is starting to look rather burdensome. We should be looking to encourage the use of data to generate economic growth and to make people’s lives easier. I am concerned that the regulation of data communities, were it to proceed as envisaged by these amendments, might risk doing just the opposite. That said, I will listen with interest to the response of noble Lords and the Minister.

--- Later in debate ---
Lord Leong Portrait Lord Leong (Lab)
- View Speech - Hansard - - - Excerpts

My understanding is that “customer” reflects an individual, but I am sure that the Minister will give a better explanation at the meeting with officials next week.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

Again before the Minister sits down—I am sure he will not be able to sit down for long—would he open that invitation to a slightly wider group?

Lord Leong Portrait Lord Leong (Lab)
- Hansard - - - Excerpts

I thank the noble Lord for that request, and I am sure my officials would be willing to do that.

--- Later in debate ---
Lord Arbuthnot of Edrom Portrait Lord Arbuthnot of Edrom (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I support my noble friend. I have a confession to make. Before this Bill came up, I foolishly thought that sex and gender were the same thing. I have discovered that they are not. Gender is not a characteristic defined in UK law. I believe that you are born with a biological sex, as being male or female, and that some people will choose, or need, to have a gender reassignment or to identify as a different gender. I thank the charity Sex Matters, which works to provide clarity on this issue of sex in law.

As my noble friend Lord Lucas said, the digital verification system currently operates on the basis of chosen gender, not of sex at birth. You can change your records on request without even having a gender recognition certificate. That means that, over the last five years, at least 3,000 people have changed their passports to show the wrong sex. Over the last six years, at least 15,000 people have changed their driving licences. The NHS has no records of how many people now have different sexes recorded from those they had at birth. It is thought that perhaps 100,000 people have one sex indicated in one record and a different sex in another. We cannot go on like that.

The consequences of this are really concerning. It means people with mismatched identities risk being flagged up as a synthetic identity risk. It means authorities with statutory safeguarding responsibilities will not be able to assess the risk that they are trying to deal with. It means that illnesses may be misdiagnosed and treatments misprescribed if the wrong sex is stated in someone’s medical records. The police will be unable to identify people if they are looking in the wrong records. Disclosure and Barring Service checks may fail to match individuals with the wrong sex. I hope that the Government will look again at correcting this. It is a really important issue.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, I will speak to Amendments 7 and 9. Amendment 7 would require the Secretary of State to lay the DVS trust framework before Parliament. Given the volume of sensitive data that digital ID providers will be handling, it is crucial for Parliament to oversee the framework rules governing digital verification service providers.

The amendment is essentially one that was tabled in Committee by the noble Viscount, Lord Camrose. I thought that he expressed this well in Committee, emphasising that such a fundamental framework demands parliamentary approval for transparency and accountability, regardless of the document’s complexity. This is an important framework with implications for data privacy and security, and should not be left solely to the discretion of the Secretary of State.

The DPRRC in its ninth report and the Constitution Committee in its third report of the Session also believed the DVS trust framework should be subject to parliamentary scrutiny; the former because it has legislative effect, and it recommended using the affirmative procedure, which would require Parliament to actively approve the framework, as the Secretary of State has significant power without adequate parliamentary involvement. The latter committee, the Constitution Committee, said:

“We reiterate our statement from our report on the Data Protection and Digital Information Bill that ‘[d]ata protection is a matter of great importance in maintaining a relationship of trust between the state and the individual. Access to personal data is beneficial to the provision of services by the state and assists in protecting national security. However, the processing of personal data affects individual rights, including the right to respect for private life and the right to freedom of expression. It is important that the power to process personal data does not become so broad as to unduly limit those rights’”.


Those views are entirely consistent with the committee’s earlier stance on a similar provision in the previous Data Protection and Digital Information Bill. That was why it was so splendid that the noble Viscount tabled that amendment in Committee. It was like a Damascene conversion.

The noble Baroness, Lady Jones, argued in Committee and in correspondence that the trust framework is a highly technical document that Parliament might find difficult to understand. That is a bit of a red rag to a bull. However, this argument fails to address the core concerns about democratic oversight. The framework aims to establish a trusted digital identity marketplace by setting requirements for providers to gain certification as trusted providers.

I am extremely grateful to the Minister, the Bill team and the department for allowing officials to give the noble Viscount, Lord Camrose, and me a tutorial on the trust framework. It depends heavily on being voluntary in nature, with the UK Accreditation Service essentially overseeing the certifiers, such as BSI, Kantara and the Age Check Certification Scheme, certifying the providers, with the installation of ISO 17065 as the governing standard.

Compliance is assured through the certification process, where services are assessed against the framework rules by independent conformity assessment bodies accredited by the UK Accreditation Service, and the trust framework establishes rules and standards for digital identity verification but does not directly contain specific provision for regulatory oversight or for redress mechanisms such as a specific ombudsman service, industry-led dispute resolution or set contract terms for consumer redress or enforcement powers. The Government say, however, that they intend to monitor the types of complaints received. Ultimately, the scope of the framework is limited to the rules providers must follow in order to remain certificated and it does not address governance matters.

Periodic certification alone is not enough to ensure ongoing compliance and highlights the lack of an independent mechanism to hold the Secretary of State accountable. The noble Baroness, Lady Jones, stated in Committee that the Government preferred a light-touch approach to regulating digital verification services. She believed that excessive parliamentary scrutiny would hinder innovation and flexibility in this rapidly evolving sector.

The Government have consistently emphasised that they have no plans to introduce mandatory digital IDs or ID cards The focus is on creating a secure and trusted system that gives citizens more choice and control over their data. The attributes trust framework is a crucial step towards achieving the goal of a secure, trusted and innovative digital identity market—all the more reason to get the process for approval right.

These services will inevitably be high-profile. Digital ID is a sensitive area which potentially also involves age verification. These services could have a major impact on data privacy and security. Public debate on such a critical issue is crucial to build trust and confidence in these systems. Laying the DVS trust framework before Parliament would allow for a wider range of voices and perspectives to be heard, ensuring a more robust and democratic approval process.

--- Later in debate ---
Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

I thank the noble Lords, Lord Clement-Jones, Lord Lucas and Lord Arbuthnot, for their amendments and interest in the important area of digital verification services. I thank the noble Viscount, Lord Camrose, for his support for this being such an important thing to make life easier for people.

I will go in reverse order and start with Amendment 9. I thank the noble Lord, Lord Clement-Jones, for reconsidering his stance since Committee on the outright creation of these offences. Amendment 9 would create an obligation for the Secretary of State to review the need for digital identity theft offences. We believe this would be unnecessary, as existing legislation—for example, the Fraud Act 2006, the Computer Misuse Act 1990 and the Data Protection Act 2018—already addresses the behaviour targeted by this amendment.

However, we note the concerns raised and confirm that the Government are taking steps to tackle the issue. First, the Action Fraud service, which allows individuals to report fraud enabled by identity theft, is being upgraded with improved reporting tools, increased intelligence flows to police forces and better support services for victims. Secondly, the Home Office is reviewing the training offered to police officers who have to respond to fraud incidents, and identifying the improvements needed.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

I am sorry to interrupt the Minister. He is equating digital identity theft to fraud, and that is not always the case. Is that the advice that he has received?

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- Hansard - - - Excerpts

The advice is that digital identity theft would be captured by those Acts. Therefore, there is no need for a specific offence. However, as I said, the Government are taking steps to tackle this and will support the Action Fraud service as a way to deal with it, even though I agree that not everything falls as fraud under that classification.

--- Later in debate ---
The rules in the framework are likely to act as a robust baseline for the independent conformity assessment process. Schemes such as this exist in many sectors, as I have said, and draw heavily on existing standards. The Secretary of State will have to undertake an annual review and consult the Information Commissioner and other appropriate stakeholders as part of that process. The trust framework’s development will be informed by industry and regulatory knowledge as the market evolves.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

I am sorry to interrupt the Minister again, but could he therefore confirm that, by reiterating his previous view that the Secretary of State should not have to bring the framework to Parliament, he disagrees with both the Delegated Powers and Regulatory Reform Committee and the Constitution Committee, both of which made the same point on this occasion and on the previous Bill—that Parliament should look at the trust framework?

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- Hansard - - - Excerpts

For the reasons that I have given, I think that the trust framework is a technical document and one best dealt with in this technical form. It is built on other assurance processes, with the United Kingdom Accreditation Service overseeing the conformity accreditation bodies that will test the digital verification services. In this case, our view is that it does not need to come under parliamentary scrutiny.

On Amendments 6 and 8 from the noble Lord, Lord Lucas, I am absolutely behind the notion that the validity of the data is critical. We have to get this right. Of course, the Bill itself takes the data from other sources, and those sources have authority to get the information correct, but it is important, for a digital service in particular, that this is dealt with very carefully and that we have good assurance processes.

On the specific point about gender identity, the Bill does not create or prescribe new ways in which to determine that, but work is ongoing to try to ensure that there is consistency and accuracy. The Central Digital and Data Office has started to progress work on developing data standards and key entities and their attributes to ensure that the way data is organised, stored and shared is consistent between public authorities. Work has also been commenced via the domain expert group on the person entity, which has representations from the Home Office, HMRC, the Office for National Statistics—importantly—NHS England, the Department for Education, the Ministry of Justice, the Local Government Association and the Police Digital Service. The group has been established as a pilot under the Data Standards Authority to help to ensure consistency across organisations, and specific pieces of work are going on relating to gender in that area.

The measures in Part 2 are intended to help secure the reliability of the process through which citizens can verify their identity digitally. They do not intervene in how government departments record and store identity data. In clarifying this important distinction, and with reference to the further information I will set out, I cannot support the amendments.

--- Later in debate ---
Moved by
7: Clause 28, page 31, line 22, at end insert—
“(11) The Secretary of State must lay the DVS trust framework before Parliament.”Member's explanatory statement
This amendment will ensure Parliamentary oversight of the rules with which digital verification service providers must comply.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I support the conclusions of the Delegated Powers and Regulatory Reform Committee and the Constitution Committee, and I beg leave to seek the opinion of the House.

--- Later in debate ---
Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- Hansard - - - Excerpts

My Lords, Amendments 10 and 12 seek to amend Clauses 56 and 58, which form part of the national underground asset register provisions. These two minor, technical amendments address a duplicate reference to “the undertaker’s employees” and replace it with the correct reference to “the contractor’s employees”. I reassure noble Lords that the amendments do not have a material policy effect and are intended to correct the drafting. I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, I thank the Minister for these two technical amendments. I take this opportunity to thank him also for responding to correspondence about LinesearchbeforeUdig and its wish to meet government and work with existing services to deliver what it describes as the safe digging elements of the NUAR. The Minister has confirmed that the heavy lifting on this—not heavy digging—will be carried out by the noble Baroness, Lady Jones, on her return, which I am sure she will look forward to. As I understand it, officials will meet LinesearchbeforeUdig this week, and they will look at the survey carried out by the service. We have made some process since Committee, and I am grateful to the Minister for that.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

My Lords, given that these are technical amendments, correcting wording errors, I have little to add to the remarks already made. We have no concerns about these amendments and will not seek to oppose the Government in making these changes.

--- Later in debate ---
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I support my noble friend Lord Colville. He has made an excellent argument, and I ask noble Lords on the Government Benches to think about it very carefully. If it is good enough for health data, it is good enough for the rest of science. In the interest of time, I will give an example of one of the issues, rather than repeat the excellent argument made by my noble friend.

In Committee, I asked the Government three times whether the cover of scientific research could be used, for example, to market-test ways to hack human responses to dopamine in order to keep children online. In the Minister’s letter, written during Committee, she could not say that the A/B testing of millions of children to make services more sticky—that is, more addictive—would not be considered scientific, but rather that the regulator, the ICO, could decide on a case-by-case basis. That is not good enough.

There is no greater argument for my noble friend Lord Colville’s amendment than the fact that the Government are unable to say if hacking children’s attention for commercial gain is scientific or not. We will come to children and child protection in the Bill in the next group, but it is alarming that the Government feel able to put in writing that this is an open question. That is not what Labour believed in opposition, and it is beyond disappointing that, now in government, Labour has forgotten what it then believed. I will be following my noble friend through the Lobby.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, it is almost impossible to better the arguments put forward by the noble Viscount, Lord Colville, and the noble Baroness, Lady Kidron, so I am not even going to try.

The inclusion of a public interest requirement would ensure that the use of data for scientific research would serve a genuine societal benefit, rather than primarily benefiting private interests. This would help safeguard against the misuse of data for purely commercial purposes under the guise of research. The debate in Committee highlighted the need for further clarity and stronger safeguards in the Bill, to ensure that data for scientific research genuinely serves the public interest, particularly concerning the sensitive data of children. The call for a public interest requirement reflects the desire to ensure a balance between promoting research and innovation and upholding the rights and interests of data subjects. I very much hope that the House will support this amendment.

Lord Sentamu Portrait Lord Sentamu (CB)
- View Speech - Hansard - - - Excerpts

My Lords, we are playing a bit of Jack-in-the-box. When I was being taught law by a wonderful person from Gray’s Inn, who was responsible for drafting the constitution of Uganda’s independence, Sir Dingle Foot, he said a phrase which struck me, and which has always stayed with me: law is a statement of public policy. The noble Viscount, Lord Coville, seeks that if there is to be scientific work, it must be conducted “in the public interest”. Law simply does not express itself for itself; it does it for the public, as a public policy. It would be a wonderful phrase to include, and I hope the Minister will accept it so that we do not have to vote on it.

--- Later in debate ---
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I was one of those who was up even earlier than the noble Baroness, Lady Harding, and managed to get my name down on these amendments. It puts me in a rather difficult position to be part of the government party but to seek to change what the Government have arrived at as their sticking position in relation to this issue in particular—and indeed one or two others, but I have learned to live with those.

This one caught my eye in Committee. I felt suddenly, almost exactly as the noble Lord, Lord Russell said, a sense of discontinuity in relation to what we thought it was in the Government’s DNA—that is, to bring forward the right solution to the problems that we have been seeking to change in other Bills. With the then Online Safety Bill, we seemed to have an agreement around the House about what we wanted, but every time we put it back to the officials and people went away with it and came back with other versions, it got worse and not better. How children are dealt with and how important it is to make sure that they are prioritised appears to be one of those problems.

The amendments before us—and I have signed many of them, because I felt that we wanted to have a good and open debate about what we wanted here—do not need to be passed today. It seems to me that the two sides are, again, very close in what we want to achieve. I sensed from the excellent speech of the noble Baroness, Lady Kidron, that she has a very clear idea of what needs to go into this Bill to ensure that, at the very least, we do not diminish the sensible way in which we drafted the 2018 Bill. I was part of that process as well; I remember those debates very well. We got there because we hammered away at it until we found a way of finding the right words that bridged the two sides. We got closer and closer together, but sometimes we had to go even beyond what the clerks would feel comfortable with in terms of government procedure to do that. We may be here again.

When he comes to respond, can the Minister commit to us today in this House that he will bring back at Third Reading a version of what he has put forward—which I think we all would say does not quite go far enough; it needs a bit more, but not that much more—to make it meet with where we currently are and where, guided by the noble Baroness, Lady Kidron, we should be in relation to the changing circumstances in both the external world and indeed in our regulator, which of course is going to go through a huge change as it reformulates itself? We have an opportunity, but there is also a danger that we do not take it. If we weaken ourselves now, we will not be in the right position in a few years’ time. I appeal to my noble friend to think carefully about how he might manage this process for the best benefit of all of us. The House, I am sure, is united about where we want to get to. The Bill does not get us there. Government Amendment 18 is too modest in its approach, but it does not need a lot to get it there. I think there is a way forward that we do not need to divide on. I hope the Minister will take the advice that has been given.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, we have heard some of the really consistent advocates for children’s online protection today. I must say that I had not realised that the opportunity of signing the amendments of the noble Baroness, Lady Kidron, was rather like getting hold of Taylor Swift tickets—clearly, there was massive competition and rightly so. I pay tribute not only to the speakers today but in particular to the noble Baroness for all her campaigning, particularly with 5Rights, on online child protection.

All these amendments are important for protecting children’s data, because they address concerns about data misuse and the need for heightened protection for children in the digital environment, with enhanced oversight and accountability in the processing of children’s data. I shall not say very much. If the noble Baroness pushes Amendment 20 to a vote, I want to make sure that we have time before the dinner hour to do so, which means going through the next group very quickly. I very much hope that we will get a satisfactory answer from the Minister. The sage advice from the noble Lord, Lord Stevenson, hit the button exactly.

Amendment 20 is particularly important in this context. It seeks to exclude children from the new provisions on purpose limitation for further processing under Article 8A. As the noble Baroness explains, that means that personal data originally collected from a child with consent for a specific purpose could not be reused for a different, incompatible purpose without obtaining fresh consent, even if the child is now an adult. In my view, that is core. I hope the Minister will come back in the way that has been requested by the noble Lord, Lord Stevenson, so we do not have to have a vote. However, we will support the noble Baroness if she wishes to test the opinion of the House.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I too thank the noble Baroness, Lady Kidron, for all her amendments in this group, and I thank the Minister for his amendment.

Amendment 15 seeks to maintain the high level of legal protection for children’s data even where protections for adults may be eased in the context of scientific research. I acknowledge the concerns raised about the potential implications that this amendment could have for medical research and safeguarding work. It is important to recognise that young people aged 16 and over are entitled to control their medical information under existing legal frameworks, reflecting their ability to understand and consent in specific contexts.

There is a legitimate concern that by excluding all children categorically, including those aged 16 and 17, we risk impeding critical medical research that could benefit young people themselves. Research into safeguarding may also be impacted by such an amendment. Studies that aim to improve systems for identifying and preventing abuse or neglect rely on the careful processing of children’s data. If this amendment were to inadvertently create a barrier to such vital work, we could find ourselves undermining some of the protections that it seeks to reinforce.

That said, the amendment highlights an important issue: the need to ensure that ethical safeguards for children remain robust and proportionate. There is no question that the rights and welfare of children should remain paramount in research contexts, but we must find the right balance—one that allows valuable, ethically conducted research to continue without eroding the legal protections that exist for children’s data. So I welcome the intent of the amendment in seeking to protect children, of course, and I urge us, as the noble Lord, Lord Stevenson, put it, to continue working collaboratively to achieve a framework that upholds their rights without hindering progress in areas that ultimately serve their best interests.

As with the previous amendment, I recognise the intent of Amendment 16, which seeks to protect children’s data by excluding them from the scope of recognised legitimate interests. Ensuring that children continue to benefit from the highest level of legal protection is a goal that, needless to say, we all share. However, I remain concerned that this could have less desirable consequences too, particularly in cases requiring urgent safeguarding action. There are scenarios where swift and proportionate data processing is critical to protecting a child at risk, and it is vital that the framework that we establish does not inadvertently create barriers to such essential work.

I am absolutely in support of Amendment 20. It provides an important safeguard by ensuring that children’s data is not used for purposes beyond those for which it was originally collected, unless it is fully compatible with the original purpose. Children are particularly vulnerable when it comes to data processing and their understanding of consent is limited. The amendment would strengthen protection for children by preventing the use of their data in ways that were not made clear to them or their guardians at the time of collection. It would ensure that children’s data remained secure and was not exploited for unrelated purposes.

On Amendment 22, the overarching duty proposed in this new clause—to prioritise children’s best interests and ensure that their data is handled with due care and attention—aligns with the objective that we all share of safeguarding children in the digital age. We also agree with the principle that the protections afforded to children’s data should not be undermined or reduced, and that those protections should remain consistent with existing standards under the UK GDPR.

However, although we support the intent of the amendment, we have concerns about the reference to the UN Convention on the Rights of the Child and general comment 25. Although these international frameworks are important, we do not believe they should be explicitly tied into this legislation. Our preference would be for a redraft of this provision that focused more directly on UK law and principles, ensuring that the protections for children’s data were robust and tailored to our legal context, rather than linking it to international standards in a way that could create potential ambiguities.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

We take that as a yes.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

This is a self-governing House.

Data (Use and Access) Bill [HL]

Lord Clement-Jones Excerpts
Moved by
17: Clause 70, page 78, leave out lines 9 to 30
Member’s explanatory statement
This amendment removes powers for the Secretary of State to override primary legislation and modify key aspects of UK data protection law via statutory instrument.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

I apologise for interrupting the Minister, in what sounded almost like full flow. I am sure that he was so eager to move his amendment.

In moving Amendment 17, I will speak also to Amendment 21. These aim to remove the Secretary of State’s power to override primary legislation and modify key aspects of the UK data protection law via statutory instruments. They are similar to those proposed by me to the previous Government’s Data Protection and Digital Information Bill, which the noble Baroness, Lady Jones of Whitchurch, then in opposition, supported. These relate to Clauses 70(4) and 71(5).

There are a number of reasons to support accepting these amendments. The Delegated Powers and Regulatory Reform Committee has expressed concerns about the broad scope of the Secretary of State’s powers, as it did previously in relation to the DBS scheme. It recommended removing the power from the previous Bill, and in its ninth report it maintains this view for the current Bill. The Constitution Committee has said likewise; I will not read out what it said at the time, but I think all noble Lords know that both committees were pretty much on the same page.

The noble Baroness, Lady Jones, on the previous DPDI Bill, argued that there was no compelling reason for introducing recognised legitimate interests. On these Benches, we agree. The existing framework already allows for data sharing with the public sector and data use for national security, crime detection and safeguarding vulnerable individuals. However, the noble Baroness, in her ministerial capacity, argued that swift changes might be needed—hence the necessity for the Secretary of State’s power. Nevertheless, the DPRRC’s view is that the grounds for the lawful processing of personal data are fundamental and should not be subject to modification by subordinate legislation.

The letter from the Minister, the noble Lord, Lord Vallance, to the Constitution Committee and the DPRRC pretty much reiterates those arguments. I will not go through all of it again, but I note, in closing, that in his letter he said:

“I hope it will reassure the Committee that the power will be used only when necessary and in the public interest”.


He could have come forward with an amendment to that effect at any point in the passage of the Bill, but he has not. I hope that, on reflection—in the light of both committees’ repeated recommendations, the potential threats to individual privacy and data adequacy, and the lack of strong justification for these powers—the Minister will accept these two amendments. I beg to move.

Baroness Morris of Bolton Portrait The Deputy Speaker (Baroness Morris of Bolton) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I must inform the House that if Amendment 17 is agreed to, I cannot call Amendment 18 for reasons of pre-emption.

--- Later in debate ---
Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, government Amendment 18 is similar to government Amendment 40 in the previous group, which added an express reference to children meriting specific protection to the new ICO duty. This amendment will give further emphasis to the need for the Secretary of State to consider the fact that children merit specific protection when deciding whether to use powers to amend the list of recognised legitimate interests.

Turning to Amendment 17 from the noble Lord, Lord Clement-Jones, I understand the concerns that have been raised about the Secretary of State’s power to add or vary the list of recognised legitimate interests. This amendment seeks to remove the power from the Bill.

In response to some of the earlier comments, including from the committees, I want to make it clear that we have constrained these powers more tightly than they were in the previous data Bill. Before making any changes, the Secretary of State must consider the rights and freedoms of individuals, paying particular attention to children, who may be less aware of the risks associated with data processing. Furthermore, any addition to the list must meet strict criteria, ensuring that it serves a clear and necessary public interest objective as described in Article 23.1 of the UK GDPR.

The Secretary of State is required to consult the Information Commissioner and other stakeholders before making any changes, and any regulations must then undergo the affirmative resolution procedure, guaranteeing parliamentary scrutiny through debates in both Houses. Retaining this regulation-making power would allow the Government to respond quickly if future public interest activities are identified that should be added to the list of recognised legitimate interests. However, the robust safeguards and limitations in Clause 70 will ensure that these powers are used both sparingly and responsibly.

I turn now to Amendment 21. As was set out in Committee, there is already a relevant power in the current Data Protection Act to provide exceptions. We are relocating the existing exemptions, so the current power, so far as it relates to the purpose limitation principle, will no longer be relevant. The power in Clause 71 is intended to take its place. In seeking to reassure noble Lords, I want to reiterate that the power cannot be used for purposes other than the public interest objectives listed in Article 23.1 of the UK GDPR. It is vital that the Government can act quickly to ensure that public interest processing is not blocked. If an exemption is misused, the power will also ensure that action can be swiftly taken to protect data subjects by placing extra safeguards or limitations on it.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, I thank the Minister for that considered reply. It went into more detail than the letter he sent to the two committees, so I am grateful for that, and it illuminated the situation somewhat. But at the end of the day, the Minister is obviously intent on retaining the regulation-making power.

I thank the noble Viscount, Lord Camrose, for his support—sort of—in principle. I am not quite sure where that fitted; it was post-ministerial language. I think he needs to throw off the shackles of ministerial life and live a little. These habits die hard but in due course, he will come to realise that there are benefits in supporting amendments that do not give too much ministerial power.

Turning to one point of principle—I am not going to press either amendment—it is a worrying trend that both the previous Government and this Government seem intent on simply steamrollering through powers for Secretaries of State in the face of pretty considered comment by House of Lords committees. This trend has been noted, first for skeletal Bills and secondly for Bills that, despite being skeletal, include a lot of regulation-making power for Secretaries of State, and Henry VIII powers. So I just issue a warning that we will keep returning to this theme and we will keep supporting and respecting committees of this House, which spend a great deal of time scrutinising secondary legislation and warning of overweening executive power. In the meantime, I beg leave to withdraw Amendment 17.

Amendment 17 withdrawn.
--- Later in debate ---
I apologise again for all the detail, but this is how we create economic growth: by preventing regulators stifling activity such as this. I beg to move.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, I do not think the noble Baroness, Lady Harding, lost the audience at all; she made an excellent case. Before speaking in support of the noble Baroness, I should say, “Blink, and you lose a whole group of amendments”. We seem to have completely lost sight of the group starting with Amendment 19—I know the noble Lord, Lord Holmes, is not here—and including Amendments 23, 74 and government Amendment 76, which seems to have been overlooked. I suggest that we degroup next week and come back to Amendments 74 and 76. I do not know what will happen to Amendment 23; I am sure there is a cunning plan on the Opposition Front Bench to reinstate that in some shape or form. I just thought I would gently point that out, since we are speeding along and forgetting some of the very valuable amendments that have been tabled.

I very much support, as I did in Committee, what the noble Baroness, Lady Harding, said about Amendment 24, which aims to clarify the use of open electoral register data for direct marketing. The core issue is the interpretation of Article 14 of the GDPR, specifically regarding the disproportionate effort exemption. The current interpretation, influenced by recent tribunal rulings, suggests that companies using open electoral register—OER—data would need to notify every individual whose data is used, even if they have not opted out. As the noble Baroness, Lady Harding, implied, notifying millions of individuals who have not opted out is unnecessary and burdensome. Citizens are generally aware of the OER system, and those who do not opt out reasonably expect to receive direct marketing materials. The current interpretation leads to excessive, unhelpful notifications.

There are issues about financial viability. Requiring individual notifications for the entire OER would be financially prohibitive for companies, potentially leading them to cease using the register altogether. On respect for citizens’ choice, around 37% of voters choose not to opt out of OER use for direct marketing, indicating their consent to such use. The amendment upholds this choice by exempting companies from notifying those individuals, which aligns with the GDPR’s principle of respecting data subject consent.

On clarity and certainty, Amendment 24 provides clear exemptions for OER data use, offering legal certainty for companies while maintaining data privacy and adequacy. This addresses the concerns about those very important tribunal rulings creating ambiguity and potentially disrupting legitimate data use. In essence, Amendment 24 seeks to reconcile the use of OER data for direct marketing with the principles of transparency and data subject rights. On that basis, we on these Benches support it.

I turn to my amendment, which seeks a soft opt-in for charities. As we discussed in Committee, a soft opt-in in Regulation 22 of the Privacy and Electronic Communications (EC Directive) Regulations 2003 allows organisations to send electronic mail marketing to existing customers without their consent, provided that the communication is for similar products and services and the messages include an “unsubscribe” link. The soft opt-in currently does not apply to non-commercial organisations such as charities and membership organisations. The Data & Marketing Association estimates that extending the soft opt-in to charities would

“increase … annual donations in the UK by £290 million”.

Extending the soft opt-in as proposed in both the Minister’s and my amendment would provide charities with a level playing field, as businesses have enjoyed this benefit since the introduction of the Privacy and Electronic Communications Regulations. Charities across the UK support this change. For example, the CEO of Mind stated:

“Mind’s ability to reach people who care about mental health is vital. We cannot deliver life changing mental health services without the financial support we receive from the public”.


Oxfam’s individual engagement director noted:

“It’s now time to finally level the playing field for charities too and to allow them to similarly engage their passionate and committed audiences”.


Topically, too, this amendment is crucial to help charities overcome the financial challenges they face due to the cost of living crisis and the recent increase in employer national insurance contributions. So I am delighted, as I know many other charities will be, that the Government have proposed Amendment 49, which achieves the same effect as my Amendment 50.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I declare an interest that my younger daughter works for a charity which will rely heavily on the amendments that have just been discussed by the noble Lord, Lord Clement-Jones.

I want to explain that my support for the amendment moved by the noble Baroness, Lady Harding, was not inspired by any quid pro quo for earlier support elsewhere —certainly not. Looking through the information she had provided, and thinking about the issue and what she said in her speech today, it seemed there was an obvious injustice happening. It seemed wrong, in a period when we were trying to support growth, that we cannot see our way through it. It was in that spirit that I suggested we should push on with it and bring it back on Report, and I am very happy to support it.

--- Later in debate ---
I strongly commend this amendment to the House, and I am minded to test the opinion of the House.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, I will speak to Amendments 28, 29, 33, 34 and 36. I give notice that I will only speak formally to Amendment 33. For some reason, it seems to have escaped this group and jumped into the next one.

As we discussed in Committee, and indeed on its previous versions, the Bill removes the general prohibition on solely automated decisions and places the responsibility on individuals to enforce their rights rather than on companies to demonstrate why automation is permissible. The Bill also amends Article 22 of the GDPR so that protection against solely automated decision-making applies only to decisions made using sensitive data such as race, religion and health data. This means that decisions based on other personal data, such as postcode, nationality, sex or gender, would be subject to weaker safeguards, increasing the risk of unfair or discriminatory outcomes. This will allow more decisions with potentially significant impacts to be made without human oversight, even if they do not involve sensitive data. This represents a significant weakening of existing protection against unsafe automated decision-making. That is why I tabled Amendment 33 to leave out the whole clause.

However, the Bill replaces the existing Article 22 with Articles 22A to 22D, which redefine automated decisions and allow for solely automated decision-making in a broader range of circumstances. This change raises concerns about transparency and the ability of individuals to challenge automated decisions. Individuals may not be notified about the use of ADM, making it difficult to exercise their rights. Moreover, the Bill’s safeguards for automated decisions, particularly in the context of law enforcement, are weaker compared with the protections offered by the existing Article 22. This raises serious concerns about the potential for infringement of people’s rights and liberties in areas such as policing, where the use of sensitive data in ADM could become more prevalent. Additionally, the lack of clear requirements for personalised explanations about how ADM systems reach decisions further limits individuals’ understanding of and ability to challenge outcomes.

In the view of these Benches, the Bill significantly weakens safeguards around ADM, creates legal uncertainty due to vague definitions, increases the risk of discrimination, and limits transparency and redress for individuals—ultimately undermining public trust in the use of these technologies. I retabled Amendments 28, 29, 33 and 34 from Committee to address continuing concerns regarding these systems. The Bill lacks clear definitions of crucial terms such as “meaningful human involvement” and, similarly, “significant effect”, which are essential for determining the scope of protection. That lack of clarity could lead to varying interpretations and inconsistencies in application, creating legal uncertainty for individuals and organisations.

In Committee, the noble Baroness, Lady Jones, emphasised the Government’s commitment to responsible ADM and argued against defining meaningful human involvement in the Bill, but instead for allowing the Secretary of State to define those terms through delegated legislation. However, that raises concerns about transparency and parliamentary oversight, as these are significant policy decisions. Predominantly automated decision-making should be included in Clause 80, as in Amendment 28, as a decision may lack meaningful human involvement and significantly impact individuals’ rights. The assertion by the noble Baroness, Lady Jones, that predominantly automated decisions inherently involve meaningful human oversight can be contested, particularly given the lack of a clear definition of such involvement in the Bill.

There are concerns that changes in the Bill will increase the risk of discrimination, especially for marginalised groups. The noble Baroness, Lady Jones, asserted in Committee that the data protection framework already requires adherence to the Equality Act. However, that is not enough to prevent algorithmic bias and discrimination in ADM systems. There is a need for mandatory bias assessments of all ADM systems, particularly those used in the public sector, as well as for greater transparency in how those systems are developed and deployed.

We have not returned to the fray on the ATRS, but it is clear that a statutory framework for the ATRS is necessary to ensure its effectiveness and build trust in public sector AI. Despite the assurance by the noble Baroness, Lady Jones, that the ATRS is mandatory for government departments, its implementation relies on a cross-government policy mandate that lacks statutory backing and may prove insufficient to ensure the consistent and transparent use of algorithmic tools.

My Amendment 34 seeks to establish requirements for public sector organisations using ADM systems. Its aim is to ensure transparency and accountability in the use of these systems by requiring public authorities to publish details of the systems they use, including the purpose of the system, the data used and any mitigating measures to address risks. I very much welcome Amendment 35 from the noble Baroness, Lady Freeman, which would improve it considerably and which I have also signed. Will the ATRS do as good a job as that amendment?

Concerns persist about the accessibility and effectiveness of this mechanism for individuals seeking redress against potentially harmful automated decisions. A more streamlined and user-friendly process for challenging automated decisions is needed in the in the age of increasing ADM. The lack of clarity and specific provisions in the Bill raises concerns about its effectiveness in mitigating the risks posed by automated systems, particularly in safeguarding vulnerable groups such as children.

My Amendment 36 would require the Secretary of State to produce a definition of “meaningful human involvement” in ADM in collaboration with the Information Commissioner’s Office, or to clearly set out their reasoning as to why that is not required within six months of the Act passing. The amendment is aimed at addressing the ambiguity surrounding “meaningful human involvement” and ensuring that there is a clear understanding of what constitutes appropriate human oversight in ADM processes.

I am pleased that the Minister has promised a code of practice, but what assurance can he give regarding the forthcoming ICO code of practice about automated decision-making? How will it provide clear guidance on how to implement and interpret the safeguards for ADM, and will it address the definition of meaningful human involvement? What forms of redress will it require to be established? What level of transparency will be required? A code of conduct offered by the Minister would be acceptable, provided that the Secretary of State did not have the sole right to determine the definition of meaningful human involvement. I therefore hope that my Amendment 29 will be accepted alongside Amendment 36, because it is important that the definition of such a crucial term should be developed independently, and with the appropriate expertise, to ensure that ADM systems are used fairly and responsibly, and that individual rights are adequately protected.

Amendments 31 and 32 from the Opposition Front Bench seem to me to have considerable merit, particularly Amendment 32, in terms of the nature of the human intervention. However, I confess to some bafflement as to the reasons for Amendment 26, which seeks to insert the OECD principles set out in the AI White Paper. Indeed, they were the G20 principles as well and are fully supportable in the context of an AI Bill, for instance, and I very much hope that will form Clause 1 of a new AI Bill going forward. I am not going to go into great detail, but I wonder whether those principles are already effectively addressed in data protection legislation. If we are not careful, we are going to find a very confused regulator in these circumstances. So, although there is much to commend the principles as such, whether they are a practical proposition in a Bill of this nature is rather moot.

Baroness Freeman of Steventon Portrait Baroness Freeman of Steventon (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I support Amendment 34 from the noble Lord, Lord Clement-Jones, and will speak to my own Amendment 35, which amends it. When an algorithm is being used to make important decisions about our lives, it is vital that everyone is aware of what it is doing and what data it is based on. On Amendment 34, I know from having had responsibility for algorithmic decision support tools that users are very interested in how recent the data it is based on is, and how relevant it is to them. Was the algorithm derived from a population that included people who share their characteristics? Subsection (1)(c)(ii) of the new clause proposed in Amendment 34 refers to regular assessment of the data used by the system. I would hope that this would be part of the meaningful explanation to individuals to be prescribed by the Secretary of State in subsection (1)(b).

Amendment 35 would add to this that it is vital that all users and procurers of such a system understand its real-world efficacy. I use the word “efficacy” rather than “accuracy” because it might be difficult to define accuracy with regard to some of these systems. The procurer of any ADM system should want to know how accurate it is using realistic testing, and users should also be aware of those findings. Does the system give the same outcome as a human assessor 95% or 60% of the time? Is that the same for all kinds of queries, or is it more accurate for some groups of people than others? The efficacy is really one of the most important aspects and should be public. I have added an extra line that ensures that this declaration of efficacy would be kept updated. One would hope that the performance of any such system would be monitored anyway, but this ensures that the outcomes of such monitoring are in the public domain.

In Committee, the Minister advised us to wait for publication of the algorithmic transparency records that were released in December. Looking at them, I think they make clear the much greater need for guidance and stringency in what should be mandated. I will give two short examples from those records. For the DBT: Find Exporters algorithm, under “Model performance” it merely says that it uses Brier scoring and other methods, without giving any actual results of that testing to indicate how well it performs. It suggests looking at the GitHub pages. I followed that link, and it did not allow me in. The public have no access to those pages. This is why these performance declarations need to be mandated and forced to be in the public domain.

In the second example, the Cambridgeshire trial of an externally supplied object detection system just cites the company’s test data, claiming average precision in a “testing environment” of 43.5%. This does not give the user a lot of information. Again, it links to GitHub pages produced by the supplier. Admittedly, this is a trial, so perhaps the Cambridgeshire Partnership will update it with its real-world trial data. But that is why we need to ensure annual updates of performance data and ensure that that data is not just a report of the supplier’s claims in a test environment.

The current model of algorithmic transparency records is demonstrably not fit for purpose, and these provisions would help put them on a much firmer footing. These systems, after all, are making life-changing decisions for all of us and we all need to be sure how well they are doing and put appropriate levels of trust in them accordingly.

--- Later in debate ---
Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

I start with Amendment 26, tabled by the noble Viscount, Lord Camrose. As he said in Committee, a principles-based approach ensures that our rules remain fit in the face of fast-evolving technologies by avoiding being overly prescriptive. The data protection framework achieves this by requiring organisations to apply data protection principles when personal data is processed, regardless of the technology used.

I agree with the principles that are present for AI, which are useful in the context in which they were put together, but introducing separate principles for AI could cause confusion around how data protection principles are interpreted when using other technologies. I note the comment that there is a significant overlap between the principles, and the comment from the noble Viscount that there are situations in which one would catch things and another would not. I am unable to see what those particular examples are, and I hope that the noble Viscount will agree with the Government’s rationale for seeking to protect the framework’s technology-neutral set of principles, rather than having two separate sets.

Amendment 28 from the noble Lord, Lord Clement-Jones, would extend the existing safeguards for decisions based on solely automated processing to decisions based on predominantly automated processing. These safeguards protect people when there is no meaningful human involvement in the decision-making. The introduction of predominantly automated decision-making, which already includes meaningful human involvement—and I shall say a bit more about that in a minute—could create uncertainty over when the safeguards are required. This may deter controllers from using automated systems that have significant benefits for individuals and society at large. However, the Government agree with the noble Viscount on strengthening the protections for individuals, which is why we have introduced a definition for solely automated decision-making as one which lacks “meaningful human involvement”.

I thank noble Lords for Amendments 29 and 36 and the important points raised in Committee on the definition of “meaningful human involvement”. This terminology, introduced in the Bill, goes beyond the current UK GDPR wording to prevent cursory human involvement being used to rubber stamp decisions as not being solely automated. The point at which human involvement becomes meaningful is context specific, which is why we have not sought to be prescriptive in the Bill. The ICO sets out in its guidance its interpretation that meaningful human involvement must be active: someone must review the decision and have the discretion to alter it before the decision is applied. The Government’s introduction of “meaningful” into primary legislation does not change this definition, and we are supportive of the ICO’s guidance in this space.

As such, the Government agree on the importance of the ICO continuing to provide its views on the interpretation of terms used in the legislation. Our reforms do not remove the ICO’s ability to do this, or to advise Parliament or the Government if it considers that the law needs clarification. The Government also acknowledge that there may be a need to provide further legal certainty in future. That is why there are a number of regulation-making powers in Article 22D, including the power to describe meaningful human involvement or to add additional safeguards. These could be used, for example, to impose a timeline on controllers to provide human intervention upon the request of the data subject, if evidence suggested that this was not happening in a timely manner following implementation of these reforms. Any regulations must follow consultation with the ICO.

Amendment 30 from the noble Baroness, Lady Kidron, would prevent law enforcement agencies seeking the consent of a young person to the processing of their special category or sensitive personal data when using automated decision-making. I thank her for this amendment and agree about the importance of protecting the sensitive personal data of children and young adults. We believe that automated decision-making will continue to be rarely deployed in the context of law enforcement decision-making as a whole.

Likewise, consent is rarely used as a lawful basis for processing by law enforcement agencies, which are far more likely to process personal data for the performance of a task, such as questioning a suspect or gathering evidence, as part of a law enforcement process. Where consent is needed—for example, when asking a victim for fingerprints or something else—noble Lords will be aware that Clause 69 clearly defines consent under the law enforcement regime as

“freely given, specific, informed and unambiguous”

and

“as easy … to withdraw … as to give”.

So the tight restrictions on its use will be crystal clear to law enforcement agencies. In summary, I believe the taking of an automated decision based on a young person’s sensitive personal data, processed with their consent, to be an extremely rare scenario. Even when it happens, the safeguards that apply to all sensitive processing will still apply.

I thank the noble Viscount, Lord Camrose, for Amendments 31 and 32. Amendment 31 would require the Secretary of State to publish guidance specifying how law enforcement agencies should go about obtaining the consent of the data subject to process their data. To reiterate a point made by my noble friend Lady Jones in Committee, Clause 69 already provides a definition of “consent” and sets out the conditions for its use; they apply to all processing under the law enforcement regime, not just automated decision-making, so the Government believe this amendment is unnecessary.

Amendment 32 would require the person reviewing an automated decision to have sufficient competence and authority to amend the decision if required. In Committee, the noble Viscount also expressed the view that a person should be “suitably qualified”. Of course, I agree with him on that. However, as my noble friend Lady Jones said in Committee, the Information Commissioner’s Office has already issued guidance which makes it clear that the individual who reconsiders an automated decision must have the “authority and competence” to change it. Consequently, the Government do not feel that it is necessary to add further restrictions in the Bill as to the type of person who can carry out such a review.

The noble Baroness, Lady Freeman, raised extremely important points about the performance of automated decision-making. The Government already provide a range of products, but A Blueprint for Modern Digital Government, laid this morning, makes it clear that part of the new digital centre’s role will be to offer specialist insurance support, including, importantly in relation to this debate,

“a service to rigorously test models and products before release”.

That function will be in place and available to departments.

On Amendments 34 and 35, my noble friend Lady Jones previously advised the noble Lord, Lord Clement-Jones, that the Government would publish new algorithmic transparency recording standard records imminently. I am pleased to say that 14 new records were published on 17 December, with more to follow. I accept that these are not yet in the state in which we would wish them to be. Where these amendments seek to ensure that the efficacy of such systems is evaluated, A Blueprint for Modern Digital Government, as I have said, makes it clear that part of the digital centre’s role will be to offer such support, including this service. I hope that this provides reassurance.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, before the Minister sits down, I was given considerable assurance between Committee and Report that a code of practice, drawn up with the ICO, would be quite detailed in how it set out the requirements for those engaging in automated decision-making. The Minister seems to have given some kind of assurance that it is possible that the ICO will come forward with the appropriate provisions, but he has not really given any detail as to what that might consist of and whether that might meet some of the considerations that have been raised in Committee and on Report, not least Amendments 34 and 35, which have just been discussed as if the ATRS was going to cover all of that. Of course, any code would no doubt cover both the public and private sectors. What more can the Minister say about the kind of code that would be expected? We seem to be in somewhat of a limbo in this respect.

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- Hansard - - - Excerpts

I apologise; I meant to deal with this at the end. I think I am dealing with the code in the next group.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Well, that is something to look forward to.

--- Later in debate ---
Moved by
33: Leave out Clause 80
Member's explanatory statement
This is a probing amendment intended to elicit assurances from the Minister regarding the forthcoming ICO code of practice about automated decision-making.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, we have waited with bated breath for the Minister to share his hand, and I very much hope that he will reveal the nature of his bountiful offer of a code of practice on the use of automated decision-making.

I will wear it as a badge of pride to be accused of introducing an analogue concept by the noble Viscount, Lord Camrose. I am still keen to see the word “predominantly” inserted into the Bill in reference to automated decision-making.

As the Minister can see, there is considerable unhappiness with the nature of Clause 80. There is a view that it does not sufficiently protect the citizen in the face of automated decision-making, so I hope that he will be able to elaborate further on the nature of those protections.

I will not steal any of the thunder of the noble Baroness, Lady Kidron. For some unaccountable reason, Amendment 33 is grouped with Amendment 41. The groupings on this Bill have been rather peculiar and at this time of night I do not think any long speeches are in order, but it is important that we at least have some debate about the importance of a code of conduct for the use of AI in education, because it is something that a great many people in the education sector believe is necessary. I beg to move.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I shall speak to Amendment 41 in my name and in the names of my noble friend Lord Russell, the noble Baroness, Lady Harding, and the noble Lord, Lord Clement-Jones. The House can be forgiven if it is sensing a bit of déjà-vu, since I have proposed this clause once or twice before. However, since Committee, a couple of things have happened that make the argument for the code more urgent. We have now heard that the Prime Minister thinks that regulating AI is “leaning out” when we should be, as the tech industry likes to say, leaning in. We have had Matt Clifford’s review, which does not mention children even once. In the meantime, we have seen rollout of AI in almost all products and services that children use. In one of the companies—a household name that I will not mention—an employee was so concerned that they rang me to say that nothing had been checked except whether the platform would fall over.

Amendment 41 does not seek to solve what is a global issue of an industry arrogantly flying a little too close to the sun and it does not grasp how we could use this extraordinary technology and put it to use for humankind on a more equitable basis than the current extractive and winner-takes-all model; it is far more modest than that. It simply says that products and services that engage with kids should undertake a mandatory process that considers their specific vulnerabilities related to age. I want to stress this point. When we talk about AI, increasingly we imagine the spectre of diagnostic benefits or the multiple uses of generative models, but of course AI is not new nor confined to these uses. It is all around us and, in particular, it is all around children.

In 2021, Amazon’s AI voice assistant, Alexa, instructed a 10 year-old to touch a live electrical plug with a coin. Last year, Snapchat’s My AI gave adult researchers posing as a 13 year-old girl tips on how to lose her virginity with a 31 year-old. Researchers were also able to obtain tips on how to hide the smell of alcohol and weed and how to conceal Snapchat conversations from their parents. Meanwhile, character.ai is being sued by the mother of a 14 year-old boy in Florida who died by suicide after becoming emotionally attached to a companion bot that encouraged him to commit suicide.

In these cases, the companies in question responded by implementing safety measures after the fact, but how many children have to put their fingers in electrical sockets, injure themselves, take their own lives and so on before we say that those measures should be mandatory? That is all that the proposed code does. It asks that companies consider the ways in which their products may impact on children and, having considered them, take steps to mitigate known risk and put procedures in place to deal with emerging risks.

One of the frustrating things about being an advocate for children in the digital world is how much time I spend articulating avoidable harms. The sorts of solutions that come after the event, or suggestions that we ban children from products and services, take away from the fact that the vast majority of products and services could, with a little forethought, be places of education, entertainment and personal growth for children. However, children are by definition not fully mature, which puts them at risk. They chat with smart speakers, disclosing details that grown-ups might consider private. One study found that three to six year-olds believed that smart speakers have thoughts, feelings and social abilities and are more reliable than human beings when it came to answering fact-based questions.

I ask the Minister: should we ban children from the kitchen or living room in which the smart speaker lives, or demand, as we do of every other product and service, minimum standards of product safety based on the broad principle that we have a collective obligation to the safety and well-being of children? An AI code is not a stretch for the Bill. It is a bare minimum.

--- Later in debate ---
Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- Hansard - - - Excerpts

It will be clear to the ICO from the amendments that have been tabled and my comments that there is an expectation that it should take into account the discussion we have had on this Bill.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, I thank the Minister for his very considered response. In the same way as the noble Baroness, Lady Kidron, I take it that, effectively, the Minister is pledging to engage directly with us and others about the nature and contents of the code, and that the ICO will also engage on that. As the Minister knows, the definition of terms such as meaningful human engagement is something that we will wish to discuss and consider in the course of that engagement. I hope that the AI edtech code will also be part of that.

I thank the Minister. I know he has had to think about this quite carefully during the Bill’s passage. Currently, Clause 80 is probably the weakest link in the Bill, and this amendment would go some considerable way towards repairing it. My final question is not to the Minister, but to the Opposition: what on earth have they got against the UN? In the meantime, I beg leave to withdraw my amendment.

Amendment 33 withdrawn.
--- Later in debate ---
Moved by
37: After Clause 84, insert the following new Clause—
“Impact of this Act and other developments at national and international level on EU data adequacy decisionBefore the European Union’s reassessment of data adequacy in June 2025, the Secretary of State must carry out an assessment of the likely impact on the European Union data adequacy decisions relating to the United Kingdom of the following—(a) this Act;(b) other changes to the United Kingdom’s domestic frameworks which are relevant to the matters listed in Article 45(2) of the UK GDPR (transfers on the basis of an adequacy decision);(c) relevant changes to the United Kingdom’s international commitments or other obligations arising from legally binding conventions or instruments, as well as from its participation in multilateral or regional systems, in particular in relation to the protection of personal data.”Member's explanatory statement
This amendment requires the Secretary of State to carry out an assessment of the impact of this Act and other changes to the UK’s domestic and international frameworks relating to data adequacy.
--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, Amendment 37 is on the subject of data adequacy, which has been a consistent issue throughout the passage of the Bill. The mechanism put forward in the amendment would review the question of data adequacy.

Safeguarding data adequacy is crucial for the UK’s economy and international partnerships. Losing data adequacy status would impose significant costs and administrative burdens on businesses and public sector organisations that share data between the UK and the EU. It would also hinder international trade and economic co-operation, and trust in the UK’s digital economy, contradicting the Government’s objective of economic growth. I hope very much that the Government are proactively engaging with the European Commission to ensure a smooth adequacy renewal process this year.

Early engagement and reassurance about the retention of adequacy status are of crucial importance, given the looming deadline of June this year. This includes explaining and providing reassurance regarding any planned data protection reforms, particularly concerning the independence of the Information Commissioner’s Office, ministerial powers to add new grounds—for instance, recognised legitimate interest for data processing —and the new provisions relating to automated decision-making.

Despite assurances from the noble Baroness, Lady Jones, that the proposed changes will not dilute data subjects’ rights or threaten EU adequacy, proactive engagement with the EU and robust safeguards are necessary to ensure the continued free flow of data while maintaining high data protection standards. The emphasis on proportionality as a safeguard against the dilution of data subjects’ rights, as echoed by the noble Baroness, Lady Jones, and the ICO, is insufficient. The lack of a clear definition of proportionality within the context of data subjects’ rights could provide loopholes for controllers and undermine the essential equivalence required for data adequacy. The Bill’s reliance on the ICO’s interpretation of proportionality without explicit legislative clarity could be perceived as inadequate by the European Commission, particularly in areas such as web scraping for AI training.

The reassurance that the Government are taking data adequacy seriously and are committing to engaging with the EU needs to be substantiated by concrete actions. The Government do not, it appears, disclose assessments and reports relating to the compatibility of the UK’s domestic data protection framework with the Council of Europe’s Convention 108+, and that raises further concerns about transparency and accountability. Access to this information would enable scrutiny and informed debate, ultimately contributing to building trust and ensuring compatibility with international data protection standards.

In conclusion, while the Government maintain that this Bill would not jeopardise data adequacy, the concerns raised by myself and others during its passage mean that I continue to believe that a comprehensive review of EU data adequacy, as proposed in Amendment 37, is essential to ensure the continued free flow of data, while upholding high data protection standards and maintaining the UK’s position as a trusted partner in international data exchange. I beg to move.

--- Later in debate ---
Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

I thank the noble Lord, Lord Clement-Jones, for his amendment, and the noble and learned Lord, Lord Thomas, for his contribution. I agree with them on the value and importance placed on maintaining our data adequacy decisions from the EU this year. That is a priority for the Government, and I reassure those here that we carefully considered all measures in the light of the EU’s review of our adequacy status when designing the Bill.

The Secretary of State wrote to the House of Lords European Affairs Committee on 20 November 2024 on this very point and I would be happy to share this letter with noble Lords if that would be helpful. The letter sets out the importance this Government place on renewal of our EU adequacy decisions and the action we are taking to support this process.

It is important to recognise that the EU undertakes its review of its decisions for the UK in a unilateral, objective and independent way. As the DSIT Secretary of State referenced in his appearance before the Select Committee on 3 December, it is important that we acknowledge the technical nature of the assessments. For that reason, we respect the EU’s discretion about how it manages its adequacy processes. I echo some of the points made by the noble Viscount, Lord Camrose.

That being said, I reassure noble Lords that the UK Government are doing all they can to support a swift renewal of our adequacy status in both technical preparations and active engagement. The Secretary of State met the previous EU Commissioner twice last year to discuss the importance of personal data sharing between the UK and EU. He has also written to the new Commissioner for Justice responsible for the EU’s review and looks forward to meeting Commissioner McGrath soon.

I also reassure noble Lords that DSIT and the Home Office have dedicated teams that have been undertaking preparations ahead of this review, working across government as needed. Those teams are supporting European Commission officials with the technical assessment as required. UK officials have met with the European Commission four times since the introduction of the Bill, with future meetings already in the pipeline.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, the noble and learned Lord, Lord Thomas, whose intervention I very much appreciated, particularly at this time of the evening, talked about a fresh pair of eyes. What kind of reassurance can the Minister give on that?

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- Hansard - - - Excerpts

It is worth remembering that the ultimate decision is with the EU Commission and we are quite keen to have its eyes on it now, which is why we are engaging with it very carefully. It is looking at it as we are going through it—we are talking to it and we have dedicated teams of people brought together specifically to do this. There are several people from outside the direct construct of the Bill who are looking at this to make sure that we have adequacy and are having very direct conversations with the EU to ensure that that process is proceeding as we would wish it to.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

I thank the Minister for his response. It would be very reassuring if it was our own fresh pair of eyes rather than across the North Sea. That is all I can say as far as that is concerned. I appreciate what he said—that the Government are taking this seriously. It is a continuing concern precisely because the chair of the European Affairs Committee wrote to the Government. It is a continuing issue for those of us observing the passage of the Bill and we will continue to keep our eyes on it as we go forward. I very much hope that June 2025 passes without incident and that the Minister’s predictions are correct. In the meantime, I beg leave to withdraw the amendment.

Amendment 37 withdrawn.

Data (Use and Access) Bill [HL]

Lord Clement-Jones Excerpts
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

My Lords, it is a pleasure to open the second day on Report on the Data (Use and Access) Bill. In doing so, I declare my technology interests as set out in the register, not least as an adviser to Socially Recruited, an AI business. In moving Amendment 38 in my name, I will not speak to any other amendments in this group.

Amendment 38 goes to the heart of the issue du jour: regulators have seldom been so much in the press and in the public eye. As the press would have it, they were hauled into No. 11 just a few days ago, but this speaks to what we want from our regulators across our economy and society. At their best, our regulators are the envy of the world. Just consider the FCA when we did the fintech regulatory sandbox: as a measure of success, it was replicated in well over 50 jurisdictions around the world.

We know how to do right-sized regulation and how to set up our regulators to succeed to do that most difficult of tasks—to balance innovation, economic growth, and consumers’ and citizens’ rights. That is what all regulators should be about. It is not straightforward; it is complex but entirely doable.

Amendment 38 simply proposes wording to assist the Information Commissioner’s Office. When it comes to the economic growth duty—“#innovation”—it simply refers back to Section 108 of the 2015 Act. I believe that bringing this clarity into the Bill will assist the regulator and enable all the conversations that are rightly going on right now, and all the plans that are being produced and reported on, such as those around AI, to be properly discussed and given proper context, with an Information Commissioner’s Office that is supported through clarity as to its responsibilities and obligations when it comes to economic growth. In simple terms, this would mean that these responsibilities are restricted and clearly set out according to Section 108 of the 2015 Act. It is critical that this should be the case if we are to have clarity around the commissioner’s independence as a supervisory authority on data protection, an absolutely essential condition for EU adequacy decisions.

I look forward to the Minister’s response. I hope that he likes my drafting. I hope that he will accept and incorporate my amendment into the Bill. I look forward to the debate. I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, I rise to support Amendment 38 in the name of the noble Lord, Lord Holmes. More than ever before, the commissioner, alongside other regulators, is being pressured to support the Government’s growth and innovation agenda. In Clause 90, the Bill places unprecedented obligations on the ICO to support innovation. The question, in respect of both the existing growth duty and Clause 90, is whether they are in any sense treated as overriding the ICO’s primary responsibilities in data protection and information rights. How does the ICO aim to balance those duties, ensuring that its regulatory actions support economic growth while maintaining necessary protections?

We need to be vigilant. As it is, there are criticisms regarding the way the Information Commissioner’s Office carries out its existing duties. Those criticisms can be broadly categorised into issues with enforcement, independence and the balancing of competing interests. The ICO has a poor record on enforcement; it has been reluctant to issue fines, particularly to public sector organisations. There has been an overreliance on reprimands, as I described in Committee. The ICO has been relying heavily on reprimands, rather than stronger enforcement actions. It has also been accused of being too slow with its investigations.

There are concerns about these new duties, which could pose threats to the ability of the Information Commissioner’s Office to effectively carry out its primary functions. For that reason, we support the amendment from the noble Lord, Lord Holmes.

--- Later in debate ---
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I support the amendment in the name of the noble Baroness, Lady Kidron, to which I have added my name. I will speak briefly because I wish to associate myself with everything that she has said, as is normal on these topics.

Those of us who worked long and hard on the Online Safety Act had our fingers burnt quite badly when things were not written into the Bill. While I am pleased—and expect to be even more pleased in a few minutes—that the Government are in favour of some form of code of conduct for edtech, whether through the age-appropriate design code or not, I am nervous. As the noble Baroness, Lady Kidron said, every day with Ofcom we are seeing the risk-aversion of our regulators in this digital space. Who can blame them when it appears to be the flavour of the month to say that, if only the regulators change the way they behave, growth will magically come? We have to be really mindful that, if we ask the ICO to do this vaguely, we will not get what we need.

The noble Baroness, Lady Kidron, as ever, makes a very clear case for why it is needed. I would ask the Minister to be absolutely explicit about the Government’s intention, so that we are giving very clear directions from this House to the regulator.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, it is a pleasure to follow the noble Baroness, Lady Harding. I have added a few further words to my speech in response, because she made an extremely good point. I pay tribute to the noble Baroness, Lady Kidron, and her tenacity in trying to make sure that we secure a code for children’s data and education, which is so needed. The education sector presents unique challenges for protecting children’s data.

Like the noble Baronesses, Lady Kidron and Lady Harding, I look forward to what the Minister has to say. I hope that whatever is agreed is explicit; I entirely agree with the noble Baroness, Lady Harding. I had my own conversation with the Minister about Ofcom’s approach to categorisation which, quite frankly, does not follow what we thought the Online Safety Act was going to imply. It is really important that we absolutely tie down what the Minister has to say.

The education sector is a complex environment. The existing regulatory environment does not adequately address the unique challenges posed by edtech, as we call it, and the increasing use of children’s data in education. I very much echo what the noble Baroness, Lady Kidron, said: children attend school for education, not to be exploited for data mining. Like her, I cross over into considering the issues related to the AI and IP consultation.

The worst-case scenario is using an opt-in system that might incentivise learners or parents to consent, whether that is to state educational institutions such as Pearson, exam boards or any other entity. I hope that, in the other part of the forest, so to speak, that will not take place to the detriment of children. In the meantime, I very much look forward to what the Minister has to say on Amendment 44.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the noble Baroness, Lady Kidron, for moving her amendment. Before I begin, let me declare my interest as a recently appointed director of Lumi, an edtech provider—but for graduates, not for schools.

AI has the potential to revolutionise educational tools, helping teachers spend less time on marking and more time on face-to-face teaching with children, creating more innovative teaching tools and exercises and facilitating more detailed feedback for students. AI presents a real opportunity to improve education outcomes for children, opening more opportunities throughout their lives. There are deeply compelling promises in edtech.

However—there is always a however when we talk about edtech—creating and using AI education tools will require the collection and processing of children’s personal data. This potentially includes special category data—for instance, medical information pertaining to special educational needs such as dyslexia. Therefore, care must be taken in regulating how this data is collected, stored, processed and used. Without this, AI poses a major safeguarding risk. We share the concerns of the noble Baroness, Lady Kidron, and wholeheartedly support the spirit of her amendment.

We agree that it is prudent to require the ICO to make a code of practice on children’s data and education, and I particularly welcome a requirement on the ICO to consult with and involve parents. Parents know their children best, needless to say, and have their best interests at heart; their input will be critical in building trust in AI-assisted educational tools and facilitating their rollout and benefits for children throughout the UK.

However, as I said earlier at Report—and I shall not repeat the arguments now—we have concerns about the incorporation of international law into our law, and specifically, in this instance, the UN Convention on the Rights of the Child. We cannot therefore support the amendment as drafted. That said, we hope very much that the Government will listen carefully to the arguments raised here and take steps to introduce appropriate safeguards for children and young people in our data legislation regime. I suspect that most parents will greatly welcome more reassurance about the use of their children’s data.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, I can be pretty brief. We have had some fantastic speeches, started by the noble Baroness, Lady Kidron, with her superb rallying cry for these amendments, which we 100% support on these Benches. As she said, there is cross-party support. We have heard support from all over the House and, as the noble and learned Baroness, Lady Butler-Sloss, has just said, there has not been a dissenting voice.

I have a long association with the creative industries and with AI policy and yield to no one in my enthusiasm for AI—but, as the noble Baroness said, it should not come at the expense of the creative industries. It should not just be for the benefit of DeepSeek or Silicon Valley. We are very clear where we stand on this.

I pay tribute to the Creative Rights in AI Coalition and its campaign, which has been so powerful in garnering support, and to all those in the creative industries and creators themselves who briefed noble Lords for this debate.

These amendments respond to deep concerns that AI companies are using copyright material without permission or compensation. With the new government consultation, I do not believe that their preferred option is a straw man for a text and data mining exemption, with an opt out that we thought was settled under the previous Government. It starts from the false premise of legal uncertainty, as we have heard from a number of noble Lords. As the News Media Association has said, the Government’s consultation is based on a mistaken idea, promoted by tech lobbyists and echoed in the consultation, that there is a lack of clarity in existing copyright law. This is completely untrue. The use of copyrighted content without a licence by gen AI firms is theft on a mass scale and there is no objective case for a new text and data mining exception.

No effective opt-out system for the use of content by gen AI models has been proposed or implemented anywhere in the world, making the Government’s proposals entirely speculative. It is vital going forward that we ensure that AI companies cannot use copyrighted material without permission or compensation; that AI development does not exploit loopholes to bypass copyright laws; that AI developers disclose the sources of the data they use for training their models, allowing for accountability and addressing infringement; and that we reinforce the existing copyright framework, rather than creating new exceptions that disadvantage creators.

These amendments would provide a mechanism for copyright holders to contest the use of their work and ensure a route for payment. They seek to ensure that AI innovation does not come at the expense of the rights and livelihoods of creators. There is no market failure. We have a well-established licensing system as an alternative to the Government’s proposed opt-out scheme for AI developers using copyrighted works. A licensing system is the only sustainable solution that benefits both creative industries and the AI sector. We have some of the most effective collective rights organisations in the world. Licensing is their bread and butter. Merely because AI platforms are resisting claims, does not mean that the law in the UK is uncertain.

Amending UK law to address the challenges posed by AI development, particularly in relation to copyright and transparency, is essential to protect the rights of creators, foster responsible innovation and ensure a sustainable future for the creative industries. This should apply regardless of which country the scraping of copyright material takes place in, if developers market their product in the UK, regardless of where the training takes place. It would also ensure that AI start-ups based in the UK are not put at a competitive disadvantage due to the ability of international firms to conduct training in a different jurisdiction.

As we have heard throughout this debate, it is clear that the options proposed by the Government have no proper economic assessment underpinning them, no technology for an opt-out underpinning them and no enforcement mechanism proposed. It baffles me why the Conservative Opposition is not supporting these amendments, and I very much hope that the voices we have heard on the Conservative Benches will make sure that these amendments pass with acclamation.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

I thank the noble Baroness, Lady Kidron, for moving this incredibly important group and all those speakers who have made the arguments so clearly and powerfully. I pay tribute to noble Baroness’s work on copyright and AI, which is so important for our arts and culture sector. As noble Lords have rightly said, our cultural industries make an enormous contribution to our country, not just in cultural terms but in economic ones, and we must ensure that our laws do not put that future at risk.

In the build-up to this debate I engaged with great pleasure with the noble Baroness, Lady Kidron, and on these Benches we are sympathetic to her arguments. Her Amendment 61 would require the Government to make regulations in this area. We accept the Government’s assurance that this is something they will seek to address, and I note the Minister’s confirmation that their consultation will form the basis of the Government’s approach to this issue. Given the importance of getting this right, our view is that the Government’s consultation is in mid-flight, and we have to allow it to do its work. Whatever view we take of the design and the timing of the consultation, it offers for now a way forward that will evidence some of the serious concerns expressed here. That said, we will take a great interest in the progress and outcomes of the consultation and will come back to this in future should the Government’s approach prove unsatisfactory.

Amendment 75 in my name also seeks to address the challenge that the growth in AI poses to our cultural industries. One of the key challenges in copyright and AI is enforceability. Copyright can be enforced only when we know it has been infringed. The size and the international distribution of AI training models render it extremely challenging to answer two fundamental questions today: first, was a given piece of content used in a training model and secondly, if so, in what jurisdiction did that use take place? If we cannot answer these questions, enforcement can become extremely hard, so a necessary, if not sufficient, part of the solution will be a digital watermark—a means of putting some red dye in the water where copyrighted material is used to train AIs. It could also potentially provide an automated means for content creators to opt out, with a vastly more manageable administrative burden.

I thank the Minister for his constructive engagement on digital watermarking and look to him to give the House an assurance that the Government will bring forward a plan to develop a technological standard for a machine-readable digital watermark. I hope that, if and when he does so, he is able to indicate both a timeline and an intention to engage internationally. Subject to receiving such reassurances when he rises, I shall not move my amendment.

--- Later in debate ---
Moved by
46: After Clause 104, insert the following new Clause—
“Review of court jurisdictionWithin one year of the day on which this Act is passed the Secretary of State must review the impact that transferring the jurisdiction of courts that relate to all data protection provisions to tribunals would have on—(a) the complexity of the appeals system, and(b) legal barriers to representation and redress.”
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, Amendment 46 seeks a review of court jurisdiction. As I said in Committee, the current system’s complexity leads to confusion regarding where to bring data protection claims—tribunals or courts? This is exacerbated by contradictory legal precedents from different levels of the judiciary, and it creates barriers for individuals seeking to enforce their rights.

Transferring jurisdiction to tribunals would simplify the process and reduce costs for individuals, and it would align with the approach for statutory appeals against public bodies, which are typically handled by tribunals. In the Killock v Information Commissioner case, Mrs Justice Farbey explicitly called for a “comprehensive strategic review” of the appeal mechanisms for data protection rights. That is effectively what we seek to do with this amendment.

In Committee, the noble Baroness, Lady Jones, raised concerns about transferring jurisdiction and introducing a new appeals regime. She argued that the tribunals lacked the capacity to handle complex data protection cases, but tribunals are, in fact, better suited to handle such matters due to their expertise and lower costs for individuals. Additionally, the volume of applications under Section 166—“Orders to progress complaints”—suggests significant demand for tribunal resolution, despite its current limitations.

The noble Baroness, Lady Jones, also expressed concern about the potential for a new appeal right to encourage “vexatious challenges”, but introducing a tribunal appeal system similar to the Freedom of Information Act could actually help filter out unfounded claims. This is because the tribunal would have the authority to scrutinise cases and potentially dismiss those deemed frivolous.

The noble Baroness, Lady Jones, emphasised the existing judicial review process as a sufficient safeguard against errors by the Information Commissioner. However, judicial review is costly and complex, presenting a significant barrier for individuals. A tribunal system would offer a much more accessible and less expensive avenue for redress.

I very much hope that, in view of the fact that this is a rather different amendment—it calls for a review—the Government will look at this. It is certainly called for by the judiciary, and I very much hope that the Government will take this on board at this stage.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

I thank the noble Lord, Lord Clement-Jones, for moving his amendment, which would require the Secretary of State to review the potential impact of transferring to tribunals the jurisdiction of courts that relate to all data protection provisions. As I argued in Committee, courts have a long-standing authority and expertise in resolving complex legal disputes, including data protection cases, and removing the jurisdiction of the courts could risk undermining the depth and breadth of legal oversight required in such critical areas.

That said, as the noble Baroness, Lady Jones of Whitchurch, said in Committee, we have a mixed system of jurisdiction for legal issues relating to data, and tribunals have an important role to play. So, although we agree with the intentions behind the amendment from the noble Lord, Lord Clement-Jones, we do not support the push to transfer all data protection provisions from the courts to tribunals, as we believe that there is still an important role for courts to play. Given the importance of the role of the courts in resolving complex cases, we do not feel that this review is necessary.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, before the noble Viscount sits down, I wonder whether he has actually read the amendment; it calls for a review, not for transfer. I think that his speech is a carryover from Committee.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

A review to the end, set out by the noble Lord.

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

I thank the noble Lord, Lord Clement-Jones, for Amendment 46. It would require a review of the impact of transferring all data protection-related cases to the relevant tribunals. Currently there is a mixture of jurisdictions for tribunals and courts for data protection cases, depending on the nature of the proceedings. This is on the basis that certain claims are deemed appropriate for tribunal, while others are appropriate for courts, where stricter rules of evidence and procedure apply—for example, in dealing with claims by data subjects against controllers for compensation due to breaches of data protection legislation. As such, the current system already provides clear and appropriate administrative and judicial redress routes for data subjects seeking to exercise their rights.

Tribunals are in many cases the appropriate venue for data protection proceedings, including appeals by controllers against enforcement action or applications by data subjects for an order that the ICO should progress a complaint. Claims by individuals against businesses or other organisations for damages arising from breach of data protection law fall under the jurisdiction of courts rather than tribunals. This is appropriate, given the likely disparity between the resources of the respective parties, because courts apply stricter rules of evidence and procedures than tribunals. While court proceedings can, of course, be more costly, successful parties can usually recover their costs, which would not always be the case in tribunals.

I hope that the noble Lord agrees that there is a rationale for these different routes and that a review to consider transfer of jurisdictions to tribunals is therefore not necessary at this time.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, I thank the Minister for that dusty reply. I wonder whether he has been briefed about particular legal cases, such as Killock or Delo, where the judiciary themselves were confused about the nature of the different jurisdictions of tribunal and court. The Minister and, indeed, the noble Viscount, Lord Camrose, seemed to make speeches on the basis that all is wonderful and the jurisdiction of the courts and tribunals is so clearly defined that we do not need a review. That is not the case and, if the Minister were better briefed about the obiter, if not the judgments, in Delo and Killock, he might appreciate that there is considerable confusion about jurisdiction, as several judges have commented.

I am very disappointed by the Minister’s reply. I think that there will be several judges jumping up and down, considering that he has not really looked at the evidence. The Minister always says that he is very evidence-based. I very much hope that he will take another look at this—or, if he does not, that the MoJ will—as there is considerably greater merit in the amendment than he accords. However, I shall not press this to a vote and I beg leave to withdraw the amendment.

Amendment 46 withdrawn.
--- Later in debate ---
Earl of Erroll Portrait The Earl of Erroll (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I too support this. I well remember the passage of the Computer Misuse Act, and we were deeply unhappy about some of its provisions defining hacker tools et cetera, because they had nothing about intention. The Government simply said, “Yes, they will be committing an offence, but we will just ignore it if they are good people”. Leaving it to faceless people in some Civil Service department to decide who is good or bad, with nothing in the Bill, is not very wise. We were always deeply unhappy about it but had to go along with it because we had to have something; otherwise, we could not do anything about hacking tools being freely available. We ended up with a rather odd situation where there is no defence against being a good guy. This is a very sensible amendment to clean up an anomaly that has been sitting in our law for a long time and should probably have been cleaned up a long time ago.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, I support Amendments 47 and 48, which I was delighted to see tabled by the noble Lords, Lord Holmes and Lord Arbuthnot. I have long argued for changes to the Computer Misuse Act. I pay tribute to the CyberUp campaign, which has been extremely persistent in advocating these changes.

The CMA was drafted some 35 years ago—an age ago in computer technology—when internet usage was much lower and cybersecurity practices much less developed. This makes the Act in its current form unfit for the modern digital landscape and inhibits security professionals from conducting legitimate research. I will not repeat the arguments made by the two noble Lords. I know that the Minister, because of his digital regulation review, is absolutely apprised of this issue, and if he were able to make a decision this evening, I think he would take them on board. I very much hope that he will express sympathy for the amendments, however he wishes to do so—whether by giving an undertaking to bring something back at Third Reading or by doing something in the Commons. Clearly, he knows what the problem is. This issue has been under consideration for a long time, in the bowels of the Home Office—what worse place is there to be?—so I very much hope that the Minister will extract the issue and deal with it as expeditiously as he can.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

I thank my noble friend Lord Holmes for tabling the amendment in this group. I, too, believe these amendments would improve the Bill. The nature of computing and data processing has fundamentally changed since the Computer Misuse Act 1990. Third parties hold and process immense quantities of data, and the means of accessing and interacting with that data have become unrecognisably more sophisticated. Updating the definition of unauthorised computer access through Amendment 48 is a sensible reform, as this new definition takes into account that data controllers and processors now hold substantial quantities of personal data. These entities are responsible for the security of the data they hold, so their provisions on access become legally relevant and this amendment reflects this.

When updating an offence, it is equally necessary to consider the legal defences, as my noble friend has rightly done in Amendment 47 by protecting individuals accessing information to detect or prevent a crime or whose actions are in the public interest. We on these Benches feel these amendments are wholly sensible. I urge the Minister to listen to the persuasive argument that my noble friend Lord Holmes has made and consider how we can deliver these improvements to our data legislation.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, I will speak to Amendment 48B. In our view, cookie paywalls create an unfair choose for users, essentially forcing them to pay for privacy. We tabled an amendment in Committee to ban cookie paywalls, but in the meantime, as the noble Baroness, Lady Jones, heralded at the time, the Information Commissioner’s Office has provided updated guidance on the “consent or pay” model for cookie compliance. It is now available for review. This guidance clarifies how organisations can offer users a choice between accepting personalised ads for free access or paying for an ad-free experience while ensuring compliance with data protection laws. It has confirmed that the “consent or pay” model is acceptable for UK publishers, provided certain conditions are met. Key requirements for a valid consent under this model include: users must have genuine free choice; the alternative to consent—that is, payment—must be reasonably priced; and users must be fully informed about their options.

The guidance is, however, contradictory. On the one hand, it says that cookie paywalls

“can be compliant with data protection law”

and that providers must document their assessments of how it is compliant with DPL. On the other, it says that, to be compliant with data protection law, cookie paywalls must allow users to choose freely without detriment. However, users who do not wish to pay the fee to access a website will be subject to detriment, because with a cookie paywall they will pay a fee if they wish to refuse consent. This is addressed as the “power imbalance”. It is also worth noting that this guidance does not constitute legal advice; it leaves significant latitude for legal interpretation and argument as to the compatibility of cookie paywalls with data protection law.

The core argument against “consent or pay” models is that they undermine the principle of freely given consent. The ICO guidance emphasises that organisations using these models must be able to demonstrate that users have a genuine choice and are not unfairly penalised for refusing to consent to data processing for personalised advertising. Yet in practice, given the power imbalance, on almost every occasion this is not possible. This amendment seeks to ensure that individuals maintain control over their personal data. By banning cookie paywalls, users can freely choose not to consent to cookies without having to pay a fee. I very much hope that the Government will reconsider the ICO’s guidance in particular, and consider banning cookie paywalls altogether.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I thank my noble friend Lord Lucas for introducing this group. Amendments 48A and 50A, in his name, would ensure that regulated professionals, including financial services firms, are able to comply with current and future regulatory requirements. The example my noble friend has given—the FCA’s expectation that firms communicate effectively with consumers—is a good one. Clearly, we must avoid a circumstance where regulators expect businesses to take action that is not possible due to limiting legislation governing data use and access. My noble friend has made a forceful case and I hope the Government will be able to give the House appropriate assurance that businesses will not be put in this position as a result of this legislation.

Amendment 48B, in the name of the noble Lord, Lord Clement-Jones, seeks to ban cookie paywalls. I opposed a similar amendment when we debated it in Committee as it actually seeks to curtail choice. Currently, users have the options to pay money and stay private, share personal data and read for free, or walk away. Faced with these options, for instance, I have sadly chosen to forgo my regular evening reading of the Daily Mail’s excellent sports pages, but I see no reason why that newspaper, or anyone else, should be compelled to provide anything for free. In fact, it has been very persuasively argued by Jaron Lanier, Shoshana Zuboff and many others that it is the fact that so much of the internet is apparently, but not actually, free that has caused a great deal of damage, rather than having an open charging model. This approach finally reveals the exact cash value of individuals’ data that websites are harvesting and offers users choice. We do not agree with attempts to remove that choice.

--- Later in debate ---
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, these amendments have to do with research access for online safety. Having sat on the Joint Committee of the draft Online Safety Bill back in 2021, I put on record that I am delighted that the Government have taken the issue of research access to data very seriously. It was a central plank of what we suggested and it is fantastic that they have done it.

Of the amendments in my name, Amendment 51 would simply ensure that the provisions of Clause 123 are acted on by removing the Government’s discretion as to whether they introduce regulations. It also introduces a deadline of 12 months for the Government to do so. Amendment 53 seeks to ensure that the regulators will enable independent researchers to research how online risks and harms impact different groups, especially vulnerable users, including children. Given the excitements we have already had this evening, I do not propose to press any of them, but I would like to hear from the Minister that he has heard me and that the Government will seek to enshrine the principle of different ages, different stages, different people, when he responds.

I note that the noble Lord, Lord Bethell, who has the other amendments in this group, to which I added my name, is not in his place, but I understand that he has sought—and got—reassurance on his amendments. So there is just one remaining matter on which I would like further reassurance: the scope of the legal privilege exception. A letter from the Minister on 10 January explains:

“The clause restates the existing law on legally privileged information as a reassurance that regulated services will not be asked to break the existing legislation on the disclosure of this type of data”.


It seems that the Minister has veered tantalisingly close to answering my question, but not in a manner that I can quite understand. So I would really love to understand—and I would be grateful to the Minister if he would try to explain to me—how the Government will prevent tech companies using legal privilege as a shield. Specifically, would CCing a lawyer on every email exchange, or having a lawyer in every team, allow companies to prevent legitimate scrutiny of their safety record? I have sat in Silicon Valley headquarters and each team came with its own lawyer—I would really appreciate clarity on this issue. I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, I can only support what the noble Baroness, Lady Kidron, had to say. This is essentially unfinished business from the Online Safety Act, which we laboured in the vineyard to deliver some time ago. These amendments aim to strengthen Clause 123 and try to make sure that this actually happens and that we do not get the outcomes of the kind that the noble Baroness has mentioned.

I, too, have read the letter from the Minister to the noble Lord, Lord Bethell. It is hedged about with a number of qualifications, so I very much hope that the Minister will cut through it and give us some very clear assurances, because I must say that I veer back and forth when I read the paragraphs. I say, “There’s a win”, and then the next paragraph kind of qualifies it, so perhaps the Minister will give us true clarity when he responds.

Earl of Erroll Portrait The Earl of Erroll (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I wanted to add something, having spent a lot of time on Part 3 of the Digital Economy Act, which after many assurances and a couple of years, the Executive decided not to implement, against the wishes of Parliament. It worries me when the Executive suddenly feel that they can do those sorts of things. I am afraid that leopards sometimes do not change their spots, and I would hate to see this happen again, so Amendment 51 immediately appeals. Parliament needs to assert its authority.

Data (Use and Access) Bill [HL]

Lord Clement-Jones Excerpts
Lord Arbuthnot of Edrom Portrait Lord Arbuthnot of Edrom (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I share in the congratulations of my noble friend Lady Owen. It has taken me about 10 years to begin to understand how this House works and it has taken her about 10 minutes.

I want to pursue something which bewilders me about this set of amendments, which is the amendment tabled by the noble Baroness, Lady Gohir. I do not understand why we are talking about a different Bill in relation to audio fakes. Audio has been with us for many years, yet video deepfakes are relatively new. Why are we talking about a different Bill in relation to audio deepfakes?

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, this has been a very interesting debate. I too congratulate the noble Baroness, Lady Owen, on having brought forward these very important amendments. It has been a privilege to be part of her support team and she has proved an extremely persuasive cross-party advocate, including in being able to bring out the team: the noble Baroness, Lady Kidron, the noble Lord, Lord Pannick, who has cross-examined the Minister, and the noble Lord, Lord Stevenson. There is very little to follow up on what noble Lords have said, because the Minister now knows exactly what he needs to reply to.

I was exercised by this rather vague issue of whether the elements that were required were going to come back at Third Reading or in the Commons. I did not think that the Minister was specific enough in his initial response. In his cross-examination, the noble Lord, Lord Pannick, really went through the key elements that were required, such as the no intent element, the question of reasonable excuse and how robust that was, the question of solicitation, which I know is very important in this context, and the question of whether it is really an international law matter. I have had the benefit of talking to the noble Lord, Lord Pannick, and surely the mischief is delivered and carried out here, so why is that an international law issue? There is also the question of deletion of data, which the noble Lord has explained pretty carefully, and the question of timing of knowledge of the offence having been committed.

The Minister needs to describe the stages at which those various elements are going to be contained in a government amendment. I understand that there may be a phasing, but there are a lot of assurances. As the noble Lord, Lord Stevenson, said, is it six or seven? How many assurances are we talking about? I very much hope that the Minister can see the sentiment and the importance we place on his assurances on these amendments, so I very much hope he is going to be able to give us the answers.

In conclusion, as the noble Baroness, Lady Morgan, said—and it is no bad thing to be able to wheel on a former Secretary of State at 9 o’clock in the evening—there is a clear link between gender-based violence and image-based abuse. This is something which motivates us hugely in favour of these amendments. I very much hope the Minister can give more assurance on the audio side of things as well, because we want future legislation to safeguard victims, improve prosecutions and deter potential perpetrators from committing image-based and audio-based abuse crimes.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

I thank the Minister and my noble friend Lady Owen for bringing these amendments to your Lordships’ House. Before I speak to the substance of the amendments, I join others in paying tribute to the tenacity, commitment and skill that my noble friend Lady Owen has shown throughout her campaign to ban these awful practices. She not only has argued her case powerfully and persuasively but, as others have remarked, seems to have figured out the machinery of this House in an uncanny way. Whatever else happens, she has the full support of these Benches.

I am pleased that the Government have engaged constructively with my noble friend and are seeking to bring this back at Third Reading. The Minister has been asked some questions and we all look forward with interest to his responses. I know from the speeches that we have heard that I am not alone in this House in believing that we have an opportunity here and now to create these offences, and we should not delay. For the sake of the many people who have been, and will otherwise be, victims of the creation of sexually explicit deepfakes, I urge the Government to continue to work with my noble friend Lady Owen to get this over the line as soon as possible.

--- Later in debate ---
Lord Freyberg Portrait Lord Freyberg (CB)
- View Speech - Hansard - - - Excerpts

I support the amendment, to which I have attached my name, along with the noble Lord, Lord Bassam, and the noble Earl, Lord Clancarty. I declare my interest as a member of DACS, the Design and Artists Copyright Society, and I, too, thank the Minister for meeting us prior to this debate.

Today’s digital landscape presents unique and pressing challenges for visual artists that we can no longer ignore. A 2022 YouGov survey commissioned by DACS uncovered a revealing paradox in our digital culture. While 75% of people regularly access cultural content at least three times a week, with 63% downloading it for free, an overwhelming 72% of the same respondents actively support compensating artists for digital sharing of their work. These figures paint a stark picture of the disconnect between the public’s consumption habits and their ethical convictions about fair compensation.

The Netherlands offers a compelling blueprint for change through DACS’ partner organisation Pictoright. Its innovative private copying scheme has successfully adapted to modern consumption habits while protecting artists’ interests. Consider a common scenario in museums: visitors now routinely photograph artworks instead of purchasing traditional postcards. Under Pictoright’s system, artists receive fair compensation for these digital captures, demonstrating that we can embrace the convenience of digital access without sacrificing creators’ right to earn from their work. This proven model shows that the tension between accessibility and fair compensation is not insurmountable.

The smart fund offers a similar balanced solution for the UK. This approach would protect our cultural ecosystem while serving the interests of creators, platforms and the public alike. I hope the Government will look favourably upon this scheme.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, I thank the noble Lord, Lord Bassam, for retabling his Committee amendment, which we did not manage to discuss. Sadly, it always appears to be discussed rather late in the evening, but I think that the time has come for this concept and I am glad that the Government are willing to explore it.

I will make two points. Many countries worldwide, including in the EU, have their own version of the smart fund to reward creators and performers for the private copy and use of their works and performances. Our own CMS Select Committee found that, despite the creative industries’ economic contribution—about which many noble Lords have talked—many skilled and successful professional creators are struggling to make a living from their work. The committee recommended that

“the Government work with the UK’s creative industries to introduce a statutory private copying scheme”.

This has a respectable provenance and is very much wanted by the collecting societies ALCS, BECS, Directors UK and DACS. Their letter said that the scheme could generate £250 million to £300 million a year for creatives, at no cost to the Government or to the taxpayer. What is not to like? They say that similar schemes are already in place in 45 countries globally, including most of Europe, and many of them include an additional contribution to public cultural funding. That could be totally game-changing. I very much hope that there is a fair wind behind this proposal.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the noble Lord, Lord Bassam of Brighton, for laying this amendment and introducing the debate on it.

As I understand it, a private copying levy is a surcharge on the price of digital content. The idea is that the money raised from the surcharge is either redistributed directly to rights holders to compensate them for any loss suffered because of copies made under the private copying exceptions or contributed straight to other cultural events. I recognise what the noble Lord is seeking to achieve and very much support his intent.

I have two concerns. First—it may be that I have misunderstood it; if so, I would be grateful if the noble Lord would set me straight—it sounds very much like a new tax of some kind is being raised, albeit a very small one. Secondly, those who legitimately pay for digital content end up paying twice. Does this not incentivise more illegal copying?

We all agree how vital it is for those who create products of the mind to be fairly rewarded and incentivised for doing so. We are all concerned by the erosion of copyright or IP caused by both a global internet and increasingly sophisticated AI. Perhaps I could modestly refer the noble Lord to my Amendment 75 on digital watermarking, which I suggest may be a more proportionate means of achieving the same end or at least paving the way towards it. For now, we are unable to support Amendment 57 as drafted.

--- Later in debate ---
Lord Lucas Portrait Lord Lucas (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I very much encourage the Government to go down this road. Everyone talks about the NHS just because the data is there and organised. If we establish a structure like this, there are other sources of data that we could develop to equivalent value. Education is the obvious one. What works in education? We have huge amounts of data, but we do nothing with it—both in schools and in higher education. What is happening to biodiversity? We do not presently collect the data or use it in the way we could, but if we had that, and if we took advantage of all the people who would be willing to help with that, we would end up with a hugely valuable national resource.

HMRC has a lot of information about employment and career patterns, none of which we use. We worry about what is happening and how we can improve seaside communities, but we do not collect the data which would enable us to do it. We could become a data-based society. This data needs guarding because it is not for general use—it is for our use, and this sort of structure seems a really good way of doing it. It is not just the NHS—there is a whole range of areas in which we could greatly benefit the UK.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, all our speakers have made it clear that this is a here-and-now issue. The context has been set out by noble Lords, whether it is Stargate, the AI Opportunities Action Plan or, indeed, the Palantir contract with the NHS. This has been coming down the track for some years. There are Members on the Government Benches, such as the noble Lords, Lord Mitchell and Lord Hunt of Kings Heath, who have been telling us that we need to work out a fair way of deriving a proper financial return for the benefits of public data assets, and Future Care Capital has done likewise. The noble Lord, Lord Freyberg, has form in this area as well.

The Government’s plan for the national data library and the concept of sovereign data assets raises crucial questions about how to balance the potential benefits of data sharing with the need to protect individual rights, maintain public trust and make sure that we achieve proper value for our public digital assets. I know that the Minister has a particular interest in this area, and I hope he will carry forward the work, even if this amendment does not go through.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

I thank the noble Baroness, Lady Kidron, for moving her amendment. The amendments in this group seek to establish a new status for data held in the public interest, and to establish statutory oversight rules for a national data library. I was pleased during Committee to hear confirmation from the noble Baroness, Lady Jones of Whitchurch, that the Government are actively developing their policy on data held in the public interest and developing plans to use our data assets in a trustworthy and ethical way.

We of course agree that we need to get this policy right, and I understand the Government’s desire to continue their policy development. Given that this is an ongoing process, it would be helpful if the Government could give the House an indication of timescales. Can the Minister say when the Government will be in a position to update the House on any plans to introduce a new approach to data held in the public interest? Will the Government bring a statement to this House when plans for a national data library proceed to the next stage?

I suggest that a great deal of public concern about nationally held datasets is a result of uncertainty. The Minister was kind enough to arrange a briefing from his officials yesterday, and this emerged very strongly. There is a great deal of uncertainty about what is being proposed. What are the mechanics? What are the risks? What are the costs? What are the eventual benefits to UK plc? I urge the Minister, as and when he makes such a statement, to bring a maximum of clarity about these fundamental questions, because I suspect that many people in the public will find this deeply reassuring.

Given the stage the Government are at with these plans, we do not think it would be appropriate to legislate at this stage, but we of course reserve the right to revisit this issue in the future.

--- Later in debate ---
In his response to the last group, I sensed that the Minister let rather an important cat out of the bag. I am always looking for the Government to be doing policy development, but he used a key word in that answer which I would very much like him to use in his answer on this group. For the Government, it seems that it is not policy development that matters but “active” policy development. I look forward to his response and the debate on this group. I beg to move.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, we have had some discussion already this week on data centres. The noble Lord, Lord Holmes, is absolutely right to raise this broad issue, but I was reassured to hear from the noble Lord, Lord Hunt of Kings Heath, earlier in the week that the building of data centres, their energy requirements and their need may well be included in NESO’s strategic spatial energy plan and the centralised strategic network plan. Clearly, in one part of the forest there is a great deal of discussion about energy use and the energy needs of data centres. What is less clear and, in a sense, reflected in the opportunities plan is exactly how the Government will decide the location of these data centres, which clearly—at least on current thinking about the needs of large language models, AI and so on—will be needed. It is about where they will be and how that will be decided. If the Minister can cast any light on that, we would all be grateful.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

I thank my noble friend Lord Holmes of Richmond for moving this amendment. Amendment 59 is an important amendment that addresses some of the key issues relating to large language models. We know that large language models have huge potential, and I agree with him that the Government should keep this under review. Perhaps the noble Baroness, Lady Jones of Whitchurch, would be willing to update the House on the Government’s policy on large language model regulation on her return.

Data centre availability is another emerging issue as we see growth in this sector. My noble friend is absolutely right to bring this to the attention of the House. We firmly agree that we will have a growing need for additional data centres. In Committee, the noble Baroness, Lady Jones, did not respond substantively to Amendments 60 and 66 from my noble friend on data centres, which I believe was—not wholly unreasonably—to speed the Committee to its conclusion just before Christmas. I hope the Minister can give the House a fuller response on this today, as it would be very helpful to hear what the Government’s plans are on the need for additional data centres.

--- Later in debate ---
Earl of Erroll Portrait The Earl of Erroll (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I spoke on this before, and I will repeat what I said previously. The only way out of this one is to have two fields against someone: one that we will call “sex” and another that we will call “gender”. I will use the terminology of the noble Lord, Lord Lucas, for this. “Sex” is what you are biologically and were born, and that you cannot change. There are instances where we need to use that field, particularly when it comes to delivering medicine to people—knowing how you treat them medically—and, possibly, in other things such as sports. There are one or two areas where we need to know what they are biologically.

Then we have another field which is called “gender”. In society, in many cases, we wish that people did not have to go around saying that they are not what they were born but what they want to be—but I do not have a problem with that. We could use that field where society decides that people can use it, such as on passports, other documents and identity cards—all sorts of things like that. It does not matter; I am not worried about what someone wants to call themselves or how they want to present themselves to society.

Researchers will have the “sex” field, and they can carry out medical research— they can find out about all the different things related to that—and, societally, we can use the other field for how people wish to project themselves in public. That way we can play around with what you are allowed to use in what scenarios; it allows you to do both. What we need is two fields; it will solve a lot of problems.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, it is clear that Amendment 67 in the name of the noble Lord, Lord Lucas, is very much of a piece with the amendments that were debated and passed last week. On these Benches, our approach will be exactly the same. Indeed, we can rely on what the Minister said last week, when he gave a considerable assurance:

“I can be absolutely clear that we must have a single version of the truth on this. There needs to be a way to verify it consistently and there need to be rules. That is why the ongoing work is so important”.—[Official Report, 21/1/25; col. 1620.]


That is, the work of the Central Digital and Data Office. We are content to rely on his assurance.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

I thank my noble friend Lord Lucas for bringing his Amendment 67, which builds on his previous work to ensure accuracy of data. On these Benches, we agree wholeheartedly with him that the information we have access to—for example, to verify documents—must be accurate. His amendment would allow the Secretary of State to make regulations establishing definitions under the Bill for the purposes of digital verification services, registers of births and deaths, and other provisions. Crucially, this would enable the Government to put measures in place to ensure the consistency of the definitions of key personal attributes, including sex. We agree that consistency and accuracy of data is vital. We supported him on the first day at Report, and, if he pushes his amendment to a Division, we will support him today.

--- Later in debate ---
Lord Arbuthnot of Edrom Portrait Lord Arbuthnot of Edrom (Con)
- View Speech - Hansard - - - Excerpts

My Lords, as so often, I listened with awe to the noble Baroness. Apart from saying that I agree with her wholeheartedly, which I do, there is really no need for me for me to add anything, so I will not.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, I too am lost in admiration for the noble Baroness, Lady Kidron—still firing on all cylinders at this time of night. Current law is clearly out of touch with the reality of computer systems. It assumes an untruth about computer reliability that has led to significant injustice. We know that that assumption has contributed to miscarriages of justice, such as the Horizon scandal.

Unlike the amendment in Committee, Amendment 68 does not address the reliability of computers themselves but focuses rather on the computer evidence presented in court. That is a crucial distinction as it seeks to establish a framework for evaluating the validity of the evidence presented, rather than questioning the inherent reliability of computers. We believe that the amendment would be a crucial step towards ensuring fairness and accuracy in legal proceedings by enabling courts to evaluate computer evidence effectively. It offers a balanced approach that would protect the interests of both the prosecution and the defence, ensuring that justice is served. The Government really must move on this.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

I thank the noble Baroness, Lady Kidron, for her amendments. The reliability of computer-based evidence, needless to say, has come into powerful public focus following the Post Office Horizon scandal and the postmasters’ subsequent fight for justice. As the noble Baroness has said previously and indeed tonight, this goes far beyond the Horizon scandal. We accept that there is an issue with the way in which the presumption that computer evidence is reliable is applied in legal proceedings.

The Government accepted in Committee that this is an issue. While we have concerns about the way that the noble Baroness’s amendment is drafted, we hope the Minister will take the opportunity today to set out clearly the work that the Government are doing in this area. In particular, we welcome the Government’s recently opened call for evidence, and we hope Ministers will work quickly to address this issue.

--- Later in debate ---
Moved by
74: After Clause 132, insert the following new Clause—
“Retrospective applicationWithin one month of the day on which this Act is passed, the Secretary of State must publish a statement clarifying whether the changes enacted by its commencement will apply to controllers and processors retrospectively, or only to data first processed following its commencement.”
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I have the very dubious privilege of moving the final amendment on Report to this Bill. This is a probing amendment and the question is: what does retrospectivity mean? The noble Lord, Lord Cameron of Lochiel, asked a question of the noble Baroness, Lady Jones, in Committee in December:

“Will the forthcoming changes to data protection law apply to such data that controllers and processors already hold?”


She replied that

“the new lawful ground of recognised legitimate interest will apply from the date of commencement and will not apply retrospectively”.—[Official Report, 10/12/24; cols. GC 435-437.]

But the question is not really whether the lawfulness is retrospective, but whether the changes made in the new law can be applied to any personal data previously collected and already held on the commencement date of the Act—so that is the exam question.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

It is indeed getting late. I thank the noble Lord, Lord Clement-Jones, for moving his amendment, and I really will be brief.

We do not oppose the government amendment in the name of the noble Lord, Lord Vallance. I think the Minister should be able to address the concerns raised by the noble Lord, Lord Clement-Jones, given that the noble Lord’s amendment merely seeks clarification on the retrospective application of the provisions of the Bill within a month of the coming into force of the Act. It seems that the Government could make this change unnecessary by clarifying the position today. I hope the Minister will be able to address this in his remarks.

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

I will speak first to Amendment 76. I reassure noble Lords that the Government do not believe that this amendment has a material policy effect. Instead, it simply corrects the drafting of the Bill and ensures that an interpretation provision in Clause 66 commences on Royal Assent.

Amendment 74, in the name of the noble Lord, Lord Clement Jones, would require the Secretary of State to publish a statement setting out whether any provisions in the Bill apply to controllers and processers retrospectively. Generally, provisions in Bills apply from the date of commencement unless there are strong policy or legal reasons for applying them retrospectively. The provisions in this Bill follow that general rule. For instance, data controllers will only be able to rely on the new lawful ground of recognised legitimate interests introduced by Clause 70 in respect of new processing activities in relation to personal data that take place after the date of commencement.

I recognise that noble Lords might have questions as to whether any of the Bill’s clauses can apply to personal data that is already held. That is the natural intent in some areas and, where appropriate, commencement regulations will provide further clarity. The Government intend to publish their plans for commencement on GOV.UK in due course and the ICO will also be updating its regulatory guidance in several key areas to help organisations prepare. We recognise that there can be complex lifecycles around the use of personal data and we will aim to ensure that how and when any new provisions can be relied on is made clear as part of the implementation process.

I hope that explanation goes some way to reassuring the noble Lord and that he will agree to withdraw his amendment.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, I thank the Minister. There is clearly no easy answer. I think we were part-expecting a rather binary answer, but clearly there is not one, so we look forward to the guidance.

But that is a bit worrying for those who have to tackle these issues. I am thinking of the data protection officers who are going to grapple with the Bill in its new form and I suspect that that is going to be quite a task. In the meantime, I withdraw the amendment.

Amendment 74 withdrawn.