All 5 Baroness Harding of Winscombe contributions to the Data Protection and Digital Information Bill 2022-23

Read Bill Ministerial Extracts

Wed 20th Mar 2024
Data Protection and Digital Information Bill
Grand Committee

Committee stage & Committee stage: Minutes of Proceedings & Committee stage: Minutes of Proceedings & Committee stage & Committee stage

Data Protection and Digital Information Bill

Baroness Harding of Winscombe Excerpts
Ultimately, information should be retained within a locked box, where it stays, and the medical researchers, who are crucial, come up with their programme, using a sandbox, that is then applied to the locked-away data. The researchers would just get the results; they would not go anywhere near the data. The outcome of the research is identical but people’s medical information —their genetic information—would be kept away, secure. We have to work to that objective. I do not quite know yet whether the Bill gets that far, but it is crucial.
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

My Lords, I, too, support the amendments in the name of the noble Lord, Lord Clement-Jones. As this is the first time I have spoken during the passage of the Bill, I should also declare my interests, but it seems that all the organisations I am involved in process data, so I refer the Committee to all the organisations in my entry in the register of interests.

I want to tell a story about the challenges of distinguishing between personal data and pseudonymised data. I apologise for bringing everyone back to the world of Covid, but that was when I realised how possible it is to track down individuals without any of their personal data. Back in November or December 2020, when the first variant of Covid, the Kent variant, was spreading, one test that was positive for the Kent variant came with no personal details at all. The individual who had conducted that test had not filled in any of the information. I was running NHS Test and Trace and we had to try to find that individual, in a very public way. In the space of three days, with literally no personal information—no name, address or sense of where they lived—the team was able to find that human being. Through extraordinary ingenuity, it tracked them down based on the type of tube the test went into—the packaging that was used—and by narrowing down the geography of the number of postcodes where the person might have been ill and in need of help but also in need of identifying all their contacts.

I learned that it was possible to find that one human being, out of a population of 60 million, within three days and without any of their personal information. I tell this story because my noble friend Lord Kamall made such an important point that, at the heart of data legislation is the question of how you build trust in the population. We have to build on firm foundations if the population are to trust that there are reasons why sharing data is hugely valuable societally. To have a data Bill that does not have firm foundations in absolutely and concretely defining personal data is quite a fatal flaw.

Personal data being subjective, as the noble Lord, Lord Clement-Jones, so eloquently set out, immediately starts citizens on a journey of distrusting this world. There is so much in this world that is hard to trust, and I feel strongly that we have to begin with some very firm foundations. They will not be perfect, but we need to go back to a solid definition of “personal data”, which is why I wholeheartedly support the noble Lord’s amendments.

Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

My Lords, I hesitate to make a Second Reading speech, and I know that the noble Lord, Lord Clement-Jones, cannot resist rehearsing these points. However, it is important, at the outset of Committee, to reflect on the Bill in its generality, and the noble Lord did a very good job of precisely that. This is fundamental.

The problem for us with the Bill is not just that it is a collection of subjects—of ideas about how data should be handled, managed and developed—but that it is flawed from the outset. It is a hotchpotch of things that do not really hang together. Several of us have chuntered away in the margins and suggested that it would have been better if the Bill had fallen and there had been a general election—not that the Minister can comment on that. But it would be better, in a way. We need to go back to square one, and many in the Committee are of a like mind.

The noble Baroness, Lady Harding, made a good point about data management, data control and so on. Her example was interesting, because this is about building trust, having confidence in data systems and managing data in the future. Her example was very good, as was that of the noble Lord, Lord Davies, who raised a challenge about how the anonymisation, or pseudonymisation, of data will work and how effective it will be.

We have two amendments in this group. Taken together, they are designed to probe exactly what the practical impacts will be of the proposed changes to Section 3 of the 2018 Act and the insertion of new Section 3A. Amendment 4 calls for the Secretary of State to publish an assessment of the changes within two months of the Bill passing, while Amendment 301 would ensure that the commencement of Clause 1 takes place no earlier than that two-month period. Noble Lords might think this is unduly cautious, but, given our wider concerns about the Bill and its departure from the previously well-understood—

--- Later in debate ---
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

My Lords, in the nearly nine years that I have been in this House, I have often played the role of bag carrier to the noble Baroness, Lady Kidron, on this issue. In many ways, I am rather depressed that once again we need to make the case that children deserve a higher bar of protection than adults in the digital world. As the noble Baroness set out—I will not repeat it—the age-appropriate design code was a major landmark in establishing that you can regulate the digital world just as you can the physical world. What is more, it is rather joyful that when you do, these extraordinarily powerful tech companies change their products in the way that you want them to.

This is extremely hard-fought ground that we must not lose. It takes us to what feels like a familiar refrain from the Online Safety Act and the Digital Markets, Competition and Consumers Bill, which we are all still engaged in: the question of whether you need to write something in the Bill and whether, by doing so, you make it more clear or less clear.

Does my noble friend the Minister agree with the fundamental principle, enshrined in the Data Protection Act 2018, that children deserve a higher bar of protection in the online world and that children’s data needs to be protected at a much higher level? If we can all agree on that principle first, then the question is: how do we make sure that this Bill does not weaken the protection that children have?

I am trying to remember on which side of the “put it in the Bill or not” debate I have been during discussions on each of the digital Bills that we have all been working on over the last couple of years. We have a really vicious problem where, as I understand it, the Government keep insisting that the Bill does not water down data protection and therefore there is no need to write anything into it to protect children’s greater rights. On the other hand, I also hear that it will remove bureaucracy and save businesses a lot of money. I have certainly been in rooms over the last couple of years where business representatives have told me, not realising I was one of the original signatories to the amendment that created the age-appropriate design code, how dreadful it was because it made their lives much more complicated.

I have no doubt that if we create a sense—which is what it is—that companies do not need to do quite as much as they used to for children in this area, that sense will create, if not a wide-open door, an ajar door that enables businesses to walk through and take the path of least resistance, which is doing less to protect children. That is why, in this case, I come down on the side of wanting to put it explicitly in the Bill, in whatever wording my noble friend the Minister thinks appropriate, that we are really clear that this creates no change at all in the approach for children and children’s data.

That is what this group of amendments is about. I know that we will come back to a whole host of other areas where there is a risk that children’s data could be handled differently from the way envisaged in that hard-fought battle for the age-appropriate design code but, on this group alone, it would be helpful if my noble friend the Minister could help us establish that firm principle and commit to coming back with wording that will firmly establish it in the Bill.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I keep getting flashbacks. This one is to the Data Protection Act 2018, although I think it was 2017 when we debated it. It is one of the huge achievements of the noble Baroness, Lady Kidron, to have introduced, and persuaded the Government to introduce, the age-appropriate design code into the Act, and—as she and the noble Baroness, Lady Harding, described—to see it spread around the world and become the gold standard. It is hardly surprising that she is so passionate about wanting to make sure that the Bill does not water down the data rights of children.

I think the most powerful amendment in this group is Amendment 290. For me, it absolutely bottles what we need to do in making sure that nothing in the Bill waters down children’s rights. If I were to choose one of the noble Baroness’s amendments in this group, it would be that one: it would absolutely give the assurance and scotch the point about legal uncertainty created by the Bill.

Both noble Baronesses asked: if the Government are not watering down the Bill, why can they not say that they are not? Why can they not, in a sense, repeat the words of Paul Scully when he was debating the Bill? He said:

“We are committed to protecting children and young people online. The Bill maintains the high standards of data protection that our citizens expect and organisations will still have to abide by our age-appropriate design code”.


He uses “our”, so he is taking full ownership of it. He went on:

“Any breach of our data protection laws will result in enforcement action by the Information Commissioner’s Office”.—[Official Report, Commons, 17/4/23; col. 101.]


I would love that enshrined in the Bill. It would give us a huge amount of assurance.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, not to put too fine a point on it, the Minister is saying that nothing in the Bill diminishes children’s rights, whether in Clause 1, Clause 6 or the legitimate interest in Clause 5. He is saying that absolutely nothing in the Bill diminishes children’s rights in any way. Is that his position?

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

Can I add to that question? Is my noble friend the Minister also saying that there is no risk of companies misinterpreting the Bill’s intentions and assuming that this might be some form of diminution of the protections for children?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

In answer to both questions, what I am saying is that, first, any risk of misinterpreting the Bill with respect to children’s safety is diminished, rather than increased, by the Bill. Overall, it is the Government’s belief and intention that the Bill in no way diminishes the safety or privacy of children online. Needless to say, if over the course of our deliberations the Committee identifies areas of the Bill where that is not the case, we will absolutely be open to listening on that, but let me state this clearly: the intent is to at least maintain, if not enhance, the safety and privacy of children and their data.

--- Later in debate ---
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I speak to Amendments 8, 21, 23 and 145 in my name and thank the other noble Lords who have added their names to them. In the interests of brevity, and as the noble Lord, Lord Clement-Jones, has done some of the heavy lifting on this, I will talk first to Amendment 8.

The definition of scientific research has been expanded to include commercial and non-commercial activity, so far as it

“can reasonably be described as scientific”,

but “scientific” is not defined. As the noble Lord said, there is no public interest requirement, so a commercial company can, in reality, develop almost any kind of product on the basis that it may have a scientific purpose, even—or maybe especially—if it measures your propensity to impulse buy or other commercial things. The spectre of scientific inquiry is almost infinite. Amendment 8 would exclude children simply by adding proposed new paragraph (e), which says that

“the data subject is not a child or could or should be known to be a child”,

so that their personal data cannot be used for scientific research purposes to which they have not given their consent.

I want to be clear that I am pro-research and understand the critical role that data plays in enabling us to understand societal challenges and innovate towards solutions. Indeed, I have signed the amendment in the name of the noble Lord, Lord Bethell, which would guarantee access to data for academic researchers working on matters of public interest. Some noble Lords may have been here last night, when the US Surgeon- General Vice Admiral Dr Murthy, who gave the Lord Speaker’s lecture, made a fierce argument in favour of independent public interest research, not knowing that such a proposal has been laid. I hope that, when we come to group 17, the Government heed his wise words.

In the meantime, Clause 3 simply embeds the inequality of arms between academics and corporates and extends it, making it much easier for commercial companies to use personal data for research while academics continue to be held to much higher ethical and professional standards. They continue to require express consent, DBS checks and complex ethical requirements. Not doing so, simply using personal data for research, is unethical and commercial players can rely on Clause 3 to process data without consent, in pursuit of profit. Like the noble Lord, Lord Clement-Jones, I would prefer an overall solution to this but, in its absence, this amendment would protect data from being commoditised in this way.

Amendments 21 and 23 would specifically protect children from changes to Clause 6. I have spoken on this a little already, but I would like it on the record that I am absolutely in favour of a safeguarding exemption. The additional purposes, which are compatible with but go beyond the original purpose, are not a safeguarding measure. Amendment 21 would amend the list of factors that a data controller must take into account to include the fact that children are entitled to a higher standard of protection.

Amendment 23 would not be necessary if Amendment 22 were agreed. It would commit the Secretary of State to ensuring that, when exercising their power under new Article 8A, as inserted by Clause 6(5), to add, vary or omit provisions of Annex 2, they take the 2018 Act and children’s data protection into account.

Finally, Amendment 145 proposes a code of practice on the use of children’s data in scientific research. This code would, in contrast, ensure that all researchers, commercial or in the public interest, are held to the same high standards by developing detailed guidance on the use of children’s data for research purposes. A burning question for researchers is how to properly research children’s experience, particularly regarding the harms defined by the Online Safety Act.

Proposed new subsection (1) sets out the broad headings that the ICO must cover to promote good practice. Proposed new subsection (2) confirms that the ICO must have regard to children’s rights under the UNCRC, and that they are entitled to a higher standard of protection. It would also ensure that the ICO consulted with academics, those who represent the interests of children and data scientists. There is something of a theme here: if the changes to UK GDPR did not diminish data subjects’ privacy and rights, there would be no need for amendments in this group. If there were a code for independent public research, as is so sorely needed, the substance of Amendment 145 could usefully form a part. If commercial companies can extend scientific research that has no definition, and if the Bill expands the right to further processing and the Secretary of State can unilaterally change the basis for onward processing, can the Minister explain, when he responds, how he can claim that the Bill maintains protections for children?

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

My Lords, I will be brief because I associate myself with everything that the noble Baroness, Lady Kidron, just said. This is where the rubber hits the road from our previous group. If we all believe that it is important to maintain children’s protection, I hope that my noble friend the Minister will be able to accept if not the exact wording of the children-specific amendments in this group then the direction of travel—and I hope that he will commit to coming back and working with us to make sure that we can get wording into the Bill.

I am hugely in favour of research in the private sector as well as in universities and the public sector; we should not close our minds to that at all. We need to be realistic that all the meaningful research in AI is currently happening in the private sector, so I do not want to close that door at all, but I am extremely uncomfortable with a Secretary of State having the ability to amend access to personal data for children in this context. It is entirely sensible to have a defined code of conduct for the use of children’s data in research. We have real evidence that a code of conduct setting out how to protect children’s rights and data in this space works, so I do not understand why it would not be a good idea to do research if we want the research to happen but we want children’s rights to be protected at a much higher level.

It seems to me that this group is self-evidently sensible, in particular Amendments 8, 22, 23 and 145. I put my name to all of them except Amendment 22 but, the more I look at the Bill, the more uncomfortable I get with it; I wish I had put my name to Amendment 22. We have discussed Secretary of State powers in each of the digital Bills that we have looked at and we know about the power that big tech has to lobby. It is not fair on Secretaries of State in future to have this ability to amend—it is extremely dangerous. I express my support for Amendment 22.

Lord Davies of Brixton Portrait Lord Davies of Brixton (Lab)
- Hansard - - - Excerpts

I just want to say that I agree with what the previous speakers have said. I particularly support Amendment 133; in effect, I have already made my speech on it. At that stage, I spoke about pseudonymised data but I focused my remarks on scientific research. Clearly, I suspect that the Minister’s assurances will not go far enough, although I do not want to pre-empt what he says and I will listen carefully to it. I am sure that we will have to return to this on Report.

I make a small additional point: I am not as content as the noble Baroness, Lady Harding of Winscombe, about commercial research. Different criteria apply; if we look in more detail at ensuring that research data is protected, there may be special factors relating to commercial research that need to be covered in a potential code of practice or more detailed regulations.

Data Protection and Digital Information Bill

Baroness Harding of Winscombe Excerpts
Once again, I am finding it hard to marry the Government’s assurance—given privately, from the Dispatch Box in the other place, and by the noble Viscount the Minister—that the Government remain fully committed to the high standards of protection they set out for children with the proposal routinely to expose them to direct marketing. The changes in UK data law proposed by Clause 5, and numerous others scattered throughout the Bill, expose the reality that the Bill is intended to reduce privacy for UK citizens and, as a knock-on, the privacy and safety protection of children. The Government have a choice: to let the House decide whether children deserve a lesser standard of protection, or to amend the Bill to maintain the current standards.
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

My Lords, I support the noble Baroness, Lady Kidron, in Amendments 13 and 15, to which I have added my name. Rather than repeat her arguments—as we are now all trying not to do—I want to build on them and point to the debate we had on the first group in Committee, when my noble friend the Minister insisted that the Government had no desire to water down the protections for children in the Bill. In Clause 5, in proposed new paragraph (7) of Article 6, the Government have felt it necessary to be explicit, in that paragraph only, that children might need extra protection. This, on its own, makes me worried that the whole Bill is reducing the protection children have, because the Government felt it necessary to insert new paragraph (7)(b). Interestingly, it refers to,

“where relevant, the need to provide children”

with additional support. But where is that not relevant?

Amendment 13 simply looks to strengthen this—to accept the premise on which the Bill is currently drafted that we need to be explicit where children deserve the right to a higher level of protection, and to get the wording right. Will my noble friend the Minister reconsider? There are two choices here: to state right at the beginning of the Bill that there is a principle that there will be no reduction in children’s right to a higher level of protection, or to do as the Bill currently does and make sure that we get the wording right at every stage as we work through.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I thank noble Lords who have spoken to this group. As ever, I am grateful to the Delegated Powers and Regulatory Reform Committee for the care it has taken in scrutinising the Bill. In its 10th report it made a number of recommendations addressing the Henry VIII powers in the Bill, which are reflected in a number of amendments that we have tabled.

In this group, we have Amendment 12 to Clause 5, which addresses the committee’s concerns about the new powers for the Secretary of State to amend new Annexe 1 of Article 6. This sets out the grounds for treating data processing as a recognised legitimate interest. This issue was raised by the noble Lord, Lord Clement-Jones, in his introduction. The Government argue that they are starting with a limited number of grounds and that the list might need to be changed swiftly, hence the need for the Secretary of State’s power to make changes by affirmative regulations.

However, the Delegated Powers and Regulatory Reform Committee argues:

“The grounds for lawful processing of personal data go to the heart of the data protection legislation, and therefore in our view should not be capable of being changed by subordinate legislation”.


It also argues that the Government have not provided strong reasons for needing this power. It recommends that the delegated power in Clause 5(4) should be removed from the Bill, which is what our Amendment 12 seeks to do.

These concerns were echoed by the Constitution Committee, which went one stage further by arguing:

“Data protection is a matter of great importance in maintaining a relationship of trust between the state and the individual”.


It is important to maintain these fundamental individual rights. On that basis, the Constitution Committee asks us to consider whether the breadth of the Secretary of State’s powers in Clauses 5 and 6 is such that those powers should be subject to primary rather than secondary legislation.

I make this point about the seriousness of these issues as they underline the points made by other noble Lords in their amendments in this group. In particular, the noble Lord, Lord Clement-Jones, asked whether any regulations made by the Secretary of State should be the subject of the super-affirmative procedure. We will be interested to hear the Minister’s response, given the concerns raised by the Constitution Committee.

Will the Minister also explain why it was necessary to remove the balancing test, which would require organisations to show why their interest in processing data outweighs the rights of data subjects? Again, this point was made by the noble Lord, Lord Clement-Jones. It would also be helpful if the Minister could clarify whether the new powers for the Secretary of State to amend the recognised legitimate interest could have consequences for data adequacy and whether this has been checked and tested with the EU.

Finally, we also welcome a number of other amendments tabled by the noble Lord, Lord Clement-Jones, in particular those to ensure that direct marketing should be considered a legitimate interest only if there is proper consent. This was one of the themes of the noble Baroness, Lady Kidron, who made, as ever, a very powerful case for ensuring that children specifically should not be subject to direct market as routine and that there should be clear consent.

The noble Baronesses, Lady Kidron and Lady Harding, have once again, quite rightly, brought us back to the Bill needing to state explicitly that children’s rights are not being watered down by it, otherwise we will come back to this again and again in all the clauses. The noble Baroness, Lady Kidron, said that this will be decided on the Floor of the House, or the Minister could give in now and come back with some government amendments. I heartily recommend to the Minister that he considers doing that because it might save us some time. I look forward to the Minister’s response on that and on the Delegated Powers and Regulatory Reform Committee’s recommendations about removing the Secretary of State’s right to amend the legitimate interest test.

--- Later in debate ---
Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- Hansard - - - Excerpts

My Lords, I follow the noble Baroness, Lady Jones of Whitchurch, with pleasure, as I agree with everything that she just said. I apologise for having failed to notice this in time to attach my name; I certainly would have done, if I had had the chance.

As the noble Baroness said, we are in an area of great concern for the level of democracy that we already have in our country. Downgrading it further is the last thing that we should be looking at doing. Last week, I was in the Chamber looking at the statutory instrument that saw a massive increase in the spending limits for the London mayoral and assembly elections and other mayoral elections—six weeks before they are held. This is a chance to spend an enormous amount of money; in reality, it is the chance for one party that has the money from donations from interesting and dubious sources, such as the £10 million, to bombard voters in clearly deeply dubious and concerning ways.

We see a great deal of concern about issues such as deepfakes, what might happen in the next general election, malicious actors and foreign actors potentially interfering in our elections. We have to make sure, however, that the main actors conduct elections fairly on the ground. As the noble Baroness, Lady Jones, just set out, this potentially drives a cart and horses through that. As she said, these clauses did not get proper scrutiny in the Commons—as much as that ever happens. As I understand it, there is the potential for us to remove them entirely later, but I should like to ask the Minister some direct questions, to understand what the Government’s intentions are and how they understand the meaning of the clauses.

Perhaps no one would have any problems with these clauses if they were for campaigns to encourage people to register to vote, given that we do not have automatic voter registration, as so many other countries do. Would that be covered by these clauses? If someone were conducting a “get out the vote” campaign in a non-partisan way, simply saying, “Please go out and vote. The election is on this day. You will need to bring along your voter ID”, would it be covered by these clauses? What about an NGO campaigning to stop a proposed new nuclear power station, or a group campaigning for stronger regulations on pesticides or for the Government to take stronger action against ultra-processed food? How do those kinds of politics fit with Clauses 114 and 115? As they are currently written, I am not sure that it is clear what is covered.

There is cause for deep concern, because no justification has been made for these two clauses. I look forward to hearing the Minister’s responses.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

My Lords, this weekend, as I was preparing for the amendments to which I have put my name, I made the huge mistake of looking at the other amendments being discussed. As a result, I had a look at this group. I probably should declare an interest as the wife of a Conservative MP; therefore, our household is directly affected by this amendment and these clause stand part notices. I wholeheartedly agree with everything said by the noble Baronesses, Lady Jones and Lady Bennett of Manor Castle.

I have two additional points to make, because I am horrified by these clauses. First, did I miss something, in that we are now defining an adult as being 14-plus? At what point did that happen? I thought that you had the right to vote at 18, so I do not understand why electoral direct marketing should be free to bombard our 14 year-olds. That was my first additional point.

Secondly, I come back to what I said on the first day of Committee: this is all about trust. I really worry that Clauses 114 and 115 risk undermining two important areas where trust really matters. The first is our electoral system and the second is the data that we give our elected representatives, when we go to them not as party representatives but as our representatives elected to help us.

--- Later in debate ---
Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- Hansard - - - Excerpts

Before the Minister replies, we may as well do the full round. I agree with him, in that I very much believe in votes at 16 and possibly younger. I have been on many a climate demonstration with young people of 14 and under, so they can be involved, but the issue here is bigger than age. The main issue is not age but whether anybody should be subjected to a potential barrage of material in which they have not in any way expressed an interest. I am keen to make sure that this debate is not diverted to the age question and that we do not lose the bigger issue. I wanted to say that I sort of agree with the Minister on one element.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

I agree with the noble Baroness, but with one rider. We will keep coming back to the need for children to have a higher level of data protection than adults, and this is but one of many examples we will debate. However, I agree with her underlying point. The reason why I support removing both these clauses is the hubris of believing that you will engage the electorate by bombarding them with things they did not ask to receive.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

A fair number of points were made there. I will look at ages under 16 and see what further steps, in addition to being necessary and proportionate, we can think about to provide some reassurance. Guidance would need to be in effect before any of this is acted on by any of the political parties. I and my fellow Ministers will continue to work with the ICO—

--- Later in debate ---
Moved by
27: Clause 11, page 23, line 10, leave out “to the extent that” and insert “when any one or more of the following is true”
Member’s explanatory statement
This amendment would clarify that only one condition under paragraph 5 must be present for paragraphs 1 to 4 to not apply.
--- Later in debate ---
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

My Lords, in moving Amendment 27 in my name, I will also express my support for Amendments 28 to 34. I thank my noble friend Lord Black, the noble Baroness, Lady Jones, and the noble Lord, Lord Clement-Jones, for supporting and signing a number of these amendments.

This is quite a specific issue compared to the matters of high policy that we have been debating this afternoon. There is a specific threat to the continuing ability of companies to use the open electoral register for marketing purposes without undue burdens. Some 37% of registered voters choose not to opt out of their data being used for direct marketing via the open electoral register, so quite a significant proportion of the population openly agrees that that data can be used for direct marketing. It is an essential resource for accurate postal addresses and for organisations such as CACI—I suspect that a number of us speaking have been briefed by it; I thank it for its briefing—and it has been used for more than 40 years without detriment to consumers and with citizens’ full knowledge. The very fact that 63% of people on the electoral register have opted out tells you that this is a conscious choice that people have knowingly made.

Why is it in doubt? A recent First-tier Tribunal ruling in a legal case stated, by implication, that every company using open electoral register data must, by 20 May 2024, notify individuals at their postal addresses whenever their data on the electoral register is used and states that cost cannot be considered “dispro-portionate effort”. That means that organisations that are using the electoral roll would need to contact 24.2 million individuals between now and the middle of May, making it completely practically and financially unviable to use the electoral register at scale.

This group of amendments to Clause 11 aims to address this issue. I fully acknowledge that we have tried to hit the target with a number of shots in this group, and I encourage the Minister, first, to acknowledge that he recognises that this is a real problem that the Bill should be able to address and, secondly, if the wording in individual amendments is not effective or has some unintended consequences that we have missed, I encourage him to respond appropriately.

To be clear, the amendments provide legal certainty about the use of the open electoral register without compromising on any aspect of the data privacy of UK citizens or risking data adequacy. The amendments specify that companies are exempt from the requirement to provide individuals with information in cases where their personal data has not been obtained from them directly if that data was obtained from the open electoral register. They provide further clarification of what constitutes “disproportionate effort” under new paragraph (e) in Article 14(5) of the GDPR. These additional criteria include the effort and cost of compliance, the damage and distress caused to the data subjects and the reasonable expectation of the data subjects, which the percentage of people not opting out shows.

Why is this a problem that we need to fix? First, if we do not fix this, we might create in the physical world the very problem that parts of the Bill are trying to address in the digital world: the bombarding of people with lots of information that they do not want to receive, lots of letters telling us that a company is using the electoral roll that we gave it permission to use in the first place. It will also inadvertently give more power to social media companies for targeting because it will make physical direct marketing much harder to target, so SMEs will be forced into a pretty oligopolistic market for social media targeting. Finally, it will mean that we lose jobs and reduce productivity at a time when we are trying to do the opposite.

This is quite a simple issue and there is cross-party support. It is not an issue of great philosophical import, but for the companies in this space, it is very real, and for the people working in this industry, it is about their jobs. Inch by inch, we need to look at things that improve productivity rather than actively destroy it, even when people have agreed to it. With that, I note the hour and I beg to move.

Lord Black of Brentwood Portrait Lord Black of Brentwood (Con)
- Hansard - - - Excerpts

My Lords, I support Amendments 27 to 34, tabled variously by my noble friend Lady Harding, and the noble Lord, Lord Clement-Jones, to which I have added my name. As this is the first time I have spoken in Committee, I declare my interests as deputy chairman of the Telegraph Media Group and president of the Institute of Promotional Marketing and note my other declarations in the register.

The direct marketing industry is right at the heart of the data-driven economy, which is crucial not just to the future of the media and communications industries but to the whole basis of the creative economy, which will power economic growth into the future. The industry has quite rightly welcomed the Bill, which provides a long-term framework for economic growth as well as protecting customers.

However, there is one area of great significance, as my noble friend Lady Harding has just eloquently set out, on which this Bill needs to provide clarity and certainty going forward, namely, the use of the open electoral register. That register is an essential resource for a huge number of businesses and brands, as well as many public services, as they try to build new audiences. As we have heard, it is now in doubt because of a recent legal ruling that could, as my noble friend said, lead to people being bombarded with letters telling them that their data on the OER has been used. That is wholly disproportionate and is not in the interests of the marketing and communications industry or customers.

These sensible amendments would simply confirm the status quo that has worked well for so long. They address the issue by providing legal certainty around the use of the OER. I believe they do so in a proportionate manner that does not in any way compromise any aspect of the data privacy of UK citizens. I urge the Minister carefully to consider these amendments. As my noble friend said, there are considerable consequences of not acting for the creative economy, jobs in direct marketing, consumers, the environment and small businesses.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank my noble friend Lady Harding for moving this important amendment. I also thank the cosignatories—the noble Lords, Lord Clement-Jones and Lord Black, and the noble Baroness, Lady Jones. As per my noble friend’s request, I acknowledge the importance of this measure and the difficulty of judging it quite right. It is a difficult balance and I will do my best to provide some reassurance, but I welcomed hearing the wise words of all those who spoke.

I turn first to the clarifying Amendments 27 and 32. I reassure my noble friend Lady Harding that, in my view, neither is necessary. Clause 11 amends the drafting of the list of cases when the exemption under Article 14(5) applies but the list closes with “or”, which makes it clear that you need to meet only one of the criteria listed in paragraph (5) to be exempt from the transparency requirements.

I turn now to Amendments 28 to 34, which collectively aim to expand the grounds of disproportionate effort to exempt controllers from providing certain information to individuals. The Government support the use of public data sources, such as the OER, which may be helpful for innovation and may have economic benefits. Sometimes, providing this information is simply not possible or is disproportionate. Existing exemptions apply when the data subject already has the information or in cases where personal data has been obtained from someone other than the data subject and it would be impossible to provide the information or disproportionate effort would be required to do so.

We must strike the right balance between supporting the use of these datasets and ensuring transparency for data subjects. We also want to be careful about protecting the integrity of the electoral register, open or closed, to ensure that it is used within the data subject’s reasonable expectations. The exemptions that apply when the data subject already has the information or when there would be a disproportionate effort in providing the information must be assessed on a case-by-case basis, particularly if personal data from public registers is to be combined with other sources of personal data to build a profile for direct marketing.

These amendments may infringe on transparency—a key principle in the data protection framework. The right to receive information about what is happening to your data is important for exercising other rights, such as the right to object. This could be seen as going beyond what individuals might expect to happen to their data.

The Government are not currently convinced that these amendments would be sufficient to prevent negative consequences to data subject rights and confidence in the open electoral register and other public registers, given the combination of data from various sources to build a profile—that was the subject of the tribunal case being referenced. Furthermore, the Government’s view is that there is no need to amend Article 14(6) explicitly to include the “reasonable expectation of the data subjects” as the drafting already includes reference to “appropriate safeguards”. This, in conjunction with the fairness principle, means that data controllers are already required to take this into account when applying the disproportionate effort exemption.

The above notwithstanding, the Government understand that the ICO may explore this question as part of its work on guidance in the future. That seems a better way of addressing this issue in the first instance, ensuring the right balance between the use of the open electoral register and the rights of data subjects. We will continue to work closely with the relevant stakeholders involved and monitor the situation.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

I wonder whether I heard my noble friend correctly. He said “may”, “could” and “not currently convinced” several times, but, for the companies concerned, there is a very real, near and present deadline. How is my noble friend the Minister suggesting that deadline should be considered?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

On the first point, I used the words carefully because the Government cannot instruct the ICO specifically on how to act in any of these cases. The question about the May deadline is important. With the best will in the world, none of the provisions in the Bill are likely to be in effect by the time of that deadline in any case. That being the case, I would feel slightly uneasy about advising the ICO on how to act.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Yes. I repeat that I very much recognise the seriousness of the case. There is a balance to be drawn here. In my view, the best way to identify the most appropriate balancing point is to continue to work closely with the ICO, because I strongly suspect that, at least at this stage, it may be very difficult to draw a legislative dividing line that balances the conflicting needs. That said, I am happy to continue to engage with noble Lords on this really important issue between Committee and Report, and I commit to doing so.

On the question of whether Clause 11 should stand part of the Bill, Clause 11 extends the existing disproportionate effort exemption to cases where the controller collected the personal data directly from the data subject and intends to carry out further processing for research purposes, subject to the research safeguards outlined in Clause 26. This exemption is important to ensure that life-saving research can continue unimpeded.

Research holds a privileged position in the data protection framework because, by its nature, it is viewed as generally being in the public interest. The framework has various exemptions in place to facilitate and encourage research in the UK. During the consultation, we were informed of various longitudinal studies, such as those into degenerative neurological conditions, where it is impossible or nearly impossible to recontact data subjects. To ensure that this vital research can continue unimpeded, Clause 11 provides a limited exemption that applies only to researchers who are complying with the safeguards set out in Clause 26.

The noble Lord, Lord Clement-Jones, raised concerns that Clause 11 would allow unfair processing. I assure him that this is not the case, as any processing that uses the disproportionate effort exemption in Article 13 must comply with the overarching data protection principles, including lawfulness, fairness and transparency, so that even if data controllers rely on this exemption they should consider other ways to make the processing they undertake as fair and transparent as possible.

Finally, returning to EU data adequacy, the Government recognise its importance and, as I said earlier, are confident that the proposals in Clause 11 are complemented by robust safeguards, which reinforces our view that they are compatible with EU adequacy. For the reasons that I have set out, I am unable to accept these amendments, and I hope that noble Lords will not press them.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

My Lords, I am not quite sure that I understand where my noble friend the Minister is on this issue. The noble Lord, Lord Clement-Jones, summed it up well in his recent intervention. I will try to take at face value my noble friend’s assurances that he is happy to continue to engage with us on these issues, but I worry that he sees this as two sides of an issue—I hear from him that there may be some issues and there could be some problems—whereas we on all sides of the Committee have set out a clear black and white problem. I do not think they are the same thing.

I appreciate that the wording might create some unintended consequences, but I have not really understood what my noble friend’s real concerns are, so we will need to come back to this on Report. If anything, this debate has made it even clearer to me that it is worth pushing for clarity on this. I look forward to ongoing discussions with a cross-section of noble Lords, my noble friend and the ICO to see if we can find a way through to resolve the very real issues that we have identified today. With that, and with thanks to all who have spoken in this debate, I beg leave to withdraw my amendment.

Amendment 27 withdrawn.

Data Protection and Digital Information Bill

Baroness Harding of Winscombe Excerpts
Irrespective of that wider point, I trust that the Minister will at least agree with me that having regard to something is quite different from ensuring something. There is a difference between a vague notion that all is changed but nothing diminished and the certainty demanded by the children’s amendments. I ask the Minister to be certain when he replies to address the question of whether “having regard” is the same bar as “ensuring no diminution of standards”.
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

My Lords, as is so often the case on these issues, it is daunting to follow the noble Baroness as she has addressed the issues so comprehensively. I speak in support of Amendment 57, to which I have added my name, and register my support for my noble friend Lord Holmes’s Amendment 59A, but I will begin by talking about the Clause 14 stand part notice.

Unfortunately, I was not able to stay for the end of our previous Committee session so I missed the last group on automated decision-making; I apologise if I cover ground that the Committee has already covered. It is important to start by saying clearly that I am in favour of automated decision-making and the benefits that it will bring to society in the round. I see from all the nodding heads that we are all in the same place—interestingly, my Whip is shaking his head. We are trying to make sure that automated decision-making is a force for good and to recognise that anything involving human beings—even automated decision-making does, because human beings create it—has the potential for harm as well. Creating the right guard-rails is really important.

Like the noble Baroness, Lady Kidron, until I understood the Bill a bit better, I mistakenly thought that the Government’s position was not to regulate AI. But that is exactly what we are doing in the Bill, in the sense that we are loosening regulation and the ability to make use of automated decision-making. While that may be the right answer, I do not think we have thought about it in enough depth or scrutinised it in enough detail. There are so few of us here; I do not think we quite realise the scale of the impact of this Bill and this clause.

I too feel that the clause should be removed from the Bill—not because it might not ultimately be the right answer but because this is something that society needs to debate fully and comprehensively, rather than it sneaking into a Bill that not enough people, either in this House or the other place, have really scrutinised.

I assume I am going to lose that argument, so I will briefly talk about Amendment 57. Even if the Government remain firm that there is “nothing to see here” in Clause 14, we know that automated decision-making can do irreparable harm to children. Any of us who has worked on child internet safety—most of us have worked on it for at least a decade—regret that we failed to get in greater protections earlier. We know of the harm done to children because there have not been the right guard-rails in the digital world. We must have debated together for hours and hours why the harms in the algorithms of social media were not expressly set out in the Online Safety Act. This is the same debate.

It is really clear to me that it should not be possible to amend the use of automated decision-making to in any way reduce protections for children. Those protections have been hard fought and ensure a higher bar for children’s data. This is a classic example of where the Bill reduces that, unless we are absolutely explicit. If we are unable to persuade the Government to remove Clause 14, it is essential that the Bill is explicit that the Secretary of State does not have the power to reduce data protection for children.

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - - - Excerpts

My Lords, I speak in favour of the clause stand part notice in my name and that of the noble Lord, Lord Clement-Jones.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I reject the characterisation of Clause 14 or any part of the Bill as loosening the safeguards. It focuses on the outcomes and by being less prescriptive and more adaptive, its goal is to heighten the levels of safety of AI, whether through privacy or anything else. That is the purpose.

On Secretary of State powers in relation to ADM, the reforms will enable the Government to further describe what is and is not to be taken as a significant effect on a data subject and what is and is not to be taken as meaningful human—

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

I may be tired or just not very smart, but I am not really sure that I understand how being less prescriptive and more adaptive can heighten safeguards. Can my noble friend the Minister elaborate a little more and perhaps give us an example of how that can be the case?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Certainly. Being prescriptive and applying one-size-fits-all measures for all processes covered by the Bill encourages organisations to follow a process, but focusing on outcomes encourages organisations to take better ownership of the outcomes and pursue the optimal privacy and safety mechanisms for those organisations. That is guidance that came out very strongly in the Data: A New Direction consultation. Indeed, in the debate on a later group we will discuss the use of senior responsible individuals rather than data protection officers, which is a good example of removing prescriptiveness to enhance adherence to the overall framework and enhance safety.

--- Later in debate ---
Lord Kamall Portrait Lord Kamall (Con)
- Hansard - - - Excerpts

My Lords, I apologise for not being here on Monday, when I wanted to speak about automated decision-making. I was not sure which group to speak on today; I am thankful that my noble friend Lord Harlech intervened to ensure that I spoke on this group and made my choice much easier.

I want to speak on Amendments 74 to 77 because transparency is essential. However, one of the challenges about transparency is to ensure you understand what you are reading. I will give noble Lords a quick example: when I was in the Department of Health and Social Care, we had a scheme called the voluntary pricing mechanism for medicines. Companies would ask whether that could be changed and there could be a different relationship because they felt that they were not getting enough value from it. I said to the responsible person in the department, “I did engineering and maths, so can you send me a copy of algorithm?” He sent it to me, and it was 100 pages long. I said, “Does anyone understand this algorithm?”, and he said, “Oh yes, the analysts do”. I was about to get a meeting, but then I was moved to another department. That shows that even if we ask for transparency, we have to make sure that we understand what we are being given. As the noble Lord, Lord Clement-Jones, has worded this, we have to make sure that we understand the functionality and what it does at a high enough level.

My noble friend Lady Harding often illustrates her points well with short stories. I am going to do that briefly with two very short stories. I promise to keep well within the time limit.

A few years ago, I was on my way to a fly to Strasbourg because I was a Member of the European Parliament. My train got stuck, and I missed my flight. My staff booked me a new ticket and sent me the boarding pass. I got to the airport, which was fantastic, and got through the gate and was waiting for my flight in a waiting area. They called to start boarding and, when I went to go on, they scanned my pass again and I was denied boarding. I asked why I was denied, having been let into the gate area in the first place, but no one could explain why. To cut a long story short, over two hours, four or five people from that company gaslighted me. Eventually, when I got back to the check-in desk, which the technology was supposed to avoid in the first place, it was explained that they had sent me an email the day before. In fact, they had not sent me an email the day before, which they admitted the day after, but no one ever explained why I was not allowed on that flight.

Imagine that in the public sector. I can accept it, although it was awful behaviour by that company, but imagine that happening for a critical operation that had been automated to cut down on paperwork. Imagine turning up for your operation when you are supposed to scan your barcode to be let into the operating theatre. What happens if there is no accountability or transparency in that case? This is why the amendments tabled by the noble Lord, Lord Clement-Jones, are essential.

Here is another quick story. A few years ago, someone asked me whether I was going to apply for one of these new fintech banks. I submitted the application and the bank said that it would get back to me within 48 hours. It did not. Two weeks later, I got a message on the app saying that I had been rejected, that I would not be given an account and that “by law, we do not have to explain why”.

Can you imagine that same technology being used in the public sector, with a WYSIWYG on the fantastic NHS app that we have now? Imagine booking an appointment then suddenly getting a message back saying, “Your appointment has been denied but we do not have to explain why”. These Amendments 74 to 78 must be given due consideration by the Government because it is absolutely essential that citizens have full transparency on decisions made through automated decision-making. We should not allow the sort of technology that was used by easyJet and Monzo in this case to permeate the public sector. We need more transparency—it is absolutely essential—which is why I support the amendments in the name of the noble Lord, Lord Clement-Jones.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

My Lords, I associate myself with the comments that my noble friend Lord Kamall just made. I have nothing to add on those amendments, as he eloquently set out why they are so important.

In the spirit of transparency, my intervention enables me to point out, were there any doubt, who I am as opposed to the noble Baroness, Lady Bennett, who was not here earlier but who I was mistaken for. Obviously, we are not graced with the presence of my noble friend Lord Maude, but I am sure that we all know what he looks like as well.

I will speak to two amendments. The first is Amendment 144, to which I have added my name. As usual, the noble Baroness, Lady Kidron, has said almost everything that can be said on this but I want to amplify two things. I have yet to meet a politician who does not get excited about the two-letter acronym that is AI. The favoured statement is that it is as big a change in the world as the discovery of electricity or the invention of the wheel. If it is that big—pretty much everyone in the world who has looked at it probably thinks it is—we need properly to think about the pluses and the minuses of the applications of AI for children.

The noble Baroness, Lady Kidron, set out really clearly why children are different. I do not want to repeat that, but children are different and need different protections; this has been established in the physical world for a very long time. With this new technology that is so much bigger than the advent of electricity and the creation of the first automated factories, it is self-evident that we need to set out how to protect children in that world. The question then is: do we need a separate code of practice on children and AI? Or, as the noble Baroness set out, is this an opportunity for my noble friend the Minister to confirm that we should write into this Bill, with clarity, an updated age-appropriate design code that recognises the existence of AI and all that it could bring? I am indifferent on those two options but I feel strongly that, as we have now said on multiple groups, we cannot just rely on the wording in a previous Act, which this Bill aims to update, without recognising that, at the same time, we need to update what an age-appropriate design code looks like in the age of AI.

The second amendment that I speak to is Amendment 252, on the open address file. I will not bore noble Lords with my endless stories about the use of the address file during Covid, but I lived through and experienced the challenges of this. I highlight an important phrase in the amendment. Proposed new subsection (1) says:

“The Secretary of State must regularly publish a list of UK addresses as open data to an approved data standard”.


One reason why it is a problem for this address data to be held by an independent private company is that the quality of the data is not good enough. That is a real problem if you are trying to deliver a national service, whether in the public sector or the private sector. If the data quality is not good enough, it leaves us substantially poorer as a country. This is a fundamental asset for the country and a fundamental building block of our geolocation data, as the noble Lord, Lord Clement-Jones, set out. Anybody who has tried to build a service that delivers things to human beings in the physical world knows that errors in the database can cause huge problems. It might not feel like a huge problem if it concerns your latest Amazon delivery but, if it concerns the urgent dispatch of an ambulance, it is life and death. Maintaining the accuracy of the data and holding it close as a national asset is therefore hugely important, which is why I lend my support to this amendment.

Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

My Lords, the noble Lord, Lord Clement-Jones, has, as ever, ably introduced his Amendments 74, 75, 76, 77 and 78, to the first of which the Labour Benches have added our name. We broadly support all the amendments, but in particular Amendment 74. We also support Amendment 144 which was tabled by the noble Baroness, Lady Kidron, and cosigned by the noble Baroness, Lady Harding, the noble Lord, Lord Clement-Jones and my noble friend Lady Jones.

Amendments 74 to 78 cover the use of the Government’s Algorithmic Transparency Recording Standard—ATRS. We heard a fair bit about this in Committee on Monday, when the Minister prayed it in aid during debates on Clause 14 and Article 22A. The noble Lord, Lord Clement-Jones, outlined its valuable work, which I think everyone in the Committee wants to encourage and see writ large. These amendments seek to aid the transparency that the Minister referred to by publishing reports by public bodies using algorithmic tools where they have a significant influence on the decision-making process. The amendments also seek to oblige the Secretary of State to ensure that public bodies, government departments and contractors using public data have a compulsory transparency reporting scheme in place. The amendments legislate to create impact assessments and root ADM processes in public service that minimise harm and are fair and non-discriminatory in their effect.

The noble Lord, Lord Kamall, made some valuable points about the importance of transparency. His two stories were very telling. It is only right that we have that transparency for the public service and in privately provided services. I think the Minister would be well advised to listen to him.

The noble Lord, Lord Clement-Jones, also alighted on the need for government departments to publish reports under the ATRS in line with their position as set out in the AI regulation White Paper consultation process and response. This would put it on a legislative basis, and I think that is fairly argued. The amendments would in effect create a statutory framework for transparency in the public service use of algorithmic tools.

We see these amendments as forming part of the architecture needed to begin building a place of trust around the increased use of ADM and the introduction of AI into public services. Like the Government and everyone in this Committee, we see all the advantages, but take the view that we need to take the public with us on this journey. If we do not do that, we act at our peril. Transparency, openness and accountability are key to securing trust in what will be something of a revolution in how public services are delivered and procured in the future.

We also support Amendment 144 in the name of the noble Baroness, Lady Kidron, for the very simple reason that in the development of AI technology we should hardwire into practice and procedure using the technology as it affects the interests of children to higher standards, and those higher standards should apply. This has been a constant theme in our Committee deliberations and our approach to child protection. In her earlier speech, the noble Baroness, Lady Harding, passionately argued for the need to get this right. We have been wanting over the past decade in that regard, and now is the moment to put that right and begin to move on this policy area.

The noble Baroness, Lady Kidron, has made the argument for higher standards of protection for children persuasively during all our deliberations, and a code of practice makes good sense. As the noble Baroness, Lady Harding, said, it can either be stand-alone or integrated. In the end, it matters little, but having it there setting the standard is critical to getting this policy area in the right place. The amendment sets out the detail that the commissioner must cover with admirable clarity so that data processors should always have prioritising children’s interests and fundamental rights in their thinking. I am sure that is something that is broadly supported by the whole Committee.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I will speak to almost all the amendments in this group, other than those proposed by the noble Baroness, Lady Kidron. I am afraid that this is a huge group; we probably should have split it to have a better debate, but that is history.

I very much support what the noble Baroness said about her amendments, particularly Amendment 79. The mandation of ethics by design is absolutely crucial. There are standards from organisations such as the IEEE for that kind of ethics by design in AI systems. I believe that it is possible to do exactly what she suggested, and we should incorporate that into the Bill. It illustrates that process is as important as outcomes. We are getting to a kind of philosophical approach here, which illustrates the differences between how some of us and the Government are approaching these things. How you do something, the way you design it and the fact that it needs to be ethical is absolutely cardinal in any discussion—particularly about artificial intelligence. I do not think that it is good enough simply to talk about the results of what AI does without examining how it does it.

Having said that, I turn to Amendment 80 and the Clause 16 stand part notice. Under Clause 16, the Government are proposing to remove Article 27 of the UK GDPR without any replacement. By removing the legal requirement on non-UK companies to retain a UK representative, the Government would deprive individuals of a local, accessible point of contact through which people can make data protection rights requests. That decision threatens people’s capacity to exercise their rights, reducing their ability to remain in control of their personal information.

The Government say that removing Article 27 will boost trade with the UK by reducing the compliance burden on non-UK businesses. But they have produced little evidence to support the notion that this will be the case and have overlooked the benefits in operational efficiency and cost savings that the representative can bring to non-UK companies. Even more worryingly, the Government appear to have made no assessment of the impact of the change on UK individuals, in particular vulnerable groups such as children. It is an ill-considered policy decision that would see the UK take a backward step in regulation at a time when numerous other jurisdictions, such as Switzerland, Turkey, South Korea, China and Thailand, are choosing to safeguard the extraterritorial application of their data protection regimes through the implementation of the legal requirement to appoint a representative.

The UK representative ensures that anyone in the UK wishing to make a privacy-related request has a local, accessible point of contact through which to do so. The representative plays a critical role in helping people to access non-UK companies and hold them accountable for the processing of their data. The representative further provides a direct link between the ICO and non-UK companies to enable the ICO to enforce the UK data protection regime against organisations outside the UK.

On the trade issue, the Government argue that by eliminating the cost of retaining a UK representative, non-UK companies will be more inclined to offer goods and services to individuals in the UK. Although there is undeniably a cost to non-UK companies of retaining a representative, the costs are significantly lower than the rather disproportionately inflated figures that were cited in the original impact assessment, which in some cases were up to 10 times the average market rate for representative services. The Government have put forward very little evidence to support the notion that removing Article 27 will boost trade with the UK.

There is an alternative approach. Currently, the Article 27 requirement to appoint a UK representative applies to data controllers and processors. An alternative approach to the removal of Article 27 in its entirety would be to retain the requirement but limit its scope so that it applies only to controllers. Along with the existing exemption at Article 27(2), this would reduce the number of non-UK companies required to appoint a representative, while arguably still preserving a local point of contact through which individuals in the UK can exercise their rights, as it is data controllers that are obliged under Articles 15 to 22 of the UK GDPR to respond to data subject access requests. That is a middle way that the Government could adopt.

Moving to Amendment 82, at present, the roles of senior responsible individual in the Bill and data protection officer under the EU GDPR appear to be incompatible. That is because the SRI is part of the organisation’s senior management, whereas a DPO must be independent of an organisation’s senior management. This puts organisations caught by both the EU GDPR and the UK GDPR in an impossible situation. At the very least, the Government must explain how they consider that these organisations can comply with both regimes in respect of the SRI and DPO provisions.

The idea of getting rid of the DPO runs completely contrary to the way in which we need to think about accountability for AI systems. We need senior management who understand the corporate significance of the AI systems they are adopting within the business. The ideal way forward would be for the DPO to be responsible for that when AI regulation comes in, but the Government seem to be completely oblivious to that. Again, it is highly frustrating for those of us who thought we had a pretty decent data protection regime to find this kind of watering down taking place in the face of the risks from artificial intelligence that are becoming more and more apparent as the days go by. I firmly believe that it will inhibit the application and adoption of AI within businesses if we do not have public trust and business certainty.

I now come to oppose the question that Clause 18, on the duty to keep records, stand part of the Bill. This clause seems to masquerade as an attempt to get rid of red tape. In reality, it makes organisations less likely to be compliant with the main obligations in the UK GDPR, as it will be amended by the Bill, and therefore heightens the risk both to the data subjects whose data they hold and to the organisations in terms of non-compliance. This is, of course, the duty to keep records. It is particularly unfair on small businesses that do not have the resources to take advice on these matters. Records of processing activities are one of the main ways in which organisations can meet the requirements of Article 5(2) of the UK GDPR to demonstrate their compliance. The obligation to demonstrate compliance remains unaltered under the Bill. Therefore, dispensing with the main way of achieving compliance with Article 5(2) is impractical and unhelpful.

At this point, I should say that we support Amendment 81 in the name of the noble Baroness, Lady Jones, which concerns the assessment of high-risk processing.

Our amendments on data protection impact assessments are Amendments 87, 88 and 89. Such assessments are currently required under Article 35 of the UK GDPR and are essential to ensuring that organisations do not deploy, and individuals are not subjected to, systems that may lead to unlawful, rights-violating or discriminatory outcomes. The Government’s data consultation response noted:

“The majority of respondents agreed that data protection impact assessments requirements are helpful in identifying and mitigating risk, and disagreed with the proposal to remove the requirement to undertake data protection impact assessments”.


However, under Clause 20, the requirement to perform an impact assessment would be seriously diluted. That is all I need to say. The Government frequently pray in aid the consultation—they say, “Well, we did that because of the consultation”—so why are they flying in the face of it? That seems an extraordinary thing to do in circumstances where impact assessments are regarded as a useful tool and training by business has clearly adjusted to them over the years since the Data Protection Act 2018.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

My Lords, I rise to speak in support Amendments 79, 83, 85, 86, 93, 96, 97, 105 and 107, to which I have added my name. An awful lot has already been said. Given the hour of the day, I will try to be brief, but I want to speak to the child amendments I have put my name to and to the non-child ones and to raise things up a level.

The noble Lord, Lord Clement-Jones, talked about trust. I have spent the best part of the past 15 years running consumer and citizen digitally enabled services. The benefit that technology brings to life is clear to me but—this is a really important “but”—our customers and citizens need to trust what we do with their data, so establishing trust is really important.

One the bedrock of that trust is forcing—as a non-technologist, I use that word advisedly—technologists to set out what they are trying to do, what the technology they propose to build will do and what the risks and opportunities of that technology are. My experience as a non-engineer is that when you put engineers under pressure, they can speak English, but it is not their preferred language. They do not find it easy to articulate the risks and opportunities of the technology they are building, which is why forcing businesses that build these services to set out in advance the data protection impacts of the services they are building is so important. It is also why you have to design with safety in mind upfront because technology is so hard to retrofit. If you do not design it up front with ethics and safety at its core, it is gone by the time you see the impact in the real world.

Data Protection and Digital Information Bill

Baroness Harding of Winscombe Excerpts
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I will speak to Amendments 142, 143 and 150 in my name, and I thank other noble Lords for their support.

We have spent considerable time across the digital Bills—the online safety, digital markets and data Bills—talking about the speed at which industry moves and the corresponding need for a more agile regulatory system. Sadly, we have not really got to the root of what that might look like. In the meantime, we have to make sure that regulators and Governments are asked to fulfil their duties in a timely manner.

Amendment 142 puts a timeframe on the creation of codes under the Act at 18 months. Data protection is a mature area of regulatory oversight, and 18 months is a long time for people to wait for the benefits that accrue to them under legislation. Similarly, Amendment 143 ensures that the transition period from the code being set to it being implemented is no more than 12 months. Together, that creates a minimum of two and half years. In future legislation on digital matters, I would like to see a very different approach that starts with the outcome and gives companies 12 months to comply, in any way they like, to ensure that outcome. But while we remain in the world of statutory code creation, it must be bound by a timeframe.

I have seen time and again, after the passage of a Bill, Parliament and civil society move on, including Ministers and key officials—as well as those who work at the regulator—and codes lose their champions. It would be wonderful to imagine that matters progress as intended, but they do not. In the absence of champions, and without ongoing parliamentary scrutiny, codes can languish in the inboxes of people who have many calls on their time. Amendments 142 and 143 simply mirror what the Government agreed to in the OSA—it is a piece of good housekeeping to ensure continuity of attention.

I am conscious that I have spent most of my time highlighting areas where the Bill falls short, so I will take a moment to welcome the reporting provisions that the Government have put forward. Transparency is a critical aspect of effective oversight, and the introduction of an annual report on regulatory action would be a valuable source of information for all stakeholders with an interest in understanding the work of the ICO and its impact.

Amendment 150 proposes that those reporting obligations also include a requirement to provide details of all activities carried out by the Information Commissioner to support, strengthen and uphold the age-appropriate design code. It also proposes that, when meeting its general reporting obligations, it should provide the information separately for children. The ICO published an evaluation of the AADC as a one-off in March 2023 and its code strategy on 3 April this year. I recognise the effort that the commissioner has made towards transparency, and the timing of his report indicates that having reporting on children specifically is something that the ICO sees as relevant and useful. However, neither of those are sufficient in terms of the level of detail provided, the reporting cadence or the focus on impact rather than the efforts that the ICO has made.

There are many frustrations for those of us who spend our time advocating for children’s privacy and safety. Among them is having to try to extrapolate child-specific data from generalised reporting. When it is not reported separately, it is usually to hide inadequacies in the level of protection afforded to children. For example, none of the community guidelines enforcement reports published for Instagram, YouTube, TikTok or Snap provides a breakdown of the violation rate data by age group, even though this would provide valuable information for academics, Governments, legislators and NGOs. Amendment 150 would go some way to addressing this gap by ensuring that the ICO is required to break down its reporting for children.

Having been momentarily positive, I would like to put on the record my concerns about the following extract from the email that accompanied the ICO’s children’s code strategy of 2 April. Having set out the very major changes to companies that the code has ushered in and explained how the Information Commissioner would spend the next few months looking at default settings, geolocation, profiling, targeting children and protecting under-13s, the email goes on to say:

“With the ongoing passage of the bill, our strategy deliberately focusses in the near term on compliance with the current code. However, once we have more clarity on the final version of the bill we will of course look to publicly signal intentions about our work on implementation and children’s privacy into the rest of the year and beyond”.


The use of the phrase “current code”, and the fact that the ICO has decided it is necessary to put its long-term enforcement strategy on hold, contradict government assurances that standards will remain the same.

The email from the ICO arrived in my inbox on the same day as a report from the US Institute of Digital Media and Child Development, which was accompanied by an impact assessment on the UK’s age-appropriate design code. It stated:

“The Institute’s review identifies an unprecedented wave of … changes made across leading social media and digital platforms, including YouTube, TikTok, Snapchat, Instagram, Amazon Marketplace, and Google Search. The changes, aimed at fostering a safer, more secure, and age-appropriate online environment, underscore the crucial role of regulation in improving the digital landscape for children and teens”.


In June, the Digital Futures Commission will be publishing a similar report written by the ex-Deputy Information Commissioner, Steve Wood, which has similarly positive but much more detailed findings. Meanwhile, we hear the steady drumbeat of adoption of the code in South America, Australia and Asia, and in additional US states following California’s lead. Experts in both the US and here in the UK evidence that this is a regulation that works to make digital services safer and better for children.

I therefore have to ask the Minister once again why the Government are downgrading child protection. If he, or those in the Box advising him, are even slightly tempted to say that they are not, I ask that they reread the debates from the last two days in Committee, in which the Government removed the balancing test to automated decision-making and the Secretary of State’s powers were changed to have regard to children rather than to mandate child protections. The data impact assessment provisions have also been downgraded, among the other sleights of hand that diminish the AADC.

The ICO has gone on record to say that it has put its medium to long-term enforcement strategy on hold, and the Minister’s letter sent on the last day before recess says that the AADC will be updated to reflect the Bill. I would like nothing more than a proposal from the Government to put the AADC back on a firm footing. I echo the words said earlier by the noble Baroness, Lady Jones, that it is time to start talking and stop writing. I am afraid that, otherwise, I will be tabling amendments on Report that will test the appetite of the House for protecting children online. In the meantime, I hope the Minister will welcome and accept the very modest proposals in this group.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

My Lords, as is so often the case on this subject, I support the noble Baroness, Lady Kidron, and the three amendments that I have added my name to: Amendments 142, 143 and 150. I will speak first to Amendments 142 and 143, and highlight a couple of issues that the noble Baroness, Lady Kidron, has already covered.

--- Later in debate ---
There is more than one way to protect children at school and the amendment in front of us is the simplest and most comprehensive. But whatever the route, the outcome must be that we act. We are seeing school districts in the US take tech providers to court and the UK can do better than that. UK tax-funded schools and teachers must not be left with the responsibility for the regulatory compliance of multibillion-dollar private companies. Not only is it patently unfair, but it illustrates and exacerbates a power asymmetry for which children at school pay the price. To unlock edtech’s full potential, regulatory action and government leadership is not just beneficial but essential. I beg to move.
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

My Lords, I rise once again in my Robin role to support the noble Baroness, Lady Kidron, on this amendment. We had a debate on 23 November last year that the noble Baroness brought on this very issue of edtech. Rather than repeat all the points that were made in that very useful debate, I point my noble friend the Minister to it.

I would just like to highlight a couple of quick points. First, in supporting this amendment, I am not anti-edtech in any way, shape or form. It is absolutely clear that technology can bring huge benefits to students of all ages but it is also clear that education is not unique. It is exactly like every other part of society: where technology brings benefit, it also brings substantial risk. We are learning the hard way that thinking that any element of society can mitigate the risks of technology without legal guard-rails is a mistake.

We have seen really clearly with the age-appropriate design code that commercial organisations operating under its purview changed the way they protected children’s data as a result of that code. The absence of the equivalent code for the edtech sector should show us clearly that we will not have had those same benefits. If we bring edtech into scope, either through this amendment or simply through extending the age-appropriate design code, I would hazard a strong guess that we would start to see very real improvements in the protection of children’s data.

In the debate on 23 November, I asked my noble friend the Minister, the noble Baroness, Lady Barran, why the age-appropriate design code did not include education. I am not an expert in education, by any stretch of the imagination. The answer I received was that it was okay because the keeping children safe in education framework covered edtech. Since that debate, I have had a chance to read that framework, and I cannot find a section in it that specifically addresses children’s data. There is lots of really important stuff in it, but there is no clearly signposted section in that regard. So even if all the work fell on schools, that framework on its own, as published on GOV.UK, does not seem to meet the standards of a framework for data protection for children in education. However, as the noble Baroness, Lady Kidron, said, this is not just about schools’ responsibility but the edtech companies’ responsibility, and it is clear that there is no section on that in the keeping children safe in education framework either.

The answer that we received last year in this House does not do justice to the real question: in the absence of a specific code—the age-appropriate design code or a specific edtech code—how can we be confident that there really are the guardrails, which we know we need to put in place in every sector, in this most precious and important sector, which is where we teach our children?

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I am absolutely delighted to be able to support this amendment. Like the noble Baroness, Lady Harding, I am not anti-edtech at all. I did not take part in the debate last year. When I listen to the noble Baroness, Lady Kidron, and even having had the excellent A Blueprint for Education Data from the 5Rights Foundation and the Digital Futures for Children brief in support of a code of practice for education technology, I submit that it is chilling to hear what is happening as we speak with edtech in terms of extraction of data and not complying properly with data protection.

I got involved some years ago with the advisory board of the Institute for Ethical AI in Education, which Sir Anthony Seldon set up with Professor Rose Luckin and Priya Lakhani. Our intention was slightly broader—it was designed to create a framework for the use of AI specifically in education. Of course, one of the very important elements was the use of data, and the safe use of data, both by those procuring AI systems and by those developing them and selling them into schools. That was in 2020 and 2021, and we have not moved nearly far enough since that time. Obviously, this is data specific, because we are talking about the data protection Bill, but what is being proposed here would cure some of the issues that are staring us in the face.

As we have been briefed by Digital Futures for Children, and as the noble Baroness, Lady Kidron, emphasised, there is widespread invasion of children’s privacy in data collection. Sometimes there is little evidence to support the claimed learning benefits, while schools and parents lack the technical and legal expertise to understand what data is collected. As has been emphasised throughout the passage of this Bill, children deserve the highest standards of privacy and data protection—especially in education, of course.

From this direction, I wholly support what the noble Baroness, Lady Kidron, is proposing, so well supported by the noble Baroness, Lady Harding. Given that it again appears that the Government gave an undertaking to bring forward a suitable code of practice but have not done so, there is double reason to want to move forward on this during the passage of the Bill. We very much support Amendment 146 on that basis.

Data Protection and Digital Information Bill

Baroness Harding of Winscombe Excerpts
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

My Lords, I want to speak briefly in support of, first, the amendments in the name of my noble friend Lord Holmes, which would recreate the office of the Biometrics and Surveillance Camera Commissioner.

As I have done on a number of occasions, I shall tell a short story; it is about the Human Fertilisation and Embryology Authority. Noble Lords may wonder why I am starting there. I remember very clearly one of the first debates that I participated in when I was at university—far too long ago. It was at the Oxford Union, and Dame Mary Warnock came to speak about what was then a highly contentious use of new technology. In this country, we had that debate early; we established an authority to oversee what are very complex scientific and ethical issues. It has remained a settled issue in this country that has enabled many families to bear children, bringing life and joy to people in a settled and safe way.

This data issue is quite similar, I think. Other countries did not have that early debate, which I remember as a teenager, and did not establish a regulator in the form of the HFEA. I point to the US, which was torn apart by those very issues. As the noble Lord, Lord Vaux, has just set out, the public are very concerned about the use of biometric data. This is an issue that many sci-fi novels and films have been made about, because it preys on our deepest fears. I think that technology can be hugely valuable to society, but only if we build and maintain trust in it. In order to do that, you need consistent, long-standing, expert regulation.

Like the noble Lord, Lord Vaux, I do not understand why the changes that this Bill brings will make things better. It narrows the scope of protection to data protection only when, actually, the issues are much broader, much subtler and much more sophisticated. For that reason and that reason alone, I think that we need to remove these clauses and reinstate the regulator that exists today.

Viscount Stansgate Portrait Viscount Stansgate (Lab)
- Hansard - - - Excerpts

My Lords, I find myself in a fortunate position: we have made progress fast enough to enable me to go from one end of the Room to the other and play a modest part in this debate. I do so because, at an earlier stage, I identified the amendments tabled by the noble Lord, Lord Holmes, and I very much wish to say a few words in support of them.

Reference has already been made to the briefing that we have had from CRISP. I pay tribute to the authors of that report—I do not need to read long chunks of it into the record—and am tempted to follow the noble Lord in referring to both of them. I sometimes wonder whether, had their report been officially available before the Government drafted the Bill, we would find ourselves in the position we are now in. I would like to think that that would have had an effect on the Government’s thinking.

When I first read about the Government’s intention to abolish the post of the Biometrics and Surveillance Camera Commissioner, I was concerned, but I am not technically adept to know enough about it in detail. I am grateful for the advice that I have had from CRISP and from Professor Michael Zander, a distinguished and eminent lawyer who is a Professor Emeritus at LSE. I am grateful to him for contacting me about this issue. I want to make a few points on his and its behalf.

In the short time available to me, this is the main thing I want to say. The Government argue that abolishing these joint roles will

“reduce duplication and simplify oversight of the police use of biometrics”.

Making that simpler and rationalising it is at the heart of the Government’s argument. It sounds as if this is merely a tidying-up exercise, but I believe that that is far from the case. It is fair to accept that the current arrangements for the oversight of public surveillance and biometric techniques are complex, but a report published on 30 October, to which noble Lords’ attention has already been drawn, makes a powerful case that what the Government intend to do will result in losses that are a great deal more significant than the problems caused by the complexity of the present arrangements. That is the paper’s argument.

The report’s authors, who produced a briefing for Members’ use today, have presented a mass of evidence and provided an impressively detailed analysis of the issues. The research underpinning the report includes a review of relevant literature, interviews with leading experts and regulators—