Committee (1st Day)
Scottish, Welsh and Northern Ireland Legislative Consent sought.
16:15
Clause 1: Information relating to an identifiable living individual
Amendment 1
Moved by
1: Clause 1, page 2, line 8, at end insert “and in the absence of appropriate organisational measures such as technical or contractual safeguards prohibiting reidentification.”
Member’s explanatory statement
To avoid confusion between the reversable pseudonymization mentioned in the bill regarding medical data and non-reversable pseudonymization, this amendment tries to distinguish between both.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, we are beginning rather a long journey—at least, it feels a bit like that. I will speak to Amendments 1, 5 and 288, and the Clause 1 stand part notice.

I will give a little context about Clause 1. In a recent speech, the Secretary of State said something that Julia Lopez repeated this morning at a conference I was at:

“The Data Bill that I am currently steering through Parliament with my wonderful team of ministers”—


I invite the Minister to take a bow—

“is just one step in the making of this a reality—on its own it will add £10 billion to our economy and most crucially—we designed it so that the greatest benefit would be felt by small businesses across our country. Cashing in on a Brexit opportunity that only we were prepared to take, and now those rewards are going to be felt by the next generation of founders and business owners in local communities”.

In contrast, a coalition of 25 civil society organisations wrote to the Secretary of State, calling for the Bill to be dropped. The signatories included trade unions as well as human rights, healthcare, racial justice and other organisations. On these Benches, we share the concerns about the government proposals. They will seriously weaken data protection rights in the UK and will particularly harm people from marginalised communities.

So that I do not have to acknowledge them at every stage of the Bill, I will now thank a number of organisations. I am slightly taking advantage of the fact that our speeches are not limited but will be extremely limited from Monday onwards—the Minister will have 20 minutes; I, the noble Baroness, Lady Jones, and colleagues will have 15; and Back-Benchers will have 10. I suspect we are into a new era of brevity, but I will take advantage today, believe me. I thank Bates Wells, Big Brother Watch, Defend Digital Me, the Public Law Project, Open Rights Group, Justice, medConfidential, Chris Pounder, the Data & Marketing Association, CACI, Preiskel & Co, AWO, Rights and Security International, the Advertising Association, the National AIDS Trust, Connected by Data and the British Retail Consortium. That is a fair range of organisations that see flaws in the Bill. We on these Benches agree with them and believe that it greatly weakens the existing data protection framework. Our preference, as we expressed at Second Reading, is that the Bill is either completely revised on a massive scale or withdrawn in the course of its passage through the Lords.

I will mention one thing; I do not think the Government are making any great secret of it. The noble Baroness, Lady Kidron, drew my attention to the Keeling schedule, which gives the game away, and Section 2(2). The Information Commissioner will no longer have to pay regard to certain aspects of the protection of personal data—all the words have been deleted, which is quite extraordinary. It is clear that the Bill will dilute protections around personal data processing, reducing the scope of data protected by the safeguards within the existing law. In fact, the Bill gives more power to data users and takes it away from the people the data is about.

I am particularly concerned about the provisions that change the definition of personal data and the purposes for which it can be processed. There is no need to redraft the definitions of personal data, research or the boundaries of legitimate interests. We have made it very clear over a period of time that guidance from the ICO would have been adequate in these circumstances, rather than a whole piece of primary legislation. The recitals are readily available for guidance, and the Government should have used them. More data will be processed, with fewer safeguards than currently permitted, as it will no longer meet the threshold of personal data, or it will be permitted under the new recognised legitimate interest provision, which we will debate later. That combination is a serious threat to privacy rights in the UK, and that is the context of a couple of our probing amendments to Clause 1— I will come on to the clause stand part notice.

As a result of these government changes, data in one organisation’s hands may be anonymous, while that same information in another organisation’s hands can be personal data. The factor that determines whether personal data can be reidentified is whether the appropriate organisational measures and technical safeguards exist to keep the data in question separate from the identity of specific individuals. That is a very clear decision by the CJEU; the case is SRB v EDPS, if the Minister is interested.

The ability to identify an individual indirectly with the use of additional information is due to the lack of appropriate organisational and technical measures. If the organisation had such appropriate measures that separated data into differently silos, it would not be able to use the additional information to identify such an individual. The language of technical and organisational measures is used in the definition of pseudonymisation in Clause 1(3)(d), which refers to “indirectly identifiable” information. If such measures existed, the data would be properly pseudonymised, in which case it would no longer be indirectly identifiable.

A lot of this depends on how data savvy organisations are, so those that are not well organised and do not have the right technology will get a free pass. That cannot be right, so I hope the Minister will respond to that. We need to make sure that personal data remains personal data, even if some may claim it is not.

Regarding my Amendment 5, can the Government explicitly confirm that personal data that is

“pseudonymised in part, but in which other indirect identifiers remain unaltered”

will remain personal data after this clause is passed? Can the Government also confirm that if an assessment is made that some data is not personal data, but that assessment is later shown to be incorrect, the data will have been personal data at all times and should be treated as such by controllers, processors and the Information Commissioner, about whom we will talk when we come to the relevant future clauses.

Amendment 288 simply asks the Government for an impact assessment. If they are so convinced that the definition of personal data will change, they should be prepared to submit to some kind of impact assessment after the Bill comes into effect. Those are probing amendments, and it would be useful to know whether the Government have any intention to assess what the impact of their changes to the Bill would be if they were passed. More importantly, we believe broadly that Clause 1 is not fit for purpose, and that is why we have tabled the clause stand part notice.

As we said, this change will erode people’s privacy en masse. The impacts could include more widespread use of facial recognition and an increase in data processing with minimal safeguards in the context of facial recognition, as the threshold for personal data would be met only if the data subject is on a watchlist and therefore identified. If an individual is not on a watchlist and images are deleted after checking it, the data may not be considered personal and so would not qualify for data protection obligations.

People’s information could be used to train AI without their knowledge or consent. Personal photos scraped from the internet and stored to train an algorithm would no longer be seen as personal data, as long as the controller does not recognise the individual, is not trying to identify them and will not process the data in such a way that would identify them. The police would have increased access to personal information. Police and security services will no longer have to go to court if they want access to genetic databases; they will be able to access the public’s genetic information as a matter of routine.

Personal data should be defined by what type of data it is, not by how easy it is for a third party to identify an individual from it. That is the bottom line. Replacing a stable, objective definition that grants rights to the individual with an unstable, subjective definition that determines the rights an individual has over their data according to the capabilities of the processor is illogical, complex, bad law-making. It is contrary to the very premise of data protection law, which is founded upon personal data rights. We start on the wrong foot in Clause 1, and it continues. I beg to move.

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - - - Excerpts

My Lords, I rise to speak in favour of Amendments 1 and 5 in this group and with sympathy towards Amendment 4. The noble Lord, Lord Clement-Jones, will remember when I was briefly Minister for Health. We had lots of conversations about health data. One of the things we looked at was a digitised NHS. It was essential if we were to solve many problems of the future and have a world-class NHS, but the problem was that we had to make sure that patients were comfortable with the use of their data and the contexts in which it could be used.

When we were looking to train AI, it was important that we made sure that the data was as anonymous as possible. For example, we looked at things such as synthetic and pseudonymised data. There is another point: having done the analysis and looked at the dataset, if you see an identifiable group of people who may well be at risk, how can you reverse-engineer that data perhaps to notify those patients that they should be contacted for further medical interventions?

I know that that makes it far too complicated; I just wanted to rise briefly to support the noble Lord, Lord Clement-Jones, on this issue, before the new rules come in next week. It is essential that the users, the patients—in other spheres as well—have absolute confidence that their data is theirs and are given the opportunity to give permission or opt out as much as possible.

One of the things that I said when I was briefed as a Health Minister was that we can have the best digital health system in the world, but it is no good if people choose to opt out or do not have confidence. We need to make sure that the Bill gives those patients that confidence where their data is used in other areas. We need to toughen this bit up. That is why I support Amendments 1 and 5 in the name of the noble Lord, Lord Clement-Jones.

Lord Davies of Brixton Portrait Lord Davies of Brixton (Lab)
- Hansard - - - Excerpts

My Lords, anonymisation of data is crucially important in this debate. I want to see, through the Bill, a requirement for personal data, particularly medical data, to be held within trusted research environments. This is a well-developed technique and Britain is the leader. It should be a legal requirement. I am not quite sure that we have got that far in the Bill; maybe we will need to return to the issue on Report.

The extent to which pseudonymisation—I cannot say it—is possible is vastly overrated. There is a sport among data scientists of being able to spot people within generally available datasets. For example, the data available to TfL through people’s use of Oyster cards and so on tells you an immense amount of information about individuals. Medical data is particularly susceptible to this, although it is not restricted to medical data. I will cite a simple example from publicly available data.

16:30
Let us say that you know that someone, with a date of birth of 6 May 1953, had two minor cardiac operations, once on 19 October 2003 and again on 24 September 2004. With that information, you would know virtually everything there is to know about Tony Blair. It is that easy to get hold of information. Of course, you will not have a medical dataset without a date of birth or pre-existing conditions. The whole idea that you can easily anonymise data is a blind alley.
Ultimately, information should be retained within a locked box, where it stays, and the medical researchers, who are crucial, come up with their programme, using a sandbox, that is then applied to the locked-away data. The researchers would just get the results; they would not go anywhere near the data. The outcome of the research is identical but people’s medical information —their genetic information—would be kept away, secure. We have to work to that objective. I do not quite know yet whether the Bill gets that far, but it is crucial.
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My Lords, I, too, support the amendments in the name of the noble Lord, Lord Clement-Jones. As this is the first time I have spoken during the passage of the Bill, I should also declare my interests, but it seems that all the organisations I am involved in process data, so I refer the Committee to all the organisations in my entry in the register of interests.

I want to tell a story about the challenges of distinguishing between personal data and pseudonymised data. I apologise for bringing everyone back to the world of Covid, but that was when I realised how possible it is to track down individuals without any of their personal data. Back in November or December 2020, when the first variant of Covid, the Kent variant, was spreading, one test that was positive for the Kent variant came with no personal details at all. The individual who had conducted that test had not filled in any of the information. I was running NHS Test and Trace and we had to try to find that individual, in a very public way. In the space of three days, with literally no personal information—no name, address or sense of where they lived—the team was able to find that human being. Through extraordinary ingenuity, it tracked them down based on the type of tube the test went into—the packaging that was used—and by narrowing down the geography of the number of postcodes where the person might have been ill and in need of help but also in need of identifying all their contacts.

I learned that it was possible to find that one human being, out of a population of 60 million, within three days and without any of their personal information. I tell this story because my noble friend Lord Kamall made such an important point that, at the heart of data legislation is the question of how you build trust in the population. We have to build on firm foundations if the population are to trust that there are reasons why sharing data is hugely valuable societally. To have a data Bill that does not have firm foundations in absolutely and concretely defining personal data is quite a fatal flaw.

Personal data being subjective, as the noble Lord, Lord Clement-Jones, so eloquently set out, immediately starts citizens on a journey of distrusting this world. There is so much in this world that is hard to trust, and I feel strongly that we have to begin with some very firm foundations. They will not be perfect, but we need to go back to a solid definition of “personal data”, which is why I wholeheartedly support the noble Lord’s amendments.

Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

My Lords, I hesitate to make a Second Reading speech, and I know that the noble Lord, Lord Clement-Jones, cannot resist rehearsing these points. However, it is important, at the outset of Committee, to reflect on the Bill in its generality, and the noble Lord did a very good job of precisely that. This is fundamental.

The problem for us with the Bill is not just that it is a collection of subjects—of ideas about how data should be handled, managed and developed—but that it is flawed from the outset. It is a hotchpotch of things that do not really hang together. Several of us have chuntered away in the margins and suggested that it would have been better if the Bill had fallen and there had been a general election—not that the Minister can comment on that. But it would be better, in a way. We need to go back to square one, and many in the Committee are of a like mind.

The noble Baroness, Lady Harding, made a good point about data management, data control and so on. Her example was interesting, because this is about building trust, having confidence in data systems and managing data in the future. Her example was very good, as was that of the noble Lord, Lord Davies, who raised a challenge about how the anonymisation, or pseudonymisation, of data will work and how effective it will be.

We have two amendments in this group. Taken together, they are designed to probe exactly what the practical impacts will be of the proposed changes to Section 3 of the 2018 Act and the insertion of new Section 3A. Amendment 4 calls for the Secretary of State to publish an assessment of the changes within two months of the Bill passing, while Amendment 301 would ensure that the commencement of Clause 1 takes place no earlier than that two-month period. Noble Lords might think this is unduly cautious, but, given our wider concerns about the Bill and its departure from the previously well-understood—

Baroness Scott of Needham Market Portrait The Deputy Chairman of Committees (Baroness Scott of Needham Market) (LD)
- Hansard - - - Excerpts

My Lords, a Division having been called, we will adjourn for 10 minutes and resume at 4.48 pm.

16:38
Sitting suspended for a Division in the House.
16:48
Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

As I was saying, it is important for the framework on data protection that we take a precautionary approach. I hope that the Minister will this afternoon be able to provide a plain English explanation of the changes, as well as giving us an assurance that those changes to definitions do not result in watering down the current legislation.

We broadly support Amendments 1 and 5 and the clause stand part notice, in the sense that they provide additional probing of the Government’s intentions in this area. We can see that the noble Lord, Lord Clement-Jones, is trying with Amendment 1 to bring some much-needed clarity to the anonymisation issue and, with Amendment 5, to secure that data remains personal data in any event. I suspect that the Minister will tell us this afternoon that that is already the case, but a significant number of commentators have questioned this, since the definition of “personal data” is seemingly moving away from the EU GDPR standard towards a definition that is more subjective from the perspective of the controller, processor or recipient. We must be confident that the new definition does not narrow the circumstances in which the information is protected as personal data. That will be an important standard for this Committee to understand.

Amendment 288, tabled by the noble Lord, Lord Clement- Jones, seeks a review and an impact assessment of the anonymisation and identifiability of data subjects. Examining that in the light of the EU GDPR seems to us to be a useful and novel way of making a judgment over which regime better suits and serves data subjects.

We will listen with interest to the Minister’s response. We want to be more than reassured that the previous high standards and fundamental principles of data protection will not be undermined and compromised.

Viscount Camrose Portrait The Parliamentary Under-Secretary of State, Department for Science, Innovation and Technology (Viscount Camrose) (Con)
- Hansard - - - Excerpts

I thank all noble Lords who have spoken in this brief, interrupted but none the less interesting opening debate. I will speak to the amendments tabled by the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Jones; I note that I plan to that form of words quite a lot in the next eight sessions on this Bill. I thank them for tabling these amendments so that we can debate what are, in the Government’s view, the significant benefits of Clause 1.

In response to the points from the noble Lord, Lord Clement-Jones, on the appetite for the reforms in the Bill, we take very seriously the criticisms of the parties that he mentioned—the civil society groups—but it is important to note that, when the Government consulted on these reforms, we received almost 3,000 responses. At that time, we proposed to clarify when data would be regarded as anonymous and proposed legislating to confirm that the test for whether anonymous data can be reidentified is relative to the means available to the controller to reidentify the data. The majority of respondents agreed that greater clarity in legislation would indeed be beneficial.

As noble Lords will know, the UK’s data protection legislation applies only to personal data, which is data relating to an identified or identifiable living individual. It does not apply to non-personal, anonymous data. This is important because, if organisations can be sure that the data they are handling is anonymous, they may be able to more confidently put it to good use in important activities such as research and product development. The current data protection legislation is already clear that a person can be identified in a number of ways by reference to details such as names, identification numbers, location data and online identifiers, or via information about a person’s physical, genetic, mental, economic or cultural characteristics. The Bill does not change the existing legislation in this respect.

With regard to genetic information, which was raised by my noble friend Lord Kamall and the noble Lord, Lord Davies, any information that includes enough genetic markers to be unique to an individual is personal data and special category genetic data, even if names and other identifiers have been removed. This means that it is subject to the additional protections set out in Article 9 of the UK GDPR. The Bill does not change this position.

However, the existing legislation is unclear about the specific factors that a data controller must consider when assessing whether any of this information relates to an identifiable living person. This uncertainty is leading to inconsistent application of anonymisation and to anonymous data being treated as personal data out of an abundance of caution. This, in turn, reduces the opportunities for anonymous data to be used effectively for projects in the public interest. It is this difficulty that Clause 1 seeks to address by providing a comprehensive statutory test on identifiability. The test will require data controllers and processors to consider the likelihood of people within or outside their organisations reidentifying individuals using reasonable means. It is drawn from recital 26 of the EU GDPR and should therefore not be completely unfamiliar to most organisations.

I turn now to the specific amendments that have been tabled in relation to this clause. Amendment 1 in the name of the noble Lord, Lord Clement-Jones, would reiterate the position currently set out in the UK GDPR and its recitals: where individuals can be identified without the use of additional information because data controllers fail to put in place appropriate organisational measures, such as technical or contractual safeguards prohibiting reidentification, they would be considered directly identifiable. Technical and organisational measures put in place by organisations are factors that should be considered alongside others under new Section 3A of the Data Protection Act when assessing whether an individual is identifiable from the data being processed. Clause 1 sets out the threshold at which data—and, therefore, personal data—is identifiable and clarifies when data is anonymous.

On the technical capabilities of a respective data controller, these are already relevant factors under current law and ICO guidance in determining whether data is personal. This means that the test of identifiability is already a relative one today in respect of the data controller, the data concerned and the purpose of the processing. However, the intention of the data controller is not a relevant factor under current law, and nor does Clause 1 make it a factor. Clause 1 merely clarifies the position under existing law and follows very closely the wording of recital 26. Let me state this clearly: nothing in Clause 1 introduces the subjective intention of the data controller as a relevant factor in determining identifiability, and the position will remain the same as under the current law and as set out in ICO guidance.

In response to the points made by the noble Lord, Lord Clement-Jones, and others on pseudonymised personal data, noble Lords may be aware that the definition of personal data in Article 4(1) of the UK GDPR, when read in conjunction with the definition of pseudonymisation in Article 4(5), makes it clear that pseudonymised data is personal data, not anonymous data, and is thus covered by the UK’s data protection regime. I hope noble Lords are reassured by that. I also hope that, for the time being, the noble Lord, Lord Clement-Jones, will agree to withdraw his amendment and not press the related Amendment 5, which seeks to make it clear that pseudonymised data is personal data.

Amendment 4 would require the Secretary of State to assess the difference in meaning and scope between the current statutory definition of personal data and the new statutory definition that the Bill will introduce two months after its passing. Similarly, Amendment 288 seeks to review the impact of Clause 1 six months after the enactment of the Bill. The Government feel that neither of these amendments is necessary as the clause is drawn from recital 26 of the EU GDPR and case law and, as I have already set out, is not seeking to substantially change the definition of personal data. Rather, it is seeking to provide clarity in legislation.

Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

I follow the argument, but what we are suggesting in our amendment is some sort of impact assessment for the scheme, including how it currently operates and how the Government wish it to operate under the new legislation. Have the Government undertaken a desktop exercise or any sort of review of how the two pieces of legislation might operate? Has any assessment of that been made? If they have done so, what have they found?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Obviously, the Bill has been in preparation for some time. I completely understand the point, which is about how we can be so confident in these claims. I suggest that I work with the Bill team to get an answer to that question and write to Members of the Committee, because it is a perfectly fair question to ask what makes us so sure.

In the future tense, I can assure noble Lords that the Department for Science, Innovation and Technology will monitor and evaluate the impact of this Bill as a whole in the years to come, in line with cross-government evaluation guidance and through continued engagement with stakeholders.

The Government feel that the first limb of Amendment 5 is not necessary given that, as has been noted, pseudonymised data is already considered personal data under this Bill. In relation to the second limb of the amendment, if the data being processed is actually personal data, the ICO already has powers to require organisations to address non-compliance. These include requiring it to apply appropriate protections to personal data that it is processing, and are backed up by robust enforcement mechanisms.

That said, it would not be appropriate for the processing of data that was correctly assessed as anonymous at the time of processing to retrospectively be treated as processing of personal data and subject to data protection laws, simply because it became personal data at a later point in the processing due to a change in circumstances. That would make it extremely difficult for any organisation to treat any dataset as anonymous and would undermine the aim of the clause, significantly reducing the potential to use anonymous data for important research and development activities.

17:00
The third part of Amendment 5 seeks to repeal provisions in Section 191 of the Data Protection Act 2018 that would allow the Secretary of State to prepare a framework for data processing by government. This could include guidance about the processing of personal data in connection with the work of government departments and others. Although the powers to issue such a framework have not been used, as the existing transparency requirements have proven sufficient, they may yet be helpful in the future.
I turn to the Clause 1 stand part notice. For the reasons I have described above, noting the help that this clause will deliver to organisations trying to assess identifiability and use anonymous data, I respectfully encourage the noble Lord not to oppose the clause standing part of the Bill.
On Amendment 301, as the Bill is currently drafted, Clause 1 is not to be automatically commenced either immediately upon Royal Assent or after two months. The probable impact of the amendments tabled by the noble Baroness, Lady Jones, would therefore be to bring forward commencement of Clause 1, rather than delay it. I therefore encourage the noble Lord, Lord Bassam, not to move Amendment 301.
For the reasons I have set out, I am not able to accept these amendments.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I thank the noble Lords, Lord Kamall, Lord Davies of Brixton and Lord Bassam, and the noble Baroness, Lady Harding, for their support for a number of these amendments. Everybody made a common point about public trust, particularly in the context of health data.

As the noble Lord, Lord Kamall, said, we had a lot of conversations during the passage of the Health and Care Act and the noble Lord and his department increasingly got it: proper communication about the use of personal, patient data is absolutely crucial to public trust. We made quite a bit of progress with NHSE and the department starting to build in safeguards and develop the concept of access to, rather than sharing of, personal data. I heard what the noble Lord, Lord Davies, said about a locked box and I think that having access for research, rather than sharing data around, is a powerful concept.

I found what the Minister said to be helpful. I am afraid that we will have to requisition a lot of wet towels during the passage of the Bill. There are a number of aspects to what he said, but the bottom line is that he is saying that there is no serious divergence from the current definition of personal data. The boot is on the other foot: where is the Brexit dividend? The Minister cannot have it both ways.

I am sure that, as we go through this and the Minister says, “It’s all in recital 26”, my response would be that the ICO could easily develop guidance based on that. That would be splendid; we would not have to go through the agony of contending with this data protection Bill. It raises all those issues and creates a great deal of angst. There are 26 organisations, maybe more— 42, I think—writing to the Secretary of State about one aspect of it or another. The Government have really created a rod for their own back, when they could have created an awful lot of guidance, included a bit on digital identity in the Bill and done something on cookies. What else is there not to like? As I say, the Government have created a rod for their own back.

As regards pseudonymised data, that is also helpful. We will hold the Minister to that as we go through, if the Minister is saying that that is personal data. I am rather disappointed by the response to Amendment 5, but I will take a very close look at it with several wet towels.

We never know quite whether CJEU judgments will be treated as precedent by this Government or where we are under the REUL Act. I could not tell you at this moment. However, it seems that the Minister is again reassuring us that the CJEU’s judgments on personal data are valid and are treated as being part of UK law for this purpose, which is why there is no change to the definition of personal data as far as he is concerned. All he is doing is importing the recitals into Clause 1. I think I need to read the Minister’s speech pretty carefully if I am going to accept that. In the meantime, we move on. I beg leave to withdraw the amendment.

Amendment 1 withdrawn.
Amendment 2
Moved by
2: Clause 1, page 2, line 16, leave out “and (3)” and insert “, (3) and (3A)”
Member's explanatory statement
This amendment, and another to Clause 1 in the name of Baroness Kidron, would ensure that controllers have a duty to identify when a user is or may be a child to give them the data protection codified by the Data Protection Act 2018.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I speak to Amendments 2, 3, 9 and 290 in my name. I thank the noble Baronesses, Lady Jones and Lady Harding, and the noble Lord, Lord Clement-Jones, for their support.

This group seeks to secure the principle that children should enjoy the same protections in UK law after this Bill passes into law as they do now. In 2018, this House played a critical role in codifying the principle that children merit special, specific protection in relation to data privacy by introducing the age-appropriate design code into the DPA. Its introduction created a wave of design changes to tech products: Google introduced safe search as its default; Instagram made it harder for adults to contact children via private messaging; Play Store stopped making adult apps available to under-18s; and TikTok stopped sending notifications through the night and hundreds of thousands of underage children were denied access to age-inappropriate services. These are just a handful of the hundreds of changes that have been made, many of them rolled out globally. The AADC served as a blueprint for children’s data privacy, and its provisions have been mirrored around the globe. Many noble Lords will have noticed that, only two weeks ago, Australia announced that it is going to follow the many others who have incorporated or are currently incorporating it into their domestic legislation, saying in the press release that it would align as closely as possible with the UK’s AADC.

As constructed in the Data Protection Act 2018, the AADC sets out the requirements of the UK GDPR as they relate to children. The code is indirectly enforceable; that is to say that the action the ICO can take against those failing to comply is based on the underlying provisions of UK GDPR, which means that any watering down, softening of provisions, unstable definitions—my new favourite—or legal uncertainty created by the Bill automatically waters down, softens and creates legal uncertainty and unstable definitions for children and therefore for child protection. I use the phrase “child protection” deliberately because the most important contribution that the AADC has made at the global level was the understanding that online privacy and safety are interwoven.

Clause 1(2) creates an obligation on the controller or processor to know, or reasonably to know, that an individual is an identifiable living individual. Amendments 2 and 3 would add a further requirement to consider whether that living individual is a child. This would ensure that providers cannot wilfully ignore the presence of children, something that tech companies have a long track record of doing. I want to quote the UK Information Commissioner, who fined TikTok £12.7 million for failing to prevent under-13s accessing that service; he said:

“There are laws in place to make sure our children are as safe in the digital world as they are in the physical world. TikTok did not abide by those laws … TikTok should have known better. TikTok should have done better … They did not do enough to check who was using their platform”.


I underline very clearly that these amendments would not introduce any requirement for age assurance. The ICO’s guidance on age assurance in the AADC and the provisions in the Online Safety Act already detail those requirements. The amendments simply confirm the need to offer a child a high bar of data privacy or, if you do not know which of your users are children, offer all users that same high bar of data privacy.

As we have just heard, it is His Majesty’s Government’s stated position that nothing in the Bill lessens children’s data privacy because nothing in the Bill lessens UK GDPR, and that the Bill is merely an exercise to reduce unnecessary bureaucracy. The noble Lords who spoke on the first group have perhaps put paid to that and I imagine that this position will be sorely tested during Committee. In the light of the alternative view that the protections afforded to children’s personal data will decline as a result of the Bill, Amendment 9 proposes that the status of children’s personal data be elevated to that of “sensitive personal data”, or special category data. The threshold for processing special category data is higher than for general personal data and the specific conditions include, for example, processing with the express consent of the data subject, processing to pursue a vital interest, processing by not-for-profits or processing for legal claims or matters of substantial public interest. Bringing children’s personal data within that definition would elevate the protections by creating an additional threshold for processing.

Finally, Amendment 290 enshrines the principle that nothing in the Bill should lead to a diminution in existing levels of privacy protections that children currently enjoy. It is essentially a codification of the commitment made by the Minister in the other place:

“The Bill maintains the high standards of data protection that our citizens expect and organisations will still have to abide by our age-appropriate design code”.—[Official Report, Commons, 17/4/23; col. 101.]


Before I sit down, I just want to highlight the Harvard Gazette, which looked at ad revenue from the perspective of children. On Instagram, children account for 16% of ad revenue; on YouTube, 27%; on TikTok, 35%; and on Snap, an extraordinary 41.4%. Collectively, YouTube, Instagram and Facebook made nearly $2 billion from children aged nought to 12, and it will not escape many noble Lords that children aged nought to 12 are not supposed to be on those platforms. Instagram, YouTube and TikTok together made more than $7 billion from 13 to 17 year-olds. The amendments in this group give a modicum of protection to a demographic who have no electoral capital, who are not developmentally adult and whose lack of care is not an unfortunate by-product of the business model, but who have their data routinely extracted, sold, shared and scraped as a significant part of the ad market. It is this that determines the features that deliberately spread, polarise and keep children compulsively online, and it is this that the AADC—born in your Lordships’ House—started a global movement to contain.

This House came together on an extraordinary cross-party basis to ensure that the Online Safety Bill delivered for children, so I say to the Minister: I am not wedded to my drafting, nor to the approach that I have taken to maintain, clause by clause, the bar for children, even when that bar is changed for adults, but I am wedded to holding the tech sector accountable for children’s privacy, safety and well-being. It is my hope and—if I dare—expectation that noble Lords will join me in making sure that the DPDI Bill does not leave this House with a single diminution of data protection for children. To do so is, in effect, to give with one hand and take away with the other.

I hope that during Committee the Minister will come to accept that children’s privacy will be undermined by the Bill, and that he will work with me and others to resolve these issues so that the UK maintains its place as a global leader in children’s privacy and safety. I beg to move.

17:15
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My Lords, in the nearly nine years that I have been in this House, I have often played the role of bag carrier to the noble Baroness, Lady Kidron, on this issue. In many ways, I am rather depressed that once again we need to make the case that children deserve a higher bar of protection than adults in the digital world. As the noble Baroness set out—I will not repeat it—the age-appropriate design code was a major landmark in establishing that you can regulate the digital world just as you can the physical world. What is more, it is rather joyful that when you do, these extraordinarily powerful tech companies change their products in the way that you want them to.

This is extremely hard-fought ground that we must not lose. It takes us to what feels like a familiar refrain from the Online Safety Act and the Digital Markets, Competition and Consumers Bill, which we are all still engaged in: the question of whether you need to write something in the Bill and whether, by doing so, you make it more clear or less clear.

Does my noble friend the Minister agree with the fundamental principle, enshrined in the Data Protection Act 2018, that children deserve a higher bar of protection in the online world and that children’s data needs to be protected at a much higher level? If we can all agree on that principle first, then the question is: how do we make sure that this Bill does not weaken the protection that children have?

I am trying to remember on which side of the “put it in the Bill or not” debate I have been during discussions on each of the digital Bills that we have all been working on over the last couple of years. We have a really vicious problem where, as I understand it, the Government keep insisting that the Bill does not water down data protection and therefore there is no need to write anything into it to protect children’s greater rights. On the other hand, I also hear that it will remove bureaucracy and save businesses a lot of money. I have certainly been in rooms over the last couple of years where business representatives have told me, not realising I was one of the original signatories to the amendment that created the age-appropriate design code, how dreadful it was because it made their lives much more complicated.

I have no doubt that if we create a sense—which is what it is—that companies do not need to do quite as much as they used to for children in this area, that sense will create, if not a wide-open door, an ajar door that enables businesses to walk through and take the path of least resistance, which is doing less to protect children. That is why, in this case, I come down on the side of wanting to put it explicitly in the Bill, in whatever wording my noble friend the Minister thinks appropriate, that we are really clear that this creates no change at all in the approach for children and children’s data.

That is what this group of amendments is about. I know that we will come back to a whole host of other areas where there is a risk that children’s data could be handled differently from the way envisaged in that hard-fought battle for the age-appropriate design code but, on this group alone, it would be helpful if my noble friend the Minister could help us establish that firm principle and commit to coming back with wording that will firmly establish it in the Bill.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I keep getting flashbacks. This one is to the Data Protection Act 2018, although I think it was 2017 when we debated it. It is one of the huge achievements of the noble Baroness, Lady Kidron, to have introduced, and persuaded the Government to introduce, the age-appropriate design code into the Act, and—as she and the noble Baroness, Lady Harding, described—to see it spread around the world and become the gold standard. It is hardly surprising that she is so passionate about wanting to make sure that the Bill does not water down the data rights of children.

I think the most powerful amendment in this group is Amendment 290. For me, it absolutely bottles what we need to do in making sure that nothing in the Bill waters down children’s rights. If I were to choose one of the noble Baroness’s amendments in this group, it would be that one: it would absolutely give the assurance and scotch the point about legal uncertainty created by the Bill.

Both noble Baronesses asked: if the Government are not watering down the Bill, why can they not say that they are not? Why can they not, in a sense, repeat the words of Paul Scully when he was debating the Bill? He said:

“We are committed to protecting children and young people online. The Bill maintains the high standards of data protection that our citizens expect and organisations will still have to abide by our age-appropriate design code”.


He uses “our”, so he is taking full ownership of it. He went on:

“Any breach of our data protection laws will result in enforcement action by the Information Commissioner’s Office”.—[Official Report, Commons, 17/4/23; col. 101.]


I would love that enshrined in the Bill. It would give us a huge amount of assurance.

Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

My Lords, we on the Labour Benches have become co-signatories to the amendments tabled by the noble Baroness, Lady Kidron, and supported by the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Harding. The noble Baroness set out very clearly and expertly the overarching purpose of retaining the level of protection currently afforded by the Data Protection Act 2018. Amendments 2 and 3 specifically stipulate that, where data controllers know, or should reasonably know, that a user is a child, they should be given the data protection codified in that Act. Amendment 9 takes it a stage further and includes children’s data in the definition of sensitive personal data, and gives it the benefit of being treated to a heightened level of protection—quite rightly, too. Finally, Amendment 290—the favourite of the noble Lord, Lord Clement-Jones—attempts to hold Ministers to the commitment made by Paul Scully in the Commons to maintain existing standards of data protection carried over from that 2018 Act.

Why is all this necessary? I suspect that the Minister will argue that it is not needed because Clause 5 already provides for the Secretary of State to consider the impact of any changes to the rights and freedoms of individuals and, in particular, of children, who require special protection.

We disagree with that argument. In the interests of brevity and the spirit of the recent Procedure Committee report, which says that we should not repeat each other’s arguments, I do not intend to speak at length, but we have a principal concern: to try to understand why the Government want to depart from the standards of protection set out in the age-appropriate design code—the international gold standard—which they so enthusiastically signed up to just five or six years ago. Given the rising levels of parental concern over harmful online content and well-known cases highlighting the harms that can flow from unregulated material, why do the Government consider it safe to water down the regulatory standards at this precise moment in time? The noble Baroness, Lady Kidron, valuably highlighted the impact of the current regulatory framework on companies’ behaviour. That is exactly what legislation is designed to do: to change how we look at things and how we work. Why change that? As she has argued very persuasively, it is and has been hugely transformative. Why throw away that benefit now?

My attention was drawn to one example of what can happen by a briefing note from the 5Rights Foundation. As it argued, children are uniquely vulnerable to harm and risk online. I thought its set of statistics was really interesting. By the age of 13, 72 million data points have already been collected about children. They are often not used in children’s best interests; for example, the data is often used to feed recommender systems and algorithms designed to keep attention at all costs and have been found to push harmful content at children.

When this happens repeatedly over time, it can have catastrophic consequences, as we know. The coroner in the Molly Russell inquest found that she had been recommended a stream of depressive content by algorithms, leading the coroner to rule that she

“died from an act of self-harm whilst suffering from depression and the negative effects of online content”.

We do not want more Molly Russell cases. Progress has already been made in this field; we should consider dispensing with it at our peril. Can the Minister explain today the thinking and logic behind the changes that the Government have brought forward? Can he estimate the impact that the new lighter-touch regime, as we see it, will have on child protection? Have the Government consulted extensively with those in the sector who are properly concerned about child protection issues, and what sort of responses have the Government received?

Finally, why have the Government decided to take a risk with the sound framework that was already in place and built on during the course of the Online Safety Act? We need to hear very clearly from the Minister how they intend to engage with groups that are concerned about these child protection issues, given the apparent loosening of the current framework. The noble Baroness, Lady Harding, said that this is hard-fought ground; we intend to continue making it so because these protections are of great value to our society.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I am grateful to the noble Baroness, Lady Kidron, for her Amendments 2, 3, 9 and 290 and to all noble Lords who have spoken, as ever, so clearly on these points.

All these amendments seek to add protections for children to various provisions in the Bill. I absolutely recognise the intent behind them; indeed, let me take this opportunity to say that the Government take child safety deeply seriously and agree with the noble Baroness that all organisations must take great care, both when making decisions about the use of children’s data and throughout the duration of their processing activities. That said, I respectfully submit that these amendments are not necessary for three main reasons; I will talk in more general terms before I come to the specifics of the amendments.

First, the Bill maintains a high standard of data protection for everybody in the UK, including—of course—children. The Government are not removing any of the existing data protection principles in relation to lawfulness, fairness, transparency, purpose limitation, data minimisation, storage limitation, accuracy, data security or accountability; nor are they removing the provisions in the UK GDPR that require organisations to build privacy into the design and development of new processing activities.

The existing legislation acknowledges that children require specific protection for their personal data, as they may be less aware of the risks, consequences and safeguards concerned, and of their rights in relation to the processing of personal data. Organisations will need to make sure that they continue to comply with the data protection principles on children’s data and follow the ICO’s guidance on children and the UK GDPR, following the changes we make in the Bill. Organisations that provide internet services likely to be accessed by children will need to continue to comply with their transparency and fairness obligations and the ICO’s age-appropriate design code. The Government welcome the AADC, as Minister Scully said, and remain fully committed to the high standards of protection that it sets out for children.

Secondly, some of the provisions in the Bill have been designed specifically with the rights and safety of children in mind. For example, one reason that the Government introduced the new lawful ground of recognised legitimate interest in Clause 5, which we will debate later, was that some consultation respondents said that the current legislation can deter organisations, particularly in the voluntary sector, from sharing information that might help to prevent crime or protect children from harm. The same goes for the list of exemptions to the purpose limitation principle introduced by Clause 6.

There could be many instances where personal data collected for one purpose may have to be reused to protect children from crime or safeguarding risks. The Bill will provide greater clarity around this and has been welcomed by stakeholders, including in the voluntary sector.

17:31
Sitting suspended for Divisions in the House.
18:12
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

While some provisions in the Bill do not specifically mention children or children’s rights, data controllers will still need to carefully consider the impact of their processing activities on children. For example, the new obligations on risk assessments, record keeping and the designation of senior responsible individuals will apply whenever an organisation’s processing activities are likely to result in high risks to people, including children.

Thirdly, the changes we are making in the Bill must be viewed in a wider context. Taken together, the UK GDPR, the Data Protection Act 2018 and the Online Safety Act 2023 provide a comprehensive legal framework for keeping children safe online. Although the data protection legislation and the age-appropriate design code make it clear how personal data can be processed, the Online Safety Act makes clear that companies must take steps to make their platforms safe by design. It requires social media companies to protect children from illegal, harmful and age-inappropriate content, to ensure they are more transparent about the risks and dangers posed to children on their sites, and to provide parents and children with clear and accessible ways to report problems online when they do arise.

After those general remarks, I turn to the specific amendments. The noble Baroness’s Amendments 2 and 3 would amend Clause 1 of the Bill, which relates to the test for assessing whether data is personal or anonymous. Her explanatory statement suggests that these amendments are aimed at placing a duty on organisations to determine whether the data they are processing relates to children, thereby creating a system of age verification. However, requiring data controllers to carry out widespread age verification of data subjects could create its own data protection and privacy risks, as it would require them to retain additional personal information such as dates of birth.

The test we have set out for reidentification is intended to apply to adults and children alike. If any person is likely to be identified from the data using reasonable means, the data protection legislation will apply. Introducing one test for adults and one for children is unlikely to be workable in practice and fundamentally undermines the clarity that this clause seeks to bring to organisations. Whether a person is identifiable will depend on a number of objective factors, such as the resources and technology available to organisations, regardless of whether they are an adult or a child. Creating wholly separate tests for adults and children, as set out in the amendment, would add unnecessary complexity to the clause and potentially lead to confusion.

Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

As I understand it, the basis on which we currently operate is that children get a heightened level of protection. Is the Minister saying that that is now unnecessary and is captured by the way in which the legislation has been reframed?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I am saying, specifically on Clause 1, that separating the identifiability of children and the identifiability of adults would be detrimental to both but particularly, in this instance, to children.

Amendment 9 would ensure that children’s data is included in the definition of special category data and is subject to the heightened protections afforded to this category of data by Article 9 of the UK GDPR. This could have unintended consequences, because the legal position would be that processing of children’s data would be banned unless specifically permitted. This could create the need for considerable additional legislation to exempt routine and important processing from the ban; for example, banning a Girl Guides group from keeping a list of members unless specifically exempted would be disproportionate. However, more sensitive data such as records relating to children’s health or safeguarding concerns would already be subject to heightened protections in the UK GDPR, as soon as the latter type of data is processed.

I am grateful to the noble Baroness, Lady Kidron, for raising these issues and for the chance to set out why the Government feel that children’s protection is at least maintained, if not enhanced. I hope my answers have, for the time being, persuaded her of the Government’s view that the Bill does not reduce standards of protection for children’s data. On that basis, I ask her also not to move her Amendment 290 on the grounds that a further overarching statement on this is unnecessary and may cause confusion when interpreting the legislation. For all the reasons stated above, I hope that she will now reconsider whether her amendments in this group are necessary and agree not to press them.

Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

Can I press the Minister more on Amendment 290 from the noble Baroness, Lady Kidron? All it does is seek to maintain the existing standards of data protection for children, as carried over from the 2018 Act. If that is all it does, what is the problem with that proposed new clause? In its current formulation, does it not put the intention of the legislation in a place of certainty? I do not quite get why it would be damaging.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I believe it restates what the Government feel is clearly implied or stated throughout the Bill: that children’s safety is paramount. Therefore, putting it there is either duplicative or confusing; it reduces the clarity of the Bill. In no way is this to say that children are not protected—far from it. The Government feel it would diminish the clarity and overall cohesiveness of the Bill to include it.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, not to put too fine a point on it, the Minister is saying that nothing in the Bill diminishes children’s rights, whether in Clause 1, Clause 6 or the legitimate interest in Clause 5. He is saying that absolutely nothing in the Bill diminishes children’s rights in any way. Is that his position?

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

Can I add to that question? Is my noble friend the Minister also saying that there is no risk of companies misinterpreting the Bill’s intentions and assuming that this might be some form of diminution of the protections for children?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

In answer to both questions, what I am saying is that, first, any risk of misinterpreting the Bill with respect to children’s safety is diminished, rather than increased, by the Bill. Overall, it is the Government’s belief and intention that the Bill in no way diminishes the safety or privacy of children online. Needless to say, if over the course of our deliberations the Committee identifies areas of the Bill where that is not the case, we will absolutely be open to listening on that, but let me state this clearly: the intent is to at least maintain, if not enhance, the safety and privacy of children and their data.

Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

My Lords, that creates another question, does it not? If that is the case, why amend the original wording from the 2018 Act?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Sorry, the 2018 Act? Or is the noble Lord referring to the amendments?

Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

Why change the wording that provides the protection that is there currently?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I assume the noble Lord is referring to Amendment 290.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Okay. The Government feel that, in terms of the efficient and effective drafting of the Bill, that paragraph diminishes the clarity by being duplicative rather than adding to it by making a declaration. For the same reason, we have chosen not to make a series of declarations about other intentions of the Bill overall in the belief that the Bill’s intent and outcome are protected without such a statement.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, before our break, the noble Baroness, Lady Harding, said that this is hard-fought ground; I hope the Minister understands from the number of questions he has just received during his response that it will continue to be hard-fought ground.

I really regret having to say this at such an early stage on the Bill, but I think that some of what the Minister said was quite disingenuous. We will get to it in other parts of the Bill, but the thing that we have all agreed to disagree on at this point is the statement that the Bill maintains data privacy for everyone in the UK. That is a point of contention between noble Lords and the Minister. I absolutely accept and understand that we will come to a collective view on it in Committee. However, the Minister appeared to suggest—I ask him to correct me if I have got this wrong—that the changes on legitimate interest and purpose limitation are child safety measures because some people are saying that they are deterred from sharing data for child protection reasons. I have to tell him that they are not couched or formed like that; they are general-purpose shifts. There is absolutely no question but that the Government could have made specific changes for child protection, put them in the Bill and made them absolutely clear. I find that very worrying.

I also find it worrying, I am afraid—this is perhaps where we are heading and the thing that many organisations are worried about—that bundling the AADC in with the Online Safety Act and saying, “I’ve got it over here so you don’t need it over there” is not the same as maintaining the protections for children from a high level of data. It is not the same set of things. I specifically said that this was not an age-verification measure and would not require it; whatever response there was on that was therefore unnecessary because I made that quite clear in my remarks. The Committee can understand that, in order to set a high bar of data protection, you must either identify a child or give it to everyone. Those are your choices. You do not have to verify.

I will withdraw the amendment, but I must say that the Government may not have it both ways. The Bill cannot be different or necessary and at the same time do nothing. The piece that I want to leave with the Committee is that it is the underlying provisions that allow the ICO to take action on the age-appropriate design code. It does not matter what is in the code; if the underlying provisions change, so does the code. During Committee, I expect that there will be a report on the changes that have happened all around the world as a result of the code, and we will be able to measure whether the new Bill would be able to create those same changes. With that, I beg leave to withdraw my amendment.

Amendment 2 withdrawn.
Amendments 3 to 5 not moved.
Clause 1 agreed.
Clause 2: Meaning of research and statistical purposes
Amendment 6
Moved by
6: Clause 2, page 4, line 8, leave out from “study” to end of line 9
Member’s explanatory statement
This amendment would ensure all uses under this Clause are in the public interest, however they may be described.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I am going to get rather used to introducing a smorgasbord of probing amendments and stand part notices throughout most of the groups of amendments as we go through them. Some of them try to find out the meaning of areas in the Bill and others are rather more serious and object to whole clauses.

I am extremely sympathetic to the use of personal data for research purposes, but Clause 2, which deals with research, is rather deceptive in many ways. That is because “scientific research” and “scientific research purposes” will now be defined to mean

“any research that can reasonably be described as scientific, whether publicly or privately funded and whether carried out as a commercial or non-commercial activity”.

The rub lies in the words “commercial or non-commercial activity”. A loosening of requirements on purpose limitation will assist commercial and non-commercial organisations in research and reusing personal data obtained from third parties but will do nothing to increase protection for individual data subjects in these circumstances. That is the real Pandora’s box that we are opening as regards commercial activity. It opens the door to Meta to use our personal data for its own purposes under the guise of research. That seems very much to be a backward step. That is why I tabled Amendment 6, which would require the public interest to apply to all uses under this clause, not just public health uses.

Then there is the question of consent under Clause 3. How is the lawful and moral right of patients, constituents or data subjects to dissent from medical research, for instance, enshrined in this clause? We have seen enough issues relating to health data, opt-outs and so on to begin to destroy public trust, if we are not careful. We have to be extremely advertent to the fact that the communications have to be right; there has to be the opportunity to opt out.

In these circumstances, Amendment 7 would provide that a data subject has been given the opportunity to express dissent or an objection and has not so expressed it. That is then repeated in Clause 26. Again, we are back to public trust: we are not going to gain it. I am very much a glass-half-full person as far as new technology, AI and the opportunities for the use of patient data in the health service are concerned. I am an enthusiast for that, but it has to be done in the right circumstances.

18:30
We come to Clause 6. The Explanatory Notes say:
“Subsection (3) clarifies that meeting a condition under Article 8A for further processing does not permit controllers to continue relying on the same lawful basis under Article 6(1) that they relied on for their original purpose if that basis is no longer valid for the new purpose”.
“Clarifies” is a bit of a weasel word. Is the Minister going to tell me that everything is absolutely fine and all we have done is import the recital into the Bill? That tells me again that we could have done it through guidance and there was absolutely no need to have legislation.
You have to read the runes with this Bill. We will be saying to the Minister all the time, as we go through almost every single clause, “Really? Is that the position?” He will be horrified to hear this, but his words will be pored over by legions of data protection lawyers as we go through. I assure him that he will have bags of correspondence to contend with due to the technicalities of this Bill; I do not think that they are like any other Bill I have ever had to deal with, partly because the nature of the amendments being made in the Bill to the original Act and to the GDPR make it so complicated. I have followed and responded to the consultation, so I am not ignorant about what the Government have ostensibly said about the changes they want to make, but I really would like to know what the Minister thinks the purpose of Clause 6 is and whether it is simply a clarification or a body of new law in these circumstances.
I know that this is a bit like the prosecution: the Minister will protest his innocence throughout the passage of the Bill, with “Not me, guv” or something to that effect. I look forward to his reply, but I think we will really have to dig under the surface as we go through. I very much hope that the Minister can clarify whether this is new. I certainly believe that the addition of commercial purposes is potentially extremely dangerous, needs to be qualified and is novel. I beg to move.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I speak to Amendments 8, 21, 23 and 145 in my name and thank the other noble Lords who have added their names to them. In the interests of brevity, and as the noble Lord, Lord Clement-Jones, has done some of the heavy lifting on this, I will talk first to Amendment 8.

The definition of scientific research has been expanded to include commercial and non-commercial activity, so far as it

“can reasonably be described as scientific”,

but “scientific” is not defined. As the noble Lord said, there is no public interest requirement, so a commercial company can, in reality, develop almost any kind of product on the basis that it may have a scientific purpose, even—or maybe especially—if it measures your propensity to impulse buy or other commercial things. The spectre of scientific inquiry is almost infinite. Amendment 8 would exclude children simply by adding proposed new paragraph (e), which says that

“the data subject is not a child or could or should be known to be a child”,

so that their personal data cannot be used for scientific research purposes to which they have not given their consent.

I want to be clear that I am pro-research and understand the critical role that data plays in enabling us to understand societal challenges and innovate towards solutions. Indeed, I have signed the amendment in the name of the noble Lord, Lord Bethell, which would guarantee access to data for academic researchers working on matters of public interest. Some noble Lords may have been here last night, when the US Surgeon- General Vice Admiral Dr Murthy, who gave the Lord Speaker’s lecture, made a fierce argument in favour of independent public interest research, not knowing that such a proposal has been laid. I hope that, when we come to group 17, the Government heed his wise words.

In the meantime, Clause 3 simply embeds the inequality of arms between academics and corporates and extends it, making it much easier for commercial companies to use personal data for research while academics continue to be held to much higher ethical and professional standards. They continue to require express consent, DBS checks and complex ethical requirements. Not doing so, simply using personal data for research, is unethical and commercial players can rely on Clause 3 to process data without consent, in pursuit of profit. Like the noble Lord, Lord Clement-Jones, I would prefer an overall solution to this but, in its absence, this amendment would protect data from being commoditised in this way.

Amendments 21 and 23 would specifically protect children from changes to Clause 6. I have spoken on this a little already, but I would like it on the record that I am absolutely in favour of a safeguarding exemption. The additional purposes, which are compatible with but go beyond the original purpose, are not a safeguarding measure. Amendment 21 would amend the list of factors that a data controller must take into account to include the fact that children are entitled to a higher standard of protection.

Amendment 23 would not be necessary if Amendment 22 were agreed. It would commit the Secretary of State to ensuring that, when exercising their power under new Article 8A, as inserted by Clause 6(5), to add, vary or omit provisions of Annex 2, they take the 2018 Act and children’s data protection into account.

Finally, Amendment 145 proposes a code of practice on the use of children’s data in scientific research. This code would, in contrast, ensure that all researchers, commercial or in the public interest, are held to the same high standards by developing detailed guidance on the use of children’s data for research purposes. A burning question for researchers is how to properly research children’s experience, particularly regarding the harms defined by the Online Safety Act.

Proposed new subsection (1) sets out the broad headings that the ICO must cover to promote good practice. Proposed new subsection (2) confirms that the ICO must have regard to children’s rights under the UNCRC, and that they are entitled to a higher standard of protection. It would also ensure that the ICO consulted with academics, those who represent the interests of children and data scientists. There is something of a theme here: if the changes to UK GDPR did not diminish data subjects’ privacy and rights, there would be no need for amendments in this group. If there were a code for independent public research, as is so sorely needed, the substance of Amendment 145 could usefully form a part. If commercial companies can extend scientific research that has no definition, and if the Bill expands the right to further processing and the Secretary of State can unilaterally change the basis for onward processing, can the Minister explain, when he responds, how he can claim that the Bill maintains protections for children?

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My Lords, I will be brief because I associate myself with everything that the noble Baroness, Lady Kidron, just said. This is where the rubber hits the road from our previous group. If we all believe that it is important to maintain children’s protection, I hope that my noble friend the Minister will be able to accept if not the exact wording of the children-specific amendments in this group then the direction of travel—and I hope that he will commit to coming back and working with us to make sure that we can get wording into the Bill.

I am hugely in favour of research in the private sector as well as in universities and the public sector; we should not close our minds to that at all. We need to be realistic that all the meaningful research in AI is currently happening in the private sector, so I do not want to close that door at all, but I am extremely uncomfortable with a Secretary of State having the ability to amend access to personal data for children in this context. It is entirely sensible to have a defined code of conduct for the use of children’s data in research. We have real evidence that a code of conduct setting out how to protect children’s rights and data in this space works, so I do not understand why it would not be a good idea to do research if we want the research to happen but we want children’s rights to be protected at a much higher level.

It seems to me that this group is self-evidently sensible, in particular Amendments 8, 22, 23 and 145. I put my name to all of them except Amendment 22 but, the more I look at the Bill, the more uncomfortable I get with it; I wish I had put my name to Amendment 22. We have discussed Secretary of State powers in each of the digital Bills that we have looked at and we know about the power that big tech has to lobby. It is not fair on Secretaries of State in future to have this ability to amend—it is extremely dangerous. I express my support for Amendment 22.

Lord Davies of Brixton Portrait Lord Davies of Brixton (Lab)
- Hansard - - - Excerpts

I just want to say that I agree with what the previous speakers have said. I particularly support Amendment 133; in effect, I have already made my speech on it. At that stage, I spoke about pseudonymised data but I focused my remarks on scientific research. Clearly, I suspect that the Minister’s assurances will not go far enough, although I do not want to pre-empt what he says and I will listen carefully to it. I am sure that we will have to return to this on Report.

I make a small additional point: I am not as content as the noble Baroness, Lady Harding of Winscombe, about commercial research. Different criteria apply; if we look in more detail at ensuring that research data is protected, there may be special factors relating to commercial research that need to be covered in a potential code of practice or more detailed regulations.

18:45
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I am grateful to all noble Lords who have spoken on this group. Amendment 6 to Clause 2, tabled by the noble Lord, Lord Clement-Jones, rightly tests the boundaries on the use of personal data for scientific research and, as he says, begins to ask, “What is the real purpose of this clause? Is it the clarification of existing good practice or is it something new? Do we fully understand what that new proposition is?”

As he said, there is particular public concern about the use of personal health data where it seems that some private companies are stretching the interpretation of “the public good”, for which authorisation for the use of this data was initially freely given, to something much wider. Although the clause seeks to provide some reassurance on this, we question whether it goes far enough and whether there are sufficient protections against the misuse of personal health data in the way the clause is worded.

This raises the question of whether it is only public health research that needs to be in the public interest, which is the way the clause is worded at the moment, because it could equally apply to research using personal data from other public services, such as measuring educational outcomes or accessing social housing. There is a range of uses for personal data. In an earlier debate, we heard about the plethora of data already held on people, much of which individuals do not understand or know about and which could be used for research or to make judgments about them. So we need to be sensitive about the way this might be used. It would be helpful to hear from the Minister why public health research has been singled out for special attention when, arguably, it should be a wider right across the board.

Noble Lords have asked questions about the wider concerns around Clause 2, which could enable private companies to use personal data to develop new products for commercial benefit without needing to inform the data subjects. As noble Lords have said, this is not what people would normally expect to be described as “scientific research”. The noble Baroness, Lady Kidron, was quite right that it has the potential to be unethical, so we need some standards and some clear understanding of what we mean by “scientific research”.

That is particularly important for Amendments 7 and 132 to 134 in the name of the noble Lord, Lord Clement-Jones, which underline the need for data subjects to be empowered and given the opportunity to object to their data being used for a new purpose. Arguably, without these extra guarantees—particularly because there is a lack of trust about how a lot of this information is being used—data subjects will be increasingly reluctant to hand over personal data on a voluntary basis in the first place. It may well be that this is an area where the Information Commissioner needs to provide additional advice and guidance to ensure that we can reap the benefits of good-quality scientific research that is in the public interest and in which the citizens involved can have absolute trust. Noble Lords around the Room have stressed that point.

Finally, we have added our names to the amendments tabled by the noble Baroness, Lady Kidron, on the use of children’s data for scientific research. As she rightly points out, the 2018 Act gave children a higher standard of protection on the uses for which their data is collected and processed. It is vital that this Bill, for all its intents to simplify and water down preceding rights, does not accidentally put at risk the higher protection agreed for children. In the earlier debate, the Minister said that he believed it will not do so. I am not sure that “believe” is a strong enough word here; we need guarantees that go beyond that. I think that this is an issue we will come back to again and again in terms of what is in the Bill and what guarantees exist for that protection.

In particular, there is a concern that relaxing the legal basis on which personal data can be processed for scientific research, including privately funded research carried out by commercial entities, could open the door for children’s data to be exploited for commercial purposes. We will consider the use of children’s data collected in schools in our debate on a separate group but we clearly need to ensure that the handling of pupils’ data by the Department for Education and the use of educational apps by private companies do not lead to a generation of exploited children who are vulnerable to direct marketing and manipulative messaging. The noble Baroness’s amendments are really important in this regard.

I also think that the noble Baroness’s Amendment 145 is a useful initiative to establish a code of practice on children’s data and scientific research. It would give us an opportunity to balance the best advantages of children’s research, which is clearly in the public and personal interest, with the maintenance of the highest level of protection from exploitation.

I hope that the Minister can see the sense in these amendments. In particular, I hope that he will take forward the noble Baroness’s proposals and agree to work with us on the code of practice principles and to put something like that in the Bill. I look forward to his response.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Jones, for this series of amendments.

I will first address Amendment 6, which seeks to amend Clause 2. As the noble Lord said, the definitions created by Clause 2, including “scientific research purposes”, are based on the current wording in recital 159 to the UK GDPR. We are changing not the scope of these definitions but their legal status. This amendment would require individual researchers to assess whether their research should be considered to be in the public interest, which could create uncertainty in the sector and discourage research. This would be more restrictive than the current position and would undermine the Government’s objectives to facilitate scientific research and empower researchers.

We have maintained a flexible scope as to what is covered by “scientific research” while ensuring that the definition is still sufficiently narrow in that it can cover only what would reasonably be seen as scientific research. This is because the legislation needs to be able to adapt to the emergence of new areas of innovative research. Therefore, the Government feel that it is more appropriate for the regulator to add more nuance and context to the definition. This includes the types of processing that are considered—

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I am sorry to interrupt but it may give the Box a chance to give the Minister a note on this. Is the Minister saying that recital 159 includes the word “commercial”?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I am afraid I do not have an eidetic memory of recital 159, but I would be happy to—

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

That is precisely why I ask this question in the middle of the Minister’s speech to give the Box a chance to respond, I hope.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Researchers must also comply with the required safeguards to protect individuals’ privacy. All organisations conducting scientific research, including those with commercial interests, must also meet all the safeguards for research laid out in the UK GDPR and comply with the legislation’s core principles, such as fairness and transparency. Clause 26 sets out several safeguards that research organisations must comply with when processing personal data for research purposes. The ICO will update its non-statutory guidance to reflect many of the changes introduced by this Bill.

Scientific research currently holds a privileged place in the data protection framework because, by its nature, it is already viewed as generally being in the public interest. As has been observed, the Bill already applies a public interest test to processing for the purpose of public health studies in order to provide greater assurance for research that is particularly sensitive. Again, this reflects recital 159.

In response to the noble Baroness, Lady Jones, on why public health research is being singled out, as she stated, this part of the legislation just adds an additional safeguard to studies into public health ensuring that they must be in the public interest. This does not limit the scope for other research unrelated to public health. Studies in the area of public health will usually be in the public interest. For the rare, exceptional times that a study is not, this requirement provides an additional safeguard to help prevent misuse of the various exemptions and privileges for researchers in the UK GDPR. “Public interest” is not defined in the legislation, so the controller needs to make a case-by-case assessment based on its purposes.

On the point made by the noble Lord, Lord Clement-Jones, about recitals and ICO guidance, although we of course respect and welcome ICO guidance, it does not have legislative effect and does not provide the certainty that legislation does. That is why we have done so via this Bill.

Amendment 7 to Clause 3 would undermine the broader consent concept for scientific research. Clause 3 places the existing concept of “broad consent” currently found in recital 33 to the UK GDPR on a statutory footing with the intention of improving awareness and confidence for researchers. This clause applies only to scientific research processing that is reliant on consent. It already contains various safeguards. For example, broad consent can be used only where it is not possible to identify at the outset the full purposes for which personal data might be processed. Additionally, to give individuals greater agency, where possible individuals will have the option to consent to only part of the processing and can withdraw their consent at any time.

Clause 3 clarifies an existing concept of broad consent which outlines how the conditions for consent will be met in certain circumstances when processing for scientific research purposes. This will enable consent to be obtained for an area of scientific research when researchers cannot at the outset identify fully the purposes for which they are collecting the data. For example, the initial aim may be the study of cancer, but it later becomes the study of a particular cancer type.

Furthermore, as part of the reforms around the reuse of personal data, we have further clarified that when personal data is originally collected on the basis of consent, a controller would need to get fresh consent to reuse that data for a new purpose unless a public interest exemption applied and it is unreasonable to expect the controller to obtain that consent. A controller cannot generally reuse personal data originally collected on the basis of consent for research purposes.

Turning to Amendments 132 and 133 to Clause 26, the general rule described in Article 13(3) of the UK GDPR is that controllers must inform data subjects about a change of purposes, which provides an opportunity to withdraw consent or object to the proposed processing where relevant. There are existing exceptions to the right to object, such as Article 21(6) of the UK GDPR, where processing is necessary for research in the public interest, and in Schedule 2 to the Data Protection Act 2018, when applying the right would prevent or seriously impair the research. Removing these exemptions could undermine life-saving research and compromise long-term studies so that they are not able to continue.

Regarding Amendment 134, new Article 84B of the UK GDPR already sets out the requirement that personal data should be anonymised for research, archiving and statistical—RAS—purposes unless doing so would mean the research could not be carried through. Anonymisation is not always possible as personal data can be at the heart of valuable research, archiving and statistical activities, for example, in genetic research for the monitoring of new treatments of diseases. That is why new Article 84C of the UK GDPR also sets out protective measures for personal data that is used for RAS purposes, such as ensuring respect for the principle of data minimisation through pseudonymisation.

The stand part notice in this group seeks to remove Clause 6 and, consequentially, Schedule 2. In the Government’s consultation on data reform, Data: A New Direction, we heard that the current provisions in the UK GDPR on personal data reuse are difficult for controllers and individuals to navigate. This has led to uncertainty about when controllers can reuse personal data, causing delays for researchers and obstructing innovation. Clause 6 and Schedule 2 address the existing uncertainty around reusing personal data by setting out clearly the conditions in which the reuse of personal data for a new purpose is permitted. Clause 6 and Schedule 2 must therefore remain to give controllers legal certainty and individuals greater transparency.

Amendment 22 seeks to remove the power to add to or vary the conditions set out in Schedule 2. These conditions currently constitute a list of specific public interest purposes, such as safeguarding vulnerable individuals, for which an organisation is permitted to reuse data without needing consent or to identify a specific law elsewhere in legislation. Since this list is strictly limited and exhaustive, a power is needed to ensure that it is kept up to date with future developments in how personal data is used for important public interest purposes.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I am interested that the safeguarding requirement is already in the Bill, so, in terms of children, which I believe the Minister is going to come to, the onward processing is not a question of safeguarding. Is that correct? As the Minister has just indicated, that is already a provision.

19:00
Sitting suspended for Divisions in the House.
19:34
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Just before we broke, I was on the verge of attempting to answer the question from the noble Baroness, Lady Kidron; I hope my coming words will do that, but she can intervene again if she needs to.

I turn to the amendments that concern the use of children’s data in research and reuse. Amendment 8 would also amend Clause 3; the noble Baroness suggests that the measure should not apply to children’s data, but this would potentially prevent children, or their parents or guardians, from agreeing to participate in broad areas of pioneering research that could have a positive impact on children, such as on the causes of childhood diseases.

On the point about safeguarding, the provisions on recognised legitimate interests and further processing are required for safeguarding children for compliance with, respectively, the lawfulness and purpose limitation principles. The purpose limitation provision in this clause is meant for situations where the original processing purpose was not safeguarding and the controller then realises that there is a need to further process it for safeguarding.

Research organisations are already required to comply with the data protection principles, including on fairness and transparency, so that research participants can make informed decisions about how their data is used; and, where consent is the lawful basis for processing, children, or their parents or guardians, are free to choose not to provide their consent, or, if they do consent, they can withdraw it at any time. In addition, the further safeguards that are set out in Clause 26, which I mentioned earlier, will protect all personal data, whether it relates to children or adults.

Amendment 21 would require data controllers to have specific regard to the fact that children’s data requires a higher standard of protection for children when deciding whether reuse of their data is compatible with the original purpose for which it was collected. This is unnecessary because the situations in which personal data could be reused are limited to public interest purposes designed largely to protect the public and children, in so far as they are relevant to them. Controllers must also consider the possible consequences for data subjects and the relationship between the controller and the data subject. This includes taking into account that the data subject is a child, in addition to the need to generally consider the interests of children.

Amendment 23 seeks to limit use of the purpose limitation exemptions in Schedule 2 in relation to children’s data. This amendment is unnecessary because these provisions permit further processing only in a narrow range of circumstances and can be expanded only to serve important purposes of public interest. Furthermore, it may inadvertently be harmful to children. Current objectives include safeguarding children or vulnerable people, preventing crime or responding to emergencies. In seeking to limit the use of these provisions, there is a risk that the noble Baroness’s amendments might make data controllers more hesitant to reuse or disclose data for public interest purposes and undermine provisions in place to protect children. These amendments could also obstruct important research that could have a demonstrable positive impact on children, such as research into children’s diseases.

Amendment 145 would require the ICO to publish a statutory code on the use of children’s data in scientific research and technology development. Although the Government recognise the value that ICO codes can play in promoting good practice and improving compliance, we do not consider that it would be appropriate to add these provisions to the Bill without further detailed consultation with the ICO and the organisations likely to be affected by the new codes. Clause 33 of the Bill already includes a measure that would allow the Secretary of State to request the ICO to publish a code on any matter that it sees fit, so this is an issue that we could return to in the future if the evidence supports it.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I will read Hansard very carefully, because I am not sure that I absolutely followed the Minister, but we will undoubtedly come back to this. I will ask two questions. Earlier, before we had a break, in response to some of the early amendments in the name of the noble Lord, Lord Clement-Jones, the Minister suggested that several things were being taken out of the recital to give them solidity in the Bill; so I am using this opportunity to suggest that recital 38, which is the special consideration of children’s data, might usefully be treated in a similar way and that we could then have a schedule that is the age-appropriate design code in the Bill. Perhaps I can leave that with the Minister, and perhaps he can undertake to have some further consultation with the ICO on Amendment 145 specifically.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

With respect to recital 38, that sounds like a really interesting idea. Yes, let us both have a look and see what the consultation involves and what the timing might look like. I confess to the Committee that I do not know what recital 38 says, off the top of my head. For the reasons I have set out, I am not able to accept these amendments. I hope that noble Lords will therefore not press them.

Returning to the questions by the noble Lord, Lord Clement-Jones, on the contents of recital 159, the current UK GDPR and EU GDPR are silent on the specific definition of scientific research. It does not preclude commercial organisations performing scientific research; indeed, the ICO’s own guidance on research and its interpretation of recital 159 already mention commercial activities. Scientific research can be done by commercial organisations—for example, much of the research done into vaccines, and the research into AI referenced by the noble Baroness, Lady Harding. The recital itself does not mention it but, as the ICO’s guidance is clear on this already, the Government feel that it is appropriate to put this on a statutory footing.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, that was intriguing. I thank the Minister for his response. It sounds as though, again, guidance would have been absolutely fine, but what is there not to like about the ICO bringing clarity? It was quite interesting that the Minister used the phrase “uncertainty in the sector” on numerous occasions and that is becoming a bit of a mantra as the Bill goes on. We cannot create uncertainty in the sector, so the poor old ICO has been labouring in the vineyard for the last few years to no purpose at all. Clearly there has been uncertainty in the sector of a major description, and all its guidance and all the work that it has put in over the years have been wholly fruitless, really. It is only this Government that have grabbed the agenda with this splendid 300-page data protection Bill that will clarify this for business. I do not know how much they will have to pay to get new compliance officers or whatever it happens to be, but the one thing that the Bill will absolutely not create is greater clarity.

I am a huge fan of making sure that we understand what the recitals have to say, and it is very interesting that the Minister is saying that the recital is silent but the ICO’s guidance is pretty clear on this. I am hugely attracted by the idea of including recital 38 in the Bill. It is another lightbulb moment from the noble Baroness, Lady Kidron, who has these moments, rather like with the age-appropriate design code, which was a huge one.

We are back to the concern, whether in the ICO guidance, the Bill or wherever, that scientific research needs to be in the public interest to qualify and not have all the consents that are normally required for the use of personal data. The Minister said, “Well, of course we think that scientific research is in the public interest; that is its very definition”. So why does only public health research need that public interest test and not the other aspects? Is it because, for instance, the opt-out was a bit of a disaster and 3 million people opted out of allowing their health data to be shared or accessed by GPs? Yes, it probably is.

Do the Government want a similar kind of disaster to happen, in which people get really excited about Meta or other commercial organisations getting hold of their data, a public outcry ensues and they therefore have to introduce a public interest test on that? What is sauce for the goose is sauce for the gander. I do not think that personal data should be treated in a particularly different way in terms of its public interest, just because it is in healthcare. I very much hope that the Minister will consider that.

19:45
I am a fan of the amendments tabled by the noble Baroness, Lady Kidron. The noble Baroness, Lady Harding, put her finger on it when talking about Amendment 22 in the name of the noble Baroness, Lady Jones. I do not like what is in the current clauses, but of greater importance is the Secretary of State’s power to extend this part of the Bill on what is covered by scientific research. I absolutely support Amendment 22. It is déjà vu all over again, as somebody once said. Basically, we are back to talking about the Secretary of State’s powers, as we did when discussing the Online Safety Bill and the digital markets Bill. This Government are addicted to giving the Secretary of State additional powers. It is like a fix that has to be taken. On any Bill produced by this Government in the last few years, there has been criticism by the Secondary Legislation Scrutiny Committee and the Delegated Powers Committee but it has fallen on deaf ears.
As the noble Baroness, Lady Harding, said, the lobbying power of big tech is the real danger here: you get into the department, have a chat with the Secretary of State and Bob’s your uncle. That is why we passed amendments to the digital markets Bill. We did not want a coach and horses driven through that Bill, and we do not want a coach and horses arriving in this Bill as a result of this power. There will be plenty more to argue about at later stages of the Bill, but in the meantime I beg leave to withdraw my amendment.
Amendment 6 withdrawn.
Clause 2 agreed.
Clause 3: Consent to processing for the purposes of scientific research
Amendments 7 and 8 not moved.
Clause 3 agreed.
Clause 4 agreed.
Amendment 9 not moved.
Amendment 10
Moved by
10: After Clause 4, insert the following new Clause—
““Data community”In this Act, a “data community” means an entity established to facilitate the collective activation of data subjects’ data rights in Chapters III and VIII of the UK GDPR and members of a data community assign specific data rights to a nominated entity to exercise those rights on their behalf.”Member’s explanatory statement
This amendment provides a definition of “data community”. It is one of a series of amendments that would establish the ability to assign data rights to a third party.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I hope this is another lightbulb moment, as the noble Lord, Lord Clement-Jones, suggested. As well as Amendment 10, I will speak to Amendments 35, 147 and 148 in my name and the names of the noble Baroness, Lady Jones, and the noble Lord, Lord Clement-Jones. I thank them both. The purpose of these amendments is to move the Bill away from nibbling around the edges of GDPR in pursuit of post-Brexit opportunities and to actually deliver a post-Brexit opportunity.

These amendments would put the UK on an enhanced path of data sophistication while not challenging equivalence, which we will undoubtedly discuss during the Committee. I echo the voice of the noble Lord, Lord Allan, who at Second Reading expressed deep concern that equivalence was not a question of an arrangement between the Government and the EU but would be a question picked up by data activists taking strategic litigation to the courts.

Data protection as conceived by GDPR and in this Bill is primarily seen as an arrangement between an individual and an entity that processes that data—most often a commercial company. But, as evidenced by the last 20 years, the real power lies in holding either vast swathes of general data, such as those used by LLMs, or large groups of specialist data such as medical scans. In short, the value—in all forms, not simply financial—lies in big data.

As the value of data became clear, ideas such as “data is the new oil” and data as currency emerged, alongside the notion of data fiduciaries or data trusts, where you can place your data collectively. One early proponent of such ideas was Jaron Lanier, inventor of virtual reality; I remember discussing it with him more than a decade ago. However, these ideas have not found widespread practical application, possibly because they are normally based around ideas of micropayments as the primary value—and very probably because they rely on data subjects gathering their data, so they are for the boffins.

During the passage of the DPA 2018, one noble Lord counted the number of times the Minister said the words “complex” and “complicated” while referring to the Bill. Data law is complex, and the complicated waterfall of its concepts and provisions eludes most non-experts. That is why I propose the four amendments in this group, which would give UK citizens access to data experts for matters that concern them deeply.

Amendment 10 would define the term “data community”, and Amendment 35 would give a data subject the power to assign their data rights to a data community for specific purposes and for a specific time period. Amendment 147 would require the ICO to set out a code of conduct for data communities, including guidance on establishing, operating and joining a data community, as well as guidance for data controllers and data processors on responding to requests made by data communities. Amendment 148 would require the ICO to keep a register of data communities, to make it publicly available and to ensure proper oversight. Together, they would provide a mechanism for non-experts—that is, any UK citizen—to assign their data rights to a community run by representatives that would benefit the entire group.

Data communities diverge from previous attempts to create big data for the benefit of users, in that they are not predicated on financial payments and neither does each data subject need to access their own data via the complex rules and often obstructive interactions with individual companies. They put rights holders together with experts who do it on their behalf, by allowing data subjects to assign their rights so that an expert can gather the data and crunch it.

This concept is based on a piece of work done by a colleague of mine at the University of Oxford, Dr Reuben Binns, an associate professor in human-centred computing, in association with the Worker Info Exchange. Since 2016, individual Uber drivers, with help from their trade unions and the WIE, asked Uber for their data that showed their jobs, earnings, movements, waiting times and so on. It took many months of negotiation, conducted via data protection lawyers, as each driver individually asked for successive pieces of information that Uber, at first, resisted giving them and then, after litigation, provided.

After a period of time, a new cohort of drivers was recruited, and it was only when several hundred drivers were poised to ask the same set of questions that a formal arrangement was made between Uber and WIE, so that they could be treated as a single group and all the data would be provided about all the drivers. This practical decision allowed Dr Binns to look at the data en masse. While an individual driver knew what they earned and where they were, what became visible when looking across several hundred drivers is how the algorithm reacted to those who refused a poorly paid job, who was assigned the lucrative airport runs, whether where you started impacted on your daily earnings, whether those who worked short hours were given less lucrative jobs, and so on.

This research project continues after several years and benefits from a bespoke arrangement that could, by means of these amendments, be strengthened and made an industry-wide standard with the involvement of the ICO. If it were routine, it would provide opportunity equally for challenger businesses, community groups and research projects. Imagine if a group of elderly people who spend a lot of time at home were able to use a data community to negotiate cheap group insurance, or imagine a research project where I might assign my data rights for the sole purpose of looking at gender inequality. A data community would allow any group of people to assign their rights, rights that are more powerful together than apart. This is doable—I have explained how it has been done. With these amendments, it would be routinely available, contractual, time-limited and subject to a code of conduct.

As it stands, the Bill is regressive for personal data rights and does not deliver the promised Brexit dividends. But there are great possibilities, without threatening adequacy, that could open markets, support innovation in the UK and make data more available to groups in society that rarely benefit from data law. I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I think this is a lightbulb moment—it is inspired, and this suite of amendments fits together really well. I entirely agree with the noble Baroness, Lady Kidron, that this is a positive aspect. If the Bill contained these four amendments, I might have to alter my opinion of it—how about that for an incentive?

This is an important subject. It is a positive aspect of data rights. We have not got this right yet in this country. We still have great suspicion about sharing and access to personal data. There is almost a conspiracy theory around the use of data, the use of external contractors in the health service and so on, which is extremely unhelpful. If individuals were able to share their data with a trusted hub—a trusted community—that would make all the difference.

Like the noble Baroness, Lady Kidron, I have come across a number of influences over the years. I think the first time many of us came across the idea of data trusts or data institutions was in the Hall-Pesenti review carried out by Dame Wendy Hall and Jérôme Pesenti in 2017. They made a strong recommendation to the Government that they should start thinking about how to operationalise data trusts. Subsequently, organisations such as the Open Data Institute did some valuable research into how data trusts and data institutions could be used in a variety of ways, including in local government. Then the Ada Lovelace Institute did some very good work on the possible legal basis for data trusts and data institutions. Professor Irene Ng was heavily engaged in setting up what was called the “hub of all things”. I was not quite convinced by how it was going to work legally in terms of data sharing and so on, but in a sense we have now got to that point. I give all credit to the academic whom the noble Baroness mentioned. If he has helped us to get to this point, that is helpful. It is not that complicated, but we need full government backing for the ICO and the instruments that the noble Baroness put in her amendments, including regulatory oversight, because it will not be enough simply to have codes that apply. We have to have regulatory oversight.

20:00
For many of us, this could be the way to crack the public trust issue. It could be the way that the ordinary person deals with the asymmetry between their position as holders of personal data and the power of big tech, with social media trawling their data in the way that Shoshana Zuboff wrote about in her book, The Age of Surveillance Capitalism. It helps with that asymmetry, which we talked about in our debates on the digital markets Bill and which is so obviously the case here.
It is also the case with data capture, which is often extremely untransparent. There was a case in the paper the other day in which somebody had made a data subject access request and received 20,000 pages from Meta. They spent several weeks trying to work out exactly what it was because it was not in a user-friendly form. A data community aggregating the rights of data subjects could really help to crack exactly that sort of thing: it could debate and discuss this with social media companies and others that hold data in order, in a sense, to create protocols meaning that this data is used, categorised and downloaded in subject access requests in a much more user-friendly and transparent way.
There is a lot to like here. The noble Baroness has done the Government’s job for them; the drafting seems terrific to me. I am sure that the Minister will talk about the amendments being technically flawed and so on and so forth—that is bound to happen—but I think that they are pretty good and very much hope that he will give them a fair wind. I know that he will be personally enthusiastic about this.
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I am also pleased to support these amendments in the name of the noble Baroness, Lady Kidron, to which I have added my name. I am hugely enthusiastic about them, too, and think that this has been a lightbulb moment from the noble Baroness. I very much thank her for doing all of this background work because she has identified the current weakness in the data protection landscape: it is currently predicated on an arrangement between an individual and the organisation that holds their data.

That is an inherently unbalanced power construct. As the noble Baroness said, as tech companies become larger and more powerful, it is not surprising that many individuals feel overwhelmed by the task of questioning or challenging those that are processing their personal information. It assumes a degree of knowledge about their rights and a degree of digital literacy, which we know many people do not possess.

In the very good debate that we had on digital exclusion a few weeks ago, it was highlighted that around 2.4 million people are unable to complete a single basic task to get online, such as opening an internet browser, and that more than 5 million employed adults cannot complete essential digital work tasks. These individuals cannot be expected to access their digital data on their own; they need the safety of a larger group to do so. We need to protect the interests of an entire group that would otherwise be locked out of the system.

The noble Baroness referred to the example of Uber drivers who were helped by their trade union to access their data, sharing patterns of exploitation and subsequently strengthening their employment package, but this does not have to be about just union membership; it could be about the interests of a group of public sector service users who want to make sure that they are not being discriminated against, a community group that wants its bid for a local grant to be treated fairly, and so on. We can all imagine examples of where this would work in a group’s interest. As the noble Baroness said, these proposals would allow any group of people to assign their rights—rights that are more powerful together than apart.

There could be other benefits; if data controllers are concerned about the number of individual requests that they are receiving for data information—and a lot of this Bill is supposed to address that extra work—group requests, on behalf of a data community, could provide economies of scale and make the whole system more efficient.

Like the noble Baroness, I can see great advantages from this proposal; it could lay the foundation for other forms of data innovation and help to build trust with many citizens who currently see digitalisation as something to fear—this could allay those fears. Like the noble Lord, Lord Clement-Jones, I hope the Minister can provide some reassurance that the Government welcome this proposal, take it seriously and will be prepared to work with the noble Baroness and others to make it a reality, because there is the essence of a very good initiative here.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank the noble Baroness, Lady Kidron, for raising this interesting and compelling set of ideas. I turn first to Amendments 10 and 35 relating to data communities. The Government recognise that individuals need to have the appropriate tools and mechanisms to easily exercise their rights under the data protection legislation. It is worth pointing out that current legislation does not prevent data subjects authorising third parties to exercise certain rights. Article 80 of the UK GDPR also explicitly gives data subjects the right to appoint not-for-profit bodies to exercise certain rights, including their right to bring a complaint to the ICO, to appeal against a decision of the ICO or to bring legal proceedings against a controller or processor and the right to receive compensation.

The concept of data communities exercising certain data subject rights is closely linked with the wider concept of data intermediaries. The Government recognise the existing and potential benefits of data intermediaries and are committed to supporting them. However, given that data intermediaries are new, we need to be careful not to distort the sector at such an early stage of development. As in many areas of the economy, officials are in regular contact with businesses, and the data intermediary sector is no different. One such engagement is the DBT’s Smart Data Council, which includes a number of intermediary businesses that advise the Government on the direction of smart data policy. The Government would welcome further and continued engagement with intermediary businesses to inform how data policy is developed.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I am sorry, but the Minister used a pretty pejorative word: “distort” the sector. What does he have in mind?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I did not mean to be pejorative; I merely point out that before embarking on quite a far-reaching policy—as noble Lords have pointed out—we would not want to jump the gun prior to consultation and researching the area properly. I certainly do not wish to paint a negative portrait.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Is this one of those “in due course” moments?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

It is a moment at which I cannot set a firm date for a firm set of actions, but on the other hand I am not attempting to punt it into the long grass either. The Government do not want to introduce a prescriptive framework without assessing potential risks, strengthening the evidence base and assessing the appropriate regulatory response. For these reasons, I hope that for the time being the noble Baroness will not press these amendments.

The noble Baroness has also proposed Amendments 147 and 148 relating to the role of the Information Commissioner’s Office. Given my response just now to the wider proposals, these amendments are no longer necessary and would complicate the statute book. We note that Clause 35 already includes a measure that will allow the Secretary of State to request the Information Commissioner’s Office to publish a code on any matter that she or he sees fit, so this is an issue we could return to in future if such a code were deemed necessary.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I am sorry to keep interrupting the Minister. Can he give us a bit of a picture of what he has in mind? He said that he did not want to distort things at the moment, that there were intermediaries out there and so on. That is all very well, but is he assuming that a market will be developed or is developing? What overview of this does he have? In a sense, we have a very clear proposition here, which the Government should respond to. I am assuming that this is not a question just of letting a thousand flowers bloom. What is the government policy towards this? If you look at the Hall-Pesenti review and read pretty much every government response—including to our AI Select Committee, where we talked about data trusts and picked up the Hall-Pesenti review recommendations —you see that the Government have been pretty much positive over time when they have talked about data trusts. The trouble is that they have not done anything.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Overall, as I say and as many have said in this brief debate, this is a potentially far-reaching and powerful idea with an enormous number of benefits. But the fact that it is far-reaching implies that we need to look at it further. I am afraid that I am not briefed on long-standing—

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

May I suggest that the Minister writes? On the one hand, he is saying that we will be distorting something—that something is happening out there—but, on the other hand, he is saying that he is not briefed on what is out there or what the intentions are. A letter unpacking all that would be enormously helpful.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I am very happy to write on this. I will just say that I am not briefed on previous government policy towards it, dating back many years before my time in the role.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

It was a few Prime Ministers ago.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

It was even further. Yes, I am very happy to write on that. For the reasons I have set out, I am not able to accept these amendments for now. I therefore hope that the noble Baroness will withdraw her amendment.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I thank the co-signatories of my amendments for their enthusiasm. I will make three very quick points. First, the certain rights that the Minister referred to are complaints after the event when something has gone wrong, not positive rights. The second point of contention I have is whether these are so far-reaching. We are talking about people’s existing rights, and these amendments do not introduce any other right apart from access to put them together. It is very worrying that the Government would see these as a threat when data subjects put together their rights but not when commercial companies put together their data.

Finally, what is the Bill for? If it is not for creating a new and vibrant data protection system for the UK, I am concerned that it undermines a lot of existing rights and will not allow for a flourishing of uses of data. This is the new world: the world of data and AI. We have to have something to offer UK citizens. I would like the Minister to say that he will discuss this further, because it is not quite adequate to nay-say it. I beg leave to withdraw.

Amendment 10 withdrawn.
Committee adjourned at 8.14 pm.