Data Protection and Digital Information Bill Debate
Full Debate: Read Full DebateViscount Camrose
Main Page: Viscount Camrose (Conservative - Excepted Hereditary)Department Debates - View all Viscount Camrose's debates with the Department for Science, Innovation & Technology
(9 months ago)
Grand CommitteeAs I was saying, it is important for the framework on data protection that we take a precautionary approach. I hope that the Minister will this afternoon be able to provide a plain English explanation of the changes, as well as giving us an assurance that those changes to definitions do not result in watering down the current legislation.
We broadly support Amendments 1 and 5 and the clause stand part notice, in the sense that they provide additional probing of the Government’s intentions in this area. We can see that the noble Lord, Lord Clement-Jones, is trying with Amendment 1 to bring some much-needed clarity to the anonymisation issue and, with Amendment 5, to secure that data remains personal data in any event. I suspect that the Minister will tell us this afternoon that that is already the case, but a significant number of commentators have questioned this, since the definition of “personal data” is seemingly moving away from the EU GDPR standard towards a definition that is more subjective from the perspective of the controller, processor or recipient. We must be confident that the new definition does not narrow the circumstances in which the information is protected as personal data. That will be an important standard for this Committee to understand.
Amendment 288, tabled by the noble Lord, Lord Clement- Jones, seeks a review and an impact assessment of the anonymisation and identifiability of data subjects. Examining that in the light of the EU GDPR seems to us to be a useful and novel way of making a judgment over which regime better suits and serves data subjects.
We will listen with interest to the Minister’s response. We want to be more than reassured that the previous high standards and fundamental principles of data protection will not be undermined and compromised.
I thank all noble Lords who have spoken in this brief, interrupted but none the less interesting opening debate. I will speak to the amendments tabled by the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Jones; I note that I plan to that form of words quite a lot in the next eight sessions on this Bill. I thank them for tabling these amendments so that we can debate what are, in the Government’s view, the significant benefits of Clause 1.
In response to the points from the noble Lord, Lord Clement-Jones, on the appetite for the reforms in the Bill, we take very seriously the criticisms of the parties that he mentioned—the civil society groups—but it is important to note that, when the Government consulted on these reforms, we received almost 3,000 responses. At that time, we proposed to clarify when data would be regarded as anonymous and proposed legislating to confirm that the test for whether anonymous data can be reidentified is relative to the means available to the controller to reidentify the data. The majority of respondents agreed that greater clarity in legislation would indeed be beneficial.
As noble Lords will know, the UK’s data protection legislation applies only to personal data, which is data relating to an identified or identifiable living individual. It does not apply to non-personal, anonymous data. This is important because, if organisations can be sure that the data they are handling is anonymous, they may be able to more confidently put it to good use in important activities such as research and product development. The current data protection legislation is already clear that a person can be identified in a number of ways by reference to details such as names, identification numbers, location data and online identifiers, or via information about a person’s physical, genetic, mental, economic or cultural characteristics. The Bill does not change the existing legislation in this respect.
With regard to genetic information, which was raised by my noble friend Lord Kamall and the noble Lord, Lord Davies, any information that includes enough genetic markers to be unique to an individual is personal data and special category genetic data, even if names and other identifiers have been removed. This means that it is subject to the additional protections set out in Article 9 of the UK GDPR. The Bill does not change this position.
However, the existing legislation is unclear about the specific factors that a data controller must consider when assessing whether any of this information relates to an identifiable living person. This uncertainty is leading to inconsistent application of anonymisation and to anonymous data being treated as personal data out of an abundance of caution. This, in turn, reduces the opportunities for anonymous data to be used effectively for projects in the public interest. It is this difficulty that Clause 1 seeks to address by providing a comprehensive statutory test on identifiability. The test will require data controllers and processors to consider the likelihood of people within or outside their organisations reidentifying individuals using reasonable means. It is drawn from recital 26 of the EU GDPR and should therefore not be completely unfamiliar to most organisations.
I turn now to the specific amendments that have been tabled in relation to this clause. Amendment 1 in the name of the noble Lord, Lord Clement-Jones, would reiterate the position currently set out in the UK GDPR and its recitals: where individuals can be identified without the use of additional information because data controllers fail to put in place appropriate organisational measures, such as technical or contractual safeguards prohibiting reidentification, they would be considered directly identifiable. Technical and organisational measures put in place by organisations are factors that should be considered alongside others under new Section 3A of the Data Protection Act when assessing whether an individual is identifiable from the data being processed. Clause 1 sets out the threshold at which data—and, therefore, personal data—is identifiable and clarifies when data is anonymous.
On the technical capabilities of a respective data controller, these are already relevant factors under current law and ICO guidance in determining whether data is personal. This means that the test of identifiability is already a relative one today in respect of the data controller, the data concerned and the purpose of the processing. However, the intention of the data controller is not a relevant factor under current law, and nor does Clause 1 make it a factor. Clause 1 merely clarifies the position under existing law and follows very closely the wording of recital 26. Let me state this clearly: nothing in Clause 1 introduces the subjective intention of the data controller as a relevant factor in determining identifiability, and the position will remain the same as under the current law and as set out in ICO guidance.
In response to the points made by the noble Lord, Lord Clement-Jones, and others on pseudonymised personal data, noble Lords may be aware that the definition of personal data in Article 4(1) of the UK GDPR, when read in conjunction with the definition of pseudonymisation in Article 4(5), makes it clear that pseudonymised data is personal data, not anonymous data, and is thus covered by the UK’s data protection regime. I hope noble Lords are reassured by that. I also hope that, for the time being, the noble Lord, Lord Clement-Jones, will agree to withdraw his amendment and not press the related Amendment 5, which seeks to make it clear that pseudonymised data is personal data.
Amendment 4 would require the Secretary of State to assess the difference in meaning and scope between the current statutory definition of personal data and the new statutory definition that the Bill will introduce two months after its passing. Similarly, Amendment 288 seeks to review the impact of Clause 1 six months after the enactment of the Bill. The Government feel that neither of these amendments is necessary as the clause is drawn from recital 26 of the EU GDPR and case law and, as I have already set out, is not seeking to substantially change the definition of personal data. Rather, it is seeking to provide clarity in legislation.
I follow the argument, but what we are suggesting in our amendment is some sort of impact assessment for the scheme, including how it currently operates and how the Government wish it to operate under the new legislation. Have the Government undertaken a desktop exercise or any sort of review of how the two pieces of legislation might operate? Has any assessment of that been made? If they have done so, what have they found?
Obviously, the Bill has been in preparation for some time. I completely understand the point, which is about how we can be so confident in these claims. I suggest that I work with the Bill team to get an answer to that question and write to Members of the Committee, because it is a perfectly fair question to ask what makes us so sure.
In the future tense, I can assure noble Lords that the Department for Science, Innovation and Technology will monitor and evaluate the impact of this Bill as a whole in the years to come, in line with cross-government evaluation guidance and through continued engagement with stakeholders.
The Government feel that the first limb of Amendment 5 is not necessary given that, as has been noted, pseudonymised data is already considered personal data under this Bill. In relation to the second limb of the amendment, if the data being processed is actually personal data, the ICO already has powers to require organisations to address non-compliance. These include requiring it to apply appropriate protections to personal data that it is processing, and are backed up by robust enforcement mechanisms.
That said, it would not be appropriate for the processing of data that was correctly assessed as anonymous at the time of processing to retrospectively be treated as processing of personal data and subject to data protection laws, simply because it became personal data at a later point in the processing due to a change in circumstances. That would make it extremely difficult for any organisation to treat any dataset as anonymous and would undermine the aim of the clause, significantly reducing the potential to use anonymous data for important research and development activities.
My Lords, we on the Labour Benches have become co-signatories to the amendments tabled by the noble Baroness, Lady Kidron, and supported by the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Harding. The noble Baroness set out very clearly and expertly the overarching purpose of retaining the level of protection currently afforded by the Data Protection Act 2018. Amendments 2 and 3 specifically stipulate that, where data controllers know, or should reasonably know, that a user is a child, they should be given the data protection codified in that Act. Amendment 9 takes it a stage further and includes children’s data in the definition of sensitive personal data, and gives it the benefit of being treated to a heightened level of protection—quite rightly, too. Finally, Amendment 290—the favourite of the noble Lord, Lord Clement-Jones—attempts to hold Ministers to the commitment made by Paul Scully in the Commons to maintain existing standards of data protection carried over from that 2018 Act.
Why is all this necessary? I suspect that the Minister will argue that it is not needed because Clause 5 already provides for the Secretary of State to consider the impact of any changes to the rights and freedoms of individuals and, in particular, of children, who require special protection.
We disagree with that argument. In the interests of brevity and the spirit of the recent Procedure Committee report, which says that we should not repeat each other’s arguments, I do not intend to speak at length, but we have a principal concern: to try to understand why the Government want to depart from the standards of protection set out in the age-appropriate design code—the international gold standard—which they so enthusiastically signed up to just five or six years ago. Given the rising levels of parental concern over harmful online content and well-known cases highlighting the harms that can flow from unregulated material, why do the Government consider it safe to water down the regulatory standards at this precise moment in time? The noble Baroness, Lady Kidron, valuably highlighted the impact of the current regulatory framework on companies’ behaviour. That is exactly what legislation is designed to do: to change how we look at things and how we work. Why change that? As she has argued very persuasively, it is and has been hugely transformative. Why throw away that benefit now?
My attention was drawn to one example of what can happen by a briefing note from the 5Rights Foundation. As it argued, children are uniquely vulnerable to harm and risk online. I thought its set of statistics was really interesting. By the age of 13, 72 million data points have already been collected about children. They are often not used in children’s best interests; for example, the data is often used to feed recommender systems and algorithms designed to keep attention at all costs and have been found to push harmful content at children.
When this happens repeatedly over time, it can have catastrophic consequences, as we know. The coroner in the Molly Russell inquest found that she had been recommended a stream of depressive content by algorithms, leading the coroner to rule that she
“died from an act of self-harm whilst suffering from depression and the negative effects of online content”.
We do not want more Molly Russell cases. Progress has already been made in this field; we should consider dispensing with it at our peril. Can the Minister explain today the thinking and logic behind the changes that the Government have brought forward? Can he estimate the impact that the new lighter-touch regime, as we see it, will have on child protection? Have the Government consulted extensively with those in the sector who are properly concerned about child protection issues, and what sort of responses have the Government received?
Finally, why have the Government decided to take a risk with the sound framework that was already in place and built on during the course of the Online Safety Act? We need to hear very clearly from the Minister how they intend to engage with groups that are concerned about these child protection issues, given the apparent loosening of the current framework. The noble Baroness, Lady Harding, said that this is hard-fought ground; we intend to continue making it so because these protections are of great value to our society.
I am grateful to the noble Baroness, Lady Kidron, for her Amendments 2, 3, 9 and 290 and to all noble Lords who have spoken, as ever, so clearly on these points.
All these amendments seek to add protections for children to various provisions in the Bill. I absolutely recognise the intent behind them; indeed, let me take this opportunity to say that the Government take child safety deeply seriously and agree with the noble Baroness that all organisations must take great care, both when making decisions about the use of children’s data and throughout the duration of their processing activities. That said, I respectfully submit that these amendments are not necessary for three main reasons; I will talk in more general terms before I come to the specifics of the amendments.
First, the Bill maintains a high standard of data protection for everybody in the UK, including—of course—children. The Government are not removing any of the existing data protection principles in relation to lawfulness, fairness, transparency, purpose limitation, data minimisation, storage limitation, accuracy, data security or accountability; nor are they removing the provisions in the UK GDPR that require organisations to build privacy into the design and development of new processing activities.
The existing legislation acknowledges that children require specific protection for their personal data, as they may be less aware of the risks, consequences and safeguards concerned, and of their rights in relation to the processing of personal data. Organisations will need to make sure that they continue to comply with the data protection principles on children’s data and follow the ICO’s guidance on children and the UK GDPR, following the changes we make in the Bill. Organisations that provide internet services likely to be accessed by children will need to continue to comply with their transparency and fairness obligations and the ICO’s age-appropriate design code. The Government welcome the AADC, as Minister Scully said, and remain fully committed to the high standards of protection that it sets out for children.
Secondly, some of the provisions in the Bill have been designed specifically with the rights and safety of children in mind. For example, one reason that the Government introduced the new lawful ground of recognised legitimate interest in Clause 5, which we will debate later, was that some consultation respondents said that the current legislation can deter organisations, particularly in the voluntary sector, from sharing information that might help to prevent crime or protect children from harm. The same goes for the list of exemptions to the purpose limitation principle introduced by Clause 6.
There could be many instances where personal data collected for one purpose may have to be reused to protect children from crime or safeguarding risks. The Bill will provide greater clarity around this and has been welcomed by stakeholders, including in the voluntary sector.
While some provisions in the Bill do not specifically mention children or children’s rights, data controllers will still need to carefully consider the impact of their processing activities on children. For example, the new obligations on risk assessments, record keeping and the designation of senior responsible individuals will apply whenever an organisation’s processing activities are likely to result in high risks to people, including children.
Thirdly, the changes we are making in the Bill must be viewed in a wider context. Taken together, the UK GDPR, the Data Protection Act 2018 and the Online Safety Act 2023 provide a comprehensive legal framework for keeping children safe online. Although the data protection legislation and the age-appropriate design code make it clear how personal data can be processed, the Online Safety Act makes clear that companies must take steps to make their platforms safe by design. It requires social media companies to protect children from illegal, harmful and age-inappropriate content, to ensure they are more transparent about the risks and dangers posed to children on their sites, and to provide parents and children with clear and accessible ways to report problems online when they do arise.
After those general remarks, I turn to the specific amendments. The noble Baroness’s Amendments 2 and 3 would amend Clause 1 of the Bill, which relates to the test for assessing whether data is personal or anonymous. Her explanatory statement suggests that these amendments are aimed at placing a duty on organisations to determine whether the data they are processing relates to children, thereby creating a system of age verification. However, requiring data controllers to carry out widespread age verification of data subjects could create its own data protection and privacy risks, as it would require them to retain additional personal information such as dates of birth.
The test we have set out for reidentification is intended to apply to adults and children alike. If any person is likely to be identified from the data using reasonable means, the data protection legislation will apply. Introducing one test for adults and one for children is unlikely to be workable in practice and fundamentally undermines the clarity that this clause seeks to bring to organisations. Whether a person is identifiable will depend on a number of objective factors, such as the resources and technology available to organisations, regardless of whether they are an adult or a child. Creating wholly separate tests for adults and children, as set out in the amendment, would add unnecessary complexity to the clause and potentially lead to confusion.
As I understand it, the basis on which we currently operate is that children get a heightened level of protection. Is the Minister saying that that is now unnecessary and is captured by the way in which the legislation has been reframed?
I am saying, specifically on Clause 1, that separating the identifiability of children and the identifiability of adults would be detrimental to both but particularly, in this instance, to children.
Amendment 9 would ensure that children’s data is included in the definition of special category data and is subject to the heightened protections afforded to this category of data by Article 9 of the UK GDPR. This could have unintended consequences, because the legal position would be that processing of children’s data would be banned unless specifically permitted. This could create the need for considerable additional legislation to exempt routine and important processing from the ban; for example, banning a Girl Guides group from keeping a list of members unless specifically exempted would be disproportionate. However, more sensitive data such as records relating to children’s health or safeguarding concerns would already be subject to heightened protections in the UK GDPR, as soon as the latter type of data is processed.
I am grateful to the noble Baroness, Lady Kidron, for raising these issues and for the chance to set out why the Government feel that children’s protection is at least maintained, if not enhanced. I hope my answers have, for the time being, persuaded her of the Government’s view that the Bill does not reduce standards of protection for children’s data. On that basis, I ask her also not to move her Amendment 290 on the grounds that a further overarching statement on this is unnecessary and may cause confusion when interpreting the legislation. For all the reasons stated above, I hope that she will now reconsider whether her amendments in this group are necessary and agree not to press them.
Can I press the Minister more on Amendment 290 from the noble Baroness, Lady Kidron? All it does is seek to maintain the existing standards of data protection for children, as carried over from the 2018 Act. If that is all it does, what is the problem with that proposed new clause? In its current formulation, does it not put the intention of the legislation in a place of certainty? I do not quite get why it would be damaging.
I believe it restates what the Government feel is clearly implied or stated throughout the Bill: that children’s safety is paramount. Therefore, putting it there is either duplicative or confusing; it reduces the clarity of the Bill. In no way is this to say that children are not protected—far from it. The Government feel it would diminish the clarity and overall cohesiveness of the Bill to include it.
My Lords, not to put too fine a point on it, the Minister is saying that nothing in the Bill diminishes children’s rights, whether in Clause 1, Clause 6 or the legitimate interest in Clause 5. He is saying that absolutely nothing in the Bill diminishes children’s rights in any way. Is that his position?
Can I add to that question? Is my noble friend the Minister also saying that there is no risk of companies misinterpreting the Bill’s intentions and assuming that this might be some form of diminution of the protections for children?
In answer to both questions, what I am saying is that, first, any risk of misinterpreting the Bill with respect to children’s safety is diminished, rather than increased, by the Bill. Overall, it is the Government’s belief and intention that the Bill in no way diminishes the safety or privacy of children online. Needless to say, if over the course of our deliberations the Committee identifies areas of the Bill where that is not the case, we will absolutely be open to listening on that, but let me state this clearly: the intent is to at least maintain, if not enhance, the safety and privacy of children and their data.
My Lords, that creates another question, does it not? If that is the case, why amend the original wording from the 2018 Act?
Sorry, the 2018 Act? Or is the noble Lord referring to the amendments?
Why change the wording that provides the protection that is there currently?
Okay. The Government feel that, in terms of the efficient and effective drafting of the Bill, that paragraph diminishes the clarity by being duplicative rather than adding to it by making a declaration. For the same reason, we have chosen not to make a series of declarations about other intentions of the Bill overall in the belief that the Bill’s intent and outcome are protected without such a statement.
My Lords, before our break, the noble Baroness, Lady Harding, said that this is hard-fought ground; I hope the Minister understands from the number of questions he has just received during his response that it will continue to be hard-fought ground.
I really regret having to say this at such an early stage on the Bill, but I think that some of what the Minister said was quite disingenuous. We will get to it in other parts of the Bill, but the thing that we have all agreed to disagree on at this point is the statement that the Bill maintains data privacy for everyone in the UK. That is a point of contention between noble Lords and the Minister. I absolutely accept and understand that we will come to a collective view on it in Committee. However, the Minister appeared to suggest—I ask him to correct me if I have got this wrong—that the changes on legitimate interest and purpose limitation are child safety measures because some people are saying that they are deterred from sharing data for child protection reasons. I have to tell him that they are not couched or formed like that; they are general-purpose shifts. There is absolutely no question but that the Government could have made specific changes for child protection, put them in the Bill and made them absolutely clear. I find that very worrying.
I also find it worrying, I am afraid—this is perhaps where we are heading and the thing that many organisations are worried about—that bundling the AADC in with the Online Safety Act and saying, “I’ve got it over here so you don’t need it over there” is not the same as maintaining the protections for children from a high level of data. It is not the same set of things. I specifically said that this was not an age-verification measure and would not require it; whatever response there was on that was therefore unnecessary because I made that quite clear in my remarks. The Committee can understand that, in order to set a high bar of data protection, you must either identify a child or give it to everyone. Those are your choices. You do not have to verify.
I will withdraw the amendment, but I must say that the Government may not have it both ways. The Bill cannot be different or necessary and at the same time do nothing. The piece that I want to leave with the Committee is that it is the underlying provisions that allow the ICO to take action on the age-appropriate design code. It does not matter what is in the code; if the underlying provisions change, so does the code. During Committee, I expect that there will be a report on the changes that have happened all around the world as a result of the code, and we will be able to measure whether the new Bill would be able to create those same changes. With that, I beg leave to withdraw my amendment.
My Lords, I am grateful to all noble Lords who have spoken on this group. Amendment 6 to Clause 2, tabled by the noble Lord, Lord Clement-Jones, rightly tests the boundaries on the use of personal data for scientific research and, as he says, begins to ask, “What is the real purpose of this clause? Is it the clarification of existing good practice or is it something new? Do we fully understand what that new proposition is?”
As he said, there is particular public concern about the use of personal health data where it seems that some private companies are stretching the interpretation of “the public good”, for which authorisation for the use of this data was initially freely given, to something much wider. Although the clause seeks to provide some reassurance on this, we question whether it goes far enough and whether there are sufficient protections against the misuse of personal health data in the way the clause is worded.
This raises the question of whether it is only public health research that needs to be in the public interest, which is the way the clause is worded at the moment, because it could equally apply to research using personal data from other public services, such as measuring educational outcomes or accessing social housing. There is a range of uses for personal data. In an earlier debate, we heard about the plethora of data already held on people, much of which individuals do not understand or know about and which could be used for research or to make judgments about them. So we need to be sensitive about the way this might be used. It would be helpful to hear from the Minister why public health research has been singled out for special attention when, arguably, it should be a wider right across the board.
Noble Lords have asked questions about the wider concerns around Clause 2, which could enable private companies to use personal data to develop new products for commercial benefit without needing to inform the data subjects. As noble Lords have said, this is not what people would normally expect to be described as “scientific research”. The noble Baroness, Lady Kidron, was quite right that it has the potential to be unethical, so we need some standards and some clear understanding of what we mean by “scientific research”.
That is particularly important for Amendments 7 and 132 to 134 in the name of the noble Lord, Lord Clement-Jones, which underline the need for data subjects to be empowered and given the opportunity to object to their data being used for a new purpose. Arguably, without these extra guarantees—particularly because there is a lack of trust about how a lot of this information is being used—data subjects will be increasingly reluctant to hand over personal data on a voluntary basis in the first place. It may well be that this is an area where the Information Commissioner needs to provide additional advice and guidance to ensure that we can reap the benefits of good-quality scientific research that is in the public interest and in which the citizens involved can have absolute trust. Noble Lords around the Room have stressed that point.
Finally, we have added our names to the amendments tabled by the noble Baroness, Lady Kidron, on the use of children’s data for scientific research. As she rightly points out, the 2018 Act gave children a higher standard of protection on the uses for which their data is collected and processed. It is vital that this Bill, for all its intents to simplify and water down preceding rights, does not accidentally put at risk the higher protection agreed for children. In the earlier debate, the Minister said that he believed it will not do so. I am not sure that “believe” is a strong enough word here; we need guarantees that go beyond that. I think that this is an issue we will come back to again and again in terms of what is in the Bill and what guarantees exist for that protection.
In particular, there is a concern that relaxing the legal basis on which personal data can be processed for scientific research, including privately funded research carried out by commercial entities, could open the door for children’s data to be exploited for commercial purposes. We will consider the use of children’s data collected in schools in our debate on a separate group but we clearly need to ensure that the handling of pupils’ data by the Department for Education and the use of educational apps by private companies do not lead to a generation of exploited children who are vulnerable to direct marketing and manipulative messaging. The noble Baroness’s amendments are really important in this regard.
I also think that the noble Baroness’s Amendment 145 is a useful initiative to establish a code of practice on children’s data and scientific research. It would give us an opportunity to balance the best advantages of children’s research, which is clearly in the public and personal interest, with the maintenance of the highest level of protection from exploitation.
I hope that the Minister can see the sense in these amendments. In particular, I hope that he will take forward the noble Baroness’s proposals and agree to work with us on the code of practice principles and to put something like that in the Bill. I look forward to his response.
I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Jones, for this series of amendments.
I will first address Amendment 6, which seeks to amend Clause 2. As the noble Lord said, the definitions created by Clause 2, including “scientific research purposes”, are based on the current wording in recital 159 to the UK GDPR. We are changing not the scope of these definitions but their legal status. This amendment would require individual researchers to assess whether their research should be considered to be in the public interest, which could create uncertainty in the sector and discourage research. This would be more restrictive than the current position and would undermine the Government’s objectives to facilitate scientific research and empower researchers.
We have maintained a flexible scope as to what is covered by “scientific research” while ensuring that the definition is still sufficiently narrow in that it can cover only what would reasonably be seen as scientific research. This is because the legislation needs to be able to adapt to the emergence of new areas of innovative research. Therefore, the Government feel that it is more appropriate for the regulator to add more nuance and context to the definition. This includes the types of processing that are considered—
I am sorry to interrupt but it may give the Box a chance to give the Minister a note on this. Is the Minister saying that recital 159 includes the word “commercial”?
I am afraid I do not have an eidetic memory of recital 159, but I would be happy to—
That is precisely why I ask this question in the middle of the Minister’s speech to give the Box a chance to respond, I hope.
Researchers must also comply with the required safeguards to protect individuals’ privacy. All organisations conducting scientific research, including those with commercial interests, must also meet all the safeguards for research laid out in the UK GDPR and comply with the legislation’s core principles, such as fairness and transparency. Clause 26 sets out several safeguards that research organisations must comply with when processing personal data for research purposes. The ICO will update its non-statutory guidance to reflect many of the changes introduced by this Bill.
Scientific research currently holds a privileged place in the data protection framework because, by its nature, it is already viewed as generally being in the public interest. As has been observed, the Bill already applies a public interest test to processing for the purpose of public health studies in order to provide greater assurance for research that is particularly sensitive. Again, this reflects recital 159.
In response to the noble Baroness, Lady Jones, on why public health research is being singled out, as she stated, this part of the legislation just adds an additional safeguard to studies into public health ensuring that they must be in the public interest. This does not limit the scope for other research unrelated to public health. Studies in the area of public health will usually be in the public interest. For the rare, exceptional times that a study is not, this requirement provides an additional safeguard to help prevent misuse of the various exemptions and privileges for researchers in the UK GDPR. “Public interest” is not defined in the legislation, so the controller needs to make a case-by-case assessment based on its purposes.
On the point made by the noble Lord, Lord Clement-Jones, about recitals and ICO guidance, although we of course respect and welcome ICO guidance, it does not have legislative effect and does not provide the certainty that legislation does. That is why we have done so via this Bill.
Amendment 7 to Clause 3 would undermine the broader consent concept for scientific research. Clause 3 places the existing concept of “broad consent” currently found in recital 33 to the UK GDPR on a statutory footing with the intention of improving awareness and confidence for researchers. This clause applies only to scientific research processing that is reliant on consent. It already contains various safeguards. For example, broad consent can be used only where it is not possible to identify at the outset the full purposes for which personal data might be processed. Additionally, to give individuals greater agency, where possible individuals will have the option to consent to only part of the processing and can withdraw their consent at any time.
Clause 3 clarifies an existing concept of broad consent which outlines how the conditions for consent will be met in certain circumstances when processing for scientific research purposes. This will enable consent to be obtained for an area of scientific research when researchers cannot at the outset identify fully the purposes for which they are collecting the data. For example, the initial aim may be the study of cancer, but it later becomes the study of a particular cancer type.
Furthermore, as part of the reforms around the reuse of personal data, we have further clarified that when personal data is originally collected on the basis of consent, a controller would need to get fresh consent to reuse that data for a new purpose unless a public interest exemption applied and it is unreasonable to expect the controller to obtain that consent. A controller cannot generally reuse personal data originally collected on the basis of consent for research purposes.
Turning to Amendments 132 and 133 to Clause 26, the general rule described in Article 13(3) of the UK GDPR is that controllers must inform data subjects about a change of purposes, which provides an opportunity to withdraw consent or object to the proposed processing where relevant. There are existing exceptions to the right to object, such as Article 21(6) of the UK GDPR, where processing is necessary for research in the public interest, and in Schedule 2 to the Data Protection Act 2018, when applying the right would prevent or seriously impair the research. Removing these exemptions could undermine life-saving research and compromise long-term studies so that they are not able to continue.
Regarding Amendment 134, new Article 84B of the UK GDPR already sets out the requirement that personal data should be anonymised for research, archiving and statistical—RAS—purposes unless doing so would mean the research could not be carried through. Anonymisation is not always possible as personal data can be at the heart of valuable research, archiving and statistical activities, for example, in genetic research for the monitoring of new treatments of diseases. That is why new Article 84C of the UK GDPR also sets out protective measures for personal data that is used for RAS purposes, such as ensuring respect for the principle of data minimisation through pseudonymisation.
The stand part notice in this group seeks to remove Clause 6 and, consequentially, Schedule 2. In the Government’s consultation on data reform, Data: A New Direction, we heard that the current provisions in the UK GDPR on personal data reuse are difficult for controllers and individuals to navigate. This has led to uncertainty about when controllers can reuse personal data, causing delays for researchers and obstructing innovation. Clause 6 and Schedule 2 address the existing uncertainty around reusing personal data by setting out clearly the conditions in which the reuse of personal data for a new purpose is permitted. Clause 6 and Schedule 2 must therefore remain to give controllers legal certainty and individuals greater transparency.
Amendment 22 seeks to remove the power to add to or vary the conditions set out in Schedule 2. These conditions currently constitute a list of specific public interest purposes, such as safeguarding vulnerable individuals, for which an organisation is permitted to reuse data without needing consent or to identify a specific law elsewhere in legislation. Since this list is strictly limited and exhaustive, a power is needed to ensure that it is kept up to date with future developments in how personal data is used for important public interest purposes.
I am interested that the safeguarding requirement is already in the Bill, so, in terms of children, which I believe the Minister is going to come to, the onward processing is not a question of safeguarding. Is that correct? As the Minister has just indicated, that is already a provision.
Just before we broke, I was on the verge of attempting to answer the question from the noble Baroness, Lady Kidron; I hope my coming words will do that, but she can intervene again if she needs to.
I turn to the amendments that concern the use of children’s data in research and reuse. Amendment 8 would also amend Clause 3; the noble Baroness suggests that the measure should not apply to children’s data, but this would potentially prevent children, or their parents or guardians, from agreeing to participate in broad areas of pioneering research that could have a positive impact on children, such as on the causes of childhood diseases.
On the point about safeguarding, the provisions on recognised legitimate interests and further processing are required for safeguarding children for compliance with, respectively, the lawfulness and purpose limitation principles. The purpose limitation provision in this clause is meant for situations where the original processing purpose was not safeguarding and the controller then realises that there is a need to further process it for safeguarding.
Research organisations are already required to comply with the data protection principles, including on fairness and transparency, so that research participants can make informed decisions about how their data is used; and, where consent is the lawful basis for processing, children, or their parents or guardians, are free to choose not to provide their consent, or, if they do consent, they can withdraw it at any time. In addition, the further safeguards that are set out in Clause 26, which I mentioned earlier, will protect all personal data, whether it relates to children or adults.
Amendment 21 would require data controllers to have specific regard to the fact that children’s data requires a higher standard of protection for children when deciding whether reuse of their data is compatible with the original purpose for which it was collected. This is unnecessary because the situations in which personal data could be reused are limited to public interest purposes designed largely to protect the public and children, in so far as they are relevant to them. Controllers must also consider the possible consequences for data subjects and the relationship between the controller and the data subject. This includes taking into account that the data subject is a child, in addition to the need to generally consider the interests of children.
Amendment 23 seeks to limit use of the purpose limitation exemptions in Schedule 2 in relation to children’s data. This amendment is unnecessary because these provisions permit further processing only in a narrow range of circumstances and can be expanded only to serve important purposes of public interest. Furthermore, it may inadvertently be harmful to children. Current objectives include safeguarding children or vulnerable people, preventing crime or responding to emergencies. In seeking to limit the use of these provisions, there is a risk that the noble Baroness’s amendments might make data controllers more hesitant to reuse or disclose data for public interest purposes and undermine provisions in place to protect children. These amendments could also obstruct important research that could have a demonstrable positive impact on children, such as research into children’s diseases.
Amendment 145 would require the ICO to publish a statutory code on the use of children’s data in scientific research and technology development. Although the Government recognise the value that ICO codes can play in promoting good practice and improving compliance, we do not consider that it would be appropriate to add these provisions to the Bill without further detailed consultation with the ICO and the organisations likely to be affected by the new codes. Clause 33 of the Bill already includes a measure that would allow the Secretary of State to request the ICO to publish a code on any matter that it sees fit, so this is an issue that we could return to in the future if the evidence supports it.
I will read Hansard very carefully, because I am not sure that I absolutely followed the Minister, but we will undoubtedly come back to this. I will ask two questions. Earlier, before we had a break, in response to some of the early amendments in the name of the noble Lord, Lord Clement-Jones, the Minister suggested that several things were being taken out of the recital to give them solidity in the Bill; so I am using this opportunity to suggest that recital 38, which is the special consideration of children’s data, might usefully be treated in a similar way and that we could then have a schedule that is the age-appropriate design code in the Bill. Perhaps I can leave that with the Minister, and perhaps he can undertake to have some further consultation with the ICO on Amendment 145 specifically.
With respect to recital 38, that sounds like a really interesting idea. Yes, let us both have a look and see what the consultation involves and what the timing might look like. I confess to the Committee that I do not know what recital 38 says, off the top of my head. For the reasons I have set out, I am not able to accept these amendments. I hope that noble Lords will therefore not press them.
Returning to the questions by the noble Lord, Lord Clement-Jones, on the contents of recital 159, the current UK GDPR and EU GDPR are silent on the specific definition of scientific research. It does not preclude commercial organisations performing scientific research; indeed, the ICO’s own guidance on research and its interpretation of recital 159 already mention commercial activities. Scientific research can be done by commercial organisations—for example, much of the research done into vaccines, and the research into AI referenced by the noble Baroness, Lady Harding. The recital itself does not mention it but, as the ICO’s guidance is clear on this already, the Government feel that it is appropriate to put this on a statutory footing.
My Lords, that was intriguing. I thank the Minister for his response. It sounds as though, again, guidance would have been absolutely fine, but what is there not to like about the ICO bringing clarity? It was quite interesting that the Minister used the phrase “uncertainty in the sector” on numerous occasions and that is becoming a bit of a mantra as the Bill goes on. We cannot create uncertainty in the sector, so the poor old ICO has been labouring in the vineyard for the last few years to no purpose at all. Clearly there has been uncertainty in the sector of a major description, and all its guidance and all the work that it has put in over the years have been wholly fruitless, really. It is only this Government that have grabbed the agenda with this splendid 300-page data protection Bill that will clarify this for business. I do not know how much they will have to pay to get new compliance officers or whatever it happens to be, but the one thing that the Bill will absolutely not create is greater clarity.
I am a huge fan of making sure that we understand what the recitals have to say, and it is very interesting that the Minister is saying that the recital is silent but the ICO’s guidance is pretty clear on this. I am hugely attracted by the idea of including recital 38 in the Bill. It is another lightbulb moment from the noble Baroness, Lady Kidron, who has these moments, rather like with the age-appropriate design code, which was a huge one.
We are back to the concern, whether in the ICO guidance, the Bill or wherever, that scientific research needs to be in the public interest to qualify and not have all the consents that are normally required for the use of personal data. The Minister said, “Well, of course we think that scientific research is in the public interest; that is its very definition”. So why does only public health research need that public interest test and not the other aspects? Is it because, for instance, the opt-out was a bit of a disaster and 3 million people opted out of allowing their health data to be shared or accessed by GPs? Yes, it probably is.
Do the Government want a similar kind of disaster to happen, in which people get really excited about Meta or other commercial organisations getting hold of their data, a public outcry ensues and they therefore have to introduce a public interest test on that? What is sauce for the goose is sauce for the gander. I do not think that personal data should be treated in a particularly different way in terms of its public interest, just because it is in healthcare. I very much hope that the Minister will consider that.
My Lords, I am also pleased to support these amendments in the name of the noble Baroness, Lady Kidron, to which I have added my name. I am hugely enthusiastic about them, too, and think that this has been a lightbulb moment from the noble Baroness. I very much thank her for doing all of this background work because she has identified the current weakness in the data protection landscape: it is currently predicated on an arrangement between an individual and the organisation that holds their data.
That is an inherently unbalanced power construct. As the noble Baroness said, as tech companies become larger and more powerful, it is not surprising that many individuals feel overwhelmed by the task of questioning or challenging those that are processing their personal information. It assumes a degree of knowledge about their rights and a degree of digital literacy, which we know many people do not possess.
In the very good debate that we had on digital exclusion a few weeks ago, it was highlighted that around 2.4 million people are unable to complete a single basic task to get online, such as opening an internet browser, and that more than 5 million employed adults cannot complete essential digital work tasks. These individuals cannot be expected to access their digital data on their own; they need the safety of a larger group to do so. We need to protect the interests of an entire group that would otherwise be locked out of the system.
The noble Baroness referred to the example of Uber drivers who were helped by their trade union to access their data, sharing patterns of exploitation and subsequently strengthening their employment package, but this does not have to be about just union membership; it could be about the interests of a group of public sector service users who want to make sure that they are not being discriminated against, a community group that wants its bid for a local grant to be treated fairly, and so on. We can all imagine examples of where this would work in a group’s interest. As the noble Baroness said, these proposals would allow any group of people to assign their rights—rights that are more powerful together than apart.
There could be other benefits; if data controllers are concerned about the number of individual requests that they are receiving for data information—and a lot of this Bill is supposed to address that extra work—group requests, on behalf of a data community, could provide economies of scale and make the whole system more efficient.
Like the noble Baroness, I can see great advantages from this proposal; it could lay the foundation for other forms of data innovation and help to build trust with many citizens who currently see digitalisation as something to fear—this could allay those fears. Like the noble Lord, Lord Clement-Jones, I hope the Minister can provide some reassurance that the Government welcome this proposal, take it seriously and will be prepared to work with the noble Baroness and others to make it a reality, because there is the essence of a very good initiative here.
I thank the noble Baroness, Lady Kidron, for raising this interesting and compelling set of ideas. I turn first to Amendments 10 and 35 relating to data communities. The Government recognise that individuals need to have the appropriate tools and mechanisms to easily exercise their rights under the data protection legislation. It is worth pointing out that current legislation does not prevent data subjects authorising third parties to exercise certain rights. Article 80 of the UK GDPR also explicitly gives data subjects the right to appoint not-for-profit bodies to exercise certain rights, including their right to bring a complaint to the ICO, to appeal against a decision of the ICO or to bring legal proceedings against a controller or processor and the right to receive compensation.
The concept of data communities exercising certain data subject rights is closely linked with the wider concept of data intermediaries. The Government recognise the existing and potential benefits of data intermediaries and are committed to supporting them. However, given that data intermediaries are new, we need to be careful not to distort the sector at such an early stage of development. As in many areas of the economy, officials are in regular contact with businesses, and the data intermediary sector is no different. One such engagement is the DBT’s Smart Data Council, which includes a number of intermediary businesses that advise the Government on the direction of smart data policy. The Government would welcome further and continued engagement with intermediary businesses to inform how data policy is developed.
I am sorry, but the Minister used a pretty pejorative word: “distort” the sector. What does he have in mind?
I did not mean to be pejorative; I merely point out that before embarking on quite a far-reaching policy—as noble Lords have pointed out—we would not want to jump the gun prior to consultation and researching the area properly. I certainly do not wish to paint a negative portrait.
It is a moment at which I cannot set a firm date for a firm set of actions, but on the other hand I am not attempting to punt it into the long grass either. The Government do not want to introduce a prescriptive framework without assessing potential risks, strengthening the evidence base and assessing the appropriate regulatory response. For these reasons, I hope that for the time being the noble Baroness will not press these amendments.
The noble Baroness has also proposed Amendments 147 and 148 relating to the role of the Information Commissioner’s Office. Given my response just now to the wider proposals, these amendments are no longer necessary and would complicate the statute book. We note that Clause 35 already includes a measure that will allow the Secretary of State to request the Information Commissioner’s Office to publish a code on any matter that she or he sees fit, so this is an issue we could return to in future if such a code were deemed necessary.
My Lords, I am sorry to keep interrupting the Minister. Can he give us a bit of a picture of what he has in mind? He said that he did not want to distort things at the moment, that there were intermediaries out there and so on. That is all very well, but is he assuming that a market will be developed or is developing? What overview of this does he have? In a sense, we have a very clear proposition here, which the Government should respond to. I am assuming that this is not a question just of letting a thousand flowers bloom. What is the government policy towards this? If you look at the Hall-Pesenti review and read pretty much every government response—including to our AI Select Committee, where we talked about data trusts and picked up the Hall-Pesenti review recommendations —you see that the Government have been pretty much positive over time when they have talked about data trusts. The trouble is that they have not done anything.
Overall, as I say and as many have said in this brief debate, this is a potentially far-reaching and powerful idea with an enormous number of benefits. But the fact that it is far-reaching implies that we need to look at it further. I am afraid that I am not briefed on long-standing—
May I suggest that the Minister writes? On the one hand, he is saying that we will be distorting something—that something is happening out there—but, on the other hand, he is saying that he is not briefed on what is out there or what the intentions are. A letter unpacking all that would be enormously helpful.
I am very happy to write on this. I will just say that I am not briefed on previous government policy towards it, dating back many years before my time in the role.
It was even further. Yes, I am very happy to write on that. For the reasons I have set out, I am not able to accept these amendments for now. I therefore hope that the noble Baroness will withdraw her amendment.