(1 month ago)
Grand CommitteeMy Lords, I remind the Committee that if there is a Division in the Chamber, the Committee will adjourn for 10 minutes from the sound of the Division Bells.
Clause 67: Meaning of research and statistical purposes
Amendment 59
I have tabled Amendments 59, 62, 63 and 65, and I thank the noble Lord, Lord Clement-Jones, my noble friend Lady Kidron and the noble Viscount, Lord Camrose, for adding their names to them. I am sure that the Committee will agree that these amendments have some pretty heavyweight support. I also support Amendment 64, in the name of the noble Lord, Lord Clement-Jones, which is an alternative to my Amendment 63. Amendments 68 and 69 in this group also warrant attention.
I very much support the Government’s aim in Clause 67 to ensure that valuable research does not get discarded due to a lack of clarity around its use or because of an overly narrow distinction between the original and new purposes of the use of the data. The Government’s position is that this clause clarifies the law by incorporating into the Bill recitals to the original GDPR. However, while the effect is to encourage scientific research and development, it has to be seen in the context of the fast-evolving world of developments in AI and the way that AI developers, given the need for huge amounts of data to train their large language models, are reusing data.
My concern is that the scraping of vast amounts of data by these AI companies is often positioned as scientific research and in some cases is even supported by the production of academic papers. I ask the Minister to understand my concerns and those of many in the data community and beyond. The fact is that the lines between scientific research, as set out in Clause 67, and AI product development are blurred. This might not be the concern of the original recitals, but I beg to suggest to the Minister that, in the new world of AI, there should be concern about the definition presented in the Bill.
Like other noble Lords, I very much hope to make this country a centre of AI development, but I do not want this to happen at the expense of data subjects’ privacy and data protection. It costs at least £1 billion—even more, sometimes—to develop a large language model and, although the cost will soon go down, there is a huge financial incentive to scrape data that pushes the boundaries of what is legitimate. In this climate, it is important that the Bill closes any loopholes that allow AI developers to claim the protections offered by Clause 67. My Amendments 59, 62, 63 and 65 go some way to ensuring that this will not happen.
The definition of scientific research in proposed new paragraph 2, in Clause 67(1)(b), is drawn broadly. My concern is that many commercial developments of digital products, particularly those involving AI, could still claim to be, in the words of the clause, “reasonably … described as scientific”. AI model development usually involves a mix of purposes—not just developing its capabilities but also commercialising as it develops services. The exemption allowed for “purposes of technological development” makes me concerned that this vague area creates a threat whereby AI developers will misuse the provisions of the Bill to reuse personal data for any AI developments, provided that one of their goals is technological advancement.
Amendments 59 and 62, by inserting the word “solely” into proposed new paragraphs 2 and 3 in Clause 67, would disaggregate reuse of data for scientific research purposes from other purposes, ensuring that the only goal of reuse is scientific research.
An example of the threat under the present definition is shown by Meta’s recently allowing the reuse of Instagram users’ data to train its new generation of Llama models. When the news got out, it created a huge backlash, with more than half a million people reposting a viral hoax image that claimed to deny Meta the right to reuse their data to train AI. This caused the ICO to say that it was pleased that Meta had paused its data processing in response to users’ concerns, adding:
“It is crucial that the public can trust that their privacy rights will be respected from the outset”.
However, Meta could well claim under this clause that it is creating technological advancement which would allow it to reuse any data collected by users under the legitimate interest grounds for training the model. The Bill as it stands would not require the company to conduct its research in accordance with any of the features of genuine scientific research. These amendments go some way to rectify that.
Amendment 63 increases the test for what is deemed to be scientific interest. At the moment, the public interest test is applied only to public health. I am pleased that NHS researchers will have to recognise this threshold, but why should all researchers doing scientific work not have to adhere to this threshold? Why should that test not be applied to all data reuse for scientific research? By deleting the public health exception, the public interest test would apply to all data reuse for scientific purposes.
The original intention of the RAS purpose of the GDPR supports public health for scientific interests. This is complemented by Amendment 65, which uses the tests for consent already laid out in Clause 68. The inclusion of ethical thresholds in the reuse of data should meet the highest levels of academic rigour and oversight envisaged in the original GDPR. It will demand not just ethical standards in research but for it to be supervised by an independent research ethics committee that meets UKRI guidance. These requirements will ensure that the high standards of ethics that we expect from scientific research will be applied in evaluating the exemption in Clause 67.
I do not want noble Lords to think that these amendments are thwarting the development of AI. There is plenty of AI research that is clearly scientific. Look at DeepMind AlphaFold, which uses AI to analyse the shape of proteins so that they can be incorporated in future drug treatment and will move pharmaceutical development. It is an AI model developed in accordance with the ethical standards expected from modern scientific research.
The Minister will argue that the definition has been taken straight from EU recitals. I therefore ask her to consider very seriously what has been said about this definition by the EU’s premier data body, the European Data Protection Supervisor, in its preliminary opinion on data protection and scientific research. In its executive summary, it states:
“The boundary between private sector research and traditional academic research is blurrier than ever, and it is ever harder to distinguish research with generalisable benefits for society from that which primarily serves private interests. Corporate secrecy, particularly in the tech sector, which controls the most valuable data for understanding the impact of digitisation and specific phenomena like the dissimilation of misinformation, is a major barrier to social science research … there have been few guidelines or comprehensive studies on the application of data protection rules to research”.
It suggests that the rules should be interpreted in such a way that permits reuse only for genuine scientific research.
For the purpose of this preliminary opinion by the EDPS, the special data protection regime for scientific research is understood to apply if each of three criteria are met: first, personal data is processed; secondly, relevant sectorial standards of methodology and ethics apply, including the notion of informed consent, accountability and oversight; and, thirdly, the research is carried out with the aim of growing society’s collective knowledge and well-being as opposed to serving primarily one or several private interests. I hope that noble Lords will recognise that these are features that the amendments before the Committee would incorporate into Clause 67.
In the circumstances, I hope that the Minister, who I know has thought deeply about these issues, will recognise that the EU’s institutions are worried about the definition of scientific research that has been incorporated into the Bill. If they are worried, I suggest that we should be worried. I hope that these amendments will allay those fears and ensure that true scientific research is encouraged by Clause 67 and that it is not abused by AI companies. I beg to move.
My Lords, I support the amendments from the noble Viscount, Lord Colville, which I have signed, and will put forward my Amendments 64, 68, 69, 130 and 132 and my Clause 85 stand part debate.
This part of the GDPR is a core component of how data protection law functions. It makes sure that organisations use personal data only for the reason that it was collected. One of the exceptional circumstances is scientific research. Focus on the definitions and uses of data in research increased in the wake of the Covid-19 pandemic, when some came to the view that legal uncertainty and related risk aversion were a barrier to clinical research.
There is a legitimate government desire to ensure that valuable research does not have to be discarded because of a lack of clarity around reuse or very narrow distinctions between the original and new purpose. The Government’s position seems to be that the Bill will only clarify the law, incorporating recitals to the original GDPR in the legislation. While this may be the policy intention, the Bill must be read in the context of recent developments in artificial intelligence and the practice of AI developers.
The Government need to provide reassurance that the intention and impact of the research provisions are not to enable the reuse of personal data, as the noble Viscount said, scraped from the internet or collected by tech companies under legitimate interest for training AI. Large tech companies could abuse the provisions to legitimise mass data scraping of personal data from the internet or to collect via legitimate interest—for example, by a social media platform, about its users. This could be legally reused for training AI systems under the new provisions if developers can claim that it constitutes scientific research. That is why we very much support what the noble Viscount said.
In our view, the definition of scientific research adopted in the Bill is too broad and will permit abuse by commercial interests outside the policy intention. The Bill must recognise the reality that companies will likely position any AI development as “reasonably described as scientific”. Combined with the inclusion of commercial activities in the Bill, that opens the door to data reuse for any data-driven product development under the auspices that it represents scientific research, even where the relationship to real scientific progress is unclear or tenuous. That is not excluded in these provisions.
I turn to Amendments 64, 68, 69, 130 and 132 and the Clause 85 stand part debate. The definition of scientific research in proposed new paragraph 2 under Clause 67(1)(b) is drawn so broadly that most commercial development of digital products and services, particularly those involving machine learning, could ostensibly be claimed by controllers to be “reasonably described as scientific”. Amendment 64, taken together with those tabled by the noble Viscount that I have signed, would radically reduce the scope for misuse of data reuse provisions by ensuring that controllers cannot mix their commercial purposes with scientific research and that such research must be in the public interest and conducted in line with established academic practice for genuine scientific research, such as ethics approval.
Since the Data Protection Act was introduced in 2018, based on the 2016 GDPR, the education sector has seen enormous expansion of state and commercial data collection, partly normalised in the pandemic, of increased volume, sensitivity, intrusiveness and high risk. Children need particular care in view of the special environment of educational settings, where pupils and families are disempowered and have no choice over the products procured, which they are obliged to use for school administrative purposes, for learning in the classroom, for homework and for digital behavioural monitoring.
The implications of broadening the definition of research activities conducted within the state education sector include questions of the appropriateness of applying the same rules where children are in a compulsory environment without agency or routine practice for research ethics oversight, particularly if the definition is expanded to commercial activity.
Parental and family personal data is often inextricably linked to the data of a child in education, such as home address, heritable health conditions or young carer status. The Responsible Technology Adoption Unit within DSIT commissioned research in the Department for Education to understand how parents and pupils feel about the use of AI tools in education and found that, while parents and pupils did not expect to make specific decisions about AI optimisation, they did expect to be consulted on whether and by whom pupil work and data can be used. There was widespread consensus that work and data should not be used without parents’ and/or pupils’ explicit agreement.
My Lords, I will speak to Amendments 59, 62, 63 and 65 in the name of my noble friend Lord Colville, and Amendment 64 in the name of the noble Lord, Lord Clement-Jones, to which I added my name. I am also very much in sympathy with the other amendments in this group more broadly.
My noble friend Lord Colville set out how he is seeking to understand what the Government intend by “scientific research” and to make sure that the Bill does not offer a loophole so big that any commercial company can avoid data protections of UK citizens in the name of science.
At Second Reading, I read out a dictionary definition of science:
“The systematic study of the structure and behaviour of the physical and natural world through observation, experimentation, and the testing of theories against the evidence obtained”—
i.e. everything. I also ask the Minister if the following scenarios could reasonably be considered scientific. Is updating or improving a new tracking app for fitness, or a bot for an airline, scientific? Is the behavioural science of testing children’s response to persuasive design strategies in order to extend the stickiness of commercial products scientific? These are practical scenarios, and I would be grateful for an answer in order to understand what is in and out of the scope of the Bill.
When I raised Clause 67 at a briefing meeting, it was said that it was, as my noble friend Lord Colville suggested, just housekeeping. The law firm Taylor Wessing suggests that what can
“‘reasonably be described as scientific’ is arguably very wide and fairly vague, so it will be interesting to see how this is interpreted, but the assumption is that it is intended to be a very broad definition”.
Each of the 14 law firm blogs and briefings that I read over the weekend described it variously as loosening, expanding or broadening. Not one suggested that it was a tightening and not one said that it was a no-change change. As we have heard, the European Data Protection Supervisor published an opinion stating that
“scientific research is understood to apply where … the research is carried out with the aim of growing society’s collective knowledge and wellbeing, as opposed to serving primarily one or several private interests”.
When the Minister responds, perhaps she could say whether the particular scenarios I have set out fall within the definition of scientific and why the Government have failed to reflect the critical clarification of the European Data Protection Supervisor in transferring the recital into the Bill.
I turn briefly to Amendment 64, which would limit the use of children’s personal data for the purposes of research and education by making it subject to a public interest requirement and opt-in from the child or a parent. I will speak in our debate on a later grouping to amendments that would enshrine children’s right to higher protection and propose a comprehensive code of practice on the use of children’s data in education, which is an issue of increasing scandal and concern. For now, it would be good to understand whether the Government agree that education is an area of research where a public interest requirement is necessary and appropriate and that children’s data should always be used to support their right to learn, rather than to commoditise them.
During debate on the DPDI Bill, a code of practice on children’s data and scientific research was proposed; the Minister added her name to it. It is by accident rather than by design that I have failed to lay it here, but I will listen carefully to the Minister’s reply to see whether children need additional protections from scientific research as the Government now define it.
My Lords, I have in subsequent groups a number of amendments that touch on many of the issues that are raised here, so I will not detain the Committee by going through them at this stage and repeating them later. However, I feel that, although the Government have had the best intentions in bringing forward a set of proposals in this area that were to update and to bring together rather conflicting and difficult pieces of legislation that have been left because of the Brexit arrangements, they have managed to open up a gap between where we want to be and where we will be if the Bill goes forward in its present form. I say that in relation to AI, which is a subject requiring a lot more attention and a lot more detail than we have before us. I doubt very much whether the Government will have the appetite for dealing with that in time for this Bill, but I hope that at the very least—it would be a minor concession at this stage—they will commit at the Dispatch Box to seeking to resolve these issues in the legislation within a very short period because, as we have heard from the arguments made today, it is desperately needed.
More importantly, if, by bringing together documentation that is thought to represent the current situation, either inadvertently or otherwise, the Government have managed to open up a loophole that will devalue the way in which we currently treat personal data—I will come on to this when I get to my groups in relation to the NHS in particular—that would be a grievous situation. I hope that, going forward, the points that have been made here can be accommodated in a statement that will resolve them, because they need to be resolved.
My Lords, it is a pleasure to take part in today’s Committee proceedings. In doing so, I declare my technology interests as set out in the register, not least as adviser to Socially Recruited, an AI business.
I support the noble Viscount, Lord Colville, in his amendments and all the other amendments in this group. They were understandably popular, to the extent that when I got my pen out, there was no space left for me to co-sign them, so I was left with the oral tradition in which to reflect my support for them. Before going into the detail, I just say that we have had three data Bills in just over three years: DPDI, DISD and this Bill. Over that period, though the names have changed, much of the meat remains the same in the legislation. Yet, in that period, everything and nothing haschanged —everything in terms of what has happened with generative AI.
Considering that seismic shift that has occurred over these three Bills, could the Minister say what in this Bill specifically has changed, not least in this part, to reflect that seismic change? Regarding “nothing has changed”, nothing has changed in terms of the incredibly powerful potential of AI for positive or negative outcomes, ably demonstrated with this set of amendments.
If you went on to Main Street and polled the public, I believe that you would get a pretty clear understanding of what they considered scientific research to be. You know it. You understand why we would want to have a specified definition of scientific research and what that would mean for the researchers and for the country.
However, if we are to draw that definition as broadly as it currently is in the Bill, why would we bother to have such a definition at all? If the Government’s intention is to enable so much to come within the perimeter, let us not have the definition at all and let us allow to continue what is happening right now, not least in the reuse of scrape data or in how data is being treated in these generative AI models.
We have seen what has happened in terms of the training, but when you look at what could be called development and improvement, as the noble Viscount has rightly pointed out, all this and more could easily fit within the scientific research definition. It could even more easily fit in when lawyers are deployed to ensure that that is so. I know we are going to come on to rehearsing a number of these subjects in the next group but, for this group, I support all the amendments as set out.
I ask the Minister these two questions. First, what has changed in all the provisions that have gone through all these three iterations of the data Bill? Secondly, what is the Government’s intention when it comes to scientific research, if it is not truly to mean scientific research, if it is not to have ethics committee involvement and if it is not to feel sound and be defined as what most people on Main Street would recognise as scientific research?
I start by apologising because, due to a prior commitment, I am not able to stay for many of the proceedings today, but I see these groupings and others as critical. In the few words that I will say, I hope to bring to bear to this area some of my experience as a Health Minister, particularly in charge of technology and development of AI.
I can see a lot of good intent behind these clauses, to make sure that we do not stop a lot of the research that we need. I was recently very much involved in the negotiation of the pandemic accord regarding the next pandemic and how you make sure that any vaccines that you develop on a worldwide basis can be distributed on a worldwide basis as well. One of the main stumbling blocks was that the so-called poorer countries were trying to demand, as part of that, the intellectual property to be able to develop the vaccines in their own countries.
The point we were trying to make was that, although we could see the good intentions behind that, it would have a real chilling effect on pharmaceutical companies investing the hundreds of millions or even billions of pounds, which you often need with vaccines, to find a cure, because if they felt that they were going to lose their intellectual property and rights at the end, it would be much harder for them to justify the investment up front.
I start by thanking all noble Lords who spoke for their comments and fascinating contributions. We on these Benches share the concern of many noble Lords about the Bill allowing the use of data for research purposes, especially scientific research purposes.
Amendment 59 has, to my mind, the entirely right and important intention of preventing misuse of the scientific research exemption for data reuse by ensuring that the only purpose for which the reuse is permissible is scientific research. Clearly, there is merit in this idea, and I look forward to hearing the Minister give it due consideration.
However, there are two problems with the concept and definition of scientific research in the Bill overall, and, again, I very much look forward to hearing the Government’s view. First, I echo the important points raised by my noble friend Lord Markham. Almost nothing in research or, frankly, life more broadly, is done with only one intention. Even the most high-minded, curiosity-driven researcher will have at the back of their mind the possibility of commercialisation. Alongside protecting ourselves from the cynical misuse of science as a cover story for commercial pursuit, we have to be equally wary of creating law that pushes for the complete absence of the profit motive in research, because to the extent that we succeed in doing that, we will see less research. Secondly—the noble Viscount, Lord Colville, and the noble Lord, Lord Clement-Jones, made this point very powerfully—I am concerned that the broad definition of scientific research in the Bill might muddy the waters further. I worry that, if the terminology itself is not tightened, restricting the exemption might serve little purpose.
On Amendment 62, to which I have put my name, the same arguments very much apply. I accept that it is very challenging to find a form of words that both encourages research and innovation and does not do so at the expense of data protection. Again, I look forward to hearing the Government’s view. I am also pleased to have signed Amendment 63, which seeks to ensure that personal data can be reused only if doing so is in the public interest. Having listened carefully to some of the arguments, I feel that the public interest test may be more fertile ground than a kind of research motivation purity test to achieve that very difficult balance.
On Amendment 64, I share the curiosity to hear how the Minister defines research and statistical processes —again, not easy but I look forward to her response.
Amendment 65 aims to ensure that research seeking to use the scientific research exemption to obtaining consent meets the minimum levels of scientific rigour. The aim of the amendment is, needless to say, excellent. We should seek to avoid creating opportunities which would allow companies—especially but not uniquely AI labs—to cloak their commercial research as scientific, thus reducing the hoops they must jump through to reuse data in their research without explicit consent. However, Amendment 66, tabled in my name, which inserts the words:
“Research considered scientific research that is carried out as a commercial activity must be subject to the approval of an independent ethics committee”,
may be a more adaptive solution.
Many of these amendments show that we are all quite aligned in what we want but that it is really challenging to codify that in writing. Therefore, the use of an ethics committee to conduct these judgments may be the more agile, adaptive solution.
I confess that I am not sure I have fully understood the mechanism behind Amendments 68 and 69, but I of course look forward to the Minister’s response. I understand that they would essentially mean consent by failing to opt out. If so, I am not sure I could get behind that.
Amendment 130 would prevent the processing of personal data for research, archiving and statistical purposes if it permits the identification of a living individual. This is a sensible precaution. It would prevent the sharing of unnecessary or irrelevant information and protect people’s privacy in the event of a data breach.
Amendment 132 appears to uphold existing patient consent for the use of their data for research, archiving and statistical purposes. I just wonder whether this is necessary. Is that not already the case?
Finally, I turn to the Clause 85 stand part notice. I listened carefully to the noble Lord, Lord Clement-Jones, but I am not, I am afraid, at a point where I can support this. There need to be safeguards on the use of data for this purpose; I feel that Clause 85 is our way of having them.
My Lords, it is a great pleasure to be here this afternoon. I look forward to what I am sure will be some excellent debates.
We have a number of debates on scientific research; it is just the way the groupings have fallen. This is just one of several groupings that will, in different ways and from different directions, probe some of these issues. I look forward to drilling down into all the implications of scientific research in the round. I should say at the beginning—the noble Lord, Lord Markham, is absolutely right about this—that we have a fantastic history of and reputation for doing R&D and scientific research in this country. We are hugely respected throughout the world. We must be careful that we do not somehow begin to demonise some of those people by casting aspersions on a lot of the very good research that is taking place.
A number of noble Lords said that they are struggling to know what the definition of “scientific research” is. A lot of scientific research is curiosity driven; it does not necessarily have an obvious outcome. People start a piece of research, either in a university or on a commercial basis, and they do not quite know where it will lead them. Then—it may be 10 or 20 years later—we begin to realise that the outcome of their research has more applications than we had ever considered in the past. That is the wonderful thing about human knowledge: as we build and we learn, we find new applications for it. So I hope that whatever we decide and agree on in this Bill does not put a dampener on that great aspect of human knowledge and the drive for further exploration, which we have seen in the UK in life sciences in particular but also in other areas such as space exploration and quantum. Noble Lords could probably identify many more areas where we are increasingly getting a reputation for being at the global forefront of this thinking. We have to take the public with us, of course, and get the balance right, but I hope we do not lose sight of the prize we could have if we get the regulations and legislation right.
Let me turn to the specifics that have been raised today. Amendments 59 and 62 to 65 relate to scientific provisions, and the noble Lord, Lord Clement-Jones, the noble Viscount, Lord Colville, and others have commented on them. I should make it clear that this Bill is not expanding the meaning of “scientific research”. If anything, it is restricting it, because the reasonableness test that has been added to the legislation—along with clarification of the requirement for research to have a lawful basis—will constrain the misuse of the existing definition. The definition is tighter, and we have attempted to do that in order to make sure that some of the new developments and technologies coming on stream will fall clearly within the constraints we are putting forward in the Bill today.
Amendments 59 and 62 seek to prevent misuse of the exceptions for data reuse. I assure the noble Viscount, Lord Colville, that the existing provisions for research purposes already prevent the controller taking advantage of them for any other purpose they may have in mind. That is controlled.
I thank the Minister very much, but is she not concerned by the preliminary opinion from the EDPS, particularly that traditional academic research is blurrier than ever and that it is even harder to distinguish research which generally benefits society from that which primarily serves private interest? People in the street would be worried about that and the Bill ought to be responding to that concern.
I have not seen that observation, but we will look at it. It goes back to my point that the provisions in this Bill are designed to be future facing as well as for the current day. The strength of those provisions will apply regardless of the technology, which may well include AI. Noble Lords may know that we will bring forward a separate piece of legislation on AI, when we will be able to debate this in more detail.
My Lords, this has been a very important debate about one of the most controversial areas of this Bill. My amendments are supported across the House and by respected civic institutions such as the Ada Lovelace Institute. I understand that the Minister thinks they will stifle scientific research, particularly by nascent AI companies, but the rights of the data subject must be borne in mind. As it stands, under Clause 67, millions of data subjects could find their information mined by AI companies, to be reused without consent.
The concerns about this definition being too broad were illustrated very well across the Committee. The noble Lord, Lord Clement-Jones, said that it was too broad and must recognise that AI development will be open to using data research for any AI purposes and talked about his amendment on protecting children’s data, which is very important and worthy of consideration. This was supported by my noble friend Lady Kidron, who pointed out that the definition of scientific research could cover everything and warned that Clause 67 is not just housekeeping. She quoted the EDPS and talked about its critical clarification not being included in the transfer of the scientific definition into the Bill. The noble Lord, Lord Holmes, asked what in the Bill has changed when you consider how much has changed in AI. I was very pleased to have the support of the noble Viscount, Lord Camrose, who warned against the abuse and misuse of data and the broad definition in this Bill, which could muddy the waters. He supported the public interest test, which would be fertile ground for helping define scientific data.
Surely this Bill should walk the line in encouraging the AI rollout to boost research and development in our science sector. I ask the Minister to meet me and other concerned noble Lords to tighten up Clauses 67 and 68. On that basis, I beg leave to withdraw my amendment.
My Lords, I have tabled Amendment 60 to add to our discussion and establish some further clarity from the Minister on the impact of widening the scope of the interpretation of scientific research to include commercial and private activities. I thank her for her letter of 27 November to all noble Lords who spoke at Second Reading, a copy of which was placed in the Lords Library; it provides some reassurance that scientific research activities must still pass a reasonableness test. However, I move this probing amendment out of concern that the change in definition may have unintended consequences for copyright law. It is vital that we do not just look at this Bill in isolation but consider the wider impact that changing definitions and interpretations will have on other aspects of legislation.
Research activities are identified under the Copyright, Designs and Patents Act 1988. Some researchers require access to and reproduction of data and copyright-protected material for research purposes. Under Section 29A, researchers can avail themselves of an exemption from copyright which allows data mining and analysis of copyright-protected works for non-commercial research only, without permission from the copyright holder. The UK copyright framework is popularly known as the “gold standard” internationally, as it carefully balances the rights of copyright holders with the need for certain uses to take place, such as non-commercial research, educational uses and those that protect free speech. That balance is fragile, and we must be very careful not to disrupt it unintentionally.
The previous Government sought to widen Section 29A of the Act by allowing text and data mining of copyright-protected works for commercial purposes, but this recommendation was quickly reversed when the Government considered that the decision was made without appropriate evidence. That was a sensible move. The current Government are still due to consult with stakeholders on the exemption to the law, against the backdrop of AI companies using copyright-protected works for training large language models without permission or fair pay. Given the global presence of AI, it is expected that this consultation will consider how the UK policy on copyright works within an international context. Therefore, while the Government are carefully considering this, we must ensure that we do not fast forward to a conclusion before that important work has taken place.
If the Minister can confirm that this definition has no impact on existing copyright law, I will happily withdraw this amendment. However, if there are potential implications on the Copyright, Designs and Patents Act 1988, I would urge the Minister to table her own amendment to explicitly preserve the current definition of “scientific research” within that Act. This would ensure that we maintain legal clarity while the broader international considerations are fully examined. I beg to move.
I advise the Committee that, if this amendment is agreed, I cannot call Amendment 61 by reason of pre-emption.
My Lords, it is a pleasure to take part in the debate on these amendments. I very much support Amendment 60 as introduced. I was delighted to hear the Minister tell the Grand Committee that the Government are coming forward with an AI Bill. I wonder if I might tempt her into sharing a bit more detail with your Lordships on when we might see that Bill or indeed the consultation. Will it be before Santa or sometime after his welcome appearance later this month?
We touched on a number of areas related to Amendment 65A in the previous group. This demonstrates the importance of and concern about Clause 67, as so many amendments pertain to it. I ask the Minister whether a large language model that comes up with medically significant conclusions but, prior to that, gained a considerable amount of that data from scraping, would be fine within Clause 67 as drafted.
Similarly, there are overriding and broader reuse possibilities from the drafting as set out. Again, as has already been debated, scientific research has a clear meaning in many respects. That clarity very much comes when you add public interest and ethics. Could a model that has taken vast quantities of others’ data without consent and—nodding more towards Amendment 60 —without remuneration and consent still potentially fit within the definition of “scientific research”?
In many ways, we are debating these points around data in the context of scientific research, but we could go to the very nub or essence of the issue. All that noble Lords are asking, in their many eloquent and excellent ways, is whose data is it, to what purpose is it being put and have those data owners been consented, respected and, where appropriate—particularly when it comes to IP and copyrighted data—remunerated? This is an excellent opportunity to expand on the earlier debate on Clause 67. I look forward to the Minister’s response.
My Lords, I declare an interest in that I checked yesterday and Copilot has clearly scraped data from behind the paywall on the Good Schools Guide. It very kindly does not publish the whole of the review, but it publishes a summary of it. It concerns me how we police copyright and how we get things right in this Bill.
However, I do not think that trying to draw a boundary around “scientific” is the right way to do it. Looking at all the evidence on engineering biology that we have just taken for the Science and Technology Committee, they are all doing science, but they all want to make money out of it at the end, if things go right. There is no sensible boundary between science and commerce. We should expect that, with anything that is done for science, even if it is done in the social sciences, someone at the end of the day will want to build a consultancy on it. There is no defendable boundary between the two.
As my noble friend Lord Camrose said, getting a working definition of public interest is key, as is, in the context of this amendment, recognising the importance of the concepts of intellectual property, copyright, trademark, patents and so on. They are international concepts, and we should seek to hold the line in the face of technological challenges because the concepts as they are have shown their worth. We may have to adapt them in one way or another, but this should be an international thing, and we should not support local infringement, because we would then make the UK a much less worthwhile place to hold intellectual property. My intellectual property is not mobile but a lot of it is, and it wants to be held in a place where it can be defended. If we do not offer that in our legal system, we will lose a great deal by it.
My Lords, I have to admit that I am slightly confused by the groupings at this point. It is very easy to have this debate in the medical space, to talk about the future of disease, fixing diseases and longevity, but my rather mundane questions have now gone unanswered twice. Perhaps the Minister will write to me about where the Government see scientific research on product development in some of these other spaces.
We will come back to the question of scraping and intellectual copyright, but I want to add my support to my noble friend Lord Freyberg’s amendment. I also want to add my voice to the question of the AI Bill that is coming. Data is fundamental to the AI infra- structure; data is infrastructure. I do not understand how we can have a data Bill that does not have one eye on AI, looking towards it, or how we are supposed to understand the intersection between the AI Bill and the data Bill if the Government are not more forthcoming about their intentions. At the moment, we are seeing a reduction in data protection that looks as though it is anticipating, or creating a runway for, certain sorts of companies.
Finally, I am sorry that the noble Lord is no longer in his place, but later amendments look at creating sovereign data assets around the NHS and so on, and I do not think that those of us who are arguing to make sure that it is not a free-for-all are unwilling to create, or are not interested in creating, ways in which the huge investment in the NHS and other datasets can be realised for UK plc. I do not want that to appear to be where we are starting just because we are unhappy about the roadway that Clause 67 appears to create.
Many thanks to the noble Lords who have spoken in this debate and to the noble Lord, Lord Freyberg, for his Amendment 60. Before I start, let me endorse and add my name to the request for something of a briefing about the AI Bill. I am concerned that we will put a lot of weight of expectation on that Bill. When it comes, if I understand this right, it will focus on the very largest AI labs and may not necessarily get to all the risks that we are talking about here.
Amendment 60 seeks to ensure that the Bill does not allow privately funded or commercial activities to be considered scientific research in order
“to avert the possibility that such ventures might benefit from exemptions in copyright law relating to data mining”.
This is a sensible, proportionate measure to achieve an important end, but I have some concerns about the underlying assumption, as it strikes me. There is a filtering criterion of whether or not the research is taxpayer funded; that feels like a slightly crude means of predicting the propensity to infringe copyright. I do not know where to take that so I shall leave it there for the moment.
Amendment 61 in my name would ensure that data companies cannot justify data scraping for AI training as scientific research. As many of us said in our debate on the previous group, as well as in our debate on this group, the definition of “scientific research” in the Bill is extremely broad. I very much take on board the Minister’s helpful response on that but, I must say, I continue to have some concerns about the breadth of the definition. The development of AI programs, funded privately and as part of a commercial enterprise, could be considered scientific, so I believe that this definition is far too broad, given that Article 8A(3), to be inserted by Clause 71(5), states:
“Processing of personal data for a new purpose is to be treated as processing in a manner compatible with the original purpose where … the processing is carried out … for the purposes of scientific research”.
By tightening up the definition of “scientific research” to exclude activities that are primarily commercial, it prevents companies from creating a scientific pretence for research that is wholly driven by commercial gain rather than furthering our collective knowledge. I would argue that, if we wish to allow these companies to build and train AI—we must, or others will—we must put in proper safeguards for people’s data. Data subjects should have the right to consent to their data being used in such a manner.
Amendment 65A in the name of my noble friend Lord Holmes would also take steps to remedy this concern. I believe that this amendment would work well in tangent with Amendment 61. It makes it absolutely clear that we expect AI developers to obtain consent from data subjects before they use or reuse their data for training purposes. For now, though, I shall not press my amendment.
My Lords, I share the confusion of the noble Baroness, Lady Kidron, about the groupings. If we are not careful, we are going to keep returning to this issue again and again over four or five groups.
With the possible exception of the noble Lord, Lord Lucas, I think that we are all very much on the same page here. On the suggestion from the noble Viscount, Lord Colville, that we meet to discuss the precise issue of the definition of “scientific research”, this would be extremely helpful; the noble Baroness and I do not need to repeat the concerns.
I should declare an interest in two respects: first, my interests as regards AI, which are set out on the register; and, secondly—I very much took account of what the noble Viscount, Lord Camrose, and the noble Lord, Lord Markham, had to say—I chair the council of a university that has a strong health faculty. It does a great deal of health research and a lot of that research relies on NHS datasets.
This is not some sort of Luddism we are displaying here. This is caution about the expansion of the definition of scientific research, so that it does not turn into something else: that it does not deprive copyright holders of compensation, and that it does not allow personal data to be scraped off the internet without consent. There are very legitimate issues being addressed here, despite the fact that many of us believe that this valuable data should of course be used for the public benefit.
One of the key themes—this is perhaps where we come back on to the same page as the noble Lord, Lord Lucas—may be public benefit, which we need to reintroduce so that we really understand that scientific research for public benefit is the purpose we want this data used for.
I do not think I need to say much more: this issue is already permeating our discussions. It is interesting that we did not get on to it in a major way during the DPDI Bill, yet this time we have focused much more heavily on it. Clearly, in opposition, the noble Viscount has seen the light. What is not to like about that? Further discussion, not least of the amendment of the noble Baroness, Lady Kidron, further down the track will be extremely useful.
My Lords, I feel we are getting slightly repetitive, but before I, too, repeat myself, I should like to say something that I did not get the chance to say the noble Viscount, Lord Colville, the noble Baroness, Lady Kidron, and others: I will write, we will meet—all the things that you have asked for, you can take it for granted that they will happen, because we want to get this right.
I say briefly to the noble Baroness: we are in danger of thinking that the only good research is health research. If you go to any university up and down the country, you find that the most fantastic research is taking place in the most obscure subjects, be it physics, mechanical engineering, fabrics or, as I mentioned earlier, quantum. A lot of great research is going on. We are in danger of thinking that life sciences are the only thing that we do well. We need to open our minds a bit to create the space for those original thinkers in other sectors.
Perhaps I did not make myself clear. I was saying that the defence always goes to space or to medicine, and we are trying to ascertain the product development that is not textiles, and so on. I have two positions in two different universities; they are marvellous places; research is very important.
I am glad we are on the same page on all that.
I now turn to the specifics of the amendments. I thank the noble Lords, Lord Freyberg and Lord Holmes, and the noble Viscount, Lord Camrose, for their amendments, and the noble Lord, Lord Lucas, for his contribution. As I said in the previous debate, I can reassure all noble Lords that if an area of research does not count as scientific research at the moment, it will not under the Bill. These provisions do not expand the meaning of scientific research. If noble Lords still feel unsure about that, I am happy to offer a technical briefing to those who are interested in this issue to clarify that as far as possible.
Moreover, the Bill’s requirement for a reasonableness test will help limit the misuse of this definition more than the current UK GDPR, which says that scientific research should be interpreted broadly. We are tightening up the regulations. This is best assessed on a case-by- case basis, along with the ICO guidance, rather than automatically disqualifying or passing into our activity sectors by approval.
Scientific research that is privately funded or conducted by commercial organisations can also have a life-changing impact. The noble Lord, Lord Markham, was talking earlier about health; issues such as the development of Covid vaccines are just one example of this. It was commercial research that was absolutely life-saving, at the end of the day.
Can the Minister say whether this will be a Bill, a draft Bill or a consultation?
We will announce this in the usual way—in due course. I refer the noble Lord to the King’s Speech on that issue. I feel that noble Lords want more information, but they will just have to go with what I am able to say at the moment.
Perhaps another aspect the Minister could speak to is whether this will be coming very shortly, shortly or imminently.
Let me put it this way: other things may be coming before it. I think I promised at the last debate that we would have something on copyright in the very, very, very near future. This may not be as very, very, very near future as that. We will tie ourselves in knots if we carry on pursuing this discussion.
On that basis, I hope that this provides noble Lords with sufficient reassurance not to press their amendments.
I thank your Lordships for this interesting debate. I apologise to the Committee for degrouping the amendment on copyright, but I thought it was important to establish from the Minister that there really was no effect on the copyright Act. I am very reassured that she has said that. It is also reassuring to hear that there will be more of an opportunity to look at this issue in greater detail. On that basis, I beg leave to withdraw the amendment.
My Lords, Amendments 66, 67 and 80 in this group are all tabled in my name. Amendment 66 requires scientific research carried out for commercial purposes to
“be subject to the approval of an independent ethics committee”.
Commercial research is, perhaps counterintuitively, generally subjected to fewer ethical safeguards than research carried out purely for scientific endeavour by educational institutions. Given the current broad definition of scientific research in the Bill—I am sorry to repeat this—which includes research for commercial purposes, and the lower bar for obtaining consent for data reuse should the research be considered scientific, I think it would be fair to require more substantial ethical safeguards on such activities.
We do not want to create a scenario where unscrupulous tech developers use the Bill to harvest significant quantities of personal data under the guise of scientific endeavour to develop their products, without having to obtain consent from data subjects or even without them knowing. An independent ethics committee would be an excellent way to monitor scientific research that would be part of commercial activities, without capping data access for scientific research, which aims more purely to expand the horizon of our knowledge and benefit society. Let us be clear: commercial research makes a huge and critically important contribution to scientific research, but it is also surely fair to subject it to the same safeguards and scrutiny required of non-commercial scientific research.
Amendment 67 would ensure that data controllers cannot gain consent for research purposes that cannot be defined at the time of data collection. As the Bill stands, consent will be considered obtained for the purposes of scientific research if, at the time consent is sought, it is not possible to identify fully the purposes for which the personal data is to be processed. I fully understand that there needs to be some scope to take advantage of research opportunities that are not always foreseeable at the start of studies, particularly multi-year longitudinal studies, but which emerge as such studies continue. I am concerned, however, that the current provisions are a little too broad. In other words: is consent not actually being given at the start of the process for, effectively, any future purpose?
Amendment 80 would prevent the data reuse test being automatically passed if the reuse is for scientific purposes. Again, I have tabled this amendment due to my concerns that research which is part of commercial activities could be artificially classed as scientific, and that other clauses in the Bill would therefore allow too broad a scope for data harvesting. I beg to move.
My Lords, it seems very strange indeed that Amendment 66 is in a different group from group 1, which we have already discussed. Of course, I support Amendment 66 from the noble Viscount, Lord Camrose, but in response to my suggestion for a similar ethical threshold, the Minister said she was concerned that scientific research would find this to be too bureaucratic a hurdle. She and many of us here sat through debates on the Online Safety Bill, now an Act. I was also on the Communications Committee when it looked at digital regulations and came forward with one of the original reports on this. The dynamic and impetus which drove us to worry about this was the lack of ethics within the tech companies and social media. Why on earth would we want to unleash some of the most powerful companies in the world on reusing people’s data for scientific purposes if we were not going to have an ethical threshold involved in such an Act? It is important that we consider that extremely seriously.
My Lords, I welcome the noble Viscount to the sceptics’ club because he has clearly had a damascene conversion. It may be that this goes too far. I am slightly concerned, like him, about the bureaucracy involved in this, which slightly gives the game away. It could be seen as a way of legitimising commercial research, whereas we want to make it absolutely certain that that research is for the public benefit, rather than imposing an ethical board on every single aspect of research which has any commercial content.
We keep coming back to this, but we seem to be degrouping all over the place. Even the Government Whips Office seems to have given up trying to give titles for each of the groups; they are just called “degrouped” nowadays, which I think is a sign of deep depression in that office. It does not tell us anything about what the different groups contain, for some reason. Anyway, it is good to see the noble Viscount, Lord Camrose, kicking the tyres on the definition of the research aspect.
I am not quite sure about the groupings, either, but let us go with what we have. I thank noble Lords who have spoken, and the noble Viscount, Lord Camrose, for his amendments. I hope I am able to provide some reassurance for him on the points he raised.
As I said when considering the previous group, the Bill does not expand the definition of scientific research. The reasonableness test, along with clarifying the requirement for researchers to have a lawful basis, will significantly reduce the misuse of the existing definition. The amendment seeks to reduce the potential for misuse of the definition of scientific research by commercial companies using AI by requiring scientific researchers for a commercial company to submit their research to an ethics committee. As I said on the previous group, making it a mandatory requirement for all research may impede studies in areas that might have their own bespoke ethical procedures. This may well be the case in a whole range of different research areas, particularly in the university sector, and in sectors more widely. Some of this research may be very small to begin with but might grow in size. The idea that a small piece of start-up research has to be cleared for ethical research at an early stage is expecting too much and will put off a lot of the new innovations that might otherwise come forward.
Amendment 80 relates to Clause 71 and the reuse of personal data. This would put at risk valuable research that relies on data originally generated from diverse contexts, since the difference between the purposes may not always be compatible.
Turning to Amendment 67, I can reassure noble Lords that the concept of broad consent is not new. Clause 68 reproduces the text from the current UK GDPR recitals because the precise definition of scientific research may become clear only during later analysis of the data. Obtaining broad consent for an area of research from the outset allows scientists to focus on potentially life-saving research. Clause 68 has important limitations. It cannot be used if the researcher already knows the specific purpose—an important safeguard that should not be removed. It also includes a requirement to give the data subject the choice to consent to only part of the research processing, if possible. Most importantly, the data subject can revoke their consent at any point. I hope this reassures the noble Viscount, Lord Camrose, and he feels content to withdraw his amendment on this basis.
I thank the noble Viscount, Lord Colville, and the noble Lord, Lord Clement-Jones, for their remarks and support, and the Minister for her helpful response. Just over 70% of scientific research in the UK is privately funded, 28% is taxpayer funded and around 1% comes through the charity sector. Perhaps the two most consequential scientific breakthroughs of the last five years, Covid vaccines and large language models, have come principally from private funding.
My Lords, I rise to move the amendment standing in my name and to speak to my other amendments in this group. I am grateful to the noble Baroness, Lady Kidron and the noble Lord, Lord Clement-Jones, for signing a number of those amendments, and I am also very grateful to Foxglove Legal and other bodies that have briefed me in preparation for this.
My amendments are in a separate group, and I make no apology for that because although some of these points have indeed been covered in other amendments, my focus is entirely on NHS patient data, partly because it is the subject of a wider debate going on elsewhere about whether value can be obtained for it to help finance the National Health Service and our health in future years. This changes the nature of the relationship between research and the data it is using, and I think it is important that we focus hard on this and get some of the points that have already been made into a form where we can get reasonable answers to the questions that it leaves.
If my amendments are accepted or agreed—a faint hope—they would make it clear beyond peradventure that the consent protections in the Bill apply to the processing of data for scientific research, that a consistent definition of consent is applied and that that consistent definition is the one with which researchers and the public are already familiar and can trust going forward.
The Minister said at the end of Second Reading, in response to concerns I and others raised about research data in general and NHS data in particular, that the provisions in this Bill
“do not alter the legal obligations that apply in relation to decisions about whether to share data”.—[Official Report, 19/11/24; col. 196.]
I accept that that may be the intention, and I have discussed this with officials, who make the same point very strongly. However, Clause 68 introduces a novel and, I suggest, significantly watered-down definition of consent in the case of scientific research. Clause 71 deploys this watered-down definition of consent to winnow down the “purpose limitation” where the processing is for the purposes of scientific research in the public interest. Taken together, this means that there has been a change in the legal obligations that apply to the need to obtain consent before data is shared.
Clause 68 amends the pivotal definition of consent in Article 4(11). Instead of consent requiring something express—freely given, specific, informed, and unambiguous through clear affirmative action—consent can now be imputed. A data subject’s consent is deemed to meet these strict requirements even when it does not, as long as the consent is given to the processing of personal data for the purposes of an area of scientific research; at the time the consent is sought, it is not possible to identify fully the purposes for which the personal data is to be processed; seeking consent in relation to the area of scientific research is consistent with generally recognised ethical standards relevant to the area of research; and, so far as the intended purposes of the processing allow, the data subject is given the opportunity to consent to processing for only part of the research. These all sound very laudable, but I believe they cut down the very strict existing standards of consent.
Proposed new paragraph 7, in Clause 68, then extends the application of this definition across the regulation:
“References in this Regulation to consent given for a specific purpose (however expressed) include consent described in paragraph 6.”
Thus, wherever you read “consent” in the regulation you can also have imputed consent as set out in proposed new paragraph 6 of Article 4. This means that “consent” within the meaning of proposed new paragraph 6(a)—i.e. the basis for lawful processing—can be imputed consent in the new way introduced by the Bill, so there is a new type of lawful basis for processing.
The Minister is entitled to disagree, of course; I expect him to say that when he comes to respond. I hope that, when he does, he will agree that we share a concern on the importance of giving researchers a clear framework, as it is this uncertainty about the legal framework that could inadvertently act as a barrier to the good research we all need. So my first argument today is that, as drafted, the Bill leaves too much room for different interpretations, which will lead to exactly the kind of uncertainty that the Minister—indeed, all of us—wish to avoid.
As we have heard already, as well as the risk of uncertainty among researchers, there is also the risk of distrust among the general public. The public rightly want and expect to have a say in what uses their data is put to. Past efforts to modernise how the NHS uses data, such as care.data, have been expensive failures, in part because they have failed to win the public’s trust. More than 3.3 million people have already opted out of NHS data sharing under the national data opt-out; that is nearly 8% of the adults who could have been part of surveys. We have talked about the value of our data and being the gold standard or gold attractor for researchers but, if we do not have all the people who could contribute, we are definitely devaluing and debasing that research. Although we want to respect people’s choice as to whether to participate, of course, this enormous vote against research reflects a pretty spectacular failure to win public trust—one that undermines the value and quality of the data, as I said.
So my second point is that watering down the rights of those whose data is held by the NHS will not put that data for research purposes on a sustainable, long-term footing. Surely, we want a different outcome this time. We cannot afford more opt-outs; we want people opting back in. I argue that this requires a different approach—one that wins the public’s trust and gains public consent. The Secretary of State for Health is correct to say that most of the public want to see the better use of health data to help the NHS and to improve the health of the nation. I agree, but he must accept that the figures show that the general public also have concerns about privacy and about private companies exploiting their data without them having a say in the matter. The way forward must be to build trust by genuinely addressing those concerns. There must not be even a whiff of watering down legal protections, so that those concerns can instead be turned into support.
This is also important because NHS healthcare includes some of the most intimate personal data. It cannot make sense for that data to have a lower standard of consent protection going forward if it is being used for research. Having a different definition of consent and a lower standard of consent will inevitably lead to confusion, uncertainty and mistrust. Taken together, these amendments seek to avoid uncertainty and distrust, as well as the risk of backlash, by making it abundantly clear that Article 4 GDPR consent protections apply despite the new wording introduced by this Bill. Further, these are the same protections that apply to other uses of data; they are identical to the protections already understood by researchers and by the public.
I turn now to a couple of the amendments in this group. Amendment 71 seeks to address the question of consent, but in a rather narrow way. I have argued that Clause 68 introduces a novel and significantly watered-down definition of consent in the case of scientific research; proposed new paragraph 7 deploys this watered-down definition to winnow down the purpose limitation. There are broader questions about the wisdom of this, which Amendments 70, 79 and 81 seek to address, but Amendment 71 focuses on the important case of NHS health data.
If the public are worried that their health data might be shared with private companies without their consent, we need an answer to that. We see from the large number of opt-outs that there is already a problem; we have also seen it recently in NHS England’s research on public attitudes to health data. This amendment would ensure that the Bill does not increase uncertainty or fuel patient distrust of plans for NHS data. It would help to build the trust that data-enabled transformation of the NHS requires.
The Government may well retort that they are not planning to share NHS patient data with commercial bodies without patient consent. That is fine, but it would be helpful if, when he comes to respond, the Minister could say that clearly and unambiguously at the Dispatch Box. However, I put it to him that, if he could accept these amendments, the law would in fact reflect that assurance and ensure that any future Government would need to come back to Parliament if they wanted to take a different approach.
It is becoming obvious that whether research is in the public interest will be the key issue that we need to resolve in this Bill, and Amendment 72 provides a proposal. The Bill makes welcome references to health research being in the public interest, but it does not explain how on earth we decide or how that requirement would actually bite. Who makes the assessment? Do we trust a rogue operator to make its own assessment of how its research is in the public interest? What would be examples of the kind of research that the Government expect this requirement to prevent? I look forward to hearing the answer to that, but perhaps it would be more helpful if the Minister responded in a letter. In the interim, this amendment seeks to introduce some procedural clarity about how research will be certified as being in the public interest. This would provide clarity and reassurance, and I commend it to the Minister.
Finally, Amendment 131 seeks to improve the appropriate safeguards that would apply to processing for research, archiving and scientific purposes, including a requirement that the data subject has given consent. This has already been touched on in another amendment, but it is a way of seeking to address the issues that Amendments 70, 79 and 81 are also trying to address. Perhaps the Government will continue to insist that this is addressing a non-existent problem because nothing in Clauses 69 or 71 waters down the consent or purpose limitation protections and therefore the safeguards themselves add nothing. However, as I have said, informed readers of the Bill are interpreting it differently, so spelling out this safeguard would add clarity and avoid uncertainty. Surely such clarity on such an important matter is worth a couple of lines of additional length in a 250-page Bill. If the Government are going to argue that our Amendment 131 adds something objectionable, let them explain what is objectionable about consent protections applying to data processing for these purposes. I beg to move.
My Lords, I support Amendments 70 to 72, which I signed, in the name of the noble Lord, Lord Stevenson of Balmacara. I absolutely share his view about the impact of Clause 68 on the definition of consent and the potential and actual mistrust among the public about sharing of their data, particularly in the health service. It is highly significant that 3.3 million people have opted out of sharing their patient data.
I also very much share the noble Lord’s views about the need for public interest. In a sense, this takes us back to the discussion that we had on previous groups about whether we should add that in a broader sense so not purely for health data or whatever but for scientific research more broadly, as he specifies. I very much support what he had to say.
Broadly speaking, the common factor between my clause stand part and what he said is health data. Data subjects cannot make use of their data rights if they do not even know that their data is being processed. Clause 77 allows a controller reusing data under the auspices of scientific research to not notify a data subject in accordance with Article 13 and 14 rights if doing so
“is impossible or would involve a disproportionate effort”.
We on these Benches believe that Clause 77 should be removed from the Bill. The safeguards are easily circumvented. The newly articulated compatibility test in new Article 8A inserted by Clause 71 that specifies how related the new and existing purposes for data use need to be to permit reuse is essentially automatically passed if it is conducted
“for the purposes of scientific research or historical research”.
This makes it even more necessary for the definition of scientific research to be tightened to prevent abuse.
Currently, data controllers must provide individuals with information about the collection and use of their personal data. These transparency obligations generally do not require the controller to contact each data subject. Such obligations can usually be satisfied by providing privacy information using different techniques that can reach large numbers of individuals, such as relevant websites, social media, local newspapers and so on.
My Lords, I rise briefly to support the amendments in the name of the noble Lord, Lord Stevenson of Balmacara. I must say that the noble Lord, Lord Clement-Jones, made a very persuasive speech; I shall be rereading it and thinking about it more carefully.
In many ways, purpose limitation is the jewel in the crown of GDPR. It does what it says on the tin: data should be used for the original purpose, and if the purpose is then extended, we should go back to the person and ask whether it can be used again. While I agree with and associate myself with the technical arguments made by the noble Lord, Lord Stevenson, that is the fundamental point.
The issue here is, what are the Government trying to do? What are we clearing a pathway for? In a later group, we will speak to a proposal to create a UK data sovereign fund to make sure that the value of UK publicly held data is realised. The value is not simply economic or financial, but societal. There are ways of arranging all this that would satisfy everyone.
I have been sitting here wondering whether to say it, but here I go: I am one of the 3.3 million.
So is the noble Lord, Lord Clement-Jones. I withdrew my consent because I did not trust the system. I think that what both noble Lords have said about trust could be spread across the Bill as a whole.
We want to use our data well. We want it to benefit our public services. We want it to benefit UK plc and we want to make the world a better place, but not at the cost of individual data subjects and not at too great a cost. I add my voice to that. On the whole, I prefer systems that offer protections by design and default, as consent is a somewhat difficult concept. But, in as much as consent is a fundamental part of the current regulatory system and nothing in the Bill gets rid of it wholesale for some better system, it must be applied meaningfully. Amendments 79, 81 and 131 make clear what we mean by the term, ensure that the definition is consistent and clarify that it is not the intention of the Government to lessen the opportunity for meaningful consent. I, too, ask the Minister to confirm that it is not the Government’s intention to downgrade the concept of meaningful consent in the way that the noble Lord, Lord Stevenson, has set out.
My Lords, I support Amendment 71 and others in this group from the noble Lords, Lord Clement-Jones and Lord Stevenson. I apologise for not being able to speak at Second Reading. The noble Lord, Lord Clement-Jones, will remember that we took a deep interest in this issue when I was a Health Minister and the conversations that we had.
I had a concern at the time. We all know that the NHS needs to be digitised and that relevant health professionals need to be able to access relevant data when they need to, so that there is no need to be stuck with one doctor when you go to another part of the country. There are so many efficiencies that we could have in the system, as long as they are accessed by relevant and appropriate health professionals at the right time. But it is also important that patients have confidence in the system and that their personal data cannot be shared with commercial organisations without them knowing. As other noble Lords have said, this is an issue of trust.
For that reason, when I was in that position, I reached out to civil liberties organisations to understand their concerns. For example, medConfidential was very helpful and had conversations with DHSC and NHS officials. In fact, after those conversations, officials told me that its demands were reasonable and that some of the things being asked for were not that difficult to give and common sense.
I asked a Written Question of the noble Baroness’s ministerial colleague, the noble Baroness, Lady Merron, about whether patients will be informed of who has had access to their patient record, because that is important for confidence. The Answer I got back was that the Government were proposing a single unified health record. We all know that. She said that:
“Ensuring that patients’ confidential information remains protected and is seen only by those who need to see it will be a priority. Public engagement next month will help us understand what safeguards patients would want to see”.
Surely the fact that patients have opted out shows that they already have concerns and have raised them.
The NHS can build the best data system—or the federated data platform, as it is called—but without patient confidence it is simply a castle made of sand. As one of my heroes, Jimi Hendrix, once said, castles made of sand fall into the sea eventually. We do not want to see that with the federated data platform. We want to see a modernised system of healthcare digital records, allowing joined-up thinking on health and care right across a patient’s life. We should be able to use machine learning to analyse those valuable datasets to improve preventive care. But, for that to happen, the key has to be trust and patients being confident that their data is secure and used in the appropriate way. I look forward to the Minister’s response.
My Lords, I support these amendments in the names of the noble Lords, Lord Stevenson and Lord Clement-Jones. It is a pleasure to follow the second ex-Health Minister this afternoon. In many ways, the arguments are just the same for health data as they are for all data. It is just that, understandably, it is at the sharpest end of this debate. Probably the most important point for everybody to realise, although it is espoused so often, is that there is no such thing as NHS data. It is a collection of the data of every citizen in this country, and it matters. Public trust matters significantly for all data but for health data in particular, because it goes so close to our identity—our very being.
Yet we know how to do public trust in this country. We know how to engage and have had significant success in public engagement decades ago. What we could do now with human-led technology-supported public engagement could be on such a positive and transformational scale. But, so far, there has been so little on this front. Let us not talk of NHS data; let us always come back to the fundamental principle encapsulated in this group of amendments and across so many of our discussions on the Bill. Does the Minister agree that it is about not NHS data but our data—our decisions—and, through that, if we get it right, our human-led digital futures?
Many thanks to all noble Lords who have proposed and supported these amendments. I will speak to just a few of them.
Amendment 70 looks to mitigate the lowering of the consent threshold for scientific research. As I have set out on previous groups, I too have concerns about that consent threshold. However, for me the issue is more with the definition of scientific research than with the consent threshold, so I am not yet confident that the amendment is the right way to achieve those desirable aims.
Amendment 71 would require that no NHS personal data can be made available for scientific research without the explicit consent of the patient. I thank the noble Lords, Lord Stevenson of Balmacara and Lord Clement-Jones, for raising this because it is such an important matter. While we will discuss this under other levels, as the noble Baroness, Lady Kidron, points out, it is such an important thing and we need to get it right.
I regret to advise my noble friend Lord Holmes that I was going to start my next sentence with the words “Our NHS data”, but I will not. The data previously referred to is a very significant and globally unique national asset, comprising many decades of population-wide, cradle-to-grave medical data. No equivalent at anything like the same scale or richness exists anywhere, which makes it incredibly valuable. I thank my noble friend Lord Kamall for stressing this point with, as ever, the help of Jimi Hendrix.
However, that data is valuable only to the extent that it can be safely exploited for research and development purposes. The data can collectively help us develop new medicines or improve the administration and productivity of the NHS, but we need to allow it to do so properly. I am concerned that this amendment, if enacted, would create too high an operational and administrative barrier to the safe exploitation of this data. I have no interest in compromising on the safety, but we have to find a more efficient and effective way of doing it.
Amendments 79, 81 and 131 all look to clarify that the definition of consent to be used is in line with the definition in Article 4.11 of the UK GDPR:
“‘consent’ of the data subject means any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her”.
This amendment would continue the use of a definition that is well understood. However, paragraph 3(a) of new Article 8A appears sufficient, in that the purpose for which a data subject consents is “specified, explicit and legitimate”.
Finally, with respect to Clause 77 stand part, I take the point and believe that we will be spending a lot of time on these matters going forward. But, on balance and for the time being, I feel that this clause needs to remain, as there must be clear rules on what information should be provided to data subjects. We should leave it in for now, although we will no doubt be looking to polish it considerably.
My Lords, I thank noble Lords for another thought-provoking debate on consent in scientific research. First, let me set out my staunch agreement with all noble Lords that a data subject’s consent should be respected.
Regarding Amendment 70, Clause 68 reproduces the text from the current UK GDPR recitals, enabling scientists to obtain “broad consent” for an area of research from the outset and to focus on potentially life-saving research. This has the same important limitations, including that it cannot be used if the researcher already knows its specific purpose and that consent can be revoked at any point.
I turn to Amendments 71 and 72, in the name of my noble friend Lord Stevenson, on assessments for research. Requiring all research projects to be submitted for assessments could discourage or delay researchers in their important work, as various noble Lords mentioned. However, I understand that my noble friend’s main concern is around NHS data. I assure him that, if NHS data is used for research, individual patients cannot be identified unless either a patient has specifically agreed for that data to be shared or the Health Research Authority has approved an application for this information to be used, informed by advice from the independent and expert Confidentiality Advisory Group. Research projects using confidential patient data are always subject to rigorous governance, including the approval of an ethics committee; the Minister, my noble friend Lady Jones, mentioned this earlier. There are also strict controls around who can see the data and how it is used and stored. Nothing in this clause will change that approach.
I turn to Amendments 81 and 131 on consent. I understand the motivations behind adding consent as a safeguard. However, organisations such as the Health Research Authority have advised researchers against relying on consent under the UK GDPR; for instance, an imbalance of power may mean that consent cannot truly be “freely given”.
On Amendment 79, I am happy to reassure my noble friend Lord Stevenson that references to “consent” in Clause 71 do indeed fall under the definition in Article 4.11.
Lastly, I turn to Clause 77, which covers the notification exemption; we will discuss this in our debates on upcoming groups. The Government have identified a gap in the UK GDPR that may disproportionately affect researchers. Where data is not collected from the data subject, there is an exemption from notifying them if getting in contact would mean a disproportionate amount of effort. This does not apply to data collected from the data subject. However, in certain studies, such as those of degenerative neurological conditions, it can be impossible or involve a disproportionate effort to recontact data subjects to inform them of any change in the study. The Bill will therefore provide a limited exemption with strong safeguards for data subjects.
Numerous noble Lords asked various questions. They touched on matters that we care about very much: trust in the organisation asking for data; the transparency rules; public interest; societal value; the various definitions of “consent”; and, obviously, whether we can have confidence in what is collected. I will not do noble Lords’ important questions justice if I stand here and try to give answers on the fly, so I will do more than just write a letter to them: I will also ask officials to organise a technical briefing and meeting so that we can go into everyone’s concerns in detail.
With that, I hope that I have reassured noble Lords that there are strong protections in place for data subjects, including patients; and that, as such, noble Lords will feel content to withdraw or not press their amendments.
My Lords, I thank those who participated in this debate very much indeed. It went a little further than I had intended in drafting these amendments, but it has raised really important issues which I think we will probably come back to, if not later in Committee, certainly at Report.
At the heart of what we discussed, we recognise, as the noble Baroness, Lady Kidron, put it, that our data held by the NHS—if that is a better way of saying it—is valuable both in financial terms and because it should and could bring better health in future. Therefore, we value it specifically among some of the other datasets that we are talking about, because it has a returning loop in it. It is of benefit not just to the individual but to the UK as a whole, and we must respect that.
However, the worry that underlies framing it in that way is that, at some point, a tempting offer will be made by a commercial body—perhaps one is already on the table—which would generate new funding for the NHS and our health more generally, but the price obtained for that will not reflect the value that we have put into it over the years and the individual data that is being collected. That lack of trust is at the heart of what we have been talking about. In a sense, these amendments are about trust, but they are also bigger. They are also about the whole question of what it is that the Government as a whole do on our behalf in holding our data and what value they will obtain for that—something which I think we will come back to on a later amendment.
I agree with much of what was said from all sides. I am very grateful to the noble Lords, Lord Kamall and Lord Holmes, from the Opposition for joining in the debate and discussion, and their points also need to be considered. The Minister replied in a very sensible and coherent way; I will read very carefully what he said in Hansard and we accept his kind offer of a technical briefing on the Bill—that would be most valuable. I beg leave to withdraw the amendment.
My Lords, I start with an apology, because almost every amendment in this group is one of mine and I am afraid I have quite a long speech to make about the different amendments, which include Amendments 73, 75, 76, 77, 78, 78A, 83, 84, 85, 86, 89 and 90, and stand part debates on Schedules 4, 5 and 7 and Clause 74. But I know that the Members of this Committee are made of strong stuff.
Clause 70 and Schedule 4 introduce a new ground of recognised legitimate interest, which in essence counts as a lawful basis for processing if it meets any of the descriptions in the new Annexe 1 to the UK GDPR, which is at Schedule 4 to the Bill—for example, processing necessary for the purposes of responding to an emergency or detecting crime. These have been taken from the previous Government’s Data Protection and Digital Information Bill. This is supposed to reduce the burden on data controllers and the cost of legal advice when they have to assess whether it is okay to use or share data or not. Crucially, while the new ground shares its name with “legitimate interest”, it does not require the controller to make any balancing test taking the data subject’s interests into account. It just needs to meet the grounds in the list. The Bill gives the Secretary of State powers to define additional recognised legitimate interests beyond those in Annexe 1—a power heavily criticised by the Delegated Powers and Regulatory Reform Committee’s report on the Bill.
Currently where a private body shares personal data with a public body in reliance on Article 6(1)(e) of the GDPR, it can rely on the condition that the processing is
“necessary for the performance of a task carried out in the public interest”.
New conditions in Annexe 1, as inserted by Schedule 4, would enable data sharing between the private and public sectors to occur without any reference to a public interest test. In the list of recognised legitimate interests, the most important is the ability of any public body to ask another controller, usually in the private sector, for the disclosure of personal data it needs to deliver its functions. This applies to all public bodies. The new recognised legitimate interest legal basis in Clause 70 and Schedule 4 should be dropped.
Stephen Cragg KC, giving his legal opinion on the DPDI Bill, which, as I mentioned, has the same provision, stated that this list of recognised legitimate interests
“has been elevated to a position where the fundamental rights of data subjects (including children) can effectively be ignored where the processing of personal data is concerned”.
The ICO has also flagged concerns about recognised legitimate interests. In its technical drafting comments on the Bill, it said:
“We think it would be helpful if the explanatory notes could explicitly state that, in all the proposed new recognised legitimate interests, an assessment of necessity involves consideration of the proportionality of the processing activity”.
An assessment of proportionality is precisely what the balancing test is there to achieve. Recognised legitimate interests undermine the fundamental rights and interests of individuals, including children, in specific circumstances.
When companies are processing data without consent, it is essential that they do the work to balance the interests of the people who are affected by that processing against their own interests. Removing recognised legitimate interests from the Bill will not stop organisations from sharing data with the public sector or using data to advance national security, detect crime or safeguard children and vulnerable people. The existing legitimate interest lawful basis is more than flexible enough for these purposes. It just requires controllers to consider and respect people’s rights as they do so.
During the scrutiny of recognised legitimate interests in the DPDI Bill—I am afraid to have to mention this—the noble Baroness, Lady Jones of Whitchurch, who is now leading on this Bill as the Minister, raised concerns about the broad nature of the objectives. She rightly said:
“There is no strong reason for needing that extra power, so, to push back a little on the Minister, why, specifically, is it felt necessary? If it were a public safety interest, or one of the other examples he gave, it seems to me that that would come under the existing list of public interests”.—[Official Report, 25/3/24; col. GC 106.]
She never spoke a truer word.
However, this Government have reintroduced the same extra power with no new articulation of any strong reason for needing it. The constraints placed on the Secretary of State are slightly higher in this Bill than they were in the DPDI Bill, as new paragraph (9), inserted by Clause 70(4), means that they able to add new recognised legitimate interests only if they consider processing the case to be necessary to safeguard an objective listed in UK GDPR Article 23(1)(c) to (j). However, this list includes catch-alls, such as
“other important objectives of general public interest”.
To give an example of what this power would allow, the DPDI Bill included a recognised legitimate interest relating to the ability of political parties to use data about citizens during election campaigns on the basis that democratic participation is an objective of general public interest. I am glad to say that this is no longer included. Another example is that a future Secretary of State could designate workplace productivity as a recognised legitimate interest—which, without a balancing test, would open the floodgates to intrusive workplace surveillance and unsustainable data-driven work intensification. That does not seem to be in line with the Government’s objectives.
Amendment 74 is rather more limited. Alongside the BMA, we are unclear about the extent of the impact of Clause 70 on the processing of health data. It is noted that the recognised legitimate interest avenue appears to be available only to data controllers that are not public authorities. Therefore, NHS organisations appear to be excluded. We would welcome confirmation that health data held by an NHS data controller is excluded from the scope of Clause 70 now and in the future, regardless of the lawful basis that is being relied on to process health data.
I cannot compete with that tour de force. I shall speak to Amendments 73 and 75 in the name of noble Lord, Lord Clement-Jones, to which I have added my name, Amendments 76, 83 and 90 on the Secretary of State’s powers and Amendments 85 and 86 to which I wish I had added my name, but it is hard to keep up with the noble Lord. I am in sympathy with the other amendments in the group.
The issue of recognised legitimate interest has made a frequent appearance in the many briefings I have received and despite reading the Explanatory Notes for the Bill several times, I have struggled to understand in plain English the Government’s intent and purpose. I went to the ICO website to remind myself of the definition of legitimate interest to try to understand why recognised legitimate interest was necessary. It states:
“Legitimate interests is the most flexible lawful basis for processing, but you cannot assume it will always be the most appropriate.”
and then goes on:
“If you choose to rely on legitimate interests, you are taking on extra responsibility for considering and protecting people’s rights and interests.”
That seems to strike a balance between compelling justifications for processing and the need to consider and protect individual data rights and interests. I would be very interested to hear from Minister why the new category of “recognised legitimate interest” is necessary. Specifically, why do the Government believe that when processing may have far-reaching consequences, such as national security, crime prevention and safeguarding, there is no need to undertake a legitimate interest assessment? What is the justification for the ability of any public body to demand that data from private companies for any purpose? I ask those questions to be precise about the context and purpose.
I am not suggesting that there is no legitimate interest for processing personal data without consent, but the legitimate interest assessment is a check and balance that ensures oversight and reduces the risk of overreach. It is a test, not a blocker, and does not in itself prevent processing if the balancing test determines that processing should go ahead. Amendment 85 illustrates this point in relation to vulnerable users. Given that a determination that a person is at risk would have far-reaching consequences for that person, the principles of fairness and accountability demand that those making the decision must follow a due process and that those subject to the decision are aware—if not in an emergency, certainly at some point in the proceedings.
In laying Amendment 86, the noble Lord, Lord Clement-Jones, raises an important question that I am keen to hear from Ministers on, namely, what is the Government’s plan for ensuring that a designation that an individual is vulnerable is monitored and removed when it is no longer appropriate? If a company or organisation has a legitimate interest in processing someone’s data considering the balancing interests of data subjects, it is free to do so. I ask the Minister again to give concrete examples of circumstances in which the current legitimate interest basis is insufficient, so that we understand the problem the Government are trying to solve.
At Second Reading, the Government’s curious defence of this new measure was the idea that organisations had concerns about whether they were doing the balancing test correctly, so the new measure is there to help, but perhaps the Minister can explain what benefits accrue from introducing the new measure that could not have been better achieved by the ICO providing more concrete guidance on the balancing test. Given that the measure is focused on the provision of public interest areas, such as national security and the detection of crime, how does the creation of the recognised legitimate interest help the majority of data controllers, rather than simply serving the interests of incumbents and/or government departments by removing an important check or balance?
Amendments 76, 83 and 90 seek to curb the power of the Secretary of State to override primary legislation and to modify key aspects of UK data protection law via statutory instrument. The proposed provisions in Clauses 70, 71 and 74 put one person in control, rather than Parliament. Elon Musk’s new role in the upcoming US Administration gives him legitimacy as an incoming officeholder in the Executive, but his new role is complicated by the fact that he is also CEO and majority shareholder of X. Like OpenAI, Google, Amazon, Palantir or any other tech behemoth, tech execs are not elected or bound to fulfil social goods or commitments, other than making a profit for their shareholders. They also fund many of the think tanks, reports and events in the political ecosystem, and there is a well-worn path of employment between industry, government and regulators.
No single person should be the carrier of that incredible burden. For now, Parliament is the only barrier in the increasingly confused picture of regulatory and political capture by the tech sector. We should fight to keep it that way.
My Lords, I support Amendment 74 from the noble Lords, Lord Scriven and Lord Clement-Jones, on excluding personal health data from being a recognised legitimate interest. I also support Amendment 78 on having a statement by the Secretary of State to recognise that legitimate interest and Amendments 83 and 90, which would remove powers from the Secretary of State to override primary legislation to modify data protection via an SI. There is not much to add to what I said on the previous group, so I will not repeat all the arguments made then. In simple terms, I repeat the necessity for trust—in health, particularly for patient trust. You do not gain trust simply by defining personal health data as a legitimate interest or by overriding primary legislation on the say-so of a Secretary of State, even if it is laid as a statutory instrument.
My Lords, I want to ask the Minister and the noble Lord, Lord Clement-Jones, in very general terms for their views on retrospectivity. Do they believe that the changes to data protection law in the Bill are intended to be applied to data already held at this time or will the new regime apply only to personal data collected going forwards from this point? I ask that specifically of data pertaining to children, from whom sensitive data has already been collected. Will the forthcoming changes to data protection law apply to such data that controllers and processors already hold, or will it apply only to data held going forward?
I thank in particular the noble Lord, Lord Clement-Jones, who has clearly had his Weetabix this morning. I will comment on some of the many amendments tabled.
On Amendments 73, 75, 76, 77, 83 and 90, I agree it is concerning that the Secretary of State can amend such important legislation via secondary legislation. However, these amendments are subject to the affirmative procedure and, therefore, to parliamentary scrutiny. Since the DPDI Bill proposed the same, I have not changed my views; I remain content that this is the right level of oversight and that these changes do not need to be made via primary legislation.
As for Amendment 74, preventing personal health data from being considered a legitimate interest seems wise. It is best to err on the side of caution when it comes to sharing personal health data.
Amendment 77 poses an interesting suggestion, allowing businesses affiliated by contract to be treated in the same way as large businesses that handle data from multiple companies in a group. This would certainly be beneficial for SMEs collaborating on a larger project. However, each such business may have different data protection structures and terms of use. Therefore, while this idea certainly has merit, I am a little concerned that it may benefit from some refining to ensure that the data flows between businesses in a way to which the data subject has consented.
On Amendment 78A and Schedule 4 standing part, there are many good, legitimate interest reasons why data must be quickly shared and processed, many of which are set out in Schedule 4: for example, national security, emergencies, crimes and safeguarding. This schedule should therefore be included in the Bill to set out the details on these important areas of legitimate interest processing. Amendment 84 feels rather like the central theme of all our deliberations thus far today, so I will listen with great interest, as ever, to the Minister’s response.
I have some concerns about Amendment 85, especially the use of the word “publicly”. The information that may be processed for the purposes of safeguarding vulnerable individuals is likely to be deeply sensitive and should not be publicly available. Following on from this point, I am curious to hear the Minister’s response to Amendment 86. It certainly seems logical that provisions should be in place so that individuals can regain control of their personal data should the reason for their vulnerability be resolved. As for the remaining stand part notices in this group, I do not feel that these schedules should be removed because they set out important detail on which we will come to rely.
My Lords, when the noble Lord, Lord Clement-Jones, opened his speech he said that he hoped that noble Lords would be made of strong stuff while he worked his way through it. I have a similar request regarding my response: please bear with me. I will address these amendments slightly out of order to ensure that related issues are grouped together.
The Schedule 4 stand part notice, and Amendments 73 and 75, tabled by the noble Lord, Lord Clement-Jones, and supported by the noble Baroness, Lady Kidron, would remove the new lawful ground of “recognised legitimate interests” created by Clause 70 and Schedule 4 to the Bill. The aim of these provisions is to give data controllers greater confidence about processing personal data for specified and limited public interest objectives. Processing that is necessary and proportionate to achieve one of these objectives can take place without a person’s consent and without undertaking the legitimate interests balancing test. However, they would still have to comply with the wider requirements of data protection legislation, where relevant, ensuring that the data is processed in compliance with the other data protection principles.
I say in response to the point raised by the noble Lord, Lord Cameron, that the new lawful ground of recognised legitimate interest will apply from the date of commencement and will not apply retrospectively.
The activities listed include processing of data where necessary to prevent crime, safeguarding national security, protecting children or responding to emergencies. They also include situations where a public body requests that a non-public body share personal data with it to help deliver a public task that is sanctioned by law. In these circumstances, it is very important that data is shared without delay, and removal of these provisions from the Bill, as proposed by the amendment, could make that harder.
Amendment 74, tabled by noble Lord, Lord Scriven, would prevent health data being processed as part of this new lawful ground, but this could have some unwelcome effects. For example, the new lawful ground is designed to give controllers greater confidence about reporting safeguarding concerns, but if these concerns relate to a vulnerable person’s health, they would not be able to rely on the new lawful ground to process the data and would have to identify an alternative lawful ground.
On the point made by the noble Lord, Lord Clement-Jones, about which data controllers can rely on the new lawful ground, it would not be available to public bodies such as the NHS; it is aimed at non-public bodies.
I reassure noble Lords that there are still sufficient safeguards in the wider framework. Any processing that involves special category data, such as health data, would also need to comply with the conditions and safeguards in Article 9 of the UK GDPR and Schedule 1 to the Data Protection Act 2018.
Amendment 78A, tabled by the noble Lord, Lord Clement-Jones, would remove the new lawful ground for non-public bodies or individuals to disclose personal data at the request of public bodies, where necessary, to help those bodies deliver their public interest tasks without carrying out a legitimate interest balance test. We would argue that, without it, controllers may lack certainty about the correct lawful ground to rely on when responding to such requests.
Amendment 76, also tabled by the noble Lord, Lord Clement-Jones, would remove the powers of regulations in Clause 70 that would allow the Secretary of State to keep the list of recognised legitimate interests up to date. Alternatively, the noble Lord’s Amendment 78 would require the Secretary of State to publish a statement every time he added a new processing activity to the list, setting out its purpose, which controllers it was aimed at and for how long they can use it. I reassure the noble Lord that the Government have already taken steps to tighten up these powers since the previous Bill was considered by this House.
Any new processing activities added would now also have to serve
“important objectives of … public interest”
as described in Article 23.1 of the UK GDPR and, as before, new activities could be added to the list only following consultation with the ICO and other interested parties. The Secretary of State would also have to consider the impact of any changes on people’s rights and have regard to the specific needs of children. Although these powers are likely to be used sparingly, the Government think it important that they be retained. I reassure the Committee that we will be responding to the report from the Delegated Powers Committee within the usual timeframes and we welcome its scrutiny of the Bill.
The noble Lord’s Amendment 77 seeks to make it clear that organisations should also be able to rely on Article 6.1(f) to make transfers between separate businesses affiliated by contract. The list of activities mentioned in Clause 70 is intended to be illustrative only and is drawn from the recitals to the UK GDPR. This avoids providing a very lengthy list that might be viewed as prescriptive. Article 6.1(f) of the UK GDPR is flexible. The transmission of personal data between businesses affiliated by contract may constitute a legitimate interest, like many other commercial interests. It is for the controller to determine this on a case-by-case basis.
I will now address the group of amendments tabled by the noble Lord, Lord Clement-Jones, concerning the purpose limitation principle, specifically Amendments 83 to 86. This principle limits the ways that personal data collected for one purpose can be used for another, but Clause 71 aims to provide more clarity and certainty around how it operates, including how certain exemptions apply.
Amendment 84 seeks to clarify whether the first exemption in proposed new Annexe 2 to the UK GDPR would allow personal data to be reused for commercial purposes. The conditions for using this exemption are that the requesting controller has a public task or official authority laid down in law that meets a public interest objective in Article 23.1 of the UK GDPR. As a result, I and the Government are satisfied that these situations would be for limited public interest objectives only, as set out in law.
Amendments 85 and 86 seek to introduce greater transparency around the use of safeguarding exemptions in paragraph 8 of new Annexe 2. These conditions are drawn from the Care Act 2014 and replicated in the existing condition for sensitive data processing for safeguarding purposes in the Data Protection Act 2018. I can reassure the Committee that processing cannot occur if it does not meet these conditions, including if the vulnerability of the individual no longer exists. In addition, requiring that an assessment be made and given to the data subject before the processing begins could result in safeguarding delays and would defeat the purpose of this exemption.
Amendment 83 would remove the regulation-making powers associated with this clause so that new exceptions could not be added in future. I remind noble Lords that there is already a power to create exemptions from the purpose limitation principle in the DPA 2018. This Bill simply moves the existing exemptions to a new annexe to the UK GDPR. The power is strictly limited to the public objectives listed in Article 23.1 of the UK GDPR.
I now turn to the noble Lord’s Amendment 89, which seeks to set conditions under which pseudonymised data should be treated as personal data. This is not necessary as pseudonymised data already falls within the definition of personal data under Article 4.1 of the UK GDPR. This amendment also seeks to ensure that a determination by the ICO that data is personal data applies
“at all points in that processing”.
However, the moment at which data is or becomes personal should be a determination of fact based on its identifiability to a living individual.
I turn now to Clause 74 stand part, together with Amendment 90. Noble Lords are aware that special categories of data require additional protection. Article 9 of the UK GDPR sets out an exhaustive list of what is sensitive data and outlines processing conditions. Currently, this list cannot be amended without primary legislation, which may not always be available. This leaves the Government unable to respond swiftly when new types of sensitive data are identified, including as a result of emerging technologies. The powers in Clause 74 enable the Government to respond more quickly and add new special categories of data, tailor the conditions applicable to their use and add new definitions if necessary.
Finally, I turn to the amendment tabled by the noble Lord, Lord Clement-Jones, that would remove Schedule 7 from the Bill. This schedule contains measures to create a clearer and more outcomes-focused UK international data transfers regime. As part of these reforms, this schedule includes a power for the Secretary of State to recognise new transfer mechanisms for protecting international personal data transfers. Without this, the UK would be unable to respond swiftly to emerging developments and global trends in personal data transfers. In addition, the ICO will be consulted on any new mechanisms, and they will be subject to debate in Parliament under the affirmative resolution procedure.
I hope this helps explain the Government’s intention with these clauses and that the noble Lord will feel able to withdraw his amendment.
My Lords, I thank the Minister. She covered quite a lot of ground and all of us will have to read Hansard quite carefully. However, it is somewhat horrifying that, for a Bill of this size, we had about 30 seconds from the Minister on Schedule 7, which could have such a huge influence on our data adequacy when that is assessed next year. I do not think anybody has talked about international transfers at this point, least of all me in introducing these amendments. Even though it may appear that we are taking our time over this Bill, we are not fundamentally covering all its points. The importance of this Bill, which obviously escapes most Members of this House—there are just a few aficionados—is considerable and could have a far-reaching impact.
I still get Viscount Camrose vibes coming from the Minister.
Perhaps I should stay that this kind of enthusiasm clearly conquers all. I should thank a former Minister, the noble Lord, Lord Kamall, and I thank the noble Baroness, Lady Kidron, for her thoughtful speech, particularly in questioning the whole recognised legitimate interest issue, especially in relation to vulnerable individuals.
It all seems to be a need for speed, whether it is the Secretary of State who has to make snappy decisions or a data controller. We are going to conquer uncertainty. We have to keep bustling along. In a way, to hell with individual data rights; needs must. I feel somewhat Canute-like holding up the barrier of data that will be flowing across us. I feel quite uncomfortable with that. I think the DPRRC is likewise going to feel pretty cheesed off.
My Lords, I thought I had no speech; that would have been terrible. In moving my amendment, I thank the noble Baronesses, Lady Kidron and Lady Harding of Winscombe, and the noble Lord, Lord Russell of Liverpool, for their support. I shall speak also to Amendments 94, 135 and 196.
Additional safeguards are required for the protection of children’s data. This amendment
“seeks to exclude children from the new provisions on purpose limitation for further processing under Article 8A”.
The change to the purpose limitation in Clause 71 raises questions about the lifelong implications of the proposed change for children, given the expectation that they are less aware of the risks of data processing and may not have made their own preferences or choices known at the time of data collection.
For most children’s data processing, adults give permission on their behalf. The extension of this for additional purposes may be incompatible with what a data subject later wishes as an adult. The only protection they may have is purpose limitation to ensure that they are reconsented or informed of changes to processing. Data reuse and access must not mean abandoning the first principles of data protection. Purpose limitation rests on the essential principles of “specified” and “explicit” at the time of collection, which this change does away with.
There are some questions that I would like to put to the Minister. If further reuses, such as more research, are compatible, they are already permitted under current law. If further reuses are not permitted under current law, why should data subjects’ current rights be undermined as a child and, through this change, never be able to be reclaimed at any time in the future? How does the new provision align with the principle of acting in the best interests of the child, as outlined in the UK GDPR, the UNCRC in Scotland and the Rights of Children and Young Persons (Wales) Measure 2011? What are the specific risks to children’s data privacy and security under the revised rules for purpose limitation that may have an unforeseeable lifelong effect? In summary, a blanket exclusion for children’s data processing conforms more with the status quo of data protection principles. Children should be asked again about data processing once they reach maturity and should not find that data rights have been given away by their parents on their behalf.
Amendment 196 is more of a probing amendment. Ofcom has set out its approach to the categorisation of category 1 services under the Online Safety Act. Ofcom’s advice and research, submitted to the Secretary of State, outlines the criteria for determining whether a service falls into category 1. These services are characterised by having the highest reach and risk functionalities among user-to-user services. The categorisation is based on certain threshold conditions, which include user numbers and functionalities such as content recommender systems and the ability for users to forward or reshare content. Ofcom has recommended that category 1 services should meet either of two sets of conditions: having more than 34 million UK users with a content recommender system or having more than 7 million UK users with a content recommender system and the ability for users to forward or reshare user-generated content. The categorisation process is part of Ofcom’s phased approach to implementing codes and guidance for online safety, with additional obligations for category 1 services due to their potential as sources of harm.
The Secretary of State recently issued the Draft Statement of Strategic Priorities for Online Safety, under Section 172 of the Online Safety Act. It says:
“Large technology companies have a key role in helping the UK to achieve this potential, but any company afforded the privilege of access to the UK’s vibrant technology and skills ecosystem must also accept their responsibility to keep people safe on their platforms and foster a safer online world … The government appreciates that Ofcom has set out to government its approach to tackling small but risky services. The government would like to see Ofcom keep this approach under continual review and to keep abreast of new and emerging small but risky services, which are posing harm to users online.
As the online safety regulator, we expect Ofcom to continue focusing its efforts on safety improvements among services that pose the highest risk of harm to users, including small but risky services. All search services in scope of the Act have duties to minimise the presentation of search results which include or lead directly to illegal content or content that is harmful to children. This should lead to a significant reduction in these services being accessible via search results”.
During the parliamentary debates on the Online Safety Bill and in Joint Committee, there was significant concern about the categorisation of services, particularly about the emphasis on size over risk. Initially, the categorisation was based largely on user numbers and functionalities, which led to concerns that smaller platforms with high-risk content might not be adequately addressed. In the Commons, Labour’s Alex Davies-Jones MP, now a Minister in the Ministry of Justice, argued that focusing on size rather than risk could fail to address extreme harms present on smaller sites.
The debates also revealed a push for a more risk-based approach to categorisation. The then Government eventually accepted an amendment allowing the Secretary of State discretion in setting thresholds based on user numbers, functionalities or both. This change aimed to provide flexibility in addressing high-risk smaller platforms. However, concerns remain, despite the strategy statement and the amendment to the original Online Safety Bill, that smaller platforms with significant potential for harm might not be sufficiently covered under the category 1 designation. Overall, while the final approach allows some flexibility, there is quite some debate about whether enough emphasis will be placed by Ofcom in its categorisation on the risks posed by smaller players. My colleagues on these Benches and in the Commons have emphasised to me that we should be rigorously addressing these issues. I beg to move.
My Lords, I shall speak to all the amendments in this group, and I thank noble Lords who have added their names to Amendments 88 and 135 in my name.
Amendment 88 creates a duty for data controllers and processors to consider children’s needs and rights. Proposed new subsection (1) simply sets out children’s existing rights and acknowledges that children of different ages have different capacities and therefore may require different responses. Proposed new subsection (2) addresses the concern expressed during the passage of the Bill and its predecessor that children should be shielded from the reduction in privacy protections that adults will experience under the proposals. Proposed new subsection (3) simply confirms that a child is anyone under the age 18.
This amendment leans on a bit of history. Section 123 of the Data Protection Act 2018 enshrined the age-appropriate design code into our data regime. The AADC’s journey from amendment to fully articulated code, since mirrored and copied around the world, has provided two useful lessons.
First, if the intent of Parliament is clear in the Bill, it is fixed. After Royal Assent to the Data Protection Act 2018, the tech lobby came calling to both the Government and the regulator arguing that the proposed age of adulthood in the AADC be reduced from 18 to 13, where it had been for more than two decades. Both the department and the regulator held up their hands and pointed at the text, which cited the UNCRC that defines a child as a person under 18. That age remains, not only in the UK but in all the other jurisdictions that have since copied the legislation.
In contrast, on several other issues both in the AADC and, more recently, in the Online Safety Act, the intentions of Parliament were not spelled out and have been reinterpreted. Happily, the promised coroner provisions are now enshrined in this Bill, but promises from the Dispatch Box about the scope and form of the coroner provisions were initially diluted and had to be refought for a second time by bereaved parents. Other examples, such as promises of a mixed economy, age-assurance requirements and a focus on contact harm, features and functionalities as well as content are some of the ministerial promises that reflected Parliament’s intention but do not form part of the final regulatory standards, in large part because they were not sufficiently spelled out in the Bill. What is on in the Bill really matters.
Secondly, our legislation over the past decade is guilty of solving the problems of yesterday. There is departmental resistance to having outcomes rather than processes enshrined in legislation. Overarching principles, such as a duty of care, or rights, such as children’s rights to privacy, are abandoned in favour of process measures, tools that even the tech companies admit are seldom used and narrow definitions of what must and may not be taken down.
Tech is various, its contexts infinite, its rate of change giddy and the skills of government and regulator are necessarily limited. At some point we are going to have to start saying what the outcome should be, what the principles are, and not what the process is. My argument for this amendment is that we need to fix our intention that in the Bill children have an established set of needs according to their evolving capacity. Similarly, they have a right to a higher bar of privacy, so that both these principles become unavoidable.
My Lords, I put my name to the amendments from the noble Baroness, Lady Kidron, and will briefly support them. I state my interest as a governor of Coram, the children’s charity. One gets a strong sense of déjà vu with this Bill. It takes me back to the Online Safety Bill and the Victims and Prisoners Bill, where we spent an inordinate amount of time trying to persuade the Government that children are children and need to be treated as children, not as adults. That was hard work. They have an absolute right to be protected and to be treated differently.
I ask the Minister to spend some time, particularly when her cold is better, with some of her colleagues whom we worked alongside during the passage of those Bills in trying to persuade the then Government of the importance of children being specifically recognised and having specific safeguards. If she has time to talk to the noble Lords, Lord Ponsonby, Lord Stevenson and Lord Knight, and the noble Baroness, Lady Thornton —when she comes out of hospital, which I hope will be soon—she will have chapter, book and verse about the arguments we used, which I hope we will not have to rehearse yet again in the passage of this Bill. I ask her please to take the time to learn from that.
As the noble Baroness said, what is fundamental is not what is hinted at or implied at the Dispatch Box, but what is actually in the Bill. When it is in the Bill, you cannot wriggle out of it—it is clearly there, stating what it is there for, and it is not open to clever legal interpretation. In a sense, we are trying to future-proof the Bill by, importantly, as she said, focusing on outcomes. If you do so, you are much nearer to future-proofing than if you focus on processes, which by their very nature will be out of date by the time you have managed to understand what they are there to do.
Amendment 135 is important because the current so-called safeguard for the Information Commissioner to look after the interests of children is woefully inadequate. One proposed new section in Clause 90 talks of
“the fact that children may be less aware of the risks and consequences associated with processing of personal data and of their rights in relation to such processing”.
It is not just children; most adults do not have a clue about any of that, so to expect children to have even the remotest idea is just a non-starter. To add insult to injury, that new section begins
“the Commissioner must have regard to such of the following”—
of which the part about children is one—
“as appear to the Commissioner to be relevant in the circumstances”.
That is about as vague and weaselly as it is possible to imagine. It is not adequate in any way, shape or form.
In all conscience, I hope that will be looked at very carefully. The idea that the commissioner might in certain circumstances deem that the status and importance of children is not relevant is staggering. I cannot imagine a circumstance in which that would be the case. Again, what is in the Bill really matters.
On Amendment 94, not exempting the provision of information regarding the processing of children’s data is self-evidently extremely important. On Amendment 82, ring-fencing children’s data from being used by a controller for a different purpose again seems a no-brainer.
Amendment 196, as the noble Lord, Lord Clement-Jones, says, is a probing amendment. It seems eminently sensible when creating Acts of Parliament that in some senses overlap, particularly in the digital and online world, that the left hand should know what the right hand is doing and how two Acts may be having an effect on one another, perhaps not in ways that had been understood or foreseen when the legislation was put forward. We are looking for consistency, clarity, future-proofing and a concentration on outputs, not processes. First and foremost, we are looking for the recognition, which we fought for so hard and finally got, that children are children and need to be recognised and treated as children.
My Lords, I think we sometimes forget, because the results are often so spectacular, the hard work that has had to happen over the years to get us to where we are, particularly in relation to the Online Safety Act. It is well exemplified by the previous speaker. He put his finger on the right spot in saying that we all owe considerable respect for the work of the noble Baroness, Lady Kidron, and others. I helped a little along the way. It is extraordinary to feel that so much of this could be washed away if the Bill goes forward in its present form. I give notice that I intend to work with my colleagues on this issue because this Bill is in serious need of revision. These amendments are part of that and may need to be amplified in later stages.
I managed to sign only two of the amendments in this group. I am sorry that I did not sign the others, because they are also important. I apologise to the noble Lord, Lord Clement-Jones, for not spotting them early enough to be able to do so. I will speak to the ones I have signed, Amendments 88 and 135. I hope that the Minister will give us some hope that we will be able to see some movement on this.
The noble Lord, Lord Russell, mentioned the way in which the wording on page 113 seems not only to miss the point but to devalue the possibility of seeing protections for children well placed in the legislation. New Clause 120B(e), which talks of
“the fact that children may be less aware of the risks and consequences associated with processing of personal data and of their rights in relation to such processing”,
almost says it all for me. I do not understand how that could possibly have got through the process by which this came forward, but it seems to speak to a lack of communication between parts of government that I hoped this new Government, with their energy, would have been able to overcome. It speaks to the fact that we need to keep an eye on both sides of the equation: what is happening in the online safety world and how data that is under the control of others, not necessarily those same companies, will be processed in support or otherwise of those who might wish to behave in an improper or illegal way towards children.
At the very least, what is in these amendments needs to be brought into the Bill. In fact, other additions may need to be made. I shall certainly keep my eye on it.
My Lords, I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, for bringing forward amendments in what is a profoundly important group. For all that data is a cornerstone of innovation and development, as we have often argued in this Committee, we cannot lose sight of our responsibility to safeguard the rights and welfare of our children.
I thank all noble Lords who have raised this important topic. I say at the outset that I appreciate and pay tribute to those who have worked on this for many years—in particular the noble Baroness, Lady Kidron, who has been a fantastic champion of these issues.
I also reassure noble Lords that these provisions are intended to build upon, and certainly not to undermine, the rights of children as they have previously been defined. We share noble Lords’ commitment to ensuring high standards of protection for children. That is why I am glad that the Bill, together with existing data protection principles, already provides robust protections for children. I hope that my response to these amendments shows that we take these issues seriously. The ICO also recognises in its guidance, after the UN Committee on the Rights of the Child, that the duties and responsibilities to respect the rights of children extend in practice to private actors and business enterprises.
Amendment 82, moved by the noble Lord, Lord Clement-Jones, would exclude children’s personal data from the exemptions to the purpose limitation principles in Schedule 5 to the Bill. The new purposes are for important public interests only, such as safeguarding vulnerable individuals or children. Broader existing safeguards in the data protection framework, such as the fairness and lawfulness principles, also apply. Prohibiting a change of purpose in processing could impede important activities, such as the safeguarding issues to which I have referred.
Amendment 88, tabled by the noble Baroness, Lady Kidron, would introduce a new duty requiring all data controllers to consider that children are entitled to higher protection than adults. We understand the noble Baroness’s intentions and, in many ways, share her aims, but we would prefer to focus on improving compliance with the current legislation, including through the way the ICO discharges its regulatory functions.
In addition, the proposed duty could have some unwelcome and unintended effects. For example, it could lead to questions about why other vulnerable people are not entitled to enhanced protections. It would also apply to organisations of all sizes, including micro-businesses and voluntary sector organisations, even if they process children’s data on only a small scale. It could also cause confusion about what they would need to do to verify age to comply with the new duty.
Amendment 94, also tabled by the noble Baroness, would ensure that the new notification exemptions under Article 13 would not apply to children. However, removing children’s data from this exemption could mean that some important research—for example, on the causes of childhood diseases—could not be undertaken if the data controller were unable to contact the individuals about the intended processing activity.
Amendment 135 would place new duties on the ICO to uphold the rights of children. The ICO’s new strategic framework, introduced by the Bill, has been carefully structured to achieve a similar effect. Its principal objective requires the regulator to
“secure an appropriate level of protection for personal data”.
This gives flexibility and nuance in the appropriateness of the level of protections; they are not always the same for all data subjects, all the time.
Going beyond this, though, the strategic framework includes the new duty relating to children. This acknowledges that, as the noble Baroness, Lady Kidron, said, children may be less aware of the risks and consequences associated with the processing of their data, as well of as their rights. As she pointed out, this is drawn from recital 38 to the UK GDPR, but the Government’s view is that the Bill’s language gives sufficient effect to the recital. We recognise the importance of clarity on this issue and hope that we have achieved it but, obviously, we are happy to talk further to the noble Baroness on this matter.
This duty will also be a consideration for the ICO and one to which the commissioner must have regard across all data protection activities, where relevant. It will inform the regulator’s thinking on everything from enforcement to guidance, including how work might need to be tailored to suit children at all stages of childhood in order to ensure that the levels of protection are appropriate.
Finally, regarding Amendment 196—
I thank the Minister for giving way. I would like her to explain why only half of the recital is in the Bill and why the fact that children merit special attention is in the Bill. How can it possibly be that, in this Bill, we are giving children adequate protection? I can disagree with some of the other things that she said, but I would like her to answer that specific question.
To be on the safe side, I will write to the noble Baroness. We feel that other bits in the provisions of the Bill cover the other aspects but, just to be clear on it, I will write to her. On Amendment 196 and the Online Safety Act—
I am sorry to interrupt but I am slightly puzzled by the way in which that exchange just happened. I take it from what the Minister is saying that there is no dissent, in her and the Bill team’s thinking, about children’s rights having to be given the correct priority, but she feels that the current drafting is better than what is now proposed because it does not deflect from the broader issues that she has adhered to. She has fallen into the trap, which I thought she never would do, of blaming unintended consequences; I am sure that she will want to rethink that before she comes back to the Dispatch Box.
Surely the point being made here is about the absolute need to make sure that children’s rights never get taken down because of the consideration of other requirements. They are on their own, separate and not to be mixed up with those considerations that are truly right for the commissioner—and the ICO, in its new form—to take but which should never deflect from the way children are protected. If the Minister agrees with that, could she not see some way of reaching out to be a bit closer to where the noble Baroness, Lady Kidron, is?
I absolutely recognise the importance of the issues being raised here, which is why I think I really should write: I want to make sure that whatever I say is properly recorded and that we can all go on to debate it further. I am not trying to duck the issue; this issue is just too important for me to give an off-the-cuff response on it. I am sure that we will have further discussions on this. As I say, let me put it in writing, and we can pick that up. Certainly, as I said at the beginning, our intention was to enhance children’s protection rather than deflect from it.
Moving on to Amendment 196, I thank the noble Lord, Lord Clement-Jones, and other noble Lords for raising this important issue and seeking clarity on how the provision relates to the categorisation of services in the Online Safety Act. These categories are, however, not directly related to Clause 122 of this Bill as a data preservation notice can be issued to any service provider regulated in the Online Safety Act, regardless of categorisation. A list of the relevant persons is provided in paragraphs (a) to (e) of Section 100(5) of the Act; it includes any user-to-user service, search service and ancillary service.
I absolutely understand noble Lords saying that these things should cross-reference in some way but, as far we are concerned, they complement each other, and that protection is currently in the Online Safety Act. As I said, I will write to noble Lords and am happy to meet if that would be helpful. In the meantime, I hope that the explanations I have given are sufficient grounds for noble Lords not to press their amendments at this stage.
I thank the Minister for her response. I should say at the outset that, although I may have led the group, it is clear that the noble Baroness, Lady Kidron, leads the pack as far as this is concerned. I know that she wants me to say that the noble Baroness, Lady Harding, wished to say that she was extremely sorry not to be able to attend as she wanted to associate herself wholeheartedly with these amendments. She said, “It’s so disappointing still to be fighting for children’s data to have higher protection but it seems that that’s our lot!” I think she anticipated the response, sadly. I very much thank the noble Baroness, Lady Kidron, the noble Lords, Lord Russell and Lord Stevenson, and the noble Viscount, Lord Camrose, in particular for his thoughtful response to Amendment 196.
I was very interested in the intervention from the noble Lord, Lord Stevenson, and wrote down “Not invented here” to sum up the Government’s response to some of these amendments, which has been consistently underwhelming throughout the debates on the DPDI Bill and this Bill. They have brought out such things as “the unintended effects” and said, “We don’t want to interfere with the ICO”, and so on. This campaign will continue; it is really important. Obviously, we will read carefully what the Minister said but, given the troops behind me, I think the campaign will only get stronger.
The Minister did not really deal with the substance of Amendment 196, which was not just a cunning ploy to connect the Bill with the Online Safety Act; it was about current intentions on categorisation. There is considerable concern that the current category 1 is overconservative and that we are not covering the smaller, unsafe social media platforms. When we discussed the Online Safety Bill, both in the Joint Committee and in the debates on subsequent stages of the Bill, it was clear that this was about risk, not just size, and we wanted to cover those risky, smaller platforms as well. While I appreciate the Government’s strategic statement, which made it pretty clear, and without wishing to overly terrorise Ofcom, we should make our view on categorisation pretty clear, and the Government should do likewise.
This argument and debate will no doubt continue. In the meantime, I beg leave to withdraw my amendment.
My Lords, although it is a late hour, I want to make two or three points. I hope that I will be able to finish what I wish to say relatively quickly. It is important that in looking at the whole of this Bill we keep in mind two things. One is equivalence, and the other is the importance of the rights in the Bill and its protections being anchored in something ordinary people can understand. Unfortunately, I could not be here on the first day but having sat through most of today, I deeply worry about the unintelligibility of this whole legislative package. We are stuck with it for now, but I sincerely hope that this is the last Civil Service-produced Bill of this kind. We need radical new thinking, and I shall try to explore that when we look at automated decision-making—again, a bit that is far too complicated.
Amendment 87 specifically relates to equivalence, and I want to touch on Amendment 125. There is in what I intend to suggest a fix to the problem, if it really exists, that will also have the benefit of underpinning this legislation by rights that people understand and that are applicable not merely to the state but to private companies. The problem that seems to have arisen—there are byproducts of Brexit that from time to time surface—is the whole history of the way in which we left the European Community. We left initially under the withdrawal Act, leaving retained EU law. No doubt many of us remember the debates that took place. The then Government were wholly opposed to keeping the charter. In respect of the protection of people’s data being processed, that is probably acceptable on the basis that the rights of the charter had merged into ordinary retained EU law through the decisions of the Court of Justice of the European Union. All was relatively well until the retained Retained EU Law (Revocation and Reform) Act, which deleted most general EU retained law principles, including fundamental rights, from the UK statute book. What then happened, as I understand it, was that a fix to this problem was attempted by the Data Protection (Fundamental Rights and Freedoms) (Amendment) Regulations 2023, which tidied up the UK GDPR by making clear that any references to fundamental rights and freedoms were regarded as reference to convention rights within the meaning of the Human Rights Act.
For good and understandable reasons, the Human Rights Act applies to public authorities and in very limited circumstances to private bodies but not as a whole. That is accepted generally and certainly is accepted in the human rights memorandum in respect of this Bill. The difficulty with the Bill, therefore, is that the protections under the Human Rights Act apply only to public authorities but not to private authorities. Whereas, generally speaking, the way in which the Charter of Fundamental Rights operated was to protect, also on a horizontal basis, the processing or use of data by private companies.
This seems to cause two problems. First, it is critical that there is no doubt about this, and I look forward to hearing what the Minister has to say as to the view of the Government’s legal advisers as to whether there is a doubt. Secondly, the amendment goes to the second of the two objectives which we are trying to achieve, which is to instil an understanding of the principles so that the ordinary member of the public can have trust. I defy anyone, even the experts who drafted this, to think that this is intelligible to any ordinary human being. It is simply not. I am sorry to be so rude about it, but this is the epitome of legislation that is, because of its sheer complexity, impossible to understand.
Of course, it could be made a lot better by a short series of principles introduced in the Bill, the kind of thing we have been talking about at times today, with a short, introductory summary of what the rights are under the Bill. I hope consideration can be given to that, but that is not the purpose of my amendment. One purpose that I suggest as a fix to this—to both the point of dealing with rights in a way that people can understand and the point on equivalence—is a very simple application, for the purposes of data processing, of the rights and remedies under the Human Rights Act, extending it to private bodies. One could therefore properly point, in going through the way that the Bill operates, to fundamental rights that people understand which are applicable, not merely if a public authority is processing the data but to the processing of data by private bodies. That is what I wanted to say about Amendment 87.
I wanted to add a word of support, because it is closely allied to this on the equivalence point, to the amendment in the name of the noble Lord, Lord Clement-Jones, for whose support I am grateful in respect of Amendment 87. That relates to the need to have a thorough review of equivalence. Obviously, negotiations will take place, but it really is important that thorough attention is given to the adequacy of our legislation to ensure that there is no incompatibility with the EU regime so we do not get adequacy. Those are the two amendments to which I wished to speak in this group. There are two reasons why I feel it would be wrong for me to go on and deal with the others. Some are very narrow and some very broad, and it is probably easiest to listen to those who are speaking to those amendments in due course. On that basis, therefore, I beg to move.
My Lords, I will speak to Amendments 139, 140 and 109A—which was a bit of a late entry this morning—in my name. I express my thanks to those who have co-signed them.
I start by speaking to two amendments tabled in my name.
Amendment 91 seeks to change
“the definition of request by data subjects to data controllers”
that can be declined or
“for which a fee can be charged from ‘manifestly unfounded or excessive’ to ‘vexatious or excessive’”.
I am sure that many of us will remember, without a great deal of fondness, our debates on these terms in the DPDI Bill. When we debated this issue at that time, it was, rather to my regret, often presented as a way to reduce protections and make it easier to decline or charge a fee for a subject access request. In fact, the purpose was to try to filter out cynical or time-wasting requests, such as attempts to bypass legal due process or to bombard organisations with vast quantities of essentially meaningless access requests. Such requests are not unfounded but they are harmful; by reducing them, we would give organisations more time and capacity to respond to well-founded requests. I realise that I am probably on a loser on this one but let me encourage noble Lords one last time to reconsider their objections and take a walk on the vexatious side.
Amendment 97 would ensure that
“AI companies who process data not directly obtained from data subjects are required to provide information to data subjects where possible. Without this amendment, data subjects may not know their data is being held”.
If a subject does not even know that their data is being held, they cannot enforce their data rights.
Amendment 99 follows on from that point, seeking to ensure that AI companies using large datasets cannot avoid providing information to data subjects on the basis that their datasets are too large. Again, if a subject does not know that their data is being held, they cannot enforce their rights. Therefore, it is really important that companies cannot avoid telling individuals about their personal data and the way in which it is being used because of sheer weight of information. These organisations are specialists in such processing of huge volumes of data, of course, so I struggle to accept that this would be too technically demanding for them.
Let me make just a few comments on other amendments tabled by noble Lords. Under Amendment 107, the Secretary of State would have
“to publish guidance within six months of the Act’s passing to clarify what constitutes ‘reasonable and proportionate’ in protection of personal data”.
I feel that this information should be published at the same time as this Bill comes into effect. It serves no purpose to have six months of uncertainty.
I do not believe that Amendment 125 is necessary. The degree to which the Government wish to align—or not—with the EU is surely a matter for the Government and their priorities.
Finally, I was struck by the interesting point that the noble and learned Lord, Lord Thomas, made when he deplored the Bill’s incomprehensibility. I have extremely high levels of personal sympathy with that view. To me, the Bill is the source code. There is a challenge in making it comprehensible and communicating it in a much more accessible way once it goes live. Perhaps the Minister can give some thought to how that implementation phase could include strong elements of communication. While that does not make the Bill any easier to understand for us, it might help the public at large.
My Lords, the problem is that I have a 10-minute speech and there are five minutes left before Hansard leaves us, so is it sensible to draw stumps at this point? I have not counted how many amendments I have, but I also wish to speak to the amendment by the noble and learned Lord, Lord Thomas. I would have thought it sensible to break at this point.