Moved by
14: Clause 67, page 75, line 10, after “scientific” insert “and that is conducted in the public interest”
Member’s explanatory statement
This amendment ensures that to qualify for the scientific research exception for data reuse, that research must be in the public interest. This requirement already exists for medical research, but this amendment would apply it to all scientific research wishing to take advantage of the exception.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - -

My Lords, I thank my noble friend Lady Kidron and the noble Viscount, Lord Camrose, for adding their signatures to my Amendment 14. I withdrew this amendment in Committee, but I am now asking the Minister to consider once again the definition of “scientific research” in the Bill. If he cannot satisfy me in his speech this evening, I will seek the opinion of the House.

I have been worried about the safeguards for defining scientific research since the Bill was published. This amendment will require that the research should be in “the public interest”, which I am sure most noble Lords will agree is a laudable aim and an important safeguard. This amendment has been looked at in the context of the Government’s recent announcements on turning this country into an AI superpower. I am very much a supporter of this endeavour, but across the country there are many people who are worried about the need to set up safeguards for their data. They fear data safety is threatened by this explosion of AI and its inexorable development by the big tech companies. This amendment will go some way to building public trust in the AI revolution.

The vision of Donald Trump surrounded at his inauguration yesterday by tech billionaires, most of whom have until recently been Democrats, puts the fear of God into me. I fear their companies are coming for our data. We have some of the best data in the world, and it needs to be safeguarded. The AI companies are spending billions of dollars developing their foundation models, and they are beholden to their shareholders to minimise the cost of developing these models.

Clause 67 gives a huge fillip to the scientific research community. It exempts research which falls within the definition of scientific research as laid out in the Bill from having to gain new consent from data subjects to reuse millions of points of data.

It costs time and money for the tech companies to get renewed consent from data holders before reusing their data. This is an issue we will discuss further when we debate amendments on scraping data from creatives without copyright licensing. It is clear from our debates in Committee that many noble Lords fear that AI companies will do what they can to avoid either getting consent or licensing data for use in scraping data. Defining their research as scientific will allow them to escape these constraints. I could not be a greater supporter of the wonderful scientific research that is carried out in this country, but I want the Bill to ensure that it really is scientific research and not AI development camouflaged as scientific research.

The line between product development and scientific research is often blurred. Many developers posit efforts to increase model capabilities, efficiency, or indeed the study of their risks, as scientific research. The balance has to be struck between allowing this country to become an AI superpower and exploiting its data subjects. I contend that this amendment will go far to allay public fears of the abuse and use of their data to further the profits and goals of huge AI companies, most of which are based in the United States.

Noble Lords have only to look at the outrage last year at Meta’s use of Instagram users’ data without their consent to train the datasets for its new Llama AI model to understand the levels of concern. There were complaints to regulators, and the ICO posted that Meta

“responded to our request to pause and review plans to use Facebook and Instagram user data to train generative AI”.

However, so far, there has been no official change to Meta’s privacy policy that would legally bind it to stop processing data without consent for the development of its AI technologies, and the ICO has not issued a binding order to stop Meta’s plans to scrape users’ data to train its AI systems. Meanwhile, Meta has resumed reusing subjects’ data without their consent.

I thank the Minister for meeting me and talking through Amendment 14. I understand his concerns that, at a public interest threshold, the definition of scientific research will create a heavy burden on researchers, but I think it is worth the risk in the name of safety. Some noble Lords are concerned about the difficulty of defining “public interest”. However, the ICO has very clear guidelines about what public interest consists of. It states that

“you should broadly interpret public interest in the research context to include any clear and positive public benefit likely to arise from that research”.

It continues:

“The public interest covers a wide range of values and principles about the public good, or what is in society’s best interests. In making the case that your research is in the public interest, it is not enough to point to your own private interests”.


The guidance even includes further examples of research in the public interest, such as

“the advancement of academic knowledge in a given field … the preservation of art, culture and knowledge for the enrichment of society … or … the provision of more efficient or more effective products and services for the public”.

This guidance is already being applied in the Bill to sensitive data and public health data. I contend that if these carefully thought-through guidelines are good enough for health data, they should be good enough for all scientific data.

This view is supported in the EU, where

“the special data protection regime for scientific research is understood to apply where … the research is carried out with the aim of growing society’s collective knowledge and wellbeing, as opposed to serving primarily one or several private interests.”

The Minister will tell the House that the data exempted to be used for scientific research is well protected—that it has both the lawfulness test, as set out in the UK GDPR, and a reasonableness test. I am concerned that the reasonableness test in this Bill references

“processing for the purposes of any research that can reasonably be described as scientific, whether publicly or privately funded and whether carried out as a commercial or non-commercial activity”.

Normally, a reasonableness test requires an expert in the context of that research to decide whether it is reasonable to consider it scientific. However, in this Bill, “reasonable” just means that an ordinary person in the street can decide whether the research is reasonable to be considered scientific. This must be a broadening of the threshold of the definition.

It seems “reasonable” in the current climate to ask the Government to include a public interest test before giving the AI companies extensive scope to reuse our data, without getting renewed consent, on the pretext that the work is for scientific research. In the light of possible deregulation of the sector by the new regime in America, it is beholden on this country to ensure that our scientific research is dynamic, but safe. If the Government can bring this reassurance then for millions of people in this country they will increase trust in Britain’s AI revolution. I beg to move.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I support my noble friend Lord Colville. He has made an excellent argument, and I ask noble Lords on the Government Benches to think about it very carefully. If it is good enough for health data, it is good enough for the rest of science. In the interest of time, I will give an example of one of the issues, rather than repeat the excellent argument made by my noble friend.

In Committee, I asked the Government three times whether the cover of scientific research could be used, for example, to market-test ways to hack human responses to dopamine in order to keep children online. In the Minister’s letter, written during Committee, she could not say that the A/B testing of millions of children to make services more sticky—that is, more addictive—would not be considered scientific, but rather that the regulator, the ICO, could decide on a case-by-case basis. That is not good enough.

There is no greater argument for my noble friend Lord Colville’s amendment than the fact that the Government are unable to say if hacking children’s attention for commercial gain is scientific or not. We will come to children and child protection in the Bill in the next group, but it is alarming that the Government feel able to put in writing that this is an open question. That is not what Labour believed in opposition, and it is beyond disappointing that, now in government, Labour has forgotten what it then believed. I will be following my noble friend through the Lobby.

--- Later in debate ---
I hope the noble Viscount is content to withdraw this amendment, given these reassurances and the concerns about a significant unintended consequence from going down this route.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - -

My Lords, I am grateful and impressed that the Minister has stepped into this controversial sphere of data management at such short notice. I wish his colleague, the noble Baroness, Lady Jones, a swift recovery.

I hope that noble Lords listened to the persuasive speeches that were given across the Benches, particularly from my noble friend Lady Kidron, with her warning about blurring the definition of scientific research. I am also grateful to the Opposition Benches for their support. I am glad that the noble Lord, Lord Markham, thinks that I am threading the needle between research and public trust.

I listened very carefully to the Minister’s response and understand that he is concerned by the heavy burden that this amendment would put on scientific research. I have listened to his explanation of the OECD Frascati principles, which define scientific research. I understand his concern that the rigorous task of demanding that new researchers have to pass a public interest test will stop many from going ahead with research. However, I repeat what I said in my opening speech: there has to be a balance between generating an AI revolution in this country and bringing the trust of the British people along with it. The public interest test is already available for restricted research in this field; I am simply asking for it to be extended to all scientific research.

I am glad that the reasonableness and lawfulness tests are built into Clause 67, but I ask for a test that I am sure most people would support—that the research should have a positive public benefit. On that note, I would like to seek the opinion of the House.