Baroness Hayter of Kentish Town Portrait Baroness Hayter of Kentish Town (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I too will speak to Motion 32A. I thank my noble friend the Minister for his confirmation of the Government’s welcome of the Supreme Court ruling and his welcome of the Sullivan report. I also very much welcome the words that he has used today and thank him for the discussions that we have been able to have.

Can he confirm that where the Equality Act allows for a women-only space, any digital IT system used for that purpose would refer to biological sex as the relevant information? With regard to public authorities, I assume that organisations such as Sport England and the GMC are counted as public authorities because they are statutory. At the moment the GMC does not record the biological sex of doctors, only the gender. When that also goes digital, will it be confined to biological sex so that, again, patients can know the sex of their physician, assuming that it will be digital? I think that the Minister understands the questions I am posing and that his wording does give that reassurance, but any clarity would be welcome.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - -

My Lords, I stand in support of my Motion 43A. I welcome so much of this Bill. I want this country to be a champion of technology and hope that it becomes a tech powerhouse, attracting hundreds of millions of pounds-worth of investment in the development of AI. I understand the concerns expressed by the Minister, but I am still pressing ahead with this amendment because I want the people of this country to have control of their data and how it is used.

This amendment is a push-back against the way the AI companies have been abusing the use of people’s data in training their AI models. Last year, Meta reused data from Instagram users without their consent to train up its Llama AI model. Once this was discovered, there was a huge outcry from the owners of the data and an appeal to the ICO. As a result, Meta stopped the processing and the ICO said,

“it is crucial that the public can trust that their privacy rights will be respected from the outset”.

I want to make sure that when the Bill becomes law, it reassures the people of this country that they can trust the new technology. The battle to stop the abuse of data is a central concern of my indomitable noble friend Lady Kidron, who is sitting beside me and whose amendment is in the next group. It responds to the theft of copyright belonging to millions of creatives, including authors and artists, by AI companies. As it stands, Clause 67 gives a powerful exemption, allowing AI companies to reuse data without consent if they can show that their work aligns with the definition of “scientific research” set out in the Bill. I fear that this definition is so widely drawn that it will allow AI models to reuse data without consent, claiming that they are carrying out scientific research when in fact they are using it for product development and their own profit.

I thank the Ada Lovelace Institute for its constant support throughout the lengthy progress of this Bill. I expressed my concern in Committee and on Report. Chi Onwurah, the very respected chair of the Science, Innovation and Technology Committee in the other place, tabled a similar amendment. However, despite meetings with Ministers, they have offered nothing to assuage our concerns, which has forced me to push this amendment at this stage.

Proposed new paragraph 2A inserted by this amendment would tighten the definition of what counts as scientific research. It is taken from the Frascati manual, developed by the OECD in order to compare R&D efforts made by different companies and identify what key features underpin them. The Government support the Frascati definition. In Committee, the Minister said the research test set out in the Bill “will not operate alone”, and will

“be in the context of the Frascati definition and the ICO’s guidance”.—[Official Report, 21/1/25; col. 1637.]

He said that the Frascati definitions are merely guidance and that codification would bring burdens on scientific researchers, but this is not a new requirement: it is simply a codification of an existing standard set up by the ICO.

The central feature of this part of the amendment is that scientific research should increase the stock of human knowledge. The Minister has told your Lordships that not all scientific research will be new knowledge, that scientific research is often refuted or confirms previous findings, and that some scientific research will fail. But if there is refutation or confirmation of an experiment, that is an extension of human knowledge. Even if research fails, the researcher will know that the experiment does not work, and that is new knowledge. The requirement for scientific research to increase the stock of knowledge is a sensible precaution to preserve our data from abuse, and it will weed out the tech companies piggybacking on the clause for their own profit.

The purpose of this amendment is not just to tighten the definition. It is also to make sure that researchers have to consider it when they start to deploy the exemption for the reuse of data. The Minister has said it will lead to undue burden on scientists and stop research going ahead, but this definition is already being used by the ICO. The problem for a person whose data is being abused is that at the moment, if they want to appeal against its use without consent, they have to go to the ICO, which then has to apply the Frascati definition.

The ICO’s latest statistics show that only 12% of data protection complaints are dealt with within 90 days, compared with the target of 80%. Surely that means it is too late for the appeal against reuse of data without consent. The data will already have been absorbed into the AI training model and, as we have been continually told, it is hard for AI researchers to identify data once it is included in part of the model.

Proposed new paragraph 2A inserted by this amendment would stop this happening. By our putting a definition in the Bill, the AI researchers would have to consider it before reusing the data for their model, therefore saving data subjects having to appeal to the ICO if they are concerned about abuse.

Proposed new paragraph 2B inserted by this amendment responds to the Government’s claim that the “reasonably described” test in this clause is a tightening of the definition of scientific research. Over 14 of our leading law companies have looked at the Government’s test as set out in the Bill and described it variously as loosening, expanding or broadening the definition. However, Clause 67 asks the question whether the research can be reasonably described as scientific. The ICO or the courts will have to consider whether it is irrational to call this scientific research, but it is very hard to prove irrationality; it is a high bar.

I hope noble Lords will agree that the use of the usual reasonableness test asks, “Would a reasonable person conducting scientific research perform this activity in this manner?”. This test evaluates actual conduct against an objective standard of what constitutes proper scientific research.

The amendment seeks to realise what is already a requirement: that such research be conducted in line with standards based on the UK Research and Innovation Code of Practice for Research. It would ensure transparency for the use of scientific research. I am sure that during the course of the debate we will hear from scientists who will say that this debate will stifle research and stop new researchers undertaking work. However, this requirement is minimal, and the information required is that which researchers should already have to hand.

What I ask your Lordships to bear in mind when voting is that this amendment would give transparency into how people’s data is being reused. The new tests laid out in my amendment would be a powerful weapon in the fight against the abuse of people’s data. I want the new technologies to be successful, but they will be successful only if they have the trust of the people of the country. If people think that the Government have caved in to tech companies and allowed them to pillage our data for their own financial gain rather than for the progress of human knowledge, most will be outraged. I ask the Minister to assuage these fears and ensure that the Bill provides data in the people’s interests. Meanwhile, I will ask the opinion of the House at the end of this debate.

--- Later in debate ---
Moved by
Viscount Colville of Culross Portrait Viscount Colville of Culross
- Hansard - -

43A: At end insert “, and do propose Amendment 43B instead of the words so left out of the Bill—

43B: Clause 67, page 75, line 28, at end insert—
“2A. For the purposes of paragraph 2, “scientific research” means creative and systematic work undertaken in order to increase the stock of knowledge, including knowledge of humankind, culture and society, and to devise new applications of available knowledge.
2B. To meet the reasonableness test in paragraph 2, the activity being described as scientific research must be conducted according to appropriate ethical, legal and professional frameworks, obligations and standards.””
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - -

My Lords, I listened carefully to the speeches of the noble Lords, Lord Winston and Lord Tarassenko, but I am not convinced that my amendment would stop the research as they suggested. However, it would protect users’ data as the technological revolution unfolds. I beg leave to test the opinion of the House.