Data (Use and Access) Bill [Lords]

Samantha Niblett Excerpts
Wednesday 7th May 2025

(1 day, 22 hours ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Ben Spencer Portrait Dr Spencer
- Hansard - - - Excerpts

I thank the Minister for his clarification and reiteration of that point, and again for his work with colleagues to take forward the issue, on which I think we are in unison across the House.

New clause 21 is on directions to public authorities on recording of sex data. One does not need to be a doctor to know that data accuracy is critical, particularly when it comes to health, research or the provision of tailored services based on protected characteristics such as sex or age. The accuracy of data must be at the heart of this Bill, and nowhere has this been more high-profile or important than in the debate over the collection and use of sex and gender data. I thank the charity Sex Matters and the noble Lords Arbuthnot and Lucas for the work they have done to highlight the need for accurate data and its relevance for the digital verification system proposed in the Bill.

Samantha Niblett Portrait Samantha Niblett (South Derbyshire) (Lab)
- Hansard - -

The recent decision by the Supreme Court that “sex” in the Equality Act 2010 refers to biological sex at birth, regardless of whether someone holds a gender recognition certificate or identifies as of a different gender, has already left many trans people feeling hurt and unseen. Does the shadow Minister agree with me that any ID and digital verification service must consider trans people, not risk making them more likely to feel that their country is forgetting who they are?

Ben Spencer Portrait Dr Spencer
- Hansard - - - Excerpts

I thank the hon. Member for her intervention, and I will shortly come on to the impact on all people of the decision of the Supreme Court. Our new clause’s focus and scope are simple. The Supreme Court ruling made it clear that public bodies must collect data on biological sex to comply with their duties under the Equality Act. The new clause ensures that this data is recorded and used correctly in accordance with the law. This is about data accuracy, not ideology.

New clause 21 is based in part on the work of Professor Alice Sullivan, who conducted a very important review, with deeply concerning findings on inaccurate data collection and the conflation of gender identity with biological sex data. She found people missed off health screening, risks to research integrity, inaccurate policing records and management through the criminal justice system, and many other concerns. These concerns present risks to everyone, irrespective of biological sex, gender identity or acquired gender. Trans people, like everyone else, need health screening based on their biological sex. Trans people need protecting from sexual predators, too, and they have the right to dignity and respect.

The Sullivan report shows beyond doubt that the concerns of the last Government and the current Leader of the Opposition were entirely justified. The Government have had Professor Sullivan’s report since September last year, but the Department for Science, Innovation and Technology has still not made a formal statement about it or addressed the concerns raised, which is even more surprising given its relevance to this Bill. The correction of public authority data on sex is necessary and urgent, but it is made even more critical by the implementation of the digital verification services in the Bill.

--- Later in debate ---
Victoria Collins Portrait Victoria Collins
- Hansard - - - Excerpts

That has been my challenge to the tech companies, which I absolutely support in innovating and driving this—but if they are saying that it would be easy for creatives to do this, why is it not easy for big tech companies with power and resources to lead the way?

Amendments 41 to 44 would ensure that the decisions made about people, whether through data profiling, automated systems or algorithms, are fair. They would clarify that meaningful human involvement in automated decision making must be real, competent and capable of changing the outcome, not just a box-ticking exercise.

The amendments before us offer a clear choice to protect our children and creators or to continue to delay while harm grows—the choice to build a future in which technology either builds trust or destroys it. We have the evidence and the solutions, and the time for action is now. Let us choose a future in which technology empowers, rather than exploits—one that is good for society and for business. I urge all Members to support our amendments, which would put people and the wellbeing of future generations first.

Samantha Niblett Portrait Samantha Niblett
- View Speech - Hansard - -

I am pleased to speak in this debate in support of new clause 14, in the name of my hon. Friend the Member for Leeds Central and Headingley (Alex Sobel), to which I have added my name. The clause would give our media and creative sectors urgently needed transparency over the use of copyright works by AI models. I am sure that my speech will come as no surprise to the Minister.

I care about this issue because of, not in spite of, my belief in the power of AI and its potential to transform our society and our economy for the better. I care because the adoption of novel technologies by businesses and consumers requires trust in the practices of firms producing the tech. I care about this issue because, as the Creative Rights in AI Coalition has said:

“The UK has the potential to be the global destination for generative firms seeking to license the highest-quality creative content. But to unlock that immense value, we must act now to stimulate a dynamic licensing market: the government must use this legislation to introduce meaningful transparency provisions.”

Although I am sure that the Government’s amendments are well meant, they set us on a timeline for change to the copyright framework that would take us right to the tail end of this Parliament. Many in this House, including myself, do not believe that an effective opt-out mechanism will ever develop; I know it is not in the Bill right now, but it was proposed in the AI and copyright consultation. Even if the Government insist on pursuing this route, it would be a dereliction of duty to fail to enforce our existing laws in the intervening period.

Big tech firms claim that transparency is not feasible, but that is a red herring. These companies are absolutely capable of letting rights holders know whether their individual works have been used, as OpenAI has been ordered to do in the Authors Guild v. OpenAI copyright case. Requiring transparency without the need for a court order will avoid wasting court time and swiftly establish a legal precedent, making the legal risk of copyright infringement too great for AI firms to continue with the mass theft that has taken place. That is why big tech objects to transparency, just as it objects to any transparency requirements, whether they are focused on online safety, digital competition or copyright. It would make it accountable to the individuals and businesses that it extracts value from.

The AI companies further argue that providing transparency would compromise their trade secrets, but that is another red herring. Nobody is asking for a specific recipe of how the models are trained: they are asking only to be able to query the ingredients that have gone into it. Generative AI models are made up of billions of data points, and it is the weighting of data that is a model’s secret sauce.

The Government can do myriad things around skills, access to finance, procurement practices and energy costs to support AI firms building and deploying models in the UK. They insist that they do not see the AI copyright debate as a zero-sum game, but trading away the property rights of 2.4 million UK creatives—70% of whom live outside London—to secure tech investment would be just that.

There are no insurmountable technical barriers to transparency in the same way that there are no opt-outs. The key barrier to transparency is the desire of tech firms to obscure their illegal behaviour. It has been shown that Meta employees proactively sought, in their own words,

“to remove data that is clearly marked as pirated/stolen”

from the data that they used from the pirate shadow library, LibGen. If they have technical means to identify copyright content to cover their own backs, surely they have the technical means to be honest with creators about the use of their valuable work.

I say to the Minister, who I know truly cares about the future of creatives and tech businesses in the UK—that is absolutely not in question—that if he cannot accept new clause 14 as tabled, he should take the opportunity as the Bill goes back to the Lords to bring forward clauses that would allow him to implement granular transparency mechanisms in the next six to 12 months. I and many on the Labour Benches—as well as the entire creative industries and others who do not want what is theirs simply to be taken from them—stand ready to support the development of workable solutions at pace. It can never be too soon to protect the livelihoods of UK citizens, nor to build trust between creators and the technology that would not exist without their hard work.

Judith Cummins Portrait Madam Deputy Speaker (Judith Cummins)
- Hansard - - - Excerpts

I call the Chair of the Culture, Media and Sport Committee.