Data (Use and Access) Bill [Lords] Debate
Full Debate: Read Full DebateJudith Cummins
Main Page: Judith Cummins (Labour - Bradford South)Department Debates - View all Judith Cummins's debates with the Department for Science, Innovation & Technology
(1 day, 22 hours ago)
Commons ChamberI am pleased to speak in this debate in support of new clause 14, in the name of my hon. Friend the Member for Leeds Central and Headingley (Alex Sobel), to which I have added my name. The clause would give our media and creative sectors urgently needed transparency over the use of copyright works by AI models. I am sure that my speech will come as no surprise to the Minister.
I care about this issue because of, not in spite of, my belief in the power of AI and its potential to transform our society and our economy for the better. I care because the adoption of novel technologies by businesses and consumers requires trust in the practices of firms producing the tech. I care about this issue because, as the Creative Rights in AI Coalition has said:
“The UK has the potential to be the global destination for generative firms seeking to license the highest-quality creative content. But to unlock that immense value, we must act now to stimulate a dynamic licensing market: the government must use this legislation to introduce meaningful transparency provisions.”
Although I am sure that the Government’s amendments are well meant, they set us on a timeline for change to the copyright framework that would take us right to the tail end of this Parliament. Many in this House, including myself, do not believe that an effective opt-out mechanism will ever develop; I know it is not in the Bill right now, but it was proposed in the AI and copyright consultation. Even if the Government insist on pursuing this route, it would be a dereliction of duty to fail to enforce our existing laws in the intervening period.
Big tech firms claim that transparency is not feasible, but that is a red herring. These companies are absolutely capable of letting rights holders know whether their individual works have been used, as OpenAI has been ordered to do in the Authors Guild v. OpenAI copyright case. Requiring transparency without the need for a court order will avoid wasting court time and swiftly establish a legal precedent, making the legal risk of copyright infringement too great for AI firms to continue with the mass theft that has taken place. That is why big tech objects to transparency, just as it objects to any transparency requirements, whether they are focused on online safety, digital competition or copyright. It would make it accountable to the individuals and businesses that it extracts value from.
The AI companies further argue that providing transparency would compromise their trade secrets, but that is another red herring. Nobody is asking for a specific recipe of how the models are trained: they are asking only to be able to query the ingredients that have gone into it. Generative AI models are made up of billions of data points, and it is the weighting of data that is a model’s secret sauce.
The Government can do myriad things around skills, access to finance, procurement practices and energy costs to support AI firms building and deploying models in the UK. They insist that they do not see the AI copyright debate as a zero-sum game, but trading away the property rights of 2.4 million UK creatives—70% of whom live outside London—to secure tech investment would be just that.
There are no insurmountable technical barriers to transparency in the same way that there are no opt-outs. The key barrier to transparency is the desire of tech firms to obscure their illegal behaviour. It has been shown that Meta employees proactively sought, in their own words,
“to remove data that is clearly marked as pirated/stolen”
from the data that they used from the pirate shadow library, LibGen. If they have technical means to identify copyright content to cover their own backs, surely they have the technical means to be honest with creators about the use of their valuable work.
I say to the Minister, who I know truly cares about the future of creatives and tech businesses in the UK—that is absolutely not in question—that if he cannot accept new clause 14 as tabled, he should take the opportunity as the Bill goes back to the Lords to bring forward clauses that would allow him to implement granular transparency mechanisms in the next six to 12 months. I and many on the Labour Benches—as well as the entire creative industries and others who do not want what is theirs simply to be taken from them—stand ready to support the development of workable solutions at pace. It can never be too soon to protect the livelihoods of UK citizens, nor to build trust between creators and the technology that would not exist without their hard work.
I call the Chair of the Culture, Media and Sport Committee.
I rise to support new clauses 2 to 5 in the name of the hon. Member for Harpenden and Berkhamsted (Victoria Collins); to pay tribute to Baroness Kidron, who has driven forward these amendments in the other place; and to speak in favour of new clause 20 in the name of the official Opposition.
I am beginning to sound a bit like a broken record on this matter, but our creative industries are such a phenomenal UK success story. They are our economic superpower and are worth more than automotive, aerospace and life sciences added together, comprising almost 10% of UK registered businesses and creating nearly 2.5 million jobs. More than that, our creative industries have so much intrinsic value; they underpin our culture and our sense of community. Intellectual property showcases our nation around the world and supports our tourism sector. As a form of soft power, there is simply nothing like it—yet these social and economic benefits are all being put at risk by the suggested wholesale transfer of copyright to AI companies.
The choice presented to us always seems, wittingly or unwittingly, to pit our innovative AI sector against our world-class creative industries and, indeed, our media sector. It is worth noting that news media is often overlooked in these debates, but newspapers, magazines and news websites license print and content online. In turn, that helps to support high-quality and independent journalism, which is so vital to underpinning our democratic life. That is essential considering recent news that the global average press freedom score has fallen to an all-time low.
I want to push back against the false choice that we always seem to be presented with that, somehow, our creative industries are Luddites and are not in favour of AI. I have seen time and again how our creators have been characterised by big tech and its lobbyists as somehow resistant to technological progress, which is of course nonsensical.
I am grateful to the right hon. Gentleman for making that very serious point. When the clinicians—whose duty is to protect their patients—say they are not convinced about the safety of data being handed over to a central database, we have to listen to their reactions.
I do not intend to press my new clause to the vote, but it is important that we continue to debate this matter, because this enormous database—which can contribute to the general welfare of all humanity—must be protected in such a way that it retains confidence and ensures the security of the whole system. With that, I leave the discussion to continue on other matters.
Thank you ever so much, Madam Deputy Speaker—other matters we shall attend to.
I speak in support of new clauses 2 to 6 and new clause 14, which I enthusiastically support. I believe that those new clauses represent our very last chance to guarantee at least a bit of security for our creative industries in the face of what can only be described as the almost existential threat posed by generative AI. This is critical. I listened to the Minister very carefully, but this lackadaisical approach and the progress he is intending do not properly reflect the scale of the threat and challenge that our creative industries are currently confronted with. I accept that we have come a long way in this debate, and I accept the positive tone the Minister tries to take when dealing with these issues. I believe that he is sincere about trying to find a solution—he wants to get to a place where both the AI companies and the creative industries are satisfied. I am not entirely sure that we will get to that place, but I wish him all the best in those efforts.
We have certainly come a long way since the first statement we had in this House. I am sure that hon. Members will remember the belligerent way in which the Secretary of State presented that first statement— I am surprised that he is not here today. He was almost saying to the creative industries that they had to take it on the chin in order to satisfy this Government’s attempts to find some economic growth—which they have so far found elusive—in the shape of unfettered artificial intelligence, and that we should just get on with that agenda.
Order. From the next speaker, there will be a five-minute time limit.
As many Members will be aware, my constituent Ellen Roome knows only too well the tragedies that can take place as a result of social media. I am pleased that Ellen joins us in the Gallery to hear this debate in her pursuit of Jools’ law.
In 2022, Ellen came home to find her son Jools not breathing. He had tragically lost his life, aged just 14. In the following months, Ellen battled the social media giants—and she is still battling them—to try to access his social media data, as she sought answers about what had happened leading up to his death. I am grateful to the shadow Minister, the hon. Member for Runnymede and Weybridge (Dr Spencer), for raising this in his speech. In her search for answers, Ellen found herself blocked by social media giants that placed process ahead of compassion. The police had no reason to suspect a crime, so they did not see any reason to undertake a full investigation into Jools’ social media. The inquest did not require a thorough analysis of Jools’ online accounts. None of the social media companies would grant Ellen access to Jools’ browsing data, and a court order was needed to access the digital data, which required eye-watering legal fees.
The legal system is unequipped to tackle the complexities of social media. In the past, when a loved one died, their family would be able to find such things in their possession—perhaps in children’s diaries, in school books or in cupboards. However, now that so much of our lives are spent online, personal data is kept by the social media giants. New clause 11 in my name would change that, although I understand that there are technical and legal difficulties.
The Minister and the Secretary of State met Ellen and me this morning, along with the hon. Member for Darlington (Lola McEvoy), and we are grateful for the time they gave us. My new clause will not go to a vote today, but we will keep pushing because Ellen and other parents like her should not have to go through this to search for answers when a child has died. I understand that there are provisions in the Bill that will be steps forward, but we will keep pushing and we will hold the Government’s and all future Governments’ feet to the fire until we get a result.
Order. Many people wish to speak in this debate, so before I call the next speaker I ask Members please to be mindful when taking interventions. I will now impose a four-minute time limit.
We live in a rapidly changing world. Like everyone else, I am sure that I am guilty of handing my data to organisations every hour of every day, oblivious to the impact on my privacy. I am also guilty of absorbing and using content assuming that it is trustworthy and that it has been obtained fairly.
On the other hand, my generation has been fortunate to have seen the introduction of social media and the online world, and to have experienced the time before it, which perhaps provides us with a level of scepticism about what we see, and an ability to switch it off and distance ourselves from the onslaught to our senses that digital content can provide.
Like other interventions of the past, we are now at a crossroads where we must pause and not simply plough on. The Bill gives us the opportunity to make it clear to the tech giants that we are not giving them everything that we have created, that they cannot own our children, and that we value our data as part of our identity.
Some of the amendments give us a great opportunity to improve the Bill—to make the most of this moment in time and to make sure that we do not leave people behind. We know that children’s brains continue to develop until they are in their early 20s. We know that young people’s development leads them to be risk takers in their adolescence and teenage years, and, as adults, we sometimes have to take decisions to curtail their fun to protect them. My own children have enjoyed social media from the age of 13, but, as the sector develops, and our understanding of its addictive nature improves, it is critical that we reflect that in law. Lifting the age of consent for social media data collection, as in new clause 1, will help to protect our children at the time they need it.
It is unimaginable to lose a child and to do so in the circumstances where the reasons behind their death are unclear, which is why I signed new clause 11 tabled my hon. Friend the Member for Cheltenham (Max Wilkinson), which would allow bereaved parents access to their child’s social media content. This should not be necessary given that GDPR and privacy rights do not apply to those who have died. The fact that we even need such legislation calls into question the motivation of tech giants and tells us where their interests lie. I urge the Government to support this and welcome the assurance today that more work will be done.
Trust is at an all-time low not only in the Government but in other authorities such as the NHS. As AI changes how we interact with the state, commerce and each other, the public should have a right to know how and when AI is involved in the decisions made. Transparency matters, which is why I am supporting the new clauses proposed by my hon. Friend the Member for Harpenden and Berkhamsted (Victoria Collins). We know that if we use each other’s content we must pay for it, or at least credit it if we are not profiting from it. We know that if we do not, we infringe that copyright, so why should tech giants, probably based in some far-flung place, have a right to scrape that content without knowledge or payment? The idea that they even need to train their systems off the backs of people who have used their talent and time and made their living through creativity is obscene.
I really must speak strongly against new clause 21. I have been overwhelmed by the scale of distress brought about by this awful proposal. It is cruel and it completely undermines the privacy of people who are transgender at a time when they are already feeling victimised.
Those who have transitioned socially, medically or surgically are protected in law, and we were told that the Supreme Court decision last month does not change that. But new clause 21 does. If it were passed, sex at birth would be recorded on a driving licence or passport, outing every trans person whenever they buy an age-restricted product, change their job, travel abroad, or even come to Parliament to visit their MP. Not only is this a fundamental breach of privacy, but it is potentially dangerous. They would be prevented from travelling to countries with poor records on rights, and they would be at higher risk of suicide and self-harm than they already are. A constituent said,
“This is a direct attempt to erase me from the public record.”
Please reject this new clause 21.
In the short time available to me, I want to speak to four amendments. On two of them, I would like to urge the Minister to think again. On one, I am in total agreement with the Minister that we should oppose it; the other is one that I want to draw to the House’s attention.
First, I join the Chair of the Culture, Media and Sport Committee, the hon. Member for Gosport (Dame Caroline Dinenage), the Chair of the Science, Innovation and Technology Committee, my hon. Friend the Member for Newcastle upon Tyne Central and West (Chi Onwurah), my hon. Friend the Member for South Derbyshire (Samantha Niblett) and the indomitable Baroness Kidron, who joins us today from the Gallery, in encouraging the Minister to look again at amendments on AI and copyright. We know that this problem will come back and that we need to move at pace.
I represent Walthamstow, the home of William Morris, the creators and makers—and creatives abound. At least William Morris could protect his wallpaper patterns. With the AI technologies we see now moving so quickly, unless we stand up for British copyright technology, we will be in a very different place. The Minister says that if we do not pass new clause 2, we will still have copyright law tomorrow, and he is right, but we will not have the tools to deal with the technology we are dealing with now.
This issue is about not just the Elton Johns, the Ed Sheerans, the Richard Osmans or the Jilly Coopers, but the thousands of creators in our country—it is their bread and butter. Nobody is opposing technology, but they are saying that we need to act more quickly. I hope to hear from the Minister what he will do in this area. New clause 14, which has not been selected, is about the question of transparency and will help creatives exercise their rights.
Briefly, I want to support what the hon. Member for Mid Dorset and North Poole (Vikki Slade) said about new clause 21. I have always supported the appropriate collection of data, but this is not an appropriate collection of data. It is a targeting of the trans community, which is deeply regressive.
I praise the Government for what they are doing with schedule 11—and I wager that nobody else in this Chamber has looked at it. The Victims and Prisoners Act received Royal Assent in May 2024. Section 31 of the Act provides a mechanism to delete data that has been created as part of a malicious campaign of harassment. Schedule 11 is a technical amendment to GDPR laws that will make that Act, which got cross-party support, possible to enact.
For parents and carers, the thought that someone who disagrees with them might use the auspices of social services to try to remove their children because of that disagreement is impossible to comprehend. It is a nightmare that I have lived through myself. Thanks to my local authority, I am still living through it, because the record created by the person who did this to me remains on the statute book, along with the allegation that I am a risk to my children because of the views that I hold.
The primary intent of the man who made this complaint was to trigger an investigation into my private life. The judge who convicted him of harassment said that it was one of the worst examples of malicious abuse in public life that he had seen. The judge demanded that the file be stricken, as did I when it first came to light and when the man was subsequently convicted of harassment. However, Waltham Forest council continue to argue that they have to retain that data to protect my own children from me. This is an example not of how data is used to safeguard but how data can be used to harm by its existence. It is not a benign matter to have such a record associated with one’s name. Anyone who has ever been to A&E knows that the question, “Is your child known to social services?” is not a neutral inquiry. Not having a way of removing data designed to harass will perpetuate the harassment.
My local authority has not labelled the fathers who are MPs in my borough in the same way, but it argues that it must retain this data about me under section 47 of the Children Act 1989, regarding children who might reasonably be considered at risk of harm from an individual. To add insult to injury, the council has not offered to delete this data but told me that I can add to it a note to dispute the claims by the person who has been convicted of harassing me about my fitness to be a parent, and then the council might consider including the note—add more data to a file, therefore, rather than remove it. That will keep the link between me, my family, these allegations and the gentleman who harassed me in the first place. I have never received any form of apology or acknowledgement.
There have always been strong grounds and legal processes to remove malicious records. It is also right that we set a high bar, as the 2024 Act did. This consequential amendment in the Bill should now mean that the Government can use the affirmative resolution to make that law a reality. We cannot delete the misogyny at the heart of Waltham Forest council’s response, but we could finally delete the records and those of others like them and move on with our lives—
My new clause 7 would ensure that, alongside the creation of a digital verification framework, there would be a right to use non-digital ID. Digital exclusion is a major challenge for many communities around the country, including in North Norfolk. Part of the answer is to improve digital education and bring those numbers down, but, as Age UK rightly says,
“it will never be possible to get everyone online.”
The progress we make in the digital age must ensure provision for those who will not be supported by it, or that they are not left behind or excluded.
Older people are not the only ones who struggle with digital exclusion—poverty is also a significant driver. A study in 2021 showed that more than half those who are offline earned less than £20,000 a year. The Government told the Lords that if it turned out that people were being excluded, they could consider legislating, but how many people earning less than £20,000 a year will be taking a business through the courts—perhaps as far as the Supreme Court—to secure their rights? Why are we waiting for it to go wrong, placing the onus on vulnerable people to generate test cases and legal precedent when we could put this matter to bed once and for all with this simple addition to the Bill?
I will also speak in support of new clause 1. It has become abundantly clear to us all that we cannot trust the social media giants to keep our children safe. In fact, I would go as far as to say that they have very little interest in keeping children safe. The algorithms that drive these platforms, which are designed to keep users scrolling for as long as possible to maximise ad revenue, can be deeply damaging to children and young people. It is important to emphasise just how pervasive the content stream can be. Not every hon. Member may have experienced it, but pervasive, targeted content is not the same as a child seeing something distressing on the news. Once seen—if only fleetingly—there is the potential for them to be exposed to unsubstantiated, misleading or even traumatic content, or versions of that content, over and over again every few swipes as the algorithm realises it can suck them in, keep them scrolling and make profit for its social media giants. That is not what social networks set out to do, but it is what they have become.
Whatever the social media giants told the Government or the Opposition, whether “It is too complex,” “It would cost too much,” or, “The flux capacitor is not big enough,” that is just rubbish. If we simply removed the right to process personal data for under 16s, we would remove the algorithms’ ability to target them with content based on what they say and do. If the social networks cared about children’s wellbeing, they would have done that already. I hope that today we will finally take the action necessary to protect the next generation.
Overall, my views on the Bill remain broadly similar to the frustrations I expressed months ago on Second Reading. There is important, commendable and sensible stuff in the Bill, and I welcome that, but what is not in the Bill is more frustrating, as it could have put it in a much better position to harness the power of data. We could have addressed the litany of failures in public sector data use that the Government’s own review outlined just months ago. We could be equipping our civil service and public sector with the talent, culture and leadership to make us a global trailblazer in data-driven government. It is really frustrating that the Bill does not contain any of the steps necessary to make those improvements.
If we use data better, we do government better. I am sure that the whole House and all our constituents are keen to see that.