(3 weeks ago)
Lords ChamberThat this House do agree with the Commons in their Amendment 1.
My Lords, I will speak to some of the amendments made in the other place, starting with Amendments 1 to 31. These will ensure that smart data schemes can function optimally and that Part 1 is as clear as possible. Similarly, Amendments 35 to 42 from the other place reflect discussions on the national underground asset register with the devolved Governments. Finally, Amendments 70 to 79 make necessary consequential updates to the final provisions of the Bill and some updates to Schedules 11 and 15.
I will now speak to the amendments tabled by noble Lords, starting with those relating to sex data. Motion 32A disagrees with the amendment to remove Clause 28(3) and (4), and instead proposes changes to the initial drafting of those subsections. These would require the Secretary of State, when preparing the trust framework, to assess whether the 15 specified public authorities can reliably ascertain the data they collect, record and share. Amendment 32B limits this assessment to sex data, as defined through Amendment 32C; that definition limits sex to biological sex only and provides a definition of acquired gender.
It is also relevant to speak now to Motion 52A, which disagrees with the amendment to remove Clause 140 and, instead, suggests changes to the drafting. Clause 140, as amended by Amendment 52B, seeks to, through a regulation-making power, give the Secretary of State the ability to define sex as being only biological sex in certain areas or across public sector data processing more widely. Let me be clear that this Government accept the recent Supreme Court judgment on the definition of sex for the purposes of equality legislation. We need to work through the effects of this ruling holistically and with care, sensitivity and—dare I say it—kindness. In line with the law, we need to take care not to inappropriately extend its reach. This is not best done by giving the Secretary of State the power to define sex as biological in all cases through secondary legislation without appropriate scrutiny, given the potential impact on people’s human rights, privacy and dignity, and the potential to create legal uncertainty. Likewise, giving the Secretary of State a role in reviewing how other public authorities process sex data in all circumstances based on that definition would be inappropriate and disproportionate, and I note that the Supreme Court’s ruling relates specifically to the meaning of sex in equalities legislation.
The driver behind these amendments has been the importance of sex data being accurate when processed by public authorities. I strongly agree with that aim: accurate data is essential. This Government take data accuracy—including the existing legislation that requires personal data to be accurate—and data standards seriously. That is why we are addressing the question of sex information in public sector data. First, the EHRC is updating its statutory code of practice to support service providers in light of the Supreme Court judgment. Secondly, the Data Standards Authority is developing data standards on the monitoring of diversity information, including sex and gender data, and the effect of the Supreme Court judgment will be considered as part of that work.
Thirdly, the Office for Statistics Regulation published updated guidance on collecting and reporting data and statistics about sex and gender identity data last year. Fourthly, the Office for National Statistics published a work plan in December 2024 for developing harmonised standards on data more generally. Finally, the department is currently considering the implementation of the Sullivan review, published this year, which I welcome.
On digital verification services, I reassure noble Lords that these measures do not change the evidence that individuals rely on to prove things about themselves. The measures simply enable that to be done digitally. This Government are clear that data must be accurate for the purpose for which it is being used and must not be misleading. It should be clear to digital verification services what the information public authorities are sharing with them means. I will give an important example. If an organisation needs to know a person’s biological sex, this Government are clear that a check cannot be made against passport data, as it does not capture biological sex. DVS could only verify biological sex using data that records that attribute specifically, not data that records sex or gender more widely.
I know this is a concern of the noble Lord, Lord Arbuthnot, and I hope this provides some reassurance. The data accuracy principle of GDPR is part of existing law. That includes where data is misleading—this is a point I will return to. I hope that noble Lords find this commitment reassuring and, as such, will agree with Commons Amendment 32.
Motion 34A on Amendments 34B and 34C address the security of the national underground asset register. Security has always been at the heart of the national underground asset register. We have therefore listened to the well-thought-through concerns that prompted the amendment previously tabled by the noble Viscount, Lord Camrose, regarding cybersecurity. Following consideration, the Government are instead proposing an amendment we have drafted with support of colleagues in the security services. We believe this addresses the intention of ensuring the security of the national underground asset register data, with three key improvements.
First, it broadens the scope from cybersecurity only to the general security of information kept in or obtained from the national underground asset register. This will ensure that front-end users have guidance on a range of measures for security good practice—for example, personnel vetting, which should be considered for implementation—while avoiding the need to publish NUAR-specific cybersecurity features that should not be in the public domain. Secondly, it specifies the audience for this guidance; namely, users accessing NUAR. Finally, it broadens the scope of the amendment to include Northern Ireland alongside England and Wales, consistent with the NUAR measures overall. Clearly, it remains the case that access to NUAR data can be approved for purposes only by eligible users, with all access controlled and auditable. As such, I hope that noble Lords will be content to support government Motion 34A and Amendments 34B and 34C.
Commons Amendment 43, made in the other place, on scientific research removes the public interest test inserted in the definition of scientific research by the noble Viscount, Lord Colville. While recognising the concern the noble Lord raises, I want to be clear that anything that does not count as scientific research now would not do so under the Bill. Indeed, we have tightened the requirement and added a reasonableness test. The Bill contains strong safeguards. Adding precise definitions in the Bill would not strengthen these protections but impose a significant, new legal obligation on our research community at a time when, in line with the good work of the previous Government, we are trying to reduce bureaucracy for researchers, not increase it with new processes. The test proposed will lead to burgeoning bureaucracy and damage our world-leading research. This disproportionate step would chill basic and curiosity-driven research, and is not one we can support.
I beg to move that the House agree with the Commons in their Amendment 1. I have spoken to the other amendments.
My Lords, I first thank the Minister for his—as ever—clear and compelling remarks. I thank all noble Lords who have been working in a collegiate, collaborative fashion to find a way forward on the few but important remaining points of disagreement with the Government.
Before I come to the issue of accurate recording of personal data, I also thank the Minister, the noble Baroness, Lady Jones, for tabling the government amendments on the national underground asset register and her constructive engagement throughout the progress of the Bill.
As noble Lords will recall, I set out our case for stronger statutory measures to require the Secretary of State to provide guidance to relevant stakeholders on the cybersecurity measures that should be in place before they receive information from the national underground asset register. I am of course delighted that the Government have responded to the arguments that we and others made and have now tabled their own version of my amendment which would require the Secretary of State to provide guidance on the security of this data. We are happy to support them in that.
I turn to Motions 32A and 52A standing in my name, which seek to ensure that data is recorded accurately. They amend the original amendment, which my noble friends Lord Lucas and Lord Arbuthnot took through your Lordships’ House. My noble friend Lord Lucas is sadly unable to attend the House today, but I am delighted to bring these Motions forward from the Opposition Front Bench. In the other place, the Conservative Front Bench tabled new Clause 21, which would, we feel, have delivered a conclusive resolution to the problem. Sadly, the Government resisted that amendment, and we are now limited by the scope of the amendments of my noble friend Lord Lucas, so we were unable to retable the, in my view, excellent amendment in your Lordships’ House.
My Lords, I have had a misspent not-so-youth over the past 50 years. As a lawyer, when I read the wording in the amendment, I cannot see the outcome that he is suggesting. This wording does not cut across anything that he has had to say. I genuinely believe that. I understand how genuine he is in his belief that this is a threat, but I do not believe this wording is such a threat.
I also understand entirely what the noble Lord, Lord Tarassenko, had to say, but an awful lot of that was about the frustration and some of the controls over health data. That does not apply in many other areas of scientific research. The Frascati formula is universal and well accepted. The noble Viscount made an extremely good case; we should be supporting him.
I thank the noble Viscount, Lord Camrose, for his Motion 32A and Amendments 32B and 32C, and Motion 52A and Amendments 52B and 52C. I reiterate that this Government have been clear that we accept the Supreme Court judgment on the meaning of sex for equalities legislation. However, as the noble Viscount, Lord Hailsham, says, it is critically important that the Government work through the effect of this ruling with care, sensitivity and in line with the law.
When it comes to public sector data, we must work through the impacts of this judgment properly. This would involve considering the scope of the judgment and the upcoming EHRC guidance. Critically, the Equality and Human Rights Commission has indicated that it will be updating its statutory code of practice for services, public functions and associations in light of this ruling, which will include some of the examples raised this afternoon, including by my noble friend Lady Hayter.
Ministers will consider the proposals once the EHRC has submitted its updated draft. It is right that the Government and, indeed, Parliament fully consider this guidance alongside the judgment itself before amending the way that public authorities collect, hold and otherwise process data—a point made by the noble Lord, Lord Clement-Jones, about the EHRC ruling.
I set out in my opening speech that this Government take the issue of data accuracy seriously. That is why, as I outlined, there are numerous existing work streams addressing the way in which sex and gender data are collected and otherwise processed across the public sector.
The digital verification services amendments that we have discussed today are misplaced, because the Bill does not alter the evidence and does not seek to alter the content of data used by digital verification services. Instead, the Bill enables people to do digitally what they can do physically. It is for organisations to consider what specific information they need to verify their circumstances, and how they go about doing that. Any inconsistency between what they can do digitally and what they can do physically would cause further confusion.
While this Government understand the intention behind the amendments, the concerns regarding the way in which public authorities process sex and gender data should be considered holistically, taking into account the effects of the Supreme Court ruling, the upcoming guidance from the equalities regulator and the specific requirements of public authorities. It is very unlikely that the digital verification services would be used for many of the cases specifically raised by or with many noble Lords. We expect DVS to be used primarily to prove things like one’s right to work or one’s age, address or professional educational qualifications.
The noble Viscount, Lord Hailsham, rightly highlights that the proposals have the potential to interfere with the right to respect for private and family life under the Human Rights Act by, in effect, indiscriminately and indirectly pushing public authorities to record sex as biological sex in cases where it is not necessary or proportionate in that particular circumstance. I raise the example that has been brought up several times, and again by the noble Baroness, Lady Fox: it is not relevant for the French passport officer to know your biological sex. That is not the purpose of the passport.
We acknowledge, however, that there are safeguards that address the concerns raised by noble Lords, including those of the noble Viscount, Lord Camrose, and the noble Lord, Lord Arbuthnot, regarding information being shared under Clause 45 but without presenting issues that could cut across existing or prospective legislation and guidance. I remind the House that the data accuracy principle is already included in law. The principle requires that only data accurate for the purpose for which it is held can be used. Again, there are workstreams looking at data use to answer the points raised by the noble Lord, Lord Arbuthnot, and indeed by the noble and learned Baroness, Lady Butler-Sloss.
The noble Baroness, Lady Ludford, asked why it was not accurate for 15 years and what that means about our reliance on this accuracy. I am afraid the fact is that it was accurate for 15 years because there was a muddle about what was being collected. There was no requirement to push for biological sex, but that is the case now. In response to the question of whether you could end up with two different sources of digital verification showing two different biological sexes, the answer is no.
I beg the House’s indulgence and indeed the Minister’s for my interrupting him. The fact is that the Supreme Court has confirmed what was always the law: that the Equality Act meant biological sex. It is therefore not true that the data accuracy principle has ensured that the law has been followed for the past 15 years. I am sorry, I find that answer a little dismissive. I do not think we can rely on that sort of assurance, and I apologise for saying that.
I apologise to the noble Baroness if she found that dismissive. My point was to try to say that there is a clear imperative under the new situation to have biological sex verified as biological sex. As a result—though not in all cases; I have given an example where it would be inappropriate to have that information—where you need that, it would not be possible, to answer her second question, to have two different sources of verification that gave two different biological sexes.
When information is shared through the gateway, it will be clear what that information represents, including in relation to sex and gender. In the light of the Supreme Court judgment, I further reassure Members by clarifying that, before the information gateway provision is commenced, the Government will carefully consider how and when biological sex may be relevant in the context of digital verification checks, and will take that into account when preparing the DVS code of practice.
I hope that these commitments and the assurance about the EHRC will provide noble Lords with reassurances that their concerns will indeed be taken into account. The amendments proposed do not fully take into account the fact that the Gender Recognition Act gives those with gender recognition certificates a level of privacy and control over who has access to information about their gender history. It is essential that Government have the chance to fully assess the Supreme Court judgment and update guidance accordingly. Given the need to consider this area holistically to ensure alignment with existing legislation and upcoming EHRC guidance, the breadth of work already being carried out on public data standards and data harmonisation and statistics, and the specific reassurance on compliance with the accuracy principle under the UK GDPR, I hope the noble Viscount feels comfortable not pressing his amendments.
I turn to Motion 43A from the noble Viscount, Lord Colville. Scientific research is one of the UK’s great strengths. We are home to four of the top 10 universities in the world and are in the top three in scientific outputs. Today’s researchers depend on data, and the UK data protection framework contains certain accommodations for processing personal data for purposes that meet the definition of scientific research in Clause 67. I understand the noble Viscount’s intention to avoid misuse of these research provisions, but the Royal Society has said the reasonableness test in the Bill provides adequate protection against that. The Bill actually tightens the current position, with the ICO being able to use the reasonableness test. “Reasonable” does not mean the subjective opinion of an uninformed person; it refers to an objective, fair observer with good judgment and knowledge of the relevant facts. Such tests are well known to UK courts.
The Bill does not extend and expand that definition. If something is not considered scientific research now, it will not be under the Bill. Similarly, the Bill does not provide any new permission for reusing data for other research purposes. Moreover, further safeguards are provided in Clause 86 and the wider UK GDPR, including the requirement that processing be fair. The Bill clarifies that all reuse of data must have a lawful basis, putting an end to previous confusion on the matter. Adding further specific conditions to the definition in law will be unnecessary and impose a disproportionate burden on researchers, who already say they spend too much time on red tape. The previous Government rightly started to tackle the pernicious creep of increased bureaucracy in research. We should not add more. At worst, this could have an unintended harmful consequence and exclude genuine researchers.
The Frascati manual provides useful guidance; it is not, however, a legal definition. Requiring researchers to start complying with a new legal standard, and one that might change, would undoubtedly create more committees and more bureaucracy—the very thing that Max Perutz argued against in his guidelines on great research.
My noble friend Lord Winston and the noble Lord, Lord Tarassenko, have given powerful examples. Let me give two examples of where the proposals might cause problems. Does requiring research to be creative hinder the essential task in science of testing or reproducing existing findings? Does the Frascati manual definition of “systematic”, which means “budgeted”, exclude unfunded, early research trying to get a foothold? Let us not dampen the UK’s world-leading research sector for a protection that is already included in the Bill.
I sympathise with the intentions of the noble Viscount, Lord Colville. I assure him that the Bill also contains a power to add to the existing safeguards and narrow access to the research provisions if necessary. The Government would not hesitate to use that power if it ever became necessary to tackle misuse.
Moved by
That this House do agree with the Commons in their Amendments 2 to 31.
Moved by
That this House do agree with the Commons in their Amendment 32.
That this House do agree with the Commons in their Amendment 33.
That this House do agree with the Commons in their Amendment 34 and do propose Amendments 34B and 34C instead of the words so left out of the Bill—
That this House do agree with the Commons in their Amendments 35 to 42.
That this House do agree with the Commons in their Amendment 43.
That this House do agree with the Commons in their Amendment 44.
My Lords, with the leave of the House, I will speak also to Amendments 45 to 51 and 78. There has, quite reasonably, been significant interest in the topic of AI and copyright. This is a hugely important issue, and a complex one. I hope that noble Lords will bear with me as I set out the Government’s position, which has been the subject of some misrepresentation in recent reporting. I make it clear that this Bill does not introduce any changes to copyright law or wider intellectual property regulation. It does not introduce an opt-out system, nor does it contain any delegated powers that would allow such a system to be implemented. All existing copyright rules continue to apply to the use of material for AI training in exactly the way it did before the Bill was introduced.
This Government recognise the enormous economic and social value of our creative industries. We saw that just last week, as the nation came together to commemorate the anniversary of VE Day. Our creative sector entertains and informs us. It is the best of us as a nation. Our manifesto quite rightly pledged to work with the creative industries to unlock their potential after years of neglect. As noble Lords will know, the creative industries are worth £124 billion GVA and support 2.4 million jobs. Since 2010, they have grown at 1.5 times the rate of the rest of the economy.
The creative industries are one of our eight priority strands within our industrial strategy. In January 2025, as a first step in delivering that strategy, we announced: first, that the British Business Bank will increase its support for creative industry businesses to help them access the finances they need to grow; secondly, that UKRI will strengthen support for the sector to drive R&D-led growth; thirdly, that shorter-duration apprenticeships as a first step towards a flexible growth and skills levy that meets creative industry employers' needs will be introduced; fourthly, a commitment to devolve funding to six priority mayoral strategic authorities to drive the growth of creative clusters; and, fifthly, a £19 million package of funding for programmes including the UK Games Fund, the UK Global Screen Fund, music export growth schemes and create growth programmes. The Government will build on this support through the upcoming creative industry sector plan, which we publish very soon.
Our manifesto also recognises both the opportunities and the risks of AI. We pledged to take early action, and one part of this was the launch of a detailed consultation on the future of copyright reform to ensure that protections are fit for purpose as technology evolves and its use becomes more widespread. That consultation closed earlier this year, and we are now analysing a large volume of responses—something in the region of 11,500—and assessing the evidence that we have received. Our proposals will be based on that evidence and what works, rather than any preferred option. This will take time to do properly and, as such, the Government did not and do not believe that this Bill is the right vehicle to make any substantial changes to the law on this issue. Yes, we must act quickly, but we must also continue our thinking and engagement to ensure that the policy outcome is the one that best balances the potential of AI and the need to support rights holders.
Although we do not believe that this Bill is the right vehicle for wholesale change to copyright law, we understand the need to demonstrate that this Government, unlike others, want to follow best practice, engage meaningfully with all sides and come to the right conclusions. This is why the elected House took the decision to remove the relevant amendments passed during Lords stages and insert new provisions to demonstrate our commitment to legislate on AI in a fair, evidence-based way.
Of course we agree that there should be greater transparency about the use of protected material to train AI models. We agree that there should be more work done to identify the technical solutions that will empower rights holders to decide whether and how their material is used. We must continue to talk to all sides and to ensure that a reformed copyright regime is carefully thought through, effectively and robustly supported by the evidence. As our amendments set out, we will report on four substantive areas within 12 months. These will clearly signpost what we want to deliver and how we propose to do so. We will also carry out an economic impact assessment of the proposed changes once we have come to a settled view.
My Lords, I was IP Minister for nearly three years and I am a long-standing member of the APPG on IP. It is a great pleasure to speak from the Back Benches and to support the Motion in the name of the noble Baroness, Lady Kidron, and my noble friend Lord Camrose’s amendment.
What concerns me is that we are witnessing an assault on a sector worth £160 billion to the UK, as we have heard. Actually, I suspect that may be an underestimate, because IP and copyright are to be found in the nooks and crannies of so much of our life and our industry. There has been a lot of mention of music and media. Nobody has mentioned breeding and performance data on racehorses, information on art and antiques, or—close to my heart—the design, by young graduates, of gorgeous new clothing and fancy footwear of the kind that I wear. It is the small operators that are most at risk. That is why I am speaking today.
We are going too slowly. Amendments have been knocked back. The noble Baroness, Lady Kidron, has been trying her hardest, with a great deal of support from right across Britain. As time goes by, AI and LLMs are stealing more of our creativity, hitting UK growth. I believe that the Government must get on. It is not easy, but it is a challenge they have to rise to, and very quickly.
My Lords, I support Motion 49A from the noble Baroness, Lady Kidron. I will also address claims that we have heard repeatedly in these debates: that transparency for AI data is technically unfeasible. This claim, forcefully pushed by technology giants such as Google, is not only unsupported by evidence but deliberately misleading.
As someone with a long-standing background in the visual arts, and as a member of DACS—the Design and Artists Copyright Society—I have witnessed first-hand how creators’ works are being exploited without consent or compensation. I have listened carefully to the concerns expressed by the noble Lord, Lord Tarassenko, in both his email to colleagues today and the letter from entrepreneurs to the Secretary of State. Although I deeply respect their expertise and commitment to innovation, I must firmly reject their assessment, which echoes the talking points of trillion-dollar tech corporations.
The claims by tech companies that transparency requirements are technically unfeasible have been thoroughly debunked. The LAION dataset already meticulously documents over 5 billion images, with granular detail. Companies operate crawler services on this dataset to identify images belonging to specific rights holders. This irrefutably demonstrates that transparency at scale is not only possible but already practised when it suits corporate interests.
Let us be clear about what is happening: AI companies are systematically ingesting billions of copyrighted works without permission or payment, then claiming it would be too difficult to tell creators which works have been taken. This is theft on an industrial scale, dressed up as inevitable technological progress.
The claim from the noble Lord, Lord Tarassenko, that these amendments would damage UK AI start-ups while sparing US technology giants is entirely backwards. Transparency would actually level the playing field by benefiting innovative British companies while preventing larger firms exploiting creative works without permission. I must respectfully suggest that concerns about potential harm to AI start-ups should be balanced against the devastating impact on our creative industries, thousands of small businesses and individual creators whose livelihoods depend on proper recognition and compensation for their work. Their continued viability depends fundamentally on protecting intellectual property rights. Without transparency, how can creators even begin to enforce these rights? The question answers itself.
This is not about choosing between technology and creativity; it is about ensuring that both sectors can thrive through fair collaboration based on consent and compensation. Transparency is not an obstacle to innovation; it is the foundation on which responsible, sustainable innovation is built.
Google’s preferred approach would reverse the fundamental basis of UK copyright law by placing an unreasonable burden on rights holders to opt out of having their work stolen. This approach is unworkable and would, effectively, legalise mass copyright theft to benefit primarily American technology corporations.
Rather than waiting for a consultation outcome that may take years, while creative works continue to be misappropriated, Motion 49A offers a practical step forward that would benefit both sectors while upholding existing law. I urge the House to support it.
I shall make a very brief speech. I stood up when the noble Lord, Lord Clement-Jones, stood up, but unfortunately, as so often in my life, he completely ignored me, so I will just slip in after him and just before our Front Bench. I declare my interest in the register as an adviser to ProRata.ai, which is a company that seeks to pay royalties to creatives for the use of their content in AI models. It was good to see not only the Secretary of State, Peter Kyle, standing at the Bar, but also the Creative Industries Minister, Chris Bryant, which shows that something is up. They were very clearly wanting to be seen by the 400 or so creatives who wrote to the newspapers over the weekend expressing their concerns about the Government’s AI legislation and also to seek, as we all do, to curry favour with the noble Baroness, Lady Kidron, who has led so well on so many of these issues.
As she was speaking and making the point that creatives and technologists are not apart at all, but are together, it reminded me that I became the Technology Minister in the Cameron Government because I was the Creative Industries Minister, and the reason I became the Technology Minister was because I was the only Minister in the Cameron Government in 2010 meeting the technology companies. The reason I was meeting the technology companies was because the technology companies were busily ripping off the intellectual property of the creative industries. At that time, in 2010, you would sit down with Google and say, “Anyone can search for any material on your website, come up with it illegally, stream it and download it without paying the creators of that material. What are you going to do about it?” Of course, they said, “We’re going to do absolutely nothing because you are just a little British Minister, and we only do what the White House tells us to do”.
The Labour Government had passed legislation that was concluded in the wash-up in 2010 that effectively criminalised, to coin a phrase, the teenager in their bedroom downloading music, just as perhaps some of us as teenagers might have taped music off the radio in the past. I knew when I became a Minister that that legislation was completely unworkable. It was pointless to be prosecuting teenagers when you should be taking on big tech. Actually, the music industry found a solution by using the Fraud Act and began to take action in the courts against websites that were completely ripping off IP. It allowed courts to order those websites to be blocked.
I also knew that there would be no solution until there was a commercial solution. In fact, that commercial solution has come about. In 2010, people were predicting the entire death of intellectual property, the death of the music industry, the death of the film industry and the death of television. They have never been healthier: there are commercial models because more people are prepared to pay a subscription to Spotify, Netflix or Amazon Prime to get great content for a reasonable price, so a commercial solution is possible when people work together.
It was interesting to hear the noble Lord, Lord Clement-Jones, talking about the opt-out model because it implies that you can have a conversation between big tech and creatives. The creatives can either opt out or opt in. We referred earlier to licensing deals. If anyone reads FT Weekend—in fact, everyone in this Chamber obviously reads FT Weekend as it is the Bible of the chattering classes—Sam Altman from OpenAI was featured in “Lunch with the FT”, an honour he shares with the noble Baroness, Lady Kidron. In fact, I texted her when she was in “Lunch with the FT” and said that it is better than a peerage. At the beginning of that lunch, it says that the FT has a licensing deal with OpenAI, so it is possible to have licensing deals.
What I think none of us can really stand is the utter hypocrisy of people saying that, for the national interest, we have to rip off intellectual property. It is completely hypocritical and nonsensical. You would not find a single tech chief saying, “I think it is fine if people take our patents because that is how you get economic growth. Just take my patent”. In fact, you will not find a CEO saying that. You will see them saying in court, “He’s ripped off my patent, and I want my money back”. That is intellectual property that big tech is prepared to fight for, yet big tech is still prepared to tell us, just as they told us 15 years ago, that they can grow only by ripping off the IP of the creative industries. Let us face it: there may be AI start-ups that need open source. I totally accept that. It is a complicated landscape, but we are still talking about big tech. We are talking about Microsoft, OpenAI, xAI and Meta. We are talking about the role of the United States. Donald Trump wants to make Hollywood great again. This is where he could start.
My Lords, I first thank all noble Lords from across the House for their many eloquent and well-made speeches. The Government share the passion displayed today. We all care about the creative sector and want to see it flourish. We all want to find ways to make that a reality. We are talking here about the practicalities of how we can do that in a proper way; that is what we are addressing today. Nobody doubts the fantastic contribution that the creative sector makes to the UK. I thought I had set out some of that in my opening speech, but I am very happy to confirm it again.
On the practicalities, the amendment tabled by the noble Baroness, Lady Kidron, sets out wide-ranging obligations on businesses that make AI models available in the UK and would require the Secretary of State to nominate a body to enforce them. I agree with the noble Baroness that the creative sector has always been an early adopter of technology, and that the creative and AI sectors go hand in hand. A number of noble Lords made that point, and made it well.
I also completely recognise the value generated by the creators—again a point well made by a number of noble Lords—and their great cultural and economic contributions to society. The noble Lords, Lord Black and Lord Berkeley, my noble friend Lord Brennan and many other speakers spoke about that.
It is the Government’s view—and, moreover, morally right—that creators should license and be paid for the use of their content. The Government have always been clear that we want to see more licensing by the AI sector. The obligations in the amendment of the noble Baroness, Lady Kidron, however, would affect a wide range of businesses and require detailed disclosure of information. This would include a mechanism to identify individual works, but it is very uncertain whether it would be possible to meet that requirement when a significant proportion of material on the internet does not have clear metadata to facilitate this. The scale of the impact on those businesses is unknown but, without a proper impact assessment, there is a real risk that the obligations could lead to AI innovators, including many home-grown British companies, thinking twice about whether they wish to develop and provide their services in the UK.
We agree that, if transparency obligations are to be created in this way, there will need to be provision for their oversight and enforcement, but that is not something that can be dropped on the first regulator that comes to mind. There is currently no body with the skills and resources to perform this function. We need a proper discussion about funding, clarity over what enforcement powers are required, and answers to a whole range of other questions.
It should also be noted that one of the main issues that creative industries are struggling with is enforcement of their rights under the current rules. As was said earlier—and I am happy to reiterate—we are not saying that the copyright laws are broken; at the heart of this is the question of enforcement.
Transparency would help with knowing what is being used, but that alone will not be a silver bullet for small creators and businesses seeking redress through our legal system. As many noble Lords will know, there are live court cases in train in the UK and other key jurisdictions. The Government, and I, recognise the urgency of the problem, as so fantastically put by the noble Baroness, Lady Benjamin.
This is why DCMS and DSIT Ministers are prioritising meetings with creative and AI stakeholders to discuss potential solutions as a top priority. Indeed, they held meetings and discussions with both sectors last September. We have moved quickly to consult, having hosted round tables and bilateral meetings with creatives and their representatives. These have been of great value and we will continue to hold those meetings.
However, all these moving parts mean that something needs to be developed as a full working approach. The amendment from the noble Baroness, Lady Kidron, does not offer an instant solution, instead asking the Government to come up with regulations in 12 months. We cannot make such significant interventions without properly understanding the impact. This is why our position is to report on four substantive issues within 12 months and set out our proposals in that time. As I said in my opening speech, our proposals will be based on the evidence from the 11,500 responses and, indeed, will concentrate on what works rather than any preferred option. As the noble Lord, Lord Tarassenko, said, the solution must indeed involve creators and AI developers being in the same room, and this is what we will endeavour to do.
I further agree with the noble Lord that AI should not become a way to whitewash copyright piracy. The Government support strong action against copyright piracy and we will continue to do so. I also agree that it is important to support transparency. I cannot say this strongly enough. Noble Lords have seemed to suggest that we are not taking that issue seriously. Of course we are. The Government fully support and are encouraged by the work of the IETF and other fora developing new standards to help identify metadata, which will make this easier.
That this House do agree with the Commons in their Amendment 45.
That this House do agree with the Commons in their Amendment 46.
That this House do agree with the Commons in their Amendments 47 and 48.
That this House do agree with the Commons in their Amendment 49.
That this House do agree with the Commons in their Amendments 50 and 51.
That this House do agree with the Commons in their Amendment 52.
That this House do agree with the Commons in their Amendment 53.
My Lords, with the leave of the House, I will also speak to Amendments 54 to 74 and 79.
We all agree that tackling the abuse of intimate image deepfakes is incredibly important. I am delighted that these provisions are returning to this House, having been strengthened in the other place, enabling us once again to discuss this key issue. I extend my heartfelt thanks to the noble Baroness, Lady Owen, for her dedication on this issue. I am also grateful to the noble Lords, Lord Pannick—who unfortunately is not in his place—and Lord Clement-Jones, and others who have generously given much of their time to discussing this issue with me. Their engagement with me and my ministerial colleagues has been instrumental as we have refined our approach to this important topic. It has been a fantastic example of parliamentarians working across the House to get policy in the strongest possible position.
At Third Reading I committed that the Government would bring forward further amendments in the Commons, including on solicitation and time limits. We have delivered on those commitments. I will begin with Commons Amendment 56, which introduces the requesting offence. This addresses the commitment made on solicitation. It replaces, but builds on and delivers the same intent as, the amendment that your Lordships made to the Bill. It comprehensively criminalises asking someone to create a deepfake intimate image for you without the consent of the person in the image or the reasonable belief in their consent. This is an offence regardless of where the person you are asking is based or whether the image was in fact created.
I turn to the commitment on time limits. Commons Amendment 63 was passed to extend the statutory time limit so that prosecutions can be brought at any date that is both within six months of when sufficient evidence comes to the prosecutor’s knowledge and within three years of when the offence was committed. This means that perpetrators will not get away with creating or requesting the creation of a deepfake just because no one knew about it at the time.
A further change was made in the Commons through Commons Amendment 55, to add a defence of reasonable excuse to both the creating and requesting offences. I know that this is likely to be the subject of much debate today, so I will spend some time setting out the Government’s position.
First, I want to reassure the House that the Government’s priority is to create comprehensive, robust defences which ensure that perpetrators cannot evade justice. It is not our intention that the defences provide defendants with a get-out clause, and we do not believe that they do so. This is especially important to stress for the creation of sexual deepfakes, which are so extraordinarily harmful. In our view, it is extremely unlikely that there will ever be a situation where someone creating a sexually explicit deepfake will be able to prove that they had a reasonable excuse. Indeed, we anticipate that the defences would apply only in an extremely narrow set of circumstances, such as for covert law enforcement operations.
It is also our view that, for a very small minority of cases, such as the creation of genuinely satirical images that are not sexually explicit, the defence to the creating offence is legally necessary for it to be compatible with Article 10 of the European Convention on Human Rights. Without the “reasonable excuse” defence, we consider that the creating offence will not be legally robust, and that any legal challenge to its compatibility with Article 10 is likely to be successful. This will not provide the best protection for the victims. Let me labour this very important point: our intention is to create comprehensive, robust offences that will ensure that those who create or request intimate deepfake images without consent, particularly sexual deepfake images, face grave consequences.
I also want to stress that abusers will not be able to evade justice by using spurious excuses. The defendant must provide enough evidence to prove that the creation, or that particular request, without consent was reasonable. They cannot just say it is art or satire without sufficient compelling evidence. It will be for the court, not the defendant, to decide whether something is in fact art or satire. From my many years as a magistrate, I can also reassure the House that it is simply not the case that a defendant can offer up any excuse and assert that it is reasonable. The CPS will challenge spurious arguments, and the courts are extremely well equipped and used to dealing with such arguments quickly.
The Government share the House’s desire to ensure that criminal law, and these defences in particular, work as well as the Government intend. I therefore speak to support the noble Baroness’s Amendments 55E and 56B, which place a binding obligation on the Government to review the operation of the “reasonable excuse” defence, for both the creating and requesting offences, by putting it in the Bill. As part of this review, we will carry out targeted engagement with external stakeholders and subject matter experts to ensure that we make a broad and informed assessment of the defence.
I hope this addresses the concerns about these defences. The best way to protect victims is to ensure that Parliament passes legally sound and robust offences that can bring perpetrators to justice. I urge the House to do that by supporting Motion 55C and Amendment 56B. I beg to move.
My Lords, I speak to my amendments in this group. In doing so, I declare my interest as a guest of Google at its AI policy conference.
I start by thanking both the Minister and Minister Davies-Jones for taking the time to engage on this issue and for their endless patience. I know they have worked incredibly hard to secure progress on this and I am very grateful for their efforts.
We are down to the issue of whether we believe a person can have a reasonable excuse to create content that looks like a photograph or film of another person without their consent. Noble Lords will recall that this House overwhelmingly indicated that we did not believe “reasonable excuse” should be included as a defence and highlighted concern that it may be misinterpreted or viewed too widely.
I have concerns over the position the Government outlined in their letter from Minister Bryant to the Joint Committee on Human Rights. Minister Bryant argues that the inclusion of “reasonable excuse” is necessary as, without it, the offence would breach the ECHR due to limiting a person’s freedom to create photorealistic satirical art of scenarios such as a person on the toilet or in boxer shorts. Additionally, the Government argued the need for tech companies to be able to red team against this offence.
I share the Government’s strong desire that we do not want this Bill to have a memorandum on it warning that it may breach the ECHR, however precarious the arguments laid out may be. I do not want those who abuse women in this way to claim the prosecution may contravene their human rights.
With this in mind, I turn to my first amendments, Amendments 55C and 56B, written in conjunction with the Government, which offer a review of the implementation of “reasonable excuse” for both the creation and requesting offences after two years. I am grateful to the Minister for the compromise. He will know the conflicts I feel about this issue and the great concern I have that, without guardrails, “reasonable excuse” may be used to allow those who abuse others in this sickening way to escape justice.
I know the Minister will offer me reassurance that the courts will be used to hearing precarious excuses. However, my concern—as noble Lords know—is that image-based sexual abuse has been consistently misunderstood, with the Law Commission itself only arguing three years ago that the harm from creating non-consensual sexually explicit content was not serious enough to criminalise. In 2023, Refuge found that, despite steady year-on-year increases in recorded offences for image-based abuse, only 4% of offenders were charged. Even when a conviction was achieved, only 3% of cases resulted in the perpetrator being deprived of the images used for the offence.
We have seen consistent failure by prosecutors to understand and tackle the issue. I therefore have a very real concern that, by allowing “reasonable excuse” to sit in this offence, we risk it being misunderstood and the offence being undermined. Further, while I am grateful for the offer of a review, I am worried that if after two years we find “reasonable excuse” is allowing perpetrators to evade justice, there will not be a legislative vehicle in which to correct the issue, and the time it takes to correct may be lengthy. I would be grateful if the Minister could offer me reassurance on this point.
Additionally, I am concerned by the very premise of the argument that legislation without “reasonable excuse” would breach the ECHR. I have sought the legal counsel of the noble Lord, Lord Pannick, KC—who apologises for not being here this evening—and he believes that the inclusion of “reasonable excuse” in the defence is not necessary in order to be compliant with the ECHR.
The noble Lord, Lord Pannick, advised, as the Joint Committee on Human Rights already highlighted in its letter, that
“the Government has stated that prosecutorial discretion is sufficient to ensure that an offence that could violate a qualified right under the ECHR is nevertheless compliant with it”.
Additionally, all legislation must, so far as possible, be read and given effect to in a manner that is compliant with the ECHR, according to Section 3 of the Human Rights Act 1998. So, even if there were to be a prosecution in the sort of circumstances contemplated by the Government, the defendant could rely on their Article 10 rights, which means that an all-encompassing reasonable excuse is not necessary.
Additionally, I would be grateful if the Minister could outline to the House the reasons why tech companies cannot red team by prompting with the images of people who do consent and, therefore, not requiring a reasonable excuse, should their model fail and end up creating the content that it is trying to avoid. I would go as far as to say that testing prompts on a model using the image of a person who does not consent would be deeply unethical. It is my belief—and the view of the noble Lord, Lord Pannick, and the noble Baroness, Lady Chakrabarti—that such specific examples do not justify general reasonable excuse. To quote my friend and human rights advocate, the noble Baroness, Lady Chakrabarti:
“Spurious ECHR arguments for weakening 21st century cyber sex offences do not help the cause of those seeking to defend human rights from its many detractors”.
My Lords, I rise to speak to the Motion standing in the name of my noble friend Lady Owen of Alderley Edge. Her amendments fall into two categories, and we support her in all of them. I start by joining the noble Baroness, Lady Chakrabarti, and others in paying tribute to her tenacity in pursuing this issue by standing up for women who should not have to live in the fear of becoming victims of sexually explicitly deepfakes. As mentioned, she has won the deep respect of this House and, at the same time won many, many friends from her action. The cross-party support that she has managed to gain from this shows this House at its best—a House of which I am proud to be a Member.
First, my noble friend has tabled reviews to ensure that the offence that is being created as a result of her tireless campaigning is effective. We support her in her Motion and agree with her that we must do everything we can to ensure that the law is robust and effective in protecting women. Secondly, like many others, I have been puzzled by the ECHR reasonable excuse approach being used by the Government. It was very helpful, as ever, to have experts on hand in this matter and my noble friend Lady Cash to bring her expertise and agree with the basic position that, while we understand it, it is very widely drawn as it is currently set up.
I think it is very sensible what my noble friend is trying to do in seeking to tighten those definitions of reasonable excuse and remove reasonable excuse in the case of requesting sexually explicit deepfakes in her Motion 55A and Amendment 56A. I completely understand why she has brought them, and, while they would appear to be instead of the reviews, which we also support, we feel that my noble friend is right to challenge the inclusion of reasonable excuse as a defence to these offences. On that, she has our complete support.
My Lords, I have listened carefully to the arguments, particularly those in favour of the noble Baroness’s Amendment 55, the creating offence, which seeks to replace the “reasonable excuse” defence with the creating offence, with a targeted defence for red-team software testing and reasonable political satire. We share the noble Baroness’s desire to ensure that any defence to the creating offence functions tightly and share her belief that only in narrow and limited circumstances would a person have a reasonable excuse for the creation of such images without consent. That is how our reasonable excuse defence will apply in practice, which is why the Government believe that the defence is the right way forward.
However, we are unable to agree to these targeted defences that the noble Baroness proposes to the creating offence in place of a reasonable excuse defence. This is a novel offence, tackling behaviour that is changing rapidly along with the technology itself. We cannot anticipate all the ways in which people will use technology as it develops. A defence of reasonable excuse which, as I have said, we believe will be interpreted very carefully by the courts, will ensure that the offence can be used effectively to target culpable perpetrators, even as technology and its uses change. The targeted defences proposed by the noble Baroness would also, crucially, not eliminate the risk of successful legal challenge, which I explained in my opening speech. Even with such targeted defences, the creating offence risks successful challenge in the courts, leading to uncertainty and reduced protection for victims.
I turn briefly to Amendment 56A on the requesting offence. As I have set out, the reasonable excuse defence to the requesting offence will only apply in an extremely narrow set of circumstances, such as covert law enforcement operations. The legal issue which applies to the creating offence does not apply to the requesting offence. However, we always aim for consistency and parity across similar offences and so urge this House not to pass Amendment 56A to the requesting offence. Also, without the defence that the Commons included for the requesting offence, law enforcement and intelligence officials may be unable to effectively carry out their functions.
We made a manifesto commitment to ban the creation of sexually explicit deepfakes. This legislation, as amended in the Commons, does just that. For the first time, there will be protection for victims and punishment for the perpetrators who create, or ask other people to create, intimate deepfakes of adults without consent or a reasonable belief of consent. These provisions represent an important and necessary response to intimate image deepfakes. The Government are clear that these offenses are comprehensive and robust. While a defence of reasonable excuse to both offences is necessary, it does not provide a get-out clause for the many perpetrators creating intimate deepfakes, especially sexual deepfakes, without consent. We remain firmly of a view that this is the most effective way to protect victims from this appalling abuse. It is our duty to act decisively. For those reasons, I urge your Lordships to support, with confidence, Motion 55C, containing as amendments in lieu Amendments 55D and 55E, and Amendment 56B. I urge the noble Baroness, Lady Owen, to withdraw her Motion 55A and Amendment 56A.
The noble Baroness asked about deprivation orders. We share her frustration with this. The ability for courts to apply deprivation orders has been in place but these have not been used as extensively as they could be, so the judges are looking at sentencing guidelines to see how that lack of implementation of deprivation orders can be remedied. My noble friend Lady Chakrabarti asked whether offenders of the requesting offence would also be deprived of images by the court. Yes, they would be. We want to ensure parity across the creating and the requesting offence, so that includes their computers and any images that are stored anywhere.
A number of noble Lords have expressed scepticism about whether the courts would adequately apply the reasonable excuse defence, which really is the nub of the issue which we are debating now. I have had this discussion many times with the noble Baroness, Lady Owen, in private. I must say, as a magistrate for nearly 20 years, that we often hear completely ridiculous defences. It is certainly not unusual in magistrates’ court—or, I am sure, in Crown Court—and magistrates and judges are well able to deal with those types of defences. I know that the noble Baroness is sceptical of that, which is one of the prime reasons why we have put the review in the Bill. She will know it is very unusual for Governments to commit in a Bill to have a review, but it is because we understand that this is a new area of law and that the way we are defining “reasonable excuse” is a politically contentious area. I urge her to continue to work with us, which I am sure she will do in any event, and I urge her not to move her amendments to a vote. I beg to move.
That this House do agree with the Commons in their Amendment 54.
That this House do agree with the Commons in their Amendment 55.
That this House do agree with the Commons in their Amendment 56.
That this House do agree with the Commons in their Amendments 57 to 79.