(5 days, 17 hours ago)
Lords ChamberTo ask His Majesty’s Government what assessment they have made of the Advanced Research and Invention Agency’s handling of an Environmental Information Regulations request regarding its “Scoping Our Planet” programme.
ARIA fully complies with its responsibilities under the Environmental Information Regulations. ARIA is committed to transparency; it publishes regular information on its programmes in its annual reports and accounts, in the corporate plan and through the quarterly transparency disclosures on its website. It publishes its responses to all EIR requests.
My Lords, the Minister mentions ARIA being committed to transparency, but that highlights the fact that it is not subject to the general freedom of information provisions under the ARIA Act. I note that on Report on the ARIA Bill the Labour Opposition Front Bench signed and supported in a Division an amendment tabled by me to bring ARIA into the provisions of the Freedom of Information Act. In fact, the noble Baroness, Lady Chapman of Darlington, said:
“The Government’s determination to keep ARIA’s projects and decision-making secret is worrying. This is a matter of principle: do they believe in transparency, or not?”—[Official Report, 14/12/2021; col. 209.]
I can now ask the same question of this Labour Government: do they believe in transparency? Will they bring ARIA within the Freedom of Information Act?
I thank the noble Lord for his question. I know that he is prone to shaking his head when Ministers answer. I fear that I may give him a neck injury during this answer.
Of course we are committed to transparency, but we have no plans to bring ARIA into the scope of the FoI Act. ARIA is a unique organisation with unique freedoms; it has been designed deliberately to be a small, agile body with limited administrative capacity so that most of its efforts can be spent devoted to finding the answers to some of the missions that it funds —long-term transformation research for the benefit of the UK. However, both the Government and ARIA understand the importance of transparency, and ARIA publishes all its information on recipients of programme funding, transactional information on its operational costs, and data on the regional distribution of its programmes and funding. It complies with the Environmental Information Regulations, is audited annually by the NAO, and publishes its annual reports and accounts.
My Lords, the Minister’s arguments are sounding dangerously like those made by the noble Lord, Lord Callanan, on Report, which I am sure he will be delighted by. Does he accept that DARPA is covered by US freedom of information legislation, whereas ARIA is not?
DARPA is a much larger organisation and the ARPA family overall probably has close to 1,000 people working in it in total. DARPA is covered by the US Act, but it has a much larger base and many more people working with it. As the noble Lord, Lord Patel, said, the amount of information that ARIA puts in the public domain is more than that of almost any other body in the world.
(1 month, 1 week ago)
Lords ChamberThe interaction between the public and private sectors is crucial in this, as it is in many other areas. UKRI is leading a number of public programmes which support universities and the ability to get spin-outs and developments from them, so there is considerable interaction at the beginning of the process. There is also interaction throughout the process; for example, the AI Security Institute is working with some of the largest companies and looking at their models to ensure that, as they are developed, issues that could come up are foreseen and, we hope, mitigated in advance. Collaboration between the public and private sectors is crucial in AI, as in many areas of technology development.
My Lords, the Government have agreed to create a new function—UK sovereign AI—to partner with the private sector and maximise the UK’s stake in what is described as frontier AI. Further details were promised by spring 2025. By my calculation, spring is over. What powers will this unit have to invest directly in companies, create joint ventures or provide advanced market commitments, as recommended in the plan, and how will it ensure economic benefit and influence on AI governance in the UK?
AI sovereignty is a crucial issue. It ranges from questions of what infrastructure and companies we need in this country to what public work we need to do to make sure that we can access the AI required. AI sovereignty is very much part of the AI action plan; the spending review is under way and there will be more information on what exactly will happen in its different areas post spending review. The areas the noble Lord raises are all important—they are the right ones. Spring is nearly over. It will not be in spring, but we hope to give more information shortly.
(1 month, 3 weeks ago)
Lords ChamberTo ask His Majesty’s Government what steps they are taking to strengthen science and innovation following reports that the Alan Turing Institute is cutting research projects.
My Lords, the Government are protecting record levels of R&D investment, with £20.4 billion allocated in 2025-26. Through UKRI and other mechanisms, we are supporting science innovation across the UK to better deliver on the Government’s priorities and maximise the potential of UK science. The Alan Turing Institute is of course an important part of the R&D system and is currently focusing its research activities on fewer projects, in line with its refreshed Turing 2.0 strategy. The Alan Turing Institute is an independent organisation, and this realignment process is being handled internally.
My Lords, I welcome today’s funding announcements. However, after a review by the EPSRC, a revised strategy and a further external review, the Turing is shutting down at least 21 science and innovation projects, three out of the four science and innovation directors have resigned, together with the chief technology officer, and at the end of last year staff sent a letter of no confidence in the leadership, saying there had been a “catastrophic decline in trust” and claiming that the viability of the institute was under question. What does all this mean for the future of the Turing, which has an enormously valuable track record and role in the AI research and innovation ecosystem? Will it continue to have a leading role in advising on AI ethics, regulation, standards and responsible innovation?
The Alan Turing Institute was set up by six universities and now has some 65 university partners. The 2023 quinquennial review identified a number of governance and programme issues that needed to be addressed, including that the institute was spread thinly across a broad area. The Turing 2.0 strategy will focus on fewer areas, put more resource behind those projects and ensure that there is real progress to build on the strengths that the noble Lord has rightly identified. The four Alan Turing Institute challenges are in health, the environment, defence and security—in which it has a very major role to play—and fundamental AI. Going through this repositioning is a major undertaking, involving a lot of current upheaval.
(2 months ago)
Lords ChamberI thank my noble friend. We have always been the beneficiaries of brain gain; we have been attractive to top-class overseas researchers for many years. Indeed, about one-third of our Nobel prize winners are first or second-generation immigrants. For 2025-26, UKRI has roughly £770 million for talent funding, of which £170 million is for future leader fellowships. There is an opportunity, as there always is, to attract people from overseas to the UK, both individuals and groups; indeed, there are mechanisms in place to do so. I am looking very carefully at what further mechanisms can be put in place to make sure we remain a country that attracts the very brightest and best.
My Lords, the Government’s immigration White Paper, as the Minister said, expresses the ambition to attract top global talent, including scientists. However, measures such as the increased skills charge, alongside high UK visa costs and the challenging context of flat cash real-terms cuts in core research funding, create barriers to recruitment. The Government seem not to be very clear whether they want to attract international scientists or not. Do we not need a proper long-term plan with increasing investment to maintain the UK’s research leadership and attract talent?
The current SR period has £20.4 billion for R&D, which is the highest amount there has been. Of course, a proportion of that is about talent attraction. The talent attraction announced in the White Paper was geared towards the global talent visa—the level of highly skilled people who can bring great value added to this country. The desire is to increase the threshold for the skilled worker visa to aim for more qualified, more talented people. On the high-talent end of the system, there are clear measures in the immigration White Paper to try to get those systems to work better and faster. The cost of visas and the health surcharge is now met on UKRI grants and on Horizon Europe grants.
(2 months, 1 week ago)
Lords ChamberAs I mentioned, there are three AI exemplars being used at the moment. They are: future customer experience; citizen AI agents —so starting with an AI agent to help young people to find a job or an education pathway; and the government efficiency accelerator. In all these examples, procurement is exactly one of the things that needs to be looked at. I have mentioned previously in this House that AI assurance services are part of this as well. The point raised, which is that it is easy to get the wrong thing, is right, and we need to look very carefully at this.
Back in January, the Blueprint for Modern Digital Government stated the intention to establish
“an AI adoption unit to build and deploy AI into public services, growing AI capacity and capability across government, and building trust, responsibility and accountability into all we do”.
How will this new AI adoption unit ensure that ethical principles, safety standards and human rights considerations are embedded from the very beginning of the AI adoption process throughout the public sector rather than being treated as a secondary concern after deployment?
The deployment of AI has started, as the noble Lord recognised, and I have given the three headline exemplars—and others are being put in through the incubator for AI that sits within DSIT. He raises a crucial point, and that is why the responsible AI advisory panel is being set up, which will include civil society, industry and academia to make sure that this is looked at properly. An ethics unit is already looking at this, and there are many diverse groups across government. What the Government Digital Service is trying to do is to pull it together into something more coherent, of which I think the responsible AI advisory panel is an important part.
(2 months, 2 weeks ago)
Lords ChamberI am sure that the noble Lord is aware that the creative industries are some of the greatest users of AI. Of course, it is important that creativity is protected. That is why a consultation has been put out around the copyright issue, which has been discussed many times in this Chamber. In all walks of life, it is important that we understand what AI brings and where it must be controlled in order to allow other things to happen. That is true not only in the creative industries but in many other areas.
My Lords, the Government failed to sign up to the declaration signed by 60 other countries at the recent Paris AI Action Summit. How much confidence can that now give us that any new AI Bill will prioritise a requirement for AI, in the words of the declaration, to be
“open, inclusive, transparent, ethical, safe, secure … trustworthy”
and sustainable? Given that the Government did sign up to the Seoul communiqué last year and hosted the Bletchley Park summit, are they now going backwards in this respect?
I can assure the noble Lord that the Government are most certainly not going backwards in this respect. I can also assure him that the AI Security Institute which has been set up has driven much of this across the world. It is linked to similar units elsewhere; it is undertaking work on many models that are evolving; and it is making its own work open, including the approach it takes. There is a very robust system being developed to make sure that the UK is at the forefront of this, not in the following stream.
(5 months, 1 week ago)
Grand CommitteeMy Lords, I congratulate my noble friend Lord Foster of Bath on securing the debate today and on his penetrating introduction, which included a number of extremely important questions for the Minister.
AI clearly has many creative uses, as Sir Paul McCartney himself emphasised last Sunday. But it is one thing to use the tech and another to be at the mercy of it, as so many noble Lords emphasised in their thoughtful but passionate speeches, both on Tuesday and today. So many outside organisations—I thank them for their briefings—have also made that very clear in what they have said.
The use of IP-protected content for training is a key issue, which has also arisen in relation to generative AI models outside the UK. It is rather a delicious irony that Open AI is now complaining of its own IP being used to train DeepSeek, as my noble friend said. Here in the UK, the Government’s intentions are clear. The new consultation on AI and copyright, reinforced by the AI opportunities plan, has set out a preferred option—this is the key thing—to change the UK’s copyright framework by creating a text and data mining exception where rights holders have not expressly reserved their rights: in other words, an opt-out system.
We all thought this had been put to bed under the last Government, but this Government seem even more intent on creating a Singapore-on-Thames. In response, we have seen the creation of a new campaign across the creative and news industries, Creative Rights In AI Coalition, and Ed Newton-Rex has raised over 37,000 signatures from creators and creative organisations.
Frankly, the creative and news industries are in uproar. As my noble friend Lord Foster says, the proposals were not underpinned by a robust economic case, but the consultation also starts from the false premise of legal uncertainty. As we heard in the debate on the amendment in the name of the noble Baroness, Lady Kidron, on Tuesday, there is no lack of clarity over how AI developers can legally access training data. UK law is clear that commercial organisations, including gen AI developers, must license the data they use to train their large language models. AI developers have already reached agreement with news publishers in a number of cases. Open AI has signed deals with publishers internationally, such as News Corp, Axel Springer, the Atlantic and Reuters. There can be no excuse of market failure. There are well-established licensing solutions administered by a variety of well-established mechanisms and collecting societies.
The consultation says:
“The government believes that the best way to achieve these objectives is through a package of interventions that can balance the needs of the two sectors”.
But what kind of balance is this when it is all take and no give on the part of creatives? The Government have stated that they will move ahead with their preferred “rights reservation” option only if the transparency and rights reservation provisions are
“effective, accessible, and widely adopted”.
However, as we have heard from across the Room today, no effective rights reservation, no system for the use of content by gen AI models, has been proposed or implemented anywhere in the world, which makes the government proposals entirely speculative. The technology does not exist.
The laws around transparency of these activities have not caught up. At present, developers can scrape content from the internet without declaring their identity, or they may use content scraped for one purpose for the completely different commercial purpose of training AI models. How can rights owners opt out of something they do not know about? Once used to train these models, the commercial value has already been extracted from IP scraped without permission, with no way to delete data from these models.
We need transparency and a clear statement about copyright. We absolutely should not expect artists to have to opt out. AI developers must be transparent about the identity and purposes of their crawlers, and have separate crawlers for distinct purposes. Unless news publishers and the broader creative industries can retain control over their data, this will not only reduce investment in creative output but will ultimately harm innovation in the AI sector and, as we have heard, tech developers will lack the high-quality data that is the essential fuel in generative AI.
Retaining the Kidron amendments to address the challenges posed by AI development, particularly in relation to copyright and transparency, is in my view, and that of those on these Benches, essential. This should apply regardless of in which country the scraping of copyright material takes place if developers market their product in the UK. It is clear that AI developers have used their lobbying clout to persuade the Government that a new exemption from copyright in their favour is required. As a result, the Government seem to have gone soft on big tech. In response, my party, creators, the creative industries and many other supporters will be vigorously opposing Government plans for a new text and data-mining exemption.
The Minister has been posed a number of key questions by my noble friend Lord Foster and many others, including the noble Lord, Lord Black of Brentwood. I put another question to him: will he now agree to withdraw the TDM with an opt-out as the preferred solution? That is one of the key requests of the creative industries; they would be dancing in the streets if the Minister said that today.
(5 months, 2 weeks ago)
Lords ChamberMy Lords, it is a pleasure to open the second day on Report on the Data (Use and Access) Bill. In doing so, I declare my technology interests as set out in the register, not least as an adviser to Socially Recruited, an AI business. In moving Amendment 38 in my name, I will not speak to any other amendments in this group.
Amendment 38 goes to the heart of the issue du jour: regulators have seldom been so much in the press and in the public eye. As the press would have it, they were hauled into No. 11 just a few days ago, but this speaks to what we want from our regulators across our economy and society. At their best, our regulators are the envy of the world. Just consider the FCA when we did the fintech regulatory sandbox: as a measure of success, it was replicated in well over 50 jurisdictions around the world.
We know how to do right-sized regulation and how to set up our regulators to succeed to do that most difficult of tasks—to balance innovation, economic growth, and consumers’ and citizens’ rights. That is what all regulators should be about. It is not straightforward; it is complex but entirely doable.
Amendment 38 simply proposes wording to assist the Information Commissioner’s Office. When it comes to the economic growth duty—“#innovation”—it simply refers back to Section 108 of the 2015 Act. I believe that bringing this clarity into the Bill will assist the regulator and enable all the conversations that are rightly going on right now, and all the plans that are being produced and reported on, such as those around AI, to be properly discussed and given proper context, with an Information Commissioner’s Office that is supported through clarity as to its responsibilities and obligations when it comes to economic growth. In simple terms, this would mean that these responsibilities are restricted and clearly set out according to Section 108 of the 2015 Act. It is critical that this should be the case if we are to have clarity around the commissioner’s independence as a supervisory authority on data protection, an absolutely essential condition for EU adequacy decisions.
I look forward to the Minister’s response. I hope that he likes my drafting. I hope that he will accept and incorporate my amendment into the Bill. I look forward to the debate. I beg to move.
My Lords, I rise to support Amendment 38 in the name of the noble Lord, Lord Holmes. More than ever before, the commissioner, alongside other regulators, is being pressured to support the Government’s growth and innovation agenda. In Clause 90, the Bill places unprecedented obligations on the ICO to support innovation. The question, in respect of both the existing growth duty and Clause 90, is whether they are in any sense treated as overriding the ICO’s primary responsibilities in data protection and information rights. How does the ICO aim to balance those duties, ensuring that its regulatory actions support economic growth while maintaining necessary protections?
We need to be vigilant. As it is, there are criticisms regarding the way the Information Commissioner’s Office carries out its existing duties. Those criticisms can be broadly categorised into issues with enforcement, independence and the balancing of competing interests. The ICO has a poor record on enforcement; it has been reluctant to issue fines, particularly to public sector organisations. There has been an overreliance on reprimands, as I described in Committee. The ICO has been relying heavily on reprimands, rather than stronger enforcement actions. It has also been accused of being too slow with its investigations.
There are concerns about these new duties, which could pose threats to the ability of the Information Commissioner’s Office to effectively carry out its primary functions. For that reason, we support the amendment from the noble Lord, Lord Holmes.
My Lords, I support the amendment in the name of the noble Baroness, Lady Kidron, to which I have added my name. I will speak briefly because I wish to associate myself with everything that she has said, as is normal on these topics.
Those of us who worked long and hard on the Online Safety Act had our fingers burnt quite badly when things were not written into the Bill. While I am pleased—and expect to be even more pleased in a few minutes—that the Government are in favour of some form of code of conduct for edtech, whether through the age-appropriate design code or not, I am nervous. As the noble Baroness, Lady Kidron said, every day with Ofcom we are seeing the risk-aversion of our regulators in this digital space. Who can blame them when it appears to be the flavour of the month to say that, if only the regulators change the way they behave, growth will magically come? We have to be really mindful that, if we ask the ICO to do this vaguely, we will not get what we need.
The noble Baroness, Lady Kidron, as ever, makes a very clear case for why it is needed. I would ask the Minister to be absolutely explicit about the Government’s intention, so that we are giving very clear directions from this House to the regulator.
My Lords, it is a pleasure to follow the noble Baroness, Lady Harding. I have added a few further words to my speech in response, because she made an extremely good point. I pay tribute to the noble Baroness, Lady Kidron, and her tenacity in trying to make sure that we secure a code for children’s data and education, which is so needed. The education sector presents unique challenges for protecting children’s data.
Like the noble Baronesses, Lady Kidron and Lady Harding, I look forward to what the Minister has to say. I hope that whatever is agreed is explicit; I entirely agree with the noble Baroness, Lady Harding. I had my own conversation with the Minister about Ofcom’s approach to categorisation which, quite frankly, does not follow what we thought the Online Safety Act was going to imply. It is really important that we absolutely tie down what the Minister has to say.
The education sector is a complex environment. The existing regulatory environment does not adequately address the unique challenges posed by edtech, as we call it, and the increasing use of children’s data in education. I very much echo what the noble Baroness, Lady Kidron, said: children attend school for education, not to be exploited for data mining. Like her, I cross over into considering the issues related to the AI and IP consultation.
The worst-case scenario is using an opt-in system that might incentivise learners or parents to consent, whether that is to state educational institutions such as Pearson, exam boards or any other entity. I hope that, in the other part of the forest, so to speak, that will not take place to the detriment of children. In the meantime, I very much look forward to what the Minister has to say on Amendment 44.
My Lords, I thank the noble Baroness, Lady Kidron, for moving her amendment. Before I begin, let me declare my interest as a recently appointed director of Lumi, an edtech provider—but for graduates, not for schools.
AI has the potential to revolutionise educational tools, helping teachers spend less time on marking and more time on face-to-face teaching with children, creating more innovative teaching tools and exercises and facilitating more detailed feedback for students. AI presents a real opportunity to improve education outcomes for children, opening more opportunities throughout their lives. There are deeply compelling promises in edtech.
However—there is always a however when we talk about edtech—creating and using AI education tools will require the collection and processing of children’s personal data. This potentially includes special category data—for instance, medical information pertaining to special educational needs such as dyslexia. Therefore, care must be taken in regulating how this data is collected, stored, processed and used. Without this, AI poses a major safeguarding risk. We share the concerns of the noble Baroness, Lady Kidron, and wholeheartedly support the spirit of her amendment.
We agree that it is prudent to require the ICO to make a code of practice on children’s data and education, and I particularly welcome a requirement on the ICO to consult with and involve parents. Parents know their children best, needless to say, and have their best interests at heart; their input will be critical in building trust in AI-assisted educational tools and facilitating their rollout and benefits for children throughout the UK.
However, as I said earlier at Report—and I shall not repeat the arguments now—we have concerns about the incorporation of international law into our law, and specifically, in this instance, the UN Convention on the Rights of the Child. We cannot therefore support the amendment as drafted. That said, we hope very much that the Government will listen carefully to the arguments raised here and take steps to introduce appropriate safeguards for children and young people in our data legislation regime. I suspect that most parents will greatly welcome more reassurance about the use of their children’s data.
My Lords, I can be pretty brief. We have had some fantastic speeches, started by the noble Baroness, Lady Kidron, with her superb rallying cry for these amendments, which we 100% support on these Benches. As she said, there is cross-party support. We have heard support from all over the House and, as the noble and learned Baroness, Lady Butler-Sloss, has just said, there has not been a dissenting voice.
I have a long association with the creative industries and with AI policy and yield to no one in my enthusiasm for AI—but, as the noble Baroness said, it should not come at the expense of the creative industries. It should not just be for the benefit of DeepSeek or Silicon Valley. We are very clear where we stand on this.
I pay tribute to the Creative Rights in AI Coalition and its campaign, which has been so powerful in garnering support, and to all those in the creative industries and creators themselves who briefed noble Lords for this debate.
These amendments respond to deep concerns that AI companies are using copyright material without permission or compensation. With the new government consultation, I do not believe that their preferred option is a straw man for a text and data mining exemption, with an opt out that we thought was settled under the previous Government. It starts from the false premise of legal uncertainty, as we have heard from a number of noble Lords. As the News Media Association has said, the Government’s consultation is based on a mistaken idea, promoted by tech lobbyists and echoed in the consultation, that there is a lack of clarity in existing copyright law. This is completely untrue. The use of copyrighted content without a licence by gen AI firms is theft on a mass scale and there is no objective case for a new text and data mining exception.
No effective opt-out system for the use of content by gen AI models has been proposed or implemented anywhere in the world, making the Government’s proposals entirely speculative. It is vital going forward that we ensure that AI companies cannot use copyrighted material without permission or compensation; that AI development does not exploit loopholes to bypass copyright laws; that AI developers disclose the sources of the data they use for training their models, allowing for accountability and addressing infringement; and that we reinforce the existing copyright framework, rather than creating new exceptions that disadvantage creators.
These amendments would provide a mechanism for copyright holders to contest the use of their work and ensure a route for payment. They seek to ensure that AI innovation does not come at the expense of the rights and livelihoods of creators. There is no market failure. We have a well-established licensing system as an alternative to the Government’s proposed opt-out scheme for AI developers using copyrighted works. A licensing system is the only sustainable solution that benefits both creative industries and the AI sector. We have some of the most effective collective rights organisations in the world. Licensing is their bread and butter. Merely because AI platforms are resisting claims, does not mean that the law in the UK is uncertain.
Amending UK law to address the challenges posed by AI development, particularly in relation to copyright and transparency, is essential to protect the rights of creators, foster responsible innovation and ensure a sustainable future for the creative industries. This should apply regardless of which country the scraping of copyright material takes place in, if developers market their product in the UK, regardless of where the training takes place. It would also ensure that AI start-ups based in the UK are not put at a competitive disadvantage due to the ability of international firms to conduct training in a different jurisdiction.
As we have heard throughout this debate, it is clear that the options proposed by the Government have no proper economic assessment underpinning them, no technology for an opt-out underpinning them and no enforcement mechanism proposed. It baffles me why the Conservative Opposition is not supporting these amendments, and I very much hope that the voices we have heard on the Conservative Benches will make sure that these amendments pass with acclamation.
I thank the noble Baroness, Lady Kidron, for moving this incredibly important group and all those speakers who have made the arguments so clearly and powerfully. I pay tribute to noble Baroness’s work on copyright and AI, which is so important for our arts and culture sector. As noble Lords have rightly said, our cultural industries make an enormous contribution to our country, not just in cultural terms but in economic ones, and we must ensure that our laws do not put that future at risk.
In the build-up to this debate I engaged with great pleasure with the noble Baroness, Lady Kidron, and on these Benches we are sympathetic to her arguments. Her Amendment 61 would require the Government to make regulations in this area. We accept the Government’s assurance that this is something they will seek to address, and I note the Minister’s confirmation that their consultation will form the basis of the Government’s approach to this issue. Given the importance of getting this right, our view is that the Government’s consultation is in mid-flight, and we have to allow it to do its work. Whatever view we take of the design and the timing of the consultation, it offers for now a way forward that will evidence some of the serious concerns expressed here. That said, we will take a great interest in the progress and outcomes of the consultation and will come back to this in future should the Government’s approach prove unsatisfactory.
Amendment 75 in my name also seeks to address the challenge that the growth in AI poses to our cultural industries. One of the key challenges in copyright and AI is enforceability. Copyright can be enforced only when we know it has been infringed. The size and the international distribution of AI training models render it extremely challenging to answer two fundamental questions today: first, was a given piece of content used in a training model and secondly, if so, in what jurisdiction did that use take place? If we cannot answer these questions, enforcement can become extremely hard, so a necessary, if not sufficient, part of the solution will be a digital watermark—a means of putting some red dye in the water where copyrighted material is used to train AIs. It could also potentially provide an automated means for content creators to opt out, with a vastly more manageable administrative burden.
I thank the Minister for his constructive engagement on digital watermarking and look to him to give the House an assurance that the Government will bring forward a plan to develop a technological standard for a machine-readable digital watermark. I hope that, if and when he does so, he is able to indicate both a timeline and an intention to engage internationally. Subject to receiving such reassurances when he rises, I shall not move my amendment.
My Lords, Amendment 46 seeks a review of court jurisdiction. As I said in Committee, the current system’s complexity leads to confusion regarding where to bring data protection claims—tribunals or courts? This is exacerbated by contradictory legal precedents from different levels of the judiciary, and it creates barriers for individuals seeking to enforce their rights.
Transferring jurisdiction to tribunals would simplify the process and reduce costs for individuals, and it would align with the approach for statutory appeals against public bodies, which are typically handled by tribunals. In the Killock v Information Commissioner case, Mrs Justice Farbey explicitly called for a “comprehensive strategic review” of the appeal mechanisms for data protection rights. That is effectively what we seek to do with this amendment.
In Committee, the noble Baroness, Lady Jones, raised concerns about transferring jurisdiction and introducing a new appeals regime. She argued that the tribunals lacked the capacity to handle complex data protection cases, but tribunals are, in fact, better suited to handle such matters due to their expertise and lower costs for individuals. Additionally, the volume of applications under Section 166—“Orders to progress complaints”—suggests significant demand for tribunal resolution, despite its current limitations.
The noble Baroness, Lady Jones, also expressed concern about the potential for a new appeal right to encourage “vexatious challenges”, but introducing a tribunal appeal system similar to the Freedom of Information Act could actually help filter out unfounded claims. This is because the tribunal would have the authority to scrutinise cases and potentially dismiss those deemed frivolous.
The noble Baroness, Lady Jones, emphasised the existing judicial review process as a sufficient safeguard against errors by the Information Commissioner. However, judicial review is costly and complex, presenting a significant barrier for individuals. A tribunal system would offer a much more accessible and less expensive avenue for redress.
I very much hope that, in view of the fact that this is a rather different amendment—it calls for a review—the Government will look at this. It is certainly called for by the judiciary, and I very much hope that the Government will take this on board at this stage.
I thank the noble Lord, Lord Clement-Jones, for moving his amendment, which would require the Secretary of State to review the potential impact of transferring to tribunals the jurisdiction of courts that relate to all data protection provisions. As I argued in Committee, courts have a long-standing authority and expertise in resolving complex legal disputes, including data protection cases, and removing the jurisdiction of the courts could risk undermining the depth and breadth of legal oversight required in such critical areas.
That said, as the noble Baroness, Lady Jones of Whitchurch, said in Committee, we have a mixed system of jurisdiction for legal issues relating to data, and tribunals have an important role to play. So, although we agree with the intentions behind the amendment from the noble Lord, Lord Clement-Jones, we do not support the push to transfer all data protection provisions from the courts to tribunals, as we believe that there is still an important role for courts to play. Given the importance of the role of the courts in resolving complex cases, we do not feel that this review is necessary.
My Lords, before the noble Viscount sits down, I wonder whether he has actually read the amendment; it calls for a review, not for transfer. I think that his speech is a carryover from Committee.
I thank the noble Lord, Lord Clement-Jones, for Amendment 46. It would require a review of the impact of transferring all data protection-related cases to the relevant tribunals. Currently there is a mixture of jurisdictions for tribunals and courts for data protection cases, depending on the nature of the proceedings. This is on the basis that certain claims are deemed appropriate for tribunal, while others are appropriate for courts, where stricter rules of evidence and procedure apply—for example, in dealing with claims by data subjects against controllers for compensation due to breaches of data protection legislation. As such, the current system already provides clear and appropriate administrative and judicial redress routes for data subjects seeking to exercise their rights.
Tribunals are in many cases the appropriate venue for data protection proceedings, including appeals by controllers against enforcement action or applications by data subjects for an order that the ICO should progress a complaint. Claims by individuals against businesses or other organisations for damages arising from breach of data protection law fall under the jurisdiction of courts rather than tribunals. This is appropriate, given the likely disparity between the resources of the respective parties, because courts apply stricter rules of evidence and procedures than tribunals. While court proceedings can, of course, be more costly, successful parties can usually recover their costs, which would not always be the case in tribunals.
I hope that the noble Lord agrees that there is a rationale for these different routes and that a review to consider transfer of jurisdictions to tribunals is therefore not necessary at this time.
My Lords, I thank the Minister for that dusty reply. I wonder whether he has been briefed about particular legal cases, such as Killock or Delo, where the judiciary themselves were confused about the nature of the different jurisdictions of tribunal and court. The Minister and, indeed, the noble Viscount, Lord Camrose, seemed to make speeches on the basis that all is wonderful and the jurisdiction of the courts and tribunals is so clearly defined that we do not need a review. That is not the case and, if the Minister were better briefed about the obiter, if not the judgments, in Delo and Killock, he might appreciate that there is considerable confusion about jurisdiction, as several judges have commented.
I am very disappointed by the Minister’s reply. I think that there will be several judges jumping up and down, considering that he has not really looked at the evidence. The Minister always says that he is very evidence-based. I very much hope that he will take another look at this—or, if he does not, that the MoJ will—as there is considerably greater merit in the amendment than he accords. However, I shall not press this to a vote and I beg leave to withdraw the amendment.
My Lords, I too support this. I well remember the passage of the Computer Misuse Act, and we were deeply unhappy about some of its provisions defining hacker tools et cetera, because they had nothing about intention. The Government simply said, “Yes, they will be committing an offence, but we will just ignore it if they are good people”. Leaving it to faceless people in some Civil Service department to decide who is good or bad, with nothing in the Bill, is not very wise. We were always deeply unhappy about it but had to go along with it because we had to have something; otherwise, we could not do anything about hacking tools being freely available. We ended up with a rather odd situation where there is no defence against being a good guy. This is a very sensible amendment to clean up an anomaly that has been sitting in our law for a long time and should probably have been cleaned up a long time ago.
My Lords, I support Amendments 47 and 48, which I was delighted to see tabled by the noble Lords, Lord Holmes and Lord Arbuthnot. I have long argued for changes to the Computer Misuse Act. I pay tribute to the CyberUp campaign, which has been extremely persistent in advocating these changes.
The CMA was drafted some 35 years ago—an age ago in computer technology—when internet usage was much lower and cybersecurity practices much less developed. This makes the Act in its current form unfit for the modern digital landscape and inhibits security professionals from conducting legitimate research. I will not repeat the arguments made by the two noble Lords. I know that the Minister, because of his digital regulation review, is absolutely apprised of this issue, and if he were able to make a decision this evening, I think he would take them on board. I very much hope that he will express sympathy for the amendments, however he wishes to do so—whether by giving an undertaking to bring something back at Third Reading or by doing something in the Commons. Clearly, he knows what the problem is. This issue has been under consideration for a long time, in the bowels of the Home Office—what worse place is there to be?—so I very much hope that the Minister will extract the issue and deal with it as expeditiously as he can.
I thank my noble friend Lord Holmes for tabling the amendment in this group. I, too, believe these amendments would improve the Bill. The nature of computing and data processing has fundamentally changed since the Computer Misuse Act 1990. Third parties hold and process immense quantities of data, and the means of accessing and interacting with that data have become unrecognisably more sophisticated. Updating the definition of unauthorised computer access through Amendment 48 is a sensible reform, as this new definition takes into account that data controllers and processors now hold substantial quantities of personal data. These entities are responsible for the security of the data they hold, so their provisions on access become legally relevant and this amendment reflects this.
When updating an offence, it is equally necessary to consider the legal defences, as my noble friend has rightly done in Amendment 47 by protecting individuals accessing information to detect or prevent a crime or whose actions are in the public interest. We on these Benches feel these amendments are wholly sensible. I urge the Minister to listen to the persuasive argument that my noble friend Lord Holmes has made and consider how we can deliver these improvements to our data legislation.
My Lords, I will speak to Amendment 48B. In our view, cookie paywalls create an unfair choose for users, essentially forcing them to pay for privacy. We tabled an amendment in Committee to ban cookie paywalls, but in the meantime, as the noble Baroness, Lady Jones, heralded at the time, the Information Commissioner’s Office has provided updated guidance on the “consent or pay” model for cookie compliance. It is now available for review. This guidance clarifies how organisations can offer users a choice between accepting personalised ads for free access or paying for an ad-free experience while ensuring compliance with data protection laws. It has confirmed that the “consent or pay” model is acceptable for UK publishers, provided certain conditions are met. Key requirements for a valid consent under this model include: users must have genuine free choice; the alternative to consent—that is, payment—must be reasonably priced; and users must be fully informed about their options.
The guidance is, however, contradictory. On the one hand, it says that cookie paywalls
“can be compliant with data protection law”
and that providers must document their assessments of how it is compliant with DPL. On the other, it says that, to be compliant with data protection law, cookie paywalls must allow users to choose freely without detriment. However, users who do not wish to pay the fee to access a website will be subject to detriment, because with a cookie paywall they will pay a fee if they wish to refuse consent. This is addressed as the “power imbalance”. It is also worth noting that this guidance does not constitute legal advice; it leaves significant latitude for legal interpretation and argument as to the compatibility of cookie paywalls with data protection law.
The core argument against “consent or pay” models is that they undermine the principle of freely given consent. The ICO guidance emphasises that organisations using these models must be able to demonstrate that users have a genuine choice and are not unfairly penalised for refusing to consent to data processing for personalised advertising. Yet in practice, given the power imbalance, on almost every occasion this is not possible. This amendment seeks to ensure that individuals maintain control over their personal data. By banning cookie paywalls, users can freely choose not to consent to cookies without having to pay a fee. I very much hope that the Government will reconsider the ICO’s guidance in particular, and consider banning cookie paywalls altogether.
My Lords, I thank my noble friend Lord Lucas for introducing this group. Amendments 48A and 50A, in his name, would ensure that regulated professionals, including financial services firms, are able to comply with current and future regulatory requirements. The example my noble friend has given—the FCA’s expectation that firms communicate effectively with consumers—is a good one. Clearly, we must avoid a circumstance where regulators expect businesses to take action that is not possible due to limiting legislation governing data use and access. My noble friend has made a forceful case and I hope the Government will be able to give the House appropriate assurance that businesses will not be put in this position as a result of this legislation.
Amendment 48B, in the name of the noble Lord, Lord Clement-Jones, seeks to ban cookie paywalls. I opposed a similar amendment when we debated it in Committee as it actually seeks to curtail choice. Currently, users have the options to pay money and stay private, share personal data and read for free, or walk away. Faced with these options, for instance, I have sadly chosen to forgo my regular evening reading of the Daily Mail’s excellent sports pages, but I see no reason why that newspaper, or anyone else, should be compelled to provide anything for free. In fact, it has been very persuasively argued by Jaron Lanier, Shoshana Zuboff and many others that it is the fact that so much of the internet is apparently, but not actually, free that has caused a great deal of damage, rather than having an open charging model. This approach finally reveals the exact cash value of individuals’ data that websites are harvesting and offers users choice. We do not agree with attempts to remove that choice.
My Lords, these amendments have to do with research access for online safety. Having sat on the Joint Committee of the draft Online Safety Bill back in 2021, I put on record that I am delighted that the Government have taken the issue of research access to data very seriously. It was a central plank of what we suggested and it is fantastic that they have done it.
Of the amendments in my name, Amendment 51 would simply ensure that the provisions of Clause 123 are acted on by removing the Government’s discretion as to whether they introduce regulations. It also introduces a deadline of 12 months for the Government to do so. Amendment 53 seeks to ensure that the regulators will enable independent researchers to research how online risks and harms impact different groups, especially vulnerable users, including children. Given the excitements we have already had this evening, I do not propose to press any of them, but I would like to hear from the Minister that he has heard me and that the Government will seek to enshrine the principle of different ages, different stages, different people, when he responds.
I note that the noble Lord, Lord Bethell, who has the other amendments in this group, to which I added my name, is not in his place, but I understand that he has sought—and got—reassurance on his amendments. So there is just one remaining matter on which I would like further reassurance: the scope of the legal privilege exception. A letter from the Minister on 10 January explains:
“The clause restates the existing law on legally privileged information as a reassurance that regulated services will not be asked to break the existing legislation on the disclosure of this type of data”.
It seems that the Minister has veered tantalisingly close to answering my question, but not in a manner that I can quite understand. So I would really love to understand—and I would be grateful to the Minister if he would try to explain to me—how the Government will prevent tech companies using legal privilege as a shield. Specifically, would CCing a lawyer on every email exchange, or having a lawyer in every team, allow companies to prevent legitimate scrutiny of their safety record? I have sat in Silicon Valley headquarters and each team came with its own lawyer—I would really appreciate clarity on this issue. I beg to move.
My Lords, I can only support what the noble Baroness, Lady Kidron, had to say. This is essentially unfinished business from the Online Safety Act, which we laboured in the vineyard to deliver some time ago. These amendments aim to strengthen Clause 123 and try to make sure that this actually happens and that we do not get the outcomes of the kind that the noble Baroness has mentioned.
I, too, have read the letter from the Minister to the noble Lord, Lord Bethell. It is hedged about with a number of qualifications, so I very much hope that the Minister will cut through it and give us some very clear assurances, because I must say that I veer back and forth when I read the paragraphs. I say, “There’s a win”, and then the next paragraph kind of qualifies it, so perhaps the Minister will give us true clarity when he responds.
My Lords, I wanted to add something, having spent a lot of time on Part 3 of the Digital Economy Act, which after many assurances and a couple of years, the Executive decided not to implement, against the wishes of Parliament. It worries me when the Executive suddenly feel that they can do those sorts of things. I am afraid that leopards sometimes do not change their spots, and I would hate to see this happen again, so Amendment 51 immediately appeals. Parliament needs to assert its authority.
(5 months, 2 weeks ago)
Lords ChamberMy Lords, I share in the congratulations of my noble friend Lady Owen. It has taken me about 10 years to begin to understand how this House works and it has taken her about 10 minutes.
I want to pursue something which bewilders me about this set of amendments, which is the amendment tabled by the noble Baroness, Lady Gohir. I do not understand why we are talking about a different Bill in relation to audio fakes. Audio has been with us for many years, yet video deepfakes are relatively new. Why are we talking about a different Bill in relation to audio deepfakes?
My Lords, this has been a very interesting debate. I too congratulate the noble Baroness, Lady Owen, on having brought forward these very important amendments. It has been a privilege to be part of her support team and she has proved an extremely persuasive cross-party advocate, including in being able to bring out the team: the noble Baroness, Lady Kidron, the noble Lord, Lord Pannick, who has cross-examined the Minister, and the noble Lord, Lord Stevenson. There is very little to follow up on what noble Lords have said, because the Minister now knows exactly what he needs to reply to.
I was exercised by this rather vague issue of whether the elements that were required were going to come back at Third Reading or in the Commons. I did not think that the Minister was specific enough in his initial response. In his cross-examination, the noble Lord, Lord Pannick, really went through the key elements that were required, such as the no intent element, the question of reasonable excuse and how robust that was, the question of solicitation, which I know is very important in this context, and the question of whether it is really an international law matter. I have had the benefit of talking to the noble Lord, Lord Pannick, and surely the mischief is delivered and carried out here, so why is that an international law issue? There is also the question of deletion of data, which the noble Lord has explained pretty carefully, and the question of timing of knowledge of the offence having been committed.
The Minister needs to describe the stages at which those various elements are going to be contained in a government amendment. I understand that there may be a phasing, but there are a lot of assurances. As the noble Lord, Lord Stevenson, said, is it six or seven? How many assurances are we talking about? I very much hope that the Minister can see the sentiment and the importance we place on his assurances on these amendments, so I very much hope he is going to be able to give us the answers.
In conclusion, as the noble Baroness, Lady Morgan, said—and it is no bad thing to be able to wheel on a former Secretary of State at 9 o’clock in the evening—there is a clear link between gender-based violence and image-based abuse. This is something which motivates us hugely in favour of these amendments. I very much hope the Minister can give more assurance on the audio side of things as well, because we want future legislation to safeguard victims, improve prosecutions and deter potential perpetrators from committing image-based and audio-based abuse crimes.
I thank the Minister and my noble friend Lady Owen for bringing these amendments to your Lordships’ House. Before I speak to the substance of the amendments, I join others in paying tribute to the tenacity, commitment and skill that my noble friend Lady Owen has shown throughout her campaign to ban these awful practices. She not only has argued her case powerfully and persuasively but, as others have remarked, seems to have figured out the machinery of this House in an uncanny way. Whatever else happens, she has the full support of these Benches.
I am pleased that the Government have engaged constructively with my noble friend and are seeking to bring this back at Third Reading. The Minister has been asked some questions and we all look forward with interest to his responses. I know from the speeches that we have heard that I am not alone in this House in believing that we have an opportunity here and now to create these offences, and we should not delay. For the sake of the many people who have been, and will otherwise be, victims of the creation of sexually explicit deepfakes, I urge the Government to continue to work with my noble friend Lady Owen to get this over the line as soon as possible.
I support the amendment, to which I have attached my name, along with the noble Lord, Lord Bassam, and the noble Earl, Lord Clancarty. I declare my interest as a member of DACS, the Design and Artists Copyright Society, and I, too, thank the Minister for meeting us prior to this debate.
Today’s digital landscape presents unique and pressing challenges for visual artists that we can no longer ignore. A 2022 YouGov survey commissioned by DACS uncovered a revealing paradox in our digital culture. While 75% of people regularly access cultural content at least three times a week, with 63% downloading it for free, an overwhelming 72% of the same respondents actively support compensating artists for digital sharing of their work. These figures paint a stark picture of the disconnect between the public’s consumption habits and their ethical convictions about fair compensation.
The Netherlands offers a compelling blueprint for change through DACS’ partner organisation Pictoright. Its innovative private copying scheme has successfully adapted to modern consumption habits while protecting artists’ interests. Consider a common scenario in museums: visitors now routinely photograph artworks instead of purchasing traditional postcards. Under Pictoright’s system, artists receive fair compensation for these digital captures, demonstrating that we can embrace the convenience of digital access without sacrificing creators’ right to earn from their work. This proven model shows that the tension between accessibility and fair compensation is not insurmountable.
The smart fund offers a similar balanced solution for the UK. This approach would protect our cultural ecosystem while serving the interests of creators, platforms and the public alike. I hope the Government will look favourably upon this scheme.
My Lords, I thank the noble Lord, Lord Bassam, for retabling his Committee amendment, which we did not manage to discuss. Sadly, it always appears to be discussed rather late in the evening, but I think that the time has come for this concept and I am glad that the Government are willing to explore it.
I will make two points. Many countries worldwide, including in the EU, have their own version of the smart fund to reward creators and performers for the private copy and use of their works and performances. Our own CMS Select Committee found that, despite the creative industries’ economic contribution—about which many noble Lords have talked—many skilled and successful professional creators are struggling to make a living from their work. The committee recommended that
“the Government work with the UK’s creative industries to introduce a statutory private copying scheme”.
This has a respectable provenance and is very much wanted by the collecting societies ALCS, BECS, Directors UK and DACS. Their letter said that the scheme could generate £250 million to £300 million a year for creatives, at no cost to the Government or to the taxpayer. What is not to like? They say that similar schemes are already in place in 45 countries globally, including most of Europe, and many of them include an additional contribution to public cultural funding. That could be totally game-changing. I very much hope that there is a fair wind behind this proposal.
My Lords, I thank the noble Lord, Lord Bassam of Brighton, for laying this amendment and introducing the debate on it.
As I understand it, a private copying levy is a surcharge on the price of digital content. The idea is that the money raised from the surcharge is either redistributed directly to rights holders to compensate them for any loss suffered because of copies made under the private copying exceptions or contributed straight to other cultural events. I recognise what the noble Lord is seeking to achieve and very much support his intent.
I have two concerns. First—it may be that I have misunderstood it; if so, I would be grateful if the noble Lord would set me straight—it sounds very much like a new tax of some kind is being raised, albeit a very small one. Secondly, those who legitimately pay for digital content end up paying twice. Does this not incentivise more illegal copying?
We all agree how vital it is for those who create products of the mind to be fairly rewarded and incentivised for doing so. We are all concerned by the erosion of copyright or IP caused by both a global internet and increasingly sophisticated AI. Perhaps I could modestly refer the noble Lord to my Amendment 75 on digital watermarking, which I suggest may be a more proportionate means of achieving the same end or at least paving the way towards it. For now, we are unable to support Amendment 57 as drafted.
My Lords, I very much encourage the Government to go down this road. Everyone talks about the NHS just because the data is there and organised. If we establish a structure like this, there are other sources of data that we could develop to equivalent value. Education is the obvious one. What works in education? We have huge amounts of data, but we do nothing with it—both in schools and in higher education. What is happening to biodiversity? We do not presently collect the data or use it in the way we could, but if we had that, and if we took advantage of all the people who would be willing to help with that, we would end up with a hugely valuable national resource.
HMRC has a lot of information about employment and career patterns, none of which we use. We worry about what is happening and how we can improve seaside communities, but we do not collect the data which would enable us to do it. We could become a data-based society. This data needs guarding because it is not for general use—it is for our use, and this sort of structure seems a really good way of doing it. It is not just the NHS—there is a whole range of areas in which we could greatly benefit the UK.
My Lords, all our speakers have made it clear that this is a here-and-now issue. The context has been set out by noble Lords, whether it is Stargate, the AI Opportunities Action Plan or, indeed, the Palantir contract with the NHS. This has been coming down the track for some years. There are Members on the Government Benches, such as the noble Lords, Lord Mitchell and Lord Hunt of Kings Heath, who have been telling us that we need to work out a fair way of deriving a proper financial return for the benefits of public data assets, and Future Care Capital has done likewise. The noble Lord, Lord Freyberg, has form in this area as well.
The Government’s plan for the national data library and the concept of sovereign data assets raises crucial questions about how to balance the potential benefits of data sharing with the need to protect individual rights, maintain public trust and make sure that we achieve proper value for our public digital assets. I know that the Minister has a particular interest in this area, and I hope he will carry forward the work, even if this amendment does not go through.
I thank the noble Baroness, Lady Kidron, for moving her amendment. The amendments in this group seek to establish a new status for data held in the public interest, and to establish statutory oversight rules for a national data library. I was pleased during Committee to hear confirmation from the noble Baroness, Lady Jones of Whitchurch, that the Government are actively developing their policy on data held in the public interest and developing plans to use our data assets in a trustworthy and ethical way.
We of course agree that we need to get this policy right, and I understand the Government’s desire to continue their policy development. Given that this is an ongoing process, it would be helpful if the Government could give the House an indication of timescales. Can the Minister say when the Government will be in a position to update the House on any plans to introduce a new approach to data held in the public interest? Will the Government bring a statement to this House when plans for a national data library proceed to the next stage?
I suggest that a great deal of public concern about nationally held datasets is a result of uncertainty. The Minister was kind enough to arrange a briefing from his officials yesterday, and this emerged very strongly. There is a great deal of uncertainty about what is being proposed. What are the mechanics? What are the risks? What are the costs? What are the eventual benefits to UK plc? I urge the Minister, as and when he makes such a statement, to bring a maximum of clarity about these fundamental questions, because I suspect that many people in the public will find this deeply reassuring.
Given the stage the Government are at with these plans, we do not think it would be appropriate to legislate at this stage, but we of course reserve the right to revisit this issue in the future.
My Lords, we have had some discussion already this week on data centres. The noble Lord, Lord Holmes, is absolutely right to raise this broad issue, but I was reassured to hear from the noble Lord, Lord Hunt of Kings Heath, earlier in the week that the building of data centres, their energy requirements and their need may well be included in NESO’s strategic spatial energy plan and the centralised strategic network plan. Clearly, in one part of the forest there is a great deal of discussion about energy use and the energy needs of data centres. What is less clear and, in a sense, reflected in the opportunities plan is exactly how the Government will decide the location of these data centres, which clearly—at least on current thinking about the needs of large language models, AI and so on—will be needed. It is about where they will be and how that will be decided. If the Minister can cast any light on that, we would all be grateful.
I thank my noble friend Lord Holmes of Richmond for moving this amendment. Amendment 59 is an important amendment that addresses some of the key issues relating to large language models. We know that large language models have huge potential, and I agree with him that the Government should keep this under review. Perhaps the noble Baroness, Lady Jones of Whitchurch, would be willing to update the House on the Government’s policy on large language model regulation on her return.
Data centre availability is another emerging issue as we see growth in this sector. My noble friend is absolutely right to bring this to the attention of the House. We firmly agree that we will have a growing need for additional data centres. In Committee, the noble Baroness, Lady Jones, did not respond substantively to Amendments 60 and 66 from my noble friend on data centres, which I believe was—not wholly unreasonably—to speed the Committee to its conclusion just before Christmas. I hope the Minister can give the House a fuller response on this today, as it would be very helpful to hear what the Government’s plans are on the need for additional data centres.
My Lords, I spoke on this before, and I will repeat what I said previously. The only way out of this one is to have two fields against someone: one that we will call “sex” and another that we will call “gender”. I will use the terminology of the noble Lord, Lord Lucas, for this. “Sex” is what you are biologically and were born, and that you cannot change. There are instances where we need to use that field, particularly when it comes to delivering medicine to people—knowing how you treat them medically—and, possibly, in other things such as sports. There are one or two areas where we need to know what they are biologically.
Then we have another field which is called “gender”. In society, in many cases, we wish that people did not have to go around saying that they are not what they were born but what they want to be—but I do not have a problem with that. We could use that field where society decides that people can use it, such as on passports, other documents and identity cards—all sorts of things like that. It does not matter; I am not worried about what someone wants to call themselves or how they want to present themselves to society.
Researchers will have the “sex” field, and they can carry out medical research— they can find out about all the different things related to that—and, societally, we can use the other field for how people wish to project themselves in public. That way we can play around with what you are allowed to use in what scenarios; it allows you to do both. What we need is two fields; it will solve a lot of problems.
My Lords, it is clear that Amendment 67 in the name of the noble Lord, Lord Lucas, is very much of a piece with the amendments that were debated and passed last week. On these Benches, our approach will be exactly the same. Indeed, we can rely on what the Minister said last week, when he gave a considerable assurance:
“I can be absolutely clear that we must have a single version of the truth on this. There needs to be a way to verify it consistently and there need to be rules. That is why the ongoing work is so important”.—[Official Report, 21/1/25; col. 1620.]
That is, the work of the Central Digital and Data Office. We are content to rely on his assurance.
I thank my noble friend Lord Lucas for bringing his Amendment 67, which builds on his previous work to ensure accuracy of data. On these Benches, we agree wholeheartedly with him that the information we have access to—for example, to verify documents—must be accurate. His amendment would allow the Secretary of State to make regulations establishing definitions under the Bill for the purposes of digital verification services, registers of births and deaths, and other provisions. Crucially, this would enable the Government to put measures in place to ensure the consistency of the definitions of key personal attributes, including sex. We agree that consistency and accuracy of data is vital. We supported him on the first day at Report, and, if he pushes his amendment to a Division, we will support him today.
My Lords, as so often, I listened with awe to the noble Baroness. Apart from saying that I agree with her wholeheartedly, which I do, there is really no need for me for me to add anything, so I will not.
My Lords, I too am lost in admiration for the noble Baroness, Lady Kidron—still firing on all cylinders at this time of night. Current law is clearly out of touch with the reality of computer systems. It assumes an untruth about computer reliability that has led to significant injustice. We know that that assumption has contributed to miscarriages of justice, such as the Horizon scandal.
Unlike the amendment in Committee, Amendment 68 does not address the reliability of computers themselves but focuses rather on the computer evidence presented in court. That is a crucial distinction as it seeks to establish a framework for evaluating the validity of the evidence presented, rather than questioning the inherent reliability of computers. We believe that the amendment would be a crucial step towards ensuring fairness and accuracy in legal proceedings by enabling courts to evaluate computer evidence effectively. It offers a balanced approach that would protect the interests of both the prosecution and the defence, ensuring that justice is served. The Government really must move on this.
I thank the noble Baroness, Lady Kidron, for her amendments. The reliability of computer-based evidence, needless to say, has come into powerful public focus following the Post Office Horizon scandal and the postmasters’ subsequent fight for justice. As the noble Baroness has said previously and indeed tonight, this goes far beyond the Horizon scandal. We accept that there is an issue with the way in which the presumption that computer evidence is reliable is applied in legal proceedings.
The Government accepted in Committee that this is an issue. While we have concerns about the way that the noble Baroness’s amendment is drafted, we hope the Minister will take the opportunity today to set out clearly the work that the Government are doing in this area. In particular, we welcome the Government’s recently opened call for evidence, and we hope Ministers will work quickly to address this issue.
My Lords, I have the very dubious privilege of moving the final amendment on Report to this Bill. This is a probing amendment and the question is: what does retrospectivity mean? The noble Lord, Lord Cameron of Lochiel, asked a question of the noble Baroness, Lady Jones, in Committee in December:
“Will the forthcoming changes to data protection law apply to such data that controllers and processors already hold?”
She replied that
“the new lawful ground of recognised legitimate interest will apply from the date of commencement and will not apply retrospectively”.—[Official Report, 10/12/24; cols. GC 435-437.]
But the question is not really whether the lawfulness is retrospective, but whether the changes made in the new law can be applied to any personal data previously collected and already held on the commencement date of the Act—so that is the exam question.
It is indeed getting late. I thank the noble Lord, Lord Clement-Jones, for moving his amendment, and I really will be brief.
We do not oppose the government amendment in the name of the noble Lord, Lord Vallance. I think the Minister should be able to address the concerns raised by the noble Lord, Lord Clement-Jones, given that the noble Lord’s amendment merely seeks clarification on the retrospective application of the provisions of the Bill within a month of the coming into force of the Act. It seems that the Government could make this change unnecessary by clarifying the position today. I hope the Minister will be able to address this in his remarks.
I will speak first to Amendment 76. I reassure noble Lords that the Government do not believe that this amendment has a material policy effect. Instead, it simply corrects the drafting of the Bill and ensures that an interpretation provision in Clause 66 commences on Royal Assent.
Amendment 74, in the name of the noble Lord, Lord Clement Jones, would require the Secretary of State to publish a statement setting out whether any provisions in the Bill apply to controllers and processers retrospectively. Generally, provisions in Bills apply from the date of commencement unless there are strong policy or legal reasons for applying them retrospectively. The provisions in this Bill follow that general rule. For instance, data controllers will only be able to rely on the new lawful ground of recognised legitimate interests introduced by Clause 70 in respect of new processing activities in relation to personal data that take place after the date of commencement.
I recognise that noble Lords might have questions as to whether any of the Bill’s clauses can apply to personal data that is already held. That is the natural intent in some areas and, where appropriate, commencement regulations will provide further clarity. The Government intend to publish their plans for commencement on GOV.UK in due course and the ICO will also be updating its regulatory guidance in several key areas to help organisations prepare. We recognise that there can be complex lifecycles around the use of personal data and we will aim to ensure that how and when any new provisions can be relied on is made clear as part of the implementation process.
I hope that explanation goes some way to reassuring the noble Lord and that he will agree to withdraw his amendment.
My Lords, I thank the Minister. There is clearly no easy answer. I think we were part-expecting a rather binary answer, but clearly there is not one, so we look forward to the guidance.
But that is a bit worrying for those who have to tackle these issues. I am thinking of the data protection officers who are going to grapple with the Bill in its new form and I suspect that that is going to be quite a task. In the meantime, I withdraw the amendment.
(5 months, 3 weeks ago)
Lords ChamberMy Lords, last week the Government published the AI Opportunities Action Plan and confirmed that they have accepted or partially accepted all 50 of the recommendations from the report’s author, Matt Clifford. Reading the report, there can be no doubting Government’s commitment to making the UK a welcoming environment for AI companies. What is less clear is how creating the infrastructure and skills pool needed for AI companies to thrive will lead to economic and social benefits for UK citizens.
I am aware that the Government have already said that they will provide further details to flesh out the top-level commitments, including policy and legislative changes over the coming months. I reiterate the point made by many noble Lords in Committee that, if data is the ultimate fuel and infrastructure on which AI is built, why, given that we have a new Government, is the data Bill going through the House without all the strategic pieces in place? This is a Bill flying blind.
Amendment 1 is very modest and would ensure that information that traders were required to provide to customers on goods, services and digital content included information that had been created using AI to build a profile about them. This is necessary because the data that companies hold about us is already a combination of information proffered by us and information inferred, increasingly, by AI. This amendment would simply ensure that all customer data—our likes and dislikes, buying habits, product uses and so on—was disclosable, whether provided by us or a guesstimate by AI.
The Government’s recent statements have promised to “mainline AI into the veins” of the nation. If AI were a drug, its design and deployment would be subject to governance and oversight to ensure its safety and efficacy. Equally, they have said that they will “unleash” AI into our public services, communities and business. If the rhetoric also included commitments to understand and manage the well-established risks of AI, the public might feel more inclined to trust both AI and the Government.
The issue of how the data Bill fails to address AI— and how the AI Opportunities Action Plan, and the government response to it, fail to protect UK citizens, children, the creative industries and so on—will be a theme throughout Report. For now, I hope that the Government can find their way to agreeing that AI-generated content that forms part of a customer’s profile should be considered personal data for the purposes of defining business and customer data. I beg to move.
My Lords, this is clearly box-office material, as ever.
I support Amendment 1 tabled by the noble Baroness, Lady Kidron, on inferred data. Like her, I regret that we do not have this Bill flying in tandem with an AI Bill. As she said, data and AI go together, and we need to see the two together in context. However, inferred data has its own dangers: inaccuracy and what are called junk inferences; discrimination and unfair treatment; invasions of privacy; a lack of transparency; security risks; predatory targeting; and a loss of anonymity. These dangers highlight the need for strong data privacy protection for consumers in smart data schemes and more transparent data collection practices.
Noble Lords will remember that Cambridge Analytica dealt extensively with inferred data. That company used various data sources to create detailed psychological profiles of individuals going far beyond the information that users explicitly provided. I will not go into the complete history, but, frankly, we do not want to repeat that. Without safeguards, the development of AI technologies could lead to a lack of public trust, as the noble Baroness said, and indeed to a backlash against the use of AI, which could hinder the Government’s ambitions to make the UK an AI superpower. I do not like that kind of boosterish language—some of the Government’s statements perhaps could have been written by Boris Johnson—nevertheless the ambition to put the UK on the AI map, and to keep it there, is a worthy one. This kind of safeguard is therefore extremely important in that context.
I start by thanking the noble Baroness, Lady Kidron, for introducing this group. I will speak particularly to the amendment in my name but before I do so, I want to say how much I agree with the noble Baroness and with the noble Lord, Lord Clement-Jones, that it is a matter of regret that we are not simultaneously looking at an AI Bill. I worry that this Bill has to take a lot of the weight that an AI Bill would otherwise take, but we will come to that in a great deal more detail in later groups.
I will address the two amendments in this group in reverse order. Amendment 5 in my name and that of my noble friend Lord Markham would remove Clause 13, which makes provision for the Secretary of State or the Treasury to give financial assistance to decision-makers and enforcers—that is, in essence, to act as a financial backstop. While I appreciate the necessity of guaranteeing the stability of enforcers who are public authorities and therefore branches of state, I am concerned that this has been extended to decision-makers. The Bill does not make the identity of a decision-maker clear. Therefore, I wonder who exactly we are protecting here. Unless those individuals or bodies or organisations can be clearly defined, how can we know whether we should extend financial assistance to them?
I raised these concerns in Committee and the Minister assured us at that time that smart data schemes should be self-financing through fees and levies as set out in Clauses 11 and 12 and that this provision is therefore a back-up plan. If that is indeed the case and we are assured of the self-funding nature of smart data schemes, then what exactly makes this necessary? Why must the statutory spending authority act as a backstop if we do not believe there is a risk it will be needed? If we do think there is such a risk, can the Minister elaborate on what it is?
I turn now to the amendment tabled by the noble Baroness, Lady Kidron, which would require data traders to supply customers with information that has been used by AI to build a profile on them. While transparency and explainability are hugely important, I worry that the mechanism proposed here will be too burdensome. The burden would grow linearly with the scale of the models used. Collating and supplying this information would, I fear, increase the cost of doing business for traders. Given AI’s potential to be an immense asset to business, helping generate billions of pounds for the UK economy—and, by the way, I rather approve of the boosterish tone and think we should strive for a great deal more growth in the economy—we should not seek to make its use more administratively burdensome for business. Furthermore, since the information is AI-generated, it is going to be a guess or an assumption or an inference. Therefore, should we require companies to disclose not just the input data but the intermediate and final outputs? Speaking as a consumer, I am not sure that I personally would welcome this. I look forward to hearing the Minister’s responses.
My Lords, the noble Baroness, Lady Kidron, is setting a cracking pace this afternoon, and I am delighted to support her amendments and speak to them. Citizens should have the clear right to assign their data to data communities or trusts, which act as intermediaries between those who hold data and those who wish to use it, and are designed to ensure that data is shared in a fair, safe and equitable manner.
A great range of bodies have explored and support data communities and data trusts. There is considerable pedigree behind the proposals that the noble Baroness has put forward today, starting with a recommendation of the Hall-Pesenti review. We then had the Royal Society and the British Academy talking about data stewardship; the Ada Lovelace Institute has explored legal mechanisms for data stewardship, including data trusts; the Open Data Institute has been actively researching and piloting data trusts in the real world; the Alan Turing Institute has co-hosted a workshop exploring data trusts; and the Royal Society of Arts has conducted citizens’ juries on AI explainability and explored the use of data trusts for community engagement and outreach.
There are many reasons why data communities are so important. They can help empower individuals, give them more control over their data and ensure that it is used responsibly; they can increase bargaining power, reduce transaction costs, address data law complexity and protect individual rights; they can promote innovation by facilitating data-sharing; and they can promote innovation in the development of new products and services. We need to ensure responsible operation and build trust in data communities. As proposed by Amendment 43 in particular, we should establish a register of data communities overseen by the ICO, along with a code of conduct and complaint mechanisms, as proposed by Amendment 42.
It is high time we move forward on this; we need positive steps. In the words of the noble Baroness, Lady Kidron, we do not just seek assurance that there is nothing to prevent these data communities; we need to take positive steps and install mechanisms to make sure that we can set them up and benefit from that.
I thank the noble Baroness, Lady Kidron, for leading on this group, and the noble Lord, Lord Clement-Jones, for his valuable comments on these important structures of data communities. Amendments 2, 3, 4 and 25 work in tandem and are designed to enable data communities, meaning associations of individuals who have come together and wish to designate a third party, to act on the group’s behalf in their data use.
There is no doubt that the concept of a data community is a powerful idea that can drive innovation and a great deal of value. I thank the noble Lord, Lord Clement-Jones, for cataloguing the many groups that have driven powerful thinking in this area, the value of which is very clear. However—and I keep coming back to this when we discuss this idea—what prevents this being done already? I realise that this may be a comparatively trivial example, but if I wanted to organise a community today to oppose a local development, could I not do so with an existing lawful basis for data processing? It is still not clear in what way these amendments would improve my ability to do so, or would reduce my administrative burden or the risks of data misuse.
I look forward to hearing more about this from the Minister today and, ideally, as the noble Baroness, Lady Kidron, said, in a briefing on the Government’s plan to drive this forward. However, I remain concerned that we do not necessarily need to drive forward this mechanism by passing new legislation. I look forward to the Minister’s comments.
Amendment 42 would require the Information Commissioner to draw up a code of practice setting out how data communities must operate and how data controllers and processors should engage with these communities. Amendment 43 would create a register of data communities and additional responsibilities for the data community controller. I appreciate the intent of the noble Baroness, Lady Kidron, in trying to ensure data security and transparency in the operation of data communities. If we on these Benches supported the idea of their creation in this Bill, we would surely have to implement mechanisms of the type proposed in these amendments. However, this observation confirms us in our view that the administration required to operate these communities is starting to look rather burdensome. We should be looking to encourage the use of data to generate economic growth and to make people’s lives easier. I am concerned that the regulation of data communities, were it to proceed as envisaged by these amendments, might risk doing just the opposite. That said, I will listen with interest to the response of noble Lords and the Minister.
My understanding is that “customer” reflects an individual, but I am sure that the Minister will give a better explanation at the meeting with officials next week.
Again before the Minister sits down—I am sure he will not be able to sit down for long—would he open that invitation to a slightly wider group?
I thank the noble Lord for that request, and I am sure my officials would be willing to do that.
My Lords, I support my noble friend. I have a confession to make. Before this Bill came up, I foolishly thought that sex and gender were the same thing. I have discovered that they are not. Gender is not a characteristic defined in UK law. I believe that you are born with a biological sex, as being male or female, and that some people will choose, or need, to have a gender reassignment or to identify as a different gender. I thank the charity Sex Matters, which works to provide clarity on this issue of sex in law.
As my noble friend Lord Lucas said, the digital verification system currently operates on the basis of chosen gender, not of sex at birth. You can change your records on request without even having a gender recognition certificate. That means that, over the last five years, at least 3,000 people have changed their passports to show the wrong sex. Over the last six years, at least 15,000 people have changed their driving licences. The NHS has no records of how many people now have different sexes recorded from those they had at birth. It is thought that perhaps 100,000 people have one sex indicated in one record and a different sex in another. We cannot go on like that.
The consequences of this are really concerning. It means people with mismatched identities risk being flagged up as a synthetic identity risk. It means authorities with statutory safeguarding responsibilities will not be able to assess the risk that they are trying to deal with. It means that illnesses may be misdiagnosed and treatments misprescribed if the wrong sex is stated in someone’s medical records. The police will be unable to identify people if they are looking in the wrong records. Disclosure and Barring Service checks may fail to match individuals with the wrong sex. I hope that the Government will look again at correcting this. It is a really important issue.
My Lords, I will speak to Amendments 7 and 9. Amendment 7 would require the Secretary of State to lay the DVS trust framework before Parliament. Given the volume of sensitive data that digital ID providers will be handling, it is crucial for Parliament to oversee the framework rules governing digital verification service providers.
The amendment is essentially one that was tabled in Committee by the noble Viscount, Lord Camrose. I thought that he expressed this well in Committee, emphasising that such a fundamental framework demands parliamentary approval for transparency and accountability, regardless of the document’s complexity. This is an important framework with implications for data privacy and security, and should not be left solely to the discretion of the Secretary of State.
The DPRRC in its ninth report and the Constitution Committee in its third report of the Session also believed the DVS trust framework should be subject to parliamentary scrutiny; the former because it has legislative effect, and it recommended using the affirmative procedure, which would require Parliament to actively approve the framework, as the Secretary of State has significant power without adequate parliamentary involvement. The latter committee, the Constitution Committee, said:
“We reiterate our statement from our report on the Data Protection and Digital Information Bill that ‘[d]ata protection is a matter of great importance in maintaining a relationship of trust between the state and the individual. Access to personal data is beneficial to the provision of services by the state and assists in protecting national security. However, the processing of personal data affects individual rights, including the right to respect for private life and the right to freedom of expression. It is important that the power to process personal data does not become so broad as to unduly limit those rights’”.
Those views are entirely consistent with the committee’s earlier stance on a similar provision in the previous Data Protection and Digital Information Bill. That was why it was so splendid that the noble Viscount tabled that amendment in Committee. It was like a Damascene conversion.
The noble Baroness, Lady Jones, argued in Committee and in correspondence that the trust framework is a highly technical document that Parliament might find difficult to understand. That is a bit of a red rag to a bull. However, this argument fails to address the core concerns about democratic oversight. The framework aims to establish a trusted digital identity marketplace by setting requirements for providers to gain certification as trusted providers.
I am extremely grateful to the Minister, the Bill team and the department for allowing officials to give the noble Viscount, Lord Camrose, and me a tutorial on the trust framework. It depends heavily on being voluntary in nature, with the UK Accreditation Service essentially overseeing the certifiers, such as BSI, Kantara and the Age Check Certification Scheme, certifying the providers, with the installation of ISO 17065 as the governing standard.
Compliance is assured through the certification process, where services are assessed against the framework rules by independent conformity assessment bodies accredited by the UK Accreditation Service, and the trust framework establishes rules and standards for digital identity verification but does not directly contain specific provision for regulatory oversight or for redress mechanisms such as a specific ombudsman service, industry-led dispute resolution or set contract terms for consumer redress or enforcement powers. The Government say, however, that they intend to monitor the types of complaints received. Ultimately, the scope of the framework is limited to the rules providers must follow in order to remain certificated and it does not address governance matters.
Periodic certification alone is not enough to ensure ongoing compliance and highlights the lack of an independent mechanism to hold the Secretary of State accountable. The noble Baroness, Lady Jones, stated in Committee that the Government preferred a light-touch approach to regulating digital verification services. She believed that excessive parliamentary scrutiny would hinder innovation and flexibility in this rapidly evolving sector.
The Government have consistently emphasised that they have no plans to introduce mandatory digital IDs or ID cards The focus is on creating a secure and trusted system that gives citizens more choice and control over their data. The attributes trust framework is a crucial step towards achieving the goal of a secure, trusted and innovative digital identity market—all the more reason to get the process for approval right.
These services will inevitably be high-profile. Digital ID is a sensitive area which potentially also involves age verification. These services could have a major impact on data privacy and security. Public debate on such a critical issue is crucial to build trust and confidence in these systems. Laying the DVS trust framework before Parliament would allow for a wider range of voices and perspectives to be heard, ensuring a more robust and democratic approval process.
I thank the noble Lords, Lord Clement-Jones, Lord Lucas and Lord Arbuthnot, for their amendments and interest in the important area of digital verification services. I thank the noble Viscount, Lord Camrose, for his support for this being such an important thing to make life easier for people.
I will go in reverse order and start with Amendment 9. I thank the noble Lord, Lord Clement-Jones, for reconsidering his stance since Committee on the outright creation of these offences. Amendment 9 would create an obligation for the Secretary of State to review the need for digital identity theft offences. We believe this would be unnecessary, as existing legislation—for example, the Fraud Act 2006, the Computer Misuse Act 1990 and the Data Protection Act 2018—already addresses the behaviour targeted by this amendment.
However, we note the concerns raised and confirm that the Government are taking steps to tackle the issue. First, the Action Fraud service, which allows individuals to report fraud enabled by identity theft, is being upgraded with improved reporting tools, increased intelligence flows to police forces and better support services for victims. Secondly, the Home Office is reviewing the training offered to police officers who have to respond to fraud incidents, and identifying the improvements needed.
I am sorry to interrupt the Minister. He is equating digital identity theft to fraud, and that is not always the case. Is that the advice that he has received?
The advice is that digital identity theft would be captured by those Acts. Therefore, there is no need for a specific offence. However, as I said, the Government are taking steps to tackle this and will support the Action Fraud service as a way to deal with it, even though I agree that not everything falls as fraud under that classification.
I am sorry to interrupt the Minister again, but could he therefore confirm that, by reiterating his previous view that the Secretary of State should not have to bring the framework to Parliament, he disagrees with both the Delegated Powers and Regulatory Reform Committee and the Constitution Committee, both of which made the same point on this occasion and on the previous Bill—that Parliament should look at the trust framework?
For the reasons that I have given, I think that the trust framework is a technical document and one best dealt with in this technical form. It is built on other assurance processes, with the United Kingdom Accreditation Service overseeing the conformity accreditation bodies that will test the digital verification services. In this case, our view is that it does not need to come under parliamentary scrutiny.
On Amendments 6 and 8 from the noble Lord, Lord Lucas, I am absolutely behind the notion that the validity of the data is critical. We have to get this right. Of course, the Bill itself takes the data from other sources, and those sources have authority to get the information correct, but it is important, for a digital service in particular, that this is dealt with very carefully and that we have good assurance processes.
On the specific point about gender identity, the Bill does not create or prescribe new ways in which to determine that, but work is ongoing to try to ensure that there is consistency and accuracy. The Central Digital and Data Office has started to progress work on developing data standards and key entities and their attributes to ensure that the way data is organised, stored and shared is consistent between public authorities. Work has also been commenced via the domain expert group on the person entity, which has representations from the Home Office, HMRC, the Office for National Statistics—importantly—NHS England, the Department for Education, the Ministry of Justice, the Local Government Association and the Police Digital Service. The group has been established as a pilot under the Data Standards Authority to help to ensure consistency across organisations, and specific pieces of work are going on relating to gender in that area.
The measures in Part 2 are intended to help secure the reliability of the process through which citizens can verify their identity digitally. They do not intervene in how government departments record and store identity data. In clarifying this important distinction, and with reference to the further information I will set out, I cannot support the amendments.
My Lords, I support the conclusions of the Delegated Powers and Regulatory Reform Committee and the Constitution Committee, and I beg leave to seek the opinion of the House.
My Lords, Amendments 10 and 12 seek to amend Clauses 56 and 58, which form part of the national underground asset register provisions. These two minor, technical amendments address a duplicate reference to “the undertaker’s employees” and replace it with the correct reference to “the contractor’s employees”. I reassure noble Lords that the amendments do not have a material policy effect and are intended to correct the drafting. I beg to move.
My Lords, I thank the Minister for these two technical amendments. I take this opportunity to thank him also for responding to correspondence about LinesearchbeforeUdig and its wish to meet government and work with existing services to deliver what it describes as the safe digging elements of the NUAR. The Minister has confirmed that the heavy lifting on this—not heavy digging—will be carried out by the noble Baroness, Lady Jones, on her return, which I am sure she will look forward to. As I understand it, officials will meet LinesearchbeforeUdig this week, and they will look at the survey carried out by the service. We have made some process since Committee, and I am grateful to the Minister for that.
My Lords, given that these are technical amendments, correcting wording errors, I have little to add to the remarks already made. We have no concerns about these amendments and will not seek to oppose the Government in making these changes.
My Lords, I support my noble friend Lord Colville. He has made an excellent argument, and I ask noble Lords on the Government Benches to think about it very carefully. If it is good enough for health data, it is good enough for the rest of science. In the interest of time, I will give an example of one of the issues, rather than repeat the excellent argument made by my noble friend.
In Committee, I asked the Government three times whether the cover of scientific research could be used, for example, to market-test ways to hack human responses to dopamine in order to keep children online. In the Minister’s letter, written during Committee, she could not say that the A/B testing of millions of children to make services more sticky—that is, more addictive—would not be considered scientific, but rather that the regulator, the ICO, could decide on a case-by-case basis. That is not good enough.
There is no greater argument for my noble friend Lord Colville’s amendment than the fact that the Government are unable to say if hacking children’s attention for commercial gain is scientific or not. We will come to children and child protection in the Bill in the next group, but it is alarming that the Government feel able to put in writing that this is an open question. That is not what Labour believed in opposition, and it is beyond disappointing that, now in government, Labour has forgotten what it then believed. I will be following my noble friend through the Lobby.
My Lords, it is almost impossible to better the arguments put forward by the noble Viscount, Lord Colville, and the noble Baroness, Lady Kidron, so I am not even going to try.
The inclusion of a public interest requirement would ensure that the use of data for scientific research would serve a genuine societal benefit, rather than primarily benefiting private interests. This would help safeguard against the misuse of data for purely commercial purposes under the guise of research. The debate in Committee highlighted the need for further clarity and stronger safeguards in the Bill, to ensure that data for scientific research genuinely serves the public interest, particularly concerning the sensitive data of children. The call for a public interest requirement reflects the desire to ensure a balance between promoting research and innovation and upholding the rights and interests of data subjects. I very much hope that the House will support this amendment.
My Lords, we are playing a bit of Jack-in-the-box. When I was being taught law by a wonderful person from Gray’s Inn, who was responsible for drafting the constitution of Uganda’s independence, Sir Dingle Foot, he said a phrase which struck me, and which has always stayed with me: law is a statement of public policy. The noble Viscount, Lord Coville, seeks that if there is to be scientific work, it must be conducted “in the public interest”. Law simply does not express itself for itself; it does it for the public, as a public policy. It would be a wonderful phrase to include, and I hope the Minister will accept it so that we do not have to vote on it.
My Lords, I was one of those who was up even earlier than the noble Baroness, Lady Harding, and managed to get my name down on these amendments. It puts me in a rather difficult position to be part of the government party but to seek to change what the Government have arrived at as their sticking position in relation to this issue in particular—and indeed one or two others, but I have learned to live with those.
This one caught my eye in Committee. I felt suddenly, almost exactly as the noble Lord, Lord Russell said, a sense of discontinuity in relation to what we thought it was in the Government’s DNA—that is, to bring forward the right solution to the problems that we have been seeking to change in other Bills. With the then Online Safety Bill, we seemed to have an agreement around the House about what we wanted, but every time we put it back to the officials and people went away with it and came back with other versions, it got worse and not better. How children are dealt with and how important it is to make sure that they are prioritised appears to be one of those problems.
The amendments before us—and I have signed many of them, because I felt that we wanted to have a good and open debate about what we wanted here—do not need to be passed today. It seems to me that the two sides are, again, very close in what we want to achieve. I sensed from the excellent speech of the noble Baroness, Lady Kidron, that she has a very clear idea of what needs to go into this Bill to ensure that, at the very least, we do not diminish the sensible way in which we drafted the 2018 Bill. I was part of that process as well; I remember those debates very well. We got there because we hammered away at it until we found a way of finding the right words that bridged the two sides. We got closer and closer together, but sometimes we had to go even beyond what the clerks would feel comfortable with in terms of government procedure to do that. We may be here again.
When he comes to respond, can the Minister commit to us today in this House that he will bring back at Third Reading a version of what he has put forward—which I think we all would say does not quite go far enough; it needs a bit more, but not that much more—to make it meet with where we currently are and where, guided by the noble Baroness, Lady Kidron, we should be in relation to the changing circumstances in both the external world and indeed in our regulator, which of course is going to go through a huge change as it reformulates itself? We have an opportunity, but there is also a danger that we do not take it. If we weaken ourselves now, we will not be in the right position in a few years’ time. I appeal to my noble friend to think carefully about how he might manage this process for the best benefit of all of us. The House, I am sure, is united about where we want to get to. The Bill does not get us there. Government Amendment 18 is too modest in its approach, but it does not need a lot to get it there. I think there is a way forward that we do not need to divide on. I hope the Minister will take the advice that has been given.
My Lords, we have heard some of the really consistent advocates for children’s online protection today. I must say that I had not realised that the opportunity of signing the amendments of the noble Baroness, Lady Kidron, was rather like getting hold of Taylor Swift tickets—clearly, there was massive competition and rightly so. I pay tribute not only to the speakers today but in particular to the noble Baroness for all her campaigning, particularly with 5Rights, on online child protection.
All these amendments are important for protecting children’s data, because they address concerns about data misuse and the need for heightened protection for children in the digital environment, with enhanced oversight and accountability in the processing of children’s data. I shall not say very much. If the noble Baroness pushes Amendment 20 to a vote, I want to make sure that we have time before the dinner hour to do so, which means going through the next group very quickly. I very much hope that we will get a satisfactory answer from the Minister. The sage advice from the noble Lord, Lord Stevenson, hit the button exactly.
Amendment 20 is particularly important in this context. It seeks to exclude children from the new provisions on purpose limitation for further processing under Article 8A. As the noble Baroness explains, that means that personal data originally collected from a child with consent for a specific purpose could not be reused for a different, incompatible purpose without obtaining fresh consent, even if the child is now an adult. In my view, that is core. I hope the Minister will come back in the way that has been requested by the noble Lord, Lord Stevenson, so we do not have to have a vote. However, we will support the noble Baroness if she wishes to test the opinion of the House.
My Lords, I too thank the noble Baroness, Lady Kidron, for all her amendments in this group, and I thank the Minister for his amendment.
Amendment 15 seeks to maintain the high level of legal protection for children’s data even where protections for adults may be eased in the context of scientific research. I acknowledge the concerns raised about the potential implications that this amendment could have for medical research and safeguarding work. It is important to recognise that young people aged 16 and over are entitled to control their medical information under existing legal frameworks, reflecting their ability to understand and consent in specific contexts.
There is a legitimate concern that by excluding all children categorically, including those aged 16 and 17, we risk impeding critical medical research that could benefit young people themselves. Research into safeguarding may also be impacted by such an amendment. Studies that aim to improve systems for identifying and preventing abuse or neglect rely on the careful processing of children’s data. If this amendment were to inadvertently create a barrier to such vital work, we could find ourselves undermining some of the protections that it seeks to reinforce.
That said, the amendment highlights an important issue: the need to ensure that ethical safeguards for children remain robust and proportionate. There is no question that the rights and welfare of children should remain paramount in research contexts, but we must find the right balance—one that allows valuable, ethically conducted research to continue without eroding the legal protections that exist for children’s data. So I welcome the intent of the amendment in seeking to protect children, of course, and I urge us, as the noble Lord, Lord Stevenson, put it, to continue working collaboratively to achieve a framework that upholds their rights without hindering progress in areas that ultimately serve their best interests.
As with the previous amendment, I recognise the intent of Amendment 16, which seeks to protect children’s data by excluding them from the scope of recognised legitimate interests. Ensuring that children continue to benefit from the highest level of legal protection is a goal that, needless to say, we all share. However, I remain concerned that this could have less desirable consequences too, particularly in cases requiring urgent safeguarding action. There are scenarios where swift and proportionate data processing is critical to protecting a child at risk, and it is vital that the framework that we establish does not inadvertently create barriers to such essential work.
I am absolutely in support of Amendment 20. It provides an important safeguard by ensuring that children’s data is not used for purposes beyond those for which it was originally collected, unless it is fully compatible with the original purpose. Children are particularly vulnerable when it comes to data processing and their understanding of consent is limited. The amendment would strengthen protection for children by preventing the use of their data in ways that were not made clear to them or their guardians at the time of collection. It would ensure that children’s data remained secure and was not exploited for unrelated purposes.
On Amendment 22, the overarching duty proposed in this new clause—to prioritise children’s best interests and ensure that their data is handled with due care and attention—aligns with the objective that we all share of safeguarding children in the digital age. We also agree with the principle that the protections afforded to children’s data should not be undermined or reduced, and that those protections should remain consistent with existing standards under the UK GDPR.
However, although we support the intent of the amendment, we have concerns about the reference to the UN Convention on the Rights of the Child and general comment 25. Although these international frameworks are important, we do not believe they should be explicitly tied into this legislation. Our preference would be for a redraft of this provision that focused more directly on UK law and principles, ensuring that the protections for children’s data were robust and tailored to our legal context, rather than linking it to international standards in a way that could create potential ambiguities.