(2 days, 16 hours ago)
Lords ChamberMy Lords, last week the Government published the AI Opportunities Action Plan and confirmed that they have accepted or partially accepted all 50 of the recommendations from the report’s author, Matt Clifford. Reading the report, there can be no doubting Government’s commitment to making the UK a welcoming environment for AI companies. What is less clear is how creating the infrastructure and skills pool needed for AI companies to thrive will lead to economic and social benefits for UK citizens.
I am aware that the Government have already said that they will provide further details to flesh out the top-level commitments, including policy and legislative changes over the coming months. I reiterate the point made by many noble Lords in Committee that, if data is the ultimate fuel and infrastructure on which AI is built, why, given that we have a new Government, is the data Bill going through the House without all the strategic pieces in place? This is a Bill flying blind.
Amendment 1 is very modest and would ensure that information that traders were required to provide to customers on goods, services and digital content included information that had been created using AI to build a profile about them. This is necessary because the data that companies hold about us is already a combination of information proffered by us and information inferred, increasingly, by AI. This amendment would simply ensure that all customer data—our likes and dislikes, buying habits, product uses and so on—was disclosable, whether provided by us or a guesstimate by AI.
The Government’s recent statements have promised to “mainline AI into the veins” of the nation. If AI were a drug, its design and deployment would be subject to governance and oversight to ensure its safety and efficacy. Equally, they have said that they will “unleash” AI into our public services, communities and business. If the rhetoric also included commitments to understand and manage the well-established risks of AI, the public might feel more inclined to trust both AI and the Government.
The issue of how the data Bill fails to address AI— and how the AI Opportunities Action Plan, and the government response to it, fail to protect UK citizens, children, the creative industries and so on—will be a theme throughout Report. For now, I hope that the Government can find their way to agreeing that AI-generated content that forms part of a customer’s profile should be considered personal data for the purposes of defining business and customer data. I beg to move.
My Lords, this is clearly box-office material, as ever.
I support Amendment 1 tabled by the noble Baroness, Lady Kidron, on inferred data. Like her, I regret that we do not have this Bill flying in tandem with an AI Bill. As she said, data and AI go together, and we need to see the two together in context. However, inferred data has its own dangers: inaccuracy and what are called junk inferences; discrimination and unfair treatment; invasions of privacy; a lack of transparency; security risks; predatory targeting; and a loss of anonymity. These dangers highlight the need for strong data privacy protection for consumers in smart data schemes and more transparent data collection practices.
Noble Lords will remember that Cambridge Analytica dealt extensively with inferred data. That company used various data sources to create detailed psychological profiles of individuals going far beyond the information that users explicitly provided. I will not go into the complete history, but, frankly, we do not want to repeat that. Without safeguards, the development of AI technologies could lead to a lack of public trust, as the noble Baroness said, and indeed to a backlash against the use of AI, which could hinder the Government’s ambitions to make the UK an AI superpower. I do not like that kind of boosterish language—some of the Government’s statements perhaps could have been written by Boris Johnson—nevertheless the ambition to put the UK on the AI map, and to keep it there, is a worthy one. This kind of safeguard is therefore extremely important in that context.
I start by thanking the noble Baroness, Lady Kidron, for introducing this group. I will speak particularly to the amendment in my name but before I do so, I want to say how much I agree with the noble Baroness and with the noble Lord, Lord Clement-Jones, that it is a matter of regret that we are not simultaneously looking at an AI Bill. I worry that this Bill has to take a lot of the weight that an AI Bill would otherwise take, but we will come to that in a great deal more detail in later groups.
I will address the two amendments in this group in reverse order. Amendment 5 in my name and that of my noble friend Lord Markham would remove Clause 13, which makes provision for the Secretary of State or the Treasury to give financial assistance to decision-makers and enforcers—that is, in essence, to act as a financial backstop. While I appreciate the necessity of guaranteeing the stability of enforcers who are public authorities and therefore branches of state, I am concerned that this has been extended to decision-makers. The Bill does not make the identity of a decision-maker clear. Therefore, I wonder who exactly we are protecting here. Unless those individuals or bodies or organisations can be clearly defined, how can we know whether we should extend financial assistance to them?
I raised these concerns in Committee and the Minister assured us at that time that smart data schemes should be self-financing through fees and levies as set out in Clauses 11 and 12 and that this provision is therefore a back-up plan. If that is indeed the case and we are assured of the self-funding nature of smart data schemes, then what exactly makes this necessary? Why must the statutory spending authority act as a backstop if we do not believe there is a risk it will be needed? If we do think there is such a risk, can the Minister elaborate on what it is?
I turn now to the amendment tabled by the noble Baroness, Lady Kidron, which would require data traders to supply customers with information that has been used by AI to build a profile on them. While transparency and explainability are hugely important, I worry that the mechanism proposed here will be too burdensome. The burden would grow linearly with the scale of the models used. Collating and supplying this information would, I fear, increase the cost of doing business for traders. Given AI’s potential to be an immense asset to business, helping generate billions of pounds for the UK economy—and, by the way, I rather approve of the boosterish tone and think we should strive for a great deal more growth in the economy—we should not seek to make its use more administratively burdensome for business. Furthermore, since the information is AI-generated, it is going to be a guess or an assumption or an inference. Therefore, should we require companies to disclose not just the input data but the intermediate and final outputs? Speaking as a consumer, I am not sure that I personally would welcome this. I look forward to hearing the Minister’s responses.
My Lords, the noble Baroness, Lady Kidron, is setting a cracking pace this afternoon, and I am delighted to support her amendments and speak to them. Citizens should have the clear right to assign their data to data communities or trusts, which act as intermediaries between those who hold data and those who wish to use it, and are designed to ensure that data is shared in a fair, safe and equitable manner.
A great range of bodies have explored and support data communities and data trusts. There is considerable pedigree behind the proposals that the noble Baroness has put forward today, starting with a recommendation of the Hall-Pesenti review. We then had the Royal Society and the British Academy talking about data stewardship; the Ada Lovelace Institute has explored legal mechanisms for data stewardship, including data trusts; the Open Data Institute has been actively researching and piloting data trusts in the real world; the Alan Turing Institute has co-hosted a workshop exploring data trusts; and the Royal Society of Arts has conducted citizens’ juries on AI explainability and explored the use of data trusts for community engagement and outreach.
There are many reasons why data communities are so important. They can help empower individuals, give them more control over their data and ensure that it is used responsibly; they can increase bargaining power, reduce transaction costs, address data law complexity and protect individual rights; they can promote innovation by facilitating data-sharing; and they can promote innovation in the development of new products and services. We need to ensure responsible operation and build trust in data communities. As proposed by Amendment 43 in particular, we should establish a register of data communities overseen by the ICO, along with a code of conduct and complaint mechanisms, as proposed by Amendment 42.
It is high time we move forward on this; we need positive steps. In the words of the noble Baroness, Lady Kidron, we do not just seek assurance that there is nothing to prevent these data communities; we need to take positive steps and install mechanisms to make sure that we can set them up and benefit from that.
I thank the noble Baroness, Lady Kidron, for leading on this group, and the noble Lord, Lord Clement-Jones, for his valuable comments on these important structures of data communities. Amendments 2, 3, 4 and 25 work in tandem and are designed to enable data communities, meaning associations of individuals who have come together and wish to designate a third party, to act on the group’s behalf in their data use.
There is no doubt that the concept of a data community is a powerful idea that can drive innovation and a great deal of value. I thank the noble Lord, Lord Clement-Jones, for cataloguing the many groups that have driven powerful thinking in this area, the value of which is very clear. However—and I keep coming back to this when we discuss this idea—what prevents this being done already? I realise that this may be a comparatively trivial example, but if I wanted to organise a community today to oppose a local development, could I not do so with an existing lawful basis for data processing? It is still not clear in what way these amendments would improve my ability to do so, or would reduce my administrative burden or the risks of data misuse.
I look forward to hearing more about this from the Minister today and, ideally, as the noble Baroness, Lady Kidron, said, in a briefing on the Government’s plan to drive this forward. However, I remain concerned that we do not necessarily need to drive forward this mechanism by passing new legislation. I look forward to the Minister’s comments.
Amendment 42 would require the Information Commissioner to draw up a code of practice setting out how data communities must operate and how data controllers and processors should engage with these communities. Amendment 43 would create a register of data communities and additional responsibilities for the data community controller. I appreciate the intent of the noble Baroness, Lady Kidron, in trying to ensure data security and transparency in the operation of data communities. If we on these Benches supported the idea of their creation in this Bill, we would surely have to implement mechanisms of the type proposed in these amendments. However, this observation confirms us in our view that the administration required to operate these communities is starting to look rather burdensome. We should be looking to encourage the use of data to generate economic growth and to make people’s lives easier. I am concerned that the regulation of data communities, were it to proceed as envisaged by these amendments, might risk doing just the opposite. That said, I will listen with interest to the response of noble Lords and the Minister.
My understanding is that “customer” reflects an individual, but I am sure that the Minister will give a better explanation at the meeting with officials next week.
Again before the Minister sits down—I am sure he will not be able to sit down for long—would he open that invitation to a slightly wider group?
I thank the noble Lord for that request, and I am sure my officials would be willing to do that.
My Lords, I support my noble friend. I have a confession to make. Before this Bill came up, I foolishly thought that sex and gender were the same thing. I have discovered that they are not. Gender is not a characteristic defined in UK law. I believe that you are born with a biological sex, as being male or female, and that some people will choose, or need, to have a gender reassignment or to identify as a different gender. I thank the charity Sex Matters, which works to provide clarity on this issue of sex in law.
As my noble friend Lord Lucas said, the digital verification system currently operates on the basis of chosen gender, not of sex at birth. You can change your records on request without even having a gender recognition certificate. That means that, over the last five years, at least 3,000 people have changed their passports to show the wrong sex. Over the last six years, at least 15,000 people have changed their driving licences. The NHS has no records of how many people now have different sexes recorded from those they had at birth. It is thought that perhaps 100,000 people have one sex indicated in one record and a different sex in another. We cannot go on like that.
The consequences of this are really concerning. It means people with mismatched identities risk being flagged up as a synthetic identity risk. It means authorities with statutory safeguarding responsibilities will not be able to assess the risk that they are trying to deal with. It means that illnesses may be misdiagnosed and treatments misprescribed if the wrong sex is stated in someone’s medical records. The police will be unable to identify people if they are looking in the wrong records. Disclosure and Barring Service checks may fail to match individuals with the wrong sex. I hope that the Government will look again at correcting this. It is a really important issue.
My Lords, I will speak to Amendments 7 and 9. Amendment 7 would require the Secretary of State to lay the DVS trust framework before Parliament. Given the volume of sensitive data that digital ID providers will be handling, it is crucial for Parliament to oversee the framework rules governing digital verification service providers.
The amendment is essentially one that was tabled in Committee by the noble Viscount, Lord Camrose. I thought that he expressed this well in Committee, emphasising that such a fundamental framework demands parliamentary approval for transparency and accountability, regardless of the document’s complexity. This is an important framework with implications for data privacy and security, and should not be left solely to the discretion of the Secretary of State.
The DPRRC in its ninth report and the Constitution Committee in its third report of the Session also believed the DVS trust framework should be subject to parliamentary scrutiny; the former because it has legislative effect, and it recommended using the affirmative procedure, which would require Parliament to actively approve the framework, as the Secretary of State has significant power without adequate parliamentary involvement. The latter committee, the Constitution Committee, said:
“We reiterate our statement from our report on the Data Protection and Digital Information Bill that ‘[d]ata protection is a matter of great importance in maintaining a relationship of trust between the state and the individual. Access to personal data is beneficial to the provision of services by the state and assists in protecting national security. However, the processing of personal data affects individual rights, including the right to respect for private life and the right to freedom of expression. It is important that the power to process personal data does not become so broad as to unduly limit those rights’”.
Those views are entirely consistent with the committee’s earlier stance on a similar provision in the previous Data Protection and Digital Information Bill. That was why it was so splendid that the noble Viscount tabled that amendment in Committee. It was like a Damascene conversion.
The noble Baroness, Lady Jones, argued in Committee and in correspondence that the trust framework is a highly technical document that Parliament might find difficult to understand. That is a bit of a red rag to a bull. However, this argument fails to address the core concerns about democratic oversight. The framework aims to establish a trusted digital identity marketplace by setting requirements for providers to gain certification as trusted providers.
I am extremely grateful to the Minister, the Bill team and the department for allowing officials to give the noble Viscount, Lord Camrose, and me a tutorial on the trust framework. It depends heavily on being voluntary in nature, with the UK Accreditation Service essentially overseeing the certifiers, such as BSI, Kantara and the Age Check Certification Scheme, certifying the providers, with the installation of ISO 17065 as the governing standard.
Compliance is assured through the certification process, where services are assessed against the framework rules by independent conformity assessment bodies accredited by the UK Accreditation Service, and the trust framework establishes rules and standards for digital identity verification but does not directly contain specific provision for regulatory oversight or for redress mechanisms such as a specific ombudsman service, industry-led dispute resolution or set contract terms for consumer redress or enforcement powers. The Government say, however, that they intend to monitor the types of complaints received. Ultimately, the scope of the framework is limited to the rules providers must follow in order to remain certificated and it does not address governance matters.
Periodic certification alone is not enough to ensure ongoing compliance and highlights the lack of an independent mechanism to hold the Secretary of State accountable. The noble Baroness, Lady Jones, stated in Committee that the Government preferred a light-touch approach to regulating digital verification services. She believed that excessive parliamentary scrutiny would hinder innovation and flexibility in this rapidly evolving sector.
The Government have consistently emphasised that they have no plans to introduce mandatory digital IDs or ID cards The focus is on creating a secure and trusted system that gives citizens more choice and control over their data. The attributes trust framework is a crucial step towards achieving the goal of a secure, trusted and innovative digital identity market—all the more reason to get the process for approval right.
These services will inevitably be high-profile. Digital ID is a sensitive area which potentially also involves age verification. These services could have a major impact on data privacy and security. Public debate on such a critical issue is crucial to build trust and confidence in these systems. Laying the DVS trust framework before Parliament would allow for a wider range of voices and perspectives to be heard, ensuring a more robust and democratic approval process.
I thank the noble Lords, Lord Clement-Jones, Lord Lucas and Lord Arbuthnot, for their amendments and interest in the important area of digital verification services. I thank the noble Viscount, Lord Camrose, for his support for this being such an important thing to make life easier for people.
I will go in reverse order and start with Amendment 9. I thank the noble Lord, Lord Clement-Jones, for reconsidering his stance since Committee on the outright creation of these offences. Amendment 9 would create an obligation for the Secretary of State to review the need for digital identity theft offences. We believe this would be unnecessary, as existing legislation—for example, the Fraud Act 2006, the Computer Misuse Act 1990 and the Data Protection Act 2018—already addresses the behaviour targeted by this amendment.
However, we note the concerns raised and confirm that the Government are taking steps to tackle the issue. First, the Action Fraud service, which allows individuals to report fraud enabled by identity theft, is being upgraded with improved reporting tools, increased intelligence flows to police forces and better support services for victims. Secondly, the Home Office is reviewing the training offered to police officers who have to respond to fraud incidents, and identifying the improvements needed.
I am sorry to interrupt the Minister. He is equating digital identity theft to fraud, and that is not always the case. Is that the advice that he has received?
The advice is that digital identity theft would be captured by those Acts. Therefore, there is no need for a specific offence. However, as I said, the Government are taking steps to tackle this and will support the Action Fraud service as a way to deal with it, even though I agree that not everything falls as fraud under that classification.
I am sorry to interrupt the Minister again, but could he therefore confirm that, by reiterating his previous view that the Secretary of State should not have to bring the framework to Parliament, he disagrees with both the Delegated Powers and Regulatory Reform Committee and the Constitution Committee, both of which made the same point on this occasion and on the previous Bill—that Parliament should look at the trust framework?
For the reasons that I have given, I think that the trust framework is a technical document and one best dealt with in this technical form. It is built on other assurance processes, with the United Kingdom Accreditation Service overseeing the conformity accreditation bodies that will test the digital verification services. In this case, our view is that it does not need to come under parliamentary scrutiny.
On Amendments 6 and 8 from the noble Lord, Lord Lucas, I am absolutely behind the notion that the validity of the data is critical. We have to get this right. Of course, the Bill itself takes the data from other sources, and those sources have authority to get the information correct, but it is important, for a digital service in particular, that this is dealt with very carefully and that we have good assurance processes.
On the specific point about gender identity, the Bill does not create or prescribe new ways in which to determine that, but work is ongoing to try to ensure that there is consistency and accuracy. The Central Digital and Data Office has started to progress work on developing data standards and key entities and their attributes to ensure that the way data is organised, stored and shared is consistent between public authorities. Work has also been commenced via the domain expert group on the person entity, which has representations from the Home Office, HMRC, the Office for National Statistics—importantly—NHS England, the Department for Education, the Ministry of Justice, the Local Government Association and the Police Digital Service. The group has been established as a pilot under the Data Standards Authority to help to ensure consistency across organisations, and specific pieces of work are going on relating to gender in that area.
The measures in Part 2 are intended to help secure the reliability of the process through which citizens can verify their identity digitally. They do not intervene in how government departments record and store identity data. In clarifying this important distinction, and with reference to the further information I will set out, I cannot support the amendments.
My Lords, I support the conclusions of the Delegated Powers and Regulatory Reform Committee and the Constitution Committee, and I beg leave to seek the opinion of the House.
My Lords, Amendments 10 and 12 seek to amend Clauses 56 and 58, which form part of the national underground asset register provisions. These two minor, technical amendments address a duplicate reference to “the undertaker’s employees” and replace it with the correct reference to “the contractor’s employees”. I reassure noble Lords that the amendments do not have a material policy effect and are intended to correct the drafting. I beg to move.
My Lords, I thank the Minister for these two technical amendments. I take this opportunity to thank him also for responding to correspondence about LinesearchbeforeUdig and its wish to meet government and work with existing services to deliver what it describes as the safe digging elements of the NUAR. The Minister has confirmed that the heavy lifting on this—not heavy digging—will be carried out by the noble Baroness, Lady Jones, on her return, which I am sure she will look forward to. As I understand it, officials will meet LinesearchbeforeUdig this week, and they will look at the survey carried out by the service. We have made some process since Committee, and I am grateful to the Minister for that.
My Lords, given that these are technical amendments, correcting wording errors, I have little to add to the remarks already made. We have no concerns about these amendments and will not seek to oppose the Government in making these changes.
My Lords, I support my noble friend Lord Colville. He has made an excellent argument, and I ask noble Lords on the Government Benches to think about it very carefully. If it is good enough for health data, it is good enough for the rest of science. In the interest of time, I will give an example of one of the issues, rather than repeat the excellent argument made by my noble friend.
In Committee, I asked the Government three times whether the cover of scientific research could be used, for example, to market-test ways to hack human responses to dopamine in order to keep children online. In the Minister’s letter, written during Committee, she could not say that the A/B testing of millions of children to make services more sticky—that is, more addictive—would not be considered scientific, but rather that the regulator, the ICO, could decide on a case-by-case basis. That is not good enough.
There is no greater argument for my noble friend Lord Colville’s amendment than the fact that the Government are unable to say if hacking children’s attention for commercial gain is scientific or not. We will come to children and child protection in the Bill in the next group, but it is alarming that the Government feel able to put in writing that this is an open question. That is not what Labour believed in opposition, and it is beyond disappointing that, now in government, Labour has forgotten what it then believed. I will be following my noble friend through the Lobby.
My Lords, it is almost impossible to better the arguments put forward by the noble Viscount, Lord Colville, and the noble Baroness, Lady Kidron, so I am not even going to try.
The inclusion of a public interest requirement would ensure that the use of data for scientific research would serve a genuine societal benefit, rather than primarily benefiting private interests. This would help safeguard against the misuse of data for purely commercial purposes under the guise of research. The debate in Committee highlighted the need for further clarity and stronger safeguards in the Bill, to ensure that data for scientific research genuinely serves the public interest, particularly concerning the sensitive data of children. The call for a public interest requirement reflects the desire to ensure a balance between promoting research and innovation and upholding the rights and interests of data subjects. I very much hope that the House will support this amendment.
My Lords, we are playing a bit of Jack-in-the-box. When I was being taught law by a wonderful person from Gray’s Inn, who was responsible for drafting the constitution of Uganda’s independence, Sir Dingle Foot, he said a phrase which struck me, and which has always stayed with me: law is a statement of public policy. The noble Viscount, Lord Coville, seeks that if there is to be scientific work, it must be conducted “in the public interest”. Law simply does not express itself for itself; it does it for the public, as a public policy. It would be a wonderful phrase to include, and I hope the Minister will accept it so that we do not have to vote on it.
My Lords, I was one of those who was up even earlier than the noble Baroness, Lady Harding, and managed to get my name down on these amendments. It puts me in a rather difficult position to be part of the government party but to seek to change what the Government have arrived at as their sticking position in relation to this issue in particular—and indeed one or two others, but I have learned to live with those.
This one caught my eye in Committee. I felt suddenly, almost exactly as the noble Lord, Lord Russell said, a sense of discontinuity in relation to what we thought it was in the Government’s DNA—that is, to bring forward the right solution to the problems that we have been seeking to change in other Bills. With the then Online Safety Bill, we seemed to have an agreement around the House about what we wanted, but every time we put it back to the officials and people went away with it and came back with other versions, it got worse and not better. How children are dealt with and how important it is to make sure that they are prioritised appears to be one of those problems.
The amendments before us—and I have signed many of them, because I felt that we wanted to have a good and open debate about what we wanted here—do not need to be passed today. It seems to me that the two sides are, again, very close in what we want to achieve. I sensed from the excellent speech of the noble Baroness, Lady Kidron, that she has a very clear idea of what needs to go into this Bill to ensure that, at the very least, we do not diminish the sensible way in which we drafted the 2018 Bill. I was part of that process as well; I remember those debates very well. We got there because we hammered away at it until we found a way of finding the right words that bridged the two sides. We got closer and closer together, but sometimes we had to go even beyond what the clerks would feel comfortable with in terms of government procedure to do that. We may be here again.
When he comes to respond, can the Minister commit to us today in this House that he will bring back at Third Reading a version of what he has put forward—which I think we all would say does not quite go far enough; it needs a bit more, but not that much more—to make it meet with where we currently are and where, guided by the noble Baroness, Lady Kidron, we should be in relation to the changing circumstances in both the external world and indeed in our regulator, which of course is going to go through a huge change as it reformulates itself? We have an opportunity, but there is also a danger that we do not take it. If we weaken ourselves now, we will not be in the right position in a few years’ time. I appeal to my noble friend to think carefully about how he might manage this process for the best benefit of all of us. The House, I am sure, is united about where we want to get to. The Bill does not get us there. Government Amendment 18 is too modest in its approach, but it does not need a lot to get it there. I think there is a way forward that we do not need to divide on. I hope the Minister will take the advice that has been given.
My Lords, we have heard some of the really consistent advocates for children’s online protection today. I must say that I had not realised that the opportunity of signing the amendments of the noble Baroness, Lady Kidron, was rather like getting hold of Taylor Swift tickets—clearly, there was massive competition and rightly so. I pay tribute not only to the speakers today but in particular to the noble Baroness for all her campaigning, particularly with 5Rights, on online child protection.
All these amendments are important for protecting children’s data, because they address concerns about data misuse and the need for heightened protection for children in the digital environment, with enhanced oversight and accountability in the processing of children’s data. I shall not say very much. If the noble Baroness pushes Amendment 20 to a vote, I want to make sure that we have time before the dinner hour to do so, which means going through the next group very quickly. I very much hope that we will get a satisfactory answer from the Minister. The sage advice from the noble Lord, Lord Stevenson, hit the button exactly.
Amendment 20 is particularly important in this context. It seeks to exclude children from the new provisions on purpose limitation for further processing under Article 8A. As the noble Baroness explains, that means that personal data originally collected from a child with consent for a specific purpose could not be reused for a different, incompatible purpose without obtaining fresh consent, even if the child is now an adult. In my view, that is core. I hope the Minister will come back in the way that has been requested by the noble Lord, Lord Stevenson, so we do not have to have a vote. However, we will support the noble Baroness if she wishes to test the opinion of the House.
My Lords, I too thank the noble Baroness, Lady Kidron, for all her amendments in this group, and I thank the Minister for his amendment.
Amendment 15 seeks to maintain the high level of legal protection for children’s data even where protections for adults may be eased in the context of scientific research. I acknowledge the concerns raised about the potential implications that this amendment could have for medical research and safeguarding work. It is important to recognise that young people aged 16 and over are entitled to control their medical information under existing legal frameworks, reflecting their ability to understand and consent in specific contexts.
There is a legitimate concern that by excluding all children categorically, including those aged 16 and 17, we risk impeding critical medical research that could benefit young people themselves. Research into safeguarding may also be impacted by such an amendment. Studies that aim to improve systems for identifying and preventing abuse or neglect rely on the careful processing of children’s data. If this amendment were to inadvertently create a barrier to such vital work, we could find ourselves undermining some of the protections that it seeks to reinforce.
That said, the amendment highlights an important issue: the need to ensure that ethical safeguards for children remain robust and proportionate. There is no question that the rights and welfare of children should remain paramount in research contexts, but we must find the right balance—one that allows valuable, ethically conducted research to continue without eroding the legal protections that exist for children’s data. So I welcome the intent of the amendment in seeking to protect children, of course, and I urge us, as the noble Lord, Lord Stevenson, put it, to continue working collaboratively to achieve a framework that upholds their rights without hindering progress in areas that ultimately serve their best interests.
As with the previous amendment, I recognise the intent of Amendment 16, which seeks to protect children’s data by excluding them from the scope of recognised legitimate interests. Ensuring that children continue to benefit from the highest level of legal protection is a goal that, needless to say, we all share. However, I remain concerned that this could have less desirable consequences too, particularly in cases requiring urgent safeguarding action. There are scenarios where swift and proportionate data processing is critical to protecting a child at risk, and it is vital that the framework that we establish does not inadvertently create barriers to such essential work.
I am absolutely in support of Amendment 20. It provides an important safeguard by ensuring that children’s data is not used for purposes beyond those for which it was originally collected, unless it is fully compatible with the original purpose. Children are particularly vulnerable when it comes to data processing and their understanding of consent is limited. The amendment would strengthen protection for children by preventing the use of their data in ways that were not made clear to them or their guardians at the time of collection. It would ensure that children’s data remained secure and was not exploited for unrelated purposes.
On Amendment 22, the overarching duty proposed in this new clause—to prioritise children’s best interests and ensure that their data is handled with due care and attention—aligns with the objective that we all share of safeguarding children in the digital age. We also agree with the principle that the protections afforded to children’s data should not be undermined or reduced, and that those protections should remain consistent with existing standards under the UK GDPR.
However, although we support the intent of the amendment, we have concerns about the reference to the UN Convention on the Rights of the Child and general comment 25. Although these international frameworks are important, we do not believe they should be explicitly tied into this legislation. Our preference would be for a redraft of this provision that focused more directly on UK law and principles, ensuring that the protections for children’s data were robust and tailored to our legal context, rather than linking it to international standards in a way that could create potential ambiguities.
(2 days, 16 hours ago)
Lords ChamberI apologise for interrupting the Minister, in what sounded almost like full flow. I am sure that he was so eager to move his amendment.
In moving Amendment 17, I will speak also to Amendment 21. These aim to remove the Secretary of State’s power to override primary legislation and modify key aspects of the UK data protection law via statutory instruments. They are similar to those proposed by me to the previous Government’s Data Protection and Digital Information Bill, which the noble Baroness, Lady Jones of Whitchurch, then in opposition, supported. These relate to Clauses 70(4) and 71(5).
There are a number of reasons to support accepting these amendments. The Delegated Powers and Regulatory Reform Committee has expressed concerns about the broad scope of the Secretary of State’s powers, as it did previously in relation to the DBS scheme. It recommended removing the power from the previous Bill, and in its ninth report it maintains this view for the current Bill. The Constitution Committee has said likewise; I will not read out what it said at the time, but I think all noble Lords know that both committees were pretty much on the same page.
The noble Baroness, Lady Jones, on the previous DPDI Bill, argued that there was no compelling reason for introducing recognised legitimate interests. On these Benches, we agree. The existing framework already allows for data sharing with the public sector and data use for national security, crime detection and safeguarding vulnerable individuals. However, the noble Baroness, in her ministerial capacity, argued that swift changes might be needed—hence the necessity for the Secretary of State’s power. Nevertheless, the DPRRC’s view is that the grounds for the lawful processing of personal data are fundamental and should not be subject to modification by subordinate legislation.
The letter from the Minister, the noble Lord, Lord Vallance, to the Constitution Committee and the DPRRC pretty much reiterates those arguments. I will not go through all of it again, but I note, in closing, that in his letter he said:
“I hope it will reassure the Committee that the power will be used only when necessary and in the public interest”.
He could have come forward with an amendment to that effect at any point in the passage of the Bill, but he has not. I hope that, on reflection—in the light of both committees’ repeated recommendations, the potential threats to individual privacy and data adequacy, and the lack of strong justification for these powers—the Minister will accept these two amendments. I beg to move.
My Lords, I must inform the House that if Amendment 17 is agreed to, I cannot call Amendment 18 for reasons of pre-emption.
My Lords, government Amendment 18 is similar to government Amendment 40 in the previous group, which added an express reference to children meriting specific protection to the new ICO duty. This amendment will give further emphasis to the need for the Secretary of State to consider the fact that children merit specific protection when deciding whether to use powers to amend the list of recognised legitimate interests.
Turning to Amendment 17 from the noble Lord, Lord Clement-Jones, I understand the concerns that have been raised about the Secretary of State’s power to add or vary the list of recognised legitimate interests. This amendment seeks to remove the power from the Bill.
In response to some of the earlier comments, including from the committees, I want to make it clear that we have constrained these powers more tightly than they were in the previous data Bill. Before making any changes, the Secretary of State must consider the rights and freedoms of individuals, paying particular attention to children, who may be less aware of the risks associated with data processing. Furthermore, any addition to the list must meet strict criteria, ensuring that it serves a clear and necessary public interest objective as described in Article 23.1 of the UK GDPR.
The Secretary of State is required to consult the Information Commissioner and other stakeholders before making any changes, and any regulations must then undergo the affirmative resolution procedure, guaranteeing parliamentary scrutiny through debates in both Houses. Retaining this regulation-making power would allow the Government to respond quickly if future public interest activities are identified that should be added to the list of recognised legitimate interests. However, the robust safeguards and limitations in Clause 70 will ensure that these powers are used both sparingly and responsibly.
I turn now to Amendment 21. As was set out in Committee, there is already a relevant power in the current Data Protection Act to provide exceptions. We are relocating the existing exemptions, so the current power, so far as it relates to the purpose limitation principle, will no longer be relevant. The power in Clause 71 is intended to take its place. In seeking to reassure noble Lords, I want to reiterate that the power cannot be used for purposes other than the public interest objectives listed in Article 23.1 of the UK GDPR. It is vital that the Government can act quickly to ensure that public interest processing is not blocked. If an exemption is misused, the power will also ensure that action can be swiftly taken to protect data subjects by placing extra safeguards or limitations on it.
My Lords, I thank the Minister for that considered reply. It went into more detail than the letter he sent to the two committees, so I am grateful for that, and it illuminated the situation somewhat. But at the end of the day, the Minister is obviously intent on retaining the regulation-making power.
I thank the noble Viscount, Lord Camrose, for his support—sort of—in principle. I am not quite sure where that fitted; it was post-ministerial language. I think he needs to throw off the shackles of ministerial life and live a little. These habits die hard but in due course, he will come to realise that there are benefits in supporting amendments that do not give too much ministerial power.
Turning to one point of principle—I am not going to press either amendment—it is a worrying trend that both the previous Government and this Government seem intent on simply steamrollering through powers for Secretaries of State in the face of pretty considered comment by House of Lords committees. This trend has been noted, first for skeletal Bills and secondly for Bills that, despite being skeletal, include a lot of regulation-making power for Secretaries of State, and Henry VIII powers. So I just issue a warning that we will keep returning to this theme and we will keep supporting and respecting committees of this House, which spend a great deal of time scrutinising secondary legislation and warning of overweening executive power. In the meantime, I beg leave to withdraw Amendment 17.
My Lords, I do not think the noble Baroness, Lady Harding, lost the audience at all; she made an excellent case. Before speaking in support of the noble Baroness, I should say, “Blink, and you lose a whole group of amendments”. We seem to have completely lost sight of the group starting with Amendment 19—I know the noble Lord, Lord Holmes, is not here—and including Amendments 23, 74 and government Amendment 76, which seems to have been overlooked. I suggest that we degroup next week and come back to Amendments 74 and 76. I do not know what will happen to Amendment 23; I am sure there is a cunning plan on the Opposition Front Bench to reinstate that in some shape or form. I just thought I would gently point that out, since we are speeding along and forgetting some of the very valuable amendments that have been tabled.
I very much support, as I did in Committee, what the noble Baroness, Lady Harding, said about Amendment 24, which aims to clarify the use of open electoral register data for direct marketing. The core issue is the interpretation of Article 14 of the GDPR, specifically regarding the disproportionate effort exemption. The current interpretation, influenced by recent tribunal rulings, suggests that companies using open electoral register—OER—data would need to notify every individual whose data is used, even if they have not opted out. As the noble Baroness, Lady Harding, implied, notifying millions of individuals who have not opted out is unnecessary and burdensome. Citizens are generally aware of the OER system, and those who do not opt out reasonably expect to receive direct marketing materials. The current interpretation leads to excessive, unhelpful notifications.
There are issues about financial viability. Requiring individual notifications for the entire OER would be financially prohibitive for companies, potentially leading them to cease using the register altogether. On respect for citizens’ choice, around 37% of voters choose not to opt out of OER use for direct marketing, indicating their consent to such use. The amendment upholds this choice by exempting companies from notifying those individuals, which aligns with the GDPR’s principle of respecting data subject consent.
On clarity and certainty, Amendment 24 provides clear exemptions for OER data use, offering legal certainty for companies while maintaining data privacy and adequacy. This addresses the concerns about those very important tribunal rulings creating ambiguity and potentially disrupting legitimate data use. In essence, Amendment 24 seeks to reconcile the use of OER data for direct marketing with the principles of transparency and data subject rights. On that basis, we on these Benches support it.
I turn to my amendment, which seeks a soft opt-in for charities. As we discussed in Committee, a soft opt-in in Regulation 22 of the Privacy and Electronic Communications (EC Directive) Regulations 2003 allows organisations to send electronic mail marketing to existing customers without their consent, provided that the communication is for similar products and services and the messages include an “unsubscribe” link. The soft opt-in currently does not apply to non-commercial organisations such as charities and membership organisations. The Data & Marketing Association estimates that extending the soft opt-in to charities would
“increase … annual donations in the UK by £290 million”.
Extending the soft opt-in as proposed in both the Minister’s and my amendment would provide charities with a level playing field, as businesses have enjoyed this benefit since the introduction of the Privacy and Electronic Communications Regulations. Charities across the UK support this change. For example, the CEO of Mind stated:
“Mind’s ability to reach people who care about mental health is vital. We cannot deliver life changing mental health services without the financial support we receive from the public”.
Oxfam’s individual engagement director noted:
“It’s now time to finally level the playing field for charities too and to allow them to similarly engage their passionate and committed audiences”.
Topically, too, this amendment is crucial to help charities overcome the financial challenges they face due to the cost of living crisis and the recent increase in employer national insurance contributions. So I am delighted, as I know many other charities will be, that the Government have proposed Amendment 49, which achieves the same effect as my Amendment 50.
My Lords, I declare an interest that my younger daughter works for a charity which will rely heavily on the amendments that have just been discussed by the noble Lord, Lord Clement-Jones.
I want to explain that my support for the amendment moved by the noble Baroness, Lady Harding, was not inspired by any quid pro quo for earlier support elsewhere —certainly not. Looking through the information she had provided, and thinking about the issue and what she said in her speech today, it seemed there was an obvious injustice happening. It seemed wrong, in a period when we were trying to support growth, that we cannot see our way through it. It was in that spirit that I suggested we should push on with it and bring it back on Report, and I am very happy to support it.
My Lords, I will speak to Amendments 28, 29, 33, 34 and 36. I give notice that I will only speak formally to Amendment 33. For some reason, it seems to have escaped this group and jumped into the next one.
As we discussed in Committee, and indeed on its previous versions, the Bill removes the general prohibition on solely automated decisions and places the responsibility on individuals to enforce their rights rather than on companies to demonstrate why automation is permissible. The Bill also amends Article 22 of the GDPR so that protection against solely automated decision-making applies only to decisions made using sensitive data such as race, religion and health data. This means that decisions based on other personal data, such as postcode, nationality, sex or gender, would be subject to weaker safeguards, increasing the risk of unfair or discriminatory outcomes. This will allow more decisions with potentially significant impacts to be made without human oversight, even if they do not involve sensitive data. This represents a significant weakening of existing protection against unsafe automated decision-making. That is why I tabled Amendment 33 to leave out the whole clause.
However, the Bill replaces the existing Article 22 with Articles 22A to 22D, which redefine automated decisions and allow for solely automated decision-making in a broader range of circumstances. This change raises concerns about transparency and the ability of individuals to challenge automated decisions. Individuals may not be notified about the use of ADM, making it difficult to exercise their rights. Moreover, the Bill’s safeguards for automated decisions, particularly in the context of law enforcement, are weaker compared with the protections offered by the existing Article 22. This raises serious concerns about the potential for infringement of people’s rights and liberties in areas such as policing, where the use of sensitive data in ADM could become more prevalent. Additionally, the lack of clear requirements for personalised explanations about how ADM systems reach decisions further limits individuals’ understanding of and ability to challenge outcomes.
In the view of these Benches, the Bill significantly weakens safeguards around ADM, creates legal uncertainty due to vague definitions, increases the risk of discrimination, and limits transparency and redress for individuals—ultimately undermining public trust in the use of these technologies. I retabled Amendments 28, 29, 33 and 34 from Committee to address continuing concerns regarding these systems. The Bill lacks clear definitions of crucial terms such as “meaningful human involvement” and, similarly, “significant effect”, which are essential for determining the scope of protection. That lack of clarity could lead to varying interpretations and inconsistencies in application, creating legal uncertainty for individuals and organisations.
In Committee, the noble Baroness, Lady Jones, emphasised the Government’s commitment to responsible ADM and argued against defining meaningful human involvement in the Bill, but instead for allowing the Secretary of State to define those terms through delegated legislation. However, that raises concerns about transparency and parliamentary oversight, as these are significant policy decisions. Predominantly automated decision-making should be included in Clause 80, as in Amendment 28, as a decision may lack meaningful human involvement and significantly impact individuals’ rights. The assertion by the noble Baroness, Lady Jones, that predominantly automated decisions inherently involve meaningful human oversight can be contested, particularly given the lack of a clear definition of such involvement in the Bill.
There are concerns that changes in the Bill will increase the risk of discrimination, especially for marginalised groups. The noble Baroness, Lady Jones, asserted in Committee that the data protection framework already requires adherence to the Equality Act. However, that is not enough to prevent algorithmic bias and discrimination in ADM systems. There is a need for mandatory bias assessments of all ADM systems, particularly those used in the public sector, as well as for greater transparency in how those systems are developed and deployed.
We have not returned to the fray on the ATRS, but it is clear that a statutory framework for the ATRS is necessary to ensure its effectiveness and build trust in public sector AI. Despite the assurance by the noble Baroness, Lady Jones, that the ATRS is mandatory for government departments, its implementation relies on a cross-government policy mandate that lacks statutory backing and may prove insufficient to ensure the consistent and transparent use of algorithmic tools.
My Amendment 34 seeks to establish requirements for public sector organisations using ADM systems. Its aim is to ensure transparency and accountability in the use of these systems by requiring public authorities to publish details of the systems they use, including the purpose of the system, the data used and any mitigating measures to address risks. I very much welcome Amendment 35 from the noble Baroness, Lady Freeman, which would improve it considerably and which I have also signed. Will the ATRS do as good a job as that amendment?
Concerns persist about the accessibility and effectiveness of this mechanism for individuals seeking redress against potentially harmful automated decisions. A more streamlined and user-friendly process for challenging automated decisions is needed in the in the age of increasing ADM. The lack of clarity and specific provisions in the Bill raises concerns about its effectiveness in mitigating the risks posed by automated systems, particularly in safeguarding vulnerable groups such as children.
My Amendment 36 would require the Secretary of State to produce a definition of “meaningful human involvement” in ADM in collaboration with the Information Commissioner’s Office, or to clearly set out their reasoning as to why that is not required within six months of the Act passing. The amendment is aimed at addressing the ambiguity surrounding “meaningful human involvement” and ensuring that there is a clear understanding of what constitutes appropriate human oversight in ADM processes.
I am pleased that the Minister has promised a code of practice, but what assurance can he give regarding the forthcoming ICO code of practice about automated decision-making? How will it provide clear guidance on how to implement and interpret the safeguards for ADM, and will it address the definition of meaningful human involvement? What forms of redress will it require to be established? What level of transparency will be required? A code of conduct offered by the Minister would be acceptable, provided that the Secretary of State did not have the sole right to determine the definition of meaningful human involvement. I therefore hope that my Amendment 29 will be accepted alongside Amendment 36, because it is important that the definition of such a crucial term should be developed independently, and with the appropriate expertise, to ensure that ADM systems are used fairly and responsibly, and that individual rights are adequately protected.
Amendments 31 and 32 from the Opposition Front Bench seem to me to have considerable merit, particularly Amendment 32, in terms of the nature of the human intervention. However, I confess to some bafflement as to the reasons for Amendment 26, which seeks to insert the OECD principles set out in the AI White Paper. Indeed, they were the G20 principles as well and are fully supportable in the context of an AI Bill, for instance, and I very much hope that will form Clause 1 of a new AI Bill going forward. I am not going to go into great detail, but I wonder whether those principles are already effectively addressed in data protection legislation. If we are not careful, we are going to find a very confused regulator in these circumstances. So, although there is much to commend the principles as such, whether they are a practical proposition in a Bill of this nature is rather moot.
My Lords, I support Amendment 34 from the noble Lord, Lord Clement-Jones, and will speak to my own Amendment 35, which amends it. When an algorithm is being used to make important decisions about our lives, it is vital that everyone is aware of what it is doing and what data it is based on. On Amendment 34, I know from having had responsibility for algorithmic decision support tools that users are very interested in how recent the data it is based on is, and how relevant it is to them. Was the algorithm derived from a population that included people who share their characteristics? Subsection (1)(c)(ii) of the new clause proposed in Amendment 34 refers to regular assessment of the data used by the system. I would hope that this would be part of the meaningful explanation to individuals to be prescribed by the Secretary of State in subsection (1)(b).
Amendment 35 would add to this that it is vital that all users and procurers of such a system understand its real-world efficacy. I use the word “efficacy” rather than “accuracy” because it might be difficult to define accuracy with regard to some of these systems. The procurer of any ADM system should want to know how accurate it is using realistic testing, and users should also be aware of those findings. Does the system give the same outcome as a human assessor 95% or 60% of the time? Is that the same for all kinds of queries, or is it more accurate for some groups of people than others? The efficacy is really one of the most important aspects and should be public. I have added an extra line that ensures that this declaration of efficacy would be kept updated. One would hope that the performance of any such system would be monitored anyway, but this ensures that the outcomes of such monitoring are in the public domain.
In Committee, the Minister advised us to wait for publication of the algorithmic transparency records that were released in December. Looking at them, I think they make clear the much greater need for guidance and stringency in what should be mandated. I will give two short examples from those records. For the DBT: Find Exporters algorithm, under “Model performance” it merely says that it uses Brier scoring and other methods, without giving any actual results of that testing to indicate how well it performs. It suggests looking at the GitHub pages. I followed that link, and it did not allow me in. The public have no access to those pages. This is why these performance declarations need to be mandated and forced to be in the public domain.
In the second example, the Cambridgeshire trial of an externally supplied object detection system just cites the company’s test data, claiming average precision in a “testing environment” of 43.5%. This does not give the user a lot of information. Again, it links to GitHub pages produced by the supplier. Admittedly, this is a trial, so perhaps the Cambridgeshire Partnership will update it with its real-world trial data. But that is why we need to ensure annual updates of performance data and ensure that that data is not just a report of the supplier’s claims in a test environment.
The current model of algorithmic transparency records is demonstrably not fit for purpose, and these provisions would help put them on a much firmer footing. These systems, after all, are making life-changing decisions for all of us and we all need to be sure how well they are doing and put appropriate levels of trust in them accordingly.
I start with Amendment 26, tabled by the noble Viscount, Lord Camrose. As he said in Committee, a principles-based approach ensures that our rules remain fit in the face of fast-evolving technologies by avoiding being overly prescriptive. The data protection framework achieves this by requiring organisations to apply data protection principles when personal data is processed, regardless of the technology used.
I agree with the principles that are present for AI, which are useful in the context in which they were put together, but introducing separate principles for AI could cause confusion around how data protection principles are interpreted when using other technologies. I note the comment that there is a significant overlap between the principles, and the comment from the noble Viscount that there are situations in which one would catch things and another would not. I am unable to see what those particular examples are, and I hope that the noble Viscount will agree with the Government’s rationale for seeking to protect the framework’s technology-neutral set of principles, rather than having two separate sets.
Amendment 28 from the noble Lord, Lord Clement-Jones, would extend the existing safeguards for decisions based on solely automated processing to decisions based on predominantly automated processing. These safeguards protect people when there is no meaningful human involvement in the decision-making. The introduction of predominantly automated decision-making, which already includes meaningful human involvement—and I shall say a bit more about that in a minute—could create uncertainty over when the safeguards are required. This may deter controllers from using automated systems that have significant benefits for individuals and society at large. However, the Government agree with the noble Viscount on strengthening the protections for individuals, which is why we have introduced a definition for solely automated decision-making as one which lacks “meaningful human involvement”.
I thank noble Lords for Amendments 29 and 36 and the important points raised in Committee on the definition of “meaningful human involvement”. This terminology, introduced in the Bill, goes beyond the current UK GDPR wording to prevent cursory human involvement being used to rubber stamp decisions as not being solely automated. The point at which human involvement becomes meaningful is context specific, which is why we have not sought to be prescriptive in the Bill. The ICO sets out in its guidance its interpretation that meaningful human involvement must be active: someone must review the decision and have the discretion to alter it before the decision is applied. The Government’s introduction of “meaningful” into primary legislation does not change this definition, and we are supportive of the ICO’s guidance in this space.
As such, the Government agree on the importance of the ICO continuing to provide its views on the interpretation of terms used in the legislation. Our reforms do not remove the ICO’s ability to do this, or to advise Parliament or the Government if it considers that the law needs clarification. The Government also acknowledge that there may be a need to provide further legal certainty in future. That is why there are a number of regulation-making powers in Article 22D, including the power to describe meaningful human involvement or to add additional safeguards. These could be used, for example, to impose a timeline on controllers to provide human intervention upon the request of the data subject, if evidence suggested that this was not happening in a timely manner following implementation of these reforms. Any regulations must follow consultation with the ICO.
Amendment 30 from the noble Baroness, Lady Kidron, would prevent law enforcement agencies seeking the consent of a young person to the processing of their special category or sensitive personal data when using automated decision-making. I thank her for this amendment and agree about the importance of protecting the sensitive personal data of children and young adults. We believe that automated decision-making will continue to be rarely deployed in the context of law enforcement decision-making as a whole.
Likewise, consent is rarely used as a lawful basis for processing by law enforcement agencies, which are far more likely to process personal data for the performance of a task, such as questioning a suspect or gathering evidence, as part of a law enforcement process. Where consent is needed—for example, when asking a victim for fingerprints or something else—noble Lords will be aware that Clause 69 clearly defines consent under the law enforcement regime as
“freely given, specific, informed and unambiguous”
and
“as easy … to withdraw … as to give”.
So the tight restrictions on its use will be crystal clear to law enforcement agencies. In summary, I believe the taking of an automated decision based on a young person’s sensitive personal data, processed with their consent, to be an extremely rare scenario. Even when it happens, the safeguards that apply to all sensitive processing will still apply.
I thank the noble Viscount, Lord Camrose, for Amendments 31 and 32. Amendment 31 would require the Secretary of State to publish guidance specifying how law enforcement agencies should go about obtaining the consent of the data subject to process their data. To reiterate a point made by my noble friend Lady Jones in Committee, Clause 69 already provides a definition of “consent” and sets out the conditions for its use; they apply to all processing under the law enforcement regime, not just automated decision-making, so the Government believe this amendment is unnecessary.
Amendment 32 would require the person reviewing an automated decision to have sufficient competence and authority to amend the decision if required. In Committee, the noble Viscount also expressed the view that a person should be “suitably qualified”. Of course, I agree with him on that. However, as my noble friend Lady Jones said in Committee, the Information Commissioner’s Office has already issued guidance which makes it clear that the individual who reconsiders an automated decision must have the “authority and competence” to change it. Consequently, the Government do not feel that it is necessary to add further restrictions in the Bill as to the type of person who can carry out such a review.
The noble Baroness, Lady Freeman, raised extremely important points about the performance of automated decision-making. The Government already provide a range of products, but A Blueprint for Modern Digital Government, laid this morning, makes it clear that part of the new digital centre’s role will be to offer specialist insurance support, including, importantly in relation to this debate,
“a service to rigorously test models and products before release”.
That function will be in place and available to departments.
On Amendments 34 and 35, my noble friend Lady Jones previously advised the noble Lord, Lord Clement-Jones, that the Government would publish new algorithmic transparency recording standard records imminently. I am pleased to say that 14 new records were published on 17 December, with more to follow. I accept that these are not yet in the state in which we would wish them to be. Where these amendments seek to ensure that the efficacy of such systems is evaluated, A Blueprint for Modern Digital Government, as I have said, makes it clear that part of the digital centre’s role will be to offer such support, including this service. I hope that this provides reassurance.
My Lords, before the Minister sits down, I was given considerable assurance between Committee and Report that a code of practice, drawn up with the ICO, would be quite detailed in how it set out the requirements for those engaging in automated decision-making. The Minister seems to have given some kind of assurance that it is possible that the ICO will come forward with the appropriate provisions, but he has not really given any detail as to what that might consist of and whether that might meet some of the considerations that have been raised in Committee and on Report, not least Amendments 34 and 35, which have just been discussed as if the ATRS was going to cover all of that. Of course, any code would no doubt cover both the public and private sectors. What more can the Minister say about the kind of code that would be expected? We seem to be in somewhat of a limbo in this respect.
I apologise; I meant to deal with this at the end. I think I am dealing with the code in the next group.
My Lords, we have waited with bated breath for the Minister to share his hand, and I very much hope that he will reveal the nature of his bountiful offer of a code of practice on the use of automated decision-making.
I will wear it as a badge of pride to be accused of introducing an analogue concept by the noble Viscount, Lord Camrose. I am still keen to see the word “predominantly” inserted into the Bill in reference to automated decision-making.
As the Minister can see, there is considerable unhappiness with the nature of Clause 80. There is a view that it does not sufficiently protect the citizen in the face of automated decision-making, so I hope that he will be able to elaborate further on the nature of those protections.
I will not steal any of the thunder of the noble Baroness, Lady Kidron. For some unaccountable reason, Amendment 33 is grouped with Amendment 41. The groupings on this Bill have been rather peculiar and at this time of night I do not think any long speeches are in order, but it is important that we at least have some debate about the importance of a code of conduct for the use of AI in education, because it is something that a great many people in the education sector believe is necessary. I beg to move.
My Lords, I shall speak to Amendment 41 in my name and in the names of my noble friend Lord Russell, the noble Baroness, Lady Harding, and the noble Lord, Lord Clement-Jones. The House can be forgiven if it is sensing a bit of déjà-vu, since I have proposed this clause once or twice before. However, since Committee, a couple of things have happened that make the argument for the code more urgent. We have now heard that the Prime Minister thinks that regulating AI is “leaning out” when we should be, as the tech industry likes to say, leaning in. We have had Matt Clifford’s review, which does not mention children even once. In the meantime, we have seen rollout of AI in almost all products and services that children use. In one of the companies—a household name that I will not mention—an employee was so concerned that they rang me to say that nothing had been checked except whether the platform would fall over.
Amendment 41 does not seek to solve what is a global issue of an industry arrogantly flying a little too close to the sun and it does not grasp how we could use this extraordinary technology and put it to use for humankind on a more equitable basis than the current extractive and winner-takes-all model; it is far more modest than that. It simply says that products and services that engage with kids should undertake a mandatory process that considers their specific vulnerabilities related to age. I want to stress this point. When we talk about AI, increasingly we imagine the spectre of diagnostic benefits or the multiple uses of generative models, but of course AI is not new nor confined to these uses. It is all around us and, in particular, it is all around children.
In 2021, Amazon’s AI voice assistant, Alexa, instructed a 10 year-old to touch a live electrical plug with a coin. Last year, Snapchat’s My AI gave adult researchers posing as a 13 year-old girl tips on how to lose her virginity with a 31 year-old. Researchers were also able to obtain tips on how to hide the smell of alcohol and weed and how to conceal Snapchat conversations from their parents. Meanwhile, character.ai is being sued by the mother of a 14 year-old boy in Florida who died by suicide after becoming emotionally attached to a companion bot that encouraged him to commit suicide.
In these cases, the companies in question responded by implementing safety measures after the fact, but how many children have to put their fingers in electrical sockets, injure themselves, take their own lives and so on before we say that those measures should be mandatory? That is all that the proposed code does. It asks that companies consider the ways in which their products may impact on children and, having considered them, take steps to mitigate known risk and put procedures in place to deal with emerging risks.
One of the frustrating things about being an advocate for children in the digital world is how much time I spend articulating avoidable harms. The sorts of solutions that come after the event, or suggestions that we ban children from products and services, take away from the fact that the vast majority of products and services could, with a little forethought, be places of education, entertainment and personal growth for children. However, children are by definition not fully mature, which puts them at risk. They chat with smart speakers, disclosing details that grown-ups might consider private. One study found that three to six year-olds believed that smart speakers have thoughts, feelings and social abilities and are more reliable than human beings when it came to answering fact-based questions.
I ask the Minister: should we ban children from the kitchen or living room in which the smart speaker lives, or demand, as we do of every other product and service, minimum standards of product safety based on the broad principle that we have a collective obligation to the safety and well-being of children? An AI code is not a stretch for the Bill. It is a bare minimum.
It will be clear to the ICO from the amendments that have been tabled and my comments that there is an expectation that it should take into account the discussion we have had on this Bill.
My Lords, I thank the Minister for his very considered response. In the same way as the noble Baroness, Lady Kidron, I take it that, effectively, the Minister is pledging to engage directly with us and others about the nature and contents of the code, and that the ICO will also engage on that. As the Minister knows, the definition of terms such as meaningful human engagement is something that we will wish to discuss and consider in the course of that engagement. I hope that the AI edtech code will also be part of that.
I thank the Minister. I know he has had to think about this quite carefully during the Bill’s passage. Currently, Clause 80 is probably the weakest link in the Bill, and this amendment would go some considerable way towards repairing it. My final question is not to the Minister, but to the Opposition: what on earth have they got against the UN? In the meantime, I beg leave to withdraw my amendment.
My Lords, Amendment 37 is on the subject of data adequacy, which has been a consistent issue throughout the passage of the Bill. The mechanism put forward in the amendment would review the question of data adequacy.
Safeguarding data adequacy is crucial for the UK’s economy and international partnerships. Losing data adequacy status would impose significant costs and administrative burdens on businesses and public sector organisations that share data between the UK and the EU. It would also hinder international trade and economic co-operation, and trust in the UK’s digital economy, contradicting the Government’s objective of economic growth. I hope very much that the Government are proactively engaging with the European Commission to ensure a smooth adequacy renewal process this year.
Early engagement and reassurance about the retention of adequacy status are of crucial importance, given the looming deadline of June this year. This includes explaining and providing reassurance regarding any planned data protection reforms, particularly concerning the independence of the Information Commissioner’s Office, ministerial powers to add new grounds—for instance, recognised legitimate interest for data processing —and the new provisions relating to automated decision-making.
Despite assurances from the noble Baroness, Lady Jones, that the proposed changes will not dilute data subjects’ rights or threaten EU adequacy, proactive engagement with the EU and robust safeguards are necessary to ensure the continued free flow of data while maintaining high data protection standards. The emphasis on proportionality as a safeguard against the dilution of data subjects’ rights, as echoed by the noble Baroness, Lady Jones, and the ICO, is insufficient. The lack of a clear definition of proportionality within the context of data subjects’ rights could provide loopholes for controllers and undermine the essential equivalence required for data adequacy. The Bill’s reliance on the ICO’s interpretation of proportionality without explicit legislative clarity could be perceived as inadequate by the European Commission, particularly in areas such as web scraping for AI training.
The reassurance that the Government are taking data adequacy seriously and are committing to engaging with the EU needs to be substantiated by concrete actions. The Government do not, it appears, disclose assessments and reports relating to the compatibility of the UK’s domestic data protection framework with the Council of Europe’s Convention 108+, and that raises further concerns about transparency and accountability. Access to this information would enable scrutiny and informed debate, ultimately contributing to building trust and ensuring compatibility with international data protection standards.
In conclusion, while the Government maintain that this Bill would not jeopardise data adequacy, the concerns raised by myself and others during its passage mean that I continue to believe that a comprehensive review of EU data adequacy, as proposed in Amendment 37, is essential to ensure the continued free flow of data, while upholding high data protection standards and maintaining the UK’s position as a trusted partner in international data exchange. I beg to move.
I thank the noble Lord, Lord Clement-Jones, for his amendment, and the noble and learned Lord, Lord Thomas, for his contribution. I agree with them on the value and importance placed on maintaining our data adequacy decisions from the EU this year. That is a priority for the Government, and I reassure those here that we carefully considered all measures in the light of the EU’s review of our adequacy status when designing the Bill.
The Secretary of State wrote to the House of Lords European Affairs Committee on 20 November 2024 on this very point and I would be happy to share this letter with noble Lords if that would be helpful. The letter sets out the importance this Government place on renewal of our EU adequacy decisions and the action we are taking to support this process.
It is important to recognise that the EU undertakes its review of its decisions for the UK in a unilateral, objective and independent way. As the DSIT Secretary of State referenced in his appearance before the Select Committee on 3 December, it is important that we acknowledge the technical nature of the assessments. For that reason, we respect the EU’s discretion about how it manages its adequacy processes. I echo some of the points made by the noble Viscount, Lord Camrose.
That being said, I reassure noble Lords that the UK Government are doing all they can to support a swift renewal of our adequacy status in both technical preparations and active engagement. The Secretary of State met the previous EU Commissioner twice last year to discuss the importance of personal data sharing between the UK and EU. He has also written to the new Commissioner for Justice responsible for the EU’s review and looks forward to meeting Commissioner McGrath soon.
I also reassure noble Lords that DSIT and the Home Office have dedicated teams that have been undertaking preparations ahead of this review, working across government as needed. Those teams are supporting European Commission officials with the technical assessment as required. UK officials have met with the European Commission four times since the introduction of the Bill, with future meetings already in the pipeline.
My Lords, the noble and learned Lord, Lord Thomas, whose intervention I very much appreciated, particularly at this time of the evening, talked about a fresh pair of eyes. What kind of reassurance can the Minister give on that?
It is worth remembering that the ultimate decision is with the EU Commission and we are quite keen to have its eyes on it now, which is why we are engaging with it very carefully. It is looking at it as we are going through it—we are talking to it and we have dedicated teams of people brought together specifically to do this. There are several people from outside the direct construct of the Bill who are looking at this to make sure that we have adequacy and are having very direct conversations with the EU to ensure that that process is proceeding as we would wish it to.
I thank the Minister for his response. It would be very reassuring if it was our own fresh pair of eyes rather than across the North Sea. That is all I can say as far as that is concerned. I appreciate what he said—that the Government are taking this seriously. It is a continuing concern precisely because the chair of the European Affairs Committee wrote to the Government. It is a continuing issue for those of us observing the passage of the Bill and we will continue to keep our eyes on it as we go forward. I very much hope that June 2025 passes without incident and that the Minister’s predictions are correct. In the meantime, I beg leave to withdraw the amendment.
(1 week ago)
Lords ChamberI thank my noble friend for that important question. Where there is evidence of non-compliance, Ofcom has set out that it will move quickly to enforcement, and that action will follow in spring this year, because companies will have had three months to get their positions sorted out—I think that 16 March is the date by which they have to do it. Ofcom will be able to apply fines, including global levies, and it will be able to apply to the courts for business disruption measures and have the flexibility to submit these applications urgently.
My Lords, the Minister’s response is somewhat baffling. Given the amendment to the Bill as it passed through the House, as a result of the amendment from the noble Baroness, Lady Morgan, it was quite clear that high-risk smaller platforms would be included in category 1 and bear all the consequences. Yet, despite the Secretary of State’s concerns, which were expressed in a letter last September, the Government have not insisted that Ofcom include those platforms in category 1. What does that mean? Why are the Government not taking proper legal advice and insisting that these smaller, high-risk platforms bear all the duties of category 1 services?
I thank the noble Lord for his question. Category 1, in the way that the Bill was ultimately approved, was for large sites with many users. The possibility remains that this threshold can be amended. It is worth remembering that category 1 imposes two additional duties: a duty that the company must apply its service agreements properly and a duty that users can make it possible for themselves not to see certain things. For many of the small and harmful sites, those things would not apply anyway, because users have gone there deliberately to see what is there, but the full force of the Act applies to those small companies, which is why there is a special task force to make sure that that is applied properly.
(1 week ago)
Lords ChamberI welcome the Secretary of State’s Statement in this space, and I start with an apology. When I agreed to speak to this, I was told it would be first business after Questions, and I am afraid I have to leave for a flight midway through, so I apologise to noble Lords and hope that they understand. My colleague will be here all the way through.
As I say, we welcome the Statement and we welcome the Matt Clifford plan, which my noble friend Lord Camrose kicked off when he was leading these efforts in government, so we see this as a positive step forward. As Health Minister during that time, I saw first-hand the potential of AI, how it can really transform our services and how the UK really does have the potential for a leadership role.
Matt Clifford's plan, we believe, is right that the role of the Government in this is really to establish the foundations for growth: namely, making sure we have an AI-skilled workforce, the computing power, the energy needs to drive that computing power and the right regulatory framework. Then we use the assets we have, such as the data, to create the right datasets and use our public sector to help the rollout in many of them. I will focus my comments and questions on how we are going to make sure that those things happen.
Turning to the first one, the AI-skilled workforce, I must admit that when I read in the report that 5,000 AI jobs were being created in this, like most of us, I thought “5,000—that is great.” Then you realise that, actually, 4,500 of those are in construction and only 500 are in AI itself, and you start to get worried that maybe this is a bit style over substance. I am very keen to understand from the Minister here what we are specifically doing in this space. I am mindful, for instance, that we talk about having government develop training for the universities with a delivery or reporting date of autumn 2027. We all know how quickly AI is moving in this space, and we are saying we are just going to have the training in place for the universities to give these courses in two and a half years’ time. I think we all know that, in two and a half years’ time, the world will have moved on massively from that, and no doubt the training will be out of place. I hope the Minister can come back on that and give us some reassurances that we will actually have an accelerated process—I am afraid this will be a bit of a recurring theme.
On computing power, my noble friend Lord Camrose, when he was in government, had secured an £800 million commitment to build a supercomputer in Culham. Now I read, in the Government’s action plan, that they will
“start to develop the business case process”
for an AI computer. Unfortunately, like many noble Lords, I know what that means: a Treasury business case process, so you are talking about a year and a half to two years, at least. All I can guarantee is that, if you take that length of time to produce a business plan, whatever you were planning in terms of a supercomputer will be superseded by advancements and events. What is the Minister doing to streamline that business plan process and get action on this front so that we can get that new supercomputer fast?
On energy, we all accept there is a desperate need for energy; again, that is laid down in the action plan. The Government’s answer to that is to set up an AI energy quango. I think most of us would say that we need to set out what our energy needs require, but then surely it is up to the network or GB Energy to fulfil that. Why do we need another quango and another layer of bureaucracy? What powers is that quango going to have if it will not be commissioning these facilities, which I assume GB Energy will do?
On regulation and governance, the regulatory framework is another very important part of the foundation. I know the Government have plans for an AI Bill, but what is the timeline for it? Again—this is a recurrent theme—it needs to be quick so we can keep up with events.
Moving on to AI datasets, I know that this is something that the Minister is very keen on in the health space, as am I, being the former Health Minister responsible for this area. We have the best health data in the world; the beauty of having a National Health Service is that we have data on primary and secondary care going back to the Second World War. We have data coming in from the UK Biobank and other sources, such as retina scans from opticians which, we are hearing, can be used for stroke detection or maybe the early warning signs of dementia. There are fantastic opportunities for this, and we can already see its applications around the health service today. We have been doing the research with focus groups to bring the public with us on the use of their healthcare data. We have the potential to create the UK Silicon Valley in the life sciences on the back of the data that we have. We had in place a data for R&D programme, which was looking to utilise and create datasets in the health space. Could the Minister update us on where we are with that, and whether it is going to be his focus? As we discussed, that is something I would be very happy to work on together.
The last part of the foundations is to use the assets that we have in the public sector as a rollout plan for that and, again, health is a perfect place for this. We have seen brilliant applications already in cancer treatment and in overprescriptions; there are possibilities with the NHS app, which is really taking off, and to use AI in the 111 service to help triage; these are all fantastic opportunities. We put in place an NHS productivity plan which was very AI driven and AI heavy. Could the Minister update us on the AI productivity plan for the NHS and what progress we are making on it?
To conclude, we are very positive about the opportunities AI provides to transform the whole country’s economy and public services in ways that we cannot even imagine. However, it is businesses that need to drive this. It is the role of the Government to set the foundations to allow business to deliver; it is not the role of quangos, which are not going to deliver it. This area will need a Minister to drive it through and make it happen. Is the Minister the one who will do that? If he is, I give him all our support and wish him the best of luck with it.
My Lords, I also welcome this plan, perhaps with rather less baggage than the Conservative Benches. The Prime Minister and the Secretary of State invoked Babbage, Lovelace, Turing, the pioneering age of steam and even the white heat of the technological revolution, but at its core there is an important set of proposals with great potential. However, it is a wish list rather than a plan at present.
I particularly welcome the language in the plan around regulation, particularly where it refers to regulation assisting innovation, which is a change of tone. However, the plan and Statement raise many questions. In particular, how will the Government ensure that AI development mitigates risks beyond just safety to ensure responsible AI development and adoption, especially given the fact that a great deal of UK development will involve open-source applications?
On the question of the introduction of AI into the public sector, the Government are enormously enthusiastic. But, given their public sector digital transformation agenda, why are the Government watering down citizens’ rights in automated decision-making in the Data (Use and Access) Bill?
We welcome the recognition of the need to get the economic benefits for the UK from public sector data which may be used to develop AI models. What can the Minister tell us at this stage about what the national data library will look like? It is not clear that the Government yet know whether it will involve primary or secondary legislation or whatever. The plan and response also talk about “sovereign compute”, but what about sovereign cloud capability? The police cannot even find a supplier that guarantees its records will be stored in the UK.
While the focus on UK training is welcome, we must go beyond high-level skills. Not only are the tech companies calling out for technical skills, but AI is also shaping workplaces, services and lives. Will the Digital Inclusion Action Committee, chaired by the noble Baroness, Lady Armstrong, have a role in advising on this? Do the changes to funding and delivery expected for skills boot camps contribute to all of this?
On the question of energy requirements for the new data centres, will the new AI energy council be tasked with ensuring that they will have their own renewable energy sources? How will their location be decided, alongside that of the new AI growth centres?
The plan cannot be game-changing without public investment. It is about delivery, too, especially by the new sovereign data office; it cannot all be done with private sector investment. Where is the public money coming from, and over what timescale? An investment plan for compute is apparently to be married to the spending review; how does a 10-year timescale fit with this? I am very pleased that a clear role is identified for the Alan Turing Institute, but it is not yet clear what level of financial support it will get, alongside university research, exacompute capacity, and the British Business Bank in the spin-out/start-up pipeline support. What will the funding for the Compound Semiconductor Applications Catapult and the design and manufacturing ecosystem consist of?
The major negative in the plan for many of us, as the Minister already knows, is the failure to understand that our creative industries need to be able to derive benefits from their material used for training large language models. The plan ominously recommended reforming,
“the UK text and data mining regime so that it is at least as competitive as the EU”,
and the Government have stacked the cards in the consultation over this. We on these Benches and the creative industries will be fighting tooth and nail any new text and data mining exemption requiring opt-out.
My Lords, I anticipated that this Statement would attract interest from Members of this House, and I thank the noble Lords, Lord Markham and Lord Clement-Jones, for their comments and their broad welcoming of the report. I will try to respond to as many points as I can, but first I will reiterate the importance of this announcement.
Through the publication of the AI Opportunities Action Plan and the Government’s response, we are signalling that our ambition is high when it comes to embracing the opportunities presented by AI. This is a plan to exploit the economic growth that AI will bring and to drive forward the Government’s plan for change. Training the UK’s workforce is a key part of the plan, and there are steps with clear timelines as to when we will do that. I will come back to training a little later.
We need to diffuse AI technology across the economy and public services for better productivity and opportunity, and embrace the transformational impact it is going to have on everyday lives, from health and education to business and government services.
As has rightly been pointed out, AI is advancing at an extraordinary pace. That is why you will see in this response very tight timelines for actions. The one that was picked out on training, which is 2027, is only one part of the response; you will see that Skills England is due to report very shortly with the first phase of its recommendations and will follow that in autumn with further work. So most of the timelines are very tight, recognising the challenge that the pace of advancement in AI brings.
The benefits extend far beyond economic growth. It is the catalyst that we need for a public service revolution, including, of course, in the NHS. It will drive growth and innovation and deliver better outcomes for citizens. It also lies at the heart of two important missions for the Government: kick-starting economic growth and delivering an NHS fit for the future. By investing in AI now, we are ensuring that the UK is prepared to harness the transformational potential that undoubtedly exists. This will improve the quality and delivery of public services. The plan is a way to do that with real speed and ambition.
The issue of regulation has been raised and there is no doubt that the regulatory environment will be critical in driving trust and capitalising on the technology offers that arise. By bringing forward the recommendations in the plan, we will continue to support the AI Safety Institute and further develop the AI assurance ecosystem, including the small companies that will arise as a result, to increase trust in and adoption of AI.
The Government are committed to supporting regulators in evaluating their AI capabilities and understanding how they can be strengthened. Part of this is the role of the regulatory innovation office. The vast majority of AI should be regulated at the point of use by the expert regulators, but some relates to fast-evolving technology. That is why we will continue to deliver on manifesto commitments by placing binding requirements on the developers of the most powerful AI models. Those commitments will build on the work that has already been done at the Seoul and Bletchley AI safety summits and will be part of strengthening the role of the AI Safety Institute. This issue of making sure that we get the safety side of this right as we develop opportunities is of course key.
The question of copyright was raised by the noble Lord, Lord Clement-Jones, and I know that this is an extremely hot issue at the moment, which will be discussed many times over the next few days and weeks. The Government have issued a consultation, in which there are three principles: the owners of copyright should have control; there should be a mechanism to allow access to data to enable companies to develop their models in the UK, rather than elsewhere in the world; and, critically, there must be transparency. Where does the data flow and how can you work out the input from the output? Those three areas are a key part of the consultation and the consultation is crucial. We have a session planned for next week to go through this in some detail, and I invite and welcome all noble Lords to it, because getting this right will be important for the country. I look forward to discussing those proposals over the next few days and weeks.
Delivering the AI Opportunities Action Plan will require a whole-of-government effort. We are starting that work immediately to deliver on the commitments, build the foundations for AI growth, drive adoption across the economy and build UK capability. We are already expecting initial updates on a series of actions by this spring. For instance, DSIT will explore options for growing the domestic AI safety market and will provide a public update on this by spring this year.
Turning to some of the very specific points, I completely agree that training is crucial and we have to get it right. There are several recommendations and, as I said, the earliest will give a readout this spring. I do understand that this is not something that can wait until 2027; it has to start immediately.
It is important to lay out for the House the situation with compute. This spring, there will be access to two new major compute facilities for AI: Dawn in Cambridge and Isambard-AI in Bristol. When fully active this year, they will increase the AI compute facility something like thirtyfold, instantly. Those are the types of compute infrastructure that are needed. It is AI-specific compute infrastructure. It is not the case that the plan for the future starts now; it is happening now and those compute infrastructures will be used by academia, SMEs and others over the course of the year and beyond. The plan beyond that is to increase the compute infrastructure twentyfold by 2030. That requires a 10-year plan and for us to think into the future about what will be needed for us to be at the forefront of this. Exascale of course is different; it is being looked at as part of that, but it is not the same.
On energy, the noble Lord recognises that one of the most difficult things in government is to join up across departments. That is why it is important.
The national data library will be essential. I welcome the offer of help on health from the noble Lord, Lord Markham, and I will certainly take him up on that; this is an important area to look at. Noble Lords will be hearing much more about the national data library over the next few months. I completely agree that, as we develop this technology, we will need to ensure that citizens’ rights are properly protected. That is something that we will continue to discuss as part of the Data (Use and Access) Bill, among other issues.
Funding will be picked up; it is a fully funded programme, but then we will need to go into a spending review, as Governments always have to.
I will wrap up there to leave plenty of time for others to ask questions, but I hope that I have addressed some of the initial questions.
(2 months ago)
Lords ChamberThe cost of launch has come down by something like 95%. The UK remains committed to getting a launch and remains committed to the space strategy as laid out.
My Lords, in that National Space Strategy, the previous Government focused on encouraging lower earth orbit satellites, which are increasingly contributing to the loss of dark skies, as we have heard. Will this Government focus on incentives for the development of higher-orbit satellites, such as geostationary satellites, particularly the micro versions, of which far fewer are needed? They offer the best cost economics, compared to LEO systems, and have a lower impact on the night sky.
The noble Lord makes an extremely important point about the size of satellites, which is one of the problems with the interference from both radio and optical imaging. The smaller satellites, which the UK is extremely good at making, will become an increasing part of the solution. On orbit, we have a commitment to low orbit through the OneWeb approach—where there are about 700 in low orbit—and to higher orbit where it is appropriate to do so.
(2 months, 3 weeks ago)
Lords ChamberThe noble Lord knows that I know that unit extremely well. It is a very important unit globally and it was given an award of £30 million recently. The new model will allow for a longer period of funding—seven years plus seven years’ funding, so a total of 14 years—with a different process of evaluation, which is a lighter-touch, less bureaucratic process. There is no reason why there cannot be a similar number of trainees going through the new system.
My Lords, I declare an interest as chair of a university governing council. To some extent the Minister’s responses are reassuring, but is this part of a wider trend towards centralising decisions on research funding through UKRI? Are we moving towards a situation where the Government will fund research only within particular sectors set out in their industrial strategy? If that is the case, will that not stifle new research talent and innovation?
As the noble Lord may be aware, I have been very clear about the need for supporting basic curiosity-driven, investigator-led research, and I will remain resolute in that determination. Some of these new centres have specified areas, such as mental health and multi-morbidity, but there is a whole round which is unspecified, allowing for people to put forward ideas of their own for units of the future, which I believe will be important for the very reason the noble Lord says.
(6 months ago)
Lords ChamberMy Lords, I refer to my interests in the register. I join in congratulating all the new Government Ministers and Whips on their appointments. As the DSIT spokesperson on these Benches, I give a particularly warm welcome to the noble Lord, Lord Vallance of Balham, and his excellent maiden speech. While he was the Government’s Chief Scientific Adviser, he was pivotal in setting up the Vaccine Taskforce and in organising the overall strategy for the UK’s development and distribution of Covid-19 vaccines, and we should all be eternally grateful for that.
I warmly welcome the noble Baroness, Lady Jones of Whitchurch, to her role. We have worked well together outside and then inside this House, and I very much want to constructively engage with both Ministers on the Government’s science and technology agenda. I also thank the noble Viscount, Lord Camrose, for his engagement when in the department, and for his courtesy and good humour throughout.
I welcome the Government’s agenda for growth through innovation, their mission to enhance public services through the deployment of new technology and DSIT’s central role in that, opening up what can be a blocked pipeline all the way from R&D to commercialisation—from university spin-out through start-up to scale-up and IPO. Crowding in and de-risking private investment through the national wealth fund, the British Business Bank and post-Mansion House pension reforms is crucial. Digital skills and digital literacy are also crucial but, to deploy digital tools successfully, we need a pipeline of creative critical thinking and collaboration skills as well.
In this context, I very much welcome the new Government’s tone on the value of universities, long-term financial settlements and resetting relations with Europe. I hope this means that we shall soon see whether spending plans for government R&D expenditure by 2030 and 2035 match their words. Disproportionately high overseas researcher visa costs must be lowered, as the noble Lord, Lord Vallance, knows.
But support for innovation should not be unconditional or at any cost, and I hope this Government will not fall into the trap of viewing regulation as necessarily the enemy of innovation. I therefore hope that the reference to AI legislation, but the failure to announce a Bill, is a mere timing issue. Perhaps we can hear later what the Government’s intention is in this respect. Before then, we are promised a product safety and metrology Bill, which could require alignment of AI-driven products with the EU AI Act. This seems to be putting the cart well in front of the regulatory horse.
We need to ensure that high-risk systems are mandated to adopt international ethical and safety standards. We all need to establish very clearly that generative AI systems need licences to ingest copyright material for training purposes, just as Mumsnet and the New York Times are asserting, and that there is an obligation of transparency in the use of datasets and original content. The Government in particular should lead the way in ensuring that there is a high level of transparency and opportunity for redress when algorithmic and automated systems are used in the public sector, and I commend my forthcoming Private Member’s Bill to them.
As regards the Bills in the King’s Speech, I look forward to seeing the details, but the digital information and smart data Bill seems to be heading in the right direction in the areas covered. I hope that other than a few clarifications, especially in research and on the constitution of the Information Commissioner’s Office, we are not going to exhume some of the worst areas of the old DPDI Bill, and that we have ditched the idea of a Brexit-EU divergence dividend by the watering down of so many data subjects’ rights. Will the Government give a firm commitment to safeguard our data adequacy with the EU? I hope that they will confirm that the intent of the reinstated digital verification provisions is not to have some form of compulsory national digital ID, but the creation of a genuine market in digital ID providers that give a choice to the citizen. I hope also that, in the course of that Bill, Ministers will meet LinesearchbeforeUdig and provide us all with much greater clarity around the proposals for the national underground asset register.
As for the cyber security and resilience Bill, events of recent days have demonstrated the need for cybersecurity, but have also made it clear that we are not just talking about threats from bad actors. There needs to be a rethink on critical national infrastructure such as cloud services and software, which are now essential public utilities.
Finally, I hope that we will see a long-awaited amendment of the Computer Misuse Act to include a statutory public defence, as called for by CyberUp, which was recommended by the Vallance report, as I recall. I very much hope that there will be no more Horizon scandals. I look forward to the Minister’s reply.
(8 months, 1 week ago)
Lords ChamberMany of your Lordships will be familiar with the arguments we have had on the Bill. The important point to stress is that there has been a general welcome of this legislation. I would also like to stress that a measure of cross-party co-operation was the hallmark of the scrutiny of the Bill during its passage through your Lordships’ House. Ministers and officials have given their time generously in meetings and have responded promptly and helpfully to the issues that scrutiny has thrown up.
At the heart of the Bill is the regulation of the internet in a way that should prevent market abuse, in particular by big tech. Helpful though the Government have been, they have not provided answers to some important questions, hence amendments being passing on Report. These have been sent back to us by the House of Commons without the Government—save in one respect—making concessions.
One of the areas that gave noble Lords particular concern is the inclusion of amendments in the House of Commons at a late stage, following lobbying of the Government by big tech. A prospective intervention by the regulator is unlikely to be welcomed by big tech companies and, given their enormous legal budgets, will inevitably be challenged. The change of wording from “appropriate” to “proportionate” will make such challenges easier. A reversion to the Bill’s original wording will help to restore balance, and it is hoped that the amendments in my name and those in the name of the noble Baroness, Lady Jones, on appeals against interventions, will achieve that. Our amendments on Motion C are intended to prevent a seepage of arguments on penalty, which involves a merits test, into the judicial review test, which applies to the intervention itself.
Why have the Government made this late change of “appropriate” to “proportionate”? They have been rather coy about this. There has been some waffle—I am afraid I must describe it as such—about increased clarity and the need for a regulator to act in a proportionate manner. That is quite so but, on further probing, the reasoning was revealed: it is intended to reflect the level of challenge derived from jurisprudence from the European Court of Human Rights and the CJEU, where human rights issues are engaged. I remain bewildered as to why big tech has human rights. This is not what the framers of the convention had in mind.
But if—and it is a big “if”—a convention right is engaged, proportionality is the test, or at least part of it. This is a much lower bar than the normal judicial review test. If the Bill remains unamended, this lower bar will apply to challenges whether or not a convention right is engaged. This is good news for big tech and its lawyers, but not for the Bill and its primary purpose.
I ask the Minister this specific question: if the convention right is engaged, proportionality comes into the analysis anyway, but what if a court were to decide that A1P1—the relevant “human right”—was not engaged? With the Bill unamended, proportionality would apply to a non-convention case, greatly to the advantage of big tech. Is my understanding correct?
It seems that big tech has got its way and that litigation wars can commence—a great pity, most specifically for the smaller players and for the ostensible rationale behind the legislation.
On Motion C1, the test for appeals on penalty is to be a merits-based one, rather than the higher bar that a judicial review standard would, or should, involve. The amendments before your Lordships’ House are intended to prevent seepage from one test to another. His Majesty’s Government say that the courts are well used, in different contexts, to applying different tests as part of an analysis. This is true—in theory. My concern is that if I were advising Meta or Google about an intervention and a consequent hefty fine—this is not an advertisement—it is inevitable that I would advise in favour of appealing both aspects of the intervention: against conviction and sentence, as it were.
It is relatively easy to insulate arguments in criminal cases. One question is, was the conviction unsafe? Another is, was the sentence too long? In the emerging world of internet regulation, however, it is likely to be far more difficult in practice. The question of whether an intervention was disproportionate—disproportionate to what?—will inevitably be closely allied to that of whether the penalty was excessive or disproportionate: another win for big tech, and a successful piece of lobbying on its part.
I look forward to words of reassurance from the Minister. In the meantime, I beg to move.
My Lords, I will speak to Motion B1 and briefly in support of other motions in this group.
Last December, at Second Reading, I said that we on these Benches want to see the Bill and the new competition and consumer powers make a real difference, but that they can do so only with some key changes. On Third Reading, I pointed out that we were already seeing big tech take an aggressive approach to the EU’s Digital Markets Act, and we therefore believed that the Bill needed to be more robust and that it was essential to retain the four key competition amendments passed on Report. That remains our position, and I echo the words of the noble Lord, Lord Faulks: that the degree of cross-party agreement has been quite exemplary.
As we heard on Report, noble Lords made four crucial amendments to Part 1 of the digital markets Bill: first, an amendment whereby, when the Competition and Markets Authority seeks approval of its guidance, the Secretary of State is required within 40 days to approve the guidance or to refuse to approve it and refer it back to the CMA; secondly, an amendment reverting the countervailing benefits exemption to the version originally in the Bill, which included the “indispensable” standard; thirdly, amendments reverting the requirement for the CMA’s conduct requirement and pro-competitive interventions to be “proportionate” back to “appropriate”; and fourthly, amendments reverting the appeals standard to judicial review for penalties.
We welcome the fact that the Government have proposed, through Motion D, Amendment 38A in lieu, which effectively achieves the same aims, ensuring that the approval of the CMA guidance by the Secretary of State does not unduly hold up the operationalisation of the new regime. However, the Government’s Motions A, B and C disagree with the other Lords amendments.
My Lords, I thank all noble Lords who have contributed to the debate today and, of course, throughout the development of this legislation. It has been a characteristically brilliant debate; I want to thank all noble Lords for their various and valuable views.
I turn first to the Motions tabled by the noble Lord, Lord Faulks, in relation to appeals and proportionality. I thank him for his continued engagement and constructive debate on these issues. We of course expect the CMA to behave in a proportionate manner at all times as it operates the digital market regime. However, today we are considering specifically the statutory requirement for proportionality in the Bill. We are making it clear that the DMU must design conduct requirements and PCIs to place as little burden as possible on firms, while still effectively addressing competition issues. The proposed amendments would not remove the reference to proportionality in Clause 21 and so, we feel, do not achieve their intended aim, but I shall set out the Government’s position on why proportionality is required.
On the question of the wording of “appropriate” versus “proportionate”, proportionality is a well-understood and precedented concept with a long history of case law. “Appropriate” would be a more subjective threshold, giving the CMA broader discretion. The Government’s position is that proportionality is the right threshold to be met in legislation due to the fact that it applies, in the vast majority of cases, because of ECHR considerations. It is the Government’s view that the same requirement for proportionality should apply whether or not ECHR rights are engaged.
As Article 1 of Protocol 1—A1P1—of the European Convention on Human Rights will apply to the vast majority of conduct requirements and PCIs imposed by the CMA, with the result that the courts will apply a proportionality requirement, we consider it important that it should be explicit that there is a statutory proportionality requirement for all conduct requirements and PCIs. We believe that proportionality should be considered beyond just those cases where A1P1 may apply, in particular when a conduct requirement or PCI would impact future contracts of an SMS firm.
The courts’ approach to proportionality in relation to consideration of ECHR rights has been set out by the Supreme Court, and we do not expect them to take a different approach here. Furthermore, the CAT will accord respect to the expert judgments of the regulator and will not seek to overturn its judgments lightly. I hope this answers the question put by the noble Lord, Lord Faulks.
On appeals, I thank noble Lords for their engagement on this matter, and in particular the noble Baroness, Lady Jones of Whitchurch, for setting out the rationale for her Amendments 32B and 32C, which seek to provide further clarity about where on the merits appeals apply. I want to be clear that the Government’s intention is that only penalty decisions will be appealable on the merits and that this should not extend to earlier decisions about whether an infringement occurred. I do not consider these amendments necessary, for the following reasons.
The Bill draws a clear distinction between penalty decisions and those about infringements, with these being covered by separate Clauses 89 and 103. There is a Court of Appeal precedent in BCL v BASF 2009 that, in considering a similar competition framework, draws a clear distinction between infringement decisions and penalty decisions. The Government consider that the CAT and the higher courts will have no difficulty in making this distinction for digital markets appeals to give effect to the legislation as drafted.
I now turn to the Motion tabled by the noble Lord, Lord Clement-Jones, in respect of the countervailing benefits exemption. I thank the noble Lord for his engagement with me and the Bill team on this important topic. The noble Lord has asked for clarification that the “indispensability” standard in Section 9 of the Competition Act 1998, and the wording,
“those benefits could not be realised without the conduct”,
are equivalent to each other. I want to be clear that the exemption within this regime and the exemption in Section 9 of the Competition Act 1998 are different. This is because they operate in wholly different contexts, with different criteria and processes. This would be the case however the exemption is worded in this Bill. That is why the Explanatory Notes refer to a “similar” exemption, because saying it is “equivalent” would be technically incorrect.
Having said that, the “indispensability” standard and the threshold of the Government’s wording,
“those benefits could not be realised without the conduct”,
are equally high. While the exemptions themselves are different, I hope I can reassure noble Lords that the Government’s view is that the standard—the height of the threshold—is, indeed, equivalent. The Government still believe that the clarity provided by simplifying the language provides greater certainty to all businesses, while ensuring that consumers get the best outcomes.
I thank the noble Lord, Lord Clement-Jones, for his question in relation to the Google privacy sandbox case. The CMA considers a range of consumer benefits under its existing consumer objective. This can include the privacy of consumers. It worked closely with the ICO to assess data privacy concerns in its Google privacy sandbox investigation and we expect it would take a similar approach under this regime.
I urge all noble Lords to consider carefully the Motions put forward by the Government and hope all Members will feel able—
Indeed. In principle I am very happy to update the Explanatory Notes, but I need to engage with ministerial colleagues. However, I see no reason why that would not be possible.
Meanwhile, I hope all noble Lords will feel able to support the Government’s position.
My Lords, I have already spoken to Motion B. I beg to move.
Motion B1 (as an amendment to Motion B)
Tabled by
Leave out from “House” to end and insert “do not insist on its Amendment 12, to which the Commons have disagreed for their Reason 13A, and do insist on its Amendment 13.”
My Lords, if this is not a non-parliamentary expression, I will say that the Minister has come within a gnat’s whisker of where we need to be. I rely on his assurances about Explanatory Notes, because they will be important, but I do not move Motion B1.
My Lords, I support Motion E1 and pay fulsome tribute to the noble Lord, Lord Moynihan, for his expertise and tenacity. Thanks to his efforts and those of Sharon Hodgson MP, and after a long campaign with the All-Party Group on Ticket Abuse, we were able to include certain consumer protections in the ticketing market in the Consumer Rights Act 2015. The noble Lord’s amendment on Report sought to introduce additional regulatory requirements on secondary ticketing sites for proof of purchase, ticket limits and the provision of information on the face of tickets. That would have secured greater protection for consumers and avoided market exploitation, which is currently exponentially growing on platforms such as viagogo.
As we have heard, the Ministers—the noble Lord, Lord Offord, and the noble Viscount, Lord Camrose—in their letter of 1 May to noble Lords, offered a review that would take place over nine months, which would make recommendations for Ministers to consider. But that is simply not enough, as the noble Lord, Lord Moynihan, has demonstrated. The Minister, the noble Lord, Lord Offord, seems to believe from his own experience—unlike the rest of us—that everything is fine with the secondary market and that the answer to any problem lies in the hands of the primary ticket sellers. However, the noble Lord, Lord Moynihan, in his brilliantly expert way, demonstrated extremely cogently how that is absolutely not the case for the Minister’s favourite sports of rugby and football, where the secondary resellers are flagrantly breaking the law.
(8 months, 2 weeks ago)
Lords ChamberIndeed—and let me first thank my noble friend for bringing up this important matter. That sounds to me like something that would be likely to be applied under the false communications offence in the Online Safety Act—Section 179—although I would not be able to say for sure. The tests that it would need to meet are that the information would have to be knowingly false and cause non-trivial physical or psychological harm to those offended, but that would seem to be the relevant offence.
My Lords, does not the Question from the noble Baroness, Lady Jones, highlight that we must hold to account with legal liability not only those who create this kind of deepfake content and facilitate its spread, but those who enable the production of deepfakes with software, such as by having standards and risk-based regulation for generative AI systems, which the Government in their White Paper have resolutely refused to do?
The Government set out in their White Paper response that off-the-shelf AI software that can in part be used to create these kinds of deepfakes is not, in and of itself, something that we are considering placing any ban on. However, there are ranges of software, a sort of middle layer to the AI production, that can greatly facilitate the production of deepfakes of all kinds, not just political but other kinds of criminal deepfakes—and there the Government would be actively considering moving against those purpose-built criminal tools.
(8 months, 4 weeks ago)
Grand CommitteeMy Lords, it has been a privilege to be at the ringside during these three groups. I think the noble Baroness, Lady Sherlock, is well ahead on points and that, when we last left the Minister, he was on the ropes, so I hope that to avoid the knock- out he comes up with some pretty good responses today, especially as we have been lucky enough to have the pleasure of reading Hansard between the second and third groups. I think the best phrase that noble Baroness had was the “astonishing breadth” of Clause 128 and Schedule 11 that we explored with horror last time. I very much support what she says.
The current provisions seem to make the code non-mandatory, yet we discovered they are without “reasonable suspicion”, the words that are in the national security legislation—fancy having the Home Office as our model in these circumstances. Does that not put the DWP to shame? If we have to base best practice on the Home Office, we are in deep trouble.
That aside, we talked about “filtering” and “signals” last time. The Minister used that phrase twice, I think, and we discovered about “test and learn”. Will all that be included in the code?
All this points to the fragility and breadth of this schedule. It has been dreamt up in an extraordinarily expansive way without considering all the points that the noble Lord, Lord Anderson, has mentioned, including the KC’s opinion, all of which point to the fact that this schedule is going to infringe Article 8 of the European Convention on Human Rights. I hope the Minister comes up with some pretty good arguments.
My final question relates to the impact assessment–or non-impact assessment. The Minister talked about the estimate of DWP fraud, which is £6.4 billion. What does the DWP estimate it will be after these powers are implemented, if they are ever implemented? Should we not have an idea of the DWP’s ambitions in this respect?
My Lords, this has been a somewhat shorter debate than we have been used to, bearing in mind Monday’s experience. As with the first two groups debated then, many contributions have been made today and I will of course aim to answer as many questions as I can. I should say that, on this group, the Committee is primarily focusing on the amendments brought forward by the noble Baroness, Lady Sherlock, and I will certainly do my very best to answer her questions.
From the debate that we have had on this measure, I believe that there is agreement in the Committee that we must do more to clamp down on benefit fraud. That is surely something on which we can agree. In 2022-23, £8.3 billion was overpaid due to fraud and error in the benefit system. We must tackle fraud and error and ensure that benefits are paid to those genuinely entitled to the help. These powers are key to ensuring that we can do this.
I will start by answering a question raised by the noble Lord, Lord Anderson—I welcome him to the Committee for the first time today. He described himself as a “surveillance nerd”, but perhaps I can entreat him to rename himself a “data-gathering nerd”. As I said on Monday, this is not a surveillance power and suggesting that it is simply causes unnecessary worry. This is a power that enables better data gathering; it is not a surveillance or investigation power.
The third-party data measure does not allow the DWP to see how claimants spend their money, nor does it give the DWP access to millions of people’s bank accounts, as has been inaccurately presented. When the DWP examines the data that it receives from third parties, this data may suggest that there is fraud or error and require a further review. This will be done through our normal, regular, business-as-usual processes to determine whether incorrect payments are indeed being made. This approach is not new. As alluded to in this debate, through the Finance Act 2011, Parliament has already determined that this type of power is proportionate and appropriate, as HMRC already owns similar powers regarding banking institutions and third parties in relation to all taxpayers.
I listened very carefully to the noble Lord and will, however, take back his points and refer again to our own legal team. I think the point was made about the legality of all this. It is a very important point that he has made with all his experience, and I will take it back and reflect on it.
That is a very fair question, and I hope that I understand it correctly. I can say that the limit for the DWP is that it can gain only from what the third party produces. Whatever goes on behind the doors of the third party is for them and not us. Whether there is a related account and how best to operate is a matter for the bank to decide. We may therefore end up getting very limited information, in terms of the limits of our powers. I hope that helps, but I will add some more detail in the letter.
My Lords, the Minister extolled the green-rated nature of this impact assessment. In the midst of all that, did he answer my question?
I asked about the amount of fraud that the Government plan to detect, on top of the £6.4 billion in welfare overpayments that was detected last year.
The figure that we have is £600 million but, again, I will reflect on the actual question that we are looking to address—the actual amount of fraud in the system.
The Minister is saying that that figure is not to be found in this green-rated impact assessment, which most of us find to be completely opaque.
I will certainly take that back, but it is green rated.
My Lords, we have talked about proportionality and disproportionality throughout the debate on this Bill. Is it not extraordinary that that figure is not on the table, given the extent of these powers?
My Lords, the Minister was kind enough to mention me a little earlier. Can I just follow up on that? In the impact assessment, which I have here, nowhere can I find the £600 million figure, nor can I find anywhere the costs related to this. There will be a burden on the banks and clearly quite a burden on the DWP, actually, if it has got to trawl through this information, as the noble Viscount says, using people rather than machines. The costs are going to be enormous to save, it would appear, up to £120 million per year out of £6.4 billion per year of fraud. It does seem odd. It would be really helpful to have those cost numbers and to understand in what document they are, because I cannot find in the impact assessment where these numbers are.
I hope I can help both noble Lords. Although I must admit that I have not read every single page, I understand that the figure of £500 million is in the IA.
Yes, £500 million. I mentioned £600 million altogether; that was mentioned by the OBR, which had certified this, and by the way, that figure was in the Autumn Statement.
My Lords, has not that demonstrated the disproportionality of these measures?
The noble Viscount explained in response to the noble Lord, Lord Anderson, that at every stage where the powers are going to be expanded, it would come back as an affirmative regulation. I might have been a bit slow about this, but I have been having a look and I cannot see where it says that. Perhaps he could point that out to me, because that would provide some reassurance that each stage of this is coming back to us.
I understand, very quickly, that it is in paragraph 1(1), but again, in the interests of time, maybe we could talk about that outside the Room.
I can reassure the noble Lord that that is the case, yes.
I reassure noble Lords that is correct—it is paragraph 1(1). It may be rather complex, but it is in there, just to reassure all noble Lords.
I am sorry to keep coming back, but did the Minister give us the paragraph in the impact assessment that referred to £500 million?
No, I did not, but that is something which surely we can deal with outside the Room. However, I can assure noble Lords that it is in there.
My Lords, I want briefly to contribute to this debate, which I think is somewhat less contentious than the previous group of amendments. As somebody, again, who was working on the Online Safety Act all the way through, I really just pay tribute to the tenacity of the noble Baroness, Lady Kidron, for pursuing this detail—it is a really important detail. We otherwise risk, having passed the legislation, ending up in scenarios where everyone would know that it was correct for the data-gathering powers to be implemented but, just because of the wording of the law, they would not kick in when it was necessary. I therefore really want to thank the noble Baroness, Lady Kidron, for being persistent with it, and I congratulate the Government on recognising that, when there is an irresistible force, it is better to be a movable object than an immovable one.
I credit the noble Viscount the Minister for tabling these amendments today. As I say, I think that this is something that can pass more quickly because there is broad agreement around the Committee that this is necessary. It will not take away the pain of families who are in those circumstances, but it will certainly help coroners get to the truth when a tragic incident has occurred, whatever the nature of that tragic incident.
My Lords, having been involved in and seen the campaigning of the bereaved families and the noble Baroness, Lady Kidron, in particular in the Joint Committee on the Draft Online Safety Bill onwards, I associate myself entirely with the noble Baroness’s statement and with my noble friend Lord Allan’s remarks.
My Lords, I thank the Minister for setting out the amendment and all noble Lords who spoke. I am sure the Minister will be pleased to hear that we support his Amendment 236 and his Amendment 237, to which the noble Baroness, Lady Kidron, has added her name.
Amendment 236 is a technical amendment. It seeks the straightforward deletion of words from a clause, accounting for the fact that investigations by a coroner, or procurator fiscal in Scotland, must start upon them being notified of the death of a child. The words
“or are due to conduct an investigation”
are indeed superfluous.
We also support Amendment 237. The deletion of this part of the clause would bring into effect a material change. It would empower Ofcom to issue a notice to an internet service provider to retain information in all cases of a child’s death, not just cases of suspected suicide. Sadly, as many of us have discovered in the course of our work on this Bill, there is an increasing number of ways in which communication online can be directly or indirectly linked to a child’s death. These include areas of material that is appropriate for adults only; the inability to filter harmful information, which may adversely affect mental health and decision-making; and, of course, the deliberate targeting of children by adults and, in some cases, by other children.
There are adults who use the internet with the intention of doing harm to children through coercion, grooming or abuse. What initially starts online can lead to contact in person. Often, this will lead to a criminal investigation, but, even if it does not, the changes proposed by this amendment could help prevent additional tragic deaths of children, not just those caused by suspected child suicides. If the investigating authorities have access to online communications that may have been a contributing factor in a child’s death, additional areas of concern can be identified by organisations and individuals with responsibility for children’s welfare and action taken to save many other young lives.
Before I sit down, I want to take this opportunity to say a big thank you to the noble Baroness, Lady Kidron, the noble Lord, Lord Kennedy, and all those who have campaigned on this issue relentlessly and brought it to our attention.
My Lords, I will be brief because we very much support these amendments. Interestingly, Amendment 239 from the noble Baroness, Lady Jones, follows closely on from a Private Member’s Bill presented in November 2021 by the Minister’s colleague, Minister Saqib Bhatti, and before that by the right honourable Andrew Mitchell, who is also currently a Minister. The provenance of this is impeccable, so I hope that the Minister will accept Amendment 239 with alacrity.
We very much support Amendment 250. The UK Commission on Bereavement’s Bereavement is Everyone’s Business is a terrific report. We welcome Clause 133 but we think that improvements can be made. The amendment from the noble Baroness, which I have signed, will address two of the three recommendations that the report made on the Tell Us Once service. It said that there should be a review, which this amendment reflects. It also said that
“regulators must make sure bereaved customers are treated fairly and sensitively”
by developing minimum standards. We very much support that. It is fundamentally a useful service but, as the report shows, it can clearly be improved. I congratulate the noble Baroness, Lady Jones, on picking up the recommendations of the commission and putting them forward as amendments to this Bill.
My Lords, I declare an interest as someone who has been through the paper death registration process and grant of probate, which has something to do with why I am in your Lordships’ House, so I absolutely understand where the noble Baroness, Lady Jones of Whitchurch, is coming from. I thank her for tabling these amendments to Clauses 133 and 142. They would require the Secretary of State to commission a review with a view to creating a single digital register for the registration of births and deaths and to conduct a review of the Government’s Tell Us Once scheme.
Clause 133 reforms how births and deaths are registered in England and Wales by enabling a move from a paper-based system of birth and death registration to registration in a single electronic register. An electronic register is already in use alongside the paper registers and has been since 2009. Well-established safety and security measures and processes are already in place with regard to the electronic infrastructure, which have proven extremely secure in practice. I assure noble Lords that an impact assessment has been completed to consider all the impacts relating to the move to an electronic register, although it should be noted that marriages and civil partnerships are already registered electronically.
The strategic direction is to progressively reduce the reliance on paper and the amount of paper in use, as it is insecure and capable of being tampered with or forged. The creation of a single electronic register will remove the risk of registrars having to transmit loose-leaf register pages back to the register office when they are registering births and deaths at service points across the district. It will also minimise the risk of open paper registers being stolen from register offices.
The Covid-19 pandemic had unprecedented impacts on the delivery of registration services across England and Wales, and it highlighted the need to offer more choice in how births and deaths are registered in the future. The provisions in the Bill will allow for more flexibility in how births and deaths are registered—for example, registering deaths by telephone, as was the case during the pandemic. Over 1 million deaths were successfully registered under provisions in the Coronavirus Act 2020. This service was well received by the public, registrars and funeral services.
Measures will be put in place to ensure that the identity of an informant is established in line with Cabinet Office good practice guidance. This will ensure that information provided by informants can be verified or validated for the purposes of registering by telephone. For example, a medical certificate of cause of death issued by a registered medical practitioner would need to have been received by the registrar before an informant could register a death by telephone. Having to conduct a review, as was proposed by the noble Baroness, Lady Jones, would delay moving to digital ways of working and the benefits this would introduce.
My Lords, I thank the Minister for his exposition. He explained the purposes of Clauses 138 to 141 and extolled their virtues, and helpfully explained what my amendments are trying to do—not that he has shot any foxes in the process.
The purpose of my amendments is much more fundamental, and that is to question the methodology of the Government in all of this. The purpose of NUAR is to prevent accidental strikes where building works damage underground infrastructure. However, the Government seem to have ignored the fact that an equivalent service—LinesearchbeforeUdig, or LSBUD—already achieves these aims, is much more widely used than NUAR and is much more cost effective. The existing system has been in place for more than 20 years and now includes data from more than 150 asset owners. It is used by 270,000 UK digging contractors and individuals—and more every day. The fact is that, without further consultation and greater alignment with current industry best practice, NUAR risks becoming a white elephant, undermining the safe working practices that have kept critical national infrastructure in the UK safe for more than two decades.
However, the essence of these amendments is not to cancel NUAR but to get NUAR and the Government to work much more closely with the services that already exist and those who wish to help. They are designed to ensure that proper consultation and democratic scrutiny is conducted before NUAR is implemented in statutory form. Essentially, the industry says that NUAR could be made much better and much quicker if it worked more closely with the private sector services that already exist. Those who are already involved with LinesearchbeforeUdig say, first of all, that NUAR will create uncertainty and reduce safety, failing in its key aims.
The Government have been developing the NUAR since 2018. Claiming that it would drive a reduction in unexpected underground assets being damaged in roadworks, the impact assessment incorrectly states:
“No businesses currently provide a service that is the same or similar to the service that NUAR would provide”.
In fact, as I said, LSBUD has been providing a safe digging service in the UK for 20 years and has grown significantly over that time. Without a plan to work more closely with LSBUD as the key industry representative, NUAR risks creating more accidental strikes of key network infrastructure, increasing risks to workers safety through electrical fires, gas leaks, pollution and so on. The public at home or at work would also suffer more service outages and disruption.
Secondly, NUAR will add costs and stifle competition. The Government claim that NUAR will deliver significant benefits to taxpayers, reduce disruption and prevent damage to underground assets, but the impact assessment ignores the fact that NUAR’s core functions are already provided through the current system—so its expected benefits are vastly overstated. While asset owners, many of whom have not been consulted, will face costs of more than £200 million over the first 10 years, the wholesale publication of asset owners’ entire networks creates commercially sensitive risks, damaging innovation and competition. Combined with the uncertainties about how quickly NUAR can gain a critical mass of users and data, this again calls into question why NUAR does not properly align with and build on the current system but instead smothers competition and harms a successful, growing UK business.
Thirdly, NUAR risks undermining control over sensitive CNI data. Underground assets are integral to critical national infrastructure; protecting them is vital to the UK’s economic and national security. LSBUD deliberately keeps data separate and ensures that data owners remain in full control over who can access their data via a secure exchange platform. NUAR, however, in aiming to provide a single view of all assets, removes providers’ control over their own data—an essential security fail-safe. It would also expand opportunities for malicious actors to target sectors in a variety of ways—for instance, the theft of copper wires from telecom networks.
NUAR shifts control over data access to a centralised government body, with no clear plan for how the data is to be protected from unauthorised access, leading to serious concerns about security and theft. Safe digging is paramount; mandating NUAR will lead to uncertainty, present more health and safety dangers to workers and the public and put critical national infrastructure at risk. These plans require further review. There needs to be, as I have said, greater alignment with industry best practice. Without further consultation, NUAR risks becoming a white elephant that undermines safe digging in the UK and increases risk to infrastructure workers and the public.
I will not go through the amendments individually as the Minister has mentioned what their effect would be, but I will dispel a few myths. The Government have claimed that NUAR has the overwhelming support of asset owners. In the view of those who briefed me, that is not an accurate reflection of the broadband and telecoms sector in particular; a number of concerns from ISPA members have been raised with the NUAR team around cost and security that have yet to be addressed. This is borne out by the fact that there are notable gaps in the major asset owners in the telecoms sector signed up to NUAR at this time.
Clearly, the noble Viscount is resisting changing the procedure by which these changes are made from negative to affirmative, but I hope I have gone some way to persuade the Committee of the importance of this change to how the NUAR system is put on a statutory footing. He talked about a “handful” of data; the comprehensive nature of the existing system is pretty impressive, and it is a free service, updated on a regular basis, which covers more than 150 asset owners and 98% of high-risk assets. NUAR currently covers only one-third of asset owners. The comparisons are already not to the advantage of NUAR.
I hope the Government will at least, even if they do not agree with these amendments, think twice before proceeding at the speed they seem to be and without the consent or taking on board the concerns of those who are already heavily engaged with Linesearch- beforeUdig who find it pretty satisfactory for their purposes.
My Lords, the Minister really did big up this section of the Bill. He said it would revolutionise this information service, that it would bring many benefits, has a green rating, would be the Formula 1 of data transfer in mapping and so on. We were led to expect quite a lot from this part of the legislation. It is an important part of the Bill, because it signifies some government progress towards the goal of creating a comprehensive national underground asset register, as he put it, or NUAR. We are happy to support this objective, but we have concerns about the progress being made and the time it is taking.
To digress a bit here, it took me back 50 years to when I was a labourer working by the side of a bypass. One of the guys I was working with was operating our post hole borer; it penetrated the Anglian Water system and sent a geyser some 20 metres up into the sky, completely destroying my midday retreat to the local pub between the arduous exercise of digging holes. Had he had one of the services on offer, I suspect that we would not have been so detained. It was quite an entertaining incident, but it clearly showed the dangers of not having good mapping.
As I understand it, and as was outlined by the noble Lord, Lord Clement-Jones, since 2018 the Government have been moving towards this notion of somewhere recording what lies below the surface in our communities. We have had street works legislation going back several decades, from at least 1991. In general, progress towards better co-ordination of utilities excavations has not been helped by poor and low levels of mapping and knowledge of what and which utilities are located underground. This is despite the various legislative attempts to make that happen, most of which have attempted to bring better co-ordination of services.
I start by thanking the noble Lords, Lord Clement-Jones and Lord Bassam, for their respective replies. As I have said, the Geospatial Commission has been engaging extensively with stakeholders, including the security services, on NUAR since 2018. This has included a call for evidence, a pilot project, a public consultation, focus groups, various workshops and other interactions. All major gas and water companies have signed up, as well as several large telecoms firms.
While the Minister is speaking, maybe the Box could tell him whether the figure of only 33% of asset owners having signed up is correct? Both I and the noble Lord, Lord Bassam, mentioned that; it would be very useful to know.
It did complete a pilot phase this year. As it operationalises, more and more will sign up. I do not know the actual number that have signed up today, but I will find out.
NUAR does not duplicate existing commercial services. It is a standardised, interactive digital map of buried infrastructure, which no existing service is able to provide. It will significantly enhance data sharing and access efficiency. Current services—
I am not sure that there is doubt over the current scope of NUAR; it is meant to address all buried infrastructure in the United Kingdom. LSBUD does make extensive representations, as indeed it has to parliamentarians of both Houses, and has spoken several times to the Geospatial Commission. I am very happy to commit to continuing to do so.
My Lords, the noble Lord, Lord Bassam, is absolutely right to be asking that question. We can go only on the briefs we get. Unlike the noble Lord, Lord Bassam, I have not been underground very recently, but we do rely on the briefings we get. LSBUD is described as a
“sustainably-funded UK success story”—
okay, give or take a bit of puff—that
“responds to most requests in 5 minutes or less”.
It has
“150+ asset-owners covering nearly 2 million km and 98% of high-risk assets—like gas, electric, and fuel pipelines”.
That sounds as though we are in the same kind of territory. How can the Minister just baldly state that NUAR is entirely different? Can he perhaps give us a paragraph on how they differ? I do not think that “completely different” can possibly characterise this relationship.
As I understand it, LSBUD services are provided on a pdf, on request. It is not interactive; it is not vector-based graphics presented on a map, so it cannot be interrogated in the same way. Furthermore, as I understand it—and I am happy to be corrected if I am misstating—LSBUD has a great many private sector asset owners, but no public sector data is provided. All of it is provided on a much more manualised basis. The two services simply do not brook comparison. I would be delighted to speak to LSBUD.
My Lords, we are beginning to tease out something quite useful here. Basically, NUAR will be pretty much an automatic service, because it will be available online, I assume, which has implications on data protection, on who owns the copyright and so on. I am sure there are all kinds of issues there. It is the way the service is delivered, and then you have the public sector, which has not taken part in LSBUD. Are those the two key distinctions?
Indeed, there are two key distinctions. One is the way that the information is provided online, in a live format, and the other is the quantity and nature of the data that is provided, which will eventually be all relevant data in the United Kingdom under NUAR, versus those who choose to sign up on LSBUD and equivalent services. I am very happy to write on the various figures. Maybe it would help if I were to arrange a demonstration of the technology. Would that be useful? I will do that.
Unlike the noble Lord, Lord Bassam, I do not have that background in seeing what happens with the excavators, but I would very much welcome that. The Minister again is really making the case for greater co-operation. The public sector has access to the public sector information, and LSBUD has access to a lot of private sector information. Does that not speak to co-operation between the two systems? We seem to have warring camps, where the Government are determined to prove that they are forging ahead with their new service and are trampling on quite a lot of rights, interests and concerns in doing so—by the sound of it. The Minister looks rather sceptical.
I am not sure whose rights are being trampled on by having a shared database of these things. However, I will arrange a demonstration, and I confidently state that nobody who sees that demonstration will have any cynicism any more about the quality of the service provided.
All I can say is that, in that case, the Minister has been worked on extremely well.
In addition to the situation that the noble Lord, Lord Bassam, described, I was braced for a really horrible situation, because these things very often lead to danger and death, and there is a very serious safety argument to providing this information reliably and rapidly, as NUAR will.
Before the Minister’s peroration, I just want to check something. He talked about the discovery project and contact with the industry; by that, I assume he was talking about asset owners as part of the project. What contact is proposed with the existing company, LinesearchbeforeUdig, and some of its major supporters? Can the Government assure us that they will have greater contact or try to align? Can they give greater assurance than they have been able to give today? Clearly, there is suspicion here of the Government’s intentions and how things will work out. If we are to achieve this safety agenda—I absolutely support it; it is the fundamental issue here—more work needs to be done in building bridges, to use another construction metaphor.
As I said, the Government have met the Geospatial Commission many times. I would be happy to meet it in order to help it adapt its business model for the NUAR future. As I said, it has attended the last three discovery workshops, allowing this data.
I close by thanking noble Lords for their contributions. I hope they look forward to the demonstration.
My Lords, I congratulate the noble Baroness, Lady Kidron, on her amendment and thank her for allowing me to add my name to it. I agree with what she said. I, too, had the benefit of a meeting with the Lord Chancellor, which was most helpful. I am grateful to Mr Paul Marshall—whom the noble Baroness mentioned and who has represented several sub-postmasters in the Horizon scandal—for his help and advice in this matter.
My first short point is that evidence derived from a computer is hearsay. There is good reason for treating hearsay evidence with caution. Computer scientists know—although the general public do not—that only the smallest and least complex computer programs can be tested exhaustively. I am told that the limit for that testing is probably around 100 lines of a well-designed and carefully written program. Horizon, which Mr Justice Fraser said was not in the least robust, consisted of a suite of programs involving millions of lines of code. It will inevitably have contained thousands of errors because all computer programs do. Most computer errors do not routinely cause malfunctions. If they did, they would be spotted at an early stage and the program would be changed—but potentially with consequential changes to the program that might not be intended or spotted.
We are all aware of how frequently we are invited to accept software updates from our mobile telephone’s software manufacturers. Those updates are not limited to security chinks but are also required because bugs—or, as we learned yesterday from Paula Vennells’s husband, anomalies and exceptions—are inevitable in computer programs. That is why Fujitsu had an office dedicated not just to altering the sub-postmasters’ balances, shocking as that is, but to altering and amending a program that was never going to be perfect because no computer program is.
The only conclusion that one can draw from all this is that computer programs are, as the noble Baroness said, inherently unreliable, such that having a presumption in law that they are reliable is unsustainable. In the case of the DPP v McKeown and Jones—in 1997, I think—Lord Hoffmann said:
“It is notorious that one needs no expertise in electronics to be able to know whether a computer is working properly”.
One must always hesitate before questioning the wisdom of a man as clever as Lord Hoffmann, but he was wrong. The notoriety now attaches to his comment.
The consequences of the repeal of Section 69 of the Police and Criminal Evidence Act 1984 have been that it reduces the burden of proof, so that Seema Misra was sent to prison in the circumstances set out by the noble Baroness. Further, this matter is urgent for two reasons; they slightly conflict with each other, but I will nevertheless set them out. The first is that for the presumption to remain in place for one minute longer means that there is a genuine risk that miscarriages of justice will continue to occur in other non-Post Office cases, from as early as tomorrow. The second is that any defence lawyer will, in any event, be treating the presumption as having been fatally undermined by the Horizon issues. The presumption will therefore be questioned in every court where it might otherwise apply. It needs consideration by Parliament.
My noble friend the Minister will say, and he will be right, that the Horizon case was a disgraceful failure of disclosure by the Post Office. But it was permitted by the presumption of the correctness of computer evidence, which I hope we have shown is unsustainable. Part of the solution to the problem may lie in changes to disclosure and discovery, but we cannot permit a presumption that we know to be unfounded to continue in law.
My noble friend may also go on to say that our amendment is flawed in that it will place impossible burdens on prosecutors, requiring them to get constant certificates of proper working from Microsoft, Google, WhatsApp, and whatever Twitter is called nowadays. Again, he may be right. We do not seek to bring prosecutions grinding to a halt, nor do we seek to question the underlying integrity of our email or communications systems, so we may need another way through this problem. Luckily, my noble friend is a very clever man, and I look forward to hearing what he proposes.
My Lords, we have heard two extremely powerful speeches; I will follow in their wake but be very brief. For many years now, I campaigned on amending the Computer Misuse Act; the noble Lord, Lord Arbuthnot, did similarly. My motivation did not start with the Horizon scandal, but was more at large because of the underlying concerns about the nature of computer evidence.
I came rather late to this understanding about the presumption of the accuracy of computer evidence. It is somewhat horrifying, the more you look into the history of this, which has been so well set out by the noble Baroness, Lady Kidron. I remember advising MPs at the time about the Police and Criminal Evidence Act. I was not really aware of what the Law Commission had recommended in terms of getting rid of Section 69, or indeed what the Youth Justice and Criminal Evidence Act did in 1999, a year after I came into this House.
The noble Baroness has set out the history of it, and how badly wrong the Law Commission got this. She set out extremely well the impact and illustration of Mrs Misra’s case, the injustice that has resulted through the Horizon cases—indeed, not just through those cases, but through other areas—and the whole aspect of the reliability of computer evidence. Likewise, we must all pay tribute to the tireless campaigning of the noble Lord, Lord Arbuthnot. I thought it was really interesting how he described computer evidence as hearsay, because that essentially is what it is, and there is the whole issue of updates and bug fixing.
The one area that I am slightly uncertain about after listening to the debate and having read some of the background to this is precisely what impact Mr Justice Fraser’s judgment had. Some people seem to have taken it as simply saying that the computer evidence was unreliable, but that it was a one-off. It seems to me that it was much more sweeping than that and was really a rebuttal of the original view the Law Commission took on the reliability of computer evidence.
My Lords, I recognise the feeling of the Committee on this issue and, frankly, I recognise the feeling of the whole country with respect to Horizon. I thank all those who have spoken for a really enlightening debate. I thank the noble Baroness, Lady Kidron, for tabling the amendment and my noble friend Lord Arbuthnot for speaking to it and—if I may depart from the script—his heroic behaviour with respect to the sub-postmasters.
There can be no doubt that hundreds of innocent sub-postmasters and sub-postmistresses have suffered an intolerable miscarriage of justice at the hands of the Post Office. I hope noble Lords will indulge me if I speak very briefly on that. On 13 March, the Government introduced the Post Office (Horizon System) Offences Bill into Parliament, which is due to go before a Committee of the whole House in the House of Commons on 29 April. The Bill will quash relevant convictions of individuals who worked, including on a voluntary basis, in Post Office branches and who have suffered as a result of the Post Office Horizon IT scandal. It will quash, on a blanket basis, convictions for various theft, fraud and related offences during the period of the Horizon scandal in England, Wales and Northern Ireland. This is to be followed by swift financial redress delivered by the Department for Business and Trade.
On the amendment laid by the noble Baroness, Lady Kidron—I thank her and the noble Lords who have supported it—I fully understand the intent behind this amendment, which aims to address issues with computer evidence such as those arising from the Post Office cases. The common law presumption, as has been said, is that the computer which has produced evidence in a case was operating effectively at the material time unless there is evidence to the contrary, in which case the party relying on the computer evidence will need to satisfy the court that the evidence is reliable and therefore admissible.
This amendment would require a party relying on computer evidence to provide proof up front that the computer was operating effectively at the time and that there is no evidence of improper use. I and my fellow Ministers, including those at the MoJ, understand the intent behind this amendment, and we are considering very carefully the issues raised by the Post Office cases in relation to computer evidence, including these wider concerns. So I would welcome the opportunity for further meetings with the noble Baroness, alongside MoJ colleagues. I was pleased to hear that she had met with my right honourable friend the Lord Chancellor on this matter.
We are considering, for example, the way reliability of evidence from the Horizon system was presented, how failures of investigation and disclosure prevented that evidence from being effectively challenged, and the lack of corroborating evidence in many cases. These issues need to be considered carefully, with the full facts in front of us. Sir Wyn Williams is examining in detail the failings that led to the Post Office scandal. These issues are not straightforward. The prosecution of those cases relied on assertions that the Horizon system was accurate and reliable, which the Post Office knew to be wrong. This was supported by expert evidence, which it knew to be misleading. The issue was that the Post Office chose to withhold the fact that the computer evidence itself was wrong.
This amendment would also have a significant impact on the criminal justice system. Almost all criminal cases rely on computer evidence to some extent, so any change to the burden of proof would or could impede the work of the Crown Prosecution Service and other prosecutors.
Although I am not able to accept this amendment for these reasons, I share the desire to find an appropriate way forward along with my colleagues at the Ministry of Justice, who will bear the brunt of this work, as the noble Lord, Lord Clement-Jones, alluded to. I look forward to meeting the noble Baroness to discuss this ahead of Report. Meanwhile, I hope she will withdraw her amendment.
Can the Minister pass on the following suggestion? Paul Marshall, who has been mentioned by all of us, is absolutely au fait with the exact procedure. He has experience of how it has worked in practice, and he has made some constructive suggestions. If there is not a full return to Section 69, there could be other, more nuanced, ways of doing this, meeting the Minister’s objections. But can I suggest that the MoJ has contact with him and discusses what the best way forward would be? He has been writing about this for some years now, and it would be extremely useful, if the MoJ has not already engaged with him, to do so.
My Lords, I am afraid that I will speak to every single one of the amendments in this group but one, which is in the name of the noble Baroness, Lady Jones, and I have signed it. We have already debated the Secretary of State’s powers in relation to what will be the commission, in setting strategic priorities for the commissioner under Clause 32 and recommending the adoption of the ICAO code of practice before it is submitted to Parliament for consideration under Clause 33:
“Codes of practice for processing personal data”.
We have also debated Clause 34:
“Codes of practice: panels and impact assessments”.
And we have debated Clause 35:
“Codes of Practice: Secretary of States recommendations”.
The Secretary of State has considerable power in relation to the new commission, and then on top of that Clause 143 and Schedule 15 to the Bill provide significant other powers for the Secretary of State to interfere with the objective and impartial functioning of the information commission by the appointment of non-executive members of the newly formed commission. The guarantee of the independence of the ICO is intended to ensure the effectiveness and reliability of its regulatory function and that the monitoring and enforcement of data protection laws are carried out objectively and free from partisan or extra-legal considerations.
These amendments would limit the Secretary of State’s powers and leeway to interfere with the objective and impartial functioning of the new information commission, in particular by modifying Schedule 15 to the Bill to transfer budget responsibility and the appointment process of the non-executive members of the information commission to the relevant Select Committee. If so amended, the Bill would ensure that the new information commission has sufficient arm’s-length distance from the Government to oversee public and private bodies’ uses of personal data with impartiality and objectivity. DSIT’s delegated powers memorandum to the DPRRC barely mentions any of these powers, yet they are of considerable importance. Therefore, I am not surprised that there was no mention of them, but they are very significant.
We have discussed data adequacy before; of course, in his letter to us, the Minister tried to rebut some of the points we made about it. In fact, he quoted somebody who has briefed me extensively on it and has taken a very different view to the one he alleges she took in a rather partial quotation from evidence taken by the European Affairs Committee, which is now conducting an inquiry into data adequacy and its implications for the UK-EU relationship. We were told by Open Rights Group attendees at a recent meeting with the European Commission that it expressed concern to those present about the risk that the Bill poses to the EU adequacy agreement; this was not under Chatham House rules. It expressed this risk in a meeting at which a number of UK groups were present, which is highly significant in itself.
I mentioned the European Affairs Committee’s inquiry. I understand that the European Parliament’s Committee on Civil Liberties, Justice and Home Affairs has also given written evidence on its concerns about this Bill, its impact on adequacy and how it could impact the agreement. It put its arguments rather strongly. Has the Minister seen this? Is he aware of the written evidence that it has given to the European Affairs Select Committee? I suggest that he becomes aware of it and takes a view on whether we need to postpone Report until we have seen the European Affairs Select Committee’s report. If it comes to the conclusion that data adequacy is at risk, the Government will have to go back to the drawing board in a number of respects on this Bill. If the Select Committee report comes out and says that the impact of the Bill will not be data adequate, it would be rather foolish if we had already gone through Report by that time. Far be it from me not to want the Government to have egg on their face but it would be peculiar if they did not carefully observe the evidence being put to the European Affairs Select Committee and the progress that it is making in its inquiry. I beg to move.
My Lords, I thank the noble Lord, Lord Clement-Jones, for introducing his amendments so ably. When I read them, I had a strong sense of déjà vu as attempts by the Government to control the appointments and functioning of new regulators have been a common theme in other pieces of legislation that we have debated in the House and which we have always resisted. In my experience, this occurred most recently in the Government’s proposals for the Office for Environmental Protection, which was dealing with EU legislation being taken into by the UK and is effectively the environment regulator. We were able to get those proposals modified to limit the Secretary of State’s involvement; we should do so again here.
I very much welcome the noble Lord’s amendments, which give us a chance to assess what level of independence would be appropriate in this case. Schedule 15 covers the transition from the Information Commissioner’s Office to the appointment of the chair and non-executive members of the new information commission. We support this development in principle but it is crucial that the new arrangements strengthen rather than weaken the independence of the new commission.
The noble Lord’s amendments would rightly remove the rights of the Secretary of State to decide the number of non-executive members and to appoint them. Instead, his amendments propose that the chair of the relevant parliamentary committee should oversee appointments. Similarly, the amendments would remove the right of the Secretary of State to recommend the appointment and removal of the chair; again, this should be passed to the relevant parliamentary committee. We agree with these proposals, which would build in an additional tier of parliamentary oversight and help remove any suspicion that the Secretary of State is exercising unwarranted political pressure on the new commission.
The noble Lord’s amendments beg the question of what the relevant parliamentary committee might be. Although we are supportive of the wording as it stands, it is regrettable that we have not been able to make more progress on establishing a strong bicameral parliamentary committee to oversee the work of the information commission. However, in the absence of such a committee, we welcome the suggestion made in the noble Lord’s Amendment 256 that the Commons Science, Innovation and Technology Committee could fulfil that role.
Finally, we have tabled Amendment 259, which addresses what is commonly known as the “revolving door” whereby public sector staff switch to jobs in the private sector and end up working for industries that they were supposedly investigating and regulating previously. This leads to accusations of cronyism and corruption; whether or not there is any evidence of this, it brings the reputation of the whole sector into disrepute. Perhaps I should have declared an interest at the outset: I am a member of the Advisory Committee on Business Appointments and therefore have a ringside view of the scale of the revolving door taking place, particularly at the moment. We believe that it is time to put standards in public life back at the heart of public service; setting new standards on switching sides should be part of that. Our amendment would put a two-year ban on members of the information commission accepting employment from a business that was subject to enforcement action or acting for persons who are being investigated by the agency.
I hope that noble Lords will see the sense and importance of these amendments. I look forward to the Minister’s response.
My Lords, I thank the Minister for his response, dusty though it may have been. The noble Baroness, Lady Jones, is absolutely right; this Government have form in all areas of regulation. In every area where we have had legislation related to a regulator coming down the track, the Government have taken more power on and diminished parliamentary oversight rather than enhancing it.
It is therefore a little rich to say that accountability to Parliament is the essence of all this. That is not the impression one gets reading the data protection Bill; the impression you get is that the Government are tightening the screw on the regulator. That was the case with Ofcom in the Online Safety Act; it is the case with the CMA; the noble Baroness, Lady Jones, mentioned her experience as regards the environment. Wherever you look, the Government are tightening their control over the regulators. It is something the Industry and Regulators Committee has been concerned about. We have tried to suggest various formulae. A Joint Committee of both Houses was proposed by the Communications and Digital Committee; it has been endorsed by a number of other committees, such as the Joint Committee on the Draft Online Safety Bill, and I think it has even been commended by the Industry and Regulators Committee as well in that respect.
We need to crack this one. On the issue of parliamentary accountability for the regulator and oversight, the balance is not currently right. That applies particularly in terms of appointments, in this case of the commissioner and the non-executives. The Minister very conveniently talked about removal but this could be about renewal of term, and it is certainly about appointment. So maybe the Minister was a little bit selective with the example he chose to say where the control was.
We are concerned about the independence of the regulator. The Minister did not give an answer, so I hope that he will write about whether he knows what the European Affairs Select Committee is up to. I made a bit of a case on that. Evidence is coming in, and the relevant committee in the European Parliament is giving evidence. The Minister, the noble Viscount, Lord Camrose, was guilty of this in a way, but the way that the data adequacy aspect is seen from this side of the North Sea seems rather selective. The Government need to try to try to put themselves in the position of the Commission and the Parliament on the other side of the North Sea and ask, “What do we think are the factors that will endanger our data adequacy as seen from that side?” The Government are being overly complacent in regarding it as “safe” once the Bill goes through.
It was very interesting to hear what the noble Baroness had to say about the revolving door issues. The notable thing about this amendment is how limited it is; it is not blanket. It would be entirely appropriate to have this in legislation, given the sensitivity of the roles that are carried out by senior people at the ICO.
However, I think we want to make more progress tonight, so I beg leave to withdraw my amendment.
My Lords, as ever, the noble Baroness, Lady Kidron, has nailed this issue. She has campaigned tirelessly in the field of child sexual abuse and has identified a major loophole.
What has been so important is learning from experience and seeing how these new generative AI models, which we have all been having to come to terms with them for the past 18 months, are so powerful in the hands of ordinary people who want to cause harm and sexual abuse. The important thing is that, under existing legislation, there are of course a number of provisions relating to creating deepfake child pornography, the circulation of pornographic deepfakes and so on. However, as the noble Baroness said, what the legislation does not do is go upstream to the AI system—the AI model itself—to make sure that those who develop those models are caught as well. That is what a lot of the discussion around deepfakes is about at the moment—it is, I would say, the most pressing issue—but it is also about trying to nail those AI system owners and users at the very outset, not waiting until something is circulated or, indeed, created in the first place. We need to get right up there at the outset.
I very much support what the noble Baroness said; I will reserve any other remarks for the next group of amendments.
My Lords, I am pleased that we were able to sign this amendment. Once again, the noble Baroness, Lady Kidron, has demonstrated her acute ability to dissect and to make a brilliant argument about why an amendment is so important.
As the noble Lord, Lord Clement-Jones, and others have said previously, what is the point of this Bill? Passing this amendment and putting these new offences on the statute book would give the Bill the purpose and clout that it has so far lacked. As the noble Baroness, Lady Kidron, has made clear, although it is currently an offence to possess or distribute child sex abuse material, it is not an offence to create these images artificially using AI techniques. So, quite innocent images of a child—or even an adult—can be manipulated to create child sex abuse imagery, pornography and degrading or violent scenarios. As the noble Baroness pointed out, this could be your child or a neighbour’s child being depicted for sexual gratification by the increasingly sophisticated AI creators of these digital models or files.
Yesterday’s report from the Internet Watch Foundation said that a manual found on the dark web encourages “nudifying” tools to remove clothes from child images, which can then be used to blackmail them into sending more graphic content. The IWF reports that the scale of this abuse is increasing year on year, with 275,000 web pages containing child sex abuse being found last year; I suspect that this is the tip of the iceberg as much of this activity is occurring on the dark web, which is very difficult to track. The noble Baroness, Lady Kidron, made a powerful point: there is a danger that access to such materials will also encourage offenders who then want to participate in real-world child sex abuse, so the scale of the horror could be multiplied. There are many reasons why these trends are shocking and abhorrent. It seems that, as ever, the offenders are one step ahead of the legislation needed for police enforcers to close down this trade.
As the noble Baroness, Lady Kidron, made clear, this amendment is “laser focused” on criminalising those who are developing and using AI to create these images. I am pleased to say that Labour is already working on a ban on creating so-called nudification tools. The prevalence of deepfakes and child abuse on the internet is increasing the public’s fear of the overall safety of AI, so we need to win their trust back if we are to harness the undoubted benefits that it can deliver to our public services and economy. Tackling this area is one step towards that.
Action to regulate AI by requiring transparency and safety reports from all those at the forefront of AI development should be a key part of that strategy, but we have a particular task to do here. In the meantime, this amendment is an opportunity for the Government to take a lead on these very specific proposals to help clean up the web and rid us of these vile crimes. I hope the Minister can confirm that this amendment, or a government amendment along the same lines, will be included in the Bill. I look forward to his response.
My Lords, I will speak to all the amendments in this group, other than Amendment 295 from the noble Baroness, Lady Jones. Without stealing her thunder, I very much support it, especially in an election year and in the light of the deepfakes we have already seen in the political arena—those of Sadiq Khan, those used in the Slovakian election and the audio deepfakes of the President of the US and Sir Keir Starmer. This is a real issue and I am delighted that she has put down this amendment, which I have signed.
In another part of the forest, the recent spread of deepfake photos purporting to show Taylor Swift engaged in explicit acts has brought new attention to the use, which has been growing in recent years, of deepfake images, video and audio to harass women and commit fraud. Women constitute 99% of the victims and the most visited deepfake site had 111 million users in October 2023. More recently, children have been found using “declothing” apps, which I think the noble Baroness mentioned, to create explicit deepfakes of other children.
Deepfakes also present a growing threat to elections and democracy, as I have mentioned, and the problems are increasingly rampant. Deepfake fraud rates rose by 3,000% globally in 2023, and it is hardly surprising that, in recent polling, 86% of the UK population supported a ban on deepfakes. I believe that the public are demanding an urgent solution to this problem. The only effective way to stop deepfakes, which is analogous to what the noble Baroness, Lady Kidron, has been so passionately advocating, is for the Government to ban them at every stage, from production to distribution. Legal liability must hold to account those who produce deepfake technology, create and enable deepfake content, and facilitate its spread.
Existing legislation seeks to limit the spread of images on social media, but this is not enough. The recent images of Taylor Swift were removed from X and Telegram, but not before one picture had been viewed more than 47 million times. Digital watermarks are not a solution, as shown by a paper by world-leading Al researchers released in 2023, which concluded that
“strong and robust watermarking is impossible to achieve”.
Without measures across the supply chain to prevent the creation of deepfakes, the law will forever be playing catch-up.
The Government now intend to ban the creation of sexual imagery deepfakes; I welcome this and have their announcement in my hand:
“Government cracks down on ‘deepfakes’ creation”.
This will send a clear message that the creation of these intimate images is not acceptable. However, this appears to cover only sexual image deepfakes. These are the most prevalent form of deepfakes, but other forms of deepfakes are also causing noticeable and rapidly growing harms, most obviously political deepfakes—as the noble Baroness, Lady Jones, will illustrate—and deepfakes used for fraud. This also appears to cover only the endpoint of the creation of deepfakes, not the supply chain leading up to that point. There are whole apps and companies dedicated to the creation of deepfakes, and they should not exist. There are industries which provide legitimate services—generative Al and cloud computing—which fail to take adequate measures and end up enabling creation of deepfakes. They should take measures or face legal accountability.
The Government’s new measures are intended to be introduced through an amendment to the Criminal Justice Bill, which is, I believe, currently between Committee and Report in the House of Commons. As I understand it, however, there is no date scheduled yet for Report, as the Bill seems to be caught in a battle over amendments.
The law will, however, be extremely difficult to enforce. Perpetrators are able to hide behind anonymity and are often difficult to identify, even when victims or authorities are aware that deepfakes have been created. The only reliable and effective countermeasure is to hold the whole supply chain responsible for deepfake creation and proliferation. All parties involved in the AI supply chain, from AI model developers and providers to cloud compute providers, must demonstrate that they have taken steps to preclude the creation of deepfakes. This approach is similar to how society combats—or, rather, analogous to the way that I hope the Minister will concede to the noble Baroness, Lady Kidron, society will combat—child abuse material and malware.
My Lords, I speak to Amendments 293 and 294 from the noble Lord, Lord Clement-Jones, Amendment 295 proposed by my noble friend Lady Jones and Amendments 295A to 295F, also in the name of the noble Lord, Lord Clement-Jones.
Those noble Lords who are avid followers of my social media feeds will know that I am an advocate of technology. Advanced computing power and artificial intelligence offer enormous opportunities, which are not all that bad. However, the intentions of those who use them can be malign or criminal, and the speed of technological developments is outpacing legislators around the world. We are constantly in danger of creating laws that close the stable door long after the virtual horse has bolted.
The remarkable progress of visual and audio technology has its roots in the entertainment industry. It has been used to complete or reshoot scenes in films in the event of actors being unavailable, or in some cases, when actors died before filming was completed. It has also enabled filmmakers to introduce characters, or younger versions of iconic heroes for sequels or prequels in movie franchises. This enabled us to see a resurrected Sir Alec Guinness and a younger version of Luke Skywalker, or a de-aged Indiana Jones, on our screens.
The technology that can do this is only around 15 years old, and until about five years ago it required extremely powerful computers, expensive resources and advanced technical expertise. The first malicious use of deepfakes occurred when famous actors and celebrities, mainly and usually women, had their faces superimposed on to bodies of participants in pornographic videos. These were then marketed online as Hollywood stars’ sex tapes or similar, making money for the producers while causing enormous distress to the women targeted. More powerful computer processors inevitably mean that what was once very expensive rapidly becomes much cheaper very quickly. An additional factor has turbo-boosted this issue: generative AI. Computers can now learn to create images, sound and video movement almost independently of software specialists. It is no longer just famous women who are the targets of sexually explicit deepfakes; it could be anyone.
Amendment 293 directly addresses this horrendous practice, and I hope that there will be widespread support for it. In an increasingly digital world, we spend more time in front of our screens, getting information and entertainment on our phones, laptops, iPads and smart TVs. What was once an expensive technology, used to titillate, entertain or for comedic purposes, has developed an altogether darker presence, well beyond the reach of most legislation.
In additional to explicit sexual images, deepfakes are known to have been used to embarrass individuals, misrepresent public figures, enable fraud, manipulate public opinion and influence democratic political elections and referendums. This damages people individually: those whose images or voices are faked, and those who are taken in by the deepfakes. Trusted public figures, celebrities or spokespeople face reputational and financial damage when their voices or images are used to endorse fake products or for harvesting data. Those who are encouraged to click through are at risk of losing money to fraudsters, being targeted for scams, or having their personal and financial data leaked or sold on. There is growing evidence that information used under false pretences can be used for profiling in co-ordinated misinformation campaigns, for darker financial purposes or political exploitation.
In passing, it is worth remembering that deepfakes are not always images of people. Last year, crudely generated fake images of an explosion, purported to be at the Pentagon, caused the Dow Jones industrial average to drop 85 points within four minutes of the image being published, and triggered emergency response procedures from local law enforcement before it was debunked 20 minutes later. The power of a single image, carefully placed and virally spreading, shows the enormous and rapid economic damage that deepfakes can create.
Amendment 294 would make it an offence for a person to generate a deepfake for the purpose of committing fraud, and Amendment 295 would make it an offence to create deepfakes of political figures, particularly when they risk undermining electoral integrity. We support all the additional provisions in this group of amendments; Amendments 295A to 295F outline the requirements, duties and definitions necessary to ensure that those creating deepfakes can be prosecuted.
I bring to your Lordships’ attention the wording of Amendment 295, which, as well as making it an offence to create a deepfake, goes a little further. It also makes it an offence to send a communication which has been created by artificial intelligence and which is intended to create the impression that a political figure has said or done something that is not based in fact. This touches on what I believe to be a much more alarming aspect of deepfakes: the manner in which false information is distributed.
I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Jones of Whitchurch, for tabling the amendments in this important group. I very much share the concerns about all the uses of deepfake images that are highlighted by these amendments. I will speak more briefly than I otherwise would with a view to trying to—
My Lords, I would be very happy to get a letter from the Minister.
I would be happy to write one. I will go for the abbreviated version of my speech.
I turn first to the part of the amendment that would seek to criminalise the creation, alteration or otherwise generation of deepfake images depicting a person engaged in an intimate act. The Government recognise that there is significant public concern about the simple creation of sexually explicit deepfake images, and this is why they have announced their intention to table an amendment to the Criminal Justice Bill, currently in the other place, to criminalise the creation of purposed sexual images of adults without consent.
The noble Lord’s Amendment 294 would create an offence explicitly targeting the creation or alteration of deepfake content when a person knows or suspects that the deepfake will be or is likely to be used to commit fraud. It is already an offence under Section 7 of the Fraud Act 2006 to generate software or deepfakes known to be designed for or intended to be used in the commission of fraud, and the Online Safety Act lists fraud as a priority offence and as a relevant offence for the duties on major services to remove paid-for fraudulent advertising.
Amendment 295 in the name of the noble Baroness, Lady Jones of Whitchurch, seeks to create an offence of creating or sharing political deepfakes. The Government recognise the threats to democracy that harmful actors pose. At the same time, the UK also wants to ensure that we safeguard the ability for robust debate and protect freedom of expression. It is crucial that we get that balance right.
Let me first reassure noble Lords that the UK already has criminal offences that protect our democratic processes, such as the National Security Act 2023 and the false communications offence introduced in the Online Safety Act 2023. It is also already an election offence to make false statements of fact about the personal character or conduct of a candidate or about the withdrawal of a candidate before or during an election. These offences have appropriate tests to ensure that we protect the integrity of democratic processes while also ensuring that we do not impede the ability for robust political debate.
I assure noble Lords that we continue to work across government to ensure that we are ready to respond to the risks to democracy from deepfakes. The Defending Democracy Taskforce, which seeks to protect the democratic integrity of the UK, is engaging across government and with Parliament, the UK’s intelligence community, the devolved Administrations, local authorities and others on the full range of threats facing our democratic institutions. We also continue to meet regularly with social media companies to ensure that they continue to take action to protect users from election interference.
Turning to Amendments 295A to 295F, I thank the noble Lord, Lord Clement-Jones, for them. Taken together, they would in effect establish a new regulatory regime in relation to the creation and dissemination of deepfakes. The Government recognise the concerns raised around harmful deepfakes and have already taken action against illegal content online. We absolutely recognise the intention behind these amendments but they pose significant risks, including to freedom of expression; I will write to noble Lords about those in order to make my arguments in more detail.
For the reasons I have set out, I am not able to accept these amendments. I hope that the noble Lord will therefore withdraw his amendment.
My Lords, I thank the Minister for that rather breathless response and his consideration. I look forward to his letter. We have arguments about regulation in the AI field; this is, if you like, a subset of that—but a rather important subset. My underlying theme is “must try harder”. I thank the noble Lord, Lord Leong, for his support and pay tribute to Control AI, which is vigorously campaigning on this subject in terms of the supply chain for the creation of these deepfakes.
Pending the Minister’s letter, which I look forward to, I beg leave to withdraw my amendment.
My Lords, what a relief—we are at the final furlong.
The UK is a world leader in genomics, which is becoming an industry of strategic importance for future healthcare and prosperity, but, frankly, it must do more to protect the genomic sector from systemic competitors that wish to dominate this industry for either economic advantage or nefarious purposes. Genomic sequencing—the process of determining the entirety of an organism’s DNA—is playing an increasing role in our NHS, which has committed to being the first national healthcare system to offer whole-genome sequencing as part of routine care. However, like other advanced technologies, our sector is exposed to data privacy and national security risks. Its dual-use potential means that it can also be used to create targeted bioweapons or genetically enhanced military. We must ensure that a suitable data protection environment exists to maintain the UK’s world-leading status.
So, how are we currently mitigating against such threats and why is our existing approach so flawed? Although I welcome initiatives such as the Trusted Research campaign and the Research Collaboration Advice Team, these bodies focus specifically on research and academia. We expect foreign companies that hold sensitive genomics and DNA to follow GDPR. I am not a hawk about relations with other countries, but we need to provide the new Information Commissioner with much greater expertise and powers to tackle complex data security threats in sensitive industries. There must be no trade-off between scientific collaboration and data privacy; that is what this amendment is designed to prevent. I beg to move.
The Committee will be relieved to know that I will be brief. I do not have much to say because, in general terms, this seems an eminently sensible amendment.
We should congratulate the noble Lord, Lord Clement-Jones, on his drafting ingenuity. He has managed to compose an amendment that brings together the need for scrutiny of emerging national security and data privacy risks relating to advanced technology, aims to inform regulatory developments and guidance that might be required to mitigate risks, and would protect the privacy of people’s genomics data. It also picks up along the way the issue of the security services scrutinising malign entities and guiding researchers, businesses, consumers and public bodies. Bringing all those things together at the end of a long and rather messy Bill is quite a feat—congratulations to the noble Lord.
I am rather hoping that the Minister will tell the Committee either that the Government will accept this wisely crafted amendment or that everything it contains is already covered. If the latter is the case, can he point noble Lords to where those things are covered in the Bill? Can he also reassure the Committee that the safety and security issues raised by the noble Lord, Lord Clement-Jones, are covered? Having said all that, we support the general direction of travel that the amendment takes.
Nothing makes me happier than the noble Lord’s happiness. I thank him for his amendment and the noble Lord, Lord Bassam, for his points; I will write to them on those, given the Committee’s desire for brevity and the desire to complete this stage tonight.
I wish to say some final words overall. I sincerely thank the Committee for its vigorous—I think that is the right word—scrutiny of this Bill. We have not necessarily agreed on a great deal, but I am in awe of the level of scrutiny and the commitment to making the Bill as good as possible. Let us be absolutely honest—this is not the most entertaining subject, but it is something that we all take extremely seriously and I pay tribute to the Committee for its work. I also extend sincere thanks to the clerks and our Hansard colleagues for agreeing to stay a little later than agreed, although that may not even be necessary. I very much look forward to engaging with noble Lords again before and during Report.
My Lords, I thank the Minister, the noble Baroness, Lady Jones, and all the team. I also thank the noble Lord, Lord Harlech, whose first name we now know; these things are always useful to know. This has been quite a marathon. I hope that we will have many conversations between now and Report. I also hope that Report is not too early as there is a lot to sort out. The noble Baroness, Lady Jones, and I will be putting together our priority list imminently but, in the meantime, I beg leave to withdraw my amendment.