(2 days, 23 hours ago)
Lords ChamberMy Lords, this is clearly box-office material, as ever.
I support Amendment 1 tabled by the noble Baroness, Lady Kidron, on inferred data. Like her, I regret that we do not have this Bill flying in tandem with an AI Bill. As she said, data and AI go together, and we need to see the two together in context. However, inferred data has its own dangers: inaccuracy and what are called junk inferences; discrimination and unfair treatment; invasions of privacy; a lack of transparency; security risks; predatory targeting; and a loss of anonymity. These dangers highlight the need for strong data privacy protection for consumers in smart data schemes and more transparent data collection practices.
Noble Lords will remember that Cambridge Analytica dealt extensively with inferred data. That company used various data sources to create detailed psychological profiles of individuals going far beyond the information that users explicitly provided. I will not go into the complete history, but, frankly, we do not want to repeat that. Without safeguards, the development of AI technologies could lead to a lack of public trust, as the noble Baroness said, and indeed to a backlash against the use of AI, which could hinder the Government’s ambitions to make the UK an AI superpower. I do not like that kind of boosterish language—some of the Government’s statements perhaps could have been written by Boris Johnson—nevertheless the ambition to put the UK on the AI map, and to keep it there, is a worthy one. This kind of safeguard is therefore extremely important in that context.
I start by thanking the noble Baroness, Lady Kidron, for introducing this group. I will speak particularly to the amendment in my name but before I do so, I want to say how much I agree with the noble Baroness and with the noble Lord, Lord Clement-Jones, that it is a matter of regret that we are not simultaneously looking at an AI Bill. I worry that this Bill has to take a lot of the weight that an AI Bill would otherwise take, but we will come to that in a great deal more detail in later groups.
I will address the two amendments in this group in reverse order. Amendment 5 in my name and that of my noble friend Lord Markham would remove Clause 13, which makes provision for the Secretary of State or the Treasury to give financial assistance to decision-makers and enforcers—that is, in essence, to act as a financial backstop. While I appreciate the necessity of guaranteeing the stability of enforcers who are public authorities and therefore branches of state, I am concerned that this has been extended to decision-makers. The Bill does not make the identity of a decision-maker clear. Therefore, I wonder who exactly we are protecting here. Unless those individuals or bodies or organisations can be clearly defined, how can we know whether we should extend financial assistance to them?
I raised these concerns in Committee and the Minister assured us at that time that smart data schemes should be self-financing through fees and levies as set out in Clauses 11 and 12 and that this provision is therefore a back-up plan. If that is indeed the case and we are assured of the self-funding nature of smart data schemes, then what exactly makes this necessary? Why must the statutory spending authority act as a backstop if we do not believe there is a risk it will be needed? If we do think there is such a risk, can the Minister elaborate on what it is?
I turn now to the amendment tabled by the noble Baroness, Lady Kidron, which would require data traders to supply customers with information that has been used by AI to build a profile on them. While transparency and explainability are hugely important, I worry that the mechanism proposed here will be too burdensome. The burden would grow linearly with the scale of the models used. Collating and supplying this information would, I fear, increase the cost of doing business for traders. Given AI’s potential to be an immense asset to business, helping generate billions of pounds for the UK economy—and, by the way, I rather approve of the boosterish tone and think we should strive for a great deal more growth in the economy—we should not seek to make its use more administratively burdensome for business. Furthermore, since the information is AI-generated, it is going to be a guess or an assumption or an inference. Therefore, should we require companies to disclose not just the input data but the intermediate and final outputs? Speaking as a consumer, I am not sure that I personally would welcome this. I look forward to hearing the Minister’s responses.
I thank the noble Baroness, Lady Kidron, and the noble Viscount, Lord Camrose, for their proposed amendments and continued interest in Part 1 of this Bill. I hope I can reassure the noble Baroness that the definition of customer data is purposefully broad. It encompasses information relating to a customer or a trader and the Government consider that this would indeed include inferred data. The specific data to be disclosed under a smart data scheme will be determined in the context of that scheme and I reassure the noble Baroness that there will be appropriate consultation before a smart data scheme is introduced.
I turn to Amendment 5. Clause 13 provides statutory authority for the Secretary of State or the Treasury to give financial assistance to decision-makers, enforcers and others for the purpose of meeting any expense in the exercise of their functions in the smart data schemes. Existing and trusted bodies such as sector regulators will likely be in the lead of the delivery of new schemes. These bodies will act as decision-makers and enforcers. It is intended that smart data schemes will be self-financing through the fees and levies produced by Clauses 11 and 12. However, because of the nature of the bodies that are involved, it is deemed appropriate for there to be a statutory spending authority as a backstop provision if that is necessary. Any spending commitment of resources will, of course, be subject to the usual estimates process and to existing public sector spending controls and transparency requirements.
I hope that with this brief explanation of the types of bodies involved, and the other explanations, the noble Baroness will be content to withdraw Amendment 1 and that noble Lords will not press Amendment 5.
My Lords, the noble Baroness, Lady Kidron, is setting a cracking pace this afternoon, and I am delighted to support her amendments and speak to them. Citizens should have the clear right to assign their data to data communities or trusts, which act as intermediaries between those who hold data and those who wish to use it, and are designed to ensure that data is shared in a fair, safe and equitable manner.
A great range of bodies have explored and support data communities and data trusts. There is considerable pedigree behind the proposals that the noble Baroness has put forward today, starting with a recommendation of the Hall-Pesenti review. We then had the Royal Society and the British Academy talking about data stewardship; the Ada Lovelace Institute has explored legal mechanisms for data stewardship, including data trusts; the Open Data Institute has been actively researching and piloting data trusts in the real world; the Alan Turing Institute has co-hosted a workshop exploring data trusts; and the Royal Society of Arts has conducted citizens’ juries on AI explainability and explored the use of data trusts for community engagement and outreach.
There are many reasons why data communities are so important. They can help empower individuals, give them more control over their data and ensure that it is used responsibly; they can increase bargaining power, reduce transaction costs, address data law complexity and protect individual rights; they can promote innovation by facilitating data-sharing; and they can promote innovation in the development of new products and services. We need to ensure responsible operation and build trust in data communities. As proposed by Amendment 43 in particular, we should establish a register of data communities overseen by the ICO, along with a code of conduct and complaint mechanisms, as proposed by Amendment 42.
It is high time we move forward on this; we need positive steps. In the words of the noble Baroness, Lady Kidron, we do not just seek assurance that there is nothing to prevent these data communities; we need to take positive steps and install mechanisms to make sure that we can set them up and benefit from that.
I thank the noble Baroness, Lady Kidron, for leading on this group, and the noble Lord, Lord Clement-Jones, for his valuable comments on these important structures of data communities. Amendments 2, 3, 4 and 25 work in tandem and are designed to enable data communities, meaning associations of individuals who have come together and wish to designate a third party, to act on the group’s behalf in their data use.
There is no doubt that the concept of a data community is a powerful idea that can drive innovation and a great deal of value. I thank the noble Lord, Lord Clement-Jones, for cataloguing the many groups that have driven powerful thinking in this area, the value of which is very clear. However—and I keep coming back to this when we discuss this idea—what prevents this being done already? I realise that this may be a comparatively trivial example, but if I wanted to organise a community today to oppose a local development, could I not do so with an existing lawful basis for data processing? It is still not clear in what way these amendments would improve my ability to do so, or would reduce my administrative burden or the risks of data misuse.
I look forward to hearing more about this from the Minister today and, ideally, as the noble Baroness, Lady Kidron, said, in a briefing on the Government’s plan to drive this forward. However, I remain concerned that we do not necessarily need to drive forward this mechanism by passing new legislation. I look forward to the Minister’s comments.
Amendment 42 would require the Information Commissioner to draw up a code of practice setting out how data communities must operate and how data controllers and processors should engage with these communities. Amendment 43 would create a register of data communities and additional responsibilities for the data community controller. I appreciate the intent of the noble Baroness, Lady Kidron, in trying to ensure data security and transparency in the operation of data communities. If we on these Benches supported the idea of their creation in this Bill, we would surely have to implement mechanisms of the type proposed in these amendments. However, this observation confirms us in our view that the administration required to operate these communities is starting to look rather burdensome. We should be looking to encourage the use of data to generate economic growth and to make people’s lives easier. I am concerned that the regulation of data communities, were it to proceed as envisaged by these amendments, might risk doing just the opposite. That said, I will listen with interest to the response of noble Lords and the Minister.
My Lords, I rise to speak to Amendments 2, 3, 4, 25, 42 and 43. I thank the noble Baroness, Lady Kidron, and the noble Lord, Lord Clement-Jones, for these amendments on data communities, which were previously tabled in Committee, and for the new clauses linking these with the Bill’s clauses on smart data.
As my noble friend Lady Jones noted in Committee, the Government support giving individuals greater agency over their data. The Government are strongly supportive of a robust regime of data subject rights and believe strongly in the opportunity presented by data for innovation and economic growth. UK GDPR does not prevent data subjects authorising third parties to exercise certain rights on their behalf. Stakeholders have, however, said that there may be barriers to this in practice.
I reassure noble Lords that the Government are actively exploring how we can support data intermediaries while maintaining the highest data protection standards. It is our intention to publish a call for evidence in the coming weeks on the activities of data intermediaries and the exercise of data subject rights by third parties. This will enable us to ensure that the policy settings on this topic are right.
In the context of smart data specifically, Part 1 of the Bill does not limit who the regulations may allow customers to authorise. Bearing in mind the IT and security-related requirements inherent in smart data schemes, provisions on who a customer may authorise are best determined in the context of a specific scheme, when the regulations are made following appropriate consultation. I hope to provide some additional reassurance that exercise of the smart data powers is subject to data protection legislation and does not displace data rights under that legislation.
There will be appropriate consultation, including with the Information Commissioner’s Office, before smart data schemes are introduced. This year, the Department for Business and Trade will be publishing a strategy on future uses of these powers.
While the smart data schemes and digital verification services are initial examples of government action to facilitate data portability and innovative uses of data, my noble friend Lady Jones previously offered a meeting with officials and the noble Baroness, Lady Kidron, to discuss these proposals, which I know my officials have arranged for next week—as the noble Baroness indicated earlier. I hope she is therefore content to withdraw her amendment.
My Lords, I very much support the amendments from the noble Lords, Lord Lucas and Lord Arbuthnot, particularly Amendment 6, about accuracy. It has become apparent—and Committee stage was interesting—that there is a challenge with having gender and sex as interchangeable. The problem becomes physical, because you cannot avoid the fact that you will react differently medically to certain things according to the sex you were born and to your DNA.
That can be very dangerous in two cases. The first case is where drugs or cures are being administered by someone who thinks they are treating a patient of one sex but they are actually a different sex. That could kill someone, quite happily. The second case is if you are doing medical research and relying on something, but then find that half the research is invalid because a person is not actually that sex but have decided to choose another gender. Therefore, all the research on that person could be invalid. That could lead to cures being missed, other things being diagnosed as being all right, and a lot of dangers.
As a society, we have decided that it will be all right for people to change gender—let us say that, as I think it is probably the easiest way to describe it. I do not see any problem with that, but we need critical things to be kept on records that are clearly separate. Maybe we can make decisions in Parliament, or wherever, about what you are allowed to declare on identity documents such as a passport. We need to have two things: one is sex, which is immutable, and therefore can help with all the other things behind the scenes, including research and treatments; the other is gender, which can be what you wish to declare, and society accepts that you can declare yourself as being of another gender. I cannot see any way round that. I have had discussions with people about this, and as one who would have said that this is quite wrong and unnecessary, I was convinced by the end of those discussions that it was right. Keeping the two separate in our minds would solve a lot of problems. These two amendments are vital for that.
I agree in many ways with the points from the noble Lord, Lord Clement-Jones. Just allowing some of these changes to be made by the stroke of a pen—a bit like someone is doing across the Atlantic—without coming to Parliament, is perhaps unwise sometimes. The combined wisdom of Parliament, looking at things from a different point of view, and possibly with a more societal point of view than the people who are trying to make systems work on a governmental basis, can be sensible and would avoid other mistakes being made. I certainly support his amendments, but I disagree entirely with his last statement where he did not support the noble Lords, Lord Lucas and Lord Arbuthnot.
I thank my noble friend Lord Lucas for introducing this group and for bringing these important and sometimes very difficult matters to the attention of the House. I will address the amendments slightly out of order, if I may.
For digital verification services to work, the information they have access to and use to verify documents must be accurate; this is, needless to say, critical to the success of the entire scheme. Therefore, it is highly sensible for Amendment 8 to require public authorities, when they disclose information via the information gateway, to ensure that it is accurate and reliable and that they can prove it. By the same measure, Amendment 6, which requires the Secretary of State to assess whether the public authorities listed are collecting accurate information, is equally sensible. These amendments as a pair will ensure the reliability of DVS services and encourage the industry to flourish.
I would like to consider the nature of accurate information, especially regarding an individual’s biological sex. It is possible for an individual to change their recorded sex on their driving licence or passport, for example, without going through the process of obtaining a gender recognition certificate. Indeed, a person can change the sex on their birth certificate if they obtain a GRC, but many would argue that changing some words on a document does not change the reality of a person’s genome, physical presentation and, in some cases, medical needs, meaning that the information recorded does not accurately relate to their sex. I urge the Minister to consider how best to navigate this situation, and to acknowledge that it is crucially important, as we have heard so persuasively from the noble Earl, Lord Errol, and my noble friends Lord Arbuthnot and Lord Lucas, that a person’s sex is recorded accurately to facilitate a fully functioning DVS system.
The DVS trust framework has the potential to rapidly transform the way identities and information are verified. It should standardise digital verification services, ensure reliability and build trust in the concept of a digital verification service. It could seriously improve existing, cumbersome methods of verifying information, saving companies, employers, employees, landlords and tenants time and money. Personally, I have high hopes of its potential to revolutionise the practices of recruitment. I certainly do not know many people who would say no to less admin. If noble Lords are minded to test the opinion of the House, we will certainly support them with respect to Amendments 6 and 8.
With the greatest respect to the noble Lord, Lord Clement-Jones, I think it is a mistake to regard this as part of some culture war struggle. As I understand it, this is about accuracy of data and the importance, for medical and other reasons, of maintaining accurate data.
All the benefits of DVS cannot be to the detriment of data privacy and data minimisation. Parliament is well-practised at balancing multiple competing concepts and doing so with due regard to public opinion. Therefore, Amendment 7 is indeed a sensible idea.
Finally, Amendment 9 would require the Secretary of State to review whether an offence of false use of identity documents created or verified by a DVS provider is needed. This is certainly worth consideration. I have no doubt that the Secretary of State will require DVS providers to take care that their services are not being used with criminal intent, and I am quite sure that DVS service providers do not want to facilitate crimes. However, the history of technology is surely one of high-minded purposes corrupted by cynical practices. Therefore, it seems prudent for the Secretary of State to conduct a review into whether creating this offence is necessary and, if it is, the best way that it can be laid out in law. I look forward to hearing the Minister’s comments on this and other matters.
I thank the noble Lords, Lord Clement-Jones, Lord Lucas and Lord Arbuthnot, for their amendments and interest in the important area of digital verification services. I thank the noble Viscount, Lord Camrose, for his support for this being such an important thing to make life easier for people.
I will go in reverse order and start with Amendment 9. I thank the noble Lord, Lord Clement-Jones, for reconsidering his stance since Committee on the outright creation of these offences. Amendment 9 would create an obligation for the Secretary of State to review the need for digital identity theft offences. We believe this would be unnecessary, as existing legislation—for example, the Fraud Act 2006, the Computer Misuse Act 1990 and the Data Protection Act 2018—already addresses the behaviour targeted by this amendment.
However, we note the concerns raised and confirm that the Government are taking steps to tackle the issue. First, the Action Fraud service, which allows individuals to report fraud enabled by identity theft, is being upgraded with improved reporting tools, increased intelligence flows to police forces and better support services for victims. Secondly, the Home Office is reviewing the training offered to police officers who have to respond to fraud incidents, and identifying the improvements needed.
My Lords, I thank the Minister for these two technical amendments. I take this opportunity to thank him also for responding to correspondence about LinesearchbeforeUdig and its wish to meet government and work with existing services to deliver what it describes as the safe digging elements of the NUAR. The Minister has confirmed that the heavy lifting on this—not heavy digging—will be carried out by the noble Baroness, Lady Jones, on her return, which I am sure she will look forward to. As I understand it, officials will meet LinesearchbeforeUdig this week, and they will look at the survey carried out by the service. We have made some process since Committee, and I am grateful to the Minister for that.
My Lords, given that these are technical amendments, correcting wording errors, I have little to add to the remarks already made. We have no concerns about these amendments and will not seek to oppose the Government in making these changes.
My Lords, I will speak to Amendments 11 and 13 in my name and that of my noble friend Lord Markham. The national underground asset register contains the details of all underground assets and apparatus in England, Wales and Northern Ireland, or at any rate it will do as it goes forward. This includes water pipes, electricity cables, internet cables and fibres—details of the critical infrastructure necessary to sustain the UK as we know it.
Needless to say, there are many hostile actors who, if they got their hands on this information, would or could use it to commit appalling acts of terror. I am mindful of and grateful for the Government’s assurances given in Committee that it is and will be subject to rigorous security measures. However, the weakest link in cyber defence is often third-party suppliers and other partners who do not recognise the same level of risk. We should take every possible measure to ensure that the vital data in NUAR is kept safe and shared only with stakeholders who have the necessary security provisions in place.
For this reason, I have tabled Amendment 11, which would require the Secretary of State to provide guidance to relevant stakeholders on the cybersecurity measures which should be in place before they receive information from NUAR. I do not believe this would place a great burden on government departments, as appropriate cybersecurity standards already exist. The key is to ensure that they are duly observed.
I cannot overstate the importance of keeping this information secure, but I doubt noble Lords need much convincing on that score. Given how frighteningly high the stakes are, I strongly urge the most proactive possible approach to cybersecurity, advising stakeholders and taking every possible step to keep us all safe.
Amendment 13, also tabled in my name, requires the Registrar-General to make provisions to ensure the cybersecurity of the newly digitised registers of births, still-births, and deaths. There are a great many benefits in moving from a paper-based register of births and deaths to a digitised version. People no longer have to make the trip to sign the register in person, saving time and simplifying the necessary admin at very busy or very difficult points in people’s lives. It also reduces the number of physical documents that need to be maintained and kept secure. However, in digitising vast quantities of personal, valuable information, we are making a larger attack surface which will appeal to malign actors looking to steal personal data.
I know we discussed this matter in Committee, when the noble Baroness the Minister made the point that this legislation is more about a digitisation drive, in that all records will now be digital rather than paper and digital. While I appreciate her summary, I am not sure it addresses my concerns about the security risks of shifting to a purely digital model. We present a large and tempting attack surface, and the absence of paper back-ups increases the value of digital information even more, as it is the only register. Of course, there are already security measures in place for the digital copies of these registers. I have no doubt we have back-ups and a range of other fallback opportunities. But the same argument applies.
Proactive cybersecurity provisions are required, taking into account the added value of these registers and the ever-evolving threat we face from cybercriminals. I will listen with great interest to the thoughts of other noble Lords and the Minister.
My Lords, I thank the noble Viscount, Lord Camrose, and the noble Lord, Lord Markham, for these amendments. Clause 56 forms part of NUAR provisions. The security of NUAR remains of the utmost importance. Because of this, the Government have closely involved a wide range of security stakeholders in the development of NUAR, including the National Protective Security Authority and security teams from the asset owners themselves. Providing clear acceptable user and usage policies for any digital service is important. As such, we intend to establish clear guidance on the appropriate usage of NUAR, including what conditions end users must fulfil before gaining access to the service. This may include cybersecurity arrangements, as well as personal vetting. However, we do not feel it appropriate to include this in the Bill.
Care must be taken when disclosing platform-specific cybersecurity information, as this could provide bad actors with greater information to enable them to counter these measures, ultimately making NUAR less secure. Furthermore, regulations made in relation to access to information from NUAR would be subject to the affirmative procedure. As such, there will be future opportunities for relevant committees to consider in full these access arrangements, including, on an individual basis, any security impacts. I therefore reassure noble Lords that these measures will ensure that access to NUAR data is subject to appropriate safeguards.
I thank the Minister for his considered reply. It is clear that the Government and the department are taking the issue of security with all due seriousness. However, I remain concerned, particularly about the move to NUAR as a highly tempting attack service for malign actors. In light of this, I am minded to test the opinion of the House.
My Lords, we have heard some of the really consistent advocates for children’s online protection today. I must say that I had not realised that the opportunity of signing the amendments of the noble Baroness, Lady Kidron, was rather like getting hold of Taylor Swift tickets—clearly, there was massive competition and rightly so. I pay tribute not only to the speakers today but in particular to the noble Baroness for all her campaigning, particularly with 5Rights, on online child protection.
All these amendments are important for protecting children’s data, because they address concerns about data misuse and the need for heightened protection for children in the digital environment, with enhanced oversight and accountability in the processing of children’s data. I shall not say very much. If the noble Baroness pushes Amendment 20 to a vote, I want to make sure that we have time before the dinner hour to do so, which means going through the next group very quickly. I very much hope that we will get a satisfactory answer from the Minister. The sage advice from the noble Lord, Lord Stevenson, hit the button exactly.
Amendment 20 is particularly important in this context. It seeks to exclude children from the new provisions on purpose limitation for further processing under Article 8A. As the noble Baroness explains, that means that personal data originally collected from a child with consent for a specific purpose could not be reused for a different, incompatible purpose without obtaining fresh consent, even if the child is now an adult. In my view, that is core. I hope the Minister will come back in the way that has been requested by the noble Lord, Lord Stevenson, so we do not have to have a vote. However, we will support the noble Baroness if she wishes to test the opinion of the House.
My Lords, I too thank the noble Baroness, Lady Kidron, for all her amendments in this group, and I thank the Minister for his amendment.
Amendment 15 seeks to maintain the high level of legal protection for children’s data even where protections for adults may be eased in the context of scientific research. I acknowledge the concerns raised about the potential implications that this amendment could have for medical research and safeguarding work. It is important to recognise that young people aged 16 and over are entitled to control their medical information under existing legal frameworks, reflecting their ability to understand and consent in specific contexts.
There is a legitimate concern that by excluding all children categorically, including those aged 16 and 17, we risk impeding critical medical research that could benefit young people themselves. Research into safeguarding may also be impacted by such an amendment. Studies that aim to improve systems for identifying and preventing abuse or neglect rely on the careful processing of children’s data. If this amendment were to inadvertently create a barrier to such vital work, we could find ourselves undermining some of the protections that it seeks to reinforce.
That said, the amendment highlights an important issue: the need to ensure that ethical safeguards for children remain robust and proportionate. There is no question that the rights and welfare of children should remain paramount in research contexts, but we must find the right balance—one that allows valuable, ethically conducted research to continue without eroding the legal protections that exist for children’s data. So I welcome the intent of the amendment in seeking to protect children, of course, and I urge us, as the noble Lord, Lord Stevenson, put it, to continue working collaboratively to achieve a framework that upholds their rights without hindering progress in areas that ultimately serve their best interests.
As with the previous amendment, I recognise the intent of Amendment 16, which seeks to protect children’s data by excluding them from the scope of recognised legitimate interests. Ensuring that children continue to benefit from the highest level of legal protection is a goal that, needless to say, we all share. However, I remain concerned that this could have less desirable consequences too, particularly in cases requiring urgent safeguarding action. There are scenarios where swift and proportionate data processing is critical to protecting a child at risk, and it is vital that the framework that we establish does not inadvertently create barriers to such essential work.
I am absolutely in support of Amendment 20. It provides an important safeguard by ensuring that children’s data is not used for purposes beyond those for which it was originally collected, unless it is fully compatible with the original purpose. Children are particularly vulnerable when it comes to data processing and their understanding of consent is limited. The amendment would strengthen protection for children by preventing the use of their data in ways that were not made clear to them or their guardians at the time of collection. It would ensure that children’s data remained secure and was not exploited for unrelated purposes.
On Amendment 22, the overarching duty proposed in this new clause—to prioritise children’s best interests and ensure that their data is handled with due care and attention—aligns with the objective that we all share of safeguarding children in the digital age. We also agree with the principle that the protections afforded to children’s data should not be undermined or reduced, and that those protections should remain consistent with existing standards under the UK GDPR.
However, although we support the intent of the amendment, we have concerns about the reference to the UN Convention on the Rights of the Child and general comment 25. Although these international frameworks are important, we do not believe they should be explicitly tied into this legislation. Our preference would be for a redraft of this provision that focused more directly on UK law and principles, ensuring that the protections for children’s data were robust and tailored to our legal context, rather than linking it to international standards in a way that could create potential ambiguities.
(2 days, 23 hours ago)
Lords ChamberMy Lords, I thank the noble Lord, Lord Clement-Jones, for raising these significant issues. While I share some of the concerns expressed, I find myself unable—at least for the moment—to offer support for the amendments in their current form.
Amendment 17 seeks to remove the powers granted to the Secretary of State to override primary legislation and to modify aspects of UK data protection law via statutory instrument. I agree with the principle underpinning this amendment: that any changes to data protection law must be subject to appropriate scrutiny. It is essential that parliamentary oversight remains robust and meaningful, particularly when it comes to matters as sensitive and far-reaching as data protection.
However, my hesitation lies in the practical implications of the amendment. While I sympathise with the call for greater transparency, I would welcome more detail on how this oversight mechanism might work in practice. Would it involve enhanced scrutiny procedures or a stronger role for relevant parliamentary committees? I fear that, without this clarity, we risk creating uncertainty in an area that requires, above all, precision and confidence.
The Minister’s Amendment 18 inserts specific protections for children’s personal data into the UK GDPR framework. The Government have rightly emphasised the importance of safeguarding children in the digital age. I commend the intention behind the amendment and agree wholeheartedly that children deserve special protections when it comes to the processing of their personal data.
It is worth noting that this is a government amendment to their own Bill. While Governments amending their own legislation is not unprecedented—the previous Government may have indulged in the practice from time to time—it is a practice that can give rise to questions. I will leave my comments there; obviously it is not ideal, but these things happen.
Finally, Amendment 21, also tabled by the noble Lord, Lord Clement-Jones, mirrors Amendment 17 in seeking to curtail the Secretary of State’s powers to amend primary legislation via statutory instrument. My earlier comments on the importance of parliamentary oversight apply here. As with Amendment 17, I am of course supportive of the principle. The delegation of such significant powers to the Executive should not proceed without robust scrutiny. However, I would appreciate greater clarity on how this proposed mechanism would function in practice. As it stands, I fear that the amendment raises too many questions. If these concerns could be addressed, I would be most grateful.
In conclusion, these amendments raise important points about the balance of power between the Executive and Parliament, as well as the protection of vulnerable individuals in the digital sphere. I look forward to hearing more detail and clarity, so that we can move forward with confidence.
My Lords, government Amendment 18 is similar to government Amendment 40 in the previous group, which added an express reference to children meriting specific protection to the new ICO duty. This amendment will give further emphasis to the need for the Secretary of State to consider the fact that children merit specific protection when deciding whether to use powers to amend the list of recognised legitimate interests.
Turning to Amendment 17 from the noble Lord, Lord Clement-Jones, I understand the concerns that have been raised about the Secretary of State’s power to add or vary the list of recognised legitimate interests. This amendment seeks to remove the power from the Bill.
In response to some of the earlier comments, including from the committees, I want to make it clear that we have constrained these powers more tightly than they were in the previous data Bill. Before making any changes, the Secretary of State must consider the rights and freedoms of individuals, paying particular attention to children, who may be less aware of the risks associated with data processing. Furthermore, any addition to the list must meet strict criteria, ensuring that it serves a clear and necessary public interest objective as described in Article 23.1 of the UK GDPR.
The Secretary of State is required to consult the Information Commissioner and other stakeholders before making any changes, and any regulations must then undergo the affirmative resolution procedure, guaranteeing parliamentary scrutiny through debates in both Houses. Retaining this regulation-making power would allow the Government to respond quickly if future public interest activities are identified that should be added to the list of recognised legitimate interests. However, the robust safeguards and limitations in Clause 70 will ensure that these powers are used both sparingly and responsibly.
I turn now to Amendment 21. As was set out in Committee, there is already a relevant power in the current Data Protection Act to provide exceptions. We are relocating the existing exemptions, so the current power, so far as it relates to the purpose limitation principle, will no longer be relevant. The power in Clause 71 is intended to take its place. In seeking to reassure noble Lords, I want to reiterate that the power cannot be used for purposes other than the public interest objectives listed in Article 23.1 of the UK GDPR. It is vital that the Government can act quickly to ensure that public interest processing is not blocked. If an exemption is misused, the power will also ensure that action can be swiftly taken to protect data subjects by placing extra safeguards or limitations on it.
My Lords, as we reach the end of this important group, I thank particularly my noble friend Lady Harding for her contribution and detailed account of some of the issues being faced, which I found both interesting and valuable. I thought the example about the jazz concert requiring the combination of those different types of data was very illuminating. These proposed changes provide us the opportunity to carefully balance economic growth with the fundamental right to data privacy, ensuring that the Bill serves all stakeholders fairly.
Amendment 24 introduces a significant consideration regarding the use of the open electoral register for direct marketing purposes. The proposal to include data from the OER, combined with personal data from other sources, to build marketing profiles creates a range of issues that require careful consideration.
Amendment 24 stipulates that transparency obligations must be fulfilled when individuals provide additional data to a data provider, and that this transparency should be reflected both in the privacy policy and via a data notification in a direct mail pack. While there is certainly potential to use the OER to enhance marketing efforts and support economic activity, we have to remain vigilant to the privacy implications. We need to make sure that individuals are informed of how and where their OER data is being processed, especially when it is combined with other data sources to build profiles.
The requirement for transparency is a positive step, but it is essential that these obligations are fully enforced and that individuals are not left in the dark about how their personal information is being used. I hope the Minister will explain a little more about how these transparency obligations will be implemented in practice and whether additional safeguards are proposed.
Amendment 49 introduces a change to Regulation 22, creating an exception for charities to use electronic mail for direct marketing in specific circumstances. This amendment enables charities to send direct marketing emails when the sole purpose is to further one or more of their charitable purposes, provided that certain conditions are met. These conditions include that the charity obtained the recipient’s contact details when the individual expressed interest in the charity or offered previous support for the charity. This provision recognises the role of charities in fundraising and that their need to communicate with volunteers, supporters or potential donors is vital for their work.
However, I understand the argument that we must ensure that the use of email marketing does not become intrusive or exploitative. The amendment requires that recipients are clearly informed about their right to refuse future marketing communications and that this option is available both when the data is first collected and with every subsequent communication. This helps strike the right balance between enabling charities to raise funds for their causes and protecting individuals from unwanted marketing.
I welcome the Government’s commitment to ensuring that charities continue to engage with their supporters while respecting individuals’ right to privacy. However, it is essential that these safeguards are robustly enforced to prevent exploitation. Again, I look forward to hearing from the Minister on how the Government plan to ensure that their provisions will be properly implemented and monitored.
Amendment 50 introduces the concept of soft opt-ins for email marketing by charities, allowing them to connect with individuals who have previously expressed interest in their charitable causes. This can help charities maintain and grow their supporter base but, again, we must strike the right balance with the broader impact this could have on people in receipt of this correspondence. It is crucial that any system put in place respects individuals’ right to privacy and their ability to opt out easily. We must ensure that charities provide a clear, simple and accessible way for individuals to refuse future communications, and that this option is consistently available.
Finally, we should also consider the rules governing the use of personal data by political parties. This is, of course, an area where we must ensure that transparency, accountability and privacy are paramount. Political parties, like any other organisation, must be held to the highest standards in their handling of personal data. I hope the Government can offer some clear guidance on improving and strengthening the rules surrounding data use by political parties to ensure that individuals’ rights are fully respected and protected.
My Lords, I rise to speak to Amendments 26, 31 and 32 tabled in my name and that of my noble friend Lord Markham. I will address the amendments in reverse order.
Amendment 32 would ensure that, where a significant decision is taken by ADM, the data subject was able to request intervention by a human with sufficient competency and authority. While that is clearly the existing intent of the ADM provisions in the Bill, this amendment brings further clarity. I am concerned that, where data processors update their ADM procedures in the light of this Bill, it should be abundantly clear to them at every stage what the requirements are and that, as currently written, there may be a risk of misunderstanding. Given the significance of decisions that may be made by ADM, we should make sure this does not happen. Data subjects must have recourse to a person who both understands their problem and is able to do something about it. I look forward to hearing the Minister’s views on this.
Amendment 31 would require the Secretary of State to provide guidance on how consent should be obtained for ADM involving special category data. It would also ensure that this guidance was readily available and reviewed frequently. The amendment would provide guidance for data controllers who wish to use ADM, helping them to set clear processes for obtaining consent, thus avoiding complaints and potential litigation.
We all know that litigation can be slow, disruptive and sometimes prohibitively expensive. If we want to encourage the use of ADM so that customers and businesses can save both time and money, we should seek to ensure that the sector does not become a hotbed of litigation. The risk can be mitigated by providing ample guidance for the sector. For relatively minimal effort on the part of the Secretary of State, we may be able to facilitate substantial growth in the use and benefits of ADM. I would be most curious to hear the Minister’s opinions on this matter and, indeed, the opinions of noble Lords more broadly.
Amendment 26 would insert the five principles set out in the AI White Paper published by the previous Government, requiring all data controllers and processors who partake in AI-driven ADM to have due regard for them. In the event that noble Lords are not familiar with these principles, they are: safety, security and robustness; appropriate transparency and explainability; fairness; accountability and governance; and contestability and redress.
These principles for safe AI are based on those originally developed with the OECD and have been the subject of extensive consultation. They have been refined and very positively received by developers, public sector organisations, private sector organisations and civil society. They offer real, and popular, safeguards against the risks of AI while continuing to foster innovation.
There is a requirement. Going back to the issue of principles, which was discussed earlier on, one of the existing principles—which I am now trying to locate and cannot—is transparency. I expect that we would make as much of the information public as we can in order to ensure good decision-making and assure people as to how the decisions have been reached.
I thank all noble Lords and the Minister for their comments and contributions to what has been a fascinating debate. I will start by commenting on the other amendments in this group before turning to those in my name.
First, on Amendments 28 and 29, I am rather more comfortable with the arrangements for meaningful human intervention set out in the Bill than the noble Lord, Lord Clement-Jones. For me, either a decision has meaningful human intervention or it does not. In the latter case, certain additional rights kick in. To me, that binary model is clear and straightforward, and could only be damaged by introducing some of the more analogue concepts such as “predominantly”, “principally”, “mainly” or “wholly”, so I am perfectly comfortable with that as it is.
However, I recognise that puts a lot of weight on to the precise meaning of “meaningful human involvement”. Amendment 36 in the name of the noble Lord, Lord Clement-Jones, which would require the Secretary of State to produce a definition of “meaningful human involvement” in ADM in collaboration with the ICO, seems to take on some value in those circumstances, so I am certainly more supportive of that one.
As for Amendments 34 and 35 in the names of the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Freeman, I absolutely recognise the value and potential of efficacy; I agree it is a very valuable term. I have more faith in the rollout and use of the ATRS but on a non-statutory basis, believing, as I do, that this would allow it to continue to develop in an agile and adaptive manner. I welcome the Minister’s words on this subject, and for now I remain comfortable that the ATRS is the direction forward for that.
I turn to the amendments in my name. I thank all noble Lords and, indeed, the Minister for their comments and contributions regarding Amendments 31 and 32. I very much take the Minister’s point that definitions of consent feature elsewhere in the Bill. That reduces my concern somewhat.
However, I continue to strongly commend Amendment 26 to the House. I believe it will foster innovation while protecting data rights. It is popular with the public and with private sector stakeholders. It will bring about outcomes that we all want to see in AI safety without stifling this new and exciting technology. In the absence of an AI Bill—and possibly even in the presence of one—it is the only AI-specific legislation that will be around. It is important somehow to get those AI principles in the Bill, at least until an AI Bill comes along. With this in mind, I wish to test the opinion of the House.
My Lords, I will speak very briefly, given the hour, just to reinforce three things that I have said as the wingman to the noble Baroness, Lady Kidron, many times, sadly, in this Chamber in child safety debates. The age-appropriate design code that we worked on together and which she championed a decade ago has driven real change. So we have evidence that setting in place codes of conduct that require technology companies to think in advance about the potential harms of their technologies genuinely drives change. That is point one.
Point two is that we all know that AI is a foundational technology which is already transforming the services that our children use. So we should be applying that same principle that was so hard fought 10 years ago for non-AI digital to this foundational technology. We know that, however well meaning, technology companies’ development stacks are always contended. They always have more good things that they think they can do to improve their products for their consumers, that will make them money, than they have the resources to do. However much money they have, they just are contended. That is the nature of technology businesses. This means that they never get to the safety-by-design issues unless they are required to. It was no different 150 or 200 years ago as electricity was rolling through the factories of the mill towns in the north of England. It required health and safety legislation. AI requires health and safety legislation. You start with codes of conduct and then you move forward, and I really do not think that we can wait.
My Lords, Amendment 41 aims to establish a code of practice for the use of children’s data in the development of AI technologies. In the face of rapidly advancing AI, it is, of course, crucial that we ensure children’s data is handled with the utmost care, prioritising their best interests and fundamental rights. We agree that AI systems that are likely to impact children should be designed to be safe and ethical by default. This code of practice will be instrumental in guiding data controllers to ensure that AI development and deployment reflect the specific needs and vulnerabilities of children.
However, although we support the intent behind the amendment, we have concerns, which echo concerns on amendments in a previous group, about the explicit reference to the UN Convention on the Rights of the Child and general comment 25. I will not rehearse my comments from earlier groups, except to say that it is so important that we do not have these explicit links to international frameworks, important as they are, in UK legislation.
In the light of this, although we firmly support the overall aim of safeguarding children’s data in AI, we believe this can be achieved more effectively by focusing on UK legal principles and ensuring that the code of practice is rooted in our domestic context.
I thank the noble Lord, Lord Clement-Jones, for Amendment 33, and the noble Baroness, Lady Kidron, for Amendment 41, and for their thoughtful comments on AI and automated decision-making throughout this Bill’s passage.
The Government have carefully considered these issues and agree that there is a need for greater guidance. I am pleased to say that we are committing to use our powers under the Data Protection Act to require the ICO to produce a code of practice on AI and solely automated decision-making through secondary legislation. This code will support controllers in complying with their data protection obligations through practical guidance. I reiterate that the Government are committed to this work as an early priority, following the Bill receiving Royal Assent. The secondary legislation will have to be approved by both Houses of Parliament, which means it will be scrutinised by Peers and parliamentarians.
I can also reassure the noble Baroness that the code of practice will include guidance about protecting data subjects, including children. The new ICO duties set out in the Bill will ensure that where children’s interests are relevant to any activity the ICO is carrying out, it should consider the specific protection of children. This includes when preparing codes of practice, such as the one the Government are committing to in this area.
I understand that noble Lords will be keen to discuss the specific contents of the code. The ICO, as the independent data protection regulator, will have views as to the scope of the code and the topics it should cover. We should allow it time to develop those thoughts. The Government are also committed to engaging with noble Lords and other stakeholders after Royal Assent to make sure that we get this right. I hope noble Lords will agree that working closely together to prepare the secondary legislation to request this code is the right approach instead of pre-empting the exact scope.
The noble Lord, Lord Clement-Jones, mentioned edtech. I should add—I am getting into a habit now—that it is discussed in a future group.
I have added my name to this amendment, about which the noble Lord, Lord Clement-Jones, has spoken so eloquently, because of the importance to our economic growth of maintaining data adequacy with the EU. I have two points to add to what he said.
First, as I said and observed on some occasions in Committee, this is legislation of unbelievable complexity. It is a bad read, except if you want a cure for insomnia. Secondly, it has the technique of amending and reamending earlier legislation. Thirdly, this is not the time to go into detail of the legal problems that arise, some of which we canvassed in Committee, as to whether this legislation has no holes in it. I do not think I would be doing any favours either to the position of the United Kingdom or to those who have been patient enough to stay and listen to this part of the debate by going into any of those in any detail, particularly those involving the European Convention on Human Rights and the fundamental charter. That is my first point, on the inherent nature of the legislative structure that we have created. As I said earlier, I very much hope we will never have such legislation again.
Secondly, in my experience, there is a tendency among lawyers steeped in an area or department often to feel, “Well, we know it’s all right; we built it. The legislation’s fine”. Therefore, there is an additional and important safeguard that I think we should adopt, which is for a fresh pair of eyes, someone outside the department or outside those who have created the legislation, to look at it again to see whether there are any holes in it. We cannot afford to go into this most important assessment of data adequacy without ensuring that our tackle is in order. I appreciate what the Minister said on the last occasion in Committee—it is for the EU to pick holes in it—but the only prudent course when dealing with anything of this complexity in a legal dispute or potential dispute is to ensure that your own tackle is in order and not to go into a debate about something without being sure of that, allowing the other side to make all the running. We should be on top of this and that is why I very much support this amendment.
My Lords, I thank the noble Lord, Lord Clement-Jones—as ever—and the noble and learned Lord, Lord Thomas, for tabling Amendment 37 in their names. It would introduce a new clause that would require the Secretary of State to carry out an impact assessment of this Act and other changes to the UK’s domestic and international frameworks relating to data adequacy before the European Union’s reassessment of data adequacy in June this year.
I completely understand the concerns behind tabling this amendment. In the very worst-case scenario, of a complete loss of data adequacy in the assessment by the EU, the effect on many businesses and industries in this country would be knocking at the door of catastrophic. It cannot be allowed to happen.
However, introducing a requirement to assess the impact of the Bill on the European Union data adequacy decision requires us to speculate on EU intentions in a public document, which runs the risk of prompting changes on its part or revealing our hand to it in ways that we would rather not do. It is important that we do two things: understand our risk, without necessarily publishing it publicly; and continue to engage at ministerial and official level, as I know we are doing intensively. I think the approach set out in this amendment runs the risk of being counterproductive.
I thank the noble Lord, Lord Clement-Jones, for his amendment, and the noble and learned Lord, Lord Thomas, for his contribution. I agree with them on the value and importance placed on maintaining our data adequacy decisions from the EU this year. That is a priority for the Government, and I reassure those here that we carefully considered all measures in the light of the EU’s review of our adequacy status when designing the Bill.
The Secretary of State wrote to the House of Lords European Affairs Committee on 20 November 2024 on this very point and I would be happy to share this letter with noble Lords if that would be helpful. The letter sets out the importance this Government place on renewal of our EU adequacy decisions and the action we are taking to support this process.
It is important to recognise that the EU undertakes its review of its decisions for the UK in a unilateral, objective and independent way. As the DSIT Secretary of State referenced in his appearance before the Select Committee on 3 December, it is important that we acknowledge the technical nature of the assessments. For that reason, we respect the EU’s discretion about how it manages its adequacy processes. I echo some of the points made by the noble Viscount, Lord Camrose.
That being said, I reassure noble Lords that the UK Government are doing all they can to support a swift renewal of our adequacy status in both technical preparations and active engagement. The Secretary of State met the previous EU Commissioner twice last year to discuss the importance of personal data sharing between the UK and EU. He has also written to the new Commissioner for Justice responsible for the EU’s review and looks forward to meeting Commissioner McGrath soon.
I also reassure noble Lords that DSIT and the Home Office have dedicated teams that have been undertaking preparations ahead of this review, working across government as needed. Those teams are supporting European Commission officials with the technical assessment as required. UK officials have met with the European Commission four times since the introduction of the Bill, with future meetings already in the pipeline.
(1 month, 2 weeks ago)
Lords ChamberThe detection of breaks is done from land, but the ability to repair them is through an agreement with the commercial companies, which pay into a fund that allows a ship to be on 24/7 standby to provide protection. That is paid for by the companies that put the cables in place.
My Lords, we of course recognise and share the Government’s and House’s concern about increased Russian military activity around these undersea cables. I was pleased that the Minister a couple of times referenced the risk assessments going on, but can he tell the House a little more and expand on his earlier answers about those risk assessments? How do they take place and how often do they occur?
The national risk assessment is undertaken regularly and led by the Cabinet Office. In this instance, DSIT is the department responsible for the risk to the cables overall, but it is in collaboration with the MoD, the Cabinet Office and others, particularly in relation to assessing risks other than those that I have outlined.
(2 months ago)
Lords ChamberMy Lords, what a pleasure it is to address this compelling, balanced and, in my opinion, excellent report on large language models and generative AI. I thank not just my noble friend Lady Stowell but all noble Lords who were involved in its creation. Indeed, it was my pleasure at one point to appear before the committee in my former ministerial role. As ever, we are having an excellent debate today. I note the view of the noble Lord, Lord Knight, that it tends to be the usual suspects in these things, but very good they are too.
We have heard, particularly from my noble friend Lady Stowell and the noble Baroness, Lady Featherstone, about the need to foster competition. We have also heard about the copyright issue from a number of noble Lords, including the noble Baronesses, Lady Featherstone, Lady Wheatcroft and Lady Healy, and I will devote some more specific remarks to that shortly.
A number of speakers, and I agree with them, regretted the cancellation of the exascale project and got more deeply into the matter of compute and the investment and energy required for it. I hope the Minister will address that without rehearsing all the arguments about the black hole, which we can all probably recite for ourselves.
We had a very good corrective from the noble Lords, Lord Strasburger and Lord Griffiths of Bury Port, and my noble friend Lord Kamall, that the risks are far-reaching and too serious to treat lightly. In particular, I note the risk of deliberate misuse by powers out of our control. We heard about the need going forward for, if possible, greater clarity about regulatory plans and comparisons with the EU AI Act from my noble friend Lord Ranger. I very much enjoyed and respond to the remarks by the noble Lord, Lord Tarassenko, about data as a sovereign asset for the UK, whether in healthcare or anything else.
These points and all the points raised in the report underscore the immense potential of AI to revolutionise key sectors of our economy and our society, while also highlighting critical risks that must be addressed. I think we all recognise at heart the essential trade-off in AI policy. How do we foster the extraordinary innovation and growth that AI promises while ensuring it is deployed in ways that keep us safe?
However, today I shall focus more deeply on two areas. The first is copyright offshoring and the second is regulation strategy overall.
The issue of copyright and AI is deeply complex for many reasons. Many of them were very ably set out by my noble friend Lord Kamall. I am concerned that any solution that does not address the offshoring problem is not very far from pointless. Put simply, we could create between us the most exquisitely balanced, perfectly formed and simply explained AI regulation, but any AI lab that did not like it could, in many cases, scrape the same copyrighted content in another jurisdiction with regulations more to its liking. The EU’s AI Act addresses this problem by forbidding the use in the EU of AI tools that have infringed copyright during their training.
Even if this is workable in the EU—frankly, I have my doubts about that—there is a key ingredient missing that would make it workable anywhere. That ingredient is an internationally recognised technical standard to indicate copyright status, ownership and licence terms. Such a standard would allow content owners to watermark copyrighted materials. Whether the correct answer is pursuing an opt in or opt out of TDM is a topic for another day, but it would at least enable that to go forward technically. Crucially, it would allow national regulators to identify copyright infringements globally. Will the Minister say whether he accepts this premise and, if so, what progress he is aware of towards the development of an international technical standard of this kind?
I turn now to the topic of AI regulation strategy. I shall make two brief points. First, as a number of noble Lords put it very well, AI regulation has to adapt to fast-moving technology changes. That means that it has to target principles, rather than specific use cases where possible. Prescriptive regulation of technology does not just face early obsolescence, but relies fatally on necessarily rigid definitions of highly dynamic concepts.
Secondly, the application of AI is completely different across sectors. That means that the bulk of regulatory heavy lifting needs to be done by existing sector regulators. As set out in the previous Government’s White Paper, this work needs to be supported by central functions. Those include horizon scanning for future developments, co-ordination where AI cuts across sectors, supporting AI skills development, the provision of regulatory sandboxes and the development of data and other standards such as the ATRS. If these and other functions were to end up as the work of a single AI regulatory body, then so much the better, but I do not believe that such an incorporation is mission critical at this stage.
I was pleased that the committee’s report was generally supportive of this position and, indeed, refined it to great effect. Do the Government remain broadly aligned to this approach? If not, where will the differences lie?
While many of us may disagree to one degree or another on AI policy, I do not believe there is really any disagreement about what we are trying to achieve. We must seize this moment to champion a forward-looking AI strategy—one that places the UK at the forefront of global innovation while preserving our values of fairness, security, and opportunity for all.
Like the committee—or as we have heard from the noble Lord, Lord Griffiths, like many members of the committee—I remain at heart deeply optimistic. We can together ensure that AI serves as a tool to enhance lives, strengthen our economy, and secure our national interests. This is a hugely important policy area, so let me close by asking the Minister if he can update this House as regularly and frequently as possible on the regulation of AI and LLMs.
(2 months ago)
Lords ChamberThis is a critical question. The Royal Institute of Navigation has recently—in fact, today—launched a paper on how to prepare for this. It is something that all critical national infrastructure will be urged to look at, to have a plan for what would happen in the event of GPS failure. There is a longer-term question about the alternatives to space-based navigation and there is active work going on in the UK on terrestrial approaches, including the use of quantum systems to try to get a robust secondary approach to PNT.
My Lords, now that over 70 nations have their own space agency, how will the Government pursue the widest and most effective possible international co-operation in support of Astra Carta’s aim,
“to care for the infinite wonders of the universe”?
There is a series of international collaborations in place. We are a member of the European Space Agency. A large proportion of the £1.9 billion of the UK Space Agency money goes to the European Space Agency and our collaborators there. We also spend through the MoD and through UKRI. We are members of the UN bodies that deal with the question of a sustainable space sector and space environment. The space environment is increasingly important and needs attention. We will continue to raise this question at the UN bodies.
(2 months, 3 weeks ago)
Lords ChamberMy Lords, it has been an absolutely brilliant debate, and I join others in thanking the noble Viscount, Lord Stansgate, for bringing it forward. I also join others in congratulating the noble Baroness, Lady Freeman. Many years from now, eventually “Walking with Dinosaurs” will be a fantastic title for her memoir, but we are not there yet. I have been asked to slightly curtail my remarks and I am very happy to do that. I hope noble Lords will forgive me if I do not reflect on everything that has been said in the debate, but rather offer, just to begin with, some of my personal highlights from what I heard.
As a theme, it is clear that we are as one in deeply recognising and valuing the contribution that science and technology can and will make to our economy. Sadly, and frustratingly, many different approaches have been advanced as to how we can best finance that. I hope that we can be on the path of constant improvement to get more investment into this crucial space. I noted a sense of ruefulness from my noble friend Lord Willetts as he said that the role of the Science Minister was to extract money from the Treasury; I am pleased to say that we have somewhat moved on from this position.
I was very struck by the noble Baroness, Lady Neville-Jones, reminding us of the growing importance of international rivalry in this space. I think that is going to play an increasing part in our deliberations here.
The noble Lords, Lord St John of Bletso, Lord Tarassenko and Lord Drayson, asked, one way or another: where are our Metas or Alphabets? It is a question that certainly bugs me. Let us hope that, between us, we can move towards more of an answer. The noble Baroness, Lady Bowles, spoke powerfully about the issue of IP retention in universities, and that is clearly something we need to continue to look at.
The noble Lord, Lord Lucas, raised the issue of standards and regulations. There are not many silver bullets in technology regulation, but standards will be one of them. International global standards, particularly for instance with the copyright issue in AI, are going to be a big part of that solution.
I absolutely share the wish of the right reverend Prelate the Bishop of Newcastle to foster a faster-growing tech community in the north-east of England. If I may, I commend to her the work of the brilliant organisation CyberNorth; she may know it already.
Innovation is not merely an advantage; it is the foundation of economic growth and global competitiveness. Science and tech are no longer confined to laboratories or research institutions; they are part of the fabric of almost all the work we are doing of any kind across this country.
As of last year, we are one of three countries in the world with a trillion-dollar tech sector. Today, that sector contributes £150 billion annually to the UK economy, a figure that reflects not only the sector’s rapid growth to this point but its remarkable potential for expansion. With emerging fields that have been mentioned many times—quantum AI, engineering biology, and so on—we have the opportunity to cement the UK’s status as a global leader in scientific and technological innovation.
Of course, the contributions of science and tech, as I enjoyed hearing from the noble Baroness, Lady Bennett of Manor Castle, are not limited to economic growth. They enhance our resilience in the face of global challenges. I frequently argue that for all the amazing scientific advances we have seen over recent years, perhaps the most impactful was the development of the Covid vaccine, which I think we can all agree underscored, among other things, the power of UK-led scientific innovation, saving lives and demonstrating the critical impact of robust scientific infrastructure.
Investment in science and technology is also an investment in the workforce of tomorrow. The noble Lord, Lord Mair, and others raised this point very powerfully, as did my noble friend Lord Willetts and the noble Lord, Lord Taylor of Warwick. By prioritising education in STEM fields and by fostering partnerships between industry and academia, we are equipping future generations with the skills and knowledge required to thrive in a rapidly evolving landscape. It is not only essential for individual opportunity but vital to our ongoing economic competitiveness.
I want to address some pressing concerns raised by yesterday’s Budget. The Chancellor announced a significant allocation of £20.4 billion for research and development, including £6.1 billion aimed specifically at protecting core research funding. There is no doubt that this funding is crucial for advancing the core of our scientific curriculum. However, the research community has expressed some apprehensions regarding the implications of this. The Budget allocates an increased £2.7 billion for association with EU research programmes and covers the cost of the old Horizon Europe guarantee scheme. This means we are committing with this money not only to new funding but to managing the cost of past obligations. I would welcome some clarity from the Minister on how this is going to break down.
Further, as raised by my noble friend Lord Waldegrave, the abruptness of the decision over the summer to cancel the exascale computing investment—which was, by the way, fully funded through DSIT’s budget, contrary, I am afraid, to statements from the Government that I have heard from time to time—must stand as a significant red flag to AI investors, if only for its unexpectedness and suddenness. When we take this together with the additional costs and risks of hiring staff, the reduction of incentives to invest in technology and the—in my view, rather aggressive—treatment of non-domiciled investors, I think we have grounds for concern. I wonder whether, when the Minister rises, he could tell us to what he attributes our leadership today in science and tech. Is he concerned that these decisions may diminish that leadership and, if so, what do the Government propose to do about it?
That said, I am keen to close on a note of excitement and positivity. Ray Kurzweil, of “singularity” fame, argues that the time between major advances in science and technology diminishes exponentially. If he is right, the technologies available to us at the end of this Parliament will be truly staggering. So let us all be working together to make sure that as many of those breakthroughs as possible are delivered and safely exploited in this science and tech superpower, the United Kingdom.
(3 months ago)
Lords ChamberThat is an area that of course comes under several other parts of regulation already. It is also an area where there are massive changes in the way that these models perform. If one looks at GPT-4 versus GPT-3—I know it is not facial recognition, but it gives an indication of the types of advances—it is about twice as good now as it was a year ago. These things are moving fast and there is indeed a need to understand exactly how facial recognition technology is valid and where it has problems in recognition.
My Lords, the supply chain for the development of the more advanced AI systems is, in almost every case, highly global in nature. That means that it becomes quite straightforward for AI developers to offshore their activities from any jurisdiction whose regulations they might prefer not to follow. This being the case, do the Government agree that the regulations for AI development, as distinguished mostly from use, are going to have to be global in nature? If the Government agree with that, how is it reflected in their plans for AI regulation going forward?
The noble Viscount makes an important point. This will be global; there is no question about it. Therefore, there needs to be some degree of interoperability between different regions in terms of the regulations put in place. At the moment, as I said, of the two most advanced, the US is the biggest AI nation in the world and is developing a regulation along similar lines to ours, we believe. The EU is of course the most regulated place in the world for AI and we need to work out, in consultation over the next months, how to make sure that we work out where the areas of interoperability will lie.
(3 months, 1 week ago)
Lords ChamberThe convention sets out activities in the life cycle of AI systems, and they should not infringe our values of human rights, democratic processes and the effectiveness of democratic institutions or the rule of law. It applies to the public sector, to the public sector when using the private sector, and there is an obligation to consider how private sector activities can be taken into account when this is implemented in a national framework.
My Lords, international bodies currently working on AI safety and regulation include the UN, UNESCO, the ITU, the G7, the G20 and the GPI, among several others. Do the Government agree that although each of these groups is crucial and has a very important role to play in creating safe and well-regulated AI globally, they will be successful only to the extent that they are effectively co-ordinated? If so, what steps are the Government taking to bring that about?
We are in active discussion with all those partners. As we consider an AI Act, we will work closely with partners in the US and elsewhere and apply it only to the limited number of companies at the very forefront of AI, to those models of tomorrow which carry particular risk and, again, where guard-rails have been asked for.