(2 weeks, 2 days ago)
Lords ChamberMy Lords, Amendment 111ZA seeks to introduce a requirement for workplace AI risk and impact assessments. This amendment is focused on addressing the profound and rapidly evolving impact of artificial intelligence systems on the modern workplace. There are many opportunities for its adoption but also risks and impacts. There is potentially massive job displacement. AI could displace 1 million to 3 million UK jobs overall. There are workplaces skills gaps; more than half the UK workforce lacks essential digital skills and the majority of the public has no AI education or training.
AI recruitment algorithms have resulted in race and sex discrimination. There are legal vulnerabilities. Companies risk facing costly lawsuits and settlements when unsuccessful job applicants claim unlawful discrimination by AI hiring systems. Meanwhile, AI adoption accelerates rapidly, and the UK’s regulatory framework is lagging behind.
Organisations such as the Trades Union Congress and the Institute for the Future of Work have consistently highlighted the critical need for robust regulation in this area. The TUC, through its artificial intelligence regulation and employment rights Bill, drafted with a multi-stakeholder task force, explicitly proposes workforce AI risk assessments and emphasises the need for worker consultation before AI systems are implemented. It also advocates for fundamental rights, such as a right to a human review for high-risk decisions. IFOW similarly calls for an accountability for algorithms Act that would mandate pre-emptive algorithmic impact assessments to identify and mitigate risks, ensuring greater transparency and accountability in the use of AI at work. Both organisations stress that existing frameworks are insufficient to protect workers from the potential harms of AI.
When I spoke to a similar amendment—Amendment 149—in Committee, the Minister acknowledged this and said:
“The Government are committed to working with trade unions, employers, workers and experts to examine what AI and new technologies mean for work, jobs and skills. We will promote best practice in safeguarding against the invasion of privacy through surveillance technology, spyware and discriminatory algorithmic decision-making … However, I assure the noble Lord, Lord Clement-Jones, that the Institute for the Future of Work will be welcome to make an input into that piece of work and the consultation that is going forward. I reassure the noble Baroness, Lady Bennett, and all noble Lords that this is an area that the Government are actively looking into, and we will consult on proposals in the make work pay plan in due course”.—[Official Report, 5/6/25; col. 878.]
This was all very reassuring, perhaps, but I have retabled this amendment precisely because we need more concrete specifics regarding this promised consultation.
The TUC and IFOW have been working on this for four years. Is it too much to ask the Government to take a clear position on what is proposed now? The Minister referred to the importance of proper consultation. This is a crucial area impacting the fundamental rights and well-being of workers right now, often without their knowledge, and AI systems are increasingly being introduced into the workforce, so the Government need to provide clarity on what kind of consultation is being undertaken, with whom they will engage beyond relevant stakeholders and what the precise timescale is for this consultation and any subsequent legislative action, particularly given the rapid introduction of AI into workplaces.
We cannot afford a wait-and-see approach. If comprehensive AI regulation cannot be addressed within this Bill as regards the workplace, we need an immediate and clear commitment to provision within dedicated AI legislation, perhaps coming down the track, to ensure that AI in the workplace truly benefits everyone. I beg to move.
My Lords, it is always a pleasure to follow my friend, the noble Lord, Lord Clement-Jones, who, in his single Nelsonian amendment, has covered a lot of the material in my more spread-out set of amendments. I support his Amendment 111ZA and will speak to my Amendments 168 to 176. I declare my interests in the register, particularly my technology interests, not least as a member of the advisory board of Endava plc and as a member of the technology and science advisory committee of the Crown Estate.
I will take one brief step backwards. From the outset, we have heard that the Government do not want to undertake cross-sector AI legislation and regulation. Rather, they want to take a domain-specific approach. That is fine; it is clearly the stated position, although it would not be my choice. But it is simultaneously interesting to ask how, if that choice is adopted, consistency across our economy and society is ensured so that, wherever an individual citizen comes up against AI, they can be assured of a consistent approach to the treatment of the challenges and opportunities of that AI. Similarly, what happens where there is no competent regulator or authority in that domain?
At the moment, largely, neither approach seems to be being adopted. Whenever I and colleagues have raised amendments around AI in what we might call domain-specific areas, such as the Product Regulation and Metrology Bill, the data Bill and now the Employment Rights Bill, we are told, “This is not the legislation for AI”. I ask the Minister for clarity as to whether, if a cross-sector approach to AI is not being taken, a domain-specific approach is, as opportunities are not being taken up when appropriate legislation comes before your Lordships’ House.
I turn to the amendments in my name. Amendment 168 goes to the very heart of the issue around employers’ use of AI. Very good, if not excellent, principles were set out in the then Government’s White Paper of 2023. I have transposed many of these into my Amendment 168. Would it not be beneficial to have these principles set in statute for the benefit of workers, in this instance, wherever they come across employers deploying AI in their workplace?
Amendment 169 lifts a clause largely from my Artificial Intelligence (Regulation) Private Member’s Bill and suggests that an AI responsible officer in all organisations that develop, deploy and use AI would be a positive thing for workers, employees and employers alike. This would not be seen as burdensome, compliant or a mere question of audit but as a positive, vibrant, dynamic role, so that the benefits of AI could be felt by workers right across their employment experience. It would be proportionate and right touch, with reporting requirements easily recognised as mirroring similar requirements set out for other obligations under the Companies Act. If we had AI responsible officers across our economy, across businesses and organisations deploying and using AI right now, this would be positive, dynamic and beneficial for workers, employees, employers, our economy and wider society.
Amendment 170 goes to the issue of IP copyright and labelling. It would put a responsibility on workers who are using AI to report to the relevant government department on the genesis of that IP and copyrighted material, and the data used in that AI deployment, by which means there would be clarity not only on where that IP copyright and data had emanated from but that it had been got through informed consent and that all IP and copyright obligations had been respected and adhered to.
Amendments 171 and 172 similarly look at where workers’ data may be ingested right now by employers’ use of AI. These are such rich, useful and economically beneficial sources of data for employers and businesses. Amendment 171 simply suggests that there should be informed consent from those workers before any of their data can be used, ingested and deployed.
I would like to take a little time on Amendment 174, around the whole area of AI in recruitment and employment. This goes back to one of my points at the beginning of this speech: for recruitment, there currently exists no competent authority or regulator. If the Government continue with their domain-specific approach, recruitment remains a gap, because there is no domain-specific competent authority or regulator that could be held responsible for the deployment and development of AI in that sector. If, for example, somebody finds themselves not making a shortlist, they may not know that AI has been involved in making that decision. Even if they were aware, they would find themselves with no redress and no competent authority to take their claim to.
(2 months, 4 weeks ago)
Lords ChamberMy Lords, first, I congratulate the Law Commission for its work on this Bill. The noble Lord, Lord Anderson, has mentioned the almost 1,000 pages of work from the Law Commission, its consultation paper, the final report, the supplemental report—no one accused the Law Commission of a lack of thoroughness as far as this two-clause Bill is concerned. Its purpose, as it said, was to ensure that the courts can respond sensitively to the complexity of emerging technology and apply the law to new fact patterns involving that technology. It also said:
“We conclude that the flexibility of common law allows for the recognition of a distinct category of personal property that can better recognise, accommodate and protect the unique features of certain digital assets (including crypto-tokens and cryptoassets). We recommend legislation to confirm the existence of this category and remove any uncertainty”.
As the noble Lord, Lord Anderson, has explained, we have thoroughly examined the resulting Bill under the Special Public Bill procedure, and it clearly fulfils that purpose, so I would like to thank our witnesses. I also thank the noble Lord, Lord Anderson, for his excellent chairing of the committee; our clerk, Matthew Burton; the Minister, of course; my fellow members for all their work on the Bill; and I agree with the Minister’s particular thanks to the noble Lord, Lord Holmes, for his stimulating and provocative input into our deliberations. But, as ever, there is more work to be done. The Law Commission recommended that the Government create a panel of industry experts who can provide guidance on technical and legal issues relating to digital assets, and I understand that the Ministry of Justice has asked the UK Jurisdiction Taskforce, an expert group chaired by the Master of the Rolls that produces non-binding guidance on areas of legal uncertainty, to take forward this work.
The Law Commission also made recommendations to provide market participants with legal tools that do not yet exist in England and Wales, let alone in Northern Ireland, such as new ways to take security over crypto tokens and tokenised securities. It recommended that this work be undertaken by a multidisciplinary project team. Whether the Minister can give us an update today, I do not know, but I very much hope that he will write to members of the committee, because that is unfinished business and it would be very useful to hear from the Minister about it.
My Lords, it is a pleasure to follow my friend, the noble Lord, Lord Clement-Jones. In doing so, I declare my technology interests as set out in the register. Like other noble Lords, I have, rightfully, a long list of thanks, not least to all those witnesses who gave oral and written evidence to our Special Bill Committee; to Matthew Burton, our clerk, and all his team; to the Minister for his careful and thoughtful engagement, and all his officials; and not least to the noble Lord, Lord Anderson, for his excellent chairing of our Special Bill Committee.
This is a short Bill, but one with significant impact for the UK, and indeed beyond our shores, because through our legislative process, the world is watching what we do in this space. We have a great fintech tradition in this country, a great fintech ecosystem, and whether it is financial market infrastructure, dematerialisation of our capital markets or crucially important financial inclusion, digital assets have a critical role to play. With some trillions of the UK and world economy due to be transacted by digital assets by the end of this decade, the UK needs to ensure that it is well set for this future. The Bill does this through non-prescription, but using the great good fortune of English and Welsh common law, with its agility and its adaptability, as the Minister rightly said, for new technologies not yet even imagined.
It was an extraordinary pleasure to be part of this legislative process. My only question for the Minister is whether there is a schedule yet in the other place, so we can ensure that Bill becomes law as soon as possible. Not only does it send a signal to the world; it sends a signal to all those involved in digital assets in this country that with the financial centre of London and our fantastic fintech start-ups, scale-ups and larger businesses, London and the United Kingdom is an excellent place to be involved in digital assets business.
(3 months, 1 week ago)
Lords ChamberMy Lords, in moving Amendment 1, in my name, I will speak to Amendments 2 and 3 in this group.
It is a pleasure to open Report of the Property (Digital Assets etc) Bill. In doing so, I declare my technology interests as set out in the register, not least as adviser to Ecospend and Members Capital Management. I take a brief moment to thank all of those who have got the Bill to this stage, including Professor Green and her team at the Law Commission, everyone who was involved with our Special Public Bill Committee—particularly the clerk, Matthew Burton, and all his staff—and all colleagues who have shown an interest in and engaged with the Bill.
There is an extraordinary opportunity when it comes to digital assets and delivering clarity, consistency and certainty around their property classification. By 2030, it is estimated that somewhere between 10% and 14% of GDP will come from digital assets. To put it another way, transactions in 2030 involving digital assets will range between £10 trillion and £24 trillion. That is a huge opportunity for the planet and for the UK, not least because of our excellence in financial services and in fintech—financial technology—but, crucially, because of the great good fortune of English common law.
What we see with the Bill is the leading-edge deployment of that great tradition in the most modern of contexts. To take just one example, if we get effective dematerialisation of the capital markets, that will save £20 billion year-on-year in reduced costs and speeded up transactions. Clarification of digital assets will not only help capital markets but will assist with financial inclusion and financial market infrastructure transformation, impacting positively on our economy and, through that, our society. We should note that the world is watching as we pass this Bill—following, as it does, a suite of Bills from the Law Commission, not least the recent Electronic Trade Documents Bill, now Act.
This is a very good Bill, which does a very simple task of enabling a third category of property: taking a “thing in possession” and a “thing in action” and enabling a potential third category to accommodate digital assets which do not neatly fit within either of those current property classes. It is a good Bill, and it has been through an excellent Committee and Special Public Bill Committee procedure, but I believe it is worthy of stress-test through these amendments this evening.
Amendments 1 and 2 go to the very heart of the Bill and propose that the presumption that digital assets cannot be fitted within the existing two categories of property be reversed. Consider something such as an NFT, a non-fungible token. To put it in simpler terms, it is largely a piece of electronic software on the hardware of a digital ledger. It has an existence beyond its legal form, but it is difficult to possess in the way you would possess, for example, a bag of gold. In that sense, the Bill is structured to enable this third category. The amendment seeks to stress-test that and reverse that presumption, as we have seen in some of the recent judgments in Australia and Singapore.
I am not suggesting that this amendment is the right amendment; it is merely put to stress-test how the Bill is set out. It seeks to stress-test the claim made by Professor Green, when she gave evidence to our Special Public Bill Committee, that this amendment would take the bite out of the Bill. If indeed it would take the bite out of the Bill, then it would not satisfy my three Cs test of what the Bill needs to achieve if we are to realise the opportunities and the economic benefits from digital assets. Those three tests are: clarity, certainty and consistency.
Amendment 3 seeks to assist with this by suggesting codes of practice that could be brought to bear to assist the courts when they come to consider issues around digital assets. With that, I beg to move Amendment 1.
My Lords, I am a great admirer of the noble Lord, Lord Holmes, and his passion for all things digital. But this is a good yet very modest Bill, and I not sure that we need stress-testing at this point in the proceedings. Through the Special Public Bill process that we have all been through over the last few months, we have kicked the tyres pretty hard already on this. We have taken evidence and had amendments in Committee, so I will be extremely brief and perhaps disappoint the noble Lord by not being in favour of any of his amendments.
(6 months, 1 week ago)
Lords ChamberMy Lords, it is a pleasure to open the second day on Report on the Data (Use and Access) Bill. In doing so, I declare my technology interests as set out in the register, not least as an adviser to Socially Recruited, an AI business. In moving Amendment 38 in my name, I will not speak to any other amendments in this group.
Amendment 38 goes to the heart of the issue du jour: regulators have seldom been so much in the press and in the public eye. As the press would have it, they were hauled into No. 11 just a few days ago, but this speaks to what we want from our regulators across our economy and society. At their best, our regulators are the envy of the world. Just consider the FCA when we did the fintech regulatory sandbox: as a measure of success, it was replicated in well over 50 jurisdictions around the world.
We know how to do right-sized regulation and how to set up our regulators to succeed to do that most difficult of tasks—to balance innovation, economic growth, and consumers’ and citizens’ rights. That is what all regulators should be about. It is not straightforward; it is complex but entirely doable.
Amendment 38 simply proposes wording to assist the Information Commissioner’s Office. When it comes to the economic growth duty—“#innovation”—it simply refers back to Section 108 of the 2015 Act. I believe that bringing this clarity into the Bill will assist the regulator and enable all the conversations that are rightly going on right now, and all the plans that are being produced and reported on, such as those around AI, to be properly discussed and given proper context, with an Information Commissioner’s Office that is supported through clarity as to its responsibilities and obligations when it comes to economic growth. In simple terms, this would mean that these responsibilities are restricted and clearly set out according to Section 108 of the 2015 Act. It is critical that this should be the case if we are to have clarity around the commissioner’s independence as a supervisory authority on data protection, an absolutely essential condition for EU adequacy decisions.
I look forward to the Minister’s response. I hope that he likes my drafting. I hope that he will accept and incorporate my amendment into the Bill. I look forward to the debate. I beg to move.
My Lords, I rise to support Amendment 38 in the name of the noble Lord, Lord Holmes. More than ever before, the commissioner, alongside other regulators, is being pressured to support the Government’s growth and innovation agenda. In Clause 90, the Bill places unprecedented obligations on the ICO to support innovation. The question, in respect of both the existing growth duty and Clause 90, is whether they are in any sense treated as overriding the ICO’s primary responsibilities in data protection and information rights. How does the ICO aim to balance those duties, ensuring that its regulatory actions support economic growth while maintaining necessary protections?
We need to be vigilant. As it is, there are criticisms regarding the way the Information Commissioner’s Office carries out its existing duties. Those criticisms can be broadly categorised into issues with enforcement, independence and the balancing of competing interests. The ICO has a poor record on enforcement; it has been reluctant to issue fines, particularly to public sector organisations. There has been an overreliance on reprimands, as I described in Committee. The ICO has been relying heavily on reprimands, rather than stronger enforcement actions. It has also been accused of being too slow with its investigations.
There are concerns about these new duties, which could pose threats to the ability of the Information Commissioner’s Office to effectively carry out its primary functions. For that reason, we support the amendment from the noble Lord, Lord Holmes.
(7 months, 2 weeks ago)
Grand CommitteeMy Lords, it is a pleasure to take part in today’s Committee proceedings. In doing so, I declare my technology interests as set out in the register, not least as an adviser to Socially Recruited, an AI business. In moving Amendment 156A, I will also speak to Amendment 156B, and I thank the noble Lord, Lord Clement-Jones, for co-signing them.
We live in extraordinarily uncertain times, domestically and internationally. In many ways, it has always been thus. However, things are different and have accelerated, not least in the last two decades, because of the online environment and the digital selves that we find ourselves interacting with in a world that is ever changing moment by moment. These amendments seek to update an important statute that governs critical elements of how cybersecurity professionals in this nation seek to keep us all safe in these extraordinarily difficult times.
The Computer Misuse Act 1990 was introduced to defend telephony exchanges at a time when 0.5% of us were online. If that was the purpose of the Act—the statute when passed—that alone would suggest that it needs an update. Who among us would use our smartphone if we had had it for 34 years? Well, we could not—the iPhone has been around only since 2007. This whole world has changed profoundly in the last 20 years, never mind the last 34. It is not just that the Act needs to be updated because it falls short of how society and technology have changed in those intervening years; it needs, desperately and urgently, to be updated because it is currently putting every citizen in this nation at risk for want of being amended. This is the purpose of Amendments 156A and 156B.
The Computer Misuse Act 1990 is not only out of date but inadvertently criminalising the cybersecurity professionals we charge with the job of keeping us all safe. They oftentimes work, understandably, under the radar, behind not just closed but locked doors, doing such important work. Yet, for want of these amendments, they are doing that work, all too often, with at least one hand tied behind their back.
Let us take just two examples: vulnerability research and threat intelligence assessment and analysis. Both could find that cybersecurity professional falling foul of the provisions of the CMA 1990. Do not take my word for it: look to the 2024 annual report of the National Cyber Security Centre, which rightly and understandably highlights the increasing gap between the threats we face and its ability, and the ability of the cybersecurity professionals community, to meet those threats.
These amendments, in essence, perform one simple but critical task: to afford a legal defence for legitimate cybersecurity activities. That is all, but it would have such a profound impact for those whom we have asked to keep us safe and for the safety they can thus deliver to every citizen in our society.
Where is the Government’s work on updating the Computer Misuse Act 1990 in this respect? Will the Government take this opportunity to accept these amendments? Do they believe that these amendments would provide a materially positive benefit to our cybersecurity professionals and thus to our nation, and, if so, why would they not take this first opportunity to enact these amendments to this data Bill?
It is not time; it is well over time that these amendments become part of our law. If not now, when? If not these amendments, which amendments? If they do not accept these amendments, what will the Government say to all those people who will continue to be put in harm’s way for want of these protective provisions being passed? It is time to pass these amendments and give our cybersecurity professionals the tools they need. It is time, from the legislative perspective, to keep them safe so that they can do the self-same thing for all of us. It is time to cyber up. I beg to move.
My Lords, I was delighted to see these amendments tabled by the noble Lord, Lord Holmes. He, the noble Lord, Lord Arbuthnot, and I, along with many other parliamentarians, have long argued for changes to the Computer Misuse Act. For context, the original Act was created largely in response to a famous incident in which professional hackers and a technology journalist broke into British Telecom’s Prestel system in the mid-1980s. The Bill received Royal Assent in June 1990, barely two months after Tim Berners-Lee and CERN made the world wide web publicly available for the first time. Who remembers Prestel? Perhaps this is the wrong House in which to ask that question.
As the noble Lord, Lord Holmes, explained, there is no statutory public interest defence in the Act. This omission creates a legal risk for cybersecurity researchers and professionals conducting legitimate activities in the public interest. The Post Office Horizon scandal demonstrated how critical independent computer system investigation is for uncovering systemic problems and highlighted the need for protected legal pathways for researchers and investigators to examine potentially flawed systems.
I am delighted that the noble Lord, Lord Vallance, is here for this set of amendments. His Pro-innovation Regulation of Technologies Review explicitly recommends incorporating such a defence to provide stronger legal protections for cybersecurity researchers and professionals engaged in threat intelligence research. This recommendation was rooted in the understanding that such a defence would have, it said,
“a catalytic effect on innovation”
within the UK’s cybersecurity sector, which possesses “considerable growth potential”.
Could the Minister say a few words on some of those points of discourse and non-consensus, to give the Committee some flavour of the type of issues where there is no consensus as well as the extent of the gap between some of those perspectives?
Just to follow up, have the Government formally responded to the original review from the noble Lord, Lord Vallance? That would be very helpful as well, in unpacking what were clearly extremely well-informed recommendations. It should, no doubt, be taken extremely seriously.
Before the Minister sits down or stands up or whatever the appropriate phrase should be, I very much hope that, since the previous Government gave that indication, this Government will take that as a spur to non-glacial progress. I hope that at least the speed might get up to a number of miles per hour before too long.
My Lords, I thank all noble Lords who have taken part in this important debate and, indeed, the Minister for her thoughtful response. We find ourselves in a position of extraordinary good fortune when it comes to these and many other amendments, not least in the area of artificial intelligence. We had a first-class report from the then Sir Patrick Vallance as CSA. It is not often in life that in a short space of time one is afforded the opportunity in government of bringing much of that excellent work into being through statute, regulation, codes and other guidance. I await further steps in this area.
There can barely be, in many ways, a more serious and pressing issue to be addressed. For every day that we delay, harms are caused. Even if the Government were only to do this on their growth agenda, much spoken of, this would have an economic benefit to the United Kingdom. It would be good to meet the Minister between Committee and Report to see if anything further can be done but, from my perspective and others, we will certainly be returning to this incredibly important issue. I beg leave to withdraw the amendment.
My Lords, it is a pleasure to introduce this group of amendments. I have a 35-minute speech prepared. In moving Amendment 211B, I shall speak also to Amendments 211C to 211E. The reason for this group of amendments is to try to get an increased focus on the range of issues they touch on.
I turn to Amendment 211B first. It seems at least curious to have a data Bill without talking about data centres in terms of their power usage, their environmental impact and the Government’s view of the current PUE standard. Is it of a standard that they think gives the right measure of confidence to consumers and citizens across the country, in terms of how data centres are being operated and their impacts?
Similarly, on Amendment 211C, not enough consideration is given to supply chains. I am not suggesting that they are the most exciting subject but you have to go only one or two steps back in any supply chain to get into deep depths of opacity. With this amendment, I am seeking to gain more clarity on data supply chains and the role of data across all supply chains. Through the combination of data and AI, we could potentially enable a transformation of our supply chain in real time. That would give us so much more flexibility to try for economic benefits and environmental benefits. I look forward to the Minister’s response.
I now move on to Amendment 211D. It is always a pleasure to bring AI into a Bill that really does not want to have AI in it. I am interested in the whole question of data input and output, not least with large language models. I am also interested in the Government’s view on how this interacts with the 1988 copyright Act. There may be some mileage in looking into some standards and approaches in this area, which would potentially go some way towards conditions of market access. We have some excellent examples to look at in other sectors of our economy and society, as set out in the amendment; I would welcome the Minister’s views on that.
I am happy that this group ends with Amendment 211E on the subject of public trust. In many ways, it is the golden thread that should run through everything when we talk about data; I wanted it to be the golden thread that ran through my AI regulation Bill. I always say that Clause 6 is the most important clause in that Bill because it goes to the question of public engagement and trust. Without that level of public engagement and trust, it does not matter how good the technologies are, how good the frameworks are or how good the chat around the data is. It might be golden but, if the public do not believe in it, they are not going to come and be part of it. The most likely consequence of this is that they will not be able to avail themselves of the benefits but they will almost certainly be saddled with the burdens. What these technologies enable is nothing short of a transformation of that discourse between citizen and state, with the potential to reimagine completely the social contract for the benefit of all.
Public engagement and public trust are the golden thread and the fuel for how we gain those economic, social and psychological benefits from the data. I will be very interested in the Minister’s response on what more could be done by the Government, because previous consultations, not least around some of these technologies, have been somewhat short of what we could achieve. With that #brevity and #our data, I beg to move.
My Lords, I shall be #even shorter. Data centres and their energy consumption are important issues. I agree that at a suitable moment—probably not now—it would be very interesting to hear the Government’s views on that. Reports from UK parliamentary committees and the Government have consistently emphasised the critical importance of maintaining public trust in data use and AI, but sometimes, the actions of the Government seem to go contrary to that. I support the noble Lord, Lord Holmes, in his call for essentially realising the benefits of AI while making sure that we maintain public trust.