Employment Rights Bill Debate
Full Debate: Read Full DebateLord Clement-Jones
Main Page: Lord Clement-Jones (Liberal Democrat - Life peer)Department Debates - View all Lord Clement-Jones's debates with the Department for Business and Trade
(2 days, 11 hours ago)
Lords ChamberMy Lords, Amendment 111ZA seeks to introduce a requirement for workplace AI risk and impact assessments. This amendment is focused on addressing the profound and rapidly evolving impact of artificial intelligence systems on the modern workplace. There are many opportunities for its adoption but also risks and impacts. There is potentially massive job displacement. AI could displace 1 million to 3 million UK jobs overall. There are workplaces skills gaps; more than half the UK workforce lacks essential digital skills and the majority of the public has no AI education or training.
AI recruitment algorithms have resulted in race and sex discrimination. There are legal vulnerabilities. Companies risk facing costly lawsuits and settlements when unsuccessful job applicants claim unlawful discrimination by AI hiring systems. Meanwhile, AI adoption accelerates rapidly, and the UK’s regulatory framework is lagging behind.
Organisations such as the Trades Union Congress and the Institute for the Future of Work have consistently highlighted the critical need for robust regulation in this area. The TUC, through its artificial intelligence regulation and employment rights Bill, drafted with a multi-stakeholder task force, explicitly proposes workforce AI risk assessments and emphasises the need for worker consultation before AI systems are implemented. It also advocates for fundamental rights, such as a right to a human review for high-risk decisions. IFOW similarly calls for an accountability for algorithms Act that would mandate pre-emptive algorithmic impact assessments to identify and mitigate risks, ensuring greater transparency and accountability in the use of AI at work. Both organisations stress that existing frameworks are insufficient to protect workers from the potential harms of AI.
When I spoke to a similar amendment—Amendment 149—in Committee, the Minister acknowledged this and said:
“The Government are committed to working with trade unions, employers, workers and experts to examine what AI and new technologies mean for work, jobs and skills. We will promote best practice in safeguarding against the invasion of privacy through surveillance technology, spyware and discriminatory algorithmic decision-making … However, I assure the noble Lord, Lord Clement-Jones, that the Institute for the Future of Work will be welcome to make an input into that piece of work and the consultation that is going forward. I reassure the noble Baroness, Lady Bennett, and all noble Lords that this is an area that the Government are actively looking into, and we will consult on proposals in the make work pay plan in due course”.—[Official Report, 5/6/25; col. 878.]
This was all very reassuring, perhaps, but I have retabled this amendment precisely because we need more concrete specifics regarding this promised consultation.
The TUC and IFOW have been working on this for four years. Is it too much to ask the Government to take a clear position on what is proposed now? The Minister referred to the importance of proper consultation. This is a crucial area impacting the fundamental rights and well-being of workers right now, often without their knowledge, and AI systems are increasingly being introduced into the workforce, so the Government need to provide clarity on what kind of consultation is being undertaken, with whom they will engage beyond relevant stakeholders and what the precise timescale is for this consultation and any subsequent legislative action, particularly given the rapid introduction of AI into workplaces.
We cannot afford a wait-and-see approach. If comprehensive AI regulation cannot be addressed within this Bill as regards the workplace, we need an immediate and clear commitment to provision within dedicated AI legislation, perhaps coming down the track, to ensure that AI in the workplace truly benefits everyone. I beg to move.
My Lords, it is always a pleasure to follow my friend, the noble Lord, Lord Clement-Jones, who, in his single Nelsonian amendment, has covered a lot of the material in my more spread-out set of amendments. I support his Amendment 111ZA and will speak to my Amendments 168 to 176. I declare my interests in the register, particularly my technology interests, not least as a member of the advisory board of Endava plc and as a member of the technology and science advisory committee of the Crown Estate.
I will take one brief step backwards. From the outset, we have heard that the Government do not want to undertake cross-sector AI legislation and regulation. Rather, they want to take a domain-specific approach. That is fine; it is clearly the stated position, although it would not be my choice. But it is simultaneously interesting to ask how, if that choice is adopted, consistency across our economy and society is ensured so that, wherever an individual citizen comes up against AI, they can be assured of a consistent approach to the treatment of the challenges and opportunities of that AI. Similarly, what happens where there is no competent regulator or authority in that domain?
At the moment, largely, neither approach seems to be being adopted. Whenever I and colleagues have raised amendments around AI in what we might call domain-specific areas, such as the Product Regulation and Metrology Bill, the data Bill and now the Employment Rights Bill, we are told, “This is not the legislation for AI”. I ask the Minister for clarity as to whether, if a cross-sector approach to AI is not being taken, a domain-specific approach is, as opportunities are not being taken up when appropriate legislation comes before your Lordships’ House.
I turn to the amendments in my name. Amendment 168 goes to the very heart of the issue around employers’ use of AI. Very good, if not excellent, principles were set out in the then Government’s White Paper of 2023. I have transposed many of these into my Amendment 168. Would it not be beneficial to have these principles set in statute for the benefit of workers, in this instance, wherever they come across employers deploying AI in their workplace?
Amendment 169 lifts a clause largely from my Artificial Intelligence (Regulation) Private Member’s Bill and suggests that an AI responsible officer in all organisations that develop, deploy and use AI would be a positive thing for workers, employees and employers alike. This would not be seen as burdensome, compliant or a mere question of audit but as a positive, vibrant, dynamic role, so that the benefits of AI could be felt by workers right across their employment experience. It would be proportionate and right touch, with reporting requirements easily recognised as mirroring similar requirements set out for other obligations under the Companies Act. If we had AI responsible officers across our economy, across businesses and organisations deploying and using AI right now, this would be positive, dynamic and beneficial for workers, employees, employers, our economy and wider society.
Amendment 170 goes to the issue of IP copyright and labelling. It would put a responsibility on workers who are using AI to report to the relevant government department on the genesis of that IP and copyrighted material, and the data used in that AI deployment, by which means there would be clarity not only on where that IP copyright and data had emanated from but that it had been got through informed consent and that all IP and copyright obligations had been respected and adhered to.
Amendments 171 and 172 similarly look at where workers’ data may be ingested right now by employers’ use of AI. These are such rich, useful and economically beneficial sources of data for employers and businesses. Amendment 171 simply suggests that there should be informed consent from those workers before any of their data can be used, ingested and deployed.
I would like to take a little time on Amendment 174, around the whole area of AI in recruitment and employment. This goes back to one of my points at the beginning of this speech: for recruitment, there currently exists no competent authority or regulator. If the Government continue with their domain-specific approach, recruitment remains a gap, because there is no domain-specific competent authority or regulator that could be held responsible for the deployment and development of AI in that sector. If, for example, somebody finds themselves not making a shortlist, they may not know that AI has been involved in making that decision. Even if they were aware, they would find themselves with no redress and no competent authority to take their claim to.
My Lords, I will begin with Amendment 111ZA, moved by the noble Lord, Lord Clement-Jones, and Amendments 168, 169, 171, 172, 175 and 176, tabled by the noble Lord, Lord Holmes, whom I thank for his engagement on these important issues.
I start by reassuring all noble Lords that we agree that AI should be deployed and used responsibly, including within the workplace. As the noble Lord knows, in January 2025, we published the AI Opportunities Action Plan, which included a commitment to
“support the AI assurance ecosystem to increase trust and adoption”
of AI. One of the key deliverables in this area is the AI management essentials tool. We are developing this tool to support businesses, particularly SMEs, to implement good AI governance practices. Following public consultation earlier this year, I hope to update your Lordships’ House on the consultation response and an updated version of that tool soon.
Regarding these amendments, I remind noble Lords that our plan to make work pay makes it clear that workers’ interests will need to inform the digital transformation happening in the workplace. Our approach is to protect good jobs, ensure good future jobs, and ensure that rights and protections keep pace with technological change.
To be clear, we are committed to working with trade unions, employers, workers and experts to examine what AI and new technologies mean for work, jobs and skills. We will promote best practice in safeguarding against the invasion of privacy through surveillance technology, spyware and discriminatory algorithmic decision-making. In response to the noble Lords, Lord Freyberg and Lord Hunt, of course we will put ethics and fairness at the heart of that.
I am keen to stress that we are taking steps to enhance our understanding of this area. This has included engagement and round-table events with a wide range of stakeholders and experts to help enrich our understanding. I reaffirm that we will consult on the make work pay proposals in due course.
The noble Lord, Lord Clement-Jones, asked what would be in the scope of the consultation. The consultation plan includes examining: what AI and new technologies, including automation and AI, mean for work, jobs and skills; how to promote best practice in safeguarding against the invasion of privacy through surveillance technology, spyware and discriminatory algorithmic decision-making; and how best to make the introduction of surveillance technology in the workplace subject to consultation and negotiation with trade union or employee representatives.
The noble Lord, Lord Holmes, asked whether or not this was going to be domain-specific. As the noble Lord, Lord Hunt, just reminded us, this was dealt with in an Oral Question earlier this afternoon, when my noble friend Lord Vallance said that existing regulators will oversee most AI systems, supported by enhanced AI skills and cross-regulatory co-ordination through forums such as the Regulatory Innovation Office. Some cross-cutting issues will be addressed also in the planned consultation on AI.
Looking specifically at Amendment 171, let me reassure the noble Lord that we believe that data protection legislation provides sufficient protection for workers and individuals where their personal data is being used in line with the key data protection principles, including lawfulness, fairness and transparency. Consent is a lawful ground to process personal data. However, due to the power imbalance between the employee and employer, it is often inappropriate for employers to rely on consent from employees to process their data. This is why we have an additional lawful ground to carry out such processing, such as legitimate interest under the data protection law. Therefore, we do not wish to limit data processing in these situations to consent alone. I also point out that while data protection principles establish the requirements that we expect the use of AI systems to adhere to, AI assurance provides ways to evidence that these requirements have been met in practice.
Amendment 170 tabled by the noble Lord, Lord Holmes, would require workers and employers to maintain records of data and intellectual property used in AI training and to allow independent audits of AI processes. As he will know, this issue was debated extensively during the passage through your Lordships’ House of the Data (Use and Access) Act 2025. Only last month I confirmed that we will publish a report, including on transparency in the use of intellectual property material in AI training, within nine months of Royal Assent to the Act, which will be due by 18 March next year. The Government have also committed to setting up expert stakeholder working groups to help drive forward practical, workable solutions in this area, alongside a parliamentary working group to engage with policy development.
Amendment 174 tabled by the noble Lord, Lord Holmes, proposes a review of the use of AI in recruitment and employment. As the noble Lord will be aware, last year the previous Government published detailed guidance on responsible AI in recruitment, which covers governance, accessibility requirements and testing. This was developed with stakeholders and relevant regulators, such as the Information Commissioner’s Office and the Equality and Human Rights Commission. Employers and recruiters may find this guidance useful to help integrate AI into their recruitment practices in a responsible way.
Furthermore, I am excited about the opportunities of AI in supporting the UK’s workforce, as well as creating jobs and growing our economy. However, we must also understand how it may affect the labour market, including any potential disruption. The AI Security Institute has begun assessing this issue, and I hope to be able to update your Lordships’ House on this as work progresses.
Regarding our position on general AI regulation and the establishment of a new AI regulator, we believe that AI is best regulated at the point of use by the UK’s existing sectoral regulators. As experts in their sector, they are in the best place to understand the uses and risks of AI in their relevant areas, and we will support them to do this. I emphasise that in response to the AI Opportunities Action Plan, we have committed to supporting regulators in evaluating their AI capabilities and understanding how they can be strengthened. I assure your Lordships’ House that we are committed to making sure that workers’ interests inform the digital transformation taking place in the workplace.
I am grateful to my noble friend Lord Pitkeathley for raising non-compete clauses. There has been extensive research and analysis in recent years looking at the prevalence of non-compete clauses in the UK labour market and their impact on both workers and the wider economy. Government research published in 2023 found that non-compete clauses were widely used across the labour market, with around 5 million employees in Great Britain working under a contract that contained a non-compete clause, with a typical duration of around six months. As my noble friend identified, this can adversely impact both the worker affected, through limiting their ability to move between jobs, and the wider economy, due to the impacts on competition.
It is often assumed that non-compete clauses are found only in contracts of high earners. However, research published last year by the Competition and Markets Authority found that while non-competes are more common in higher-paid jobs, even in lower-paid jobs 20% to 30% of workers believe that they are covered by non-compete clauses. The Government have been reviewing the research and work done to date on non-compete clauses, and I am pleased to be able to confirm that we will be consulting on options for reform of non-compete clauses in employment contracts in due course.
Finally, the noble Lord, Lord Hunt, asked for my suggested reading list following my noble friend’s kind offer earlier this afternoon. I can do no better than to recommend the excellent book by the noble Lord, Lord Clement-Jones, on AI. In that spirit, I ask the noble Lord, Lord Clement-Jones, to withdraw his Amendment 111ZA.
The noble Baroness nearly won me over at that point. I thank her. I feel like someone who was expecting a full meal but receives a rather light snack. I will explain why as we go through.
I thank the noble Lord, Lord Holmes. I feel that I am somewhat upstaging him by putting an amendment at the front of the group, but we have many common themes that we both have pursued over the years together. I agree with him on the desirability of a cross-sector approach. He is much more patient than I am and, in putting down individual amendments and hoping that the Minister will give satisfactory answers, he is clearly more optimistic than I am. Whether his optimism has been justified today, I am not so sure.
The Minister could not even acknowledge the work done by the TUC, which has been ground-breaking in so many ways. It has taken four years, so it is extraordinary that the Government are doing what they are doing. I acknowledge what the noble Lord, Lord Pitkeathley, had to say. I was not quite sure how it connected to AI, but he very cunningly linked the subject of non-compete clauses to innovation, which does link to AI. I was encouraged by what the Minister had to say about consultation on reform.
The noble Lord, Lord Hunt, reminded me that I was a solicitor. Unlike him, I do not still have a practising certificate still, but there we are. He has much more stamina than I have. Non-compete clauses can be extremely important in making sure that know-how is preserved within an existing business. I thank the noble Lord, Lord Freyberg, for what he had to say on making sure that AI ensures human flourishing and that we preserve agency. That is what the amendments tabled by the noble Lord, Lord Holmes, and me are all about.
The Minister talked about an AI assurance ecosystem and AI management essential tools that there will be a consultation on, but I could not sense any intention to do anything other than a sort of voluntary approach. We have a lot of employment law that has developed over the years, but the Government seem to be allergic to doing anything with any teeth. She mentioned recruitment practices, but that again seems to be very much a voluntary approach. The AI Security Institute is not a regulator. I cannot feel that the Minister has given much more than the noble Lord, Lord Leong, gave last time. For instance, the Minister talked about consultation over make-work proposals. This involved talking about best practice on the adoption of AI and how best to deal with surveillance technology. Again, I did not sense any real intent to make sure that we have a new set of protections in the workplace in the face of AI.
I very much hope that, as time goes on, the Government will develop a much more muscular approach to this. As many noble Lords have said, AI presents a great number of opportunities in the workplace, but we absolutely do not want to see the opportunities overwhelmed by mistrust and a belief that AI presents unacceptable risks on the part of those employees. We want to see employees understanding that in the face of AI adoption, they have the right to be consulted and there is proper risk assessment of the introduction of these systems into the workplace, so that there is a proper, consensual approach to AI adoption.
I really do not feel that the Government are keeping up to date with the issues in this respect, and I am afraid that is rather reflected in some of the issues that we are going to talk about on Wednesday as well. In the meantime, however, I beg leave to withdraw the amendment.