(1 week, 2 days ago)
Grand CommitteeMy Lords, it is a pleasure to take part in today’s Committee proceedings. In doing so, I declare my technology interests as set out in the register, not least as an adviser to Socially Recruited, an AI business. In moving Amendment 156A, I will also speak to Amendment 156B, and I thank the noble Lord, Lord Clement-Jones, for co-signing them.
We live in extraordinarily uncertain times, domestically and internationally. In many ways, it has always been thus. However, things are different and have accelerated, not least in the last two decades, because of the online environment and the digital selves that we find ourselves interacting with in a world that is ever changing moment by moment. These amendments seek to update an important statute that governs critical elements of how cybersecurity professionals in this nation seek to keep us all safe in these extraordinarily difficult times.
The Computer Misuse Act 1990 was introduced to defend telephony exchanges at a time when 0.5% of us were online. If that was the purpose of the Act—the statute when passed—that alone would suggest that it needs an update. Who among us would use our smartphone if we had had it for 34 years? Well, we could not—the iPhone has been around only since 2007. This whole world has changed profoundly in the last 20 years, never mind the last 34. It is not just that the Act needs to be updated because it falls short of how society and technology have changed in those intervening years; it needs, desperately and urgently, to be updated because it is currently putting every citizen in this nation at risk for want of being amended. This is the purpose of Amendments 156A and 156B.
The Computer Misuse Act 1990 is not only out of date but inadvertently criminalising the cybersecurity professionals we charge with the job of keeping us all safe. They oftentimes work, understandably, under the radar, behind not just closed but locked doors, doing such important work. Yet, for want of these amendments, they are doing that work, all too often, with at least one hand tied behind their back.
Let us take just two examples: vulnerability research and threat intelligence assessment and analysis. Both could find that cybersecurity professional falling foul of the provisions of the CMA 1990. Do not take my word for it: look to the 2024 annual report of the National Cyber Security Centre, which rightly and understandably highlights the increasing gap between the threats we face and its ability, and the ability of the cybersecurity professionals community, to meet those threats.
These amendments, in essence, perform one simple but critical task: to afford a legal defence for legitimate cybersecurity activities. That is all, but it would have such a profound impact for those whom we have asked to keep us safe and for the safety they can thus deliver to every citizen in our society.
Where is the Government’s work on updating the Computer Misuse Act 1990 in this respect? Will the Government take this opportunity to accept these amendments? Do they believe that these amendments would provide a materially positive benefit to our cybersecurity professionals and thus to our nation, and, if so, why would they not take this first opportunity to enact these amendments to this data Bill?
It is not time; it is well over time that these amendments become part of our law. If not now, when? If not these amendments, which amendments? If they do not accept these amendments, what will the Government say to all those people who will continue to be put in harm’s way for want of these protective provisions being passed? It is time to pass these amendments and give our cybersecurity professionals the tools they need. It is time, from the legislative perspective, to keep them safe so that they can do the self-same thing for all of us. It is time to cyber up. I beg to move.
My Lords, I was delighted to see these amendments tabled by the noble Lord, Lord Holmes. He, the noble Lord, Lord Arbuthnot, and I, along with many other parliamentarians, have long argued for changes to the Computer Misuse Act. For context, the original Act was created largely in response to a famous incident in which professional hackers and a technology journalist broke into British Telecom’s Prestel system in the mid-1980s. The Bill received Royal Assent in June 1990, barely two months after Tim Berners-Lee and CERN made the world wide web publicly available for the first time. Who remembers Prestel? Perhaps this is the wrong House in which to ask that question.
As the noble Lord, Lord Holmes, explained, there is no statutory public interest defence in the Act. This omission creates a legal risk for cybersecurity researchers and professionals conducting legitimate activities in the public interest. The Post Office Horizon scandal demonstrated how critical independent computer system investigation is for uncovering systemic problems and highlighted the need for protected legal pathways for researchers and investigators to examine potentially flawed systems.
I am delighted that the noble Lord, Lord Vallance, is here for this set of amendments. His Pro-innovation Regulation of Technologies Review explicitly recommends incorporating such a defence to provide stronger legal protections for cybersecurity researchers and professionals engaged in threat intelligence research. This recommendation was rooted in the understanding that such a defence would have, it said,
“a catalytic effect on innovation”
within the UK’s cybersecurity sector, which possesses “considerable growth potential”.
My Lords, I turn to Amendments 156A and 156B, tabled by the noble Lord, Lord Holmes. I understand the strength of feeling and the need to provide legal protections for legitimate cybersecurity activities. I agree with the noble Lord that the UK should have the right legislative framework to allow us to tackle the harms posed by cybercriminals. We have heard examples of some of those threats this afternoon.
I reassure the noble Lord that this Government are committed to ensuring that the Computer Misuse Act remains up to date and effective in tackling criminality. We will continue to work with the cybersecurity industry, the National Cyber Security Centre and law enforcement agencies to consider whether there are workable proposals on this. The noble Lord will know that this is a complex and ongoing issue being considered as part of the review of the Computer Misuse Act being carried out by the Home Office. We are considering improved defences by engaging extensively with the cybersecurity industry, law enforcement agencies, prosecutors and system owners. However, engagement to date has not produced a consensus on the issue, even within the industry, and that is holding us back at this moment—but we are absolutely determined to move forward with this and to reach a consensus on the way forward.
I think the noble Lord, Lord Clement-Jones, said in the previous debate that the amendments were premature, and here that is certainly the case. The specific amendments that the noble Lord has tabled are premature, because we need a stronger consensus on the way forward, notwithstanding all the good reasons that noble Lords have given for why it is important that we have updated legislation. With these concerns and reasons in mind, I hope that the noble Lord will feel able to withdraw his amendment.
Could the Minister say a few words on some of those points of discourse and non-consensus, to give the Committee some flavour of the type of issues where there is no consensus as well as the extent of the gap between some of those perspectives?
Just to follow up, have the Government formally responded to the original review from the noble Lord, Lord Vallance? That would be very helpful as well, in unpacking what were clearly extremely well-informed recommendations. It should, no doubt, be taken extremely seriously.
Before the Minister sits down or stands up or whatever the appropriate phrase should be, I very much hope that, since the previous Government gave that indication, this Government will take that as a spur to non-glacial progress. I hope that at least the speed might get up to a number of miles per hour before too long.
My Lords, I thank all noble Lords who have taken part in this important debate and, indeed, the Minister for her thoughtful response. We find ourselves in a position of extraordinary good fortune when it comes to these and many other amendments, not least in the area of artificial intelligence. We had a first-class report from the then Sir Patrick Vallance as CSA. It is not often in life that in a short space of time one is afforded the opportunity in government of bringing much of that excellent work into being through statute, regulation, codes and other guidance. I await further steps in this area.
There can barely be, in many ways, a more serious and pressing issue to be addressed. For every day that we delay, harms are caused. Even if the Government were only to do this on their growth agenda, much spoken of, this would have an economic benefit to the United Kingdom. It would be good to meet the Minister between Committee and Report to see if anything further can be done but, from my perspective and others, we will certainly be returning to this incredibly important issue. I beg leave to withdraw the amendment.
My Lords, it is a pleasure to introduce this group of amendments. I have a 35-minute speech prepared. In moving Amendment 211B, I shall speak also to Amendments 211C to 211E. The reason for this group of amendments is to try to get an increased focus on the range of issues they touch on.
I turn to Amendment 211B first. It seems at least curious to have a data Bill without talking about data centres in terms of their power usage, their environmental impact and the Government’s view of the current PUE standard. Is it of a standard that they think gives the right measure of confidence to consumers and citizens across the country, in terms of how data centres are being operated and their impacts?
Similarly, on Amendment 211C, not enough consideration is given to supply chains. I am not suggesting that they are the most exciting subject but you have to go only one or two steps back in any supply chain to get into deep depths of opacity. With this amendment, I am seeking to gain more clarity on data supply chains and the role of data across all supply chains. Through the combination of data and AI, we could potentially enable a transformation of our supply chain in real time. That would give us so much more flexibility to try for economic benefits and environmental benefits. I look forward to the Minister’s response.
I now move on to Amendment 211D. It is always a pleasure to bring AI into a Bill that really does not want to have AI in it. I am interested in the whole question of data input and output, not least with large language models. I am also interested in the Government’s view on how this interacts with the 1988 copyright Act. There may be some mileage in looking into some standards and approaches in this area, which would potentially go some way towards conditions of market access. We have some excellent examples to look at in other sectors of our economy and society, as set out in the amendment; I would welcome the Minister’s views on that.
I am happy that this group ends with Amendment 211E on the subject of public trust. In many ways, it is the golden thread that should run through everything when we talk about data; I wanted it to be the golden thread that ran through my AI regulation Bill. I always say that Clause 6 is the most important clause in that Bill because it goes to the question of public engagement and trust. Without that level of public engagement and trust, it does not matter how good the technologies are, how good the frameworks are or how good the chat around the data is. It might be golden but, if the public do not believe in it, they are not going to come and be part of it. The most likely consequence of this is that they will not be able to avail themselves of the benefits but they will almost certainly be saddled with the burdens. What these technologies enable is nothing short of a transformation of that discourse between citizen and state, with the potential to reimagine completely the social contract for the benefit of all.
Public engagement and public trust are the golden thread and the fuel for how we gain those economic, social and psychological benefits from the data. I will be very interested in the Minister’s response on what more could be done by the Government, because previous consultations, not least around some of these technologies, have been somewhat short of what we could achieve. With that #brevity and #our data, I beg to move.
My Lords, I shall be #even shorter. Data centres and their energy consumption are important issues. I agree that at a suitable moment—probably not now—it would be very interesting to hear the Government’s views on that. Reports from UK parliamentary committees and the Government have consistently emphasised the critical importance of maintaining public trust in data use and AI, but sometimes, the actions of the Government seem to go contrary to that. I support the noble Lord, Lord Holmes, in his call for essentially realising the benefits of AI while making sure that we maintain public trust.
My Lords, I am grateful to the noble Lord, Lord Holmes, for tabling Amendment 221B and his other amendments in this group, which are on a range of varied and important issues. Given the hour, I hope he will be content if I promise to write to him on each of these issues and in the meantime, I ask him to withdraw the amendment.
I thank all noble Lords who participated: I will not go through them by name. I thank the Minister for her response and would very much welcome a letter. I am happy to meet her on all these subjects but, for now, I beg leave to withdraw the amendment.
(1 week, 4 days ago)
Grand CommitteeMy Lords, in speaking to Amendment 137 in my name I thank the noble Baroness, Lady Harding, the noble Lord, Lord Stevenson, and my noble friend Lord Russell for their support. I also add my enthusiastic support to the amendments in the name of my noble friend Lord Colville.
This is the same amendment that I laid to the DPDI Bill, which at the time had the support of the Labour Party. I will not labour that point, but it is consistently disappointing that these things have gone into the “too difficult” box.
Amendment 137 would introduce a code of practice on children and AI. AI drives the recommender systems that determine all aspects of a child’s digital experience, including the videos they watch, their learning opportunities, the people they follow and the products they buy—and, as reported last weekend, AI is even helping farmers pick the ripest tomatoes for baked beans. But it no longer concerns simply the elective parts of life where, arguably, a child or a parent on their behalf can choose to avoid certain products and services. AI is invisibly and ubiquitously present in all areas of their lives, and its advances and impact are particularly evident in the education and health sectors, the first of which is compulsory for children and the second of which is necessary for all of us.
The amendment has three parts. The first requires the ICO to create a code and sets out the expectations of its scope; the second considers who and what should be consulted and considered, including experts, children, and the frameworks that codify children’s existing rights; and the third part defines elements of the process, including risk assessment definitions, and sets out the principles to which the code must adhere.
When we debated this before, I anticipated that the Minister would say that the ICO had already published guidance, that we do not want to exclude children from the benefits of AI, and that we must not get in the way of innovation. Given that the new Government have taken so many cues from the previous one, I am afraid I anticipate a similar response.
I first point out, therefore, that the ICO’s non-binding guidance on AI and data protection is insufficient. It has only a single mention of a child in its 140 pages, which is a case study about child benefits. In the hundreds of pages of guidance, toolkits and sector information, nowhere are the specific needs and rights, or development vulnerabilities, of children comprehensively addressed in relation to AI. This absence of children is also mirrored in government publications on AI. Of course, we all want children to enjoy the benefits of AI, but consideration of their needs would increase the likelihood of those benefits. Moreover, it seems reckless and unprincipled not to protect them from known harms. Surely the last three decades of tech development have shown us that the experiment of a “build first, worry about the kids later—or never” approach has cost our children dearly.
Innovation is welcome but not all innovation is equal. We have bots offering 13 year-olds advice on how to seduce grown men, or encouraging them to take their own lives, edtech products that profile children to unfair and biased outcomes that limit their education and life chances, and we have gen AI that perpetuates negative, racist, misogynist and homophobic stereotypes. Earlier this month, the Guardian reported a deep bias in the AI used by the Department for Work and Pensions. This “hurt first, fix later” approach creates a lack of trust, increases unfairness, and has real-world consequences. Is it too much to insist that we ask better questions of systems that may result in children going hungry?
Why children? I am saddened that I must explain this, but from our deeply upsetting debate last week on the child protection amendments, in which the Government asserted that children are already catered for while deliberately downgrading their protections, it seems that the Government or their advisers have forgotten.
Children are different for three reasons. First, as has been established over decades, children are on a development journey. There are ages and stages at which children are developmentally able to do certain things, such as walk, talk, understand risk and irony and learn different social skills. There are equally ages and stages at which they cannot do those things. The long-established consensus is that families, social groups and society more broadly, including government, step in to support them on this journey. Secondly, children have less voice and less choice about how and where they spend their time, so the places and spaces they inhabit have to be designed to be fit for childhood. Thirdly, we have a responsibility towards children that extends even beyond our responsibility to each other. This means that we cannot legitimatise profit at their expense. Allowing systems to play in the wild in the name of growth and innovation, leaving kids to pay the price, is a low bar.
It is worth noting that since we debated it, a proposal for this AI code for children that follows the full life cycle of development, deployment, use and retirement of AI systems has been drafted and has the support of multiple expert organisations and individuals around the globe. I am sure that all nations and intergovernmental organisations will have additional inputs and requirements, but it is worth saying that the proposed code, which was written with input from academics, computer scientists, lawyers, engineers and children’s rights activists, is mindful of and compatible with the EU AI Act, the White House Blueprint for an AI Bill of Rights, the Executive Order on the Safe, Secure and Trustworthy Development and Use of Artificial Intelligence, the Council of Europe’s Framework Convention on Artificial Intelligence and, of course, the UNCRC general comment no. 25.
This proposal will be launched early next year as an indication of what could and should be done. Unless the Government find their compass vis-à-vis children and tech, I suspect that another jurisdiction will adopt it ahead of the UK, making that the go-to destination for trusted tech development for child-safe products. It is perhaps worth reminding the Committee that one in three connected people is under 18, which is roughly 1 billion children. As the demographics change, the proportion and number of children will rise. It is a huge financial market.
Before I sit down, I shall briefly talk about the AADC because sometimes Ministers say that we already have a children’s code. The age-appropriate design code covers only ISS, which automatically limits it, and even the ICO by now agrees that its enforcement record is neither extensive nor impressive. It does not clearly cover the urgent area of edtech, which is the subject of another amendment, and, most pertinently to this amendment, it addresses AI profiling only, which means that it is limited in how it can look at the new and emerging challenges of generative AI. A revamp of the AADC to tackle the barriers of enforcement, account for technological advances, cover all products and services likely to be accessed by children and make our data regime AI-sensitive would be welcome, but rather than calling for a strengthening of the AADC, the ICO agreed to the downgrading of children’s data protection in the DPDI Bill and, again, has agreed to the downgrading of protections in the current Bill on ADM, scientific research, onward processing and so on. A stand-alone code for AI development is required because in this way we could be sure that children are in the minds of developers at the outset.
It is disappointing that the UK is failing to claim its place as the centre of regulated and trusted innovation. Although we are promised an AI Bill, the Government repeatedly talk of large frontier companies. AI is in every part of a child’s life from the news they read to the prices they pay for travel and goods. It is clear from previous groups that many colleagues feel that a data Bill with no AI provisions is dangerous commercially and for the communities of the UK. An AI Bill with no consideration of the daily impact on children may be a very poor next choice. Will the Minister say why a Labour Government are willing to abandon children to technology rather than building technology that anticipates children’s rights and needs?
My Lords, it is a pleasure to follow my friend the noble Baroness, Lady Kidron, and to give full-throated support to my friend the noble Viscount, Lord Colville, on all his amendments. Given that the noble Baroness mentioned it and that another week has passed since we asked the Minister the question, will we see an AI Bill or a consultation before Santa comes or at some stage in the new year? I support all the amendments in this group and in doing so, as it is the first time I have spoken today in Committee, I declare my technology interests as set out in the register, not least as an adviser to Socially Recruited, an AI business.
I will speak particularly to my Amendment 211A. I have put down “image, likeness and personality” not because I believe that they stand as the most important rights that are being transgressed or that they are the most important rights which we should consider; I have put them down to give a specific focus on them because, right now, they are being largely cut across and ignored, so that all of our creatives find themselves with their works, but also image, likeness and personality, disappearing into these largely foundation AI models with no potential for redress.
Once parts of you such as your name, face or voice have been ingested, as the noble Lord, Lord Clement-Jones, said in the previous group, it is difficult then to have them extracted from the model. There is no sense, for example, of seeking an equitable remedy to put one back in the situation had the breach not occurred. It is almost “once in, forever in”, then works start to be created based on those factors, features and likenesses, which compete directly with the creatives. This is already particularly prevalent in the music industry.
What plans do the Government have in terms of personality rights, image and likeness? Are they content with the current situation where there is no protection for our great creatives, not least in the music industry? What does the Bill do for our creatives? I go back to the point made by the noble Baroness, Lady Kidron. How can we have all these debates on a data Bill which is silent when it comes to AI, and a product regulation Bill where AI is specifically excluded, and yet have no AI Bill on the near horizon—unless the Minister can give us some up-to-date information this afternoon? I look forward to hearing from her.
My Lords, I should first apologise for not being able to attend Second Reading or, arguably more importantly, to be in Committee last week to support the many amendments of the noble Baroness, Lady Kidron, on child protection. I read Hansard carefully and was deeply depressed to see that we were once again needing to rehearse, as she has done again today, the importance of protecting children in the digital era. It seems to be our lot that there is a group of us who keep coming back. We play the merry-go-round and sit in different places; it is a privilege to sit next to the noble Baroness, Lady Kidron, for the first time in the decade that I have been in the House. I support her Amendment 137. She has given a good exposé as to why we should think really carefully about how we protect children in this AI world. I would just like to add one point about AI itself.
We keep being told—in a good way—that AI is an underlying and general-purpose technology. That means we need to properly establish the principles with which we should protect children there. We know that technology is morally neutral; it is the human beings who do the damage. In every other underlying, breakthrough technology, we have learned that we have needed to protect the most vulnerable, whether it was electricity when it first went into factories, toys when they were first distributed on the mass market, or social media, with the age-appropriate design code. I feel that it would be a huge mistake, on the third Bill where many of us have debated this subject matter, for us not to address the fact that, as of today, this is the biggest breakthrough technology of our lifetime. We should recognise that children will need protecting, as well as having the opportunity to benefit from it.
My Lords, I rise briefly to support my friend, the noble Lord, Lord Clement-Jones, and his string of amendments. He made the case clearly: it is simply about access, the right to redress and a clear pathway to that redress, a more efficient process and clarity and consistency across this part of our data landscape. There is precious little point in having obscure remedies or rights—or even, in some cases, as we have discussed in our debates on previous groups, no right or obvious pathways to redress. I believe that this suite of amendments addresses that issue. Again, I full-throatedly support them.
My Lords, I address the amendments tabled by the noble Lord, Lord Clement-Jones. These proposals aim to transfer jurisdiction from courts to tribunals; to establish a new right of appeal against decisions made by the Information Commissioner; and to grant the Lord Chancellor authority to implement tribunal procedure rules. I understand and recognise the noble Lord’s intent here, of course, but I have reservations about these amendments and urge caution in accepting them.
The suggestion to transfer jurisdiction from courts to tribunals raises substantial concerns. Courts have a long-standing authority and expertise in adjudicating complex legal matters, including data protection cases. By removing these disputes from the purview of the courts, the risk is that we undermine the depth and breadth of legal oversight required in such critical areas. Tribunals, while valuable for specialised and expedited decisions, may not provide the same level of rigorous legal analysis.
Cases such as those cited by the noble Lord, Lord Clement-Jones—Killock and another v the Information Commissioner and Delo v the Information Commissioner—demonstrate to me the intricate interplay between data protection, administrative discretion and broader legal principles. It is questionable whether tribunals, operating under less formal procedures, can consistently handle such complexities without diminishing the quality of justice. Further, I am not sure that the claim that this transfer will streamline the system and reduce burdens on the courts is fully persuasive. Shifting cases to tribunals does not eliminate complexity; it merely reallocates it, potentially at the expense of the detailed scrutiny that these cases demand.
I turn to the right of appeal against the commissioner’s decisions. Although the introduction of a right of appeal against these decisions may seem like a safeguard, it risks creating unnecessary layers of litigation. The ICO already operates within a robust framework of accountability, including judicial review for cases of legal error or improper exercise of discretion. Adding a formal right of appeal risks encouraging vexatious challenges, overwhelming the tribunal system and diverting resources from addressing genuine grievances.
I think we in my party understand the importance of regulatory accountability. However, creating additional mechanisms should not come at the expense of efficiency and proportionality. The existing legal remedies are designed to strike an appropriate balance, and further appeals risk creating a chilling effect on the ICO’s ability to act decisively in protecting data rights.
On tribunal procedure rules and centralised authority, the proposed amendment granting the Lord Chancellor authority to set tribunal procedure rules bypasses the Tribunal Procedure Committee, an independent body designed to ensure that procedural changes are developed with judicial oversight. This move raises concerns about the concentration of power and the erosion of established checks and balances. I am concerned that this is a case of expediency overriding the principles of good governance. While I acknowledge that consultation with the judiciary is included in the amendment, it is not a sufficient substitute for the independent deliberative processes currently in place. The amendment risks undermining the independence of our legal institutions and therefore I have concerns about it.
These amendments overall, while presented as technical fixes, and certainly I recognise the problem and the intent, would have far-reaching consequences for our data protection framework. The vision of my party for governance is one that prioritises stability, legal certainty and the preservation of integrity. We must avoid reforms that, whatever their intent, introduce confusion or inefficiency or undermine public trust in our system. Data protection is, needless to say, a cornerstone of our modern economy and individual rights. As such, any changes to its governance must be approached with the utmost care.
My Lords, my Amendment 115 would similarly act in that way by making automated decision-making processes explain themselves to the people affected by them. This would be a much better way of controlling the quality of what is going on with automated decision-making than restricting that sort of information to professionals—to people who are anyway overworked and have a lot of other things to do. There is no one more interested in the decision of an automated process than the person about whom it is being made. If we are to trust these systems then their ability, which is way beyond the human ability, to have the time to explain why they took the decision they did—which, if the machine is any good, it knows and can easily set out—is surely the way to generate trust: you can absolutely see what decision has been made and why, and you can respond to it.
This would, beyond anything else, produce a much better system for our young people when they apply for their first job. My daughter’s friends in that position are getting into the hundreds of unexplained rejections. This is not a good way to treat young people. It does not help them to improve and understand what is going on. I completely understand why firms do not explain; they have so many applications that they just do not have the time or the personnel to sit down and write a response—but that does not apply to an automated decision-making machine. It could produce a much better situation when it comes to hiring.
As I said, my principal concern, to echo that of the noble Viscount, is that it would give us sight of the decisions that have been taken and why. If it becomes evident that they are taken well and for good reasons, we shall learn to trust them. If it becomes evident that they really are not fair or understandable, we shall be in a position to demand changes.
My Lords, it is a pleasure to take part in the debate on this group. I support the spirit of all the amendments debated thus far.
Speaking of spirits, and it being the season, I have more than a degree of sympathy for the Minister. With so many references to her previous work, this Christmas is turning into a bit of the Ghost of Amendments Past for her. That is good, because all the amendments she put down in the past were of an excellent quality, well thought through, equally considered and even-handed.
As has been mentioned many times, we have had three versions of a data Bill so far over just over three years. One wonders whether all the elements of this current draft have kept up with what has happened in the outside world over those three years, not least when it comes to artificial intelligence. This goes to the heart of the amendments in this group on automated decision-making.
When the first of these data Bills emerged, ADM was present—but relatively discreetly present—in our society and our economy. Now it would be fair to say that it proliferates across many areas of our economy and our society, often in situations where people find themselves at the sharpest end of the economy and the sharpest end of these automated decisions, often without even knowing that ADM was present. More than that, even on the discovery that ADM was in the mix, depending on which sector of the economy or society they find that decision being made in, they may find themselves with no or precious little redress—employment and recruitment, to name but one sector.
It being the season, it is high time when it comes to ADM that we start to talk turkey. In all the comments thus far, we are talking not just about ADM but about the principles that should underpin all elements of artificial intelligence—that is, they should be human led. These technologies should be in our human hands, with our human values feeding into human oversight: human in the loop and indeed, where appropriate, human over the loop.
That goes to elements in my two amendments in this group, Amendments 123A and 123B. Amendment 123A simply posits, through a number of paragraphs, the point that if someone is subject to an automated decision then they have the right to a personalised explanation of that decision. That explanation should be accessible in its being in plain language of their choice, not having a cost attached to it and not being in any sense technically or technologically convoluted or opaque. That would be relatively straightforward to achieve, but the positive impact for all those citizens would certainly be more than material.
Amendment 123B goes to the heart of those humans charged with the delivery of these personalised explanations. It is not enough to simply say that there are individuals within an organisation responsible for the provision of personalised explanations for automated decisions; it is critical that those individuals have the training, the capabilities and, perhaps most importantly, the authority within that organisation to make a meaningful impact regarding those personalised explanations. If not, this measure may have a small voice but would have absolutely no teeth when it comes to the citizen.
In short, ADM is proliferating so we need to ensure that we have a symmetrical situation for citizens, for consumers, and for anyone who finds themselves in any domain or sector of our economy and society. We must assert the principles: human-led, human in the loop, “Our decisions, our data”, and “We determine, we decide, we choose”. That is how I believe we can have an effective, positive, enabling and empowering AI future. I look forward to the Minister’s comments.
My Lords, I shall speak to the series of amendments on automated decision-making to which I have added my name but are mostly in the name of the noble Lord, Lord Clement-Jones. As he said, we had a rehearsal for this debate last Friday when we debated his Private Member’s Bill so I will not delay the Committee by saying much about the generalities of ADMs in the public sector.
Suffice it to say that human involvement in overseeing AIs must be meaningful—for example, without those humans themselves being managed by algorithms. We must ensure that ADMs comply by design with the Equality Act and safeguard data subjects’ other rights and freedoms. As discussed in earlier groups, we must pay particular attention to children’s rights with regard to ADMs, and we must reinforce the obligation on public bodies to use the algorithmic transparency recording standards. I also counsel my noble friend the Minister that, as we have heard, there are many voices from civil society advising me and others that the new Article 22 of the GDPR takes us backwards in terms of protection.
That said, I want to focus on Amendment 123C, relating to ADMs in the workplace, to which I was too late to add my name but would have done. This amendment follows a series of probing amendments tabled by me to the former DPDI Bill. In this, I am informed by my work as the co-chair of the All-Party Parliamentary Group on the Future of Work, assisted by the Institute for the Future of Work. These amendments were also mirrored during the passage of the Procurement Act and competition Act to signal the importance of the workplace, and in particular good work, as a cross-cutting objective and lens for policy orientation.
Does the Minister agree that the crux of this machinery is solely automated decision-making as a binary thing—it is or it is not—and, therefore, that the absolute key to it is making sure that the humans involved are suitably qualified and finding some way to do so, whether by writing a definition or publishing guidelines?
On the question of qualification, the Minister may wish to reflect on the broad discussions we have had in the past around certification and the role it may play. I gently her take her back to what she said on Amendment 123A about notification. Does she see notification as the same as a personalised response to an individual?
Noble Lords have asked several questions. First, in response to the noble Viscount, Lord Camrose, I think I am on the same page as him about binary rather than muddying the water by having degrees of meaningful intervention. The ICO already has guidance on how human review should be provided, and this will be updated after the Bill to ensure that it reflects what is meant by “meaningful human involvement”. Those issues will be addressed in the ICO guidance, but if it helps, I can write further on that.
I have forgotten the question that the noble Lord, Lord Holmes, asked me. I do not know whether I have addressed it.
In her response the Minister said “notification”. Does she see notification as the same as “personalised response”?
My understanding is that it would be. Every individual who was affected would receive their own notification rather than it just being on a website, for example.
Let me just make sure I have not missed anyone out. On Amendment 123B on addressing bias in automated decision-making, compliance with the data protection principles, including accuracy, transparency and fairness, will ensure that organisations take the necessary measures to address the risk of bias.
On Amendment 123C from the noble Lord, Lord Clement-Jones, I reassure him that the Government strongly agree that employment rights should be fit for a modern economy. The plan to make work pay will achieve this by addressing the challenges introduced by new trends and technologies. I agree very much with my noble friend Lord Knight that although we have to get this right, there are opportunities for a different form of work, and we should not just see this as being potentially a negative impact on people’s lives. However, we want to get the balance right with regard to the impact on individuals to make sure that we get the best rather than the possible negative effects out of it.
Employment rights law is more suitable for regulating the specific use of data and technology in the workplace rather than data protection law in isolation, as data protection law sets out general rules and principles for processing that apply in all contexts. Noble Lords can rest assured that we take the impact on employment and work very seriously, and as part of our plan to make work pay and the Employment Rights Bill, we will return to these issues.
On Amendments 119, 120, 121 and 122, tabled by the noble Lord, Lord Clement-Jones, the noble Viscount, Lord Colville, and my noble friend Lord Knight, the Government share the noble Lords’ belief in the importance of public sector algorithmic transparency, and, as the noble Lord, Lord Clement-Jones, reminded us, we had a very good debate on this last week. The algorithmic transparency recording standard is already mandatory for government departments and arm’s-length bodies. This is a cross-government policy mandate underpinned by digital spend controls, which means that when budget is requested for a relevant tool, the team in question must commit to publishing an ATRS record before receiving the funds.
As I said on Friday, we are implementing this policy accordingly, and I hope to publish further records imminently. I very much hope that when noble Lords see what I hope will be a significant number of new records on this, they will be reassured that the nature of the mandation and the obligation on public sector departments is working.
Policy routes also enable us to provide detailed guidance to the public sector on how to carry out its responsibilities and monitor compliance. Examples include the data ethics framework, the generative AI framework, and the guidelines for AI procurement. Additionally, the data protection framework already achieves some of the intended outcomes of these amendments. It requires organisations, including public authorities, to demonstrate how they have identified and mitigated risks when processing personal data. The ICO provides guidance on how organisations can audit their privacy management and ensure a high level of data protection compliance.
I know I have given a great deal of detail there. If I have not covered all the points that the noble Lords have raised, I will write. In the meantime, given the above assurances, I hope that the noble Lord will withdraw his amendment.
(2 weeks, 3 days ago)
Grand CommitteeMy Lords, I have in subsequent groups a number of amendments that touch on many of the issues that are raised here, so I will not detain the Committee by going through them at this stage and repeating them later. However, I feel that, although the Government have had the best intentions in bringing forward a set of proposals in this area that were to update and to bring together rather conflicting and difficult pieces of legislation that have been left because of the Brexit arrangements, they have managed to open up a gap between where we want to be and where we will be if the Bill goes forward in its present form. I say that in relation to AI, which is a subject requiring a lot more attention and a lot more detail than we have before us. I doubt very much whether the Government will have the appetite for dealing with that in time for this Bill, but I hope that at the very least—it would be a minor concession at this stage—they will commit at the Dispatch Box to seeking to resolve these issues in the legislation within a very short period because, as we have heard from the arguments made today, it is desperately needed.
More importantly, if, by bringing together documentation that is thought to represent the current situation, either inadvertently or otherwise, the Government have managed to open up a loophole that will devalue the way in which we currently treat personal data—I will come on to this when I get to my groups in relation to the NHS in particular—that would be a grievous situation. I hope that, going forward, the points that have been made here can be accommodated in a statement that will resolve them, because they need to be resolved.
My Lords, it is a pleasure to take part in today’s Committee proceedings. In doing so, I declare my technology interests as set out in the register, not least as adviser to Socially Recruited, an AI business.
I support the noble Viscount, Lord Colville, in his amendments and all the other amendments in this group. They were understandably popular, to the extent that when I got my pen out, there was no space left for me to co-sign them, so I was left with the oral tradition in which to reflect my support for them. Before going into the detail, I just say that we have had three data Bills in just over three years: DPDI, DISD and this Bill. Over that period, though the names have changed, much of the meat remains the same in the legislation. Yet, in that period, everything and nothing haschanged —everything in terms of what has happened with generative AI.
Considering that seismic shift that has occurred over these three Bills, could the Minister say what in this Bill specifically has changed, not least in this part, to reflect that seismic change? Regarding “nothing has changed”, nothing has changed in terms of the incredibly powerful potential of AI for positive or negative outcomes, ably demonstrated with this set of amendments.
If you went on to Main Street and polled the public, I believe that you would get a pretty clear understanding of what they considered scientific research to be. You know it. You understand why we would want to have a specified definition of scientific research and what that would mean for the researchers and for the country.
However, if we are to draw that definition as broadly as it currently is in the Bill, why would we bother to have such a definition at all? If the Government’s intention is to enable so much to come within the perimeter, let us not have the definition at all and let us allow to continue what is happening right now, not least in the reuse of scrape data or in how data is being treated in these generative AI models.
We have seen what has happened in terms of the training, but when you look at what could be called development and improvement, as the noble Viscount has rightly pointed out, all this and more could easily fit within the scientific research definition. It could even more easily fit in when lawyers are deployed to ensure that that is so. I know we are going to come on to rehearsing a number of these subjects in the next group but, for this group, I support all the amendments as set out.
I ask the Minister these two questions. First, what has changed in all the provisions that have gone through all these three iterations of the data Bill? Secondly, what is the Government’s intention when it comes to scientific research, if it is not truly to mean scientific research, if it is not to have ethics committee involvement and if it is not to feel sound and be defined as what most people on Main Street would recognise as scientific research?
I start by apologising because, due to a prior commitment, I am not able to stay for many of the proceedings today, but I see these groupings and others as critical. In the few words that I will say, I hope to bring to bear to this area some of my experience as a Health Minister, particularly in charge of technology and development of AI.
I can see a lot of good intent behind these clauses, to make sure that we do not stop a lot of the research that we need. I was recently very much involved in the negotiation of the pandemic accord regarding the next pandemic and how you make sure that any vaccines that you develop on a worldwide basis can be distributed on a worldwide basis as well. One of the main stumbling blocks was that the so-called poorer countries were trying to demand, as part of that, the intellectual property to be able to develop the vaccines in their own countries.
The point we were trying to make was that, although we could see the good intentions behind that, it would have a real chilling effect on pharmaceutical companies investing the hundreds of millions or even billions of pounds, which you often need with vaccines, to find a cure, because if they felt that they were going to lose their intellectual property and rights at the end, it would be much harder for them to justify the investment up front.
I advise the Committee that, if this amendment is agreed, I cannot call Amendment 61 by reason of pre-emption.
My Lords, it is a pleasure to take part in the debate on these amendments. I very much support Amendment 60 as introduced. I was delighted to hear the Minister tell the Grand Committee that the Government are coming forward with an AI Bill. I wonder if I might tempt her into sharing a bit more detail with your Lordships on when we might see that Bill or indeed the consultation. Will it be before Santa or sometime after his welcome appearance later this month?
We touched on a number of areas related to Amendment 65A in the previous group. This demonstrates the importance of and concern about Clause 67, as so many amendments pertain to it. I ask the Minister whether a large language model that comes up with medically significant conclusions but, prior to that, gained a considerable amount of that data from scraping, would be fine within Clause 67 as drafted.
Similarly, there are overriding and broader reuse possibilities from the drafting as set out. Again, as has already been debated, scientific research has a clear meaning in many respects. That clarity very much comes when you add public interest and ethics. Could a model that has taken vast quantities of others’ data without consent and—nodding more towards Amendment 60 —without remuneration and consent still potentially fit within the definition of “scientific research”?
In many ways, we are debating these points around data in the context of scientific research, but we could go to the very nub or essence of the issue. All that noble Lords are asking, in their many eloquent and excellent ways, is whose data is it, to what purpose is it being put and have those data owners been consented, respected and, where appropriate—particularly when it comes to IP and copyrighted data—remunerated? This is an excellent opportunity to expand on the earlier debate on Clause 67. I look forward to the Minister’s response.
My Lords, I support Amendment 71 and others in this group from the noble Lords, Lord Clement-Jones and Lord Stevenson. I apologise for not being able to speak at Second Reading. The noble Lord, Lord Clement-Jones, will remember that we took a deep interest in this issue when I was a Health Minister and the conversations that we had.
I had a concern at the time. We all know that the NHS needs to be digitised and that relevant health professionals need to be able to access relevant data when they need to, so that there is no need to be stuck with one doctor when you go to another part of the country. There are so many efficiencies that we could have in the system, as long as they are accessed by relevant and appropriate health professionals at the right time. But it is also important that patients have confidence in the system and that their personal data cannot be shared with commercial organisations without them knowing. As other noble Lords have said, this is an issue of trust.
For that reason, when I was in that position, I reached out to civil liberties organisations to understand their concerns. For example, medConfidential was very helpful and had conversations with DHSC and NHS officials. In fact, after those conversations, officials told me that its demands were reasonable and that some of the things being asked for were not that difficult to give and common sense.
I asked a Written Question of the noble Baroness’s ministerial colleague, the noble Baroness, Lady Merron, about whether patients will be informed of who has had access to their patient record, because that is important for confidence. The Answer I got back was that the Government were proposing a single unified health record. We all know that. She said that:
“Ensuring that patients’ confidential information remains protected and is seen only by those who need to see it will be a priority. Public engagement next month will help us understand what safeguards patients would want to see”.
Surely the fact that patients have opted out shows that they already have concerns and have raised them.
The NHS can build the best data system—or the federated data platform, as it is called—but without patient confidence it is simply a castle made of sand. As one of my heroes, Jimi Hendrix, once said, castles made of sand fall into the sea eventually. We do not want to see that with the federated data platform. We want to see a modernised system of healthcare digital records, allowing joined-up thinking on health and care right across a patient’s life. We should be able to use machine learning to analyse those valuable datasets to improve preventive care. But, for that to happen, the key has to be trust and patients being confident that their data is secure and used in the appropriate way. I look forward to the Minister’s response.
My Lords, I support these amendments in the names of the noble Lords, Lord Stevenson and Lord Clement-Jones. It is a pleasure to follow the second ex-Health Minister this afternoon. In many ways, the arguments are just the same for health data as they are for all data. It is just that, understandably, it is at the sharpest end of this debate. Probably the most important point for everybody to realise, although it is espoused so often, is that there is no such thing as NHS data. It is a collection of the data of every citizen in this country, and it matters. Public trust matters significantly for all data but for health data in particular, because it goes so close to our identity—our very being.
Yet we know how to do public trust in this country. We know how to engage and have had significant success in public engagement decades ago. What we could do now with human-led technology-supported public engagement could be on such a positive and transformational scale. But, so far, there has been so little on this front. Let us not talk of NHS data; let us always come back to the fundamental principle encapsulated in this group of amendments and across so many of our discussions on the Bill. Does the Minister agree that it is about not NHS data but our data—our decisions—and, through that, if we get it right, our human-led digital futures?
Many thanks to all noble Lords who have proposed and supported these amendments. I will speak to just a few of them.
Amendment 70 looks to mitigate the lowering of the consent threshold for scientific research. As I have set out on previous groups, I too have concerns about that consent threshold. However, for me the issue is more with the definition of scientific research than with the consent threshold, so I am not yet confident that the amendment is the right way to achieve those desirable aims.
Amendment 71 would require that no NHS personal data can be made available for scientific research without the explicit consent of the patient. I thank the noble Lords, Lord Stevenson of Balmacara and Lord Clement-Jones, for raising this because it is such an important matter. While we will discuss this under other levels, as the noble Baroness, Lady Kidron, points out, it is such an important thing and we need to get it right.
I regret to advise my noble friend Lord Holmes that I was going to start my next sentence with the words “Our NHS data”, but I will not. The data previously referred to is a very significant and globally unique national asset, comprising many decades of population-wide, cradle-to-grave medical data. No equivalent at anything like the same scale or richness exists anywhere, which makes it incredibly valuable. I thank my noble friend Lord Kamall for stressing this point with, as ever, the help of Jimi Hendrix.
However, that data is valuable only to the extent that it can be safely exploited for research and development purposes. The data can collectively help us develop new medicines or improve the administration and productivity of the NHS, but we need to allow it to do so properly. I am concerned that this amendment, if enacted, would create too high an operational and administrative barrier to the safe exploitation of this data. I have no interest in compromising on the safety, but we have to find a more efficient and effective way of doing it.
Amendments 79, 81 and 131 all look to clarify that the definition of consent to be used is in line with the definition in Article 4.11 of the UK GDPR:
“‘consent’ of the data subject means any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her”.
This amendment would continue the use of a definition that is well understood. However, paragraph 3(a) of new Article 8A appears sufficient, in that the purpose for which a data subject consents is “specified, explicit and legitimate”.
Finally, with respect to Clause 77 stand part, I take the point and believe that we will be spending a lot of time on these matters going forward. But, on balance and for the time being, I feel that this clause needs to remain, as there must be clear rules on what information should be provided to data subjects. We should leave it in for now, although we will no doubt be looking to polish it considerably.
(1 month, 1 week ago)
Lords ChamberMy Lords, it is a pleasure to take part in this Second Reading debate. I thank the Minister for the way she introduced the Bill. I declare my interests as set out in the register, particularly those in technology and financial services: as an adviser to Ecospend, an open banking technology, and to Socially Recruited, an AI business.
It is a pleasure to take part in a Second Reading for the third time on one Bill with three different names. We should all feel grateful that the only word to survive in all those titles is “data”, which must be a good thing. It is also a pleasure to follow so many excellent speeches, to which I find myself simply saying “Yes, agree, agree”, in particular the excellent speech of the noble Baroness, Lady Kidron, who pointed to some of the most extreme and urgent issues that we must address in data. I also support the concept from the noble Lord, Lord Knight, of the Government laying out their overall approach to all new technologies and issues around data so that we have a road map, suite, menu or whatever of everything they intend in the coming months and years, so that we can have clarity and try to enable consistency through all these Bills and statutory measures, which cover so much of our economy and society. As this is the third Second Reading for one Bill, I will cover three issues: smart data, automated decisions and the use of data in training AI.
On smart data, perhaps it would be better for the public if we called it “smart uses of data”. As has been mentioned, open banking is currently the only smart use of data. Perhaps one of the reasons why it has not been mainstreamed or got to a critical level in our society is the brand and rollout of open banking. We should all be rightly proud of the concept’s success— made in the UK and replicated in over 60 jurisdictions around the world, many of which have gone much further than us in a much shorter time. It demonstrates that we know how to do right-sized regulation and shows that we know how to regulate for innovation, consumer protection and citizens’ rights. Yet we are doing so little of this legislation and regulation.
It is one thing to pass that willed regulatory intervention; perhaps the Government and other entities did not do anywhere near enough promotion of the opportunities and possibilities of open banking. If you polled people on main street about open banking, I imagine they would say “I have no idea what you’re talking about; they’ve closed all the branches”. This goes to the heart of the point raised by the noble Lord, Lord Knight. Without a coherent narrative, explained, communicated and connected across our society, it is hardly surprising that we have not only this level of take-up of open banking but this level of connection to all the opportunities around these new technologies.
The opportunities are immense, as set out in this Bill. The extension of smart data into areas such as energy provision could be truly transformational for citizens and bill payers. What is the Government’s plan to communicate these opportunities on the passage of this Bill to make all bill payers, citizens and consumers aware of the opportunities that these smart data, smart energy and smart savings provisions may bring to them?
Secondly, as has rightly and understandably been mentioned by noble Lords, the Bill proposes a significant and material change to automated decision-making. It could be argued that one of the impacts of gen AI has been to cause a tidal wave of automated decisions, not least in recruitment and employment. Somebody may find themselves on the wrong end of a shortlisting decision for a role: an automated decision where the individual did not even know that AI was in the mix. I suggest that that makes as clear a case as any for the need to label all goods and products in which AI is involved.
The Bill seeks to take Article 22 and turn it into what we see in Clause 80. Would the Minister not agree that Clause 80 is largely saying, “It’s over to you, pal”? How can somebody effectively assert their right if they do not even know that AI and automated decision-making were in the mix at the time? Would the Minister not agree that, at the very least, there must be a right for an individual to have a personalised decision to understand what was at play, with some potential for redress if so sought?
Thirdly, on the use of data in training AI, where is the Bill on this most critical point? Our creatives add billions to the UK economy and they enrich our society. They lift our souls, making music where otherwise there may be silence, filling in the blank page with words that change our lives and pictures that elevate the human condition. Yet right now, we allow their works to be purloined without consent, respect or remuneration. What does the Bill do for our creative community, a section of the economy growing at twice the rate of the rest of it?
More broadly, why is the Bill silent when it comes to artificial intelligence, impacting as it does so many elements of our economy, society and individuals’ lives right now? If we are not doing AI in this Bill, when will we be? What are we waiting to know that we do not already know to make a decent effort at AI legislation and regulation?
The danger is that, with so much running through the Bill, if we do not engender a connection with the public then there will be no trust. No matter how much potential there is in these rich datasets and potential models to transform our health, education, mobility and so much more, none of it will come to anything if there is not public trust. I guess we should not be so surprised that, while we all enjoy “Wolf Hall: The Mirror and the Light” every Sunday evening, there is more than a degree of Henry VIII spattered through this Bill as a whole.
I move to some final questions. What is the Government’s position when it comes to the reversal of the burden of proof in computer evidence? We may need to modernise the situation pre-1999, but it should certainly be the case that that evidence is put to proof. We cannot continue with the situation so shamefully and shockingly set out in the Horizon situation, as rightly set out by my noble friend Lord Arbuthnot, who has done more than any in that area.
Similarly, on the Bill in its entirety, has the “I” of GenAI been passed over in the Bill as currently constructed? So many of the clauses and so much of the wording were put together before the arrival of GenAI. Is there not a sense that there is a need for renewal throughout the Bill, with so many clauses at least creaking as a consequence of the arrival of GenAI?
Will the Government consider updating the Computer Misuse Act, legislation which came into being before we had any of this modern AI or modern computing? Will they at least look at a statutory defence for our cyber community, who do so much to keep us all safe but, for want of a statutory defence, have to do so much of that with at least one hand tied behind their back?
Does the Minister believe that this Bill presents the opportunity to move forward with data literacy? This will be required if citizens are to assert their data rights and be able to say of their data, “It is my data and I decide to whom it goes, for what and for what remuneration”?
Finally, what is the Government’s approach to data connected to AI legislation, and when may we see at least a White Paper in that respect?
Data may be, as the Minister said, the DNA of our time, or, as other noble Lords have said, the oil; perhaps more pertinently it may be the plastic of our time, for all that that entails. The critical point is this: it offers so much potential, but not inevitability, to drive economic, social and psychological growth. We need to enable and empower all our citizens to be able to say, full-throatedly, “Our data; our decisions; our human-led digital futures”.
(1 month, 1 week ago)
Lords ChamberThe noble Lord makes a very good point and I hope that that can be included in the Green Paper as one aspect of this. I reiterate that we see a future for community hubs. It may be that we need fixed premises for that to work in practice, rather than for it to be something that just visits. For more isolated communities, that may well be a solution. Whatever happens, we want to guarantee to all communities in the UK that they will be able to access a post office to do the business that they need to do in order to access public services, driving licences and all the things we were talking about earlier. They will need to have some form of post office within easy reach. That is certainly one way of looking at it.
My Lords, I declare my financial services and technology interests, as set out in the register. Would the Minister agree that the country is suffering from an epidemic of financial exclusion and digital exclusion, with the two often walking painfully hand in hand? Would not the golden principles of financial inclusion and digital inclusion be two excellent elements on which to fund the Post Office going forward?
Would she also agree that a significant part of the difficulties experienced by sub-postmasters and sub-postmistresses was that computer evidence was taken almost on the nod? Would she agree that it is high time we reversed the burden of proof with computer evidence to what it was pre changing it to this iniquitous position?
My Lords, I think we have all learned the lesson from the Horizon scandal that you cannot assume that the computer is always right. I absolutely agree with the noble Lord that we need to be much more sceptical when presented with that kind of evidence in future.
On digital exclusion, the noble Lord is absolutely right. It is a huge issue for the Government and we are taking it very seriously. A huge piece of work is going on around this. Obviously, our ambition is to make sure that everybody has the skills and capacity to go online and access services, because it is to their benefit; it makes their life easier. The proposals we have—for example, the Government’s One Login service, will always have the option for individuals to go in person to a post office to access those services as an alternative. We will make sure that people are not excluded. But the real challenge relates to the discussion we were having earlier with the noble Baroness, Lady Smith, about education and skills; it is our intention to make sure that people have the skills, education, capacity and equipment to go online and have all the advantages that the digital world will offer them.
(1 month, 2 weeks ago)
Lords ChamberMy Lords, the Government are clear that copyright law must be respected when content is used to train AI models. If copies are made of protected work, licences must be required from the copyright owner unless a specific copyright exception applies. The problem is that the law does not yet apply equally to generative AI models, and that is the issue we are grappling with. Our view is that this should not necessarily be left to the law; unfortunately, it takes a long time for these legal cases to be resolved. We are trying to find a way forward that will be fair to everybody but that does not require the long legislative process that I know the noble Earl is all too aware of.
My Lords, I declare my technology interests as set out in the register. Does the Minister agree that this is not just a question of fairness? We must have a respected, remunerated, consented, dynamic licensing market for IP and copyrighted works for both the benefit of IP and creatives and for a flourishing AI market in the UK.
The noble Lord is quite right: we have to find a way forward that reflects the importance of both these sectors to our economy. The creative industries are one of the UK’s most powerful economic activities, worth £124 billion in GVA at the moment, so they are hugely important. We know that we have to respect the creative sector and the journalists working in it, but equally, we know that the future will be about an enhanced AI system. More and more businesses in the UK are now using AI, so that is the way forward and we have to find a way through this, but there is not a simple answer. I assure noble Lords that my colleagues, particularly Chris Bryant and Feryal Clark, are very aware of this issue. It has to be resolved but we would just ask for a little bit more space to allow us to make some progress.
(3 months, 2 weeks ago)
Lords ChamberMy Lords, it is a pleasure to take part in this debate. I congratulate the noble Lord, Lord Hollick, on his excellent introduction to the debate and thank him and the committee for an excellent report that covers so much ground in such clarity and detail. “Who watches the watchdogs?” has been the cry over centuries of human societies, and it is never more applicable than today with the proliferation of regulators covering all aspects of our economy and society. Performance, independence and accountability are exactly the three points on any tripod to get into the issues surrounding how in the UK we regulate in the 21st century. The recommendations are clear, achievable and relevant, and I agree with all of them.
The themes running through the report are equally clear. There is a sense that it is as good as pointless—worse, harmful—simply to add more statutory objectives to regulators in the belief that this would impact performance and produce a better result for the market or consumers. Similarly, some regulators are able to fund themselves through levies and fees, and others have to go with their hand out to government. That financial structure must impact on the way that they operate, through no fault of their own.
The cry I hear running through the whole report is for clarity, consistency and coherence across the regulatory landscape. I agree entirely. This is never clearer than when we come to artificial intelligence where, currently, there is no regulator. The previous Government had the inadequate approach of writing a letter to all regulators to ask them what they intended to do when it comes to artificial intelligence. Will the Minister say what this Government’s approach will be to get the right regulatory framework for AI? I would certainly like to see an AI authority to review many of the provisions in my AI Private Member’s Bill, and I thank the noble Viscount, Lord Chandos, for his kind words about it.
When I say an AI authority, I do not mean a behemothic regulator covering all aspects of AI; I mean a right-sized, agile, nimble and, crucially, horizontally-focused regulator to look across all the existing regulators to assess their competence, address the issues, challenges and opportunities of AI and identify the gaps where currently there is no recourse. For example, in recruitment, if you find yourself on the wrong end of a recruitment decision, often without even knowing that AI was in the mix, there is currently nowhere in the regulatory landscape to seek redress. Similarly, we need an AI authority to be the custodian of the principles we want to see, not just for the right-size regulation of AI, but going further than that with an ability to transform the way we regulate across the whole of our economy and society and to look at all legislation to address its competence to address the challenges and opportunities of AI.
Will the Minister say where the Government currently are with the regulatory innovation office? What will be the scope? How will it be funded? What will be its first tasks? Does she agree that it is high time that we had an AI authority if we are to gain all the economic, social and psychological advantages and benefits of AI while being wholly conscious and competent to address all the risks and challenges? I suggest that if we had such an AI authority, it would have not just a positive impact on how we go about regulating AI but could improve how we go about regulation and regulators across the piece, not just positively impacting AI, not just asking the question “Who watches the watchdog?”, but enabling those watchdogs to be more, enabling them to be guard dogs and to be guide dogs, and, crucially, if the guard dog and the guide dog fail, empowering them to show their teeth.
(3 months, 3 weeks ago)
Lords ChamberMy Lords, it is a pleasure to take part in this Second Reading and to follow the noble Lord, Lord Berkeley. I congratulate the noble Lord, Lord Redesdale, on his Bill and on the impact that it has already had on the new Government. It means that, unusually for a Second Reading, I can indulge in more questions to the Minister than may otherwise be the case.
It is clear that we cannot turn our energy system green and we cannot reach net zero without batteries. The questions are what batteries and with what chemistry within them, and around how they are constructed, controlled and deployed. That goes to the heart of the noble Lord’s Bill, which I support and wish well on its journey. It seems it may have a longer and more winding journey—or perhaps shorter and more winding—than other Private Members’ Bills.
I have a number of questions for the Minister, not least on the Product Regulation and Metrology Bill, which encompasses much of the noble Lord’s principles in his Private Member’s Bill. First, would it be a good idea to have a complete prohibition on charging any of these batteries, whatever device they are in, in any hallway or common parts of shared dwellings?
Secondly, are the current sanctions against those who manufacture and produce batteries that are not of the requisite standard and quality at an appropriate level? I am also interested to hear what representations the Government have had from our courageous firefighters on what is happening out there? Do we have a clear picture of the number of fires caused by lithium-ion batteries? Do we have that mapping exercise and is it clearly understood? What do the Government need to do to support our firefighters to face different challenges? There will be an exponential increase in the number of these batteries, not just on our person but moving around on small, large and mega mobility devices. What is the Government’s plan to control and effectively deal with these devices when, in tragic and horrific situations, they go wrong?
Looking broader than the Bill, it is clear that the Government need an overall battery strategy. We saw issues with Britishvolt in the north-east, so I am interested to hear from the Minister about the Government’s current strategy for battery use and development, and to get the UK to the level of battery manufacture that it requires to deliver on net zero and our mobility needs.
I refer the Minister to a report on this issue of the Science and Technology Committee, on which I was involved, a couple of years ago, called Battery Strategy Goes Flat. I cannot claim to have been the author of the title but, as it referred to the previous Government, perhaps the Minister can tell us the current Government’s strategy for the battery needs of the country.
Similarly, what level of investment is going into developing and understanding not just current battery technologies but—as the debate already referred to—all the new technologies coming on stream and very nascent technologies that are likely to form a large part of our battery need in a short time? All have potential, but allied to potential risks that need to be understood and legislated for.
Finally, on the future, what is the Government’s grand vision for the role of batteries and fuel cells across our economy and society, so that we have a safe, positive transition to green energy, to mobility for all in an inclusive manner, and a situation where the chemistry and science are fully understood so that, most importantly, we can all go about our business safely. I wish the Bill well and look forward to seeing how it interacts with the product safety Bill to put the country in a far better situation for the generation and storage of energy, and, crucially, our safety.
(4 months, 3 weeks ago)
Lords ChamberI thank the noble Lord for that question and for all the work he has done on the AI issue, including his new book, which I am sure is essential reading over the summer for everybody. I should say that several noble Lords in this Chamber have written books on AI, so noble Lords might want to consider that for their holiday reading.
The noble Lord will know that the use and regulation of live facial recognition is for each country to decide. We already have some regulations about it, but it is already governed by data protection, equality and human rights legislation, supplemented by specific police guidance. It is absolutely vital that its use is only when it is necessary, proportionate and fair. We will continue to look at the legislation and at whether privacy is being sufficiently protected. That is an issue that will come forward when the future legislation is being prepared.
My Lords, would the Minister agree that the way to regulate AI is principles-based, outcomes-focused and input-understood, and always, where appropriate, remunerated? To that end, what is the Government’s plan to support our creative industries—the musicians, writers and artists who make such a contribution to our economy, society and well-being, and whose IP and copyright are currently being swallowed up by gen AI, with no respect, no consent and no remuneration? Surely it is time to legislate.
The noble Lord raises a really important point here and again I acknowledge his expertise on this issue. It is a complex and challenging area and we understand the importance of it. I can assure the noble Lord that it remains a priority for this Government and that we are determined to make meaningful progress in this area. We believe in both human-centred creativity and the potential of AI to open new creative frontiers. Finding the right balance between innovation and protection for those creators and for the ongoing viability of the creative industries will require thoughtful engagement and consultation. That is one of the things we will do when we consult on the new legislation.