(1 week ago)
Grand CommitteeMy Lords, I rise briefly, first, to thank everyone who has spoken so eloquently about the importance of automated decision-making, in particular its importance to public trust and the importance of human intervention. The retrograde step of watering down Article 22 is to be deplored. I am therefore grateful to the noble Lord, Lord Clement-Jones, for putting forward that this part of the Bill should not stand part. Secondly, the specific amendment that I have laid seeks to retain the broader application of human intervention for automated decision-making where it is important. I can see no justification for that watering down, particularly when there is such uncertainty about the scope that AI may bring to what can be done by automated decision-making.
My Lords, in speaking to this group of amendments I must apologise to the Committee that, when I spoke last week, I forgot to mention my interests in the register, specifically as an unpaid adviser to the Startup Coalition. For Committee, noble Lords will realise that I have confined myself to amendments that may be relevant to our healthcare and improving that.
I will speak to Amendments 111 and 116 in the names of my noble friends Lord Camrose and Lord Markham, and Amendment 115 from my noble friend Lord Lucas and the noble Lords, Lord Clement-Jones and Lord Knight of Weymouth, as well as other amendments, including from my noble friend Lord Holmes—I will probably touch on most amendments in this group. To illustrate my concerns, I return to two personal experiences that I shared during debate on the Data Protection and Digital Information Bill. I apologise to noble Lords who have heard these examples previously, but they illustrate the points being made in discussing this group of amendments.
A few years ago, when I was supposed to be travelling to Strasbourg, my train to the airport got delayed. My staff picked me up, booked me a new flight and drove me to the airport. I got to the airport with my new boarding pass and scanned it to get into the gate area, but as I was about to get on the flight, I scanned my pass again and was not allowed on the flight. No one there could explain why, having been allowed through security, I was not allowed on the flight. To cut a long story short, after two hours of being gaslighted by four or five staff, with them not even saying that they could not explain things to me, I eventually had to return to the check-in desk—this was supposed to be avoided by all the automation—to ask what had happened. The airline claimed that it had sent me an email that day. The next day, it admitted that it had not sent me an email. It then explained what had happened by saying that a flag had gone off in its system. That was simply the explanation.
This illustrates the point about human intervention, but it is also about telling customers and others what happens when something goes wrong. The company clearly had not trained its staff in how to speak to customers or in transparency. Companies such as that airline get away with this sort of disgraceful behaviour all the time, but imagine if such technology were being used in the NHS. Imagine the same scenario: you turn up for an operation, and you scan your barcode to enter the hospital—possibly even the operating theatre—but you are denied access. There must be accountability, transparency and human intervention, and, in these instances, there has to be human intervention immediately. These things are critical.
I know that this Bill makes some sort of differentiation between more critical and less critical ADM, but let me illustrate my point with another example. A few years ago, I paid for an account with one of those whizzy fintech banks. Its slogan was: “We are here to make money work for everyone”. I downloaded the app and filled out the fields, then a message popped up telling me, “We will get back to you within 48 hours”. Two weeks later, I got a message on the app saying that I had been rejected and that, by law, the bank did not have to explain why. Once again, I ask noble Lords to imagine. Imagine Monzo’s technology being used on the NHS app, which many people currently use for repeat prescriptions or booking appointments. What would happen if you tried to book an appointment but you received a message saying, “Your appointment has been denied and, by law, we do not have to explain why”? I hope that we would have enough common sense to ensure that there is human intervention immediately.
I realise that the noble Lord, Lord Clement-Jones, has a Private Member’s Bill on this issue—I am sorry that I have not been able to take part in those debates—but, for this Bill, I hope that the two examples I have just shared illustrate the point that I know many noble Lords are trying to make in our debate on this group of amendments. I look forward to the response from the Minister.
I thank all noble Lords who have spoken. I must confess that, of all the groups we are looking at today, I have been particularly looking forward to this one. I find this area absolutely fascinating.
Let me begin in that spirit by addressing an amendment in my name and that of my noble friend Lord Markham and I ask the Government and all noble Lords to give it considerable attention. Amendment 111 seeks to insert the five principles set out in the AI White Paper published by the previous Government and to require all those participating in ADM—indeed, all forms of AI—to have due regard for them. They are:
“safety, security and robustness, appropriate transparency and explainability, fairness, accountability and governance, and contestability and redress”.
These principles for safe AI are based on those originally developed with the OECD and have been the subject of extensive consultation. They have been refined and very positively received by developers, public sector organisations, private sector organisations and civil society. They offer real safeguards against the risks of AI while continuing to foster innovation.
I will briefly make three brief points to commend their inclusion in the Bill, as I have described. First, the Bill team has argued throughout that these principles are already addressed by the principles of data protection and so are covered in the Bill. There is overlap, of course, but I do not agree that they are equivalent. Data protection is a significant concern in AI but the risks and, indeed, the possibilities of AI go far further than data protection. We simply cannot entrust all our AI risks to data protection principles.
Secondly, I think the Government will point to their coming AI Bill and suggest that we should wait for that before we move significantly on AI. However, in practice all we have to go on about the Bill—I recognise that Ministers cannot describe much of it now—is that it will focus on the largest AI labs and the largest models. I assume it will place existing voluntary agreements on a statutory footing. In other words, we do not know when the Bill is coming, but this approach will allow a great many smaller AI fish to slip through the net. If we want to enshrine principles into law that cover all use of AI here, this may not quite be the only game in town, but it is certainly the only all-encompassing, holistic game in town likely to be positively impactful. I look forward to the Minister’s comments on this point.
(1 week, 6 days ago)
Grand CommitteeSo is the noble Lord, Lord Clement-Jones. I withdrew my consent because I did not trust the system. I think that what both noble Lords have said about trust could be spread across the Bill as a whole.
We want to use our data well. We want it to benefit our public services. We want it to benefit UK plc and we want to make the world a better place, but not at the cost of individual data subjects and not at too great a cost. I add my voice to that. On the whole, I prefer systems that offer protections by design and default, as consent is a somewhat difficult concept. But, in as much as consent is a fundamental part of the current regulatory system and nothing in the Bill gets rid of it wholesale for some better system, it must be applied meaningfully. Amendments 79, 81 and 131 make clear what we mean by the term, ensure that the definition is consistent and clarify that it is not the intention of the Government to lessen the opportunity for meaningful consent. I, too, ask the Minister to confirm that it is not the Government’s intention to downgrade the concept of meaningful consent in the way that the noble Lord, Lord Stevenson, has set out.
My Lords, I support Amendment 71 and others in this group from the noble Lords, Lord Clement-Jones and Lord Stevenson. I apologise for not being able to speak at Second Reading. The noble Lord, Lord Clement-Jones, will remember that we took a deep interest in this issue when I was a Health Minister and the conversations that we had.
I had a concern at the time. We all know that the NHS needs to be digitised and that relevant health professionals need to be able to access relevant data when they need to, so that there is no need to be stuck with one doctor when you go to another part of the country. There are so many efficiencies that we could have in the system, as long as they are accessed by relevant and appropriate health professionals at the right time. But it is also important that patients have confidence in the system and that their personal data cannot be shared with commercial organisations without them knowing. As other noble Lords have said, this is an issue of trust.
For that reason, when I was in that position, I reached out to civil liberties organisations to understand their concerns. For example, medConfidential was very helpful and had conversations with DHSC and NHS officials. In fact, after those conversations, officials told me that its demands were reasonable and that some of the things being asked for were not that difficult to give and common sense.
I asked a Written Question of the noble Baroness’s ministerial colleague, the noble Baroness, Lady Merron, about whether patients will be informed of who has had access to their patient record, because that is important for confidence. The Answer I got back was that the Government were proposing a single unified health record. We all know that. She said that:
“Ensuring that patients’ confidential information remains protected and is seen only by those who need to see it will be a priority. Public engagement next month will help us understand what safeguards patients would want to see”.
Surely the fact that patients have opted out shows that they already have concerns and have raised them.
The NHS can build the best data system—or the federated data platform, as it is called—but without patient confidence it is simply a castle made of sand. As one of my heroes, Jimi Hendrix, once said, castles made of sand fall into the sea eventually. We do not want to see that with the federated data platform. We want to see a modernised system of healthcare digital records, allowing joined-up thinking on health and care right across a patient’s life. We should be able to use machine learning to analyse those valuable datasets to improve preventive care. But, for that to happen, the key has to be trust and patients being confident that their data is secure and used in the appropriate way. I look forward to the Minister’s response.
My Lords, I support these amendments in the names of the noble Lords, Lord Stevenson and Lord Clement-Jones. It is a pleasure to follow the second ex-Health Minister this afternoon. In many ways, the arguments are just the same for health data as they are for all data. It is just that, understandably, it is at the sharpest end of this debate. Probably the most important point for everybody to realise, although it is espoused so often, is that there is no such thing as NHS data. It is a collection of the data of every citizen in this country, and it matters. Public trust matters significantly for all data but for health data in particular, because it goes so close to our identity—our very being.
Yet we know how to do public trust in this country. We know how to engage and have had significant success in public engagement decades ago. What we could do now with human-led technology-supported public engagement could be on such a positive and transformational scale. But, so far, there has been so little on this front. Let us not talk of NHS data; let us always come back to the fundamental principle encapsulated in this group of amendments and across so many of our discussions on the Bill. Does the Minister agree that it is about not NHS data but our data—our decisions—and, through that, if we get it right, our human-led digital futures?
I am not suggesting that there is no legitimate interest for processing personal data without consent, but the legitimate interest assessment is a check and balance that ensures oversight and reduces the risk of overreach. It is a test, not a blocker, and does not in itself prevent processing if the balancing test determines that processing should go ahead. Amendment 85 illustrates this point in relation to vulnerable users. Given that a determination that a person is at risk would have far-reaching consequences for that person, the principles of fairness and accountability demand that those making the decision must follow a due process and that those subject to the decision are aware—if not in an emergency, certainly at some point in the proceedings.
In laying Amendment 86, the noble Lord, Lord Clement-Jones, raises an important question that I am keen to hear from Ministers on, namely, what is the Government’s plan for ensuring that a designation that an individual is vulnerable is monitored and removed when it is no longer appropriate? If a company or organisation has a legitimate interest in processing someone’s data considering the balancing interests of data subjects, it is free to do so. I ask the Minister again to give concrete examples of circumstances in which the current legitimate interest basis is insufficient, so that we understand the problem the Government are trying to solve.
At Second Reading, the Government’s curious defence of this new measure was the idea that organisations had concerns about whether they were doing the balancing test correctly, so the new measure is there to help, but perhaps the Minister can explain what benefits accrue from introducing the new measure that could not have been better achieved by the ICO providing more concrete guidance on the balancing test. Given that the measure is focused on the provision of public interest areas, such as national security and the detection of crime, how does the creation of the recognised legitimate interest help the majority of data controllers, rather than simply serving the interests of incumbents and/or government departments by removing an important check or balance?
Amendments 76, 83 and 90 seek to curb the power of the Secretary of State to override primary legislation and to modify key aspects of UK data protection law via statutory instrument. The proposed provisions in Clauses 70, 71 and 74 put one person in control, rather than Parliament. Elon Musk’s new role in the upcoming US Administration gives him legitimacy as an incoming officeholder in the Executive, but his new role is complicated by the fact that he is also CEO and majority shareholder of X. Like OpenAI, Google, Amazon, Palantir or any other tech behemoth, tech execs are not elected or bound to fulfil social goods or commitments, other than making a profit for their shareholders. They also fund many of the think tanks, reports and events in the political ecosystem, and there is a well-worn path of employment between industry, government and regulators.
No single person should be the carrier of that incredible burden. For now, Parliament is the only barrier in the increasingly confused picture of regulatory and political capture by the tech sector. We should fight to keep it that way.
My Lords, I support Amendment 74 from the noble Lords, Lord Scriven and Lord Clement-Jones, on excluding personal health data from being a recognised legitimate interest. I also support Amendment 78 on having a statement by the Secretary of State to recognise that legitimate interest and Amendments 83 and 90, which would remove powers from the Secretary of State to override primary legislation to modify data protection via an SI. There is not much to add to what I said on the previous group, so I will not repeat all the arguments made then. In simple terms, I repeat the necessity for trust—in health, particularly for patient trust. You do not gain trust simply by defining personal health data as a legitimate interest or by overriding primary legislation on the say-so of a Secretary of State, even if it is laid as a statutory instrument.
My Lords, I want to ask the Minister and the noble Lord, Lord Clement-Jones, in very general terms for their views on retrospectivity. Do they believe that the changes to data protection law in the Bill are intended to be applied to data already held at this time or will the new regime apply only to personal data collected going forwards from this point? I ask that specifically of data pertaining to children, from whom sensitive data has already been collected. Will the forthcoming changes to data protection law apply to such data that controllers and processors already hold, or will it apply only to data held going forward?
(2 months, 1 week ago)
Lords ChamberMy Lords, we have already done considerable work on this, and I pay tribute to the noble Baroness, Lady Cumberlege, for her original work on this, as well as now to the Patient Safety Commissioner. We are looking in detail at these issues and will continue to do so. I should make it clear, however, that should we make any changes to the legislation, it will require—as I understand it—primary legislation. It will not in any case be retrospective, so all we can do is look at products going forward. Obviously, patient safety is our primary concern and is absolutely at the forefront of our mind in taking these issues forward.
My Lords, in discussing patient safety, I pay tribute to my noble friend Lady Cumberlege for all her work on patient safety and medical devices over the years. I think noble Lords were very reassured to hear from the Minister that the Government are going to review the timeframe on medical product liability. Is she able to give us a bit more detail about that process, perhaps including timeframes, and what the process will entail, including consultation and who will be consulted?