Data (Use and Access) Bill [HL] Debate

Full Debate: Read Full Debate
Department: Department for Business and Trade
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Yes, it would be helpful if we could write and set that out in more detail. Obviously the ICO’s report is fairly recent, but I am sure he has considered how the enforcement would follow on from that. I am sure we can write and give more details.

Lord Thomas of Cwmgiedd Portrait Lord Thomas of Cwmgiedd (CB)
- Hansard - -

My Lords, I thank the Minister for her response. I wish to make three points. First, the critical question is: are our laws adequate to pass the adequacy test? Normally, when you go in for a legal test, you check that your own house is in order. I am therefore slightly disappointed by the response to Amendment 125. Normally one has the full-scale medical first, rather than waiting until you are found to be ill afterwards.

Secondly, I listened to what the Minister said about my Amendment 87 and the difference between what rights are protected by the charter and the much greater limitation of the ECHR, normally simply to do with the extent to which they apply horizontally to private individuals. I will look at her answer, but at first sight it does not seem right to me that, where you have fundamental rights, you move to a second stage of rights—namely, the rights under the Data Protection Act.

Thirdly, I want to comment on the whole concept of data communities and data trusts. This is an important area, and it takes me back to what I said last time: this legislation really needs trying to reduce to principles. I am going to throw out a challenge to the very learned people behind the Minister, particularly the lawyers: can they come up with something intelligible to the people who are going to do this?

This legislation is ghastly; I am sorry to say that, but it is. It imposes huge costs on SMEs—not to say on others, but they can probably afford it—and if you are going to get trust from people, you have to explain things in simple principles. My challenge to those behind the Minister is: can they draft a Clause 1 of the Bill to say, “The principles that underpin the Bill are as follows, and the courts are to interpret it in accordance with those principles”? That is my challenge—a challenge, as the noble Baroness, Lady Kidron, points out, to be ambitious and not to sit in a tepid bath. I beg leave to withdraw the amendment.

Amendment 87 withdrawn.
--- Later in debate ---
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I speak to Amendment 114 to which I have added my name. It is a very simple amendment that prevents controllers circumventing the duties for automated decision-making by adding trivial human elements to avoid the designation. So, as such, it is a very straightforward—and, I would have thought, uncontroversial—amendment. I really hope that the Government will find something in all our amendments to accept, and perhaps that is one such thing.

I am struck that previous speeches have referred to questions that I raised last week: what is the Bill for, who is it for and why is not dealing with a host of overlapping issues that cannot really be extrapolated one from another? In general, a bit like the noble Lord, Lord Holmes, I am very much with the spirit of all these amendments. They reflect the view of the Committee and the huge feeling of civil society—and many lawyers—that this sort of attack on Article 22 by Clause 80 downgrades UK data rights at a time when we do not understand the Government’s future plans and hear very little about protections. We hear about the excitements of AI, which I feel bound to say that we all share, but not at the expense of individuals.

I raise one last point in this group. I had hoped that the Minister would have indicated the Government’s openness to Amendment 88 last week, which proposed an overarching duty on controllers and processors to provide children with heightened protections. That seemed to me the most straightforward mechanism for ensuring that current standards were maintained and then threaded through new situations and technologies as they emerged. I put those two overarching amendments down on the understanding that Labour, when in opposition, was very much for this approach to children. We may need to bring back specific amendments, as we did throughout the Data Protection and Digital Information Bill, including Amendment 46 to that Bill, which sought to ensure

“that significant decisions that impact children cannot be made using automated processes unless they are in a child’s best interest”.

If the Minister does not support an overarching provision, can she indicate whether the Government would be more open to clause-specific carve-outs to protect children and uphold their rights?

Lord Thomas of Cwmgiedd Portrait Lord Thomas of Cwmgiedd (CB)
- Hansard - -

My Lords, I rise briefly, first, to thank everyone who has spoken so eloquently about the importance of automated decision-making, in particular its importance to public trust and the importance of human intervention. The retrograde step of watering down Article 22 is to be deplored. I am therefore grateful to the noble Lord, Lord Clement-Jones, for putting forward that this part of the Bill should not stand part. Secondly, the specific amendment that I have laid seeks to retain the broader application of human intervention for automated decision-making where it is important. I can see no justification for that watering down, particularly when there is such uncertainty about the scope that AI may bring to what can be done by automated decision-making.

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - - - Excerpts

My Lords, in speaking to this group of amendments I must apologise to the Committee that, when I spoke last week, I forgot to mention my interests in the register, specifically as an unpaid adviser to the Startup Coalition. For Committee, noble Lords will realise that I have confined myself to amendments that may be relevant to our healthcare and improving that.

I will speak to Amendments 111 and 116 in the names of my noble friends Lord Camrose and Lord Markham, and Amendment 115 from my noble friend Lord Lucas and the noble Lords, Lord Clement-Jones and Lord Knight of Weymouth, as well as other amendments, including from my noble friend Lord Holmes—I will probably touch on most amendments in this group. To illustrate my concerns, I return to two personal experiences that I shared during debate on the Data Protection and Digital Information Bill. I apologise to noble Lords who have heard these examples previously, but they illustrate the points being made in discussing this group of amendments.

A few years ago, when I was supposed to be travelling to Strasbourg, my train to the airport got delayed. My staff picked me up, booked me a new flight and drove me to the airport. I got to the airport with my new boarding pass and scanned it to get into the gate area, but as I was about to get on the flight, I scanned my pass again and was not allowed on the flight. No one there could explain why, having been allowed through security, I was not allowed on the flight. To cut a long story short, after two hours of being gaslighted by four or five staff, with them not even saying that they could not explain things to me, I eventually had to return to the check-in desk—this was supposed to be avoided by all the automation—to ask what had happened. The airline claimed that it had sent me an email that day. The next day, it admitted that it had not sent me an email. It then explained what had happened by saying that a flag had gone off in its system. That was simply the explanation.

This illustrates the point about human intervention, but it is also about telling customers and others what happens when something goes wrong. The company clearly had not trained its staff in how to speak to customers or in transparency. Companies such as that airline get away with this sort of disgraceful behaviour all the time, but imagine if such technology were being used in the NHS. Imagine the same scenario: you turn up for an operation, and you scan your barcode to enter the hospital—possibly even the operating theatre—but you are denied access. There must be accountability, transparency and human intervention, and, in these instances, there has to be human intervention immediately. These things are critical.

I know that this Bill makes some sort of differentiation between more critical and less critical ADM, but let me illustrate my point with another example. A few years ago, I paid for an account with one of those whizzy fintech banks. Its slogan was: “We are here to make money work for everyone”. I downloaded the app and filled out the fields, then a message popped up telling me, “We will get back to you within 48 hours”. Two weeks later, I got a message on the app saying that I had been rejected and that, by law, the bank did not have to explain why. Once again, I ask noble Lords to imagine. Imagine Monzo’s technology being used on the NHS app, which many people currently use for repeat prescriptions or booking appointments. What would happen if you tried to book an appointment but you received a message saying, “Your appointment has been denied and, by law, we do not have to explain why”? I hope that we would have enough common sense to ensure that there is human intervention immediately.

I realise that the noble Lord, Lord Clement-Jones, has a Private Member’s Bill on this issue—I am sorry that I have not been able to take part in those debates—but, for this Bill, I hope that the two examples I have just shared illustrate the point that I know many noble Lords are trying to make in our debate on this group of amendments. I look forward to the response from the Minister.