(8 months, 1 week ago)
Grand CommitteeI hope that the noble Baroness does not get too carried away on that one.
I am sure that we will revisit this at some point in future. Perhaps the noble Lord will like the fact that I am saying that it is certain that we will revisit it from a different place.
These are all really serious amendments. This is a long Committee stage but, in the whole issue of data, having regard to data adequacy is absolutely crucial, as the degree of intervention on the Minister indicated. The Green Party’s position is that we want to be rejoin-ready: we want to remain as close as possible to EU standards so that we can rejoin the EU as soon as possible.
Even without taking that approach, this is a crucial issue as so many businesses are reliant on this adequacy ruling. I was taken by a comment from the Minister, who said that the UK is committed to data adequacy. The issue here is not what the UK is saying but convincing the EU, which is not in our hands or under our control, as numerous noble Lords said.
I have no doubt that we will return to data adequacy and I hope that we will return to the innovative and creative intervention from the noble Baroness, Lady Young of Old Scone. In the meantime, I beg leave to withdraw Amendment 195A.
(8 months, 4 weeks ago)
Grand CommitteeMy Lords, this is the first group of amendments covering issues relating to automated decision-making, one of the most interesting areas of data use but also one of the most contested and, for the public at large, one of the most controversial and difficult to navigate. The development of AI and data systems that easily enable automatable decisions could offer huge efficiencies for consumers of public services. Equally, the use of such systems can, if used and regulated in the wrong way, have a devastating impact on people’s lives. If we have learned one thing from the Horizon scandal it is simply that, in the wrong hands and with the wrong system in place, the misuse of data can destroy lives and livelihoods.
Our country has a massive social security system, which includes everything from pension payments to disability income support and, of course, the universal credit system, which covers people entitled to in-work and out-of-work benefits. Over 22 million people receive DWP benefits of one sort or another. If automated decisions make errors in this field the potential to damage lives is enormous, as I am sure the Minister will appreciate.
I turn to the four amendments in the group in the name of my noble friend Lady Jones. Amendments 36 and 37 seek to amend new Article 22A of the UK GDPR and make it clear that protection is provided for profiling operations that lead to decisions. This is important, not least because the clause further reduces the scope for the human review of automated decision-making. Profiling is used as part of this process, and these amendments seek to protect individual data subjects from its effect. We take the view that it is essential that human interaction is involved in making subject access decisions.
Amendment 40 also makes it clear that, in the context of the new Article 22A, for human involvement to be considered meaningful, the review of the decision must be completed by a competent person. One of the positive changes made by the Bill is the introduction of the concept of “meaningful human involvement” in a decision. Meaningful human review is a key component for achieving an appropriate level of oversight over automated decision-making, for protecting individuals from unfair treatment and for offering an avenue for redress. The aim of the amendment is to bring more clarity around what “meaningful human involvement” should consist of. It would require that a review needs to be performed by a person with the necessary competence, training and understanding of the data, and, of course, the authority to alter the decision.
Our Amendment 109 is not so much about building protections as introducing something new and adding to the strength of what is already there. Users have never been able to get personalised explanations of automated decisions but, given the impact that these can have, we feel that systems should be in place for people to understand why a computer has simply said yes or no.
As it stands, the Bill deletes Section 14 of the Data Protection Act 2018 in its entirety. Our amendment would undo that and then add personalisation in. The amendment would retain Section 14 of that Act, which is where most automated decision-making safeguards are currently detailed in law. It would introduce an entitlement for data subjects to receive a personalised explanation of an automated decision made about them. This is based on public attitudes research conducted by the Ada Lovelace Institute, which shows a clear demand for greater transparency over these sorts of decisions.
The amendment also draws on independent legal analysis commissioned by the Ada Lovelace Institute, which found that the generic nature of explanations provided under current law are insufficient for individuals to understand how they have been affected by automated decision-making. This was considered to be a major barrier to meaningful protection from and redress for harms caused by AI. As many noble Lords have made clear in these debates, we have put building trust at the heart of how we get the most from AI and, more particularly, ADM systems.
I turn to the amendments in the name of the noble Lord, Lord Clement-Jones. In essence, they are about—as the noble Lord will, I am sure, explain better than I possibly could—the level of engagement of individuals in decisions about data subject automated decision-making processes. The common thread through the amendments is that they raise the bar in terms of the safeguards for data subjects’ rights and freedoms. We have joined the noble Lord, Lord Clement-Jones, on Amendment 47, and might equally have added our names to the other amendments in the group as we broadly support those too.
Amendment 38A, in the name of the noble Baroness, Lady Bennett, would place an additional requirement under new Article 22A to ensure human engagement in the automated decision-making processes.
I am sure the Committee will want more than warm words from the Minister when he comes to wind up the debate. For all of us, ADM is the here and now; it shapes how we use and consume public services and defines what and who we are. Reducing our protections from its downsides is not to be done lightly and we cannot easily see how that can be justified. I want to hear from the Minister how the Government came to conclude that this was acceptable, not least because, as we will hear in later debates on the Bill, the Government are seeking powers that provide for invasive bulk access to potentially every citizen’s bank accounts. I beg to move the amendments in the name of the noble Baroness, Lady Jones.
My Lords, it is a pleasure to follow the noble Lord, Lord Bassam, who has already set out very clearly what the group is about. I will chiefly confine myself to speaking to my Amendment 38A, which seeks to put in the Bill a clear idea of what having a human in the loop actually means. We need to have a human in the loop to ensure that a human interpreted, assessed and, perhaps most crucially, was able to intervene in the decision and any information on which it is based.
Noble Lords will be aware of many situations that have already arisen in which artificial intelligence is used—I would say that what we are currently describing is artificial intelligence but, in real terms, it is not truly that at all. What we have is a very large use of big data and, as the noble Lord, Lord Bassam, said, big data can be a very useful and powerful tool to be used for many positive purposes. However, we know that the quality of decision-making often depends on the quality of the data going in. A human is able to see whether something looks astray or wrong; there is a kind of intelligence that humans apply to this, which machines simply do not have the capacity for.
I pay credit to Justice, the law reform and human rights organisation which produced an excellent briefing on the issues around Clause 14. It asserts that, as it is currently written, it inadequately protects individuals from automated harm.
The noble Lord, Lord Bassam, referred to the Horizon case in the UK; that is the obvious example but, while we may think of some of the most vulnerable people in the UK, the Robodebt case in Australia is another case where crunching big data, and then crunching down on individuals, had truly awful outcomes. We know that there is a real risk of unfairness and discrimination in the use of these kinds of tools. I note that the UK has signed the Bletchley declaration, which says that
“AI should be designed, developed, deployed, and used, in a manner that is … human-centric, trustworthy and responsible”.
I focus particularly on “human-centric”: human beings can sympathise with and understand other human beings in a way that big data simply does not.
I draw a parallel with something covered by a special Select Committee of your Lordships’ House, last year: lethal autonomous weapon systems, or so-called killer robots. This is an obvious example of where there is a very strong argument for having a human in the loop, as the terminology goes. From the last I understood and heard about this, I am afraid that the UK Government are not fully committed to a human in the loop in the case of killer robots, but I hope that we get to that point.
When we talk about how humans’ data is used and managed, we are also talking about situations that are—almost equally—life and death: whether people get a benefit, whether they are fairly treated and whether they do not suddenly disappear off the system. Only this morning, I was reading a case study of a woman aged over 80, highlighting how she had been through multiple government departments, but could not get her national insurance number. Without a national insurance number, she could not get the pension to which she was entitled. If there is no human in the loop to cut through those kinds of situations, there is a real risk that people will find themselves just going around and around machines—a circumstance with which we are personally all too familiar, I am sure. My amendment is an attempt to put a real explanation in the Bill for having that human in the loop.
(10 months, 2 weeks ago)
Grand CommitteeMy Lords, I rise to speak briefly on Amendment 133 in the name of the noble Baroness, Lady Jones of Whitchurch, to which the noble Baroness, Lady Kidron, and I have attached our names. I express support in passing to the attempts to restrict fake reviews, which are clearly an absolute plague online and a cause for considerable concern. I, like many other consumers, very much rely on reviews these days. I am also interested in the amendment of the noble Lord, Lord Lucas. I very much oppose the whole structure by which students are regarded as consumers. The Green Party’s position is that education is a public good, which should be provided for free, but his point raises some interesting questions, on which I would be interested in the Minister’s answers.
Amendment 133 is about so-called drip pricing. I found various government surveys producing different figures on the cost of this to consumers, ranging from £1.6 billion to £2.2 billion each year. We are all familiar with this, unsurprisingly, given that more than half of entertainment providers, transport providers and communications businesses use this as a regular practice: “Get this bargain price. Get in now. Click here: it will cost you only £10”. Mysteriously, as you go through the process, the price keeps going up and up. People fill in all the steps in the forms, fill in their names, tick to say that they have read the terms and conditions—even though they have not—and spend all that time and energy, but suddenly the price is three times what it started as. They feel as though they have spent all that time, so it is worth going hunting around again? Do they have that time?
What we are seeing is very much a change in what might have been considered service businesses; consumers are instead servicing them, with their time, energy and efforts. This is an important area, on which people need transparency. In the cost of living crisis, it is worth noting that so-called budget airlines are particular offenders. Most people think, particularly for a long-distance journey, that luggage is not an optional extra, not to mention that a family travelling should not have to pay extra for seats together. Amendment 133 is a particularly important amendment and I look forward to the Minister’s response.
My Lords, my noble friend has added her name to that of the noble Lord, Lord Clement-Jones, on his Amendment 130. We share his concern that online marketing should not be used to promote products or services by mimicking particular brands. In some ways, it is much easier to fool consumers online into thinking that a particular product has the same characteristics and spec as a branded product. As the noble Lord argued very well, we are all familiar with how cheaper and sometimes inferior products on the shelves are designed to mislead the purchaser. This simple amendment is worth supporting for that reason alone.
I was thinking back to an incident not that long ago, when I was misled into buying a product like Lemsip, simply because the colour of the packaging was almost identical. It was so simple and easy to take the thing off the shelf and put it into the basket but, when I got home, the product was inferior. This is about not just price but quality. This amendment is well worth our support.
Amendment 131 from the noble Lord, Lord Lucas, asks an important question. It is a niche issue for this legislation, but I am nevertheless looking forward to hearing the Minister clearly explain that universities can or cannot continue to market themselves to pupils and parents. All parents, along with their children, want to receive accurate information that is easily accessible and, more importantly, verifiable so that informed choices can be made. As the noble Lord argued, this is one of the more expensive areas of parents’ expenditure on their child’s education and it is only right that we set high standards for the content of the material that is made available to those making applications, and that it is verifiable.
I now turn to Amendments 132, 133 and 144 in the name of my noble friend Lady Jones. Amendments 132 and 144 should be taken together. They would insert into Schedule 19, which deals with commercial practices, the circumstances considered where there is an unfair reference to the marketing of a counterfeit or dangerous good and would empower enforcement officers to require the removal of relevant listings from the internet. We think that this is a fairly self-explanatory process, which should provide protection for consumers from shoddy goods. If the Minister insists that this is not the place for these amendments, perhaps he can explain how else consumers are to be protected and how else this false marketing is to be tackled.
I want to persist a bit more on that. We are now almost at the end of Committee, and Report is probably two or three weeks away. That is not a lengthy period in which to get the drafting right and for us to have that discussion, so I ask that we get a really early draft of these amendments. The wording is important and that will help my noble friend Lady Jones to form a view about whether it covers what we are after here.
This is of great concern to many consumer groups, so it is important to publish and make it publicly available so that people are able to examine, think about and get legal advice on it. It is not just the people in this Committee but broader society that really needs to have the chance to input into this crucial issue.
(3 years, 1 month ago)
Lords ChamberMy Lords, I rise briefly to commend the noble Baroness, Lady Kramer, on her alertness in uncovering this issue, and to make a very simple comparison with something that has occupied a great deal of time in your Lordships’ House lately: the water companies, and what we have seen happen with them, with, very often, hedge fund owners involved, massive profits being taken out and massive loads of debt. This is a terribly important amendment. I regret not attaching my name to it. I certainly would have done had I been alerted to it earlier. This is terribly important, and I encourage the noble Baroness to keep pushing.
My Lords, I do not have a great deal to add. The argument of the noble Baroness, Lady Kramer, is very sound and was well made and well researched. We had an interesting debate on this topic in Grand Committee, and I am grateful to our colleagues on the Liberal Democrat Benches for allowing us to return to it through this reformulated amendment.
During the previous debate, examples were raised of organisations that are not social enterprises or charities, but which nevertheless deliver public good through the use of dormant assets funding. This new amendment captures that reality, while introducing the safeguard that these funds, which are finite and will be highly sought after, are not used to enhance investors’ returns, where that may be a concern.
I do not really understand why the Government should not write this kind of safeguard into the Bill. Failing that, will the Minister put something on the record that will provide us with some comfort? We need that reassurance, protection and level of accountability.