Data Protection and Digital Information Bill Debate
Full Debate: Read Full DebateBaroness Bennett of Manor Castle
Main Page: Baroness Bennett of Manor Castle (Green Party - Life peer)Department Debates - View all Baroness Bennett of Manor Castle's debates with the Department for Science, Innovation & Technology
(8 months ago)
Grand CommitteeMy Lords, Amendment 19 is consequential on my more substantive Clauses 114 and 115 stand part notices, which are also in this group. I am grateful to the noble Lord, Lord Clement-Jones, for his support.
These amendments all relate to the 150 or so pages of late amendments tabled in the Commons on Report and therefore not given adequate scrutiny before now. No real explanation has been given for why the Government felt it necessary to table the amendments in this way, and this group of amendments comes under the heading of so-called “democratic engagement”. Clause 113 extends a soft opt-in for direct mail marketing for furthering charitable or political objectives, while Clause 114 goes further and allows the Secretary of State to change the direct marketing rules through secondary legislation for the purpose of democratic engagement. This would allow the Government, in the run-up to an election, to switch off the direct mailing rules that apply to political parties.
Like many others, we are highly suspicious of the Government’s motives in introducing these amendments in the run-up to this election. Although we do not have a problem with a softer opt-in for direct mailing for charities, the application of Clause 114 to political parties gives politicians carte blanche to mine voters’ data given in good faith for completely different purposes. It would allow voters to be bombarded with calls, texts and personalised social media without their explicit consent.
When you consider these proposals in the context of other recent moves by the Government to make it harder for some people to vote and to vastly increase the amount of money that can be spent on campaigning in the run-up to an election, you have to wonder what the Government are up to, because these measures have certainly not been requested by Labour. In fact, these measures were not supported by the majority of respondents to the Government’s initial consultation, who wanted the existing rules upheld.
The Advertising Association has told us that it is concerned that switching off the rules could result in an increase in poor practice, such as political lobbying under the guise of research. This is apparently a practice known as “plugging”. It referred us to a report from the previous Information Commissioner on how political parties manage data protection, which provided key recommendations for how political parties could improve. These included providing clearer information about how data will be used and being more transparent about how voters are profiled and targeted via social media platforms. This is the direction our democratic engagement should be going in, with stronger and more honest rules that treat the electorate with respect, not watering down the rules that already exist.
When these proposals were challenged in the Commons on Report, the Minister, John Whittingdale, said:
“We have no immediate plans to use the regulation powers”.—[Official Report, Commons, 29/11/23; col. 912.]
If that is the case, why do the Government not take the proposals off the table, go back to the drawing board by conducting a proper consultation and test whether there is any appetite for these changes? They should also involve the Information Commissioner at an early stage, as he has already gone on record to say that this is
“an area in which there are significant potential risks to people if any future policy is not implemented very carefully”.
Finally, if there are to be any changes, they should be subject to full parliamentary scrutiny and approval.
We believe that Clauses 114 and 115 are taking us in fundamentally the wrong direction, against the interests of the electorate. I look forward to the Minister’s response, but I give notice now that, unless the Government adopt a very different strategy on this issue, we will return to this on Report. I beg to move.
My Lords, I follow the noble Baroness, Lady Jones of Whitchurch, with pleasure, as I agree with everything that she just said. I apologise for having failed to notice this in time to attach my name; I certainly would have done, if I had had the chance.
As the noble Baroness said, we are in an area of great concern for the level of democracy that we already have in our country. Downgrading it further is the last thing that we should be looking at doing. Last week, I was in the Chamber looking at the statutory instrument that saw a massive increase in the spending limits for the London mayoral and assembly elections and other mayoral elections—six weeks before they are held. This is a chance to spend an enormous amount of money; in reality, it is the chance for one party that has the money from donations from interesting and dubious sources, such as the £10 million, to bombard voters in clearly deeply dubious and concerning ways.
We see a great deal of concern about issues such as deepfakes, what might happen in the next general election, malicious actors and foreign actors potentially interfering in our elections. We have to make sure, however, that the main actors conduct elections fairly on the ground. As the noble Baroness, Lady Jones, just set out, this potentially drives a cart and horses through that. As she said, these clauses did not get proper scrutiny in the Commons—as much as that ever happens. As I understand it, there is the potential for us to remove them entirely later, but I should like to ask the Minister some direct questions, to understand what the Government’s intentions are and how they understand the meaning of the clauses.
Perhaps no one would have any problems with these clauses if they were for campaigns to encourage people to register to vote, given that we do not have automatic voter registration, as so many other countries do. Would that be covered by these clauses? If someone were conducting a “get out the vote” campaign in a non-partisan way, simply saying, “Please go out and vote. The election is on this day. You will need to bring along your voter ID”, would it be covered by these clauses? What about an NGO campaigning to stop a proposed new nuclear power station, or a group campaigning for stronger regulations on pesticides or for the Government to take stronger action against ultra-processed food? How do those kinds of politics fit with Clauses 114 and 115? As they are currently written, I am not sure that it is clear what is covered.
There is cause for deep concern, because no justification has been made for these two clauses. I look forward to hearing the Minister’s responses.
My Lords, this weekend, as I was preparing for the amendments to which I have put my name, I made the huge mistake of looking at the other amendments being discussed. As a result, I had a look at this group. I probably should declare an interest as the wife of a Conservative MP; therefore, our household is directly affected by this amendment and these clause stand part notices. I wholeheartedly agree with everything said by the noble Baronesses, Lady Jones and Lady Bennett of Manor Castle.
I have two additional points to make, because I am horrified by these clauses. First, did I miss something, in that we are now defining an adult as being 14-plus? At what point did that happen? I thought that you had the right to vote at 18, so I do not understand why electoral direct marketing should be free to bombard our 14 year-olds. That was my first additional point.
Secondly, I come back to what I said on the first day of Committee: this is all about trust. I really worry that Clauses 114 and 115 risk undermining two important areas where trust really matters. The first is our electoral system and the second is the data that we give our elected representatives, when we go to them not as party representatives but as our representatives elected to help us.
The Minister mentioned a presumption that the ICO will update its guidance. Is there a timeframe for that? Will the guidance be updated before this comes into effect? How does the age of 14 relate to the AADC, which sets the age of adulthood at 18?
Before the Minister replies, we may as well do the full round. I agree with him, in that I very much believe in votes at 16 and possibly younger. I have been on many a climate demonstration with young people of 14 and under, so they can be involved, but the issue here is bigger than age. The main issue is not age but whether anybody should be subjected to a potential barrage of material in which they have not in any way expressed an interest. I am keen to make sure that this debate is not diverted to the age question and that we do not lose the bigger issue. I wanted to say that I sort of agree with the Minister on one element.
I agree with the noble Baroness, but with one rider. We will keep coming back to the need for children to have a higher level of data protection than adults, and this is but one of many examples we will debate. However, I agree with her underlying point. The reason why I support removing both these clauses is the hubris of believing that you will engage the electorate by bombarding them with things they did not ask to receive.
My Lords, this is the first group of amendments covering issues relating to automated decision-making, one of the most interesting areas of data use but also one of the most contested and, for the public at large, one of the most controversial and difficult to navigate. The development of AI and data systems that easily enable automatable decisions could offer huge efficiencies for consumers of public services. Equally, the use of such systems can, if used and regulated in the wrong way, have a devastating impact on people’s lives. If we have learned one thing from the Horizon scandal it is simply that, in the wrong hands and with the wrong system in place, the misuse of data can destroy lives and livelihoods.
Our country has a massive social security system, which includes everything from pension payments to disability income support and, of course, the universal credit system, which covers people entitled to in-work and out-of-work benefits. Over 22 million people receive DWP benefits of one sort or another. If automated decisions make errors in this field the potential to damage lives is enormous, as I am sure the Minister will appreciate.
I turn to the four amendments in the group in the name of my noble friend Lady Jones. Amendments 36 and 37 seek to amend new Article 22A of the UK GDPR and make it clear that protection is provided for profiling operations that lead to decisions. This is important, not least because the clause further reduces the scope for the human review of automated decision-making. Profiling is used as part of this process, and these amendments seek to protect individual data subjects from its effect. We take the view that it is essential that human interaction is involved in making subject access decisions.
Amendment 40 also makes it clear that, in the context of the new Article 22A, for human involvement to be considered meaningful, the review of the decision must be completed by a competent person. One of the positive changes made by the Bill is the introduction of the concept of “meaningful human involvement” in a decision. Meaningful human review is a key component for achieving an appropriate level of oversight over automated decision-making, for protecting individuals from unfair treatment and for offering an avenue for redress. The aim of the amendment is to bring more clarity around what “meaningful human involvement” should consist of. It would require that a review needs to be performed by a person with the necessary competence, training and understanding of the data, and, of course, the authority to alter the decision.
Our Amendment 109 is not so much about building protections as introducing something new and adding to the strength of what is already there. Users have never been able to get personalised explanations of automated decisions but, given the impact that these can have, we feel that systems should be in place for people to understand why a computer has simply said yes or no.
As it stands, the Bill deletes Section 14 of the Data Protection Act 2018 in its entirety. Our amendment would undo that and then add personalisation in. The amendment would retain Section 14 of that Act, which is where most automated decision-making safeguards are currently detailed in law. It would introduce an entitlement for data subjects to receive a personalised explanation of an automated decision made about them. This is based on public attitudes research conducted by the Ada Lovelace Institute, which shows a clear demand for greater transparency over these sorts of decisions.
The amendment also draws on independent legal analysis commissioned by the Ada Lovelace Institute, which found that the generic nature of explanations provided under current law are insufficient for individuals to understand how they have been affected by automated decision-making. This was considered to be a major barrier to meaningful protection from and redress for harms caused by AI. As many noble Lords have made clear in these debates, we have put building trust at the heart of how we get the most from AI and, more particularly, ADM systems.
I turn to the amendments in the name of the noble Lord, Lord Clement-Jones. In essence, they are about—as the noble Lord will, I am sure, explain better than I possibly could—the level of engagement of individuals in decisions about data subject automated decision-making processes. The common thread through the amendments is that they raise the bar in terms of the safeguards for data subjects’ rights and freedoms. We have joined the noble Lord, Lord Clement-Jones, on Amendment 47, and might equally have added our names to the other amendments in the group as we broadly support those too.
Amendment 38A, in the name of the noble Baroness, Lady Bennett, would place an additional requirement under new Article 22A to ensure human engagement in the automated decision-making processes.
I am sure the Committee will want more than warm words from the Minister when he comes to wind up the debate. For all of us, ADM is the here and now; it shapes how we use and consume public services and defines what and who we are. Reducing our protections from its downsides is not to be done lightly and we cannot easily see how that can be justified. I want to hear from the Minister how the Government came to conclude that this was acceptable, not least because, as we will hear in later debates on the Bill, the Government are seeking powers that provide for invasive bulk access to potentially every citizen’s bank accounts. I beg to move the amendments in the name of the noble Baroness, Lady Jones.
My Lords, it is a pleasure to follow the noble Lord, Lord Bassam, who has already set out very clearly what the group is about. I will chiefly confine myself to speaking to my Amendment 38A, which seeks to put in the Bill a clear idea of what having a human in the loop actually means. We need to have a human in the loop to ensure that a human interpreted, assessed and, perhaps most crucially, was able to intervene in the decision and any information on which it is based.
Noble Lords will be aware of many situations that have already arisen in which artificial intelligence is used—I would say that what we are currently describing is artificial intelligence but, in real terms, it is not truly that at all. What we have is a very large use of big data and, as the noble Lord, Lord Bassam, said, big data can be a very useful and powerful tool to be used for many positive purposes. However, we know that the quality of decision-making often depends on the quality of the data going in. A human is able to see whether something looks astray or wrong; there is a kind of intelligence that humans apply to this, which machines simply do not have the capacity for.
I pay credit to Justice, the law reform and human rights organisation which produced an excellent briefing on the issues around Clause 14. It asserts that, as it is currently written, it inadequately protects individuals from automated harm.
The noble Lord, Lord Bassam, referred to the Horizon case in the UK; that is the obvious example but, while we may think of some of the most vulnerable people in the UK, the Robodebt case in Australia is another case where crunching big data, and then crunching down on individuals, had truly awful outcomes. We know that there is a real risk of unfairness and discrimination in the use of these kinds of tools. I note that the UK has signed the Bletchley declaration, which says that
“AI should be designed, developed, deployed, and used, in a manner that is … human-centric, trustworthy and responsible”.
I focus particularly on “human-centric”: human beings can sympathise with and understand other human beings in a way that big data simply does not.
I draw a parallel with something covered by a special Select Committee of your Lordships’ House, last year: lethal autonomous weapon systems, or so-called killer robots. This is an obvious example of where there is a very strong argument for having a human in the loop, as the terminology goes. From the last I understood and heard about this, I am afraid that the UK Government are not fully committed to a human in the loop in the case of killer robots, but I hope that we get to that point.
When we talk about how humans’ data is used and managed, we are also talking about situations that are—almost equally—life and death: whether people get a benefit, whether they are fairly treated and whether they do not suddenly disappear off the system. Only this morning, I was reading a case study of a woman aged over 80, highlighting how she had been through multiple government departments, but could not get her national insurance number. Without a national insurance number, she could not get the pension to which she was entitled. If there is no human in the loop to cut through those kinds of situations, there is a real risk that people will find themselves just going around and around machines—a circumstance with which we are personally all too familiar, I am sure. My amendment is an attempt to put a real explanation in the Bill for having that human in the loop.
My Lords, the number of amendments proposed to Clause 14 reflects the Committee’s very real concern about the impact of automated decision-making on the privacy, safety and prospects of UK data subjects. I have specific amendments in groups 7 and 8, so I will speak to the impact of Clause 14 on children later. I will again be making arguments about the vulnerability of these systems in relation to the Government’s proposals on the DWP.
Without repeating the arguments made, I associate myself with most the proposals and the intention behind them—the need to safeguard the prospects of a fair outcome when algorithms hold sway over a person’s future. It seems entirely logical that, if the definition of solely automated decision-making requires “no meaningful human involvement”, we should be clear, as Amendment 40 proposes, about what is considered “meaningful”, so that the system cannot be gamed by providing human involvement that provides an ineffective safeguard and is therefore not meaningful.
I have sympathy with many of these amendments—Amendments 38A, 39, 47, 62, 64 and 109—and ultimately believe, as was suggested by the noble Lord, Lord Bassam, that it is a matter of trust. I refer briefly to the parliamentary briefing from the BMA, which boldly says that:
“Clause 14 risks eroding trust in AI”.
That would be a very sad outcome.