Data Protection and Digital Information Bill Debate
Full Debate: Read Full DebateLord Bassam of Brighton
Main Page: Lord Bassam of Brighton (Labour - Life peer)Department Debates - View all Lord Bassam of Brighton's debates with the Department for Science, Innovation & Technology
(8 months ago)
Grand CommitteeMy Lords, I thank the noble Baroness, Lady Jones, for tabling her amendments. Amendment 19 would remove processing which is necessary for the purposes of democratic engagement from the list of recognised legitimate interests. It is essential in a healthy democracy that registered political parties, elected representatives and permitted participants in referendums can engage freely with the electorate without being impeded unnecessarily by data protection legislation.
The provisions in the Bill will mean that these individuals and organisations do not have to carry out legitimate interest assessments or look for a separate legal basis. They will, however, still need to comply with other requirements of data protection legislation, such as the data protection principles and the requirement for processing to be necessary.
On the question posed by the noble Baroness about the term “democratic engagement”, it is intended to cover a wide range of political activities inside and outside election periods. These include but are not limited to democratic representation; communicating with electors and interested parties; surveying and opinion gathering; campaigning activities; activities to increase voter turnout; supporting the work of elected representatives, prospective candidates and official candidates; and fundraising to support any of these activities. This is reflected in the drafting, which incorporates these concepts in the definition of democratic engagement and democratic engagement activities.
The ICO already has guidance on the use of personal data by political parties for campaigning purposes, which the Government anticipate it will update to reflect the changes in the Bill. We will of course work with the ICO to make sure it is familiar with our plans for commencement and that it does not benefit any party over another.
On the point made about the appropriate age for the provisions, in some parts of the UK the voting age is 16 for some elections, and children can join the electoral register as attainers at 14. The age of 14 reflects the variations in voting age across the nation; in some parts of the UK, such as Scotland, a person can register to vote at 14 as an attainer. An attainer is someone who is registered to vote in advance of their being able to do so, to allow them to be on the electoral roll as soon as they turn the required age. Children aged 14 and over are often politically engaged and are approaching voting age. The Government consider it important that political parties and elected representatives can engage freely with this age group—
I am interested in what the Minister says about the age of attainers. Surely it would be possible to remove attainers from those who could be subject to direct marketing. Given how young attainers could be, it would protect them from the unwarranted attentions of campaigning parties and so on. I do not see that as a great difficulty.
Indeed. It is certainly worth looking at, but I remind noble Lords that such communications have to be necessary, and the test of their being necessary for someone of that age is obviously more stringent.
The processor has to determine whether it is necessary to the desired democratic engagement outcome to communicate with someone at that age. But I take the point: for the vast majority of democratic engagement communications, 14 would be far too young to make that a worthwhile or necessary activity.
I am not aware one way or the other, but I will happily look into that to see what further safeguards we can add so that we are not bombarding people who are too young with this material.
I am not sure whether it is written in the Bill. I will check, but the Bill would not function without the existence of the guidance.
I am sorry to drag this out but, on the guidance, can we be assured that the Minister will involve the Electoral Commission? It has a great deal of experience here; in fact, it has opined in the past on votes for younger cohorts of the population. It seems highly relevant to seek out its experience and the benefits of that.
I should also declare an interest. I apologise that I did not do so earlier. I worked with a think tank and wrote a series of papers on who regulates the regulators. I still have a relationship with that think tank.
My Lords, I have been through this large group and, apart from my natural suspicion that there might be something dastardly hidden away in it, I am broadly content, but I have a few questions.
On Amendment 20, can the Minister conform that the new words “further processing” have the same meaning as the reuse of personal data? Can he confirm that Article 5(1)(b) will prohibit this further processing when it is not in line with the original purpose for which the data was collected? How will the data subject know that is the case?
On Amendment 196, to my untutored eye it looks like the regulation-making power is being extended away from the data holder to include authorised persons and third-party recipients. My questions are simple enough: was this an oversight on the part of the original drafters of that clause? Is the amendment an extension of those captured by the effect of the clause? Is it designed to achieve consistency across the Bill? Finally, can I assume that an authorised person or third party would usually be someone acting on behalf of an agent of the data holder?
I presume that Amendments 198, 212 and 213 are needed because of a glitch in the drafting—similarly with Amendment 206. I can see that Amendments 208, 216 and 217 clarify when time periods begin, but why are the Government seeking to disapply time periods in Amendment 253 when surely some consistency is required?
Finally—I am sure the Minister will be happy about this—I am all in favour of flexibility, but Amendment 283 states that the Information Commissioner has the power to do things to facilitate the exercise of his functions. The noble Lord, Lord Kamall, picked up on this. We need to understand what those limits are. On the face of it, one might say that the amendment is sensible, but it seems rather general and broad in its application. As the noble Lord, Lord Kamall, rightly said, we need to see what the limits of accountability are. This is one of those occasions.
I thank the noble Lords, Lord Kamall and Lord Bassam, for their engagement with this group. On the questions from the noble Lord, Lord Kamall, these are powers that the ICO would already have in common law. As I am given to understand is now best practice, they are put on a statutory footing in the Bill as part of best practice with all Bills. The purpose is to align with best practice. It does not confer substantial new powers but clarifies the powers that the regulator has. I can also confirm that the ICO was and remains accountable to Parliament.
The Information Commissioner is directly accountable to Parliament in that he makes regular appearances in front of Select Committees that scrutinise the regulator’s work, including progress against objectives.
The noble Lord, Lord Bassam, made multiple important and interesting points. I hope he will forgive me if I undertake to write to him about those; there is quite a range of topics to cover. If there are any on which he requires answers right away, he is welcome to intervene.
I want to be helpful to the Minister. I appreciate that these questions are probably irritating but I carefully read through the amendments and aligned them with the Explanatory Notes. I just wanted some clarification to make sure that we are clear on exactly what the Government are trying to do. “Minor and technical” covers a multitude of sins; I know that from my own time as a Minister.
Indeed. I will make absolutely sure that we provide a full answer. By the way, I sincerely thank the noble Lord for taking the time to go through what is perhaps not the most rewarding of reads but is useful none the less.
From looking at the wording of the Members’ explanatory statements for wishing to leave out Clauses 9 and 36, I do not think that the Minister has addressed this, but does he accept that the Bill now provides a more lax approach? Is this a reduction of the standard expected? To me, “vexatious or excessive” sounds very different from “manifestly unfounded or excessive”. Does he accept that basic premise? That is really the core of the debate; if it is not, we have to look again at the issue of resources, which seems to be the argument to make this change.
If that is the case and this is a dilution, is this where the Government think they will get the savings identified in the impact assessment? It was alleged in the Public Bill Committee that this is where a lot of the savings would come from—we all have rather different views. My first information was that every SME might save about £80 a year then, suddenly, the Secretary of State started talking about £10 billion of benefit from the Bill. Clarification of that would be extremely helpful. There seems to be a dichotomy between the noble Lord, Lord Bassam, saying that this is a way to reduce the burdens on business and the Minister saying that it is all about confident refusal and confidence. He has used that word twice, which is worrying.
First, on the point made by the noble Lord, Lord Bassam, it is not to be argumentative—I am sure that there is much discussion to be had—but the intention is absolutely not to lower the standard for a well-intended request.
Sadly, a number of requests that are not well intended are made, with purposes of cynicism and an aim to disrupt. I can give a few examples. For instance, some requests are deliberately made with minimal time between them. Some are made to circumvent the process of legal disclosure in a trial. Some are made for other reasons designed to disrupt an organisation. The intent of using “vexatious” is not in any way to reduce well-founded, or even partially well-founded, attempts to secure information; it is to reduce less desirable, more cynical attempts to work in this way.
But the two terms have a different legal meaning, surely.
The actual application of the terms will be set out in guidance by the ICO but the intention is to filter out the more disruptive and cynical ones. Designing these words is never an easy thing but there has been considerable consultation on this in order to achieve that intention.
My Lords, this is the first group of amendments covering issues relating to automated decision-making, one of the most interesting areas of data use but also one of the most contested and, for the public at large, one of the most controversial and difficult to navigate. The development of AI and data systems that easily enable automatable decisions could offer huge efficiencies for consumers of public services. Equally, the use of such systems can, if used and regulated in the wrong way, have a devastating impact on people’s lives. If we have learned one thing from the Horizon scandal it is simply that, in the wrong hands and with the wrong system in place, the misuse of data can destroy lives and livelihoods.
Our country has a massive social security system, which includes everything from pension payments to disability income support and, of course, the universal credit system, which covers people entitled to in-work and out-of-work benefits. Over 22 million people receive DWP benefits of one sort or another. If automated decisions make errors in this field the potential to damage lives is enormous, as I am sure the Minister will appreciate.
I turn to the four amendments in the group in the name of my noble friend Lady Jones. Amendments 36 and 37 seek to amend new Article 22A of the UK GDPR and make it clear that protection is provided for profiling operations that lead to decisions. This is important, not least because the clause further reduces the scope for the human review of automated decision-making. Profiling is used as part of this process, and these amendments seek to protect individual data subjects from its effect. We take the view that it is essential that human interaction is involved in making subject access decisions.
Amendment 40 also makes it clear that, in the context of the new Article 22A, for human involvement to be considered meaningful, the review of the decision must be completed by a competent person. One of the positive changes made by the Bill is the introduction of the concept of “meaningful human involvement” in a decision. Meaningful human review is a key component for achieving an appropriate level of oversight over automated decision-making, for protecting individuals from unfair treatment and for offering an avenue for redress. The aim of the amendment is to bring more clarity around what “meaningful human involvement” should consist of. It would require that a review needs to be performed by a person with the necessary competence, training and understanding of the data, and, of course, the authority to alter the decision.
Our Amendment 109 is not so much about building protections as introducing something new and adding to the strength of what is already there. Users have never been able to get personalised explanations of automated decisions but, given the impact that these can have, we feel that systems should be in place for people to understand why a computer has simply said yes or no.
As it stands, the Bill deletes Section 14 of the Data Protection Act 2018 in its entirety. Our amendment would undo that and then add personalisation in. The amendment would retain Section 14 of that Act, which is where most automated decision-making safeguards are currently detailed in law. It would introduce an entitlement for data subjects to receive a personalised explanation of an automated decision made about them. This is based on public attitudes research conducted by the Ada Lovelace Institute, which shows a clear demand for greater transparency over these sorts of decisions.
The amendment also draws on independent legal analysis commissioned by the Ada Lovelace Institute, which found that the generic nature of explanations provided under current law are insufficient for individuals to understand how they have been affected by automated decision-making. This was considered to be a major barrier to meaningful protection from and redress for harms caused by AI. As many noble Lords have made clear in these debates, we have put building trust at the heart of how we get the most from AI and, more particularly, ADM systems.
I turn to the amendments in the name of the noble Lord, Lord Clement-Jones. In essence, they are about—as the noble Lord will, I am sure, explain better than I possibly could—the level of engagement of individuals in decisions about data subject automated decision-making processes. The common thread through the amendments is that they raise the bar in terms of the safeguards for data subjects’ rights and freedoms. We have joined the noble Lord, Lord Clement-Jones, on Amendment 47, and might equally have added our names to the other amendments in the group as we broadly support those too.
Amendment 38A, in the name of the noble Baroness, Lady Bennett, would place an additional requirement under new Article 22A to ensure human engagement in the automated decision-making processes.
I am sure the Committee will want more than warm words from the Minister when he comes to wind up the debate. For all of us, ADM is the here and now; it shapes how we use and consume public services and defines what and who we are. Reducing our protections from its downsides is not to be done lightly and we cannot easily see how that can be justified. I want to hear from the Minister how the Government came to conclude that this was acceptable, not least because, as we will hear in later debates on the Bill, the Government are seeking powers that provide for invasive bulk access to potentially every citizen’s bank accounts. I beg to move the amendments in the name of the noble Baroness, Lady Jones.
My Lords, it is a pleasure to follow the noble Lord, Lord Bassam, who has already set out very clearly what the group is about. I will chiefly confine myself to speaking to my Amendment 38A, which seeks to put in the Bill a clear idea of what having a human in the loop actually means. We need to have a human in the loop to ensure that a human interpreted, assessed and, perhaps most crucially, was able to intervene in the decision and any information on which it is based.
Noble Lords will be aware of many situations that have already arisen in which artificial intelligence is used—I would say that what we are currently describing is artificial intelligence but, in real terms, it is not truly that at all. What we have is a very large use of big data and, as the noble Lord, Lord Bassam, said, big data can be a very useful and powerful tool to be used for many positive purposes. However, we know that the quality of decision-making often depends on the quality of the data going in. A human is able to see whether something looks astray or wrong; there is a kind of intelligence that humans apply to this, which machines simply do not have the capacity for.
I pay credit to Justice, the law reform and human rights organisation which produced an excellent briefing on the issues around Clause 14. It asserts that, as it is currently written, it inadequately protects individuals from automated harm.
The noble Lord, Lord Bassam, referred to the Horizon case in the UK; that is the obvious example but, while we may think of some of the most vulnerable people in the UK, the Robodebt case in Australia is another case where crunching big data, and then crunching down on individuals, had truly awful outcomes. We know that there is a real risk of unfairness and discrimination in the use of these kinds of tools. I note that the UK has signed the Bletchley declaration, which says that
“AI should be designed, developed, deployed, and used, in a manner that is … human-centric, trustworthy and responsible”.
I focus particularly on “human-centric”: human beings can sympathise with and understand other human beings in a way that big data simply does not.
I draw a parallel with something covered by a special Select Committee of your Lordships’ House, last year: lethal autonomous weapon systems, or so-called killer robots. This is an obvious example of where there is a very strong argument for having a human in the loop, as the terminology goes. From the last I understood and heard about this, I am afraid that the UK Government are not fully committed to a human in the loop in the case of killer robots, but I hope that we get to that point.
When we talk about how humans’ data is used and managed, we are also talking about situations that are—almost equally—life and death: whether people get a benefit, whether they are fairly treated and whether they do not suddenly disappear off the system. Only this morning, I was reading a case study of a woman aged over 80, highlighting how she had been through multiple government departments, but could not get her national insurance number. Without a national insurance number, she could not get the pension to which she was entitled. If there is no human in the loop to cut through those kinds of situations, there is a real risk that people will find themselves just going around and around machines—a circumstance with which we are personally all too familiar, I am sure. My amendment is an attempt to put a real explanation in the Bill for having that human in the loop.
As I said, the intent of the Government is: yes to more automated data processing to take advantage of emerging technologies, but also yes to maintaining appropriate safeguards. The safeguards in the present system consist—if I may characterise it in a slightly blunt way—of providing quite a lot of uncertainty, so that people do not take the decision to positively embrace the technology in a safe way. By bringing in this clarity, we will see an increase not only in the safety of their applications but in their use, driving up productivity in both the public and private sectors.
My Lords, I said at the outset that I thought this was the beginning of a particular debate, and I was right, looking at the amendments coming along. The theme of the debate was touched on by the noble Baroness, Lady Bennett, when she talked about these amendments, in essence, being about keeping humans in the loop and the need for them to be able to review decisions. Support for that came from the noble Baroness, Lady Kidron, who made some important points. The point the BMA made about risking eroding trust cut to what we have been talking about all afternoon: trust in these processes.
The noble Lord, Lord Clement-Jones, talked about this effectively being the watering down of Article 22A, and the need for some core ethical principles in AI use and for the Government to ensure a right to human review. Clause 14 reverses the presumption of that human reviewing process, other than where solely automated decision-making exists, where it will be more widely allowed, as the Minister argued.
However, I am not satisfied by the responses, and I do not think other Members of your Lordships’ Committee will be either. We need more safeguards. We have moved from one clear position to another, which can be described as watering down or shifting the goalposts; I do not mind which, but that is how it seems to me. Of course, we accept that there are huge opportunities for AI in the delivery of public services, particularly in healthcare and the operation of the welfare system, but we need to ensure that citizens in this country have a higher level of protection than the Bill currently affords them.
At one point I thought the Minister said that a solely automated decision was a rubber-stamped decision. To me, that gave the game away. I will have to read carefully what he said in Hansard¸ but that is how it sounded, and it really gets our alarm bells ringing. I am happy to withdraw my amendment, but we will come back to this subject from time to time and throughout our debates on the rest of the Bill.
My Lords, this group, in which we have Amendments 41, 44, 45, 49, 50, 98A and 104A and have cosigned Amendments 46 and 48, aims to further the protections that we discussed in the previous group. We are delighted that the noble Lord, Lord Clement-Jones, and others joined us in signing various of these amendments.
The first amendment, Amendment 41, is a straight prohibition of any data processing that would contravene the Equality Act 2010. All legislation should conform to the terms of the Equality Act, so I expect the Minister to confirm that he is happy to accept that amendment. If he is not, I think the Committee will want to understand better why that is the case.
Amendment 44 to new Article 22B of the UK GDPR is, as it says, designed,
“to prevent data subjects from becoming trapped in unfair agreements and being unable to exercise their data rights”,
because of the contract terms. One might envisage some sensitive areas where the exercise of these rights might come into play, but there is nothing that I could see, particularly in the Explanatory Notes, which seeks to argue that point. We have no knowledge of when this might occur, and I see no reason why the legislation should be changed to that effect. Special category data can be used for automated decision-making only if certain conditions are met. It involves high-risk processing and, in our view, requires explicit consent.
The amendments remove performance of a contract as one of the requirements that allows the processing of special category data for reaching significant decisions based on automated processing. It is difficult to envisage a situation where it would be acceptable to permit special category data to be processed in high-risk decisions on a purely automated basis, simply pursuant to a contract where there is no explicit consent.
Furthermore, relying on performance of a contract for processing special category data removes the possibility for data subjects to exercise their data rights, for example, the right to object and the ability to withdraw consent, and could trap individuals in unfair agreements. There is an implicit power imbalance between data subjects and data controllers when entering a contract, and people are often not given meaningful choices or options to negotiate the terms. It is usually a take-it-or-leave-it approach. Thus, removing the criteria for performance of a contract reduces the risks associated with ADM and creates a tighter framework for protection. This also aligns with the current wording of Article 9 of the UK GDPR.
Amendment 45 changes the second condition to include only decisions that are required or authorised by law, with appropriate safeguards, and that are necessary for reasons of substantial public interest. The safeguards are retained from Section 14 of the DPA 2018, with amendments to strengthen transparency provisions.
Amendment 49 seeks to ensure that the protections conferred by Article 22C of the UK GDPR would apply to decisions “solely or partly” based on ADM rather than just “solely”. This would help to maximise the protections that data subjects currently enjoy.
Amendment 50 is another strengthening measure, which would make sure that safeguards in the new Article 22C are alongside rather than instead of those contained in Articles 12 to 15.
Our Amendment 104A would insert a new Section into the 2018 Act, requiring data controllers who undertake high-risk processing in relation to work-related decisions or activities to carry out an additional algorithmic impact assessment and make reasonable mitigations in response to the outcome of that assessment.
I ought to have said earlier that Amendment 98A is a minor part of the consequential text.
An improved workplace-specific algorithmic impact assessment is the best way to remedy clear deficiencies in Clause 20 as drafted, and it signals Labour’s international leadership and alignment with international regulatory and AI ethics initiatives. These are moving towards the pre-emptive evaluation of significant social and workplace impacts by responsible actors, combined with a procedure for ongoing monitoring, which is not always possible. It also moves towards our commitment to algorithmic assurance and will help to ensure that UK businesses are not caught up in what is sometimes described as the “Brussels effect”.
I am not philosophically averse to such regulation. As to implementing it in the immediate future, however, I have my doubts about that possibility.
My Lords, this has been an interesting and challenging session. I hope that we have given the Minister and his team plenty to think about—I am sure we have. A lot of questions remain unanswered, and although the Committee Room is not full this afternoon, I am sure that colleagues reading the debate will be studying the responses that we have received very carefully.
I am grateful to the noble Baroness, Lady Kidron, for her persuasive support. I am also grateful to the noble Lord, Lord Clement-Jones, for his support for our amendments. It is a shame the noble Lord, Lord Holmes, was not here this afternoon, but I am sure we will hear persuasively from him on his amendment later in Committee.
The Minister is to be congratulated for his consistency. I think I heard the phrase “not needed” or “not necessary” pretty constantly this afternoon, but particularly with this group of amendments. He probably topped the lot with his response on the Equality Act on Amendment 41.
I want to go away with my colleagues to study the responses to the amendments very carefully. That being said, however, I am happy to withdraw Amendment 41 at this stage.