Data Protection Bill [HL]

Lord Whitty Excerpts
Monday 13th November 2017

(7 years ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Hamwee Portrait Baroness Hamwee
- Hansard - - - Excerpts

My Lords, I too want to say a word about Amendment 75. The Human Rights Act trumps everything. To put it another way, the fundamental rights it deals with are incorporated into UK law, and they trump everything.

Like the noble Baroness, I believe that it is quite right that those who are responsible—humans—stop and think whether fundamental human rights are engaged. The right not to be subject to unfair discrimination has been referred to. Both the Bill and the GDPR recognised that as an issue in the provisions on profiling, but we need this overarching provision. Like other noble Lords, I find it so unsettling to be faced with what are clearly algorithmic decisions.

When I was on holiday I went to a restaurant in France called L’Algorithme, which was very worrying but I was allowed to choose my own meal. If this work continues in the industry, perhaps I will not be allowed to do so next year. I wondered about the practicalities of this, and whether through this amendment we are seeking something difficult to implement—but I do not think so. Law enforcement agencies under a later part of the Bill may not make significant decisions adversely affecting a data subject. Judgments of this sort must be practicable. That was a concern in my mind, and I thought that I would articulate my dismissal of that concern.

Lord Whitty Portrait Lord Whitty (Lab)
- Hansard - -

My Lords, my name is attached to two of these amendments. This is a very difficult subject in that we are all getting used to algorithmic decisions; not many people call them that, but they are what in effect decide major issues in their life and entice them into areas where they did not previously choose to be. Their profile, based on a number of inter-related algorithms, suggests that they may be interested in a particular commercial product or lifestyle move. It is quite difficult for those of my generation to grasp that, and difficult also for the legislative process to grasp it. So some of these amendments go back to first principles. The noble Baroness, Lady Hamwee, said that the issue of human rights trumps everything. Of course, we all agree with that, but human rights do not work unless you have methods of enforcing them.

In other walks of life, there are precedents. You may not be able to identify exactly who took a decision that, for example, women in a workforce should be paid significantly less than men for what were broadly equivalent jobs; it had probably gone on for decades. There was no clear paper trail to establish that discrimination took place but, nevertheless, the outcome was discriminatory. With algorithms, it is clear that some of the outcomes may be discriminatory, but you would not be able to put your finger on why they were discriminatory, let alone who or what decided that that discrimination should take place. Nevertheless, if the outcome is discriminatory, you need a way of redressing it. That is why the amendments to which I have added my name effectively say that the data subject should be made aware of the use to which their data is being made and that they would have the right of appeal to the Information Commissioner and of redress, as you would in a human-based decision-making process that was obscure in its origin but clear in relation to its outcome. That may be a slightly simplistic way in which to approach the issue, but it is a logical one that needs to be reflected in the Bill, and I hope that the Government take the amendments seriously.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

My Lords, I thank the noble Lord, Lord Clement-Jones, who introduced this interesting debate; of course, I recognise his authority and his newfound expertise in artificial intelligence from being chairman of the Select Committee on Artificial Intelligence. I am sure that he is an expert anyway, but it will only increase his expertise. I thank other noble Lords for their contributions, which raise important issues about the increasing use of automated decision-making, particularly in the online world. It is a broad category, including everything from personalised music playlists to quotes for home insurance and far beyond that.

The noble Lord, Lord Stevenson, before speaking to his amendments, warned about some of the things that we need to think about. He contrasted the position on human embryology and fertility research and the HFEA, which is not exactly parallel because, of course, the genie is out of the bottle in that respect, and things were prevented from happening at least until the matter was debated. But I take what the noble Lord said and agree with the issues that he raised. I think that we will discuss in a later group some of the ideas about how we debate those broader issues.

The noble Baroness, Lady Jones, talked about how she hoped that the repressive bits would be removed from the Bill. I did not completely understand her point, as this Bill is actually about giving data subjects increased rights, both in the GDPR and the law enforcement directive. That will take direct effect, but we are also applying those GDPR rights to other areas not subject to EU jurisdiction. I shall come on to her amendment on the Human Rights Act in a minute—but we agree with her that human beings should be involved in significant decisions. That is exactly what the Bill tries to do. We realise that data subjects should have rights when they are confronted by significant decisions made about them by machines.

The Bill recognises the need to ensure that such processing is correctly regulated. That is why it includes safeguards, such as the right to be informed of automated processing as soon as reasonably practicable and the right to challenge an automated decision made by the controller. The noble Lord, Lord Clement-Jones, alluded to some of these things. We believe that Clauses 13, 47, 48, 94 and 95 provide adequate and proportionate safeguards to protect data subjects of all ages, adults as well as children. I can give some more examples, because it is important to recognise data rights. For example, Clause 47 is clear that individuals should not be subject to a decision based solely on automated processing if that decision significantly and adversely impacts on them, either legally or otherwise, unless required by law. If that decision is required by law, Clause 48 specifies the safeguards that controllers should apply to ensure the impact on the individual is minimised. Critically, that includes informing the data subject that a decision has been taken and providing them 21 days within which to ask the controller to reconsider the decision or retake the decision with human intervention.

I turn to Amendments 74, 134 and 136, proposed by the noble Lord, Lord Clement-Jones, which seek to insert into Parts 2 and 3 of the Bill a definition of the term,

“based solely on automated processing”,

to provide that human intervention must be meaningful. I do not disagree with the meaning of the phrase put forward by the noble Lord. Indeed, I think that that is precisely the meaning that that phrase already has. The test here is what type of processing the decision having legal or significant effects is based on. Mere human presence or token human involvement will not be enough. The purported human involvement has to be meaningful; it has to address the basis for the decision. If a decision was based solely on automated processing, it could not have meaningful input by a natural person. On that basis, I am confident that there is no need to amend the Bill to clarify this definition further.

In relation to Amendments 74A and 133A, the intention here seems to be to prevent any automated decision-making that impacts on a child. By and large, the provisions of the GDPR and of the Bill, Clause 8 aside, apply equally to all data subjects, regardless of age. We are not persuaded of the case for different treatment here. The important point is that the stringent safeguards in the Bill apply equally to all ages. It seems odd to suggest that the NHS could, at some future point, use automated decision-making, with appropriate safeguards, to decide on the eligibility for a particular vaccine—