Data Protection Bill [HL]

Baroness Jones of Moulsecoomb Excerpts
Monday 13th November 2017

(6 years, 6 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
These are all consequences of the overall approach we are taking. I look forward to further debates and the Minister’s response.
Baroness Jones of Moulsecoomb Portrait Baroness Jones of Moulsecoomb (GP)
- Hansard - -

My Lords, I speak to Amendment 75 in particular, but the whole issue of automated decision-making is extremely worrying.

As we have gone through this Bill, I have been desperately hoping that some of the most repressive bits are a negotiating tactic on the Government’s part, and that before Report they will say, “We’ll take out this really nasty bit if you let us leave in this not really quite so nasty bit”. I feel that this issue is one of the really nasty bits.

I thank Liberty, which has worked incredibly hard on this Bill and drawn out the really nasty bits. Under the Data Protection Act 1998, individuals have a qualified right not to be subject to purely automated decision-making and, to the extent that automated decision-making is permitted, they have a right to access information relating to such decisions made about them. The GDPR clarifies and extends these rights to the point that automated decisions that engage a person’s human rights are not permissible.

This could include being subjected to unfair discrimination. The noble Lord, Lord Clement-Jones, used the phrase, “unintended discrimination”—for example, detecting sexuality or diagnosing depression. The rapidly growing field of machine learning and algorithmic decision-making presents some new and very serious risks to our right to a private life and to freedom of expression and assembly. Such automated decision-making is deeply worrying when done by law enforcement agencies or the intelligence services because the decisions could have adverse legal effects. Such processing should inform rather than determine officers’ decisions.

We must have the vital safeguard for human rights of the requirement of human involvement. After the automated decision-making result has come out, there has to be a human who says whether or not it is reasonable.

Baroness Hamwee Portrait Baroness Hamwee
- Hansard - - - Excerpts

My Lords, I too want to say a word about Amendment 75. The Human Rights Act trumps everything. To put it another way, the fundamental rights it deals with are incorporated into UK law, and they trump everything.

Like the noble Baroness, I believe that it is quite right that those who are responsible—humans—stop and think whether fundamental human rights are engaged. The right not to be subject to unfair discrimination has been referred to. Both the Bill and the GDPR recognised that as an issue in the provisions on profiling, but we need this overarching provision. Like other noble Lords, I find it so unsettling to be faced with what are clearly algorithmic decisions.

When I was on holiday I went to a restaurant in France called L’Algorithme, which was very worrying but I was allowed to choose my own meal. If this work continues in the industry, perhaps I will not be allowed to do so next year. I wondered about the practicalities of this, and whether through this amendment we are seeking something difficult to implement—but I do not think so. Law enforcement agencies under a later part of the Bill may not make significant decisions adversely affecting a data subject. Judgments of this sort must be practicable. That was a concern in my mind, and I thought that I would articulate my dismissal of that concern.