Data Protection Bill [ Lords ] (Eighth sitting) Debate
Full Debate: Read Full DebateDaniel Zeichner
Main Page: Daniel Zeichner (Labour - Cambridge)Department Debates - View all Daniel Zeichner's debates with the Home Office
(6 years, 9 months ago)
Public Bill CommitteesNew clauses 7 and 8 to 11 touch on the question of how we ensure a degree of justice when it comes to decisions that are taken about us automatically. The growth in decisions that are made through automated decision making has been exponential, and there are risks to that. We need to ensure that the law is modernised to provide new protections and safeguards for our constituents in this new world.
I should say at the outset that this group of new clauses is rooted in the excellent work of the Future of Work commission, which produced a long, thought-provoking report. The Committee will be frustrated to hear that I am not going to read through that this afternoon, but, none the less, I want to tease out a couple of points.
The basket of new clauses that we have proposed are well thought through and have been carefully crafted. I put on record my thanks to Helen Mountfield QC, an expert in equality law, and to Mike Osborne, professor of machine learning. Along with Ben Jaffey QC, a specialist in data law, they have been looking at some of the implications of automated decision making, which were discussed at length by the Future of Work commission.
Central to the new clauses is a concern that unaccountable and highly sophisticated automated or semi-automated systems are now making decisions that bear on fundamental elements of people’s work, including recruitment, pay and discipline. Just today, I was hearing about the work practices at the large Amazon warehouse up in Dundee, I think, where there is in effect digital casualisation. Employees are not put on zero-hours contracts, but they are put on four-hour contracts. They are guided around this gigantic warehouse by some kind of satnav technology on a mobile phone, but the device that guides them around the warehouse is also a device that tracks how long it takes them to put together a basket.
That information is then arranged in a nice league table of employees of who is the fastest and who is slowest, and decisions are then taken about who gets an extension to their contracted hours each week and who does not. That is a pretty automated kind of decision. My hon. Friend the Member for Eltham (Clive Efford) was describing to me the phenomenon of the butty man—the individual who decided who on a particular day got to work on the docks or on the construction site. In the pub at the end of the week, he divvied up the earnings and decided who got what, and who got work the following week. That kind of casualisation is now being reinvented in a digital era and is something that all of us ought to be incredibly concerned about.
What happens with these algorithms is called, in the jargon, socio-technical—what results is a mixture of conventional software, human judgment and statistical models. The issue is that very often the decisions that are made are not transparent, and are certainly not open to challenge. They are now quite commonly used by employers and prospective employers, and their agents, who are able to analyse very large datasets and can then deploy artificial intelligence and machine learning to make inferences about a person. Quite apart from the ongoing debates about how we define a worker and how we define employment—the subject of a very excellent report by my old friend Matthew Taylor, now at the RSA—there are real questions about how we introduce new safeguards for workers in this country.
I want to highlight the challenge with a couple of examples. Recent evidence has revealed how many recruiters use—surprise, surprise—Facebook to seek candidates in ways that routinely discriminate against older workers by targeting advertisements for jobs in a particular way. Slater and Gordon, which is a firm of excellent employment lawyers, showed that about one in five company executives admit to unlawful discrimination when advertising jobs online. The challenge is that when jobs are advertised in a targeted way, by definition they are not open to applicants from all walks of life, because lots of people just will not see the ads.
Women and those over the age of 50 are now most likely to be prevented from seeing an advert. Some 32% of company executives say that they have discriminated against those who are over 50, and a quarter have discriminated in that way against women. Nearly two thirds of executives with access to a profiling tool have said that they use it to actively seek out people based on criteria as diverse as age, gender and race. If we are to deliver a truly meritocratic labour market, where the rights of us all to shoot for jobs and to develop our skills and capabilities are protected, some of those practices have to stop. If we are to stop them, the law needs to change, and it needs to change now.
This battery of new clauses sets out to do five basic things. First, they set out some enhancements and refinements to the Equality Act 2010, in a way that ensures that protection from discrimination is applied to new forms of decision making, especially when those decisions engage core rights, such as rights on recruitment, terms of work, or dismissal. Secondly, there is a new right to algorithmic fairness at work, to ensure equal treatment. Thirdly, there is the right to an explanation when a decision is taken in a way that affects core elements of work life, such as a decision to hire, fire or suspend someone. Fourthly, there is a new duty for employers to undertake an algorithmic impact assessment, and fifthly, there are new, realistic ways for individuals to enforce those rights in an employment tribunal. It is quite a broad-ranging set of reforms to a number of different parts of legislation.
My right hon. Friend is making a powerful case. Does he agree that this is exactly the kind of thing we ought to have been discussing at the outset of the Bill? The elephant in the room is that the Bill seems to me, overall, to be looking backwards rather than forwards. It was developed to implement the general data protection regulation, which has been discussed over many years. We are seeing this week just how fast-moving the world is. These are the kind of ideas that should have been driving the Bill in the first place.
Exactly. My hon. Friend makes such a good point. The challenge with the way that Her Majesty’s Government have approached the Bill is that they have taken a particular problem—that we are heading for the exit door of Europe, so we had better ensure that we get a data-sharing agreement in place, or it will be curtains for Britain’s services exports—and said, “We’d better find a way of incorporating the GDPR into British law as quickly as possible.” They should have thought imaginatively and creatively about how we strengthen our digital economy, and how we protect freedoms, liberties and protections in this new world, going back to first principles and thinking through the consequences. What we have is not quite a cut-and-paste job—I will not describe it in that way—but neither is it the sophisticated exercise in public law making that my hon. Friend describes as more virtuous.
I want to give the Committee a couple of examples of why this is so serious, as sometimes a scenario or two can help. Let us take an individual whom we will call “Mr A”. He is a 56-year-old man applying for website development roles. Typically, if someone is applying for jobs in a particular sector, those jobs will be advertised online. In fact, many such roles are advertised only online, and they target users only in the age profile 26 to 35, through digital advertising or social media networks, whether that is Facebook, LinkedIn, or others. Because Mr A is not in the particular age bracket being targeted, he never sees the ad, as it will never pop up on his news feed, or on digital advertising aimed at him. He therefore does not apply for the role and does not know he is being excluded from applying for the role, all as a consequence of him being the wrong age. Since he is excluded from opportunities because of his age, he finds it much harder to find a role.
The Equality Act, which was passed with cross-party consensus, prohibits less favourable treatment because of age—direct discrimination—including in relation to recruitment practices, and protects individuals based on their age. The Act sets out a number of remedies for individuals who have been discriminated against in that way, but it is not clear how the Bill proposes to correct that sin. Injustices in the labour market are multiplying, and there is a cross-party consensus for a stronger defence of workers. In fact, the Member of Parliament for the town where I grew up, the right hon. Member for Harlow (Robert Halfon), has led the argument in favour of the Conservative party rechristening itself the Workers’ party, and the Labour party was founded on a defence of labour rights, so I do not think this is an especially contentious matter. There is cross-party consensus about the need to stand up for workers’ rights, particularly when wages are stagnating so dramatically.
We are therefore not divided on a point of principle, but the Opposition have an ambition to do something about this growing problem. The Bill could be corrected in a way that made a significant difference. There is not an argument about the rights that are already in place, because they are enshrined in the Equality Act, with which Members on both sides of the House agree. The challenge is that the law as it stands is deficient and cannot be applied readily or easily to automated decision making.
If I may, I will write to the right hon. Gentleman with that precise number, but I know that the Equality and Human Rights Commission is very clear in its guidance that employers must act within the law. The law is very clear that there are to be no direct or indirect forms of discrimination.
The hon. Member for Cambridge raised the GDPR, and talked about looking forwards not backwards. Article 5(1)(a) requires processing of any kind to be fair and transparent. Recital 71 draws a link between ensuring that processing is fair and minimising discriminatory effects. Article 35 of the GDPR requires controllers to undertake data protection impact assessments for all high-risk activities, and article 36 requires a subset of those impact assessments to be sent to the Information Commissioner for consultation prior to the processing taking place. The GDPR also gives data subjects the tools to understand the way in which their data has been processed. Processing must be transparent, details of that processing must be provided to every data subject, whether or not the data was collected directly from them, and data subjects are entitled to a copy of the data held about them.
When automated decision-making is engaged there are yet more safeguards. Controllers must tell the data subject, at the point of collecting the data, whether they intend to make such decisions and, if they do, provide meaningful information about the logic involved, as well as the significance and the envisaged consequences for the data subject of such processing. Once a significant decision has been made, that must be communicated to the data subject, and they must be given the opportunity to object to that decision so that it is re-taken by a human being.
We would say that the existing equality law and data protection law are remarkably technologically agnostic. Controllers cannot hide behind algorithms, but equally they should not be prevented from making use of them when they can do so in a sensible, fair and productive way.
Going back to the point raised by my right hon. Friend, I suspect that the number of cases will prove to be relatively low. The logic of what the Minister is saying would suggest that there is no algorithmic unfairness going on out there. I do not think that that is the case. What does she think?
I would be guided by the view of the Equality and Human Rights Commission, which oversees conduct in this area. I have no doubt that the Information Commissioner and the Equality and Human Rights Commission are in regular contact. If they are not, I very much hope that this will ensure that they are.
We are clear in law that there cannot be such discrimination as has been discussed. We believe that the framework of the law is there, and that the Information Commissioner’s Office and the Equality and Human Rights Commission, with their respective responsibilities, can help, advise and cajole, and, at times, enforce the law accordingly. I suspect that we will have some interesting times ahead of us with the release of the gender pay gap information. I will do a plug now, and say that any company employing more than 250 employees should abide by the law by 4 April. I look forward to reviewing the evidence from that exercise next month.
We are concerned that new clauses 7 and 8 are already dealt with in law, and that new clauses 9 to 11 would create an entirely new regulatory structure just for computer-assisted decision-making in the workplace, layered on top of the existing requirements of both employment and data protection law. We want the message to be clear to employers that there is no distinction between the types of decision-making. They are responsible for it, whether a human being was involved or not, and they must ensure that their decisions comply with the law.
Having explained our belief that the existing law meets the concerns raised by the right hon. Member for Birmingham, Hodge Hill, I hope he will withdraw the new clause.
I thank the Minister for her co-operative words and for the invitation to be part of this developing area of public policy. Having already plugged my New Statesman article, I will plug a part of it, which is the news that, having worked with some of the all-party parliamentary groups, I am pleased that we will launch a commission on technology ethics with one of the Minister’s colleagues, whose constituency I cannot quite remember, I am afraid, so I cannot make reference to him. But he is excellent.
We look forward to working with industry, stakeholders and politicians on a cross-party basis, to get into the debate about technology ethics. I accept the Minister’s warm words about co-operating on this issue positively, so that hopefully the outcomes of this commission can perhaps help to influence the work of the unit, or centre, and the Government’s response to it.
I would like this new unit to be given a statutory basis, to show its importance. It is vital that it has clout across Government and across Departments, so that it is not just a positive thing when we have Ministers who are willing to take part in and listen to this debate and instead is something that will go on with successive Ministers, should the current Minister be promoted, and with future Governments, too. However, in return for the Minister’s warm words of co-operation, I am happy not to press the new clause to a vote today.
Very briefly, I declare an interest as the chair of the all-party parliamentary group on data analytics. This is a subject, of course, that is very dear to our hearts. I will just say that there is a great deal of common ground on it. I commend my hon. Friend the Member for Bristol North West for trying to put it into the Bill, because I, too, think it needs to be put on a statutory basis. However, I will just draw attention to a lot of the very good work that has been done by a whole range of people in bringing forward the new structures.
I will just say again that in general I think we are heaping a huge amount of responsibility on the Information Commissioner; frankly, we are now almost inviting her to save the world. She and her office will need help. So an additional body, with resources, is required.
The Royal Society and the British Academy have done a lot of work on this issue over the last few years. I will conclude by referring back to a comment made by the hon. Member for Gordon, because it is worth saying that the Royal Society and the British Academy state in the conclusions of their report:
“It is essential to have a framework that engenders trust and confidence, to give entrepreneurs and decision-makers the confidence to act now, and to realise the potential of new applications in a way that reflects societal preferences.”
That is exactly the kind of thing we are trying to achieve. This body is essential and it needs to be set up as quickly as possible.