Data Protection Bill [ Lords ] (Eighth sitting)

Debate between Liam Byrne and Daniel Zeichner
Thursday 22nd March 2018

(6 years, 8 months ago)

Public Bill Committees
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Liam Byrne Portrait Liam Byrne
- Hansard - -

New clauses 7 and 8 to 11 touch on the question of how we ensure a degree of justice when it comes to decisions that are taken about us automatically. The growth in decisions that are made through automated decision making has been exponential, and there are risks to that. We need to ensure that the law is modernised to provide new protections and safeguards for our constituents in this new world.

I should say at the outset that this group of new clauses is rooted in the excellent work of the Future of Work commission, which produced a long, thought-provoking report. The Committee will be frustrated to hear that I am not going to read through that this afternoon, but, none the less, I want to tease out a couple of points.

The basket of new clauses that we have proposed are well thought through and have been carefully crafted. I put on record my thanks to Helen Mountfield QC, an expert in equality law, and to Mike Osborne, professor of machine learning. Along with Ben Jaffey QC, a specialist in data law, they have been looking at some of the implications of automated decision making, which were discussed at length by the Future of Work commission.

Central to the new clauses is a concern that unaccountable and highly sophisticated automated or semi-automated systems are now making decisions that bear on fundamental elements of people’s work, including recruitment, pay and discipline. Just today, I was hearing about the work practices at the large Amazon warehouse up in Dundee, I think, where there is in effect digital casualisation. Employees are not put on zero-hours contracts, but they are put on four-hour contracts. They are guided around this gigantic warehouse by some kind of satnav technology on a mobile phone, but the device that guides them around the warehouse is also a device that tracks how long it takes them to put together a basket.

That information is then arranged in a nice league table of employees of who is the fastest and who is slowest, and decisions are then taken about who gets an extension to their contracted hours each week and who does not. That is a pretty automated kind of decision. My hon. Friend the Member for Eltham (Clive Efford) was describing to me the phenomenon of the butty man—the individual who decided who on a particular day got to work on the docks or on the construction site. In the pub at the end of the week, he divvied up the earnings and decided who got what, and who got work the following week. That kind of casualisation is now being reinvented in a digital era and is something that all of us ought to be incredibly concerned about.

What happens with these algorithms is called, in the jargon, socio-technical—what results is a mixture of conventional software, human judgment and statistical models. The issue is that very often the decisions that are made are not transparent, and are certainly not open to challenge. They are now quite commonly used by employers and prospective employers, and their agents, who are able to analyse very large datasets and can then deploy artificial intelligence and machine learning to make inferences about a person. Quite apart from the ongoing debates about how we define a worker and how we define employment—the subject of a very excellent report by my old friend Matthew Taylor, now at the RSA—there are real questions about how we introduce new safeguards for workers in this country.

I want to highlight the challenge with a couple of examples. Recent evidence has revealed how many recruiters use—surprise, surprise—Facebook to seek candidates in ways that routinely discriminate against older workers by targeting advertisements for jobs in a particular way. Slater and Gordon, which is a firm of excellent employment lawyers, showed that about one in five company executives admit to unlawful discrimination when advertising jobs online. The challenge is that when jobs are advertised in a targeted way, by definition they are not open to applicants from all walks of life, because lots of people just will not see the ads.

Women and those over the age of 50 are now most likely to be prevented from seeing an advert. Some 32% of company executives say that they have discriminated against those who are over 50, and a quarter have discriminated in that way against women. Nearly two thirds of executives with access to a profiling tool have said that they use it to actively seek out people based on criteria as diverse as age, gender and race. If we are to deliver a truly meritocratic labour market, where the rights of us all to shoot for jobs and to develop our skills and capabilities are protected, some of those practices have to stop. If we are to stop them, the law needs to change, and it needs to change now.

This battery of new clauses sets out to do five basic things. First, they set out some enhancements and refinements to the Equality Act 2010, in a way that ensures that protection from discrimination is applied to new forms of decision making, especially when those decisions engage core rights, such as rights on recruitment, terms of work, or dismissal. Secondly, there is a new right to algorithmic fairness at work, to ensure equal treatment. Thirdly, there is the right to an explanation when a decision is taken in a way that affects core elements of work life, such as a decision to hire, fire or suspend someone. Fourthly, there is a new duty for employers to undertake an algorithmic impact assessment, and fifthly, there are new, realistic ways for individuals to enforce those rights in an employment tribunal. It is quite a broad-ranging set of reforms to a number of different parts of legislation.

Daniel Zeichner Portrait Daniel Zeichner (Cambridge) (Lab)
- Hansard - - - Excerpts

My right hon. Friend is making a powerful case. Does he agree that this is exactly the kind of thing we ought to have been discussing at the outset of the Bill? The elephant in the room is that the Bill seems to me, overall, to be looking backwards rather than forwards. It was developed to implement the general data protection regulation, which has been discussed over many years. We are seeing this week just how fast-moving the world is. These are the kind of ideas that should have been driving the Bill in the first place.

Liam Byrne Portrait Liam Byrne
- Hansard - -

Exactly. My hon. Friend makes such a good point. The challenge with the way that Her Majesty’s Government have approached the Bill is that they have taken a particular problem—that we are heading for the exit door of Europe, so we had better ensure that we get a data-sharing agreement in place, or it will be curtains for Britain’s services exports—and said, “We’d better find a way of incorporating the GDPR into British law as quickly as possible.” They should have thought imaginatively and creatively about how we strengthen our digital economy, and how we protect freedoms, liberties and protections in this new world, going back to first principles and thinking through the consequences. What we have is not quite a cut-and-paste job—I will not describe it in that way—but neither is it the sophisticated exercise in public law making that my hon. Friend describes as more virtuous.

I want to give the Committee a couple of examples of why this is so serious, as sometimes a scenario or two can help. Let us take an individual whom we will call “Mr A”. He is a 56-year-old man applying for website development roles. Typically, if someone is applying for jobs in a particular sector, those jobs will be advertised online. In fact, many such roles are advertised only online, and they target users only in the age profile 26 to 35, through digital advertising or social media networks, whether that is Facebook, LinkedIn, or others. Because Mr A is not in the particular age bracket being targeted, he never sees the ad, as it will never pop up on his news feed, or on digital advertising aimed at him. He therefore does not apply for the role and does not know he is being excluded from applying for the role, all as a consequence of him being the wrong age. Since he is excluded from opportunities because of his age, he finds it much harder to find a role.

The Equality Act, which was passed with cross-party consensus, prohibits less favourable treatment because of age—direct discrimination—including in relation to recruitment practices, and protects individuals based on their age. The Act sets out a number of remedies for individuals who have been discriminated against in that way, but it is not clear how the Bill proposes to correct that sin. Injustices in the labour market are multiplying, and there is a cross-party consensus for a stronger defence of workers. In fact, the Member of Parliament for the town where I grew up, the right hon. Member for Harlow (Robert Halfon), has led the argument in favour of the Conservative party rechristening itself the Workers’ party, and the Labour party was founded on a defence of labour rights, so I do not think this is an especially contentious matter. There is cross-party consensus about the need to stand up for workers’ rights, particularly when wages are stagnating so dramatically.

We are therefore not divided on a point of principle, but the Opposition have an ambition to do something about this growing problem. The Bill could be corrected in a way that made a significant difference. There is not an argument about the rights that are already in place, because they are enshrined in the Equality Act, with which Members on both sides of the House agree. The challenge is that the law as it stands is deficient and cannot be applied readily or easily to automated decision making.

Data Protection Bill [ Lords ] (First sitting)

Debate between Liam Byrne and Daniel Zeichner
Tuesday 13th March 2018

(6 years, 8 months ago)

Public Bill Committees
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Daniel Zeichner Portrait Daniel Zeichner
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Mr Hanson. I shall begin by declaring an interest: I chair the all-party parliamentary group on data analytics, the secretariat to which is provided by Policy Connect. In that capacity, I have had the pleasure of having many discussions about GDPR with experts over the past couple of years. I reflect on what a very good process it is that British parliamentarians in the European Parliament are able to intervene on such matters at early stages, to make sure that when the legislation finally comes to us it already has our slant on it. That may not be possible in future when we come to discuss such legislation.

I represent a university city, so research is a key part of what we do. It is on that basis that I tabled the amendments, and I am grateful to the Wellcome Trust and the Sanger Institute, which have given me advice on how the amendments would help them by providing certainty for the work that they do. The purpose of amendment 141 is to ensure that university researchers and public bodies with a research function are able to use what is called the “task in the public interest” lawful basis for processing personal data, where consent is not a viable lawful basis. I apologise for going into some detail, but it is important for universities and researchers that there is clarity.

As the Bill is drafted, clause 8 provides a definition of lawfulness of processing personal data under GDPR article 6(1)(e). Subsections (a) to (d) of clause 8 set out a narrow list of activities that could be included in the scope of public interest. I am told that that list is imported from schedule 2(5) of the Data Protection Act 1998, but I am also told that the drafters have omitted a version of the final and most general sub-paragraph from that list, which reads:

“for the exercise of any other functions of a public nature exercised in the public interest by any person.”

It is speculated that that may have been taken out of the list to tighten up, and to avoid a tautology in defining, “public interest”, but the worry is that taking it out has made the clause too restrictive. The explanatory notes indicate that the list in clause 8—that is, subsections (a) to (d)—is not intended to be exhaustive, but the Wellcome Trust and the Sanger Institute worry that it has narrowed the public interest terminology to a very narrow concept, which will be confined to public and judicial administration.

There was a very lengthy and very good debate in the other place on this matter. One of our universities’ main functions is to undertake research that will often involve processing personal data. In some cases, GDPR compliant consent, which may seem the obvious way of doing it, will not be the most appropriate lawful basis on which to process that data. It is therefore really important that an article 6 lawful basis for processing is available to university researchers with certainty and clarity.

The Government have included reference to medical research purposes in the explanatory notes, but the worry is that that does not necessarily have weight in law and the reference excludes many other types of research that are rightly conducted by universities. This is not a satisfactory resolution to the problems that are faced.

The amendment tries to enable research functions to be conducted by public bodies such as universities without doing what the Government fear, which is to broaden the definition of “public interest” too far. The wording retains the structure of the DPA list, from which the current clauses were imported, but it narrows it down in two ways. It specifies the purpose of processing, that is, research functions, which must be the reason for the processing and specifies who is doing the processing—the basis of it only being available to public bodies, as defined in the previous clause.

We are aware that the Government are worried about adding further subsections to the list. I think they said that it could open the floodgates in some way. However, I am told that there is not really any evidence to suggest that the current wording of paragraph 5 of schedule 2 of the Data Protection Act, which has a very broad notion of public interest, has in any way “opened the floodgates”. To give some sense of the concerns that have arisen, the processes by which university researchers seek permission to do things are quite complicated. Some of the bodies have already issued guidance. I am told that the Health Research Authority issued guidance on GDPR before Christmas. It advised that a clause on using legitimate interests should be included in the Bill.

There is confusion in the research sector, and there is a wider worry that if this is not clear, it is open to legal challenge. While some institutions will be able to take that risk, the worry is that smaller research bodies would conclude that, given the lack of clarity, it would not be worth taking that risk. I hope that the Government will think hard about the suggestion. It comes from the research institutions themselves and would give clarity and reassurance. I hope that the Minister will accept the amendment.

Liam Byrne Portrait Liam Byrne
- Hansard - -

I want to say a few words in support of my hon. Friend and these important amendments. I think there is an acknowledgement on both sides of the Committee that if we are to prosper in the world that is coming, we are going to need to increase the amount of money that we spend on research and development and make sure that a research-driven economy reaches every corner of the country.

The world of innovation and research is changing very quickly. I think it is next year that China becomes the world’s largest science spender for the first time in several centuries. If we are to compete in this new world, we need to invest more in our R&D base. The Government have made some helpful commitments in this area. Their proposals are not quite as ambitious as the Labour amendments, but none the less all progress is welcome.

I hope that the Minister will reflect on the reality—the way in which research is conducted in our country is changing. In the past, I have called that a shift from the cathedral to the campus. Once upon a time, big firms put a lot of people in a large building and prayed for the best. Now, they are building business parks and creating ecosystems of innovation where they may have a shared research and development facility, otherwise known as a university. There may be big international companies with global reach organised around them, but there are also scores of much smaller firms. They may be as small as a couple of post-docs in a shared lab. If we look at facilities such as BT at Dashwood Park, the Crick Institute or GSK in Stevenage, we see big global companies with hundreds of smaller companies around them which are undertaking research with much greater speed and much lower risk, but with an impact that could change the world.

We cannot jeopardise the conduct of that research. My hon. Friend the Member for Cambridge is right to point out that where there is doubt about the law, or the powers and freedoms of research firms, there is a risk that such firms simply will not undertake such work in the UK, and instead will seek relationships either with global companies or, increasingly, with universities that have R&D facilities elsewhere. We want to create the world’s best place to undertake new science, and that means having a research regime that is the best in the world. We therefore need a data protection regime that helps and does not hinder, which is why the Government should accept these carefully crafted amendments.