Data Protection Bill [ Lords ] (Eighth sitting) Debate

Full Debate: Read Full Debate
Department: Home Office
Thursday 22nd March 2018

(6 years, 8 months ago)

Public Bill Committees
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
None Portrait The Chair
- Hansard -

I call the Minister, whose birthday it is today.

Victoria Atkins Portrait The Parliamentary Under-Secretary of State for the Home Department (Victoria Atkins)
- Hansard - -

Thank you, Mr Streeter, and what a wonderful birthday present it is to be serving on the Committee.

It is a joy, actually, to be able to agree with the Opposition on the principle that equality applies not only to decisions made by human beings or with human input, but to decisions made solely by computers and algorithms. On that, we are very much agreed. The reason that we do not support the new clauses is that we believe that the Equality Act already protects workers against direct or indirect discrimination by computer or algorithm-based decisions. As the right hon. Member for Birmingham, Hodge Hill rightly said, the Act was passed with cross-party consensus.

The Act is clear that in all cases, the employer is liable for the outcome of any of their actions, or those of their managers or supervisors, or those that are the result of a computer, algorithm or mechanical process. If, during a recruitment process, applications from people with names that suggest a particular ethnicity were rejected for that reason by an algorithm, the employer would be liable for race discrimination, whether or not they designed the algorithm with that intention in mind.

The right hon. Gentleman placed a great deal of emphasis on advertising and, again, we share his concerns that employers could seek to treat potential employees unfairly and unequally. The Equality and Human Rights Commission publishes guidance for employers to ensure that there is no discriminatory conduct and that fair and open access to employment opportunities is made clear in the way that employers advertise posts.

The same principle applies in the provision of services. An automated process that intentionally or unintentionally denies a service to someone because of a protected characteristic will lay the service provider open to a claim under the Act, subject to any exceptions.

Liam Byrne Portrait Liam Byrne
- Hansard - - - Excerpts

I am grateful to the Minister for giving way, not least because it gives me the opportunity to wish her a happy birthday. Could she remind the Committee how many prosecutions there have been for discriminatory advertising because employers chose to target their adverts?

Victoria Atkins Portrait Victoria Atkins
- Hansard - -

If I may, I will write to the right hon. Gentleman with that precise number, but I know that the Equality and Human Rights Commission is very clear in its guidance that employers must act within the law. The law is very clear that there are to be no direct or indirect forms of discrimination.

The hon. Member for Cambridge raised the GDPR, and talked about looking forwards not backwards. Article 5(1)(a) requires processing of any kind to be fair and transparent. Recital 71 draws a link between ensuring that processing is fair and minimising discriminatory effects. Article 35 of the GDPR requires controllers to undertake data protection impact assessments for all high-risk activities, and article 36 requires a subset of those impact assessments to be sent to the Information Commissioner for consultation prior to the processing taking place. The GDPR also gives data subjects the tools to understand the way in which their data has been processed. Processing must be transparent, details of that processing must be provided to every data subject, whether or not the data was collected directly from them, and data subjects are entitled to a copy of the data held about them.

When automated decision-making is engaged there are yet more safeguards. Controllers must tell the data subject, at the point of collecting the data, whether they intend to make such decisions and, if they do, provide meaningful information about the logic involved, as well as the significance and the envisaged consequences for the data subject of such processing. Once a significant decision has been made, that must be communicated to the data subject, and they must be given the opportunity to object to that decision so that it is re-taken by a human being.

We would say that the existing equality law and data protection law are remarkably technologically agnostic. Controllers cannot hide behind algorithms, but equally they should not be prevented from making use of them when they can do so in a sensible, fair and productive way.

Daniel Zeichner Portrait Daniel Zeichner
- Hansard - - - Excerpts

Going back to the point raised by my right hon. Friend, I suspect that the number of cases will prove to be relatively low. The logic of what the Minister is saying would suggest that there is no algorithmic unfairness going on out there. I do not think that that is the case. What does she think?

Victoria Atkins Portrait Victoria Atkins
- Hansard - -

I would be guided by the view of the Equality and Human Rights Commission, which oversees conduct in this area. I have no doubt that the Information Commissioner and the Equality and Human Rights Commission are in regular contact. If they are not, I very much hope that this will ensure that they are.

We are clear in law that there cannot be such discrimination as has been discussed. We believe that the framework of the law is there, and that the Information Commissioner’s Office and the Equality and Human Rights Commission, with their respective responsibilities, can help, advise and cajole, and, at times, enforce the law accordingly. I suspect that we will have some interesting times ahead of us with the release of the gender pay gap information. I will do a plug now, and say that any company employing more than 250 employees should abide by the law by 4 April. I look forward to reviewing the evidence from that exercise next month.

We are concerned that new clauses 7 and 8 are already dealt with in law, and that new clauses 9 to 11 would create an entirely new regulatory structure just for computer-assisted decision-making in the workplace, layered on top of the existing requirements of both employment and data protection law. We want the message to be clear to employers that there is no distinction between the types of decision-making. They are responsible for it, whether a human being was involved or not, and they must ensure that their decisions comply with the law.

Having explained our belief that the existing law meets the concerns raised by the right hon. Member for Birmingham, Hodge Hill, I hope he will withdraw the new clause.

Liam Byrne Portrait Liam Byrne
- Hansard - - - Excerpts

I think it was in “Candide” that Voltaire introduced us to the word “Panglossian”, and we have heard a rather elegant and Panglossian description of a perfect world in which all is fine in the labour market. I am much more sceptical than the Minister. I do not think the current law is sufficiently sharp, and I am concerned that the consequence of that will be injustice for our constituents.

The Minister raised a line of argument that it is important for us to consider. The ultimate test of whether the law is good enough must be what is actually happening out there in the labour market. I do not think it is good enough; she thinks it is fine. On the nub of the argument, a few more facts might be needed on both sides, so we reserve the right to come back to the issue on Report. This has been a useful debate. I beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.

New Clause 13

Review of Electronic Commerce (EC Directive) Regulations

“(1) The Secretary of State shall lay before both Houses of Parliament a review of the application and operation of the Electronic Commerce (EC Directive) Regulations 2002 in relation to the processing of personal data.

(2) A review under subsection (1) shall be laid before Parliament by 31 January 2019.”—(Liam Byrne.)

This new clause would order the Secretary of State to review the application and operation of the Electronic Commerce (EC Directive) Regulations 2002 in relation to the processing of data and lay that review before Parliament before 31 January 2019.

Brought up, and read the First time.

--- Later in debate ---
Liam Byrne Portrait Liam Byrne
- Hansard - - - Excerpts

Yes. My hon. Friend has done an extraordinary job of exposing that minor scandal. I am surprised that it has not had more attention in the House, but hopefully once the Bill has passed it is exactly the kind of behaviour that we can begin to police rather more effectively.

I am sure that Ministers will recognise that there is a need for this. No doubt their colleagues in the Department for Education are absolutely all over it. I was talking to a headteacher in the Minister’s own constituency recently—an excellent headteacher, in an excellent school, who is a personal friend. The horror with which headteachers regard the arrival of the GDPR is something to behold. Heaven knows, our school leaders and our teachers have enough to do. I call on Ministers to make their task, their lives, and their mission that bit easier by accepting the new clause.

Victoria Atkins Portrait Victoria Atkins
- Hansard - -

Our schools handle large volumes of sensitive data about the children they educate. Anyone who has any involvement with the education system, either personally through their families, on their mobile phone apps, or in a professional capacity as constituency MPs, is very conscious of the huge responsibilities that school leaders have in handling that data properly and well, and in accordance with the law. As data controllers in their own right, schools and other organisations in the education system will need to ensure that they have adequate data-handling policies in place to comply with their legal obligations under the new law.

Work is going on already. The Department for Education has a programme of advice and education for school-leaders, which covers everything from blogs, a guidance video, speaking engagements, and work to encourage system suppliers to be proactive in helping schools to become GDPR-compliant. Research is also being undertaken with parents about model privacy notices that will help schools to make parents and pupils more aware of the data about children used in the sector. The Department for Education is also shaping a toolkit that will bring together various pieces of guidance and best practice to address the specific needs of those who process education data. In parallel, the Information Commissioner has consulted on guidance specifically addressing issues about the fair and lawful processing of children’s data. Everyone is very alive to the issue of protecting children and their data.

At this point, the Government want to support the work that is ongoing—already taking place—and the provisions on guidance that are already in the Bill. Our concern is that legislating for a code now could be seen as a reason for schools to wait and see, rather than continuing their preparations for the new law. But it may be that in due course the weight of argument swings in favour of a sector-specific code of practice. That can happen. It does not have to be in the Bill. It can happen because clause 128 provides that the Secretary of State may require the Information Commissioner to prepare additional codes of practice for the processing of personal data, and the commissioner can issue further guidance under her own steam, using her powers under article 57 of the GDPR, without needing any direction from the Secretary of State.

I hope that the ongoing work reassures the right hon. Gentleman and that he will withdraw the new clause at this stage.

Liam Byrne Portrait Liam Byrne
- Hansard - - - Excerpts

I am reassured by that and I beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.

New Clause 17

Personal data ethics advisory board and ethics code of practice

‘(1) The Secretary of State must appoint an independent Personal Data Ethics Advisory Board (“the board”).

(2) The board’s functions, in relation to the processing of personal data to which the GDPR and this Act applies, are—

(a) to monitor further technical advances in the use and management of personal data and their implications for the rights of data subjects;

(b) to monitor the protection of the individual and collective rights and interests of data subjects in relation to their personal data;

(c) to ensure that trade-offs between the rights of data subjects and the use of management of personal data are made transparently, inclusively, and with accountability;

(d) to seek out good practices and learn from successes and failures in the use and management of personal data;

(e) to enhance the skills of data subjects and controllers in the use and management of personal data.

(3) The board must work with the Commissioner to prepare a data ethics code of practice for data controllers, which must—

(a) include a duty of care on the data controller and the processor to the data subject;

(b) provide best practice for data controllers and processors on measures, which in relation to the processing of personal data—

(i) reduce vulnerabilities and inequalities;

(ii) protect human rights;

(iii) increase the security of personal data; and

(iv) ensure that the access, use and sharing personal data is transparent, and the purposes of personal data processing are communicated clearly and accessibly to data subjects.

(4) The code must also include guidance in relation to the processing of personal data in the public interest and the substantial public interest.

(5) Where a data controller or processor does not follow the code under this section, the data controller or processor is subject to a fine to be determined by the Commissioner.

(6) The board must report annually to the Secretary of State.

(7) The report in subsection (6) may contain recommendations to the Secretary of State and the Commissioner relating to how they can improve the processing of personal data and the protection of data subjects’ rights by improving methods of—

(a) monitoring and evaluating the use and management of personal data;

(b) sharing best practice and setting standards for data controllers; and

(c) clarifying and enforcing data protection rules.

(8) The Secretary of State must lay the report made under subsection (6) before both Houses of Parliament.

(9) The Secretary of State must, no later than one year after the day on which this Act receives Royal Assent, lay before both Houses of Parliament draft regulations in relation to the functions of the Personal Data Ethics Advisory Board as listed in subsections (2), (3), (4), (6) and (7) of this section.

(10) Regulations under this section are subject to the affirmative resolution procedure.’—(Darren Jones.)

This new clause would establish a statutory basis for a Data Ethics Advisory Board.

Brought up, and read the First time.

--- Later in debate ---
Liam Byrne Portrait Liam Byrne
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

I will touch on this new clause only very briefly, because I hope the Minister will put my mind at rest with a simple answer. For some time, there has been concern that the way in which data collected by the police through automatic number plate recognition technology is not adequately ordered, organised or policed by a code of practice. A code of practice is probably required to put the police well and truly within the boundaries of the Police and Criminal Evidence Act 1984, the Data Protection Act 1998 and the Bill.

With this new clause, we are basically asking the Secretary of State to issue a code of practice in connection with the operation by the police of ANPR systems under subsection (1), and we ask that it conform to section 67 of the Police and Criminal Evidence Act 1984. I hope the Minister will just say that a code of practice is on the way so we can safely withdraw the new clause.

Victoria Atkins Portrait Victoria Atkins
- Hansard - -

I hope Committee members have had the chance to see my response to the questions of the hon. Member for Sheffield, Heeley on Tuesday about ANPR, other aspects of surveillance and other types of law enforcement activity.

I assure the right hon. Member for Birmingham, Hodge Hill that ANPR data is personal data and is therefore caught by the provisions of the GDPR and the Bill. We recognise the need to ensure the use of ANPR is properly regulated. Indeed, ANPR systems are governed by not one but two existing codes of practice. The first is the code issued by the Information Commissioner, exercising her powers under section 51 of the Data Protection Act 1998. It is entitled “In the picture: A data protection code of practice for surveillance cameras and personal information”, and was published in June 2017. It is clear that it covers ANPR. It also refers to data protection impact assessments, which we debated last week. It clearly states that where the police and others use or intend to use an ANPR system, it is important that they

“undertake a privacy impact assessment to justify its use and show that its introduction is proportionate and necessary.”

The second code is brought under section 29 of the Protection of Freedoms Act 2012, which required the Secretary of State to issue a code of practice containing guidance about surveillance camera systems. The “Surveillance camera code of practice”, published in June 2013, already covers the use of ANPR systems by the police and others. It sets out 12 guiding principles for system operators. Privacy is very much a part of that. The Protection of Freedoms Act established the office of the Surveillance Camera Commissioner, who has a number of statutory functions in relation to the code, including keeping its operation under review.

In addition, a published memorandum of understanding between the Surveillance Camera Commissioner and the Information Commissioner sets out how they will work together. We also have the general public law principles of the Human Rights Act 1998 and the European convention on human rights. I hope that the two codes I have outlined, the Protection of Freedoms Act and the Human Rights Act reassure the right hon. Gentleman, and that he will withdraw his new clause.

Liam Byrne Portrait Liam Byrne
- Hansard - - - Excerpts

I am indeed mollified. I beg to ask leave to withdraw the clause.

Clause, by leave, withdrawn.

New Clause 21

Targeted dissemination disclosure notice for third parties and others (No. 2)

“In Schedule 19B of the Political Parties, Elections and Referendums Act 2000 (Power to require disclosure), after paragraph 10 (documents in electronic form) insert—

10A (1) This paragraph applies to the following organisations and individuals—

(a) a recognised third party (within the meaning of Part 6);

(b) a permitted participant (within the meaning of Part 7);

(c) a regulated donee (within the meaning of Schedule 7);

(d) a regulated participant (within the meaning of Schedule 7A);

(e) a candidate at an election (other than a local government election in Scotland);

(f) the election agent for such a candidate;

(g) an organisation or a person notified under subsection 2 of this section;

(h) an organisation or individual formerly falling within any of paragraphs (a) to (g); or

(i) the treasurer, director, or another officer of an organisation to which this paragraph applies, or has been at any time in the period of five years ending with the day on which the notice is given.

(2) The Commission may under this paragraph issue at any time a targeted dissemination disclosure notice, requiring disclosure of any settings used to disseminate material which it believes were intended to have the effect, or were likely to have the effect, of influencing public opinion in any part of the United Kingdom, ahead of a specific election or referendum, where the platform for dissemination allows for targeting based on demographic or other information about individuals, including information gathered by information society services.

(3) This power shall not be available in respect of registered parties or their officers, save where they separately and independently fall into one or more of categories (a) to (i) of sub-paragraph (1).

(4) A person or organisation to whom such a targeted dissemination disclosure notice is given shall comply with it within such time as is specified in the notice.”

This new clause would amend the Political Parties, Elections and Referendums Act 2000 to allow the Electoral Commission to require disclosure of settings used to disseminate material where the platform for dissemination allows for targeting based on demographic or other information about individuals.(Liam Byrne.)

Brought up, and read the First time.