Data Protection Bill [ Lords ] (Eighth sitting) Debate
Full Debate: Read Full DebateDarren Jones
Main Page: Darren Jones (Labour - Bristol North West)Department Debates - View all Darren Jones's debates with the Home Office
(6 years, 8 months ago)
Public Bill CommitteesI want to add some further comments in support of the new clauses.
The Science and Technology Committee, one of the two Committees that I sit on, has had a detailed debate on algorithmic fairness. It is important to understand what the new clauses seek to do. There is a nervousness about regulating algorithms or making them completely transparent, because there are commercial sensitivities in the coding in respect of the way they are published or otherwise.
These new clauses seek to put the obligation on to the human beings who produce the algorithms to think about things such as equalities law to ensure that we do not hardcode biases into them, as my hon. Friend the Member for Cambridge said on Second Reading. It is important to understand how the new clauses apply to the inputs—what happens in the black box of the algorithm—and the outputs. The inputs to an algorithm are that a human codes and sets its rules, and that they put the data into it for it to make a decision.
The new clauses seek to say that the human must have a consistent and legal obligation to understand the equalities impacts of their coding and data entry into the black box of the algorithm to avoid biases coming out at the other end. As algorithms are increasingly used, that is an important technical distinction to understand, and it is why the new clauses are very sensible. On that basis, I hope the Government will support them.
That sounds like a terrifying application; my hon. Friend’s daughter very much has my sympathies. He is absolutely right. Lord Knight made this point with such power in the other place. The technology is advancing so quickly, and schools know that if they can monitor things in new, more forensic ways, that helps them to do their job of improving children’s education. However, it has costs and consequences too. I hope that Her Majesty’s Government will look sympathetically on the task of teachers, as they confront this 200-and-heaven-knows-what-page Bill.
Does my right hon. Friend share my concerns that, in response to a number of written parliamentary questions that I tabled, it became clear that the Government gave access to the national pupil database, which is controlled by the Government, to commercial entities, including newspapers such as The Daily Telegraph?
Yes. My hon. Friend has done an extraordinary job of exposing that minor scandal. I am surprised that it has not had more attention in the House, but hopefully once the Bill has passed it is exactly the kind of behaviour that we can begin to police rather more effectively.
I am sure that Ministers will recognise that there is a need for this. No doubt their colleagues in the Department for Education are absolutely all over it. I was talking to a headteacher in the Minister’s own constituency recently—an excellent headteacher, in an excellent school, who is a personal friend. The horror with which headteachers regard the arrival of the GDPR is something to behold. Heaven knows, our school leaders and our teachers have enough to do. I call on Ministers to make their task, their lives, and their mission that bit easier by accepting the new clause.
I am reassured by that and I beg to ask leave to withdraw the motion.
Clause, by leave, withdrawn.
New Clause 17
Personal data ethics advisory board and ethics code of practice
‘(1) The Secretary of State must appoint an independent Personal Data Ethics Advisory Board (“the board”).
(2) The board’s functions, in relation to the processing of personal data to which the GDPR and this Act applies, are—
(a) to monitor further technical advances in the use and management of personal data and their implications for the rights of data subjects;
(b) to monitor the protection of the individual and collective rights and interests of data subjects in relation to their personal data;
(c) to ensure that trade-offs between the rights of data subjects and the use of management of personal data are made transparently, inclusively, and with accountability;
(d) to seek out good practices and learn from successes and failures in the use and management of personal data;
(e) to enhance the skills of data subjects and controllers in the use and management of personal data.
(3) The board must work with the Commissioner to prepare a data ethics code of practice for data controllers, which must—
(a) include a duty of care on the data controller and the processor to the data subject;
(b) provide best practice for data controllers and processors on measures, which in relation to the processing of personal data—
(i) reduce vulnerabilities and inequalities;
(ii) protect human rights;
(iii) increase the security of personal data; and
(iv) ensure that the access, use and sharing personal data is transparent, and the purposes of personal data processing are communicated clearly and accessibly to data subjects.
(4) The code must also include guidance in relation to the processing of personal data in the public interest and the substantial public interest.
(5) Where a data controller or processor does not follow the code under this section, the data controller or processor is subject to a fine to be determined by the Commissioner.
(6) The board must report annually to the Secretary of State.
(7) The report in subsection (6) may contain recommendations to the Secretary of State and the Commissioner relating to how they can improve the processing of personal data and the protection of data subjects’ rights by improving methods of—
(a) monitoring and evaluating the use and management of personal data;
(b) sharing best practice and setting standards for data controllers; and
(c) clarifying and enforcing data protection rules.
(8) The Secretary of State must lay the report made under subsection (6) before both Houses of Parliament.
(9) The Secretary of State must, no later than one year after the day on which this Act receives Royal Assent, lay before both Houses of Parliament draft regulations in relation to the functions of the Personal Data Ethics Advisory Board as listed in subsections (2), (3), (4), (6) and (7) of this section.
(10) Regulations under this section are subject to the affirmative resolution procedure.’—(Darren Jones.)
This new clause would establish a statutory basis for a Data Ethics Advisory Board.
Brought up, and read the First time.
I beg to move, That the clause be read a Second time.
New clause 17 is in my name and that of my right hon. Friend the Member for Birmingham, Hodge Hill. I do not take it personally that my other hon. Friends have not signed up to it; that was probably my fault for not asking them to do so in advance.
The new clause would bring a statutory footing to the data and artificial intelligence ethics unit, which I am very pleased that the Government have now funded and established, through the spring statement, in the Minister’s Department. It comes off the back of conversations with the Information Commissioner in Select Committee about the differing roles of enforcing legislation and of having a public debate about what is right and wrong and what the boundaries are in this ever-changing space. The commissioner was very clear that we need to have that debate with the public, but that it is not for her to do it. The ICO is an enforcer of legislation. The commissioner has a lot on her plate and is challenged by her own resource as it is. She felt that the new unit in the Department would be a good place to have the debate about technology ethics, and I support that assertion.
With no disrespect to any colleagues, I do not think that the House of Commons, and perhaps even the Select Committees to a certain extent, necessarily has the time, energy or resource to get into the real detail of some of the technology ethics questions, nor to take them out to the public, who are the people we need to be having the debate with.
The new clause would therefore establish in law that monitoring, understanding and public debate obligation that I, the ICO and others agree ought to exist in the new data ethics unit, but make it clear that enforcement was reserved for the Information Commissioner. I tabled the new clause because, although I welcome the Government’s commitment to the data and AI ethics unit, I feel that there is potential for drift. The new clause would therefore put an anchor in the technology ethics requirement of the unit so that it understands and communicates the ethical issues and does not necessarily get sidetracked into other issues, although it may seek to do that on top of this anchor. However, I think this anchor needs to be placed.
Also, I recognise that the Minister and the Secretary of State supported the recommendation made previously under the Cameron Government and I welcome that, but of course, with an advisory group within the Department, it may be a future Minister’s whim that they no longer wish to be advised on these issues, or it may be the whim of the Treasury—with, potentially, budget cuts—that it no longer wishes to fund the people doing the work. I think that that is not good enough and that putting this provision in the Bill would give some security to the unit for the future.
I will refer to some of the comments made about the centre for data ethics and innovation, which I have been calling the data and AI ethics unit. When it was first discussed, in the autumn Budget of November 2017, the Chancellor of the Exchequer said that the unit would be established
“to enable and ensure safe, ethical and ground-breaking innovation in AI and data-driven technologies. This world-first advisory body will work with government, regulators and industry to lay the foundations for AI adoption”.
Although that is a positive message, it says to me that its job is to lay the foundations for AI adoption. I agree with that as an aim, but it does not mean that at its core is understanding and communicating the ethical challenges that we need to try to understand and legislate for.
I move on to some of the documents from the recruitment advertising for personnel to run the unit from January of this year, which said that the centre will be at the centre of plans to make the UK the best place in the world for AI businesses. Again, that is a positive statement, but one about AI business adoption in this country, not ethical requirements. It also said that the centre would advise on ethical and innovative uses of data-driven tech. Again, that is a positive statement, but I just do not think it is quite at the heart of understanding and communicating and having a debate about the ethics.
My concern is that while all this stuff is very positive, and I agree with the Government that we need to maintain our position as a world leader in artificial intelligence and that it is something we need to be very proud of—especially as we go through the regrettable process of leaving the European Union and the single market, we need to hold on to the strengths we have in the British economy—this week has shown that there is a need for an informed public debate on ethics. As no doubt all members of the Committee have read in my New Statesman article of today, one of the issues we have as the voice of our constituents in Parliament is that in order for our constituents to understand or take a view on what is right or wrong in this quickly developing space, we all need to understand it in the first place—to understand what is happening with our data and in the technology space, to understand what is being done with it and, having understood it, to then to take a view about it. The Cambridge Analytica scandal has been so newsworthy because the majority of people understandably had no idea that all this stuff was happening with their data. How we legislate for and set ethical frameworks must first come from a position of understanding.
That is why the new clause sets out that there should be an independent advisory board. The use of such boards is commonplace across Departments and I hope that would not be a contentious question. Subsection (2) talks about some of the things that that board should do. The Minister will note that the language I have used is quite careful in looking at how the board should monitor developments, monitor the protection of rights and look out for good practice. It does not seek to step on the toes of the Information Commissioner or the powers of the Government, but merely to understand, educate and inform.
The new clause goes on to suggest that the new board would work with the commissioner to put together a code of practice for data controllers. A code of practice with a technology ethics basis is important because it says to every data controller, regardless of what they do or what type of work they do, that we require ethical boundaries to be set and understood in the culture of what we do with big data analytics in this country. In working with the commissioner, this board would add great value to the way that we work with people’s personal data, by setting out that code of practice.
I hope that the new clause adds value to the work that the Minister’s Department is already doing. My hope is that by adding it to the Bill—albeit that current Parliaments cannot of course bind their successors and it could be legislated away in the future—it gives a solid grounding to the concept that we take technology ethical issues seriously, that we seek to understand them properly, not as politicians or as busy civil servants, but as experts who can be out with our stakeholders understanding the public policy consequences, and that we seek to have a proper debate with the public, working with enforcers such as the ICO to set, in this wild west, the boundaries of what is and is not acceptable. I commend the new clause to the Committee and hope that the Government will support it.
I thank the hon. Gentleman for raising this very important subject. He is absolutely right. Data analytics have the potential to transform whole sectors of society and the economy—law enforcement and healthcare to name but some. I agree with him that a public debate around the issues is required, and that is one of the reasons why the Government are creating the centre for data ethics and innovation, which he mentioned. The centre will advise the Government and regulators on how they can strengthen and improve the way that data and AI are governed, as well as supporting the innovative and ethical use of that data.
I thank the Minister for her co-operative words and for the invitation to be part of this developing area of public policy. Having already plugged my New Statesman article, I will plug a part of it, which is the news that, having worked with some of the all-party parliamentary groups, I am pleased that we will launch a commission on technology ethics with one of the Minister’s colleagues, whose constituency I cannot quite remember, I am afraid, so I cannot make reference to him. But he is excellent.
We look forward to working with industry, stakeholders and politicians on a cross-party basis, to get into the debate about technology ethics. I accept the Minister’s warm words about co-operating on this issue positively, so that hopefully the outcomes of this commission can perhaps help to influence the work of the unit, or centre, and the Government’s response to it.
I would like this new unit to be given a statutory basis, to show its importance. It is vital that it has clout across Government and across Departments, so that it is not just a positive thing when we have Ministers who are willing to take part in and listen to this debate and instead is something that will go on with successive Ministers, should the current Minister be promoted, and with future Governments, too. However, in return for the Minister’s warm words of co-operation, I am happy not to press the new clause to a vote today.
Very briefly, I declare an interest as the chair of the all-party parliamentary group on data analytics. This is a subject, of course, that is very dear to our hearts. I will just say that there is a great deal of common ground on it. I commend my hon. Friend the Member for Bristol North West for trying to put it into the Bill, because I, too, think it needs to be put on a statutory basis. However, I will just draw attention to a lot of the very good work that has been done by a whole range of people in bringing forward the new structures.
I will just say again that in general I think we are heaping a huge amount of responsibility on the Information Commissioner; frankly, we are now almost inviting her to save the world. She and her office will need help. So an additional body, with resources, is required.
The Royal Society and the British Academy have done a lot of work on this issue over the last few years. I will conclude by referring back to a comment made by the hon. Member for Gordon, because it is worth saying that the Royal Society and the British Academy state in the conclusions of their report:
“It is essential to have a framework that engenders trust and confidence, to give entrepreneurs and decision-makers the confidence to act now, and to realise the potential of new applications in a way that reflects societal preferences.”
That is exactly the kind of thing we are trying to achieve. This body is essential and it needs to be set up as quickly as possible.
I beg to ask leave to withdraw the new clause.
Clause, by leave, withdrawn.
New Clause 20
Automated number plate recognition (No. 2)
“(1) Vehicle registration marks captured by automated number plate recognition systems are personal data.
(2) The Secretary of State shall issue a code of practice in connection with the operation by the police of automated number plate recognition systems.
(3) Any code of practice under subsection (1) shall conform to section 67 of the Police and Criminal Evidence Act 1984.”—(Liam Byrne.)
This new clause requires the Secretary of State to issue a code of practice in connection with the operation by the police of automated number plate recognition systems, vehicle registration marks captured by which are to be considered personal data in line with the opinion of the Information Commissioner.
Brought up, and read the First time.
I beg to move, That the clause be read a Second time.
I will touch on this new clause only very briefly, because I hope the Minister will put my mind at rest with a simple answer. For some time, there has been concern that the way in which data collected by the police through automatic number plate recognition technology is not adequately ordered, organised or policed by a code of practice. A code of practice is probably required to put the police well and truly within the boundaries of the Police and Criminal Evidence Act 1984, the Data Protection Act 1998 and the Bill.
With this new clause, we are basically asking the Secretary of State to issue a code of practice in connection with the operation by the police of ANPR systems under subsection (1), and we ask that it conform to section 67 of the Police and Criminal Evidence Act 1984. I hope the Minister will just say that a code of practice is on the way so we can safely withdraw the new clause.