Data Protection and Digital Information (No. 2) Bill (First sitting) Debate
Full Debate: Read Full DebateMike Amesbury
Main Page: Mike Amesbury (Independent - Runcorn and Helsby)(1 year, 6 months ago)
Public Bill CommitteesWe will now hear oral evidence from John Edwards, the Information Commissioner, and Paul Arnold, the deputy chief executive and chief operating officer of the Information Commissioner’s Office. I remind all Members that questions should be limited to matters within the scope of the Bill, and that we must stick to the timings in the programme order, which the Committee has agreed. For this panel, we have until 9.55 am. Will the witnesses please introduce themselves for the record?
John Edwards: Kia ora! My name is John Edwards. I am the Information Commissioner. I took up the job at the beginning of January last year. I was previously the Privacy Commissioner of New Zealand for eight years.
Paul Arnold: I am Paul Arnold, the deputy chief executive and chief operating officer of the ICO. I took up that position in 2016.
Q
John Edwards: Yes, I do believe that an empowered citizenry is best placed to enjoy these rights. However, I also believe that the complexity of the modern digital environment creates such an information asymmetry that it is important for strong advocates such as the Information Commissioner’s Office to act as a proxy on behalf of citizenry. I do not believe that we should devolve responsibility to citizens purely to ensure that high standards are set and adhered to in digital industries.
Q
John Edwards: I do not believe so. We have been involved right from the outset. We made a submission on the initial White Paper. We have worked closely with officials. We have said that we want to see the Bill get to a position where I, as Information Commissioner, am able to stand up and say, “I support this legislation.” We have done that, which has meant we have achieved quite significant changes for the benefit of the people of the United Kingdom. It does not mean that we have just accepted what the Government have handed out. We have worked closely together. We have acted as advocates, and I believe that the product before you shows the benefits of that.
Q
Neil Ross: I do not have one to hand, but we could certainly follow up.
Q
Neil Ross: That is in relation to clause 85?
Q
Neil Ross: We do not think it is particularly appropriate for this scenario, given that the telecoms operators are just informing the ICO about activity that is happening on their service. It is not that they are the bad actors in the first instance; they are having to manage it. Ultimately, the first step is to clarify the aims of clause 85, and then whether the fine is appropriate is a subsequent question.
Q
Neil Ross: It will vary from company to company. Most companies will always seek to comply with the law. If you feel you need some kind of deterrent, that is something for Parliament to consider. The first step is to make sure that the law is really clear about what companies are being asked to do. At the moment, that is not the situation we are in.
Q
Anna Thomas: Yes, absolutely. In our model, we suggest that the impact assessment should incorporate not just the data protection elements, which we say remain essential, but equality of opportunity and disparity of outcome—for example, equal opportunity to promotion, or access to benefits. That should be incorporated in a model that forefronts and considers impacts on work.
Q
Anna Thomas: I would advise very clear additional rights, and a duty to notify in advance what, how and why AI is being used where it has these impacts, and where it meets the threshold that I was just asked about. I would also advise having more consultation throughout design, development and deployment, and ongoing monitoring, because AI changes, and there are impacts that we have not thought about or cannot ascertain in advance.
There should also be a separate obligation to conduct an algorithmic impact assessment. The Bill does nudge in that direction, but it says that there should be an assessment, rather than a data protection impact assessment. We suggest that the opportunity be grasped of clarifying that—at least in the workplace context, but arguably there are lessons more widely—the assessment ought to cover these fundamental aspects, and impacts at work.
Q
Michael Birtwistle: My colleagues could not be here, unfortunately, but they would have been better representatives in that sense.