Data Protection and Digital Information (No. 2) Bill (First sitting) Debate

Full Debate: Read Full Debate
Chi Onwurah Portrait Chi Onwurah (Newcastle upon Tyne Central) (Lab)
- Hansard - - - Excerpts

I am a proud member of two trade unions.

Rupa Huq Portrait Dr Rupa Huq (Ealing Central and Acton) (Lab)
- Hansard - -

Should we declare our membership of any union?

None Portrait The Chair
- Hansard -

My advice is that it is always better to declare.

Rupa Huq Portrait Dr Huq
- Hansard - -

Okay. I am a member of Unison, formerly the National and Local Government Officers Association.

Christian Wakeford Portrait Christian Wakeford (Bury South) (Lab)
- Hansard - - - Excerpts

I am also a member of a union.

--- Later in debate ---
None Portrait The Chair
- Hansard -

We have a late entry—the last question will be from Rupa Huq.

Rupa Huq Portrait Dr Huq
- Hansard - -

Q When I was on the Criminal Finances Bill Committee, lots was promised, but the National Crime Agency then claimed that it was not financed enough to pursue all the unexplained wealth orders that were promised. Do you think that a beefed-up Information Commission will be sufficiently well resourced to do all the things it is meant to do?

John Edwards: In short, yes. We are having discussions about the funding model with DSIT. We are funded by levies. There are two questions: one is about how those levies are set and where the burden of funding our office lies in the economy, and the second is about the overall quantum. We can always do more with more. If you look at the White Paper on artificial intelligence and the Vallance report, you will see that there is a role for our office to patrol the new boundaries of AI. In order to do that, we will have to be funded appropriately, but I have a good relationship with our sponsor Department and am confident that we will be able to discharge all the responsibilities in the Bill.

None Portrait The Chair
- Hansard -

Gentlemen, thank you very much indeed for your evidence. You can now breathe, relax and enjoy the rest of your day.

Examination of Witnesses

Eduardo Ustaran, Vivienne Artz and Bojana Bellamy gave evidence.

--- Later in debate ---
Mike Amesbury Portrait Mike Amesbury
- Hansard - - - Excerpts

Q Anna, how would you strengthen the Bill? If you were to table an amendment around employees and AI, what would it be?

Anna Thomas: I would advise very clear additional rights, and a duty to notify in advance what, how and why AI is being used where it has these impacts, and where it meets the threshold that I was just asked about. I would also advise having more consultation throughout design, development and deployment, and ongoing monitoring, because AI changes, and there are impacts that we have not thought about or cannot ascertain in advance.

There should also be a separate obligation to conduct an algorithmic impact assessment. The Bill does nudge in that direction, but it says that there should be an assessment, rather than a data protection impact assessment. We suggest that the opportunity be grasped of clarifying that—at least in the workplace context, but arguably there are lessons more widely—the assessment ought to cover these fundamental aspects, and impacts at work.

Rupa Huq Portrait Dr Huq
- Hansard - -

Q It is good to see the Ada Lovelace Institute represented; she was a pioneering woman computer scientist who lived in my constituency, so it is a bit ironic that the one man here is representing the institute.

Michael Birtwistle: My colleagues could not be here, unfortunately, but they would have been better representatives in that sense.

Rupa Huq Portrait Dr Huq
- Hansard - -

I want to touch on the equality issue again. A 2019 UN report on the digital welfare state made the point that algorithms repeat existing biases and entrench inequalities. How do we get around that? There are a lot of issues around trust and people’s rights and protections when it comes to this data. On top of those, there is this issue. Does the legislation address that? How can we overcome it?

Dr Tennison: As I have mentioned, there need to be more points in the Bill where explicit consideration of the public interest, including equality, is written into the sets of considerations that organisations, the ICO and the Secretary of State need to take into account when they are exercising their rights. That includes ensuring that public interest and equality are an explicit part of assessments of high-risk processing. That will help us to make sure that in the assessment process, organisations are made to look beyond the impacts on individuals and data subjects, and to look at the whole societal and economic impacts—even at the environmental impacts—that there might be from the processing that they are looking to carry out.

Anna Thomas: I agree. To add to what I said before, it would help to require a technical bias audit as well as a wider equality impact assessment. One idea that you may wish to consider is this: in the same way that the public sector has an obligation sometimes to consider the reduction of wider inequalities, you could have—well, not a full private sector model requiring that; that may need to be built up over time. We could, at the very least, require consideration of the desirability of reducing inequalities of opportunity and outcome as part of determining our reasonable and proportionate mitigations in the circumstances; that would be easy to do.

Michael Birtwistle: I agree. There is also a question about institutional capability—ensuring that the institutions involved have the capability to react to the use of these technologies as they evolve. Specifically, it would be great to see the ICO asked in the Bill to produce guidance on how the safeguards in article 22C are to be implemented, as that will have a large effect on how automated decision making will be lived in practice and built into firms. The powers reserved for Ministers around interpreting meaningful human involvement, and legal and similarly significant effect, will also have a big impact. It would make more sense for that to be with the ICO.

Rupa Huq Portrait Dr Huq
- Hansard - -

Can I add one yes/no question?

None Portrait The Chair
- Hansard -

Yes.

Rupa Huq Portrait Dr Huq
- Hansard - -

Q If we have an already overburdened regulatory framework, and we put AI on top of it, will it just fall through the cracks? Is there a danger that AI gets forgotten?

Michael Birtwistle: Yes, if regulators are not properly empowered.

Anna Thomas: I strongly agree, but they could be properly empowered and resourced, and in some instances given extra powers to interrogate or to redress what they have found. We advised that there should be a forum in 2020, and are delighted to see the Digital Regulation Cooperation Forum. That could be given additional resources and additional bite, and we would certainly like to see work forefronted and involved in activities. The forum would be well placed, for example, to provide dedicated cross-cutting guidance on impacts in work.

Dr Tennison: I agree with the other panellists. The only thing I would add is that I think that the involvement of the public will be absolutely essential for moving trust forward in those circumstances.

None Portrait The Chair
- Hansard -

The last question is from Chi Onwurah.