Data Protection and Digital Information (No. 2) Bill (First sitting) Debate
Full Debate: Read Full DebateJohn Whittingdale
Main Page: John Whittingdale (Conservative - Maldon)(1 year, 6 months ago)
Public Bill CommitteesIt is a really tight timetable this morning and we have nine minutes left. The Minister wants to ask some questions and there are three Members from the Opposition. I will call the Minister now. Perhaps you would be kind enough, Minister, to leave time for one question each from our three Members of the Opposition.
Q
John Edwards: The obligation to investigate every complaint does consume quite a lot of our resources. Can I ask my colleague to make a contribution on this point?
Paul Arnold: As the commissioner says, that duty to investigate all complaints can challenge us in terms of where we need to dedicate the majority of our resources.
To the previous question and answer, our role in trying to provide or maximise regulatory certainty means being able to invest as much resource as we can in that upstream advice, particularly in those novel, complex, finely balanced, context-specific areas. We are adding far more value if we can add that support upstream.
The additional statutory objectives that are being added through the Bill overall will be a real asset to our accountability. Any regulator that welcomes independence also needs to welcome the accountability. It is the means through which we describe how we think, how we act and the outcomes that we achieve. Those extra statutory objectives will be a real aid to us and also an aid to Parliament and our stakeholders. It really does crystallise and clarify why we are here and how we will prioritise our efforts and resources.
Q
John Edwards: I do not believe there is anything in the Bill that would put at risk the adequacy determination with the European Union. The test the Commission applies is whether the law is essentially equivalent. New Zealand lacks many of the features of the GDPR, as do Israel and Canada, each of which has maintained adequacy status. The importance of an independent regulator is preserved in this legislation. All the essential features of the UK GDPR or the rights that citizens of the European Union enjoy are present in the Bill, so I do not believe that there is a realistic prospect of the Commission reviewing negatively the adequacy determination.
It is a brutal cut-off, I am afraid, at 9.55 am. I have no discretion in this matter. It is a quick-fire round now, gentlemen. We need quick questions and quick answers, with one each from Carol Monaghan, Chi Onwurah and Mike Amesbury.
We still have not heard definitively whether our other guests can hear us or speak to us, so we are waiting for confirmation from the tech people. In the meantime, I invite the Minister to question Vivienne Artz.
Q
Vivienne Artz: The Bill provides for the opportunity for the Government to look at a range of issues and to move away from an equivalence approach to one in which we can consider more factors and features. The reality is that if you compare two pieces of legislation, you will always find differences because they come from different cultural backgrounds and different legal regimes. There will always be differences. The approach the UK is taking in the Bill is helpful because it looks at outcomes and broader issues such as the rule of law in different jurisdictions.
What is said on paper is not necessarily what always happens in practice; we need to look at it far more holistically. The legislation gives the Government the opportunity to take that broader and more common-sense view with regard to adequacy and not just do a word-by-word comparison of legislative provisions without actually looking at how the legislation is implemented in that jurisdiction and what other rights can support the outcomes. We can recognise that there is a different legal process and application but ask whether it still achieves the same end. That is what is really important. There is an opportunity not only to move more quickly in this space but to consider jurisdictions that might not be immediately obvious but none the less still offer appropriate safeguards for data.
Q
Vivienne Artz: The current process is incredibly cumbersome for businesses and, if I am honest, it provides zero transparency for individuals as well. It tends to be mostly a paperwork exercise—forgive if that sounds provocative, but putting in place the model clauses is very often an expensive paperwork exercise. At the moment, it is difficult, time-consuming and costly, as the case may be.
The thing with adequacy is that it is achieved at a Government-to-Government level. It is across all sectors and provides certainty for organisations to move forward to share information, sell their goods and services elsewhere and receive those goods and services, and for consumers to access those opportunities as well. Adequacy is certainly the ideal. Whether it is achievable in all jurisdictions I do not know, but I think it is achievable for many jurisdictions to provide confidence for both consumers and businesses on how they can operate.
We can see Mr Ustaran and Ms Bellamy and they can hear us, but we cannot hear them, so we will carry on with questioning Vivienne Artz.
We have 12 minutes left and two Members are indicating that they wish to ask questions after you, Minister.
Q
Eduardo Ustaran: That is a very important question to address because perhaps one of the ways in which we should be looking at this legislative reform is a way of seeing how the existing GDPR framework that exists both in the EU and the UK could, in fact, be made more effective, relevant and modern to deal with the issues we are facing right now. You refer to artificial intelligence as one of those issues.
GDPR in the EU and the UK, is about five years old. It is not a very old piece of legislation, but a number of technological developments have happened in the past five years. More importantly, we have learned how GDPR operates in practice. This exercise in the UK is in fact very useful, not just for the UK but for the EU and the world at large, because it is looking at how to reform elements of existing law that is already in operation in order to make it more effective. That does not mean that the law needs to be more onerous or more strict, but it can be more effective at the same time as being more pragmatic. This is an important optic in terms of how we look at legislative reform, and not only from the UK’s point of view. The UK can make an effort to try to make the changes more visible outside the United Kingdom, and possibly influence the way in which EU GDPR evolves in the years to come.
Bojana Bellamy: I agree that we need a more flexible legal regime to enable the responsible use of AI and machine learning technologies. To be very frank with you, I was hoping the Bill would go a little further. I was hoping that there would be, for example, a recognition of the use of data in order to train algorithms to ensure that they are not discriminatory, not biased and function properly. I would have hoped that would be considered as an example of legitimate interests. That is certainly a way in which the Government can go further, because there are possibilities for the Secretary of State to augment those provisions.
We have seen that in the European AI Act, where they are now allowing greater use of data for algorithmic AI training, precisely in order to ensure that algorithms work properly. We have Dubai’s data protection law and some others are starting to do that. I hope that we have good foundations to ensure further progression of the rules on AI. The rules on automated decision making are certainly better in this Bill than they are in GDPR. They are more realistic; they understand the fact that we going to be faced with AI and machine learning taking more and more decisions, of course with the possibility of human intervention.
Again, to those who criticise the rules, I would say it is more important to have these exposed rights of individuals. We should emphasise, in the way we have done in the Bill, the right to information that there is AI involved, the right to make a representation, the right to contest a decision, and the right to demand human review or human intervention. To me, that is really what empowers individuals and gives them trust that the decisions will be made in a better way. There is no point in prohibiting AI in the way GDPR sort of does. In GDPR, we are going to have something of a clash between the fact that the world is moving toward greater use of AI, and that in article 22 on automated decision making, there is a prohibition that makes it subject to consent or contract. That is really unrealistic. Again, we have chosen a better way.
As a third small detail, I find the rules on research purposes to be smarter. They are rather complicated to read, to be frank, but I look forward to the consolidated, clean version. The fact that technological development research is included in commercial research will enable the organisations that are developing AI to create the rules in a responsible way that creates the right outcomes for people, and does not create harms or risks. To me, that is what matters. That is more important, and that is what is going to be delivered here. We have the exemptions from notices for research and so on, so I feel we will have better conditions for the development of AI in a responsible and trusted way. However, we must not take our eyes off it. We really need to link GDPR with our AI strategy, and ensure that we incentivise organisations to be accountable and responsible when they are developing and deploying AI. That will be a part of the ICO’s role as well.
Five minutes left. This will be the quick-fire round. I have two Members indicating that they wish to ask questions—Chi Onwurah.
Q
Neil Ross: Smart data is potentially a very powerful tool for increasing consumer choice, lowering prices and giving people access to a much broader range of services. The smart data provisions that the Government have introduced, as well as the Smart Data Council that they are leading, are really welcome. However, we need to go one step further and start to give people and industries clarity around where the Government will look first, in terms of what kind of smart data provisions they might look at and what kind of sectors they might go into. Ultimately, we need to make sure that businesses are well consulted and that there is a strong cost-benefit analysis. We then need to move ahead with the key sectors that we want to push forward on. Similarly to on nuisance calls, we will send some suggested text to the Committee to add those bits in, but it is a really welcome step forward.
Q
Neil Ross: I do not want to name specific sectors at this point. We are having a lot of engagement with our members about where we would like to see it first. The transport sector is one area where it has been used in the past and could have a large use in the future, but it is something that we are exploring. We are working directly with the Government through the Smart Data Council to try to identify the initial sectors that we could look at.
Q
Chris Combemale: I think the single biggest one that has troubled our members since the implementation of GDPR is the issue around legitimate interest, which was raised by the hon. Member for Folkestone and Hythe. The main issue is that GDPR contains six bases of data processing, which in law are equal. For the data and marketing industry, the primary bases are legitimate interest and consent. For some reason it has become widely accepted through the implementation of GDPR that GDPR requires consent for marketing and for community activities. I am sure that you hear in your constituencies of many community groups that feel that they cannot go about organising local events because they must have consent to communicate. That has never been the intention behind the legislation; in fact, the European Court of Justice has always ruled that any legal interest could be a legitimate interest, including advertising and marketing.
If you look at what we do, which is effectively finding and retaining customers, the GDPR legislation says in recital 4 that privacy is a fundamental right, not an absolute right, and must be balanced against other rights, such as the right to conduct a business. You cannot conduct a business without the right to find and retain customers, just as you cannot run a charity without the right to find donors and volunteers who provide the money and the labour for your good cause. The clarification is really important across a wide range of use cases in the economy, but particularly ours. It was recognised in GDPR in recital 47. What the legislation does is give illustrative examples that are drawn from recitals 47, 48 and 49. They are not new examples; they are just given main text credibility. It is an illustrative list. Really, any legal interest could be a legitimate interest for the purpose of data providing, subject to necessity and proportionality, which we discussed earlier with the Information Commissioner.
Q
Chris Combemale: In the sector that I represent, we have a fairly clear understanding of the gradients of risk. As I was saying earlier, many companies do not share data with other companies. They are interested solely in the relationships that they have with their existing customers or prospects. In that sense, all the customer attitudes to privacy research that we do indicates that people are generally comfortable sharing data with companies they trust and do business with regularly.