Data Protection and Digital Information (No. 2) Bill (First sitting) Debate

Full Debate: Read Full Debate
None Portrait The Chair
- Hansard -

It is a brutal cut-off, I am afraid, at 9.55 am. I have no discretion in this matter. It is a quick-fire round now, gentlemen. We need quick questions and quick answers, with one each from Carol Monaghan, Chi Onwurah and Mike Amesbury.

Carol Monaghan Portrait Carol Monaghan (Glasgow North West) (SNP)
- Hansard - -

Q Clause 40 sets out the criteria by which a data controller can refuse data access requests. Do you think this is appropriate? Are you concerned that it may lead to a situation in which only those who can afford to pay a potential fee will be able to access their data?

John Edwards: Yes and no. Yes, I do believe it is an adequate provision, and no, I do not believe there will be an economic barrier to people accessing their information rights.

Chi Onwurah Portrait Chi Onwurah
- Hansard - - - Excerpts

Q The Bill’s intent is to reduce burdens on organisations while maintaining high data protection standards. Do you agree that high data protection standards are promoted by well-informed and empowered citizens? What steps do you think the Bill takes to ensure greater information empowerment for citizens?

John Edwards: Yes, I do believe that an empowered citizenry is best placed to enjoy these rights. However, I also believe that the complexity of the modern digital environment creates such an information asymmetry that it is important for strong advocates such as the Information Commissioner’s Office to act as a proxy on behalf of citizenry. I do not believe that we should devolve responsibility to citizens purely to ensure that high standards are set and adhered to in digital industries.

--- Later in debate ---
None Portrait The Chair
- Hansard -

We can see Mr Ustaran and Ms Bellamy and they can hear us, but we cannot hear them, so we will carry on with questioning Vivienne Artz.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Q A number of organisations have expressed concerns about moving to a situation in which we can refuse subject access requests or indeed charge a fee. Do you believe the thresholds in the Bill are appropriate and proportionate?

Vivienne Artz: I do think the thresholds are appropriate and proportionate. In practice, most organisations do not actually choose to charge, because actually it costs more to process the cheque than it is worth in terms of the revenue. Certainly, some sectors have been subject to very vexatious approaches through claims-management companies and others, where it is a bombarding exercise and it is unclear whether it is in the best interests of the consumers, or whether it is at their understanding and behest, to make a genuine subject access request.

I am a great supporter of subject access requests—they are a way for individuals to exercise their rights to understand what data is being processed—but as a result of quirks of how we operate often in the UK, they are being used as a pre-litigation investigative tool on the cheap, which is unfortunate and has meant that we have had to put in place additional safeguards to ensure they are used for the purpose for which they were provided, which is so that individuals can have transparency and clarity around what data is being processed and by whom.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Q Do you think the threshold for something to be considered vexatious or excessive is well understood?

Vivienne Artz: We have heard from the Information Commissioner that they are fairly clear on what that terminology means and it will reflect the existing body of law in practice. I will be perfectly honest: it is not immediately clear to me, but there is certainly a boundary within which that could be determined, and that is something we would rely on the Information Commissioner to provide further guidance on. It is probably also likely to be contextual.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Q How frequently do we expect such requests to be refused off the back of this legislation?

Vivienne Artz: I think it depends on the sector. I come from the financial services sector, so the types of subject access requests we get tend to be specific to us. I think organisations are going to be reluctant to refuse a subject access request because, at the end of the day, an individual can always escalate to the Information Commissioner if they feel they have been unfairly treated. I think organisations understand their responsibility to act in the best interests of the individual at all times.

None Portrait The Chair
- Hansard -

Q Ms Bellamy and Mr Ustaran, we can now hear both of you. Would you be kind enough to introduce yourselves?

Bojana Bellamy: Thank you for inviting me to this hearing. My name is Bojana Bellamy. I lead the Centre for Information Policy Leadership. We are a global data privacy and data policy think-and-do-tank operating out of London, Brussels and Washington, and I have been in the world of data privacy for almost 30 years.

Eduardo Ustaran: Good morning. My name is Eduardo Ustaran. I am a partner at Hogan Lovells, based in London, and I co-lead our global privacy and cyber-security practice, a team of over 100 lawyers who specialise in data protection law all over the world.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Q Thank you. Mr Combemale, will you set out some of the obstacles for your organisation, and how you would like the Bill to reduce them?

Chris Combemale: I think the single biggest one that has troubled our members since the implementation of GDPR is the issue around legitimate interest, which was raised by the hon. Member for Folkestone and Hythe. The main issue is that GDPR contains six bases of data processing, which in law are equal. For the data and marketing industry, the primary bases are legitimate interest and consent. For some reason it has become widely accepted through the implementation of GDPR that GDPR requires consent for marketing and for community activities. I am sure that you hear in your constituencies of many community groups that feel that they cannot go about organising local events because they must have consent to communicate. That has never been the intention behind the legislation; in fact, the European Court of Justice has always ruled that any legal interest could be a legitimate interest, including advertising and marketing.

If you look at what we do, which is effectively finding and retaining customers, the GDPR legislation says in recital 4 that privacy is a fundamental right, not an absolute right, and must be balanced against other rights, such as the right to conduct a business. You cannot conduct a business without the right to find and retain customers, just as you cannot run a charity without the right to find donors and volunteers who provide the money and the labour for your good cause. The clarification is really important across a wide range of use cases in the economy, but particularly ours. It was recognised in GDPR in recital 47. What the legislation does is give illustrative examples that are drawn from recitals 47, 48 and 49. They are not new examples; they are just given main text credibility. It is an illustrative list. Really, any legal interest could be a legitimate interest for the purpose of data providing, subject to necessity and proportionality, which we discussed earlier with the Information Commissioner.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Q We have heard already this morning that a number of words and phrases could have some ambiguity associated with them, such as the word “excessive”, and the Bill allowing certain cookies that are “low risk”. Do you think that the phrase “low risk” is well enough understood?

Chris Combemale: In the sector that I represent, we have a fairly clear understanding of the gradients of risk. As I was saying earlier, many companies do not share data with other companies. They are interested solely in the relationships that they have with their existing customers or prospects. In that sense, all the customer attitudes to privacy research that we do indicates that people are generally comfortable sharing data with companies they trust and do business with regularly.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Q Would that then be the definition of low risk?

Chris Combemale: I would not want to suggest what the legal definition is. To us in direct marketing and in the Data and Marketing Association, existing customer relationships—loyal customers who trust and are sometimes passionate about the brands they interact with—are low risk. Higher risk is when you come to share data with other companies, but again much of that activity and data sharing is essential to creating relevance. With the right protections, it is not a hugely high-risk activity. Then you can move on up, so the higher the degree of automation and the higher the degree of third-party data, the greater the risk, and you have to put in place mitigations accordingly. I am not a lawyer—I am just a poor practitioner—so I cannot define it from a legal point of view, but it is clear in the context of our industry how risk elevates depending on what you are doing.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Q I might come back to that in a second, but I think Neil wanted to add something.

Neil Ross: I was going to say that you can see how Chris has interpreted it through the lens of his industry, but the feedback we have had from our members, who operate across a range of industries, suggests that there is quite a lot of confusion about what that terminology might mean. The rest of the Bill aims to clarify elements of the GDPR and put them on the face of the Bill, but this provision seems to be going in the other direction. It raises concern and confusion.

That is why our approach has always been that you are going to get more clarity by aligning the Privacy and Electronic Communications Regulation 2003 more with the GDPR, which has clear legal bases, processes and an understanding of what is high and low risk—a balancing test, and so on—than through this fairly broad and poorly understood term “low risk”. We have concerns about how it will operate across a range of sectors.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Q Chris, you said that you are not a lawyer and cannot define what low risk is, but there will of course have to be some sort of definition. Have we captured that well enough?

Chris Combemale: Coming back to our discussion about legitimate interest and the proportionality balancing test, or legitimate interest impact assessments, when you are thinking about what you are planning to do with your customers, it is a requirement of good marketing without the legislation, but also within the legislation, to think about how what you are planning to do will impact your customers’ privacy, and then to mitigate. The important thing is not to say, “There’s no risk,” “It is low risk,” or “It is high risk”; it is to understand that the higher the risk, the greater the mitigations that you have to put in place. You may conclude that you should not do something because the risk level is too high. That is what balancing tests do, and decisions and outcomes result from them.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Q The potential difficulty here is that the responsibility is being put on the company. You have described a responsible company that categorises levels of risk and takes action accordingly. Without a clear definition, if it were a less scrupulous company, would there be a grey area?

Chris Combemale: We do a lot of work combating rogue traders, and we provide evidence to cases from our work with the telephone preference service and other activities. Rogue traders—especially those with criminal intent—will generally ignore the legislation anyway regardless of what you do and whether it lacks clarity or not, but I think you are right. An important part of GDPR is that it puts a lot of responsibility on companies to consider their particular activity, their particular customer base and the nature of their audience. Age UK, a charity that has a lot of vulnerable elderly customers, has to have greater protections and put more thought into how it is doing things than a nightclub marketing to under-30s, who are very technologically literate and digitally conversant.

When we do customer attitudes to privacy studies, we see three broad segmentations—data unconcerned, data pragmatist and data fundamentalist—and they require different treatment. It is incumbent on any company, in a marketing context, to understand who their audience and their customer base is, and design programmes appropriately to build trust and long-term relationships over time. That is an important element of GDPR, from a marketer’s perspective. I should add that it should not take legislation to force marketers to do that.

None Portrait The Chair
- Hansard -

There are five minutes left and there are two Members seeking to ask questions.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Mr Birtwistle?

Michael Birtwistle: I very much agree with my other panellists on those points. If you are thinking about concrete ways to improve what is in the Bill, the high level of protection around automated decision making is currently in article 22B. That looks at decisions using special category data, which, as an input, you could also add in there, looking at the output. You could include decisions that involve high-risk processing, which is already terminology used throughout the Bill. That would mean that, where automated decision making is used around decisions that involve high-risk processing, you would need meaningful human involvement, explicit consent or substantial public interest.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Q Jeni, can I come back to you on automated decision making? You have suggested that a requirement to notify people when an automated decision is made about them would be a useful inclusion in the Bill. Do you think enough consideration has been given to that?

Dr Tennison: The main thing that we have been arguing for is that it should be the wider set of decision subjects, rather than data subjects, who get rights relating to notification, or who can have a review. It is really important that there be notification of automated decision making, and as much transparency as possible about the details of it, and the process that an organisation has gone through in making an impact assessment of what that might mean for all individuals, groups and collective interests that might be affected by that automated decision making.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Q We can probably broadly split these decisions into two categories. Decisions are already being made by algorithms online, according to what we are looking at. If I look up a paint colour online, and then start getting adverts for different paint companies, I am not too worried about that. I am more concerned that decisions could be made in the workplace about me, or about energy tariffs, as we have heard. That is more serious. Is there a danger that if we notify individuals of all the automated decisions that are made, it will end up like the cookie scenario—we will just ignore it all?

Dr Tennison: I do not think it is a matter of notifying people about all automated decision making. The Bill suggests limiting that to legally or otherwise significant decisions, so that we have those additional rights only as regards things that will really have an impact on people’s lives.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Q And you are not comfortable that those have been considered properly in the Bill.

Dr Tennison: I am not comfortable that they are directed to the right people.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Q The subject, rather than the decision maker.

Dr Tennison: Yes.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Anna, did you want to come in on that?

Anna Thomas: The last question about the threshold is really important, and it tends to suggest that work should have separate consideration, which is happening all over the world. Last week, Canada introduced its automated decision-making directive, and extended it to work. We have been working with it on that. Japan has a strategy that deals expressly with work. In the United States there are various examples, including the California Privacy Rights Act, of rules that give work special attention in this context. Our proposal for addressing the issue of threshold is that you should always provide notification, assess, and do your best to promote positive impacts and reduce negative ones if the decision-making impacts access to work, termination, pay, contractual status or terms, and, for the rest, when there is significant impact.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Q Is there a danger that automated decisions could impact the Equality Act, if biases are not properly accounted for?

Anna Thomas: Yes, absolutely. In our model, we suggest that the impact assessment should incorporate not just the data protection elements, which we say remain essential, but equality of opportunity and disparity of outcome—for example, equal opportunity to promotion, or access to benefits. That should be incorporated in a model that forefronts and considers impacts on work.

Mike Amesbury Portrait Mike Amesbury
- Hansard - - - Excerpts

Q Anna, how would you strengthen the Bill? If you were to table an amendment around employees and AI, what would it be?

Anna Thomas: I would advise very clear additional rights, and a duty to notify in advance what, how and why AI is being used where it has these impacts, and where it meets the threshold that I was just asked about. I would also advise having more consultation throughout design, development and deployment, and ongoing monitoring, because AI changes, and there are impacts that we have not thought about or cannot ascertain in advance.

There should also be a separate obligation to conduct an algorithmic impact assessment. The Bill does nudge in that direction, but it says that there should be an assessment, rather than a data protection impact assessment. We suggest that the opportunity be grasped of clarifying that—at least in the workplace context, but arguably there are lessons more widely—the assessment ought to cover these fundamental aspects, and impacts at work.