Data Protection and Digital Information (No. 2) Bill (First sitting) Debate

Full Debate: Read Full Debate

Stephanie Peacock

Main Page: Stephanie Peacock (Labour - Barnsley East)
Mark Eastwood Portrait Mark Eastwood (Dewsbury) (Con)
- Hansard - - - Excerpts

Given that one of today’s witnesses is from Prospect, I wish to declare that I am a member of that union.

Stephanie Peacock Portrait Stephanie Peacock (Barnsley East) (Lab)
- Hansard - -

I am a proud member of a trade union. I refer the Committee to my entry in the Register of Members’ Financial Interests.

Chi Onwurah Portrait Chi Onwurah (Newcastle upon Tyne Central) (Lab)
- Hansard - - - Excerpts

I am a proud member of two trade unions.

--- Later in debate ---
None Portrait The Chair
- Hansard -

May I gently say to the witnesses that this is a big room, so you will need to project your voices so that we can hear your evidence?

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Good morning and welcome. The Bill creates a new body corporate to replace the corporation sole. What impact, both in the short and long term, do you think that will have on its ability to carry out its functions?

John Edwards: The corporation sole model is fit for a number of purposes. That was the structure that I had back home in New Zealand. For an organisation such as the Information Commissioner’s Office, it is starting to buckle under the weight. It will benefit, I think, from the support of a formal board structure, with colleagues with different areas of expertise appointed to ensure that we bring an economy-wide perspective to our role, which as we have heard from the declarations of interest spans almost every aspect of human activity.

There will be some short-term, transitional challenges as we make the transition from a corporation sole to a board structure. We will need to employ a chief executive, for example, as well as getting used to those structures and setting up our new accountability frameworks. But I think, in the longer term, the model proposed in the legislation is well proven across other regulators, both domestically and internationally.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q I would like to ask about the independence of the ICO as it stands. Do you have any experience of being directed by the Secretary of State in a way that has threatened the regulator’s impartial position?

John Edwards: No, I do not.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q If the Bill is passed in its current form, the Secretary of State—whoever that might be—will have the ability to approve and veto statutory codes of practice produced by the commission, as well as to set out a statement of strategic priorities to which the commission will have to adhere. Do you perceive that having any impact on your organisation’s ability to act independently of political direction?

John Edwards: No, I do not believe it will undermine our independence at all. What I think it will do is to further enhance and promote our accountability, which is very important.

To take your first challenge, about codes of conduct, we worked closely with the Department for Digital, Culture, Media and Sport and subsequently the Department for Science, Innovation and Technology to ensure that we got the appropriate balance between the independence of the commission with the right of the Executive and Parliament to oversee what is essentially delegated lawmaking. I think we have got there. It is not a right to veto out of hand; there is a clear process of transparency, which would require the Secretary of State, in the event that he or she decided not to publish a statutory code that we had recommended, to publish their reasons, and those would be available to the House. I do think there is an appropriate level of parliamentary and Executive oversight of what is, as I say, essentially a lawmaking function on the part of the commission.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q If the Secretary of State can veto a code of practice that the commission has produced regarding the activities of Government, will that not mean that they are, effectively, marking their own homework?

John Edwards: I do not believe so. The code of practice would be statutory—it is only the most serious statutory guidance that we would issue, not the day-to-day opinions that we have of the way in which the law operates. But, also, it is a reflection of the commissioner’s view of the law, and a statement as to how he or she will interpret and apply the very general principles. A failure of the Secretary of State to table and issue a proposed code would not affect the way in which the commissioner discharges his or her enforcement functions. We would still be able to investigate matters and find them in breach, regardless of whether that finding was consistent with the Secretary of State’s view of the law.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q I will come on to a slightly different topic now. The ICO will play a huge role in enforcing the measures in the Bill. Is there enough clarity in the Bill to ensure that the commission is able to do that effectively? For example, are you clear on how the commission will enforce the law surrounding terms like “vexatious” and “excessive” with regards to subject access requests?

John Edwards: Yes. We are in the business of statutory interpretation. We are given a law by Parliament. A term like “vexatious” has a considerable provenance and jurisprudence; it is one that I worked with back home in New Zealand. So, yes, I am quite confident that we will be able to apply those.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Linked to that, what about terms like “meaningful human involvement” and “significant decision” with regards to automated decision making?

John Edwards: Sorry, what is your question?

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Parts of the Bill refer to there being “meaningful human involvement” and “significant decisions” within automated decision making. That might be in an application for a mortgage or in certain parts of employment. Do you feel that you can interpret those words effectively?

John Edwards: Yes, of course. You are quite right to point out that those phrases are capable of numerous different interpretations. It will be incumbent on my office to issue guidance to provide clarity. There are phrases in the legislation that Parliament could perhaps look at providing clearer criteria on to assist us in that process of issuing guidance—here I am particularly thinking of the phrase “high risk” activities. That is a new standard, which will dictate whether some of the measures apply.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

That is useful. Thank you.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Q Continuing with that theme, the Bill uses a broader definition of “recognised legitimate interests” for data controllers. How do you think the Bill will change the regime for businesses? What sort of things might they argue they should be able to do under the Bill that they cannot do now?

John Edwards: There is an argument that there is nothing under the Bill that they cannot do now, but it does respond to a perception that there is a lack of clarity and certainty about the scope of legitimate interests, and it is a legitimate activity of lawmakers to respond to such perceptions. The provision will allow doubt to be taken out of the economy in respect of aspects such as, “Is maintaining the security of my system a legitimate interest in using this data?” Uncertainty in law is very inefficient—it causes people to seek legal opinions and expend resources away from their primary activity—so the more uncertainty we can take out of the legislation, the greater the efficiency of the regulation. We have a role in that at the Information Commissioner’s Office and you as lawmakers have just as important a role.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Will Eduardo Ustaran please introduce himself? Can you hear us, Mr Ustaran? No. Can you hear us, Bojana Bellamy? No. Okay, we will start with our witness who has been kind enough to join us in the room.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Welcome. Vivienne, would you be in favour of implementing a smart data regime in your industry? If so, why?

Vivienne Artz: Yes, we are interested in implementing a smart data regime because it will allow broader access to data for innovation, particularly in the context of open banking and open finance. It would require access to information, which can often be limited at the moment. There is a lot of concern from businesses around whether or not they can actually access data. Some clarification on what that means, in respect of information that is not necessarily sensitive and can be used for the public good, would be most welcome. Currently, the provisions in the legislation are pretty broad, so it is difficult to see what it will look like, but in theory we are absolutely in favour.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Could you give more detail on who you think would benefit or lose out, and in what ways?

Vivienne Artz: Consumers would absolutely benefit, and that is where our priority needs to be—with individuals. It is an opportunity for them to leverage the opportunities that the data can provide. It will enable innovators to produce more products and services that will help individuals to better understand their financial and personal circumstances, particularly in the context of utility bills and so on. There are a number of positive use cases. There is obviously always the possibility that data can be misused, but I am a great advocate of saying that we need to find the positive use cases and allow business to support society and our consumers to the fullest extent. That is what we need to support.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Brilliant. What are your thoughts on giving the Secretary of State the power to amendment data protection legislation further? Do you think it is necessary to future-proof the Bill?

Vivienne Artz: It is necessary to future-proof the Bill. We are seeing such an incredible speed of innovation and change, particularly with regard to generative artificial intelligence. We need to make sure that the legislation remains technology-neutral and can keep up to date with the changes that are currently taking place.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I have more questions if our other witnesses are with us.

None Portrait The Chair
- Hansard -

We still have not heard definitively whether our other guests can hear us or speak to us, so we are waiting for confirmation from the tech people. In the meantime, I invite the Minister to question Vivienne Artz.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Thank you. Chi Onwurah and Damian Collins are lined up to ask questions, but I want first to ask the shadow Minister whether she has any further questions, followed by the Minister. Because we have one witness in the room and two online, please will whoever is asking the question indicate whom you are asking it of?

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Good morning to our guests joining us via Zoom. Ms Bellamy, in your opinion has it been difficult for businesses to adapt to the EU GDPR? If so, do you think the changes in the Bill will make it easier or harder for businesses to comply with data protection legislation?

Bojana Bellamy: Yes, certainly it has been hard to get businesses to comply with GDPR, in particular small and medium-sized businesses. I think the changes proposed in the Bill will make it easier, because it is more about outcomes-based regulation. It is more about being effective on the ground, as opposed to being prescriptive. GDPR is quite prescriptive and detailed. It tells you how to do things. In this new world of digital, that is not very helpful, because technology always goes in front of and faster than the rules.

In effect, what we see proposed in the Bill is more flexibility and more onus on organisations in both the public and private sector to deliver accountability and effective protection for people. It does not tell them and prescribe how exactly to do that, yet they are still accountable for the outcomes. From that perspective, it is a step forward. It is a better regime, in my opinion.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Mr Ustaran, what do you perceive the value of EU adequacy to be? What would be the consequences for your businesses and other businesses and the UK market of losing such an agreement?

Eduardo Ustaran: From the point of view of adequacy, it is fundamental to acknowledge that data flows between the UK and the EU and the EU and the UK are essential for global commerce and for our digital existence. Adequacy is an extremely valuable element of the way in which the current data protection regime works across both the EU and the UK.

It is really important to note at the outset that the changes being proposed to the UK framework are extremely unlikely to affect that adequacy determination by the EU, in the same way that if the EU were to make the same changes to the EU GDPR, the UK would be very unlikely to change the adequacy determination of the EU. It is important to appreciate that these changes do not affect the essence of UK data protection law, and therefore the adequacy that is based on that essence would not be affected.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q You have answered my next question—thank you—but I will pose it to the other witnesses, who may have something to add. In the previous session, the Information Commissioner said that he did not think the Bill was a threat to adequacy. That is comforting, but it is not confirmation, because the only people who have the power to decide whether adequacy stands are the European Commission. Do you think any of the measures in the Bill pose a risk to the adequacy agreement?

Bojana Bellamy: I certainly agree that adequacy is a political decision. In many ways—you have seen this with the Northern Ireland protocol—some of these decisions are made for different purposes. I do not believe there are elements of the Bill that would reduce adequacy; if anything, the Bill is very well balanced. Let me give you some examples of where I think the Bill goes beyond GDPR: certainly, on expectations of accountability on the senior responsible individual, which actually delivers better oversight and leadership over privacy; on the right to complain to an organisation and on organisations to respond to these complaints; and on the strong and effective Information Commissioner, who actually has more power. The regulator is smarter; that, again, is better than GDPR. There are also the safeguards that exist for scientific research and similar purposes, as well as some other detailed ones.

Yes, you will see, and you have seen in public projects as well, that there are people who are worried about the erosion of rights, but I do not believe that exception to subject access requests and other rights we talked about are actually a real erosion. I think it just clarifies what has been the law. Some of the requirements to simplify privacy impact assessment and records of processing will, in fact, deliver better accountability in practice. They are still there; they are just not as prescriptive. The Information Commissioner has strong powers; it is a robust regulator, and I do not believe its independence will be dented by this Bill. I say to those who think that we are reducing the level of protection that, actually, the balance of all the rules is going to be essential equivalency to the EU. That is really what is important.

May I say one more thing quickly? We have seen the EU make adequacy decisions regarding countries such as Japan and Korea, and even privacy shield. Even in these cases, you have not had a situation where the requirements were essentially equivalent. These laws are still different from GDPR—they do not have the right of portability or the concept of automated decision making—but they are still found to be adequate. That is why I really do not believe that this is a threat. One thing we have to keep absolutely clear and on par with the EU is Government access to data for national security and intelligence purposes. That is something the EU will be very interested in to ensure that that is not where the bar goes down, but there is no reason to believe so and there is nothing in the Bill to tell us so.

Vivienne Artz: I concur; I do not think the Bill poses any threat to adequacy with the EU. With regard to the national security issue that Bojana raises, I would also point out that the UN rapporteur noted that the UK has better protections for Government access to data than many EU member states, where it is often a very political approach as opposed to a practical approach and really looking at what the outcomes are. There is nothing in this Bill that would jeopardise adequacy with the EU.

None Portrait The Chair
- Hansard -

We have 12 minutes left and two Members are indicating that they wish to ask questions after you, Minister.

--- Later in debate ---
None Portrait The Chair
- Hansard -

I apologise for getting your surname pronunciation wrong, Mr Combemale.

Chris Combemale: That’s okay, it happens all the time. It is actually of French heritage, rather than Italian.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Welcome to the witnesses. TechUK’s response to the withdrawn Bill last autumn stated that it

“could go further in seeking the full benefits of data driven innovation”.

Does this amended Bill go further?

Neil Ross: Yes, it does. If we go back to the statement of the Information Commissioner earlier, the most important part of the legislation is to provide increased clarity on how we can use data. I think there were about 3,000 responses to the consultation, and the vast majority—particularly around the scientific research and the legitimate interest provisions—focused on providing that extra level of clarity. What the Government have done is quite clever, in that they have lifted examples from the recitals—recital 157, as well as those related to legitimate interests—to give additional clarity on the face of the Bill, so that we can take a much more innovative approach to data management and use in the UK, while still maintaining that within the broad umbrella of what means we qualify for EU adequacy.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q How have your members found adapting to GDPR? Will the Bill make it easier or harder for those that you represent to comply?

Neil Ross: Most tech companies have adapted to GDPR. It is now a common global standard. The Bill makes the compliance burden a little easier to use, allows us to be a little more flexible in interpretation of it and will give companies much more certainty when taking decisions about data use.

One really good example is fraud. Online fraud is a massive problem in the UK and the Government have a strategy to deal with it, so having that legitimate interest that focuses on crime prevention—also those further processing rights around compliance with the law—means that we can be much more innovative and adaptive about how we share and process data to protect against and prevent fraud. That will be absolutely vital in addressing the shared objective that we all have to reduce online fraud.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q On the changes to requirements to report suspicious activity related to unsolicited direct marketing, do the telecoms companies among your members have the technical capability to identify instances of mass unsolicited direct marketing in order to report as required?

Neil Ross: No. That is one area where we think further work is needed in the Bill. I think you are referring to clause 85. When we responded to the consultation, we said that the Government should try to create equivalence between the private communications requirements and the GDPR to give that extra level of flex. By not doing that and by not setting out specific cases of where telecoms companies have to identify unsolicited calls, the Government are being really unfair in what they are asking them to do. We have had concerns raised by a range of companies, both large and small, that they might not have the technical capability and that they will have to set up new systems to do it. Overall, we think that the Bill makes a bit of a misstep here and that we need to clarify exactly how it will work. TechUK and some of my colleagues will be suggesting to the Committee some legal amendments for how to do that.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q On that point, do the telecoms companies feel that they have been consulted properly in the making of the legislation?

Neil Ross: No, not on that clause, but yes in relation to the rest of the legislation.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q I was asking about that. Chris, will the changes to the cookies set out in the Bill benefit, first, the consumer experience and, secondly, your members or businesses?

Chris Combemale: Yes. First, on the consumer experience, I think that we all recognise that the pop-up consent banners for cookies are generally ticked as a matter of course by consumers who really want to go about their business and get to the website that they want to do business on. In a way, it is not genuine consent, because people are not really thinking deeply about it.

In terms of business, a number of the cookies, which are really identifiers that help you understand what people are doing on your website, are used just on a first-party basis by websites, such as e-commerce websites and business-to-business websites, to understand the basic operational aspects and statistical measurement of how many people are going to which pages. Those are websites that do not take any advertising and do not share any data with third parties, so the exemptions in the Bill generally would make those types of companies no longer need cookie banners while providing no risk to the customers, because the company uses the cookies purely to understand the behaviours of its own website traffic and its own customers. In that sense, we strongly support the provisions and the exemptions in the Bill.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Is the technology available to centralise cookies by browser?

Chris Combemale: I think it can be eventually, but we oppose those provisions in the Bill, because they create a market imbalance and give control as a gateway to large companies that manage browser technology, at the expense of media owners and publishers that are paying journalists and investing in content. It is incumbent upon all else that media owners are able to develop first-party relationships with their audiences and customers to better understand what they need. If anything, we need more control in the hands of the people who invest in creating the content and in paying the journalists who provide those important democratic functions.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Is there a concern that centralising cookies by browser will entrench power in the hands of the larger tech companies that own the browsers?

Chris Combemale: It certainly would give even greater market control to those companies.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Is the risk in centralising cookies by browser that we could confuse liability, for example who is responsible for a breach of cookie regulation?

Chris Combemale: I think it could be. For us, the essential principle is that a business, whether a media owner, e-commerce business or publishing business, should have control of the relationships between its products and services and its customers and prospects for its customers. By nature, when you give control to a third party, whether a large tech company or another company, you are getting in between the relationship between people and the organisations that they want to do business with and giving control to an intermediary who may not understand. At the least point, if you register with a website after, for instance, changing your browser setting, that should take precedence over the browser setting: your choice to engage with a particular company should always take precedence over a centralised cookie management system.

Neil Ross: I think that what the Government have done in relation to this is quite clever: they have said that their objective is to have a centralised system in the future, but they have recognised that there are a number of different ongoing legislative and regulatory activities that have a significant bearing on that. I think it was only last week that the Government introduced the Digital Markets, Competition and Consumers Bill, clause 20 of which—on conduct requirements—would play a large role in whether you could set up a centralised system, so there is an element of co-ordinating two different but ongoing regulatory regimes. I think we agree with Chris that the steps on analytical cookies now are good but that we need to have a lot more deep thought about what a centralised system may or may not look like and whether we want to go ahead with it.

Chris Combemale: May I come in on that final point? What makes sense to us is a centralised system for managing opt-outs as opposed to managing consent. As the Data and Marketing Association, we operate the telephone preference service and the mailing preference service, which give consumers the opportunity to opt out from receiving unwanted cold calls or unwanted direct mail. There is already a system in place with digital advertising—an icon that people can use to opt out from the use of personal data for personalising digital ads. I think it makes sense that, if people do not want to receive certain things, they can opt out centrally, but a centralised consent opt-in gives too much control to the intermediaries.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Thank you.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Q Mr Ross, I know that techUK has been supportive of a number of elements of the Bill, particularly around the opportunities created by the use of smart data. Will you set out your view of the opportunities, and how the Bill will help to attain them?

Neil Ross: Smart data is potentially a very powerful tool for increasing consumer choice, lowering prices and giving people access to a much broader range of services. The smart data provisions that the Government have introduced, as well as the Smart Data Council that they are leading, are really welcome. However, we need to go one step further and start to give people and industries clarity around where the Government will look first, in terms of what kind of smart data provisions they might look at and what kind of sectors they might go into. Ultimately, we need to make sure that businesses are well consulted and that there is a strong cost-benefit analysis. We then need to move ahead with the key sectors that we want to push forward on. Similarly to on nuisance calls, we will send some suggested text to the Committee to add those bits in, but it is a really welcome step forward.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Welcome. Stephanie Peacock will start the questions.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Good morning. To go first to Dr Jeni Tennison, do you think the general public and workers have a good level of trust and understanding in terms of how their data is being used? What does the Bill do, if anything, to help build or improve on that trust and understanding?

Dr Tennison: Surveys and public attitudes polling show that when you ask people about their opinions around the use of data, they have a good understanding about the ways in which it is going wrong, and they have a good understanding about the kinds of protections that they would like to see. The levels of trust are not really there.

A poll from the Open Data Institute, for example, shows that only 30% trust the Government to use data ethically. CDEI has described this as “tenuous trust” and highlighted that about 70% of the public think that the tech sector is insufficiently regulated. I do not think that the Bill addresses those issues of trust very well; in fact, it reduces the power individuals have and also the level of collective representation people can have, particularly in the work context. I think this will diminish trust in the way in which data is used.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Do you believe the Government have consulted the public and data subjects such as workers appropriately during the process of formulating the Bill?

Dr Tennison: Obviously, there was a strong consultation exercise around the data reform Bill, as it was then characterised. However, there are elements of this Bill, in particular the recognised legitimate interests that are listed, that have not had detailed public consultation or scrutiny. There are also not the kinds of provisions that we would like to see on ongoing consultation with the public on specific questions around data processing in the future.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q What value do subject access requests hold for citizens, and how will changing the threshold for refusing a request or changing a request to “vexatious or excessive” impact citizens’ ability to exercise their rights?

Dr Tennison: Subject access requests are an important way in which citizens can work out what is happening within organisations with the data that is being held about them. There are already protections under UK GDPR against vexatious or excessive requests, and strengthening those as the Bill is doing is, I think, going to put off more citizens from making these kinds of requests.

It is worth noting that this is a specific design of the Bill. If you look at the impact assessment, this is where most of the cost to business is being saved; that is being done by refusing subject access requests. So I think we should be suspicious about what that looks like. Where we have been looking at the role of subject access requests in people exercising their rights, it is clear that that is a necessary step, and delays to or refusals of subject access requests would prevent people from exercising their rights.

We think that a better way of reducing subject access requests would be to have publication of things like the risk assessments that organisations have to do when there is high-risk processing—so that there is less suspicion on the part of data subjects and they do not make those requests in the first place.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Thank you. I have a couple of questions for Anna Thomas now. Do the current laws around automated decision making do enough to protect workers and citizens from harm?

Anna Thomas: Referring partly to our work in “Mind the gap” and “The Amazonian Era”, as well as the report by the all-party parliamentary group on the future of work about use of AI in the workplace, we would say no. The aim of the Bill—to simplify—is very good. But particular areas in the Bill as it stands—eroded somewhat—are particularly problematic in the workplace. The automated ones that you ask about are really important with regard to the reduction of human involvement. But in addition to that are the need to assess in advance what the risks and impacts are, the requirement for consultation, and the access to relevant information. Those are all relevant and overlap with the automated decision making requirement.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Linked to that, do you believe that the safeguards outlined in the Bill—having a right to human review, for example—are enough to protect workers from the potential harm of automated decision making?

Anna Thomas: Not in themselves. There is potential, in those areas, to correct that or to improve it in the course of the Bill’s proceedings, in order that the opportunities, as well as the risks, of putting this new Bill through Parliament are seized. But, no, because of the transformation of work and the extent of the impact, as well as the risks, that new technologies and automated technologies are having across work, not just on access to work, but on terms, conditions, nature, quality and models for work, the safeguards—there is, I think, increasing cross-party consensus about this—should be, in those areas, moving in the other direction.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q My final question is to Michael. Do you believe that the current regulation does enough to govern the use of biometric technologies?

Michael Birtwistle: No, we would say that it does not. The Ada Lovelace Institute published a couple of reports last year on the use of biometric data, arguing for a much stronger and coherent regulatory governance framework for biometric technologies. These are a set of technologies that are incredibly personal. We are used to their being talked about in terms of our faces or fingerprints, but actually it is a much wider range, involving any measurement to do with the human body, which can be used in emotional analysis—walking style or gait, your tone of voice or even your typing style. There is also a set of incoming, next-generation AI technologies that rely quite heavily on biometrics, so there is a question about future-proofing the Bill.

We have made two broad proposals. One is to increase the capability of the Information Commissioner’s Office to look specifically at biometrics—for example, to create and maintain a public register of private entities engaging in processing of biometric data, to have a proper complaints procedure, to publish annual reports and so on. There is a set of issues around increasing the capability of our institutions to deal with that.

Then there is a second question about scope. First, the current focus of biometric data and definition is on identifiability of personal data. There are many potentially problematic use cases of biometric data that do not need to know who you are in order to make a decision about you. We think it would be wise and would future-proof the regulation of this powerful technology to also include classification or categorisation as the purpose of those biometric technologies.