Data Protection and Digital Information (No. 2) Bill Debate

Full Debate: Read Full Debate
Department: Department for Science, Innovation & Technology

Data Protection and Digital Information (No. 2) Bill

Stephanie Peacock Excerpts
2nd reading
Monday 17th April 2023

(1 year ago)

Commons Chamber
Read Full debate Data Protection and Digital Information Bill 2022-23 Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Stephanie Peacock Portrait Stephanie Peacock (Barnsley East) (Lab)
- View Speech - Hansard - -

I would like to add my best wishes to the Minister and the Secretary of State on their imminent arrivals.

We are in the midst of a tech revolution, and right at the centre of this is data. From social media and online shopping to the digitisation of public services, the rate at which data is being collected, processed and shared is multiplying by the minute. This new wealth of data holds great potential for innovation, boosting economic growth and improving the delivery of public services. The aims of the Bill to unlock the economic and societal benefits of data while ensuring strong, future-proofed privacy rights are therefore ones that we support. We welcome, for example, provisions to modernise the ICO structure, and we support provisions for the new smart data regimes, so long as there are clear requirements for impact assessments.

However, the Bill in its current form does not go far enough in actually achieving its aims. Its narrow approach and lack of clarity render it a missed opportunity to implement a truly innovative and progressive data regime. Indeed, in its current form many clarifications will be needed to reassure the public that their rights will not be weakened by the Bill while sweeping powers are awarded to the Secretary of State. Currently, solely automated processing is defined by the Bill as one having “no meaningful human involvement” that results in a “significant decision”, with the Secretary of State trusted with powers to amend what counts within this definition. The lack of detail on the boundaries of such definitions as well as their ability to change over time have concerned the likes of the Ada Lovelace Institute and the TUC.

The Chair of the Business, Energy and Industrial Strategy Committee, my hon. Friend the Member for Bristol North West (Darren Jones), outlined in his powerful speech the power imbalance between big tech and the people, which is an important insight and a challenge for us in this House. Indeed, just this month Uber was found to have violated the rights of three UK-based drivers by firing them without appeal on the basis of fraudulent activity picked up by its automated decision-making system. In its judgment, the court found that the limited human intervention in Uber’s automated decision process was not

“much more than a purely symbolic act”.

This case and the justice the drivers received therefore explicitly relied on current legislation in the form of article 22 of the UK GDPR, and a clear understanding of what constitutes meaningful human involvement. Without providing clear boundaries for defining significant decisions and meaningful human involvement, this Bill therefore risks removing the exact rights that won this case and creating an environment where vital safeguards, such as the right to contest automated decisions and request human intervention, could easily become exempt from applying at the whim of the Secretary of State. This must be resolved, and the public must be reassured that they will not be denied a job, mortgage or visa by an algorithm without a method of redress.

There is also a lack of clarity around how rules allowing organisations to charge a fee or refuse subject access requests deemed “vexatious” and “excessive” will work, as the likes of Which? and the Public Law Project have argued and which my hon. Friend the Member for Cambridge (Daniel Zeichner) highlighted. Indeed, if the list of circumstances where these terms might be met is non-exhaustive, what safeguards will be in place to stop controllers from abusing this, deciding that any request they dislike is vexatious? Organisations should absolutely be supported in directing resources to good faith requests, but we must be careful to ensure that any new limits are protected against abuse.

Reform of the responsibilities of the Information Commissioner’s Office is another area in need of analysis. Indeed, more than evolving its structure, the Bill gives the Secretary of State power to set the strategic priorities of the regulator and approve codes of practice. This has sparked concern across the spectrum of stakeholders, from the Open Rights Group to techUK, over what it means for the regulator’s independence. Given these new powers, particularly in cases where guidance addresses the activity of the Government, how can Ministers assure us that a Secretary of State will not be marking their own homework?

Whether it is the Secretary of State being able to amend the “recognised legitimate interests” list or the removal of the requirement for consultation on impact assessment, this same theme is echoed throughout the Bill, which was raised by the hon. Member for Oxford West and Abingdon (Layla Moran). Without additional guidance and clear examples of how definitions apply, it is hard to grasp the full extent of the consequences of these new measures, especially given the sweeping powers of the Secretary of State to make further changes. We will look to ensure that this clarity is included in the Bill, so that everyone can be assured of their rights and of a truly independent regulator. We must also ensure that children are protected by the Bill and that the age-appropriate design code is not compromised, as raised by the hon. Member for Folkestone and Hythe (Damian Collins) and others across the House.

Clarity on the new regime is also vital for reassuring businesses who still have fears around losing EU adequacy, something raised throughout this debate and which the former Secretary of State the right hon. Member for Maldon (Sir John Whittingdale) outlined in his contribution. The Government have said that they recognise that losing adequacy would be disastrous, costing up to £460 million as a one-off and £410 million every year afterwards. Ministers have rightly rowed back on many of the more concerning suggestions from their consultation, but they must be absolutely clear on how they are sure that the measures in the Bill, particularly those that toy with the regulator’s independence and give Ministers power to create further change, will not threaten adequacy.

Having already made significant adjustments to comply with UK GDPR, the changes in the Bill must also be careful not to create further uncertainty for businesses. Indeed, although Ministers say that anyone who abides by the current rules will still be compliant after the passing of the Bill, organisations will still have to do their own legal due diligence to understand how, if at all, this set of amendments impacts them. It would therefore be good to hear from Ministers on how they plan to ensure that businesses, particularly small and medium-sized enterprises, are supported in understanding the requirements on them.

We understand the Government’s attempts to future-proof this legislation, and it would be great to see an end to constant cookie banners or nuisance calls, which the hon. Member for Aberconwy (Robin Millar) referenced, but the measures in the Bill rely on technology that does not currently operationally exist. In the case of browser-enabled cookie models, there is also the concern that this may entrench power in the hands of existing tech giants and muddy the waters on liability. We must be careful, therefore, to ensure that businesses can actually implement what the Bill requires.

Ultimately, with the exception of the section on smart data, this Bill chooses to take a very narrow view of what an innovative data regime could look like. In the context of a rapidly changing world, this Bill was a great opportunity to really consider how we can get data working in better interests, like those of the general public or small businesses. Labour would have used a Bill like this to, for example, examine how data can empower communities and collective groups such as workers in industries who have long felt that they have been on the wrong end of automated decision-making as well as the automation of jobs.

We would also have sought to improve public trust and understanding in how our data is used, particularly since the willingness to share data has been eroded after the likes of the Cambridge Analytica scandal, the NHS data opt-out, and the exam algorithm scandal, which disproportionately affected my constituents in Barnsley. As it stands, however, the Bill seems only to consider data rights when they emerge as a side product of making changes to rules for processors. Data rights and data protection have wide-ranging consequences across society, as the hon. Member for Strangford (Jim Shannon) discussed. Labour would have used this as an opportunity to look at the larger picture of data ownership. Deregulation measures such as those in the Bill might mean less work for some small businesses, but as long as a disproportionate amount of data is held by a limited number of firms, they will still be at a large competitive disadvantage. From introducing methods of collective redress to nurturing privacy-enhancing technologies, there are many positive opportunities a progressive data Bill could have explored to put our country at the forefront of innovation while genuinely strengthening rights and trust for the modern era, but the Government have missed this opportunity.

Overall, we can all agree on unlocking innovation through data while ensuring data subjects have the rights and trust they fundamentally deserve. However, there are many areas for clarity and improvement if this Bill is to match the bold vision required to truly be at the forefront of data use and data protection. I look forward to working closely with Ministers in the coming months towards legislation that better fulfils these aims.

Data Protection and Digital Information (No. 2) Bill (First sitting) Debate

Full Debate: Read Full Debate

Stephanie Peacock

Main Page: Stephanie Peacock (Labour - Barnsley East)

Data Protection and Digital Information (No. 2) Bill (First sitting)

Stephanie Peacock Excerpts
Committee stage
Wednesday 10th May 2023

(11 months, 4 weeks ago)

Public Bill Committees
Read Full debate Data Protection and Digital Information Bill 2022-23 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 10 May 2023 - (10 May 2023)
Mark Eastwood Portrait Mark Eastwood (Dewsbury) (Con)
- Hansard - - - Excerpts

Given that one of today’s witnesses is from Prospect, I wish to declare that I am a member of that union.

Stephanie Peacock Portrait Stephanie Peacock (Barnsley East) (Lab)
- Hansard - -

I am a proud member of a trade union. I refer the Committee to my entry in the Register of Members’ Financial Interests.

Chi Onwurah Portrait Chi Onwurah (Newcastle upon Tyne Central) (Lab)
- Hansard - - - Excerpts

I am a proud member of two trade unions.

--- Later in debate ---
None Portrait The Chair
- Hansard -

May I gently say to the witnesses that this is a big room, so you will need to project your voices so that we can hear your evidence?

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Good morning and welcome. The Bill creates a new body corporate to replace the corporation sole. What impact, both in the short and long term, do you think that will have on its ability to carry out its functions?

John Edwards: The corporation sole model is fit for a number of purposes. That was the structure that I had back home in New Zealand. For an organisation such as the Information Commissioner’s Office, it is starting to buckle under the weight. It will benefit, I think, from the support of a formal board structure, with colleagues with different areas of expertise appointed to ensure that we bring an economy-wide perspective to our role, which as we have heard from the declarations of interest spans almost every aspect of human activity.

There will be some short-term, transitional challenges as we make the transition from a corporation sole to a board structure. We will need to employ a chief executive, for example, as well as getting used to those structures and setting up our new accountability frameworks. But I think, in the longer term, the model proposed in the legislation is well proven across other regulators, both domestically and internationally.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q I would like to ask about the independence of the ICO as it stands. Do you have any experience of being directed by the Secretary of State in a way that has threatened the regulator’s impartial position?

John Edwards: No, I do not.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q If the Bill is passed in its current form, the Secretary of State—whoever that might be—will have the ability to approve and veto statutory codes of practice produced by the commission, as well as to set out a statement of strategic priorities to which the commission will have to adhere. Do you perceive that having any impact on your organisation’s ability to act independently of political direction?

John Edwards: No, I do not believe it will undermine our independence at all. What I think it will do is to further enhance and promote our accountability, which is very important.

To take your first challenge, about codes of conduct, we worked closely with the Department for Digital, Culture, Media and Sport and subsequently the Department for Science, Innovation and Technology to ensure that we got the appropriate balance between the independence of the commission with the right of the Executive and Parliament to oversee what is essentially delegated lawmaking. I think we have got there. It is not a right to veto out of hand; there is a clear process of transparency, which would require the Secretary of State, in the event that he or she decided not to publish a statutory code that we had recommended, to publish their reasons, and those would be available to the House. I do think there is an appropriate level of parliamentary and Executive oversight of what is, as I say, essentially a lawmaking function on the part of the commission.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q If the Secretary of State can veto a code of practice that the commission has produced regarding the activities of Government, will that not mean that they are, effectively, marking their own homework?

John Edwards: I do not believe so. The code of practice would be statutory—it is only the most serious statutory guidance that we would issue, not the day-to-day opinions that we have of the way in which the law operates. But, also, it is a reflection of the commissioner’s view of the law, and a statement as to how he or she will interpret and apply the very general principles. A failure of the Secretary of State to table and issue a proposed code would not affect the way in which the commissioner discharges his or her enforcement functions. We would still be able to investigate matters and find them in breach, regardless of whether that finding was consistent with the Secretary of State’s view of the law.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q I will come on to a slightly different topic now. The ICO will play a huge role in enforcing the measures in the Bill. Is there enough clarity in the Bill to ensure that the commission is able to do that effectively? For example, are you clear on how the commission will enforce the law surrounding terms like “vexatious” and “excessive” with regards to subject access requests?

John Edwards: Yes. We are in the business of statutory interpretation. We are given a law by Parliament. A term like “vexatious” has a considerable provenance and jurisprudence; it is one that I worked with back home in New Zealand. So, yes, I am quite confident that we will be able to apply those.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Linked to that, what about terms like “meaningful human involvement” and “significant decision” with regards to automated decision making?

John Edwards: Sorry, what is your question?

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Parts of the Bill refer to there being “meaningful human involvement” and “significant decisions” within automated decision making. That might be in an application for a mortgage or in certain parts of employment. Do you feel that you can interpret those words effectively?

John Edwards: Yes, of course. You are quite right to point out that those phrases are capable of numerous different interpretations. It will be incumbent on my office to issue guidance to provide clarity. There are phrases in the legislation that Parliament could perhaps look at providing clearer criteria on to assist us in that process of issuing guidance—here I am particularly thinking of the phrase “high risk” activities. That is a new standard, which will dictate whether some of the measures apply.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

That is useful. Thank you.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Q Continuing with that theme, the Bill uses a broader definition of “recognised legitimate interests” for data controllers. How do you think the Bill will change the regime for businesses? What sort of things might they argue they should be able to do under the Bill that they cannot do now?

John Edwards: There is an argument that there is nothing under the Bill that they cannot do now, but it does respond to a perception that there is a lack of clarity and certainty about the scope of legitimate interests, and it is a legitimate activity of lawmakers to respond to such perceptions. The provision will allow doubt to be taken out of the economy in respect of aspects such as, “Is maintaining the security of my system a legitimate interest in using this data?” Uncertainty in law is very inefficient—it causes people to seek legal opinions and expend resources away from their primary activity—so the more uncertainty we can take out of the legislation, the greater the efficiency of the regulation. We have a role in that at the Information Commissioner’s Office and you as lawmakers have just as important a role.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Will Eduardo Ustaran please introduce himself? Can you hear us, Mr Ustaran? No. Can you hear us, Bojana Bellamy? No. Okay, we will start with our witness who has been kind enough to join us in the room.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Welcome. Vivienne, would you be in favour of implementing a smart data regime in your industry? If so, why?

Vivienne Artz: Yes, we are interested in implementing a smart data regime because it will allow broader access to data for innovation, particularly in the context of open banking and open finance. It would require access to information, which can often be limited at the moment. There is a lot of concern from businesses around whether or not they can actually access data. Some clarification on what that means, in respect of information that is not necessarily sensitive and can be used for the public good, would be most welcome. Currently, the provisions in the legislation are pretty broad, so it is difficult to see what it will look like, but in theory we are absolutely in favour.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Could you give more detail on who you think would benefit or lose out, and in what ways?

Vivienne Artz: Consumers would absolutely benefit, and that is where our priority needs to be—with individuals. It is an opportunity for them to leverage the opportunities that the data can provide. It will enable innovators to produce more products and services that will help individuals to better understand their financial and personal circumstances, particularly in the context of utility bills and so on. There are a number of positive use cases. There is obviously always the possibility that data can be misused, but I am a great advocate of saying that we need to find the positive use cases and allow business to support society and our consumers to the fullest extent. That is what we need to support.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Brilliant. What are your thoughts on giving the Secretary of State the power to amendment data protection legislation further? Do you think it is necessary to future-proof the Bill?

Vivienne Artz: It is necessary to future-proof the Bill. We are seeing such an incredible speed of innovation and change, particularly with regard to generative artificial intelligence. We need to make sure that the legislation remains technology-neutral and can keep up to date with the changes that are currently taking place.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I have more questions if our other witnesses are with us.

None Portrait The Chair
- Hansard -

We still have not heard definitively whether our other guests can hear us or speak to us, so we are waiting for confirmation from the tech people. In the meantime, I invite the Minister to question Vivienne Artz.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Thank you. Chi Onwurah and Damian Collins are lined up to ask questions, but I want first to ask the shadow Minister whether she has any further questions, followed by the Minister. Because we have one witness in the room and two online, please will whoever is asking the question indicate whom you are asking it of?

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Good morning to our guests joining us via Zoom. Ms Bellamy, in your opinion has it been difficult for businesses to adapt to the EU GDPR? If so, do you think the changes in the Bill will make it easier or harder for businesses to comply with data protection legislation?

Bojana Bellamy: Yes, certainly it has been hard to get businesses to comply with GDPR, in particular small and medium-sized businesses. I think the changes proposed in the Bill will make it easier, because it is more about outcomes-based regulation. It is more about being effective on the ground, as opposed to being prescriptive. GDPR is quite prescriptive and detailed. It tells you how to do things. In this new world of digital, that is not very helpful, because technology always goes in front of and faster than the rules.

In effect, what we see proposed in the Bill is more flexibility and more onus on organisations in both the public and private sector to deliver accountability and effective protection for people. It does not tell them and prescribe how exactly to do that, yet they are still accountable for the outcomes. From that perspective, it is a step forward. It is a better regime, in my opinion.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Mr Ustaran, what do you perceive the value of EU adequacy to be? What would be the consequences for your businesses and other businesses and the UK market of losing such an agreement?

Eduardo Ustaran: From the point of view of adequacy, it is fundamental to acknowledge that data flows between the UK and the EU and the EU and the UK are essential for global commerce and for our digital existence. Adequacy is an extremely valuable element of the way in which the current data protection regime works across both the EU and the UK.

It is really important to note at the outset that the changes being proposed to the UK framework are extremely unlikely to affect that adequacy determination by the EU, in the same way that if the EU were to make the same changes to the EU GDPR, the UK would be very unlikely to change the adequacy determination of the EU. It is important to appreciate that these changes do not affect the essence of UK data protection law, and therefore the adequacy that is based on that essence would not be affected.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q You have answered my next question—thank you—but I will pose it to the other witnesses, who may have something to add. In the previous session, the Information Commissioner said that he did not think the Bill was a threat to adequacy. That is comforting, but it is not confirmation, because the only people who have the power to decide whether adequacy stands are the European Commission. Do you think any of the measures in the Bill pose a risk to the adequacy agreement?

Bojana Bellamy: I certainly agree that adequacy is a political decision. In many ways—you have seen this with the Northern Ireland protocol—some of these decisions are made for different purposes. I do not believe there are elements of the Bill that would reduce adequacy; if anything, the Bill is very well balanced. Let me give you some examples of where I think the Bill goes beyond GDPR: certainly, on expectations of accountability on the senior responsible individual, which actually delivers better oversight and leadership over privacy; on the right to complain to an organisation and on organisations to respond to these complaints; and on the strong and effective Information Commissioner, who actually has more power. The regulator is smarter; that, again, is better than GDPR. There are also the safeguards that exist for scientific research and similar purposes, as well as some other detailed ones.

Yes, you will see, and you have seen in public projects as well, that there are people who are worried about the erosion of rights, but I do not believe that exception to subject access requests and other rights we talked about are actually a real erosion. I think it just clarifies what has been the law. Some of the requirements to simplify privacy impact assessment and records of processing will, in fact, deliver better accountability in practice. They are still there; they are just not as prescriptive. The Information Commissioner has strong powers; it is a robust regulator, and I do not believe its independence will be dented by this Bill. I say to those who think that we are reducing the level of protection that, actually, the balance of all the rules is going to be essential equivalency to the EU. That is really what is important.

May I say one more thing quickly? We have seen the EU make adequacy decisions regarding countries such as Japan and Korea, and even privacy shield. Even in these cases, you have not had a situation where the requirements were essentially equivalent. These laws are still different from GDPR—they do not have the right of portability or the concept of automated decision making—but they are still found to be adequate. That is why I really do not believe that this is a threat. One thing we have to keep absolutely clear and on par with the EU is Government access to data for national security and intelligence purposes. That is something the EU will be very interested in to ensure that that is not where the bar goes down, but there is no reason to believe so and there is nothing in the Bill to tell us so.

Vivienne Artz: I concur; I do not think the Bill poses any threat to adequacy with the EU. With regard to the national security issue that Bojana raises, I would also point out that the UN rapporteur noted that the UK has better protections for Government access to data than many EU member states, where it is often a very political approach as opposed to a practical approach and really looking at what the outcomes are. There is nothing in this Bill that would jeopardise adequacy with the EU.

None Portrait The Chair
- Hansard -

We have 12 minutes left and two Members are indicating that they wish to ask questions after you, Minister.

--- Later in debate ---
None Portrait The Chair
- Hansard -

I apologise for getting your surname pronunciation wrong, Mr Combemale.

Chris Combemale: That’s okay, it happens all the time. It is actually of French heritage, rather than Italian.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Welcome to the witnesses. TechUK’s response to the withdrawn Bill last autumn stated that it

“could go further in seeking the full benefits of data driven innovation”.

Does this amended Bill go further?

Neil Ross: Yes, it does. If we go back to the statement of the Information Commissioner earlier, the most important part of the legislation is to provide increased clarity on how we can use data. I think there were about 3,000 responses to the consultation, and the vast majority—particularly around the scientific research and the legitimate interest provisions—focused on providing that extra level of clarity. What the Government have done is quite clever, in that they have lifted examples from the recitals—recital 157, as well as those related to legitimate interests—to give additional clarity on the face of the Bill, so that we can take a much more innovative approach to data management and use in the UK, while still maintaining that within the broad umbrella of what means we qualify for EU adequacy.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q How have your members found adapting to GDPR? Will the Bill make it easier or harder for those that you represent to comply?

Neil Ross: Most tech companies have adapted to GDPR. It is now a common global standard. The Bill makes the compliance burden a little easier to use, allows us to be a little more flexible in interpretation of it and will give companies much more certainty when taking decisions about data use.

One really good example is fraud. Online fraud is a massive problem in the UK and the Government have a strategy to deal with it, so having that legitimate interest that focuses on crime prevention—also those further processing rights around compliance with the law—means that we can be much more innovative and adaptive about how we share and process data to protect against and prevent fraud. That will be absolutely vital in addressing the shared objective that we all have to reduce online fraud.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q On the changes to requirements to report suspicious activity related to unsolicited direct marketing, do the telecoms companies among your members have the technical capability to identify instances of mass unsolicited direct marketing in order to report as required?

Neil Ross: No. That is one area where we think further work is needed in the Bill. I think you are referring to clause 85. When we responded to the consultation, we said that the Government should try to create equivalence between the private communications requirements and the GDPR to give that extra level of flex. By not doing that and by not setting out specific cases of where telecoms companies have to identify unsolicited calls, the Government are being really unfair in what they are asking them to do. We have had concerns raised by a range of companies, both large and small, that they might not have the technical capability and that they will have to set up new systems to do it. Overall, we think that the Bill makes a bit of a misstep here and that we need to clarify exactly how it will work. TechUK and some of my colleagues will be suggesting to the Committee some legal amendments for how to do that.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q On that point, do the telecoms companies feel that they have been consulted properly in the making of the legislation?

Neil Ross: No, not on that clause, but yes in relation to the rest of the legislation.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q I was asking about that. Chris, will the changes to the cookies set out in the Bill benefit, first, the consumer experience and, secondly, your members or businesses?

Chris Combemale: Yes. First, on the consumer experience, I think that we all recognise that the pop-up consent banners for cookies are generally ticked as a matter of course by consumers who really want to go about their business and get to the website that they want to do business on. In a way, it is not genuine consent, because people are not really thinking deeply about it.

In terms of business, a number of the cookies, which are really identifiers that help you understand what people are doing on your website, are used just on a first-party basis by websites, such as e-commerce websites and business-to-business websites, to understand the basic operational aspects and statistical measurement of how many people are going to which pages. Those are websites that do not take any advertising and do not share any data with third parties, so the exemptions in the Bill generally would make those types of companies no longer need cookie banners while providing no risk to the customers, because the company uses the cookies purely to understand the behaviours of its own website traffic and its own customers. In that sense, we strongly support the provisions and the exemptions in the Bill.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Is the technology available to centralise cookies by browser?

Chris Combemale: I think it can be eventually, but we oppose those provisions in the Bill, because they create a market imbalance and give control as a gateway to large companies that manage browser technology, at the expense of media owners and publishers that are paying journalists and investing in content. It is incumbent upon all else that media owners are able to develop first-party relationships with their audiences and customers to better understand what they need. If anything, we need more control in the hands of the people who invest in creating the content and in paying the journalists who provide those important democratic functions.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Is there a concern that centralising cookies by browser will entrench power in the hands of the larger tech companies that own the browsers?

Chris Combemale: It certainly would give even greater market control to those companies.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Is the risk in centralising cookies by browser that we could confuse liability, for example who is responsible for a breach of cookie regulation?

Chris Combemale: I think it could be. For us, the essential principle is that a business, whether a media owner, e-commerce business or publishing business, should have control of the relationships between its products and services and its customers and prospects for its customers. By nature, when you give control to a third party, whether a large tech company or another company, you are getting in between the relationship between people and the organisations that they want to do business with and giving control to an intermediary who may not understand. At the least point, if you register with a website after, for instance, changing your browser setting, that should take precedence over the browser setting: your choice to engage with a particular company should always take precedence over a centralised cookie management system.

Neil Ross: I think that what the Government have done in relation to this is quite clever: they have said that their objective is to have a centralised system in the future, but they have recognised that there are a number of different ongoing legislative and regulatory activities that have a significant bearing on that. I think it was only last week that the Government introduced the Digital Markets, Competition and Consumers Bill, clause 20 of which—on conduct requirements—would play a large role in whether you could set up a centralised system, so there is an element of co-ordinating two different but ongoing regulatory regimes. I think we agree with Chris that the steps on analytical cookies now are good but that we need to have a lot more deep thought about what a centralised system may or may not look like and whether we want to go ahead with it.

Chris Combemale: May I come in on that final point? What makes sense to us is a centralised system for managing opt-outs as opposed to managing consent. As the Data and Marketing Association, we operate the telephone preference service and the mailing preference service, which give consumers the opportunity to opt out from receiving unwanted cold calls or unwanted direct mail. There is already a system in place with digital advertising—an icon that people can use to opt out from the use of personal data for personalising digital ads. I think it makes sense that, if people do not want to receive certain things, they can opt out centrally, but a centralised consent opt-in gives too much control to the intermediaries.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Thank you.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Q Mr Ross, I know that techUK has been supportive of a number of elements of the Bill, particularly around the opportunities created by the use of smart data. Will you set out your view of the opportunities, and how the Bill will help to attain them?

Neil Ross: Smart data is potentially a very powerful tool for increasing consumer choice, lowering prices and giving people access to a much broader range of services. The smart data provisions that the Government have introduced, as well as the Smart Data Council that they are leading, are really welcome. However, we need to go one step further and start to give people and industries clarity around where the Government will look first, in terms of what kind of smart data provisions they might look at and what kind of sectors they might go into. Ultimately, we need to make sure that businesses are well consulted and that there is a strong cost-benefit analysis. We then need to move ahead with the key sectors that we want to push forward on. Similarly to on nuisance calls, we will send some suggested text to the Committee to add those bits in, but it is a really welcome step forward.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Welcome. Stephanie Peacock will start the questions.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Good morning. To go first to Dr Jeni Tennison, do you think the general public and workers have a good level of trust and understanding in terms of how their data is being used? What does the Bill do, if anything, to help build or improve on that trust and understanding?

Dr Tennison: Surveys and public attitudes polling show that when you ask people about their opinions around the use of data, they have a good understanding about the ways in which it is going wrong, and they have a good understanding about the kinds of protections that they would like to see. The levels of trust are not really there.

A poll from the Open Data Institute, for example, shows that only 30% trust the Government to use data ethically. CDEI has described this as “tenuous trust” and highlighted that about 70% of the public think that the tech sector is insufficiently regulated. I do not think that the Bill addresses those issues of trust very well; in fact, it reduces the power individuals have and also the level of collective representation people can have, particularly in the work context. I think this will diminish trust in the way in which data is used.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Do you believe the Government have consulted the public and data subjects such as workers appropriately during the process of formulating the Bill?

Dr Tennison: Obviously, there was a strong consultation exercise around the data reform Bill, as it was then characterised. However, there are elements of this Bill, in particular the recognised legitimate interests that are listed, that have not had detailed public consultation or scrutiny. There are also not the kinds of provisions that we would like to see on ongoing consultation with the public on specific questions around data processing in the future.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q What value do subject access requests hold for citizens, and how will changing the threshold for refusing a request or changing a request to “vexatious or excessive” impact citizens’ ability to exercise their rights?

Dr Tennison: Subject access requests are an important way in which citizens can work out what is happening within organisations with the data that is being held about them. There are already protections under UK GDPR against vexatious or excessive requests, and strengthening those as the Bill is doing is, I think, going to put off more citizens from making these kinds of requests.

It is worth noting that this is a specific design of the Bill. If you look at the impact assessment, this is where most of the cost to business is being saved; that is being done by refusing subject access requests. So I think we should be suspicious about what that looks like. Where we have been looking at the role of subject access requests in people exercising their rights, it is clear that that is a necessary step, and delays to or refusals of subject access requests would prevent people from exercising their rights.

We think that a better way of reducing subject access requests would be to have publication of things like the risk assessments that organisations have to do when there is high-risk processing—so that there is less suspicion on the part of data subjects and they do not make those requests in the first place.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Thank you. I have a couple of questions for Anna Thomas now. Do the current laws around automated decision making do enough to protect workers and citizens from harm?

Anna Thomas: Referring partly to our work in “Mind the gap” and “The Amazonian Era”, as well as the report by the all-party parliamentary group on the future of work about use of AI in the workplace, we would say no. The aim of the Bill—to simplify—is very good. But particular areas in the Bill as it stands—eroded somewhat—are particularly problematic in the workplace. The automated ones that you ask about are really important with regard to the reduction of human involvement. But in addition to that are the need to assess in advance what the risks and impacts are, the requirement for consultation, and the access to relevant information. Those are all relevant and overlap with the automated decision making requirement.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Linked to that, do you believe that the safeguards outlined in the Bill—having a right to human review, for example—are enough to protect workers from the potential harm of automated decision making?

Anna Thomas: Not in themselves. There is potential, in those areas, to correct that or to improve it in the course of the Bill’s proceedings, in order that the opportunities, as well as the risks, of putting this new Bill through Parliament are seized. But, no, because of the transformation of work and the extent of the impact, as well as the risks, that new technologies and automated technologies are having across work, not just on access to work, but on terms, conditions, nature, quality and models for work, the safeguards—there is, I think, increasing cross-party consensus about this—should be, in those areas, moving in the other direction.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q My final question is to Michael. Do you believe that the current regulation does enough to govern the use of biometric technologies?

Michael Birtwistle: No, we would say that it does not. The Ada Lovelace Institute published a couple of reports last year on the use of biometric data, arguing for a much stronger and coherent regulatory governance framework for biometric technologies. These are a set of technologies that are incredibly personal. We are used to their being talked about in terms of our faces or fingerprints, but actually it is a much wider range, involving any measurement to do with the human body, which can be used in emotional analysis—walking style or gait, your tone of voice or even your typing style. There is also a set of incoming, next-generation AI technologies that rely quite heavily on biometrics, so there is a question about future-proofing the Bill.

We have made two broad proposals. One is to increase the capability of the Information Commissioner’s Office to look specifically at biometrics—for example, to create and maintain a public register of private entities engaging in processing of biometric data, to have a proper complaints procedure, to publish annual reports and so on. There is a set of issues around increasing the capability of our institutions to deal with that.

Then there is a second question about scope. First, the current focus of biometric data and definition is on identifiability of personal data. There are many potentially problematic use cases of biometric data that do not need to know who you are in order to make a decision about you. We think it would be wise and would future-proof the regulation of this powerful technology to also include classification or categorisation as the purpose of those biometric technologies.

Data Protection and Digital Information (No. 2) Bill (Second sitting) Debate

Full Debate: Read Full Debate

Stephanie Peacock

Main Page: Stephanie Peacock (Labour - Barnsley East)

Data Protection and Digital Information (No. 2) Bill (Second sitting)

Stephanie Peacock Excerpts
Committee stage
Wednesday 10th May 2023

(11 months, 4 weeks ago)

Public Bill Committees
Read Full debate Data Protection and Digital Information Bill 2022-23 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 10 May 2023 - (10 May 2023)
None Portrait The Chair
- Hansard -

Thank you both for joining us. Stephanie Peacock.

Stephanie Peacock Portrait Stephanie Peacock (Barnsley East) (Lab)
- Hansard - -

Q 82 Welcome to you both. My first question is to both witnesses. How easy is it currently for service users and care teams to access and share all of their relevant health and care data?

Jonathan Sellors: I am not sure I am the expert on this particular topic, because my experience is more research-based than in IT systems embedded in clinical care.

Tom Schumacher: I am also not as intimately familiar with that issue, but I would say that interoperability is absolutely critical. One of the challenges we experience with our technologies—I assume this is also the case for your health providers—is the ability to have high-quality data that means the same thing in different systems. That is a challenge that will be improved, but it is really a data challenge more than a privacy challenge. That is how I see it.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Will the new definition in the Bill of what constitutes scientific research help people in your field to conduct more or better research? If so, what impact would this research have on citizens and healthcare?

Jonathan Sellors: I think it is a thoroughly useful clarification of what constitutes research. It is essentially welcome, because it was not entirely clear under the provisions of the General Data Protection Regulation what the parameters of research were, so this is a helpful clarification.

Tom Schumacher: I completely concur: it is very useful. I would say that a couple of things really stand out. One is that it makes it clear that private industry and other companies can participate in research. That is really important, particularly for a company like Medtronic because, in order to bring our products through to help patients, we need to conduct research, have real-world data and be able to present that to regulators for approval. It will be extremely helpful to have that broader definition.

The other component of the definition that is quite helpful is that it makes it explicit that technology development and other applied research constitutes research. I know there is a lot of administrative churn trying to figure out what constitutes research and what does not, and I think this is a really helpful piece of clarification.

John Whittingdale Portrait The Minister for Data and Digital Infrastructure (Sir John Whittingdale)
- Hansard - - - Excerpts

Q Perhaps I could ask you both to elaborate on how the existing definition and the current lack of clarity have impeded you in carrying out the research you would like to do and how this will change as a result of the Bill.

Tom Schumacher: Maybe I can give an example. One of the businesses we purchased is a business based in the UK called Digital Surgery. It uses inter-body videos to try to improve the surgery process and create technologies to aid surgeons in prevention and care. One of the challenges has been, to what extent is the use of surgery videos to create artificial intelligence and a better outcome for patient research? Ultimately, it was often the case that a particular site or hospital would agree, but it created a lot of churn, activity and work back and forth to explain exactly what was to be done. I think this will make it much clearer and easier for a hospital to say, “We understand this is an appropriate research use” and to be in a position to share that data according to all the protections that the GDPR provides around securing and de-identifying the data and so on.

Jonathan Sellors: I think our access test, which we apply to all our 35,000 users, is to ensure they are bona fide researchers conducting health-related research in the public interest. We quite often get asked whether the research they are planning to conduct is legitimate research. For example, a lot of genetic research, rather than being based on a particular hypothesis, is hypothesis-generating—they look at the data first and then decide what they want to investigate. This definition definitely helps clear up quite a few—not major, but minor—confusions that we have. They arise quite regularly, so I think it is a thoroughly helpful development to be able to point to something with this sort of clarity.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Welcome, gentlemen. We will now hear from Harry Weber-Brown, chief engagement officer at ZILO, and Phillip Mind, director of digital technology and innovation at UK Finance. We have until 2.50pm for this session. I now invite the witnesses to please introduce themselves to the Committee for the record, starting with Mr Weber-Brown.

Harry Weber-Brown: Thank you very much. My name is Harry Weber-Brown, chief engagement officer for ZILO Technology Ltd, which is a start-up based in London. I have previously worked for the Investing and Saving Alliance. I have much experience in both smart data, which is dealt with in part 3 of the Bill, and digital identity, which relates to digital verification services in part 2.

Phillip Mind: Good afternoon. I am Phillip Mind, director of digital technology and innovation at UK Finance, a trade body representing over 300 organisations in the bank and finance community. Like Harry, my expertise resides more in parts 2 and 3 of the Bill, although I have a few insights into part 1.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Good afternoon to both witnesses. I have a broad opening question. What are the main implications of the Bill’s provisions for the finance sector?

Phillip Mind: The banking community is supportive of the Bill, which is enabling of a digital economy. The data protection reforms reduce compliance burdens on business, which is very welcome. The provisions on digital identity are enabling, and we see digital identity as an essential utility for customers in the future. The provisions on smart data extend an open data regime to other sectors. We already have an open banking regime, and we are keen for that to extend to other sectors. It offers real opportunities in terms of innovative products and services, but we would caution the Committee that there is significant cost and complexity in those measures.

Harry Weber-Brown: The Bill is key to retaining the UK’s place as a hub for technical innovation, and in particular for investment in fintech. It is critical also to make sure the UK remains a global leader in data portability. Building on the work that Phillip just mentioned on open banking, which has over 7 million users among both consumers and small and medium-sized enterprises, it is critical that we make sure we are ahead of the competition.

For the financial services sector, the provisions on ID help to reduce costs for things like onboarding and reduce fraud for things like authorised push payments. It also delivers a better customer experience, so you do not have to rummage around to find your passport every time you want to set up a new account or need to verify yourself to a financial service firm.

Smart data is an opportunity for us to extend ourselves as the world leader in open finance, building on the work of not only open banking but the pensions dashboard, which is yet to be launched but is another open finance scheme. The opportunity to widen up and give consumers more control in their ability to share data is critical for the customer, the economy and the financial services industry.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q That is great. You both mentioned smart data. For the benefit of the Committee, could you outline some of the progress that the banking and finance industries have made in developing smart data initiatives?

Phillip Mind: In the banking industry we have open banking, which allows customers to choose and consent to allow an authorised third party provider access to their account to provide products and services—access to see the data. It also allows—again, with customer choice and consent—customers to allow a third party provider to make payments on their behalf. That has been hugely enabling. It has enabled growth in all sorts of innovative products and services and growth in fintech in the UK. As Harry mentioned, there are over 7 million active customers at the moment, but it does come with a cost; it is not a free good. Making that service available has involved cost and complexity.

In extending the provisions to other sectors through secondary legislation, it is really important that we are cognisant of the impacts and the unintended consequences. Many sectors have pre-existing data-sharing arrangements, many of which are commercial, and it is important that we understand the relative costs and benefits and how they fall among different participants in the market. My caution to the Committee and to Government is to go into those smart data schemes with eyes open.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q To develop that point, do you think there are enough safeguards in the Bill to ensure that Ministers assess the commercial sense and the impact of any new smart data regimes before regulating for them?

Phillip Mind: Clauses 62 and 64 make provision for the Secretary of State and Treasury to consult on smart data schemes. We think that those provisions could be strengthened. We see a need for impact assessments, cost-benefit analysis and full consultation. The Bill already allows for a post-implementation review, and we would advise that too.

Harry Weber-Brown: I think the other one to call out is the pensions dashboard, which has been driven out of the Money and Pensions Service. Although it has not actually launched yet, it has brought the life assurance industry on the site to develop free access to information. The consumer can see all their pensions holdings in a single place, which will then help them to make better financial decisions.

I think my former employer, the Investing and Saving Alliance, was working on an open savings, investments and pensions scheme. Obviously, that is not mandatory, but this is where the provision for secondary legislation is absolutely imperative to ensure that you get a wide scope of firms utilising this. At the moment, it is optional, but firms are still lining up and wanting to use it. There is a commitment within the financial services industry to do this, but having the legislation in place—secondary legislation, in particular—will ensure that they all do it to the same standards, both technical and data, and have a trust framework that wraps around it. That is why it is so imperative to have smart data.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Q Would you say a little about the international position? You referred to the UK’s position as a leader in this field. To what extent is that the case? What are the benefits, and what is the risk to the UK’s position if we do not make the changes proposed in the Bill?

Harry Weber-Brown: In part 2 or part 3 of the Bill? The digital verification services or smart data?

--- Later in debate ---
None Portrait The Chair
- Hansard -

Welcome, Mr Rosser. We have just 15 minutes, until 3.05 pm, for this session. Would you kindly introduce yourself to the Committee for the record?

Keith Rosser: My name is Keith Rosser. I am the chair of the Better Hiring Institute.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Good afternoon. What are the main implications of the Bill for employers? Specifically, how will enabling greater use of a digital verification service help employers to make hiring decisions?

Keith Rosser: Employers have been making hiring decisions using digital identity since 1 October, so we are a live case study. The biggest impact so far has been on the speed at which employers are able to hire staff and on the disconnection between where people live and the location of their job. For example, people in a digital identity scheme could apply for work, get a job and validate who they are without ever necessarily having to go and meet the employer. It is really important across the regions, from St Austell to Glasgow, that we are opening up job opportunities across the UK, including in some of our urban areas—West Bromwich, Barnsley and others—where people get greater job opportunities from where they live because they are not tied to where the employer is. It has had a profound effect already.

We recently looked at a study of 70,000 hires or people going through a hiring process, and 83%—some 58,000—opted to take the digital identity route. They did it in an average time of three minutes and 30 seconds. If we compare that with having to meet an employer and go through a process to provide your physical documents, there is a saving of around a week. If we think about making UK hiring the fastest globally, which is our ambition, people can start work a week earlier and pay taxes earlier, and we are cutting waiting lists and workloads. There is a huge positive impact.

In terms of employers making those hiring decisions, technology is so much better than people at identifying whether a document is genuine and the person is who they say they are. In that case study, we found that 200 of the 70,000 people going through the process had fake documents or fraudulently obtained genuine documents. The question is, would the human eye have spotted that prior to the implementation of digital identity? I am certain that it would not have done. Digital identity is really driving the potential for UK hiring to be a shining example globally.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Do you think the provisions in the Bill will help to improve public trust in digital identities?

Keith Rosser: From that 70,000 example, we have not seen evidence yet that public trust has been negatively impacted. There are some very important provisions in the Bill that have to go a long way to assuring that. One is the creation of a governance body, which we think is hugely important. There has to be a monitoring of standards within the market. It also introduces the idea of certifying companies in the market. That is key, because in this market right now 30% of DVSs—nearly one in three companies—are not certified. The provision to introduce certification is another big, important move forward.

We also found, through a survey, that we had about 25% fewer objections when a user, company or employer was working with a certified company. Those are two really important points. In terms of the provision on improving the fraud response, we think there is a real opportunity to improve what DVSs do to tackle fraud, which I will probably talk about later.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Q Perhaps I could ask you to expand on that now. To what extent would you say that some providers that are not certified are not meeting the standards necessary, or in some cases even promoting fraud?

Keith Rosser: I have every reason to believe that organisations not certified will not be meeting anywhere near the standards that they should be meeting under a certified scheme. That appears really clear. They certainly will not be doing as much as they need to do to tackle fraud.

My caveat here is that across the entire market, even the certified market, I think that there is a real need for us to do more to make sure that those companies are doing far more to tackle fraud, share data and work with Government. I would say that uncertified is a greater risk, certainly, but even with certified companies we must do more to make sure that they are pushed to meet the highest possible standards.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Welcome and thank you. Aimee Reed?

Aimee Reed: Hello, everybody. This is also my first appearance in front of a Bill Committee. I am the Director of Data at the Metropolitan Police Service. For my sins, I also volunteer to lead all 43 forces on data; I am chair of the national police data board. I am here today in that capacity as well.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q You are both very welcome. My first question is to Aimee. Currently, police are required by section 62 of the Data Protection Act 2018 to log their justification for accessing specific data records; this Bill, of course, changes that. How time consuming is that requirement currently for officers?

Aimee Reed: It is a big requirement across all 43 forces, largely because, as I am sure you are aware, we are operating on various aged systems. Many of the technology systems across the policing sector do not have the capacity to log section 62 requirements, so police officers are having to record extra justification in spreadsheets alongside the searches and release of information that they deliver. So the requirement is a considerable burden across all the forces.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Helen, how, if at all, will listing as a recognised legitimate interest

“detecting, investigating or preventing crime”,

to quote the new definition, aid the tackling of serious crime in the UK?

Helen Hitching: Sorry—could you repeat that?

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Sure. My understanding of the legislation in front of us is that if the Bill becomes law,

“detecting, investigating or preventing crime”

will be listed as a recognised legitimate interest and therefore be subject to separate, or slightly amended, data rules. How will that change help tackle serious crime in the UK?

Helen Hitching: I think it will bring a level of simplicity across the data protection environment and make sure that we can share data with our policing colleagues and other services in a more appropriate way. It will make the whole environment less complex.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q I have a connected but slightly separate question. Would being able to apply for a joint designation notice with the intelligence services aid competent authorities in targeting serious and organised crime, and if so, how?

Helen Hitching: Yes, it will aid it. Again, it brings in the ability to put the data protection framework on the same level, so we can share data in an easier fashion and make it less complex.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Q Can you say a little bit more about the implications of personal data sharing between countries, the extent to which that might lead to a lowering of standards of protection and how we safeguard against that?

Helen Hitching: The agency does not believe that those safeguards will be lowered. We will still not be able to share data internationally with countries that do not have the same standards that are met by the UK. It will provide greater clarity about which regimes should be used and at which point. The standards will not reduce.

--- Later in debate ---
None Portrait The Chair
- Hansard -

We now come to our ninth panel. We welcome Andrew Pakes, who is director of communications and research at Prospect, and Mary Towers, who is the policy officer at the Trades Union Congress. We have until 3.55 for this session. I invite the witnesses to introduce themselves to the Committee for the record—ladies first.

Mary Towers: Hi, and thanks very much for inviting the TUC to give evidence today. My name is Mary Towers. I am an employment rights policy officer at the TUC, and I have been leading a project at the TUC looking at the use of AI in the employment relationship for the past couple of years.

Andrew Pakes: Hello, everyone. Thank you for inviting Prospect to give evidence today. My name is Andrew Pakes. I am one of the deputy general secretaries and the research lead for Prospect union, which represents scientific, technical and professional workers. I am also a member of the OECD’s AI expert panel, representing trade unions.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Good afternoon to you both; you are very welcome. My first question is to Andrew. Obviously, the nature of work has changed significantly over the past few decades, particularly in the last decade. What impact has technology, particularly the rise of automated decision making and automated performance management, had on the workplace?

Andrew Pakes: We were already seeing a huge change in the use of digital technology prior to the pandemic. The pandemic itself, not least through all the means that have kept many of us working from home, has transformed that. Our approach as a trade union is to embrace technology. We believe that our economy and the jobs our members do can be made better and more productive through the good deployment of technology to improve jobs.

We also think there is a downside to it all. Everything that needs to be risked and balanced is in that. Alongside the advance in innovation and technology that has brought benefits to the UK, we have seen a rise in the darker or less savoury side of that, which is namely the rise of surveillance software; the ability of software to follow us, including while working from home, and to micromanage us and track people; and the use of technology in performance management—the so-called people analytics or HR management, which is largely an unregulated area.

If you ask me which legislation this should sit in, I would probably say an employment-type Bill, but this is the legislation we have and the Government’s choice. We would definitely like to see checks and balances at least retained in the new legislation compared with GDPR, but maybe they should be enhanced to ensure that there is some form of social partnership and that working people have a say over how technology is introduced and implemented in their workspaces.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q That makes sense. You mentioned the changes since the pandemic. How do you think those changes have impacted on the right to privacy and the right to a work-life balance? I presume that has shifted since the pandemic.

Andrew Pakes: There is increasing evidence that while technology has allowed many of us to remain connected to our workspaces—many of us can now take our work anywhere—the downside is that our work can follow us everywhere. It is about the balance of digital disconnection and the ability to switch off from work. I am probably preaching to the wrong crowd, because MPs are constantly on their phones and other technology, but many of us are able to put that away, or should do, because we are contracted workers and have a different relationship with our workplace in terms of how that balance is struck. We very much focus on wellbeing and on information and consultation, ensuring that people are aware of the information that is collected on us.

One of the troubling factors that we and the TUC have picked up is that consistently, in opinion polls and research that is done, working people do not have confidence or knowledge about what level of data is being collected and used on them. When we see the increasing power of technology through AI and automated decisions, anxiety in the workplace is best foiled by transparency, in the first place, and, we would obviously argue, a level of social partnership and negotiation over how technology is introduced.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q What effect do you believe the new rules in the Bill on automated decision making will have on workers? I think you have alluded to this, but would you like to see greater protections in place?

Andrew Pakes: Absolutely. What strikes me about the legislation you are considering is that just about all our major competitors—who are more productive and more advanced, often in innovation, including the United States—are choosing a path of greater scrutiny and accountability for AI and automated decision making. There is a concern that in this legislation we are taking an alternative path that makes us stand out in the international economy, which is about diluting existing protections we have within GDPR to a lower level. That raises concerns.

We have particular concerns about automated technology, but also about the clauses on the reduction of powers around data protection impact assessments. We think the risk is that the legislation could open the back door to the increase in dodgy surveillance and other forms of software coming into the UK market. I am worried about that for two reasons: first, because of the impact it has on individual workers and what is happening there; and secondly, because most of this technology—we have been part of a project that has tracked over 500 different surveillance software products currently on the international market—is designed largely for a US or Chinese market, with little knowledge of how it is being done.

What we know through ensuring consultation on the existing DPIA arrangements is that there is a break in the current rules that enables or ensures that employers have a consultation and check where their products are taking their data from and what they have stored. Diluting that risks ensuring that we are not sure where that data is being used and we are not sure of the power of this technology, and working people then end up with a worse deal than they currently have.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q I have a couple of questions for Mary Towers. Do you think that the changes in the Bill will do anything to improve the collective rights of workers? If not, what sort of mechanisms would you like to see in place to give workers a method of redress collectively?

Mary Towers: On the contrary, we would say that the Bill in fact reduces the collective rights of workers, particularly in relation to data protection impact assessments. As Andrew has mentioned, at the moment the right to a data protection impact assessment involves an obligation on an employer to consult with workers or their representatives. That is an absolutely key tool for trade unions to ensure that worker voice is represented in the path of the introduction of new technologies at work. Also, at the moment, missing from the Bill is the ability of trade unions to act as representatives for data subjects in a collective way. We say that that, too, is missing, could be added and would be an important role that unions could take on.

Another aspect missing from the Bill, which we say is a hugely missed opportunity, is a potential right that workers could have to have an equal right to their data that matches the right employers have over worker data. Once workers had that right, they could then collectivise their own data, which would enable them, for example, to pick up on any discriminatory patterns at work or pick up any problems with equal pay or the gender pay gap. We say that that right to collectivise data and redress the imbalance of power over data at work is really important.

The Bill misses entirely the opportunity to introduce those kinds of concepts, which are actually vital in the modern workplace, where data is everything. Data is about control; data is about influence; data is the route that workers have to establish fair conditions at work. Without that influence and control, there is a risk that only one set of interests is represented through the use of technology at work, and that technology at work, rather than being used to improve the world of work, is used to intensify work to an unsustainable level.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q In that answer, you highlighted the imbalance between employers and workers. Correct me if I am wrong, but you said that data protection impact assessments are particularly valuable to both trade unions and the collective workforce. Do you have any specific examples of this consultation tool being used successfully?

Mary Towers: Yes. This is something that Andrew’s union, Prospect, has been really active in. It has produced some absolutely brilliant guidance that looks in detail at the importance of the process of data protection impact assessments and rolled out training for its trade union reps. Again, several of our other affiliates have undertaken that really important work, which is then being rolled out into the workplace to enable reps to make good use of that process.

I will, however, add the caveat that I understand from our affiliates that there is a very low level of awareness among employers about that obligation, about the importance of that process and about exactly what it involves. So a really important piece of awareness-raising work needs to be done there. We say it is vital to build on the existing rights in the UK GDPR, not dilute or remove them.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q What impact would the Bill have on workers by taking away this tool or watering down the DPIAs into assessments of high risk, especially given that earlier today, before this Committee, the Information Commissioner himself raised concerns about the lack of clarity on what will count as high-risk processing? That question is to either of you, briefly. I have one more and then I will let someone else come in.

Andrew Pakes: We would assert that under the law of GDPR, high risk in the legislation is, I think, in recital 39. I will correct that if I picked the wrong one. It talks about high risk as being decisions that can make material or non-material impact on people. If we now have software and algorithms or automated decisions that can hire and fire us—we have examples of that—and can decide who deserves a promotion or who can be disciplined, if that information can now be used to track individuals and decide whether someone is a good or bad worker, we would assert that that is a high risk. Anything that can actually affect both your standing in your workspace or your contractual relationship, which is essentially what employment is, or which has an impact on the trust and confidence the employer has in you and, equally, your trust and confidence back in the employer, that is a very clear definition of high risk.

What is important about the existing UK GDPR is that it recognises the nature of high risk but, secondarily, it recognises that data subjects themselves must be consulted and involved either directly or, where that is not practicable, through their representatives. Our worry is that the legislation that is tabled now dilutes that and opens up risk to bad practice.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Thank you. This is my final question. Does the Bill offer enough detail on the new threshold for charging or refusing a subject access request that is either “vexatious or excessive” to assure workers that they will still be able to access their personal records from an employer when making a good-faith request?

Mary Towers: The right to a data subject access request—again, like the DPIAs—is an absolutely crucial tool for trade unions in terms of establishing transparency over how their data is being used. Really, it provides a route for workers and unions to get information about what is going on in the workplace, how technologies operate and how they are operating in relation to individuals. It is an vital tool for trade unions.

What we are concerned about is that the new test specified in the Bill will provide employers with very broad discretion to decide when they do not have to comply with a data subject access request. The use of the term “vexatious or excessive” is a potential barrier to providing the right to an access request and provides employers with a lot of scope to say, for example, “Well, look, you have made a request several times. Now, we are going to say no.” However, there may be perfectly valid reasons why a worker might make several data subject access requests in a row. One set of information that is revealed may then lead a worker to conclude that they need to make a different type of access request.

We say that it is really vital to preserve and protect the right for workers to access information. Transparency as a principle is something that, again, goes to really important issues. For example, if there is discriminatory operation of a technology at work, how does a worker get information about that technology and about how the algorithm is operating? Data subject access requests are a key way of doing that.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Q May I ask a relatively simple question? Obviously your concern is the protection of workers’ rights, and safeguards against discrimination and other potential adverse consequences of technology. We will debate the provisions of the Bill in those areas in the coming weeks—I suspect at some length—but would you nevertheless accept that the overall impact of the legislation, if we get this right, will be beneficial to your members in terms of the promotion of growth and potential future job opportunities?

Andrew Pakes: “If we get this right” is doing a lot of heavy lifting there; I will leave it to Members to decide the balance. That should be the goal. There is a wonderful phrase from the Swedish trade union movement that I have cited before: “Workers should not be scared of the new machines; they should be scared of the old ones.” There are no jobs, there is no prosperity and there is no future for the kind of society that our members want Britain to be that does not involve innovation and the use of new technology.

The speed at which technology is now changing and the power of this technology compared with previous periods of economic change make us believe that there has to be a good, robust discussion about the balances of checks and balances in the process. We have seen in larger society—whether through A-level results, the Post Office or other things—that the detriment is significant on the individuals impacted if legislators get that balance wrong. I agree with the big principle and I will leave you to debate that, but we would certainly urge that checks and balances need to be balanced, not one-sided.

Mary Towers: Why does respect for fundamental rights have to be in direct conflict with growth and innovation? There is not necessarily any conflict there. Indeed, in a workplace where people are respected, have dignity at work and are working in a healthy way, that can only be beneficial for productivity and growth.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Thank you. You are all very welcome.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q My first question is to Alexandra. What would the benefit be to the general public of the Government being transparent about their use of algorithms?

Alexandra Sinclair: Thank you for the question. In order for the public to have trust and buy-in to these systems overall, so that they can benefit from them, they have to believe that their data is being used fairly and lawfully. That requires knowing which criteria are being used when making a decision, whether those criteria are relevant, and whether they are discriminatory or not. The first step to accountability is always transparency. You can know a decision is fair or lawful only if you know how the decision was made in the first place.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q That is great. Could you tell us about your TAG transparency register and what it revealed about the level of transparency in Government algorithmic use?

Alexandra Sinclair: Currently the Government have their algorithmic reporting transparency standard—I think I have got that right; they keep changing the acronym. Currently on that system there are about six reports of the use of automated decision-making technology in government. The Public Law Project decided to create a parallel register of the evidence that we could find for automated decision making in government. Our register includes over 40 systems in use right now that involve partly automated decisions about people. It would be great if the Government themselves were providing that information.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q In the consultation, the Government said:

“There are clear benefits to organisations, individuals and society in explaining algorithmic decision-making”

in the public sector. Do you think that measures in the Bill achieve that? Do they unlock benefits and explain the Government’s algorithmic decision making to the public?

Alexandra Sinclair: No, and I think they do not do that for three reasons, if I have the time to get into this. The changes to subject access requests, to data protection impact assessments and to the prohibition on article 22 are the key issues that we see. The reason why we are particularly worried about subject access requests and data protection impact assessments is that they are the transparency provisions. They are how you find out information about what is happening. A subject access request is how you realise any other right in the Bill. You can only figure out if an error has been made about your data, or object to your data, if you know how your data is being used in the first place.

What we are worried about with the Bill is that you currently have an almost presumptive right to your data under a subject access request, but the change in the Bill changes the standard from the current “manifestly unfounded or excessive” to “vexatious or excessive”. It also gives a whole load of factors that data controllers are now allowed to take into account when declining your request for your own data. Furthermore, under the proposal in the Bill they do not have to give you the reason why they declined your request for the data. We think that is really problematic for individuals. You have got this information asymmetry there, and it is going to be really difficult for you to prove that your request was not vexatious or excessive if you do not even know why it was denied in the first place.

If we think about some examples that we have been talking about in Committee today, in a lot of the Uber and Ola-led litigation, where individuals were able to show that their employment rights had been unfairly treated, they were able to find out about that through subject access requests. Another example is the London Met police’s gangs matrix. The Information Commissioner’s Office did a review of that matrix and found that the system did not even clearly distinguish between victims and perpetrators of crime, and the only way for individuals to access the matrix and check if the information held on them is accurate is through a subject access request. That is our first concern with the Bill.

Our second concern is the changes to data protection impact assessments. The first thing to note is that they already have to apply only in high-risk processing situations, so we do not think that they are an undue or onerous burden on data controllers because they are already confined in their scope. What a data protection impact assessment does—this is what we think is beneficial about it—is not to be a brake on processing, but to force data controllers to think though the consequences of processing operations. It asks data controllers to think, “Where is that data coming from? What is the data source? Where is that data being trained? For what purpose is that data being used?” The new proposal under the Bill for data protection impact assessments significantly waters down those obligations and means that, essentially, the only requirement is accounting for the purposes for the data. So instead of explaining how the data is being used, you are only requiring that purpose.

We think that has two problems. First, data controllers will not be thinking through all the harms and consequences before they deploy a system. Secondly, if individuals affected by those systems want to get information about how their data was processed and what happened, there will be a lot less information on that impact assessment for them to assess the lawfulness of that processing.

My final critique of the Bill is this. We would say that the UK is world-leading in terms of article 22—other states are certainly looking to the UK—and it is a strange time to be looking to roll back protections. I do not know if Committee members have heard about how Australia recently experienced the Robodebt scandal, on which there is a royal commission at the moment. In that case, the system was a solely automated debt discrepancy system that ended up making over 500,000 incorrect decisions, telling people that they had committed benefit fraud when they had not. Australia is having to pay millions of dollars in compensation to those individuals and to deal with the human cost of that decision. The conversation in Australia right now is, “Maybe we should have article 22. Maybe this wouldn’t have happened if we had had a prohibition on solely automated decision making.” When other states are looking to beef up their AI protections, we need to think carefully about looking to roll them back.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Thank you for that really comprehensive answer.

Jacob, what measures do you think should be in place to ensure that data protection legislation balances the need to protect national security with the need to uphold human rights? Does the Bill strike the right balance?

Jacob Smith: Thanks for the question. To take the second part first, we argue that the Bill does not strike the right balance between protecting national security and upholding data and privacy rights. We have three main concerns with how the Bill sets out that balance at the moment, and they come from clauses 24 to 26.

We have this altered regime of national security certificates for when law enforcement is taking measures in the name of national security, and we have this new regime of derogation notices. When law enforcement and the security services are collaborating, the notices allow the law enforcement body working in that collaboration to benefit from the more relaxed rules that are generally only for the intelligence services.

From our perspective, there are three main concerns. First, we are not quite sure why these amendments are necessary. Under human rights law, for an interference with somebody’s data or privacy rights to be lawful, it needs to be necessary, and that is quite a high standard. It is not something akin to it being more convenient for us to have access to this data, or more efficient for us to have access to this data; it has to meet a high standard of strict necessity. Looking through the Second Reading debate, the impact assessment and the European convention on human rights analysis, there is no reference to anything that would be akin to necessity. It is all, “It would be easier for law enforcement to have these extra powers. It would be easier if law enforcement were potentially able to use people’s personal data in more ways than they are at the moment.” But that is not the necessity standard.

The second concern is the lack of safeguards in the Bill. Another thing that human rights law—particularly article 8 of the ECHR—focuses on is the necessity of introducing additional safeguards to prevent the misuse of legislation that allows public bodies to interfere with people’s privacy rights. At the moment, as the Bill sets out, we have very weak safeguards when both national security certificates and designation notices are in place. At the moment, there is an opportunity, at least on the face of the Bill, for both those measures to be challenged before the courts. However, the issue here is that the Secretary of State has almost a monopoly over deciding whether those notices and certificates get published. So yes, although on the face of the Bill an individual may be able to challenge a national security certificate or a designation notice that has impacted them in some way, in practice they will not be able to do that if they do not know that it exists.

Finally, one encompassing issue is the expansive powers for the Secretary of State. One thing that we advocate is increased independent oversight. In the Bill, the Secretary of State has an extremely broad role in authorising law enforcement bodies to process personal data in a way that would otherwise be unlawful and go further than the existing regimes under the Data Protection Act 2018. Those are our three broad concerns in that regard. Ultimately, we do not see that the right balance has been made.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q My final question is to all the witnesses. What are your views on the reforms to the ICO and their potential impact on its independence from Government?

Ms Irvine: We have concerns about the proposed changes and their potential impact on the independence of the Information Commissioner. I was able to listen to John Edwards speaking this morning, and I noted that he did not share those concerns, which I find surprising. The ICO is tasked with producing statutory codes of conduct, which are incredibly useful for my clients and for anyone working in this sector. The fact that the Secretary of State can, in effect, overrule these is concerning, and it must be seen as a limit on the Information Commissioner’s independence.

That leads to a concern that we have in relation to the adequacy decision that is in place between the EU and the United Kingdom. Article 52 of the GDPR states very clearly that a supervisory authority must have clear independence. The provisions relating to the independence of the Commission—the potential interference of the Secretary of State in law is enough to undermine independence—are therefore of concern to us.

Alexandra Sinclair: We would just say that it is not typical for an independent regulator to have its strategic objectives set by a Minister, and for a Minister to set those priorities without necessarily consulting. We consider that the ICO, as subject matter experts, are probably best placed to do that.

Jacob Smith: From our perspective, the only thing to add is that one way to improve the clauses on national security certificates and designation notices would be to give the ICO an increased role in oversight and monitoring, for instance. Obviously, if there are concerns about its independence, we would want to consider other mechanisms.

Carol Monaghan Portrait Carol Monaghan (Glasgow North West) (SNP)
- Hansard - - - Excerpts

Q Laura Irvine, in your briefing about the Bill you raised concerns about some of the language. We had some discussion this morning about the language and particular terms, such as what “vexatious” means, for example. Could you elaborate on your concerns?

Ms Irvine: Certainly. There are terms that have been used in data protection law since the 1984 Act. They were used again in the 1998 Act, echoed under the GDPR and included in all the guidance that has come from the Information Commissioner’s Office over the past number of years. In addition to that, there is case law that has interpreted many of those terms. Some of the proposed changes in the Bill introduce unexpected and unusual terms that will require interpretation. Even then, once we have guidance from the Information Commissioner, that guidance is sometimes not as helpful as interpretation by tribunals and courts, which is pretty sparse in this sector. The number of cases coming through the courts is limited—albeit that there is a lot more activity in the sector than there used to be. It simply presents a lot more questions and uncertainty in certain ways.

For my business clients, that is a great difficulty, and I certainly spend a lot of time advising clients on how I believe a matter—a phrase—will be interpreted, because I have knowledge of how data protection law works in general. That is based on my experience of the power of businesses and organisations, particularly in the third sector. Smaller bodies will often be challenged by a lack of knowledge and expertise, and that is a difficulty of introducing in legislation brand-new terms that are not familiar to practitioners, far less the organisations asked to implement the changes.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Thank you and welcome.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q What are the main implications of the Bill for people’s personal data rights?

Alex Lawrence-Archer: There is a group of changes in the Bill that, perhaps in ways that were unintended or at least not fully thought through, quite seriously undermine the protection of individuals’ privacy and data rights. A few of the most concerning ones are the change to the definition of personal data, recognising legitimate interests, purpose limitation, changes to the test for the exercise of data subject rights—I could go on. You will have heard about many of those today. It amounts to an undermining of data rights that seems not to be in proportion to the relatively modest gains in terms of reduction in bureaucracy on the part of data controllers.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Following on from that answer, what do you think the impact will be of the new definition of personal data as contained in the Bill?

Alex Lawrence-Archer: It is quite difficult to predict, because it is complicated, but it is foundational to the regime of data protection. One of the issues is that in seeking to relieve data controllers of certain bureaucratic requirements, we are tinkering with these really foundational concepts such as lawful basis and the definition of personal data.

Two things could happen, I think. Some quite bad-faith arguments could be run to take quite a lot of processing outside the scope of the data protection regime. Although I doubt that those arguments would succeed, there is an additional issue; it is quite complicated to explain, but I will try. If it is unlikely but possible that an individual might be re-identified from a pseudonymised dataset—it could happen if there were a hack, say, but it is unlikely—that processing under the new regime would not, as the Bill is drafted, benefit from the protection of the regime. It would not be considered personal data, as it would not be likely that the individual could be identified from that dataset. That is a real problem because pseudonymised datasets are very common with large datasets. There are real risks there that would not be dealt with.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q On average, how long does it currently take for data subjects to resolve basic data rights breaches?

Alex Lawrence-Archer: Under the current regime, that is a bit like asking, “How long is a piece of string?” It can take quite a long time. There are certain practices that the ICO follows in terms of requiring individuals to complain to the controller first. Some controllers are good; some are quick, but some are not. You might have a lot of back and forth about data access at the beginning, but other controllers might hand over your data really quickly. However, you could be looking at anything up to, say, 10 to 12 months.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q Do you think that any changes in the Bill, for example those surrounding subject access requests, would increase that time?

Alex Lawrence-Archer: Yes. You have heard from lots of people about the changes to the standard to be applied when any of the rights in chapter 3 are exercised by a data subject, and that includes the right of access. I think it is very likely that many more exercises of the right of access will be refused, at least initially. I think there will be many more complaints about the right of access and there is likely to be satellite litigation about those complaints as well, because you cannot proceed in finding out what has gone on with your data and rectify a problem unless you have access to the copies of it.

So, what you might find in many cases is a two-stage process whereby, first, you must resolve a complaint, maybe even a court case, about your right to access the data and then, and only then, can you figure out what has actually been going on with it and resolve the underlying unlawfulness in the processing. Effectively, therefore, it is a doubling of the process for the individual.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Q A final question: do you think that the definitions of “vexatious” and “excessive” are clear enough not to be abused by controllers who simply do not want to carry out subject access requests?

Alex Lawrence-Archer: The new definitions, particularly the list of factors to be taken into consideration in determining whether the test is met, provide a lot of breathing room for controllers, whether or not they have good intentions, to make arguments that they do not need to comply with the right of access. If you are looking not to comply or if you have an incentive not to, as many controllers do, that does not necessarily mean that you are acting in bad faith; you might just not want to hand over the data and think that you are entitled not to do so. If you are looking not to comply, you will look at the Act and see lots of hooks that you can hang arguments on. Ultimately, that will come back to individuals who are just trying to exercise their rights and who will be engaged in big arguments with big companies and their lawyers.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Q The age-appropriate design code for children was mentioned in our session this morning. Do you have any thoughts on what the Bill could mean for the application of that design code, which was obviously prepared for an environment in which GDPR was enshrined in UK data law?

Alex Lawrence-Archer: The age-appropriate design code was a real success for the UK in terms of its regulation and its reputation internationally. It clarified the rights that children have in relation to the processing of their personal data. However, those rights are only helpful if you know what is happening to your personal data, and if and when you find out that you can exercise your rights in relation to that processing.

As I have said, what the Bill does—again, perhaps inadvertently—is undermine in a whole host of ways your ability to know what is happening with your personal data and to do something about it when you find out that things have gone wrong. It seems to me that on the back of a notable success in relation to the AADC, we are now, with this Bill, moving in rather a different direction in terms of that argument for protection of personal data.

Looking at the even longer term, there will be some slightly more nuanced changes if and when the AADC comes to be amended or redrafted, because of the role of the ICO and the factors that it has to take into account in its independence, which again you have already heard about. So you could, in the long term, see a new version of the AADC that is more business-friendly, potentially, because of this Bill.

Data Protection and Digital Information (No. 2) Bill (Third sitting) Debate

Full Debate: Read Full Debate

Stephanie Peacock

Main Page: Stephanie Peacock (Labour - Barnsley East)

Data Protection and Digital Information (No. 2) Bill (Third sitting)

Stephanie Peacock Excerpts
Committee stage
Tuesday 16th May 2023

(11 months, 3 weeks ago)

Public Bill Committees
Read Full debate Data Protection and Digital Information Bill 2022-23 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 16 May 2023 - (16 May 2023)
John Whittingdale Portrait The Minister for Data and Digital Infrastructure (Sir John Whittingdale)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Mr Hollobone. May I thank all hon. Members for volunteering to serve on the Committee? When I spoke on Second Reading, I expressed my enthusiastic support for the Bill—just as well, really. I did not necessarily expect to be leading on it in Committee, but I believe it is a very important Bill. It is complex and will require quite a lot of scrutiny, but it will create a framework of real benefit to the UK, by facilitating the exchange of data and allowing us to take the maximum advantage of emerging technologies. I look forward to our debates over the next few days.

Clause 1 will create a test in legislation to help organisations to understand whether the data that they are processing is personal or anonymous. This is important, because personal data is subject to data protection rules but anonymous data is not. If organisations can be confident that the data they are processing is anonymous, they will be able to use it for important activities such as research and product development without concern about the potential impact on individuals’ personal data.

The new test will require data controllers considering whether data is personal or anonymous to consider two scenarios. The first is where a living individual can be identified by somebody within the data controller or processor’s own organisation using reasonable means at any point at which the data is being processed, from the initial point of collection for its use and storage to its eventual deletion or onward transmission. The second scenario is where the data controller or processor knows or should reasonably know that somebody outside the organisation is likely to obtain the information and to be able to re-identify individuals from it using reasonable means. That could be a research partner or a business client with whom the data controller intends to share the data, or an outside organisation that obtains the data as a result of the data controller not putting adequate security measures in place.

What would be considered “reasonable means” in any given case takes into account, among other things, the time, effort and cost of identifying the individual, as well as the technology available during the time the processing occurs. We hope that the clarity the test provides will give organisations greater confidence about using anonymous data for a range of purposes, from marketing to medical research. I commend the clause to the Committee.

Stephanie Peacock Portrait Stephanie Peacock (Barnsley East) (Lab)
- Hansard - -

It is a pleasure to serve under your chairship, Mr Hollobone. I echo the Minister’s thanks to everyone serving on the Bill Committee; it is indeed a privilege to be here representing His Majesty’s loyal Opposition. I look forward to doing our constitutional duty as we scrutinise the Bill today and in the coming sittings.

The definition of personal data is critical, not only to this entire piece of legislation, but to the data protection regime more widely. That is because the definition of what counts as personal data sets the parameters on who will benefit from protections and safeguards set out by the legislation, and, looking at it from the other side, the various protections will not apply when data is not classed as personal. It is therefore important that the definition should be clear for both controllers and data subjects, so that everyone understands where regulations and, by extension, rights do and do not apply.

The Bill defines personal data as that where a data subject can be identified by a controller or processor, or anyone likely to obtain the information,

“by reasonable means at the time of processing”.

According to the Bill, “reasonable means” take into account the time, effort, costs, technology and resources available to the person. The addition of “reasonable” to the definition has caused major concern among civil society groups, which are worried that it will introduce an element of subjectivity from the perspective of the controller when determining whether data is personal or not. Indeed, although recital 26 of the General Data Protection Regulation also refers to reasonable means—making this, in some ways, more of a formal change than a practical one—there must still be clear parameters on how controllers or processors are to make that judgment. Without those, there may be a danger of controllers and processors avoiding the requirement to comply with rules around personal data by simply claiming they do not have the means to identify living individuals within their resources.

Has the Department undertaken an impact assessment to determine whether the definition could, first, increase subjectivity in what counts as personal data, or secondly, reduce the amount of data classified as personal data? If an assessment identifies such a risk, what steps will the Department take to mitigate that and ensure that citizens are able to exercise their rights as they can under the current definition?

Other stakeholders have raised concerns that the phrase

“at the time of the processing”

in the definition might imply that there is no continuous obligation to consider whether data is personal. Indeed, under the current definition, where personal data is

“any information that relates to an identified or identifiable living individual”,

there is an implied obligation to consider whether an individual is identifiable on an ongoing basis. Rather than assessing the identifiability of a dataset at a fixed point, the controller or processor must keep the categorisation of data that it holds under careful review, taking into account technological developments, such as sophisticated new artificial intelligence or cross-referencing tools. Inserting the phrase

“at the time of the processing”

into this definition has prompted the likes of Which? to express concern that some processors may feel that they are no longer bound by this continuous obligation. That would be particularly worrying given the potential subjectivity of the new definition. If whether an individual is identifiable is based on “reasonable means”, including one’s resources and technology, it is perfectly feasible that, with a change of resources or technology, it could become reasonable to identify a person when once it was not.

Chi Onwurah Portrait Chi Onwurah (Newcastle upon Tyne Central) (Lab)
- Hansard - - - Excerpts

My hon. Friend is making an excellent speech. Does she agree that the absence of regard for the rate of technological change, particularly the rise of artificial intelligence—datasets are now being processed at phenomenal speeds—is potentially negligent on the part of the Government?

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

My hon. Friend makes an important point, which I will come to later.

In these circumstances, it is crucial that if a person is identifiable through data at any time in the future, the data is legally treated as personal so that the relevant safeguards and rights that GDPR was designed to ensure still apply.

When arguing for increased Secretary of State powers across the Bill, Ministers have frequently cited the need to future-proof the legislation. Given that, we must also consider the need to future-proof the definition of data so that technological advances do not render it useless. Does the new definition involve a continuous obligation to assess whether data is personal? Will guidance be offered to inform both controllers and data subjects on the application of this definition, so that both sides can be clear on how it will work in practice? As 5Rights has pointed out, that could avoid clogging up the regulator’s time with claims about what counts as personal data in many individual cases.

Finally, when determining whether data is personal, it is also vital that controllers take into account how a determined stalker or malicious actor might find and use their data. It is therefore good to see the change made since the first iteration of the Data Protection and Digital Information Bill, to clarify that

“obtaining the information as a result of the processing”

also includes information obtained as a result of inaction by a controller or processor—for example, as the result of a failure to put in place appropriate measures to prevent or reduce the risk of hacking.

Overall, it is important that we give both controllers and data subjects clarity about which data is covered by which protections, and when. I look forward to hearing from the Minister about the concerns that have been raised, which could affect the definition’s ability to allow for that clarity.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I agree absolutely with the hon. Lady that the definition of personal data is central to the regime that we are putting in place. She is absolutely right that we need to be very clear and to provide organisations with clarity about what is within the definition of personal data and what is rightly considered to be anonymous. She asks whether the provision will lead to a reduction in the current level of protection. We do not believe that it will.

Clause 1 builds on the strong foundations used in GDPR recital 26 to clarify when data can be categorised as truly anonymous without creating undue risks. The aim of the provision in the Bill is to clarify when information should be considered to be personal data by including a test for identifiability in the legislation. That improved clarity will help organisations to determine when data can be considered truly anonymous and therefore pose almost no risk to the data subject.

The hon. Lady asked whether

“at the time of the processing”

extends into the future, and the answer is yes. The definition of data processing in the legislation is very broad and includes a lot of processing activities other than just the collection of data, such as alteration, retrieval, storage and disclosure by transmission, to name just a few. The phrase

“at the time of the processing”

could therefore cover a long period, depending on the nature and purpose of the processing. The test would need to be applied afresh for each new act of processing. That means that if at any point in the life cycle of processing, the data could be reasonably re-identified by someone by reasonable means, they would then not be able to legally consider to be anonymous. That includes transferring abroad to other regimes.

The clause makes it clear that a controller will have to consider the likelihood of re-identification at all stages of the processing activity. If a data controller held a dataset for several years, they would need to be mindful of the technologies available during that time that might be used to re-identify it. As the hon. Lady said, technology is advancing very fast and could well change over time from the point at which the data is first collected.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Again, yes, it will. It will be transferred abroad only if we are satisfied that the recipient will impose the same level of protection that we regard as necessary in this country.

Question put and agreed to.

Clause 1 accordingly ordered to stand part of the Bill.

Clause 2

Meaning of research and statistical purposes

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I beg to move amendment 66, clause 2, page 4, line 8, at end insert—

“(c) do not include processing of personal data relating to children for research carried out as a commercial activity.”

This amendment would exempt children’s data from being used for commercial purposes under the definition of scientific purposes in this clause.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss:

Amendment 65, clause 2, page 4, line 21, at end insert—

“7. The Commissioner must prepare a code of practice under section 124A of the Data Protection Act 2018 on the interpretation of references in this Regulation to “scientific research”.

8. The code of practice prepared under paragraph 7 must include examples of the kinds of research purposes, fields, controllers, and ethical standards that are to be considered as being scientific, and those that are excluded from being so considered.”

This amendment would require a statutory code of practice from the ICO on how the definition of scientific research in this clause is to be interpreted.

Clause stand part.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Fuelling safe scientific research through data will be vital to support the UK’s ambition to become a science superpower. We understand that, as is the case in many areas of data protection law, lack of clarity about what counts as processing for scientific purposes causes organisations to take a risk-averse approach to conducting research. An understanding of exactly what is included would therefore give organisations confidence they need to conduct vital processing that will allow for the scientific discoveries and benefits of the future.

Unfortunately, the clause makes the same mistake as the Bill does in general by focusing on easing regulations on those who hold data, rather than looking at how data can be harnessed for the general greater good. It misses the opportunity to unlock the benefits of safely redistributing and sharing data. Indeed, none of the clauses on processing for research purposes make any attempt to explore options to incentivise controllers to share their data with independent researchers. Similarly, the Bill does not explore how the likes of data trusts or co-operatives that pool data resources in the interests of a larger group of beneficiaries or organisations could create a stronger environment for research. Instead, it leaves those who already collect and hold data to benefit from the regime by processing for their own research purposes, while those who might hope to collaborate will use alternative data sets and are no better off.

By failing to think about the safe sharing of data to fuel scientific research, the Government limit the progress the UK could make as a powerhouse of science innovation. The Bill leaves only those organisations with large amounts of data able to contribute to such progress, entrenching existing power structures and neglecting the talent held in the smaller independent organisations that would otherwise be able to conduct research for the public good.

Turning to amendment 65, it has always been written into the GDPR, in recital 159, that processing for scientific purposes should be interpreted broadly. It is therefore understandable why Ministers provided a broad definition in the Bill that allows for those conducting genuine scientific research to have absolute confidence that their processing falls under this umbrella, preventing a risk-averse environment. However, stakeholders, including Reset.tech and the Ada Lovelace Institute, have expressed worries that clause 2 goes a little too far, essentially providing a blank cheque for private companies to self-identify as conducting scientific research as a guise for processing personal information for any purpose they choose.

All that must be understood in combination with clause 9, which gives organisations an exemption from purpose limitation, allowing them to reuse data as long as it is for scientific purposes, as defined in clause 2. Indeed, though the Bill contains a few clarifications of what the definition in clause 2 includes, such as publicly and privately funded processing, commercial or non-commercial processing and processing for the likes of technological development, fundamental research, or applied research, I am keen to hear from the Minister about what specific purposes would actually be ruled out under the letter of the current definition. For example, as the Ada Lovelace Institute asked, would pseudoscientific applications, such as polygraphy or experimental AI claiming to predict an individual’s religion, politics or sexuality, be categorically ruled out under the current definition?

Though it may not be the intention in the clause to enable malicious or pseudoscientific processing under the definition of science, we must ensure that the definition is not open to exploitation, or so broad that any controller could reasonably identify their processing as falling under it. Regulator guidance would be in a prime position to do that. By providing context as to what must be considered for something to be reasonably classified as scientific—for example, the purpose of the research, the field of research, the type of controller carrying it out, or the methodological and ethical standards used—controllers using the definition legitimately will feel even more assured, and malicious processing will be explicitly excluded from the application of the definition. Amendment 65 would do nothing to stop genuinely scientific research from benefiting from the changes in this Bill and would provide further clarity around how the definition can be legitimately relied upon.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I welcome the recognition of the importance of allowing genuine research and the benefits that can flow from it. Such research may well be dependent on using data and the clause is intended to provide clarity as to exactly how that can be done and in what circumstances.

I will address the amendments immediately. I am grateful to the hon. Member for Barnsley East for setting out her arguments and we understand her concerns. However, I think that the amendments go beyond what the clause proposes and, in addition, I do not think that there is a foundation for those concerns. As we have set out, clause 2 inserts in legislation a definition for processing for scientific research, historical research and statistical purposes. The definition of scientific research purposes is set out as

“any research that can be reasonably described as scientific”

and I am not sure that some of the examples that the hon. Lady gave would meet that definition.

The definitions inserted by the clause are based on the wording in the recitals to the UK GDPR. We are not changing the scope of these definitions, only their status in the legislation. They will already be very familiar to people using them, but setting them out in the Bill will provide more clarity and legal certainty. We have maintained a broad scope as to what is allowed to be included in scientific research, with the view that the regulator can add more nuance and context through guidance, as is currently the case. The power to require codes of practice provides a route for the Secretary of State to require the Information Commissioner to prepare any code of practice that gives guidance on good practice in processing personal data.

There will be situations where non-statutory guidance, which can be produced without being requested under regulations made by the Secretary of State, may be more appropriate than a statutory code of practice. Examples of the types of activity that are considered scientific research and the indicative criteria that a researcher should demonstrate are best placed in non-statutory guidance produced by the Information Commissioner’s Office. That will give flexibility to amend and change the examples when necessary, so I believe that the process does not change the provision. However, putting it in the legislation, rather than in the recitals, will impose stronger safeguards and make things clearer. Once the Bill has come into effect, the Government will continue to work with the ICO to update its already detailed and helpful guidance on the definition of scientific research as necessary.

Amendment 66 would prohibit the use of children’s data for commercial purposes under the definition of scientific research. The definition inserted by clause 2 includes the clarification that processing for scientific research carried out as a commercial activity can be considered processing for scientific research purposes. Parts of the research community asked for that clarification in response to our consultation. It reflects the existing scope, as is already clear from the ICO’s guidance, and we have seen that research by commercial bodies can have immense societal value. For instance, research into vaccines and life-saving treatments is clearly in the public interest. I entirely understand the hon. Lady’s concern for children’s privacy, but we think that her amendment could obstruct important research by commercial organisations, such as research into children’s diseases. I think that the Information Commissioner would make it clear as to whether or not the kind of example that the hon. Lady gave would fall within the definition of research for scientific purposes.

I also entirely understand the concern expressed by my hon. Friend the Member for Folkestone and Hythe. I suspect that the question about the sharing of data internationally, particularly, perhaps, by TikTok, may recur during the course of our debates. As he knows, we would share data internationally only if we were confident that it would still be protected in the same way that it is here, which would include considering the possibility of whether or not it could then be passed on to a third country, such as China.

I hope that I can reassure the hon. Lady that emphasising the safeguards that researchers must comply with in clause 22 to protect individuals relates to all data used for these purposes, including children’s data and the protections afforded to children under the UK GDPR. For those reasons, I hope that she will be willing to withdraw her amendment.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I am disappointed that the Minister does not accept amendment 66. Let me make a couple of brief points about amendment 65. The Minister said that he was not sure whether some of the examples I gave fitted under the definition, and that is what the amendment speaks to. I asked what specific purposes would be ruled out under the letter of the current definition, and that is still not clear, so I will press the amendment to a vote.

Question put, That the amendment be made.

--- Later in debate ---
Although law enforcement agencies have the power to process personal data with the permission of the individual, there is no definition of consent in the legislation. Clause 4 again mirrors the UK GDPR definition of consent, including the conditions that must be met in order for it to be used as a lawful basis for processing. That change will address the slight risk that consent may be interpreted inconsistently with the definition used in the UK GDPR. We are taking this opportunity to make our data protection laws more consistent, by clarifying terminology for both organisations and individuals. I therefore commend the clauses to the Committee.
Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

With regard to clause 3, I refer Members to my remarks on clause 2. It is sensible to clarify how controllers and processors conducting scientific research can gain consent where it is not possible to fully identify the full set of uses for that data when it is collected. However, what counts as scientific, and therefore what is covered by the clause, must be properly understood by both data subjects and controllers through proper guidance issued by the ICO.

Clause 4 is largely technical and inserts the recognised definition of consent into part 3 of the Data Protection Act 2018, for use when it is inappropriate to use one of the law enforcement purposes. I will talk about law enforcement processing in more detail when we consider clauses 16, 24 and 26, but I have no problem with the definition in clause 4 and am happy to accept it.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I am grateful to the hon. Lady for her support. I agree with her on the importance of ensuring that the definition of scientific research is clear. That is something on which I have no doubt the ICO will also issue guidance.

Question put and agreed to.

Clause 3 accordingly ordered to stand part of the Bill.

Clause 4 ordered to stand part of the Bill.

Clause 5

Lawfulness of processing

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I beg to move amendment 68, in clause 5, page 6, line 37, at end insert—

“7A. The Secretary of State may not make regulations under paragraph 6 unless—

(a) following consultation with such persons as the Secretary of State considers appropriate, the Secretary of State has published an assessment of the impact of the change to be made by the regulations on the rights and freedoms of data and decision subjects (with particular reference to children),

(b) the Commissioner has reviewed the Secretary of State’s statement and published a statement of the Commissioner’s views on whether the change should be made, with reasons, and

(c) the Secretary of State has considered whether to proceed with the change in the light of the Commissioner’s statement.”

This amendment would make the Secretary of State’s ability to amend the conditions in Annex 1 which define “legitimate interests” subject to a requirement for consultation with interested parties and with the Information Commissioner, who would be required to publish their views on any proposed change.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 67, in clause 5, page 7, line 18, at end insert—

“11. Processing may not be carried out in reliance on paragraph 1(ea) unless the controller has published a statement of—

(a) which of the conditions in Annex 1 has been met which makes the processing necessary,

(b) what processing will be carried out in reliance on that condition, or those conditions, and

(c) why that processing is proportionate to and necessary for the purpose or purposes indicated in the condition or conditions.”

This amendment would require controllers to document and publish (e.g. in a privacy notice) a short statement on their reliance on a “recognised legitimate interest” for processing personal data.

Clause stand part.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

At present, the lawful bases for processing are set out in article 6 of the UK GDPR. At least one of them must apply whenever someone processes personal data. They are consent, contract, legal obligation, vital interests, public task, and legitimate interests. That is where data is being used in ways that we would reasonably expect, there is minimal privacy impact, or there is a compelling justification for processing. Of the existing lawful bases, consent is by far the most relied upon, as it is the most clear. There have therefore been calls for the other lawful bases to be made clearer and easier to use. It is welcome to see some examples of how organisations might rely on the legitimate interests lawful ground brought on to the statute book.

At the moment, in order to qualify for using legitimate interests as grounds for lawful processing, a controller must also complete a balancing test. The balancing test is an important safeguard. As per the ICO, it requires controllers to consider the interests and fundamental rights and freedoms of the individual, and whether they override the legitimate interests that the controller has identified. That means at a minimum considering the nature of the personal data being processed, the reasonable expectations of the individual, the likely impact of processing on the individual, and whether any safeguards can be put in place to mitigate any negative impacts.

As tech.UK mentioned, the introduction of a list of legitimate interests no longer requiring that test is something many have long called for. When conducting processing relating to an emergency, for example, the outcome of a balancing test often very obviously weighs in one direction, making the decision straightforward, and the test itself an administrative task that may slow processing down. It makes sense in such instances that a considered exemption might apply.

However, given the reduction in protection and control for consumers when removing a balancing test, it is vital that a list of exemptions is limited and exhaustive, and that every item on such a list is well consulted on. It is also vital that the new lawful basis cannot be relied upon in bad faith or exploited by those who simply want to process without the burden, for reasons outside of those listed in annex 1. The Bill as it currently stands does not do enough to ensure either of those things, particularly given the Secretary of State’s ability to add to the list on a whim.

I turn to amendment 67. Although it is likely not the intention for the clause to be open to exploitation, Reset.tech, among many others, has shared concerns that controllers may be able to abuse the new lawful basis of “recognised legitimate interests”, stretching the listed items in annex 1 to cover some or all of their processing, and giving themselves flexibility over a wide range of processing without an explicit requirement to consider how that processing affects the rights of data and decision subjects. That is particularly concerning where controllers may be able to conflate different elements of their processing.

Reset.tech and AWO provide a theoretical case study to demonstrate that point. Let us say that there is a gig economy food delivery company that processes a range of data on workers, including minute-by-minute location data. That location data would be used primarily for performance management, but could occasionally be used in more extreme circumstances to detect crime—for example, detecting fraud by workers who are making false claims about how long they waited for an order to be ready for delivery. By exploiting the new recognised legitimate interests basis, the company could conflate its purposes of performance management and detecting crime, and justify the tracking of location data as a whole as being exempt from the balancing test, without having to record or specify exactly which processing is for the detection of crime.

Under the current regime, there remain two tests other than the balancing test that form a complete assessment of legitimate interests and help to prevent conflation of that kind. First, there is the purpose test, which requires the controller to identify which legitimate interest the company is relying upon. Secondly, there is the necessity test, which requires the controller to consider whether the processing that the company intends to conduct is necessary and proportionate to meet its purposes.

In having to conduct those tests, the food delivery company would find it much more difficult to conflate its performance management and crime prevention purposes, as it would have to identify and publicly state exactly which elements of its processing are covered by the legitimate interest purpose of crime prevention. That would make it explicit that any processing the company conducts for the purposes of performance management is not permitted under a recognised legitimate interest, meaning that a lawful basis for that processing would be required separately.

Amendment 67 therefore seeks to ensure that the benefits of the purpose and necessity tests are retained, safeguarding the recognised legitimate interests list from being used to cynically conflate purposes and being exploited more generally. In practice, that would mean that controllers relying on a purpose listed in annex 1 for processing would be required to document and publish a notice that explains exactly which processing the company is conducting under which purpose, and why it is necessary.

It is foundational to the GDPR regime that each act of processing has a purpose, so this requirement should just be formalising and publishing what controllers are already required to consider. The measure that the amendment seeks to introduce should therefore be no extra burden on those already complying in good faith, but should still act as a barrier to those attempting to abuse the new basis.

I turn to amendment 68. As the likes of Which? have argued, any instance of removing the balancing test will inevitably enable controllers to prioritise their interests in processing over the impact on data subjects, resulting in weaker protections for data subjects and weaker consumer control. Which? research, such as that outlined in its report “Control, Alt or Delete? The future of consumer data”, also shows that consumers value control over how their data is collected and used, and that they desire more transparency, rather than less, on how their data is used.

With those two things in mind—the value people place on control of their data and the degradation of that control as a result of removing the balancing test—it is vital that the power to remove the balancing test is used extremely sparingly on carefully considered, limited purposes only. Even for those purposes already included in annex 1, it is unclear exactly what impact assessment took place to ensure that the dangers of removing the test on the rights of citizens did not outweigh the positives of that removal.

It would therefore be helpful if the Minister could outline the assessment and analysis that took place before deciding the items on the list. Although it is sensible to future-proof the list and amend it as needs require, this does not necessarily mean vesting the power to do so in the Secretary of State’s hands, especially when such a power is open to potential abuse. Indeed, to say that the Secretary of State must have regard to the interests and fundamental rights and freedoms of data subjects and children when making amendments to the list is simply not a robust enough protection for citizens. Our laws should not rely on the good nature of the Secretary of State; they must be comprehensive enough to protect us if Ministers begin to act in bad faith.

Further, secondary legislation simply does not offer the scrutiny that the Government claim it does, because it is rarely voted on. Even when it is, if the Government of the day have a majority, defeating such a vote is incredibly rare. For the method of changing the list to be protected from the whims of a bad faith Secretary of State who simply claims to have had regard to people’s rights, proper consultation should be undertaken by the regulator on any amendments before they are considered for parliamentary approval.

This amendment would move the responsibility for judging the impact of changes away from the Secretary of State and place it with the regulator on a yearly basis, ensuring that amendments proceed only if they are deemed, after consultation, to be in the collective societal interest. That means there will be independent assurance that any amendments are not politically or maliciously motivated. This safeguard should not be of concern to anyone prepared to act in good faith, particularly the current Secretary of State, as it would not prevent the progression in Parliament of any amendments that serve the common good. The amendment represents what genuine future-proofing in a way that retains appropriate safeguards looks like, as opposed to what ends up looking like little more than an excuse for a sweeping power grab.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I welcome the hon. Lady’s recognition of the value of setting out a list of legitimate interests to provide clarity, but I think she twice referred to the possibility of the Secretary of State adding to it on a whim. I do not think we would recognise that as a possibility. There is an established procedure, which I would like to go through in responding to the hon. Lady’s concerns. As she knows, one of the key principles of our data protection legislation is that any processing of personal data must be lawful. Processing will be lawful where an individual has given his or her consent, or where another specified lawful ground in article 6 of the UK GDPR applies. This includes where the processing is necessary for legitimate interests pursued by the data controller, providing that those interests are not outweighed by an individual’s privacy rights.

Clause 5 addresses the concerns that have been raised by some organisations about the difficulties in relying on the “legitimate interests” lawful ground, which is used mainly by commercial organisations and other non-public bodies. In order to rely on it, the data controller must identify what their interest is, show that the processing is necessary for their purposes and balance their interests against the privacy right of the data subject. If the rights of the data subject outweigh the interests of the organisation, the processing would not be lawful and the controller would need to identify a different lawful ground. Regulatory guidance strongly recommends that controllers document the outcome of their legitimate interests assessments.

As we have heard, and as the hon. Lady recognises, some organisations have struggled with the part of the legitimate interests assessment that requires them to balance their interests against the rights of individuals, and concern about getting the balancing test wrong—and about regulatory action that might follow as a result—can cause risk aversion. In the worst-case scenario, that could lead to crucial information in the interests of an individual or the public—for example, about safeguarding concerns—not being shared by third-sector and private-sector organisations. That is why we are taking steps in clause 5 and schedule 1 to remove the need to do the balancing test in relation to a narrow range of recognised legitimate activities that are carried out by non-public bodies. Those activities include processing, which is necessary for the purposes of safeguarding national security or defence; responding to emergencies; preventing crimes such as fraud or money laundering; safeguarding vulnerable individuals; and engaging with the public for the purposes of democratic engagement.

--- Later in debate ---
Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

We do not believe that amendment 67 would place an extra burden on those who are already complying in good faith. The idea behind it is that it will be a barrier to those attempting to abuse the new basis.

On amendment 68, we should not have laws that rely on the Secretary of State’s good faith. As the Minister said, it is pretty rare for secondary legislation to be voted on, and for the Government to lose, so I do not see that as a barrier. The hon. Member for Folkestone and Hythe highlighted that although there are some protections, we do not believe that the Government protections go as far as we would like. For that reason, I will press the amendment to a vote.

Question put, That the amendment be made.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

As the Committee will be aware, data protection legislation prohibits the use of “special category” data—namely, information about a person that is sensitive in nature—unless certain conditions or exemptions apply. One such exemption is where processing is necessary on grounds of substantial public interest.

Schedule 1 to the Data Protection Act 2018 sets out a number of situations where processing would be permitted on grounds of substantial public interest, subject to certain conditions and safeguards. That includes processing by elected representatives who are acting with the authority of their constituents for the purposes of progressing their casework. The current exemption applies to former Members of the Westminster and devolved Parliaments for four days after a general election—for example, if the MP has been defeated or decides to stand down. That permits them to continue to rely on the exemption for a short time after the election to conclude their parliamentary casework or hand it over to the incoming MP. In practice, however, it can take much longer than that to conclude these matters.

New clause 6 will therefore extend what is sometimes known as the four-day rule to 30 days, which will give outgoing MPs and their colleagues in the devolved Parliaments more time to conclude casework. That could include handing over live cases to the new representative, or considering what records should be retained, stored and deleted. When MPs leave office, there is an onus on them to conclude their casework in a timely manner. However, the sheer volume of their caseload, on top of the other work that needs to be done when leaving office, means that four days is just not enough to conclude all relevant business. The new clause will therefore avoid the unwelcome situation where an outgoing MP who is doing his or her best to conclude constituency casework could be acting unlawfully if they continue to process their constituents’ sensitive data after the four-day time limit has elapsed. Extending the time limit to 30 days will provide a pragmatic solution to help outgoing MPs while ensuring the exemptions cannot be relied on for an indefinite period.

Government amendments 30 and 31 will make identical changes to other parts of the Bill that rely on the same definition of “elected representative”. Government amendment 30 will change the definition of “elected representative” when the term appears in schedule 1. As I mentioned when we debated the previous group of amendments, clause 5 and schedule 1 to the Bill create a new lawful ground for processing non-sensitive personal data, where the processing is necessary for a “recognised legitimate interest”. The processing of personal data by elected representatives for the purposes of democratic engagement is listed as such an interest, along with other processing activities of high public importance, such as crime prevention, safeguarding children, protecting national security and responding to emergencies.

Government amendment 31 will make a similar change to the definition of “elected representative” when the term is used in clause 84. Clauses 83 and 84 give the Secretary of State the power to make regulations to exempt elected representatives from some or all of the direct marketing rules in the Privacy and Electronic Communications (EC Directive) Regulations 2003. I have no doubt that we will debate the merits of those clauses in more detail later in Committee, but for now it makes sense to ensure that there is a single definition of “elected representative” wherever it appears in the Bill. I hope the hon. Member for Barnsley East and other colleagues will agree that those are sensible suggestions and will support the amendments.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

This set of Government provisions will increase the period for which former MPs and elected representatives in the devolved regions can use the democratic engagement purpose for processing. On the face of it, that seems like a sensible provision that allows for a transition period so that data can be deleted, processed, or moved on legally and safely after an election, and the Opposition have a huge amount of sympathy for it.

I will briefly put on record a couple of questions and concerns. The likes of the Ada Lovelace Institute have raised concerns about the inclusion of democratic engagement purposes in schedule 1. They are worried, particularly with the Cambridge Analytica scandal still fresh in people’s minds, that allowing politicians and elected parties to process data for fundraising and marketing without a proper balancing test could result in personal data being abused for political gain. The decision to make processing for the purposes of democratic engagement less transparent and to remove the balancing test that measures the impact of that processing on individual rights may indicate that the Government do not share the concern about political processing. Did the Minister’s Department consider the Cambridge Analytica scandal when drawing up the provisions? Further, what safeguards will be in place to ensure that all data processing done under the new democratic engagement purpose is necessary and is not abused to spread misinformation?

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I would only say to the hon. Lady that I have no doubt that we will consider those aspects in great detail when we get to the specific proposals in the Bill, and I shall listen with great interest to my hon. Friend the Member for Folkestone and Hythe, who played an extremely important role in uncovering what went on with Cambridge Analytica.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I think my hon. Friend is right. I have no doubt that we will go into these matters in more detail when we get to those provisions. As the hon. Member for Barnsley East knows, this measure makes a very narrow change to simply extend the existing time limit within which there is protection for elected representatives to conclude casework following a general election. As we will have opportunity in due course to look at the democratic engagement exemption, I hope she will be willing to support these narrow provisions.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I am grateful for the Minister’s reassurance, and we are happy to support them.

--- Later in debate ---
The purpose limitation
Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I beg to move amendment 69, in clause 6, page 9, leave out lines 7 to 20.

This amendment would remove the ability of the Secretary of State to amend Annex 2, so they could not make changes through secondary legislation to the way purpose limitation operates.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clause stand part.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

One of the key principles in article 5 of the EU GDPR is purpose limitation. The principle aims to ensure that personal data is collected by controllers only for specified, explicit and legitimate purposes. Generally speaking, it ensures that the data is not further processed in a manner that is incompatible with those purposes. If a controller’s purposes change over time, or they want to use data for a new purpose that they did not originally anticipate, they can go ahead only if the new purpose is compatible with the original purpose, they get the individual’s specific consent for the new purpose or they can point to a clear legal provision requiring or allowing the new processing in the public interest.

Specifying the reasons for obtaining data from the outset helps controllers to be accountable for their processing and helps individuals understand how their data is being used and whether they are happy with that, particularly where they are deciding whether to provide consent. Purpose limitation exists so that it is clear why personal data is being collected and what the intention behind using it is.

In any circumstance where we water down this principle, we reduce transparency, we reduce individuals’ ability to understand how their data will be used and, in doing so, we weaken assurances that people’s data will be used in ways that are fair and lawful. We must therefore think clearly about what is included in clause 6 and the associated annex. Indeed, many stakeholders, from Which? to Defend Digital Me, have expressed concern that what is contained in annex 2 could seriously undermine the principle of purpose limitation.

As Reset.tech illustrates, under the current regime, if data collected for a relatively everyday purpose, such as running a small business, is requested by a second controller for the purpose of investigating crime, the small business would need to assess whether this further processing—thereby making a disclosure of the data—was compatible with its original purpose. In many cases, there will be no link between the original and secondary purposes, and there are potential negative consequences for the data subjects. As such, the further processing would be unlawful, as it would breach the principle of purpose limitation.

However, under the new regime, all it would take for the disclosure to be deemed compatible with the original purpose is the second controller stating that it requires the data for processing in the public interest. In essence, this means that, for every item listed in annex 2, there are an increased number of circumstances in which data subjects’ personal information could be used for purposes outside their reasonable expectations. It seems logical, therefore, that whatever is contained in the list is absolutely necessary for the public good and is subject to the highest level of public scrutiny possible.

Instead, the clause gives the Secretary of State new Henry VIII powers to add to the new list of compatible purposes by secondary legislation whenever they wish, with no provisions made for consulting on, scrutinising or assessing the impact of such changes. It is important to remember here that secondary legislation is absolutely not a substitute for parliamentary scrutiny of primary legislation. Delegated legislation, as we have discussed, is rarely voted on, and even when it is, the Government of the day will win such a vote if they have a majority.

If there are other circumstances in which the Government think it should be lawful to carry out further processing beyond the original purpose, those should be in the Bill, rather than being left to Ministers to determine at a later date, avoiding the same level of scrutiny.

The Government’s impact assessment says that clarity on the reuse of data could help to fix the market failure caused by information gaps on how purpose limitation works. Providing such clarity is something we could all get behind. However, by giving the Secretary of State sweeping powers fundamentally to change how purpose limitation operates, the clause goes far beyond increasing clarity.

Improved and updated guidance on how the new rules surrounding reusing data work would be far more fruitful in providing clarity than further deregulation in this instance. If Ministers believe there are things missing from the clause and annex, they should discuss them here and now, rather than opening the back door to making further additions afterwards, and that is what the amendment seeks to ensure.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

The clause sets out the conditions under which the reuse of personal data for a new purpose is permitted. As the hon. Lady has said, the clause expands on the purpose limitation principle. That key principle of data protection ensures that an individual’s personal data is reused only in ways they might reasonably expect.

The current provisions in the UK GDPR on personal data reuse are difficult for controllers and individuals to navigate. That has led to uncertainty about when controllers can reuse personal data. The clause addresses the existing uncertainty around reusing personal data by setting out clearly when it is permitted. That includes when personal data is being reused for a very different purpose from that for which it was originally collected—for example, when a company might wish to disclose personal data for crime prevention.

The clause permits reuse of personal data by a controller when the new purpose is “compatible”; they get fresh consent; there is a research purpose; UK GDPR is being complied with, such as for anonymisation or pseudonymisation purposes; there is an objective in the public interest authorised by law; and certain specified objectives in the public interest set out in a limited list in schedule 2 are met. I will speak more about that when we come to the amendment and the debate on schedule 2.

The clause contains a power to add or amend conditions or remove conditions added by regulations from that list to ensure it can be kept up to date with any future developments in how personal data should be reused in the public interest. It also sets out restrictions on reusing personal data that the controller originally collected on the basis of consent.

The Government want to ensure that consent is respected to uphold transparency and maintain high data protection standards. If a person gives consent for their data to be processed for a specific purpose, that purpose should be changed without their consent only in limited situations, such as for certain public interest purposes, if it would be unreasonable to seek fresh consent. That acts as a safeguard to ensure that organisations address the possibility of seeking fresh consent before relying on any exemptions.

The restrictions around consent relate to personal data collected under paragraph 1(a) of article 6 of the UK GDPR, which came into force in May 2018. Therefore, they do not apply to personal data processed on the basis of consent prior to May 2018, when different requirements applied. By simplifying the rules on further processing, the clause will give controllers legal certainty on when they can reuse personal data and give individuals greater transparency. I support the clause standing part of the Bill.

Let me turn to amendment 69, which proposes to remove the power set out in the clause to amend the annex in schedule 2. As I have already said, schedule 2 will insert a new annex in the UK GDPR, which sets out certain specific public interest circumstances where personal data reuse is permitted. The list is strictly limited and exhaustive, so a power is needed to ensure that it is kept up to date with any future developments in how personal data is reused for important public interest purposes. That builds on an existing power in schedule 2 to the Data Protection Act 2018, where there is already the ability to make exceptions to the purpose limitation principle via secondary legislation.

The power in the clause also provides the possibility of narrowing a listed objective if there is evidence of any of the routes not being used appropriately. That includes limiting it, by reference, to the lawful ground of the original processing—for example, to prohibit the reuse of data that was collected on the basis of an individual’s consent.

I would like to reassure the hon. Lady that this power will be used only when necessary and in the public interest. That is why the clause contains a restriction on its use; it may be used only to safeguard an objective listed in article 23 of the UK GDPR. Clause 44 of the Bill also requires that the Secretary of State must consult the commissioner, and any other persons as the Secretary of State considers appropriate, before making any regulations.

On that basis, I hope the hon. Lady will accept that the amendment is unnecessary.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

The purpose behind our amendment —this speaks to a number of our amendments—is that we disagree with the amount of power being given to the Secretary of State. For that reason, I would like to continue with my amendment.

Question put, That the amendment be made.

--- Later in debate ---
purpose
Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I beg to move amendment 71, in schedule 2, page 138, line 16, leave out “states” and insert “confirms”.

This amendment would require a person who needs personal data for a purpose described in Article 6(1)(e) (a task carried out in the public interest or in the exercise of official authority vested in the controller) to confirm, and not merely to state, that they need the data for legitimate purposes.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 70, in schedule 2, page 139, line 30, at end insert

“levied by a public authority”.

This amendment would clarify that personal data could be processed as a “legitimate interest” under this paragraph only when the processing is carried out for the purposes of the assessment or collection of a tax or duty or an imposition of a similar nature levied by a public authority.

That schedule 2 be the Second schedule to the Bill.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I will begin by addressing amendment 70, which seeks only to make a wording change so that the annex cannot be misinterpreted. Paragraph 10 of annex 2 outlines that further processing is to be treated as compatible with original purposes

“where the processing is carried out for the purposes of the assessment or collection of a tax or duty or an imposition of a similar nature.”

Which? has expressed concerns that that is much too vaguely worded, especially without a definition of “tax” or “duty” for the purposes of that paragraph, leaving the data open to commercial uses beyond the intention. Amendment 70 would close any potential loopholes by linking the condition to meeting a specific statutory obligation to co-operate with a public authority such as His Majesty’s Revenue and Customs.

Moving on, amendment 71 would correct a similar oversight in paragraph 1 of annex 2, which was identified by the AWO and Reset.tech. Paragraph 1 aims to ensure that processing is treated as compatible with the original purpose when it is necessary for making a disclosure of personal data to another controller that needs to process that data for a task in the public interest or in the exercise of official authority and that has requested that data. However, the Bill says that processing is to be treated as compatible with the original purpose where such a request simply “states” that the other person needs the personal data for the purposes of carrying out processing that is a matter of public task. At very least, those matters should surely be actually true, rather than just stated. Amendment 71 would close that loophole, so that the request must confirm a genuine need for data in completing a task in the public interest or exercising official authority, rather than simply being a statement of need.

Beyond those amendments, I wish only to reiterate the thoughts that I expressed during the debate on clause 6. Everything contained in the annex provides for further processing that is hidden from data subjects and may not be within their reasonable expectations. The reliance on the new annex should therefore be closely monitored  to ensure that it is not being exploited, or we risk compromising the purpose limitation principle altogether. Does the Department plan to monitor how the new exemptions on the reuse of data are being relied on?

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

As we have already discussed with clause 6, schedule 2 inserts a new annex into the UK GDPR. It sets out certain specific public interest circumstances in which personal data reuse is permitted regardless of the purpose for which the data was originally collected—for example, when the disclosure of personal data is necessary to safeguard vulnerable individuals. Taken together, clause 6 and schedule 2 will give controllers legal certainty on when they can reuse personal data and give individuals greater transparency.

Amendment 70 concerns taxation purposes, which are included in the list in schedule 2. I reassure the hon. Member for Barnsley East that the exemption for taxation is not new: it has been moved from schedule 2 to the Data Protection Act 2018. Indeed, the specific language in question goes back as far as 1998. We are not aware of any problems caused by that language.

The inclusion in the schedule of

“levied by a public authority”

would likely cause problems, since taxes and duties can be imposed only by law. Some must be assessed or charged by public authorities, but many become payable as a result of a person’s transactions or circumstances, without any intervention needed except to enforce collection if unpaid. They are not technically levied by a public authority. That would therefore lead to uncertainty and confusion about whether processing for certain important taxation purposes would be permitted under the provision.

I hope to reassure the hon. Lady by emphasising that taxation is not included in the annex 1 list of legitimate interests. That means that anyone seeking to use the legitimate interest lawful ground for that purpose would need to carry out a balancing-of-interests test, unless they were responding to a request for information from a public authority or other body with public tasks set out in law. For those reasons, I am afraid I am unable to accept the amendment, and I hope the hon. Lady will withdraw it.

Amendment 71 relates to the first paragraph in new annex 2 to the UK GDPR, as inserted by schedule 2. The purpose of that provision is to clarify that non-public bodies can disclose personal data to other bodies in certain situations to help those bodies to deliver public interest tasks in circumstances in which personal data might have been collected for a different purpose. For example, it might be necessary for a commercial organisation to disclose personal data to a regulator on an inquiry so that that body can carry out its public functions. The provision is tightly formulated and will permit disclosure from one body to another only if the requesting organisation states that it has a public interest task, that it has an appropriate legal basis for processing the data set out in law, and that the use of the data is necessary to safeguard important public policy or other objectives listed in article 23.

I recognise that the amendment is aimed at ensuring that the requesting organisation has a genuine basis for asking for the data, but suggest that changing one verb in the clause from “state” to “confirm” will not make a significant difference. The key point is that non-public bodies will not be expected to hand over personal data on entirely spurious grounds, because of the safeguards that I described. On that basis, I hope the hon. Lady will withdraw her amendment.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I am reassured by what the Minister said about amendment 70 and am happy not to move it, but I am afraid he has not addressed all my concerns in respect of amendment 71, so I will press it to a vote.

Question put, That the amendment be made.

--- Later in debate ---
Vexatious or excessive requests by data subjects
Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I beg to move amendment 74, in clause 7, page 10, line 34, at end insert—

“6. Where a controller—

(a) charges a fee for dealing with a request, in accordance with paragraph 2(a), or

(b) refuses to act on a request, in accordance with paragraph 2(b)

the controller must issue a notice to the data subject explaining the reasons why they are refusing to act on the request, or charging a fee for dealing with the request, and informing the subject of their right to make a complaint to the Commissioner and of their ability to seek to enforce this right through a judicial remedy.”

This amendment would oblige controllers to issue a notice to the data subject explaining the reasons why they are not complying with a request, or charging for a request, their right to make a complaint to the ICO, and their ability to seek to enforce this right through a judicial remedy.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 73, in clause 7, page 12, line 20, at end insert—

“(1A) When considering the resources available to the recipient for the purposes of subsection (1)(c), no account may be taken of any lack of resources which is due to a failure by the recipient to appoint staff to relevant roles where the recipient has the resources to do so.”

This amendment would make it clear that, when taking into account “resources available to the controller” for deciding whether a subject access request is vexatious or excessive, this cannot include where the organisation has neglected to appoint staff, but has the finances or resources to do so.

Amendment 72, in clause 7, page 12, line 25, at end insert—

“(3) The Commissioner must prepare a code of practice under section 124A on the circumstances in which a request may be deemed vexatious or excessive.

(4) The code of practice prepared under subsection (3) must include examples of requests which may be deemed vexatious or excessive, and of requests which may be troublesome to deal with but which should not be deemed vexatious or excessive.”

This amendment would require the ICO to produce a code of practice on how the terms vexatious and excessive are to be applied, with examples of the kind of requests that may be troublesome to deal with, but are neither vexatious nor excessive.

Clause stand part.

--- Later in debate ---
Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I will speak first to clause 7 and amendment 72. Currently, everyone has the right to ask an organisation whether or not it is using or storing their personal data and to ask for copies of that data. That is called the right of access, and exercising that right is known as making a subject access request. Stakeholders from across the spectrum, including tech companies and civil society organisations, all recognise the value of SARs in helping individuals to understand how and why their data is being used and enabling them to hold controllers to account in processing their data lawfully.

The right of access is key to transparency and often underpins people’s ability to exercise their other rights as data subjects. After all, how is someone to know that their data is being used in an unlawful way, or in a way they would object to, if they are not able to ascertain whether their personal data is being held or processed by any particular organisation? For example, as the TUC highlighted in oral evidence to the Committee, the right of data subjects to make an information access request is a particularly important process for workers and their representatives, as it enables workers to gain access to personal data on them that is held by their employer and aids transparency over how algorithmic management systems operate.

It has pleased many across the board to see the Government roll back on their suggestion of introducing a nominal fee for subject access requests. However, the Bill introduces a new threshold for when controllers are able to charge a reasonable fee, or refuse a subject access request, moving from “manifestly unfounded or excessive” to “vexatious or excessive”. When deciding whether a request is vexatious or excessive, the Bill requires the controller to have regard to the circumstances of the subject access request. That includes, but is not limited to, the nature of the request; the relationship between subject and controller; the resources available to the controller; the extent to which the request repeats a previous request made by the subject; how long ago any previous request was made; and whether the request overlaps with other requests made by the data subject to the controller.

Stakeholders such as the TUC, the Public Law Project and Which? have expressed concerns that, as currently drafted, the terms that make up the new threshold are too subjective and could be open to abuse by controllers who may define any request they do not want to answer as vexatious or excessive. Currently, all there is in the Bill to guide controllers on how to apply the threshold is a non-exhaustive list of considerations; as I raised on Second Reading, if that list is non-exhaustive, what explicit protections will be in place to stop the application of terms such as “vexatious” and “excessive” being stretched and manipulated by controllers who simply do not want to fulfil the requests they do not like?

There are concerns that without further guidance even the considerations listed could be interpreted selfishly by controllers who lack a desire to complete a request. For example, given that many subject access requests come from applicants who are suspicious of how their data is being used, or have cause to believe their data is being misused, there is a high likelihood that the relationship any given applicant has with the controller has previously involved some level of friction and, perhaps, anger. The Bill prompts controllers to consider their relationship with a data subject when determining whether their request is vexatious; what is to stop a controller simply marking any data subject who has shared suspicions as “angry and vexatious”, thereby giving them grounds to refuse a genuine request?

Without clarity on how both the new threshold and the considerations apply, the ability of data subjects to raise a legal complaint about why their request was categorised as vexatious and excessive will be severely impeded. As AWO pointed out in oral evidence, that kind of legal dispute over a subject access request may be only the first stage of court proceedings for an individual, with a further legal case on the contents of the subject access request potentially coming afterwards. There simply should not be such a long timescale and set of legal proceedings in order for a person to exercise their fundamental data rights. Even the Information Commissioner himself, despite saying that he was clear on how the phrases “vexatious” and “excessive” should be applied, mentioned to the Committee that it was right to point out that such phrases were open to numerous interpretations.

The ICO is in a great position to provide clear statutory guidance on the application of the terms, with specific examples of when they do and do not apply, so that only truly bad-natured requests that are designed to exploit the system can be rejected or charged for. Such guidance would provide clarity on the ways in which a request might be considered troublesome but neither vexatious nor excessive. That way, controllers can be sure that they have dismissed, or charged for, only requests that genuinely pass the threshold, and data subjects can be assured that they will still be able to freely access information on how their data is being used, should they genuinely need or want it.

On amendment 73, one consideration that the Bill suggests controllers rely on when deciding whether a request is vexatious or excessive is the “resources available” to them. I assume that consideration is designed to operate in relation to the “excessive” threshold and the ability to charge. For example, when a subject access request would require work far beyond the means of the controller in question, the controller would be able to charge for providing the information needed, to ensure that they do not experience a genuine crisis of resources as a result of the request. However, the Bill does not explicitly express that, meaning the consideration in its vague form could be applied in circumstances beyond that design.

Indeed, if a controller neglected to appoint an appropriate number of staff to the responsibility of responding to subject access requests, despite having the finances and resources to do so, they could manipulate the consideration to say that any request they did not like was excessive, as a result of the limited resources available to respond. As is the case across many parts of the Bill, we cannot have legislation that simply assumes that people will act in good faith; we must instead have legislation that explicitly protects against bad-faith interpretations. The amendment would ensure just that by clarifying that a controller cannot claim that a request is excessive simply because they have neglected to arrange their resources in such a way that makes responding to the request possible.

On amendment 74, as is the case with the definition of personal data in clause 1, where the onus is placed on controllers to decide whether a living individual could reasonably be identified in any dataset, clause 7 again places the power—this time to decide whether a request is vexatious or excessive—in the hands of the controller.

As the ICO notes, transparency around the use of data is fundamentally linked to fairness, and is about being

“clear, open and honest with people from the start about who you are, and how and why you use their personal data”.

If a controller decides, then, that due to a request being vexatious or excessive they cannot provide transparency on how they are processing an individual’s data at that time, the very least they could do, in the interests of upholding fairness, is to provide transparency on their justification for classifying a request in that way. The amendment would allow for just that, by requiring controllers to issue a notice to the data subject explaining the grounds on which their request has been deemed vexatious or excessive and informing them of their rights to make a complaint or seek legal redress.

In oral evidence, the Public Law Project described the Bill’s lack of a requirement for controllers to notify subjects as to why their request has been rejected as a decision that creates an “information asymmetry”. That is particularly concerning given that it is often exactly that kind of information that is needed to access the other rights and safeguards outlined in the Bill and across GDPR. A commitment to transparency, as the amendment would ensure, would not only give data subjects clarity on why their request had been rejected or required payment, but provide accountability for controllers who rely on the clause, and thereby a deterrent from misusing it to reject any requests that they dislike. For controllers, the workload of issuing such notices should surely be less than that of processing a request that is genuinely vexatious and excessive, ensuring that the provision does not counterbalance the benefits brought to controllers through the clause.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Let me start by recognising the importance of of subject access requests. I am aware that some have interpreted the change in the wording for grounds of refusal as a weakening. We do not believe that is the case.

On amendment 72, in our view the new “vexatious or excessive” language in the Bill gives greater clarity than there has previously been. The Government have set out parameters and examples in the Bill that outline how the term “vexatious” should be interpreted within a personal data protection context, to ensure that controllers understand.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I completely agree with my hon. Friend. That is an issue that both he and I regard as very serious, and is perhaps another example of the kind of legal tactic that SLAPPs—strategic lawsuits against public participation—represent, whereby oligarchs can frustrate genuine journalism or investigation. He is absolutely right to emphasise that.

It is important to highlight that controllers can already consider resource when refusing or charging a reasonable fee for a request. The Government do not wish to change that situation. Current ICO guidance sets out that controllers can consider resources as a factor when determining if a request is excessive.

The new parameters are not intended to be reasons for refusal. The Government expect that the new parameters will be considered individually as well as in relation to one another, and a controller should consider which parameters may be relevant when deciding how to respond to a request. For example, when the resource impact of responding would be minimal even if a large amount of information was requested—such as for a large organisation—that should be taken into account. Additionally, the current rights of appeal allow a data subject to contest a refusal and ultimately raise a complaint with the ICO. Those rights will not change with regard to individual rights requests.

Amendment 74 proposes adding more detail on the obligations of a controller who refuses or charges for a request from a data subject. The current legislation sets out that any request from a data subject, including subject access requests, is to be responded to. The Government are retaining that approach and controllers will be expected to demonstrate why the provision applies each time it is relied on. The current ICO guidance sets out those obligations on controllers and the Government do not plan to suggest a move away from that approach.

The clause also states that it is for the controller to show that a request is vexatious or excessive in circumstances where that might be in doubt. Thus, the Government believe that the existing legislation provides the necessary protections. Following the passage of the Bill, the Government will work with the ICO to update guidance on subject access requests, which we believe plays an important role and is the best way to achieve the intended effect of the amendments. For those reasons, I will not accept this group of amendments; I hope that the hon. Member for Barnsley East will be willing to withdraw them.

I turn to clause 7 itself. As I said, the UK’s data protection framework sets out key data subject rights, including the right of access—the right for a person to obtain a copy of their personal data. A subject access request is used when an individual requests their personal data from an organisation. The Government absolutely recognise the importance of the right of access and do not want to restrict that right for reasonable requests.

The existing legislation enables organisations to refuse or charge a reasonable fee for a request when they deem it to be “manifestly unfounded or excessive”. Some organisations, however, struggle to rely on that in cases where it may be appropriate to do so, which as a consequence impacts their ability to respond to reasonable requests.

The clause changes the legislation to allow controllers to refuse or charge a reasonable fee for a request that is “vexatious or excessive”. The clause adds parameters for controllers to consider when relying on the “vexatious or excessive” exemption, such as the nature of the request and the relationship between the data subject and the controller. The clause also includes examples of the types of request that may be vexatious, such as those intended to cause distress, those not made in good faith or those that are an abuse of process.

We believe that the changes will give organisations much-needed clarity over when they can refuse or charge a reasonable fee for a request. That will ensure that controllers can focus on responding to reasonable requests, as well as other important data and organisational needs. I commend the clause to the Committee.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I appreciate that, as the Minister said, the Government do not intend the new terms to be grounds for refusal, but his remarks do not reassure me that that will not be the case. Furthermore, as I said on moving the amendment, stakeholders such as the TUC, Public Law and Which? have all expressed concern that, as drafted, those terms are too subjective. I will press the amendment to a vote.

Question put, That the amendment be made.

--- Later in debate ---
Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I expressed my thoughts on the value and importance of subject access requests when we debated clause 7, and most of the same views remain pertinent here. Clause 8 allows for subject access requests to be extended where the nature of the request is complex, or due to volume. Some civil society groups, including Reset.tech, have expressed concern that that could mean that requests are unduly delayed for months, reflecting concern that they could be disregarded altogether, which was discussed when we debated clause 7. With that in mind, can the Minister tell us what protections will be in place to ensure that data controllers do not abuse the new ability to extend subject access requests, particularly by using the excuse that it is a large amount of data, in order to delay requests that they simply do not wish to respond to?

The clause provides some clarity on clause 7 by demonstrating that just because a request is lengthy or comes in combination with many others, it is not necessarily excessive as the clause gives controllers the option to extend the timeframe for dealing with requests that are high in volume. Of course, we do not want to unnecessarily delay requests, but allowing controllers to manage their load within a reasonable extended timeframe can act as a safeguard against their automatically relying on the “excessive” threshold. With that in mind, I am happy for the clause to stand part. However, I reiterate that my comments on clause 7 should be referred to.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

May I briefly respond to the hon. Lady’s comments? I assure her that controllers will not be able to stop the clock for all subject access requests—only for those where they reasonably require further information to be able to proceed with responding. Once that information has been received from a data subject, the clock resumes and the controller must proceed with responding to the request within the applicable time period, which is usually one month from when the controller receives the request information. A data subject who has provided the requested information would also be able to complain to a controller, and ultimately to the Information Commissioner’s Office, if they feel that their request has not been processed within the appropriate time. I hope the hon. Lady will be assured that there are safeguards to ensure that this power is not abused.

Question put and agreed to.

Clause 8 accordingly ordered to stand part of the Bill.

Clause 9

Information to be provided to data subjects

Question proposed, That the clause stand part of the Bill.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Clause 9 provides researchers, archivists and those processing personal data for statistical purposes with a new exemption from providing certain information to individuals when they are reusing datasets for a different purpose, which will help to ensure that important research can continue unimpeded. The new exemption will apply when the data was collected directly from the individual, and can be used only when providing the additional information would involve a disproportionate effort. There is already an exemption from this requirement where the personal data was collected from a different source.

The clause also adds a non-exhaustive list of examples of factors that may constitute a disproportionate effort. This list is added to both the new exemption in article 13 and the existing exemption found in article 14. Articles 13 and 14 of the UK GDPR set out the information that must be provided to data subjects at the point of data collection: article 13 covers circumstances where data is directly collected from data subjects, and article 14 covers circumstances where personal data is collected indirectly—for example, via another organisation. The information that controllers must provide to individuals includes details such as the identity and contact details of the controller, the purposes of the processing and the lawful basis for processing the data.

Given the long-term nature of research, it is not always possible to meaningfully recontact individuals. Therefore, applying a disproportionate effort exemption addresses the specific problem of researchers wishing to reuse data collected directly from an individual. The exemption will help ensure that important research can continue unimpeded. The clause also makes some minor changes to article 14. Those do not amend the scope of the exemption or affect its operation, but make it easier to understand.

I now turn to clause 10, which introduces an exemption relating to legally professionally privileged data into the law enforcement regime, mirroring the existing exemptions under the UK GDPR and the intelligence services regime. As a fundamental principle of our legal system, legal professional privilege protects confidential communications between professional legal advisers and their clients. The existing exemption in the UK GDPR restricts an individual’s right to access personal data that is being processed or held by an organisation, and to receive certain information about that processing.

However, in the absence of an explicit exemption, organisations processing data under the law enforcement regime, for a law enforcement purpose rather than under the UK GDPR, must rely on ad hoc restrictions in the Data Protection Act. Those require them to evaluate and justify its use on a case-by-case basis, even where legal professional privilege is clearly applicable. The new exemption will make it simpler for organisations that process data for a law enforcement purpose to exempt legally privileged information, avoiding the need to justify the use of alternative exemptions. It will also clarify when such information can be withheld from the individual.

Hon. Members might wonder why an exemption for legal professional privilege was not included under the law enforcement regime of the Data Protection Act in the first place. The reason is that we faithfully transposed the EU law enforcement directive, which did not contain such an exemption. Following our exit from the EU, we are taking this opportunity to align better the UK GDPR and the law enforcement regime, thereby simplifying the obligations for organisations and clarifying the rules for individuals.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

The impact of clause 9 and the concerns around it should primarily be understood in relation to the definition contained in clause 2, so I refer hon. Members to my remarks in the debate on clause 2. I also refer them to my remarks on purpose limitation in clause 6. To reiterate both in combination, I should say that purpose limitation exists so that it is clear why personal data is being collected, and what the intention is behind its use. That means that people’s data should not largely be reused in ways not initially collected for, unless a new legal basis is obtained.

It is understandable that, where genuine scientific, historical and statistical research is occurring, and there is disproportionate effort to provide the information required to data subjects, there may be a need for exemption and to reuse data without informing the subject. However, that must be done only where strictly necessary. We must be clear that, unless there are proper boundaries to the definition of scientific data, this could be interpreted far too loosely.

I am concerned that, without amendment to clause 2, clause 9 could extend the problem of scientific research being used as a guise for using people’s personal data in malicious or pseudoscientific ways. Will the Minister tell us what protections will be in place to ensure that people’s data is not reused on scientific grounds for something that they would otherwise have objected to?

On clause 10, I will speak more broadly on law enforcement processing later in the Bill, but it is good to have clarity on the legal professional privilege exemptions. I have no further comments at this stage.

Carol Monaghan Portrait Carol Monaghan (Glasgow North West) (SNP)
- Hansard - - - Excerpts

What we are basically doing is changing the rights of individuals, who would previously have known when their data was used for a purpose other than that for which it was collected. The terms

“scientific or historical research, the purposes of archiving in the public interest or statistical purposes”

are very vague, and, according to the Public Law Project, open to wide interpretation. Scientific research is defined as

“any research that can reasonably described as scientific, whether publicly or privately funded”.

I ask the Minister: what protections are in place to ensure that private companies are not given, through this clause, a carte blanche to use personal data for the purpose of developing new products, without the need to inform the data subject?

Data Protection and Digital Information (No. 2) Bill (Fourth sitting) Debate

Full Debate: Read Full Debate

Stephanie Peacock

Main Page: Stephanie Peacock (Labour - Barnsley East)

Data Protection and Digital Information (No. 2) Bill (Fourth sitting)

Stephanie Peacock Excerpts
Committee stage
Tuesday 16th May 2023

(11 months, 3 weeks ago)

Public Bill Committees
Read Full debate Data Protection and Digital Information Bill 2022-23 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 16 May 2023 - (16 May 2023)
John Whittingdale Portrait The Minister for Data and Digital Infrastructure (Sir John Whittingdale)
- Hansard - - - Excerpts

When the Committee adjourned this morning, I was nearly at my conclusion; I was responding to points made by the hon. Member for Barnsley East and by the hon. Member for Glasgow North West, who has not yet rejoined us. I was saying that the exemption applies where the data originally collected is historic, where to re-contact to obtain consent would require a disproportionate effort, and where that data could be of real value in scientific research. We think that there is a benefit to research and we are satisfied that the protection is there. There was some debate about the definition of scientific research, which we covered earlier; that is a point that is appealable to the Information Commissioner’s Office. On the basis of what I said earlier, and that assurance, I hope that the Committee will agree to the clause.

Question put and agreed to.

Clause 9 accordingly ordered to stand part of the Bill.

Clause 10 ordered to stand part of the Bill.

Clause 11

Automated decision-making

Stephanie Peacock Portrait Stephanie Peacock (Barnsley East) (Lab)
- Hansard - -

I beg to move amendment 78, in clause 11, page 18, line 13, after “subject” insert “or decision subject”.

This amendment, together with Amendments 79 to 101, would apply the rights given to data subjects by this clause to decision subjects (see NC12).

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 79, in clause 11, page 18, line 15, leave out “data”.

See explanatory statement to Amendment 78.

Amendment 80, in clause 11, page 18, line 16, leave out “data”.

See explanatory statement to Amendment 78.

Amendment 81, in clause 11, page 18, line 27, after “subject” insert “or decision subject”.

See explanatory statement to Amendment 78.

Amendment 82, in clause 11, page 18, line 31, after “subject” insert “or decision subject”.

See explanatory statement to Amendment 78.

Amendment 83, in clause 11, page 19, line 4, after “subject” insert “or decision subject”.

See explanatory statement to Amendment 78.

Amendment 84, in clause 11, page 19, line 7, leave out “data”.

See explanatory statement to Amendment 78.

Amendment 85, in clause 11, page 19, line 11, leave out “data”.

See explanatory statement to Amendment 78.

Amendment 86, in clause 11, page 19, line 12, leave out “data”.

See explanatory statement to Amendment 78.

Amendment 87, in clause 11, page 19, line 13, leave out “data”.

See explanatory statement to Amendment 78.

Amendment 88, in clause 11, page 19, line 15, leave out “data”.

See explanatory statement to Amendment 78.

Amendment 89, in clause 11, page 19, line 17, leave out “data”.

See explanatory statement to Amendment 78.

Amendment 90, in clause 11, page 19, line 26, after “subject” insert “or decision subject”.

See explanatory statement to Amendment 78.

Amendment 91, in clause 11, page 20, line 8, after “subject” insert “or decision subject”.

See explanatory statement to Amendment 78.

Amendment 92, in clause 11, page 20, line 10, leave out “data”.

See explanatory statement to Amendment 78.

Amendment 93, in clause 11, page 20, line 12, leave out “data”.

See explanatory statement to Amendment 78.

Amendment 94, in clause 11, page 20, line 23, after “subject” insert “or decision subject”.

See explanatory statement to Amendment 78.

Amendment 95, in clause 11, page 20, line 28, after “subject” insert “or decision subject”.

See explanatory statement to Amendment 78.

Amendment 96, in clause 11, page 20, line 31, leave out “data”.

See explanatory statement to Amendment 78.

Amendment 97, in clause 11, page 20, line 35, leave out “data”.

See explanatory statement to Amendment 78.

Amendment 98, in clause 11, page 20, line 37, leave out “data”.

See explanatory statement to Amendment 78.

Amendment 99, in clause 11, page 20, line 39, leave out “data”.

See explanatory statement to Amendment 78.

Amendment 100, in clause 11, page 21, line 1, leave out “data”.

See explanatory statement to Amendment 78.

Amendment 101, in clause 11, page 21, line 31, after “subject” insert “or decision subject”.

See explanatory statement to Amendment 78.

Amendment 106, in clause 27, page 47, line 27, after “subjects”, insert “decision subjects,”.

This amendment would require the ICO to have regard to decision subjects (see NC12) as well as data subjects as part of its obligations.

Amendment 108, in clause 29, page 53, line 11, at end insert—

“(ba) decision subjects;”.

This amendment, together with Amendments 109 and 110, would require codes of conduct produced by the ICO to have regard to decision subjects (see NC12) as well as data subjects.

Amendment 109, in clause 29, page 53, line 13, at end insert—

“(d) persons who appear to the Commissioner to represent the interests of decision subjects.”.

See explanatory statement to Amendment 108.

Amendment 110, in clause 29, page 53, line 21, after “subjects”, insert “, decision subjects”.

See explanatory statement to Amendment 108.

New clause 12—Decision subjects

“(1) The UK GDPR is amended as follows.

(2) In Article 4, after paragraph (A1), insert—

‘(A1A) “decision subject” means an identifiable individual who is subject to data-based and automated decision making;’”.

This new clause would provide a definition of “decision subjects”, enabling them to be given rights similar to those given to data subjects (see, for example, Amendment 78).

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I am pleased to speak to new clause 12, which would insert a definition of decision subjects, and to amendments 79 to 101, 106 and 108 to 110, which seek to insert rights and considerations for decision subjects that mirror those of data subjects at various points throughout the Bill.

Most of our data protection legislation operates under the assumption that the only people affected by data-based and automated decision making are data subjects. The vast majority of protections available for citizens are therefore tied to being a data subject: an identifiable living person whose data has been used or processed. However, as Dr Jeni Tennison described repeatedly in evidence to the Committee, that assumption is unfortunately flawed. Although data subjects form the majority of those affected by data-based decision making, they are not the only group of people impacted. It is becoming increasingly common across healthcare, employment, education and digital platforms for algorithms created and trained on one set of people to be used to reach conclusions about another, wider set of people. That means that an algorithm can make an automated decision that affects an individual to a legal or similarly significant degree without having used their personal data specifically.

For example, as Connected by Data points out, an automated decision could be made about a neighbourhood area, such as a decision on gritting or a police patrol route, based on personal data about some of the people who live in that neighbourhood, with the outcome impacting even those residents and visitors whose data was not directly used. For those who are affected by the automated decision but are not data subjects, there is currently no protection, recognition or method of redress.

The new clause would therefore define the decision subjects who are impacted by the likes of AI without their data having been used, in the hope that we can give them protections throughout the Bill that are equal to those for data subjects, where appropriate. That is especially important because special category data is subject to stricter safeguards for data subjects but not for decision subjects.

Connected by Data illustrates that point using the following example. Imagine a profiling company that uses special category data about the mental health of some volunteers to construct a model that predicts mental health conditions based on social media feeds, which would not be special category data. From that information, the company could give an estimate of how much time people are likely to take off work. A recruitment agency could then use that model to assess candidates and reject those who are likely to have extended absences. The model would never use any special category data about the candidates directly, but those candidates would have been subject to an automated decision that made assumptions about their own special category data, based on their social media feeds. In that scenario, by virtue of being a decision subject, the individual would not have the right to the same safeguards as those who were data subjects.

Furthermore, there might be scenarios in which someone was subject to an automated decision despite having consciously prevented their personal data from being shared. Connected by Data illustrates that point by suggesting that we consider a person who has set their preferences on their web browser so that it does not retain tracking cookies or share information such as their location when they visit an online service. If the online service has collected data about the purchasing patterns of similarly anonymous users and knows that such a customer is willing to pay more for the service, it may automatically provide a personalised price on that basis. Again, no personal data about the purchaser will have been used in determining the price that they are offered, but they will still be subject to an automated decision based on the data of other people like them.

What those scenarios illustrate is that it is whether an automated decision affects an individual in a legal or similarly significant way that should be central to their rights, rather than whether any personal data is held about them. If the Bill wants to unlock innovation around AI, automated decisions and the creative use of data, it is only fair that that be balanced by ensuring that all those affected by such uses are properly protected should they need to seek redress.

This group of amendments would help our legislative framework to address the impact of AI, rather than just its inputs. The various amendments to clause 11 would extend to decision subjects rights that mirror those given to data subjects regarding automated decision making, such as the right to be informed, the right to safeguards such as contesting a decision and the right to seek human intervention. Likewise, the amendments to clauses 27 and 29 would ensure that the ICO is obliged to have regard to decision subjects both generally and when producing codes of conduct.

Finally, to enact the safeguards to which decision subjects would hopefully be entitled via the amendments to clause 11, the amendment to clause 39 would allow decision subjects to make complaints to data controllers, mirroring the rights available to data subjects. Without defining decision subjects in law, that would not be possible, and members of the general public could be left without the rights that they deserve.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I am very much aware of the concern about automated decision making. The Government share the wish of the hon. Member for Barnsley East for all those who may be affected to be given protection. Where I think we differ is that we do not recognise the distinction that she tries to make between data subjects and decision subjects, which forms the basis of her amendments.

The hon. Lady’s amendments would introduce to the UK GDPR a definition of the term “decision subject”, which would refer to an identifiable individual subject to data- based and automated decision making, to be distinguished from the existing term “data subject”. The intended effect is to extend the requirements associated with provisions related to decisions taken about an individual using personal data to those about whom decisions are taken, even though personal information about them is not held or used to take a decision. It would hence apply to the safeguards available to individuals where significant decisions are taken about them solely through automated means, as amendments 78 to 101 call for, and to the duties of the Information Commissioner to have due regard to decision subjects in addition to data subjects, as part of the obligations imposed under amendment 106.

I suggest to the hon. Lady, however, that the existing reference to data subjects already covers decision subjects, which are, if you like, a sub-group of data subjects. That is because even if an individual’s personal data is not used to inform the decision taken about them, the fact that they are identifiable through the personal data that is held makes them data subjects. The term “data subject” is broad and already captures the decision subjects described in the hon. Lady’s amendment, as the identification of a decision subject would make them a data subject.

I will not, at this point, go on to set out the Government’s wider approach to the use of artificial intelligence, because that is somewhat outside the scope of the Bill and has already been set out in the White Paper, which is currently under consultation. Nevertheless, it is within that framework that we need to address all these issues.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Essentially, if anybody is affected by automated decision making on the basis of the characteristics of another person whose data is held—in other words, if the same data is used to take a decision that affects them, even if it does not personally apply to them—they are indeed within the broader definition of a data subject. With that reassurance, I hope that the hon. Member for Barnsley East will consider withdrawing her amendment.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I appreciate the Minister’s comments, but the point is that the data could be used—I gave the example that it might affect a group of residents who were not identifiable but were still subject to that data—so I am not quite sure that I agree with the Minister’s comparison. As the use of automated decision making evolves and expands, it is crucial that even if a person’s data is not being used directly, they are afforded protections and rights if they are subject to the outcome. I would like to press my amendment to a vote.

Question put, That the amendment be made.

Division 11

Ayes: 7

Noes: 10

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I beg to move amendment 77, in clause 11, page 19, line 12, at end insert

“and about the safeguards available to the subject in accordance with this paragraph and any regulations under Article 22D(4);”.

This amendment would require controllers proactively to provide data subjects with information about their rights in relation to automated decision-making.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss amendment 120, in clause 11, page 19, line 12, at end insert—

“(aa) require the controller to inform the data subject when a decision described in paragraph 1 has been taken in relation to the data subject;”.

This amendment would require a data controller to inform a data subject whenever a significant decision about that subject based entirely or partly on personal data was taken based solely on automated processing.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

New article 22C of the UK GDPR, inserted by clause 11, sets out the safeguards available to those who are subject to automated decision making. One such safeguard is that controllers must provide information to subjects relating to significant decisions taken through solely automated processing. That includes notifying subjects when a decision has been taken or informing them of the logic involved in producing that decision.

That provision is important. After all, how can the subject of an automated decision possibly exercise their other rights surrounding that decision if they do not even know that it has been taken on a solely automated basis? By the same logic, however, the average member of the general public is not likely to be aware of those other rights in the first place, including the rights to express their point of view with respect to automated decisions, to contest them and to seek human intervention.

Amendment 77 therefore recommends that as well as controllers being required to inform subjects about the decision, the same notice should be used as a vehicle to ensure that the subject is aware of the rights and safeguards in place to protect them and offer them redress. It would require no extra administrative effort on behalf of the controllers, because they will already be informing subjects. A proactive offer of redress may also encourage controllers to have extra regard to the way in which their automated systems are operating, in order to avoid unlawful activity that may cause them to receive a complaint or a request for human intervention.

An imbalance of power between those who conduct automated decisions and those who are subject to them already largely exists. Those who conduct decisions hold the collective power of the data, whereas each individual subject to a decision has only their own personal information; I will address that issue in greater detail in relation to other amendments, but there is no reason why that power imbalance should be exacerbated by hiding an individual’s own rights from them. If the intention of new article 22C is, as stated, to ensure that controllers are required to review and correct decisions that have produced a systematically wrongful outcome, there should be no issue with ensuring that the mechanism is properly communicated to the people it purports to serve. I am pleased to see that the hon. Member for Glasgow North West has tabled a similar amendment.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

They would obviously have that right, and indeed they would ultimately have the right to appeal to the Information Commissioner if they felt that they had been subjected unfairly to a decision where they had not been properly informed of the fact. On the basis of what I have said, I hope the hon. Member for Barnsley East might withdraw her amendment.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I appreciate the Minister’s comment, but the Government protection does not go as far as we would like. Our amendment speaks to the potential imbalance of power in the use of data and it would not require any extra administrative effort on behalf of controllers. For that reason, I will press it to a vote.

Question put, That the amendment be made.

--- Later in debate ---
Carol Monaghan Portrait Carol Monaghan
- Hansard - - - Excerpts

I will not move it formally, Mr Hollobone, but I may bring it back on Report.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I beg to move amendment 76, in clause 11, page 19, line 34, at end insert—

“5A. The Secretary of State may not make regulations under paragraph 5 unless—

(a) following consultation with such persons as the Secretary of State considers appropriate, the Secretary of State has published an assessment of the impact of the change to be made by the regulations on the rights and freedoms of data and decision subjects (with particular reference to children),

(b) the Commissioner has reviewed the Secretary of State’s statement and published a statement of the Commissioner’s views on whether the change should be made, with reasons, and

(c) the Secretary of State has considered whether to proceed with the change in the light of the Commissioner’s statement.”

This amendment would make the Secretary of State’s ability to amend the safeguards for automated decision-making set out in new Articles 22A to D subject to a requirement for consultation with interested parties and with the Information Commissioner, who would be required to publish their views on any proposed change.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss amendment 75, in clause 11, page 19, line 36, at end insert—

“7. The Commissioner must prepare a code of practice under section 124A of the Data Protection Act 2018 on the interpretation of references in this Regulation to “meaningful human involvement” and “similarly significant”.

8. The code of practice prepared under paragraph 7 must include examples of the kinds of processing which do, and which do not, fall within the definitions which use the terms referred to in that paragraph.”

This amendment would require the ICO to produce a code of practice on the interpretation of references to “meaningful human involvement” and “similarly significant” in connection with automated decision-making, with examples of the kinds of processing that would not count as falling within these definitions.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I will begin by discussing amendment 76 in the context of the general principles of this clause. The rise of AI and algorithmic decision making has happened at an unprecedented speed—so much so, in fact, that when the first version of this Bill was published, the likes of ChatGPT were not even launched yet. Now we live in a world where the majority of people across the country have been affected by or have used some form of AI-based or automated decision-making system.

When algorithms and automation work well, not only do they reduce administrative burdens, increase efficiency and free up capacity for further innovation and growth; they can also have remarkable outcomes. Indeed, PwC UK suggests that UK GDP could be up to 10.3% higher in 2030 as a result of artificial intelligence. AI is already being used to develop vaccines and medicines, for example, which are saving lives across the country and the entire world. Labour’s belief, outlined in our industrial strategy, is that the UK should be leading the world on efforts to ensure that transformative AI is aligned with the public interest in that way, and that regulations ensure we are well positioned to do that.

Despite the potential of AI to be harnessed for the public good, however, where things go wrong, the harms can be serious. The first way in which automation is prone to go wrong is by producing discriminatory outcomes. An algorithm, although intelligent in itself, is only ever as fair as the information and the people used to train it. That means that where biases exist in our world, they can become entrenched in our automated systems too. In in 2020, thousands of students in England and Wales received A-level exam results where, due to the pandemic, their grades were determined by an algorithm rather than by sitting an exam. At the hands of the automated system, almost 40% of students received grades lower than they had anticipated, with pupils from certain backgrounds and areas such as those that I represent disproportionately impacted by the lower marks. Within days of the results being published, there was widespread public outcry about the distress caused, as well as threats of mass protests and legal action. Similarly, Amazon was reported to have used an AI tool that systematically penalised women in job application processes. The tool had been trained on a decade’s worth of CVs, predominantly submitted by men. As such examples show, AI on its own can produce discriminatory outcomes. Our regulation must therefore recognise that and seek to protect against it.

The second major way in which automated decision making tends to go wrong, or can be abused, is when it makes legal or critical decisions about our lives based on mismanaged, abused or faulty systems. In the most extreme cases, automated systems can even contribute to deciding whether someone’s employment will be terminated, with grave consequences when that goes wrong. As mentioned in the oral evidence sessions, for example, last month the courts upheld the finding that three UK-based Uber drivers were robotically fired without redress, having been accused of fraudulent activity on the basis of an automated detection system. The court found that human involvement in the firing process was

“not much more than a purely symbolic act”,

and that implementing such a decision without a mechanism for appeal was unjust. Where livelihoods are at risk, data regulation must ensure that proper safeguards are in place to protect against mismanaged and faulty automated systems.

Serious harms sometimes occur under the existing system, but there are laws under the GDPR that try to protect us against discriminatory outcomes and mismanagement. Indeed, article 21 of GDPR gives a data subject the right to object at any time to the processing of their personal data, unless the controller can demonstrate “compelling legitimate grounds” for the processing to override the data subject’s rights. In conjunction, article 22 prevents data subjects from being subject to a decision based solely on automated processing that has significant effects, except in a few circumstances, including when it is based on explicit consent and does not rely on special categories of data. In all cases where automated decision making is allowed, suitable measures to safeguard the data subjects’ rights and freedoms must also be implemented.

Albeit from different perspectives, stakeholders from techUK to the TUC have emphasised the importance of those articles and of the core principles that they promote. For example, the articles place an element of control in the hands of those that an automated decision affects. They emphasise the need for appropriate safeguards, and they consider the need for a different approach where sensitive data is concerned.

Where the clause adjusts the threshold on automated decision making to unlock innovation, therefore—as the likes of the A-level algorithm scandal and the robo- firings show—it is vital that any changes to regulation maintain and in some cases strengthen the principles set out in articles 21 and 22 of the GDPR. However, as the likes of the Ada Lovelace Institute, Which? and the TUC warn, in reality the Bill does the opposite, watering down existing protections. The amendments I have tabled are designed to rectify that.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

The hon. Lady began her remarks on the broader question of the ambition to ensure that the UK benefits to the maximum extent from the use of artificial intelligence. We absolutely share that ambition, but also agree that it needs to be regulated. That is why we have published the AI regulation White Paper, which suggests that it is most appropriate that each individual regulator should develop its own rules on how that should apply. I think in the case that she was quoting of those who had lost their jobs, maybe through an automated process, the appropriate regulator—in that case, presumably, the special employment tribunal —would need to develop its own mechanism for adjudicating decisions.

I will concentrate on the amendment. On amendment 76, we feel that clause 44 already provides for an overarching requirement on the Secretary of State to consult the Information Commissioner and other persons that she or he considers appropriate before making regulations under UK GDPR, including the measures in article 22. When the new clause 44 powers are used in reference to article 22 provisions, they will be subject to the affirmative procedure in Parliament. I know that the hon. Lady is not wholly persuaded of the merits of using the affirmative procedure, but it does mean that parliamentary approval will be required. Given the level of that scrutiny, we do not think it is necessary for the Secretary of State to have to publish an assessment, as the hon. Lady would require through her amendment.

On amendment 75, as we have already debated in relation to previous amendments, there are situations where non-statutory guidance, which can be produced without being requested under regulations made by the Secretary of State, may be more appropriate than a statutory code of practice. We believe that examples of the kinds of processing that do and do not fall within the definitions of the terms “meaningful human involvement” and “similarly significant” are best placed in non-statutory guidance produced by the ICO, as this will give the flexibility to amend and change the examples where necessary. What constitutes a significant decision or meaningful human involvement is often highly context-specific, and the current wording allows for some inter-pretability to enable the appropriate application of this provision in different contexts, rather than introducing an absolute definition that risks excluding decisions that ought to fall within this provision and vice versa. For that reason, we are not minded to accept the amendments.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I appreciate the Minister’s remarks about consultation and consulting relevant experts. He is right to observe that I am not a big fan of the affirmative procedure as a method of parliamentary scrutiny but I appreciate that it is included in this Bill as part of that.

I think the problem is that we fundamentally disagree on the power to change these definitions being concentrated in the hands of the Secretary of State. It is one thing to future-proof the Bill but another to allow the Secretary of State alone to amend things as fundamental as the safeguards offered here. I would therefore like to proceed to a vote.

Question put, That the amendment be made.

--- Later in debate ---

Division 14

Ayes: 6

Noes: 10

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I beg to move amendment 121, in clause 11, page 19, line 36, at end insert—

“7. When exercising the power to make regulations under this Article, the Secretary of State must have regard to the following statement of principles:

Digital information principles at work

1. People should have access to a fair, inclusive and trustworthy digital environment at work.

2. Algorithmic systems should be designed and used to achieve better outcomes: to make work better, not worse, and not for surveillance. Workers and their representatives should be involved in this process.

3. People should be protected from unsafe, unaccountable and ineffective algorithmic systems at work. Impacts on individuals and groups must be assessed in advance and monitored, with reasonable and proportionate steps taken.

4. Algorithmic systems should not harm workers’ mental or physical health, or integrity.

5. Workers and their representatives should always know when an algorithmic system is being used, how and why it is being used, and what impacts it may have on them or their work.

6. Workers and their representatives should be involved in meaningful consultation before and during use of an algorithmic system that may significantly impact work or people.

7. Workers should have control over their own data and digital information collected about them at work.

8. Workers and their representatives should always have an opportunity for human contact, review and redress when an algorithmic system is used at work where it may significantly impact work or people. This includes a right to a written explanation when a decision is made.

9. Workers and their representatives should be able to use their data and digital technologies for contact and association to improve work quality and conditions.

10. Workers should be supported to build the information, literacy and skills needed to fulfil their capabilities through work transitions.”

This amendment would insert into new Article 22D of the UK GDPR a requirement for the Secretary of State to have regard to the statement of digital information principles at work when making regulations about automated decision-making.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss amendment 122, in clause 11, page 22, line 2, at end insert—

“(7) When exercising the power to make regulations under this section, the Secretary of State must have regard to the following statement of principles:

Digital information principles at work

1. People should have access to a fair, inclusive and trustworthy digital environment at work.

2. Algorithmic systems should be designed and used to achieve better outcomes: to make work better, not worse, and not for surveillance. Workers and their representatives should be involved in this process.

3. People should be protected from unsafe, unaccountable and ineffective algorithmic systems at work. Impacts on individuals and groups must be assessed in advance and monitored, with reasonable and proportionate steps taken.

4. Algorithmic systems should not harm workers’ mental or physical health, or integrity.

5. Workers and their representatives should always know when an algorithmic system is being used, how and why it is being used, and what impacts it may have on them or their work.

6. Workers and their representatives should be involved in meaningful consultation before and during use of an algorithmic system that may significantly impact work or people.

7. Workers should have control over their own data and digital information collected about them at work.

8. Workers and their representatives should always have an opportunity for human contact, review and redress when an algorithmic system is used at work where it may significantly impact work or people. This includes a right to a written explanation when a decision is made.

9. Workers and their representatives should be able to use their data and digital technologies for contact and association to improve work quality and conditions.

10. Workers should be supported to build the information, literacy and skills needed to fulfil their capabilities through work transitions.”

This amendment would insert into new section 50D of the DPA2018 a requirement for the Secretary of State to have regard to the statement of digital information principles at work when making regulations about automated decision-making.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Amendments 121 and 122 would ensure that close attention is paid to the specific and unique circumstances of workers and the workplace when regulations are made under the clause. Indeed, as has already been referenced, the workplace has dramatically evolved in the last decade with the introduction and growth of technology. Whether it be Royal Mail using the postal digital assistant service to calculate the length of time posties spend walking, on doorsteps and standing still, or Amazon collecting data from handheld scanners to calculate how much time workers are spending “off task”, the digital monitoring of workers and subsequent use of that data by managers to assess performance, allocate work hours and decide on levels of pay, is on the rise.

Of course it is absolutely right that workplaces embrace technology. As Andrew Pakes of Prospect said to this Committee, our economy and the jobs that people do each day can be made better and more productive through the good deployment of technology—but the key is in the phrase “good deployment”, and in order to have deployment that works for the greater good, the rights and protections in place at work must keep pace with the changing nature of the workplace and these technological advancements. As Labour outlined in our industrial strategy, we want to do just that: harness data for the public good and ensure that data and the innovation it brings with it benefit our wider society, not just large corporations. Further, as is written in our “New Deal for Working People”, Labour wants to introduce new rights to protect workers in the modern age—for example by legislating to make proposals to introduce surveillance technologies subject to consultation and agreement of trade unions, or elected staff representatives where there is no trade union. After all, we can only truly unlock the benefits of data and become a world leader in this space if there is genuine public trust in these technologies. Good regulation breeds that trust.

Currently, however, and particularly in the Bill, the kinds of measures that would allow for good deployment of technology in the workplace—technology that operates in the greater interest including that of workers—are missing from the Government’s plans. Instead, as the TUC note, we are overseeing a growing power imbalance between worker and employer. This imbalance not only exists by the nature of the relationship, but it is now being exacerbated by the increasing level of knowledge and control that employers have over personal data as the workplace becomes digitised, compared with workers, who have very little power over, expertise on or access to such data.

Some impressive projects have sought to address that imbalance. For example, in 2020 Prospect worked with a coalition of unions, tech specialists and researchers to launch a beta version of WeClock, a free mobile app that helps workers to track and manage their own data such as that related to their location, their commute and when they are doing work on their phone. Those data profiles could then potentially be used by trade union campaigners to improve rights for workers. However, it should not just be down to individual projects to ensure that there is an equal balance between worker and employer. The Bill is a huge missed opportunity to write into law this balance and the principles that we should consider with regard to worker’s rights in the modern age.

The amendment, which has been prepared in partnership with the Institute for the Future of Work, is designed to right that wrong and ensure that where regulations are made about automated decision making, the full impact on workers is considered and strong principles about worker involvement are upheld. It will mean that the Secretary of State has to consider that people have an inclusive digital environment at work, that they should be protected from harms by algorithmic systems, and that they should be meaningfully consulted before and after the use of such tools. Further, under this amendment, consideration will be given to supporting workers in building the information, literacy and skills needed to understand these transitions in the workplace, thereby addressing some of the imbalances in knowledge and understanding.

I will end with an example of the real-life consequences of employment and data laws lagging behind technology. As was revealed by a report by the Worker Info Exchange just last month, 11 Just Eat couriers in the UK were recently robotically fired after receiving allegations of fraudulent activity identified by an automated system. According to the report, these workers were falsely accused of receiving “undeserved financial gain” relating to nominal waiting time payments at restaurants. Just Eat argued that the workers left the restaurant while continuing to claim waiting fees. However, GPS evidence showed that workers had stayed in the vicinity of the restaurant, usually in the car park. In each case, the worker collected the food and completed the delivery, and the average value of the alleged undeserved payments justifying the robo-firings was just £1.44. Cases such as those, in which real livelihoods are impacted and rights infringed for the sake of profit margins, can and must be avoided.

The amendment would take the first steps in ensuring that regulations around automated decision making centre the unique experience of workers. It also highlights the Bill’s failure to move towards a legislative framework in which a distinct focus is placed on harnessing data for the public good, which is something that Labour would have placed at the heart of a data Bill such as this one.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

As I was Chair of the Culture, Media and Sport Committee in 2008 when we published a report calling for legislation on online safety, I recognise the hon. Lady’s point that these things take a long time—indeed, far too long—to come about. She calls for action now on governance and regulation of the use of artificial intelligence. She will know that last month the Government published the AI regulation White Paper, which set out the proposals for a proportionate outcomes-focused approach with a set of principles that she would recognise and welcome. They include fairness, transparency and explainability, and we feel that this has the potential to address the risks of possible bias and discrimination that concern us all. As she knows, the White Paper is currently out to consultation, and I hope that she and others will take advantage of that to respond. They will have until 21 June to do so.

I assure the hon. Lady and the hon. Member for Barnsley East that the Government are keenly aware of the need to move swiftly, but we want to do so in consultation with all those affected. The Bill looks at one relatively narrow aspect of the use of AI, but certainly the Government’s general approach is one that we are developing at pace, and we will obviously respond once the consultation has been completed.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

The power imbalance between employer and worker has no doubt grown wider as technology has developed. Our amendment speaks to the real-life consequences of that, and to what happens when employment and data law lags behind technology. For the reasons that have been outlined by my hon. Friend the Member for Newcastle upon Tyne Central and myself, I would like to continue with my amendment.

Question put, That the amendment be made.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

We have, I think, covered a lot of ground already in the debates on the amendments. To recap, clause 11 reforms the rules relating to automated decision making in article 22 of the UK GDP and relevant sections of the Data Protection Act 2018. It expands the lawful grounds on which solely automated decision making that produces a legal or similarly significant effect on an individual may be carried out.

Currently, article 22 of the UK GDPR restricts such activity to a narrow set of circumstances. By expanding the available lawful grounds and ensuring we are clear about the required safeguards, these reforms will boost confidence that the responsible use of this technology is lawful, and will reduce barriers to responsible data use.

The clause makes it clear that solely automated decisions are those that do not involve any meaningful human involvement. It ensures that there are appropriate constraints on the use of sensitive personal data for solely automated decisions, and that such activities are carried out in a fair and transparent manner, providing individuals with key safeguards.

The clause provides three powers to the Secretary of State. The first enables the Secretary of State to describe cases where there is or is not meaningful human involvement in the taking of a decision. The second enables the Secretary of State to further describe what is and is not to be taken as having a significant effect on an individual. The third enables the introduction of further safeguards, and allows those already set out in the reforms to be amended but not removed.

The reformed section 50 of the Data Protection Act mirrors the changes in subsection (1) for solely automated decision making by law enforcement agencies for a law enforcement purpose, with a few differences. First, in contrast to article 22, the rules on automated decision making apply only where such decisions have an adverse legal or similarly significant effect on the individual. Secondly, the processing of sensitive personal data cannot be carried out for the purposes of entering into a contract with the data subject for law enforcement purposes.

The final difference relates to the safeguards for processing. This clause replicates the UK GDPR safeguards for law enforcement processing but also allows a controller to apply an exemption to them where it is necessary for a particular reason, such as to avoid obstructing an inquiry. This exemption is available only where the decision taken by automated means is reconsidered by a human as soon as reasonably practicable.

The subsections amending relevant sections of the Data Protection Act 2018, which apply to processing by or on behalf of the intelligence services, clarify that requirements apply to decisions that are entirely automated, rather than solely automated. They also define what constitutes a decision based on this processing. I have explained the provisions of the clause, and hope the Committee will feel able to accept it.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I talked at length about my views about the changes to automated decision making when we debated amendments 77, 120, 76, 75, 121 and 122. I have nothing further to add at this stage, but those concerns still stand. As such, I cannot support this clause.

Question put, That the clause stand part of the Bill.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I can be reasonably brief on these amendments. Schedule 3 sets out the consequential changes needed to reflect references to the rules on automated decision making in reformed article 22 and section 50 and other provisions in the UK GDPR and the Data Protection Act 2018. Schedule 3 also sets out that section 14 of the Data Protection Act is repealed. Instead, reformed article 22 sets out the safeguards that must apply, regardless of the lawful ground on which such activity is carried out.

Government amendments 17 to 23 are minor technical amendments ensuring that references elsewhere in the UK GDPR and the Data Protection Act to the provisions on automated decision making are comprehensively updated to reflect the reforms related to such activity in this Bill. That means that references to article 22 UK GDPR are updated to the reformed article 22A to 22D provisions, and references to sections 49 and 50 in the Data Protection Act are updated to the appropriate new sections 50A to 50D.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I thank the Minister for outlining these technical changes. I have nothing further to add on these consequential amendments beyond what has already been discussed on clause 11 and the rules around automated decision making. Consistency across the statute book is important, but all the concerns I raised when discussing the substance of those changes remain.

Amendment 17 agreed to.

Amendments made: 18, in schedule 3, page 140, line 30, before second “in” insert “provided for”.

This amendment and Amendment 19 adjust consequential amendments of Article 23(1) of the UK GDPR for consistency with other amendments of the UK GDPR consequential on the insertion of new Articles 22A to 22D.

Amendment 19, in schedule 3, page 140, line 31, leave out “in or under” and insert

“arising under or by virtue of”.

See the explanatory statement for Amendment 18.

Amendment 20, in schedule 3, page 140, line 33, leave out from “protection” to end of line 35 and insert

“in accordance with, and with regulations made under, Articles 22A to 22D in connection with decisions based solely on automated processing (including decisions reached by means of profiling)”.

This amendment adjusts the consequential amendment of Article 47(2)(e) of the UK GDPR to reflect the way in which profiling is required to be taken into account for the purposes of provisions about automated decision-making (see Article 22A(2) inserted by clause 11).

Amendment 21, in schedule 3, page 140, line 36, leave out paragraph 10 and insert—

“10 In Article 83(5) (general conditions for imposing administrative fines)—

(a) in point (b), for “22” substitute “21”, and

(b) after that point insert—

“(ba) Article 22B or 22C (restrictions on, and safeguards for, automated decision-making);””.

This amendment adjusts the consequential amendment of Art 83(5) of the UK GDPR (maximum amount of penalty) for consistency with the consequential amendment of equivalent provision in section 157(2) of the Data Protection Act 2018.

Amendment 22, in schedule 3, page 141, line 8, leave out sub-paragraph (2) and insert—

“(2) In subsection (3), for “by the data subject under section 45, 46, 47 or 50” substitute “made by the data subject under or by virtue of any of sections 45, 46, 47, 50C or 50D”.”.

This amendment adjusts the consequential amendment of section 52(3) of the Data Protection Act 2018 for consistency with other amendments of that Act consequential on the insertion of new sections 50A to 50D.

Amendment 23, in schedule 3, page 141, line 9, leave out sub-paragraph (3) and insert—

“(3) In subsection (6), for “under sections 45 to 50” substitute “arising under or by virtue of sections 45 to 50D””.—(Sir John Whittingdale.)

This amendment adjusts the consequential amendment of section 52(6) of the Data Protection Act 2018 for consistency with other amendments of that Act consequential on the insertion of new sections 50A to 50D.

Schedule 3, as amended, agreed to.

Clause 12

General obligations

Question proposed, That the clause stand part of the Bill.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

One of the main criticisms that the Government have received of the current legislative framework is that it sets out a number of prescriptive requirements that organisations must satisfy to demonstrate compliance. They include appointing independent data protection officers, keeping records of processing, appointing UK representatives, carrying out impact assessments and consulting the ICO about intended processing activities in specified circumstances.

Those rules can sometimes generate a significant and disproportionate administrative burden, particularly for small and medium-sized enterprises and for some third sector organisations. The current framework provides some limited exemptions for small businesses and organisations that are carrying out low-risk processing activities, but they are not always as clear or as useful as they should be.

We are therefore taking the opportunity to improve chapter 4 of the UK GDPR, and the equivalent provisions in part 3 of the Data Protection Act, in respect of law enforcement processing. Those provisions deal with the policies and procedures that organisations and law enforcement organisations must put in place to monitor and ensure compliance. Clauses 12 to 20 will give organisations greater flexibility to implement data protection management programmes that work for their organisations, while maintaining high standards of data protection for individuals.

Clause 12 is technical in nature. It will improve the terminology in the relevant articles of the UK GDPR by replacing the requirement to implement

“appropriate technical and organisational measures”.

In its place, data protection risks must be managed with

“appropriate measures, including technical and organisational measures,”.

That will give organisations greater flexibility to implement any measures that they consider appropriate to help them manage risks. A similar clarification is made to equivalent parts of the Data Protection Act.

Clause 13 will remove article 27 of the UK GDPR, ending the requirement for overseas controllers or processors to appoint a representative in the UK where they offer goods or services to, or monitor the behaviour of, UK citizens—

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I think I have covered the points that I would like to make on clause 12.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Clause 12 is a set of largely technical amendments to terminology that I hope will provide clarity to data controllers and processors. I have no further comments to make at this stage.

Question put and agreed to.

Clause 12 accordingly ordered to stand part of the Bill.

Clause 13

Removal of requirement for representatives for controllers etc outside the UK

Question proposed, That the clause stand part of the Bill.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

As I was saying, clause 13 will remove article 27 of the UK GDPR, ending the requirement for overseas controllers or processors to appoint a representative in the UK where they offer goods or services to, or monitor the behaviour of, UK citizens. By no longer mandating organisations to appoint a representative, we will be allowing organisations to decide for themselves the best way to comply with the requirements for effective communication. That may still include the appointment of a UK-based representative. The removal of this requirement is therefore in line with the Bill’s wider strategic aim of removing unnecessary prescriptive regulation.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

The rules set out in the UK GDPR apply to all those who are active in the UK market, regardless of whether their organisation is based or located in the UK. Article 27 of the UK GDPR currently requires controllers and processors based outside the UK to designate a UK-based representative, unless they process only occasionally without special categories of data, providing an element of proportionality, or are a public authority or body. The idea is that the representative will act on behalf of the controller or processor regarding their UK GDPR compliance and will deal with the ICO and data subjects in that respect, acting as a primary contact for all things data within the country.

The removal of the requirement for a UK representative was not included in the Government’s consultation, “Data: a new direction”, nor was it even mentioned in their response. As a result, stakeholders have not been given an opportunity to put forward their opinions on this change. I wish to represent some of those opinions so that they are on the record for the Minister and his Department to consider.

Concern among the likes of Lexology, DataRep and Which? relates primarily to the fact that the current requirements for UK-based representatives ensure that UK data subjects can conveniently reach the companies that process their personal data, so that they can exercise their rights under the GDPR. Overseas data handlers may have a different first language, operate in a different time zone or have local methods of contact that are not easily accessible from the UK. Having a UK-based point of contact therefore ensures that data subjects do not struggle to apply the rights to which they are entitled because of the inevitable differences that occur across international borders.

As Lexology has pointed out, the Government’s own impact assessment says:

“There is limited information and data on the benefits of having an Article 27 representative as it is a relatively new and untested requirement and also one that applies exclusively to businesses and organisations outside of the UK which makes gathering evidence very challenging.”

By their own admission, then, the Government seem to recognise the challenges in gathering information from organisations outside the UK. If the Government find it difficult to get the information that they require, surely average citizens and data subjects may also face difficulties.

Not only is having a point of contact a direct benefit for data subjects, but a good UK representative indirectly helps data subjects by facilitating a culture of good data protection practice in the organisation that they represent. For example, they may be able to translate complex legal concepts into practical business terms or train fellow employees in a general understanding of the UK GDPR. Such functions may make it less likely that a data subject will need to exercise their rights in the first place.

As well as things being harder for data subjects in the ways I have outlined, stakeholders are not clear about the benefits of removing representatives for UK businesses. For example, the Government impact assessment estimates that the change could save a large organisation £50,000 per year, but stakeholders have said that that figure is an overestimation. Even if the figure is accurate, the saving will apply only to organisations outside the UK and will be made through a loss of employment for those who are actually based in the UK and performing the job.

The question therefore remains: if the clause is not in the interests of data subjects, of UK businesses or of UK-based employees who act as representatives, how will this country actually benefit from the change? I am keen to hear from the Minister on that point.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

If there are concerns that were not fed in during the consultation period, obviously we will consider them. However, it remains the case that even without the article 27 representative requirement, controllers will have to maintain contact with UK citizens and co-operate with the ICO under other provisions of the UK GDPR. For example, overseas controllers and processors must still co-operate with the ICO as a result of the specific requirements to do so under article 31 of the UK GDPR. To answer the hon. Lady’s question about where the benefit lies, the clause is part of a streamlining process to remove what we see as unnecessary administrative requirements and bureaucracy.

Question put and agreed to.

Clause 13 accordingly ordered to stand part of the Bill.

Clause 14

Senior responsible individual

Question proposed, That the clause stand part of the Bill.

--- Later in debate ---
Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

In a number of places in the Bill, the Government have focused on trying to ensure a more proportionate approach to data protection. That often takes the form of reducing regulatory requirements on controllers and processors where low-risk processing, which presents less of a threat of harm to data subjects, is taking place. Clause 14 is one place in which Ministers have applied that principle, replacing data protection officers with a requirement to appoint a senior responsible individual, but only where high-risk processing is being carried out.

Such a proportionate approach makes sense in theory. Where the stakes are lower, less formalised oversight of GDPR compliance will be required, which will be particularly helpful in small business settings where margins and resources are tight. Where the stakes are higher, however, a senior responsible individual will have a similar duty to that of a data protection officer, but with the added benefit of being part of the senior leadership team, ensuring that data protection is considered at the highest level of organisations conducting high-risk processing.

However, the Government have admitted that the majority of respondents to their consultation disagreed with the proposal to remove the requirement to designate a data protection officer. In particular, respondents were concerned that removing DPOs would result in

“a loss of data protection expertise”

and

“a potential fall in trust and reassurance to data subjects.”

Indeed, data protection officers perform a vital role in upholding GDPR, taking on responsibility for informing people of their obligations; monitoring compliance, including raising awareness and training staff; providing advice, where requested, on data protection impact assessments; co-operating with the regulator; and acting as a contact point. That provides not only guaranteed expertise to organisations, but reassurance to data subjects that they will have someone to approach should they feel the need to exercise any of their rights under the GDPR.

The contradiction between the theory of the benefits of proportionality and the reality of the concerns expressed by respondents to the consultation emphasises a point that the Government have repeatedly forgotten throughout the Bill: although removing truly unnecessary burdens can sometimes be positive, organisations often want clear regulation more than they want less regulation. They believe in the principles of the GDPR, understand the value of rights to data subjects and often over-comply with regulation out of fear of breaking the rules.

In this context, it makes sense that organisations recognise the value of having a data protection officer. They actually want in-house expertise on data—someone they can ask questions and someone they can rely on to ensure their compliance. Indeed, according to the DPO Centre, in September 2022, the UK data protection index panel of 523 DPOs unequivocally disagreed with the idea that the changes made by the clause would be in the best interests of data subjects. Furthermore, when asked whether the proposal to remove the requirement for a DPO and replace it with a requirement for a senior responsible individual would simplify the management of privacy in their organisation, 42% of DPOs surveyed gave the lowest score of 1.

Did the Department consider offering clarification, support and guidance to DPOs, rather than just removing them? Has it attempted to assess the impact of their removal on data subjects? In practice, it is likely that many data protection officers will be rebranded as senior responsible individuals. However, many will be relieved of their duties, particularly since the requirement to be part of the organisation’s senior management team could be problematic for external DPO appointments and those in more junior positions. Has the Department assessed how many data protection officers may lose their job as a result of these changes? Is the number expected to be substantial? Will there be any protections to support those people in transitioning to skilled employment surrounding data protection and to prevent an overall reduction of data protection expertise in organisations?

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

The clause does not in any way represent a lessening of the requirement on organisations to comply with data protection law. It simply introduces a degree of flexibility. An organisation could not get rid of data protection officers without ensuring that processing activities likely to pose high risks to individuals are still managed properly. The senior responsible individual will be required to ensure that that is the case.

At the moment, even small firms whose core activities do not involve the processing of sensitive data must have a data protection officer. We feel that that is an unnecessary burden on those small firms, and that allowing them to designate an individual will give them more flexibility without reducing the overall level of data protection that they require.

Question put and agreed to.

Clause 14 accordingly ordered to stand part of the Bill.

Clause 15

Duty to keep records

Question proposed, That the clause stand part of the Bill.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Clauses 15 and 16 will improve the record-keeping requirements under article 30 of the UK GDPR and the logging requirements under part 3 of the Data Protection Act, which is concerned with records kept for law enforcement purposes. Article 30 of the UK GDPR requires most organisations to keep records of their processing activities and includes a list of requirements that should be included in the record. Those requirements can add to the paperwork that organisations have to keep to demonstrate compliance. Although there is an exemption from those requirements in the UK GDPR for some small organisations, it has a limited impact because it applies only where their processing of personal data is “occasional”.

Clause 15 will replace the record-keeping requirements under article 30. It will make it easier for data controllers to understand exactly what needs to be included in the record. Most importantly, organisations of any size will no longer have to keep records of processing, unless their activities are

“likely to result in a high risk”

to individuals. That should help small businesses in particular, which have found the current small business exemption difficult to understand and apply in practice.

Clause 16 will make an important change to the logging requirements for law enforcement purposes in part 3 of the Data Protection Act. It will remove the ineffective requirement to record a justification when an officer consults or discloses personal data for the purposes of an investigation. The logging requirements are unique to the law enforcement regime and aim to assist in monitoring and auditing data use. Recording a justification for accessing data was intended to help protect against unlawful access, but the reality is that someone is unlikely to record an honest reason if their access is unlawful. That undermines the purpose of this requirement, because appropriate and inappropriate uses would both produce essentially indistinguishable data.

As officers often need to access large amounts of data quickly, especially in time-critical scenarios, the clause will facilitate the police’s ability to investigate and prevent crime more swiftly. We estimate that the change could save approximately 1.5 million policing hours. Other elements of the logs, such as the date and time of the consultation or disclosure and the identity of the person accessing them, are likely to be far more effective in protecting personal data against misuse; those elements remain in place. On that basis, I commend the clauses to the Committee.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Record keeping is a valuable part of data processing. It requires controllers, and to a lesser extent processors, to stay on top of all the processing that they are conducting by ensuring that they record the purposes for processing, the time limits within which they envisage holding data and the categories of recipients to whom the data has been or will be disclosed.

Many respondents to the Government’s consultation “Data: a new direction” said that they did not think the current requirements were burdensome. In fact, they said that the records allow them easily to understand the personal data that they are processing and how sensitive it is. It is likely that that was helped by the fact that the requirements were proportionate, meaning that organisations that employed under 250 people and were not conducting high-risk processing were exempt from the obligations.

It is therefore pleasing to see the Government rolling back on the idea of removing record-keeping requirements entirely, as was suggested in their consultation. As was noted, the majority of respondents disagreed with that proposal, and it is right that it has been changed. However, some respondents indicated a preference for more flexibility in the record-keeping regime, which is what I understand the clause is trying to achieve. Replacing the current requirements with a requirement to keep an appropriate record of processing, tied to high-risk activities, will give controllers the flexibility that they require.

As with many areas of the Bill, it is important that we be clear on the definition of “appropriate” so that it cannot be used by those who simply do not want to keep records. I therefore ask the Minister whether further guidance will be available to assist controllers in deciding what counts as appropriate.

I also wish to highlight the point that although in isolation the clause does not seem to change requirements much, other than by adding an element of proportionality, it cannot be viewed in isolation. In combination with other provisions, such as the reduced requirements on DPIAs and the higher threshold for subject access requests, it seems that there will be less records overall on which a data subject might be able to rely to understand how their personal information is being used or to prove how it has been used when they seek redress. With that in mind, I ask the Minister whether the Government have assessed the potential impact of the combination of the Bill’s clauses on the ability of data subjects to exercise their rights. Do the Government have any plans to work with the commissioner to monitor any such impacts on data subjects after the Bill is passed?

I turn to clause 16. Section 62 of the Data Protection Act 2018 requires competent authorities to keep logs that show who has accessed certain datasets, and at what time. It also requires that that access be justified: the reason for consulting the data must be given. Justification logs exist to assist in disciplinary proceedings, for example if there is reason to believe that a dataset has been improperly accessed or that personal data has been disclosed in an unauthorised way. However, as Aimee Reed, director of data at the Met police and chair of the national police data board, told the Committee:

“It is a big requirement across all 43 forces, largely because…we are operating on various aged systems. Many of the technology systems…do not have the capacity to log section 62 requirements, so police officers are having to record extra justification in spreadsheets alongside the searches”.––[Official Report, Data Protection and Digital Information (No. 2) Public Bill Committee, 10 May 2023; c. 56, Q118.]

That creates what she described as a “considerable burden”.

Understandably, therefore, the Bill removes the justification requirement. There are some—the Public Law Project, for example—who have expressed concern that this change would pose a threat to individual rights by allowing the police to provide a retrospective justification for accessing records. However, as the explanatory notes indicate, it is highly unlikely that in an investigation concerning inappropriate use, a justification recorded by the individual under investigation for improper access or unauthorised access could be relied on anyway. Clause 16 would therefore not stop anyone from being investigated for improper access; it would simply reduce the burden of recording a self-identified justification that could hardly be relied on anyway. I welcome the intent of the clause and the positive impact that it could have on our law enforcement processing.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

The intention behind clause 15 is to reduce the burden on organisations by tying the record-keeping requirements to high-risk processing activities. If there is uncertainty about the nature of the risk, organisations will be able to refer to ICO guidance. The ICO has already published examples on its website of processing that is likely to be high-risk for the purposes of completing impact assessments; clause 17 will require it to apply the guidance to the new record-keeping requirements as well. It will continue to provide guidance on the matter, and we are happy to work with it on that.

With respect to clause 16, I am most grateful for the Opposition’s welcome recognition of the benefits for crime prevention and law enforcement.

Question put and agreed to.

Clause 15 accordingly ordered to stand part of the Bill.

Clause 16 ordered to stand part of the Bill.

Clause 17

Assessment of high risk processing

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I beg to move amendment 102, in clause 17, page 32, line 12, leave out from “with” to the end of line 28 on page 33 and insert

“subsection (2)

(2) In Article 57(1) (Information Commissioner’s tasks), for paragraph (k) substitute—

‘(k) produce and publish a document containing examples of types of processing which the Commissioner considers are likely to result in a high risk to the rights and freedoms of individuals (for the purposes of Articles 27A, 30A and 35);’.”

This amendment would remove the provisions of clause 17 which replace the existing data protection impact assessment requirements with new requirements about “high risk processing”, leaving only the requirement for the ICO to produce a document containing examples of types of processing likely to result in a high risk to the rights and freedoms of individuals.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 103, in clause 17, page 33, line 9, at end insert—

“(4A) After Article 35(11) insert—

‘(11A) Any public authority, government department, or contractor of a government department which routinely uses public data in the discharge of its functions must publish any assessments of high risk processing conducted pursuant to this Article. Any assessments published under this Article must be redacted where necessary for the purposes of—

(a) removing sensitive details,

(b) protecting public interests, or

(c) ensuring the security of data processing operations.’”

This amendment inserts a new requirement into Article 35 of UKGDPR, for any public authority which uses public data to publish any assessment of high risk processing they conduct under Article 35.

Clause stand part.

Clause 18 stand part.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

As was the intention, the Bill loosens restrictions on processing personal data in many areas: it adds a new lawful basis and creates new exceptions to purpose limitation, removes blocks to automated decision-making and allows for much thinner record keeping. Each change in isolation may make only a relatively small adjustment to the regime. Collectively, however, they result in a large-scale shift towards controllers being able to conduct more processing, with less transparency and communication, and having fewer records to keep, all of which reduces opportunities for accountability.

As mentioned, loosening restrictions is an entirely deliberate consequence of a Bill that seeks to unlock innovation through data—an aim that Members across the House, including me, are strongly behind, given the power of data to influence growth for the public good. However, given the cumulative impact of this deregulation, where increasingly opaque processing is likely to result in a large risk to people’s rights, a processor might at the very least record how they will ensure that any high-risk activities that they undertake do not lead to unlawful or discriminatory outcomes for the general public. That is exactly what the current system of DPIAs, as outlined in article 35 of GDPR, allows for. These assessments, which require processors to measure their activities against the risk to the rights and freedoms of data subjects, are not just a tick-box exercise, unnecessary paperwork or an administrative burden; they are an essential tool for ensuring that organisations do not deploy, and individuals are not subjected to, systems that may lead to a fundamental breach of their rights.

Assessments of that kind are not a concept unique to data processing. The Government routinely publish impact assessments on the legislation that they want to introduce; any researcher or scientist is likely to conduct an assessment of the safety and morality of their methodology; and a teacher will routinely and formally measure the risks involved when taking pupils on a school trip. Where activities pose a high risk to others, it is simply common practice to keep a record of where the risks lie, and to make plans to ensure that they are mitigated where possible.

In the case of data, not only are DPIAs an important mechanism to ensure that risks are managed, but they act as a key tool for data subjects. That is first because the process of conducting a DPIA encourages processors to consult data subjects, either directly or through a representative, on how the type of processing might impact them. Secondly, where things go wrong for data subjects, DPIAs act as a legal record of the processing, its purpose and the risks involved. Indeed, the Public Law Project, a registered charity that employs a specialist lawyer to conduct research, provide training and take on legal casework, identified DPIAs as a key tool in litigating against the unlawful use of data processing. They show a public law record of the type of processing that has been conducted, and its impact.

The TUC and the Institute for the Future of Work echo that, citing DPIAs as a crucial process and consultation tool for workers and trade unions in relation to the use of technology at work. The clause, however, seeks to water down DPIAs, which will become “assessments of high-risk processing”. That guts both the fundamental benefit of risk management that they offer in a data protection system that is about to become increasingly transparent, and the extra benefits that they give to data subjects.

Instead of requiring a systematic description of the processing operations and purposes, under the new assessments the controller would be required only to summarise the purpose of the processing. Furthermore, instead of conducting a proportionality assessment, controllers will be required only to consider whether the processing is necessary for the stated purpose. The Public Law Project describes the proportionality assessment as a crucial legal test that weighs up whether an infringement of human rights, including the right not to be discriminated against, is justified in relation to the processing being conducted.

When it comes to consultation, where previously it was encouraged for controllers to seek the views of those likely to be impacted by the processing, that requirement to seek those views will now be entirely omitted, despite the important benefit to data subjects, workers and communities. The new tests therefore simply do not carry the same weight or benefit as DPIAs, which in truth could themselves be strengthened. It is simply not appropriate to remove the need to properly assess the risk of processing, while simultaneously removing restrictions that help to mitigate those risks. For that reason, the clause must be opposed; we would keep only the requirement for the ICO to produce that much-needed guidance on what constitutes high-risk processing.

Moving on to amendment 103, given the inherent importance of conducting risk assessments for high-risk processing, and their potential for use by data subjects when things go wrong, it seems only right that transparency be built into the system where it comes to Government use of public data. The amendment would do just that, and only that. It would not adjust any of the requirements on Government Departments or public authorities to complete high-risk assessments; it would simply require an assessment to be published in any case where one is completed. Indeed, the ICO guidance on DPIAs says:

“Although publishing a DPIA is not a requirement of UK GDPR, you should actively consider the benefits of publication. As well as demonstrating compliance, publication can help engender trust and confidence. We would therefore recommend that you publish your DPIAs, where possible, removing sensitive details if necessary.”

However, very few organisations choose to publish their assessments. This is a chance for the Government to lead by example, and foster an environment of trust and confidence in data protection

Alongside the amendment I tabled on compulsory reporting on the use of algorithms, this amendment is designed to afford the general public honesty and openness on how their data is used, especially where the process has been identified as having a high risk of causing harm. Again, a published impact assessment would provide citizens with an official record of high-risk uses of their data, should they need that when seeking redress. However, a published impact assessment would also encourage responsible use of data, so that redress does not need to be sought in the first place.

The Government need not worry about the consequences of the amendment if they already meet the requirement to conduct the correct impact assessments and process them in such a way that the benefits are not heavily outweighed by a risk to data rights. If rules are being followed, the amendment will only provide proof of that. However, if anyone using public data in a public authority’s name did so without completing the appropriate assessments, or processed that data in a reckless or malicious way, there would be proof of that. Where there is transparency, there is accountability, and where the Government are involved, accountability is always crucial in a democracy. The amendment would ensure that accountability shined through in data protection law.

Finally, I turn to clause 18. The majority of respondents to the “Data: a new direction” consultation agreed that organisations are likely to approach the ICO voluntarily before commencing high-risk processing activities if that is taken into account as a mitigating factor in any future investigation or enforcement action. The loosening of requirements in the clause is therefore not a major concern. However, when that is combined with the watering down of the impact assessments, there remains an overarching concern about the oversight of high-risk processing. I refer to my remarks on clause 17, in which I set out the broader problems that the Bill poses to protection against harms from high-risk processing.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

As we have discussed, one of the principal objectives of this part of the Bill is to remove some of the prescriptive unnecessary requirements on organisations to do things to demonstrate compliance. Clauses 17 and 18 reduce the unnecessary burdens placed on organisations by articles 35 and 36 of the UK GDPR in respect of data protection impact assessments and prior consultation with the ICO respectively.

Clause 17 will replace the EU-derived notion of a data protection impact assessment with more streamline requirements for organisations to document how they intend to assess and mitigate risks associated with high-risk processing operations. The changes will apply to both the impact assessment provisions under the UK GDPR and the section of the Data Protection Act 2018 that deals with impact assessments for processing relating to law enforcement. Amendment 102 would reverse those changes to maintain the current data protection impact assessment requirements, but we feel that this would miss an important opportunity for reform.

There are significant differences between the new provisions in the Bill and current provisions on data protection impact assessments. First, the new provisions are less prescriptive about the precise processing activities for which a risk assessment will be required. We think organisations are best placed to judge whether a particular activity poses a high risk to individuals in the context of the situation, taking account of any relevant guidance from the regulator.

Secondly, we have also removed the mandatory requirement to consult individuals about the intended processing activity as part of a risk-assessment process, as that imposes unnecessary burdens. There are already requirements in the legislation to ensure that any new processing is fair, transparent and designed with the data protection principles in mind. It should be open to businesses to consult their clients about intended new processing operations if they wish, but that should not be dictated to them by the data protection legislation.

Clause 18 will make optional the previous requirement for data controllers to consult the commissioner when a risk assessment indicates a potential high risk to individuals. The Information Commissioner will be able to consider any voluntary actions that organisations have taken to consult the ICO as a factor when imposing administrative fines on a data controller. Currently, compliance with the prior consultation requirement is low, likely due to a lack of clarity in the legislation and a reluctance for organisations to engage directly with the regulator on potential high-risk processing. The clause will encourage a more proactive, open and collaborative dialogue between the ICO and organisations, so that they can work together to better mitigate the risks.

The Opposition’s amendment 103 would mandate the publication of risk assessments by all public sector bodies. That requirement would, in our view, place a disproportionate burden on public authorities of all sizes. It would apply not just to Departments but to smaller public authorities such as schools, hospitals, independent pharmacies and so on. The amendment acknowledges that each public authority would have to spend time redacting sensitive details from risk assessments prior to publication. As those assessments can already be requested by the ICO as part of its investigations, or by members of the public via freedom of information requests, we do not think it is necessary to impose that significant new burden on all public bodies. I therefore invite the hon. Member for Barnsley East to withdraw her two amendments, and I commend clauses 17 and 18 to the Committee.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I am happy not to press amendment 103 to a vote, but on amendment 102, I simply do not think it is appropriate to remove the need to properly assess the risk of processing while removing the restrictions that help to mitigate it. For those reasons, I will press it to a vote.

Question put, That the amendment be made.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Clause 19 introduces an ability for public bodies with the appropriate knowledge and expertise to produce codes of conduct applicable to the law enforcement regime. The clause mirrors the equivalent provision in the UK GDPR.

As with regular guidance, these codes of conduct will be drafted by law enforcement data protection experts and tailored to the specific data protection issues that affect law enforcement agencies, to help improve compliance with the legislation and encourage best practice. However, they are intended to carry more weight, because they will additionally have the formal approval of the Information Commissioner.

When a code of conduct is produced, there is a requirement to submit a draft of it to the Information Commissioner. While that is good practice, we think it is unnecessary to mandate that. Government amendment 1 replaces that requirement with a duty on the commissioner to instead encourage public bodies to do that. Government amendments 2 and 3 are consequential to that.

Where a public body has submitted a code of conduct to the commissioner for review, Government amendment 4 removes the requirement for the commissioner to review any subsequent amendments made by the public body until the initial draft has been considered. This change will promote transparency, greater clarity and confidence in how police process personal data under the law enforcement regime. Codes of conduct are not a new concept. The clause mirrors what is already available under the UK GDPR.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

The Bill fails to fully recognise that the burdens that organisations face in complying with data protection legislation are not always best dealt with by simply removing the protections in place. In many cases, clarification and proper guidance can be just as fruitful in allowing data protection to work more seamlessly. Clauses such as clause 19, which seeks to create an environment in which best practice is shared on how to comply with data protection laws and deal with key data protection challenges, are therefore very welcome. It is absolutely right that we should capitalise on pockets of experience and expertise, especially in the public sector, where resources have often been stretched, particularly over the last 13 years. We should ensure that learnings are shared with those who are less familiar with how to resolve challenges around data.

It is also pleasing to see that codes that give sector-specific guidance will be approved by the commissioner before being published. That will ensure absolute coherence between guidance and the enforcement of data protection law more widely. I look forward to seeing what positive impact the codes of conduct will have on how personal data is handled by public bodies, to the benefit of the general public as well as the public bodies themselves; the burden on them will likely be lifted as a result of the clarity provided by the guidance.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I welcome the Opposition’s support.

Amendment 1 agreed to.

Amendments made: 2, in clause 19, page 35, line 26, leave out from ‘body’ to ‘, the’ in line 27 and insert ‘does so’.

This amendment is consequential on Amendment 1.

Amendment 3, in clause 19, page 35, line 28, leave out ‘draft’.

This amendment is consequential on Amendment 2.

Amendment 4, in clause 19, page 35, line 33, leave out from ‘conduct’ to the end of line 34 and insert—

‘that is for the time being approved under this section as they apply in relation to a code’.—(Sir John Whittingdale.)

This amendment makes clear that the Commissioner’s duty under new section 68A of the Data Protection Act 2018 to consider whether to approve amendments of codes of conduct relates only to amendments of codes that are for the time being approved under that section.

Clause 19, as amended, ordered to stand part of the Bill.

Clause 20

Obligations of controllers and processors: consequential amendments

Question proposed, That the clause stand part of the Bill.

--- Later in debate ---
Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I have no comments to add on the consequential amendments in clause 20 beyond what has been discussed regarding the obligations on controllers and processors. With regard to Government amendments 40 to 44 and schedule 4, I will address changes to the ICO’s powers to refuse requests when we come to them further on in the Bill.

Question put and agreed to.

Clause 20 accordingly ordered to stand part of the Bill.

Schedule 4

Obligations of controllers and processors: consequential amendments

Amendments made: 42, in schedule 4, page 143, line 20, leave out ‘and section 135’.—(Sir John Whittingdale.)

This amendment is consequential on Amendment 40.

Amendment 43, in schedule 4, page 143, line 24, leave out paragraph 18.

This amendment is consequential on Amendment 40.

Schedule 4, as amended, agreed to.

Clause 21

Transfers of personal data to third countries and international organisations

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 104, in schedule 5, page 144, line 28, at end insert—

‘4 All provisions in this Chapter must be applied in such a way as to ensure that the level of protection of natural persons guaranteed by this Regulation is not undermined.’

This amendment would reinsert into the new Article on general principles for international data transfers the principle that all provisions of this Chapter of the UK GDPR should be applied in such a way as to ensure that the level of protection of natural persons guaranteed by the Regulation is not undermined.

Government amendments 24 to 26.

That schedule 5 be the Fifth schedule to the Bill.

Government amendments 27 to 29.

That schedule 6 be the Sixth schedule to the Bill.

That schedule 7 be the Seventh schedule to the Bill.

--- Later in debate ---
None Portrait The Chair
- Hansard -

I call Stephanie Peacock.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I am grateful to the Minister, and I will focus my remarks particularly on the contents of schedule 5 before explaining the thought process behind amendment 104.

In the globalised world in which we live, we have an obligation to be outward looking and to consider not just the activities that take place in the UK, but those that occur worldwide. When it comes to data protection, that means accepting that data will likely need to travel across borders, and inserting appropriate safeguards so that UK citizens do not lose the protection of data protection laws if their personal data is transferred away from this country. The standard of those safeguards is absolutely crucial to the integrity of our entire data protection regime. After all, if a controller can simply send the personal data of UK citizens to a country that has limited data protection laws for processing that would be unlawful here, and if they can transfer that data back afterwards, in reality our laws are only as strong as the country with the weakest protections in the world.

As things stand, there is only a limited set of circumstances under which personal data can be transferred to a third party outside the UK. One such circumstance is where there is an adequacy agreement, similar to that which we have with the EU. For such an agreement to be reached, the Secretary of State must have considered many things, including the receiver’s respect for human rights and data rules; the presence, or lack thereof, of a regulator, and its independence; and any international commitments they have made in relation to data protection. These amendments ensure that data can flow freely between the UK and another country as long as the level of protection received by citizens is not undermined by the regulatory structure in that country.

The Bill amends the adequacy-based framework and replaces it with a new outcomes-based approach through the data protection test. The test is met if the standard of the protection provided for data subjects, with regard to the general processing of personal data in the country or by the organisation, is not materially lower than the standard of protection under the UK GDPR and relevant parts of the DPA 2018.

When deciding whether the test is met, the Secretary of State must still consider many of the same things: their respect for human rights, the existence of a regulator, and international obligations. However, stakeholders such as Reset.tech and the TUC have expressed concern that the new test could mean that UK data is transferred to countries with lower standards of protection than previously. That is significant not just for data subjects in the UK, who may be faced with weaker rights, but for business, which fears that this may signify a divergence from the EU GDPR that could threaten the UK’s own adequacy status. Losing this agreement would have real-world consequences for UK consumers and businesses to the tune of hundreds of millions of pounds. What conversations has the Minister had with representatives of the European Commission to ensure that the new data protection test does not threaten adequacy? Does he expect the new data protection test to result in the data of UK citizens being passed to countries with weaker standards than are allowed under the current regime?

Moving on to amendment 104, one reason why some stakeholders are expressing concern about the new rules is because they appear to omit article 44. As it stands, for those who are concerned about the level of data protection available to them as a result of international transfers, article 44 of the UK GDPR provides a guarantee that the integrity of the UK’s data protection laws will be protected. Indeed, it sets out that all provisions relating to the international transfer of UK personal data

“shall be applied in order to ensure that the level of protection of natural persons guaranteed by this Regulation is not undermined.”

If UK data will not be transferred to countries with weaker protections, it is not clear why this simple guarantee would be removed. The amendment would clear up any confusion around that and reinsert the article so that data subjects can be reassured of the strength of this new data protection test and of their rights.

Again, it is important to emphasise that getting the clause right is absolutely essential, as it underpins the entire data protection regime in the country. Getting it wrong could cost a huge amount, rendering the Bill, the UK GDPR and the Data Protection Act 2018 essentially useless. It is likely that the Government do not intend to undermine their own regulatory framework. Reinserting the article would confirm that in the Bill, offering complete clarity that the new data protection test will not result in lower levels of protection for UK data subjects.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

We completely agree with the hon. Lady that we would not wish to see data transferred to countries that have an inferior data protection regime. However, we do not think amendment 104 is required to achieve that, because the reforms in chapter 5 already provide for a clear and high standard of protection when transferring personal data overseas. It states that the standard of protection in that country must not be “materially lower” than the standard under the UK GDPR. That ensures that high standards of data protection are maintained. In addition, we feel that the amendment would return us to the confusion of the existing regime. At present, the legislative framework makes it difficult for organisations and others to understand what standard needs to be applied when transferring personal data internationally, with several terms used in the chapter and in case law. Our reforms ensure that a clear standard applies, which maintains protection for personal data.

The hon. Lady raised the EU’s data adequacy assessment. That is something that featured earlier in our debates on the Bill, and, as we heard from a number of our witnesses, including the information commissioner, there is no reason to believe that this in any way jeopardises the EU’s assessment of the UK’s data adequacy.

Government amendment 24 revises new article 45B(3)(c) of the UK GDPR, which is inserted by schedule 5 and which makes provision about the data protection test that must be satisfied for data bridge regulations to be made. An amendment to the Bill is required for the Secretary of State to retain the flexibility to make data bridge regulations covering transfers from the UK or elsewhere. The amendment will preserve the status quo under the current regime, in which the Secretary of State’s power is not limited to covering only transfers from the UK. In addition to these amendments, four other minor and technical Government amendments —25, 26, 28 and 29—were tabled on 10 May.

Question put and agreed to. 

Clause 21 accordingly ordered to stand part of the Bill.

Schedule 5

Transfers of personal data to third countries etc: general processing

Amendments made: 24, in schedule 5, page 147, line 3, leave out “from the United Kingdom” and insert

“to the country or organisation by means of processing to which this Regulation applies as described in Article 3”.

New Article 45B(3)(c) of the UK GDPR explains how references to processing of personal data in a third country should be read (in the data protection test for regulations approving international transfers of personal data). This amendment changes a reference to data transferred from the United Kingdom to include certain data transferred from outside the United Kingdom.

Amendment 25, in schedule 5, page 147, line 12, leave out

“the transfer of personal data”

and insert “transfer”.

This amendment and Amendment 26 simplify the wording in new Article 45B(4)(b) of the UK GDPR.

Amendment 26, in schedule 5, page 147, line 14, leave out

“the transfer of personal data”

and insert “transfer”.—(Sir John Whittingdale.)

See the explanatory statement for Amendment 25.

Schedule 5, as amended, agreed to.

Schedule 6

Transfers of personal data to third countries etc: law enforcement processing

Amendments made: 27, in schedule 6, page 155, line 39, leave out “from the United Kingdom” and insert—

“to the country or organisation by means of processing to which this Act applies as described in section 207(2)”.

New section 74AB(3)(c) of the Data Protection Act 2018 explains how references to processing of personal data in a third country should be read (in the data protection test for regulations approving international transfers of personal data). This amendment changes a reference to data transferred from the United Kingdom to include certain data transferred from outside the United Kingdom.

Amendment 28, in schedule 6, page 156, line 6, leave out

“the transfer of personal data”

and insert “transfer”.

This amendment and Amendment 29 simplify the wording in new section 74AB(4)(b) of the Data Protection Act 2018.

Amendment 29, in schedule 6, page 156, line 8, leave out

“the transfer of personal data”

and insert “transfer”.—(Sir John Whittingdale.)

See the explanatory statement for Amendment 28.

Schedule 6, as amended, agreed to. 

Schedule 7 agreed to. 

Clause 22

Safeguards for processing for research etc purposes

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Clause 22 creates a new chapter in the UK GDPR that provides safeguards for the processing of personal data for the purposes of scientific research or historical research, archiving in the public interest, and for statistical purposes. Currently, the provisions that provide safeguards for those purposes are spread across the UK GDPR and the Data Protection Act 2018.

Clause 22 consolidates those safeguards in a new chapter 8A of the UK GDPR. Those safeguards ensure that the processing of personal data for research, archiving and statistical purposes does not cause substantial damage or substantial distress and that appropriate technical and organisational measures are in place to respect data minimisation. Clause 23 sets out consequential changes to the UK GDPR and Data Protection Act 2018 required as a result of the changes being made in clause 22 to consolidate safeguards for research.

Government amendments 34 to 39 are minor, technical amendments clarifying that, as part of the pre-existing additional requirement when processing for research, archiving and statistical purposes, a controller is to use anonymous—rather that personal—data, unless that means that those purposes cannot be fulfilled. It makes clear that processing to anonymise the personal data is permitted. On that basis, I commend the clauses, and indeed the Government amendments, to the Committee.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

With regards to clause 22, it is pleasing to see a clause confirming the safeguards that are applicable when processing under the new research and scientific purposes. For example, it is welcome that it is set out that such processing must not cause substantial damage or distress to a data subject, must respect the principle of data minimisation and must not make decisions related to a particular data subject unless it is for approved medical research.

Those safeguards are especially important given the concerns that I laid out over the definition of scientific research in clause 2, which could lead to the abuse of data under the guise of legitimate research. I have no further comments on the clause or the Government’s amendments to it at this stage, other than to reiterate that the definition of scientific research must have clear boundaries if any of the clauses that concern research are to be used as intended.

Clause 23 makes changes consequential on those in clause 22, so I refer to the substance of my remarks during the discussion of the previous clause.

Amendment 34 agreed to.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Clause 24 introduces an exemption that can be applied to the processing of personal data for law enforcement purposes under the law enforcement regime for the purposes of safeguarding national security. It will replace the current, more limited national security exemptions that exist in the law enforcement regime and mirror the existing exemptions in the UK GDPR and intelligence services regime.

The clause will allow organisations to exempt themselves from specified provisions in the law enforcement regime of the Data Protection Act 2018, such as some of the data protection principles and the rights of the individual, but only where it is necessary to do so for the purposes of safeguarding national security. Like the other exemptions in the Act, it must be applied on a case-by-case basis. There are limits to what the exemption applies to. The processing of data by law enforcement authorities must always be lawful, and the protections surrounding sensitive processing remain.

Subsection (2) amends the general processing regime of the Data Protection Act, regarding processing under UK GDPR, to remove the ability of organisations to exempt themselves, on the grounds of safeguarding national security, from article 77 of the UK GDPR, which provides the right for individuals to lodge a complaint with the Information Commissioner. That is because we do not consider exemption from that provision necessary. The change will align the national security exemption applicable to UK GDPR processing with the other national security exemptions in the Data Protection Act 2018, which do not permit the exemption to be applied in relation to an individual’s right to complain to the Commissioner.

The ability of a Minister of the Crown to issue a certificate certifying the application of the exemption for the purposes of safeguarding national security, which previously existed, is retained; clause 24(8) simply updates that provision to reflect the new exemption. That change will assist closer working between organisations operating under the three distinct data protection regimes by providing greater confidence that data that, for example, may be of importance to a police investigation but also pertinent to a separate national security operation can be properly safeguarded by both organisations. I will allow the hon. Member for Barnsley East to speak to amendment 105, because I wish to respond to her.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I am grateful to the Minister. I want to speak today about a concern that has been raised about clauses 24, 25 and 26, so I will address them before speaking to amendment 105.

In essence, the clauses increase the opportunities for competent authorities to operate in darkness when it comes to personal data through both national security certificates and designation notices. Though it may of course be important in some cases to adjust data protection regulation in a minimal way to protect national security or facilitate working with the intelligence services, important too is the right to understand how any competent authority is processing our personal data—particularly given the growing mistrust around police culture.

To cite one stark example of why data transparency in law enforcement is important, after Sarah Everard was murdered, more than 30 police officers were reportedly investigated for unnecessarily looking up her personal data. First, that demonstrates that there is a temptation for officers to access personal data without due reason, perhaps particularly when it is related to a high-profile case. Secondly, however, it shows that transparency does hold people accountable. Indeed, thankfully, the individuals who were accused of accessing the data were swiftly investigated. That would not have been possible if that transparency had been restricted—for example, had there been a national security certificate or a designation notice in place.

The powers to apply for the certificates and notices that allow the police and law enforcement authorities exemptions from data protection, although sometimes needed, must be used extremely sparingly and must be proportionate to the need to protect national security. However, that proportionate approach does not appear to be guaranteed in the Bill, despite it being a requirement in human rights law.

In their oral and written evidence, representatives from Rights and Security International warned that clauses 24 to 26 could actually violate the UK’s obligations under the Human Rights Act 1998 and the European convention on human rights. Everything that the UK does, including in the name of national security or intelligence services, must comply with human rights and the ECHR. That means that any time there is interference with the privacy of people in the UK—which is considered a fundamental right—for it to be lawful, the law in question must do only what is truly necessary for national security. That necessity standard is a high one, and it does not take into account whether a change might be more convenient for a competent authority.

Will the Minister clearly explain in what way the potential powers given to law enforcement under clauses 24 to 26, in both national security certificates and designation notices, would be strictly proportionate and necessary for national security, rather than simply making the operations of law enforcement easier and more convenient?

Primarily, the concern is for those whose data could be used in a way that fundamentally infringes on their privacy, but there are practical concerns too. Any clauses that contain suspected violations of human rights could set up the Government for lengthy legal battles, both in the UK and at the European Court of Human Rights, about their data protection and surveillance regimes. Furthermore, any harm to the UK’s important relationships with the EU around data could threaten the adequacy agreement which, as we have all repeatedly heard, is vital to our economy.

It is vital, then, that Minister confirms that both national security certificates and designation notices will be used only where necessary, and exemptions will be allowed only where necessary. If that cannot be satisfied, we must oppose the clauses.

I will now focus on amendment 105. Where powers are available to provide exemptions to privacy protections on grounds of national security, it is important that they are protected from exploitation, and not unduly concentrated in any individual’s hands without appropriate checks and balances. However, Rights and Security International warned that that was not taken into appropriate consideration in clause 25. Instead, the power to issue designation notices has been concentrated almost entirely in the hands of the Secretary of State, with no accountability measures built in.

Designation notices allow for joint processing between a qualifying competent authority and the intelligence services, which could have greatly beneficial consequences for tackling crime and threats to our national security, but they will also allow for both those parties to be exempt from what are usually crucial data protections. They must therefore be used sparingly, and only when necessary and proportionate.

As we have seen—and as I will argue countless times—we cannot rely on the Secretary of State’s acting in good faith. Our legislation must instead protect against a Secretary of State who acts in bad faith. Neither can we rely on the Secretary of State having the level of expertise needed to make complex and technical decisions, especially those that impact on national security and data rights at the same time.

Despite that, under clause 25(2), the Secretary of State alone can specify which competent authorities qualify as able to apply for a designation notice. Under subsection (3), it is the Secretary of state alone to whom qualifying competent authorities will jointly apply. It is the Secretary of State who reviews a notice and has the power to withdraw it, and it is the Secretary of State who makes transition arrangements.

Although there is a requirement in the Bill to consult the commissioner, the amendment seeks to formalise some independent oversight of the designation process by ensuring that the commissioner has an actual say in approving the notices and adjusting the concentration of power so that it does not lie solely in the Secretary of State’s hands. That would mean that should the Secretary of State act in bad faith, or lack the expertise needed to make such a decision—whether aware or unaware of this fact—the commissioner would be able to help to ensure that an informed and proportionate decision was made with regard to each notice applied for. This would not present any designation notices from being issued when they were genuinely necessary; it would simply safeguard their approval when they were.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I assure the hon. Lady that clauses 25 and 26 are necessary for the improvement of national security. The reports on events such as the Manchester and Fishmongers’ Hall terrorist incidents have demonstrated that better joined-up working between the intelligence services and law enforcement is in the public interest to safeguard national security. A current barrier to such effective joint working is that only the intelligence services can operate under part 4 of the Data Protection Act, which is drafted to reflect the unique operational nature of their processing.

Data Protection and Digital Information (No. 2) Bill (Fifth sitting) Debate

Full Debate: Read Full Debate

Stephanie Peacock

Main Page: Stephanie Peacock (Labour - Barnsley East)

Data Protection and Digital Information (No. 2) Bill (Fifth sitting)

Stephanie Peacock Excerpts
Committee stage
Thursday 18th May 2023

(11 months, 3 weeks ago)

Public Bill Committees
Read Full debate Data Protection and Digital Information Bill 2022-23 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 18 May 2023 - (18 May 2023)
Stephanie Peacock Portrait Stephanie Peacock (Barnsley East) (Lab)
- Hansard - -

I spoke to amendment 105 in our last sitting. In summary, the Bill contains a requirement to consult the commissioner. The amendment seeks to formalise some of the independent oversight of the designation notice process so that the power does not lie solely in the Secretary of State’s hands. The matter of the Secretary of State’s power is obviously something with which we take issue throughout the Bill. The amendment would not stop any designation notice being issued where it is genuinely necessary; it would simply add a safeguard for its approval where it is not. For that reason, I will press the amendment to a vote.

Question put and agreed to.

Clause 24 accordingly ordered to stand part of the Bill.

Clause 25

Joint processing by intelligence services and competent authorities

Amendment proposed: 105, in clause 25, page 44, line 6, leave out “must consult the Commissioner” and insert

“must apply to the Commissioner for authorisation of the designation notice on the grounds that it satisfies subsection (1)(b).”—(Stephanie Peacock.)

This amendment seeks to increase independent oversight of designation notices by replacing the requirement to consult the Commissioner with a requirement to seek the approval of the Commissioner.

Question put, That the amendment be made.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I think we will come on to some of the questions around the fees that are potentially payable, particularly by those organisations that may be required to provide more evidence, and the costs that that could entail. I will return to that subject shortly.

The new strategic framework acknowledges the breadth of the ICO’s remit and its impact on other areas. We believe that it will provide clarity for the commissioner, businesses and the general public on the commissioner’s objectives and duties. I therefore commend clause 27 to the Committee.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

The importance to any data protection regime of an independent, well-functioning regulator cannot be overstated. The ICO, which is soon to be the Information Commission as a result of this Bill, is no exception to that rule. It is a crucial piece of the puzzle in our regime to uphold the information rights set out in regulation. Importantly, it works in the interests of the general public. The significance of an independent regulator is also recognised by the European Commission, which deems it essential to any adequacy agreement. The general duties of our regulator, such as those set out in this clause, are therefore vital because they form the foundations on which it operates and the principles to which it must be accountable.

Although the duties are more an indicator of overarching direction than a prescriptive list of duties, they should still aim to reflect the wide range of tasks that the regulator carries out and the values with which they do so. On the whole, the clause does this well. Indeed, the principal objective for the commissioner set out in this clause, which is

“to secure an appropriate level of protection for personal data, having regard to the interests of data subjects, controllers and others and matters of general public interest, and…to promote public trust and confidence in the processing of personal data”

is a good overarching starting point. It simply outlines the basic functions of the regulator that we should all be able to get behind, even if the Bill itself does disappointingly little to encourage the promotion of public trust in data processing.

It is particularly welcome that the principal objective includes specific regard to

“matters of general public interest.”

This should cover things like the need to consider sustainability and societal impact. However, it is a shame that that is not made explicit among the sub-objectives, which require the commissioner to have regard to the likes of promoting innovation and safeguarding national security. That would have ingrained in our culture a desire to unlock data for the wider good, not just for the benefit of big tech. Overall, however, the responsibilities set out in the clause, and the need to report on fulfilling them, seem to reflect the task and value of the regulator fairly and accurately.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I think that was slightly qualified support for the clause. Nevertheless, we welcome the support of the Opposition.

Question put and agreed to.

Clause 27 accordingly ordered to stand part of the Bill.

Clause 28

Strategic priorities

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Clause 28 provides a power for the Secretary of State to prepare a statement of strategic priorities relating to data protection as part of the new strategic framework for the Information Commissioner. The statement will contain only the Government’s data protection priorities, and the Secretary of State may choose to include both domestic and international priorities. That will enable the Government to provide a transparent statement of how their data protection priorities fit in with their wider agenda, giving the commissioner, we hope, helpful context.

Although the commissioner must take the statement into account when carrying out his functions, he is not required to act in accordance with it. That means that the statement will not be used in a way to direct what the commissioner may and may not do. Once the statement is drafted, the Secretary of State will be required to lay it before Parliament, where it will be subject to the negative resolution procedure before it can be designated. The commissioner will need to consider the statement when carrying out functions under the data protection legislation, except functions relating to a particular person, case or investigation.

Once designated, the commissioner will be required to respond to the statement, outlining how he intends to consider it in future data protection work. The commissioner will also be required to report on how he has considered the statement in his annual report. I commend the clause to the Committee.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Clause 28 requires that every three years the Secretary of State publish a statement of strategic priorities for the commissioner to consider, respond to, and have regard to. The statement would be subject to the negative resolution procedure in Parliament, and the commissioner would be obliged to report on what they have done to comply with it annually. Taken in good faith, I see what the clause was intended to achieve. It is, of course, important that the Government’s data priorities are understood by the commissioner. It is also vital that we ensure that the regulator functions in line with the most relevant issues of the day, given the rapidly evolving landscape of technology.

A statement of strategic priorities could, in theory, allow the Government to set out their priorities on data policy in a transparent way, allowing both Ministers and the ICO to be held accountable for their relationship. However, there is and must be a line drawn between the ICO understanding the modern regulatory regime that it will be expected to uphold and political interference in the activities and priorities of the ICO. The Open Rights Group, among others, has expressed concern that the introduction of a statement of strategic priorities could cross that line, exposing the ICO to political direction, making it subject to culture wars and leaving it vulnerable to corporate capture or even corruption.

Although the degree to which those consequences would become a reality given the current strength of our regulator might be up for debate, the very concept of the Government setting out a statement of strategic priorities that must be adhered to by the commissioner at the very least sets out a need for the ICO to follow some sort of politically led direction, something that seems counterintuitive with respect to independence. As I have already argued, an independent ICO is vital not only directly, for data subjects to be sure that their rights will be implemented and for controllers to be sure of their obligations, but indirectly, as a crucial component of our EU adequacy agreement.

Even though the clause may not be intended to threaten independence, we must be extremely careful not to unintentionally embark on a slippery slope, particularly as there are other mechanisms for ensuring that the ICO keeps up with the times and has a transparent relationship with Government. In 2022, the ICO published its new strategic plan, ICO25, which sets out why its work is important, what it wants to be known for and by whom, and how it intends to achieve that by 2025. It describes the ICO’s purpose, objectives and values and the shift in approach that it aims to achieve through the life of the plan, acknowledging that its work is

“complex, fast moving and ever changing.”

The plan was informed by extensive stakeholder consultation and by the responsibilities that the ICO has been given by Parliament. There are therefore ways for the ICO to communicate openly with Government, Parliament and other relevant stakeholders to ensure that its direction is in keeping with the most relevant challenges and with updates to legislation and Government activity. Ministers might have been better off encouraging transparent reviews, consultations and strategies of that kind, rather than prompting any sort of interference from politicians with the ICO’s priorities.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

We agree about the importance of the independence of the Information Commissioner, but I do not think that the statement, as we have set out, is an attempt to interfere with that. I remind the hon. Lady that in relation to the statement of strategic priorities, she asked the Information Commissioner himself:

“Do you perceive that having any impact on your organisation’s ability to act independently of political direction?”,

and he replied:

“No, I do not believe it will undermine our independence at all.”––[Official Report, Data Protection and Digital Information (No. 2) Public Bill Committee, 10 May 2023; c. 6, Q3.]

--- Later in debate ---
Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

The Minister is right to quote the evidence session, but he will perhaps also remember that in a later session Ms Irvine from the Law Society of Scotland said that she was surprised by the answer given by the Information Commissioner.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Ms Irvine may have been surprised. I have to say that we were not. What the Information Commissioner said absolutely chimed with our view of the statement, so I am afraid on this occasion I will disagree with the Law Society of Scotland.

Question put, That the clause stand part of the Bill.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Given the significant number of ways in which personal data can be used, we believe that it is important that the regulator provides guidance for data controllers, particularly on complex and technical areas of the law, and that the guidance should be accessible and enable compliance with the legislation efficiently and easily. We are therefore making a number of reforms to the process by which the Information Commissioner produces statutory codes of practice.

Clause 29 is a technical measure that ensures that all statutory codes of practice issued under the Data Protection Act 2018 follow the same parliamentary procedures, have the same legal effect, and are published and kept under review by the Information Commissioner. Under sections 121 to 124 of the Data Protection Act, the commissioner is obliged to publish four statutory codes of practice: the data sharing code, the direct marketing code, the age-appropriate design code, and the data protection and journalism code. The DPA includes provisions concerning the parliamentary approval process, requirements for publication and review by the commissioner, and details of the legal effect of each of the codes. So far, the commissioner has completed the data sharing code and the age-appropriate design code.

Section 128 of the Act permits the Secretary of State to make regulations requiring the Information Commissioner to prepare other codes that give guidance as to good practice in the processing of personal data. Those powers have not yet been used, but may be useful in the future. However, due to the current drafting of the provisions, any codes required by regulations made by the Secretary of State and issued by the commissioner would not be subject to the same formal parliamentary approval process or review requirements as the codes issued under sections 121 to 124. In addition, they do not have the same legal effect, and courts and tribunals would not be required to take a relevant provision of the code into account when determining a relevant question. Clearly, it is not appropriate to have two different standards of statutory codes of practice. To address that, clause 29 replaces the original section 128 with new section 124A, so that codes required in regulations made by the Secretary of State follow a similar procedure to codes issued under sections 121 to 124.

New section 124A provides the Secretary of State with the power to make regulations requiring the commissioner to produce codes of practice giving guidance as to good practice in the processing of personal data. Before preparing any code, the commissioner must consult the Secretary of State and other interested parties such as trade associations, data subjects and groups representing data subjects. That is similar to the consultation requirements for the existing codes. The parliamentary approval processes and requirements for the ICO to keep existing codes under review are also extended to any new codes required by the Secretary of State. The amendment also ensures that those codes requested by the Secretary of State have the same legal effect as those set out on the face of the DPA.

Clauses 30 and 31 introduce reforms to the process by which the commissioner develops statutory codes of practice for data protection. They require the commissioner to undertake and publish impact assessments, consult with a panel of experts during the development of a code, and submit the final version of a code to the Secretary of State for approval. Those processes will apply to the four statutory codes that the commissioner is already required to produce and to any new statutory codes on the processing of personal data that the commissioner is required to prepare under regulation made by the Secretary of State.

The commissioner will be required to set up and consult a panel of experts when drafting a statutory code. That panel will be made up of relevant stakeholders and, although the commissioner will have discretion over its membership, he or she will be required to explain how the panel was chosen. The panel will consider a draft of a statutory code and submit a report of its recommendations to the commissioner. The commissioner will be required to publish the panel’s response to the code and, if he chooses not to follow a recommendation, the reasons must also be published.

Clause 30 also requires the commissioner to publish impact assessments setting out who will be affected by the new or amended code and the impact it will have on them. While the commissioner currently carries out impact assessments when developing codes of practice, we believe that there are advantages to formalising an approach on the face of the legislation to ensure consistency.

Given the importance of the statutory codes, we believe it is important that there is a further degree of democratic accountability within the process. Therefore, clause 31 requires the commissioner to submit the final version of a statutory code to the Secretary of State for approval.

On that basis, I commend the relevant clauses to the Committee, but I am aware that the hon. Member for Barnsley East wishes to propose an amendment.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I turn first to clauses 29 and 30. Codes of practice will become increasingly important as the remit of the ICO expands and modernises. As such, it is important that the codes are developed in a way that is conducive to the product being as effective and useful as possible.

Although the ICO already carries out impact assessments for new codes of practice, that is only done as best practice and currently does not have any statutory underpinning. It is therefore pleasing to see clauses that will require consistency and high standards when developing new codes, ensuring that the resulting products are as comprehensive and helpful as possible. It is welcome, for example, to see that experts will be consulted in the process of developing these codes, including Government officials, trade associations and data subjects. It is also good to see that the commissioner will be required to publish a statement relating to the establishment of the expert panel, including how and why members were selected.

--- Later in debate ---
Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

The problem is that the Government are operating on the basis that everyone is acting in good faith, and although I am sure that the Minister and the current Secretary of State are doing so, we do not know what the future holds. It was incredibly encouraging that throughout the evidence sessions a number of witnesses said they did not feel that adequacy was at threat. That is welcome and reassuring, but only the EU Commission can give us adequacy. I am afraid the Minister simply has not done enough to alleviate my concerns about the independence of the ICO. I understand that the Minister disagrees with the Law Society of Scotland, but the full quote was:

“The ICO is tasked with producing statutory codes of conduct, which are incredibly useful for my clients and for anyone working in this sector. The fact that the Secretary of State can, in effect, overrule these is concerning, and it must be seen as a limit on the Information Commissioner’s independence.”––[Official Report, Data Protection and Digital Information (No. 2) Public Bill Committee, 10 May 2023; c. 74, Q156.]

As such, I will push my amendment to a vote.

Question put and agreed to.

Clause 29 accordingly ordered to stand part of the Bill.

Clause 30 ordered to stand part of the Bill.

Clause 31

Codes of practice: approval by the Secretary of State

Amendment proposed: 111, in clause 31, page 56, line 30, leave out lines 30 and 31 and insert—

“(6) If the Commissioner submits a revised code under subsection (5)(b), the Secretary of State must approve the code.”—(Stephanie Peacock.)

This amendment seeks to limit the ability of the Secretary of State to require the Commissioner to provide a revised code to only one occasion, after which the Secretary of State must approve the revised code.

Question put, That the amendment be made.

--- Later in debate ---
Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

The new threshold contained in the clause has been discussed in debates under clause 7, and I refer hon. Members to my remarks in those debates, as many of the same concerns apply. The guidance that will be needed to interpret the terms “vexatious” and “excessive” should be no less applicable to the Information Commissioner, whose co-operation with data subjects and transparency should be exemplary, not least because the functioning of the regulator inherently sets an example for other organisations on how the rules should be followed.

Question put and agreed to.

Clause 32, as amended, accordingly ordered to stand part of the Bill.

Clause 33

Analysis of performance

Question proposed, That the clause stand part of the Bill.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Clause 33 introduces the requirement for the Information Commissioner to prepare and publish an analysis of their performance, using key performance indicators. The regulator will be required to publish that analysis at least annually. The commissioner will have the discretion to decide which factors effectively measure their performance.

Improving the commissioner’s monitoring and reporting mechanisms will strengthen their accountability to Parliament, organisations and the public, who have an interest in the commissioner’s effectiveness. Performance measurement will also have benefits for the commissioner, including by supporting their work of measuring progress towards their objectives and ensuring that resources are prioritised in the right areas. I urge that clause 33 stand part of the Bill.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I welcome the clause, as did the majority of respondents who supported the proposal in the “Data: a new direction” consultation. As recognised by the Government’s response to their consultation, respondents felt the proposal would allow for the performance of the ICO to be assessed publicly and provide evidence of how the ICO is meeting its statutory obligations. We should do all we can to promote accountability, transparency and public awareness of the obligations and performance of the ICO. The clause allows for just that.

Question put and agreed to.

Clause 33 accordingly ordered to stand part of the Bill.

Clause 34

Power of the Commissioner to require documents

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clauses 35 to 38 stand part.

Government amendment 47.

Clause 42 stand part.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

This is a slightly chunkier set of clauses and amendments, so I will not be as brief as in the last two debates.

Clause 34 is a clarificatory amendment to the Information Commissioner’s powers in section 142 of the Data Protection Act to require information. Its purpose is to clarify the commissioner’s existing powers to put it beyond doubt that the commissioner can require specific documents as well as information when using the information notice power. Subsections (3) to (7) of the clause make consequential amendments to references to information notices elsewhere in the Data Protection Act.

Clause 35 makes provision for the Information Commissioner to require a data controller or processor to commission a report from an approved person on a specified matter when exercising the power under section 146 of the Data Protection Act to issue an assessment notice. The aim of the power is to ensure that the regulator can access information necessary to its investigations.

In the event of a data breach, the commissioner is heavily dependent on the information that the organisation provides. If it fails to share information—for example, because it lacks the capability to provide it—that can limit the commissioner’s ability to conduct a thorough investigation. Of course, if the organisation is able to provide the necessary information, it is not expected that the power would be used. The commissioner is required to act proportionately, so we expect that the power would be used only in a small minority of investigations, likely to be those that are particularly complex and technical in nature.

Clause 36 grants the Information Commissioner the power to require a person to attend an interview and answer questions when investigating a suspected failure to comply with data protection legislation. At the moment, the Information Commissioner can only interview people who attend voluntarily, which means there is a heavy reliance on documentary evidence. Sometimes that is ambiguous or incomplete and can lead to uncertainty. The ability to require a person to attend an interview will help to explain an organisation’s practices or evidence submitted, and circumvent a protracted and potentially fruitless series of back-and-forth communication via information notices. The power is based on existing comparable powers for the Financial Conduct Authority and the Competition and Markets Authority.

Clause 37 amends the provisions for the Information Commissioner to impose penalties set out in the Data Protection Act. It will allow the commissioner more time, where needed, to issue a final penalty notice after issuing a notice of intent. At the moment the Act requires the commissioner to issue a notice of intent to issue a penalty notice; the commissioner then has up to six months to issue the penalty notice unless an extension is agreed. That can prove difficult in some cases—for instance, if the organisation under investigation submits new evidence that affects the case at a late stage, or when the legal representations are particularly complex. The clause allows the regulator more time to issue a final penalty notice after issuing a notice of intent, where that is needed. That will benefit business, as it means the commissioner can give organisations more time to prepare their representations, and will result in better outcomes by ensuring that the commissioner has sufficient time to assess representations and draw his conclusions.

Clause 38 introduces the requirement for the Information Commissioner to produce and publish an annual report on regulatory activity. The report will include the commissioner’s investigatory activity and how the regulator has exercised its enforcement powers. That will lead to greater transparency of the commissioner’s regulatory activity.

Clauses 34 to 37, as I said, make changes to the Data Protection Act 2018 in respect of the Information Commissioner’s enforcement powers. Consequential on clauses 35 and 36, clause 42 makes changes to the Electronic Identification and Trust Services for Electronic Transactions Regulations 2016, known as the EITSET regulations. The EITSET regulations extend and modify the Information Commissioner’s enforcement powers to apply to its role as the supervisory body for trust service providers under the UK regulations on electronic identification and trust services for electronic transactions, known as the UK eIDAS. Clause 42 amends the EITSET regulations to ensure that the new enforcement powers introduced by clauses 34 to 37 are available to the Information Commissioner for the purposes of regulating trust service providers.

The new powers will help to ensure that the Information Commissioner is able to access the evidence needed to inform investigations. The powers will result in more informed investigations and, we believe, better outcomes. Clause 42 ensures that the Information Commissioner will continue to be able to act as an effective supervisory body for trust service providers established in the UK.

Government amendment 47 amends schedule 2 to the EITSET regulations. The amendment 2 is consequential to the amendment of section 155(3)(c) of the Data Protection Act made by schedule 4 to the Bill. The amendment to schedule 2 will remove the reference to consultation under section 65 of the Data Protection Act when section 155 is applied. It is necessary to remove reference to section 65 of the Data Protection Act when section 155 is applied with modification under schedule 2, as consultation requirements under that section are not relevant to the regulation of trust service providers under the UK eIDAS.

I hope that that is helpful to Members in explaining the merits of our approach to ensuring that the Information Commissioner has the right enforcement tools at its disposal and continues to be an effective and transparent regulator. I commend the clauses and Government amendment 47 to the Committee.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I will speak to each of the relevant clauses in turn. On clause 34, I am satisfied that the clarification that the Information Commissioner can require documents as well as information is necessary and will be of use to the regulator. I am pleased therefore pleased to accept the clause as drafted and to move on to the other clauses in this part.

Clause 35 provides for the commissioner to require an approved person to prepare a report on a specified matter, as well as to provide statutory guidance on, first, the factors it considers when deciding to require such a report and, secondly, the factors it considers when determining whom the approved person might be. That power to commission technical reports is one that the vast majority of respondents to the “Data: a new direction” consultation supported, as they felt it would lead to better informed ICO investigations. Any measures that help the ICO to carry out its duties rigorously and to better effect, while ensuring that relevant safeguards apply, are measures that I believe Members across the Committee will want to support.

In the consultation, however, the power was originally framed to commission a “technical report”, implying that it would be limited to particularly complex and technical investigations where there is significant risk of harm or detriment to data subjects. Although the commissioner is required to produce guidance on the circumstances in which a report might be required, I would still like clarification from the Minister of why such a limit was not included in the Bill as drafted. Does he expect it to be covered by the guidance produced by the ICO? Such a clarification is necessary not because we are against clause 35 in principle, just in acknowledgement that ICO’s powers—indeed, enforcement powers generally—must always be proportionate to the task at hand.

Furthermore, some stakeholders have said that it is unclear whether privilege will attach to reports required by the ICO and whether they may be disclosable to third parties who request copies of them. Greater clarity about how the power will operate in practice would therefore be appreciated.

Turning to clause 36, it is a core function of the ICO to monitor and enforce the UK’s data protection legislation and rules, providing accountability against the activities of all controllers, processors and individuals. To fulfil that function, the ICO may have to conduct an investigation to establish a body of evidence and determine whether someone has failed to comply with the legislation. The Government’s consultation document said that the ICO sometimes faces problems engaging organisations in those investigations, despite their having a duty to co-operate fully, especially in relation to interviews, as many people are nervous of negative consequences in their life or career if they participate in one. However, interviews are a crucial tool for investigations, as not all the relevant evidence will be available in written form. Indeed, that may become even more the case after the passing of this Bill, due to the reduced requirements to keep records, conduct data protection impact assessments and assign data protection officers—all of which contribute to a larger pool of documentation tracking data processing.

Clause 36, which will explicitly allow the ICO to compel witnesses to comply with interviews as part of an investigation, will, where necessary, ensure that as much relevant evidence as possible is obtained to inform the ICO’s judgment. That is something that we absolutely welcome. It is also welcome to see the safeguards that will be put in place under this clause, including the right not to self-incriminate and exemptions from giving answers that would infringe legal professional privilege or parliamentary privilege. That will ensure that the investigatory powers of the ICO stay proportionate to the issues at hand. In short, clause 36 is one that I am happy to support. After all, what is the purpose of us ensuring that data protection legislation is fit for purpose here today if the ICO is unable to actually determine whether anyone is complying?

On clause 37, it seems entirely reasonable that the ICO may require more than the standard six months to issue a penalty notice in particularly complex investigations. Of course, it remains important that the operations of the ICO are not allowed to slow unduly in cases where a penalty can be issued in the usual timeframe, but where the subject matter is particularly complicated, it makes sense to allow the ICO an extension to enable the investigation to be concluded in the proper, typically comprehensive manner. Indeed, complex investigations may be more common as we adjust to the new data legislation and a rapidly evolving technological landscape. By conducting the investigations properly and paying due attention to particularly technical issues, new precedents can be set that will speed up the regulator’s processes on the whole. Clause 37 is therefore welcomed by us, as it was by the majority of respondents to the Government’s consultation.

Turning to clause 38, as we have said multiple times throughout the progress of this Bill and in Committee, transparency and data protection should go hand in hand. Requiring the ICO to publish information each year on the investigations it has undertaken and the powers it has used will embed a further level of transparency into the regulatory system. Transparency breeds accountability, and requiring the regulator to publish information on the powers it is using will encourage such powers to be used proportionately and appropriately. Publishing an annual report with that information should also give us a better idea of how effectively the new regulatory regime is working. For example, a high volume of cases on a recurring issue could indicate a problem within the framework that needs addressing. Overall, it is welcome that Parliament and the public should be privy to information about how the ICO is discharging its regulatory functions. As a result, I am pleased to support clause 38.

Finally, the amendments to clause 42 are of a consequential nature, and I am happy to proceed without asking any further questions about them.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I am most grateful to the hon. Lady for welcoming the vast majority of the provisions within these clauses. She did express some concern about the breadth of the powers available to the Information Commissioner, but I point out that they are subject to a number of safeguards defining how they can be used. The commissioner is required to publish how he will exercise his powers, and that will provide organisations with clarity on the circumstances in which they are to be used.

As the hon. Lady will be aware, like other regulators, the Information Commissioner is subject to the duty under the Legislative and Regulatory Reform Act to exercise their functions

“in a way which is transparent, accountable, proportionate and consistent”,

and,

“targeted only at cases in which action is needed.”

There will also be a right of appeal, which is consistent with the commissioner’s existing powers. On that basis, I hope that the hon. Lady is reassured.

Question put agreed to.

Clause 34 accordingly ordered to stand part of the Bill.

Clauses 35 to 38 ordered to stand part of the Bill.

Clause 39

Complaints to controllers

--- Later in debate ---
Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I will focus most of my remarks on the group on clauses 39 and 40, as clause 41 and schedule 8 contain mostly consequential provisions, as the Minister outlined.

There are two major sections to the clauses. First, they require a complainant to issue their complaint to the controller directly, through allowing the commissioner to refuse to process their complaint otherwise. Secondly, they require the commissioner to refuse any complaint that is vexatious or excessive. I will speak to both in turn.

As the ICO grows and its remit expands, given the rapidly growing use of data in our society, it makes sense that its resources should be focused where they are most needed. Indeed, when giving evidence to the Committee, the Information Commissioner and Paul Arnold of the ICO stated that their current duty to investigate all complaints is creating a burden on their resources. Therefore, the proposal to require that complainants reach out to their data controller first, before contacting the ICO, seems to make sense, as it will allow the regulator to move away from handling low-level complaints, or complaints that are under way but not yet resolved. Instead, it would be able to refocus resources into handling complaints that have been mishandled or that offer a serious threat to data rights and public trust in data use.

Though that may be seen by some businesses and controllers as shifting an extra requirement on to them, the move should be viewed overall as a positive one, as it will require controllers to have clear processes in place for handling complaints and hopefully incentivise against conducting the kind of unlawful processing that prompts complaints in the first place. Indeed, the ICO already encourages that type of best practice, with complainants often encouraged to speak directly with the relevant data controller first before seeking help from the regulator. The clause would therefore simply formalise the arrangement, providing clarity on three levels. First, it would ensure that data subjects are clear on their right to complain directly to the controller. Secondly, it would ensure that controllers are clear on their duty to respond to such complaints. Finally, the ICO would be certain of its ability to refuse a request if the complainant refuses to comply with that model.

Although it is vital that the ICO is able to modernise and direct efforts where they are most needed, it is also vital that a healthy relationship is kept between the public—as data and decision subjects—and the ICO. The public must feel that the commissioner is there to support them in exercising their rights or seeking redress where necessary, not least because lodging a complaint can already be a difficult and distressing process. Indeed, even the commissioner himself said, when he first assumed his role, that he wanted to

“make it easy for people to access remedies if things go wrong.”

As such, it is pleasing to see safeguards built into the clause that ensure a complainant can still escalate their complaint to the ICO, and appeal any refusal from the commissioner to a tribunal.

Data rights groups, such as the Open Rights Group, hold much more serious concerns about the ability to refuse vexatious and excessive requests. Indeed, they worry that the new power will allow the ICO to ignore widespread and systemic abuses of data rights. As was the case with subject access requests, the difference between a complaint made in anger—which is quite likely, given that the complainant believes they have suffered an abuse of their rights—and a vexatious one must be clearly distinguished. The ICO should not be able to reject complaints of data abuses simply because the complainant acts in ways caused by distress.

As the response of the Government to their consultation reveals, only about half of respondents agreed with the proposal to set out criteria by which the ICO can decide not to investigate a complaint. The safeguard to appeal any refusal from the commissioner is therefore crucial in ensuring that there is a clear pathway for data subjects and decision subjects to dispute the decision of the ICO. It is also right that they should be informed of that safeguard, as well as told why their complaint has been refused, and given the opportunity to complain again with a more complete picture of information.

Overall, the clauses seems to strike the right balance between ensuring safeguards for data and decision subjects while helping the ICO to modernise. However, terms such as “vexatious” and “excessive” must be clearly defined to ensure that the ICO is able to exercise this new power of refusal proportionately and sensibly.

Carol Monaghan Portrait Carol Monaghan
- Hansard - - - Excerpts

I am looking for some clarification from the Minister. Under clause 39, it says:

“A controller must facilitate the making of complaints…such as providing a complaint form which can be completed electronically and by other means.”

Can the Minister clarify whether every data controller will have to provide an electronic means of making a complaint? For many small data controllers, which would include many of us in the room, providing an electronic means of complaint might require additional expertise and cost that they may not have. If it said, “and/or by other means”, which would allow a data controller to provide a paper copy, that might provide a little more reassurance to data controllers.

--- Later in debate ---
Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I understand that the clause contains legal clarifications relating to the interaction of data protection laws with other laws. On that basis, I am happy to proceed.

Question put and agreed to.

Clause 43 accordingly ordered to stand part of the Bill.

Clause 44

Regulations under the UK GDPR

Question proposed, That the clause stand part of the Bill.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

The clause outlines the process and procedure for making regulations under powers in the UK GDPR. Such provision is needed because the Bill introduces regulation-making powers into the GDPR. There is an equivalent provision in section 182 of the Data Protection Act. Among other things, the clause makes it clear that, before making regulations, the Secretary of State must consult the Information Commissioner and such other persons as they consider appropriate, other than when the made affirmative procedure applies. In such cases, the regulations can be made before Parliament has considered them, but cannot remain as law unless approved by Parliament within a 120-day period.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I am sure that the Committee will be pleased to learn that we have now completed part 1 of the Bill. [Hon. Members: “Hear, hear!”]

Clause 46 provides an overview of the provisions in part 2 that are aimed at securing the reliability of digital verification services through a trust framework, a public register, an information gateway and a trust mark.

Clause 47 will require the Secretary of State to prepare and publish the digital verification services trust framework, a set of rules, principles, policies, procedures and standards that an organisation that wishes to become a certified and registered digital verification service provider must follow. The Secretary of State must consult the Information Commissioner and other appropriate persons when preparing the trust framework; that consultation requirement can be satisfied ahead of the clause coming into force. The Secretary of State must review the trust framework every 12 months and must consult the Information Commissioner and other appropriate persons when carrying out the review. I commend both clauses to the Committee.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Clause 46 defines digital verification services. Central to the definition, and to the framing of the debate on part 2, is the clarification that they are

“services that are provided at the request of an individual”.

That is a crucial distinction: digital verification services and the kinds of digital identity that they enable are not the same as any kind of Government-backed digital ID card, let alone a compulsory one. As we will discuss, it is important that any such services are properly regulated and can be relied on. However, the clause seems to set out a sensible definition that clarifies that all such services operate at individual request and are entirely separate from universal or compulsory digital identities.

I will speak in more depth about clause 47. As we move towards an increasingly digitally focused society, it makes absolute sense that someone should be able, at their own choice, to prove their identity online as well as in the physical world. Providing for a trusted set of digital verification services would facilitate just that, allowing people to prove with security and ease who they are for purposes including opening a bank account or moving house, akin to using physical equivalents like a passport or a proof of address such as a utility bill. It is therefore understandable that the Government, building on their existing UK digital identity and attributes trust framework, want to legislate so that the full framework can be brought into law when it is ready.

In evidence to the Committee, Keith Rosser highlighted the benefits that a digital verification service could bring, using his industry of work and employment as a live case study. He said:

“The biggest impact so far has been on the speed at which employers are able to hire staff”––[Official Report, Data Protection and Digital Information (No. 2) Public Bill Committee, 10 May 2023; c. 52, Q112.]

In a study of 70,000 hires, the digital identity route took an average time of three minutes and 30 seconds, saving about a week compared with having to meet with an employer in person to provide physical documents. That has benefits not only to the individuals, who can start work a week earlier, but to the wider economy, since the same people will start contributing to taxation and their local economy a week earlier too.

Secondly, Keith identified that digital verification could open up remote jobs to people living in areas where employment opportunities are harder to come by. In theory, someone living in my constituency of Barnsley East could be hired in a role that would previously have been available only in London, thanks to their ability to prove who they are without ever having to meet their employer in person.

In the light of those benefits, as well as the potential reduction in fraud from cutting down on the usability of fake documents, in principle it seems only logical to support a framework that would allow trusted digital verification services to flourish. However, the key is to ensure that the framework breeds the trust necessary to make it work. In response to the digital identity call for evidence in 2019, the Government identified that a proportion of respondents were concerned about their privacy when it came to digital verification, saying that without assurances on privacy protections it would be hard to build trust in those systems. It is therefore curious that the Government have not accompanied their framework with any principles to ensure that services are designed and implemented around user needs and that they reflect important privacy and data protection principles.

Can the Minister say why the Government have not considered placing the nine identity assurance principles on the statute book, for example, to be considered when legislating for any framework? Those principles were developed by the Government’s own privacy and consumer advisory group back in 2014; they include ensuring that identity assurance can take place only where consent, transparency, multiplicity of choice, data minimisation and dispute resolution procedures are in place. That would give people the reassurance to trust that the framework is in keeping with their needs and rights, as well as those of industry.

Furthermore, can the Minister explain whether the Government intend to ensure that digital verification will not be the only option in any circumstance, making it mandatory? As Big Brother Watch points out, digital identity is not a practical or desired option, particularly for vulnerable or marginalised groups. Elderly people may not be familiar with such technology, while others might be priced out of it, especially given the recent rise in the cost of broadband and mobile bills attached to inflation. Although we must embrace the opportunities that technology can provide in identity verification, there must also be the ability to opt out and use offline methods of identification where needed, or we will risk leaving people out of participating in key activities such as jobseeking.

Finally, I look forward to hearing more about the governance of digital verification services and the framework. The Bill does not provide a statutory basis for the new office for digital identities and attributes, and there is therefore no established body for the functions related to the framework. It is important that when the new office is established, there is good communication from Government about its powers, duties, functions and funding model. After all, the framework and the principles it supports are only as strong as their enforcement.

Overall, I do not wish to stand in the way of this part of the Bill, with the caveat that I am keen to hear from the Minister on privacy protections, on the creation of the new office and on ensuring that digital verification is the beginning of a new way of verifying one’s identity, not the end of any physical verification options.

Chi Onwurah Portrait Chi Onwurah
- Hansard - - - Excerpts

It is a pleasure to follow my hon. Friend the Member for Barnsley East. I have some general comments, which I intend to make now, on the digital verification services framework introduced and set out in clause 46. I also have some specific comments on subsequent clauses; I will follow your guidance, Mr Hollobone, if it is your view that my comments relate to other clauses and should be made at a later point.

Like my hon. Friend, I recognise the importance of digital verification services and the many steps that the Government are taking to support them, but I am concerned about the lack of coherence between the steps set out in the Bill and other initiatives, consultations and activities elsewhere in Government.

As my hon. Friend said, the Government propose to establish an office for digital identities and attributes, which I understand is not a regulator as such. It would be good to have clarity on the position, as there is no discussion in the Bill of the duties of the new office or any kind of mechanisms for oversight or appeal. What is the relationship between the office for digital identities and attributes and this legislation? The industry has repeatedly called for clarity on the issue. I think we can all agree that a robust and effective regulatory framework is important, particularly as the Bill confers broad information-gathering powers on the Secretary of State. Will the Minister set out his vision and tell us how he sees the services being regulated, what the governance model will be, how the office—which will sit, as I understand it, in the Department for Science, Innovation and Technology—will relate to this legislation, and whether it will be independent of Government?

Will the Minister also help us to understand the relationship between the digital verification services set out in the Bill and other initiatives across Government on digital identity, such as the Government Digital Service’s One Login service, which we understand will be operated across Government services, and the initiatives of the Home Office’s fraud strategy? Is there a relationship between them, or are they separate initiatives? If they are separate, might that be confusing for the sector? I am sure the Minister will agree that we in the UK are fortunate to have world leaders in digital verification, including iProov, Yoti and Onfido. I hope the Minister agrees that for those organisations to continue their world-leading role, they need clarification and understanding of the direction of Government and how this legislation relates to that direction.

Finally, I hope the Minister will agree that digital identity is a global business. Will he say a few words about how he has worked with, or is working with, other countries to ensure that the digital verification services model set out in this legislation is complementary to other services and interoperable as appropriate, and that it builds on the learnings of other digital verification services?

Data Protection and Digital Information (No. 2) Bill (Sixth sitting) Debate

Full Debate: Read Full Debate

Stephanie Peacock

Main Page: Stephanie Peacock (Labour - Barnsley East)

Data Protection and Digital Information (No. 2) Bill (Sixth sitting)

Stephanie Peacock Excerpts
Committee stage
Thursday 18th May 2023

(11 months, 3 weeks ago)

Public Bill Committees
Read Full debate Data Protection and Digital Information Bill 2022-23 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 18 May 2023 - (18 May 2023)
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

This is a function that will operate within Government. I do not think that it is one where there is any specific need for particular independence, but as I said, I am happy to supply further details about precisely how it will operate if that is helpful to the hon. Lady.

Let me move on from the precise operation of the body. Clause 53 sets out requirements for certified digital verification service providers in relation to obtaining top-up certificates where the Secretary of State revises and republishes the DVS trust framework.

Clause 48 provides that the Secretary of State must establish and maintain a register of digital verification service providers. The register must be made publicly available. The Secretary of State is required to add a digital verification service provider to the register, provided that it has met certain requirements. To gain a place on the register, the provider must first be certified against the trust framework by an accredited conformity assessment body. Secondly, the provider must have applied to be registered in line with the Secretary of State’s application requirements under clause 49. Thirdly, the provider must pay any fee set by the Secretary of State under the power in clause 50.

The United Kingdom Accreditation Service accredits conformity assessment bodies as competent to assess whether a digital verification service meets the requirements set out in the trust framework. That, of course, is an arm’s length body. Assessment is by independent audits, and successful DVS providers are issued with a certificate.

The Secretary of State is prohibited from registering a provider if it has not complied with the registration requirements. An application must be rejected if it is based on a certificate that has expired, has been withdrawn by the issuing body, or is required to be ignored under clause 53 because the trust framework rules have been amended and the provider has not obtained a top-up certificate in time. The Secretary of State must also refuse to register a DVS provider if the provider was removed from the register through enforcement powers under clause 52 and reapplies for registration while still within the specified removal period.

Clause 48(7) provides definitions for “accredited conformity assessment body”, “the Accreditation Regulation”, “conformity assessment body” and “the UK national accreditation body”.

Clause 49 makes provision for the Secretary of State to determine the form of an application for registration in the digital verification services register, the information that an application needs to contain, the documents to be provided with an application and the manner in which an application is to be submitted.

Clause 50 allows the Secretary of State to charge providers a fee on application to be registered in the DVS register. The fee amount is to be determined by the Secretary of State. The clause also allows the Secretary of State to charge already registered providers ongoing fees. The amount and timing of those fees are to be determined by the Secretary of State.

Clauses 51 and 52 confer powers and duties on the Secretary of State in relation to the removal of persons from the register. Clause 51 places a duty on the Secretary of State to remove a provider from the register if certain conditions are met. That will keep the register up to date and ensure that only providers that hold a certificate to prove that they adhere to the standards set in the framework are included in the register. Clause 52 provides a power to the Secretary of State to remove a provider from the register if the Secretary of State is satisfied that the provider is failing to provide services in accordance with the trust framework, or if it has failed to provide the Secretary of State with information as required by a notice issued under clause 58. Clause 52 also contains safeguards in respect of the use of that power.

Clause 53 applies where the Secretary of State revises and republishes the DVS trust framework to include a new rule or to change an existing rule and specifies in the trust framework that a top-up certificate will be required to show compliance with the new rule from a specified date.

I hope that what I have set out is reasonably clear, and on that basis I ask that clauses 48 to 53 stand part of the Bill.

Stephanie Peacock Portrait Stephanie Peacock (Barnsley East) (Lab)
- Hansard - -

As has been mentioned, a publicly available register of trusted digital verification services is welcome; as a result, so is this set of clauses. A DVS register of this kind will improve transparency for anyone wanting to use a DVS service, as they will be able to confirm easily and freely whether the organisation that they hope to use complies with the trust framework.

However, the worth of the register relies on the worth of the trust framework, because only by getting the trust framework right will we be able to trust those that have been accredited as following it. That will mean including enough in the framework to assure the general public that their rights are protected by it. I am thinking of things such as data minimisation and dispute resolution procedures. I hope that the Department will consider embedding principles of data rights in the framework, as has been mentioned.

As with the framework, the detail of these clauses will come via secondary legislation, and careful attention must be paid to the detail of those measures when they are laid before Parliament. In principle, however, I have no problem with the provisions of the clauses. It seems sensible to enable the Secretary of State to determine a fee for registration, to remove a person from the register upon a change in circumstances, or to remove an organisation if it is failing to comply with the trust framework. Those are all functions that are essential to the register functioning well, although any fees should of course be proportionate to keep market barriers low and ensure that smaller players continue to have access. That facilitates competition and innovation.

Similarly, the idea of top-up certificates seems sensible. Members on both sides of the House have agreed at various points on the importance of future-proofing a Bill such as this, and the digital verification services framework should have space for modernisation and adaptation where necessary. Top-up certificates will allow for the removal of any organisation that is already registered but fails to comply with new rules added to the framework.

The detail of these provisions will be analysed as and when the regulations are introduced, but I will not object to the principle of an accessible and transparent register of accredited digital verification services.

Chi Onwurah Portrait Chi Onwurah
- Hansard - - - Excerpts

I thank the Minister for clarifying the role of the office for digital identities and attributes. Some of the comments I made on clause 46 are probably more applicable here, but I will not repeat them, as I am sure the Committee does not want to hear them a second time. However, I ask the Minister to clarify the process. If a company objects to not being approved for registration or says that it has followed the process set out by the Secretary of State but the Secretary of State does not agree, or if a dispute arises for whatever reason, what appeal process is there, if any, and who is responsible for resolving disputes? That is just one example of the clarity that is necessary for an office of this kind.

Will the Minister clarify the dispute resolution process and whether the office for digital identities and attributes will have a regulatory function? Given the lack of detail on the office, I am concerned about whether it will have the necessary powers and resources. How many people does the Minister envisage working for it? Will they be full-time employees of the office, or will they be job sharing with other duties in his Department?

My other questions are about something I raised earlier, to which the Minister did not refer: international co-operation and regulation. I imagine there will be instances where companies headquartered elsewhere want to offer digital verification services. Will there be compatibility issues with digital verification that is undertaken in other jurisdictions? Is there an international element to the office for digital identities and attributes?

Everyone on the Committee agrees that this is a very important area, and it will only get more important as digital verification becomes even more essential for our everyday working lives. What discussions is the Minister having with the Department for Business and Trade about the kind of market that we might expect to see in digital verification services and ensuring that it is competitive, diverse and across our country?

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Clause 54 creates a permissive power to enable public authorities to share information relating to an individual with registered digital verification service providers. That the power is permissive means that public authorities are not under any obligation to disclose information. The power applies only where a digital verification service provider is registered in the DVS register and the individual has requested the digital verification service from that provider. Information disclosed using the power does not breach any duty of confidentiality or other restrictions relating to the disclosure of information, but the power does not enable the disclosure of information if disclosure would breach data protection legislation. The clause also gives public authorities the power to charge fees for disclosing information.

All information held by His Majesty’s Revenue and Customs is subject to particular statutory safeguards relating to confidentiality. Clause 55 establishes particular safeguards for information disclosed to registered digital verification service providers by His Majesty’s Revenue and Customs under clause 54. The Government will not commence measures to enable the disclosure of information held by HMRC until the commissioners for HMRC are satisfied that the technology and processes for information sharing uphold the particular safeguards relating to taxpayer confidentiality and therefore allow information sharing by HMRC to occur without adverse effect on the tax system or any other functions of HMRC.

Clause 56 obliges the Secretary of State to produce and publish a code of practice about the disclosure of information under clause 54. Public authorities must have regard to the code when disclosing information under this power. Publication of the first version of the code is subject to the affirmative resolution procedure. Publication of subsequent versions of the code is subject to the negative resolution procedure. We will work with the commissioners for HMRC to ensure that the code meets the needs of the tax system.

New clauses 3 and 4 and Government amendments 6 and 7 establish safeguards for information that reflect those already in the Bill under clause 55 for HMRC. Information held by tax authorities in Scotland and Wales—Revenue Scotland and the Welsh Revenue Authority—is subject to similar statutory safeguards relating to confidentiality. These safeguards ensure that confidence and trust in the tax system is maintained. Under these provisions, registered DVS providers may not further disclose information provided by Revenue Scotland or the Welsh Revenue Authority unless they have the consent of that revenue authority to do so. The addition of these provisions will provide an equivalent level of protection for information shared by all three tax authorities in the context of part 2 of the Bill, avoiding any disparity in the treatment of information held by different tax authorities in this context. A similar provision is not required for Northern Irish tax data, as HMRC is responsible for the collection of devolved taxes in Northern Ireland.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Many digital verification services will, to some extent, rely on public authorities being able to share information relating to an individual with an organisation on the DVS register. To create a permissive gateway that allows this to happen, as clause 54 does, is therefore important for the functioning of the entire DVS system, but there must be proper legal limits placed on these disclosures of information, and as ever, any disclosures involving personal data must abide by the minimisation principle, with only the information necessary to verify the person’s identity or the fact about them being passed on. As such, it is pleasing to see in clause 54 the clarification of some of those legal limits, as contained in the likes of data protection legislation and the Investigatory Powers Act 2016. Similarly, clause 55 and the Government new clauses apply the necessary limits on sharing of personal data from HMRC and devolved revenue authorities under clause 54.

Finally, clause 56, which seeks to ensure that a code of practice is published regarding the disclosure of information under clause 54, will be a useful addition to the previous clauses and will ensure that the safety of such disclosures is properly considered in comprehensive detail. The Information Commissioner, with their expertise, will be well placed to help with this, so it is pleasing to see that they will be consulted during the process of designing this code. It is also good to see that this consultation will be able to occur swiftly—before the clause even comes into force—and that the resulting code will be laid before both Houses.

In short, although some disclosures of personal data from public authorities to organisations providing DVS are inevitable, as they are necessary for the very functioning of a verification service, careful attention should be paid to how this is done safely and legally. These clauses, alongside a well-designed framework—as already discussed—will ensure that that is the case.

Question put and agreed to.

Clause 54 accordingly ordered to stand part of the Bill.

Clauses 55 and 56 ordered to stand part of the Bill.

Clause 57

Trust mark for use by registered persons

Question proposed, That the clause stand part of the Bill.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Clause 57 makes provision for the Secretary of State to designate a trust mark to a DVS provider. The trust mark is essentially a kitemark that shows that the provider complies with the rules and standards set out in the trust framework, and has been certified by an approved conformity assessment body. The trust mark must be published by the Secretary of State and can only be used by registered digital verification service providers. The clause gives the Secretary of State powers to enforce that restriction in civil proceedings.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Trust marks are useful tools that allow organisations and the general public alike to immediately recognise whether or not a product or service has passed a certain testing standard or criterion. This is especially the case online, where due to misinformation and the prevalence of scams such as phishing, trust in online services can be lower than in the physical world.

The TrustedSite certification, for example, offers online businesses an earned certification programme that helps them to demonstrate that they are compliant with good business practices and maintain high safety standards. This is a benefit not only to the business itself, which is able to convert more users into clicks and sales, but to the users, who do not have to spend time researching each individual business and can explore pages and shop with immediate certainty. A trust mark for digital verification services would serve a similar purpose, enabling certified organisations that meet the trust framework criteria to be immediately recognisable, offering them the opportunity to be used by more people and offering the public assurance that their personal data is being handled by a verified source.

Of course, as is the case with this entire section of the Bill, the trust mark is only worth as much as the framework around it. Ministers should again think carefully about how to ensure that the framework supports the rights of the individual. Furthermore, the trust mark is useful only if people recognise it; otherwise, it cannot provide the immediate reassurance that it is supposed to. When the trust mark is established, what measures will the Department take to raise public awareness of it? In the same vein, to know the mark’s value, the public must also be aware of the trust framework that the mark is measured against, so what further steps will the Department take to increase knowledge and understanding of digital verification services and frameworks? Finally, will the Department publish the details of any identified unlawful use of the trust mark, so that public faith in the reliability of the trust mark remains high?

Overall, the clause is helpful in showing that we take seriously the need to ensure that people do not use digital verification services that may mishandle their data.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I am grateful to the hon. Lady for her support. I entirely take her point that a trust mark only really works if people know what it is and can look for it when seeking a DVS provider.

Regarding potential abuse, obviously that is something we will monitor and potentially publicise in due course. All I would say at this stage is that she raises valid points that I am sure we will consider as the new system is implemented.

Question put and agreed to.

Clause 57 accordingly ordered to stand part of the Bill.

Clause 58

Power of Secretary of State to require information

Amendments made: amendment 6, in clause 58, page 84, line 5, after “55” insert

“or (Information disclosed by the Welsh Revenue Authority)”

This amendment prevents the Secretary of State requesting a disclosure of information which would contravene the new clause inserted by NC3.

Amendment 7, in clause 58, page 84, line 5, after “55” insert

“or (Information disclosed by Revenue Scotland)”—(Sir John Whittingdale.)

This amendment prevents the Secretary of State requesting a disclosure of information which would contravene the new clause inserted by NC4.

Question proposed, That the clause, as amended, stand part of the Bill.

--- Later in debate ---
Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

To oversee the DVS register, it is understandable that the Secretary of State may in some cases need to require information from registered bodies to ensure that they are complying with their duties under the framework. It is good that clause 58 provides for that power, and places reasonable legal limits on it, so that disclosures of information do not disrupt legal professional privilege or other important limitations. Likewise, it is sensible that the Secretary of State be given the statutory power to delegate some oversight of the measures in this part in a paid capacity, as is ensured by clause 59.

As I have mentioned many times throughout our scrutiny of the Bill, the Secretary of State may not always have the level of expertise needed to act alone in exercising the powers given to them by such regulations. The input of those with experience and time to commit to ensuring the quality of the regulations will therefore be vital to the success of these clauses. Again, however, we will need more information about the establishment of the OfDIA and the governance of digital identities overall to be able to interpret fully both the delegated powers and the power to require information, and how they will be used. Once again, therefore, I urge transparency from the Government as those governance structures emerge.

That leads nicely to clause 60, which requires the Secretary of State to prepare and publish yearly reports on the operation of this part. A report of that nature will offer the chance to periodically review the functioning of the trust framework, register, trust mark and all other provisions contained in this part, thereby providing an opportunity to identify and rectify any recurring issues that the system may face. That is sensible for any new project, particularly one that, through its transparency, will offer accountability of the Government to the general public, who will be able to read the published reports. In short, there are no major concerns regarding any of the three clauses, though further detail on the governance of digital identities services will need proper scrutiny.

Question put and agreed to.

Clause 58 accordingly ordered to stand part of the Bill.

Clauses 59 and 60 ordered to stand part of the Bill.

Clause 61

Customer data and business data

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I beg to move amendment 46, in clause 61, page 85, line 24, after “supplied” insert “or provided”.

The definition of “business data” in clause 61 refers to the supply or provision of goods, services and digital content. For consistency with that, this amendment amends an example given in the definition so that it refers to what is provided, as well as what is supplied.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

We move on to part 3 of the Bill, concerning smart data usage, which I know is of interest to a number of Members. Before I discuss the detail of clause 61 and amendment 46, I will give a brief overview of this part and the policy intention behind it. The provisions in part 3 allow the Secretary of State or the Treasury to make regulations that introduce what we term “schemes” that compel businesses to share data that they hold on customers with the customer or authorised third parties upon the customer’s request, and to share or publish data that they hold about the services or products that they provide. Regulations under this part will specify what data is in scope within the parameters set out by the clauses, and how it should be shared.

The rest of the clauses in this part permit the Secretary of State or the Treasury to include in the regulations the measures that will underpin these data sharing schemes and ensure that they are subject to proper safeguards—for example, relating to the enforcement of regulations; the accreditation of third party businesses wanting to facilitate data sharing; and how these schemes can be funded through levies and charging. Regulations that introduce schemes, or significantly amend existing schemes, will be subject to prior consultation and parliamentary approval through the affirmative procedure.

The policy intention behind the clauses is to allow for the creation of new smart data schemes, building on the success of open banking in the UK. Smart data schemes establish the secure sharing of customer data and contextual information with authorised third parties on the customer’s request. The third parties can then be authorised by the customer to act on their behalf. The authorised third parties can therefore provide innovative services for the customer, such as analysing spending to identify cost savings or displaying data from multiple accounts in a single portal. The clauses replace existing regulation-making powers relating to the supply of customer data in sections 89 to 91 of the Enterprise and Regulatory Reform Act 2013; those powers are not sufficient for new smart data schemes to be effective.

Clause 61 defines the key terms and concepts for the powers in part 3. We have tabled a minor Government amendment to the clause, which I will explain. The definitions of data holder and trader in subsection (2) explain who may be required to provide data under the regulations. The definitions of customer data and business data deal with the two kinds of data that suppliers may be required to provide. Customer data is information relating to the transactions between the customer and supplier, such as a customer’s consumption of the relevant good or service and how much the customer has paid. Business data is wider contextual data relating to the goods or services supplied or provided by the relevant supplier. Business data may include standard prices, charges or tariffs and information relating to service performance. That information may allow customers to understand their customer data. Government amendment 46 clarifies that a specific example of business data—information about location—refers to the supply or provision of goods or services. It corrects a minor inconsistency in the list of examples of business data in subsection (2)(b).

Subsection (3) concerns who is a customer of the supplying trader, and who can therefore benefit from smart data. Customers may include both consumers and businesses. Subsection (4) enables customers to exercise smart data rights in relation to contracts they have already entered into, and subsection (5) allows the schemes to function through provision of access to data, as opposed to sending data as a one-off transfer.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

The clause defines key terms in this part of the Bill, such as business data, customer data and data holder, as well as data regulations, customer and trader. These are key to the regulation-making powers on smart data in part 3, and I have no specific concerns to raise about them at this point.

I note the clarification made by the Minister in his amendment to the example given. As he outlined, that will ensure there is consistency in the definition and understanding of business data. It is good to see areas such as that being cleaned up so that the Bill can be interpreted as easily as possible, given its complexity to many. I am therefore happy to proceed with the Bill.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - - - Excerpts

I rise to ask the Minister a specific question about the use of smart data in this way. A lot of users will be giving away data a device level, rather than just accessing individual accounts. People are just going to a particular account they are signed into and making transactions, or doing whatever they are doing in that application, on a particular device, but there will be much more gathering of data at the device level. We know that many companies—certainly some of the bigger tech companies—use their apps to gather data not just about what their users do on their particular app, but across their whole device. One of the complaints of Facebook customers is that if they seek to remove their data from Facebook and get it back, the company’s policy is to give them back data only for things they have done while using its applications—Instagram, Facebook or whatever. It retains any device-level data that it has gathered, which could be quite significant, on the basis of privacy—it says that it does not know whether someone else was using the device, so it is not right to hand that data back. Companies are exploiting this anomaly to retain as much data as possible about things that people are doing across a whole range of apps, even when the customer has made a clear request for deletion.

I will be grateful if the Minister can say something about that. If he cannot do so now, will he write to me or say something in the future? When considering the way that these regulations work, particularly in the era of smart data when it will be far more likely that data is gathered across multiple applications, it should be clear what rights customers have to have all that data deleted if they request it.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I share my hon. Friend’s general view. Customers can authorise that their data be shared through devices with other providers, so they should equally have the right to take back that data if they so wish. He invites me to come back to him with greater detail on that point, and we would be very happy to do so.

Amendment 46 agreed to.

Clause 61, as amended, ordered to stand part of the Bill.

Clause 62

Power to make provision in connection with customer data

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I beg to move amendment 112, in clause 62, page 87, line 2, at end insert—

“(3A) The Secretary of State or the Treasury may only make regulations under this section if—

(a) the Secretary of State or the Treasury has conducted an assessment of the impact the regulations may have on customers, businesses, or industry,

(b) the assessment mentioned in paragraph (a) has been published, and

(c) the assessment concludes that the regulations achieve their objective without imposing disproportionate, untargeted or unnecessary cost on customers or businesses.”

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 113, in clause 62, page 87, line 12, at end insert—

“(5) The Secretary of State or the Treasury may invite a relevant sectoral regulator to contribute to, or to conduct, any impact assessment conducted in order to enable the Secretary of State or the Treasury to fulfil their obligation under subsection (4).”

This amendment would allow the Secretary of State or the Treasury to enable a relevant sectoral regulator to contribute to, or conduct, any impact assessments on smart data regulations.

Amendment 114, in clause 62, page 87, line 12, at end insert—

“(5) The Secretary of State or the Treasury must consult representatives of the relevant business or industry sector to inform their decision whether to make regulations under this section.”

This amendment would require the Secretary of State or the Treasury to consult representatives of the relevant business or industry sector before making smart data regulations.

Amendment 115, in clause 62, page 87, line 12, at end insert—

“(5) Within six months of the passage of this Act, the Secretary of State must—

(a) publish a target date for the coming into force of the first regulations under this section, and

(b) make arrangements for the completion of an assessment of the impact of those regulations.”

This amendment would require Government to identify a target for a first smart data scheme within 6 months, and make arrangements for an impact assessment for these regulations.

--- Later in debate ---
Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Of all the provisions in the Bill, the ones on smart data are those that I am most excited about and pleased to welcome. The potential of introducing smart data schemes is immense: they can bring greater choice to consumers, enable innovation, increase competition and result in the delivery of better products and services. I will address amendments 112 and 113, but I look forward to the opportunity to speak in support of this part more widely.

Most of the detail on how and where smart regimes will be regulated in practice through this Bill will follow in secondary legislation and regulation. That is deliberate and welcome, as it ensures that smart data schemes are built around the realities of the sectors to which they apply. Given that they cannot be included on the face of the Bill, however, it is important that the regulations are prepared in the way that any good data-related law is. There must be a committee of consultation to ensure that the outcome works effectively for consumers and businesses, with the appropriate data protection safeguards.

Indeed, there may be certain sectors in which the costs simply outweigh the benefits of introducing such a regime. Sky believes that there is currently no evidence that a smart data scheme in the communications sector would bring clear and tangible additional benefits to customers. Ofcom consulted on the proposal in 2020 and came to a similar conclusion. Sky argues that the communications sector already has

“a very high bar for supporting consumers to use data to find the best deal for them. For example, in 2020 Ofcom introduced End of Contract Notifications”,

which tell customers when their current contract is ending and what they could save by signing up to another deal. Sky says that Ofcom is

“also in the process of introducing One Touch Switching for fixed broadband which will make it easier for customers to move between providers who operate on different networks”.

As BT identifies, smart data initiatives require significant time and investment to implement. The Government’s impact assessment estimates that the implementation cost for the telecoms sector for a smart data initiative could be anywhere between £610 million and £732 million. That is not to say that the cost outweighs the potential benefits for all industries, including telecoms, but it is important that the Government weigh that up before making any regulations, particularly given that large costs be passed on to consumers, or that there may be less investment in other areas. In the telecoms industry, it could lead to a reduction in investment in full-fibre broadband and 5G. It is imperative, therefore, to ensure that all costs remain targeted, proportionate and necessary to bring about an overall benefit that outweighs the costs. An impact assessment would provide assurance that this has been taken into consideration before any new schemes are introduced.

When conducting such an assessment, sectoral regulators, which can provide expert insight into the impact of smart data in any particular industry, will be well placed to assess the costs and benefits in the detail needed. That is something the Government themselves recognise, as they have placed a requirement in the Bill to consult those regulators. The amendments I propose would strengthen that commitment, allowing relevant sectoral regulators the opportunity, where appropriate, to be formally involved in the process of conducting an impact assessment.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I assure the hon. Lady that I and, no doubt, the whole Committee share her excitement about the potential offered by smart data, and I have sympathy for the intention behind her amendments. However, taking each one in turn, we feel amendment 112 is unnecessary because the requirements are already set by the better regulation framework, the Small Business, Enterprise and Employment Act 2015 and, indeed, these clauses. Departments will conduct an impact assessment in line with the better regulation framework and Green Book guidance when setting up a new smart data scheme, and must demonstrate consideration of their requirements under the Equality Act 2010. That will address the proportionality, targeting and necessity of the scheme.

Moreover, the clauses require the Government to consider the effect of the regulations on matters including customers, businesses and competition. An impact assessment would be an effective approach to meeting those requirements. However, there is a risk that prescribing exactly how a Department should approach the requirements could unnecessarily constrain the policymaking process.

I turn to amendment 113. Clause 74(5) already requires the Secretary of State or the Treasury to consult with relevant sector regulators as they consider appropriate. As part of the process, sector regulators may be asked to contribute to the development of regulatory impact assessments, so we do not believe the amendment is necessary.

On amendment 114, we absolutely share the view of the importance of Government consulting businesses before making regulations. That is why, under clause 74(6), the Secretary of State or the Treasury must, when introducing a smart data scheme, consult such persons as are likely to be affected by the regulations and such sectoral regulators as they consider appropriate. Those persons will include businesses relevant to the envisaged scheme.

On amendment 115, we absolutely share the ambition to grab whatever opportunities smart data offers. In particular, I draw the hon. Lady’s attention to the commitments made last month by the Economic Secretary to the Treasury, who set out the Treasury’s plans to use the smart data powers to provide open banking with a sustainable regulatory framework, while the Under-Secretary of State for Business and Trade, my hon. Friend the Member for Thirsk and Malton (Kevin Hollinrake), chaired the inaugural meeting of the Smart Data Council last month. That council has been established to support and co-ordinate the development of smart data schemes in a timely manner.

With respect to having a deadline for schemes, we should recognise that implementation of the regulations requires careful consideration. The hon. Member for Barnsley East clearly recognises the importance of consultation and of properly considering the impacts of any new scheme. We are committed to that, and there is a risk that a statutory deadline for making the regulations would jeopardise our due diligence. I assure her that all her concerns are ones that we share, so I hope that she will accept that the amendments are unnecessary.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I am grateful to the Minister for those assurances. I am reassured by his comments, and I am happy to beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clause 63 stand part.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Clause 62 provides the principal regulation-making power to establish smart data schemes in relation to customer data. The clause enables the Secretary of State or the Treasury to make regulations that require data holders to provide customer data either directly to a customer, or to a person they have authorised, at their request. Subsection (3) of the clause also allows for an authorised person who receives the customer data, to exercise the customer’s rights in relation to their data on their behalf. We call that “action initiation”.

An illustrative example could be in open banking, where customers can give authorised third parties access to their data to compare the consumer’s current bank account with similar offers, or to group the contracts within a household together for parents or guardians to better manage children’s accounts. Subsection (3) could allow the authorised third party to update the customer’s contact details across the associated accounts, for example if an email address changes.

Clause 63 outlines the provisions that smart data scheme regulations may contain when relating to customer data. The clause establishes much of the critical framework that smart data schemes will be built on. On that basis, I commend clauses 62 and 63 to the Committee.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

As previously mentioned, and with the caveats that I expressed when I was discussing my amendments, I am extremely pleased to be able to welcome this part of the Bill. In essence, clauses 62 and 63 enable regulations that will allow for customer data to be provided to a third party on request. I will take the opportunity to highlight why that is the case by looking at some of the benefits that smart data can provide.

Since 2018, open banking—by far the most well known and advanced version of smart data in operation—has demonstrated what smart data can deliver over and over again. For the wider economy, the benefits have been remarkable, with the total value to the UK economy now amounting to more than £4.1 billion, according to Coadec, the Coalition for a Digital Economy. Consumers’ experience of banking has been revolutionised if they have consented of their own accord to have third-party applications access their financial data.

Indeed, a whole host of money management tools and apps can now harness people’s financial data to create personalised recommendations based on their spending habits, including how to budget or save. During a cost of living crisis, some of those tools have been extremely valuable in helping people to manage new bills and outgoings. Furthermore, online retailers can now connect directly to someone’s bank so that, rather than spending the time filling in their card details each time they make a purchase, an individual can approve the transaction via their online banking system.

It is important to reiterate that open banking is based on consent, so consumers participate only if they feel it is right for them. As it happens, millions of people have capitalised on the benefits. More than seven million consumers and 50% of small and medium-sized enterprises have used open banking services to gain a holistic view of their finances, to support applications for credit and to pay securely, quickly and cheaply.

Though open banking has brought great success for both consumers and the wider economy, it is also important that the Government learn lessons from its implementation. We must pay close attention to how the introduction of open banking has impacted both the industry and consumers and ensure that any takeaways are factored in when considering an expansion of smart data into new industries.

Further, given that the Government clearly recognise the value of open data, as shown by this section of the Bill, it is a shame that the Bill does not go further in exploring the possibilities of opening datasets in other settings. Labour has explicitly set out to do that in its industrial strategy. For example, we have identified that better, more open datasets on jobs could help us to understand where skills shortages are, allowing jobseekers, training providers and Government to better fill those gaps.

The provisions in clauses 62 and 63 to create new regimes of smart data are therefore welcome, but the Bill unfortunately remains a missed opportunity to fully capitalise on the opportunities of open, secure data flows.

Question put and agreed to.

Clause 62 accordingly ordered to stand part of the Bill.

Clause 63 ordered to stand part of the Bill.

Clause 64

Power to make provision in connection with business data

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to consider clause 65 stand part.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Clause 64 provides the principal regulation-making power for the creation of smart data schemes relating to business data. Regulations created through this clause allow for business data to be provided to the customer of a trader or a third-party recipient. Business data may also be published to be more widely available.

These regulations relating to business data will increase the transparency around the pricing of goods and services, which will increase competition and benefit both consumers and smaller businesses. To give just one example, the Competition and Markets Authority recently highlighted the potential of an open data scheme that compared the prices of fuel at roadside stations, increasing competition and better informing consumers. It is that kind of market intervention that the powers provide for.

Clause 65 outlines provisions that regulations relating to business data may contain. Those provisions are non-exhaustive. The clause largely mirrors clause 63, extending the same protections and benefits to schemes that make use of businesses data exclusively or in tandem with customer data. The clause differs from clause 63 in subsection (2), where an additional consideration is made as to who may make a request for business data. As action initiation relates only to an authorised person exercising a customer’s rights relating to their data, clause 65 does not include the references to that that are made in subsections (7) and (8) of clause 63.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

The measures in these clauses largely mirror 62 and 63, but they refer to business data rather than customer data. I therefore refer back to my comments on clause 62 and 63 and the benefits that new regulations such as these might be able to provide. Those remarks provide context as to why I am pleased to support these measures, which will allow the making of regulations that require data holders to share business data with third parties.

However, I would like clarification from the Minister on one point. The explanatory notes explain that the powers will likely be used together with those in clauses 62 and 63, but it would be good to hear confirmation from the Minister on whether there may be circumstances in which the Department envisages using the powers regarding business data distinctly. If there are, will he share examples of those circumstances? It would be good for both industry and Members of this House to have insight into how these clauses, and the regulatory powers they provide, will actually be used.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I think it is probably sensible if I come back to the hon. Lady on that point. I am sure we would be happy to provide examples if there are ones that we can identify.

Question put and agreed to.

Clause 64 accordingly ordered to stand part of the Bill.

Clause 65 ordered to stand part of the Bill.

Clause 66

Decision-makers

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Clauses 66 to 72 contain a number of provisions that will allow smart data regulations to function effectively. They are provisions on decision makers who approve and monitor third parties that can access the data, provisions on enforcement of the regulations and provisions on the funding of smart data schemes. It is probably sensible that I go through each one in more detail.

Clause 66 relates to the appointment of persons or accrediting bodies referred to as decision makers. The decision makers may approve the third parties that can access customer and business data, and act on behalf of customers. The decision makers may also revoke or suspend their accreditation, if that is necessary. An accreditation regime provides certainty about the expected governance, security and conduct requirements for businesses that can access data. Customers can be confident their chosen third party meets an appropriate standard. Clause 66 allows the decision maker to monitor compliance with authorisation conditions, subject to safeguards in clause 68.

Clause 67 enables regulations to confer powers of enforcement on a public body. The public body will be the enforcer, responsible for acting upon any breaches of the regulations. We envisage that the enforcer for a smart data scheme is likely to be an existing sectoral regulator, such as the Financial Conduct Authority in open banking. While the clause envisages civil enforcement of the regulations, subsection (6) allows for criminal offences in the case of falsification of information or evidence. Under subsections (3) and (10), the regulations may confer powers of investigation on the enforcer. That may include powers to require the provision of information and powers of entry, search and seizure. Those powers are subject to statutory restrictions in clause 68.

Clause 68 contains provisions limiting the investigatory powers given to enforcers. The primary restriction is that regulations may not require a person to give an enforcer information that would infringe the privileges of Parliament or undermine confidentiality, legal privilege and, subject to the exceptions in subsection (7), privilege against self-incrimination. Subsection (8) prevents any written or oral statement given in response to a request for information in the course of an investigation from being used as evidence against the person being prosecuted for an offence, other than that created by the data regulations.

Clause 69 contains provisions relating to financial penalties and the relevant safeguards. It sets out what regulations must provide for if enabling the use of financial penalties. Subsection (2) requires that the amount of a financial penalty is specified in, or determined in accordance with, the regulations. For example, the regulations may set a maximum financial penalty that an enforcer can impose and they may specify the methodology to be used to determine a specific financial penalty.

Clause 70 enables actors in smart data schemes to require the payment of fees. The circumstances and conditions of the fee charging process will be specified in the regulations. The purpose of the clause, along with clause 71, is to seek to ensure that the costs of smart data schemes, and of bodies exercising functions under them, can be met by the relevant sector.

It is intended that fees may be charged by accrediting bodies and enforcers. For example, regulations could specify that an accrediting body may charge third parties to cover the cost of an accreditation process and ongoing monitoring. Enforcers may also be able to charge to cover or contribute to the cost of any relevant enforcement activities. The regulations may provide for payment of fees only by persons who are directly affected by the performance of duties, or exercise of powers, under the regulations. That includes data holders, customers and those accessing customer and business data.

Clause 71 will enable the regulations to impose a levy on data holders or allow a specified public body to do so. That is to allow arrangements similar to those in section 38 of the Communications Act 2003, which enables the fixing of charges by Ofcom. Together with the provision on fees, the purpose of the levy is to meet all or part of the costs incurred by enforcers and accrediting bodies, or persons acting on their behalf. The intention is to ensure that expenses can be met without incurring a cost to the taxpayer. Levies may be imposed only in respect of data holders that appear to be capable of being directly affected by the exercise of the functions.

Clause 72 provides statutory authority for the Secretary of State or the Treasury to give financial assistance, including to accrediting bodies or enforcers. Subsection (2) provides that the assistance may be given on terms and conditions that are deemed appropriate by the regulation maker. Financial assistance is defined to include both actual or contingent assistance, such as a grant, loan, guarantee or indemnity. It does not include the purchase of shares. I commend clauses 66 to 72 to the Committee.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Clauses 66 to 72 provide for decision makers and enforcers to help with the operation and regulation of new smart data regimes. As was the case with the digital verification services, where I agreed that there was a need for the Secretary of State to have limited powers to ensure compliance with the trust framework, powers will be needed to ensure that any regulations made under this part of the Bill are followed. The introduction in clause 67 of enforcers—public bodies that will, by creating fines, penalties and notices of compliance, ensure that organisations follow regulations made under part 3—is therefore welcome.

As ever, it is pleasing to see that the relevant restrictions on the powers of enforcers are laid out in clause 68, to ensure that they cannot infringe upon other, more fundamental rights. It is also right, as is ensured by clause 69, that there are safeguards on the financial penalties that an enforcer is able to issue. Guidance on the amount of any penalties, as well as a formalised process for issuing notices and allowing for appeal, will provide uniformity across the board so that every enforcer acts proportionately and consistently.

Decision makers allowed for by clause 66 will be important, too, in conjunction with enforcers. They will ensure there is sufficient oversight of the organisations that are enabled to have access to customer or business data through any particular smart data regimes. Clauses 70, 71 and 72, which finance the activities of decision makers and enforcers, follow the trend of sensible provisions that will be required if we are to have confidence that regulations made under this part of the Bill will be adhered to. In short, the measures under this grouping are largely practical, and they are necessary to support clauses 62 to 65.

Question put and agreed to.

Clause 66 accordingly ordered to stand part of the Bill.

Clauses 67 to 72 ordered to stand part of the Bill.

Clause 73

Confidentiality and data protection

Question proposed, That the clause stand part of the Bill

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clauses 74 to 77 stand part.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Clauses 73 to 77 relate to confidentiality and data protection; various provisions connected with making the regulations, including consultation, parliamentary scrutiny and a duty to conduct periodic reviews of regulations; and the repeal of the existing regulation-making powers that these clauses replace.

Clause 73(1) allows the regulations to provide that there are no contravening obligations of confidence or other restrictions on the processing of information. Subsection (2) ensures that the regulations do not require or authorise processing that would contravene the data protection legislation. The provisions are in line with the approach taken towards pension dashboards, which are electronic communications services that allow individuals to access information about their pensions.

Clause 74(1) allows the regulation-making powers to be used flexibly. Subsection (1)(f) allows regulations to make provision by reference to specifications or technical requirements. That is essential to allow for effective and safe access to customer data, for instance the rapid updating of IT and security requirements, and it mirrors the powers enacted in relation to pensions dashboards, which I have mentioned. Clause 74(2) provides for limited circumstances in which it may be necessary for regulations to modify primary legislation to allow the regulations to function effectively. For instance, it may be necessary to extend a statutory alternative dispute resolution scheme in a specific sector to cover the activities of a smart data scheme.

Clause 74(3) states that affirmative parliamentary scrutiny will apply to the first regulations made under clauses 62 or 64; that is, affirmative scrutiny will apply to regulations that introduce a scheme. Affirmative parliamentary scrutiny will also be required where primary legislation is modified, where regulations make requirements more onerous for data holders and where the regulations confer monitoring or enforcement functions or make provisions for fees or a levy. Under clause 74(5), prior to making regulations that will be subject to affirmative scrutiny, the Secretary of State or the Treasury must consult persons who are likely to be affected by the regulations, and relevant sectoral regulators, as they consider appropriate.

The Government recognise the importance of enabling the ongoing scrutiny of future regulations, so clause 75 requires the regulation maker to review the regulations at least at five-yearly intervals. Clause 76 repeals the regulation-making powers in sections 89 to 91 of the Enterprise and Regulatory Reform Act 2013, which are no longer adequate to enable the introduction of effective smart data schemes. Those sections are replaced by the clauses in part 3 of the Bill. Clause 77 defines, or refers to definitions of, terms used in part 3 and is essential to the functioning and clarity of part 3. I commend the clauses to the Committee.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Many of the clauses in this grouping are supplementary to the provisions that we have already discussed, or they provide clarification as to which regulations under part 3 are subject to parliamentary scrutiny. I have no further comments to add on the clauses, other than to welcome them as fundamental to the wider part. However, I specifically welcome clause 75, which requires that the regulations made under this part be periodically reviewed at least every five years.

I hope that such regulations will be under constant review on an informal basis to assess how well they are working, but it is good to see a formal mechanism to ensure that that is the case over the long term. It would have been good, in fact, to see more such provisions throughout the Bill, to ensure that regulations that are made under it work as intended. Overall, I hope it is clear that I am very supportive of this part’s enabling of smart data regimes. I look forward to it coming into force and unlocking the innovation and consumer benefits that such schemes will provide.

Question put and agreed to.

Clause 73 accordingly ordered to stand part of the Bill.

Clause 74 to 77 ordered to stand part of the Bill.

Ordered, That further consideration be now adjourned. —(Steve Double.)

Data Protection and Digital Information (No. 2) Bill (Seventh sitting) Debate

Full Debate: Read Full Debate

Stephanie Peacock

Main Page: Stephanie Peacock (Labour - Barnsley East)

Data Protection and Digital Information (No. 2) Bill (Seventh sitting)

Stephanie Peacock Excerpts
Committee stage
Tuesday 23rd May 2023

(11 months, 2 weeks ago)

Public Bill Committees
Read Full debate Data Protection and Digital Information Bill 2022-23 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 23 May 2023 - (23 May 2023)
Stephanie Peacock Portrait Stephanie Peacock (Barnsley East) (Lab)
- Hansard - -

On a point of order, Mr Paisley. I would like to correct the record regarding my comments on clause 13, which appear in column 148 of the Committee proceedings in Hansard for Tuesday 16 May. I referred to the views of Lexology and included a quote, which I attributed to that organisation, when in fact the views and quote in question were those of an organisation named Prighter, which were simply published by Lexology.

None Portrait The Chair
- Hansard -

Thank you for that clarification.

Clause 78

The PEC Regulations

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Mr Paisley. Welcome to the Committee.

The Privacy and Electronic Communications (EC Directive) Regulations 2003 place specific requirements on organisations in relation to use of personal data in electronic communications. They include, for example, rules on the use of emails, texts and phone calls for direct marketing purposes and the use of cookies and similar technologies.

Trade associations have told us that sometimes their members need guidance on complying with the legislation that is more bespoke than the general regulatory guidance from the Information Commissioner’s Office. New clause 2 will allow representative bodies to design codes of conduct on complying with the PEC regulations that reflect their specific processing operations. There are already similar provisions in articles 40 and 41 of the UK General Data Protection Regulation to help organisations in particular sectors to comply.

Importantly, codes of conduct prepared under these provisions can be contained in the same document as codes of conduct under the UK GDPR. That will be particularly beneficial to representative bodies that are developing codes for processing activities that are subject to the requirements of both the UK GDPR and the PEC regulations. New clause 2 envisages that representative bodies will draw up voluntary codes of conduct and then seek formal approval of them from the Information Commissioner. The Information Commissioner will approve a code only if it contains a mechanism for the representative body to monitor their members’ compliance with the code.

New clause 1 makes a related amendment to article 41 of the UK GDPR to clarify that bodies accredited to monitor compliance with codes of conduct under the GDPR are required to notify the Information Commissioner only if they suspend or exclude a person from a code. Government amendment 5 is a minor and technical amendment necessary as a consequence of new clause 2.

These provisions are being put into the Bill at the suggestion of business organisations. We hope that they will allow organisations to comply more easily with the requirements.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

It is a pleasure to serve under your chairship, Mr Paisley, and I too welcome you to the Committee.

As I have said more than once in our discussions, in many cases the burden of following regulations can be eased just as much by providing clarification, guidance and support as by removing regulation altogether. I advocated for codes of practice in more detail in the discussion of such codes in the public sector, under clause 19, and during our debates on clauses 29 and 30, when we were discussing ICO codes more generally. New clauses 1 and 2 seem to recognise the value of codes of practice too, and both seek to provide either clarification or the sharing of best practice in terms of following the PEC regulations. I have no problem with proceeding with the Bill with these inclusions.

Amendment 5 agreed to.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I beg to move amendment 48, in clause 78, page 100, line 30, after “86” insert “and [Pre-commencement consultation]”.

This amendment is consequential on NC7.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

New clause 7 clarifies that the consultation requirements imposed by the Bill in connection with or under the PEC regulations can be satisfied by consultation that takes place before the relevant provision of the Bill comes into force. That ensures that the consultation work that supports development of policy before the Bill is passed can continue and is not paused unnecessarily. A similar provision was included in section 182 of the Data Protection Act 2018. Government amendment 48 is a minor and technical amendment which is necessary as a consequence of new clause 7. I commend the new clause and amendment to the Committee.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

The new clause and accompanying amendment seek to expedite work on consultation in relation to the measures in this part. It makes sense that consultation can begin before the Bill comes into force, to ensure that regulations can be acted on promptly after its passing. I have concerns about various clauses in this part, but no specific concerns about the overarching new clause, and am happy to move on to discussing the substance of the clauses to which it relates.

Amendment 48 agreed to.

Question proposed, That the clause, as amended, stand part of the Bill.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Clause 78 introduces part 4 of the Bill, which amends the Privacy and Electronic Communications (EC Directive) Regulations 2003. Clauses 79 to 86 refer to them as “the PEC Regulations” for short. They sit alongside the Data Protection Act and the UK GDPR. We will debate some of the more detailed provisions in the next few clauses.

Question put and agreed to.

Clause 78, as amended, accordingly ordered to stand part of the Bill.

Clause 79

Storing information in the terminal equipment of a subscriber or user

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I beg to move amendment 116, in clause 79, page 101, line 15, leave out

“making improvements to the service”

and insert

“making changes to the service which are intended to improve the user’s experience”.

Cookies are small text files that are downloaded on to somebody’s computer or smartphone when they access a website; they allow the website to recognise the person’s device, and to store information about the user’s preferences or past actions. The current rules around using cookies, set out in regulation 6 of the PEC regulations, dictate that organisations must tell people that the cookies are there, explain what the cookies are doing and why, and finally get the person’s freely given, specific and informed consent to store cookies on their device. However, at the moment there is almost universal agreement that the system is not working as intended.

To comply with the legislation, most website have adopted what is known as a cookie banner—a notice that pops up when a user first visits the site, prompting them to indicate which cookies they are happy with. However, due to the sheer volume of those banners, in many cases people no longer feel they are giving consent because they are informed or because they freely wish to give it, but are doing so simply because the banners stop them using the website as they wish.

In their communications regarding the Bill, the Government have focused on reducing cookie fatigue, branding it one of the headline achievements of the legislation. Unfortunately, as I will argue throughout our debates on clause 79, I do not believe that the Bill will fix the problem in the way that users hope. The new exemptions to the consent requirement for purposes that present a low risk to privacy may reduce the number of circumstances in which permission might be required, but there will still be a wide-ranging list of circumstances where consent is still required.

If the aim is to reduce cookie fatigue for users, as the Government have framed the clause, the exemptions must centre on the experience of users. If they do not, the clause is not about reducing consent fatigue, but rather about legitimising large networks of online surveillance of internet users. With that in mind, amendment 116 would narrow the exemption for collecting statistical information with a view to improving a service so that it is clear that any such improvements are exclusively considered to be those from the user’s perspective. That would ensure that the term “improvements” cannot be interpreted as including sweeping changes for commercial benefit, but is instead focused only on benefits to users.

I will speak to proposed new regulation 6B when we debate later amendments, but I reiterate that I have absolute sympathy for the intention behind the clause and want as much as anyone to see an end to constant cookie banners where possible. However, we must place the consumer and user experience at the heart of any such changes. That is what we hope to ensure through the amendment, with respect to the list of exemptions.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I am grateful to the hon. Lady for making it clear that the Opposition share our general objective in the clause. As she points out, the intention of cookies has been undermined by their ubiquity when they are placed as banners right at the start. Clause 79 removes the requirement to seek consent for the placement of audience measurement cookies. That means, for example, that a business could place cookies to count the number of visitors to its website without seeking the consent of web users via a cookie pop-up notice. The intention is that the organisation could use the statistical information collected to understand how its service is being used, with a view to improving it. Amendment 116 would mean that “improvements to the service” would be narrowed in scope to mean improvements to the user’s experience of the service, but while that is certainly one desirable outcome of the new exception, we want it to enable organisations to make improvements for their own purposes, and these may not necessarily directly improve the user’s experience of the service.

Organisations have repeatedly told us how important the responsible use of data is for their growth. For example, a business may want to use information collected to improve navigation of its service to improve sales. It could use the information collected to make improvements to the back-end IT functionality of its website, which the user may not be aware of. Or it could even decide to withdraw parts of its service that had low numbers of users; those users could then find that their experience was impaired rather than improved, but the business could invest the savings gained to improve other parts of the service. We do not think that businesses should be prevented from improving services in this way, but the new exception provides safeguards to prevent them from sharing the collected data with anyone else, except for the same purpose of making improvements to the service. On that basis, I hope the hon. Lady will consider withdrawing her amendment.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I am grateful for the Minister’s answer. I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Clause 79 reforms regulation 6 of the Privacy and Electronic Communications (EC Directive) Regulations 2003, which sets the rules on when an organisation can store information or gain access to information stored on a person’s device—for example, their computer, phone or tablet. This is commonly described as the cookies rule, but it includes similar technologies such as tracking pixels and device fingerprinting. Currently, organisations do not have to seek a user’s consent to place cookies that are strictly necessary to provide a service requested by the user—for example, to detect fraud or remember items in a user’s online shopping basket.

To reduce the number of cookie pop-up notices that can spoil web users’ enjoyment of the internet, clause 79 will remove the requirement for organisations to seek consent for several low privacy risk purposes, including the installation of software updates necessary for the security of the device. Government amendments 49 and 51 remove the user’s right to opt out of the software security update and the right to remove an update after it has taken effect. Government amendment 50 removes the right to disable an update before it takes effect.

Although these measures were initially included in the Bill to give web users a choice about whether security updates were installed, stakeholders have subsequently advised us that the failure to install certain updates could result in a high level of risk to the security of users’ devices and personal information. We have been reflecting on the provisions since the Bill was introduced, and have concluded that removing them is the right thing to do, in the interests of security of web users. Even if these provisions are omitted, organisations will still need to provide users with clear and comprehensive information about the purpose of software security updates. Web users will also still have the right to postpone an update for a limited time before it takes effect.

Government amendment 54 concerns the regulation-making powers under the new PEC regulations. One of the main aims is to ensure that web users are empowered to use automated technology such as browsers and apps to select their choices regarding which cookies they are willing to accept. The Secretary of State could use powers under these provisions to require consent management tools to meet certain standards or specifications. so that web users can make clear, meaningful choices once and have those choices respected throughout their use of the internet.

The Committee will note that new regulation 6B already requires the Secretary of State to consult the Information Commissioner and other interested parties before making any new regulations on consent management tools. Government amendment 54 adds the Competition and Markets Authority as a required consultee. That will help ensure that any competition impacts are properly considered when developing new regulations that set standards of design.

Finally, Government amendments 52 and 53 make minor and technical changes that will ensure that future regulations made under the reformed PEC regulations can include transitional, transitory or savings provisions. These will simply ensure there is a smooth transition to the new regime if the Secretary of State decides to make use of these new powers. I commend the amendments to the Committee.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I understand that amendments 49 to 51 primarily remove the option for subscribers or users to object to or disable an update or software for security reasons. As techUK has highlighted, the PEC regulations already contain an exemption on cookie consent for things that are strictly necessary, and it was widely accepted that security purposes met this exemption. This is reflected by its inclusion in the list of things that meet the criteria in new paragraph (5).

However, in the Bill the Government also include security updates in the stand-alone exemption list. This section introduces additional conditions that are not present in the existing law, including the requirement to offer users an opt-out from the security update and the ability to disable or postpone it. The fact that this overlap has been clarified by removing the additional conditions seems sensible. Although user choice has value, it is important that we do not leave people vulnerable to known security flaws.

In principle, Government amendment 54 is a move in the right direction. I will speak to regulation 6B in more detail when we discuss amendment 117 and explain why we want to remove it. If the regulation is to remain, it is vital that the Competition and Markets Authority be consulted before regulations are made due to the impact they will likely have in entrenching power in the hands of browser owners. That the Government have recognised that it was an oversight not to involve the CMA in any consultations is really pleasing. I offer my full support to the amendment in that context, though I do not believe it goes far enough and will advocate the removal of regulation 6B entirely in due course.

Amendment 49 agreed to.

Amendments made: 50, in clause 79, page 102, line 25, leave out “disable or”.

Clause 79 amends regulation 6 of the PEC Regulations to create new exceptions from the prohibition on storing and accessing information in terminal equipment. New paragraph (2C) contains an exception for software updates that satisfy specified requirements. This amendment removes a requirement for subscribers and users to be able to disable, not just postpone, the update.

Amendment 51, in clause 79, page 102, leave out lines 27 to 29.

Clause 79 amends regulation 6 of the PEC Regulations to create new exceptions from the prohibition on storing and accessing information in terminal equipment. New paragraph (2C) contains an exception for software updates that satisfy specified requirements. This amendment removes a requirement that, where the update takes effect, the subscriber or user can remove or disable the software.

Amendment 52, in clause 79, page 104, line 20, leave out “or supplementary provision” and insert

“, supplementary, transitional, transitory or saving provision, including provision”.—(Sir John Whittingdale.)

This amendment provides that regulations under the new regulation 6A of the PEC Regulations, inserted by clause 79, can include transitional, transitory or saving provision.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I beg to move amendment 117, in clause 79, page 104, line 32, leave out from the beginning to end of line 38 on page 105.

I begin by re-emphasising my overarching support for exploring ways to reduce consent fatigue and cookie banners. However, because of the direction that new regulation 6B takes us in, it requires far more consultation before entering the statute book. My amendment seeks to remove it. Regulation 6B aims, at some point in the future, to enable users to express any consent they wish to give or objections they wish to make regarding cookies to an operator of a website—commonly a browser—so that this can be done automatically on visiting the website. The three main concerns I have with this must be addressed and consulted on before such a regulation becomes law.

I am concerned that it will pose concerns for competition if browsers, often owned by powerful global tech companies, are given centralised control and access to data surrounding cookies across the entire internet. That concern was echoed by the Advertising Association and the CEO of the Data and Marketing Association during an oral evidence session. When asked whether there was any concern that centralising cookies by browser will entrench power in the hands of the larger tech companies that own the browsers, Chris Combemale answered:

“It certainly would give even greater market control to those companies.”––[Official Report, Data Protection and Digital Information (No. 2) Public Bill Committee, 10 May 2023; c. 21, Q43.]

He said:

“If anything, we need more control in the hands of the people who invest in creating the content”––[Official Report, Data Protection and Digital Information (No. 2) Public Bill Committee, 10 May 2023; c. 21, Q42.]

online.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I entirely agree with my hon. Friend. He accurately sums up the reason that the Government decided it was important that the Competition and Markets Authority would have an input into the development of any facility to allow browser users to set their preferences at the browser level. We will see whether, with the advent of other browsers, AI-generated search engines and so on, the dominance is maintained, but I think he is absolutely right that this will remain an issue that the Competition and Markets Authority needs to keep under review.

That is the purpose of Government amendment 54, which will ensure that any competition impacts are considered properly. For example, we want any review of regulations to be relevant and fair to both smaller publishers and big tech. On that basis, I hope that the hon. Member for Barnsley East will consider withdrawing her amendment.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I appreciate the Minister’s comments and the Government change involving the CMA, but we simply do not believe that that is worth putting into law. We just do not know the full implications, as echoed by the hon. Member for Folkestone and Hythe. I will therefore press my amendment to a Division.

Question put, That the amendment be made.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I shall not repeat all that has been said about the purpose of the clause. To recap quickly, consent is required for any non-essential functions, such as audience measurement, design optimisation, presentation of adverts and tracking across websites but, clearly, the current system is not working well. Researchers found that people often click yes to cookies to make the banner go away and because they want to access the service quickly.

The clause will remove the requirement for organisations to seek consent to cookies placed for several low privacy risk purposes. As a result of the new exceptions we are introducing, web users should know that if they continue to see cookie pop-up messages it is because they relate to more intrusive uses of cookies. It is possible that we may identify additional types of non-intrusive cookies in the future, so the clause permits the Secretary of State to make regulations amending the exceptions to the consent requirement or introducing new exceptions.

The changes will not completely remove the existence of cookie pop-ups. However, we are committed to working with tech companies and consumer groups to promote technologies that help people to set their online preferences at browser level or by using apps. Such technology has the potential to reduce further the number of pop-ups that appear on websites. Alongside the Bill, we will take forward work to discuss what can be done further to develop and raise awareness of possible technological solutions. On that basis, I commend the clause to the Committee.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I spoke in detail about my issues with the clause during our debates on amendments 116 and 117, but overall I commend the Government’s intention to explore ways to end cookie fatigue. Although I unfortunately do not believe that these changes will solve the issues, it is pleasing that the Government are looking at ways to reduce the need for consent where the risk for privacy is low. I will therefore not stand in the way of the clause, beyond voicing my opposition to regulation 6B.

Question put and agreed to.

Clause 79, as amended, accordingly ordered to stand part of the Bill.

Clause 80

Unreceived communications

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clauses 81 and 82 stand part.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Clause 80 provides an additional power for the Information Commissioner when investigating unsolicited direct marketing through telephone calls, texts and emails—more commonly known as nuisance calls or nuisance communications.

Some unscrupulous direct marketing companies generate hundreds of thousands of calls to consumers who have not consented to be contacted. That can affect the most vulnerable in our society, some of whom may agree to buy products or services that they did not want or cannot afford. Successive Governments have taken a range of actions over the years—for example, by banning unsolicited calls from claims management firms and pensions providers—but the problem persists and further action is needed.

Under the Privacy and Electronic Communications (EC Directive) Regulations 2003, the Information Commissioner can investigate and take enforcement action against rogue companies where there is evidence that unsolicited marketing communications have been received by the recipient. The changes we are making in clause 80 will enable the Information Commissioner to take action in relation to unsolicited marketing communications that have been generated, as well as those received or connected.

Not every call that is generated reaches its intended target. For example, an individual may be out or may simply not pick up the phone. However, the potential for harm should be a relevant factor in any enforcement action by the Information Commissioner’s Office. The application of the regulations, through the changes in clause 80, to communications generated will more accurately reflect the level of intent to cause disturbance.

Clause 81 is a minor and technical clause that should improve the readability of the PEC regulations. The definition of “direct marketing”, which the PEC regulations rely on, is currently found in the Data Protection Act 1998. To help the reader quickly locate the definition, the clause adds the definition to the PEC regulations themselves.

Under the current PEC regulations, businesses can already send direct marketing to existing customers, subject to certain safeguards. That is sometimes known as the soft opt-in rule. Clause 82 applies the same rule to non-commercial organisations, such as charities. The changes will mean that charitable, political and non-commercial organisations will be able to send direct marketing communications to persons who have previously expressed an interest in the organisation’s aims and ideals.

The current soft opt-in rules for business are subject to certain safeguards. We have applied the same safeguards to these new provisions for non-commercial organisations. We think these changes will help non-commercial organisations, including charities and political parties, to build ongoing relationships with their supporters. There is no good reason why the soft opt-in rule should apply to businesses but not to non-commercial organisations. I hope Members will see the benefit of these measures in ensuring the balance between protecting the most vulnerable in society and supporting organisations. I commend clauses 80 to 82 to the Committee.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

As I have said many times during our discussion of the Bill, I believe that the Information Commissioner should be given proportionate powers to investigate and take action where that is needed to uphold our regulations. That is no less the case with clause 80, which introduces measures that allow the Information Commissioner to investigate organisations responsible for generating unsolicited direct marketing communications, even if they are not received by anyone.

Clause 81 simply lifts the definition of “direct marketing” from the Data Protection Act 1998 and places it into the PEC regulations to increase the readability of that legislation. I have no issues with that.

Clause 82 extends the soft opt-in rules to charities and non-commercial organisations. It is only right that the legislation is consistent in offering non-profits the opportunity to send electronic marketing communications in the same way as for-profit organisations. It might, however, be worth raising the public’s awareness of the rule and of the ability to opt out at any point. If they suddenly find themselves on the end of such communications, they will have a clear understanding of why that is the case and that consent may be withdrawn if they so wish.

Question put and agreed to.

Clause 80 accordingly ordered to stand part of the Bill.

Clauses 81 and 82 ordered to stand part of the Bill.

Clause 83

Direct marketing for the purposes of democratic engagement

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Before I speak to the amendment, I will set out the provisions of clause 83, which gives the Secretary of State the power to make exceptions to the PEC regulations’ direct marketing provisions for communications sent for the purposes of democratic engagement. We do not intend to use the powers immediately because the Bill contains a range of other measures that will facilitate a responsible use of personal data for the purposes of political campaigning, including the extension of the soft opt-in rule that we have just debated. However, it is important we keep the changes we are making in the Bill under review to make sure that elected representatives and parties can continue to engage transparently with the electorate and are not unnecessarily constrained by data protection and privacy rules.

The Committee will note that if the Secretary of State decided to exercise the powers, there are a number of safeguards in the clause that will maintain a sensible balance between the need for healthy interaction with the electorate and any expectations that an individual might have with regard to privacy rights. Any new exceptions would be limited to communications sent by the individuals and organisations listed in clause 83, including elected representatives, registered political parties and permitted participants in referendum campaigns.

Before laying any regulations under the clause, the Secretary of State will need to consult the Information Commissioner and other interested parties, and have specific regard for the effect that further exceptions could have on the privacy of individuals. Regulations will require parliamentary approval via the affirmative resolution procedure. Committee members should also bear in mind that the powers will not affect an individual’s right under the UK GDPR to opt out of receiving communications.

We have also tabled two technical amendments to the clause to improve the way it is drafted. Government amendment 55 will make it clear that regulations made under this power can include transitory or savings provisions in addition to transitional provisions. Such provisions might be necessary if, for example, new exceptions were only to apply for a time-limited period. Clause 84 is also technical in nature and simply sets out the meaning of terms such as “candidate”, “elected representative” and “permitted participant” for the purposes of clause 83.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

The clauses mirror somewhat the involvement of democratic engagement purposes on the recognised legitimate interests list. However, here, rather than giving elected representatives and the like an exemption from completing a balancing test when processing under this purpose, the Bill paves the way for them to be exempt from certain direct marketing provisions in future.

The specific content of any future changes, however, should be properly scrutinised. As such, it is disappointing that the Government have not indicated how they intend to use such regulations in future. I appreciate that the Minister has just said that they do not intend to use them right now. Does he have in mind any examples of any exemptions that he might like to make from the direct marketing provisions for democratic engagement purposes? That is not to say that such exemptions will not be justified; just that their substance should be openly discussed and democratically scrutinised.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

As I have set out, the existing data protection provisions remain under the GDPR. In terms of specific exemptions, I have said that the list will be subject to future regulation making, which will be also subject to parliamentary scrutiny. We will be happy to supply a letter to the hon. Lady to set out specific examples of where that might be the case.

Amendment 55 agreed to.

Clause 83, as amended, ordered to stand part of the Bill.

Clause 84

Meaning of expressions in section 83

Amendment made: 31, in clause 84, page 110, line 31, leave out “fourth day after” and insert

“period of 30 days beginning with the day after”.—(Sir John Whittingdale.)

Clauses 83 and 84 enable regulations to make exceptions from direct marketing rules in the PEC Regulations, including for certain processing by elected representatives. This amendment increases the period for which former members of the Westminster Parliament and the devolved legislatures continue to be treated as "elected representatives" following an election. See also NC6 and Amendment 30.

Clause 84, as amended, ordered to stand part of the Bill.

Clause 85

Duty to notify the Commissioner of unlawful direct marketing

--- Later in debate ---
None Portrait The Chair
- Hansard -

Thank you.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

The ambition of the clause is broadly welcome, and we agree that there is a need to tackle unwanted calls, but the communications sector, including Vodafone and BT, as well as techUK, has shared concerns that the clause, which will place a new duty on telecoms providers to report to the commissioner whenever they have “reasonable grounds” for suspecting a breach of direct marketing regulations, might not be the best way to solve the issue.

I will focus my remarks on highlighting those concerns, and how amendment 118 would address some of them. First, though, let me say that the Government have already made it clear in their explanatory notes that it is not the intention of the Bill to require providers to monitor communications. However, that has not been included in the Bill, which has caused some confusion in the communications sector.

Amendment 118 would put that confusion to rest by providing for the explicit inclusion of the clarification in the clause itself. That would provide assurances to customers who would be sure their calls and texts would not be monitored, and to telecoms companies, which would be certain that such monitoring of content was absolutely not required of them.

Secondly, the intent of the clause is indeed not to have companies monitoring communications, but many relevant companies have raised concerns around the technological feasibility of identifying instances of unlawful and unsolicited direct marketing. Indeed, the new duty will require telecommunications providers to be able to identify whether a person receiving a direct marketing call has or has not given consent to receive the call from the company making it. However, providers have said they cannot reliably know that, and have warned that there is no existing technology to conduct that kind of monitoring accurately and at scale. In the absence of communication monitoring and examples of how unsolicited direct marketing is to be identified, it is therefore unclear how companies will fulfil their duties under the clause.

That is not to say the industry is not prepared to commit significant resources to tackling unwanted calls. BT, for example, has set up a range of successful tools to help customers. That includes BT Call Protect, which is used by 4.4 million BT customers and now averages 2.35 million calls diverted per week. However, new measures must be feasible, and our amendment 118 would therefore require that guidance around the implementation of the clause include illustrative examples of the grounds on which a provider may reasonably suspect that a person is contravening, or has contravened, any of the direct marketing regulations.

If the Minister does not intend to support the amendment, I would like to hear such examples from him today, so that the communications sector was absolutely clear about how to fulfil its new duties, given the technology available.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

As the hon. Lady has said, amendment 118 would require the commissioner to state clearly in the guidance that the new duty does not oblige providers to intercept or monitor the content of electronic communications in order to determine whether there has been a contravention of the rules. It would also require the guidance to include illustrative examples of the types of activity that may cause a provider reasonably to suspect that there had been a contravention of the requirements.

I recognise that the amendment echoes concerns that have been raised by communications service providers, and that there has been some apprehension about exactly what companies will have to do to comply with the duty. In response, I would emphasise that “reasonable grounds” does mean reasonable in all circumstances.

The hon. Lady has asked for an example of the kind of activity that might give reasonable grounds for suspicion. I direct her to the remarks I made in moving the amendment and the example of a very large number of calls being generated in rapid succession in which, in each case, the telephone number is simply one digit away from the number before. The speed at which that takes place does provide reasonable grounds to suspect that the requirement to, for instance, check with the TPS is not being fulfilled.

There are simple examples of that kind, but I draw the attention of the hon. Lady and the Committee to the consultation requirements that will apply to the ICO’s guidance. In addition to consulting providers of public electronic communications networks and services on the development of the guidance, the ICO will be required to consult the Secretary of State, Ofcom and other relevant stakeholders to ensure that the guidance is as practical and useful to organisations as possible.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I completely agree; my hon. Friend is right to make that distinction. Companies should use their best endeavours, but it is worth repeating that the guidance does not expect service and network providers to monitor the content of individual calls and messages to comply with the duty. There is more interest in patterns of activity on networks, such as where a rogue direct marketing firm behaves in the manner that I set out. On that basis, I ask the hon. Lady not to press her amendment to a vote.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I appreciate the Minister’s comments and those of the hon. Member for Folkestone and Hythe. We have no issue with the monitoring of patterns; we wanted clarification on the content. I am not sure that the Minister addressed the concerns about the fact that, although the Government have provided a partial clarification in the explanatory notes, this is not in the Bill. For that reason, I will press my amendment to a vote.

Amendment 56 agreed to.

Amendment proposed: 118, in clause 85, page 113, line 3, at end insert—

“(1A) Guidance under this section must—

(a) make clear that a provider of a public electronic communications service is not obligated to monitor the content of individual electronic communications in order to determine whether those communications contravene the direct marketing regulations; and

(b) include illustrative examples of the grounds on which a provider may reasonably suspect that a person is contravening or has contravened any of the direct marketing regulations.”—(Stephanie Peacock.)

Question put, That the amendment be made.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Before turning specifically to the provisions of the amendment, I will set out the provisions of clause 86 and schedule 10. Clause 86 updates the ICO’s powers in respect of enforcing the PEC regulations. Currently, the ICO has to rely mainly on outdated powers in the Data Protection Act 1998 to enforce breaches of the PEC regulations. The powers were not updated when the UK GDPR and the Data Protection Act came into force in 2018. That means that some relatively serious breaches of the PEC regulations, such as nuisance calls being generated on an industrial scale, cannot be investigated as effectively or punished as severely as breaches under the data protection legislation.

The clause will therefore give the ICO the same investigatory and enforcement powers in relation to breaches of the PEC regulations as currently apply to breaches of the UK GDPR and the 2018 Act. That will result in a legal framework that is more consistent and predictable for organisations, particularly for those with processing activities that engage both the PEC regulations and the UK GDPR.

Clause 86 and schedule 10 add a new schedule to the PEC regulations, which sets out how the investigatory and enforcement powers in the 2018 Act will be applied to the PEC regulations. Among other things, that includes the power for the Information Commissioner to impose information notices, assessment notices, interview notices and enforcement and penalty notices. The maximum penalty that the Information Commissioner can impose for the most serious breaches of the PEC regulations will be increased to the same levels that can be imposed under the UK GDPR and the Data Protection Act. That is up to 4% of a company’s annual turnover or £17.5 million, whichever is higher.

Relevant criminal offences under the Data Protection Act, such as the offence of deliberately frustrating an investigation by the Information Commissioner by destroying or falsifying information, are also applied to the PEC regulations. The updated enforcement provisions in new schedule 1 to the PEC regulations will retain some pre-existing powers that are unique to the previous regulations.

Clause 86 also updates regulation 5C of the PEC regulations, which sets out the fixed penalty amount for a failure to report a personal data breach under regulation 5. Currently, the fine level is set at £1,000. The clause introduces a regulation-making power, which will be subject to the affirmative procedure, for the Secretary of State to increase the fine level. We have tabled Government amendment 57 to provide an explicit requirement for the Secretary of State to consult the Information Commissioner and any other persons the Secretary of State considers appropriate before making new regulations. The amendment also confirms that regulations made under the power can include transitional provisions.

Finally, we have tabled two further minor amendments to schedule 10. Government amendment 58 makes a minor correction by inserting a missing schedule number. Government amendment 32 adjusts the provision that applies section 155(3)(c) of the Data Protection Act for the purposes of the PEC regulations. That is necessary as that section is being amended by schedule 4. Without making those corrective amendments, the provisions will not achieve the intended effect.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Clause 86 and schedule 10 insert and clarify the commissioner’s enforcement powers with regards to privacy and electronic communications regulation. Particularly of note within the proposals is the move to increase fines for nuisance calls and messages to a higher maximum penalty of £17.5 million or 4% of the undertaking’s total annual worldwide turnover, whichever is higher. That is one of the Government’s headline commitments in the Bill and should create tougher punishments for those who are unlawfully pestering people through their phones.

We are in complete agreement that more must be done to stop unwanted communications. However, to solve the problem as a whole, we must take stronger action on scam calling as well as on instances of unsolicited direct marketing. Labour has committed to going further than Ofcom’s new controls on overseas scam calls and has proposed the following to close loopholes: first, no phone call made from overseas using a UK telephone number should have that number displayed when it appears on a UK mobile phone or digital landline; and secondly, all mobile calls from overseas using a UK number should be blocked unless the network provider confirms that the known bill payer for the number is currently roaming. To mitigate the fact that some legitimate industries rely on overseas call centres that handle genuine customer service requests, we will also require Ofcom to register those legitimate companies and their numbers as exceptions to the blocking.

As the clause and schedule seek to take strong action against unwanted communications, I would be pleased to hear from the Minister whether the Government would consider going further and matching our commitments on overseas scam calling, too.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I say to the hon. Lady that the provisions deal specifically with nuisance calls, not necessarily scam calls. As she will know, the Government have a comprehensive set of policies designed to address fraud committed through malicious or scam calls, and those are being processed through the fraud prevention strategy. I accept that more needs to be done and say to her that it is already taking place.

Amendment 57 agreed to.

Clause 86, as amended, ordered to stand part of the Bill.

Schedule 10

Privacy and electronic communications: Commissioner’s enforcement powers

Amendments made: 32, in schedule 10, page 180, line 25, leave out “for “data subjects”” and insert

“for the words from “data subjects” to the end”.

This amendment adjusts provision applying section 155(3)(c) of the Data Protection Act 2018 (penalty notices) for the purposes of the PEC Regulations to take account of the amendment of section 155(3)(c) by Schedule 4 to the Bill.

Amendment 58, in schedule 10, page 183, line 5, at end insert “15”.—(John Whittingdale.)

This amendment inserts a missing Schedule number, so that the provision refers to Schedule 15 to the Data Protection Act 2018.

Schedule 10, as amended, agreed to.

Clause 87

The eIDAS Regulation

Question proposed, That the clause stand part of the Bill.

--- Later in debate ---
I hope that Members will recognise the merits of that approach. As the digital economy grows and the demand for UK-based qualified trust service providers is rising, these clauses will ensure that the UK’s trust services framework is future-proofed and able to support the growing demand for trusted digital transactions globally.
Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

“Trust services” refers to services including those relating to electronic signatures, electronic seals, timestamps, electronic delivery services and website authentication. As has been mentioned, trust services are required to meet certain standards and technical specifications for operation across the UK economy, which are outlined under eIDAS regulations. These clauses seek to make logistical adjustments to that legal framework for trust service products and services within in the UK.

Although we understand that the changes are intended to enable flexibility in case EU regulations should no longer be adequate, and absolutely agree that we must future-proof regulations to ensure that standards are always kept high, we must also ensure that any changes made are necessary, to ensure that standards remain high, rather than being made simply for their own sake. It is vital that any alterations made are genuinely intended to improve current practices and have been thoroughly considered to ensure that they are making positive and meaningful change.

Question put and agreed to.

Clause 87 accordingly ordered to stand part of the Bill.

Clauses 88 to 91 ordered to stand part of the Bill.

Clause 92

Disclosure of information to improve public service delivery to undertakings

Question proposed, That the clause stand part of the Bill.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

The clause will amend the Digital Economy Act 2017 to extend the powers under section 35 to include businesses. Existing powers enable public authorities to share data to support better services to individuals and households. The Government believe that businesses too can benefit from responsive, joined-up public services across the digital economy. The clause introduces new data sharing powers allowing specified public authorities to share data with other specified public authorities for the purposes of fulfilling their functions.

The sharing of data will also provide benefits for the public in a number of ways. It will pave the way for businesses to access Government services more conveniently, efficiently and securely—by using digital verification services, accessing support when trying to start up new businesses, completing import and export processes or applying for Government grants such as rural grants, for example. Any data sharing will of course be carried out in accordance with the requirements of the Data Protection Act and the UK GDPR.

Being able to share data about businesses will bring many benefits. For example, by improving productivity while keeping employment high we can earn more, raising living standards, providing funds to support our public services and improving the quality of life for all citizens. Now that we have left the EU, businesses that take action to improve their productivity will increase their resilience to changing market conditions and be more globally competitive. The Minister will be able to make regulations to add new public authorities to those already listed in schedule 4 to the Digital Economy Act. However, any regulations would be made by the affirmative procedure, requiring the approval of both Houses. I commend the clause to the Committee.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

The clause amends section 35 of the Digital Economy Act to enable specified public authorities to share information to improve the delivery of public services to businesses with other specified persons. That echoes the existing legal gateway that allows for the sharing of information on improving the delivery of public services to individuals and households.

I believe that the clause is a sensible extension, but would have preferred the Minister and his Department to have considered public service delivery more broadly when drafting the Bill. While attention has rightly been paid throughout the Bill to making data protection regulation work in the interests of businesses, far less attention has gone towards how we can harness data for the public good and use it to the benefit of our public services. That is a real missed opportunity, which Labour would certainly have taken.

Question put and agreed to.

Clause 92 accordingly ordered to stand part of the Bill.

Clause 93

Implementation of law enforcement information-sharing agreements

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I beg to move amendment 8, in clause 93, page 119, line 18, leave out first “Secretary of State” and insert “appropriate national authority”.

This amendment, Amendment 10 and NC5 enable the regulation-making power conferred by clause 93 to be exercised concurrently by the Secretary of State and, in relation to devolved matters, by Scottish Ministers and Welsh Ministers.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Clause 93 creates a delegated power for the Secretary of State, and a concurrent power for Welsh and Scottish Ministers, to make regulations to implement international agreements relating to the sharing of information for law enforcement purposes. The concurrent power for Welsh and Scottish Ministers has been included in an amendment to the clause. While international relations are a reserved matter, the domestic implementation of the provisions likely to be contained in future international agreements may be devolved, given that law enforcement is a devolved matter to various extents in each devolved Administration.

In the light of introducing a concurrent power for Welsh and Scottish Ministers, amendments to clauses 93 and 108 have been tabled, as has new clause 5. Together they specifically detail the appropriate national authority that will have the power to make regulations in respect of clause 93. The Government amendments make it clear that the appropriate national authority may make the regulations. New clause 5 then defines who is an appropriate national authority for those purposes. I therefore commend new clause 5 and the related Government amendments to the Committee.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

It is right that the powers conferred by clause 93 can be exercised by devolved Ministers where appropriate. I therefore have no objections to the amendments or the new clause.

Amendment 8 agreed to.

Amendments made: 9, in clause 93, page 119, line 18, leave out second “Secretary of State” and insert “authority”.

This amendment is consequential on Amendment 8.

Amendment 10, in clause 93, page 119, line 36, at end insert—

‘“appropriate national authority” has the meaning given in section (Meaning of “appropriate national authority”);’.(Sir John Whittingdale.)

See the explanatory statement for Amendment 8.

Question proposed, That the clause, as amended, stand part of the Bill.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

As I have already set out, clause 93 creates a delegated power for the Secretary of State, along with a concurrent power for Welsh and Scottish Ministers, to make regulations to implement international agreements relating to the sharing of information for law enforcement purposes. The legislation will provide powers to implement technical aspects of such international agreements via secondary legislation once the agreements have been negotiated.

Clause 93 stipulates that regulations can be made in connection with implementing an international agreement only in so far as it relates to the sharing of information for law enforcement purposes, and that any data sharing must comply with data protection legislation. These measures will enable the implementation of new international agreements designed to help keep the public safe from the threat posed by international criminality and cross-border crime, as well as helping to protect vulnerable people.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Hmm. Okay.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

The clause allows the Secretary of State to make regulations to enact an international agreement for the sharing of information for law enforcement purposes. The substance of any such agreement will likely therefore come through secondary legislation and, as such, it will be appropriate at that point to scrutinise their contents. If the Minister and his Department have identified any targets for such agreements at this stage, I am sure that the Committee would be grateful to hear of them. If not, however, I expect that he would update the House of that through the usual channels.

Question put and agreed to.

Clause 93, as amended, accordingly ordered to stand part of the Bill.

Clause 94

Form in which registers of births and deaths are to be kept

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clauses 95 to 98 stand part.

That schedule 11 be the Eleventh schedule to the Bill.

--- Later in debate ---
Before sitting down, I pay tribute to my hon. Friend the Member for Solihull (Julian Knight), who attempted to introduce a number of these provisions via a private Member’s Bill, which unfortunately did not make it through. His intention is now to be put into law as a result of the measures in this Bill.
Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Clauses 94 to 98 amend the Births and Deaths Registration Act, with the overall effect of removing the provision for birth and death records to be kept on paper, and allowing them to be held in an online database. This is a positive move, with the potential to bring many benefits. First, it will improve the functioning of the registration system—for example, it will allow the Registrar General and the superintendent registrar to have immediate access to all birth and death entries as soon as they have been entered into the system. The changes will undoubtedly be important to families who are experiencing joy or loss, because they make registrations easier and more likely to be correct in the first instance, minimising unnecessary clarifications at what can often be a very difficult time. Indeed, one of the recommendations of the 2022 UK Commission on Bereavement’s landmark report, which looked at the key challenges facing bereaved people in this country, was that it should be possible to register deaths online.

It is great that the Government have chosen to pursue this change. However, despite it being the recommendation listed right next to online death registration, the Government have not used this opportunity to explore the potential of extending the Tell Us Once service, which is disappointing. Indeed, the existing Tell Us Once service has proved very helpful to bereaved people in reducing the administrative burden they face, by enabling them to inform a large number of Government and public sector bodies in one process, rather than forcing them to go through the same process time and again. However, private organisations are not included, and loved ones are still tasked with contacting organisations such as employers, energy and electricity companies, banks, telephone and internet providers, and more. At a time of emotional struggle, this is a huge administrative burden to place on the bereaved and leaves them vulnerable to other unsettling variables, such as communication barriers and potentially insensitive customer service.

The commission found that 61% of adult respondents reported experiencing practical challenges when notifying the organisations that need to be made aware of the death of a loved one. We are therefore disappointed that the Government have not explored whether the Bill could extend the policy to the private sector in order to further reduce the burden on grieving friends and families, and make the inevitably difficult process a little easier. Overall, however, the clauses will mark a positive change for families up and down the country, and we are pleased to see them implemented.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I merely say to the hon. Lady that, having used the Tell Us Once service myself in relation to the death of my mother not that long ago, I absolutely hear what she says about the importance of making the process as easy as possible. We will certainly consider what she says.

Question put and agreed to.

Clause 94 accordingly ordered to stand part of the Bill.

--- Later in debate ---
Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Information standards govern how data can be shared and compared across a sector. They are important in every sector in which they operate, but particularly in health, where they are critical to enabling the information sharing and interoperability necessary for good patient outcomes across health and social care services. For many reasons, however, we do not have a standard national approach to health data; as such, patients receive a far from seamless experience between different healthcare services. The Bill’s technical amendments and clarifications of existing rules on information standards in health, and how they interact with IT and IT services, are small but good steps in the journey towards trying resolve that.

Tom Schumacher of Medtronic told us in oral evidence that one of the problems faced by his organisation and NHS trusts is

“variability in technical and IT security standards.”

He suggested that harmonising those standards would be a “real opportunity,” since it would mean that

“each trust does not have to decide for itself which international standard to use and which local standard to use.”––[Official Report, Data Protection and Digital Information (No. 2) Public Bill Committee, 10 May 2023; c. 42, Q90.]

However, it is unclear how much headway these IT-related changes will make in providing that harmonisation, let alone the seamless service that patients so often call for.

I have one query that I hope the Minister can help with. MedConfidential has shared with us a concern that new section 251ZE of the Health and Social Care Act 2012 on accreditation of information technology, which is introduced by schedule 12, seems to imply that the Department of Health and Social Care and NHS England will have the power to set data standards in social care. MedConfidential says that would be a major policy shift, and that it seems unusual to implement such a shift through an otherwise unrelated Bill. Will the Minister write to me to clarify whether it is the Government’s intention to have DHSC and NHS England take over the information infrastructure of social care—and, if so, why they have come to that decision?

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I am grateful to the hon. Lady for her support in general. I hear the concern that she expressed on behalf of the firm that has been in contact with her. We will certainly look into that, and I will be happy to let her have a written response in due course.

Mr Paisley, might I beg the Committee’s indulgence to correct the record? I incorrectly credited the hon. Member for Solihull for the private Member’s Bill, but it was in fact my hon. Friend the Member for Meriden (Saqib Bhatti). I apologise to him for getting his constituency wrong—

Data Protection and Digital Information (No. 2) Bill (Eighth sitting) Debate

Full Debate: Read Full Debate

Stephanie Peacock

Main Page: Stephanie Peacock (Labour - Barnsley East)

Data Protection and Digital Information (No. 2) Bill (Eighth sitting)

Stephanie Peacock Excerpts
Committee stage
Tuesday 23rd May 2023

(11 months, 2 weeks ago)

Public Bill Committees
Read Full debate Data Protection and Digital Information Bill 2022-23 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 23 May 2023 - (23 May 2023)
John Whittingdale Portrait The Minister for Data and Digital Infrastructure (Sir John Whittingdale)
- Hansard - - - Excerpts

We now turn to part 5 of the Bill. Clauses 100 to 103 and schedule 13 will establish a body corporate, the Information Commission, to replace the existing regulator, the Information Commissioner, which is currently structured as a corporation sole. I should make it clear that the clauses will make no changes to the regulator’s role and responsibilities; all the functions that rest with the Information Commissioner will continue to sit with the new Information Commission.

Clause 100 will establish a body corporate, the Information Commission, to replace the existing regulator, the Information Commissioner. The commission will be governed by an independent board, with chair and chief executive roles, thereby spreading the responsibilities of the Information Commissioner across a larger number of people.

Clause 101 will abolish the office of the Information Commissioner and amend the Data Protection Act 2018 accordingly. To ensure an orderly transfer of functions, the Information Commissioner’s Office will not be abolished until the new body corporate, the Information Commission, is established.

Clause 102 provides for all regulatory and other functions of the Information Commissioner to be transferred to the new body corporate, the Information Commission, once it is established. The clause also provides for references to the Information Commissioner in enactments or other documents to be treated as references to the Information Commission, where appropriate, as a result of the transfer of functions to the new Information Commission.

Clause 103 will allow the Secretary of State to make a scheme for the transfer of property, rights and liabilities, including rights and liabilities relating to employment contracts, from the commissioner to the new commission. The scheme may transfer property such as IT equipment or office furniture, or transfer staff currently employed by the commissioner to the commission. The transfer scheme will be designed to ensure continuity and facilitate a seamless transition to the new Information Commission.

Schedule 13 will insert a new schedule 12A to the Data Protection Act 2018, which describes the nature, form and governance structure of the new body corporate, the Information Commission. The commission will be governed by an independent statutory board, which will consist of a chair and other non-executive members, as well as executive members including a chief executive. The new structure formalises aspects of the existing governance arrangements of the Information Commissioner’s Office and brings the ICO in line with how other UK regulators, such as Ofcom and the Financial Conduct Authority, are governed. The chair of the new commission will be appointed by His Majesty by letters patent on the recommendation of the Secretary of State, as is currently the case for the commissioner.

Schedule 13 also provides for the current Information Commissioner to transfer to the role of chair of the Information Commission for the remainder of their term. I put on record the Government’s intention to preserve the title of Information Commissioner in respect of the chair, in acknowledgment of the fact that the commissioner’s brand is recognised and valued both domestically and internationally. Other non-executive members will be appointed by the Secretary of State, and the chief executive will be appointed by the non-executive members in consultation with the Secretary of State.

Government amendment 45 will allow the chair to appoint the first chief executive on an interim basis and for a term of up to a maximum of 24 months, which will minimise any delay in the transition from the commissioner to the new commission. As drafted, the Bill provides that the chief executive of the commission will be appointed by the non-executive members once they are in place, in consultation with the Secretary of State. The transition from the commissioner to the new Information Commission cannot take place until the board is properly constituted, with, as a minimum, a chair, another non-executive member and a chief executive in place. That requirement would be likely to cause delay to the transition, as the appointment of the non-executive members by the Secretary of State and the chief executive would need to take place consecutively.

Amendment 44 is a minor consequential amendment to paragraph 3(3)(a) of proposed new schedule 12A, making it clear that the interim chief executive is appointed as an executive member.

The amendments seek to minimise any delay in the transfer of functions to the new commission by enabling the appointment of the chief executive to take place in parallel with the appointments process for non-executive members. The appointment of the interim chief executive will be made on the basis of fair and open competition and in consultation with the Secretary of State. I commend clauses 100 to 103, schedule 13 and Government amendments 44 and 45 to the Committee.

Stephanie Peacock Portrait Stephanie Peacock (Barnsley East) (Lab)
- Hansard - -

It is a pleasure to serve under your chairship once again, Mr Hollobone. The clauses that restructure the Information Commissioner’s Office are among those that the Opposition are pleased to welcome in the Bill.

The Information Commissioner is the UK’s independent regulator for data protection and freedom of information under the Data Protection Act 2018 and the Freedom of Information Act 2000. Under the current system, as the Minister outlined, the Information Commissioner’s Office is a corporation sole, meaning that one person has overall responsibility for data protection and freedom of information, with a group of staff supporting them. However, as the use of data in our society has grown, so too has the ICO, from a team of 10 in 1984 to an organisation with more than 500 staff.

In that context, the corporation sole model is obviously not fit for purpose. Clauses 100 to 103 recognise that: they propose changes that will modernise the Information Commissioner’s Office, turning it into the Information Commission by abolishing the corporation sole and replacing it with a body corporate. It is absolutely right that those changes be made, transforming the regulator into a commission with a broader set-up structure and a board of executives, among other key changes. That will bring the ICO in line with other established UK regulators such as Ofcom and the Financial Conduct Authority, reflect the fact that the ICO is not just a small commissioner’s office, and ensure that it is equipped to deal with the volume of work for which it has responsibility.

It is essential that the ICO remains independent and fair. We agree that moving from an individual to a body will ensure greater integrity, although the concerns that I have raised about the impact of earlier clauses on the ICO’s independence certainly remain. Overall, however, we are pleased that the Government recognise that the ICO must be brought in line with other established regulators and are making much-needed changes, which we support.

Question put and agreed to.

Clause 100 accordingly ordered to stand part of the Bill.

Schedule 13

The Information Commission

Amendments made: 44, in schedule 13, page 195, line 21, after “members” insert

“or in accordance with paragraph 23A”.

This amendment is consequential on Amendment 45.

Amendment 45, in schedule 13, page 204, line 6, at end insert—

Transitional provision: interim chief executive

23A (1) The first chief executive of the Commission is to be appointed by the chair of the Commission.

(2) Before making the appointment the chair must consult the Secretary of State.

(3) The appointment must be for a term of not more than 2 years.

(4) The chair may extend the term of the appointment but not so the term as extended is more than 2 years.

(5) For the term of appointment, the person appointed under sub-paragraph (1) is ”the interim chief executive”.

(6) Until the expiry of the term of appointment, the powers conferred on the non-executive members by paragraph 11(2) and (3) are exercisable in respect of the interim chief executive by the chair (instead of by the non-executive members).

(7) In sub-paragraphs (5) and (6), the references to the term of appointment are to the term of appointment described in sub-paragraph (3), including any extension of the term under sub-paragraph (4).”—(Sir John Whittingdale.)

The Bill establishes the Information Commission. This new paragraph enables the chair of the new body, in consultation with the Secretary of State, to appoint the first chief executive (as opposed to the appointment being made by non-executive members). It also enables the chair to determine the terms and conditions, pay, pensions etc relating to the appointment.

Schedule 13, as amended, agreed to.

Clauses 101 to 103 ordered to stand part of the Bill.

Clause 104

Oversight of retention and use of biometric material

Question proposed, That the clause stand part of the Bill.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Clause 104 will repeal the role of the Biometrics Commissioner and transfer the casework functions to the Investigatory Powers Commissioner. There is an extensive legal framework to ensure that the police can make effective use of biometrics, for example as part of an investigation to quickly and reliably identify suspects, while maintaining public trust. That includes the Police and Criminal Evidence Act 1984, which sets out detailed rules on DNA and fingerprints, and the Data Protection Act 2018, which provides an overarching framework for the processing of all personal data.

The oversight framework is complicated, however, and there are overlapping responsibilities. The Bio -metrics Commissioner currently has specific oversight responsibilities just for police use of DNA and fingerprints, while the Information Commissioner’s Office regulates the use of all personal data, including biometrics, by any organisation, including the police. Clause 104 will simplify the framework by removing the overlap, leaving the ICO to provide independent oversight and transferring the casework functions to another existing body.

The casework involves extending retention periods in certain circumstances, particularly on national security grounds, and is quasi-judicial in nature. That is why clause 104 transfers those functions to the independent Investigatory Powers Commissioner, which has the necessary expertise, and avoids the conflict of interest that could occur if the functions were transferred to the ICO as regulator. Transparency in police use of biometrics is essential to retaining public trust and will continue through the annual reports of the Forensic Information Databases Service strategy board, the Investigatory Powers Commissioner and the ICO. I commend clause 104 to the Committee.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I will speak in more detail about my more general views on the oversight of biometrics, particularly their private use, when we come to new clauses 13, 14 and 15. However, as I look specifically at clauses 104 and 105, which seek to abolish the currently combined offices of Biometrics Commissioner and Surveillance Camera Commissioner, I would like to draw on the direct views of the Information Commissioner. In his initial response to “Data: a new direction”, which proposed absorbing the functions of the Biometrics Commissioner and Surveillance Camera Commissioner into the ICO, the commissioner said that there were some functions that,

“if absorbed by the ICO, would almost certainly result in their receiving less attention”.

Other functions, he said,

“simply do not fit with even a reformed data protection authority”

with there being

“far more intuitive places for them to go.”

That was particularly so, he said, with biometric casework.

It is therefore pleasing that as a result of the consultation responses the Government have chosen to transfer the commissioner’s biometric functions not to the ICO but to the Investigatory Powers Commissioner, acknowledging the relevant national security expertise that it can provide. However, in written evidence to this Committee, the commissioner reiterated his concern about the absorption of his office’s functions, saying that work is currently being undertaken within its remit that, under the Bill’s provisions, would be unaccounted for.

Given that the commissioner’s concerns clearly remain, I would be pleased if the Minister provided in due course a written response to that evidence and those concerns. If not, the Government should at the very least undertake their own gap analysis to identify areas that will not be absorbed under the current provisions. It is important that this Committee and the office of the Biometrics and Surveillance Camera Commissioner can be satisfied that all the functions will be properly delegated and given the same degree of attention wherever they are carried out. Equally, it is important that those who will be expected to take on these new responsibilities are appropriately prepared to do so.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I am happy to provide the further detail that the hon. Lady has requested.

Question put and agreed to.

Clause 104 accordingly ordered to stand part of the Bill.

Clause 105

Oversight of biometrics databases

--- Later in debate ---
Clause 105 seeks to abolish the office of the Surveillance Camera Commissioner, while erasing its important functions. Considering the rapid advancement in surveillance technologies, including the concerning development and deployment of facial recognition technologies, it is more important than ever that we protect safeguards and build on them. My new clause 17 would preserve the important functions that I have outlined. The experts interviewed for Professor Fussey and Professor Webster’s report supported such a change, highlighting how most of the gaps left in the Bill could be addressed if responsibility for the surveillance camera code were also moved under the IPCO.
Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Having outlined my broad concerns about clause 105 when I spoke to clause 104, I will focus briefly on the specific concern raised by the hon. Member for Glasgow North West, which is that the Surveillance Camera Commissioner’s functions will not be properly absorbed.

In evidence to the Committee, the commissioner outlined a number of non-data protection functions in relation to public space surveillance that their office currently carries out, but that, they believe, the Bill does not make provision to transfer. They cite the significant work that their office has undertaken to ensure that Government Departments are able

“to cease deploying visual surveillance systems onto sensitive sites where they are produced by companies subject to the National Intelligence Law of the People’s Republic of China”,

following a November 2022 instruction from the Chancellor of the Duchy of Lancaster. The commissioner says that such non-data protection work, which has received international acclaim, is not addressed in the Bill.

I am therefore hopeful that the explicit mention in amendment 123 that that the functions of the Surveillance Camera Commissioner will be transferred provides a backstop to ensure that all the commissioner’s duties, including the non-data protection work, are accounted for. If the amendment is not accepted, a full-depth analysis should be conducted, as argued previously, with a full response issued to the commissioner’s evidence to ensure that every one of the functions is properly and appropriately absorbed.

I understand the argument that the Surveillance Camera Commissioner’s powers would be better placed with the Investigatory Powers Commissioner, rather than the ICO. Indeed, the commissioner’s evidence to the Committee referenced the interim findings of an independent report it had commissioned, as the hon. Member for Glasgow North West just mentioned. The report found that most of the gaps left by the Bill could be addressed if responsibility for the surveillance camera code moved under the IPCO, harmonising the oversight of traditional and remote biometrics.

I end by pointing to a recent example that shows the value of proper oversight of the use of surveillance. Earlier this year, following a referral from my hon. Friend the Member for Bristol North West (Darren Jones), the ICO found a school in Bristol guilty of unlawfully installing covert CCTV cameras at the edge of their playing fields. Since then, the Surveillance Camera Commissioner has been responding to freedom of information requests on the matter, with more information about the incident thereby emerging as recently as yesterday. It is absolutely unacceptable that a school should be filming people without their knowledge. The Surveillance Camera Commissioner is a vital cog in the machinery of ensuring that incidents are dealt with appropriately. For such reasons, we must preserve its functions.

In short, I am in no way opposed to the simplification of oversight in surveillance or biometrics, but I hope to see it done in an entirely thorough way, so that none of the current commissioner’s duties get left behind or go unseen.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I am grateful to the hon. Members for Glasgow North West and for Barnsley East for the points they have made. The hon. Member for Glasgow North West, in moving the amendment, was right to say that the clause as drafted abolishes the role of the Surveillance Camera Commissioner and the surveillance camera code that the commissioner promotes compliance with. The commissioner and the code, however, are concerned only with police and local authority use in England and Wales. Effective, independent oversight of the use of surveillance camera systems is critical to public trust. There is a comprehensive legal framework for the use of such systems, but the oversight framework is complex and confusing.

The ICO regulates the processing of all personal data by all UK organisations under the Data Protection Act; that includes surveillance camera systems operated by the police and local authorities, and the ICO has issued its own video surveillance guidance. That duplication is confusing for both the operators and the public and it has resulted in multiple and sometimes inconsistent guidance documents covering similar areas. The growing reliance on surveillance from different sectors in criminal investigations, such as footage from Ring doorbells, means that it is increasingly important for all users of surveillance systems to have clear and consistent guidance. Consolidating guidance and oversight will make it easier for the police, local authorities and the public to understand. The ICO will continue to provide independent regulation of the use of surveillance camera systems by all organisations. Indeed, the chair of the National Police Data Board, who gave evidence to the Committee, said that that will significantly simplify matters and will not reduce the level of oversight and scrutiny placed upon the police.

Amendment 123, proposed by the hon. Member for Glasgow North West, would retain the role of the Surveillance Camera Commissioner and the surveillance camera code. In our view, that would simply continue the complexity and duplication with the ICO’s responsibilities. Feedback that we received from our consultation showed broad support for simplifying the oversight framework, with consultees agreeing that the roles and responsibilities, in particular in relation to new technologies, were unclear.

The hon. Lady went on to talk about the oversight going beyond that of the Information Commissioner, but I point out that there is a comprehensive legal framework outside the surveillance camera code. That includes not only data protection, but equality and human rights law, to which the code cross-refers. The ICO and the Equality and Human Rights Commission will continue to regulate such activities. There are other oversight bodies for policing, including the Independent Office for Police Conduct and His Majesty’s inspectorate of constabulary, as well as the College of Policing, which provide national guidance and training.

The hon. Lady also specifically mentioned the remarks of the Surveillance Camera Commissioner about Chinese surveillance cameras. I will simply point out that the responsibility for oversight, which the ICO will continue to have, is not changed in any way by the Bill. The Information Commissioner’s Office continues to regulate all organisations’ use of surveillance cameras, and it has issued its own video surveillance guidance.

New clause 17 would transfer the functions of the commissioner to the Investigatory Powers Commissioner. As I have already said, we believe that that would simply continue to result in oversight resting in two different places, and that is an unnecessary duplication. The Investigatory Powers Commissioner’s Office oversees activities that are substantially more intrusive than those relating to overt surveillance cameras. IPCO’s existing work requires it to oversee over 600 public authorities, as well as several powers from different pieces of legislation. That requires a high level of expertise and specialisation to ensure effective oversight.

For those reasons, we believe that the proposals in the clause to bring the oversight functions under the responsibility of the Information Commissioner’s Office will not result in any reduction in oversight, but will result in the removal of duplication and greater clarity. On that basis, I am afraid that I am unable to accept the amendment, and I hope that the hon. Lady will consider withdrawing it.

Carol Monaghan Portrait Carol Monaghan
- Hansard - - - Excerpts

I thank the Minister for responding to my amendments. However, we have a situation where we are going from having a specialist oversight to a somewhat more generalist oversight. That cannot be good when we are talking about this fast-moving technology. I will withdraw my amendment for the moment, but I reserve the right to bring it back at a later stage. I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Clause 105 ordered to stand part of the Bill.

Clause 106

Oversight of biometrics databases

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I beg to move amendment 119, in clause 106, page 130, line 7, leave out

“which allows or confirms the unique identification of that individual”.

This amendment is intended to ensure that the definition of biometric data in the Bill includes cases where that data is used for the purposes of classification (and not just unique identification).

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss new clause 8—Processing of special categories of personal data: biometric data

“(1) Article 9 of UK GDPR is amended as follows.

(2) In paragraph (1), after “biometric data”, omit “for the purpose of uniquely identifying a natural person.”

This new clause would extend the same protections that are currently in place for the processing of biometric data for the purposes of identification to the processing of all biometric data, including if the processing is for the purpose of classification (i.e. identification as part of a group, rather than identification as an individual).

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Biometric data is uniquely personal. It captures our faces, fingerprints, walking style, tone of voice, expressions and all other data derived from measures of the human body. Under current UK law, biometric data is defined as

“personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person”.

Furthermore, biometric data counts as special category personal data only when it is used or collected for

“the purpose of uniquely identifying a natural person”.

However, as the use of biometrics grows, they are not only used for identification; indeed, there is a growing set of biometric technologies used to categorise or classify people on the basis of traits thought to be statistically related or correlated, however tenuously, with particular characteristics. For instance, biometric systems have been developed that attempt to infer people’s sexuality from their facial geometry, or judge criminality from pictures of people’s faces. Other biometric classification systems attempt to judge people’s internal emotional state or intentions from their biometrics, such as tone, voice, gait or facial expressions, known as emotion recognition. For example, employers have used facial expression and tone analysis to decide who should be selected for a job, using biometric technologies to score candidates on characteristics such as enthusiasm, willingness to learn, conscientiousness and responsibility, and personal stability.

Members of the Citizens’ Biometrics Council convened by the Ada Lovelace Institute in 2020 to build a deeper understanding of the British public’s views on biometric technologies have expressed concerns about these use cases. Members suggest that these technologies classify people according to reductive, ableist and stereotypical characteristics, harming people’s wellbeing and risking characterisation in a database or data-driven systems. Further, these cases often use pseudoscientific assumptions to draw links between external features and other traits, meaning that the underlying bases of these technologies are often not valid, reliable or accurate. For example, significant evidence suggests that it is not possible accurately to infer emotion from facial expressions. Despite that, existing data protection law would not consider biometric data collected for those purposes to be special category data, and would therefore not give data subjects the highest levels of safeguards in these contexts.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Clause 106 makes changes to the national DNA database strategy board, which provides oversight of the operation of the national DNA database, including setting policies for access and use by the police. Amendment 119 would seem to extend the power to widen the board’s potential scope beyond biometrics databases for the purpose of identification, to include the purpose of classification.

The police can process data only for policing purposes. It is not clear what policing purpose there would be in being able to classify, for example, emotions or gender, even assuming it was proven to be scientifically robust, or what sort of data would be on such a database. Even if one were developed in the future, it is likely to need knowledge, skills and resources very different from what is needed to oversee a database that identifies and eliminates suspects based on biometric identification, so it would probably make sense for a different body to carry out any oversight.

New clause 8 aims to make changes in a similar way to amendment 119 in relation to the definition of biometric data for the purposes of article 9 of the GDPR. As the GDPR is not concerned with the police’s use of biometric data for law enforcement purposes, the new clause would apply to organisations that are processing biometric data for general purposes. The aim seems to be to ensure that enhanced protections afforded by GDPR to biometric data used for unique identification purposes also apply to biometric data that is used for classification or categorisation purposes.

The hon. Lady referred to the Ada Lovelace Institute’s comments on these provisions, and its 2022 “Countermeasures” report issued on biometric technologies, but we are not convinced that such a change is necessary. One example in the report was using algorithms to make judgments that prospective employees are bored or not paying attention, based on their facial expressions or tone of voice. Using biometric data to draw inferences about people, using algorithms or otherwise, is not as invasive as using biometric data uniquely to identify someone. For example, biometric identification could include matching facial images caught on closed circuit television to a centrally held database of known offenders.

Furthermore, using biometric data for classification or categorisation purposes is still subject to the general data protection principles in the UK GDPR. That includes ensuring that there is a lawful ground for the processing, that the processing is necessary and proportionate, and is fair and transparent to the individuals concerned. If algorithms are used to categorise and make significant decisions about people based on their biometric characteristics, including in an employment context, they will have the right to be given information about the decision, and to obtain human intervention, as a result of the measures we previously debated in clause 11.

Therefore, we do see a distinction between the use of biometric information for identification purposes and the more general classification which the hon. Lady sought to draw. Though we believe that there is sufficient safeguard already in place regarding possible use of classification by biometric data, given what I have said, I hope that she will consider withdrawing the amendment.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I am grateful to the Minister for his comments. We will be speaking about the private uses of biometric data later, so I beg to ask leave to withdraw my amendment.

Amendment, by leave, withdrawn.

Question proposed, That the clause stand part of the Bill.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

DNA and fingerprints are key tools in helping the police to identify and eliminate suspects quickly and accurately by comparing evidence left at crime scenes with the appropriate files on the national databases. As I previously set out, clause 106 makes changes to the National DNA Database Strategy Board. The board provides oversight of the operation of the database, including setting policies for access and use by the police.

These reforms change the scope of the board to make it clear that they should provide similar oversight of the police fingerprint database, which operates under similar rules. The change brings the legislation up to date with the board’s recently published governance rules. Clause 106 also updates the name of the board to the Forensic Information Databases Strategy Board, to better reflect the broadened scope of its work. We are also taking this opportunity to simplify and future-proof oversight of national police biometric databases. While DNA and fingerprints are well established, biometrics is an area of rapid technological development, including for example the growing use of iris, face and voice recognition. Given the pace of technological change in this area and the benefits of consistent oversight, Clause 106 also includes a power for the Secretary of State to make regulations which make changes to the board’s scope, for example by adding new biometric databases into the board’s remit or to remove them, where a database is no longer used. Such regulations would be subject to the affirmative procedure.

For these reasons, I commend the clause to the Committee.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Clause 106 will primarily increase the scope of the Forensic Information Databases Strategy Board to provide oversight of the national fingerprint database. However, there are also provisions enabling the Secretary of State to add or remove a biometric database that the board oversees, using the affirmative procedure. I would therefore like to ask the Minister whether they have any plans to use these powers regarding any particular databases—or whether this is intended as a measure for future-proofing the Bill in the case of changed circumstances?

I would also like to refer hon. Members to the remarks that I have made throughout the Bill that emphasise a need for caution when transferring the ability to change regulation further into the hands of the Secretary of State alone.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I would add only that this is an area where technology is moving very fast, as I referred to earlier. We think it is right to put in place this provision, to allow an extension if it becomes necessary—though I do not think we have any current plans. It is future-proofing of the Bill.

Question put and agreed to.

Clause 106 accordingly ordered to stand part of the Bill.

Clause 107

Regulations

Question proposed, That the clause stand part of the Bill.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Clause 107 will give the Secretary of State a regulation-making power to make consequential amendments to other legislation. The power enables amendments to this Bill itself where such amendments are consequential to the abolition of the Information Commissioner and his replacement by the new Information Commission. Such provision is needed because there are a number of areas where data protection legislation will need to be updated as a consequence of the Bill. This is a standard power, commonly included in Bills to ensure that wider legislation is updated where necessary as a result of new legislation. For example, references to “the Commissioner” in the Data Protection Act 2018 will no longer be accurate, given changes to the governance structure of the Information Commissioner’s Office within the Bill, so consequential amendments will be required to that Act.

Clause 108 outlines the form and procedure for making regulations under the powers in the Bill: they are to be made by statutory instrument. Where regulations in the Bill are subject to the affirmative resolution procedure, they may not be made unless a draft of the statutory instrument has been laid before Parliament and approved by a resolution of each House. That provision is needed because the Bill introduces new regulation-making powers, which are necessary to support the Bill’s policy objectives. For example, powers in part 3 of the Bill replace an existing statutory framework with a new, enhanced one.

Clause 109 explains the meaning of references to “the 2018 Act” and “the UK GDPR” in the Bill. Such provision is needed to explain the meaning of those two references. Clause 110 authorises expenditure arising from the Bill. That provision is needed to confirm that Parliament will fund any expenditure incurred under the Bill by the Secretary of State, the Treasury or a Government Department. It requires a money resolution and a Ways and Means resolution, both of which were passed in the House of Commons on 17 April.

Clause 111 outlines the territorial extent of the Bill. Specifically, the clause states that the Bill extends to England and Wales, Scotland and Northern Ireland, with some exceptions. Much of the Bill, including everything on data protection, is reserved policy. In areas where the Bill legislates on devolved matters, we are working with the devolved Administrations to secure legislative consent motions. Clause 112 gives the Secretary of State a regulation-making power to bring the Bill’s provisions into force. Some provisions, listed in subsection (2), come into force on the date of Royal Assent. Other provisions, listed in subsection (3), come into force two months after Royal Assent. Such provision is needed to outline when the Bill’s provisions will come into force.

Clause 113 gives the Secretary of State a regulation-making power to make transitional, transitory or saving provisions that may be needed in connection with any of the Bill’s provisions coming into force. For example, provision might be required to clarify that the Information Commissioner’s new power to refuse to act on complaints will not apply where such complaints have already been made prior to commencement of the relevant provision. Clause 114 outlines the short title of the Bill. That provision is needed to confirm the title once the Bill has been enacted. I commend clauses 107 to 114 to the Committee.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

The clauses set out the final technical provisions necessary in order for the Bill to be passed and enacted effectively, and for the most part are standard. I will focus briefly on clause 107, however, as a number of stakeholders including the Public Law Project have expressed concern that, as a wide Henry VIII power, it may give the Secretary of State the power to make further sweeping changes to data protection law. Can the Minister provide some assurance that the clause will allow for the creation only of further provisions that are genuinely consequential to the Bill and necessary for its proper enactment?

It is my belief that this would not have been such a concern to civil society groups had there not been multiple occasions throughout the Bill when the Secretary of State made grabs for power, concentrating the ability to make further changes to data protection legislation in their own hands. I am disappointed, though of course not surprised, that the Government have not accepted any of my amendments to help to mitigate those powers with checks and balances involving the commissioner. However, keeping the clause alone in mind, I look forward to hearing from the Minister how the powers in clause 107 will be restricted and used.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

We have previously debated the efficacy of the affirmative resolution procedure. I recognise that the hon. Lady is not convinced about how effective it is in terms of parliamentary scrutiny; we will beg to differ on that point. Although the power in clause 107 allows the Secretary of State to amend Acts of Parliament, I can confirm that that is just to ensure the legal clarity of the text. Without that power, data protection legislation would be harder to interpret, thereby reducing people’s understanding of the legislation and their ability to rely on the law.

Question put and agreed to.

Clause 107 accordingly ordered to stand part of the Bill.

Clause 108

Regulations

--- Later in debate ---
Brought up, and read the First time.
Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I beg to move, That the clause be read a Second time.

In order for the public to have trust in algorithmic decision making, particularly where used by the Government, they must be able to understand how and when it is being used as a basic minimum. That is something that the Government themselves previously recognised by including a proposal to make transparency reporting on the use of algorithms in decision making for public sector bodies compulsory in their “Data: a new direction” consultation. Indeed, the Government have already made good progress on bringing together a framework that will make that reporting possible. The algorithmic transparency recording standard they have built provides a decent, standardised way of recording and sharing information about how the public sector uses algorithmic tools. There is also full guidance to accompany the standard, giving public sector bodies a clear understanding of how to complete transparency reports, as well as a compilation of pilot reports that have already been published, providing a bank of examples.

However, despite that and the majority of consultation respondents agreeing with the proposed compulsory reporting for public sector bodies—citing benefits of increased trust, accountability and accessibility for the public—the Government chose not to go ahead with the legislative change. Relying on self-regulation in the early stages of the scheme is understandable, but having conducted successful pilots, from the Cabinet Office to West Midlands police, it is unclear why the Government now choose not to commit to the very standard they created. This is a clear missed opportunity, with the standard running the risk of failing altogether if there is no legislative requirement to use it.

As the use of such algorithms grows, particularly considering further changes contained in clause 11, transparency around Government use of big data and automated decision-making tools will only increase in importance and value—people have a right to know how they are being governed. As the Public Law Project argues, transparency also has a consequential value; it facilitates democratic consensus building about the appropriate use of new technologies, and it allows for full accountability when things go wrong.

Currently, in place of that accountability, the Public Law Project has put together its own register called “Tracking Automated Government”, or TAG. Using mostly freedom of information requests, the register tracks the use of 42 algorithmic tools and rates their transparency. Of the 42, just one ranked as having high transparency. Among those with low transparency are asylum estates analysis, used to help the Home Office decide where asylum interviews should take place, given the geographical distribution of asylum seekers across the asylum estate; the general matching service and fraud referral and intervention management system, used as part of the efforts of the Department for Work and Pensions to combat benefit fraud and error—for example, by identifying claimants who may potentially have undisclosed capital or other income; and housing management systems, such as that in Wigan Metropolitan Borough Council, which uses a points-based system to prioritise social housing waiting lists.

We all want to see Government modernising and using new technology to increase efficiency and outcomes, but if an algorithmic tool impacts our asylum applications, our benefits system and the ability of people to gain housing, the people affected by those decisions deserve at the very least to know how they are being made. If the public sector sets the right example, private companies may choose to follow in the future, helping to improve transparency even further. The framework is ready to go and the benefits are clear; the amendment would simply make progress certain by bringing it forward as part of the legislative agenda. It is time that we gave people the confidence in public use of algorithms that they deserve.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I thank the hon. Member for Barnsley East for moving new clause 9. We completely share her wish to ensure that Government and public authorities provide transparency in the way they use algorithmic tools that process personal data, especially when they are used to make decisions affecting members of the public.

The Government have made it our priority to ensure that transparency is being provided through the publication of the algorithmic transparency recording standard. That has been developed to assist public sector organisations in documenting and communicating their use of algorithms in decision making that impacts members of the public. The focus of the standard is to provide explanations of the decisions taken using automated processing of data by an algorithmic system, rather than all data processing.

The standard has been endorsed by the Government’s Data Standards Authority, which recommends the standards, guidance and other resources that Government Departments should follow when working on data projects. Publishing the standard fulfils commitments made in both the national data strategy 2020 and the national artificial intelligence strategy. Since its publication, the standard has been piloted with a variety of public sector organisations across the UK, and the published records can be openly accessed via gov.uk. It is currently being rolled out more widely across the public sector.

Although the Government have made it a priority to advance work on algorithmic transparency, the algorithmic transparency recording standard is still a maturing standard that is being progressively promoted and adopted. It is evolving alongside policy thinking and Government understanding of the complexities, scope and risks around its use. We believe that enshrining the standard into law at this point of maturity could hinder the ability to ensure that it remains relevant in a rapidly developing technology field.

Therefore, although the Government sympathise with the intention behind the new clause, we believe it is best to continue with the current roll-out across the public sector. We remain committed to advancing algorithmic transparency, but we do not intend to take forward legislative change at this time. For that reason, I am unable to accept the new clause as proposed by the Opposition.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I am grateful to the Minister, but I am still confused about why, having developed the standard, the Government are not keen to put it into practice and into law. He just said that he wants to keep it relevant; he could use some of the secondary legislation that he is particularly keen on if he accepted the new clause. As I outlined, this issue has real-life consequences, whether for housing, asylum or benefits. In my constituency, many young people were affected by the exam algorithm scandal. For those reasons, I would like to push the new clause to a vote.

Question put, That the clause be read a Second time.

--- Later in debate ---
Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I beg to move, That the clause be read a Second time.

Overall, the aim of the GDPR is to ensure the effective and complete protection of data subjects. That protection cannot be considered effective or complete if people cannot seek justice, remedy and repair if an organisation processes personal data unlawfully. Therefore, there must be suitable methods of redress for all data and decision subjects in any suitable data protection regime. Bringing any kind of legal case is not something people take lightly. Cases can be lengthy, costly and, in many lower-level cases, seem disproportionate to the loss suffered or remedy available. That is no different in cases surrounding the misuse of personal data.

As the law stands, article 80(1) of the EU GDPR has been implemented in the UK, meaning a data subject has the right to mandate a not-for-profit body or organisation to lodge a complaint on their behalf. That means, for example, a charity can help an individual to bring forward a case where they have been materially impacted by a data breach. Such provisions help to ensure that those who have suffered an infringement can be supported in lodging a claim, and are not disincentivised by a lack of understanding, resources or cost. However, the UK has not yet adopted article 80(2), which goes one step further, allowing those same organisations to lodge a complaint independently of a data subject’s mandate.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I am grateful to the hon. Lady for setting out the purposes of the new clause. As she has described, it aims to require the Secretary of State to use regulation-making powers under section 190 of the Data Protection Act to implement article 80(2) of the UK GDPR. It would enable non-profit organisations with an expertise in data protection law to make complaints to the Information Commissioner and/or take legal action against data controllers without the specific authorisation of the individuals who have been affected by data breaches. Relevant non-profit organisations can already take such actions on behalf of individuals who have specifically authorised them to do so under provisions in article 80(1) of the UK GDPR.

In effect, the amendment would replace the current discretionary powers in section 190 of the Data Protection Act with a duty for the Secretary of State to legislate to bring those provisions into force soon after the Bill has received Royal Assent. Such an amendment would be undesirable for a number of reasons. First, as required under section 189 of the Data Protection Act, we have already consulted and reported to Parliament on proposals of that nature, and we concluded that there was not a strong enough case for introducing new legislation.

Although the Government’s report acknowledged that some groups in society might find it difficult to complain to the ICO or bring legal proceedings of their own accord, it pointed out that the regulator can and does investigate complaints raised by civil society groups even when they are not made on behalf of named individuals. Big Brother Watch’s recent complaints about the use of live facial recognition technology in certain shops in the south of England is an example of that.

Secondly, the response concluded that giving non-profit organisations the right to bring compensation claims against data controllers on behalf of individuals who had not authorised them to do so could prompt the growth of US-style lawsuits on behalf of thousands or even millions of customers at a time. In the event of a successful claim, each individual affected by the alleged breach could be eligible for a very small payout, but the consequences for the businesses could be hugely damaging, particularly in cases that involved little tangible harm to individuals.

Some organisations could be forced out of business or prompted to increase prices to recoup costs. The increase in litigation costs could also increase insurance premiums. A hardening in the insurance market could affect all data controllers, including those with a good record of compliance. For those reasons, we do not believe that it is right to extend the requirement on the Secretary of State to allow individuals to bring actions without the consent of those affected. On that basis, I ask the hon. Lady to withdraw the motion.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Data is increasingly used to make decisions about us as a collective, so it is important that GDPR gives us collective rights to reflect that, rather than the system being designed only for individuals to seek redress. For those reasons, I will press my new clause to a vote.

Question put, That the clause be read a Second time.

--- Later in debate ---
Brought up, and read the First time.
Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I beg to move, That the clause be read a Second time.

Privacy enhancing technologies are technologies and techniques that can help organisations to share and use people’s data responsibly, lawfully and securely. They work most often by minimising the amount of data used, maximising data security—for example by encrypting or anonymising personal information—or empowering individuals. One of the best-known examples of a PET is synthetic data: data that is modelled to reproduce the statistical properties of a real dataset when taken as a whole. That type of data could allow third-party researchers or processors to analyse the statistical outcomes of the data without having access to the original set of personal data, or any information about identifiable living individuals.

Another example of PETs are those that minimise the amount of personal data that is shared without affecting the data’s utility. Federated learning, for example, allows for the training of an algorithm across multiple devices or datasets held on servers, so if an organisation wants to train a machine-learning model but has limited training data available, they can send the model to a remote dataset for training. The model will then return having benefited from those datasets, while the sensitive data itself is not exchanged or ever put in the hands of those in ownership of the algorithm. The use of PETs therefore does not necessarily exclude data from being defined as personal or falling within the remit of GDPR. They can, however, help to minimise the risk that arises from personal data breaches and provide an increased level of security.

The Government have positioned the Bill as one that seeks to strengthen the data rights of citizens while catalysing innovation. PETs could and should have been a natural area for the Bill to explore, because not only can such devices help controllers demonstrate an approach based on data protection by design and default, but they can open the door for new ways of collaborating, innovating and researching with data. The Royal Society has researched the role that PETs can play in data governance and collaboration in immense detail, with its findings contained in its 2023 report, which is more than 100 pages long. One of the report’s key recommendations was that the Government should develop a national PET strategy to promote their responsible use as tools for advancing scientific research, increasing security and offering new partnership possibilities, both domestically and across borders.

It is vital to acknowledge that working with PETs involves risks that must be considered. Some may not be robust enough against attacks because they are in the early stages of development, while others might require a significant amount of expertise to operate, without which their use may be counterproductive. It is therefore important to be clear that the amendment would not jump ahead and endorse any particular technology or device before it was ready. Instead, it would enshrine the European Union Agency for Cybersecurity definition of PETs in UK law and prompt the Government to issue a report on how that growing area of technology might play a role in data processing and data regulation in future.

That could include identifying the opportunities that PETs could provide while also looking at the threats and potential harms involved in using the technologies without significant expertise or technological readiness. Indeed, in their consultation response, the Government even mentioned they were keen to explore opportunities around smart data, while promoting understanding that they should not be seen as a substitute for reducing privacy risks on an organisational level. The report, and the advancing of the amendment, would allow the Government that exploration, indicating a positive acknowledgment of the potentially growing role that PETs might play in data processing and opening the door for further research in the area.

Even by their name, privacy enhancing technologies reflect exactly what the Bill should be doing: looking to the future to encourage innovation in tech and then using such innovation to protect citizens in return. I hope hon. Members will see those technologies’ potential value and the importance of analysing any harms, and look to place the requirement to analyse PETs on the statute book.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

We absolutely agree with the Opposition about the importance of privacy enhancing technologies, which I will call PETs, since I spoke on them recently and was told that was the best abbreviation—it is certainly easier. We wish to see their use by organisations to help ensure compliance with data protection principles and we seek to encourage that. As part of our work under the national data strategy, we are already exploring the macro-impacts of PETs and how they can unlock data across the economy.

The ICO has recently published its draft guidance on anonymisation, pseudonymisation and PETs, which explains the benefits and different types of PETs currently available, as well as how they can help organisations comply with data protection law. In addition, the Centre for Data Ethics and Innovation has published an adoption guide to aid decision making around the use of PETs in data-driven projects. It has also successfully completed delivery of UK-US prize challenges to drive innovation in PETs that reinforce democratic values. Indeed, I was delighted to meet some of the participants in those prize challenges at the Royal Society yesterday and hear a little more about some of their remarkable innovations.

As the hon. Lady mentioned, the Royal Society has published reports on how PETs can maximise the benefit and reduce the harms associated with data use. Adding a definition of PETs to the legislation and requiring the Government to publish a report six months after Royal Assent is unlikely to have many advantages over the approach that the ICO, the CDEI and others are taking to develop a better understanding in the area. Furthermore, many PETs are still in the very early stages of their deployment and use, and have not been widely adopted across the UK or globally. A statutory definition could quickly become outdated. Publishing a comprehensive report on the potential impacts of PETs, which advocated the use of one technology or another, could even distort a developing market, and lead to unintended negative impacts on the development of what are promising technologies. For that reason, I ask the hon. Lady to withdraw the new clause.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I am grateful to the Minister for his clarification on the pronunciation of the acronym. I acknowledge the points he made. I beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.

New Clause 13

Oversight of biometric technology use by the Information Commission

‘(1) The Information Commission must establish a Biometrics Office.

(2) The Biometrics Office is to consist of a committee of three commissioners with relevant expertise, appointed by the Commission.

(3) The functions of the Biometrics Office are—

(a) to establish and maintain a public register of relevant entities engaged in processing biometric data;

(b) to oversee and review the biometrics use of relevant entities;

(c) to produce a Code of Practice for the use of biometric technology by registered parties, which must include—

(i) compulsory standards of accuracy and reliability for biometric technologies,

(ii) a requirement for the proportionality of biometrics use to be assessed prior to use and annually thereafter, and a procedure for such assessment, and

(iii) a procedure for individual complaints about the use of biometrics by registered parties;

(d) to receive and publish annual reports from all relevant entities, which must include the relevant entity’s proportionality assessment of their biometrics use;

(e) to enforce registration and reporting by the issuing of enforcement notices and, where necessary, the imposition of fines for non-compliance with the registration and reporting requirements;

(f) to ensure lawfulness of biometrics use by relevant entities, including issuing compliance and abatement notices where necessary.

(4) The Secretary of State may by regulations add to the responsibilities of the Biometrics Office.

(5) Regulations made under subsection (4) are subject to the affirmative resolution procedure.

(6) For the purposes of this Part—

“biometric data” has the meaning given by section 106 of this Act (see subsection 13);

“relevant entity” means any organisation or body corporate (whether public or private) which processes biometric data, other than where the biometric processing undertaken by the organisation or body corporate is otherwise overseen by the Investigatory Powers Commissioner, because it is—

(a) for the purposes of making or renewing a national security determination as defined by s.20(2) Protection of Freedoms Act 2012; or

(b) for the purposes set out in s.20(6) Protection of Freedoms Act 2012.’.(Stephanie Peacock.)

This new clause, together with NC14 and NC15, are intended to form a new Part of the Bill which creates a mechanism for the Information Commission to oversee biometric technology use by private parties.

Brought up, and read the First time.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

I beg to move, That the clause be read a Second time.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

New clause 14—Requirement to register with the Information Commission

‘(1) Any relevant entity intending to process biometric data for purposes other than those contained in section 20(2) and section 20(6) of the Protection of Freedoms Act 2012 must register with the Information Commission prior to the deployment of the biometric technology.

(2) An application for registration must include an explanation of the intended biometrics use, including an assessment of its proportionality and its extent.

(3) All relevant entities must provide an annual report to the Biometrics Office addressing their processing of biometric data in the preceding year and their intended processing of biometrics in the following year .

(4) Each annual report must contain a proportionality assessment of the relevant entity’s processing of biometric data in the preceding year and intended processing of biometric data in the following year.

(5) Any relevant entity which processes biometric data without having registered with the Information Commission, or without providing annual reports to the Biometrics Office, is liable to an unlimited fine imposed by the Information Commission.’

See explanatory statement to NC13.

New clause 15—Private biometrics use prior to entry into force of the Act

‘Any relevant entity engaged in processing biometric data other than for the purposes contained in section 20(2) and section 20(6) of the Protection of Freedoms Act 2012 prior to the entry into force of this Part must register with the Information Commission in accordance with section [Requirement to register with the Information Commission] within six months of the date of entry into force of this Part; and subsection (5) of that section does not apply to such an entity during that period.’

See explanatory statement to NC13. This new clause would provide a transitional period of six months for entities which were already engaged in the processing of biometric data to register with the Commission.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

A wider range of biometric data is now being collected than ever before. From data on the way we walk and talk to the facial expressions we make, biometric data is now being collected and used in a wide range of situations for many distinct purposes. Great attention has rightly been paid to police use of facial recognition technology to identify individuals, for example at football matches or protests. Indeed, to date, much of the regulatory attention has focused on those use cases, which are overseen by the Investigatory Powers Commissioner. However, the use of biometric technologies extends far beyond those examples, and there has been a proliferation of biometrics designed by private organisations to be used across day-to-day life—not just in policing.

We unlock smartphones with our faces or fingerprints, and companies have proposed using facial expression analysis to detect whether students are paying attention in online classes. Employers have used facial expression and tone analysis to decide who should be selected for a job—as was already mentioned in reference to new clause 8. As the proliferation of biometric technologies occurs, a number of issues have been raised about their impact on people and society. Indeed, if people’s identities can be detected by both public and private actors at any given point, there is potential for it to significantly infringe on someone’s privacy to move through the world with freedom of expression, association and assembly. Similarly, if people’s traits, characteristics or abilities can be automatically assessed on the basis of biometrics, often without a scientific basis, it may affect free expression and the development of personality.

Public attitudes research carried out by the Ada Lovelace Institute shows that the British public recognise the potential benefits of tools such as facial recognition in certain circumstances—for example, smartphone locking systems and in airports—but often reject their use in others. Large majorities are opposed to the use of facial recognition in shops, schools and on public transport, as well as by human resources departments in recruitment. In all cases, the public expect the use of biometrics to be accompanied by safeguards and limitations, such as appropriate transparency and accountability measures.

Members of the citizens’ biometrics council, convened by the Ada Lovelace Institute in 2020 and made up of 50 members of the public, expressed the view that biometric technologies as currently used are lacking in transparency and accountability. In particular, safeguards are uneven across sectors. Private use of biometrics are not currently subject to the same level of regulatory oversight or due process as is afforded within the criminal justice system, despite also having the potential to create changes of life-affecting significance. As a result, one member of the council memorably asked:

“If the technology companies break their promises…what will the implications be? Who’s going to hold them to account?”

It is with those issues in mind that experts and legal opinion seem all to come to the same consistent conclusion that, at the moment, there is not a sufficient legal framework in place to manage the unique issues that the private proliferation of biometrics use raises. An independent legal review, commissioned by the Ada Lovelace Institute and led by Matthew Ryder KC, found that current governance structures and accountability mechanisms for biometrics are fragmented, unclear and ineffective. Similar findings have been made by the Biometrics and Surveillance Camera Commissioner, and Select Committees in this House and in the other place.

The Government, however, have not yet acted on delivering a legal framework to govern the use of biometric technology by private corporations, meaning that the Bill is a missed opportunity. New clause 13 therefore seeks to move towards the creation of that framework, providing for the Information Commission to oversee the use of biometric technology by private parties, and ensure accountability around it. I hope that the Committee see the value of this oversight and what it could provide and will support the new clause.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

New clause 13 would require the Information Commission to establish a new separate statutory biometrics office with responsibility for the oversight and regulation of biometric data and technology. However, the Information Commissioner already has responsibility for monitoring and enforcing the processing of biometric data, as it falls within the definition of personal data. Under the Bill, the new body corporate—the Information Commission—will continue to monitor and enforce the processing of all personal data under the data protection legislation, including biometric data. Indeed, with its new independent board and governance structure, the commission will enjoy greater diversity in skills and decision making, ensuring that the regulator has the right blend of skills and expertise at the very top of the organisation.

Furthermore, the Bill allows the new Information Commission to establish committees, which may include specialists from outside the organisation with key skills and expertise in specialist areas. As such, the Government are of the firm view that the Information Commission is best placed to provide regulatory oversight of biometric data, rather than delegating responsibility and functions to a separate office. The creation of a new body would likely cause confusion for those seeking redress, by creating novel complaints processes for biometric-related complaints, as set out in new clause 13(3)(c)(iii). It would also complicate regulatory oversight and decision making by providing the new office with powers to impose fines, as per subsection (2)(e). For those reasons, I encourage the hon. Lady to withdraw her new clause.

New clauses 14 and 15 would require non-law enforcement bodies that process biometric data about individuals to register with the Information Commissioner before the processing begins. Where the processing started prior to passage of the Bill, the organisation would need to register within six months of commencement. As part of the registration process, the organisation would have to explain the intended effect of the processing and provide annual updates to the Information Commissioner’s Office on current and future processing activities. Organisations that fail to comply with these requirements would be subject to an unlimited fine.

I appreciate that the new clauses aim to make sure that organisations will give careful thought to the necessity and proportionality of their processing activities, and to improve regulatory oversight, but they could have significant unintended consequences. As the hon. Lady will be aware, there are many everyday uses of biometrics data, such as using a thumbprint to access a phone, laptop or other connected device. Such services would always ask for the user’s explicit consent and make alternatives such as passwords available to customers who would prefer not to part with their biometric data.

If every organisation that launched a new product had to register with the Information Commissioner to explain its intentions and complete annual reports, that could place significant and unnecessary new burdens on businesses and undermine the aims of the Bill. Where the use of biometric data is more intrusive, perhaps involving surveillance technology to identify specific individuals, the processing will already be subject to the heightened safeguards in article 9 of the UK GDPR. The processing would need to be necessary and proportionate on the grounds of substantial public interest.

The Bill will also require organisations to designate a senior responsible individual to manage privacy risks, act as a contact point for the regulator, undertake risk assessments and keep records in relation to high-risk processing activities. It would be open to the regulator to request to see these documents if members of the public expressed concern about the use of the technology.

I hope my response has helped to address the issues the hon. Lady was concerned about, and I would respectfully ask her to not to press these new clauses.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

It does indeed provide reassurance. On that basis, I beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.

None Portrait The Chair
- Hansard -

We now come to the big moment for the hon. Member for Loughborough. Weeks of anticipation are now at an end. I call her to move new clause 16.

New Clause 16

Processing of data in relation to a case-file prepared by the police service for submission to the Crown Prosecution Service for a charging decision

‘(1) The 2018 Act is amended in accordance with subsection (2).

(2) In the 2018 Act, after section 40 insert—

40A Processing of data in relation to a case-file prepared by the police service for submission to the Crown Prosecution Service for a charging decision

(1) This section applies to a set of processing operations consisting of the preparation of a case-file by the police service for submission to the Crown Prosecution Service for a charging decision, the making of a charging decision by the Crown Prosecution Service, and the return of the case-file by the Crown Prosecution Service to the police service after a charging decision has been made.

(2) The police service is not obliged to comply with the first data protection principle except insofar as that principle requires processing to be fair, or the third data protection principle, in preparing a case-file for submission to the Crown Prosecution Service for a charging decision.

(3) The Crown Prosecution Service is not obliged to comply with the first data protection principle except insofar as that principle requires processing to be fair, or the third data protection principle, in making a charging decision on a case-file submitted for that purpose by the police service.

(4) If the Crown Prosecution Service decides that a charge will not be pursued when it makes a charging decision on a case-file submitted for that purpose by the police service it must take all steps reasonably required to destroy and delete all copies of the case-file in its possession.

(5) If the Crown Prosecution Service decides that a charge will be pursued when it makes a charging decision on a case-file submitted for that purpose by the police service it must return the case-file to the police service and take all steps reasonably required to destroy and delete all copies of the case-file in its possession.

(6) Where the Crown Prosecution Service decides that a charge will be pursued when it makes a charging decision on a case-file submitted for that purpose by the police service and returns the case-file to the police service under subsection (5), the police service must comply with the first data protection principle and the third data protection principle in relation to any subsequent processing of the data contained in the case-file.

(7) For the purposes of this section—

(a) The police service means—

(i) constabulary maintained by virtue of an enactment, or

(ii) subject to section 126 of the Criminal Justice and Public Order Act 1994 (prison staff not to be regarded as in police service), any other service whose members have the powers or privileges of a constable.

(b) The preparation of, or preparing, a case-file by the police service for submission to the Crown Prosecution Service for a charging decision includes the submission of the file.

(c) A case-file includes all information obtained by the police service for the purpose of preparing a case-file for submission to the Crown Prosecution Service for a charging decision.”’ —(Jane Hunt.)

This new clause adjusts Section 40 of the Data Protection Act 2018 to exempt the police service and the Crown Prosecution Service from the first and third data protection principles contained within the 2018 Act so that they can share unredacted data with one another when making a charging decision.

Brought up, and read the First time.

--- Later in debate ---
I hope not to have to press the new clause to a vote, and that the Minister will provide some encouragement that the issue will be resolved during progress of the Bill.
Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

New clause 16 would amend section 40 of the Data Protection Act 2018, allowing police services to share unredacted data with the Crown Prosecution Service when it is making a charging decision. I am incredibly sympathetic to the aim that the hon. Member for Loughborough has set out, which is to get the police fighting crime on the frontline as much as possible. In oral evidence, Aimee Reed, director of data at the Metropolitan police, said that if the police could share information redacted before charging decisions were made, it would be “of considerable benefit”. She said that that would

“enable better and easier charging decisions”

and

“reduce the current burden on officers”––[Official Report, Data Protection and Digital Information (No. 2) Public Bill Committee, 10 May 2023; c. 58, Q126.]

That would allow them to focus their time on other things. It is therefore good to see that concept being explored in a new clause.

To determine the value of the change, we would like to see a full impact assessment of the potential risks and harms associated with it. I hope that that could be conducted with the intention of weighing the change against the actual cost of the current burden that police face in redacting data. Without such an assessment, it is hard to determine whether the benefit to the police would be proportionate to the impact or harms that might occur as a result of the change, particularly for the subjects of data involved. That is not to say that any change would not be beneficial, but perhaps more detail could be explored with regard to the proposal.

As I believe that this is the final time that I will speak in this Committee, may I say a few words of thanks?

None Portrait The Chair
- Hansard -

I think that you should wait for the next Question.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

Okay, I will wait for the next Question. Thank you for your guidance, Mr Hollobone.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I thank my hon. Friend the Member for Loughborough, who has been assiduous in pursuing her point and has set out very clearly the purpose of her new clause. We share her wish to reduce unnecessary burdens on the police as much as possible. The new clause seeks to achieve that in relation to the preparation by police officers of pre-charge files, which is an issue that the National Police Chiefs’ Council has raised with the Home Office, as I think she knows.

This is a serious matter for our police forces, which estimate that about four hours is spent redacting a typical case file. They argue that reducing that burden would enable officers to spend more time on frontline policing. We completely understand the frustration that many officers feel about having to spend a huge amount of time on what they see as unnecessary redaction. I can assure my hon. Friend that the Home Office is working with partners in the criminal justice system to find ways of safely reducing the redaction burden while maintaining public trust. It is important that we give them the time to do so.

We need to resolve the issue through an evidence-based solution that will ensure that the right amount of redaction is done at the right point in the process, so as to reduce any delays while maintaining victim and witness confidence in the process. I assure my hon. Friend that her point is very well taken on board and the Government are looking at how we can achieve her objective as quickly as possible, but I hope she will accept that, at this point, it would be sensible to withdraw her new clause.

--- Later in debate ---
Stephanie Peacock Portrait Stephanie Peacock
- Hansard - -

It has been a real pleasure to represent His Majesty’s loyal Opposition in the scrutiny of the Bill. I thank the Minister for his courteous manner, all members of the Committee for their time, the Clerks for their work and the many stakeholders who have contributed their time, input and views. I conclude by thanking Anna Clingan, my senior researcher, who has done a remarkable amount of work to prepare for our scrutiny of this incredibly complex Bill. Finally, I thank you, Mr Hollobone, for the way in which you have chaired the Committee.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

May I join the hon. Lady in expressing thanks to you, Mr Hollobone, and to Mr Paisley for chairing the Bill Committee so efficiently and getting us to this point ahead of schedule? I thank all members of the Committee for their participation: we have been involved in what will be seen to be a very important piece of legislation.

I am very grateful to the Opposition for their support in principle for many of the objectives of the Bill. It is absolutely right that the Opposition scrutinise the detail, and the hon. Member for Barnsley East and her colleagues have done so very effectively. I am pleased that we have reached this point with the Bill so far unamended, but obviously we will be considering it further on Report.

I thank all my hon. Friends for attending the Committee and for their contributions, particularly saying “Aye” at the appropriate moments, which has allowed us to get to this point. I also thank the officials in the Department for Science, Innovation and Technology. I picked up this baton on day two of my new role covering the maternity leave of my hon. Friend the Member for Hornchurch and Upminster (Julia Lopez); I did so with some trepidation, but the officials have made my task considerably easier and I am hugely indebted to them.

I thank everybody for allowing us to get this point. I look forward to further debate on Report, in due course.