Read Bill Ministerial Extracts
Data Protection and Digital Information (No. 2) Bill Debate
Full Debate: Read Full DebateJohn Whittingdale
Main Page: John Whittingdale (Conservative - Maldon)Department Debates - View all John Whittingdale's debates with the Department for Science, Innovation & Technology
(1 year, 6 months ago)
Commons ChamberI welcome the Bill. I am delighted that it finally takes advantage of one of the freedoms that has resulted from our leaving the European Union, which I supported at the time and continue to support. As has been indicated, the Bill has had a long gestation. I was the Minister at the time of the issue of the consultation paper in September 2021 and the Bill first appeared a year later. As the Opposition spokesman pointed out, a small hiccup delayed it a bit further.
Our current data protection laws originate almost entirely from the EU and are based on GDPR. Before the adoption of GDPR in 2016, the UK Government opposed parts of it. I recall that the assessment at the time was that, although there were benefits to larger companies, there would be substantial costs for smaller firms and indeed that has been borne out. There was a debate in government about whether we should oppose the GDPR regulation when it was going through the process of the Commission formation. As so often was the case in the EU, we were advised that, if we opposed that, we would lose vital leverage and our ability to influence its development. Whether we were able then to influence its development is arguable, but it was decided that we should not outright oppose it. However, it has always been clear that the one-size-fits-all GDPR that currently is in place imposes significant costs on smaller firms. When we had the consultation in 2021, smaller firms in particular complained about the complexity of GDPR, and the uncertainty and cost that it imposed. Clearly, there was seen to be an opportunity to streamline it—not to remove it, but to make it simpler and more understandable, and to reduce some of the burdens it imposes. We now have that opportunity to diverge.
The other thing that came back from the consultation—I agree with the Opposition Members who have raised this point—was that there is an advantage in the UK’s retaining data adequacy with the EU. It was not taken for granted that we would get data adequacy. A lengthy negotiation with the EU took place before a data adequacy agreement was reached. As part of that process, officials rightly looked at what alternative there would be, should we not be granted data adequacy. It became clear that there are ways around it. Standard contractual clauses and alternative transfer mechanisms would allow companies to continue to exchange data. It would be a little more complicated. They would need to write the clauses into contracts. For that reason, there was clearly a value in having a general data adequacy agreement, but one should not think that the loss of data adequacy would be a complete disaster because, as I say, there are ways around it.
The Government are right to look at additional adequacy agreements with countries outside the EU, because therein lies a great opportunity. The EU has managed to conclude some, but not that many, and the Government have rightly identified a number of target countries where we see benefits from achieving data adequacy agreements. It is perfectly possible for us to diverge to a limited extent from GDPR and still retain adequacy. Notably, the EU recognises New Zealand’s regime as being adequate, even though New Zealand’s data protection laws are different from those of the EU. The fact that we decided to appoint the former New Zealand Information Commissioner as our own Information Commissioner means that he brings a particular degree of knowledge about that, which will be very useful.
In considering data protection law, it is sometimes said that there is a conflict between privacy—the right of consumers to have protection of their data—and the innovation and growth opportunities of technology companies. I do not believe that that is true; the two things have to be integral parts of our data protection laws. If people believe that their privacy is at risk, they will not trust the exchange of data. One problem is that, in general, people read only about the problems that arise, particularly from things such as identity theft, hacks and the loss of data as a result of people leaving memory sticks on phones or of cyber-criminals hacking into large databases and taking all their financial information. All those things are a genuine risk, but they present only one side of the picture and, in general, people reach their view about the importance of data protection according to all the risk, without necessarily seeing the real benefits that come from the free exchange of data. That was perhaps the lesson that covid showed us more than any other: by allowing the exchange of data, it allowed us to develop and research vaccines. We were able to research what worked in terms of prevention and the various measures that could be taken to protect consumers from getting covid. Therefore, covid was the big demonstration of the fact that data exchange can bring real benefits to all consumers. We are just on the threshold—
Further to my right hon. Friend’s point about facilitating a trusted mechanism for sharing data, does he agree that the huge global success of open banking in this country has demonstrated that a trust framework not only makes people much more willing to exchange their data but frees up the economy and creates a world-leading sector at the same time?
I agree with my hon. Friend on that. The use of smart data in open banking demonstrates the benefits that can flow from its use, and that example could be replicated in a large number of other sectors to similar benefit. I hope that that will be one benefit that will eventually flow from the changes we are making.
As I say, we are on the threshold of an incredibly exciting time. The use of artificial intelligence and automated decision making will bring real consumer benefits, although, of course, safeguards must be built in. The question of algorithmic bias was looked at by the Centre for Data Ethics and Innovation and there was evidence there. Obviously, we need to take account of that and build in protections against it, but, in general, the opportunities that can flow from making data more easily available are enormous.
I wish to flag up a couple of things. People have long found pop-up banner cookies deeply irritating. They have become self-defeating, because they are so ubiquitous that everybody just presses “yes”. The whole point of them was to acquire informed consent, but that is undermined if everybody is confronted by these things every time they log on to the internet and they automatically press “yes” without properly reading what they are consenting to. Restricting them to cookies that represent intrusive acquisition of data and explaining that to people and requiring consent is clearly an improvement. That will not only make data exchange easier but increase consumer protection, as people will know that they are being asked to give consent because they may choose not to allow their data to be used.
I understand the concerns that have been expressed about the Bill in some areas, particularly about the powers that will be given to the Secretary of State, but this is a complicated area. It is also one where technology is moving very fast. We need flexible legislation to keep up to date with the development of technology, so, to some extent, secondary legislation is probably the right way forward. We will debate these matters in Committee, but, generally, the Bill will help to deliver the Government’s declared intention, which is to make the UK the most successful data-driven technology economy in the world.
Data Protection and Digital Information (No. 2) Bill (First sitting) Debate
Full Debate: Read Full DebateJohn Whittingdale
Main Page: John Whittingdale (Conservative - Maldon)(1 year, 5 months ago)
Public Bill CommitteesIt is a really tight timetable this morning and we have nine minutes left. The Minister wants to ask some questions and there are three Members from the Opposition. I will call the Minister now. Perhaps you would be kind enough, Minister, to leave time for one question each from our three Members of the Opposition.
Q
John Edwards: The obligation to investigate every complaint does consume quite a lot of our resources. Can I ask my colleague to make a contribution on this point?
Paul Arnold: As the commissioner says, that duty to investigate all complaints can challenge us in terms of where we need to dedicate the majority of our resources.
To the previous question and answer, our role in trying to provide or maximise regulatory certainty means being able to invest as much resource as we can in that upstream advice, particularly in those novel, complex, finely balanced, context-specific areas. We are adding far more value if we can add that support upstream.
The additional statutory objectives that are being added through the Bill overall will be a real asset to our accountability. Any regulator that welcomes independence also needs to welcome the accountability. It is the means through which we describe how we think, how we act and the outcomes that we achieve. Those extra statutory objectives will be a real aid to us and also an aid to Parliament and our stakeholders. It really does crystallise and clarify why we are here and how we will prioritise our efforts and resources.
Q
John Edwards: I do not believe there is anything in the Bill that would put at risk the adequacy determination with the European Union. The test the Commission applies is whether the law is essentially equivalent. New Zealand lacks many of the features of the GDPR, as do Israel and Canada, each of which has maintained adequacy status. The importance of an independent regulator is preserved in this legislation. All the essential features of the UK GDPR or the rights that citizens of the European Union enjoy are present in the Bill, so I do not believe that there is a realistic prospect of the Commission reviewing negatively the adequacy determination.
It is a brutal cut-off, I am afraid, at 9.55 am. I have no discretion in this matter. It is a quick-fire round now, gentlemen. We need quick questions and quick answers, with one each from Carol Monaghan, Chi Onwurah and Mike Amesbury.
We still have not heard definitively whether our other guests can hear us or speak to us, so we are waiting for confirmation from the tech people. In the meantime, I invite the Minister to question Vivienne Artz.
Q
Vivienne Artz: The Bill provides for the opportunity for the Government to look at a range of issues and to move away from an equivalence approach to one in which we can consider more factors and features. The reality is that if you compare two pieces of legislation, you will always find differences because they come from different cultural backgrounds and different legal regimes. There will always be differences. The approach the UK is taking in the Bill is helpful because it looks at outcomes and broader issues such as the rule of law in different jurisdictions.
What is said on paper is not necessarily what always happens in practice; we need to look at it far more holistically. The legislation gives the Government the opportunity to take that broader and more common-sense view with regard to adequacy and not just do a word-by-word comparison of legislative provisions without actually looking at how the legislation is implemented in that jurisdiction and what other rights can support the outcomes. We can recognise that there is a different legal process and application but ask whether it still achieves the same end. That is what is really important. There is an opportunity not only to move more quickly in this space but to consider jurisdictions that might not be immediately obvious but none the less still offer appropriate safeguards for data.
Q
Vivienne Artz: The current process is incredibly cumbersome for businesses and, if I am honest, it provides zero transparency for individuals as well. It tends to be mostly a paperwork exercise—forgive if that sounds provocative, but putting in place the model clauses is very often an expensive paperwork exercise. At the moment, it is difficult, time-consuming and costly, as the case may be.
The thing with adequacy is that it is achieved at a Government-to-Government level. It is across all sectors and provides certainty for organisations to move forward to share information, sell their goods and services elsewhere and receive those goods and services, and for consumers to access those opportunities as well. Adequacy is certainly the ideal. Whether it is achievable in all jurisdictions I do not know, but I think it is achievable for many jurisdictions to provide confidence for both consumers and businesses on how they can operate.
We can see Mr Ustaran and Ms Bellamy and they can hear us, but we cannot hear them, so we will carry on with questioning Vivienne Artz.
We have 12 minutes left and two Members are indicating that they wish to ask questions after you, Minister.
Q
Eduardo Ustaran: That is a very important question to address because perhaps one of the ways in which we should be looking at this legislative reform is a way of seeing how the existing GDPR framework that exists both in the EU and the UK could, in fact, be made more effective, relevant and modern to deal with the issues we are facing right now. You refer to artificial intelligence as one of those issues.
GDPR in the EU and the UK, is about five years old. It is not a very old piece of legislation, but a number of technological developments have happened in the past five years. More importantly, we have learned how GDPR operates in practice. This exercise in the UK is in fact very useful, not just for the UK but for the EU and the world at large, because it is looking at how to reform elements of existing law that is already in operation in order to make it more effective. That does not mean that the law needs to be more onerous or more strict, but it can be more effective at the same time as being more pragmatic. This is an important optic in terms of how we look at legislative reform, and not only from the UK’s point of view. The UK can make an effort to try to make the changes more visible outside the United Kingdom, and possibly influence the way in which EU GDPR evolves in the years to come.
Bojana Bellamy: I agree that we need a more flexible legal regime to enable the responsible use of AI and machine learning technologies. To be very frank with you, I was hoping the Bill would go a little further. I was hoping that there would be, for example, a recognition of the use of data in order to train algorithms to ensure that they are not discriminatory, not biased and function properly. I would have hoped that would be considered as an example of legitimate interests. That is certainly a way in which the Government can go further, because there are possibilities for the Secretary of State to augment those provisions.
We have seen that in the European AI Act, where they are now allowing greater use of data for algorithmic AI training, precisely in order to ensure that algorithms work properly. We have Dubai’s data protection law and some others are starting to do that. I hope that we have good foundations to ensure further progression of the rules on AI. The rules on automated decision making are certainly better in this Bill than they are in GDPR. They are more realistic; they understand the fact that we going to be faced with AI and machine learning taking more and more decisions, of course with the possibility of human intervention.
Again, to those who criticise the rules, I would say it is more important to have these exposed rights of individuals. We should emphasise, in the way we have done in the Bill, the right to information that there is AI involved, the right to make a representation, the right to contest a decision, and the right to demand human review or human intervention. To me, that is really what empowers individuals and gives them trust that the decisions will be made in a better way. There is no point in prohibiting AI in the way GDPR sort of does. In GDPR, we are going to have something of a clash between the fact that the world is moving toward greater use of AI, and that in article 22 on automated decision making, there is a prohibition that makes it subject to consent or contract. That is really unrealistic. Again, we have chosen a better way.
As a third small detail, I find the rules on research purposes to be smarter. They are rather complicated to read, to be frank, but I look forward to the consolidated, clean version. The fact that technological development research is included in commercial research will enable the organisations that are developing AI to create the rules in a responsible way that creates the right outcomes for people, and does not create harms or risks. To me, that is what matters. That is more important, and that is what is going to be delivered here. We have the exemptions from notices for research and so on, so I feel we will have better conditions for the development of AI in a responsible and trusted way. However, we must not take our eyes off it. We really need to link GDPR with our AI strategy, and ensure that we incentivise organisations to be accountable and responsible when they are developing and deploying AI. That will be a part of the ICO’s role as well.
Five minutes left. This will be the quick-fire round. I have two Members indicating that they wish to ask questions—Chi Onwurah.
Q
Neil Ross: Smart data is potentially a very powerful tool for increasing consumer choice, lowering prices and giving people access to a much broader range of services. The smart data provisions that the Government have introduced, as well as the Smart Data Council that they are leading, are really welcome. However, we need to go one step further and start to give people and industries clarity around where the Government will look first, in terms of what kind of smart data provisions they might look at and what kind of sectors they might go into. Ultimately, we need to make sure that businesses are well consulted and that there is a strong cost-benefit analysis. We then need to move ahead with the key sectors that we want to push forward on. Similarly to on nuisance calls, we will send some suggested text to the Committee to add those bits in, but it is a really welcome step forward.
Q
Neil Ross: I do not want to name specific sectors at this point. We are having a lot of engagement with our members about where we would like to see it first. The transport sector is one area where it has been used in the past and could have a large use in the future, but it is something that we are exploring. We are working directly with the Government through the Smart Data Council to try to identify the initial sectors that we could look at.
Q
Chris Combemale: I think the single biggest one that has troubled our members since the implementation of GDPR is the issue around legitimate interest, which was raised by the hon. Member for Folkestone and Hythe. The main issue is that GDPR contains six bases of data processing, which in law are equal. For the data and marketing industry, the primary bases are legitimate interest and consent. For some reason it has become widely accepted through the implementation of GDPR that GDPR requires consent for marketing and for community activities. I am sure that you hear in your constituencies of many community groups that feel that they cannot go about organising local events because they must have consent to communicate. That has never been the intention behind the legislation; in fact, the European Court of Justice has always ruled that any legal interest could be a legitimate interest, including advertising and marketing.
If you look at what we do, which is effectively finding and retaining customers, the GDPR legislation says in recital 4 that privacy is a fundamental right, not an absolute right, and must be balanced against other rights, such as the right to conduct a business. You cannot conduct a business without the right to find and retain customers, just as you cannot run a charity without the right to find donors and volunteers who provide the money and the labour for your good cause. The clarification is really important across a wide range of use cases in the economy, but particularly ours. It was recognised in GDPR in recital 47. What the legislation does is give illustrative examples that are drawn from recitals 47, 48 and 49. They are not new examples; they are just given main text credibility. It is an illustrative list. Really, any legal interest could be a legitimate interest for the purpose of data providing, subject to necessity and proportionality, which we discussed earlier with the Information Commissioner.
Q
Chris Combemale: In the sector that I represent, we have a fairly clear understanding of the gradients of risk. As I was saying earlier, many companies do not share data with other companies. They are interested solely in the relationships that they have with their existing customers or prospects. In that sense, all the customer attitudes to privacy research that we do indicates that people are generally comfortable sharing data with companies they trust and do business with regularly.
Data Protection and Digital Information (No. 2) Bill (Second sitting) Debate
Full Debate: Read Full DebateJohn Whittingdale
Main Page: John Whittingdale (Conservative - Maldon)(1 year, 5 months ago)
Public Bill CommitteesQ
Jonathan Sellors: I think it is a thoroughly useful clarification of what constitutes research. It is essentially welcome, because it was not entirely clear under the provisions of the General Data Protection Regulation what the parameters of research were, so this is a helpful clarification.
Tom Schumacher: I completely concur: it is very useful. I would say that a couple of things really stand out. One is that it makes it clear that private industry and other companies can participate in research. That is really important, particularly for a company like Medtronic because, in order to bring our products through to help patients, we need to conduct research, have real-world data and be able to present that to regulators for approval. It will be extremely helpful to have that broader definition.
The other component of the definition that is quite helpful is that it makes it explicit that technology development and other applied research constitutes research. I know there is a lot of administrative churn trying to figure out what constitutes research and what does not, and I think this is a really helpful piece of clarification.
Q
Tom Schumacher: Maybe I can give an example. One of the businesses we purchased is a business based in the UK called Digital Surgery. It uses inter-body videos to try to improve the surgery process and create technologies to aid surgeons in prevention and care. One of the challenges has been, to what extent is the use of surgery videos to create artificial intelligence and a better outcome for patient research? Ultimately, it was often the case that a particular site or hospital would agree, but it created a lot of churn, activity and work back and forth to explain exactly what was to be done. I think this will make it much clearer and easier for a hospital to say, “We understand this is an appropriate research use” and to be in a position to share that data according to all the protections that the GDPR provides around securing and de-identifying the data and so on.
Jonathan Sellors: I think our access test, which we apply to all our 35,000 users, is to ensure they are bona fide researchers conducting health-related research in the public interest. We quite often get asked whether the research they are planning to conduct is legitimate research. For example, a lot of genetic research, rather than being based on a particular hypothesis, is hypothesis-generating—they look at the data first and then decide what they want to investigate. This definition definitely helps clear up quite a few—not major, but minor—confusions that we have. They arise quite regularly, so I think it is a thoroughly helpful development to be able to point to something with this sort of clarity.
Q
Jonathan Sellors: The short answer would be yes. I was contacted by NHS England about the wording of some of the consent aspects, some of the research aspects and particularly some of the pseudonymisation aspects, because that is an important wall. Most research conducted is essentially on pseudonymised rather than identifiable data. The way it has been worded and clarified, because it makes an incremental improvement on what is already there in the GDPR, is very useful. I think it is a good job.
Tom Schumacher: Yes, I would say the same. NHS Transformation and the Department for Culture, Media and Sport, particularly Owen Rowland and Elisabeth Stafford, have been very willing to hear points of view from industry and very proactive in reaching out for our feedback. I feel like the result reflects that good co-ordination.
Q
Jonathan Sellors: Yes, I think it is reasonably clear.
Q
Phillip Mind: Clauses 62 and 64 make provision for the Secretary of State and Treasury to consult on smart data schemes. We think that those provisions could be strengthened. We see a need for impact assessments, cost-benefit analysis and full consultation. The Bill already allows for a post-implementation review, and we would advise that too.
Harry Weber-Brown: I think the other one to call out is the pensions dashboard, which has been driven out of the Money and Pensions Service. Although it has not actually launched yet, it has brought the life assurance industry on the site to develop free access to information. The consumer can see all their pensions holdings in a single place, which will then help them to make better financial decisions.
I think my former employer, the Investing and Saving Alliance, was working on an open savings, investments and pensions scheme. Obviously, that is not mandatory, but this is where the provision for secondary legislation is absolutely imperative to ensure that you get a wide scope of firms utilising this. At the moment, it is optional, but firms are still lining up and wanting to use it. There is a commitment within the financial services industry to do this, but having the legislation in place—secondary legislation, in particular—will ensure that they all do it to the same standards, both technical and data, and have a trust framework that wraps around it. That is why it is so imperative to have smart data.
Q
Harry Weber-Brown: In part 2 or part 3 of the Bill? The digital verification services or smart data?
I will come on to digital verification. Let us focus on smart data, to begin with.
Harry Weber-Brown: On that, Australia is certainly one of the leaders. The consumer has a data right under legislation that enables them to recall information from across a variety of sectors, not just financial services, and to have their information in a structured format shared with a data consumer—a third-party provider in open banking. Things are afoot. A lot of work is going on in the States, but less in Europe, interestingly. Legislation is coming through, but I think the big country to watch from our perspective is Australia and what has happened there. Theirs is a more far-reaching approach than, say, we have. That is for the smart data side.
There is a risk that if we do not extend that data right to other financial services, the consumer has a very limited view of what they can actually share. They can share their bank account details and possibly their pensions data as well, but what about their savings and investments, certainly in non-pension type wrappers? Give the consumer a full, holistic view of all their holdings and their debt as well, so that they can see their balance, as it were, and make better financial decisions. That is why we think it is so important to have part 3 of the Bill go through and for secondary legislation to follow behind it.
There is a risk that if we do not do that, the consumer has a very fragmented view. Does that mean that overseas, where it is legislated for, the consumer would have a more holistic view of everything? Would that drive investment overseas, rather than into the UK? As Phillip said, open banking has really heralded a range of fintech providers being able to consume data and provide value-added services on top of that banking data. I think it rebalances the marketplace as well.
Phillip Mind: To build on Harry’s remarks, I think that the real opportunity is for the UK to build a flourishing fintech industry. We have that already; open banking is actually one of our exports. Our way of doing open banking—the standards and the trust framework—has been a successful export, and it has been deployed in other jurisdictions. The opportunity around open data is to maintain that competitiveness for UK fintech when it is trading abroad.
Most of the consequences of extending beyond open banking into other smart data schemes impact UK businesses and consumers. I do not necessarily see that there is a competitiveness issue; it is bounded within the domestic economy.
Q
Harry Weber-Brown: That is a very good question. I did quite a lot of consumer research in my previous capacity, and consumers are initially quite sceptical, asking “Why are you asking me for identity details and things?” You have to explain fully why you are doing that. Certainly having Government support and things like the trust framework and a certification regime to make sure that the consumer knows whom they are dealing with when they are passing over sensitive data will help to build the trust to ensure that consumers will utilise this.
The second part to that is what types of services are built on top of the identity system. If I have the identity verified to an AML—anti-money laundering—standard for financial services, I could use it for a whole suite of other types of activity. That could be the purchase of age-restricted products, or sharing data with my independent financial adviser; it could reduce fraud in push payments, and so on. There is a whole suite of different types of services; you would not be using it just for onboarding. I think the Government support of this under digital verification services, part 2 of the Bill, is critical to make sure it happens.
It is opt-in. We are not saying to people that they have to get an identity card, which obviously is not hugely popular; but if we can demonstrate the value of having a digital identity, with support and trust—with the trust framework and certification with Government—we will not necessarily need to run a full marketing campaign to make sure that consumers use this.
Look at other territories—for example, Norway with Vipps, or Sweden’s BankID. I think about 98% of the population now use ID in a digital format; it is very commonplace. It is really a question of looking at the use cases—examples of how the consumer could utilise this—and making sure they receive utility and value from the setting up and the utilisation of the ID. The ID by itself is not necessarily compelling enough; the point is what you can use it for.
Phillip Mind: Trust and acceptance are key issues, and the Bill lays the legislative foundations for that. We already assert our identity digitally when we open accounts, but we do so on a one-off basis. The challenge is to go from doing so on a one-off basis to creating a digital token that is safe and secure and that allows us to reuse that digital identity. For that to work, that token has to be widely accepted, and that is a really complex strategic challenge, but the Bill lays the foundations.
We will transact digitally more and more; that is for sure. At the moment, we have a consultation, from the Treasury and the Bank of England, on a central bank digital currency. Arguably, that would benefit hugely from a reusable digital identity, but we need to be able to create the token in the right way. It could be enabling for people who have access to a smartphone but do not have a passport or driving licence; it could also build inclusion, in terms of identity. So we are very supportive of a reusable digital identity, but it is a big challenge, and the challenge is gaining trust and acceptance.
Q
Harry Weber-Brown: Financial services obviously rely heavily on data to be able to fashion their products accordingly and make them personal, so I think it is critical to have a smart data regime where everything is collected in a single format—what is known as an API, an application programming interface, which is a common way of securely sharing data.
Some of the other use cases from smart data that would benefit business would be things like sharing data around fact find. For example, if someone wants to instruct an independent financial adviser, could they not use this as a way of speeding up the process, rather than having to wait on letters of authority, which are written and take time? Similarly, with pension providers, if I wanted to move from one pension to another or to consolidate things, could we use the smart data to get an illustration of what impact that might have, so that before I ported it over I could see that?
For big financial services firms—well, for all of them—efficiencies are delivered because, as my colleague said, we are using digital as opposed to having to rely on manual processing. As long as the safeguards are put in place, that spawns a whole array of different types of use case, such as with regulatory reporting. If I need to report things to the regulator, could I use smart data provision to do that? That would benefit businesses. A lot of the financial services industry still relies on reporting on Excel spreadsheets and CSV files, so if we can digitise that, it would certainly make it a much more efficient economy.
Q
Keith Rosser: From that 70,000 example, we have not seen evidence yet that public trust has been negatively impacted. There are some very important provisions in the Bill that have to go a long way to assuring that. One is the creation of a governance body, which we think is hugely important. There has to be a monitoring of standards within the market. It also introduces the idea of certifying companies in the market. That is key, because in this market right now 30% of DVSs—nearly one in three companies—are not certified. The provision to introduce certification is another big, important move forward.
We also found, through a survey, that we had about 25% fewer objections when a user, company or employer was working with a certified company. Those are two really important points. In terms of the provision on improving the fraud response, we think there is a real opportunity to improve what DVSs do to tackle fraud, which I will probably talk about later.
Q
Keith Rosser: I have every reason to believe that organisations not certified will not be meeting anywhere near the standards that they should be meeting under a certified scheme. That appears really clear. They certainly will not be doing as much as they need to do to tackle fraud.
My caveat here is that across the entire market, even the certified market, I think that there is a real need for us to do more to make sure that those companies are doing far more to tackle fraud, share data and work with Government. I would say that uncertified is a greater risk, certainly, but even with certified companies we must do more to make sure that they are pushed to meet the highest possible standards.
Q
Keith Rosser: Yes. The requirement on DVSs to tackle fraud should be higher than it currently is.
Q
Keith Rosser: Absolutely. I will give a quick example relating to the Online Safety Bill and hiring, which I am talking about. If you look at people getting work online by applying through job boards or platforms, that is an uncertified, unregulated space. Ofcom recently did research, ahead of the Online Safety Bill, that found that 30% of UK adults have experienced employment scams when applying for work online, which has a major impact on access to and participation in the labour market, for many reasons.
Turning the question the other way around, we can also use that example to show that where we do have uncertified spaces, the risks are huge, and we are seeing the evidence of that. Specifically, yes, I would expect the governance body or the certification regime, or both, to really put a requirement on DVSs to do all the things you said—to have better upstream processes and better technology.
Also, I think there is a big missing space, given that we have been live with this in hiring for eight months, to provide better information to the public. At the moment, if I am a member of the public applying for a job and I need to use my digital identity, there is no information for me to look at, unless the employer—the end user—is providing me with something up front. Many do not, so I go through this process without any information about what I am doing. It is a real missed opportunity so far, but now we can right that to make sure that DVSs are providing at least basic information to the public about what to do, what not to do, what questions to ask and where to get help.
Q
Helen Hitching: Yes, it will aid it. Again, it brings in the ability to put the data protection framework on the same level, so we can share data in an easier fashion and make it less complex.
Q
Helen Hitching: The agency does not believe that those safeguards will be lowered. We will still not be able to share data internationally with countries that do not have the same standards that are met by the UK. It will provide greater clarity about which regimes should be used and at which point. The standards will not reduce.
Q
Helen Hitching: The agency has had to undertake a test to make sure that there is adequate or, essentially, equivalent protection. That standard is now changing to “not materially lower”, so it will be a lot easier to understand where those protection levels are the same as or not materially lower than the UK’s. It will be simplified a lot.
Q
Aimee Reed: Policing thinks that that will significantly simplify things. It will not reduce the level of oversight and scrutiny that will be placed upon us, which is the right thing to do. In terms of the simplicity of that and the regimes that we are under, we are very supportive of that change.
Helen Hitching: Likewise, we are supportive and welcome the simplification. We do note, however, that the Biometrics Commissioner currently has a keen focus on developing technology in a legal manner and consults with the public. We would ask that there remains a focus on that oversight of biometrics, to assure the public that that work remains a priority once the regulation of biometrics transfers to the Information Commissioner’s Office and to make sure that that focus is retained.
Q
Aimee Reed: On balance, it will make things easier. We are retaining the very different sections of the Act under which different organisations operate, and the sections that look to improve joint working across part 3 and part 4 agencies are very welcome. At the moment that is not about simplifying the relationships between those in, say, part 2 and part 3, albeit data sharing is entirely possible. In essence, it is going to get simpler and easier to share data, but without losing any of the safeguards.
Q
Mary Towers: The right to a data subject access request—again, like the DPIAs—is an absolutely crucial tool for trade unions in terms of establishing transparency over how their data is being used. Really, it provides a route for workers and unions to get information about what is going on in the workplace, how technologies operate and how they are operating in relation to individuals. It is an vital tool for trade unions.
What we are concerned about is that the new test specified in the Bill will provide employers with very broad discretion to decide when they do not have to comply with a data subject access request. The use of the term “vexatious or excessive” is a potential barrier to providing the right to an access request and provides employers with a lot of scope to say, for example, “Well, look, you have made a request several times. Now, we are going to say no.” However, there may be perfectly valid reasons why a worker might make several data subject access requests in a row. One set of information that is revealed may then lead a worker to conclude that they need to make a different type of access request.
We say that it is really vital to preserve and protect the right for workers to access information. Transparency as a principle is something that, again, goes to really important issues. For example, if there is discriminatory operation of a technology at work, how does a worker get information about that technology and about how the algorithm is operating? Data subject access requests are a key way of doing that.
Q
Andrew Pakes: “If we get this right” is doing a lot of heavy lifting there; I will leave it to Members to decide the balance. That should be the goal. There is a wonderful phrase from the Swedish trade union movement that I have cited before: “Workers should not be scared of the new machines; they should be scared of the old ones.” There are no jobs, there is no prosperity and there is no future for the kind of society that our members want Britain to be that does not involve innovation and the use of new technology.
The speed at which technology is now changing and the power of this technology compared with previous periods of economic change make us believe that there has to be a good, robust discussion about the balances of checks and balances in the process. We have seen in larger society—whether through A-level results, the Post Office or other things—that the detriment is significant on the individuals impacted if legislators get that balance wrong. I agree with the big principle and I will leave you to debate that, but we would certainly urge that checks and balances need to be balanced, not one-sided.
Mary Towers: Why does respect for fundamental rights have to be in direct conflict with growth and innovation? There is not necessarily any conflict there. Indeed, in a workplace where people are respected, have dignity at work and are working in a healthy way, that can only be beneficial for productivity and growth.
Q
Andrew Pakes: That is the first base. The power of technology is changing so quickly, and the informal conversations we have every day with employers suggest that many of them are wrestling with the same questions that we are. If we get this legislation right, it is a win-win when it comes to the question of how we introduce technology in workspaces.
You are right to identify the changing nature of work. We would also identify people analytics, or the use of digital technology to manage people. How we get that right is about the balance: how do you do it without micromanaging, without invading privacy, without using technology to make decisions without—this is a horrible phrase, but it is essentially about accountability—humans in the loop? Good legislation in this area should promote innovation, but it should also have due regard to balancing how you manage risks and reduce harms. That is the element that we want to make sure comes through in the legislation in its final form.
Data Protection and Digital Information (No. 2) Bill (Third sitting) Debate
Full Debate: Read Full DebateJohn Whittingdale
Main Page: John Whittingdale (Conservative - Maldon)(1 year, 5 months ago)
Public Bill CommitteesI have a few preliminary announcements that Mr Speaker would like me to make. Hansard colleagues would be grateful if Members emailed their speaking notes to hansardnotes@parliament.uk. Please switch electronic devices to silent mode. Tea and coffee are not allowed during sittings.
The selection list for today’s sitting, which is available in the room, shows how the selected amendments have been grouped for debate. Grouped amendments are generally on the same or a similar issue. Please note that decisions on amendments will take place not in the order in which they are debated, but in the order in which they appear on the amendment paper. The selection and grouping list shows the order of debates. Decisions on each amendment will be taken when we come to the clause to which the amendment relates.
The Member who has put their name to the lead amendment in a group will be called first. Other Members will then be free to catch my eye to speak on all or any of the amendments within that group. A Member may speak more than once in a single debate. At the end of a debate on a group of amendments, I shall again call the Member who moved the lead amendment. Before they sit down, they will need to indicate to me whether they wish to withdraw the amendment or to seek a decision. If any Member wishes to press any other amendment in a group to a vote, they will need to let me know.
Clause 1
Information relating to an identifiable living individual
Question proposed, That the clause stand part of the Bill.
It is a pleasure to serve under your chairmanship, Mr Hollobone. May I thank all hon. Members for volunteering to serve on the Committee? When I spoke on Second Reading, I expressed my enthusiastic support for the Bill—just as well, really. I did not necessarily expect to be leading on it in Committee, but I believe it is a very important Bill. It is complex and will require quite a lot of scrutiny, but it will create a framework of real benefit to the UK, by facilitating the exchange of data and allowing us to take the maximum advantage of emerging technologies. I look forward to our debates over the next few days.
Clause 1 will create a test in legislation to help organisations to understand whether the data that they are processing is personal or anonymous. This is important, because personal data is subject to data protection rules but anonymous data is not. If organisations can be confident that the data they are processing is anonymous, they will be able to use it for important activities such as research and product development without concern about the potential impact on individuals’ personal data.
The new test will require data controllers considering whether data is personal or anonymous to consider two scenarios. The first is where a living individual can be identified by somebody within the data controller or processor’s own organisation using reasonable means at any point at which the data is being processed, from the initial point of collection for its use and storage to its eventual deletion or onward transmission. The second scenario is where the data controller or processor knows or should reasonably know that somebody outside the organisation is likely to obtain the information and to be able to re-identify individuals from it using reasonable means. That could be a research partner or a business client with whom the data controller intends to share the data, or an outside organisation that obtains the data as a result of the data controller not putting adequate security measures in place.
What would be considered “reasonable means” in any given case takes into account, among other things, the time, effort and cost of identifying the individual, as well as the technology available during the time the processing occurs. We hope that the clarity the test provides will give organisations greater confidence about using anonymous data for a range of purposes, from marketing to medical research. I commend the clause to the Committee.
It is a pleasure to serve under your chairship, Mr Hollobone. I echo the Minister’s thanks to everyone serving on the Bill Committee; it is indeed a privilege to be here representing His Majesty’s loyal Opposition. I look forward to doing our constitutional duty as we scrutinise the Bill today and in the coming sittings.
The definition of personal data is critical, not only to this entire piece of legislation, but to the data protection regime more widely. That is because the definition of what counts as personal data sets the parameters on who will benefit from protections and safeguards set out by the legislation, and, looking at it from the other side, the various protections will not apply when data is not classed as personal. It is therefore important that the definition should be clear for both controllers and data subjects, so that everyone understands where regulations and, by extension, rights do and do not apply.
The Bill defines personal data as that where a data subject can be identified by a controller or processor, or anyone likely to obtain the information,
“by reasonable means at the time of processing”.
According to the Bill, “reasonable means” take into account the time, effort, costs, technology and resources available to the person. The addition of “reasonable” to the definition has caused major concern among civil society groups, which are worried that it will introduce an element of subjectivity from the perspective of the controller when determining whether data is personal or not. Indeed, although recital 26 of the General Data Protection Regulation also refers to reasonable means—making this, in some ways, more of a formal change than a practical one—there must still be clear parameters on how controllers or processors are to make that judgment. Without those, there may be a danger of controllers and processors avoiding the requirement to comply with rules around personal data by simply claiming they do not have the means to identify living individuals within their resources.
Has the Department undertaken an impact assessment to determine whether the definition could, first, increase subjectivity in what counts as personal data, or secondly, reduce the amount of data classified as personal data? If an assessment identifies such a risk, what steps will the Department take to mitigate that and ensure that citizens are able to exercise their rights as they can under the current definition?
Other stakeholders have raised concerns that the phrase
“at the time of the processing”
in the definition might imply that there is no continuous obligation to consider whether data is personal. Indeed, under the current definition, where personal data is
“any information that relates to an identified or identifiable living individual”,
there is an implied obligation to consider whether an individual is identifiable on an ongoing basis. Rather than assessing the identifiability of a dataset at a fixed point, the controller or processor must keep the categorisation of data that it holds under careful review, taking into account technological developments, such as sophisticated new artificial intelligence or cross-referencing tools. Inserting the phrase
“at the time of the processing”
into this definition has prompted the likes of Which? to express concern that some processors may feel that they are no longer bound by this continuous obligation. That would be particularly worrying given the potential subjectivity of the new definition. If whether an individual is identifiable is based on “reasonable means”, including one’s resources and technology, it is perfectly feasible that, with a change of resources or technology, it could become reasonable to identify a person when once it was not.
My hon. Friend makes an important point, which I will come to later.
In these circumstances, it is crucial that if a person is identifiable through data at any time in the future, the data is legally treated as personal so that the relevant safeguards and rights that GDPR was designed to ensure still apply.
When arguing for increased Secretary of State powers across the Bill, Ministers have frequently cited the need to future-proof the legislation. Given that, we must also consider the need to future-proof the definition of data so that technological advances do not render it useless. Does the new definition involve a continuous obligation to assess whether data is personal? Will guidance be offered to inform both controllers and data subjects on the application of this definition, so that both sides can be clear on how it will work in practice? As 5Rights has pointed out, that could avoid clogging up the regulator’s time with claims about what counts as personal data in many individual cases.
Finally, when determining whether data is personal, it is also vital that controllers take into account how a determined stalker or malicious actor might find and use their data. It is therefore good to see the change made since the first iteration of the Data Protection and Digital Information Bill, to clarify that
“obtaining the information as a result of the processing”
also includes information obtained as a result of inaction by a controller or processor—for example, as the result of a failure to put in place appropriate measures to prevent or reduce the risk of hacking.
Overall, it is important that we give both controllers and data subjects clarity about which data is covered by which protections, and when. I look forward to hearing from the Minister about the concerns that have been raised, which could affect the definition’s ability to allow for that clarity.
I agree absolutely with the hon. Lady that the definition of personal data is central to the regime that we are putting in place. She is absolutely right that we need to be very clear and to provide organisations with clarity about what is within the definition of personal data and what is rightly considered to be anonymous. She asks whether the provision will lead to a reduction in the current level of protection. We do not believe that it will.
Clause 1 builds on the strong foundations used in GDPR recital 26 to clarify when data can be categorised as truly anonymous without creating undue risks. The aim of the provision in the Bill is to clarify when information should be considered to be personal data by including a test for identifiability in the legislation. That improved clarity will help organisations to determine when data can be considered truly anonymous and therefore pose almost no risk to the data subject.
The hon. Lady asked whether
“at the time of the processing”
extends into the future, and the answer is yes. The definition of data processing in the legislation is very broad and includes a lot of processing activities other than just the collection of data, such as alteration, retrieval, storage and disclosure by transmission, to name just a few. The phrase
“at the time of the processing”
could therefore cover a long period, depending on the nature and purpose of the processing. The test would need to be applied afresh for each new act of processing. That means that if at any point in the life cycle of processing, the data could be reasonably re-identified by someone by reasonable means, they would then not be able to legally consider to be anonymous. That includes transferring abroad to other regimes.
The clause makes it clear that a controller will have to consider the likelihood of re-identification at all stages of the processing activity. If a data controller held a dataset for several years, they would need to be mindful of the technologies available during that time that might be used to re-identify it. As the hon. Lady said, technology is advancing very fast and could well change over time from the point at which the data is first collected.
I appreciate the Minister’s clarification. He has just said that the test of identification would apply when sharing the data with another authority. However, once that has been done, the test no longer applies. Does he accept that it is possible for data to be shared that could not by this test reasonably be identified but that, over time, in a different authority, could reasonably be identified, without the data subject having any redress?
If data is shared and then held by a new controller, it will be still subject to the same protections even though it has been transferred from the original. It is important that there should be the ability to continue to apply protection no matter what technology evolves over the course of time, but it will still be subject to the same protection and, of course, still be enforceable through the Information Commissioner.
Would it be subject to the same protection if it was transferred abroad?
Again, yes, it will. It will be transferred abroad only if we are satisfied that the recipient will impose the same level of protection that we regard as necessary in this country.
Question put and agreed to.
Clause 1 accordingly ordered to stand part of the Bill.
Clause 2
Meaning of research and statistical purposes
I beg to move amendment 66, clause 2, page 4, line 8, at end insert—
“(c) do not include processing of personal data relating to children for research carried out as a commercial activity.”
This amendment would exempt children’s data from being used for commercial purposes under the definition of scientific purposes in this clause.
I wish to pose a couple of questions, after two thoughtful and well-presented amendments from those on the Opposition Front Bench. With regard to children and the use of apps such as TikTok, what assurance will the Government seek to ensure that companies that process and store data abroad are abiding by the principles of our domestic legislation? I mention TikTok directly because it stores data from UK users, including children, in Singapore, and it has made clear in evidence to the Joint Committee on the Online Safety Bill that that data is accessed by engineers in China who are working on it.
We all know that when data is taken from a store and used for product development, it can be returned in its original state but a huge amount of information is gathered and inferred from it that is then in the hands of engineers and product developers working in countries such as China and under very different jurisdictions. I am interested to know what approach we would take to companies that store data in a country where we feel we have a data equivalence regime but then process the data from a third location where we do not have such a data agreement.
I welcome the recognition of the importance of allowing genuine research and the benefits that can flow from it. Such research may well be dependent on using data and the clause is intended to provide clarity as to exactly how that can be done and in what circumstances.
I will address the amendments immediately. I am grateful to the hon. Member for Barnsley East for setting out her arguments and we understand her concerns. However, I think that the amendments go beyond what the clause proposes and, in addition, I do not think that there is a foundation for those concerns. As we have set out, clause 2 inserts in legislation a definition for processing for scientific research, historical research and statistical purposes. The definition of scientific research purposes is set out as
“any research that can be reasonably described as scientific”
and I am not sure that some of the examples that the hon. Lady gave would meet that definition.
The definitions inserted by the clause are based on the wording in the recitals to the UK GDPR. We are not changing the scope of these definitions, only their status in the legislation. They will already be very familiar to people using them, but setting them out in the Bill will provide more clarity and legal certainty. We have maintained a broad scope as to what is allowed to be included in scientific research, with the view that the regulator can add more nuance and context through guidance, as is currently the case. The power to require codes of practice provides a route for the Secretary of State to require the Information Commissioner to prepare any code of practice that gives guidance on good practice in processing personal data.
There will be situations where non-statutory guidance, which can be produced without being requested under regulations made by the Secretary of State, may be more appropriate than a statutory code of practice. Examples of the types of activity that are considered scientific research and the indicative criteria that a researcher should demonstrate are best placed in non-statutory guidance produced by the Information Commissioner’s Office. That will give flexibility to amend and change the examples when necessary, so I believe that the process does not change the provision. However, putting it in the legislation, rather than in the recitals, will impose stronger safeguards and make things clearer. Once the Bill has come into effect, the Government will continue to work with the ICO to update its already detailed and helpful guidance on the definition of scientific research as necessary.
Amendment 66 would prohibit the use of children’s data for commercial purposes under the definition of scientific research. The definition inserted by clause 2 includes the clarification that processing for scientific research carried out as a commercial activity can be considered processing for scientific research purposes. Parts of the research community asked for that clarification in response to our consultation. It reflects the existing scope, as is already clear from the ICO’s guidance, and we have seen that research by commercial bodies can have immense societal value. For instance, research into vaccines and life-saving treatments is clearly in the public interest. I entirely understand the hon. Lady’s concern for children’s privacy, but we think that her amendment could obstruct important research by commercial organisations, such as research into children’s diseases. I think that the Information Commissioner would make it clear as to whether or not the kind of example that the hon. Lady gave would fall within the definition of research for scientific purposes.
I also entirely understand the concern expressed by my hon. Friend the Member for Folkestone and Hythe. I suspect that the question about the sharing of data internationally, particularly, perhaps, by TikTok, may recur during the course of our debates. As he knows, we would share data internationally only if we were confident that it would still be protected in the same way that it is here, which would include considering the possibility of whether or not it could then be passed on to a third country, such as China.
I hope that I can reassure the hon. Lady that emphasising the safeguards that researchers must comply with in clause 22 to protect individuals relates to all data used for these purposes, including children’s data and the protections afforded to children under the UK GDPR. For those reasons, I hope that she will be willing to withdraw her amendment.
I am disappointed that the Minister does not accept amendment 66. Let me make a couple of brief points about amendment 65. The Minister said that he was not sure whether some of the examples I gave fitted under the definition, and that is what the amendment speaks to. I asked what specific purposes would be ruled out under the letter of the current definition, and that is still not clear, so I will press the amendment to a vote.
Question put, That the amendment be made.
The clause clarifies how the conditions for consent will be met in certain circumstances when processing for scientific research purposes. It clarifies an existing concept of “broad consent” that is currently found in the recitals. The measure will enable consent to be obtained for an area of scientific research when the researcher cannot fully identify the purposes for which they are collecting the data.
Consent under UK GDPR must be for a specific purpose, but in scientific research the precise purpose may not be fully known when the data is collected. For example, the initial aim may be the study of cancer, and then later becomes the study of a particular cancer type. Currently, the UK GDPR recitals clarify that consent may be given for an area of scientific research, but as the recitals are only an interpretative aid that may not give scientists the certainty that they need. The clause will therefore add the ability to give broad consent for scientific research into the operative text of the UK GDPR, giving scientists greater certainty and confidence. The clause contains a number of safeguards to protect against misuse. That includes the requirement that seeking consent is consistent with ethical standards that are generally recognised and relevant to that area of research.
With regard to clause 3, I refer Members to my remarks on clause 2. It is sensible to clarify how controllers and processors conducting scientific research can gain consent where it is not possible to fully identify the full set of uses for that data when it is collected. However, what counts as scientific, and therefore what is covered by the clause, must be properly understood by both data subjects and controllers through proper guidance issued by the ICO.
Clause 4 is largely technical and inserts the recognised definition of consent into part 3 of the Data Protection Act 2018, for use when it is inappropriate to use one of the law enforcement purposes. I will talk about law enforcement processing in more detail when we consider clauses 16, 24 and 26, but I have no problem with the definition in clause 4 and am happy to accept it.
I am grateful to the hon. Lady for her support. I agree with her on the importance of ensuring that the definition of scientific research is clear. That is something on which I have no doubt the ICO will also issue guidance.
Question put and agreed to.
Clause 3 accordingly ordered to stand part of the Bill.
Clause 4 ordered to stand part of the Bill.
Clause 5
Lawfulness of processing
I beg to move amendment 68, in clause 5, page 6, line 37, at end insert—
“7A. The Secretary of State may not make regulations under paragraph 6 unless—
(a) following consultation with such persons as the Secretary of State considers appropriate, the Secretary of State has published an assessment of the impact of the change to be made by the regulations on the rights and freedoms of data and decision subjects (with particular reference to children),
(b) the Commissioner has reviewed the Secretary of State’s statement and published a statement of the Commissioner’s views on whether the change should be made, with reasons, and
(c) the Secretary of State has considered whether to proceed with the change in the light of the Commissioner’s statement.”
This amendment would make the Secretary of State’s ability to amend the conditions in Annex 1 which define “legitimate interests” subject to a requirement for consultation with interested parties and with the Information Commissioner, who would be required to publish their views on any proposed change.
At present, the lawful bases for processing are set out in article 6 of the UK GDPR. At least one of them must apply whenever someone processes personal data. They are consent, contract, legal obligation, vital interests, public task, and legitimate interests. That is where data is being used in ways that we would reasonably expect, there is minimal privacy impact, or there is a compelling justification for processing. Of the existing lawful bases, consent is by far the most relied upon, as it is the most clear. There have therefore been calls for the other lawful bases to be made clearer and easier to use. It is welcome to see some examples of how organisations might rely on the legitimate interests lawful ground brought on to the statute book.
At the moment, in order to qualify for using legitimate interests as grounds for lawful processing, a controller must also complete a balancing test. The balancing test is an important safeguard. As per the ICO, it requires controllers to consider the interests and fundamental rights and freedoms of the individual, and whether they override the legitimate interests that the controller has identified. That means at a minimum considering the nature of the personal data being processed, the reasonable expectations of the individual, the likely impact of processing on the individual, and whether any safeguards can be put in place to mitigate any negative impacts.
As tech.UK mentioned, the introduction of a list of legitimate interests no longer requiring that test is something many have long called for. When conducting processing relating to an emergency, for example, the outcome of a balancing test often very obviously weighs in one direction, making the decision straightforward, and the test itself an administrative task that may slow processing down. It makes sense in such instances that a considered exemption might apply.
However, given the reduction in protection and control for consumers when removing a balancing test, it is vital that a list of exemptions is limited and exhaustive, and that every item on such a list is well consulted on. It is also vital that the new lawful basis cannot be relied upon in bad faith or exploited by those who simply want to process without the burden, for reasons outside of those listed in annex 1. The Bill as it currently stands does not do enough to ensure either of those things, particularly given the Secretary of State’s ability to add to the list on a whim.
I turn to amendment 67. Although it is likely not the intention for the clause to be open to exploitation, Reset.tech, among many others, has shared concerns that controllers may be able to abuse the new lawful basis of “recognised legitimate interests”, stretching the listed items in annex 1 to cover some or all of their processing, and giving themselves flexibility over a wide range of processing without an explicit requirement to consider how that processing affects the rights of data and decision subjects. That is particularly concerning where controllers may be able to conflate different elements of their processing.
Reset.tech and AWO provide a theoretical case study to demonstrate that point. Let us say that there is a gig economy food delivery company that processes a range of data on workers, including minute-by-minute location data. That location data would be used primarily for performance management, but could occasionally be used in more extreme circumstances to detect crime—for example, detecting fraud by workers who are making false claims about how long they waited for an order to be ready for delivery. By exploiting the new recognised legitimate interests basis, the company could conflate its purposes of performance management and detecting crime, and justify the tracking of location data as a whole as being exempt from the balancing test, without having to record or specify exactly which processing is for the detection of crime.
Under the current regime, there remain two tests other than the balancing test that form a complete assessment of legitimate interests and help to prevent conflation of that kind. First, there is the purpose test, which requires the controller to identify which legitimate interest the company is relying upon. Secondly, there is the necessity test, which requires the controller to consider whether the processing that the company intends to conduct is necessary and proportionate to meet its purposes.
In having to conduct those tests, the food delivery company would find it much more difficult to conflate its performance management and crime prevention purposes, as it would have to identify and publicly state exactly which elements of its processing are covered by the legitimate interest purpose of crime prevention. That would make it explicit that any processing the company conducts for the purposes of performance management is not permitted under a recognised legitimate interest, meaning that a lawful basis for that processing would be required separately.
Amendment 67 therefore seeks to ensure that the benefits of the purpose and necessity tests are retained, safeguarding the recognised legitimate interests list from being used to cynically conflate purposes and being exploited more generally. In practice, that would mean that controllers relying on a purpose listed in annex 1 for processing would be required to document and publish a notice that explains exactly which processing the company is conducting under which purpose, and why it is necessary.
It is foundational to the GDPR regime that each act of processing has a purpose, so this requirement should just be formalising and publishing what controllers are already required to consider. The measure that the amendment seeks to introduce should therefore be no extra burden on those already complying in good faith, but should still act as a barrier to those attempting to abuse the new basis.
I turn to amendment 68. As the likes of Which? have argued, any instance of removing the balancing test will inevitably enable controllers to prioritise their interests in processing over the impact on data subjects, resulting in weaker protections for data subjects and weaker consumer control. Which? research, such as that outlined in its report “Control, Alt or Delete? The future of consumer data”, also shows that consumers value control over how their data is collected and used, and that they desire more transparency, rather than less, on how their data is used.
With those two things in mind—the value people place on control of their data and the degradation of that control as a result of removing the balancing test—it is vital that the power to remove the balancing test is used extremely sparingly on carefully considered, limited purposes only. Even for those purposes already included in annex 1, it is unclear exactly what impact assessment took place to ensure that the dangers of removing the test on the rights of citizens did not outweigh the positives of that removal.
It would therefore be helpful if the Minister could outline the assessment and analysis that took place before deciding the items on the list. Although it is sensible to future-proof the list and amend it as needs require, this does not necessarily mean vesting the power to do so in the Secretary of State’s hands, especially when such a power is open to potential abuse. Indeed, to say that the Secretary of State must have regard to the interests and fundamental rights and freedoms of data subjects and children when making amendments to the list is simply not a robust enough protection for citizens. Our laws should not rely on the good nature of the Secretary of State; they must be comprehensive enough to protect us if Ministers begin to act in bad faith.
Further, secondary legislation simply does not offer the scrutiny that the Government claim it does, because it is rarely voted on. Even when it is, if the Government of the day have a majority, defeating such a vote is incredibly rare. For the method of changing the list to be protected from the whims of a bad faith Secretary of State who simply claims to have had regard to people’s rights, proper consultation should be undertaken by the regulator on any amendments before they are considered for parliamentary approval.
This amendment would move the responsibility for judging the impact of changes away from the Secretary of State and place it with the regulator on a yearly basis, ensuring that amendments proceed only if they are deemed, after consultation, to be in the collective societal interest. That means there will be independent assurance that any amendments are not politically or maliciously motivated. This safeguard should not be of concern to anyone prepared to act in good faith, particularly the current Secretary of State, as it would not prevent the progression in Parliament of any amendments that serve the common good. The amendment represents what genuine future-proofing in a way that retains appropriate safeguards looks like, as opposed to what ends up looking like little more than an excuse for a sweeping power grab.
I welcome the hon. Lady’s recognition of the value of setting out a list of legitimate interests to provide clarity, but I think she twice referred to the possibility of the Secretary of State adding to it on a whim. I do not think we would recognise that as a possibility. There is an established procedure, which I would like to go through in responding to the hon. Lady’s concerns. As she knows, one of the key principles of our data protection legislation is that any processing of personal data must be lawful. Processing will be lawful where an individual has given his or her consent, or where another specified lawful ground in article 6 of the UK GDPR applies. This includes where the processing is necessary for legitimate interests pursued by the data controller, providing that those interests are not outweighed by an individual’s privacy rights.
Clause 5 addresses the concerns that have been raised by some organisations about the difficulties in relying on the “legitimate interests” lawful ground, which is used mainly by commercial organisations and other non-public bodies. In order to rely on it, the data controller must identify what their interest is, show that the processing is necessary for their purposes and balance their interests against the privacy right of the data subject. If the rights of the data subject outweigh the interests of the organisation, the processing would not be lawful and the controller would need to identify a different lawful ground. Regulatory guidance strongly recommends that controllers document the outcome of their legitimate interests assessments.
As we have heard, and as the hon. Lady recognises, some organisations have struggled with the part of the legitimate interests assessment that requires them to balance their interests against the rights of individuals, and concern about getting the balancing test wrong—and about regulatory action that might follow as a result—can cause risk aversion. In the worst-case scenario, that could lead to crucial information in the interests of an individual or the public—for example, about safeguarding concerns—not being shared by third-sector and private-sector organisations. That is why we are taking steps in clause 5 and schedule 1 to remove the need to do the balancing test in relation to a narrow range of recognised legitimate activities that are carried out by non-public bodies. Those activities include processing, which is necessary for the purposes of safeguarding national security or defence; responding to emergencies; preventing crimes such as fraud or money laundering; safeguarding vulnerable individuals; and engaging with the public for the purposes of democratic engagement.
Will my right hon. Friend confirm whether the Information Commissioner’s advice will be published, either by the commissioner, the Minister or Parliament—perhaps through the relevant Select Committee?
I am not sure it would necessarily be published. I want to confirm that, but I am happy to give a clear response to the Committee in due course if my hon. Friend will allow me.
As well as the advice that the Information Commissioner supplies, the proposal is also subject to the affirmative procedure, as the hon. Member for Barnsley East recognised, so Parliament could refuse to approve any additions to the list that do not respect the rights of data subjects. She suggested that it is rare for an affirmative resolution to be rejected by Parliament; nevertheless, it is part of our democratic proceedings, and every member of the Committee considering it will have the opportunity to reach their own view and vote accordingly. I hope that reassures the hon. Lady that there are already adequate safeguards in place in relation to the exercise of powers to add new activities to the list of recognised legitimate interests.
Amendment 67, which the hon. Lady also tabled, would require data controllers to publish a statement if they are relying on the new recognised legitimate interests lawful ground. The statement would have to explain what processing would be carried out in reliance on the new lawful ground and why the processing is proportionate and necessary for the intended purpose. In our view, the amendment would significantly weaken the clause. It would reintroduce something similar to the legitimate interests assessment, which, as we have heard, can unnecessarily delay some very important processing activities. In scenarios involving national security or child protection, for example, the whole point of the clause is to make sure that relevant and necessary personal data can be shared without hesitation to protect vulnerable individuals or society more generally.
I hope the hon. Lady is reassured by my response and agrees to withdraw her amendments. I commend clause 5 to the Committee.
I beg to move amendment 30, in schedule 1, page 137, line 28, leave out “fourth day after” and insert
“period of 30 days beginning with the day after”.
Annex 1 to the UK GDPR makes provision about processing for democratic engagement purposes, including certain processing by elected representatives. This amendment increases the period for which former members of the Westminster Parliament and the devolved legislatures continue to be treated as "elected representatives" following an election. See also NC6 and Amendment 31.
With this it will be convenient to discuss the following:
Government amendment 31.
Government new clause 6—Special categories of personal data: elected representatives responding to requests.
That schedule 1 be the First schedule to the Bill.
As the Committee will be aware, data protection legislation prohibits the use of “special category” data—namely, information about a person that is sensitive in nature—unless certain conditions or exemptions apply. One such exemption is where processing is necessary on grounds of substantial public interest.
Schedule 1 to the Data Protection Act 2018 sets out a number of situations where processing would be permitted on grounds of substantial public interest, subject to certain conditions and safeguards. That includes processing by elected representatives who are acting with the authority of their constituents for the purposes of progressing their casework. The current exemption applies to former Members of the Westminster and devolved Parliaments for four days after a general election—for example, if the MP has been defeated or decides to stand down. That permits them to continue to rely on the exemption for a short time after the election to conclude their parliamentary casework or hand it over to the incoming MP. In practice, however, it can take much longer than that to conclude these matters.
New clause 6 will therefore extend what is sometimes known as the four-day rule to 30 days, which will give outgoing MPs and their colleagues in the devolved Parliaments more time to conclude casework. That could include handing over live cases to the new representative, or considering what records should be retained, stored and deleted. When MPs leave office, there is an onus on them to conclude their casework in a timely manner. However, the sheer volume of their caseload, on top of the other work that needs to be done when leaving office, means that four days is just not enough to conclude all relevant business. The new clause will therefore avoid the unwelcome situation where an outgoing MP who is doing his or her best to conclude constituency casework could be acting unlawfully if they continue to process their constituents’ sensitive data after the four-day time limit has elapsed. Extending the time limit to 30 days will provide a pragmatic solution to help outgoing MPs while ensuring the exemptions cannot be relied on for an indefinite period.
Government amendments 30 and 31 will make identical changes to other parts of the Bill that rely on the same definition of “elected representative”. Government amendment 30 will change the definition of “elected representative” when the term appears in schedule 1. As I mentioned when we debated the previous group of amendments, clause 5 and schedule 1 to the Bill create a new lawful ground for processing non-sensitive personal data, where the processing is necessary for a “recognised legitimate interest”. The processing of personal data by elected representatives for the purposes of democratic engagement is listed as such an interest, along with other processing activities of high public importance, such as crime prevention, safeguarding children, protecting national security and responding to emergencies.
Government amendment 31 will make a similar change to the definition of “elected representative” when the term is used in clause 84. Clauses 83 and 84 give the Secretary of State the power to make regulations to exempt elected representatives from some or all of the direct marketing rules in the Privacy and Electronic Communications (EC Directive) Regulations 2003. I have no doubt that we will debate the merits of those clauses in more detail later in Committee, but for now it makes sense to ensure that there is a single definition of “elected representative” wherever it appears in the Bill. I hope the hon. Member for Barnsley East and other colleagues will agree that those are sensible suggestions and will support the amendments.
This set of Government provisions will increase the period for which former MPs and elected representatives in the devolved regions can use the democratic engagement purpose for processing. On the face of it, that seems like a sensible provision that allows for a transition period so that data can be deleted, processed, or moved on legally and safely after an election, and the Opposition have a huge amount of sympathy for it.
I will briefly put on record a couple of questions and concerns. The likes of the Ada Lovelace Institute have raised concerns about the inclusion of democratic engagement purposes in schedule 1. They are worried, particularly with the Cambridge Analytica scandal still fresh in people’s minds, that allowing politicians and elected parties to process data for fundraising and marketing without a proper balancing test could result in personal data being abused for political gain. The decision to make processing for the purposes of democratic engagement less transparent and to remove the balancing test that measures the impact of that processing on individual rights may indicate that the Government do not share the concern about political processing. Did the Minister’s Department consider the Cambridge Analytica scandal when drawing up the provisions? Further, what safeguards will be in place to ensure that all data processing done under the new democratic engagement purpose is necessary and is not abused to spread misinformation?
I would only say to the hon. Lady that I have no doubt that we will consider those aspects in great detail when we get to the specific proposals in the Bill, and I shall listen with great interest to my hon. Friend the Member for Folkestone and Hythe, who played an extremely important role in uncovering what went on with Cambridge Analytica.
The principle that underpinned what happened in the Cambridge Analytica scandal was the connection of Facebook profiles to the electoral register. If I understand my right hon. Friend the Minister correctly, what he is talking about would not necessarily change that situation. This could be information that the political campaign has gained anyway from a voter profile or from information that already exists in accounts it has access to on platforms such as Facebook; it would simply be attaching that, for the purposes of targeting, to people who voted in an election. The sort of personal data that Members of Parliament hold for the purposes of completing casework would not have been processed in that way. These proposals would not change in any way the ability to safeguard people’s data, and companies such as Cambridge Analytica will still seek other sources of open public data to complete their work.
I think my hon. Friend is right. I have no doubt that we will go into these matters in more detail when we get to those provisions. As the hon. Member for Barnsley East knows, this measure makes a very narrow change to simply extend the existing time limit within which there is protection for elected representatives to conclude casework following a general election. As we will have opportunity in due course to look at the democratic engagement exemption, I hope she will be willing to support these narrow provisions.
I am grateful for the Minister’s reassurance, and we are happy to support them.
One of the key principles in article 5 of the EU GDPR is purpose limitation. The principle aims to ensure that personal data is collected by controllers only for specified, explicit and legitimate purposes. Generally speaking, it ensures that the data is not further processed in a manner that is incompatible with those purposes. If a controller’s purposes change over time, or they want to use data for a new purpose that they did not originally anticipate, they can go ahead only if the new purpose is compatible with the original purpose, they get the individual’s specific consent for the new purpose or they can point to a clear legal provision requiring or allowing the new processing in the public interest.
Specifying the reasons for obtaining data from the outset helps controllers to be accountable for their processing and helps individuals understand how their data is being used and whether they are happy with that, particularly where they are deciding whether to provide consent. Purpose limitation exists so that it is clear why personal data is being collected and what the intention behind using it is.
In any circumstance where we water down this principle, we reduce transparency, we reduce individuals’ ability to understand how their data will be used and, in doing so, we weaken assurances that people’s data will be used in ways that are fair and lawful. We must therefore think clearly about what is included in clause 6 and the associated annex. Indeed, many stakeholders, from Which? to Defend Digital Me, have expressed concern that what is contained in annex 2 could seriously undermine the principle of purpose limitation.
As Reset.tech illustrates, under the current regime, if data collected for a relatively everyday purpose, such as running a small business, is requested by a second controller for the purpose of investigating crime, the small business would need to assess whether this further processing—thereby making a disclosure of the data—was compatible with its original purpose. In many cases, there will be no link between the original and secondary purposes, and there are potential negative consequences for the data subjects. As such, the further processing would be unlawful, as it would breach the principle of purpose limitation.
However, under the new regime, all it would take for the disclosure to be deemed compatible with the original purpose is the second controller stating that it requires the data for processing in the public interest. In essence, this means that, for every item listed in annex 2, there are an increased number of circumstances in which data subjects’ personal information could be used for purposes outside their reasonable expectations. It seems logical, therefore, that whatever is contained in the list is absolutely necessary for the public good and is subject to the highest level of public scrutiny possible.
Instead, the clause gives the Secretary of State new Henry VIII powers to add to the new list of compatible purposes by secondary legislation whenever they wish, with no provisions made for consulting on, scrutinising or assessing the impact of such changes. It is important to remember here that secondary legislation is absolutely not a substitute for parliamentary scrutiny of primary legislation. Delegated legislation, as we have discussed, is rarely voted on, and even when it is, the Government of the day will win such a vote if they have a majority.
If there are other circumstances in which the Government think it should be lawful to carry out further processing beyond the original purpose, those should be in the Bill, rather than being left to Ministers to determine at a later date, avoiding the same level of scrutiny.
The Government’s impact assessment says that clarity on the reuse of data could help to fix the market failure caused by information gaps on how purpose limitation works. Providing such clarity is something we could all get behind. However, by giving the Secretary of State sweeping powers fundamentally to change how purpose limitation operates, the clause goes far beyond increasing clarity.
Improved and updated guidance on how the new rules surrounding reusing data work would be far more fruitful in providing clarity than further deregulation in this instance. If Ministers believe there are things missing from the clause and annex, they should discuss them here and now, rather than opening the back door to making further additions afterwards, and that is what the amendment seeks to ensure.
The clause sets out the conditions under which the reuse of personal data for a new purpose is permitted. As the hon. Lady has said, the clause expands on the purpose limitation principle. That key principle of data protection ensures that an individual’s personal data is reused only in ways they might reasonably expect.
The current provisions in the UK GDPR on personal data reuse are difficult for controllers and individuals to navigate. That has led to uncertainty about when controllers can reuse personal data. The clause addresses the existing uncertainty around reusing personal data by setting out clearly when it is permitted. That includes when personal data is being reused for a very different purpose from that for which it was originally collected—for example, when a company might wish to disclose personal data for crime prevention.
The clause permits reuse of personal data by a controller when the new purpose is “compatible”; they get fresh consent; there is a research purpose; UK GDPR is being complied with, such as for anonymisation or pseudonymisation purposes; there is an objective in the public interest authorised by law; and certain specified objectives in the public interest set out in a limited list in schedule 2 are met. I will speak more about that when we come to the amendment and the debate on schedule 2.
The clause contains a power to add or amend conditions or remove conditions added by regulations from that list to ensure it can be kept up to date with any future developments in how personal data should be reused in the public interest. It also sets out restrictions on reusing personal data that the controller originally collected on the basis of consent.
The Government want to ensure that consent is respected to uphold transparency and maintain high data protection standards. If a person gives consent for their data to be processed for a specific purpose, that purpose should be changed without their consent only in limited situations, such as for certain public interest purposes, if it would be unreasonable to seek fresh consent. That acts as a safeguard to ensure that organisations address the possibility of seeking fresh consent before relying on any exemptions.
The restrictions around consent relate to personal data collected under paragraph 1(a) of article 6 of the UK GDPR, which came into force in May 2018. Therefore, they do not apply to personal data processed on the basis of consent prior to May 2018, when different requirements applied. By simplifying the rules on further processing, the clause will give controllers legal certainty on when they can reuse personal data and give individuals greater transparency. I support the clause standing part of the Bill.
Let me turn to amendment 69, which proposes to remove the power set out in the clause to amend the annex in schedule 2. As I have already said, schedule 2 will insert a new annex in the UK GDPR, which sets out certain specific public interest circumstances where personal data reuse is permitted. The list is strictly limited and exhaustive, so a power is needed to ensure that it is kept up to date with any future developments in how personal data is reused for important public interest purposes. That builds on an existing power in schedule 2 to the Data Protection Act 2018, where there is already the ability to make exceptions to the purpose limitation principle via secondary legislation.
The power in the clause also provides the possibility of narrowing a listed objective if there is evidence of any of the routes not being used appropriately. That includes limiting it, by reference, to the lawful ground of the original processing—for example, to prohibit the reuse of data that was collected on the basis of an individual’s consent.
I would like to reassure the hon. Lady that this power will be used only when necessary and in the public interest. That is why the clause contains a restriction on its use; it may be used only to safeguard an objective listed in article 23 of the UK GDPR. Clause 44 of the Bill also requires that the Secretary of State must consult the commissioner, and any other persons as the Secretary of State considers appropriate, before making any regulations.
On that basis, I hope the hon. Lady will accept that the amendment is unnecessary.
The purpose behind our amendment —this speaks to a number of our amendments—is that we disagree with the amount of power being given to the Secretary of State. For that reason, I would like to continue with my amendment.
Question put, That the amendment be made.
As we have already discussed with clause 6, schedule 2 inserts a new annex into the UK GDPR. It sets out certain specific public interest circumstances in which personal data reuse is permitted regardless of the purpose for which the data was originally collected—for example, when the disclosure of personal data is necessary to safeguard vulnerable individuals. Taken together, clause 6 and schedule 2 will give controllers legal certainty on when they can reuse personal data and give individuals greater transparency.
Amendment 70 concerns taxation purposes, which are included in the list in schedule 2. I reassure the hon. Member for Barnsley East that the exemption for taxation is not new: it has been moved from schedule 2 to the Data Protection Act 2018. Indeed, the specific language in question goes back as far as 1998. We are not aware of any problems caused by that language.
The inclusion in the schedule of
“levied by a public authority”
would likely cause problems, since taxes and duties can be imposed only by law. Some must be assessed or charged by public authorities, but many become payable as a result of a person’s transactions or circumstances, without any intervention needed except to enforce collection if unpaid. They are not technically levied by a public authority. That would therefore lead to uncertainty and confusion about whether processing for certain important taxation purposes would be permitted under the provision.
I hope to reassure the hon. Lady by emphasising that taxation is not included in the annex 1 list of legitimate interests. That means that anyone seeking to use the legitimate interest lawful ground for that purpose would need to carry out a balancing-of-interests test, unless they were responding to a request for information from a public authority or other body with public tasks set out in law. For those reasons, I am afraid I am unable to accept the amendment, and I hope the hon. Lady will withdraw it.
Amendment 71 relates to the first paragraph in new annex 2 to the UK GDPR, as inserted by schedule 2. The purpose of that provision is to clarify that non-public bodies can disclose personal data to other bodies in certain situations to help those bodies to deliver public interest tasks in circumstances in which personal data might have been collected for a different purpose. For example, it might be necessary for a commercial organisation to disclose personal data to a regulator on an inquiry so that that body can carry out its public functions. The provision is tightly formulated and will permit disclosure from one body to another only if the requesting organisation states that it has a public interest task, that it has an appropriate legal basis for processing the data set out in law, and that the use of the data is necessary to safeguard important public policy or other objectives listed in article 23.
I recognise that the amendment is aimed at ensuring that the requesting organisation has a genuine basis for asking for the data, but suggest that changing one verb in the clause from “state” to “confirm” will not make a significant difference. The key point is that non-public bodies will not be expected to hand over personal data on entirely spurious grounds, because of the safeguards that I described. On that basis, I hope the hon. Lady will withdraw her amendment.
I am reassured by what the Minister said about amendment 70 and am happy not to move it, but I am afraid he has not addressed all my concerns in respect of amendment 71, so I will press it to a vote.
Question put, That the amendment be made.
I will speak first to clause 7 and amendment 72. Currently, everyone has the right to ask an organisation whether or not it is using or storing their personal data and to ask for copies of that data. That is called the right of access, and exercising that right is known as making a subject access request. Stakeholders from across the spectrum, including tech companies and civil society organisations, all recognise the value of SARs in helping individuals to understand how and why their data is being used and enabling them to hold controllers to account in processing their data lawfully.
The right of access is key to transparency and often underpins people’s ability to exercise their other rights as data subjects. After all, how is someone to know that their data is being used in an unlawful way, or in a way they would object to, if they are not able to ascertain whether their personal data is being held or processed by any particular organisation? For example, as the TUC highlighted in oral evidence to the Committee, the right of data subjects to make an information access request is a particularly important process for workers and their representatives, as it enables workers to gain access to personal data on them that is held by their employer and aids transparency over how algorithmic management systems operate.
It has pleased many across the board to see the Government roll back on their suggestion of introducing a nominal fee for subject access requests. However, the Bill introduces a new threshold for when controllers are able to charge a reasonable fee, or refuse a subject access request, moving from “manifestly unfounded or excessive” to “vexatious or excessive”. When deciding whether a request is vexatious or excessive, the Bill requires the controller to have regard to the circumstances of the subject access request. That includes, but is not limited to, the nature of the request; the relationship between subject and controller; the resources available to the controller; the extent to which the request repeats a previous request made by the subject; how long ago any previous request was made; and whether the request overlaps with other requests made by the data subject to the controller.
Stakeholders such as the TUC, the Public Law Project and Which? have expressed concerns that, as currently drafted, the terms that make up the new threshold are too subjective and could be open to abuse by controllers who may define any request they do not want to answer as vexatious or excessive. Currently, all there is in the Bill to guide controllers on how to apply the threshold is a non-exhaustive list of considerations; as I raised on Second Reading, if that list is non-exhaustive, what explicit protections will be in place to stop the application of terms such as “vexatious” and “excessive” being stretched and manipulated by controllers who simply do not want to fulfil the requests they do not like?
There are concerns that without further guidance even the considerations listed could be interpreted selfishly by controllers who lack a desire to complete a request. For example, given that many subject access requests come from applicants who are suspicious of how their data is being used, or have cause to believe their data is being misused, there is a high likelihood that the relationship any given applicant has with the controller has previously involved some level of friction and, perhaps, anger. The Bill prompts controllers to consider their relationship with a data subject when determining whether their request is vexatious; what is to stop a controller simply marking any data subject who has shared suspicions as “angry and vexatious”, thereby giving them grounds to refuse a genuine request?
Without clarity on how both the new threshold and the considerations apply, the ability of data subjects to raise a legal complaint about why their request was categorised as vexatious and excessive will be severely impeded. As AWO pointed out in oral evidence, that kind of legal dispute over a subject access request may be only the first stage of court proceedings for an individual, with a further legal case on the contents of the subject access request potentially coming afterwards. There simply should not be such a long timescale and set of legal proceedings in order for a person to exercise their fundamental data rights. Even the Information Commissioner himself, despite saying that he was clear on how the phrases “vexatious” and “excessive” should be applied, mentioned to the Committee that it was right to point out that such phrases were open to numerous interpretations.
The ICO is in a great position to provide clear statutory guidance on the application of the terms, with specific examples of when they do and do not apply, so that only truly bad-natured requests that are designed to exploit the system can be rejected or charged for. Such guidance would provide clarity on the ways in which a request might be considered troublesome but neither vexatious nor excessive. That way, controllers can be sure that they have dismissed, or charged for, only requests that genuinely pass the threshold, and data subjects can be assured that they will still be able to freely access information on how their data is being used, should they genuinely need or want it.
On amendment 73, one consideration that the Bill suggests controllers rely on when deciding whether a request is vexatious or excessive is the “resources available” to them. I assume that consideration is designed to operate in relation to the “excessive” threshold and the ability to charge. For example, when a subject access request would require work far beyond the means of the controller in question, the controller would be able to charge for providing the information needed, to ensure that they do not experience a genuine crisis of resources as a result of the request. However, the Bill does not explicitly express that, meaning the consideration in its vague form could be applied in circumstances beyond that design.
Indeed, if a controller neglected to appoint an appropriate number of staff to the responsibility of responding to subject access requests, despite having the finances and resources to do so, they could manipulate the consideration to say that any request they did not like was excessive, as a result of the limited resources available to respond. As is the case across many parts of the Bill, we cannot have legislation that simply assumes that people will act in good faith; we must instead have legislation that explicitly protects against bad-faith interpretations. The amendment would ensure just that by clarifying that a controller cannot claim that a request is excessive simply because they have neglected to arrange their resources in such a way that makes responding to the request possible.
On amendment 74, as is the case with the definition of personal data in clause 1, where the onus is placed on controllers to decide whether a living individual could reasonably be identified in any dataset, clause 7 again places the power—this time to decide whether a request is vexatious or excessive—in the hands of the controller.
As the ICO notes, transparency around the use of data is fundamentally linked to fairness, and is about being
“clear, open and honest with people from the start about who you are, and how and why you use their personal data”.
If a controller decides, then, that due to a request being vexatious or excessive they cannot provide transparency on how they are processing an individual’s data at that time, the very least they could do, in the interests of upholding fairness, is to provide transparency on their justification for classifying a request in that way. The amendment would allow for just that, by requiring controllers to issue a notice to the data subject explaining the grounds on which their request has been deemed vexatious or excessive and informing them of their rights to make a complaint or seek legal redress.
In oral evidence, the Public Law Project described the Bill’s lack of a requirement for controllers to notify subjects as to why their request has been rejected as a decision that creates an “information asymmetry”. That is particularly concerning given that it is often exactly that kind of information that is needed to access the other rights and safeguards outlined in the Bill and across GDPR. A commitment to transparency, as the amendment would ensure, would not only give data subjects clarity on why their request had been rejected or required payment, but provide accountability for controllers who rely on the clause, and thereby a deterrent from misusing it to reject any requests that they dislike. For controllers, the workload of issuing such notices should surely be less than that of processing a request that is genuinely vexatious and excessive, ensuring that the provision does not counterbalance the benefits brought to controllers through the clause.
Let me start by recognising the importance of of subject access requests. I am aware that some have interpreted the change in the wording for grounds of refusal as a weakening. We do not believe that is the case.
On amendment 72, in our view the new “vexatious or excessive” language in the Bill gives greater clarity than there has previously been. The Government have set out parameters and examples in the Bill that outline how the term “vexatious” should be interpreted within a personal data protection context, to ensure that controllers understand.
Does my right hon. Friend agree that the provisions will be helpful and important for organisations that gather data about public persons, and particularly oligarchs, who are very adept at using subject access requests to bombard and overwhelm a journalist or a small investigatory team that is doing important work looking into their business activities?
I completely agree with my hon. Friend. That is an issue that both he and I regard as very serious, and is perhaps another example of the kind of legal tactic that SLAPPs—strategic lawsuits against public participation—represent, whereby oligarchs can frustrate genuine journalism or investigation. He is absolutely right to emphasise that.
It is important to highlight that controllers can already consider resource when refusing or charging a reasonable fee for a request. The Government do not wish to change that situation. Current ICO guidance sets out that controllers can consider resources as a factor when determining if a request is excessive.
The new parameters are not intended to be reasons for refusal. The Government expect that the new parameters will be considered individually as well as in relation to one another, and a controller should consider which parameters may be relevant when deciding how to respond to a request. For example, when the resource impact of responding would be minimal even if a large amount of information was requested—such as for a large organisation—that should be taken into account. Additionally, the current rights of appeal allow a data subject to contest a refusal and ultimately raise a complaint with the ICO. Those rights will not change with regard to individual rights requests.
Amendment 74 proposes adding more detail on the obligations of a controller who refuses or charges for a request from a data subject. The current legislation sets out that any request from a data subject, including subject access requests, is to be responded to. The Government are retaining that approach and controllers will be expected to demonstrate why the provision applies each time it is relied on. The current ICO guidance sets out those obligations on controllers and the Government do not plan to suggest a move away from that approach.
The clause also states that it is for the controller to show that a request is vexatious or excessive in circumstances where that might be in doubt. Thus, the Government believe that the existing legislation provides the necessary protections. Following the passage of the Bill, the Government will work with the ICO to update guidance on subject access requests, which we believe plays an important role and is the best way to achieve the intended effect of the amendments. For those reasons, I will not accept this group of amendments; I hope that the hon. Member for Barnsley East will be willing to withdraw them.
I turn to clause 7 itself. As I said, the UK’s data protection framework sets out key data subject rights, including the right of access—the right for a person to obtain a copy of their personal data. A subject access request is used when an individual requests their personal data from an organisation. The Government absolutely recognise the importance of the right of access and do not want to restrict that right for reasonable requests.
The existing legislation enables organisations to refuse or charge a reasonable fee for a request when they deem it to be “manifestly unfounded or excessive”. Some organisations, however, struggle to rely on that in cases where it may be appropriate to do so, which as a consequence impacts their ability to respond to reasonable requests.
The clause changes the legislation to allow controllers to refuse or charge a reasonable fee for a request that is “vexatious or excessive”. The clause adds parameters for controllers to consider when relying on the “vexatious or excessive” exemption, such as the nature of the request and the relationship between the data subject and the controller. The clause also includes examples of the types of request that may be vexatious, such as those intended to cause distress, those not made in good faith or those that are an abuse of process.
We believe that the changes will give organisations much-needed clarity over when they can refuse or charge a reasonable fee for a request. That will ensure that controllers can focus on responding to reasonable requests, as well as other important data and organisational needs. I commend the clause to the Committee.
I appreciate that, as the Minister said, the Government do not intend the new terms to be grounds for refusal, but his remarks do not reassure me that that will not be the case. Furthermore, as I said on moving the amendment, stakeholders such as the TUC, Public Law and Which? have all expressed concern that, as drafted, those terms are too subjective. I will press the amendment to a vote.
Question put, That the amendment be made.
Clause 8 makes changes to the time requirements to which an organisation must adhere when responding to a subject access request. Currently, organisations must respond to a subject access request within a set period; in the majority of cases, that is one month from receipt of the request. This clause enables organisations to “stop the clock” on the response time when an organisation is unable to respond without further information or clarification from an individual. For example, when the controller has information on multiple data subjects with the same name, they may require further information to help to differentiate the data subject’s information from others’. Organisations must have a legitimate reason to pause the response time; once confirmation is received from the data subject, the original time obligations resume.
The clause will also enable organisations to extend the period permitted for law enforcement and the intelligence services to respond to complex requests by two further months in certain circumstances. This replicates the existing provisions applicable to processing requests under the UK GDPR. Currently, all subject access requests received under the law enforcement and intelligence services regimes must be actioned within one month, irrespective of the complexity or number of requests received from an individual. Consequently, complex or confusing requests can disproportionately burden public bodies operating under those regimes, creating resource pressures.
Clause 8 will rectify the disparity currently existing between processing regimes and put law enforcement and intelligence services organisations on an equal footing to UK GDPR organisations. That will also provide a consistent framework for organisations operating under more than one regime at the same time. The clause also brings clarity on how best to respond to a confusing or complex request, ensuring that organisations do not lose time while seeking this clarification and can instead focus on responding to a request. On that basis, I urge that clause 8 stand part of the Bill.
I expressed my thoughts on the value and importance of subject access requests when we debated clause 7, and most of the same views remain pertinent here. Clause 8 allows for subject access requests to be extended where the nature of the request is complex, or due to volume. Some civil society groups, including Reset.tech, have expressed concern that that could mean that requests are unduly delayed for months, reflecting concern that they could be disregarded altogether, which was discussed when we debated clause 7. With that in mind, can the Minister tell us what protections will be in place to ensure that data controllers do not abuse the new ability to extend subject access requests, particularly by using the excuse that it is a large amount of data, in order to delay requests that they simply do not wish to respond to?
The clause provides some clarity on clause 7 by demonstrating that just because a request is lengthy or comes in combination with many others, it is not necessarily excessive as the clause gives controllers the option to extend the timeframe for dealing with requests that are high in volume. Of course, we do not want to unnecessarily delay requests, but allowing controllers to manage their load within a reasonable extended timeframe can act as a safeguard against their automatically relying on the “excessive” threshold. With that in mind, I am happy for the clause to stand part. However, I reiterate that my comments on clause 7 should be referred to.
May I briefly respond to the hon. Lady’s comments? I assure her that controllers will not be able to stop the clock for all subject access requests—only for those where they reasonably require further information to be able to proceed with responding. Once that information has been received from a data subject, the clock resumes and the controller must proceed with responding to the request within the applicable time period, which is usually one month from when the controller receives the request information. A data subject who has provided the requested information would also be able to complain to a controller, and ultimately to the Information Commissioner’s Office, if they feel that their request has not been processed within the appropriate time. I hope the hon. Lady will be assured that there are safeguards to ensure that this power is not abused.
Question put and agreed to.
Clause 8 accordingly ordered to stand part of the Bill.
Clause 9
Information to be provided to data subjects
Question proposed, That the clause stand part of the Bill.
Clause 9 provides researchers, archivists and those processing personal data for statistical purposes with a new exemption from providing certain information to individuals when they are reusing datasets for a different purpose, which will help to ensure that important research can continue unimpeded. The new exemption will apply when the data was collected directly from the individual, and can be used only when providing the additional information would involve a disproportionate effort. There is already an exemption from this requirement where the personal data was collected from a different source.
The clause also adds a non-exhaustive list of examples of factors that may constitute a disproportionate effort. This list is added to both the new exemption in article 13 and the existing exemption found in article 14. Articles 13 and 14 of the UK GDPR set out the information that must be provided to data subjects at the point of data collection: article 13 covers circumstances where data is directly collected from data subjects, and article 14 covers circumstances where personal data is collected indirectly—for example, via another organisation. The information that controllers must provide to individuals includes details such as the identity and contact details of the controller, the purposes of the processing and the lawful basis for processing the data.
Given the long-term nature of research, it is not always possible to meaningfully recontact individuals. Therefore, applying a disproportionate effort exemption addresses the specific problem of researchers wishing to reuse data collected directly from an individual. The exemption will help ensure that important research can continue unimpeded. The clause also makes some minor changes to article 14. Those do not amend the scope of the exemption or affect its operation, but make it easier to understand.
I now turn to clause 10, which introduces an exemption relating to legally professionally privileged data into the law enforcement regime, mirroring the existing exemptions under the UK GDPR and the intelligence services regime. As a fundamental principle of our legal system, legal professional privilege protects confidential communications between professional legal advisers and their clients. The existing exemption in the UK GDPR restricts an individual’s right to access personal data that is being processed or held by an organisation, and to receive certain information about that processing.
However, in the absence of an explicit exemption, organisations processing data under the law enforcement regime, for a law enforcement purpose rather than under the UK GDPR, must rely on ad hoc restrictions in the Data Protection Act. Those require them to evaluate and justify its use on a case-by-case basis, even where legal professional privilege is clearly applicable. The new exemption will make it simpler for organisations that process data for a law enforcement purpose to exempt legally privileged information, avoiding the need to justify the use of alternative exemptions. It will also clarify when such information can be withheld from the individual.
Hon. Members might wonder why an exemption for legal professional privilege was not included under the law enforcement regime of the Data Protection Act in the first place. The reason is that we faithfully transposed the EU law enforcement directive, which did not contain such an exemption. Following our exit from the EU, we are taking this opportunity to align better the UK GDPR and the law enforcement regime, thereby simplifying the obligations for organisations and clarifying the rules for individuals.
The impact of clause 9 and the concerns around it should primarily be understood in relation to the definition contained in clause 2, so I refer hon. Members to my remarks in the debate on clause 2. I also refer them to my remarks on purpose limitation in clause 6. To reiterate both in combination, I should say that purpose limitation exists so that it is clear why personal data is being collected, and what the intention is behind its use. That means that people’s data should not largely be reused in ways not initially collected for, unless a new legal basis is obtained.
It is understandable that, where genuine scientific, historical and statistical research is occurring, and there is disproportionate effort to provide the information required to data subjects, there may be a need for exemption and to reuse data without informing the subject. However, that must be done only where strictly necessary. We must be clear that, unless there are proper boundaries to the definition of scientific data, this could be interpreted far too loosely.
I am concerned that, without amendment to clause 2, clause 9 could extend the problem of scientific research being used as a guise for using people’s personal data in malicious or pseudoscientific ways. Will the Minister tell us what protections will be in place to ensure that people’s data is not reused on scientific grounds for something that they would otherwise have objected to?
On clause 10, I will speak more broadly on law enforcement processing later in the Bill, but it is good to have clarity on the legal professional privilege exemptions. I have no further comments at this stage.
What we are basically doing is changing the rights of individuals, who would previously have known when their data was used for a purpose other than that for which it was collected. The terms
“scientific or historical research, the purposes of archiving in the public interest or statistical purposes”
are very vague, and, according to the Public Law Project, open to wide interpretation. Scientific research is defined as
“any research that can reasonably described as scientific, whether publicly or privately funded”.
I ask the Minister: what protections are in place to ensure that private companies are not given, through this clause, a carte blanche to use personal data for the purpose of developing new products, without the need to inform the data subject?
These clauses relate to one of the fundamental purposes of the Bill, which is to facilitate genuine scientific research—obviously, that carries with it huge potential benefits in the areas of tackling disease or other scientific advances. We debated the definition of scientific research earlier in relation to clause 2. We believe that the definition is clear. In this particular case, the use of historical data can be very valuable. It is simply impractical for some organisations to reobtain consent when they may not even know where original data subjects are now located.
Order. I apologise to the Minister. He can resume his remarks at 2 o’clock, when we meet again in this room but, it being 11.25 am, the Committee is now adjourned.
Data Protection and Digital Information (No. 2) Bill (Fourth sitting) Debate
Full Debate: Read Full DebateJohn Whittingdale
Main Page: John Whittingdale (Conservative - Maldon)(1 year, 5 months ago)
Public Bill CommitteesWhen the Committee adjourned this morning, I was nearly at my conclusion; I was responding to points made by the hon. Member for Barnsley East and by the hon. Member for Glasgow North West, who has not yet rejoined us. I was saying that the exemption applies where the data originally collected is historic, where to re-contact to obtain consent would require a disproportionate effort, and where that data could be of real value in scientific research. We think that there is a benefit to research and we are satisfied that the protection is there. There was some debate about the definition of scientific research, which we covered earlier; that is a point that is appealable to the Information Commissioner’s Office. On the basis of what I said earlier, and that assurance, I hope that the Committee will agree to the clause.
Question put and agreed to.
Clause 9 accordingly ordered to stand part of the Bill.
Clause 10 ordered to stand part of the Bill.
Clause 11
Automated decision-making
I beg to move amendment 78, in clause 11, page 18, line 13, after “subject” insert “or decision subject”.
This amendment, together with Amendments 79 to 101, would apply the rights given to data subjects by this clause to decision subjects (see NC12).
I am pleased to speak to new clause 12, which would insert a definition of decision subjects, and to amendments 79 to 101, 106 and 108 to 110, which seek to insert rights and considerations for decision subjects that mirror those of data subjects at various points throughout the Bill.
Most of our data protection legislation operates under the assumption that the only people affected by data-based and automated decision making are data subjects. The vast majority of protections available for citizens are therefore tied to being a data subject: an identifiable living person whose data has been used or processed. However, as Dr Jeni Tennison described repeatedly in evidence to the Committee, that assumption is unfortunately flawed. Although data subjects form the majority of those affected by data-based decision making, they are not the only group of people impacted. It is becoming increasingly common across healthcare, employment, education and digital platforms for algorithms created and trained on one set of people to be used to reach conclusions about another, wider set of people. That means that an algorithm can make an automated decision that affects an individual to a legal or similarly significant degree without having used their personal data specifically.
For example, as Connected by Data points out, an automated decision could be made about a neighbourhood area, such as a decision on gritting or a police patrol route, based on personal data about some of the people who live in that neighbourhood, with the outcome impacting even those residents and visitors whose data was not directly used. For those who are affected by the automated decision but are not data subjects, there is currently no protection, recognition or method of redress.
The new clause would therefore define the decision subjects who are impacted by the likes of AI without their data having been used, in the hope that we can give them protections throughout the Bill that are equal to those for data subjects, where appropriate. That is especially important because special category data is subject to stricter safeguards for data subjects but not for decision subjects.
Connected by Data illustrates that point using the following example. Imagine a profiling company that uses special category data about the mental health of some volunteers to construct a model that predicts mental health conditions based on social media feeds, which would not be special category data. From that information, the company could give an estimate of how much time people are likely to take off work. A recruitment agency could then use that model to assess candidates and reject those who are likely to have extended absences. The model would never use any special category data about the candidates directly, but those candidates would have been subject to an automated decision that made assumptions about their own special category data, based on their social media feeds. In that scenario, by virtue of being a decision subject, the individual would not have the right to the same safeguards as those who were data subjects.
Furthermore, there might be scenarios in which someone was subject to an automated decision despite having consciously prevented their personal data from being shared. Connected by Data illustrates that point by suggesting that we consider a person who has set their preferences on their web browser so that it does not retain tracking cookies or share information such as their location when they visit an online service. If the online service has collected data about the purchasing patterns of similarly anonymous users and knows that such a customer is willing to pay more for the service, it may automatically provide a personalised price on that basis. Again, no personal data about the purchaser will have been used in determining the price that they are offered, but they will still be subject to an automated decision based on the data of other people like them.
What those scenarios illustrate is that it is whether an automated decision affects an individual in a legal or similarly significant way that should be central to their rights, rather than whether any personal data is held about them. If the Bill wants to unlock innovation around AI, automated decisions and the creative use of data, it is only fair that that be balanced by ensuring that all those affected by such uses are properly protected should they need to seek redress.
This group of amendments would help our legislative framework to address the impact of AI, rather than just its inputs. The various amendments to clause 11 would extend to decision subjects rights that mirror those given to data subjects regarding automated decision making, such as the right to be informed, the right to safeguards such as contesting a decision and the right to seek human intervention. Likewise, the amendments to clauses 27 and 29 would ensure that the ICO is obliged to have regard to decision subjects both generally and when producing codes of conduct.
Finally, to enact the safeguards to which decision subjects would hopefully be entitled via the amendments to clause 11, the amendment to clause 39 would allow decision subjects to make complaints to data controllers, mirroring the rights available to data subjects. Without defining decision subjects in law, that would not be possible, and members of the general public could be left without the rights that they deserve.
I am very much aware of the concern about automated decision making. The Government share the wish of the hon. Member for Barnsley East for all those who may be affected to be given protection. Where I think we differ is that we do not recognise the distinction that she tries to make between data subjects and decision subjects, which forms the basis of her amendments.
The hon. Lady’s amendments would introduce to the UK GDPR a definition of the term “decision subject”, which would refer to an identifiable individual subject to data- based and automated decision making, to be distinguished from the existing term “data subject”. The intended effect is to extend the requirements associated with provisions related to decisions taken about an individual using personal data to those about whom decisions are taken, even though personal information about them is not held or used to take a decision. It would hence apply to the safeguards available to individuals where significant decisions are taken about them solely through automated means, as amendments 78 to 101 call for, and to the duties of the Information Commissioner to have due regard to decision subjects in addition to data subjects, as part of the obligations imposed under amendment 106.
I suggest to the hon. Lady, however, that the existing reference to data subjects already covers decision subjects, which are, if you like, a sub-group of data subjects. That is because even if an individual’s personal data is not used to inform the decision taken about them, the fact that they are identifiable through the personal data that is held makes them data subjects. The term “data subject” is broad and already captures the decision subjects described in the hon. Lady’s amendment, as the identification of a decision subject would make them a data subject.
I will not, at this point, go on to set out the Government’s wider approach to the use of artificial intelligence, because that is somewhat outside the scope of the Bill and has already been set out in the White Paper, which is currently under consultation. Nevertheless, it is within that framework that we need to address all these issues.
I have been closely following the speeches of the Minister and the hon. Member for Barnsley East. The closest example that I can think of for this scenario is the use of advertising tools such as lookalike audiences on Facebook and customer match on YouTube, where a company holding data about users looks to identify other customers who are the closest possible match. It does not hold any personal data about those people, but the platform forms the intermediary to connect them. Is the Minister saying that in that situation, as far as the Bill is concerned, someone contacted through a lookalike audience has the same rights as someone who is contacted directly by an advertiser that holds their data?
Essentially, if anybody is affected by automated decision making on the basis of the characteristics of another person whose data is held—in other words, if the same data is used to take a decision that affects them, even if it does not personally apply to them—they are indeed within the broader definition of a data subject. With that reassurance, I hope that the hon. Member for Barnsley East will consider withdrawing her amendment.
I appreciate the Minister’s comments, but the point is that the data could be used—I gave the example that it might affect a group of residents who were not identifiable but were still subject to that data—so I am not quite sure that I agree with the Minister’s comparison. As the use of automated decision making evolves and expands, it is crucial that even if a person’s data is not being used directly, they are afforded protections and rights if they are subject to the outcome. I would like to press my amendment to a vote.
Question put, That the amendment be made.
I rise to speak to my amendment 120. The explanatory notes to the Bill clarify that newly permitted automated decisions will not require the existing legal safeguard of notification, stating only:
“Where appropriate, this may include notifying data subjects after such a decision has been taken”.
Clause 11 would replace article 22 of the GDPR, which regulates AI decision making, with new articles 22A to 22D. According to Connected by Data, it is built on the faulty assumption that the people who are affected by automated decision making are data subjects—identifiable individuals within the data used to make the automated decision. However, now that AI decisions can be based on information about other people, it is becoming increasingly common for algorithms created through training on one set of people to be used to reach conclusions about another set.
A decision can be based on seemingly innocuous information such as someone’s postcode or whether they liked a particular tweet. Where such a decision has an impact on viewing recommendations for an online player, we would probably not be that concerned, but personal data is being used more and more to make decisions that affect whole groups of people rather than identified individuals. We need no reminding of the controversy that ensued when Ofqual used past exam results to grade students during the pandemic.
Another example might be an electricity company getting data from its customers about home energy consumption. Based on that data, it could automatically adjust the time of day at which it offered cheaper tariffs. Everyone who used the electricity company would be affected, whether data about their energy consumption patterns were used to make the decision or not. It is whether an automated decision has a legal or similarly significant effect on an individual that should be relevant to their rights around automated decision making.
Many of the rights and interests of decision subjects are protected through the Equality Act 2010, as the Committee heard in oral evidence last week. What is not covered by other legislation, however, is how data can be used in automated decisions and the rights of decision subjects to be informed about, control and seek redress around automated decisions with a significant effect on them. According to Big Brother Watch:
“This is an unacceptable dilution of a critical safeguard that will not only create uncertainty for organisations seeking to comply, but could lead to vastly expanded ADM operating with unprecedented opacity.”
Amendment 120 would require a data controller to inform a data subject whenever a significant decision about that subject was based solely on automated processing. I am pleased that the hon. Member for Barnsley East has tabled a similar amendment, which I support.
The Government absolutely share hon. Members’ view of the importance of transparency. We agree that individuals who are subject to automated decision making should be made aware of it and should have information about the available safeguards. However, we feel that those requirements are already built into the Bill via article 22C, which will ensure that individuals are provided with information as soon as is practicable after such decisions have been taken. This will need to include relevant information that an individual would require to contest such decisions and seek human review of them.
The reforms that we propose take an outcome-focused approach to ensure that data subjects receive the right information at the right time. The Information Commissioner’s Office will play an important role in elaborating guidance on what that will entail in different circumstances.
If I understood the Minister correctly, he said that decision subjects are a subset of data subjects. Can he envisage any circumstances in which a decision subject is not included within the group “data subjects”?
It is certainly our view that anybody who is affected by an automated decision made on the basis of data held about individuals themselves becomes a data subject, so I think the answer to the honourable Lady’s question is no. As I said, the Information Commissioner’s Office will provide guidance in this area. If such a situation does arise, obviously it will need to be considered.The hon. Members for Barnsley East and for Glasgow North West asked about making information available to all those affected, and about safeguards, which we think are contained within the requirements under article 22C.
Further to the point that was made earlier, let us say that a Facebook user was targeted with an advert that was based on their protected characteristics data—data relevant to their sexual orientation, for example—but that user said that they had never shared that information with the platform. Would they have the right to make a complaint, either to the advertiser or to the platform, for inferring that data about them and making it available to a commercial organisation without their informed consent?
They would obviously have that right, and indeed they would ultimately have the right to appeal to the Information Commissioner if they felt that they had been subjected unfairly to a decision where they had not been properly informed of the fact. On the basis of what I have said, I hope the hon. Member for Barnsley East might withdraw her amendment.
I appreciate the Minister’s comment, but the Government protection does not go as far as we would like. Our amendment speaks to the potential imbalance of power in the use of data and it would not require any extra administrative effort on behalf of controllers. For that reason, I will press it to a vote.
Question put, That the amendment be made.
The hon. Lady began her remarks on the broader question of the ambition to ensure that the UK benefits to the maximum extent from the use of artificial intelligence. We absolutely share that ambition, but also agree that it needs to be regulated. That is why we have published the AI regulation White Paper, which suggests that it is most appropriate that each individual regulator should develop its own rules on how that should apply. I think in the case that she was quoting of those who had lost their jobs, maybe through an automated process, the appropriate regulator—in that case, presumably, the special employment tribunal —would need to develop its own mechanism for adjudicating decisions.
I will concentrate on the amendment. On amendment 76, we feel that clause 44 already provides for an overarching requirement on the Secretary of State to consult the Information Commissioner and other persons that she or he considers appropriate before making regulations under UK GDPR, including the measures in article 22. When the new clause 44 powers are used in reference to article 22 provisions, they will be subject to the affirmative procedure in Parliament. I know that the hon. Lady is not wholly persuaded of the merits of using the affirmative procedure, but it does mean that parliamentary approval will be required. Given the level of that scrutiny, we do not think it is necessary for the Secretary of State to have to publish an assessment, as the hon. Lady would require through her amendment.
On amendment 75, as we have already debated in relation to previous amendments, there are situations where non-statutory guidance, which can be produced without being requested under regulations made by the Secretary of State, may be more appropriate than a statutory code of practice. We believe that examples of the kinds of processing that do and do not fall within the definitions of the terms “meaningful human involvement” and “similarly significant” are best placed in non-statutory guidance produced by the ICO, as this will give the flexibility to amend and change the examples where necessary. What constitutes a significant decision or meaningful human involvement is often highly context-specific, and the current wording allows for some inter-pretability to enable the appropriate application of this provision in different contexts, rather than introducing an absolute definition that risks excluding decisions that ought to fall within this provision and vice versa. For that reason, we are not minded to accept the amendments.
I appreciate the Minister’s remarks about consultation and consulting relevant experts. He is right to observe that I am not a big fan of the affirmative procedure as a method of parliamentary scrutiny but I appreciate that it is included in this Bill as part of that.
I think the problem is that we fundamentally disagree on the power to change these definitions being concentrated in the hands of the Secretary of State. It is one thing to future-proof the Bill but another to allow the Secretary of State alone to amend things as fundamental as the safeguards offered here. I would therefore like to proceed to a vote.
Question put, That the amendment be made.
I rise to speak briefly in support of the amendment tabled by my hon. Friend the Member for Barnsley East and to emphasise the points that she made regarding the importance of putting forward a vision for the protection of workers as the nature of working environments change. That is part of what the amendment’s “digital information principles at work” seek to do. I declare an interest: I worked for Ofcom as head of technology before coming to this House. That work highlighted to me the importance of forward-looking regulation. As my hon. Friend set out, artificial intelligence is not forward looking; it is here with us and in the workplace.
Many technological changes have made work more accessible to more people: covid showed us that we could work from many different locations—indeed, Parliament successfully worked from many locations across the country. Technological changes have also made work more productive, and companies and public sector organisations are taking advantage of that increase in productivity. But some technologies have accelerated bad employment practices, driven down standards and damaged the wellbeing of workers—for example, workplace surveillance technologies such as GPS tracking, webcam monitoring and click monitoring, which encroach on workers’ privacy and autonomy. My constituents often say that they feel that technology is something that is done to them, rather than something that has their consent and empowers them.
It is important, as I am sure that the Minister will agree, that working people welcome and embrace the opportunities that technology can bring, both for them and for the companies and organisations they work for, but that cannot happen without trust in those technologies. For that, there need to be appropriate regulation and safeguards. Surely the Minister must therefore agree that it is time to bring forward a suite of appropriate principles that follows amendment’s principle of
“a fair, inclusive and trustworthy digital environment at work.”
I hope that he cannot disagree with any of that.
If we are to get ourselves out of the economic stagnation and lack of growth of the last 10 or 13 years, we need to build on new technologies and productivity, but we cannot do that without the support and trust of people in the workforce. People must feel that their rights—new rights that reflect the new environment in the workplace—are safeguarded. I hope that the Minister will agree that the principles set out in the amendment are essential to building that trust, and to ensuring a working environment in which workers feel protected and able to benefit from advances in technology.
I am grateful to the hon. Members for Barnsley East and for Newcastle upon Tyne Central for setting out the thinking behind the amendment. We share the view, as the hon. Member for Newcastle upon Tyne Central has just said, that those who are subject to artificial intelligence and automated decision making need to have trust in the process, and there need to be principles underlying the way in which those decisions are taken. In each case, the contributions go above and beyond the provision in the Bill. On what we are proposing regarding data protection, the changes proposed in clause 11 will reinforce and provide further clarification, as I have said, in respect of the important safeguards for automated decision making, which may be used in some workplace technologies. These safeguards ensure that individuals are made aware of and can seek human intervention on significant decisions that are taken about them through solely automated means. The reforms to article 22 would make clear employer obligations and employee rights in such scenarios, as we debated in the earlier amendments.
On the wider question, we absolutely recognise that the kind of deployment of technology in the workplace shown in the examples that have already been given needs to be considered across a wide range of different regulatory frameworks in terms of not just data protection law, but human rights law, legal frameworks regarding health and safety and, of course, employment law.
I thank the Minister for his comments. I note that he castigates us, albeit gently, for tabling an amendment to this data protection Bill, while he argues that there is a need for wider legislation to enshrine the rights he apparently agrees with. When and where will that legislation come forward? Does he recognise that we waited a long time and listened to similar arguments about addressing online harms, but have ended up in a situation where—in 2023—we still do not have legislation on online harms? My question is: if not now, when?
As I was Chair of the Culture, Media and Sport Committee in 2008 when we published a report calling for legislation on online safety, I recognise the hon. Lady’s point that these things take a long time—indeed, far too long—to come about. She calls for action now on governance and regulation of the use of artificial intelligence. She will know that last month the Government published the AI regulation White Paper, which set out the proposals for a proportionate outcomes-focused approach with a set of principles that she would recognise and welcome. They include fairness, transparency and explainability, and we feel that this has the potential to address the risks of possible bias and discrimination that concern us all. As she knows, the White Paper is currently out to consultation, and I hope that she and others will take advantage of that to respond. They will have until 21 June to do so.
I assure the hon. Lady and the hon. Member for Barnsley East that the Government are keenly aware of the need to move swiftly, but we want to do so in consultation with all those affected. The Bill looks at one relatively narrow aspect of the use of AI, but certainly the Government’s general approach is one that we are developing at pace, and we will obviously respond once the consultation has been completed.
The power imbalance between employer and worker has no doubt grown wider as technology has developed. Our amendment speaks to the real-life consequences of that, and to what happens when employment and data law lags behind technology. For the reasons that have been outlined by my hon. Friend the Member for Newcastle upon Tyne Central and myself, I would like to continue with my amendment.
Question put, That the amendment be made.
We have, I think, covered a lot of ground already in the debates on the amendments. To recap, clause 11 reforms the rules relating to automated decision making in article 22 of the UK GDP and relevant sections of the Data Protection Act 2018. It expands the lawful grounds on which solely automated decision making that produces a legal or similarly significant effect on an individual may be carried out.
Currently, article 22 of the UK GDPR restricts such activity to a narrow set of circumstances. By expanding the available lawful grounds and ensuring we are clear about the required safeguards, these reforms will boost confidence that the responsible use of this technology is lawful, and will reduce barriers to responsible data use.
The clause makes it clear that solely automated decisions are those that do not involve any meaningful human involvement. It ensures that there are appropriate constraints on the use of sensitive personal data for solely automated decisions, and that such activities are carried out in a fair and transparent manner, providing individuals with key safeguards.
The clause provides three powers to the Secretary of State. The first enables the Secretary of State to describe cases where there is or is not meaningful human involvement in the taking of a decision. The second enables the Secretary of State to further describe what is and is not to be taken as having a significant effect on an individual. The third enables the introduction of further safeguards, and allows those already set out in the reforms to be amended but not removed.
The reformed section 50 of the Data Protection Act mirrors the changes in subsection (1) for solely automated decision making by law enforcement agencies for a law enforcement purpose, with a few differences. First, in contrast to article 22, the rules on automated decision making apply only where such decisions have an adverse legal or similarly significant effect on the individual. Secondly, the processing of sensitive personal data cannot be carried out for the purposes of entering into a contract with the data subject for law enforcement purposes.
The final difference relates to the safeguards for processing. This clause replicates the UK GDPR safeguards for law enforcement processing but also allows a controller to apply an exemption to them where it is necessary for a particular reason, such as to avoid obstructing an inquiry. This exemption is available only where the decision taken by automated means is reconsidered by a human as soon as reasonably practicable.
The subsections amending relevant sections of the Data Protection Act 2018, which apply to processing by or on behalf of the intelligence services, clarify that requirements apply to decisions that are entirely automated, rather than solely automated. They also define what constitutes a decision based on this processing. I have explained the provisions of the clause, and hope the Committee will feel able to accept it.
I talked at length about my views about the changes to automated decision making when we debated amendments 77, 120, 76, 75, 121 and 122. I have nothing further to add at this stage, but those concerns still stand. As such, I cannot support this clause.
Question put, That the clause stand part of the Bill.
I beg to move amendment 17, in schedule 3, page 140, line 9, leave out sub-paragraph (3) and insert—
“(3) In paragraph 2—
(a) for “under Articles 15 to 22”, in the first place, substitute “arising under or by virtue of Articles 15 to 22D”, and
(b) for “his or her rights under Articles 15 to 22” substitute “those rights”.”.
This amendment adjusts consequential amendments of Article 12(2) of the UK GDPR for consistency with other amendments of the UK GDPR consequential on the insertion of new Articles 22A to 22D.
With this it will be convenient to discuss the following:
Government amendments 18 to 23.
That schedule 3 be the Third schedule to the Bill.
I can be reasonably brief on these amendments. Schedule 3 sets out the consequential changes needed to reflect references to the rules on automated decision making in reformed article 22 and section 50 and other provisions in the UK GDPR and the Data Protection Act 2018. Schedule 3 also sets out that section 14 of the Data Protection Act is repealed. Instead, reformed article 22 sets out the safeguards that must apply, regardless of the lawful ground on which such activity is carried out.
Government amendments 17 to 23 are minor technical amendments ensuring that references elsewhere in the UK GDPR and the Data Protection Act to the provisions on automated decision making are comprehensively updated to reflect the reforms related to such activity in this Bill. That means that references to article 22 UK GDPR are updated to the reformed article 22A to 22D provisions, and references to sections 49 and 50 in the Data Protection Act are updated to the appropriate new sections 50A to 50D.
I thank the Minister for outlining these technical changes. I have nothing further to add on these consequential amendments beyond what has already been discussed on clause 11 and the rules around automated decision making. Consistency across the statute book is important, but all the concerns I raised when discussing the substance of those changes remain.
Amendment 17 agreed to.
Amendments made: 18, in schedule 3, page 140, line 30, before second “in” insert “provided for”.
This amendment and Amendment 19 adjust consequential amendments of Article 23(1) of the UK GDPR for consistency with other amendments of the UK GDPR consequential on the insertion of new Articles 22A to 22D.
Amendment 19, in schedule 3, page 140, line 31, leave out “in or under” and insert
“arising under or by virtue of”.
See the explanatory statement for Amendment 18.
Amendment 20, in schedule 3, page 140, line 33, leave out from “protection” to end of line 35 and insert
“in accordance with, and with regulations made under, Articles 22A to 22D in connection with decisions based solely on automated processing (including decisions reached by means of profiling)”.
This amendment adjusts the consequential amendment of Article 47(2)(e) of the UK GDPR to reflect the way in which profiling is required to be taken into account for the purposes of provisions about automated decision-making (see Article 22A(2) inserted by clause 11).
Amendment 21, in schedule 3, page 140, line 36, leave out paragraph 10 and insert—
“10 In Article 83(5) (general conditions for imposing administrative fines)—
(a) in point (b), for “22” substitute “21”, and
(b) after that point insert—
“(ba) Article 22B or 22C (restrictions on, and safeguards for, automated decision-making);””.
This amendment adjusts the consequential amendment of Art 83(5) of the UK GDPR (maximum amount of penalty) for consistency with the consequential amendment of equivalent provision in section 157(2) of the Data Protection Act 2018.
Amendment 22, in schedule 3, page 141, line 8, leave out sub-paragraph (2) and insert—
“(2) In subsection (3), for “by the data subject under section 45, 46, 47 or 50” substitute “made by the data subject under or by virtue of any of sections 45, 46, 47, 50C or 50D”.”.
This amendment adjusts the consequential amendment of section 52(3) of the Data Protection Act 2018 for consistency with other amendments of that Act consequential on the insertion of new sections 50A to 50D.
Amendment 23, in schedule 3, page 141, line 9, leave out sub-paragraph (3) and insert—
“(3) In subsection (6), for “under sections 45 to 50” substitute “arising under or by virtue of sections 45 to 50D””.—(Sir John Whittingdale.)
This amendment adjusts the consequential amendment of section 52(6) of the Data Protection Act 2018 for consistency with other amendments of that Act consequential on the insertion of new sections 50A to 50D.
Schedule 3, as amended, agreed to.
Clause 12
General obligations
Question proposed, That the clause stand part of the Bill.
One of the main criticisms that the Government have received of the current legislative framework is that it sets out a number of prescriptive requirements that organisations must satisfy to demonstrate compliance. They include appointing independent data protection officers, keeping records of processing, appointing UK representatives, carrying out impact assessments and consulting the ICO about intended processing activities in specified circumstances.
Those rules can sometimes generate a significant and disproportionate administrative burden, particularly for small and medium-sized enterprises and for some third sector organisations. The current framework provides some limited exemptions for small businesses and organisations that are carrying out low-risk processing activities, but they are not always as clear or as useful as they should be.
We are therefore taking the opportunity to improve chapter 4 of the UK GDPR, and the equivalent provisions in part 3 of the Data Protection Act, in respect of law enforcement processing. Those provisions deal with the policies and procedures that organisations and law enforcement organisations must put in place to monitor and ensure compliance. Clauses 12 to 20 will give organisations greater flexibility to implement data protection management programmes that work for their organisations, while maintaining high standards of data protection for individuals.
Clause 12 is technical in nature. It will improve the terminology in the relevant articles of the UK GDPR by replacing the requirement to implement
“appropriate technical and organisational measures”.
In its place, data protection risks must be managed with
“appropriate measures, including technical and organisational measures,”.
That will give organisations greater flexibility to implement any measures that they consider appropriate to help them manage risks. A similar clarification is made to equivalent parts of the Data Protection Act.
Clause 13 will remove article 27 of the UK GDPR, ending the requirement for overseas controllers or processors to appoint a representative in the UK where they offer goods or services to, or monitor the behaviour of, UK citizens—
Order. I am sorry, Minister, but we are talking about clause 12 at the moment; we will come on to clause 13 later. Have you concluded your remarks on clause 12?
I think I have covered the points that I would like to make on clause 12.
Clause 12 is a set of largely technical amendments to terminology that I hope will provide clarity to data controllers and processors. I have no further comments to make at this stage.
Question put and agreed to.
Clause 12 accordingly ordered to stand part of the Bill.
Clause 13
Removal of requirement for representatives for controllers etc outside the UK
Question proposed, That the clause stand part of the Bill.
As I was saying, clause 13 will remove article 27 of the UK GDPR, ending the requirement for overseas controllers or processors to appoint a representative in the UK where they offer goods or services to, or monitor the behaviour of, UK citizens. By no longer mandating organisations to appoint a representative, we will be allowing organisations to decide for themselves the best way to comply with the requirements for effective communication. That may still include the appointment of a UK-based representative. The removal of this requirement is therefore in line with the Bill’s wider strategic aim of removing unnecessary prescriptive regulation.
The rules set out in the UK GDPR apply to all those who are active in the UK market, regardless of whether their organisation is based or located in the UK. Article 27 of the UK GDPR currently requires controllers and processors based outside the UK to designate a UK-based representative, unless they process only occasionally without special categories of data, providing an element of proportionality, or are a public authority or body. The idea is that the representative will act on behalf of the controller or processor regarding their UK GDPR compliance and will deal with the ICO and data subjects in that respect, acting as a primary contact for all things data within the country.
The removal of the requirement for a UK representative was not included in the Government’s consultation, “Data: a new direction”, nor was it even mentioned in their response. As a result, stakeholders have not been given an opportunity to put forward their opinions on this change. I wish to represent some of those opinions so that they are on the record for the Minister and his Department to consider.
Concern among the likes of Lexology, DataRep and Which? relates primarily to the fact that the current requirements for UK-based representatives ensure that UK data subjects can conveniently reach the companies that process their personal data, so that they can exercise their rights under the GDPR. Overseas data handlers may have a different first language, operate in a different time zone or have local methods of contact that are not easily accessible from the UK. Having a UK-based point of contact therefore ensures that data subjects do not struggle to apply the rights to which they are entitled because of the inevitable differences that occur across international borders.
As Lexology has pointed out, the Government’s own impact assessment says:
“There is limited information and data on the benefits of having an Article 27 representative as it is a relatively new and untested requirement and also one that applies exclusively to businesses and organisations outside of the UK which makes gathering evidence very challenging.”
By their own admission, then, the Government seem to recognise the challenges in gathering information from organisations outside the UK. If the Government find it difficult to get the information that they require, surely average citizens and data subjects may also face difficulties.
Not only is having a point of contact a direct benefit for data subjects, but a good UK representative indirectly helps data subjects by facilitating a culture of good data protection practice in the organisation that they represent. For example, they may be able to translate complex legal concepts into practical business terms or train fellow employees in a general understanding of the UK GDPR. Such functions may make it less likely that a data subject will need to exercise their rights in the first place.
As well as things being harder for data subjects in the ways I have outlined, stakeholders are not clear about the benefits of removing representatives for UK businesses. For example, the Government impact assessment estimates that the change could save a large organisation £50,000 per year, but stakeholders have said that that figure is an overestimation. Even if the figure is accurate, the saving will apply only to organisations outside the UK and will be made through a loss of employment for those who are actually based in the UK and performing the job.
The question therefore remains: if the clause is not in the interests of data subjects, of UK businesses or of UK-based employees who act as representatives, how will this country actually benefit from the change? I am keen to hear from the Minister on that point.
If there are concerns that were not fed in during the consultation period, obviously we will consider them. However, it remains the case that even without the article 27 representative requirement, controllers will have to maintain contact with UK citizens and co-operate with the ICO under other provisions of the UK GDPR. For example, overseas controllers and processors must still co-operate with the ICO as a result of the specific requirements to do so under article 31 of the UK GDPR. To answer the hon. Lady’s question about where the benefit lies, the clause is part of a streamlining process to remove what we see as unnecessary administrative requirements and bureaucracy.
Question put and agreed to.
Clause 13 accordingly ordered to stand part of the Bill.
Clause 14
Senior responsible individual
Question proposed, That the clause stand part of the Bill.
As I mentioned in our debate on clause 12, clauses 12 to 18 will give organisations greater flexibility about the policies, procedures or programmes that they put in place to ensure compliance with the legislation. As we have discussed, a criticism of the current legal framework is that many of the existing requirements are so prescriptive that they impose unnecessary burdens on businesses. Many organisations could manage data protection risks effectively without appointing an independent data protection officer, but they are forced to do so by the prescriptive rules that we inherited from the European Union.
Clause 14 will therefore abolish existing requirements on data protection officers and replace them with new requirements for organisations to designate a senior responsible individual where appropriate. That individual would be part of the organisation’s senior management and would be responsible for overseeing data protection matters within the organisation. In particular, the individual would be responsible for monitoring compliance with the legislation, ensuring the implementation of appropriate risk management procedures, responding to data protection breaches and co-operating with the information commissioner, or for ensuring that those tasks are performed by another suitably skilled person where appropriate. Senior responsible individuals may perform the tasks specified in clause 14 themselves, delegate them to suitably skilled members of staff or, if it is right for the company and its clients, seek advice from independent data protection experts.
We recognise that some people have raised concerns that giving organisations more flexibility in how they monitor and ensure compliance with the legislation could reduce standards of protection for individuals. We are confident that that will not be the effect of the clause. On the contrary, the clause provides an opportunity to elevate discussions about data protection risks to senior levels within organisations by requiring a senior responsible individual to take ownership of data protection risks and embed a culture of data protection. On that basis, I commend the clause to the Committee.
In a number of places in the Bill, the Government have focused on trying to ensure a more proportionate approach to data protection. That often takes the form of reducing regulatory requirements on controllers and processors where low-risk processing, which presents less of a threat of harm to data subjects, is taking place. Clause 14 is one place in which Ministers have applied that principle, replacing data protection officers with a requirement to appoint a senior responsible individual, but only where high-risk processing is being carried out.
Such a proportionate approach makes sense in theory. Where the stakes are lower, less formalised oversight of GDPR compliance will be required, which will be particularly helpful in small business settings where margins and resources are tight. Where the stakes are higher, however, a senior responsible individual will have a similar duty to that of a data protection officer, but with the added benefit of being part of the senior leadership team, ensuring that data protection is considered at the highest level of organisations conducting high-risk processing.
However, the Government have admitted that the majority of respondents to their consultation disagreed with the proposal to remove the requirement to designate a data protection officer. In particular, respondents were concerned that removing DPOs would result in
“a loss of data protection expertise”
and
“a potential fall in trust and reassurance to data subjects.”
Indeed, data protection officers perform a vital role in upholding GDPR, taking on responsibility for informing people of their obligations; monitoring compliance, including raising awareness and training staff; providing advice, where requested, on data protection impact assessments; co-operating with the regulator; and acting as a contact point. That provides not only guaranteed expertise to organisations, but reassurance to data subjects that they will have someone to approach should they feel the need to exercise any of their rights under the GDPR.
The contradiction between the theory of the benefits of proportionality and the reality of the concerns expressed by respondents to the consultation emphasises a point that the Government have repeatedly forgotten throughout the Bill: although removing truly unnecessary burdens can sometimes be positive, organisations often want clear regulation more than they want less regulation. They believe in the principles of the GDPR, understand the value of rights to data subjects and often over-comply with regulation out of fear of breaking the rules.
In this context, it makes sense that organisations recognise the value of having a data protection officer. They actually want in-house expertise on data—someone they can ask questions and someone they can rely on to ensure their compliance. Indeed, according to the DPO Centre, in September 2022, the UK data protection index panel of 523 DPOs unequivocally disagreed with the idea that the changes made by the clause would be in the best interests of data subjects. Furthermore, when asked whether the proposal to remove the requirement for a DPO and replace it with a requirement for a senior responsible individual would simplify the management of privacy in their organisation, 42% of DPOs surveyed gave the lowest score of 1.
Did the Department consider offering clarification, support and guidance to DPOs, rather than just removing them? Has it attempted to assess the impact of their removal on data subjects? In practice, it is likely that many data protection officers will be rebranded as senior responsible individuals. However, many will be relieved of their duties, particularly since the requirement to be part of the organisation’s senior management team could be problematic for external DPO appointments and those in more junior positions. Has the Department assessed how many data protection officers may lose their job as a result of these changes? Is the number expected to be substantial? Will there be any protections to support those people in transitioning to skilled employment surrounding data protection and to prevent an overall reduction of data protection expertise in organisations?
The clause does not in any way represent a lessening of the requirement on organisations to comply with data protection law. It simply introduces a degree of flexibility. An organisation could not get rid of data protection officers without ensuring that processing activities likely to pose high risks to individuals are still managed properly. The senior responsible individual will be required to ensure that that is the case.
At the moment, even small firms whose core activities do not involve the processing of sensitive data must have a data protection officer. We feel that that is an unnecessary burden on those small firms, and that allowing them to designate an individual will give them more flexibility without reducing the overall level of data protection that they require.
Question put and agreed to.
Clause 14 accordingly ordered to stand part of the Bill.
Clause 15
Duty to keep records
Question proposed, That the clause stand part of the Bill.
Clauses 15 and 16 will improve the record-keeping requirements under article 30 of the UK GDPR and the logging requirements under part 3 of the Data Protection Act, which is concerned with records kept for law enforcement purposes. Article 30 of the UK GDPR requires most organisations to keep records of their processing activities and includes a list of requirements that should be included in the record. Those requirements can add to the paperwork that organisations have to keep to demonstrate compliance. Although there is an exemption from those requirements in the UK GDPR for some small organisations, it has a limited impact because it applies only where their processing of personal data is “occasional”.
Clause 15 will replace the record-keeping requirements under article 30. It will make it easier for data controllers to understand exactly what needs to be included in the record. Most importantly, organisations of any size will no longer have to keep records of processing, unless their activities are
“likely to result in a high risk”
to individuals. That should help small businesses in particular, which have found the current small business exemption difficult to understand and apply in practice.
Clause 16 will make an important change to the logging requirements for law enforcement purposes in part 3 of the Data Protection Act. It will remove the ineffective requirement to record a justification when an officer consults or discloses personal data for the purposes of an investigation. The logging requirements are unique to the law enforcement regime and aim to assist in monitoring and auditing data use. Recording a justification for accessing data was intended to help protect against unlawful access, but the reality is that someone is unlikely to record an honest reason if their access is unlawful. That undermines the purpose of this requirement, because appropriate and inappropriate uses would both produce essentially indistinguishable data.
As officers often need to access large amounts of data quickly, especially in time-critical scenarios, the clause will facilitate the police’s ability to investigate and prevent crime more swiftly. We estimate that the change could save approximately 1.5 million policing hours. Other elements of the logs, such as the date and time of the consultation or disclosure and the identity of the person accessing them, are likely to be far more effective in protecting personal data against misuse; those elements remain in place. On that basis, I commend the clauses to the Committee.
Record keeping is a valuable part of data processing. It requires controllers, and to a lesser extent processors, to stay on top of all the processing that they are conducting by ensuring that they record the purposes for processing, the time limits within which they envisage holding data and the categories of recipients to whom the data has been or will be disclosed.
Many respondents to the Government’s consultation “Data: a new direction” said that they did not think the current requirements were burdensome. In fact, they said that the records allow them easily to understand the personal data that they are processing and how sensitive it is. It is likely that that was helped by the fact that the requirements were proportionate, meaning that organisations that employed under 250 people and were not conducting high-risk processing were exempt from the obligations.
It is therefore pleasing to see the Government rolling back on the idea of removing record-keeping requirements entirely, as was suggested in their consultation. As was noted, the majority of respondents disagreed with that proposal, and it is right that it has been changed. However, some respondents indicated a preference for more flexibility in the record-keeping regime, which is what I understand the clause is trying to achieve. Replacing the current requirements with a requirement to keep an appropriate record of processing, tied to high-risk activities, will give controllers the flexibility that they require.
As with many areas of the Bill, it is important that we be clear on the definition of “appropriate” so that it cannot be used by those who simply do not want to keep records. I therefore ask the Minister whether further guidance will be available to assist controllers in deciding what counts as appropriate.
I also wish to highlight the point that although in isolation the clause does not seem to change requirements much, other than by adding an element of proportionality, it cannot be viewed in isolation. In combination with other provisions, such as the reduced requirements on DPIAs and the higher threshold for subject access requests, it seems that there will be less records overall on which a data subject might be able to rely to understand how their personal information is being used or to prove how it has been used when they seek redress. With that in mind, I ask the Minister whether the Government have assessed the potential impact of the combination of the Bill’s clauses on the ability of data subjects to exercise their rights. Do the Government have any plans to work with the commissioner to monitor any such impacts on data subjects after the Bill is passed?
I turn to clause 16. Section 62 of the Data Protection Act 2018 requires competent authorities to keep logs that show who has accessed certain datasets, and at what time. It also requires that that access be justified: the reason for consulting the data must be given. Justification logs exist to assist in disciplinary proceedings, for example if there is reason to believe that a dataset has been improperly accessed or that personal data has been disclosed in an unauthorised way. However, as Aimee Reed, director of data at the Met police and chair of the national police data board, told the Committee:
“It is a big requirement across all 43 forces, largely because…we are operating on various aged systems. Many of the technology systems…do not have the capacity to log section 62 requirements, so police officers are having to record extra justification in spreadsheets alongside the searches”.––[Official Report, Data Protection and Digital Information (No. 2) Public Bill Committee, 10 May 2023; c. 56, Q118.]
That creates what she described as a “considerable burden”.
Understandably, therefore, the Bill removes the justification requirement. There are some—the Public Law Project, for example—who have expressed concern that this change would pose a threat to individual rights by allowing the police to provide a retrospective justification for accessing records. However, as the explanatory notes indicate, it is highly unlikely that in an investigation concerning inappropriate use, a justification recorded by the individual under investigation for improper access or unauthorised access could be relied on anyway. Clause 16 would therefore not stop anyone from being investigated for improper access; it would simply reduce the burden of recording a self-identified justification that could hardly be relied on anyway. I welcome the intent of the clause and the positive impact that it could have on our law enforcement processing.
The intention behind clause 15 is to reduce the burden on organisations by tying the record-keeping requirements to high-risk processing activities. If there is uncertainty about the nature of the risk, organisations will be able to refer to ICO guidance. The ICO has already published examples on its website of processing that is likely to be high-risk for the purposes of completing impact assessments; clause 17 will require it to apply the guidance to the new record-keeping requirements as well. It will continue to provide guidance on the matter, and we are happy to work with it on that.
With respect to clause 16, I am most grateful for the Opposition’s welcome recognition of the benefits for crime prevention and law enforcement.
Question put and agreed to.
Clause 15 accordingly ordered to stand part of the Bill.
Clause 16 ordered to stand part of the Bill.
Clause 17
Assessment of high risk processing
I beg to move amendment 102, in clause 17, page 32, line 12, leave out from “with” to the end of line 28 on page 33 and insert
“subsection (2)
(2) In Article 57(1) (Information Commissioner’s tasks), for paragraph (k) substitute—
‘(k) produce and publish a document containing examples of types of processing which the Commissioner considers are likely to result in a high risk to the rights and freedoms of individuals (for the purposes of Articles 27A, 30A and 35);’.”
This amendment would remove the provisions of clause 17 which replace the existing data protection impact assessment requirements with new requirements about “high risk processing”, leaving only the requirement for the ICO to produce a document containing examples of types of processing likely to result in a high risk to the rights and freedoms of individuals.
As was the intention, the Bill loosens restrictions on processing personal data in many areas: it adds a new lawful basis and creates new exceptions to purpose limitation, removes blocks to automated decision-making and allows for much thinner record keeping. Each change in isolation may make only a relatively small adjustment to the regime. Collectively, however, they result in a large-scale shift towards controllers being able to conduct more processing, with less transparency and communication, and having fewer records to keep, all of which reduces opportunities for accountability.
As mentioned, loosening restrictions is an entirely deliberate consequence of a Bill that seeks to unlock innovation through data—an aim that Members across the House, including me, are strongly behind, given the power of data to influence growth for the public good. However, given the cumulative impact of this deregulation, where increasingly opaque processing is likely to result in a large risk to people’s rights, a processor might at the very least record how they will ensure that any high-risk activities that they undertake do not lead to unlawful or discriminatory outcomes for the general public. That is exactly what the current system of DPIAs, as outlined in article 35 of GDPR, allows for. These assessments, which require processors to measure their activities against the risk to the rights and freedoms of data subjects, are not just a tick-box exercise, unnecessary paperwork or an administrative burden; they are an essential tool for ensuring that organisations do not deploy, and individuals are not subjected to, systems that may lead to a fundamental breach of their rights.
Assessments of that kind are not a concept unique to data processing. The Government routinely publish impact assessments on the legislation that they want to introduce; any researcher or scientist is likely to conduct an assessment of the safety and morality of their methodology; and a teacher will routinely and formally measure the risks involved when taking pupils on a school trip. Where activities pose a high risk to others, it is simply common practice to keep a record of where the risks lie, and to make plans to ensure that they are mitigated where possible.
In the case of data, not only are DPIAs an important mechanism to ensure that risks are managed, but they act as a key tool for data subjects. That is first because the process of conducting a DPIA encourages processors to consult data subjects, either directly or through a representative, on how the type of processing might impact them. Secondly, where things go wrong for data subjects, DPIAs act as a legal record of the processing, its purpose and the risks involved. Indeed, the Public Law Project, a registered charity that employs a specialist lawyer to conduct research, provide training and take on legal casework, identified DPIAs as a key tool in litigating against the unlawful use of data processing. They show a public law record of the type of processing that has been conducted, and its impact.
The TUC and the Institute for the Future of Work echo that, citing DPIAs as a crucial process and consultation tool for workers and trade unions in relation to the use of technology at work. The clause, however, seeks to water down DPIAs, which will become “assessments of high-risk processing”. That guts both the fundamental benefit of risk management that they offer in a data protection system that is about to become increasingly transparent, and the extra benefits that they give to data subjects.
Instead of requiring a systematic description of the processing operations and purposes, under the new assessments the controller would be required only to summarise the purpose of the processing. Furthermore, instead of conducting a proportionality assessment, controllers will be required only to consider whether the processing is necessary for the stated purpose. The Public Law Project describes the proportionality assessment as a crucial legal test that weighs up whether an infringement of human rights, including the right not to be discriminated against, is justified in relation to the processing being conducted.
When it comes to consultation, where previously it was encouraged for controllers to seek the views of those likely to be impacted by the processing, that requirement to seek those views will now be entirely omitted, despite the important benefit to data subjects, workers and communities. The new tests therefore simply do not carry the same weight or benefit as DPIAs, which in truth could themselves be strengthened. It is simply not appropriate to remove the need to properly assess the risk of processing, while simultaneously removing restrictions that help to mitigate those risks. For that reason, the clause must be opposed; we would keep only the requirement for the ICO to produce that much-needed guidance on what constitutes high-risk processing.
Moving on to amendment 103, given the inherent importance of conducting risk assessments for high-risk processing, and their potential for use by data subjects when things go wrong, it seems only right that transparency be built into the system where it comes to Government use of public data. The amendment would do just that, and only that. It would not adjust any of the requirements on Government Departments or public authorities to complete high-risk assessments; it would simply require an assessment to be published in any case where one is completed. Indeed, the ICO guidance on DPIAs says:
“Although publishing a DPIA is not a requirement of UK GDPR, you should actively consider the benefits of publication. As well as demonstrating compliance, publication can help engender trust and confidence. We would therefore recommend that you publish your DPIAs, where possible, removing sensitive details if necessary.”
However, very few organisations choose to publish their assessments. This is a chance for the Government to lead by example, and foster an environment of trust and confidence in data protection
Alongside the amendment I tabled on compulsory reporting on the use of algorithms, this amendment is designed to afford the general public honesty and openness on how their data is used, especially where the process has been identified as having a high risk of causing harm. Again, a published impact assessment would provide citizens with an official record of high-risk uses of their data, should they need that when seeking redress. However, a published impact assessment would also encourage responsible use of data, so that redress does not need to be sought in the first place.
The Government need not worry about the consequences of the amendment if they already meet the requirement to conduct the correct impact assessments and process them in such a way that the benefits are not heavily outweighed by a risk to data rights. If rules are being followed, the amendment will only provide proof of that. However, if anyone using public data in a public authority’s name did so without completing the appropriate assessments, or processed that data in a reckless or malicious way, there would be proof of that. Where there is transparency, there is accountability, and where the Government are involved, accountability is always crucial in a democracy. The amendment would ensure that accountability shined through in data protection law.
Finally, I turn to clause 18. The majority of respondents to the “Data: a new direction” consultation agreed that organisations are likely to approach the ICO voluntarily before commencing high-risk processing activities if that is taken into account as a mitigating factor in any future investigation or enforcement action. The loosening of requirements in the clause is therefore not a major concern. However, when that is combined with the watering down of the impact assessments, there remains an overarching concern about the oversight of high-risk processing. I refer to my remarks on clause 17, in which I set out the broader problems that the Bill poses to protection against harms from high-risk processing.
As we have discussed, one of the principal objectives of this part of the Bill is to remove some of the prescriptive unnecessary requirements on organisations to do things to demonstrate compliance. Clauses 17 and 18 reduce the unnecessary burdens placed on organisations by articles 35 and 36 of the UK GDPR in respect of data protection impact assessments and prior consultation with the ICO respectively.
Clause 17 will replace the EU-derived notion of a data protection impact assessment with more streamline requirements for organisations to document how they intend to assess and mitigate risks associated with high-risk processing operations. The changes will apply to both the impact assessment provisions under the UK GDPR and the section of the Data Protection Act 2018 that deals with impact assessments for processing relating to law enforcement. Amendment 102 would reverse those changes to maintain the current data protection impact assessment requirements, but we feel that this would miss an important opportunity for reform.
There are significant differences between the new provisions in the Bill and current provisions on data protection impact assessments. First, the new provisions are less prescriptive about the precise processing activities for which a risk assessment will be required. We think organisations are best placed to judge whether a particular activity poses a high risk to individuals in the context of the situation, taking account of any relevant guidance from the regulator.
Secondly, we have also removed the mandatory requirement to consult individuals about the intended processing activity as part of a risk-assessment process, as that imposes unnecessary burdens. There are already requirements in the legislation to ensure that any new processing is fair, transparent and designed with the data protection principles in mind. It should be open to businesses to consult their clients about intended new processing operations if they wish, but that should not be dictated to them by the data protection legislation.
Clause 18 will make optional the previous requirement for data controllers to consult the commissioner when a risk assessment indicates a potential high risk to individuals. The Information Commissioner will be able to consider any voluntary actions that organisations have taken to consult the ICO as a factor when imposing administrative fines on a data controller. Currently, compliance with the prior consultation requirement is low, likely due to a lack of clarity in the legislation and a reluctance for organisations to engage directly with the regulator on potential high-risk processing. The clause will encourage a more proactive, open and collaborative dialogue between the ICO and organisations, so that they can work together to better mitigate the risks.
The Opposition’s amendment 103 would mandate the publication of risk assessments by all public sector bodies. That requirement would, in our view, place a disproportionate burden on public authorities of all sizes. It would apply not just to Departments but to smaller public authorities such as schools, hospitals, independent pharmacies and so on. The amendment acknowledges that each public authority would have to spend time redacting sensitive details from risk assessments prior to publication. As those assessments can already be requested by the ICO as part of its investigations, or by members of the public via freedom of information requests, we do not think it is necessary to impose that significant new burden on all public bodies. I therefore invite the hon. Member for Barnsley East to withdraw her two amendments, and I commend clauses 17 and 18 to the Committee.
I am happy not to press amendment 103 to a vote, but on amendment 102, I simply do not think it is appropriate to remove the need to properly assess the risk of processing while removing the restrictions that help to mitigate it. For those reasons, I will press it to a vote.
Question put, That the amendment be made.
I beg to move amendment 1, in clause 19, page 35, leave out lines 23 to 25 and insert—
“(5) The Commissioner must encourage expert public bodies to submit codes of conduct described in subsection (1) to the Commissioner in draft.”.
This amendment replaces a duty on expert public bodies to submit draft codes of conduct relating to compliance with Part 3 of the Data Protection Act 2018 to the Information Commissioner with a duty on the Information Commissioner to encourage such bodies to do so.
With this it will be convenient to discuss the following:
Government amendments 2 to 4.
Clause stand part.
Clause 19 introduces an ability for public bodies with the appropriate knowledge and expertise to produce codes of conduct applicable to the law enforcement regime. The clause mirrors the equivalent provision in the UK GDPR.
As with regular guidance, these codes of conduct will be drafted by law enforcement data protection experts and tailored to the specific data protection issues that affect law enforcement agencies, to help improve compliance with the legislation and encourage best practice. However, they are intended to carry more weight, because they will additionally have the formal approval of the Information Commissioner.
When a code of conduct is produced, there is a requirement to submit a draft of it to the Information Commissioner. While that is good practice, we think it is unnecessary to mandate that. Government amendment 1 replaces that requirement with a duty on the commissioner to instead encourage public bodies to do that. Government amendments 2 and 3 are consequential to that.
Where a public body has submitted a code of conduct to the commissioner for review, Government amendment 4 removes the requirement for the commissioner to review any subsequent amendments made by the public body until the initial draft has been considered. This change will promote transparency, greater clarity and confidence in how police process personal data under the law enforcement regime. Codes of conduct are not a new concept. The clause mirrors what is already available under the UK GDPR.
The Bill fails to fully recognise that the burdens that organisations face in complying with data protection legislation are not always best dealt with by simply removing the protections in place. In many cases, clarification and proper guidance can be just as fruitful in allowing data protection to work more seamlessly. Clauses such as clause 19, which seeks to create an environment in which best practice is shared on how to comply with data protection laws and deal with key data protection challenges, are therefore very welcome. It is absolutely right that we should capitalise on pockets of experience and expertise, especially in the public sector, where resources have often been stretched, particularly over the last 13 years. We should ensure that learnings are shared with those who are less familiar with how to resolve challenges around data.
It is also pleasing to see that codes that give sector-specific guidance will be approved by the commissioner before being published. That will ensure absolute coherence between guidance and the enforcement of data protection law more widely. I look forward to seeing what positive impact the codes of conduct will have on how personal data is handled by public bodies, to the benefit of the general public as well as the public bodies themselves; the burden on them will likely be lifted as a result of the clarity provided by the guidance.
I welcome the Opposition’s support.
Amendment 1 agreed to.
Amendments made: 2, in clause 19, page 35, line 26, leave out from ‘body’ to ‘, the’ in line 27 and insert ‘does so’.
This amendment is consequential on Amendment 1.
Amendment 3, in clause 19, page 35, line 28, leave out ‘draft’.
This amendment is consequential on Amendment 2.
Amendment 4, in clause 19, page 35, line 33, leave out from ‘conduct’ to the end of line 34 and insert—
‘that is for the time being approved under this section as they apply in relation to a code’.—(Sir John Whittingdale.)
This amendment makes clear that the Commissioner’s duty under new section 68A of the Data Protection Act 2018 to consider whether to approve amendments of codes of conduct relates only to amendments of codes that are for the time being approved under that section.
Clause 19, as amended, ordered to stand part of the Bill.
Clause 20
Obligations of controllers and processors: consequential amendments
Question proposed, That the clause stand part of the Bill.
With this it will be convenient to consider the following:
Government amendments 42 and 43.
That schedule 4 be the Fourth schedule to the Bill.
Government amendments 40 and 41.
As clauses 12 to 18 remove terms such as data protection officers and data protection impact assessments from the legislation, some consequential changes are required to other parts of the legislation where the same terms are used. Clause 20 therefore introduces schedule 4, which sets out the details of the consequential changes required. An example of that is in article 13 of the UK GDPR, which currently requires controllers to provide individuals with the contact details of the data protection officer, where appropriate. In future, that provision will refer to the organisation’s senior responsible individual instead. Removal of the term data protection officer from the UK GDPR will have knock-on effects in other areas, including in relation to the types of people from whom the ICO receives requests and queries.
Government amendment 40 will provide that the commissioner may refuse to deal with vexatious or excessive requests made by any person, not just those made by data protection officers or data subjects. Government amendments 41 to 43 make further minor and technical changes to the provisions in schedule 4 to reflect the changes we have made to the terminology.
With this it will be convenient to discuss the following:
Amendment 104, in schedule 5, page 144, line 28, at end insert—
‘4 All provisions in this Chapter must be applied in such a way as to ensure that the level of protection of natural persons guaranteed by this Regulation is not undermined.’
This amendment would reinsert into the new Article on general principles for international data transfers the principle that all provisions of this Chapter of the UK GDPR should be applied in such a way as to ensure that the level of protection of natural persons guaranteed by the Regulation is not undermined.
Government amendments 24 to 26.
That schedule 5 be the Fifth schedule to the Bill.
Government amendments 27 to 29.
That schedule 6 be the Sixth schedule to the Bill.
That schedule 7 be the Seventh schedule to the Bill.
Clause 21 refers to schedules 5 to 7, which introduce reforms to the provisions of the UK GDPR and the Data Protection Act 2018, which regulate the international transfers of personal data. Schedule 5 introduces changes to the UK’s general processing regime for transferring personal data internationally. In order to provide for a clearer structure than the current UK regime, schedule 5 will consolidate the existing provisions on international transfers. It replaces article 44 with article 44A, setting out in clearer terms the general principles for international transfers and listing the same bases under which personal data can be lawfully transferred overseas.
Schedule 5 also introduces article 45A, which sets out the Secretary of State’s power to make regulations approving transfers of personal data to a third country or international organisation. The Government now use the term “data bridges” to refer to those regulations, which allow the free flow of personal data. Article 45A outlines that the Secretary of State may make such regulations only if they are satisfied that the data protection test is met. In addition to the requirement that the Secretary of State be satisfied that the data protection test is met, article 45A specifies that the Secretary of State may have regard to other matters that he or she considers relevant when making those regulations, including the desirability of facilitating transfers of personal data to and from the UK.
Article 45B sets out the data protection test that the Secretary of State must consider is met in order to establish new data bridges. In order for a country or international organisation to meet the data protection test, the standard of protection for personal data in that country or international organisation must be “not materially lower” than the standard of protection under the UK’s data protection framework. The reformed law recognises that the Secretary of State must exercise their judgment when making a determination. Their assessment will be made with respect to the outcomes of data protection in a third country, instead of being prescriptive about the form and means of protection, recognising that no two data protection regimes are identical.
The article also sets out a more concise and streamlined list of key factors that the Secretary of State must consider as part of their assessment. However, article 45B(2) is a non-exhaustive list, and the Secretary of State may also need to consider other matters in order to determine whether the required standard of protection exists.
Article 45C amends the system for formally reviewing data bridge regulations, removing the requirement for them to be reviewed periodically. The Secretary of State will still be subject to the requirement to monitor developments in other countries on an ongoing basis. Schedule 5 also amends article 46, which sets out the rules for controllers and processors to make international transfers of personal data using alternative transfer mechanisms.
The new article 46 requirements are tailored for data exporters to transfer defined types of data in specific circumstances. They stipulate that the data exporter, acting reasonably and proportionately, must consider that the standard of protection provided for the data subject would be “not materially lower” than the standard of protection in the UK in the specific circumstances of the transfer. The new requirements accommodate disparities between data exporters, where what is right for a multinational organisation transferring lots of sensitive data may not be right for a small charity making ad hoc transfers.
Schedule 5 also introduces article 47A, which provides a power for the Secretary of State to create or recognise new UK and non-UK alternative transfer mechanisms. The new power will help to future-proof the UK’s international transfers regime by allowing the Government to shape international developments and react quickly to global trends, helping UK businesses connect and trade with their partners around the world.
Schedule 6 amends relevant parts of the Data Protection Act 2018 governing international transfers of personal data, which are governed by the law enforcement processing regime. Paragraph 4 omits the section governing transfers based on adequacy assessments and inserts a new provision to mirror the approach being adopted in schedule 5. As with the changes described in schedule 5, schedule 6 amends the power in new section 74AA for the Secretary of State to make regulations approving transfers of personal data to another jurisdiction. It replaces the current list of considerations with a broader, non-exhaustive one. The schedule also clarifies the test found in new section 74AB that must be applied when regulations are made, giving greater clarity to the UK regulations decision-making process.
The Minister is being very courteous and generous, and he makes a very sensible suggestion. Will he respond to amendment 104 after the Opposition have spoken to it?
It would make sense to explain the reasons why we are not convinced after we have heard the arguments in favour.
I am grateful to the Minister, and I will focus my remarks particularly on the contents of schedule 5 before explaining the thought process behind amendment 104.
In the globalised world in which we live, we have an obligation to be outward looking and to consider not just the activities that take place in the UK, but those that occur worldwide. When it comes to data protection, that means accepting that data will likely need to travel across borders, and inserting appropriate safeguards so that UK citizens do not lose the protection of data protection laws if their personal data is transferred away from this country. The standard of those safeguards is absolutely crucial to the integrity of our entire data protection regime. After all, if a controller can simply send the personal data of UK citizens to a country that has limited data protection laws for processing that would be unlawful here, and if they can transfer that data back afterwards, in reality our laws are only as strong as the country with the weakest protections in the world.
As things stand, there is only a limited set of circumstances under which personal data can be transferred to a third party outside the UK. One such circumstance is where there is an adequacy agreement, similar to that which we have with the EU. For such an agreement to be reached, the Secretary of State must have considered many things, including the receiver’s respect for human rights and data rules; the presence, or lack thereof, of a regulator, and its independence; and any international commitments they have made in relation to data protection. These amendments ensure that data can flow freely between the UK and another country as long as the level of protection received by citizens is not undermined by the regulatory structure in that country.
The Bill amends the adequacy-based framework and replaces it with a new outcomes-based approach through the data protection test. The test is met if the standard of the protection provided for data subjects, with regard to the general processing of personal data in the country or by the organisation, is not materially lower than the standard of protection under the UK GDPR and relevant parts of the DPA 2018.
When deciding whether the test is met, the Secretary of State must still consider many of the same things: their respect for human rights, the existence of a regulator, and international obligations. However, stakeholders such as Reset.tech and the TUC have expressed concern that the new test could mean that UK data is transferred to countries with lower standards of protection than previously. That is significant not just for data subjects in the UK, who may be faced with weaker rights, but for business, which fears that this may signify a divergence from the EU GDPR that could threaten the UK’s own adequacy status. Losing this agreement would have real-world consequences for UK consumers and businesses to the tune of hundreds of millions of pounds. What conversations has the Minister had with representatives of the European Commission to ensure that the new data protection test does not threaten adequacy? Does he expect the new data protection test to result in the data of UK citizens being passed to countries with weaker standards than are allowed under the current regime?
Moving on to amendment 104, one reason why some stakeholders are expressing concern about the new rules is because they appear to omit article 44. As it stands, for those who are concerned about the level of data protection available to them as a result of international transfers, article 44 of the UK GDPR provides a guarantee that the integrity of the UK’s data protection laws will be protected. Indeed, it sets out that all provisions relating to the international transfer of UK personal data
“shall be applied in order to ensure that the level of protection of natural persons guaranteed by this Regulation is not undermined.”
If UK data will not be transferred to countries with weaker protections, it is not clear why this simple guarantee would be removed. The amendment would clear up any confusion around that and reinsert the article so that data subjects can be reassured of the strength of this new data protection test and of their rights.
Again, it is important to emphasise that getting the clause right is absolutely essential, as it underpins the entire data protection regime in the country. Getting it wrong could cost a huge amount, rendering the Bill, the UK GDPR and the Data Protection Act 2018 essentially useless. It is likely that the Government do not intend to undermine their own regulatory framework. Reinserting the article would confirm that in the Bill, offering complete clarity that the new data protection test will not result in lower levels of protection for UK data subjects.
We completely agree with the hon. Lady that we would not wish to see data transferred to countries that have an inferior data protection regime. However, we do not think amendment 104 is required to achieve that, because the reforms in chapter 5 already provide for a clear and high standard of protection when transferring personal data overseas. It states that the standard of protection in that country must not be “materially lower” than the standard under the UK GDPR. That ensures that high standards of data protection are maintained. In addition, we feel that the amendment would return us to the confusion of the existing regime. At present, the legislative framework makes it difficult for organisations and others to understand what standard needs to be applied when transferring personal data internationally, with several terms used in the chapter and in case law. Our reforms ensure that a clear standard applies, which maintains protection for personal data.
The hon. Lady raised the EU’s data adequacy assessment. That is something that featured earlier in our debates on the Bill, and, as we heard from a number of our witnesses, including the information commissioner, there is no reason to believe that this in any way jeopardises the EU’s assessment of the UK’s data adequacy.
Government amendment 24 revises new article 45B(3)(c) of the UK GDPR, which is inserted by schedule 5 and which makes provision about the data protection test that must be satisfied for data bridge regulations to be made. An amendment to the Bill is required for the Secretary of State to retain the flexibility to make data bridge regulations covering transfers from the UK or elsewhere. The amendment will preserve the status quo under the current regime, in which the Secretary of State’s power is not limited to covering only transfers from the UK. In addition to these amendments, four other minor and technical Government amendments —25, 26, 28 and 29—were tabled on 10 May.
Question put and agreed to.
Clause 21 accordingly ordered to stand part of the Bill.
Schedule 5
Transfers of personal data to third countries etc: general processing
Amendments made: 24, in schedule 5, page 147, line 3, leave out “from the United Kingdom” and insert
“to the country or organisation by means of processing to which this Regulation applies as described in Article 3”.
New Article 45B(3)(c) of the UK GDPR explains how references to processing of personal data in a third country should be read (in the data protection test for regulations approving international transfers of personal data). This amendment changes a reference to data transferred from the United Kingdom to include certain data transferred from outside the United Kingdom.
Amendment 25, in schedule 5, page 147, line 12, leave out
“the transfer of personal data”
and insert “transfer”.
This amendment and Amendment 26 simplify the wording in new Article 45B(4)(b) of the UK GDPR.
Amendment 26, in schedule 5, page 147, line 14, leave out
“the transfer of personal data”
and insert “transfer”.—(Sir John Whittingdale.)
See the explanatory statement for Amendment 25.
Schedule 5, as amended, agreed to.
Schedule 6
Transfers of personal data to third countries etc: law enforcement processing
Amendments made: 27, in schedule 6, page 155, line 39, leave out “from the United Kingdom” and insert—
“to the country or organisation by means of processing to which this Act applies as described in section 207(2)”.
New section 74AB(3)(c) of the Data Protection Act 2018 explains how references to processing of personal data in a third country should be read (in the data protection test for regulations approving international transfers of personal data). This amendment changes a reference to data transferred from the United Kingdom to include certain data transferred from outside the United Kingdom.
Amendment 28, in schedule 6, page 156, line 6, leave out
“the transfer of personal data”
and insert “transfer”.
This amendment and Amendment 29 simplify the wording in new section 74AB(4)(b) of the Data Protection Act 2018.
Amendment 29, in schedule 6, page 156, line 8, leave out
“the transfer of personal data”
and insert “transfer”.—(Sir John Whittingdale.)
See the explanatory statement for Amendment 28.
Schedule 6, as amended, agreed to.
Schedule 7 agreed to.
Clause 22
Safeguards for processing for research etc purposes
I beg to move amendment 34, in clause 22, page 36, leave out lines 20 to 22.
This amendment and Amendment 37 transpose the requirement for processing of personal data for research, archiving and statistical purposes to be carried out subject to appropriate safeguards from the beginning to the end of new Article 84B of the UK GDPR.
With this it will be convenient to discuss the following:
Government amendments 35 to 39.
Clause stand part.
Clause 23 stand part.
Clause 22 creates a new chapter in the UK GDPR that provides safeguards for the processing of personal data for the purposes of scientific research or historical research, archiving in the public interest, and for statistical purposes. Currently, the provisions that provide safeguards for those purposes are spread across the UK GDPR and the Data Protection Act 2018.
Clause 22 consolidates those safeguards in a new chapter 8A of the UK GDPR. Those safeguards ensure that the processing of personal data for research, archiving and statistical purposes does not cause substantial damage or substantial distress and that appropriate technical and organisational measures are in place to respect data minimisation. Clause 23 sets out consequential changes to the UK GDPR and Data Protection Act 2018 required as a result of the changes being made in clause 22 to consolidate safeguards for research.
Government amendments 34 to 39 are minor, technical amendments clarifying that, as part of the pre-existing additional requirement when processing for research, archiving and statistical purposes, a controller is to use anonymous—rather that personal—data, unless that means that those purposes cannot be fulfilled. It makes clear that processing to anonymise the personal data is permitted. On that basis, I commend the clauses, and indeed the Government amendments, to the Committee.
With regards to clause 22, it is pleasing to see a clause confirming the safeguards that are applicable when processing under the new research and scientific purposes. For example, it is welcome that it is set out that such processing must not cause substantial damage or distress to a data subject, must respect the principle of data minimisation and must not make decisions related to a particular data subject unless it is for approved medical research.
Those safeguards are especially important given the concerns that I laid out over the definition of scientific research in clause 2, which could lead to the abuse of data under the guise of legitimate research. I have no further comments on the clause or the Government’s amendments to it at this stage, other than to reiterate that the definition of scientific research must have clear boundaries if any of the clauses that concern research are to be used as intended.
Clause 23 makes changes consequential on those in clause 22, so I refer to the substance of my remarks during the discussion of the previous clause.
Amendment 34 agreed to.
With this it will be convenient to discuss the following:
Amendment 105, in clause 25, page 44, line 6, leave out “must consult the Commissioner” and insert
“must apply to the Commissioner for authorisation of the designation notice on the grounds that it satisfies subsection (1)(b).”
This amendment seeks to increase independent oversight of designation notices by replacing the requirement to consult the Commissioner with a requirement to seek the approval of the Commissioner.
Clauses 25 and 26 stand part.
Clause 24 introduces an exemption that can be applied to the processing of personal data for law enforcement purposes under the law enforcement regime for the purposes of safeguarding national security. It will replace the current, more limited national security exemptions that exist in the law enforcement regime and mirror the existing exemptions in the UK GDPR and intelligence services regime.
The clause will allow organisations to exempt themselves from specified provisions in the law enforcement regime of the Data Protection Act 2018, such as some of the data protection principles and the rights of the individual, but only where it is necessary to do so for the purposes of safeguarding national security. Like the other exemptions in the Act, it must be applied on a case-by-case basis. There are limits to what the exemption applies to. The processing of data by law enforcement authorities must always be lawful, and the protections surrounding sensitive processing remain.
Subsection (2) amends the general processing regime of the Data Protection Act, regarding processing under UK GDPR, to remove the ability of organisations to exempt themselves, on the grounds of safeguarding national security, from article 77 of the UK GDPR, which provides the right for individuals to lodge a complaint with the Information Commissioner. That is because we do not consider exemption from that provision necessary. The change will align the national security exemption applicable to UK GDPR processing with the other national security exemptions in the Data Protection Act 2018, which do not permit the exemption to be applied in relation to an individual’s right to complain to the Commissioner.
The ability of a Minister of the Crown to issue a certificate certifying the application of the exemption for the purposes of safeguarding national security, which previously existed, is retained; clause 24(8) simply updates that provision to reflect the new exemption. That change will assist closer working between organisations operating under the three distinct data protection regimes by providing greater confidence that data that, for example, may be of importance to a police investigation but also pertinent to a separate national security operation can be properly safeguarded by both organisations. I will allow the hon. Member for Barnsley East to speak to amendment 105, because I wish to respond to her.
I am grateful to the Minister. I want to speak today about a concern that has been raised about clauses 24, 25 and 26, so I will address them before speaking to amendment 105.
In essence, the clauses increase the opportunities for competent authorities to operate in darkness when it comes to personal data through both national security certificates and designation notices. Though it may of course be important in some cases to adjust data protection regulation in a minimal way to protect national security or facilitate working with the intelligence services, important too is the right to understand how any competent authority is processing our personal data—particularly given the growing mistrust around police culture.
To cite one stark example of why data transparency in law enforcement is important, after Sarah Everard was murdered, more than 30 police officers were reportedly investigated for unnecessarily looking up her personal data. First, that demonstrates that there is a temptation for officers to access personal data without due reason, perhaps particularly when it is related to a high-profile case. Secondly, however, it shows that transparency does hold people accountable. Indeed, thankfully, the individuals who were accused of accessing the data were swiftly investigated. That would not have been possible if that transparency had been restricted—for example, had there been a national security certificate or a designation notice in place.
The powers to apply for the certificates and notices that allow the police and law enforcement authorities exemptions from data protection, although sometimes needed, must be used extremely sparingly and must be proportionate to the need to protect national security. However, that proportionate approach does not appear to be guaranteed in the Bill, despite it being a requirement in human rights law.
In their oral and written evidence, representatives from Rights and Security International warned that clauses 24 to 26 could actually violate the UK’s obligations under the Human Rights Act 1998 and the European convention on human rights. Everything that the UK does, including in the name of national security or intelligence services, must comply with human rights and the ECHR. That means that any time there is interference with the privacy of people in the UK—which is considered a fundamental right—for it to be lawful, the law in question must do only what is truly necessary for national security. That necessity standard is a high one, and it does not take into account whether a change might be more convenient for a competent authority.
Will the Minister clearly explain in what way the potential powers given to law enforcement under clauses 24 to 26, in both national security certificates and designation notices, would be strictly proportionate and necessary for national security, rather than simply making the operations of law enforcement easier and more convenient?
Primarily, the concern is for those whose data could be used in a way that fundamentally infringes on their privacy, but there are practical concerns too. Any clauses that contain suspected violations of human rights could set up the Government for lengthy legal battles, both in the UK and at the European Court of Human Rights, about their data protection and surveillance regimes. Furthermore, any harm to the UK’s important relationships with the EU around data could threaten the adequacy agreement which, as we have all repeatedly heard, is vital to our economy.
It is vital, then, that Minister confirms that both national security certificates and designation notices will be used only where necessary, and exemptions will be allowed only where necessary. If that cannot be satisfied, we must oppose the clauses.
I will now focus on amendment 105. Where powers are available to provide exemptions to privacy protections on grounds of national security, it is important that they are protected from exploitation, and not unduly concentrated in any individual’s hands without appropriate checks and balances. However, Rights and Security International warned that that was not taken into appropriate consideration in clause 25. Instead, the power to issue designation notices has been concentrated almost entirely in the hands of the Secretary of State, with no accountability measures built in.
Designation notices allow for joint processing between a qualifying competent authority and the intelligence services, which could have greatly beneficial consequences for tackling crime and threats to our national security, but they will also allow for both those parties to be exempt from what are usually crucial data protections. They must therefore be used sparingly, and only when necessary and proportionate.
As we have seen—and as I will argue countless times—we cannot rely on the Secretary of State’s acting in good faith. Our legislation must instead protect against a Secretary of State who acts in bad faith. Neither can we rely on the Secretary of State having the level of expertise needed to make complex and technical decisions, especially those that impact on national security and data rights at the same time.
Despite that, under clause 25(2), the Secretary of State alone can specify which competent authorities qualify as able to apply for a designation notice. Under subsection (3), it is the Secretary of state alone to whom qualifying competent authorities will jointly apply. It is the Secretary of State who reviews a notice and has the power to withdraw it, and it is the Secretary of State who makes transition arrangements.
Although there is a requirement in the Bill to consult the commissioner, the amendment seeks to formalise some independent oversight of the designation process by ensuring that the commissioner has an actual say in approving the notices and adjusting the concentration of power so that it does not lie solely in the Secretary of State’s hands. That would mean that should the Secretary of State act in bad faith, or lack the expertise needed to make such a decision—whether aware or unaware of this fact—the commissioner would be able to help to ensure that an informed and proportionate decision was made with regard to each notice applied for. This would not present any designation notices from being issued when they were genuinely necessary; it would simply safeguard their approval when they were.
I assure the hon. Lady that clauses 25 and 26 are necessary for the improvement of national security. The reports on events such as the Manchester and Fishmongers’ Hall terrorist incidents have demonstrated that better joined-up working between the intelligence services and law enforcement is in the public interest to safeguard national security. A current barrier to such effective joint working is that only the intelligence services can operate under part 4 of the Data Protection Act, which is drafted to reflect the unique operational nature of their processing.
Of course, the reports on incidents such as those at Fishmongers’ Hall and the Manchester Arena pointed to a general lack of effective collaboration between security forces and the police. It was not data that was the issue; it was collaboration.
I certainly accept that greater collaboration would have been beneficial as well, but there was a problem with data sharing and that is what the clause is designed to address.
As the hon. Member for Barnsley East will know, law enforcement currently operates under part 3 of the Data Protection Act when processing data for law enforcement purposes. That means that even when they work together, law enforcement and the intelligence services must each undertake separate assessments regarding the same joint-working processing.
Order. I am making a habit of interrupting the Minister—I do apologise—but we have some news from the Whip.
Ordered, That the debate be now adjourned.—(Steve Double.)
Data Protection and Digital Information (No. 2) Bill (Fifth sitting) Debate
Full Debate: Read Full DebateJohn Whittingdale
Main Page: John Whittingdale (Conservative - Maldon)(1 year, 5 months ago)
Public Bill CommitteesI remind the Committee that with this we are discussing the following:
Amendment 105, in clause 25, page 44, line 6, leave out “must consult the Commissioner” and insert
“must apply to the Commissioner for authorisation of the designation notice on the grounds that it satisfies subsection (1)(b).”
This amendment seeks to increase independent oversight of designation notices by replacing the requirement to consult the Commissioner with a requirement to seek the approval of the Commissioner.
Clauses 25 and 26 stand part.
When the Committee last adjourned, I had already spoken to clauses 24 to 26 and was responding to amendment 105, which was tabled by the hon. Member for Barnsley East. However, let me give a quick recap.
Clauses 24 to 26 are essentially designed to enable better joined-up working between the intelligence services and law enforcement. To that end, they will allow qualifying authorities to use part 4 of the data protection regime, but the Secretary of State will be required to issue a designation notice. We believe that enabling qualifying competent authorities to jointly process data under one regime in authorised, specific circumstances will allow better control over data in a way that is not possible under two different data protection regimes.
Amendment 105 seeks to increase the role of the Information Commissioner’s Office by requiring it to judge whether the designation notice is required for the purposes of safeguarding national security. The Bill requires the Secretary of State to consult the ICO as part of the Secretary of State’s decision whether to grant a notice, but it is not the function of the ICO in its capacity as a regulator to assess national security requirements. The ICO’s expertise is in data protection, not in national security, and it would be inappropriate for it to decide on the latter; that decision should be reserved to the Secretary of State. We believe that clause 25 provides significant safeguards through proposed new sections 82B and 82E, which provide respectively for legal challenge and annual review of a notice. In addition, should the notice no longer be required, the Secretary of State can withdraw it. For that reason, we cannot accept the amendment.
We now come to the provisions in the Bill relating to the powers of the Information Commissioner. Clause 27 will introduce a new strategic framework for the Information Commissioner when carrying out his functions under data protection legislation. The framework contains a principal data protection objective and a number of general duties.
The legislation does not currently provide the commissioner with a framework of strategic objectives to help to prioritise activities and resources, evaluate performance and be held accountable by stakeholders. Instead, the commissioner is obliged to fulfil a long list of tasks and functions without a clear strategic framework to guide his work.
The clause introduces a principal objective for the commissioner, first to secure an appropriate level of protection for personal data, taking into account the interests of data subjects, controllers and others along with matters of general public interest, and secondly to promote public trust and confidence in the processing of personal data. This principal objective will replace section 2(2) of the Data Protection Act 2018.
How does the Minister think the words
“an appropriate level of protection for personal data”
should be understood by the Information Commissioner? Is it in the light of the duties that follow, or what?
Obviously that is a matter for the Information Commissioner, but that is the overriding principal objective. I am about to set out some of the other objectives that the clause will introduce, but it is made very clear that the principal objective is to ensure the appropriate level of protection. Precisely how the Information Commissioner interprets “appropriate level of protection” is a matter for him, but I think it is fairly clear what that should entail, as he himself set out in his evidence.
As I have said, clause 27 introduces new duties that the commissioner must consider where they are relevant to his work in carrying out data protection functions: the desirability of promoting innovation and competition; the importance of the prevention, investigation, detection and prosecution of criminal offences; the need to safeguard public security and national security; and, where necessary, the need to consult other regulators when considering how the ICO’s work may affect economic growth, innovation and competition. There is also the statement of strategic priorities, which is introduced by clause 28. However, as I have indicated to the hon. Member for Newcastle upon Tyne Central, the commissioner will be clear that his primary focus should be to achieve the principal objective.
Clause 27 also introduces new reporting requirements for the commissioner in relation to the strategic framework. The commissioner will be required to publish a forward-looking strategy outlining how he intends to meet the new principal objective and duties, as well as pre-existing duties in the Deregulation Act 2015 and the Legislative and Regulatory Reform Act 2006.
Finally, the commissioner will be required to publish a review of what he has done to comply with the principal objective, and with the new and existing duties, in his annual report.
I wonder whether part of the strategy might include a list of fees that could potentially be charged for accessing data. This idea of fees seems to be quite vague in terms of amounts and levels, so it would be useful to have some more information on that.
I think we will come on to some of the questions around the fees that are potentially payable, particularly by those organisations that may be required to provide more evidence, and the costs that that could entail. I will return to that subject shortly.
The new strategic framework acknowledges the breadth of the ICO’s remit and its impact on other areas. We believe that it will provide clarity for the commissioner, businesses and the general public on the commissioner’s objectives and duties. I therefore commend clause 27 to the Committee.
The importance to any data protection regime of an independent, well-functioning regulator cannot be overstated. The ICO, which is soon to be the Information Commission as a result of this Bill, is no exception to that rule. It is a crucial piece of the puzzle in our regime to uphold the information rights set out in regulation. Importantly, it works in the interests of the general public. The significance of an independent regulator is also recognised by the European Commission, which deems it essential to any adequacy agreement. The general duties of our regulator, such as those set out in this clause, are therefore vital because they form the foundations on which it operates and the principles to which it must be accountable.
Although the duties are more an indicator of overarching direction than a prescriptive list of duties, they should still aim to reflect the wide range of tasks that the regulator carries out and the values with which they do so. On the whole, the clause does this well. Indeed, the principal objective for the commissioner set out in this clause, which is
“to secure an appropriate level of protection for personal data, having regard to the interests of data subjects, controllers and others and matters of general public interest, and…to promote public trust and confidence in the processing of personal data”
is a good overarching starting point. It simply outlines the basic functions of the regulator that we should all be able to get behind, even if the Bill itself does disappointingly little to encourage the promotion of public trust in data processing.
It is particularly welcome that the principal objective includes specific regard to
“matters of general public interest.”
This should cover things like the need to consider sustainability and societal impact. However, it is a shame that that is not made explicit among the sub-objectives, which require the commissioner to have regard to the likes of promoting innovation and safeguarding national security. That would have ingrained in our culture a desire to unlock data for the wider good, not just for the benefit of big tech. Overall, however, the responsibilities set out in the clause, and the need to report on fulfilling them, seem to reflect the task and value of the regulator fairly and accurately.
I think that was slightly qualified support for the clause. Nevertheless, we welcome the support of the Opposition.
Question put and agreed to.
Clause 27 accordingly ordered to stand part of the Bill.
Clause 28
Strategic priorities
Clause 28 provides a power for the Secretary of State to prepare a statement of strategic priorities relating to data protection as part of the new strategic framework for the Information Commissioner. The statement will contain only the Government’s data protection priorities, and the Secretary of State may choose to include both domestic and international priorities. That will enable the Government to provide a transparent statement of how their data protection priorities fit in with their wider agenda, giving the commissioner, we hope, helpful context.
Although the commissioner must take the statement into account when carrying out his functions, he is not required to act in accordance with it. That means that the statement will not be used in a way to direct what the commissioner may and may not do. Once the statement is drafted, the Secretary of State will be required to lay it before Parliament, where it will be subject to the negative resolution procedure before it can be designated. The commissioner will need to consider the statement when carrying out functions under the data protection legislation, except functions relating to a particular person, case or investigation.
Once designated, the commissioner will be required to respond to the statement, outlining how he intends to consider it in future data protection work. The commissioner will also be required to report on how he has considered the statement in his annual report. I commend the clause to the Committee.
Clause 28 requires that every three years the Secretary of State publish a statement of strategic priorities for the commissioner to consider, respond to, and have regard to. The statement would be subject to the negative resolution procedure in Parliament, and the commissioner would be obliged to report on what they have done to comply with it annually. Taken in good faith, I see what the clause was intended to achieve. It is, of course, important that the Government’s data priorities are understood by the commissioner. It is also vital that we ensure that the regulator functions in line with the most relevant issues of the day, given the rapidly evolving landscape of technology.
A statement of strategic priorities could, in theory, allow the Government to set out their priorities on data policy in a transparent way, allowing both Ministers and the ICO to be held accountable for their relationship. However, there is and must be a line drawn between the ICO understanding the modern regulatory regime that it will be expected to uphold and political interference in the activities and priorities of the ICO. The Open Rights Group, among others, has expressed concern that the introduction of a statement of strategic priorities could cross that line, exposing the ICO to political direction, making it subject to culture wars and leaving it vulnerable to corporate capture or even corruption.
Although the degree to which those consequences would become a reality given the current strength of our regulator might be up for debate, the very concept of the Government setting out a statement of strategic priorities that must be adhered to by the commissioner at the very least sets out a need for the ICO to follow some sort of politically led direction, something that seems counterintuitive with respect to independence. As I have already argued, an independent ICO is vital not only directly, for data subjects to be sure that their rights will be implemented and for controllers to be sure of their obligations, but indirectly, as a crucial component of our EU adequacy agreement.
Even though the clause may not be intended to threaten independence, we must be extremely careful not to unintentionally embark on a slippery slope, particularly as there are other mechanisms for ensuring that the ICO keeps up with the times and has a transparent relationship with Government. In 2022, the ICO published its new strategic plan, ICO25, which sets out why its work is important, what it wants to be known for and by whom, and how it intends to achieve that by 2025. It describes the ICO’s purpose, objectives and values and the shift in approach that it aims to achieve through the life of the plan, acknowledging that its work is
“complex, fast moving and ever changing.”
The plan was informed by extensive stakeholder consultation and by the responsibilities that the ICO has been given by Parliament. There are therefore ways for the ICO to communicate openly with Government, Parliament and other relevant stakeholders to ensure that its direction is in keeping with the most relevant challenges and with updates to legislation and Government activity. Ministers might have been better off encouraging transparent reviews, consultations and strategies of that kind, rather than prompting any sort of interference from politicians with the ICO’s priorities.
We agree about the importance of the independence of the Information Commissioner, but I do not think that the statement, as we have set out, is an attempt to interfere with that. I remind the hon. Lady that in relation to the statement of strategic priorities, she asked the Information Commissioner himself:
“Do you perceive that having any impact on your organisation’s ability to act independently of political direction?”,
and he replied:
“No, I do not believe it will undermine our independence at all.”––[Official Report, Data Protection and Digital Information (No. 2) Public Bill Committee, 10 May 2023; c. 6, Q3.]
The Minister is right to quote the evidence session, but he will perhaps also remember that in a later session Ms Irvine from the Law Society of Scotland said that she was surprised by the answer given by the Information Commissioner.
Ms Irvine may have been surprised. I have to say that we were not. What the Information Commissioner said absolutely chimed with our view of the statement, so I am afraid on this occasion I will disagree with the Law Society of Scotland.
Question put, That the clause stand part of the Bill.
With this it will be convenient to discuss:
Clause 30 stand part.
Amendment 111, in clause 31, page 56, line 30, leave out lines 30 and 31 and insert—
“(6) If the Commissioner submits a revised code under subsection (5)(b), the Secretary of State must approve the code.”
This amendment seeks to limit the ability of the Secretary of State to require the Commissioner to provide a revised code to only one occasion, after which the Secretary of State must approve the revised code.
Clause 31 stand part.
Given the significant number of ways in which personal data can be used, we believe that it is important that the regulator provides guidance for data controllers, particularly on complex and technical areas of the law, and that the guidance should be accessible and enable compliance with the legislation efficiently and easily. We are therefore making a number of reforms to the process by which the Information Commissioner produces statutory codes of practice.
Clause 29 is a technical measure that ensures that all statutory codes of practice issued under the Data Protection Act 2018 follow the same parliamentary procedures, have the same legal effect, and are published and kept under review by the Information Commissioner. Under sections 121 to 124 of the Data Protection Act, the commissioner is obliged to publish four statutory codes of practice: the data sharing code, the direct marketing code, the age-appropriate design code, and the data protection and journalism code. The DPA includes provisions concerning the parliamentary approval process, requirements for publication and review by the commissioner, and details of the legal effect of each of the codes. So far, the commissioner has completed the data sharing code and the age-appropriate design code.
Section 128 of the Act permits the Secretary of State to make regulations requiring the Information Commissioner to prepare other codes that give guidance as to good practice in the processing of personal data. Those powers have not yet been used, but may be useful in the future. However, due to the current drafting of the provisions, any codes required by regulations made by the Secretary of State and issued by the commissioner would not be subject to the same formal parliamentary approval process or review requirements as the codes issued under sections 121 to 124. In addition, they do not have the same legal effect, and courts and tribunals would not be required to take a relevant provision of the code into account when determining a relevant question. Clearly, it is not appropriate to have two different standards of statutory codes of practice. To address that, clause 29 replaces the original section 128 with new section 124A, so that codes required in regulations made by the Secretary of State follow a similar procedure to codes issued under sections 121 to 124.
New section 124A provides the Secretary of State with the power to make regulations requiring the commissioner to produce codes of practice giving guidance as to good practice in the processing of personal data. Before preparing any code, the commissioner must consult the Secretary of State and other interested parties such as trade associations, data subjects and groups representing data subjects. That is similar to the consultation requirements for the existing codes. The parliamentary approval processes and requirements for the ICO to keep existing codes under review are also extended to any new codes required by the Secretary of State. The amendment also ensures that those codes requested by the Secretary of State have the same legal effect as those set out on the face of the DPA.
Clauses 30 and 31 introduce reforms to the process by which the commissioner develops statutory codes of practice for data protection. They require the commissioner to undertake and publish impact assessments, consult with a panel of experts during the development of a code, and submit the final version of a code to the Secretary of State for approval. Those processes will apply to the four statutory codes that the commissioner is already required to produce and to any new statutory codes on the processing of personal data that the commissioner is required to prepare under regulation made by the Secretary of State.
The commissioner will be required to set up and consult a panel of experts when drafting a statutory code. That panel will be made up of relevant stakeholders and, although the commissioner will have discretion over its membership, he or she will be required to explain how the panel was chosen. The panel will consider a draft of a statutory code and submit a report of its recommendations to the commissioner. The commissioner will be required to publish the panel’s response to the code and, if he chooses not to follow a recommendation, the reasons must also be published.
Clause 30 also requires the commissioner to publish impact assessments setting out who will be affected by the new or amended code and the impact it will have on them. While the commissioner currently carries out impact assessments when developing codes of practice, we believe that there are advantages to formalising an approach on the face of the legislation to ensure consistency.
Given the importance of the statutory codes, we believe it is important that there is a further degree of democratic accountability within the process. Therefore, clause 31 requires the commissioner to submit the final version of a statutory code to the Secretary of State for approval.
On that basis, I commend the relevant clauses to the Committee, but I am aware that the hon. Member for Barnsley East wishes to propose an amendment.
I turn first to clauses 29 and 30. Codes of practice will become increasingly important as the remit of the ICO expands and modernises. As such, it is important that the codes are developed in a way that is conducive to the product being as effective and useful as possible.
Although the ICO already carries out impact assessments for new codes of practice, that is only done as best practice and currently does not have any statutory underpinning. It is therefore pleasing to see clauses that will require consistency and high standards when developing new codes, ensuring that the resulting products are as comprehensive and helpful as possible. It is welcome, for example, to see that experts will be consulted in the process of developing these codes, including Government officials, trade associations and data subjects. It is also good to see that the commissioner will be required to publish a statement relating to the establishment of the expert panel, including how and why members were selected.
I welcome the support of the Opposition for many of the principles contained in the clauses. I turn to amendment 111, tabled by the hon. Lady. As the clause originally sets out, once the commissioner is issued the final version of the code, the Secretary of State decides whether to approve it. If they do approve the code, it will be laid before Parliament for final approval. If they do not, they are required to publish their reasons.
The amendment would place a limit on that, so that the Secretary of State would be able to reject the final version of the code only once. If the code is revised by the commissioner in the light of the comments of the Secretary of State and resubmitted, under the amendment the Secretary of State would have to lay the code in Parliament for final approval. Although I understand the concern behind the amendment, we do not believe it to be justified. I understand that the hon. Lady does not want a code to be rejected multiple times, but we regard this as a final safeguard and it will be fully transparent. We are absolutely committed to maintaining the commissioner’s independence, but we think it also important that the Government have the opportunity to give a view before the code is laid before Parliament and for Parliament to give final approval. The amendment would unduly limit the Government’s ability to provide as necessary that further degree of democratic accountability.
The hon. Lady referred to the importance of maintaining adequacy, which we have already touched on. I fully share her view on its importance to the wider functioning of the economy, but when she raised the matter with the Information Commissioner he did not believe that it posed any risk. Indeed, he went on to point out:
“A failure of the Secretary of State to table and issue a proposed code would not affect the way in which the commissioner discharges his or her enforcement functions. We would still be able to investigate matters and find them in breach, regardless of whether that finding was consistent with the Secretary of State’s view of the law.”––[Official Report, Data Protection and Digital Information (No. 2) Public Bill Committee, 10 May 2023; c. 6-7, Q4.]
On that basis, we think that there should be the ongoing ability for the Secretary of State—and, through the Secretary of State, Parliament—to approve the final version of the code, but we do not feel that this interferes with the Information Commissioner’s ability to carry out his functions, nor does it represent any view as to our adequacy agreement.
Taking advantage of your invitation, Mr Hollobone, I shall speak only briefly. The UK’s data protection framework allows a data subject or data protection officer to make a request to the Information Commissioner for information concerning the exercise of their data protection rights. The commissioner is expected to respond to a data subject or data protection officer and make no charge in the majority of cases, but the commissioner can refuse to respond or charge a reasonable fee for a response to a request when it is “manifestly unfounded or excessive”. Clause 7 changes the “manifestly unfounded or excessive” threshold for all requests from data subjects across the UK data protection framework to “vexatious or excessive”. Clause 32 replicates that language, inserting the same new threshold into section 135 of the Data Protection Act 2018, to ensure that the Information Commissioner’s exemption is consistent across the legislation. I urge the Committee to agree to the clause.
The new threshold contained in the clause has been discussed in debates under clause 7, and I refer hon. Members to my remarks in those debates, as many of the same concerns apply. The guidance that will be needed to interpret the terms “vexatious” and “excessive” should be no less applicable to the Information Commissioner, whose co-operation with data subjects and transparency should be exemplary, not least because the functioning of the regulator inherently sets an example for other organisations on how the rules should be followed.
Question put and agreed to.
Clause 32, as amended, accordingly ordered to stand part of the Bill.
Clause 33
Analysis of performance
Question proposed, That the clause stand part of the Bill.
Clause 33 introduces the requirement for the Information Commissioner to prepare and publish an analysis of their performance, using key performance indicators. The regulator will be required to publish that analysis at least annually. The commissioner will have the discretion to decide which factors effectively measure their performance.
Improving the commissioner’s monitoring and reporting mechanisms will strengthen their accountability to Parliament, organisations and the public, who have an interest in the commissioner’s effectiveness. Performance measurement will also have benefits for the commissioner, including by supporting their work of measuring progress towards their objectives and ensuring that resources are prioritised in the right areas. I urge that clause 33 stand part of the Bill.
I welcome the clause, as did the majority of respondents who supported the proposal in the “Data: a new direction” consultation. As recognised by the Government’s response to their consultation, respondents felt the proposal would allow for the performance of the ICO to be assessed publicly and provide evidence of how the ICO is meeting its statutory obligations. We should do all we can to promote accountability, transparency and public awareness of the obligations and performance of the ICO. The clause allows for just that.
Question put and agreed to.
Clause 33 accordingly ordered to stand part of the Bill.
Clause 34
Power of the Commissioner to require documents
Question proposed, That the clause stand part of the Bill.
This is a slightly chunkier set of clauses and amendments, so I will not be as brief as in the last two debates.
Clause 34 is a clarificatory amendment to the Information Commissioner’s powers in section 142 of the Data Protection Act to require information. Its purpose is to clarify the commissioner’s existing powers to put it beyond doubt that the commissioner can require specific documents as well as information when using the information notice power. Subsections (3) to (7) of the clause make consequential amendments to references to information notices elsewhere in the Data Protection Act.
Clause 35 makes provision for the Information Commissioner to require a data controller or processor to commission a report from an approved person on a specified matter when exercising the power under section 146 of the Data Protection Act to issue an assessment notice. The aim of the power is to ensure that the regulator can access information necessary to its investigations.
In the event of a data breach, the commissioner is heavily dependent on the information that the organisation provides. If it fails to share information—for example, because it lacks the capability to provide it—that can limit the commissioner’s ability to conduct a thorough investigation. Of course, if the organisation is able to provide the necessary information, it is not expected that the power would be used. The commissioner is required to act proportionately, so we expect that the power would be used only in a small minority of investigations, likely to be those that are particularly complex and technical in nature.
Clause 36 grants the Information Commissioner the power to require a person to attend an interview and answer questions when investigating a suspected failure to comply with data protection legislation. At the moment, the Information Commissioner can only interview people who attend voluntarily, which means there is a heavy reliance on documentary evidence. Sometimes that is ambiguous or incomplete and can lead to uncertainty. The ability to require a person to attend an interview will help to explain an organisation’s practices or evidence submitted, and circumvent a protracted and potentially fruitless series of back-and-forth communication via information notices. The power is based on existing comparable powers for the Financial Conduct Authority and the Competition and Markets Authority.
Clause 37 amends the provisions for the Information Commissioner to impose penalties set out in the Data Protection Act. It will allow the commissioner more time, where needed, to issue a final penalty notice after issuing a notice of intent. At the moment the Act requires the commissioner to issue a notice of intent to issue a penalty notice; the commissioner then has up to six months to issue the penalty notice unless an extension is agreed. That can prove difficult in some cases—for instance, if the organisation under investigation submits new evidence that affects the case at a late stage, or when the legal representations are particularly complex. The clause allows the regulator more time to issue a final penalty notice after issuing a notice of intent, where that is needed. That will benefit business, as it means the commissioner can give organisations more time to prepare their representations, and will result in better outcomes by ensuring that the commissioner has sufficient time to assess representations and draw his conclusions.
Clause 38 introduces the requirement for the Information Commissioner to produce and publish an annual report on regulatory activity. The report will include the commissioner’s investigatory activity and how the regulator has exercised its enforcement powers. That will lead to greater transparency of the commissioner’s regulatory activity.
Clauses 34 to 37, as I said, make changes to the Data Protection Act 2018 in respect of the Information Commissioner’s enforcement powers. Consequential on clauses 35 and 36, clause 42 makes changes to the Electronic Identification and Trust Services for Electronic Transactions Regulations 2016, known as the EITSET regulations. The EITSET regulations extend and modify the Information Commissioner’s enforcement powers to apply to its role as the supervisory body for trust service providers under the UK regulations on electronic identification and trust services for electronic transactions, known as the UK eIDAS. Clause 42 amends the EITSET regulations to ensure that the new enforcement powers introduced by clauses 34 to 37 are available to the Information Commissioner for the purposes of regulating trust service providers.
The new powers will help to ensure that the Information Commissioner is able to access the evidence needed to inform investigations. The powers will result in more informed investigations and, we believe, better outcomes. Clause 42 ensures that the Information Commissioner will continue to be able to act as an effective supervisory body for trust service providers established in the UK.
Government amendment 47 amends schedule 2 to the EITSET regulations. The amendment 2 is consequential to the amendment of section 155(3)(c) of the Data Protection Act made by schedule 4 to the Bill. The amendment to schedule 2 will remove the reference to consultation under section 65 of the Data Protection Act when section 155 is applied. It is necessary to remove reference to section 65 of the Data Protection Act when section 155 is applied with modification under schedule 2, as consultation requirements under that section are not relevant to the regulation of trust service providers under the UK eIDAS.
I hope that that is helpful to Members in explaining the merits of our approach to ensuring that the Information Commissioner has the right enforcement tools at its disposal and continues to be an effective and transparent regulator. I commend the clauses and Government amendment 47 to the Committee.
I will speak to each of the relevant clauses in turn. On clause 34, I am satisfied that the clarification that the Information Commissioner can require documents as well as information is necessary and will be of use to the regulator. I am pleased therefore pleased to accept the clause as drafted and to move on to the other clauses in this part.
Clause 35 provides for the commissioner to require an approved person to prepare a report on a specified matter, as well as to provide statutory guidance on, first, the factors it considers when deciding to require such a report and, secondly, the factors it considers when determining whom the approved person might be. That power to commission technical reports is one that the vast majority of respondents to the “Data: a new direction” consultation supported, as they felt it would lead to better informed ICO investigations. Any measures that help the ICO to carry out its duties rigorously and to better effect, while ensuring that relevant safeguards apply, are measures that I believe Members across the Committee will want to support.
In the consultation, however, the power was originally framed to commission a “technical report”, implying that it would be limited to particularly complex and technical investigations where there is significant risk of harm or detriment to data subjects. Although the commissioner is required to produce guidance on the circumstances in which a report might be required, I would still like clarification from the Minister of why such a limit was not included in the Bill as drafted. Does he expect it to be covered by the guidance produced by the ICO? Such a clarification is necessary not because we are against clause 35 in principle, just in acknowledgement that ICO’s powers—indeed, enforcement powers generally—must always be proportionate to the task at hand.
Furthermore, some stakeholders have said that it is unclear whether privilege will attach to reports required by the ICO and whether they may be disclosable to third parties who request copies of them. Greater clarity about how the power will operate in practice would therefore be appreciated.
Turning to clause 36, it is a core function of the ICO to monitor and enforce the UK’s data protection legislation and rules, providing accountability against the activities of all controllers, processors and individuals. To fulfil that function, the ICO may have to conduct an investigation to establish a body of evidence and determine whether someone has failed to comply with the legislation. The Government’s consultation document said that the ICO sometimes faces problems engaging organisations in those investigations, despite their having a duty to co-operate fully, especially in relation to interviews, as many people are nervous of negative consequences in their life or career if they participate in one. However, interviews are a crucial tool for investigations, as not all the relevant evidence will be available in written form. Indeed, that may become even more the case after the passing of this Bill, due to the reduced requirements to keep records, conduct data protection impact assessments and assign data protection officers—all of which contribute to a larger pool of documentation tracking data processing.
Clause 36, which will explicitly allow the ICO to compel witnesses to comply with interviews as part of an investigation, will, where necessary, ensure that as much relevant evidence as possible is obtained to inform the ICO’s judgment. That is something that we absolutely welcome. It is also welcome to see the safeguards that will be put in place under this clause, including the right not to self-incriminate and exemptions from giving answers that would infringe legal professional privilege or parliamentary privilege. That will ensure that the investigatory powers of the ICO stay proportionate to the issues at hand. In short, clause 36 is one that I am happy to support. After all, what is the purpose of us ensuring that data protection legislation is fit for purpose here today if the ICO is unable to actually determine whether anyone is complying?
On clause 37, it seems entirely reasonable that the ICO may require more than the standard six months to issue a penalty notice in particularly complex investigations. Of course, it remains important that the operations of the ICO are not allowed to slow unduly in cases where a penalty can be issued in the usual timeframe, but where the subject matter is particularly complicated, it makes sense to allow the ICO an extension to enable the investigation to be concluded in the proper, typically comprehensive manner. Indeed, complex investigations may be more common as we adjust to the new data legislation and a rapidly evolving technological landscape. By conducting the investigations properly and paying due attention to particularly technical issues, new precedents can be set that will speed up the regulator’s processes on the whole. Clause 37 is therefore welcomed by us, as it was by the majority of respondents to the Government’s consultation.
Turning to clause 38, as we have said multiple times throughout the progress of this Bill and in Committee, transparency and data protection should go hand in hand. Requiring the ICO to publish information each year on the investigations it has undertaken and the powers it has used will embed a further level of transparency into the regulatory system. Transparency breeds accountability, and requiring the regulator to publish information on the powers it is using will encourage such powers to be used proportionately and appropriately. Publishing an annual report with that information should also give us a better idea of how effectively the new regulatory regime is working. For example, a high volume of cases on a recurring issue could indicate a problem within the framework that needs addressing. Overall, it is welcome that Parliament and the public should be privy to information about how the ICO is discharging its regulatory functions. As a result, I am pleased to support clause 38.
Finally, the amendments to clause 42 are of a consequential nature, and I am happy to proceed without asking any further questions about them.
I am most grateful to the hon. Lady for welcoming the vast majority of the provisions within these clauses. She did express some concern about the breadth of the powers available to the Information Commissioner, but I point out that they are subject to a number of safeguards defining how they can be used. The commissioner is required to publish how he will exercise his powers, and that will provide organisations with clarity on the circumstances in which they are to be used.
As the hon. Lady will be aware, like other regulators, the Information Commissioner is subject to the duty under the Legislative and Regulatory Reform Act to exercise their functions
“in a way which is transparent, accountable, proportionate and consistent”,
and,
“targeted only at cases in which action is needed.”
There will also be a right of appeal, which is consistent with the commissioner’s existing powers. On that basis, I hope that the hon. Lady is reassured.
Question put agreed to.
Clause 34 accordingly ordered to stand part of the Bill.
Clauses 35 to 38 ordered to stand part of the Bill.
Clause 39
Complaints to controllers
With this it will be convenient to discuss the following:
Clauses 40 and 41 stand part.
That schedule 8 be the Eighth schedule to the Bill.
These three clauses, together with schedule 8, streamline and clarify complaint routes for data subjects by making the respective rights and responsibilities of data controllers and data subjects clear in legislation. The measures will reduce the volume of premature complaints to the Information Commissioner, and give an opportunity to controllers to resolve complaints before they are escalated to the regulator.
Clause 39 enables data subjects to complain to a data controller if they believe that there has been an infringement of their data protection rights, and creates a duty for data controllers to facilitate the making of complaints by taking appropriate steps, such as providing a complaints form. The requirement will encourage better conversations and more dialogue between data subjects and data controllers. It will formalise best practice, and align with the standard procedures of other ombudsman services, which require complainants to seek to resolve an issue with the relevant organisation before escalation. The clause also introduces a regulation-making power for the Secretary of State to require controllers to notify the Information Commissioner of the number of complaints made to them in circumstances specified in the regulations.
Clause 40 provides the Information Commissioner with a new power to refuse to act on certain data protection complaints if certain conditions are met, specifically if the complaint has not been made to the relevant controller; the controller has not finished handling the complaint and less than 45 days have elapsed since it was made; or the complaint is considered vexatious or excessive, as defined in the Bill. For example, that could be the case with a complaint that repeats a previous complaint made by the data subject to the commissioner. The power is in addition to the discretion that the commissioner can already exercise to “take appropriate steps” to respond to a complaint and investigate it “to the extent appropriate.” The clause requires the Information Commissioner to publish guidance about how it will respond to complaints and exercise its power to refuse to act on complaints. Finally, the clause also outlines the process for appeals if the commissioner refuses to act on a data protection complaint.
Clause 41 introduces schedule 8, which contains miscellaneous minor and consequential amendments to the UK General Data Protection Regulation and the Data Protection Act relating to complaints by data subjects.
Schedule 8 makes consequential amendments to the UK GDPR and the DPA relating to complaints by data subjects, which will ensure consistency across data protection legislation in relation to the changes to the complaints framework under clauses 39 and 40.
I am looking for some clarification from the Minister. Under clause 39, it says:
“A controller must facilitate the making of complaints…such as providing a complaint form which can be completed electronically and by other means.”
Can the Minister clarify whether every data controller will have to provide an electronic means of making a complaint? For many small data controllers, which would include many of us in the room, providing an electronic means of complaint might require additional expertise and cost that they may not have. If it said, “and/or by other means”, which would allow a data controller to provide a paper copy, that might provide a little more reassurance to data controllers.
Let me address the point of the hon. Member for Glasgow North West first. The intention of the clause is to ensure that complainants go first to the data controller, and the data controller makes available a process whereby complaints can be considered. I certainly fully understand the concern of the hon. Lady that it should not prove burdensome, particularly for small firms, and I do not believe that it would necessarily require an electronic means to do so. If that is not the case, I will tell her, but it seems to me that the sensible approach would be for data controllers to have a process that the Information Commissioner will accept is available to complainants first, before a complaint is possibly escalated to the next stage.
With regard to the point of the hon. Member for Barnsley East, we have debated previously the change in the threshold to “vexatious” and “excessive”, and we may continue to disagree on that matter.
Question put and agreed to.
Clause 39 accordingly ordered to stand part of the Bill.
Clauses 40 and 41 ordered to stand part of the Bill.
Schedule 8 agreed to.
Clause 42
Consequential amendments to the EITSET Regulations
Amendment made: 47, Clause 42, page 72, line 12, at end insert—
“(7A) In paragraph 13 (modification of section 155 (penalty notices)), in sub-paragraph (3)(c), for “for “data subjects”” there were substituted “for the words from “data subjects” to the end”.”.—(Sir John Whittingdale.)
This amendment inserts an amendment of Schedule 2 to the EITSET Regulations which is consequential on the amendment of section 155(3)(c) of the Data Protection Act 2018 by Schedule 4 to the Bill.
Clause 42, as amended, ordered to stand part of the Bill.
Clause 43
Protection of prohibitions, restrictions and data subject’s rights
Question proposed, That the clause stand part of the Bill.
Clause 43 is a technical measure that creates a presumption that our data protection laws should not be overridden by future laws that relate to the processing of personal data, but it respects parliamentary sovereignty by ensuring that Parliament can depart from this presumption in particular cases if it deems it appropriate to do so. For example, if new legislation permitted or required an organisation to share personal data with another for a particular purpose, the default position in the absence of any specific indication to the contrary would be that the data protection legislation would apply to the new arrangement.
Will my right hon. Friend confirm that the provision will also apply with trade agreements? Certainly in the early stages of the negotiations for a UK-US trade agreement, the United States Government sought to include various provisions relating to tech policy. In such a scenario, would this legislation take precedence above anything written into a trade agreement?
That would certainly be my interpretation. I do not see that a trade agreement could possibly overturn an Act of Parliament unless Parliament specifically sets out that it intends that that should be the case. This is a general protection, essentially saying that in all future cases data protection legislation applies unless Parliament specifically indicates that that should not be the case.
Until now, ensuring that any new data protection measures are read consistently with the data protection legislation has relied either on inclusion of express provision to that effect in new data processing measures, or on general rules of interpretation. There are risks to that situation. Including relevant provisions in each and every new data processing provision is onerous and could be inadvertently omitted. General rules of interpretation can be open to different interpretations by courts, particularly in the light of legal challenges following our exit from the European Union. This can create the potential for legal uncertainty and as a result could lead to a less effective and comprehensive data protection legislative framework.
Clause 43 creates a presumption that any future legislation permitting the processing of personal data will be subject to the key requirements of the UK’s data protection legislation unless clear provisions are made to the contrary. This is a technical but necessary measure and I commend it to the Committee.
I understand that the clause contains legal clarifications relating to the interaction of data protection laws with other laws. On that basis, I am happy to proceed.
Question put and agreed to.
Clause 43 accordingly ordered to stand part of the Bill.
Clause 44
Regulations under the UK GDPR
Question proposed, That the clause stand part of the Bill.
The clause outlines the process and procedure for making regulations under powers in the UK GDPR. Such provision is needed because the Bill introduces regulation-making powers into the GDPR. There is an equivalent provision in section 182 of the Data Protection Act. Among other things, the clause makes it clear that, before making regulations, the Secretary of State must consult the Information Commissioner and such other persons as they consider appropriate, other than when the made affirmative procedure applies. In such cases, the regulations can be made before Parliament has considered them, but cannot remain as law unless approved by Parliament within a 120-day period.
I am sure that the Committee will be pleased to learn that we have now completed part 1 of the Bill. [Hon. Members: “Hear, hear!”]
Clause 46 provides an overview of the provisions in part 2 that are aimed at securing the reliability of digital verification services through a trust framework, a public register, an information gateway and a trust mark.
Clause 47 will require the Secretary of State to prepare and publish the digital verification services trust framework, a set of rules, principles, policies, procedures and standards that an organisation that wishes to become a certified and registered digital verification service provider must follow. The Secretary of State must consult the Information Commissioner and other appropriate persons when preparing the trust framework; that consultation requirement can be satisfied ahead of the clause coming into force. The Secretary of State must review the trust framework every 12 months and must consult the Information Commissioner and other appropriate persons when carrying out the review. I commend both clauses to the Committee.
Clause 46 defines digital verification services. Central to the definition, and to the framing of the debate on part 2, is the clarification that they are
“services that are provided at the request of an individual”.
That is a crucial distinction: digital verification services and the kinds of digital identity that they enable are not the same as any kind of Government-backed digital ID card, let alone a compulsory one. As we will discuss, it is important that any such services are properly regulated and can be relied on. However, the clause seems to set out a sensible definition that clarifies that all such services operate at individual request and are entirely separate from universal or compulsory digital identities.
I will speak in more depth about clause 47. As we move towards an increasingly digitally focused society, it makes absolute sense that someone should be able, at their own choice, to prove their identity online as well as in the physical world. Providing for a trusted set of digital verification services would facilitate just that, allowing people to prove with security and ease who they are for purposes including opening a bank account or moving house, akin to using physical equivalents like a passport or a proof of address such as a utility bill. It is therefore understandable that the Government, building on their existing UK digital identity and attributes trust framework, want to legislate so that the full framework can be brought into law when it is ready.
In evidence to the Committee, Keith Rosser highlighted the benefits that a digital verification service could bring, using his industry of work and employment as a live case study. He said:
“The biggest impact so far has been on the speed at which employers are able to hire staff”––[Official Report, Data Protection and Digital Information (No. 2) Public Bill Committee, 10 May 2023; c. 52, Q112.]
In a study of 70,000 hires, the digital identity route took an average time of three minutes and 30 seconds, saving about a week compared with having to meet with an employer in person to provide physical documents. That has benefits not only to the individuals, who can start work a week earlier, but to the wider economy, since the same people will start contributing to taxation and their local economy a week earlier too.
Secondly, Keith identified that digital verification could open up remote jobs to people living in areas where employment opportunities are harder to come by. In theory, someone living in my constituency of Barnsley East could be hired in a role that would previously have been available only in London, thanks to their ability to prove who they are without ever having to meet their employer in person.
In the light of those benefits, as well as the potential reduction in fraud from cutting down on the usability of fake documents, in principle it seems only logical to support a framework that would allow trusted digital verification services to flourish. However, the key is to ensure that the framework breeds the trust necessary to make it work. In response to the digital identity call for evidence in 2019, the Government identified that a proportion of respondents were concerned about their privacy when it came to digital verification, saying that without assurances on privacy protections it would be hard to build trust in those systems. It is therefore curious that the Government have not accompanied their framework with any principles to ensure that services are designed and implemented around user needs and that they reflect important privacy and data protection principles.
Can the Minister say why the Government have not considered placing the nine identity assurance principles on the statute book, for example, to be considered when legislating for any framework? Those principles were developed by the Government’s own privacy and consumer advisory group back in 2014; they include ensuring that identity assurance can take place only where consent, transparency, multiplicity of choice, data minimisation and dispute resolution procedures are in place. That would give people the reassurance to trust that the framework is in keeping with their needs and rights, as well as those of industry.
Furthermore, can the Minister explain whether the Government intend to ensure that digital verification will not be the only option in any circumstance, making it mandatory? As Big Brother Watch points out, digital identity is not a practical or desired option, particularly for vulnerable or marginalised groups. Elderly people may not be familiar with such technology, while others might be priced out of it, especially given the recent rise in the cost of broadband and mobile bills attached to inflation. Although we must embrace the opportunities that technology can provide in identity verification, there must also be the ability to opt out and use offline methods of identification where needed, or we will risk leaving people out of participating in key activities such as jobseeking.
Finally, I look forward to hearing more about the governance of digital verification services and the framework. The Bill does not provide a statutory basis for the new office for digital identities and attributes, and there is therefore no established body for the functions related to the framework. It is important that when the new office is established, there is good communication from Government about its powers, duties, functions and funding model. After all, the framework and the principles it supports are only as strong as their enforcement.
Overall, I do not wish to stand in the way of this part of the Bill, with the caveat that I am keen to hear from the Minister on privacy protections, on the creation of the new office and on ensuring that digital verification is the beginning of a new way of verifying one’s identity, not the end of any physical verification options.
It is a pleasure to follow my hon. Friend the Member for Barnsley East. I have some general comments, which I intend to make now, on the digital verification services framework introduced and set out in clause 46. I also have some specific comments on subsequent clauses; I will follow your guidance, Mr Hollobone, if it is your view that my comments relate to other clauses and should be made at a later point.
Like my hon. Friend, I recognise the importance of digital verification services and the many steps that the Government are taking to support them, but I am concerned about the lack of coherence between the steps set out in the Bill and other initiatives, consultations and activities elsewhere in Government.
As my hon. Friend said, the Government propose to establish an office for digital identities and attributes, which I understand is not a regulator as such. It would be good to have clarity on the position, as there is no discussion in the Bill of the duties of the new office or any kind of mechanisms for oversight or appeal. What is the relationship between the office for digital identities and attributes and this legislation? The industry has repeatedly called for clarity on the issue. I think we can all agree that a robust and effective regulatory framework is important, particularly as the Bill confers broad information-gathering powers on the Secretary of State. Will the Minister set out his vision and tell us how he sees the services being regulated, what the governance model will be, how the office—which will sit, as I understand it, in the Department for Science, Innovation and Technology—will relate to this legislation, and whether it will be independent of Government?
Will the Minister also help us to understand the relationship between the digital verification services set out in the Bill and other initiatives across Government on digital identity, such as the Government Digital Service’s One Login service, which we understand will be operated across Government services, and the initiatives of the Home Office’s fraud strategy? Is there a relationship between them, or are they separate initiatives? If they are separate, might that be confusing for the sector? I am sure the Minister will agree that we in the UK are fortunate to have world leaders in digital verification, including iProov, Yoti and Onfido. I hope the Minister agrees that for those organisations to continue their world-leading role, they need clarification and understanding of the direction of Government and how this legislation relates to that direction.
Finally, I hope the Minister will agree that digital identity is a global business. Will he say a few words about how he has worked with, or is working with, other countries to ensure that the digital verification services model set out in this legislation is complementary to other services and interoperable as appropriate, and that it builds on the learnings of other digital verification services?
I am grateful to the hon. Member for Barnsley East for setting out the Opposition’s general support for the principle of moving towards the facilitation of digital verification services. She set out some of the benefits that such services can provide, and I completely echo her points on that score. I reiterate the central point that none of this is mandatory: people can choose to use digital verification services, but there is no intention to make them compulsory.
The trust framework has been set out with a wide number of principles and standards, to which privacy is central. The hon. Member for Barnsley East is right that that will be necessary to obtain trust from people seeking to use the services. She and the hon. Member for Newcastle upon Tyne Central have both set out detailed questions about the operation of the new office and the work alongside other Government Departments. I would like to respond to their points but, given that we are about to break, we could accept the general principle of this clause and then discuss them, no doubt in greater detail, in the debate on subsequent clauses. Will the Committee accept this clause with the assurance that we will address a lot of the issues just raised as we come to subsequent clauses in this part of the Bill?
Question put and agreed to.
Clause 46 accordingly ordered to stand part of the Bill.
Ordered, That further consideration be now adjourned. —(Steve Double.)
Data Protection and Digital Information (No. 2) Bill (Sixth sitting) Debate
Full Debate: Read Full DebateJohn Whittingdale
Main Page: John Whittingdale (Conservative - Maldon)(1 year, 5 months ago)
Public Bill CommitteesClauses 48 to 52 provide the Secretary of State with powers and duties relating to the governance and oversight of digital identities in the UK. Those functions will be carried out by the office for digital identities and attributes. I can tell the hon. Member for Newcastle upon Tyne Central that the office is a team of civil servants in the Department for Science, Innovation and Technology. The office will oversee certified organisations that provide trusted digital verification services, to ensure that the purpose of the legislation is being upheld as the market develops.
I appreciate the Minister’s clarification that the office will be a group of civil servants, but I do not see that set out in the Bill, in the clause that we are currently debating. Am I wrong?
As the office is an internal body, within the Department, I do not think that it would necessarily be specifically identified in the legislation in that way. If there is any more information on that, I will be happy to provide it to the hon. Lady in a letter, but the office is not a separate body to the Department.
I thank the Minister for providing greater clarification, but if the office is not a separate body, it cannot be claimed to be independent of Government, which means that the governance of digital verification services is not independent. Will he confirm that?
This is a function that will operate within Government. I do not think that it is one where there is any specific need for particular independence, but as I said, I am happy to supply further details about precisely how it will operate if that is helpful to the hon. Lady.
Let me move on from the precise operation of the body. Clause 53 sets out requirements for certified digital verification service providers in relation to obtaining top-up certificates where the Secretary of State revises and republishes the DVS trust framework.
Clause 48 provides that the Secretary of State must establish and maintain a register of digital verification service providers. The register must be made publicly available. The Secretary of State is required to add a digital verification service provider to the register, provided that it has met certain requirements. To gain a place on the register, the provider must first be certified against the trust framework by an accredited conformity assessment body. Secondly, the provider must have applied to be registered in line with the Secretary of State’s application requirements under clause 49. Thirdly, the provider must pay any fee set by the Secretary of State under the power in clause 50.
The United Kingdom Accreditation Service accredits conformity assessment bodies as competent to assess whether a digital verification service meets the requirements set out in the trust framework. That, of course, is an arm’s length body. Assessment is by independent audits, and successful DVS providers are issued with a certificate.
The Secretary of State is prohibited from registering a provider if it has not complied with the registration requirements. An application must be rejected if it is based on a certificate that has expired, has been withdrawn by the issuing body, or is required to be ignored under clause 53 because the trust framework rules have been amended and the provider has not obtained a top-up certificate in time. The Secretary of State must also refuse to register a DVS provider if the provider was removed from the register through enforcement powers under clause 52 and reapplies for registration while still within the specified removal period.
Clause 48(7) provides definitions for “accredited conformity assessment body”, “the Accreditation Regulation”, “conformity assessment body” and “the UK national accreditation body”.
Clause 49 makes provision for the Secretary of State to determine the form of an application for registration in the digital verification services register, the information that an application needs to contain, the documents to be provided with an application and the manner in which an application is to be submitted.
Clause 50 allows the Secretary of State to charge providers a fee on application to be registered in the DVS register. The fee amount is to be determined by the Secretary of State. The clause also allows the Secretary of State to charge already registered providers ongoing fees. The amount and timing of those fees are to be determined by the Secretary of State.
Clauses 51 and 52 confer powers and duties on the Secretary of State in relation to the removal of persons from the register. Clause 51 places a duty on the Secretary of State to remove a provider from the register if certain conditions are met. That will keep the register up to date and ensure that only providers that hold a certificate to prove that they adhere to the standards set in the framework are included in the register. Clause 52 provides a power to the Secretary of State to remove a provider from the register if the Secretary of State is satisfied that the provider is failing to provide services in accordance with the trust framework, or if it has failed to provide the Secretary of State with information as required by a notice issued under clause 58. Clause 52 also contains safeguards in respect of the use of that power.
Clause 53 applies where the Secretary of State revises and republishes the DVS trust framework to include a new rule or to change an existing rule and specifies in the trust framework that a top-up certificate will be required to show compliance with the new rule from a specified date.
I hope that what I have set out is reasonably clear, and on that basis I ask that clauses 48 to 53 stand part of the Bill.
As has been mentioned, a publicly available register of trusted digital verification services is welcome; as a result, so is this set of clauses. A DVS register of this kind will improve transparency for anyone wanting to use a DVS service, as they will be able to confirm easily and freely whether the organisation that they hope to use complies with the trust framework.
However, the worth of the register relies on the worth of the trust framework, because only by getting the trust framework right will we be able to trust those that have been accredited as following it. That will mean including enough in the framework to assure the general public that their rights are protected by it. I am thinking of things such as data minimisation and dispute resolution procedures. I hope that the Department will consider embedding principles of data rights in the framework, as has been mentioned.
As with the framework, the detail of these clauses will come via secondary legislation, and careful attention must be paid to the detail of those measures when they are laid before Parliament. In principle, however, I have no problem with the provisions of the clauses. It seems sensible to enable the Secretary of State to determine a fee for registration, to remove a person from the register upon a change in circumstances, or to remove an organisation if it is failing to comply with the trust framework. Those are all functions that are essential to the register functioning well, although any fees should of course be proportionate to keep market barriers low and ensure that smaller players continue to have access. That facilitates competition and innovation.
Similarly, the idea of top-up certificates seems sensible. Members on both sides of the House have agreed at various points on the importance of future-proofing a Bill such as this, and the digital verification services framework should have space for modernisation and adaptation where necessary. Top-up certificates will allow for the removal of any organisation that is already registered but fails to comply with new rules added to the framework.
The detail of these provisions will be analysed as and when the regulations are introduced, but I will not object to the principle of an accessible and transparent register of accredited digital verification services.
I thank the Minister for clarifying the role of the office for digital identities and attributes. Some of the comments I made on clause 46 are probably more applicable here, but I will not repeat them, as I am sure the Committee does not want to hear them a second time. However, I ask the Minister to clarify the process. If a company objects to not being approved for registration or says that it has followed the process set out by the Secretary of State but the Secretary of State does not agree, or if a dispute arises for whatever reason, what appeal process is there, if any, and who is responsible for resolving disputes? That is just one example of the clarity that is necessary for an office of this kind.
Will the Minister clarify the dispute resolution process and whether the office for digital identities and attributes will have a regulatory function? Given the lack of detail on the office, I am concerned about whether it will have the necessary powers and resources. How many people does the Minister envisage working for it? Will they be full-time employees of the office, or will they be job sharing with other duties in his Department?
My other questions are about something I raised earlier, to which the Minister did not refer: international co-operation and regulation. I imagine there will be instances where companies headquartered elsewhere want to offer digital verification services. Will there be compatibility issues with digital verification that is undertaken in other jurisdictions? Is there an international element to the office for digital identities and attributes?
Everyone on the Committee agrees that this is a very important area, and it will only get more important as digital verification becomes even more essential for our everyday working lives. What discussions is the Minister having with the Department for Business and Trade about the kind of market that we might expect to see in digital verification services and ensuring that it is competitive, diverse and across our country?
I look forward to debating the detail of the framework with the hon. Member for Barnsley East when it comes forward, but the hon. Member for Newcastle upon Tyne Central raised a couple of specific points. As I said, the new office for digital identities and attributes will be in the Department for Science, Innovation and Technology, and it will work on a similar basis to that of the office for product safety and standards, which operates within the Department for Business and Trade.
However, I should make it clear that the office for digital identities and attributes is not a regulator, because the use of digital identities is not mandatory, so it does not have investigatory or enforcement powers. It is not our intention for it to be able to levy fines or resolve individual complaints. Further down the line, as the market develops, it may be decided that it should be housed permanently in an independent body or as an arm’s length body, but that is for consideration in due course. It will start off within the Department.
I will come back to the hon. Member for Newcastle upon Tyne Central with more detail about dispute resolution. I take her point; I am not sure how often what she describes is likely to happen, but clearly it is sensible at least to take account of it.
With this it will be convenient to discuss the following:
Clauses 55 and 56 stand part.
Government amendments 6 and 7.
Government new clause 3—Information disclosed by the Welsh Revenue Authority.
Government new clause 4—Information disclosed by Revenue Scotland.
Clause 54 creates a permissive power to enable public authorities to share information relating to an individual with registered digital verification service providers. That the power is permissive means that public authorities are not under any obligation to disclose information. The power applies only where a digital verification service provider is registered in the DVS register and the individual has requested the digital verification service from that provider. Information disclosed using the power does not breach any duty of confidentiality or other restrictions relating to the disclosure of information, but the power does not enable the disclosure of information if disclosure would breach data protection legislation. The clause also gives public authorities the power to charge fees for disclosing information.
All information held by His Majesty’s Revenue and Customs is subject to particular statutory safeguards relating to confidentiality. Clause 55 establishes particular safeguards for information disclosed to registered digital verification service providers by His Majesty’s Revenue and Customs under clause 54. The Government will not commence measures to enable the disclosure of information held by HMRC until the commissioners for HMRC are satisfied that the technology and processes for information sharing uphold the particular safeguards relating to taxpayer confidentiality and therefore allow information sharing by HMRC to occur without adverse effect on the tax system or any other functions of HMRC.
Clause 56 obliges the Secretary of State to produce and publish a code of practice about the disclosure of information under clause 54. Public authorities must have regard to the code when disclosing information under this power. Publication of the first version of the code is subject to the affirmative resolution procedure. Publication of subsequent versions of the code is subject to the negative resolution procedure. We will work with the commissioners for HMRC to ensure that the code meets the needs of the tax system.
New clauses 3 and 4 and Government amendments 6 and 7 establish safeguards for information that reflect those already in the Bill under clause 55 for HMRC. Information held by tax authorities in Scotland and Wales—Revenue Scotland and the Welsh Revenue Authority—is subject to similar statutory safeguards relating to confidentiality. These safeguards ensure that confidence and trust in the tax system is maintained. Under these provisions, registered DVS providers may not further disclose information provided by Revenue Scotland or the Welsh Revenue Authority unless they have the consent of that revenue authority to do so. The addition of these provisions will provide an equivalent level of protection for information shared by all three tax authorities in the context of part 2 of the Bill, avoiding any disparity in the treatment of information held by different tax authorities in this context. A similar provision is not required for Northern Irish tax data, as HMRC is responsible for the collection of devolved taxes in Northern Ireland.
Many digital verification services will, to some extent, rely on public authorities being able to share information relating to an individual with an organisation on the DVS register. To create a permissive gateway that allows this to happen, as clause 54 does, is therefore important for the functioning of the entire DVS system, but there must be proper legal limits placed on these disclosures of information, and as ever, any disclosures involving personal data must abide by the minimisation principle, with only the information necessary to verify the person’s identity or the fact about them being passed on. As such, it is pleasing to see in clause 54 the clarification of some of those legal limits, as contained in the likes of data protection legislation and the Investigatory Powers Act 2016. Similarly, clause 55 and the Government new clauses apply the necessary limits on sharing of personal data from HMRC and devolved revenue authorities under clause 54.
Finally, clause 56, which seeks to ensure that a code of practice is published regarding the disclosure of information under clause 54, will be a useful addition to the previous clauses and will ensure that the safety of such disclosures is properly considered in comprehensive detail. The Information Commissioner, with their expertise, will be well placed to help with this, so it is pleasing to see that they will be consulted during the process of designing this code. It is also good to see that this consultation will be able to occur swiftly—before the clause even comes into force—and that the resulting code will be laid before both Houses.
In short, although some disclosures of personal data from public authorities to organisations providing DVS are inevitable, as they are necessary for the very functioning of a verification service, careful attention should be paid to how this is done safely and legally. These clauses, alongside a well-designed framework—as already discussed—will ensure that that is the case.
Question put and agreed to.
Clause 54 accordingly ordered to stand part of the Bill.
Clauses 55 and 56 ordered to stand part of the Bill.
Clause 57
Trust mark for use by registered persons
Question proposed, That the clause stand part of the Bill.
Clause 57 makes provision for the Secretary of State to designate a trust mark to a DVS provider. The trust mark is essentially a kitemark that shows that the provider complies with the rules and standards set out in the trust framework, and has been certified by an approved conformity assessment body. The trust mark must be published by the Secretary of State and can only be used by registered digital verification service providers. The clause gives the Secretary of State powers to enforce that restriction in civil proceedings.
Trust marks are useful tools that allow organisations and the general public alike to immediately recognise whether or not a product or service has passed a certain testing standard or criterion. This is especially the case online, where due to misinformation and the prevalence of scams such as phishing, trust in online services can be lower than in the physical world.
The TrustedSite certification, for example, offers online businesses an earned certification programme that helps them to demonstrate that they are compliant with good business practices and maintain high safety standards. This is a benefit not only to the business itself, which is able to convert more users into clicks and sales, but to the users, who do not have to spend time researching each individual business and can explore pages and shop with immediate certainty. A trust mark for digital verification services would serve a similar purpose, enabling certified organisations that meet the trust framework criteria to be immediately recognisable, offering them the opportunity to be used by more people and offering the public assurance that their personal data is being handled by a verified source.
Of course, as is the case with this entire section of the Bill, the trust mark is only worth as much as the framework around it. Ministers should again think carefully about how to ensure that the framework supports the rights of the individual. Furthermore, the trust mark is useful only if people recognise it; otherwise, it cannot provide the immediate reassurance that it is supposed to. When the trust mark is established, what measures will the Department take to raise public awareness of it? In the same vein, to know the mark’s value, the public must also be aware of the trust framework that the mark is measured against, so what further steps will the Department take to increase knowledge and understanding of digital verification services and frameworks? Finally, will the Department publish the details of any identified unlawful use of the trust mark, so that public faith in the reliability of the trust mark remains high?
Overall, the clause is helpful in showing that we take seriously the need to ensure that people do not use digital verification services that may mishandle their data.
I am grateful to the hon. Lady for her support. I entirely take her point that a trust mark only really works if people know what it is and can look for it when seeking a DVS provider.
Regarding potential abuse, obviously that is something we will monitor and potentially publicise in due course. All I would say at this stage is that she raises valid points that I am sure we will consider as the new system is implemented.
Question put and agreed to.
Clause 57 accordingly ordered to stand part of the Bill.
Clause 58
Power of Secretary of State to require information
Amendments made: amendment 6, in clause 58, page 84, line 5, after “55” insert
“or (Information disclosed by the Welsh Revenue Authority)”
This amendment prevents the Secretary of State requesting a disclosure of information which would contravene the new clause inserted by NC3.
Amendment 7, in clause 58, page 84, line 5, after “55” insert
“or (Information disclosed by Revenue Scotland)”—(Sir John Whittingdale.)
This amendment prevents the Secretary of State requesting a disclosure of information which would contravene the new clause inserted by NC4.
Question proposed, That the clause, as amended, stand part of the Bill.
Clauses 58 to 60 set out powers and duties conferred upon the Secretary of State in relation to the exercise of her governance and oversight functions under part 2.
Clause 58 enables the Secretary of State to issue a written notice that requires accredited conformity assessment bodies or registered DVS providers to provide information reasonably required by the Secretary of State to exercise functions under part 2. The notice must state why the information is required. It may also state what information is required, the form in which it should be provided, when it should be provided and the place to which it should be provided. Any notice given to a provider must also inform the provider that they may be removed from the DVS register if they fail to comply with the notice.
The power is subject to certain safeguards. Information does not have to be disclosed if to do so would breach clause 55 in relation to HMRC data or data protection legislation, or if disclosure is prohibited by the relevant parts of the Investigatory Powers Act 2016. Information does not need to be disclosed if doing so would reveal an offence that would expose a person to criminal proceedings. That does not apply to offences mentioned relating to false statements.
Clause 59 gives the Secretary of State the power to make regulations specifying that another person is able to exercise her functions under part 2. This clause enables us to move the governance and oversight functions of the Secretary of State to a third party if appropriate.
I thank the Minister for giving way. Before he moves on to clause 60, can he set out, perhaps giving an example, where it might be appropriate to use the power in clause 59 to make arrangements for another person to take on these functions, or in what circumstances he envisages it being used?
We are obviously at a very early stage in the development of this market. At the moment, it is felt right that oversight should rest with the Secretary of State, but it may be that as the market grows and develops there will need to be the oversight via a separate body. The clause keeps the power available to the Secretary of State to delegate the function if he or she chooses to do so.
Clause 60 requires the Secretary of State to publish an annual report on the functioning of this part. The first report must be published within 12 months of clause 47, the DVS trust framework clause, coming into force. The reports will help to ensure that the market continues to meet the needs of DVS providers, public authorities, regulators, civil society and individuals. I commend the clauses to the Committee.
To oversee the DVS register, it is understandable that the Secretary of State may in some cases need to require information from registered bodies to ensure that they are complying with their duties under the framework. It is good that clause 58 provides for that power, and places reasonable legal limits on it, so that disclosures of information do not disrupt legal professional privilege or other important limitations. Likewise, it is sensible that the Secretary of State be given the statutory power to delegate some oversight of the measures in this part in a paid capacity, as is ensured by clause 59.
As I have mentioned many times throughout our scrutiny of the Bill, the Secretary of State may not always have the level of expertise needed to act alone in exercising the powers given to them by such regulations. The input of those with experience and time to commit to ensuring the quality of the regulations will therefore be vital to the success of these clauses. Again, however, we will need more information about the establishment of the OfDIA and the governance of digital identities overall to be able to interpret fully both the delegated powers and the power to require information, and how they will be used. Once again, therefore, I urge transparency from the Government as those governance structures emerge.
That leads nicely to clause 60, which requires the Secretary of State to prepare and publish yearly reports on the operation of this part. A report of that nature will offer the chance to periodically review the functioning of the trust framework, register, trust mark and all other provisions contained in this part, thereby providing an opportunity to identify and rectify any recurring issues that the system may face. That is sensible for any new project, particularly one that, through its transparency, will offer accountability of the Government to the general public, who will be able to read the published reports. In short, there are no major concerns regarding any of the three clauses, though further detail on the governance of digital identities services will need proper scrutiny.
Question put and agreed to.
Clause 58 accordingly ordered to stand part of the Bill.
Clauses 59 and 60 ordered to stand part of the Bill.
Clause 61
Customer data and business data
I beg to move amendment 46, in clause 61, page 85, line 24, after “supplied” insert “or provided”.
The definition of “business data” in clause 61 refers to the supply or provision of goods, services and digital content. For consistency with that, this amendment amends an example given in the definition so that it refers to what is provided, as well as what is supplied.
We move on to part 3 of the Bill, concerning smart data usage, which I know is of interest to a number of Members. Before I discuss the detail of clause 61 and amendment 46, I will give a brief overview of this part and the policy intention behind it. The provisions in part 3 allow the Secretary of State or the Treasury to make regulations that introduce what we term “schemes” that compel businesses to share data that they hold on customers with the customer or authorised third parties upon the customer’s request, and to share or publish data that they hold about the services or products that they provide. Regulations under this part will specify what data is in scope within the parameters set out by the clauses, and how it should be shared.
The rest of the clauses in this part permit the Secretary of State or the Treasury to include in the regulations the measures that will underpin these data sharing schemes and ensure that they are subject to proper safeguards—for example, relating to the enforcement of regulations; the accreditation of third party businesses wanting to facilitate data sharing; and how these schemes can be funded through levies and charging. Regulations that introduce schemes, or significantly amend existing schemes, will be subject to prior consultation and parliamentary approval through the affirmative procedure.
The policy intention behind the clauses is to allow for the creation of new smart data schemes, building on the success of open banking in the UK. Smart data schemes establish the secure sharing of customer data and contextual information with authorised third parties on the customer’s request. The third parties can then be authorised by the customer to act on their behalf. The authorised third parties can therefore provide innovative services for the customer, such as analysing spending to identify cost savings or displaying data from multiple accounts in a single portal. The clauses replace existing regulation-making powers relating to the supply of customer data in sections 89 to 91 of the Enterprise and Regulatory Reform Act 2013; those powers are not sufficient for new smart data schemes to be effective.
Clause 61 defines the key terms and concepts for the powers in part 3. We have tabled a minor Government amendment to the clause, which I will explain. The definitions of data holder and trader in subsection (2) explain who may be required to provide data under the regulations. The definitions of customer data and business data deal with the two kinds of data that suppliers may be required to provide. Customer data is information relating to the transactions between the customer and supplier, such as a customer’s consumption of the relevant good or service and how much the customer has paid. Business data is wider contextual data relating to the goods or services supplied or provided by the relevant supplier. Business data may include standard prices, charges or tariffs and information relating to service performance. That information may allow customers to understand their customer data. Government amendment 46 clarifies that a specific example of business data—information about location—refers to the supply or provision of goods or services. It corrects a minor inconsistency in the list of examples of business data in subsection (2)(b).
Subsection (3) concerns who is a customer of the supplying trader, and who can therefore benefit from smart data. Customers may include both consumers and businesses. Subsection (4) enables customers to exercise smart data rights in relation to contracts they have already entered into, and subsection (5) allows the schemes to function through provision of access to data, as opposed to sending data as a one-off transfer.
The clause defines key terms in this part of the Bill, such as business data, customer data and data holder, as well as data regulations, customer and trader. These are key to the regulation-making powers on smart data in part 3, and I have no specific concerns to raise about them at this point.
I note the clarification made by the Minister in his amendment to the example given. As he outlined, that will ensure there is consistency in the definition and understanding of business data. It is good to see areas such as that being cleaned up so that the Bill can be interpreted as easily as possible, given its complexity to many. I am therefore happy to proceed with the Bill.
I rise to ask the Minister a specific question about the use of smart data in this way. A lot of users will be giving away data a device level, rather than just accessing individual accounts. People are just going to a particular account they are signed into and making transactions, or doing whatever they are doing in that application, on a particular device, but there will be much more gathering of data at the device level. We know that many companies—certainly some of the bigger tech companies—use their apps to gather data not just about what their users do on their particular app, but across their whole device. One of the complaints of Facebook customers is that if they seek to remove their data from Facebook and get it back, the company’s policy is to give them back data only for things they have done while using its applications—Instagram, Facebook or whatever. It retains any device-level data that it has gathered, which could be quite significant, on the basis of privacy—it says that it does not know whether someone else was using the device, so it is not right to hand that data back. Companies are exploiting this anomaly to retain as much data as possible about things that people are doing across a whole range of apps, even when the customer has made a clear request for deletion.
I will be grateful if the Minister can say something about that. If he cannot do so now, will he write to me or say something in the future? When considering the way that these regulations work, particularly in the era of smart data when it will be far more likely that data is gathered across multiple applications, it should be clear what rights customers have to have all that data deleted if they request it.
I share my hon. Friend’s general view. Customers can authorise that their data be shared through devices with other providers, so they should equally have the right to take back that data if they so wish. He invites me to come back to him with greater detail on that point, and we would be very happy to do so.
Amendment 46 agreed to.
Clause 61, as amended, ordered to stand part of the Bill.
Clause 62
Power to make provision in connection with customer data
I beg to move amendment 112, in clause 62, page 87, line 2, at end insert—
“(3A) The Secretary of State or the Treasury may only make regulations under this section if—
(a) the Secretary of State or the Treasury has conducted an assessment of the impact the regulations may have on customers, businesses, or industry,
(b) the assessment mentioned in paragraph (a) has been published, and
(c) the assessment concludes that the regulations achieve their objective without imposing disproportionate, untargeted or unnecessary cost on customers or businesses.”
I assure the hon. Lady that I and, no doubt, the whole Committee share her excitement about the potential offered by smart data, and I have sympathy for the intention behind her amendments. However, taking each one in turn, we feel amendment 112 is unnecessary because the requirements are already set by the better regulation framework, the Small Business, Enterprise and Employment Act 2015 and, indeed, these clauses. Departments will conduct an impact assessment in line with the better regulation framework and Green Book guidance when setting up a new smart data scheme, and must demonstrate consideration of their requirements under the Equality Act 2010. That will address the proportionality, targeting and necessity of the scheme.
Moreover, the clauses require the Government to consider the effect of the regulations on matters including customers, businesses and competition. An impact assessment would be an effective approach to meeting those requirements. However, there is a risk that prescribing exactly how a Department should approach the requirements could unnecessarily constrain the policymaking process.
I turn to amendment 113. Clause 74(5) already requires the Secretary of State or the Treasury to consult with relevant sector regulators as they consider appropriate. As part of the process, sector regulators may be asked to contribute to the development of regulatory impact assessments, so we do not believe the amendment is necessary.
On amendment 114, we absolutely share the view of the importance of Government consulting businesses before making regulations. That is why, under clause 74(6), the Secretary of State or the Treasury must, when introducing a smart data scheme, consult such persons as are likely to be affected by the regulations and such sectoral regulators as they consider appropriate. Those persons will include businesses relevant to the envisaged scheme.
On amendment 115, we absolutely share the ambition to grab whatever opportunities smart data offers. In particular, I draw the hon. Lady’s attention to the commitments made last month by the Economic Secretary to the Treasury, who set out the Treasury’s plans to use the smart data powers to provide open banking with a sustainable regulatory framework, while the Under-Secretary of State for Business and Trade, my hon. Friend the Member for Thirsk and Malton (Kevin Hollinrake), chaired the inaugural meeting of the Smart Data Council last month. That council has been established to support and co-ordinate the development of smart data schemes in a timely manner.
With respect to having a deadline for schemes, we should recognise that implementation of the regulations requires careful consideration. The hon. Member for Barnsley East clearly recognises the importance of consultation and of properly considering the impacts of any new scheme. We are committed to that, and there is a risk that a statutory deadline for making the regulations would jeopardise our due diligence. I assure her that all her concerns are ones that we share, so I hope that she will accept that the amendments are unnecessary.
I am grateful to the Minister for those assurances. I am reassured by his comments, and I am happy to beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
Question proposed, That the clause stand part of the Bill.
Clause 62 provides the principal regulation-making power to establish smart data schemes in relation to customer data. The clause enables the Secretary of State or the Treasury to make regulations that require data holders to provide customer data either directly to a customer, or to a person they have authorised, at their request. Subsection (3) of the clause also allows for an authorised person who receives the customer data, to exercise the customer’s rights in relation to their data on their behalf. We call that “action initiation”.
An illustrative example could be in open banking, where customers can give authorised third parties access to their data to compare the consumer’s current bank account with similar offers, or to group the contracts within a household together for parents or guardians to better manage children’s accounts. Subsection (3) could allow the authorised third party to update the customer’s contact details across the associated accounts, for example if an email address changes.
Clause 63 outlines the provisions that smart data scheme regulations may contain when relating to customer data. The clause establishes much of the critical framework that smart data schemes will be built on. On that basis, I commend clauses 62 and 63 to the Committee.
As previously mentioned, and with the caveats that I expressed when I was discussing my amendments, I am extremely pleased to be able to welcome this part of the Bill. In essence, clauses 62 and 63 enable regulations that will allow for customer data to be provided to a third party on request. I will take the opportunity to highlight why that is the case by looking at some of the benefits that smart data can provide.
Since 2018, open banking—by far the most well known and advanced version of smart data in operation—has demonstrated what smart data can deliver over and over again. For the wider economy, the benefits have been remarkable, with the total value to the UK economy now amounting to more than £4.1 billion, according to Coadec, the Coalition for a Digital Economy. Consumers’ experience of banking has been revolutionised if they have consented of their own accord to have third-party applications access their financial data.
Indeed, a whole host of money management tools and apps can now harness people’s financial data to create personalised recommendations based on their spending habits, including how to budget or save. During a cost of living crisis, some of those tools have been extremely valuable in helping people to manage new bills and outgoings. Furthermore, online retailers can now connect directly to someone’s bank so that, rather than spending the time filling in their card details each time they make a purchase, an individual can approve the transaction via their online banking system.
It is important to reiterate that open banking is based on consent, so consumers participate only if they feel it is right for them. As it happens, millions of people have capitalised on the benefits. More than seven million consumers and 50% of small and medium-sized enterprises have used open banking services to gain a holistic view of their finances, to support applications for credit and to pay securely, quickly and cheaply.
Though open banking has brought great success for both consumers and the wider economy, it is also important that the Government learn lessons from its implementation. We must pay close attention to how the introduction of open banking has impacted both the industry and consumers and ensure that any takeaways are factored in when considering an expansion of smart data into new industries.
Further, given that the Government clearly recognise the value of open data, as shown by this section of the Bill, it is a shame that the Bill does not go further in exploring the possibilities of opening datasets in other settings. Labour has explicitly set out to do that in its industrial strategy. For example, we have identified that better, more open datasets on jobs could help us to understand where skills shortages are, allowing jobseekers, training providers and Government to better fill those gaps.
The provisions in clauses 62 and 63 to create new regimes of smart data are therefore welcome, but the Bill unfortunately remains a missed opportunity to fully capitalise on the opportunities of open, secure data flows.
Question put and agreed to.
Clause 62 accordingly ordered to stand part of the Bill.
Clause 63 ordered to stand part of the Bill.
Clause 64
Power to make provision in connection with business data
Question proposed, That the clause stand part of the Bill.
Clause 64 provides the principal regulation-making power for the creation of smart data schemes relating to business data. Regulations created through this clause allow for business data to be provided to the customer of a trader or a third-party recipient. Business data may also be published to be more widely available.
These regulations relating to business data will increase the transparency around the pricing of goods and services, which will increase competition and benefit both consumers and smaller businesses. To give just one example, the Competition and Markets Authority recently highlighted the potential of an open data scheme that compared the prices of fuel at roadside stations, increasing competition and better informing consumers. It is that kind of market intervention that the powers provide for.
Clause 65 outlines provisions that regulations relating to business data may contain. Those provisions are non-exhaustive. The clause largely mirrors clause 63, extending the same protections and benefits to schemes that make use of businesses data exclusively or in tandem with customer data. The clause differs from clause 63 in subsection (2), where an additional consideration is made as to who may make a request for business data. As action initiation relates only to an authorised person exercising a customer’s rights relating to their data, clause 65 does not include the references to that that are made in subsections (7) and (8) of clause 63.
The measures in these clauses largely mirror 62 and 63, but they refer to business data rather than customer data. I therefore refer back to my comments on clause 62 and 63 and the benefits that new regulations such as these might be able to provide. Those remarks provide context as to why I am pleased to support these measures, which will allow the making of regulations that require data holders to share business data with third parties.
However, I would like clarification from the Minister on one point. The explanatory notes explain that the powers will likely be used together with those in clauses 62 and 63, but it would be good to hear confirmation from the Minister on whether there may be circumstances in which the Department envisages using the powers regarding business data distinctly. If there are, will he share examples of those circumstances? It would be good for both industry and Members of this House to have insight into how these clauses, and the regulatory powers they provide, will actually be used.
I think it is probably sensible if I come back to the hon. Lady on that point. I am sure we would be happy to provide examples if there are ones that we can identify.
Question put and agreed to.
Clause 64 accordingly ordered to stand part of the Bill.
Clause 65 ordered to stand part of the Bill.
Clause 66
Decision-makers
Clauses 66 to 72 contain a number of provisions that will allow smart data regulations to function effectively. They are provisions on decision makers who approve and monitor third parties that can access the data, provisions on enforcement of the regulations and provisions on the funding of smart data schemes. It is probably sensible that I go through each one in more detail.
Clause 66 relates to the appointment of persons or accrediting bodies referred to as decision makers. The decision makers may approve the third parties that can access customer and business data, and act on behalf of customers. The decision makers may also revoke or suspend their accreditation, if that is necessary. An accreditation regime provides certainty about the expected governance, security and conduct requirements for businesses that can access data. Customers can be confident their chosen third party meets an appropriate standard. Clause 66 allows the decision maker to monitor compliance with authorisation conditions, subject to safeguards in clause 68.
Clause 67 enables regulations to confer powers of enforcement on a public body. The public body will be the enforcer, responsible for acting upon any breaches of the regulations. We envisage that the enforcer for a smart data scheme is likely to be an existing sectoral regulator, such as the Financial Conduct Authority in open banking. While the clause envisages civil enforcement of the regulations, subsection (6) allows for criminal offences in the case of falsification of information or evidence. Under subsections (3) and (10), the regulations may confer powers of investigation on the enforcer. That may include powers to require the provision of information and powers of entry, search and seizure. Those powers are subject to statutory restrictions in clause 68.
Clause 68 contains provisions limiting the investigatory powers given to enforcers. The primary restriction is that regulations may not require a person to give an enforcer information that would infringe the privileges of Parliament or undermine confidentiality, legal privilege and, subject to the exceptions in subsection (7), privilege against self-incrimination. Subsection (8) prevents any written or oral statement given in response to a request for information in the course of an investigation from being used as evidence against the person being prosecuted for an offence, other than that created by the data regulations.
Clause 69 contains provisions relating to financial penalties and the relevant safeguards. It sets out what regulations must provide for if enabling the use of financial penalties. Subsection (2) requires that the amount of a financial penalty is specified in, or determined in accordance with, the regulations. For example, the regulations may set a maximum financial penalty that an enforcer can impose and they may specify the methodology to be used to determine a specific financial penalty.
Clause 70 enables actors in smart data schemes to require the payment of fees. The circumstances and conditions of the fee charging process will be specified in the regulations. The purpose of the clause, along with clause 71, is to seek to ensure that the costs of smart data schemes, and of bodies exercising functions under them, can be met by the relevant sector.
It is intended that fees may be charged by accrediting bodies and enforcers. For example, regulations could specify that an accrediting body may charge third parties to cover the cost of an accreditation process and ongoing monitoring. Enforcers may also be able to charge to cover or contribute to the cost of any relevant enforcement activities. The regulations may provide for payment of fees only by persons who are directly affected by the performance of duties, or exercise of powers, under the regulations. That includes data holders, customers and those accessing customer and business data.
Clause 71 will enable the regulations to impose a levy on data holders or allow a specified public body to do so. That is to allow arrangements similar to those in section 38 of the Communications Act 2003, which enables the fixing of charges by Ofcom. Together with the provision on fees, the purpose of the levy is to meet all or part of the costs incurred by enforcers and accrediting bodies, or persons acting on their behalf. The intention is to ensure that expenses can be met without incurring a cost to the taxpayer. Levies may be imposed only in respect of data holders that appear to be capable of being directly affected by the exercise of the functions.
Clause 72 provides statutory authority for the Secretary of State or the Treasury to give financial assistance, including to accrediting bodies or enforcers. Subsection (2) provides that the assistance may be given on terms and conditions that are deemed appropriate by the regulation maker. Financial assistance is defined to include both actual or contingent assistance, such as a grant, loan, guarantee or indemnity. It does not include the purchase of shares. I commend clauses 66 to 72 to the Committee.
Clauses 66 to 72 provide for decision makers and enforcers to help with the operation and regulation of new smart data regimes. As was the case with the digital verification services, where I agreed that there was a need for the Secretary of State to have limited powers to ensure compliance with the trust framework, powers will be needed to ensure that any regulations made under this part of the Bill are followed. The introduction in clause 67 of enforcers—public bodies that will, by creating fines, penalties and notices of compliance, ensure that organisations follow regulations made under part 3—is therefore welcome.
As ever, it is pleasing to see that the relevant restrictions on the powers of enforcers are laid out in clause 68, to ensure that they cannot infringe upon other, more fundamental rights. It is also right, as is ensured by clause 69, that there are safeguards on the financial penalties that an enforcer is able to issue. Guidance on the amount of any penalties, as well as a formalised process for issuing notices and allowing for appeal, will provide uniformity across the board so that every enforcer acts proportionately and consistently.
Decision makers allowed for by clause 66 will be important, too, in conjunction with enforcers. They will ensure there is sufficient oversight of the organisations that are enabled to have access to customer or business data through any particular smart data regimes. Clauses 70, 71 and 72, which finance the activities of decision makers and enforcers, follow the trend of sensible provisions that will be required if we are to have confidence that regulations made under this part of the Bill will be adhered to. In short, the measures under this grouping are largely practical, and they are necessary to support clauses 62 to 65.
Question put and agreed to.
Clause 66 accordingly ordered to stand part of the Bill.
Clauses 67 to 72 ordered to stand part of the Bill.
Clause 73
Confidentiality and data protection
Question proposed, That the clause stand part of the Bill
Clauses 73 to 77 relate to confidentiality and data protection; various provisions connected with making the regulations, including consultation, parliamentary scrutiny and a duty to conduct periodic reviews of regulations; and the repeal of the existing regulation-making powers that these clauses replace.
Clause 73(1) allows the regulations to provide that there are no contravening obligations of confidence or other restrictions on the processing of information. Subsection (2) ensures that the regulations do not require or authorise processing that would contravene the data protection legislation. The provisions are in line with the approach taken towards pension dashboards, which are electronic communications services that allow individuals to access information about their pensions.
Clause 74(1) allows the regulation-making powers to be used flexibly. Subsection (1)(f) allows regulations to make provision by reference to specifications or technical requirements. That is essential to allow for effective and safe access to customer data, for instance the rapid updating of IT and security requirements, and it mirrors the powers enacted in relation to pensions dashboards, which I have mentioned. Clause 74(2) provides for limited circumstances in which it may be necessary for regulations to modify primary legislation to allow the regulations to function effectively. For instance, it may be necessary to extend a statutory alternative dispute resolution scheme in a specific sector to cover the activities of a smart data scheme.
Clause 74(3) states that affirmative parliamentary scrutiny will apply to the first regulations made under clauses 62 or 64; that is, affirmative scrutiny will apply to regulations that introduce a scheme. Affirmative parliamentary scrutiny will also be required where primary legislation is modified, where regulations make requirements more onerous for data holders and where the regulations confer monitoring or enforcement functions or make provisions for fees or a levy. Under clause 74(5), prior to making regulations that will be subject to affirmative scrutiny, the Secretary of State or the Treasury must consult persons who are likely to be affected by the regulations, and relevant sectoral regulators, as they consider appropriate.
The Government recognise the importance of enabling the ongoing scrutiny of future regulations, so clause 75 requires the regulation maker to review the regulations at least at five-yearly intervals. Clause 76 repeals the regulation-making powers in sections 89 to 91 of the Enterprise and Regulatory Reform Act 2013, which are no longer adequate to enable the introduction of effective smart data schemes. Those sections are replaced by the clauses in part 3 of the Bill. Clause 77 defines, or refers to definitions of, terms used in part 3 and is essential to the functioning and clarity of part 3. I commend the clauses to the Committee.
Many of the clauses in this grouping are supplementary to the provisions that we have already discussed, or they provide clarification as to which regulations under part 3 are subject to parliamentary scrutiny. I have no further comments to add on the clauses, other than to welcome them as fundamental to the wider part. However, I specifically welcome clause 75, which requires that the regulations made under this part be periodically reviewed at least every five years.
I hope that such regulations will be under constant review on an informal basis to assess how well they are working, but it is good to see a formal mechanism to ensure that that is the case over the long term. It would have been good, in fact, to see more such provisions throughout the Bill, to ensure that regulations that are made under it work as intended. Overall, I hope it is clear that I am very supportive of this part’s enabling of smart data regimes. I look forward to it coming into force and unlocking the innovation and consumer benefits that such schemes will provide.
Question put and agreed to.
Clause 73 accordingly ordered to stand part of the Bill.
Clause 74 to 77 ordered to stand part of the Bill.
Ordered, That further consideration be now adjourned. —(Steve Double.)
Data Protection and Digital Information (No. 2) Bill (Seventh sitting) Debate
Full Debate: Read Full DebateJohn Whittingdale
Main Page: John Whittingdale (Conservative - Maldon)(1 year, 5 months ago)
Public Bill CommitteesI beg to move amendment 5, in clause 78, page 100, line 30, after “86” insert “and [Codes of conduct]”.
This amendment is consequential on NC2.
With this it will be convenient to discuss Government new clause 1 and Government new clause 2.
It is a pleasure to serve under your chairmanship, Mr Paisley. Welcome to the Committee.
The Privacy and Electronic Communications (EC Directive) Regulations 2003 place specific requirements on organisations in relation to use of personal data in electronic communications. They include, for example, rules on the use of emails, texts and phone calls for direct marketing purposes and the use of cookies and similar technologies.
Trade associations have told us that sometimes their members need guidance on complying with the legislation that is more bespoke than the general regulatory guidance from the Information Commissioner’s Office. New clause 2 will allow representative bodies to design codes of conduct on complying with the PEC regulations that reflect their specific processing operations. There are already similar provisions in articles 40 and 41 of the UK General Data Protection Regulation to help organisations in particular sectors to comply.
Importantly, codes of conduct prepared under these provisions can be contained in the same document as codes of conduct under the UK GDPR. That will be particularly beneficial to representative bodies that are developing codes for processing activities that are subject to the requirements of both the UK GDPR and the PEC regulations. New clause 2 envisages that representative bodies will draw up voluntary codes of conduct and then seek formal approval of them from the Information Commissioner. The Information Commissioner will approve a code only if it contains a mechanism for the representative body to monitor their members’ compliance with the code.
New clause 1 makes a related amendment to article 41 of the UK GDPR to clarify that bodies accredited to monitor compliance with codes of conduct under the GDPR are required to notify the Information Commissioner only if they suspend or exclude a person from a code. Government amendment 5 is a minor and technical amendment necessary as a consequence of new clause 2.
These provisions are being put into the Bill at the suggestion of business organisations. We hope that they will allow organisations to comply more easily with the requirements.
It is a pleasure to serve under your chairship, Mr Paisley, and I too welcome you to the Committee.
As I have said more than once in our discussions, in many cases the burden of following regulations can be eased just as much by providing clarification, guidance and support as by removing regulation altogether. I advocated for codes of practice in more detail in the discussion of such codes in the public sector, under clause 19, and during our debates on clauses 29 and 30, when we were discussing ICO codes more generally. New clauses 1 and 2 seem to recognise the value of codes of practice too, and both seek to provide either clarification or the sharing of best practice in terms of following the PEC regulations. I have no problem with proceeding with the Bill with these inclusions.
Amendment 5 agreed to.
I beg to move amendment 48, in clause 78, page 100, line 30, after “86” insert “and [Pre-commencement consultation]”.
This amendment is consequential on NC7.
New clause 7 clarifies that the consultation requirements imposed by the Bill in connection with or under the PEC regulations can be satisfied by consultation that takes place before the relevant provision of the Bill comes into force. That ensures that the consultation work that supports development of policy before the Bill is passed can continue and is not paused unnecessarily. A similar provision was included in section 182 of the Data Protection Act 2018. Government amendment 48 is a minor and technical amendment which is necessary as a consequence of new clause 7. I commend the new clause and amendment to the Committee.
The new clause and accompanying amendment seek to expedite work on consultation in relation to the measures in this part. It makes sense that consultation can begin before the Bill comes into force, to ensure that regulations can be acted on promptly after its passing. I have concerns about various clauses in this part, but no specific concerns about the overarching new clause, and am happy to move on to discussing the substance of the clauses to which it relates.
Amendment 48 agreed to.
Question proposed, That the clause, as amended, stand part of the Bill.
Clause 78 introduces part 4 of the Bill, which amends the Privacy and Electronic Communications (EC Directive) Regulations 2003. Clauses 79 to 86 refer to them as “the PEC Regulations” for short. They sit alongside the Data Protection Act and the UK GDPR. We will debate some of the more detailed provisions in the next few clauses.
Question put and agreed to.
Clause 78, as amended, accordingly ordered to stand part of the Bill.
Clause 79
Storing information in the terminal equipment of a subscriber or user
I beg to move amendment 116, in clause 79, page 101, line 15, leave out
“making improvements to the service”
and insert
“making changes to the service which are intended to improve the user’s experience”.
Cookies are small text files that are downloaded on to somebody’s computer or smartphone when they access a website; they allow the website to recognise the person’s device, and to store information about the user’s preferences or past actions. The current rules around using cookies, set out in regulation 6 of the PEC regulations, dictate that organisations must tell people that the cookies are there, explain what the cookies are doing and why, and finally get the person’s freely given, specific and informed consent to store cookies on their device. However, at the moment there is almost universal agreement that the system is not working as intended.
To comply with the legislation, most website have adopted what is known as a cookie banner—a notice that pops up when a user first visits the site, prompting them to indicate which cookies they are happy with. However, due to the sheer volume of those banners, in many cases people no longer feel they are giving consent because they are informed or because they freely wish to give it, but are doing so simply because the banners stop them using the website as they wish.
In their communications regarding the Bill, the Government have focused on reducing cookie fatigue, branding it one of the headline achievements of the legislation. Unfortunately, as I will argue throughout our debates on clause 79, I do not believe that the Bill will fix the problem in the way that users hope. The new exemptions to the consent requirement for purposes that present a low risk to privacy may reduce the number of circumstances in which permission might be required, but there will still be a wide-ranging list of circumstances where consent is still required.
If the aim is to reduce cookie fatigue for users, as the Government have framed the clause, the exemptions must centre on the experience of users. If they do not, the clause is not about reducing consent fatigue, but rather about legitimising large networks of online surveillance of internet users. With that in mind, amendment 116 would narrow the exemption for collecting statistical information with a view to improving a service so that it is clear that any such improvements are exclusively considered to be those from the user’s perspective. That would ensure that the term “improvements” cannot be interpreted as including sweeping changes for commercial benefit, but is instead focused only on benefits to users.
I will speak to proposed new regulation 6B when we debate later amendments, but I reiterate that I have absolute sympathy for the intention behind the clause and want as much as anyone to see an end to constant cookie banners where possible. However, we must place the consumer and user experience at the heart of any such changes. That is what we hope to ensure through the amendment, with respect to the list of exemptions.
I am grateful to the hon. Lady for making it clear that the Opposition share our general objective in the clause. As she points out, the intention of cookies has been undermined by their ubiquity when they are placed as banners right at the start. Clause 79 removes the requirement to seek consent for the placement of audience measurement cookies. That means, for example, that a business could place cookies to count the number of visitors to its website without seeking the consent of web users via a cookie pop-up notice. The intention is that the organisation could use the statistical information collected to understand how its service is being used, with a view to improving it. Amendment 116 would mean that “improvements to the service” would be narrowed in scope to mean improvements to the user’s experience of the service, but while that is certainly one desirable outcome of the new exception, we want it to enable organisations to make improvements for their own purposes, and these may not necessarily directly improve the user’s experience of the service.
Organisations have repeatedly told us how important the responsible use of data is for their growth. For example, a business may want to use information collected to improve navigation of its service to improve sales. It could use the information collected to make improvements to the back-end IT functionality of its website, which the user may not be aware of. Or it could even decide to withdraw parts of its service that had low numbers of users; those users could then find that their experience was impaired rather than improved, but the business could invest the savings gained to improve other parts of the service. We do not think that businesses should be prevented from improving services in this way, but the new exception provides safeguards to prevent them from sharing the collected data with anyone else, except for the same purpose of making improvements to the service. On that basis, I hope the hon. Lady will consider withdrawing her amendment.
I am grateful for the Minister’s answer. I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
I beg to move amendment 49, in clause 79, page 102, leave out lines 21 to 23.
Clause 79 amends regulation 6 of the PEC Regulations to create new exceptions from the prohibition on storing and accessing information in terminal equipment. New paragraph (2C) contains an exception for software updates that satisfy specified requirements. This amendment removes a requirement that the subscriber or user can object to the update and does not object.
Clause 79 reforms regulation 6 of the Privacy and Electronic Communications (EC Directive) Regulations 2003, which sets the rules on when an organisation can store information or gain access to information stored on a person’s device—for example, their computer, phone or tablet. This is commonly described as the cookies rule, but it includes similar technologies such as tracking pixels and device fingerprinting. Currently, organisations do not have to seek a user’s consent to place cookies that are strictly necessary to provide a service requested by the user—for example, to detect fraud or remember items in a user’s online shopping basket.
To reduce the number of cookie pop-up notices that can spoil web users’ enjoyment of the internet, clause 79 will remove the requirement for organisations to seek consent for several low privacy risk purposes, including the installation of software updates necessary for the security of the device. Government amendments 49 and 51 remove the user’s right to opt out of the software security update and the right to remove an update after it has taken effect. Government amendment 50 removes the right to disable an update before it takes effect.
Although these measures were initially included in the Bill to give web users a choice about whether security updates were installed, stakeholders have subsequently advised us that the failure to install certain updates could result in a high level of risk to the security of users’ devices and personal information. We have been reflecting on the provisions since the Bill was introduced, and have concluded that removing them is the right thing to do, in the interests of security of web users. Even if these provisions are omitted, organisations will still need to provide users with clear and comprehensive information about the purpose of software security updates. Web users will also still have the right to postpone an update for a limited time before it takes effect.
Government amendment 54 concerns the regulation-making powers under the new PEC regulations. One of the main aims is to ensure that web users are empowered to use automated technology such as browsers and apps to select their choices regarding which cookies they are willing to accept. The Secretary of State could use powers under these provisions to require consent management tools to meet certain standards or specifications. so that web users can make clear, meaningful choices once and have those choices respected throughout their use of the internet.
The Committee will note that new regulation 6B already requires the Secretary of State to consult the Information Commissioner and other interested parties before making any new regulations on consent management tools. Government amendment 54 adds the Competition and Markets Authority as a required consultee. That will help ensure that any competition impacts are properly considered when developing new regulations that set standards of design.
Finally, Government amendments 52 and 53 make minor and technical changes that will ensure that future regulations made under the reformed PEC regulations can include transitional, transitory or savings provisions. These will simply ensure there is a smooth transition to the new regime if the Secretary of State decides to make use of these new powers. I commend the amendments to the Committee.
I understand that amendments 49 to 51 primarily remove the option for subscribers or users to object to or disable an update or software for security reasons. As techUK has highlighted, the PEC regulations already contain an exemption on cookie consent for things that are strictly necessary, and it was widely accepted that security purposes met this exemption. This is reflected by its inclusion in the list of things that meet the criteria in new paragraph (5).
However, in the Bill the Government also include security updates in the stand-alone exemption list. This section introduces additional conditions that are not present in the existing law, including the requirement to offer users an opt-out from the security update and the ability to disable or postpone it. The fact that this overlap has been clarified by removing the additional conditions seems sensible. Although user choice has value, it is important that we do not leave people vulnerable to known security flaws.
In principle, Government amendment 54 is a move in the right direction. I will speak to regulation 6B in more detail when we discuss amendment 117 and explain why we want to remove it. If the regulation is to remain, it is vital that the Competition and Markets Authority be consulted before regulations are made due to the impact they will likely have in entrenching power in the hands of browser owners. That the Government have recognised that it was an oversight not to involve the CMA in any consultations is really pleasing. I offer my full support to the amendment in that context, though I do not believe it goes far enough and will advocate the removal of regulation 6B entirely in due course.
Amendment 49 agreed to.
Amendments made: 50, in clause 79, page 102, line 25, leave out “disable or”.
Clause 79 amends regulation 6 of the PEC Regulations to create new exceptions from the prohibition on storing and accessing information in terminal equipment. New paragraph (2C) contains an exception for software updates that satisfy specified requirements. This amendment removes a requirement for subscribers and users to be able to disable, not just postpone, the update.
Amendment 51, in clause 79, page 102, leave out lines 27 to 29.
Clause 79 amends regulation 6 of the PEC Regulations to create new exceptions from the prohibition on storing and accessing information in terminal equipment. New paragraph (2C) contains an exception for software updates that satisfy specified requirements. This amendment removes a requirement that, where the update takes effect, the subscriber or user can remove or disable the software.
Amendment 52, in clause 79, page 104, line 20, leave out “or supplementary provision” and insert
“, supplementary, transitional, transitory or saving provision, including provision”.—(Sir John Whittingdale.)
This amendment provides that regulations under the new regulation 6A of the PEC Regulations, inserted by clause 79, can include transitional, transitory or saving provision.
As the hon. Lady sets out, amendment 117 would remove new regulation 6B from the Bill, but we see this as an important tool for reducing frequent cookie consent banners and pop-ups that can, as we have debated already, interfere with people’s use of the internet. Members will be aware, as has already been set out, that clause 79 removes the need for organisations to seek consent to place cookies for certain non-intrusive purposes. One way of further reducing the need for repeated cookie pop-up notices is by blocking them at source—in other words, allowing web users to select which cookies they are willing to accept and which they are not comfortable with by using browser-level settings or similar technologies. These technologies should allow users to set their online preferences once and be confident that those choices will be respected throughout their use of the internet.
We will continue to work with the industry and the Information Commissioner to improve take-up and effectiveness of browser-based and similar solutions. Retaining the regulation-making powers at 6B is important to this work because it will allow the Secretary of State to require relevant technologies to meet certain standards or specifications.
Without regulations, there could be an increased risk of companies developing technologies that did not give web users sufficient choice and control about the types of cookies they are willing to accept. We will consult widely before making any new regulations under 6B, and new regulations will be subject to the affirmative resolution procedure. We have listened to stakeholders and intend to amend 6B to provide an explicit requirement for the Secretary of State to consult the Competition and Markets Authority before making new regulations.
Is this something the Department has considered? For example, Google Chrome has a 77% share of the web browser market on desktop computers, and over 60% for all devices including mobile devices. Although we want to improve the use of the internet for users and get rid of unwanted cookies, the consequence would be the consolidation of power in the hands of one or two companies with all that data.
I entirely agree with my hon. Friend. He accurately sums up the reason that the Government decided it was important that the Competition and Markets Authority would have an input into the development of any facility to allow browser users to set their preferences at the browser level. We will see whether, with the advent of other browsers, AI-generated search engines and so on, the dominance is maintained, but I think he is absolutely right that this will remain an issue that the Competition and Markets Authority needs to keep under review.
That is the purpose of Government amendment 54, which will ensure that any competition impacts are considered properly. For example, we want any review of regulations to be relevant and fair to both smaller publishers and big tech. On that basis, I hope that the hon. Member for Barnsley East will consider withdrawing her amendment.
I appreciate the Minister’s comments and the Government change involving the CMA, but we simply do not believe that that is worth putting into law. We just do not know the full implications, as echoed by the hon. Member for Folkestone and Hythe. I will therefore press my amendment to a Division.
Question put, That the amendment be made.
I shall not repeat all that has been said about the purpose of the clause. To recap quickly, consent is required for any non-essential functions, such as audience measurement, design optimisation, presentation of adverts and tracking across websites but, clearly, the current system is not working well. Researchers found that people often click yes to cookies to make the banner go away and because they want to access the service quickly.
The clause will remove the requirement for organisations to seek consent to cookies placed for several low privacy risk purposes. As a result of the new exceptions we are introducing, web users should know that if they continue to see cookie pop-up messages it is because they relate to more intrusive uses of cookies. It is possible that we may identify additional types of non-intrusive cookies in the future, so the clause permits the Secretary of State to make regulations amending the exceptions to the consent requirement or introducing new exceptions.
The changes will not completely remove the existence of cookie pop-ups. However, we are committed to working with tech companies and consumer groups to promote technologies that help people to set their online preferences at browser level or by using apps. Such technology has the potential to reduce further the number of pop-ups that appear on websites. Alongside the Bill, we will take forward work to discuss what can be done further to develop and raise awareness of possible technological solutions. On that basis, I commend the clause to the Committee.
I spoke in detail about my issues with the clause during our debates on amendments 116 and 117, but overall I commend the Government’s intention to explore ways to end cookie fatigue. Although I unfortunately do not believe that these changes will solve the issues, it is pleasing that the Government are looking at ways to reduce the need for consent where the risk for privacy is low. I will therefore not stand in the way of the clause, beyond voicing my opposition to regulation 6B.
Question put and agreed to.
Clause 79, as amended, accordingly ordered to stand part of the Bill.
Clause 80
Unreceived communications
Question proposed, That the clause stand part of the Bill.
Clause 80 provides an additional power for the Information Commissioner when investigating unsolicited direct marketing through telephone calls, texts and emails—more commonly known as nuisance calls or nuisance communications.
Some unscrupulous direct marketing companies generate hundreds of thousands of calls to consumers who have not consented to be contacted. That can affect the most vulnerable in our society, some of whom may agree to buy products or services that they did not want or cannot afford. Successive Governments have taken a range of actions over the years—for example, by banning unsolicited calls from claims management firms and pensions providers—but the problem persists and further action is needed.
Under the Privacy and Electronic Communications (EC Directive) Regulations 2003, the Information Commissioner can investigate and take enforcement action against rogue companies where there is evidence that unsolicited marketing communications have been received by the recipient. The changes we are making in clause 80 will enable the Information Commissioner to take action in relation to unsolicited marketing communications that have been generated, as well as those received or connected.
Not every call that is generated reaches its intended target. For example, an individual may be out or may simply not pick up the phone. However, the potential for harm should be a relevant factor in any enforcement action by the Information Commissioner’s Office. The application of the regulations, through the changes in clause 80, to communications generated will more accurately reflect the level of intent to cause disturbance.
Clause 81 is a minor and technical clause that should improve the readability of the PEC regulations. The definition of “direct marketing”, which the PEC regulations rely on, is currently found in the Data Protection Act 1998. To help the reader quickly locate the definition, the clause adds the definition to the PEC regulations themselves.
Under the current PEC regulations, businesses can already send direct marketing to existing customers, subject to certain safeguards. That is sometimes known as the soft opt-in rule. Clause 82 applies the same rule to non-commercial organisations, such as charities. The changes will mean that charitable, political and non-commercial organisations will be able to send direct marketing communications to persons who have previously expressed an interest in the organisation’s aims and ideals.
The current soft opt-in rules for business are subject to certain safeguards. We have applied the same safeguards to these new provisions for non-commercial organisations. We think these changes will help non-commercial organisations, including charities and political parties, to build ongoing relationships with their supporters. There is no good reason why the soft opt-in rule should apply to businesses but not to non-commercial organisations. I hope Members will see the benefit of these measures in ensuring the balance between protecting the most vulnerable in society and supporting organisations. I commend clauses 80 to 82 to the Committee.
As I have said many times during our discussion of the Bill, I believe that the Information Commissioner should be given proportionate powers to investigate and take action where that is needed to uphold our regulations. That is no less the case with clause 80, which introduces measures that allow the Information Commissioner to investigate organisations responsible for generating unsolicited direct marketing communications, even if they are not received by anyone.
Clause 81 simply lifts the definition of “direct marketing” from the Data Protection Act 1998 and places it into the PEC regulations to increase the readability of that legislation. I have no issues with that.
Clause 82 extends the soft opt-in rules to charities and non-commercial organisations. It is only right that the legislation is consistent in offering non-profits the opportunity to send electronic marketing communications in the same way as for-profit organisations. It might, however, be worth raising the public’s awareness of the rule and of the ability to opt out at any point. If they suddenly find themselves on the end of such communications, they will have a clear understanding of why that is the case and that consent may be withdrawn if they so wish.
Question put and agreed to.
Clause 80 accordingly ordered to stand part of the Bill.
Clauses 81 and 82 ordered to stand part of the Bill.
Clause 83
Direct marketing for the purposes of democratic engagement
I beg to move amendment 55 in clause 83, page 107, line 41, leave out ‘or transitional’ and insert ‘, transitional, transitory or saving’.
This amendment provides that regulations under clause 83 can make transitory or saving provision.
With this it will be convenient to discuss the following:
Clauses 83 and 84 stand part.
Before I speak to the amendment, I will set out the provisions of clause 83, which gives the Secretary of State the power to make exceptions to the PEC regulations’ direct marketing provisions for communications sent for the purposes of democratic engagement. We do not intend to use the powers immediately because the Bill contains a range of other measures that will facilitate a responsible use of personal data for the purposes of political campaigning, including the extension of the soft opt-in rule that we have just debated. However, it is important we keep the changes we are making in the Bill under review to make sure that elected representatives and parties can continue to engage transparently with the electorate and are not unnecessarily constrained by data protection and privacy rules.
The Committee will note that if the Secretary of State decided to exercise the powers, there are a number of safeguards in the clause that will maintain a sensible balance between the need for healthy interaction with the electorate and any expectations that an individual might have with regard to privacy rights. Any new exceptions would be limited to communications sent by the individuals and organisations listed in clause 83, including elected representatives, registered political parties and permitted participants in referendum campaigns.
Before laying any regulations under the clause, the Secretary of State will need to consult the Information Commissioner and other interested parties, and have specific regard for the effect that further exceptions could have on the privacy of individuals. Regulations will require parliamentary approval via the affirmative resolution procedure. Committee members should also bear in mind that the powers will not affect an individual’s right under the UK GDPR to opt out of receiving communications.
We have also tabled two technical amendments to the clause to improve the way it is drafted. Government amendment 55 will make it clear that regulations made under this power can include transitory or savings provisions in addition to transitional provisions. Such provisions might be necessary if, for example, new exceptions were only to apply for a time-limited period. Clause 84 is also technical in nature and simply sets out the meaning of terms such as “candidate”, “elected representative” and “permitted participant” for the purposes of clause 83.
The clauses mirror somewhat the involvement of democratic engagement purposes on the recognised legitimate interests list. However, here, rather than giving elected representatives and the like an exemption from completing a balancing test when processing under this purpose, the Bill paves the way for them to be exempt from certain direct marketing provisions in future.
The specific content of any future changes, however, should be properly scrutinised. As such, it is disappointing that the Government have not indicated how they intend to use such regulations in future. I appreciate that the Minister has just said that they do not intend to use them right now. Does he have in mind any examples of any exemptions that he might like to make from the direct marketing provisions for democratic engagement purposes? That is not to say that such exemptions will not be justified; just that their substance should be openly discussed and democratically scrutinised.
As I have set out, the existing data protection provisions remain under the GDPR. In terms of specific exemptions, I have said that the list will be subject to future regulation making, which will be also subject to parliamentary scrutiny. We will be happy to supply a letter to the hon. Lady to set out specific examples of where that might be the case.
Amendment 55 agreed to.
Clause 83, as amended, ordered to stand part of the Bill.
Clause 84
Meaning of expressions in section 83
Amendment made: 31, in clause 84, page 110, line 31, leave out “fourth day after” and insert
“period of 30 days beginning with the day after”.—(Sir John Whittingdale.)
Clauses 83 and 84 enable regulations to make exceptions from direct marketing rules in the PEC Regulations, including for certain processing by elected representatives. This amendment increases the period for which former members of the Westminster Parliament and the devolved legislatures continue to be treated as "elected representatives" following an election. See also NC6 and Amendment 30.
Clause 84, as amended, ordered to stand part of the Bill.
Clause 85
Duty to notify the Commissioner of unlawful direct marketing
I beg to move amendment 56, in clause 85, page 112, line 35, at end insert—
“(13A) Regulations under paragraph (13) may make transitional provision.
(13B) Before making regulations under paragraph (13), the Secretary of State must consult—
(a) the Commissioner, and
(b) such other persons as the Secretary of State considers appropriate.”
This amendment enables regulations changing the amount of a fixed penalty under regulation 26B of the PEC Regulations to include transitional provision. It also requires the Secretary of State to consult the Information Commissioner and such other persons as the Secretary of State considers appropriate before making such regulations.
With this it will be convenient to discuss the following:
Amendment 118, in clause 85, page 113, line 3, at end insert—
“(1A) Guidance under this section must—
(a) make clear that a provider of a public electronic communications service is not obligated to monitor the content of individual electronic communications in order to determine whether those communications contravene the direct marketing regulations; and
(b) include illustrative examples of the grounds on which a provider may reasonably suspect that a person is contravening or has contravened any of the direct marketing regulations.”
Government amendment 33.
Clause stand part.
Before I speak to Government amendment 56, it might be helpful to set out the provisions of clause 85. The clause will help to ensure that there is better co-operation between the industry and the regulator in tackling the problem of nuisance communications. It places a duty on public electronic communications service and network providers to notify the Information Commissioner within 28 days if they have “reasonable grounds” for suspecting that unlawful direct marketing communications are transiting their services or networks. Once notified, the ICO will investigate whether a breach of the PEC regulations has occurred and take appropriate action where necessary.
We cannot expect network and service providers to know for certain whether a customer has agreed to receive a marketing call, which is why the new requirement is predicated on the organisation having reasonable grounds for suspecting that something unlawful is occurring. For example, there might be cases where a communications network or service provider notices a large volume of calls being generated in quick succession, with only one digit in the telephone number changing each time. That might suggest that calls are being made indiscriminately, without regard to whether the customer has registered with the telephone preference service or previously advised the caller that they did not want to be contacted.
We do not envisage that the provision will place significant new burdens on the network and service providers. It does not require them to put new systems in place to monitor for suspicious activities. However, where they have that capability already and have reasonable grounds to believe that unlawful activity is going on, we would like them to share that information with the ICO. The clause also requires the ICO to produce and publish guidance for network and service providers to help them to understand what intelligence information could reasonably be shared.
I shall respond to amendment 118 after the hon. Member for Barnsley East has spoken to it, but it might be helpful for me briefly to explain Government amendment 56. The fixed penalty for failure to comply with the duty, which is currently set at £1,000, is being kept under review. Where appropriate, the Secretary of State can use regulations to change the fine amount. The amendment will ensure that those regulation-making powers are consistent with similar powers elsewhere in the Bill. The regulations could include transitional provisions, and the amendment will also require the Secretary of State to consult the Information Commissioner and other persons they consider appropriate before making such regulations.
Government amendment 33 is a minor and technical change designed to improve the readability of the legislation.
The amount is fixed in the Bill at £1,000, Minister. That is stated at clause 85 in proposed new regulation 26B. The Bill states:
“The amount of a fixed monetary penalty under this regulation shall be £1,000.”
That does not indicate any flexibility. I draw that to the attention of the Committee.
The ambition of the clause is broadly welcome, and we agree that there is a need to tackle unwanted calls, but the communications sector, including Vodafone and BT, as well as techUK, has shared concerns that the clause, which will place a new duty on telecoms providers to report to the commissioner whenever they have “reasonable grounds” for suspecting a breach of direct marketing regulations, might not be the best way to solve the issue.
I will focus my remarks on highlighting those concerns, and how amendment 118 would address some of them. First, though, let me say that the Government have already made it clear in their explanatory notes that it is not the intention of the Bill to require providers to monitor communications. However, that has not been included in the Bill, which has caused some confusion in the communications sector.
Amendment 118 would put that confusion to rest by providing for the explicit inclusion of the clarification in the clause itself. That would provide assurances to customers who would be sure their calls and texts would not be monitored, and to telecoms companies, which would be certain that such monitoring of content was absolutely not required of them.
Secondly, the intent of the clause is indeed not to have companies monitoring communications, but many relevant companies have raised concerns around the technological feasibility of identifying instances of unlawful and unsolicited direct marketing. Indeed, the new duty will require telecommunications providers to be able to identify whether a person receiving a direct marketing call has or has not given consent to receive the call from the company making it. However, providers have said they cannot reliably know that, and have warned that there is no existing technology to conduct that kind of monitoring accurately and at scale. In the absence of communication monitoring and examples of how unsolicited direct marketing is to be identified, it is therefore unclear how companies will fulfil their duties under the clause.
That is not to say the industry is not prepared to commit significant resources to tackling unwanted calls. BT, for example, has set up a range of successful tools to help customers. That includes BT Call Protect, which is used by 4.4 million BT customers and now averages 2.35 million calls diverted per week. However, new measures must be feasible, and our amendment 118 would therefore require that guidance around the implementation of the clause include illustrative examples of the grounds on which a provider may reasonably suspect that a person is contravening, or has contravened, any of the direct marketing regulations.
If the Minister does not intend to support the amendment, I would like to hear such examples from him today, so that the communications sector was absolutely clear about how to fulfil its new duties, given the technology available.
As the hon. Lady has said, amendment 118 would require the commissioner to state clearly in the guidance that the new duty does not oblige providers to intercept or monitor the content of electronic communications in order to determine whether there has been a contravention of the rules. It would also require the guidance to include illustrative examples of the types of activity that may cause a provider reasonably to suspect that there had been a contravention of the requirements.
I recognise that the amendment echoes concerns that have been raised by communications service providers, and that there has been some apprehension about exactly what companies will have to do to comply with the duty. In response, I would emphasise that “reasonable grounds” does mean reasonable in all circumstances.
The hon. Lady has asked for an example of the kind of activity that might give reasonable grounds for suspicion. I direct her to the remarks I made in moving the amendment and the example of a very large number of calls being generated in rapid succession in which, in each case, the telephone number is simply one digit away from the number before. The speed at which that takes place does provide reasonable grounds to suspect that the requirement to, for instance, check with the TPS is not being fulfilled.
There are simple examples of that kind, but I draw the attention of the hon. Lady and the Committee to the consultation requirements that will apply to the ICO’s guidance. In addition to consulting providers of public electronic communications networks and services on the development of the guidance, the ICO will be required to consult the Secretary of State, Ofcom and other relevant stakeholders to ensure that the guidance is as practical and useful to organisations as possible.
Does my right hon. Friend agree that, if amendment 118 were made, it could be used as a general get-out-of-jail-free card by companies? Let us consider, for example, a situation where a company could easily and obviously have spotted a likely breach of the regulations and should have intervened. When the commissioner discovered that the company had failed in its duty to do so, the company could turn around and say, “Well, yes, we missed that, but we were not under any obligation to monitor.” It is therefore important that there is a requirement for companies to use their best endeavours to monitor where possible.
I completely agree; my hon. Friend is right to make that distinction. Companies should use their best endeavours, but it is worth repeating that the guidance does not expect service and network providers to monitor the content of individual calls and messages to comply with the duty. There is more interest in patterns of activity on networks, such as where a rogue direct marketing firm behaves in the manner that I set out. On that basis, I ask the hon. Lady not to press her amendment to a vote.
I appreciate the Minister’s comments and those of the hon. Member for Folkestone and Hythe. We have no issue with the monitoring of patterns; we wanted clarification on the content. I am not sure that the Minister addressed the concerns about the fact that, although the Government have provided a partial clarification in the explanatory notes, this is not in the Bill. For that reason, I will press my amendment to a vote.
Amendment 56 agreed to.
Amendment proposed: 118, in clause 85, page 113, line 3, at end insert—
“(1A) Guidance under this section must—
(a) make clear that a provider of a public electronic communications service is not obligated to monitor the content of individual electronic communications in order to determine whether those communications contravene the direct marketing regulations; and
(b) include illustrative examples of the grounds on which a provider may reasonably suspect that a person is contravening or has contravened any of the direct marketing regulations.”—(Stephanie Peacock.)
Question put, That the amendment be made.
I beg to move amendment 57, in clause 86, page 113, line 38, at end insert—
“(13A) Regulations under paragraph (13) may make transitional provision.
(13B) Before making regulations under paragraph (13), the Secretary of State must consult—
(a) the Information Commissioner, and
(b) such other persons as the Secretary of State considers appropriate.”
This amendment enables regulations changing the amount of a fixed penalty under regulation 5C of the PEC Regulations to include transitional provision. It also requires the Secretary of State to consult the Information Commissioner and such other persons as the Secretary of State considers appropriate before making such regulations.
With this it will be convenient to discuss the following:
Clause stand part.
Government amendments 32 and 58.
That schedule 10 be the Tenth schedule to the Bill.
Before turning specifically to the provisions of the amendment, I will set out the provisions of clause 86 and schedule 10. Clause 86 updates the ICO’s powers in respect of enforcing the PEC regulations. Currently, the ICO has to rely mainly on outdated powers in the Data Protection Act 1998 to enforce breaches of the PEC regulations. The powers were not updated when the UK GDPR and the Data Protection Act came into force in 2018. That means that some relatively serious breaches of the PEC regulations, such as nuisance calls being generated on an industrial scale, cannot be investigated as effectively or punished as severely as breaches under the data protection legislation.
The clause will therefore give the ICO the same investigatory and enforcement powers in relation to breaches of the PEC regulations as currently apply to breaches of the UK GDPR and the 2018 Act. That will result in a legal framework that is more consistent and predictable for organisations, particularly for those with processing activities that engage both the PEC regulations and the UK GDPR.
Clause 86 and schedule 10 add a new schedule to the PEC regulations, which sets out how the investigatory and enforcement powers in the 2018 Act will be applied to the PEC regulations. Among other things, that includes the power for the Information Commissioner to impose information notices, assessment notices, interview notices and enforcement and penalty notices. The maximum penalty that the Information Commissioner can impose for the most serious breaches of the PEC regulations will be increased to the same levels that can be imposed under the UK GDPR and the Data Protection Act. That is up to 4% of a company’s annual turnover or £17.5 million, whichever is higher.
Relevant criminal offences under the Data Protection Act, such as the offence of deliberately frustrating an investigation by the Information Commissioner by destroying or falsifying information, are also applied to the PEC regulations. The updated enforcement provisions in new schedule 1 to the PEC regulations will retain some pre-existing powers that are unique to the previous regulations.
Clause 86 also updates regulation 5C of the PEC regulations, which sets out the fixed penalty amount for a failure to report a personal data breach under regulation 5. Currently, the fine level is set at £1,000. The clause introduces a regulation-making power, which will be subject to the affirmative procedure, for the Secretary of State to increase the fine level. We have tabled Government amendment 57 to provide an explicit requirement for the Secretary of State to consult the Information Commissioner and any other persons the Secretary of State considers appropriate before making new regulations. The amendment also confirms that regulations made under the power can include transitional provisions.
Finally, we have tabled two further minor amendments to schedule 10. Government amendment 58 makes a minor correction by inserting a missing schedule number. Government amendment 32 adjusts the provision that applies section 155(3)(c) of the Data Protection Act for the purposes of the PEC regulations. That is necessary as that section is being amended by schedule 4. Without making those corrective amendments, the provisions will not achieve the intended effect.
Clause 86 and schedule 10 insert and clarify the commissioner’s enforcement powers with regards to privacy and electronic communications regulation. Particularly of note within the proposals is the move to increase fines for nuisance calls and messages to a higher maximum penalty of £17.5 million or 4% of the undertaking’s total annual worldwide turnover, whichever is higher. That is one of the Government’s headline commitments in the Bill and should create tougher punishments for those who are unlawfully pestering people through their phones.
We are in complete agreement that more must be done to stop unwanted communications. However, to solve the problem as a whole, we must take stronger action on scam calling as well as on instances of unsolicited direct marketing. Labour has committed to going further than Ofcom’s new controls on overseas scam calls and has proposed the following to close loopholes: first, no phone call made from overseas using a UK telephone number should have that number displayed when it appears on a UK mobile phone or digital landline; and secondly, all mobile calls from overseas using a UK number should be blocked unless the network provider confirms that the known bill payer for the number is currently roaming. To mitigate the fact that some legitimate industries rely on overseas call centres that handle genuine customer service requests, we will also require Ofcom to register those legitimate companies and their numbers as exceptions to the blocking.
As the clause and schedule seek to take strong action against unwanted communications, I would be pleased to hear from the Minister whether the Government would consider going further and matching our commitments on overseas scam calling, too.
I say to the hon. Lady that the provisions deal specifically with nuisance calls, not necessarily scam calls. As she will know, the Government have a comprehensive set of policies designed to address fraud committed through malicious or scam calls, and those are being processed through the fraud prevention strategy. I accept that more needs to be done and say to her that it is already taking place.
Amendment 57 agreed to.
Clause 86, as amended, ordered to stand part of the Bill.
Schedule 10
Privacy and electronic communications: Commissioner’s enforcement powers
Amendments made: 32, in schedule 10, page 180, line 25, leave out “for “data subjects”” and insert
“for the words from “data subjects” to the end”.
This amendment adjusts provision applying section 155(3)(c) of the Data Protection Act 2018 (penalty notices) for the purposes of the PEC Regulations to take account of the amendment of section 155(3)(c) by Schedule 4 to the Bill.
Amendment 58, in schedule 10, page 183, line 5, at end insert “15”.—(John Whittingdale.)
This amendment inserts a missing Schedule number, so that the provision refers to Schedule 15 to the Data Protection Act 2018.
Schedule 10, as amended, agreed to.
Clause 87
The eIDAS Regulation
Question proposed, That the clause stand part of the Bill.
Clauses 87 to 91 make changes to the UK’s eIDAS regulation to support the effective functioning of the UK’s trust services market into the future. Clause 87 states that when clauses 88 to 91 talk about the eIDAS regulation, this refers to regulation 910/2014, on electronic identification and trust services for electronic transactions in the internal market, which was adopted by the European Parliament and the European Council on 23 July 2014.
There is potential for confusion between the UK eIDAS regulation and the EU eIDAS regulation from which it stems and which shares the same title. I can confirm that all references to the eIDAS regulation in clauses 88 to 91 refer to the regulation as it was retained and modified on EU exit to apply within the UK.
Clause 88 amends the UK eIDAS regulation so that conformity assessment reports issued by an accredited EU conformity assessment body can be recognised and used to grant a trust service provider qualified status under the regulation. UK-qualified trust services are no longer legally recognised within the EU, which has meant that qualified trust service providers who wish to operate within both the UK and the EU need to meet two sets of auditing requirements. That is not cost effective and creates regulatory barriers in the nascent UK trust services market. Unilateral recognition of EU conformity assessment bodies will remove an unnecessary regulatory barrier for qualified trust service providers wishing to operate within both the UK and EU markets.
Clause 89 provides the Secretary of State with a power to revoke articles 24A and 24B of the UK eIDAS regulation in the future, should the continued unilateral recognition of EU-qualified trust services, and the recognition of conformity assessment reports issued by EU conformity assessment bodies, no longer meet the needs of the UK market. Clause 89 also provides a power to amend article 24A in order to wind down the recognition of EU-qualified trust services, by removing the recognition of certain elements of EU-qualified trust service standards only.
For example, it will be possible to continue to recognise EU-qualified electronic time stamps and delivery services while ending the recognition of EU-qualified electronic signatures and seals, which will give the UK eIDAS regulation flexibility to adapt to future changes. The clause provides that any regulations made under this power will be subject to the negative resolution procedure.
“Trust services” refers to services including those relating to electronic signatures, electronic seals, timestamps, electronic delivery services and website authentication. As has been mentioned, trust services are required to meet certain standards and technical specifications for operation across the UK economy, which are outlined under eIDAS regulations. These clauses seek to make logistical adjustments to that legal framework for trust service products and services within in the UK.
Although we understand that the changes are intended to enable flexibility in case EU regulations should no longer be adequate, and absolutely agree that we must future-proof regulations to ensure that standards are always kept high, we must also ensure that any changes made are necessary, to ensure that standards remain high, rather than being made simply for their own sake. It is vital that any alterations made are genuinely intended to improve current practices and have been thoroughly considered to ensure that they are making positive and meaningful change.
Question put and agreed to.
Clause 87 accordingly ordered to stand part of the Bill.
Clauses 88 to 91 ordered to stand part of the Bill.
Clause 92
Disclosure of information to improve public service delivery to undertakings
Question proposed, That the clause stand part of the Bill.
The clause will amend the Digital Economy Act 2017 to extend the powers under section 35 to include businesses. Existing powers enable public authorities to share data to support better services to individuals and households. The Government believe that businesses too can benefit from responsive, joined-up public services across the digital economy. The clause introduces new data sharing powers allowing specified public authorities to share data with other specified public authorities for the purposes of fulfilling their functions.
The sharing of data will also provide benefits for the public in a number of ways. It will pave the way for businesses to access Government services more conveniently, efficiently and securely—by using digital verification services, accessing support when trying to start up new businesses, completing import and export processes or applying for Government grants such as rural grants, for example. Any data sharing will of course be carried out in accordance with the requirements of the Data Protection Act and the UK GDPR.
Being able to share data about businesses will bring many benefits. For example, by improving productivity while keeping employment high we can earn more, raising living standards, providing funds to support our public services and improving the quality of life for all citizens. Now that we have left the EU, businesses that take action to improve their productivity will increase their resilience to changing market conditions and be more globally competitive. The Minister will be able to make regulations to add new public authorities to those already listed in schedule 4 to the Digital Economy Act. However, any regulations would be made by the affirmative procedure, requiring the approval of both Houses. I commend the clause to the Committee.
The clause amends section 35 of the Digital Economy Act to enable specified public authorities to share information to improve the delivery of public services to businesses with other specified persons. That echoes the existing legal gateway that allows for the sharing of information on improving the delivery of public services to individuals and households.
I believe that the clause is a sensible extension, but would have preferred the Minister and his Department to have considered public service delivery more broadly when drafting the Bill. While attention has rightly been paid throughout the Bill to making data protection regulation work in the interests of businesses, far less attention has gone towards how we can harness data for the public good and use it to the benefit of our public services. That is a real missed opportunity, which Labour would certainly have taken.
Question put and agreed to.
Clause 92 accordingly ordered to stand part of the Bill.
Clause 93
Implementation of law enforcement information-sharing agreements
I beg to move amendment 8, in clause 93, page 119, line 18, leave out first “Secretary of State” and insert “appropriate national authority”.
This amendment, Amendment 10 and NC5 enable the regulation-making power conferred by clause 93 to be exercised concurrently by the Secretary of State and, in relation to devolved matters, by Scottish Ministers and Welsh Ministers.
With this it will be convenient to discuss the following:
Government amendments 9 to 16.
Government new clause 5—Meaning of “appropriate national authority”.
Clause 93 creates a delegated power for the Secretary of State, and a concurrent power for Welsh and Scottish Ministers, to make regulations to implement international agreements relating to the sharing of information for law enforcement purposes. The concurrent power for Welsh and Scottish Ministers has been included in an amendment to the clause. While international relations are a reserved matter, the domestic implementation of the provisions likely to be contained in future international agreements may be devolved, given that law enforcement is a devolved matter to various extents in each devolved Administration.
In the light of introducing a concurrent power for Welsh and Scottish Ministers, amendments to clauses 93 and 108 have been tabled, as has new clause 5. Together they specifically detail the appropriate national authority that will have the power to make regulations in respect of clause 93. The Government amendments make it clear that the appropriate national authority may make the regulations. New clause 5 then defines who is an appropriate national authority for those purposes. I therefore commend new clause 5 and the related Government amendments to the Committee.
It is right that the powers conferred by clause 93 can be exercised by devolved Ministers where appropriate. I therefore have no objections to the amendments or the new clause.
Amendment 8 agreed to.
Amendments made: 9, in clause 93, page 119, line 18, leave out second “Secretary of State” and insert “authority”.
This amendment is consequential on Amendment 8.
Amendment 10, in clause 93, page 119, line 36, at end insert—
‘“appropriate national authority” has the meaning given in section (Meaning of “appropriate national authority”);’.—(Sir John Whittingdale.)
See the explanatory statement for Amendment 8.
Question proposed, That the clause, as amended, stand part of the Bill.
As I have already set out, clause 93 creates a delegated power for the Secretary of State, along with a concurrent power for Welsh and Scottish Ministers, to make regulations to implement international agreements relating to the sharing of information for law enforcement purposes. The legislation will provide powers to implement technical aspects of such international agreements via secondary legislation once the agreements have been negotiated.
Clause 93 stipulates that regulations can be made in connection with implementing an international agreement only in so far as it relates to the sharing of information for law enforcement purposes, and that any data sharing must comply with data protection legislation. These measures will enable the implementation of new international agreements designed to help keep the public safe from the threat posed by international criminality and cross-border crime, as well as helping to protect vulnerable people.
I believe the position is that at the present time, Northern Ireland does not have a functioning Assembly, so it is not possible, but that may change in due course.
With this it will be convenient to discuss the following:
Clauses 95 to 98 stand part.
That schedule 11 be the Eleventh schedule to the Bill.
Clauses 94 to 98 amend the Registration Service Act 1953 and the Births and Deaths Registration Act 1953—which I will refer to as the Act —and introduce schedule 11, which contains minor and consequential amendments. Currently, under the Act, the Registrar General for England and Wales provides the local registration service with paper live birth, stillbirth and death registers and with paper forms for making certified copies of the register entries—for example, birth and death certificates. Since 2009, registrars in England and Wales also record birth and death registration information electronically, in parallel with the paper-based systems. That is a duplication of effort for registrars.
Clause 94(2) amends the Act and substitutes section 25 with a new section 25. The new section will allow the Registrar General to determine in which form registers of live births, stillbirths and deaths are to be kept, and contains additional provision appropriate for the keeping of registers in an electronic form only. New section 25(2) of the Act allows the Registrar General to require that registrars keep information in a form that will allow the Registrar General and the superintendent registrar to have immediate access to all live birth and death entries as soon as the registrar has entered the details in the register. In the case of stillbirths, new section 25(2)(b) allows the Registrar General to have immediate access to the entries in the register.
New section 25(3) provides that where a register is kept in such form as determined under new section 25(2) —for example, an electronic form—any information in that register made available to the Registrar General or superintendent registrar is deemed to be held by that person, as well as the registrar, when carrying out that person’s functions—for example, the issue of certified copies.
Clause 94(3)(a) and (b) omit sections 26 and 27 of the Act, which set out the requirements for the quarterly returns made by a registrar and superintendent registrar. These returns will no longer be needed, as the superintendent registrar and the Registrar General will have immediate access to the records as provided for by new section 25 of the Act.
Clause 94(3)(c) omits section 28 of the Act, which sets out how paper registers must be stored by registrars, superintendent registrars and the Registrar General. With the introduction of new section 25, that provision is no longer necessary as it would not be relevant to an electronic register.
Proposed new section 25(4) of the Act provides that anything that is required for the purposes of creating and maintaining the registers—for example, providing registrars with the electronic system—is the responsibility of the Registrar General. Proposed new section 25(5) of the Act places a responsibility on the Registrar General to provide the required forms that the local registration service will need to produce certified copies of entries—for example, birth and death certificates.
Clauses 94 to 98 amend the Births and Deaths Registration Act, with the overall effect of removing the provision for birth and death records to be kept on paper, and allowing them to be held in an online database. This is a positive move, with the potential to bring many benefits. First, it will improve the functioning of the registration system—for example, it will allow the Registrar General and the superintendent registrar to have immediate access to all birth and death entries as soon as they have been entered into the system. The changes will undoubtedly be important to families who are experiencing joy or loss, because they make registrations easier and more likely to be correct in the first instance, minimising unnecessary clarifications at what can often be a very difficult time. Indeed, one of the recommendations of the 2022 UK Commission on Bereavement’s landmark report, which looked at the key challenges facing bereaved people in this country, was that it should be possible to register deaths online.
It is great that the Government have chosen to pursue this change. However, despite it being the recommendation listed right next to online death registration, the Government have not used this opportunity to explore the potential of extending the Tell Us Once service, which is disappointing. Indeed, the existing Tell Us Once service has proved very helpful to bereaved people in reducing the administrative burden they face, by enabling them to inform a large number of Government and public sector bodies in one process, rather than forcing them to go through the same process time and again. However, private organisations are not included, and loved ones are still tasked with contacting organisations such as employers, energy and electricity companies, banks, telephone and internet providers, and more. At a time of emotional struggle, this is a huge administrative burden to place on the bereaved and leaves them vulnerable to other unsettling variables, such as communication barriers and potentially insensitive customer service.
The commission found that 61% of adult respondents reported experiencing practical challenges when notifying the organisations that need to be made aware of the death of a loved one. We are therefore disappointed that the Government have not explored whether the Bill could extend the policy to the private sector in order to further reduce the burden on grieving friends and families, and make the inevitably difficult process a little easier. Overall, however, the clauses will mark a positive change for families up and down the country, and we are pleased to see them implemented.
I merely say to the hon. Lady that, having used the Tell Us Once service myself in relation to the death of my mother not that long ago, I absolutely hear what she says about the importance of making the process as easy as possible. We will certainly consider what she says.
Question put and agreed to.
Clause 94 accordingly ordered to stand part of the Bill.
Congratulations to the hon. Member for Solihull.
Clauses 95 to 98 ordered to stand part of the Bill.
Schedule 11 agreed to.
Clause 99
Information standards for health and adult social care in England
Question proposed, That the clause stand part of the Bill
With this it will be convenient to discussing the following:
That schedule 12 be the Twelfth schedule to the Bill.
Schedule 12 makes it clear that information standards published under section 250 of the Health and Social Care Act 2012, as amended by the Health and Care Act 2022, can include standards relating to information technology or IT services that are used or intended to be used in connection with the processing of information. The schedule extends the potential application of information standards to the providers of IT products and services to the health and adult social care sector for England. It also introduces mechanisms for monitoring and enforcing compliance by IT providers with information standards, and allows for the establishment of an accreditation scheme for IT products and services.
It is absolutely right that health and care information can flow in a standardised way between different IT systems and across organisational boundaries in the health and adult social care system in England, for the benefit of individuals and their healthcare outcomes. Information standards are vital to enabling that, alongside joint working between everyone involved in the processing of heath and care information.
These changes will support the efficient and effective operation of the health and adult social care system by making it easier for people delivering care to access accurate and complete information when they need it, improve clinical decision making and, ultimately, improve clinical outcomes for patients. The clause is a crucial enabler for the creation of a modern health and care service with systems that are integrated and responsive to the needs of patients and users. I therefore commend it to the Committee.
Information standards govern how data can be shared and compared across a sector. They are important in every sector in which they operate, but particularly in health, where they are critical to enabling the information sharing and interoperability necessary for good patient outcomes across health and social care services. For many reasons, however, we do not have a standard national approach to health data; as such, patients receive a far from seamless experience between different healthcare services. The Bill’s technical amendments and clarifications of existing rules on information standards in health, and how they interact with IT and IT services, are small but good steps in the journey towards trying resolve that.
Tom Schumacher of Medtronic told us in oral evidence that one of the problems faced by his organisation and NHS trusts is
“variability in technical and IT security standards.”
He suggested that harmonising those standards would be a “real opportunity,” since it would mean that
“each trust does not have to decide for itself which international standard to use and which local standard to use.”––[Official Report, Data Protection and Digital Information (No. 2) Public Bill Committee, 10 May 2023; c. 42, Q90.]
However, it is unclear how much headway these IT-related changes will make in providing that harmonisation, let alone the seamless service that patients so often call for.
I have one query that I hope the Minister can help with. MedConfidential has shared with us a concern that new section 251ZE of the Health and Social Care Act 2012 on accreditation of information technology, which is introduced by schedule 12, seems to imply that the Department of Health and Social Care and NHS England will have the power to set data standards in social care. MedConfidential says that would be a major policy shift, and that it seems unusual to implement such a shift through an otherwise unrelated Bill. Will the Minister write to me to clarify whether it is the Government’s intention to have DHSC and NHS England take over the information infrastructure of social care—and, if so, why they have come to that decision?
I am grateful to the hon. Lady for her support in general. I hear the concern that she expressed on behalf of the firm that has been in contact with her. We will certainly look into that, and I will be happy to let her have a written response in due course.
Mr Paisley, might I beg the Committee’s indulgence to correct the record? I incorrectly credited the hon. Member for Solihull for the private Member’s Bill, but it was in fact my hon. Friend the Member for Meriden (Saqib Bhatti). I apologise to him for getting his constituency wrong—
So we will take the congratulations away from Solihull and pass them elsewhere.
I am afraid that congratulations have been removed from Solihull and transferred to Meriden.
Better luck next time, Solihull! Thank you, Minister, for the correction.
Question put and agreed to.
Clause 99 accordingly ordered to stand part of the Bill.
Schedule 12 agreed to.
Ordered, That further consideration be now adjourned. —(Steve Double.)
Data Protection and Digital Information (No. 2) Bill (Eighth sitting) Debate
Full Debate: Read Full DebateJohn Whittingdale
Main Page: John Whittingdale (Conservative - Maldon)(1 year, 5 months ago)
Public Bill CommitteesWith this it will be convenient to discuss the following:
Government amendments 44 and 45.
That schedule 13 be the Thirteenth schedule to the Bill.
Clauses 101 to 103 stand part.
We now turn to part 5 of the Bill. Clauses 100 to 103 and schedule 13 will establish a body corporate, the Information Commission, to replace the existing regulator, the Information Commissioner, which is currently structured as a corporation sole. I should make it clear that the clauses will make no changes to the regulator’s role and responsibilities; all the functions that rest with the Information Commissioner will continue to sit with the new Information Commission.
Clause 100 will establish a body corporate, the Information Commission, to replace the existing regulator, the Information Commissioner. The commission will be governed by an independent board, with chair and chief executive roles, thereby spreading the responsibilities of the Information Commissioner across a larger number of people.
Clause 101 will abolish the office of the Information Commissioner and amend the Data Protection Act 2018 accordingly. To ensure an orderly transfer of functions, the Information Commissioner’s Office will not be abolished until the new body corporate, the Information Commission, is established.
Clause 102 provides for all regulatory and other functions of the Information Commissioner to be transferred to the new body corporate, the Information Commission, once it is established. The clause also provides for references to the Information Commissioner in enactments or other documents to be treated as references to the Information Commission, where appropriate, as a result of the transfer of functions to the new Information Commission.
Clause 103 will allow the Secretary of State to make a scheme for the transfer of property, rights and liabilities, including rights and liabilities relating to employment contracts, from the commissioner to the new commission. The scheme may transfer property such as IT equipment or office furniture, or transfer staff currently employed by the commissioner to the commission. The transfer scheme will be designed to ensure continuity and facilitate a seamless transition to the new Information Commission.
Schedule 13 will insert a new schedule 12A to the Data Protection Act 2018, which describes the nature, form and governance structure of the new body corporate, the Information Commission. The commission will be governed by an independent statutory board, which will consist of a chair and other non-executive members, as well as executive members including a chief executive. The new structure formalises aspects of the existing governance arrangements of the Information Commissioner’s Office and brings the ICO in line with how other UK regulators, such as Ofcom and the Financial Conduct Authority, are governed. The chair of the new commission will be appointed by His Majesty by letters patent on the recommendation of the Secretary of State, as is currently the case for the commissioner.
Schedule 13 also provides for the current Information Commissioner to transfer to the role of chair of the Information Commission for the remainder of their term. I put on record the Government’s intention to preserve the title of Information Commissioner in respect of the chair, in acknowledgment of the fact that the commissioner’s brand is recognised and valued both domestically and internationally. Other non-executive members will be appointed by the Secretary of State, and the chief executive will be appointed by the non-executive members in consultation with the Secretary of State.
Government amendment 45 will allow the chair to appoint the first chief executive on an interim basis and for a term of up to a maximum of 24 months, which will minimise any delay in the transition from the commissioner to the new commission. As drafted, the Bill provides that the chief executive of the commission will be appointed by the non-executive members once they are in place, in consultation with the Secretary of State. The transition from the commissioner to the new Information Commission cannot take place until the board is properly constituted, with, as a minimum, a chair, another non-executive member and a chief executive in place. That requirement would be likely to cause delay to the transition, as the appointment of the non-executive members by the Secretary of State and the chief executive would need to take place consecutively.
Amendment 44 is a minor consequential amendment to paragraph 3(3)(a) of proposed new schedule 12A, making it clear that the interim chief executive is appointed as an executive member.
The amendments seek to minimise any delay in the transfer of functions to the new commission by enabling the appointment of the chief executive to take place in parallel with the appointments process for non-executive members. The appointment of the interim chief executive will be made on the basis of fair and open competition and in consultation with the Secretary of State. I commend clauses 100 to 103, schedule 13 and Government amendments 44 and 45 to the Committee.
It is a pleasure to serve under your chairship once again, Mr Hollobone. The clauses that restructure the Information Commissioner’s Office are among those that the Opposition are pleased to welcome in the Bill.
The Information Commissioner is the UK’s independent regulator for data protection and freedom of information under the Data Protection Act 2018 and the Freedom of Information Act 2000. Under the current system, as the Minister outlined, the Information Commissioner’s Office is a corporation sole, meaning that one person has overall responsibility for data protection and freedom of information, with a group of staff supporting them. However, as the use of data in our society has grown, so too has the ICO, from a team of 10 in 1984 to an organisation with more than 500 staff.
In that context, the corporation sole model is obviously not fit for purpose. Clauses 100 to 103 recognise that: they propose changes that will modernise the Information Commissioner’s Office, turning it into the Information Commission by abolishing the corporation sole and replacing it with a body corporate. It is absolutely right that those changes be made, transforming the regulator into a commission with a broader set-up structure and a board of executives, among other key changes. That will bring the ICO in line with other established UK regulators such as Ofcom and the Financial Conduct Authority, reflect the fact that the ICO is not just a small commissioner’s office, and ensure that it is equipped to deal with the volume of work for which it has responsibility.
It is essential that the ICO remains independent and fair. We agree that moving from an individual to a body will ensure greater integrity, although the concerns that I have raised about the impact of earlier clauses on the ICO’s independence certainly remain. Overall, however, we are pleased that the Government recognise that the ICO must be brought in line with other established regulators and are making much-needed changes, which we support.
Question put and agreed to.
Clause 100 accordingly ordered to stand part of the Bill.
Schedule 13
The Information Commission
Amendments made: 44, in schedule 13, page 195, line 21, after “members” insert
“or in accordance with paragraph 23A”.
This amendment is consequential on Amendment 45.
Amendment 45, in schedule 13, page 204, line 6, at end insert—
“Transitional provision: interim chief executive
23A (1) The first chief executive of the Commission is to be appointed by the chair of the Commission.
(2) Before making the appointment the chair must consult the Secretary of State.
(3) The appointment must be for a term of not more than 2 years.
(4) The chair may extend the term of the appointment but not so the term as extended is more than 2 years.
(5) For the term of appointment, the person appointed under sub-paragraph (1) is ”the interim chief executive”.
(6) Until the expiry of the term of appointment, the powers conferred on the non-executive members by paragraph 11(2) and (3) are exercisable in respect of the interim chief executive by the chair (instead of by the non-executive members).
(7) In sub-paragraphs (5) and (6), the references to the term of appointment are to the term of appointment described in sub-paragraph (3), including any extension of the term under sub-paragraph (4).”—(Sir John Whittingdale.)
The Bill establishes the Information Commission. This new paragraph enables the chair of the new body, in consultation with the Secretary of State, to appoint the first chief executive (as opposed to the appointment being made by non-executive members). It also enables the chair to determine the terms and conditions, pay, pensions etc relating to the appointment.
Schedule 13, as amended, agreed to.
Clauses 101 to 103 ordered to stand part of the Bill.
Clause 104
Oversight of retention and use of biometric material
Question proposed, That the clause stand part of the Bill.
Clause 104 will repeal the role of the Biometrics Commissioner and transfer the casework functions to the Investigatory Powers Commissioner. There is an extensive legal framework to ensure that the police can make effective use of biometrics, for example as part of an investigation to quickly and reliably identify suspects, while maintaining public trust. That includes the Police and Criminal Evidence Act 1984, which sets out detailed rules on DNA and fingerprints, and the Data Protection Act 2018, which provides an overarching framework for the processing of all personal data.
The oversight framework is complicated, however, and there are overlapping responsibilities. The Bio -metrics Commissioner currently has specific oversight responsibilities just for police use of DNA and fingerprints, while the Information Commissioner’s Office regulates the use of all personal data, including biometrics, by any organisation, including the police. Clause 104 will simplify the framework by removing the overlap, leaving the ICO to provide independent oversight and transferring the casework functions to another existing body.
The casework involves extending retention periods in certain circumstances, particularly on national security grounds, and is quasi-judicial in nature. That is why clause 104 transfers those functions to the independent Investigatory Powers Commissioner, which has the necessary expertise, and avoids the conflict of interest that could occur if the functions were transferred to the ICO as regulator. Transparency in police use of biometrics is essential to retaining public trust and will continue through the annual reports of the Forensic Information Databases Service strategy board, the Investigatory Powers Commissioner and the ICO. I commend clause 104 to the Committee.
I will speak in more detail about my more general views on the oversight of biometrics, particularly their private use, when we come to new clauses 13, 14 and 15. However, as I look specifically at clauses 104 and 105, which seek to abolish the currently combined offices of Biometrics Commissioner and Surveillance Camera Commissioner, I would like to draw on the direct views of the Information Commissioner. In his initial response to “Data: a new direction”, which proposed absorbing the functions of the Biometrics Commissioner and Surveillance Camera Commissioner into the ICO, the commissioner said that there were some functions that,
“if absorbed by the ICO, would almost certainly result in their receiving less attention”.
Other functions, he said,
“simply do not fit with even a reformed data protection authority”
with there being
“far more intuitive places for them to go.”
That was particularly so, he said, with biometric casework.
It is therefore pleasing that as a result of the consultation responses the Government have chosen to transfer the commissioner’s biometric functions not to the ICO but to the Investigatory Powers Commissioner, acknowledging the relevant national security expertise that it can provide. However, in written evidence to this Committee, the commissioner reiterated his concern about the absorption of his office’s functions, saying that work is currently being undertaken within its remit that, under the Bill’s provisions, would be unaccounted for.
Given that the commissioner’s concerns clearly remain, I would be pleased if the Minister provided in due course a written response to that evidence and those concerns. If not, the Government should at the very least undertake their own gap analysis to identify areas that will not be absorbed under the current provisions. It is important that this Committee and the office of the Biometrics and Surveillance Camera Commissioner can be satisfied that all the functions will be properly delegated and given the same degree of attention wherever they are carried out. Equally, it is important that those who will be expected to take on these new responsibilities are appropriately prepared to do so.
I am happy to provide the further detail that the hon. Lady has requested.
Question put and agreed to.
Clause 104 accordingly ordered to stand part of the Bill.
Clause 105
Oversight of biometrics databases
I beg to move amendment 123, in clause 105, page 128, line 22, leave out subsections (2) and (3).
Having outlined my broad concerns about clause 105 when I spoke to clause 104, I will focus briefly on the specific concern raised by the hon. Member for Glasgow North West, which is that the Surveillance Camera Commissioner’s functions will not be properly absorbed.
In evidence to the Committee, the commissioner outlined a number of non-data protection functions in relation to public space surveillance that their office currently carries out, but that, they believe, the Bill does not make provision to transfer. They cite the significant work that their office has undertaken to ensure that Government Departments are able
“to cease deploying visual surveillance systems onto sensitive sites where they are produced by companies subject to the National Intelligence Law of the People’s Republic of China”,
following a November 2022 instruction from the Chancellor of the Duchy of Lancaster. The commissioner says that such non-data protection work, which has received international acclaim, is not addressed in the Bill.
I am therefore hopeful that the explicit mention in amendment 123 that that the functions of the Surveillance Camera Commissioner will be transferred provides a backstop to ensure that all the commissioner’s duties, including the non-data protection work, are accounted for. If the amendment is not accepted, a full-depth analysis should be conducted, as argued previously, with a full response issued to the commissioner’s evidence to ensure that every one of the functions is properly and appropriately absorbed.
I understand the argument that the Surveillance Camera Commissioner’s powers would be better placed with the Investigatory Powers Commissioner, rather than the ICO. Indeed, the commissioner’s evidence to the Committee referenced the interim findings of an independent report it had commissioned, as the hon. Member for Glasgow North West just mentioned. The report found that most of the gaps left by the Bill could be addressed if responsibility for the surveillance camera code moved under the IPCO, harmonising the oversight of traditional and remote biometrics.
I end by pointing to a recent example that shows the value of proper oversight of the use of surveillance. Earlier this year, following a referral from my hon. Friend the Member for Bristol North West (Darren Jones), the ICO found a school in Bristol guilty of unlawfully installing covert CCTV cameras at the edge of their playing fields. Since then, the Surveillance Camera Commissioner has been responding to freedom of information requests on the matter, with more information about the incident thereby emerging as recently as yesterday. It is absolutely unacceptable that a school should be filming people without their knowledge. The Surveillance Camera Commissioner is a vital cog in the machinery of ensuring that incidents are dealt with appropriately. For such reasons, we must preserve its functions.
In short, I am in no way opposed to the simplification of oversight in surveillance or biometrics, but I hope to see it done in an entirely thorough way, so that none of the current commissioner’s duties get left behind or go unseen.
I am grateful to the hon. Members for Glasgow North West and for Barnsley East for the points they have made. The hon. Member for Glasgow North West, in moving the amendment, was right to say that the clause as drafted abolishes the role of the Surveillance Camera Commissioner and the surveillance camera code that the commissioner promotes compliance with. The commissioner and the code, however, are concerned only with police and local authority use in England and Wales. Effective, independent oversight of the use of surveillance camera systems is critical to public trust. There is a comprehensive legal framework for the use of such systems, but the oversight framework is complex and confusing.
The ICO regulates the processing of all personal data by all UK organisations under the Data Protection Act; that includes surveillance camera systems operated by the police and local authorities, and the ICO has issued its own video surveillance guidance. That duplication is confusing for both the operators and the public and it has resulted in multiple and sometimes inconsistent guidance documents covering similar areas. The growing reliance on surveillance from different sectors in criminal investigations, such as footage from Ring doorbells, means that it is increasingly important for all users of surveillance systems to have clear and consistent guidance. Consolidating guidance and oversight will make it easier for the police, local authorities and the public to understand. The ICO will continue to provide independent regulation of the use of surveillance camera systems by all organisations. Indeed, the chair of the National Police Data Board, who gave evidence to the Committee, said that that will significantly simplify matters and will not reduce the level of oversight and scrutiny placed upon the police.
Amendment 123, proposed by the hon. Member for Glasgow North West, would retain the role of the Surveillance Camera Commissioner and the surveillance camera code. In our view, that would simply continue the complexity and duplication with the ICO’s responsibilities. Feedback that we received from our consultation showed broad support for simplifying the oversight framework, with consultees agreeing that the roles and responsibilities, in particular in relation to new technologies, were unclear.
The hon. Lady went on to talk about the oversight going beyond that of the Information Commissioner, but I point out that there is a comprehensive legal framework outside the surveillance camera code. That includes not only data protection, but equality and human rights law, to which the code cross-refers. The ICO and the Equality and Human Rights Commission will continue to regulate such activities. There are other oversight bodies for policing, including the Independent Office for Police Conduct and His Majesty’s inspectorate of constabulary, as well as the College of Policing, which provide national guidance and training.
The hon. Lady also specifically mentioned the remarks of the Surveillance Camera Commissioner about Chinese surveillance cameras. I will simply point out that the responsibility for oversight, which the ICO will continue to have, is not changed in any way by the Bill. The Information Commissioner’s Office continues to regulate all organisations’ use of surveillance cameras, and it has issued its own video surveillance guidance.
New clause 17 would transfer the functions of the commissioner to the Investigatory Powers Commissioner. As I have already said, we believe that that would simply continue to result in oversight resting in two different places, and that is an unnecessary duplication. The Investigatory Powers Commissioner’s Office oversees activities that are substantially more intrusive than those relating to overt surveillance cameras. IPCO’s existing work requires it to oversee over 600 public authorities, as well as several powers from different pieces of legislation. That requires a high level of expertise and specialisation to ensure effective oversight.
For those reasons, we believe that the proposals in the clause to bring the oversight functions under the responsibility of the Information Commissioner’s Office will not result in any reduction in oversight, but will result in the removal of duplication and greater clarity. On that basis, I am afraid that I am unable to accept the amendment, and I hope that the hon. Lady will consider withdrawing it.
I thank the Minister for responding to my amendments. However, we have a situation where we are going from having a specialist oversight to a somewhat more generalist oversight. That cannot be good when we are talking about this fast-moving technology. I will withdraw my amendment for the moment, but I reserve the right to bring it back at a later stage. I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
Clause 105 ordered to stand part of the Bill.
Clause 106
Oversight of biometrics databases
Clause 106 makes changes to the national DNA database strategy board, which provides oversight of the operation of the national DNA database, including setting policies for access and use by the police. Amendment 119 would seem to extend the power to widen the board’s potential scope beyond biometrics databases for the purpose of identification, to include the purpose of classification.
The police can process data only for policing purposes. It is not clear what policing purpose there would be in being able to classify, for example, emotions or gender, even assuming it was proven to be scientifically robust, or what sort of data would be on such a database. Even if one were developed in the future, it is likely to need knowledge, skills and resources very different from what is needed to oversee a database that identifies and eliminates suspects based on biometric identification, so it would probably make sense for a different body to carry out any oversight.
New clause 8 aims to make changes in a similar way to amendment 119 in relation to the definition of biometric data for the purposes of article 9 of the GDPR. As the GDPR is not concerned with the police’s use of biometric data for law enforcement purposes, the new clause would apply to organisations that are processing biometric data for general purposes. The aim seems to be to ensure that enhanced protections afforded by GDPR to biometric data used for unique identification purposes also apply to biometric data that is used for classification or categorisation purposes.
The hon. Lady referred to the Ada Lovelace Institute’s comments on these provisions, and its 2022 “Countermeasures” report issued on biometric technologies, but we are not convinced that such a change is necessary. One example in the report was using algorithms to make judgments that prospective employees are bored or not paying attention, based on their facial expressions or tone of voice. Using biometric data to draw inferences about people, using algorithms or otherwise, is not as invasive as using biometric data uniquely to identify someone. For example, biometric identification could include matching facial images caught on closed circuit television to a centrally held database of known offenders.
Furthermore, using biometric data for classification or categorisation purposes is still subject to the general data protection principles in the UK GDPR. That includes ensuring that there is a lawful ground for the processing, that the processing is necessary and proportionate, and is fair and transparent to the individuals concerned. If algorithms are used to categorise and make significant decisions about people based on their biometric characteristics, including in an employment context, they will have the right to be given information about the decision, and to obtain human intervention, as a result of the measures we previously debated in clause 11.
Therefore, we do see a distinction between the use of biometric information for identification purposes and the more general classification which the hon. Lady sought to draw. Though we believe that there is sufficient safeguard already in place regarding possible use of classification by biometric data, given what I have said, I hope that she will consider withdrawing the amendment.
I am grateful to the Minister for his comments. We will be speaking about the private uses of biometric data later, so I beg to ask leave to withdraw my amendment.
Amendment, by leave, withdrawn.
Question proposed, That the clause stand part of the Bill.
DNA and fingerprints are key tools in helping the police to identify and eliminate suspects quickly and accurately by comparing evidence left at crime scenes with the appropriate files on the national databases. As I previously set out, clause 106 makes changes to the National DNA Database Strategy Board. The board provides oversight of the operation of the database, including setting policies for access and use by the police.
These reforms change the scope of the board to make it clear that they should provide similar oversight of the police fingerprint database, which operates under similar rules. The change brings the legislation up to date with the board’s recently published governance rules. Clause 106 also updates the name of the board to the Forensic Information Databases Strategy Board, to better reflect the broadened scope of its work. We are also taking this opportunity to simplify and future-proof oversight of national police biometric databases. While DNA and fingerprints are well established, biometrics is an area of rapid technological development, including for example the growing use of iris, face and voice recognition. Given the pace of technological change in this area and the benefits of consistent oversight, Clause 106 also includes a power for the Secretary of State to make regulations which make changes to the board’s scope, for example by adding new biometric databases into the board’s remit or to remove them, where a database is no longer used. Such regulations would be subject to the affirmative procedure.
For these reasons, I commend the clause to the Committee.
Clause 106 will primarily increase the scope of the Forensic Information Databases Strategy Board to provide oversight of the national fingerprint database. However, there are also provisions enabling the Secretary of State to add or remove a biometric database that the board oversees, using the affirmative procedure. I would therefore like to ask the Minister whether they have any plans to use these powers regarding any particular databases—or whether this is intended as a measure for future-proofing the Bill in the case of changed circumstances?
I would also like to refer hon. Members to the remarks that I have made throughout the Bill that emphasise a need for caution when transferring the ability to change regulation further into the hands of the Secretary of State alone.
I would add only that this is an area where technology is moving very fast, as I referred to earlier. We think it is right to put in place this provision, to allow an extension if it becomes necessary—though I do not think we have any current plans. It is future-proofing of the Bill.
Question put and agreed to.
Clause 106 accordingly ordered to stand part of the Bill.
Clause 107
Regulations
Question proposed, That the clause stand part of the Bill.
Clause 107 will give the Secretary of State a regulation-making power to make consequential amendments to other legislation. The power enables amendments to this Bill itself where such amendments are consequential to the abolition of the Information Commissioner and his replacement by the new Information Commission. Such provision is needed because there are a number of areas where data protection legislation will need to be updated as a consequence of the Bill. This is a standard power, commonly included in Bills to ensure that wider legislation is updated where necessary as a result of new legislation. For example, references to “the Commissioner” in the Data Protection Act 2018 will no longer be accurate, given changes to the governance structure of the Information Commissioner’s Office within the Bill, so consequential amendments will be required to that Act.
Clause 108 outlines the form and procedure for making regulations under the powers in the Bill: they are to be made by statutory instrument. Where regulations in the Bill are subject to the affirmative resolution procedure, they may not be made unless a draft of the statutory instrument has been laid before Parliament and approved by a resolution of each House. That provision is needed because the Bill introduces new regulation-making powers, which are necessary to support the Bill’s policy objectives. For example, powers in part 3 of the Bill replace an existing statutory framework with a new, enhanced one.
Clause 109 explains the meaning of references to “the 2018 Act” and “the UK GDPR” in the Bill. Such provision is needed to explain the meaning of those two references. Clause 110 authorises expenditure arising from the Bill. That provision is needed to confirm that Parliament will fund any expenditure incurred under the Bill by the Secretary of State, the Treasury or a Government Department. It requires a money resolution and a Ways and Means resolution, both of which were passed in the House of Commons on 17 April.
Clause 111 outlines the territorial extent of the Bill. Specifically, the clause states that the Bill extends to England and Wales, Scotland and Northern Ireland, with some exceptions. Much of the Bill, including everything on data protection, is reserved policy. In areas where the Bill legislates on devolved matters, we are working with the devolved Administrations to secure legislative consent motions. Clause 112 gives the Secretary of State a regulation-making power to bring the Bill’s provisions into force. Some provisions, listed in subsection (2), come into force on the date of Royal Assent. Other provisions, listed in subsection (3), come into force two months after Royal Assent. Such provision is needed to outline when the Bill’s provisions will come into force.
Clause 113 gives the Secretary of State a regulation-making power to make transitional, transitory or saving provisions that may be needed in connection with any of the Bill’s provisions coming into force. For example, provision might be required to clarify that the Information Commissioner’s new power to refuse to act on complaints will not apply where such complaints have already been made prior to commencement of the relevant provision. Clause 114 outlines the short title of the Bill. That provision is needed to confirm the title once the Bill has been enacted. I commend clauses 107 to 114 to the Committee.
The clauses set out the final technical provisions necessary in order for the Bill to be passed and enacted effectively, and for the most part are standard. I will focus briefly on clause 107, however, as a number of stakeholders including the Public Law Project have expressed concern that, as a wide Henry VIII power, it may give the Secretary of State the power to make further sweeping changes to data protection law. Can the Minister provide some assurance that the clause will allow for the creation only of further provisions that are genuinely consequential to the Bill and necessary for its proper enactment?
It is my belief that this would not have been such a concern to civil society groups had there not been multiple occasions throughout the Bill when the Secretary of State made grabs for power, concentrating the ability to make further changes to data protection legislation in their own hands. I am disappointed, though of course not surprised, that the Government have not accepted any of my amendments to help to mitigate those powers with checks and balances involving the commissioner. However, keeping the clause alone in mind, I look forward to hearing from the Minister how the powers in clause 107 will be restricted and used.
We have previously debated the efficacy of the affirmative resolution procedure. I recognise that the hon. Lady is not convinced about how effective it is in terms of parliamentary scrutiny; we will beg to differ on that point. Although the power in clause 107 allows the Secretary of State to amend Acts of Parliament, I can confirm that that is just to ensure the legal clarity of the text. Without that power, data protection legislation would be harder to interpret, thereby reducing people’s understanding of the legislation and their ability to rely on the law.
Question put and agreed to.
Clause 107 accordingly ordered to stand part of the Bill.
Clause 108
Regulations
I beg to move, That the clause be read a Second time.
In order for the public to have trust in algorithmic decision making, particularly where used by the Government, they must be able to understand how and when it is being used as a basic minimum. That is something that the Government themselves previously recognised by including a proposal to make transparency reporting on the use of algorithms in decision making for public sector bodies compulsory in their “Data: a new direction” consultation. Indeed, the Government have already made good progress on bringing together a framework that will make that reporting possible. The algorithmic transparency recording standard they have built provides a decent, standardised way of recording and sharing information about how the public sector uses algorithmic tools. There is also full guidance to accompany the standard, giving public sector bodies a clear understanding of how to complete transparency reports, as well as a compilation of pilot reports that have already been published, providing a bank of examples.
However, despite that and the majority of consultation respondents agreeing with the proposed compulsory reporting for public sector bodies—citing benefits of increased trust, accountability and accessibility for the public—the Government chose not to go ahead with the legislative change. Relying on self-regulation in the early stages of the scheme is understandable, but having conducted successful pilots, from the Cabinet Office to West Midlands police, it is unclear why the Government now choose not to commit to the very standard they created. This is a clear missed opportunity, with the standard running the risk of failing altogether if there is no legislative requirement to use it.
As the use of such algorithms grows, particularly considering further changes contained in clause 11, transparency around Government use of big data and automated decision-making tools will only increase in importance and value—people have a right to know how they are being governed. As the Public Law Project argues, transparency also has a consequential value; it facilitates democratic consensus building about the appropriate use of new technologies, and it allows for full accountability when things go wrong.
Currently, in place of that accountability, the Public Law Project has put together its own register called “Tracking Automated Government”, or TAG. Using mostly freedom of information requests, the register tracks the use of 42 algorithmic tools and rates their transparency. Of the 42, just one ranked as having high transparency. Among those with low transparency are asylum estates analysis, used to help the Home Office decide where asylum interviews should take place, given the geographical distribution of asylum seekers across the asylum estate; the general matching service and fraud referral and intervention management system, used as part of the efforts of the Department for Work and Pensions to combat benefit fraud and error—for example, by identifying claimants who may potentially have undisclosed capital or other income; and housing management systems, such as that in Wigan Metropolitan Borough Council, which uses a points-based system to prioritise social housing waiting lists.
We all want to see Government modernising and using new technology to increase efficiency and outcomes, but if an algorithmic tool impacts our asylum applications, our benefits system and the ability of people to gain housing, the people affected by those decisions deserve at the very least to know how they are being made. If the public sector sets the right example, private companies may choose to follow in the future, helping to improve transparency even further. The framework is ready to go and the benefits are clear; the amendment would simply make progress certain by bringing it forward as part of the legislative agenda. It is time that we gave people the confidence in public use of algorithms that they deserve.
I thank the hon. Member for Barnsley East for moving new clause 9. We completely share her wish to ensure that Government and public authorities provide transparency in the way they use algorithmic tools that process personal data, especially when they are used to make decisions affecting members of the public.
The Government have made it our priority to ensure that transparency is being provided through the publication of the algorithmic transparency recording standard. That has been developed to assist public sector organisations in documenting and communicating their use of algorithms in decision making that impacts members of the public. The focus of the standard is to provide explanations of the decisions taken using automated processing of data by an algorithmic system, rather than all data processing.
The standard has been endorsed by the Government’s Data Standards Authority, which recommends the standards, guidance and other resources that Government Departments should follow when working on data projects. Publishing the standard fulfils commitments made in both the national data strategy 2020 and the national artificial intelligence strategy. Since its publication, the standard has been piloted with a variety of public sector organisations across the UK, and the published records can be openly accessed via gov.uk. It is currently being rolled out more widely across the public sector.
Although the Government have made it a priority to advance work on algorithmic transparency, the algorithmic transparency recording standard is still a maturing standard that is being progressively promoted and adopted. It is evolving alongside policy thinking and Government understanding of the complexities, scope and risks around its use. We believe that enshrining the standard into law at this point of maturity could hinder the ability to ensure that it remains relevant in a rapidly developing technology field.
Therefore, although the Government sympathise with the intention behind the new clause, we believe it is best to continue with the current roll-out across the public sector. We remain committed to advancing algorithmic transparency, but we do not intend to take forward legislative change at this time. For that reason, I am unable to accept the new clause as proposed by the Opposition.
I am grateful to the Minister, but I am still confused about why, having developed the standard, the Government are not keen to put it into practice and into law. He just said that he wants to keep it relevant; he could use some of the secondary legislation that he is particularly keen on if he accepted the new clause. As I outlined, this issue has real-life consequences, whether for housing, asylum or benefits. In my constituency, many young people were affected by the exam algorithm scandal. For those reasons, I would like to push the new clause to a vote.
Question put, That the clause be read a Second time.
I am grateful to the hon. Lady for setting out the purposes of the new clause. As she has described, it aims to require the Secretary of State to use regulation-making powers under section 190 of the Data Protection Act to implement article 80(2) of the UK GDPR. It would enable non-profit organisations with an expertise in data protection law to make complaints to the Information Commissioner and/or take legal action against data controllers without the specific authorisation of the individuals who have been affected by data breaches. Relevant non-profit organisations can already take such actions on behalf of individuals who have specifically authorised them to do so under provisions in article 80(1) of the UK GDPR.
In effect, the amendment would replace the current discretionary powers in section 190 of the Data Protection Act with a duty for the Secretary of State to legislate to bring those provisions into force soon after the Bill has received Royal Assent. Such an amendment would be undesirable for a number of reasons. First, as required under section 189 of the Data Protection Act, we have already consulted and reported to Parliament on proposals of that nature, and we concluded that there was not a strong enough case for introducing new legislation.
Although the Government’s report acknowledged that some groups in society might find it difficult to complain to the ICO or bring legal proceedings of their own accord, it pointed out that the regulator can and does investigate complaints raised by civil society groups even when they are not made on behalf of named individuals. Big Brother Watch’s recent complaints about the use of live facial recognition technology in certain shops in the south of England is an example of that.
Secondly, the response concluded that giving non-profit organisations the right to bring compensation claims against data controllers on behalf of individuals who had not authorised them to do so could prompt the growth of US-style lawsuits on behalf of thousands or even millions of customers at a time. In the event of a successful claim, each individual affected by the alleged breach could be eligible for a very small payout, but the consequences for the businesses could be hugely damaging, particularly in cases that involved little tangible harm to individuals.
Some organisations could be forced out of business or prompted to increase prices to recoup costs. The increase in litigation costs could also increase insurance premiums. A hardening in the insurance market could affect all data controllers, including those with a good record of compliance. For those reasons, we do not believe that it is right to extend the requirement on the Secretary of State to allow individuals to bring actions without the consent of those affected. On that basis, I ask the hon. Lady to withdraw the motion.
Data is increasingly used to make decisions about us as a collective, so it is important that GDPR gives us collective rights to reflect that, rather than the system being designed only for individuals to seek redress. For those reasons, I will press my new clause to a vote.
Question put, That the clause be read a Second time.
I beg to move, That the clause be read a Second time.
Privacy enhancing technologies are technologies and techniques that can help organisations to share and use people’s data responsibly, lawfully and securely. They work most often by minimising the amount of data used, maximising data security—for example by encrypting or anonymising personal information—or empowering individuals. One of the best-known examples of a PET is synthetic data: data that is modelled to reproduce the statistical properties of a real dataset when taken as a whole. That type of data could allow third-party researchers or processors to analyse the statistical outcomes of the data without having access to the original set of personal data, or any information about identifiable living individuals.
Another example of PETs are those that minimise the amount of personal data that is shared without affecting the data’s utility. Federated learning, for example, allows for the training of an algorithm across multiple devices or datasets held on servers, so if an organisation wants to train a machine-learning model but has limited training data available, they can send the model to a remote dataset for training. The model will then return having benefited from those datasets, while the sensitive data itself is not exchanged or ever put in the hands of those in ownership of the algorithm. The use of PETs therefore does not necessarily exclude data from being defined as personal or falling within the remit of GDPR. They can, however, help to minimise the risk that arises from personal data breaches and provide an increased level of security.
The Government have positioned the Bill as one that seeks to strengthen the data rights of citizens while catalysing innovation. PETs could and should have been a natural area for the Bill to explore, because not only can such devices help controllers demonstrate an approach based on data protection by design and default, but they can open the door for new ways of collaborating, innovating and researching with data. The Royal Society has researched the role that PETs can play in data governance and collaboration in immense detail, with its findings contained in its 2023 report, which is more than 100 pages long. One of the report’s key recommendations was that the Government should develop a national PET strategy to promote their responsible use as tools for advancing scientific research, increasing security and offering new partnership possibilities, both domestically and across borders.
It is vital to acknowledge that working with PETs involves risks that must be considered. Some may not be robust enough against attacks because they are in the early stages of development, while others might require a significant amount of expertise to operate, without which their use may be counterproductive. It is therefore important to be clear that the amendment would not jump ahead and endorse any particular technology or device before it was ready. Instead, it would enshrine the European Union Agency for Cybersecurity definition of PETs in UK law and prompt the Government to issue a report on how that growing area of technology might play a role in data processing and data regulation in future.
That could include identifying the opportunities that PETs could provide while also looking at the threats and potential harms involved in using the technologies without significant expertise or technological readiness. Indeed, in their consultation response, the Government even mentioned they were keen to explore opportunities around smart data, while promoting understanding that they should not be seen as a substitute for reducing privacy risks on an organisational level. The report, and the advancing of the amendment, would allow the Government that exploration, indicating a positive acknowledgment of the potentially growing role that PETs might play in data processing and opening the door for further research in the area.
Even by their name, privacy enhancing technologies reflect exactly what the Bill should be doing: looking to the future to encourage innovation in tech and then using such innovation to protect citizens in return. I hope hon. Members will see those technologies’ potential value and the importance of analysing any harms, and look to place the requirement to analyse PETs on the statute book.
We absolutely agree with the Opposition about the importance of privacy enhancing technologies, which I will call PETs, since I spoke on them recently and was told that was the best abbreviation—it is certainly easier. We wish to see their use by organisations to help ensure compliance with data protection principles and we seek to encourage that. As part of our work under the national data strategy, we are already exploring the macro-impacts of PETs and how they can unlock data across the economy.
The ICO has recently published its draft guidance on anonymisation, pseudonymisation and PETs, which explains the benefits and different types of PETs currently available, as well as how they can help organisations comply with data protection law. In addition, the Centre for Data Ethics and Innovation has published an adoption guide to aid decision making around the use of PETs in data-driven projects. It has also successfully completed delivery of UK-US prize challenges to drive innovation in PETs that reinforce democratic values. Indeed, I was delighted to meet some of the participants in those prize challenges at the Royal Society yesterday and hear a little more about some of their remarkable innovations.
As the hon. Lady mentioned, the Royal Society has published reports on how PETs can maximise the benefit and reduce the harms associated with data use. Adding a definition of PETs to the legislation and requiring the Government to publish a report six months after Royal Assent is unlikely to have many advantages over the approach that the ICO, the CDEI and others are taking to develop a better understanding in the area. Furthermore, many PETs are still in the very early stages of their deployment and use, and have not been widely adopted across the UK or globally. A statutory definition could quickly become outdated. Publishing a comprehensive report on the potential impacts of PETs, which advocated the use of one technology or another, could even distort a developing market, and lead to unintended negative impacts on the development of what are promising technologies. For that reason, I ask the hon. Lady to withdraw the new clause.
A wider range of biometric data is now being collected than ever before. From data on the way we walk and talk to the facial expressions we make, biometric data is now being collected and used in a wide range of situations for many distinct purposes. Great attention has rightly been paid to police use of facial recognition technology to identify individuals, for example at football matches or protests. Indeed, to date, much of the regulatory attention has focused on those use cases, which are overseen by the Investigatory Powers Commissioner. However, the use of biometric technologies extends far beyond those examples, and there has been a proliferation of biometrics designed by private organisations to be used across day-to-day life—not just in policing.
We unlock smartphones with our faces or fingerprints, and companies have proposed using facial expression analysis to detect whether students are paying attention in online classes. Employers have used facial expression and tone analysis to decide who should be selected for a job—as was already mentioned in reference to new clause 8. As the proliferation of biometric technologies occurs, a number of issues have been raised about their impact on people and society. Indeed, if people’s identities can be detected by both public and private actors at any given point, there is potential for it to significantly infringe on someone’s privacy to move through the world with freedom of expression, association and assembly. Similarly, if people’s traits, characteristics or abilities can be automatically assessed on the basis of biometrics, often without a scientific basis, it may affect free expression and the development of personality.
Public attitudes research carried out by the Ada Lovelace Institute shows that the British public recognise the potential benefits of tools such as facial recognition in certain circumstances—for example, smartphone locking systems and in airports—but often reject their use in others. Large majorities are opposed to the use of facial recognition in shops, schools and on public transport, as well as by human resources departments in recruitment. In all cases, the public expect the use of biometrics to be accompanied by safeguards and limitations, such as appropriate transparency and accountability measures.
Members of the citizens’ biometrics council, convened by the Ada Lovelace Institute in 2020 and made up of 50 members of the public, expressed the view that biometric technologies as currently used are lacking in transparency and accountability. In particular, safeguards are uneven across sectors. Private use of biometrics are not currently subject to the same level of regulatory oversight or due process as is afforded within the criminal justice system, despite also having the potential to create changes of life-affecting significance. As a result, one member of the council memorably asked:
“If the technology companies break their promises…what will the implications be? Who’s going to hold them to account?”
It is with those issues in mind that experts and legal opinion seem all to come to the same consistent conclusion that, at the moment, there is not a sufficient legal framework in place to manage the unique issues that the private proliferation of biometrics use raises. An independent legal review, commissioned by the Ada Lovelace Institute and led by Matthew Ryder KC, found that current governance structures and accountability mechanisms for biometrics are fragmented, unclear and ineffective. Similar findings have been made by the Biometrics and Surveillance Camera Commissioner, and Select Committees in this House and in the other place.
The Government, however, have not yet acted on delivering a legal framework to govern the use of biometric technology by private corporations, meaning that the Bill is a missed opportunity. New clause 13 therefore seeks to move towards the creation of that framework, providing for the Information Commission to oversee the use of biometric technology by private parties, and ensure accountability around it. I hope that the Committee see the value of this oversight and what it could provide and will support the new clause.
New clause 13 would require the Information Commission to establish a new separate statutory biometrics office with responsibility for the oversight and regulation of biometric data and technology. However, the Information Commissioner already has responsibility for monitoring and enforcing the processing of biometric data, as it falls within the definition of personal data. Under the Bill, the new body corporate—the Information Commission—will continue to monitor and enforce the processing of all personal data under the data protection legislation, including biometric data. Indeed, with its new independent board and governance structure, the commission will enjoy greater diversity in skills and decision making, ensuring that the regulator has the right blend of skills and expertise at the very top of the organisation.
Furthermore, the Bill allows the new Information Commission to establish committees, which may include specialists from outside the organisation with key skills and expertise in specialist areas. As such, the Government are of the firm view that the Information Commission is best placed to provide regulatory oversight of biometric data, rather than delegating responsibility and functions to a separate office. The creation of a new body would likely cause confusion for those seeking redress, by creating novel complaints processes for biometric-related complaints, as set out in new clause 13(3)(c)(iii). It would also complicate regulatory oversight and decision making by providing the new office with powers to impose fines, as per subsection (2)(e). For those reasons, I encourage the hon. Lady to withdraw her new clause.
New clauses 14 and 15 would require non-law enforcement bodies that process biometric data about individuals to register with the Information Commissioner before the processing begins. Where the processing started prior to passage of the Bill, the organisation would need to register within six months of commencement. As part of the registration process, the organisation would have to explain the intended effect of the processing and provide annual updates to the Information Commissioner’s Office on current and future processing activities. Organisations that fail to comply with these requirements would be subject to an unlimited fine.
I appreciate that the new clauses aim to make sure that organisations will give careful thought to the necessity and proportionality of their processing activities, and to improve regulatory oversight, but they could have significant unintended consequences. As the hon. Lady will be aware, there are many everyday uses of biometrics data, such as using a thumbprint to access a phone, laptop or other connected device. Such services would always ask for the user’s explicit consent and make alternatives such as passwords available to customers who would prefer not to part with their biometric data.
If every organisation that launched a new product had to register with the Information Commissioner to explain its intentions and complete annual reports, that could place significant and unnecessary new burdens on businesses and undermine the aims of the Bill. Where the use of biometric data is more intrusive, perhaps involving surveillance technology to identify specific individuals, the processing will already be subject to the heightened safeguards in article 9 of the UK GDPR. The processing would need to be necessary and proportionate on the grounds of substantial public interest.
The Bill will also require organisations to designate a senior responsible individual to manage privacy risks, act as a contact point for the regulator, undertake risk assessments and keep records in relation to high-risk processing activities. It would be open to the regulator to request to see these documents if members of the public expressed concern about the use of the technology.
I hope my response has helped to address the issues the hon. Lady was concerned about, and I would respectfully ask her to not to press these new clauses.
It does indeed provide reassurance. On that basis, I beg to ask leave to withdraw the motion.
Clause, by leave, withdrawn.
Okay, I will wait for the next Question. Thank you for your guidance, Mr Hollobone.
I thank my hon. Friend the Member for Loughborough, who has been assiduous in pursuing her point and has set out very clearly the purpose of her new clause. We share her wish to reduce unnecessary burdens on the police as much as possible. The new clause seeks to achieve that in relation to the preparation by police officers of pre-charge files, which is an issue that the National Police Chiefs’ Council has raised with the Home Office, as I think she knows.
This is a serious matter for our police forces, which estimate that about four hours is spent redacting a typical case file. They argue that reducing that burden would enable officers to spend more time on frontline policing. We completely understand the frustration that many officers feel about having to spend a huge amount of time on what they see as unnecessary redaction. I can assure my hon. Friend that the Home Office is working with partners in the criminal justice system to find ways of safely reducing the redaction burden while maintaining public trust. It is important that we give them the time to do so.
We need to resolve the issue through an evidence-based solution that will ensure that the right amount of redaction is done at the right point in the process, so as to reduce any delays while maintaining victim and witness confidence in the process. I assure my hon. Friend that her point is very well taken on board and the Government are looking at how we can achieve her objective as quickly as possible, but I hope she will accept that, at this point, it would be sensible to withdraw her new clause.
I thank the Minister greatly for what he has said, and for the time and effort that is being put in by several Departments to draw attention to the issue and bring it to a conclusion. I am happy that some progress has been made and, although I reserve my right to bring back the new clause at a later date, I beg to ask leave to withdraw the motion.
Clause, by leave, withdrawn.
It has been a real pleasure to represent His Majesty’s loyal Opposition in the scrutiny of the Bill. I thank the Minister for his courteous manner, all members of the Committee for their time, the Clerks for their work and the many stakeholders who have contributed their time, input and views. I conclude by thanking Anna Clingan, my senior researcher, who has done a remarkable amount of work to prepare for our scrutiny of this incredibly complex Bill. Finally, I thank you, Mr Hollobone, for the way in which you have chaired the Committee.
May I join the hon. Lady in expressing thanks to you, Mr Hollobone, and to Mr Paisley for chairing the Bill Committee so efficiently and getting us to this point ahead of schedule? I thank all members of the Committee for their participation: we have been involved in what will be seen to be a very important piece of legislation.
I am very grateful to the Opposition for their support in principle for many of the objectives of the Bill. It is absolutely right that the Opposition scrutinise the detail, and the hon. Member for Barnsley East and her colleagues have done so very effectively. I am pleased that we have reached this point with the Bill so far unamended, but obviously we will be considering it further on Report.
I thank all my hon. Friends for attending the Committee and for their contributions, particularly saying “Aye” at the appropriate moments, which has allowed us to get to this point. I also thank the officials in the Department for Science, Innovation and Technology. I picked up this baton on day two of my new role covering the maternity leave of my hon. Friend the Member for Hornchurch and Upminster (Julia Lopez); I did so with some trepidation, but the officials have made my task considerably easier and I am hugely indebted to them.
I thank everybody for allowing us to get this point. I look forward to further debate on Report, in due course.
May I thank all hon. Members for their forbearance during the passage of the Bill and thank all the officers of the House for their diligence and attention to duty? My one remaining humble observation is that if the day ever comes when a facial recognition algorithm is attached to the cameras in the main Chamber to assess whether Members are bored or not paying attention, we will all be in very big trouble.
Question put and agreed to.
Bill, as amended, accordingly to be reported.
Data Protection and Digital Information Bill Debate
Full Debate: Read Full DebateJohn Whittingdale
Main Page: John Whittingdale (Conservative - Maldon)Department Debates - View all John Whittingdale's debates with the Department for Digital, Culture, Media & Sport
(11 months, 1 week ago)
Commons ChamberI begin by joining the hon. Member for Rhondda (Sir Chris Bryant) in expressing the condolences of the House to his predecessor, Allan Rogers. He served as a Member of Parliament during my first nine years in this place. I remember him as an assiduous constituency Member of Parliament, and I am sure we all share the sentiments expressed by the hon. Gentleman.
It is a pleasure to return to the Dispatch Box to lead the House through Report stage of the Bill. We spent considerable time discussing it in Committee, but the hon. Gentleman was not in his post at that time. I welcome him to his position. He may regret that he missed out on Committee stage, which makes him keen to return to it today.
The Bill is an essential piece of legislation that will update the UK’s data laws, making them among the most effective in the world. We scrutinised it in depth in Committee. The hon. Gentleman is right that the Government have tabled a number of amendments for the House to consider today, and he has done the same. The vast majority are technical, and the number sounds large because a lot are consequential on original amendments. One or two address new aspects, and I will be happy to speak to those as we go through them during this afternoon’s debate. Nevertheless, they represent important additions to the Bill.
The Minister for Disabled People, Health and Work, my hon. Friend the Member for Corby (Tom Pursglove), who is sitting next to me, has drawn the House’s attention to the fact that amending the Bill to allow the Department for Work and Pensions access to financial data will make a significant contribution to identifying fraud. I would have thought that the Opposition would welcome that. It is not a new measure; it was contained in the fraud plan that the Government published back in May 2022. The Government have been examining that measure, and we have always made it clear that we would bring it forward at an appropriate parliamentary time when a vehicle was available. This is a data Bill, and the measure is specific to it. We estimate that it will result in a saving to the taxpayer of around £500 million by the end of 2028-29. I am surprised that the Opposition should question that.
As I said, the Bill has been considered at length in Committee. It is important that we consider it on Report, in order that it achieve the next stage of its progress through Parliament. On that basis, I reject the motion.
Question put.
I beg to move, That the clause be read a Second time.
With this it will be convenient to discuss the following:
Government new clause 48—Processing of personal data revealing political opinions.
Government new clause 7—Searches in response to data subjects’ requests.
Government new clause 8—Notices from the Information Commissioner.
Government new clause 9—Court procedure in connection with subject access requests.
Government new clause 10—Approval of a supplementary code.
Government new clause 11—Designation of a supplementary code.
Government new clause 12—List of recognised supplementary codes.
Government new clause 13—Change to conditions for approval or designation.
Government new clause 14—Revision of a recognised supplementary code.
Government new clause 15—Applications for approval and re-approval.
Government new clause 16—Fees for approval, re-approval and continued approval.
Government new clause 17—Request for withdrawal of approval.
Government new clause 18—Removal of designation.
Government new clause 19—Registration of additional services.
Government new clause 20—Supplementary notes.
Government new clause 21—Addition of services to supplementary notes.
Government new clause 22—Duty to remove services from the DVS register.
Government new clause 23—Duty to remove supplementary notes from the DVS register.
Government new clause 24—Duty to remove services from supplementary notes.
Government new clause 25—Index of defined terms for Part 2.
Government new clause 26—Powers relating to verification of identity or status.
Government new clause 27—Interface bodies.
Government new clause 28—The FCA and financial services interfaces.
Government new clause 29—The FCA and financial services interfaces: supplementary.
Government new clause 30—The FCA and financial services interfaces: penalties and levies.
Government new clause 31—Liability and damages.
Government new clause 32—Other data provision.
Government new clause 33—Duty to notify the Commissioner of personal data breach: time periods.
Government new clause 34—Power to require information for social security purposes.
Government new clause 35—Retention of information by providers of internet services in connection with death of child.
Government new clause 36—Retention of biometric data and recordable offences.
Government new clause 37—Retention of pseudonymised biometric data.
Government new clause 38—Retention of biometric data from INTERPOL.
Government new clause 39—National Underground Asset Register.
Government new clause 40—Information in relation to apparatus.
Government new clause 41—Pre-commencement consultation.
Government new clause 42—Transfer of certain functions of Secretary of State.
New clause 1—Processing of data in relation to a case-file prepared by the police service for submission to the Crown Prosecution Service for a charging decision—
“(1) The 2018 Act is amended in accordance with subsection (2).
(2) In the 2018 Act, after section 40 insert—
“40A Processing of data in relation to a case-file prepared by the police service for submission to the Crown Prosecution Service for a charging decision
(1) This section applies to a set of processing operations consisting of the preparation of a case-file by the police service for submission to the Crown Prosecution Service for a charging decision, the making of a charging decision by the Crown Prosecution Service, and the return of the case-file by the Crown Prosecution Service to the police service after a charging decision has been made.
(2) The police service is not obliged to comply with the first data protection principle except insofar as that principle requires processing to be fair, or the third data protection principle, in preparing a case-file for submission to the Crown Prosecution Service for a charging decision.
(3) The Crown Prosecution Service is not obliged to comply with the first data protection principle except insofar as that principle requires processing to be fair, or the third data protection principle, in making a charging decision on a case-file submitted for that purpose by the police service.
(4) If the Crown Prosecution Service decides that a charge will not be pursued when it makes a charging decision on a case-file submitted for that purpose by the police service it must take all steps reasonably required to destroy and delete all copies of the case-file in its possession.
(5) If the Crown Prosecution Service decides that a charge will be pursued when it makes a charging decision on a case-file submitted for that purpose by the police service it must return the case-file to the police service and take all steps reasonably required to destroy and delete all copies of the case-file in its possession.
(6) Where the Crown Prosecution Service decides that a charge will be pursued when it makes a charging decision on a case-file submitted for that purpose by the police service and returns the case-file to the police service under subsection (5), the police service must comply with the first data protection principle and the third data protection principle in relation to any subsequent processing of the data contained in the case-file.
(7) For the purposes of this section—
(a) The police service means—
(i) constabulary maintained by virtue of an enactment, or
(ii) subject to section 126 of the Criminal Justice and Public Order Act 1994 (prison staff not to be regarded as in police service), any other service whose members have the powers or privileges of a constable.
(b) The preparation of, or preparing, a case-file by the police service for submission to the Crown Prosecution Service for a charging decision includes the submission of the file.
(c) A case-file includes all information obtained by the police service for the purpose of preparing a case-file for submission to the Crown Prosecution Service for a charging decision.””
This new clause adjusts Section 40 of the Data Protection Act 2018 to exempt the police service and the Crown Prosecution Service from the first and third data protection principles contained within the 2018 Act so that they can share unredacted data with one another when making a charging decision.
New clause 2—Common standards and timeline for implementation—
“(1) Within one month of the passage of this Act, the Secretary of State must by regulations require those appointed as decision-makers to create, publish and update as required open and common standards for access to customer data and business data.
(2) Standards created by virtue of subsection (1) must be interoperable with those created as a consequence of Part 2 of the Retail Banking Market Investigation Order 2017, made by the Competition and Markets Authority.
(3) Regulations under section 66 and 68 must ensure interoperability of customer data and business data with standards created by virtue of subsection (1).
(4) Within one month of the passage of this Act, the Secretary of State must publish a list of the sectors to which regulations under section 66 and section 68 will apply within three years of the passage of the Act, and the date by which those regulations will take effect in each case.”
This new clause, which is intended to be placed in Part 3 (Customer data and business data) of the Bill, would require interoperability across all sectors of the economy in smart data standards, including the Open Banking standards already in effect, and the publication of a timeline for implementation.
New clause 3—Provision about representation of data subjects—
“(1) Section 190 of the Data Protection Act 2018 is amended as follows.
(2) In subsection (1), leave out “After the report under section 189(1) is laid before Parliament, the Secretary of State may” and insert “The Secretary of State must, within three months of the passage of the Data Protection and Digital Information Act 2024,”.”
This new clause would require the Secretary of State to exercise powers under s190 DPA2018 to allow organisations to raise data breach complaints on behalf of data subjects generally, in the absence of a particular subject who wishes to bring forward a claim about misuse of their own personal data.
New clause 4—Review of notification of changes of circumstances legislation—
“(1) The Secretary of State must commission a review of the operation of the Social Security (Notification of Changes of Circumstances) Regulations 2010.
(2) In conducting the review, the designated reviewer must—
(a) consider the current operation and effectiveness of the legislation;
(b) identify any gaps in its operation and provisions;
(c) consider and publish recommendations as to how the scope of the legislation could be expanded to include non-public sector, voluntary and private sector holders of personal data.
(3) In undertaking the review, the reviewer must consult—
(a) specialists in data sharing;
(b) people and organisations who campaign for the interests of people affected by the legislation;
(c) people and organisations who use the legislation;
(d) any other persons and organisations the review considers appropriate.
(4) The Secretary of State must lay a report of the review before each House of Parliament within six months of this Act coming into force.”
This new clause requires a review of the operation of the “Tell Us Once” programme, which seeks to provide simpler mechanisms for citizens to pass information regarding births and deaths to government, and consideration of whether the progress of “Tell Us Once” could be extended to non-public sector holders of data.
New clause 5—Definition of “biometric data”—
“Article 9 of the UK GDPR is amended by the omission, in paragraph 1, of the words “for the purpose of uniquely identifying a natural person”.”
This new clause would amend the UK General Data Protection Regulation to extend the protections currently in place for biometric data for identification to include biometric data for the purpose of classification.
New clause 43—Right to use non-digital verification services—
“(1) This section applies when an organisation—
(a) requires an individual to use a verification service, and
(b) uses a digital verification service for that purpose.
(2) The organisation—
(a) must make a non-digital alternative method of verification available to any individual required to use a verification service, and
(b) must provide information about digital and non-digital methods of verification to those individuals before verification is required.”
This new clause, which is intended for insertion into Part 2 of the Bill (Digital verification services), creates the right for data subjects to use non-digital identity verification services as an alternative to digital verification services, thereby preventing digital verification from becoming mandatory in certain settings.
New clause 44—Transfer of functions to the Investigatory Powers Commissioner’s Office—
“The functions of the Surveillance Camera Commissioner are transferred to the Investigatory Powers Commissioner.”
New clause 45—Interoperability of data and collection of comparable healthcare statistics across the UK—
“(1) The Health and Social Care Act 2012 is amended as follows.
(2) After section 250, insert the following section—
“250A Interoperability of data and collection of comparable healthcare statistics across the UK
(1) The Secretary of State must prepare and publish an information standard specifying binding data interoperability requirements which apply across the whole of the United Kingdom.
(2) An information standard prepared and published under this section—
(a) must include guidance about the implementation of the standard;
(b) may apply to any public body which exercises functions in connection with the provision of health services anywhere in the United Kingdom.
(3) A public body to which an information standard prepared and published under this section applies must have regard to the standard.
(4) The Secretary of State must report to Parliament each year on progress on the implementation of an information standard prepared in accordance with this section.
(5) For the purposes of this section—
“health services” has the same meaning as in section 250 of this Act, except that for “in England” there is substituted “anywhere in the United Kingdom”, and “the health service” in parts of the United Kingdom other than England has the meaning given by the relevant statute of that part of the United Kingdom;
“public body” has the same meaning as in section 250 of this Act.”
(3) In section 254 (Powers to direct NHS England to establish information systems), after subsection (2), insert—
“(2A) The Secretary of State must give a direction under subsection (1) directing NHS England to collect and publish information about healthcare performance and outcomes in all parts of the United Kingdom in a way which enables comparison between different parts of the United Kingdom.
(2B) Before giving a direction by virtue of subsection (2A), the Secretary of State must consult—
(a) the bodies responsible for the collection and publication of official statistics in each part of the United Kingdom,
(b) Scottish Ministers,
(c) Welsh Ministers, and
(d) Northern Ireland departments.
(2C) The Secretary of State may not give a direction by virtue of subsection (2A) unless a copy of the direction has been laid before, and approved by resolution of, both Houses of Parliament.
(2D) Scottish Ministers, Welsh Ministers and Northern Ireland departments must arrange for the information relating to the health services for which they have responsibility described in the direction given by virtue of subsection (2A) to be made available to NHS England in accordance with the direction.
(2E) For the purposes of a direction given by virtue of subsection (2A), the definition of “health and social care body” given in section 259(11) applies as if for “England” there were substituted “the United Kingdom”.””
New clause 46—Assessment of impact of Act on EU adequacy—
“(1) Within six months of the passage of this Act, the Secretary of State must carry out an assessment of the impact of the Act on EU adequacy, and lay a report of that assessment before both Houses of Parliament.
(2) The report must assess the impact on—
(a) data risk, and
(b) small and medium-sized businesses.
(3) The report must quantify the impact of the Act in financial terms.”
New clause 47—Review of the impact of the Act on anonymisation and the identifiability of data subjects—
“(1) Within six months of the passage of this Act, the Secretary of State must lay before Parliament the report of an assessment of the impact of the measures in the Act on anonymisation and the identifiability of data subjects.
(2) The report must include a comparison between the rights afforded to data subjects under this Act with those afforded to data subjects by the EU General Data Protection Regulation.”
Amendment 278, in clause 5, page 6, line 15, leave out paragraphs (b) and (c).
This amendment and Amendment 279 would remove the power for the Secretary of State to create pre-defined and pre-authorised “recognised legitimate interests”, for data processing. Instead, the current test would continue to apply in which personal data can only be processed in pursuit of a legitimate interest, as balanced with individual rights and freedoms.
Amendment 279, page 6, line 23, leave out subsections (4), (5) and (6).
See explanatory statement to Amendment 278.
Amendment 230, page 7, leave out lines 1 and 2 and insert—
“8. The Secretary of State may not make regulations under paragraph 6 unless a draft of the regulations has been laid before both Houses of Parliament for the 60-day period.
8A. The Secretary of State must consider any representations made during the 60-day period in respect of anything in the draft regulations laid under paragraph 8.
8B. If, after the end of the 60-day period, the Secretary of State wishes to proceed to make the regulations, the Secretary of State must lay before Parliament a draft of the regulations (incorporating any changes the Secretary of State considers appropriate pursuant to paragraph 8A).
8C. Draft regulations laid under paragraph 8B must, before the end of the 40-day period, have been approved by a resolution of each House of Parliament.
8D. In this Article—
“the 40-day period” means the period of 40 days beginning on the day on which the draft regulations mentioned in paragraph 8 are laid before Parliament (or, if it is not laid before each House of Parliament on the same day, the later of the days on which it is laid);
“the 60-day period” means the period of 60 days beginning on the day on which the draft regulations mentioned in paragraph 8B are laid before Parliament (or, if it is not laid before each House of Parliament on the same day, the later of the days on which it is laid).
8E. When calculating the 40-day period or the 60-day period for the purposes of paragraph 8D, ignore any period during which Parliament is dissolved or prorogued or during which both Houses are adjourned for more than 4 days.”
This amendment would make regulations made in respect of recognised legitimate interest subject to a super-affirmative Parliamentary procedure.
Amendment 11, page 7, line 12, at end insert—
““internal administrative purposes” , in relation to special category data, means the conditions set out for lawful processing in paragraph 1 of Schedule 1 of the Data Protection Act 2018.”
This amendment clarifies that the processing of special category data in employment must follow established principles for reasonable processing, as defined by paragraph 1 of Schedule 1 of the Data Protection Act 2018.
Government amendment 252.
Amendment 222, page 10, line 8, leave out clause 8.
Amendment 3, in clause 8, page 10, leave out line 31.
This amendment would mean that the resources available to the controller could not be taken into account when determining whether a request is vexatious or excessive.
Amendment 2, page 11, line 34, at end insert—
“(6A) When informing the data subject of the reasons for not taking action on the request in accordance with subsection (6), the controller must provide evidence of why the request has been treated as vexatious or excessive.”
This amendment would require the data controller to provide evidence of why a request has been considered vexatious or excessive if the controller is refusing to take action on the request.
Government amendment 17.
Amendment 223, page 15, line 22, leave out clause 10.
Amendment 224, page 18, line 7, leave out clause 12.
Amendment 236, in clause 12, page 18, line 21, at end insert—
“(c) a data subject is an identified or identifiable individual who is affected by a significant decision, irrespective of the direct presence of their personal data in the decision-making process.”
This amendment would clarify that a “data subject” includes identifiable individuals who are subject to data-based and automated decision-making, whether or not their personal data is directly present in the decision-making process.
Amendment 232, page 19, line 12, leave out “solely” and insert “predominantly”.
This amendment would mean safeguards for data subjects’ rights, freedoms and legitimate interests would have to be in place in cases where a significant decision in relation to a data subject was taken based predominantly, rather than solely, on automated processing.
Amendment 5, page 19, line 12, after “solely” insert “or partly”.
This amendment would mean that the protections provided for by the new Article 22C would apply where a decision is based either solely or partly on automated processing, not only where it is based solely on such processing.
Amendment 233, page 19, line 18, at end insert
“including the reasons for the processing.”
This amendment would require data controllers to provide the data subject with the reasons for the processing of their data in cases where a significant decision in relation to a data subject was taken based on automated processing.
Amendment 225, page 19, line 18, at end insert—
“(aa) require the controller to inform the data subject when a decision described in paragraph 1 has been taken in relation to the data subject;”.
Amendment 221, page 20, line 3, at end insert—
“7. When exercising the power to make regulations under this Article, the Secretary
of State must have regard to the following statement of principles:
Digital information principles at work
1. People should have access to a fair, inclusive and trustworthy digital environment
at work.
2. Algorithmic systems should be designed and used to achieve better outcomes:
to make work better, not worse, and not for surveillance. Workers and their
representatives should be involved in this process.
3. People should be protected from unsafe, unaccountable and ineffective
algorithmic systems at work. Impacts on individuals and groups must be assessed
in advance and monitored, with reasonable and proportionate steps taken.
4. Algorithmic systems should not harm workers’ mental or physical health, or
integrity.
5. Workers and their representatives should always know when an algorithmic
system is being used, how and why it is being used, and what impacts it may
have on them or their work.
6. Workers and their representatives should be involved in meaningful consultation
before and during use of an algorithmic system that may significantly impact
work or people.
7. Workers should have control over their own data and digital information collected
about them at work.
8. Workers and their representatives should always have an opportunity for human
contact, review and redress when an algorithmic system is used at work where
it may significantly impact work or people. This includes a right to a written
explanation when a decision is made.
9. Workers and their representatives should be able to use their data and digital
technologies for contact and association to improve work quality and conditions.
10. Workers should be supported to build the information, literacy and skills needed
to fulfil their capabilities through work transitions.”
This amendment would insert into new Article 22D of the UK GDPR a requirement for the Secretary of State to have regard to the statement of digital information principles at work when making regulations about automated decision-making.
Amendment 4, in clause 15, page 25, line 4, at end insert
“(including in the cases specified in sub-paragraphs (a) to (c) of paragraph 3 of Article 35)”.
This amendment, together with Amendment 1, would provide a definition of what constitutes “high risk processing” for the purposes of applying Articles 27A, 27B and 27C, which require data controllers to designate, and specify the duties of, a “senior responsible individual” with responsibility for such processing.
Government amendments 18 to 44.
Amendment 12, in page 32, line 7, leave out clause 17.
This amendment keeps the current requirement on police in the Data Protection Act 2018 to justify why they have accessed an individual’s personal data.
Amendment 1, in clause 18, page 32, line 18, leave out paragraph (c) and insert—
“(c) omit paragraph 2,
(ca) in paragraph 3—
(i) for “data protection” substitute “high risk processing”,
(ii) in sub-paragraph (a), for “natural persons” substitute “individuals”,
(iii) in sub-paragraph (a) for “natural person” substitute “individual” in both places where it occurs,
(cb) omit paragraphs 4 and 5,”.
This amendment would leave paragraph 3 of Article 35 of the UK GDPR in place (with amendments reflecting amendments made by the Bill elsewhere in the Article), thereby ensuring that there is a definition of “high risk processing” on the face of the Regulation.
Amendment 226, page 39, line 38, leave out clause 26.
Amendment 227, page 43, line 2, leave out clause 27.
Amendment 228, page 46, line 32, leave out clause 28.
Government amendment 45.
Amendment 235, page 57, line 29, leave out clause 34.
This amendment would leave in place the existing regime, which refers to “manifestly unfounded” or excessive requests to the Information Commissioner, rather than the proposed change to “vexatious” or excessive requests.
Government amendments 46 and 47.
Amendment 237, in clause 48, page 77, line 4, leave out “individual” and insert “person”.
This amendment and Amendments 238 to 240 are intended to enable the digital verification services covered by the Bill to include verification of organisations as well as individuals.
Amendment 238, page 77, line 5, leave out “individual” and insert “person”.
See explanatory statement to Amendment 237.
Amendment 239, page 77, line 6, leave out “individual” and insert “person”.
See explanatory statement to Amendment 237.
Amendment 240, page 77, line 7, leave out “individual” and insert “person”.
See explanatory statement to Amendment 237.
Amendment 241, page 77, line 8, at end insert (on new line)—
“and the facts which may be so ascertained, verified or confirmed may include the fact that an individual has a claimed connection with a legal person.”
This amendment would ensure that the verification services covered by the Bill will include verification that an individual has a claimed connection with a legal person.
Government amendments 48 to 50.
Amendment 280, in clause 49, page 77, line 13, at end insert—
“(2A) The DVS trust framework must include a description of how the provision of digital verification services is expected to uphold the Identity Assurance Principles.
(2B) Schedule (Identity Assurance Principles) describes each Identity Assurance Principle and its effect.”
Amendment 281, page 77, line 13, at end insert—
“(2A) The DVS trust framework must allow valid attributes to be protected by zero-knowledge proof and other decentralised technologies, without restriction upon how and by whom those proofs may be held or processed.”
Government amendments 51 to 66.
Amendment 248, in clause 52, page 79, line 7, at end insert—
“(1A) A determination under subsection (1) may specify an amount which is tiered to the size of the person and its role as specified in the DVS trust framework.”
This amendment would enable fees for application for registration in the DVS register to be determined on the basis of the size and role of the organisation applying to be registered.
Amendment 243, page 79, line 8, after “may”, insert “not”.
This amendment would provide that the fee for application for registration in the DVS register could not exceed the administrative costs of determining the application.
Government amendment 67.
Amendment 244, page 79, line 13, after “may”, insert “not”.
This amendment would provide that the fee for continued registration in the DVS register could not exceed the administrative costs of that registration.
Government amendment 68.
Amendment 245, page 79, line 21, at end insert—
“(10) The fees payable under this section must be reviewed every two years by the National Audit Office.”
This amendment would provide that the fees payable for DVS registration must be reviewed every two years by the NAO.
Government amendments 69 to 77.
Amendment 247, in clause 54, page 80, line 38, after “person”, insert “or by other parties”.
This amendment would enable others, for example independent experts, to make representations about a decision to remove a person from the DVS register, as well as the person themselves.
Amendment 246, page 81, line 7, at end insert—
“(11) The Secretary of State may not exercise the power granted by subsection (1) until the Secretary of State has consulted on proposals for how a decision to remove a person from the DVS register will be reached, including—
(a) how information will be collected from persons impacted by a decision to remove the person from the register, and from others;
(b) how complaints will be managed;
(c) how evidence will be reviewed;
(d) what the burden of proof will be on which a decision will be based.”
This amendment would provide that the power to remove a person from the DVS register could not be exercised until the Secretary of State had consulted on the detail of how a decision to remove would be reached.
Government amendments 78 to 80.
Amendment 249, in clause 62, page 86, line 17, at end insert—
“(3A) A notice under this section must give the recipient of the notice an opportunity to consult the Secretary of State on the content of the notice before providing the information required by the notice.”
This amendment would provide an option for consultation between the Secretary of State and the recipient of an information notice before the information required by the notice has to be provided.
Government amendment 81.
Amendment 242, in clause 63, page 87, line 21, leave out “may” and insert “must”.
This amendment would require the Secretary of State to make arrangements for a person to exercise the Secretary of State’s functions under this Part of the Bill, so that an independent regulator would perform the relevant functions and not the Secretary of State.
Amendment 250, in clause 64, page 87, line 34, at end insert—
“(1A) A report under subsection (1) must include a report on any arrangements made under section 63 for a third party to exercise functions under this Part.”
This amendment would require information about arrangements for a third party to exercise functions under this Part of the Bill to be included in the annual reports on the operation of the Part.
Government amendments 82 to 196.
Amendment 6, in clause 83, page 107, leave out from line 26 to the end of line 34 on page 108.
This amendment would leave out the proposed new regulation 6B of the PEC Regulations, which would enable consent to be given, or an objection to be made, to cookies automatically.
Amendment 217, page 109, line 20, leave out clause 86.
This amendment would leave out the clause which would enable the sending of direct marketing electronic mail on a “soft opt-in” basis.
Amendment 218, page 110, line 1, leave out clause 87.
This amendment would remove the clause which would enable direct marketing for the purposes of democratic engagement. See also Amendment 220.
Government amendments 253 to 255.
Amendment 219, page 111, line 6, leave out clause 88.
This amendment is consequential on Amendment 218.
Government amendments 256 to 265.
Amendment 7, in clause 89, page 114, line 12, at end insert—
“(2A) A provider of a public electronic communications service or network is not required to intercept or examine the content of any communication in order to comply with their duty under this regulation.”
This amendment would clarify that a public electronic communications service or network is not required to intercept or examine the content of any communication in order to comply with their duty to notify the Commissioner of unlawful direct marketing.
Amendment 8, page 117, line 3, at end insert—
“(5) In regulation 1—
(a) at the start, insert “(1)”;
(b) after “shall”, insert “save for regulation 26A”;
(c) at end, insert—
“(2) Regulation 26A comes into force six months after the Commissioner has published guidance under regulation 26C (Guidance in relation to regulation 26A).””
This amendment would provide for the new regulation 26A, Duty to notify Commissioner of unlawful direct marketing, not to come into force until six months after the Commissioner has published guidance in relation to that duty.
Government amendment 197.
Amendment 251, in clause 101, page 127, line 3, leave out “and deaths” and insert “, deaths and deed polls”.
This amendment would require deed poll information to be kept to the same standard as records of births and deaths.
Amendment 9, page 127, line 24, at end insert—
“(2A) After section 25, insert—
“25A Review of form in which registers are to be kept
(1) The Secretary of State must commission a review of the provisions of this Act and of related legislation, with a view to the creation of a single digital register of births and deaths.
(2) The review must consider and make recommendations on the effect of the creation of a single digital register on—
(a) fraud,
(b) data collection, and
(c) ease of registration.
(3) The Secretary of State must lay a report of the review before each House of Parliament within six months of this section coming into force.””
This amendment would insert a new section into the Births and Deaths Registration Act 1953 requiring a review of relevant legislation, with consideration of creating a single digital register for registered births and registered deaths and recommendations on the effects of such a change on reducing fraud, improving data collection and streamlining digital registration.
Government amendment 198.
Amendment 229, in clause 112, page 135, line 8, leave out subsections (2) and (3).
Amendment 10, in clause 113, page 136, line 35, leave out
“which allows or confirms the unique identification of that individual”.
This amendment would amend the definition of “biometric data” for the purpose of the oversight of law enforcement biometrics databases so as to extend the protections currently in place for biometric data for identification to include biometric data for the purpose of classification.
Government amendments 199 to 207.
Government new schedule 1—Power to require information for social security purposes.
Government new schedule 2—National Underground Asset Register: monetary penalties.
New schedule 3—Identity Assurance Principles—
“Part 1
Definitions
1 These Principles are limited to the processing of Identity Assurance Data (IdA Data) in an Identity Assurance Service (e.g. establishing and verifying identity of a Service User; conducting a transaction that uses a user identity; maintaining audit requirements in relation a transaction associated with the use of a service that needs identity verification etc.). They do not cover, for example, any data used to deliver a service, or to measure its quality.
2 In the context of the application of the Identity Assurance Principles to an Identity Assurance Service, “Identity Assurance Data” (“IdA Data”) means any recorded information that is connected with a “Service User” including—
“Audit Data.” This includes any recorded information that is connected with any log or audit associated with an Identity Assurance Service.
“General Data.” This means any other recorded information which is not personal data, audit data or relationship data, but is still connected with a “Service User”.
“Personal Data.” This takes its meaning from the Data Protection Act 2018 or subsequent legislation (e.g. any recorded information that relates to a “Service User” who is also an identified or identifiable living individual).
“Relationship Data.” This means any recorded information that describes (or infers) a relationship between a “Service User”, “Identity Provider” or “Service Provider” with another “Service User”, “Identity Provider” or “Service Provider” and includes any cookie or program whose purpose is to supply a means through which relationship data are collected.
3 Other terms used in relation to the Principles are defined as follows—
“save-line2Identity Assurance Service.” This includes relevant applications of the technology (e.g. hardware, software, database, documentation) in the possession or control of any “Service User”, “Identity Provider” or “Service Provider” that is used to facilitate identity assurance activities; it also includes any IdA Data processed by that technology or by an Identity Provider or by a Service Provider in the context of the Service; and any IdA Data processed by the underlying infrastructure for the purpose of delivering the IdA service or associated billing, management, audit and fraud prevention.
“Identity Provider.” This means the certified individual or certified organisation that provides an Identity Assurance Service (e.g. establishing an identity, verification of identity); it includes any agent of a certified Identity Provider that processes IdA data in connection with that Identity Assurance Service.
“Participant.” This means any “Identity Provider”, “Service Provider” or “Service User” in an Identity Assurance Service. A “Participant” includes any agent by definition.
“Processing.” In the context of IdA data means “collecting, using, disclosing, retaining, transmitting, copying, comparing, corroborating, correlating, aggregating, accessing” the data and includes any other operation performed on IdA data.
“Provider.” Includes both “Identity Provider” and/or “Service Provider”.
“Service Provider.” This means the certified individual or certified organisation that provides a service that uses an Identity Provider in order to verify identity of the Service User; it includes any agent of the Service Provider that processes IdA data from an Identity Assurance Service.
“Service User.” This means the person (i.e. an organisation (incorporated or not)) or an individual (dead or alive) who has established (or is establishing) an identity with an Identity Provider; it includes an agent (e.g. a solicitor, family member) who acts on behalf of a Service User with proper authority (e.g. a public guardian, or a Director of a company, or someone who possesses power of attorney). The person may be living or deceased (the identity may still need to be used once its owner is dead, for example by an executor).
“Third Party.” This means any person (i.e. any organisation or individual) who is not a “Participant” (e.g. the police or a Regulator).
Part 2
The Nine Identity Assurance Principles
Any exemptions from these Principles must be specified via the “Exceptional Circumstances Principle”. (See Principle 9).
1 User Control Principle
Statement of Principle: “I can exercise control over identity assurance activities affecting me and these can only take place if I consent or approve them.”
1.1 An Identity Provider or Service Provider must ensure any collection, use or disclosure of IdA data in, or from, an Identity Assurance Service is approved by each particular Service User who is connected with the IdA data.
1.2 There should be no compulsion to use the Identity Assurance Service and Service Providers should offer alternative mechanisms to access their services. Failing to do so would undermine the consensual nature of the service.
2 Transparency Principle
Statement of Principle: “Identity assurance can only take place in ways I understand and when I am fully informed.”
2.1 Each Identity Provider or Service Provider must be able to justify to Service Users why their IdA data are processed. Ensuring transparency of activity and effective oversight through auditing and other activities inspires public trust and confidence in how their details are used.
2.2 Each Service User must be offered a clear description about the processing of IdA data in advance of any processing. Identity Providers must be transparent with users about their particular models for service provision.
2.3 The information provided includes a clear explanation of why any specific information has to be provided by the Service User (e.g. in order that a particular level of identity assurance can be obtained) and identifies any obligation on the part of the Service User (e.g. in relation to the User’s role in securing his/her own identity information).
2.4 The Service User will be able to identify which Service Provider they are using at any given time.
2.5 Any subsequent and significant change to the processing arrangements that have been previously described to a Service User requires the prior consent or approval of that Service User before it comes into effect.
2.6 All procedures, including those involved with security, should be made publicly available at the appropriate time, unless such transparency presents a security or privacy risk. For example, the standards of encryption can be identified without jeopardy to the encryption keys being used.
3 Multiplicity Principle
Statement of Principle: “I can use and choose as many different identifiers or identity providers as I want to.”
3.1 A Service User is free to use any number of identifiers that each uniquely identifies the individual or business concerned.
3.2 A Service User can use any of his identities established with an Identity Provider with any Service Provider.
3.3 A Service User shall not be obliged to use any Identity Provider or Service Provider not chosen by that Service User; however, a Service Provider can require the Service User to provide a specific level of Identity Assurance, appropriate to the Service User’s request to a Service Provider.
3.4 A Service User can choose any number of Identity Providers and where possible can choose between Service Providers in order to meet his or her diverse needs. Where a Service User chooses to register with more than one Identity Provider, Identity Providers and Service Providers must not link the Service User’s different accounts or gain information about their use of other Providers.
3.5 A Service User can terminate, suspend or change Identity Provider and where possible can choose between Service Providers at any time.
3.6 A Service Provider does not know the identity of the Identity Provider used by a Service User to verify an identity in relation to a specific service. The Service Provider knows that the Identity Provider can be trusted because the Identity Provider has been certified, as set out in GPG43 – Requirements for Secure Delivery of Online Public Services (RSDOPS).
4 Data Minimisation Principle
Statement of Principle: “My interactions only use the minimum data necessary to meet my needs.”
4.1 Identity Assurance should only be used where a need has been established and only to the appropriate minimum level of assurance.
4.2 Identity Assurance data processed by an Identity Provider or a Service Provider to facilitate a request of a Service User must be the minimum necessary in order to fulfil that request in a secure and auditable manner.
4.3 When a Service User stops using a particular Identity Provider, their data should be deleted. Data should be retained only where required for specific targeted fraud, security or other criminal investigation purposes.
5 Data Quality Principle
Statement of Principle: “My interactions only use the minimum data necessary to meet my needs.”
5.1 Service Providers should enable Service Users (or authorised persons, such as the holder of a Power of Attorney) to be able to update their own personal data, at a time at their choosing, free of charge and in a simple and easy manner.
5.2 Identity Providers and Service Providers must take account of the appropriate level of identity assurance required before allowing any updating of personal data.
6 Service User Access and Portability Principle
Statement of Principle: “I have to be provided with copies of all of my data on request; I can move/remove my data whenever I want.”
6.1 Each Identity Provider or Service Provider must allow, promptly, on request and free of charge, each Service User access to any IdA data that relates to that Service User.
6.2 It shall be unlawful to make it a condition of doing anything in relation to a Service User to request or require that Service User to request IdA data.
6.3 The Service User must be able to require an Identity Provider to transfer his personal data, to a second Identity Provider in a standard electronic format, free of charge and without impediment or delay.
7 Certification Principle
Statement of Principle: “I can have confidence in the Identity Assurance Service because all the participants have to be certified against common governance requirements.”
7.1 As a baseline control, all Identity Providers and Service Providers will be certified against a shared standard. This is one important way of building trust and confidence in the service.
7.2 As part of the certification process, Identity Providers and Service Providers are obliged to co-operate with the independent Third Party and accept their impartial determination and to ensure that contractual arrangements—
• reinforce the application of the Identity Assurance Principles
• contain a reference to the independent Third Party as a mechanism for dispute resolution.
7.3 In the context of personal data, certification procedures include the use of Privacy Impact Assessments, Security Risk Assessments, Privacy by Design concepts and, in the context of information security, a commitment to using appropriate technical measures (e.g. encryption) and ever improving security management. Wherever possible, such certification processes and security procedures reliant on technical devices should be made publicly available at the appropriate time.
7.4 All Identity Providers and Service Providers will take all reasonable steps to ensure that a Third Party cannot capture IdA data that confirms (or infers) the existence of relationship between any Participant. No relationships between parties or records should be established without the consent of the Service User.
7.5 Certification can be revoked if there is significant non-compliance with any Identity Assurance Principle.
8 Dispute Resolution Principle
Statement of Principle: “If I have a dispute, I can go to an independent Third Party for a resolution.”
8.1 A Service User who, after a reasonable time, cannot, or is unable, to resolve a complaint or problem directly with an Identity Provider or Service Provider can call upon an independent Third Party to seek resolution of the issue. This could happen for example where there is a disagreement between the Service User and the Identity Provider about the accuracy of data.
8.2 The independent Third Party can resolve the same or similar complaints affecting a group of Service Users.
8.3 The independent Third Party can co-operate with other regulators in order to resolve problems and can raise relevant issues of importance concerning the Identity Assurance Service.
8.4 An adjudication/recommendation of the independent Third Party should be published. The independent Third Party must operate transparently, but detailed case histories should only be published subject to appropriate review and consent.
8.5 There can be more than one independent Third Party.
8.6 The independent Third Party can recommend changes to standards or certification procedures or that an Identity Provider or Service Provider should lose their certification.
9 Exceptional Circumstances Principle
Statement of Principle: “Any exception has to be approved by Parliament and is subject to independent scrutiny.”
9.1 Any exemption from the application of any of the above Principles to IdA data shall only be lawful if it is linked to a statutory framework that legitimises all Identity Assurance Services, or an Identity Assurance Service in the context of a specific service. In the absence of such a legal framework then alternative measures must be taken to ensure, transparency, scrutiny and accountability for any exceptions.
9.2 Any exemption from the application of any of the above Principles that relates to the processing of personal data must also be necessary and justifiable in terms of one of the criteria in Article 8(2) of the European Convention of Human Rights: namely in the interests of national security; public safety or the economic well-being of the country; for the prevention of disorder or crime; for the protection of health or morals, or for the protection of the rights and freedoms of others.
9.3 Any subsequent processing of personal data by any Third Party who has obtained such data in exceptional circumstances (as identified by Article 8(2) above) must be the minimum necessary to achieve that (or another) exceptional circumstance.
9.4 Any exceptional circumstance involving the processing of personal data must be subject to a Privacy Impact Assessment by all relevant “data controllers” (where “data controller” takes its meaning from the Data Protection Act).
9.5 Any exemption from the application of any of the above Principles in relation to IdA data shall remain subject to the Dispute Resolution Principle.”
Amendment 220, in schedule 1, page 141, leave out from line 21 to the end of line 36 on page 144.
This amendment would remove from the new Annex 1 of the UK GDPR provisions which would enable direct marketing for the purposes of democratic engagement. See also Amendment 218.
Government amendments 266 to 277.
Government amendments 208 to 211.
Amendment 15, in schedule 5, page 154, line 2, at end insert—
“(g) the views of the Information Commission on suitability of international transfer of data to the country or organisation.”
This amendment requires the Secretary of State to seek the views of the Information Commission on whether a country or organisation has met the data protection test for international data transfer.
Amendment 14, page 154, line 25, at end insert—
“5. In relation to special category data, the Information Commissioner must assess whether the data protection test is met for data transfer to a third country or international organisation.”
This amendment requires the Information Commission to assess suitability for international transfer of special category data to a third country or international organisation.
Amendment 13, page 154, line 30, leave out “ongoing” and insert “annual”.
This amendment mandates that a country’s suitability for international transfer of data is monitored on an annual basis.
Amendment 16, in schedule 6, page 162, line 36, at end insert—
“(g) the views of the Information Commission on suitability of international transfer of data to the country or organisation.”
This amendment requires the Secretary of State to seek the views of the Information Commission on whether a country or organisation has met the data protection test for international data transfer in relation to law enforcement processing.
Government amendment 212.
Amendment 231, in schedule 13, page 202, line 33, at end insert—
“(2A) A person may not be appointed under sub-paragraph (2) unless the Science, Innovation and Technology Committee of the House of Commons has endorsed the proposed appointment.”
This amendment would ensure that non-executive members of the Information Commission may not be appointed unless the Science, Innovation and Technology Committee has endorsed the Secretary of State’s proposed appointee.
Government amendments 213 to 216.
The current one-size-fits-all, top-down approach to data protection that we inherited from the European Union has led to public confusion, which has impeded the effective use of personal data to drive growth and competition, and to support key innovations. The Bill seizes on a post-Brexit opportunity to build on our existing foundations and create an innovative, flexible and risk-based data protection regime. This bespoke model will unlock the immense possibilities of data use to improve the lives of everyone in the UK, and help make the UK the most innovative society in the world through science and technology.
I want to make it absolutely clear that the Bill will continue to maintain the highest standards of data protection that the British people rightly expect, but it will also help those who use our data to make our lives healthier, safer and more prosperous. That is because we have convened industry leaders and experts to co-design the Bill at every step of the way. We have held numerous roundtables with both industry experts in the field and campaigning groups. The outcome, I believe, is that the legislation will ensure our regulation reflects the way real people live their lives and run their businesses.
I am grateful to the Minister for giving way so early. Oxford West and Abingdon has a huge number of spin-offs and scientific businesses that have expressed concern that any material deviation on standards, particularly European Union data adequacy, would entangle them in more red tape, rather than remove it. He says he has spoken to industry leaders. Have he and his Department assessed the risk of any deviation? Is there any associated cost to businesses from any potential deviation? Who is going to bear that cost?
I share the hon. Lady’s appreciation of the importance of data adequacy with the European Union. It is not the case that we have to replicate every aspect of GDPR to be assessed as adequate by the European Union for the purposes of data exchange. Indeed, a number of other countries have data adequacy, even though they do not have precisely the same framework of data protection legislation.
In drawing up the measures in the Bill, we have been very clear that we do not wish to put data adequacy at risk, and we are confident that nothing in the Bill does so. That is not only my view; it is the view of the expert witnesses who gave evidence in Committee. It is also the view of the Information Commissioner, who has been closely involved in all the measures before us today. I recognise the concern, but I do not believe it has any grounds.
The Minister says, “We do not wish”. Is that a guarantee from the Dispatch Box that there will be absolutely no deviation that causes a material difference for businesses on EU data adequacy? Can he give that guarantee?
I can guarantee that there is nothing in the Government’s proposals that we believe puts data adequacy at risk. That is not just our view; it is the view of all those we have consulted, including the Information Commissioner. He was previously the information commissioner in New Zealand, which has its own data protection laws but is, nevertheless, recognised as adequate by the EU. He is very familiar with the process required to achieve and keep data adequacy, and it is his view, as well as ours, that the Bill achieves that objective.
We believe the Government amendments will strengthen the fundamental elements of the Bill and reflect the Government’s commitment to unleashing the power of data across our economy and society. I have already thanked all the external stakeholders who have worked with us to ensure that the Bill functions at its best. Taken together, we believe these amendments will benefit the economy by £10.6 billion over the next 10 years. That is more than double the estimated impact of the Bill when it was introduced in the spring.
Will the Minister confirm that no services will rely on digital identity checks?
I will come on to that, because we have tabled a few amendments on digital verification and the accreditation of digital identity.
We are proposing a voluntary framework. We believe that using digital identity has many advantages, and those will become greater as the technology improves, but there is no compulsory or mandatory element to the use of digital identity. I understand why the hon. Lady raises that point, and I am happy to give her that assurance.
Before my right hon. Friend moves on to the specifics of the Government amendments, may I ask him about something they do not yet cover? The Bill does not address the availability of data to researchers so that they can assist in the process of, for example, identifying patterns in online safety. He will know that there was considerable discussion of this during the passage of the Online Safety Act 2023, when a succession of Ministers said that we might return to the subject in this Bill. Will he update the House on how that is going? When might we expect to see amendments to deal with this important area?
It is true that we do not have Government amendments to that effect, but it is a central part of the Bill that we have already debated in Committee. Making data more available to researchers is, indeed, an objective of the Bill, and I share my right hon. and learned Friend’s view that it will produce great value. If he thinks more needs to be done in specific areas, I would be very happy to talk to him further or to respond in writing.
Broadly speaking, we support this measure. What negotiations and discussions has the Minister had about red notices under Interpol and the abuse of them, for instance by the Russian state? We have concerns about decent people being maltreated by the Russian state through the use of red notices. Are those concerns conflicted by the measure that the Government are introducing?
As the hon. Gentleman knows, I strongly share his view about the need to act against abuse of legal procedures by the Russian state. As he will appreciate, this aspect of the Bill emanated from the Home Office. However, I have no doubt that my colleagues in the Home Office will have heard the perfectly valid point he makes. I hope that they will be able to provide him with further information about it, and I will draw the matter to their attention.
I wish to say just a few more words about the biometric material received from our international partners, as a tool in protecting the public from harm. Sometimes, counter-terrorism police receive biometrics from international partners with identifiable information. Under current laws, they are not allowed to retain these biometrics unless they were taken in the past three years. That can make it harder for our counter-terrorism police to carry out their job effectively. That is why we are making changes to allow the police to take proactive steps to pseudonymise biometric data received from international partners—obviously, that means holding the material without including information that identifies the person—and hold indefinitely under existing provisions in the Counter-Terrorism Act information that identifies the person it relates to. Again, those changes have been requested by counter-terrorism police and will support them to better protect the British public.
The national underground asset register, or NUAR, is a digital map that will improve both the efficiency and safety of underground works, by providing secure access to privately and publicly owned location data about the pipes and cables beneath our feet. This will underpin the Government’s priority to get the economy growing by expediting projects such as new roads, new houses and broadband roll-out—the hon. Gentleman and I also share a considerable interest in that.
The NUAR will bring together valuable data from more than 700 public and private sector organisations about the location of underground utilities assets. This will deliver £490 million per year of economic growth, through increased efficiency, reduced asset strikes and reduced disruptions for citizens and businesses. Once operational, the running of the register will be funded by those who benefit most. The Government’s amendments include powers to, through regulations, levy charges on apparatus owners and request relevant information. The introduction of reasonable charges payable by those who benefit from the service, rather than the taxpayer, will ensure that the NUAR is a sustainable service for the future. Other amendments will ensure that there is the ability to realise the full potential of this data for other high-value uses, while respecting the rights of asset owners.
Is any consideration given to the fact that that information could be used by bad actors? If people are able to find out where particular cables or pipes are, they also have the ability to find weakness in the system, which could have implications for us all.
I understand the hon. Lady’s point. There would need to be a legitimate purpose for accessing such information and I am happy to supply her with further detail about precisely how that works.
The hon. Lady intervenes at an appropriate point, because I was about to say that the provision will allow the National Underground Asset Register service to operate in England and Wales. We intend to bring forward equivalent provisions as the Bill progresses in the other House, subject to the usual agreements, to allow the service to operate in Northern Ireland, but the Scottish Road Works Commissioner currently maintains its own register. It has helped us in the development of the NUAR, so the hon. Lady may like to talk to the Scottish Road Works Commissioner on that point.
I turn to the use of data for the purposes of democratic engagement, which is an issue of considerable interest to Members of the House. The Bill includes provisions to facilitate the responsible use of personal data by elected representatives, registered political parties and others for the purposes of “democratic engagement”. We have tabled further related amendments for consideration today, including adding a fuller definition of what constitutes “democratic engagement activities” to help the reader understand that term wherever it appears in the legislation.
The amendments provide for former MPs to continue to process personal data following a successful recall petition, to enable them to complete urgent casework or hand over casework to a successor, as they do following the Dissolution of Parliament. For consistency, related amendments are made to the definitions used in provisions relating to direct marketing for the purposes of democratic engagement.
Finally, hon. Members may be aware that the Data Protection Act 2018 currently permits registered political parties to process sensitive political opinions data without consent for the purposes of their political activities. The exemption does not however currently apply to elected representatives, candidates, recall petitioners and permitted participants in referendums. The amendment addresses that anomaly and allows those individuals to benefit from the same exemption as registered political parties.
Is the Minister prepared to look at how the proposals in the Bill and the amendments align with relevant legislation passed in the Scottish Government? A number of framework Bills to govern the operation of potential future referendums on a variety of subjects have been passed, particularly the Referendums (Scotland) Act 2020. It is important that there is alignment with the definitions used in the Bill, such as that for “a permitted participant”. Will he commit to looking at that and, if necessary, make changes to the Bill at a later stage in its progress, in discussion with the Scottish Government?
I am happy to look at that, as the hon. Gentleman suggests. I hope the changes we are making to the Bill will provide greater legal certainty for MPs and others who undertake the processing of personal data for the purposes of democratic engagement.
The Bill starts and ends with reducing burdens on businesses and, above all, on small businesses, which account for over 99% of UK firms. In the future, organisations will need to keep records of their processing activities only when those activities are likely to result in a high risk to individuals. Some organisations have queried whether that means they will have to keep records in relation to all their activities if only some of their processing activities are high risk. That is not the Government’s intention. To maximise the benefits to business and other organisations, the amendments make it absolutely clear that organisations have to keep records only in relation to their high-risk processing activities.
The Online Safety Act 2023 took crucial steps to shield our children, and it is also important that we support grieving families who are seeking answers after tragic events where a child has taken their own life, by removing obstacles to accessing social media information that could be relevant to the coroner’s investigations.
We welcome such measures, but is the Minister aware of the case of Breck Bednar, who was groomed and then murdered? His family is campaigning not just for new clause 35 but for measures that go further. In that case, the coroner would have wanted access to Breck’s online life but, as it currently stands, new clause 35 does not provide what the family needs without a change to widen the scope of the amendment to the Online Safety Act. Will the Minister look at that? I think it will just require a tweak in some of the wording.
I understand the concerns of the hon. Lady. We want to do all that we can to support the bereaved parents of children who have lost their lives. As it stands, the amendment will require Ofcom, following notification from a coroner, to issue information notices to specified providers of online services, requiring them to hold data they may have relating to a deceased child’s use of online services, in circumstances where the coroner suspects the child has taken their own life, which could later be required by a coroner as relevant to an inquest.
We will continue to work with bereaved families and Members of the other place who have raised concerns. During the passage of the Online Safety Act, my noble colleague Lord Parkinson of Whitley Bay made it clear that we are aware of the importance of data preservation to bereaved parents, coroners and others involved in investigations. It is very important that we get this right. I hear what the hon. Lady says and give her an assurance that we will continue to work across Government, with the Ministry of Justice and others, in ensuring that we do so.
The hon. Member for Rhondda made reference to proposed new schedule 1, relating to improving our ability to identify and tackle fraud in the welfare system. I am grateful for the support of the Minister for Disabled People, Health and Work, my hon. Friend the Member for Corby (Tom Pursglove). In 2022-23, the Department for Work and Pensions overpaid £8.3 billion in fraud and error. A major area of loss is the under-declaration of financial assets, which we cannot currently tackle through existing powers. Given the need to address the scale of fraud and error in the welfare system, we need to modernise and strengthen the legal framework, to allow the Department for Work and Pensions to keep pace with change and stand up to future fraud challenges.
As I indicated earlier, the fraud plan, published in 2022, contains a provision outlining the DWP’s intention to bring forward new powers that would boost access to data held by third parties. The amendment will enable the DWP to access data held by third parties at scale where the information signals potential fraud or error. That will allow the DWP to detect fraud and error more proactively and protect taxpayers’ money from falling into the hands of fraudsters.
My reading of the proposed new schedule is that it gives the Department the power to look into the bank accounts of people claiming the state pension. Am I right about that?
The purpose of the proposed new schedule is narrowly focused. It will ensure that where benefit claimants may also have considerable financial assets, that is flagged with the DWP for further examination, but it does not allow people to go through the contents of people’s bank accounts. It is an alarm system where financial institutions that hold accounts of benefit claimants can match those against financial assets, so where it appears fraud might be taking place, they can refer that to the Department.
I am surprised that the Opposition regard this as something to question. Obviously, they are entitled to seek further information, but I would hope that they share the wish to identify where fraud is taking place and take action against it. This is about claimants of benefits, including universal credit—
The state pension will not currently be an area of focus for the use of these powers.
The House of Commons Library makes it absolutely clear that the Bill, if taken forward in the way that the Government are proposing at the moment, does allow the Government to look at people in receipt of state pensions. That is the case, is it not?
I can tell the hon. Gentleman that it is not the case that the DWP intends to focus on the state pension—and that is confirmed by my hon. Friend the Member for Corby. This is specifically about ensuring that means-related benefit claimants are eligible for the benefits for which they are currently claiming. In doing that, the identification and the avoidance of fraud will save the taxpayer a considerable amount of money.
I think everybody in the House understands the importance of getting this right. We all want to stop fraud in the state system. That being said, this is the only time that I am aware of where the state seeks the right to put people under surveillance without prior suspicion, and therefore such a power has to be restricted very carefully indeed. As we are not going to have time to debate this properly today, is my right hon. Friend open to having further discussion on this issue when the Bill goes to the Lords, so that we can seek further restrictions? I do not mean to undermine the effectiveness of the action; I just want to make it more targeted.
I am very grateful to my right hon. Friend for his contribution, and I share his principled concern that the powers of the state should be limited to those that are absolutely necessary. Those who are in receipt of benefits funded by the taxpayer have an obligation to meet the terms of those benefits, and this provision is one way of ensuring that they do so. My hon. Friend the Member for Corby has already said that he would be very happy to discuss this matter with my right hon. Friend further, and I am happy to do the same if that is helpful to him.
Can the Minister give us an example of the circumstances in which the Department would need to look into the bank accounts of people claiming state pensions in order to tackle the fraud problem? Why is the state pension within the scope of this amendment?
All I can say to the right hon. Gentleman is that the Government have made it clear that there is no intention to focus on claimants of the state pension. That is an undertaking that has been given. I am sure that Ministers from the DWP would be happy to give further evidence to the right hon. Gentleman, who may well wish to look at this further in his Committee.
Finally, I wish to touch on the framework around smart data, which is contained in part 3 of the Bill. The smart data powers will extend the Government’s ability to introduce smart data schemes, building on the success of open banking, which is the UK’s most developed data sharing scheme, with more than 7 million active users. The amendments will support the Government’s ability to meet their commitment, first, to provide open banking with a long-term regulatory framework, and, secondly, to establish an open data scheme for road fuel prices. It will also more generally strengthen the toolkit available to Government to deliver future smart data schemes.
The amendments ensure that the range of data and activities essential to smart data schemes are better captured and more accurately defined. That includes types of financial data and payment activities that are integral to open banking. The amendments, as I say, are complicated and technical and therefore I will not go into further detail.
I will give way to my hon. Friend as I know that he has taken a particular interest, and is very knowledgeable, in this area.
The Minister is very kind. I just wanted to pick up on his last point about smart data. He is right to say that the provisions are incredibly important and potentially extremely valuable to the economy. Can he just clarify a couple of points? I want to be clear on Government new clause 27 about interface bodies. Does that apply to the kinds of new data standards that will be required under smart data? If it does, can he please clarify how he will make sure that we do not end up with multiple different standards for each sector of our economy? It is absolutely in everybody’s interests that the standards are interoperable and, to the greatest possible extent, common between sectors so that they can talk to each other?
I do have a note on interface bodies, which I am happy to include for the benefit of my hon. Friend. However, he will be aware that this is a technical and complicated area. If he wants to pursue a further discussion, I would of course be happy to oblige. I can tell him that the amendments will ensure that smart data schemes can replicate and build on the open banking model by allowing the Government to require interface bodies to be set up by members of the scheme. Interface bodies will play a similar role to that of the open banking implementation entity, developing common standards on arrangements for data sharing. Learning from the lessons and successes of the open banking regime, regulations will be able to specify the responsibilities and requirements for interface bodies and ensure appropriate accountability to regulators. I hope that that goes some way to addressing the point that he makes, but I would be happy to discuss it further with him in due course.
I believe these amendments will generally improve the functioning of the Bill and address some specific concerns that I have identified. On that basis, I commend them to the House.
With the leave of the House, I call the Minister to wind up the debate.
I thank all hon. Members who have contributed to the debate. I believe that these matters are important, if sometimes very complicated and technical. My hon. Friend the Member for Yeovil (Mr Fysh) was absolutely right to stress how fundamentally important they are, and they will become more so.
I also thank the shadow Minister for identifying the areas where we are in agreement. We had a good Committee stage with his colleague, the hon. Member for Barnsley East (Stephanie Peacock), where we agreed on the overall objectives of the Bill. It is welcome that the shadow Minister has supported us, particularly on the amendment that we moved this afternoon on the powers of the Information Commissioner’s Office, the provisions relating to digital verification services, and smart data. There were, however, some areas on which we will not agree.
Let me begin by addressing the main amendments that the hon. Gentleman has moved. Amendment 1 relates to high-risk processing. It is the case that one of the main aims of the Bill is to remove some of the UK GDPR’s unnecessary compliance burdens. That is why organisations will be required to designate only senior responsible individuals to carry out risk assessments and keep records of processing when their activities pose high risks to individuals. The amendments that the hon. Gentleman is proposing would reintroduce a prescriptive list of high-risk processing activities drawn from article 35 of the UK GDPR. We find that some of the language in article 35 is unclear and confusing, which is partly why we removed it in the first place. We think organisations should have the ability to make a judgment of risk based on the specific nature, scale and context of their own processing activities. We do not need to provide prescriptive examples of high-risk processing in the legislation, because any list could quickly become out of date. Instead, to help data controllers, clause 18 of the Bill requires the ICO to produce a document with examples of what the commissioner considers to be high-risk processing.
But the Minister has already indicated that, basically, he will come forward with exactly the same list as is in the legislation that the Government are amending. All that is happening is that, in the Bill, the Information Commissioner will be doing what the Government or the House could be doing, and this is the one area where the Government disagree with the Information Commissioner.
As I say, the Government do not believe that it is necessary to have a prescriptive list in the Bill. We feel that it is better that individuals make a judgment based on their assessment of the risk, with the guidance of the Information Commissioner.
Moving to the shadow Minister’s second amendment, the Government agree that controllers should not be able to refuse a request without proper thought or consideration. That is why the existing responsibilities of controllers to facilitate requests from data subjects as the default has not changed and why the new article 12A also ensures that the burden of proof for a request meeting the vexatious or excessive threshold remains with the controller. The Government believe that is sufficient, and stipulating that evidence must be provided each time a request is refused may not be appropriate in all circumstances and would likely bring further burdens for controllers. On that basis, we oppose that amendment.
On amendment 5, the safeguards set out in reformed article 22 of the UK GDPR ensure that individuals are able to seek human intervention when significant decisions about them are taken solely through automated means with no meaningful human involvement.
Partly automated decisions already involve meaningful human involvement, so there is no need to extend the safeguards in article 22 to all forms of automated decision making. In such instances, other data protection requirements continue to apply and offer relevant protections to data subjects, as set out in the broader UK data protection regime. Those protections include lawfulness, fairness, transparency and accountability.
My understanding was that the level of fraud among state pension claims was indeed extremely small. The Minister said earlier that the Government should take powers only where they are absolutely necessary; I think he is now saying that they are not necessary in the case of people claiming a state pension. Is he confident that that bit of this power—to look into the bank account of anybody claiming a state pension—is absolutely necessary?
What I am saying is that the Government’s intention is to use the power only when there is clear evidence or suggestion that fraud is taking place on a significant scale. The Government simply want to retain the option to amend that should future evidence emerge; that is why the issue has been left open.
The trouble is that this is not about amending. The Government describe the relevant benefits in part 5 of proposed new schedule 3B, within new schedule 1, which is clear that pensions are included. The Minister has effectively said at the Dispatch Box that the Government do not need to tackle fraud in relation to pensions; perhaps it would be a good idea for us to all sit down and have a meeting to work out a more sensible set of measures to tackle fraud where it is necessary, rather than giving unending powers to the Government.
I agree, to the extent that levels of fraud in state pensions being currently nearly zero, the power is not needed in that case. However, the Government wish to retain an option should the position change in the future. But I am happy to take the hon. Gentleman up on his request on behalf of my hon. Friend the Minister for Disabled People, Health and Work, with whom he has already engaged. I am sure that the right hon. Member for East Ham will want to examine the issue further in the Work and Pensions Committee, which he chairs. It will undoubtedly also be subject to further discussions in the other place. We are certainly open to further discussion.
The right hon. Member for East Ham also raised the question of commencement. I can tell him that the test and learn phase will begin in 2025, with a steady roll-out to full-scale delivery by 2030. I am sure that he will want to examine these matters further.
The amendment tabled by my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) focuses on digital exclusion. The Bill provides for the use of secure and inclusive digital identities across the economy. It does not force businesses or individuals to use them. Individual choice is integral to our approach. As the Bill makes clear, digital verification services can be provided only at the request of the individual. Where people want to use a digital verification service, the Government are committed to ensuring that available products and services are secure and privacy-focused. That is to be achieved through the high standards set out in the trust framework.
The trust framework also outlines how services can improve inclusion, and requires services to publish an annual inclusion monitoring report. There are businesses that operate only in the digital sphere, such as some online banks and energy companies, as I think has been acknowledged. We feel that to oblige them to offer manual document checking would place obligations on businesses that go beyond the Government’s commitment to do only what is necessary to enable the digital market to grow.
On amendment 224 from the Scottish National party, solely automated decision making that produces legal or similarly significant effects on individuals was not entirely prohibited previously under the UK’s data protection legal framework. The rules governing article 22 are confusing and complex, so clause 12 clarifies and simplifies the rules related to solely automated decision making, and will reduce barriers to responsible data use, help to drive innovation, and maintain high standards of data protection. The reforms do not water down any of the protections to data subjects offered under the broader UK data protection regime—that is, UK GDPR and the Data Protection Act 2018.
On the other amendment tabled by the SNP, amendment 229, effective independent oversight of surveillance camera systems is crucial to public trust. The oversight framework is complex and confusing for the police and public because of substantial duplication between the surveillance camera commissioner functions and the code, which covers police and local authorities in England and Wales only, and the ICO and data protection legislation. The Bill addresses that, following public consultation, through abolishing the surveillance camera commissioner and code.
The amendment tabled by the hon. Member for Glasgow North would negate that by retaining the code and transferring the surveillance camera commissioner functions to the investigatory powers commissioner. It would also blur the lines between overt and covert surveillance, which the investigatory powers commissioner oversees. Those two types of surveillance have distinct legislation and oversight, mainly because covert surveillance is generally considered to be significantly more intrusive.
On amendment 222, it is important to be clear that the ability to refuse or charge a reasonable fee for a request already exists, and clause 8 does not place new restrictions on reasonable requests from data subjects. The Government believe that it is proportionate to allow controllers to refuse or charge a reasonable fee for vexatious or excessive requests, and a clearer provision enables controllers to focus time and resources on responding to reasonable requests instead.
Amendments 278 and 279, tabled by my hon. Friend the Member for Yeovil, would remove the new lawful ground of recognised legitimate interests, which the Bill will add to article 6 of UK GDPR. Amendment 230 accepts that there is merit in retaining the recognised legitimate interests list, but would make any additions to it subject to a super-affirmative parliamentary procedure. It is true that the Bill removes the need for non-public-sector organisations to do a detailed legitimate interests assessment in relation to a small number of processing activities. Those include activities relating for example to the safeguarding of children, crime prevention and responding to emergencies. We heard from stakeholders that the need to do an assessment and the fear of getting it wrong could sometimes delay or deter those important processing activities from taking place. Future Governments would not be able to add new activities to the list lightly; clause 5 of the Bill already makes it clear that the Secretary of State must carefully consider the rights and interests of people, and in particular the special protection needed for children, before adding anything new to the list. Any new regulations would also need to be approved via the affirmative resolution procedure.
My hon. Friend the Member for Yeovil has tabled a large number of other amendments, which are complicated in nature. I have written to him in some detail setting out the Government’s response to each of those, but if he wishes to pursue further any of the points contained therein I would be very happy to have further discussions with him.
I would like to comment on the amendments by several of my colleagues that I wish I was in a position to be able to support. In particular, my hon. Friend the Member for Loughborough (Jane Hunt) has been assiduous in pursuing her point both in the Bill Committee and in this debate. The problem she identifies is without question a very real one, and she set out in some detail how it is massively increasing the burden on the police, which clearly we would wish to reduce wherever possible.
I have had meetings with Home Office Ministers, as my hon. Friend has, and they absolutely identify that problem and share her wish. While we welcome her intent, the problem is that we do not think that her amendment as drafted would achieve her aims of removing the burden of redaction. To do so would require the amendment and exception of more principles than those identified in the amendment. Indeed, it would require the amendment of more laws than just the Data Protection Act 2018.
The Government are absolutely committed to reducing the burden on the police, but it is obviously important that, if we do so, we do it right, and that the solution works comprehensively. We are therefore actively working on ways to better address the issue, including through improved process, new technology, guidance and legislation. I am very happy to continue to work with her on achieving the aim that we all share and so too, I know, are colleagues in the Home Office.
With respect to the amendments tabled by my hon. Friend the Member for Weston-super-Mare (John Penrose), as I indicated, we absolutely share his enthusiasm for smart data and ensuring that the powers within the Bill are implemented in a timely manner, with interoperability at their core. While I agree that we can only fully realise the benefits of smart data schemes if they enable interoperability, different sectors will have different levels of existing digital infrastructure and capability. Thus, we could inadvertently hinder the success of future schemes if we mandated the use of one universal set of standards based, for instance, on those used in open banking.
The Government will ensure that interoperability is central to the development of smart data schemes. To support our thinking, we are working with industry and regulators in the Smart Data Council to identify the technical infrastructure that needs to be replicated. With regard to the timeline—or even the timeline for a timeline—that my hon. Friend asked for, I recognise that it is important to build investor, industry and consumer confidence by outlining the Government’s planned timeline.
My hon. Friend is right to highlight the Chancellor’s comments in the autumn statement, where we set out plans to kick-start the smart data big bang, and our ambition for using those powers across seven sectors. At this stage I am afraid I am not able to accept his amendment, but it is our intention to set out those plans in more detail in the coming months. I know the Under-Secretary of State for Business and Trade, my hon. Friend the Member for Thirsk and Malton (Kevin Hollinrake) and I will be happy to work with him to do so.
The aim of the amendment tabled by the hon. Member for Jarrow (Kate Osborne) was to clarify that, when special category data of employees such as health data is transferred between members of a group of undertakings for internal administrative purposes on grounds of legitimate interests, the conditions and safeguards outlined in schedule 1 of the Data Protection Act should apply to that processing. The Government agree with the sentiment of her amendment, but consider that it is unnecessary. The current legal framework already requires controllers to identify an exemption under article 9 of the UK GDPR if they are processing special category data. Those exemptions are supplemented by the conditions and safeguards outlined in schedule 1. Under those provisions, employers can process special category data where processing is necessary to comply with obligations under employment law. We do not therefore consider the amendment necessary.
Finally, I turn to new clause 45, tabled by my hon. Friend the Member for Aberconwy (Robin Millar). The Government are absolutely committed to improving the availability of comparable UK-wide data. He, too, has been assiduous in promoting that cause, and we are very happy to work with him. We are extremely supportive of the principle underlying his amendment. He is right to point out that people have the right to know the extent of Labour’s failings with the NHS in Wales, as he pointed out, and his new clause sends an important message on our commitment to better data. I can commit to working at pace with him and the UK Statistics Authority to look at ways in which we may be able to implement the intentions of his amendment and bring forward legislative changes following those discussions.
On that basis, I commend the Government amendments to the House.
Question put and agreed to.
New clause 6 accordingly read a Second time, and added to the Bill.
For the benefit of all Members, we are before the knife, so we will have to go through a sequence of procedures. It would help me, the Clerk and the Minister if we had a degree of silence. This will take a little time, and we need to be able to concentrate. Elected representative Candidate for election as an elected representative member of the House of Commons section 118A of the Representation of the People Act 1983 a member of the Senedd article 84(2) of the National Assembly for Wales (Representation of the People) Order 2007 (S.I. 2007/236) a member of the Scottish Parliament article 80(1) of the Scottish Parliament (Elections etc) Order 2015 (S.S.I. 2015/425) a member of the Northern Ireland Assembly section 118A of the Representation of the People Act 1983, as applied by the Northern Ireland Assembly (Elections) Order 2001 (S.I. 2001/2599) an elected member of a local authority within the meaning of section 270(1) of the Local Government Act 1972, namely— (i) in England, a county council, a district council, a London borough council or a parish council; (ii) in Wales, a county council, a county borough council or a community council; section 118A of the Representation of the People Act 1983 an elected mayor of a local authority within the meaning of Part 1A or 2 of the Local Government Act 2000 section 118A of the Representation of the People Act 1983, as applied by the Local Authorities (Mayoral Elections) (England and Wales) Regulations 2007 (S.I. 2007/1024) a mayor for the area of a combined authority established under section 103 of the Local Democracy, Economic Development and Construction Act 2009 section 118A of the Representation of the People Act 1983, as applied by the Combined Authorities (Mayoral Elections) Order 2017 (S.I. 2017/67) a mayor for the area of a combined county authority established under section 9 of the Levelling-up and Regeneration Act 2023 section 118A of the Representation of the People Act 1983, as applied by the Combined Authorities (Mayoral Elections) Order 2017 (S.I. 2017/67) the Mayor of London or an elected member of the London Assembly section 118A of the Representation of the People Act 1983 an elected member of the Common Council of the City of London section 118A of the Representation of the People Act 1983 an elected member of the Council of the Isles of Scilly section 118A of the Representation of the People Act 1983 an elected member of a council constituted under section 2 of the Local Government etc (Scotland) Act 1994 section 118A of the Representation of the People Act 1983 an elected member of a district council within the meaning of the Local Government Act (Northern Ireland) 1972 (c. 9 (N.I.)) section 130(3A) of the Electoral Law Act (Northern Ireland) 1962 (c. 14 (N.I.)) (n)a police and crime commissioner article 3 of the Police and Crime Commissioner Elections Order 2012 (S.I. 2012/1917) Term Provision accredited conformity assessment body section 50(7) approved supplementary code section (Approval of a supplementary code)(6) designated supplementary code section (Designation of a supplementary code)(3) digital verification services section 48(2) the DVS register section 50(2) the DVS trust framework section 49(2)(a) the main code section 49(2)(b) recognised supplementary code section (List of recognised supplementary codes)(2) supplementary code section 49(2)(c) supplementary note section (Supplementary notes)(6)” “the data protection legislation section 236”.”
New Clause 48
Processing of personal data revealing political opinions
“(1) Schedule 1 to the Data Protection Act 2018 (special categories of personal data) is amended in accordance with subsections (2) to (5).
(2) After paragraph 21 insert—
‘Democratic engagement
21A (1) This condition is met where—
(a) the personal data processed is personal data revealing political opinions,
(b) the data subject is aged 14 or over, and
(c) the processing falls within sub-paragraph (2),
subject to the exceptions in sub-paragraphs (3) and (4).
(2) Processing falls within this sub-paragraph if—
(a) the processing—
(i) is carried out by an elected representative or a person acting with the authority of such a representative, and
(ii) is necessary for the purposes of discharging the elected representative’s functions or for the purposes of the elected representative’s democratic engagement activities,
(b) the processing—
(i) is carried out by a registered political party, and
(ii) is necessary for the purposes of the party’s election activities or democratic engagement activities,
(c) the processing—
(i) is carried out by a candidate for election as an elected representative or a person acting with the authority of such a candidate, and
(ii) is necessary for the purposes of the candidate’s campaign for election,
(d) the processing—
(i) is carried out by a permitted participant in relation to a referendum or a person acting with the authority of such a person, and
(ii) is necessary for the purposes of the permitted participant’s campaigning in connection with the referendum, or
(e) the processing—
(i) is carried out by an accredited campaigner in relation to a recall petition or a person acting with the authority of such a person, and
(ii) is necessary for the purposes of the accredited campaigner’s campaigning in connection with the recall petition.
(3) Processing does not meet the condition in sub-paragraph (1) if it is likely to cause substantial damage or substantial distress to an individual.
(4) Processing does not meet the condition in sub-paragraph (1) if—
(a) an individual who is the data subject (or one of the data subjects) has given notice in writing to the controller requiring the controller not to process personal data in respect of which the individual is the data subject (and has not given notice in writing withdrawing that requirement),
(b) the notice gave the controller a reasonable period in which to stop processing such data, and
(c) that period has ended.
(5) For the purposes of sub-paragraph (2)(a) and (b)—
(a) “democratic engagement activities” means activities whose purpose is to support or promote democratic engagement;
(b) “democratic engagement” means engagement by the public, a section of the public or a particular person with, or with an aspect of, an electoral system or other democratic process in the United Kingdom, either generally or in connection with a particular matter, whether by participating in the system or process or engaging with it in another way;
(c) examples of democratic engagement activities include activities whose purpose is—
(i) to promote the registration of individuals as electors;
(ii) to increase the number of electors participating in elections for elected representatives, referendums or processes for recall petitions in which they are entitled to participate;
(iii) to support an elected representative or registered political party in discharging functions, or carrying on other activities, described in sub-paragraph (2)(a) or (b);
(iv) to support a person to become a candidate for election as an elected representative;
(v) to support a campaign or campaigning referred to in sub-paragraph (2)(c), (d) or (e);
(vi) to raise funds to support activities whose purpose is described in sub-paragraphs (i) to (v);
(d) examples of activities that may be democratic engagement activities include—
(i) gathering opinions, whether by carrying out a survey or by other means;
(ii) communicating with electors.
(6) In this paragraph—
“accredited campaigner” has the meaning given in Part 5 of Schedule 3 to the Recall of MPs Act 2015;
“candidate” , in relation to election as an elected representative, has the meaning given by the provision listed in the relevant entry in the second column of the table in sub-paragraph (7);
“elected representative” means a person listed in the first column of the table in sub-paragraph (7) and see also sub-paragraphs (8) to (10);
“election activities” , in relation to a registered political party, means—
(a) campaigning in connection with an election for an elected representative, and
(b) activities whose purpose is to enhance the standing of the party, or of a candidate standing for election in its name, with electors;
“elector” means a person who is entitled to vote in an election for an elected representative or in a referendum;
“permitted participant” has the same meaning as in Part 7 of the Political Parties, Elections and Referendums Act 2000 (referendums) (see section 105 of that Act);
“recall petition” has the same meaning as in the Recall of MPs Act 2015 (see section 1(2) of that Act);
“referendum” means a referendum or other poll held on one or more questions specified in, or in accordance with, an enactment;
“registered political party” means a person or organisation included in a register maintained under section 23 of the Political Parties, Elections and Referendums Act 2000;
“successful” , in relation to a recall petition, has the same meaning as in the Recall of MPs Act 2015 (see section 14 of that Act).
(7) This is the table referred to in the definitions of “candidate” and “elected representative” in sub-paragraph (6)—
(8) For the purposes of the definition of “elected representative” in sub-paragraph (6), a person who is—
(a) a member of the House of Commons immediately before Parliament is dissolved,
(b) a member of the Senedd immediately before Senedd Cymru is dissolved,
(c) a member of the Scottish Parliament immediately before that Parliament is dissolved, or
(d) a member of the Northern Ireland Assembly immediately before that Assembly is dissolved,
is to be treated as if the person were such a member until the end of the period of 30 days beginning with the day after the day on which the subsequent general election in relation to that Parliament or Assembly is held.
(9) For the purposes of the definition of “elected representative” in sub-paragraph (6), where a member of the House of Commons’s seat becomes vacant as a result of a successful recall petition, that person is to be treated as if they were a member of the House of Commons until the end of the period of 30 days beginning with the day after—
(a) the day on which the resulting by-election is held, or
(b) if earlier, the day on which the next general election in relation to Parliament is held.
(10) For the purposes of the definition of “elected representative” in sub-paragraph (6), a person who is an elected member of the Common Council of the City of London and whose term of office comes to an end at the end of the day preceding the annual Wardmotes is to be treated as if the person were such a member until the end of the fourth day after the day on which those Wardmotes are held.’
(3) Omit paragraph 22 and the italic heading before it.
(4) In paragraph 23 (elected representatives responding to requests)—
(a) leave out sub-paragraphs (3) to (5), and
(b) at the end insert—
‘(6) In this paragraph, “elected representative” has the same meaning as in paragraph 21A.’
(5) In paragraph 24(3) (definition of ‘elected representative’), for ‘23’ substitute ‘21A’.
(6) In section 205(2) of the 2018 Act (general interpretation: periods of time), in paragraph (i), for ‘paragraph 23(4) and (5)’ substitute ‘paragraph 21A(8) to (10)’.”—(Sir John Whittingdale.)
This new Clause inserts into Schedule 1 to the Data Protection Act 2018 (conditions for processing of special categories of personal data) a condition relating to processing by elected representatives, registered political parties and others of information about an individual’s political opinions for the purposes of democratic engagement activities and campaigning.
Brought up, read the First and Second time, and added to the Bill.
New Clause 7
Searches in response to data subjects’ requests
“(1) In Article 15 of the UK GDPR (right of access by the data subject)—
(a) after paragraph 1 insert—
‘1A. Under paragraph 1, the data subject is only entitled to such confirmation, personal data and other information as the controller is able to provide based on a reasonable and proportionate search for the personal data and other information described in that paragraph.’, and
(b) in paragraph 3, after ‘processing’ insert ‘to which the data subject is entitled under paragraph 1’.
(2) The 2018 Act is amended in accordance with subsections (3) and (4).
(3) In section 45 (law enforcement processing: right of access by the data subject), after subsection (2) insert—
‘(2A) Under subsection (1), the data subject is only entitled to such confirmation, personal data and other information as the controller is able to provide based on a reasonable and proportionate search for the personal data and other information described in that subsection.’
(4) In section 94 (intelligence services processing: right of access by the data subject), after subsection (2) insert—
‘(2ZA) Under subsection (1), the data subject is only entitled to such confirmation, personal data and other information as the controller is able to provide based on a reasonable and proportionate search for the personal data and other information described in that subsection.’
(5) The amendments made by this section are to be treated as having come into force on 1 January 2024.”—(Sir John Whittingdale.)
This new clause confirms that, in responding to subject access requests, controllers are only required to undertake reasonable and proportionate searches for personal data and other information.
Brought up, read the First and Second time, and added to the Bill.
New Clause 8
Notices from the Information Commissioner
“(1) The 2018 Act is amended in accordance with subsections (2) and (3).
(2) Omit section 141 (notices from the Commissioner).
(3) After that section insert—
‘141A Notices from the Commissioner
(1) This section applies in relation to a notice authorised or required by this Act to be given to a person by the Commissioner.
(2) The notice may be given to the person by—
(a) delivering it by hand to a relevant individual,
(b) leaving it at the person’s proper address,
(c) sending it by post to the person at that address, or
(d) sending it by email to the person’s email address.
(3) A “relevant individual” means—
(a) in the case of a notice to an individual, that individual;
(b) in the case of a notice to a body corporate (other than a partnership), an officer of that body;
(c) in the case of a notice to a partnership, a partner in the partnership or a person who has the control or management of the partnership business;
(d) in the case of a notice to an unincorporated body (other than a partnership), a member of its governing body.
(4) For the purposes of subsection (2)(b) and (c), and section 7 of the Interpretation Act 1978 (services of documents by post) in its application to those provisions, a person’s proper address is—
(a) in a case where the person has specified an address as one at which the person, or someone acting on the person’s behalf, will accept service of notices or other documents, that address;
(b) in any other case, the address determined in accordance with subsection (5).
(5) The address is—
(a) in a case where the person is a body corporate with a registered office in the United Kingdom, that office;
(b) in a case where paragraph (a) does not apply and the person is a body corporate, partnership or unincorporated body with a principal office in the United Kingdom, that office;
(c) in any other case, an address in the United Kingdom at which the Commissioner believes, on reasonable grounds, that the notice will come to the attention of the person.
(6) A person’s email address is—
(a) an email address published for the time being by that person as an address for contacting that person, or
(b) if there is no such published address, an email address by means of which the Commissioner believes, on reasonable grounds, that the notice will come to the attention of that person.
(7) A notice sent by email is treated as given 48 hours after it was sent, unless the contrary is proved.
(8) In this section “officer”, in relation to a body corporate, means a director, manager, secretary or other similar officer of the body.
(9) This section does not limit other lawful means of giving a notice.’
(4) In Schedule 2 to the Electronic Identification and Trust Services for Electronic Transactions Regulations 2016 (S.I. 2016/696) (Commissioner’s enforcement powers), in paragraph 1(b), for ‘141’ substitute ‘141A’.”—(Sir John Whittingdale.)
This amendment adjusts the procedure by which notices can be given by the Information Commissioner under the Data Protection Act 2018. In particular, it enables the Information Commissioner to give notices by email without obtaining the consent of the recipient to use that mode of delivery.
Brought up, read the First and Second time, and added to the Bill.
New Clause 9
Court procedure in connection with subject access requests
“(1) The Data Protection Act 2018 is amended as follows.
(2) For the italic heading before section 180 substitute—
‘Jurisdiction and court procedure’.
(3) After section 180 insert—
‘180A Procedure in connection with subject access requests
(1) This section applies where a court is required to determine whether a data subject is entitled to information by virtue of a right under—
(a) Article 15 of the UK GDPR (right of access by the data subject);
(b) Article 20 of the UK GDPR (right to data portability);
(c) section 45 of this Act (law enforcement processing: right of access by the data subject);
(d) section 94 of this Act (intelligence services processing: right of access by the data subject).
(2) The court may require the controller to make available for inspection by the court so much of the information as is available to the controller.
(3) But, unless and until the question in subsection (1) has been determined in the data subject’s favour, the court may not require the information to be disclosed to the data subject or the data subject’s representatives, whether by discovery (or, in Scotland, recovery) or otherwise.
(4) Where the question in subsection (1) relates to a right under a provision listed in subsection (1)(a), (c) or (d), this section does not confer power on the court to require the controller to carry out a search for information that is more extensive than the reasonable and proportionate search required by that provision.’”—(Sir John Whittingdale.)
This new clause makes provision about courts’ powers to require information to be provided to them, and to a data subject, when determining whether a data subject is entitled to information under certain provisions of the data protection legislation.
Brought up, read the First and Second time, and added to the Bill.
New Clause 10
Approval of a supplementary code
“(1) This section applies to a supplementary code whose content is for the time being determined by a person other than the Secretary of State.
(2) The Secretary of State must approve the supplementary code if—
(a) the code meets the conditions set out in the DVS trust framework (so far as relevant),
(b) an application for approval of the code is made which complies with any requirements imposed by a determination under section (Applications for approval and re-approval), and
(c) the applicant pays any fee required to be paid by a determination under section (Fees for approval, re-approval and continued approval)(1).
(3) The Secretary of State must notify an applicant in writing of the outcome of an application for approval.
(4) The Secretary of State may not otherwise approve a supplementary code.
(5) In this Part, an “approved supplementary code” means a supplementary code for the time being approved under this section.
(6) For when a code ceases (or may cease) to be approved under this section, see sections (Change to conditions for approval or designation), (Revision of a recognised supplementary code) and (Request for withdrawal of approval).”—(Sir John Whittingdale.)
This amendment sets out when a supplementary code of someone other than the Secretary of State must be approved by the Secretary of State.
Brought up, read the First and Second time, and added to the Bill.
New Clause 11
Designation of a supplementary code
“(1) This section applies to a supplementary code whose content is for the time being determined by the Secretary of State.
(2) If the Secretary of State determines that the supplementary code meets the conditions set out in the DVS trust framework (so far as relevant), the Secretary of State may designate the code as one which complies with the conditions.
(3) In this Part, a ‘designated supplementary code’ means a supplementary code for the time being designated under this section.
(4) For when a code ceases (or may cease) to be designated under this section, see sections (Change to conditions for approval or designation), (Revision of a recognised supplementary code) and (Removal of designation).”—(Sir John Whittingdale.)
This enables the Secretary of State to designate a supplementary code of the Secretary of State as one which complies with the conditions set out in the DVS trust framework.
Brought up, read the First and Second time, and added to the Bill.
New Clause 12
List of recognised supplementary codes
“(1) The Secretary of State must—
(a) maintain a list of recognised supplementary codes, and
(b) make the list publicly available.
(2) For the purposes of this Part, each of the following is a ‘recognised supplementary code’—
(a) an approved supplementary code, and
(b) a designated supplementary code.”—(Sir John Whittingdale.)
This amendment places the Secretary of State under a duty to publish, and keep up to date, a list of supplementary codes that are designated or approved.
Brought up, read the First and Second time, and added to the Bill.
New Clause 13
Change to conditions for approval or designation
“(1) This section applies if the Secretary of State revises the DVS trust framework so as to change the conditions which must be met for the approval or designation of a supplementary code.
(2) An approved supplementary code which is affected by the change ceases to be an approved supplementary code at the end of the relevant period unless an application for re-approval of the code is made within that period.
(3) Pending determination of an application for re-approval the supplementary code remains an approved supplementary code.
(4) Before the end of the relevant period the Secretary of State must—
(a) review each designated supplementary code which is affected by the change (if any), and
(b) determine whether it meets the conditions as changed.
(5) If, on a review under subsection (4), the Secretary of State determines that a designated supplementary code does not meet the conditions as changed, the code ceases to be a designated supplementary code at the end of the relevant period.
(6) A supplementary code is affected by a change if the change alters, or adds, a condition which is or would be relevant to the supplementary code when deciding whether to approve it under section (Approval of a supplementary code) or designate it under section (Designation of a supplementary code).
(7) In this section “the relevant period” means the period of 21 days beginning with the day on which the DVS trust framework containing the change referred to in subsection (1) comes into force.
(8) Section (Approval of a supplementary code) applies to re-approval of a supplementary code as it applies to approval of such a code.”—(Sir John Whittingdale.)
This amendment provides that when conditions for approval or designation are changed this requires re-approval of an approved supplementary code and, in the case of a designated supplementary code, a re-assessment of whether the code meets the revised conditions.
Brought up, read the First and Second time, and added to the Bill.
New Clause 14
Revision of a recognised supplementary code
“(1) If an approved supplementary code is revised—
(a) the code before and after the revision are treated as the same code for the purposes of this Part, and
(b) the code ceases to be an approved supplementary code unless subsection (2) or (4) applies.
(2) This subsection applies if the supplementary code, in its revised form, has been approved under section (Approval of a supplementary code).
(3) If subsection (2) applies the approved supplementary code, in its revised form, remains an approved supplementary code.
(4) This subsection applies for so long as—
(a) a decision is pending under section (Approval of a supplementary code) on an application for approval of the supplementary code in its revised form, and
(b) the revisions to the code have not taken effect.
(5) If subsection (4) applies the supplementary code, in its unrevised form, remains an approved supplementary code.
(6) The Secretary of State may revise a designated supplementary code only if the Secretary of State is satisfied that the code, in its revised form, meets the conditions set out in the DVS trust framework (so far as relevant).
(7) If a designated supplementary code is revised, the code before and after the revision are treated as the same code for the purposes of this Part.”—(Sir John Whittingdale.)
This amendment sets out the consequences where there are changes to a recognised supplementary code and, in particular, what needs to be done for the code to remain a recognised supplementary code.
Brought up, read the First and Second time, and added to the Bill.
New Clause 15
Applications for approval and re-approval
“(1) The Secretary of State may determine—
(a) the form of an application for approval or re-approval under section (Approval of a supplementary code),
(b) the information to be contained in or provided with the application,
(c) the documents to be provided with the application,
(d) the manner in which the application is to be submitted, and
(e) who may make the application.
(2) A determination may make different provision for different purposes.
(3) The Secretary of State must publish a determination.
(4) The Secretary of State may revise a determination.
(5) If the Secretary of State revises a determination the Secretary of State must publish the determination as revised.”—(Sir John Whittingdale.)
This amendment enables the Secretary of State to determine the process for making a valid application for approval of a supplementary code.
Brought up, read the First and Second time, and added to the Bill.
New Clause 16
Fees for approval, re-approval and continued approval
“(1) The Secretary of State may determine that a person who applies for approval or re-approval of a supplementary code under section (Approval of a supplementary code) must pay a fee to the Secretary of State of an amount specified in the determination.
(2) A determination under subsection (1) may specify an amount which exceeds the administrative costs of determining the application for approval or re-approval.
(3) The Secretary of State may determine that a fee is payable to the Secretary of State, of an amount and at times specified in the determination, in connection with the continued approval of a supplementary code.
(4) A determination under subsection (3)—
(a) may specify an amount which exceeds the administrative costs associated with the continued approval of a supplementary code, and
(b) must specify, or describe, who must pay the fee.
(5) A fee payable under subsection (3) is recoverable summarily (or, in Scotland, recoverable) as a civil debt.
(6) A determination may make different provision for different purposes.
(7) The Secretary of State must publish a determination.
(8) The Secretary of State may revise a determination.
(9) If the Secretary of State revises a determination the Secretary of State must publish the determination as revised.”—(Sir John Whittingdale.)
This amendment enables the Secretary of State to determine that a fee is payable for approval/re-approval/continued approval of a supplementary code and the amount of such a fee.
Brought up, read the First and Second time, and added to the Bill.
New Clause 17
Request for withdrawal of approval
“(1) The Secretary of State must withdraw approval of a supplementary code if—
(a) the Secretary of State receives a notice requesting the withdrawal of approval of the supplementary code, and
(b) the notice complies with any requirements imposed by a determination under subsection (3).
(2) Before the day on which the approval is withdrawn, the Secretary of State must inform the person who gave the notice of when it will be withdrawn.
(3) The Secretary of State may determine—
(a) the form of a notice,
(b) the information to be contained in or provided with the notice,
(c) the documents to be provided with the notice,
(d) the manner in which the notice is to be submitted,
(e) who may give the notice.
(4) A determination may make different provision for different purposes.
(5) The Secretary of State must publish a determination.
(6) The Secretary of State may revise a determination.
(7) If the Secretary of State revises a determination the Secretary of State must publish the determination as revised.”—(Sir John Whittingdale.)
This amendment enables a supplementary code to be “de-approved”, on request.
Brought up, read the First and Second time, and added to the Bill.
New Clause 18
Removal of designation
“(1) The Secretary of State may determine to remove the designation of a supplementary code.
(2) A determination must—
(a) be published, and
(b) specify when the designation is to be removed, which must be a time after the end of the period of 21 days beginning with the day on which the determination is published.”—(Sir John Whittingdale.)
This amendment enables the Secretary of State to determine that a designated supplementary code should cease to be designated.
Brought up, read the First and Second time, and added to the Bill.
New Clause 19
Registration of additional services
“(1) Subsection (2) applies if—
(a) a person is registered in the DVS register,
(b) the person applies for their entry in the register to be amended to record additional digital verification services that the person provides in accordance with the main code,
(c) the person holds a certificate from an accredited conformity assessment body certifying that the person provides the additional services in accordance with the main code,
(d) the application complies with any requirements imposed by a determination under section 51, and
(e) the person pays any fee required to be paid by a determination under section 52(1).
(2) The Secretary of State must amend the DVS register to record that the person is also registered in respect of the additional services referred to in subsection (1).
(3) For the purposes of subsection (1)(c), a certificate is to be ignored if—
(a) it has expired in accordance with its terms,
(b) it has been withdrawn by the body that issued it, or
(c) it is required to be ignored by reason of provision included in the DVS trust framework under 49(10).”—(Sir John Whittingdale.)
This amendment provides for a person to apply to add services to their entry in the DVS register and requires the Secretary of State to amend the register to record that a person is registered in respect of the additional services.
Brought up, read the First and Second time, and added to the Bill.
New Clause 20
Supplementary notes
“(1) Subsection (2) applies if—
(a) a person holds a certificate from an accredited conformity assessment body certifying that digital verification services provided by the person are provided in accordance with a recognised supplementary code,
(b) the person applies for a note about one or more of the services to which the certificate relates to be included in the entry relating to that person in the DVS register,
(c) the application complies with any requirements imposed by a determination under section 51, and
(d) the person pays any fee required to be paid by a determination under section 52(1).
(2) The Secretary of State must include a note in the entry relating to the person in the DVS register recording that the person provides, in accordance with the recognised supplementary code referred to in subsection (1), the services in respect of which the person made the application referred to in that subsection.
(3) The Secretary of State may not otherwise include a note described in subsection (2) in the DVS register.
(4) For the purposes of subsection (1)(a), a certificate is to be ignored if—
(a) it has expired in accordance with its terms,
(b) it has been withdrawn by the body that issued it, or
(c) subsection (5) applies.
(5) This subsection applies if—
(a) the recognised supplementary code to which the certificate relates has been revised since the certificate was issued,
(b) the certificate was issued before the revision to the supplementary code took effect, and
(c) the supplementary code (as revised) provides—
(i) that certificates issued before the time the revision takes effect are required to be ignored, or
(ii) that such certificates are to be ignored from a date, or from the end of a period, specified in the code and that date has passed or that period has elapsed.
(6) In this Part, a note included in the DVS register in accordance with subsection (2) is referred to as a supplementary note.”—(Sir John Whittingdale.)
This amendment provides for a person to apply for a note to be included in the DVS register that they provide digital verification services in accordance with a recognised supplementary code.
Brought up, read the First and Second time, and added to the Bill.
New Clause 21
Addition of services to supplementary notes
“(1) Subsection (2) applies if—
(a) a person has a supplementary note included in the DVS register,
(b) the person applies for the note to be amended to record additional digital verification services that the person provides in accordance with a recognised supplementary code,
(c) the person holds a certificate from an accredited conformity assessment body certifying that the person provides the additional services in accordance with the recognised supplementary code referred to in paragraph (b),
(d) the application complies with any requirements imposed by a determination under section 51, and
(e) the person pays any fee required to be paid by a determination under section 52(1).
(2) The Secretary of State must amend the note to record that the person also provides the additional services referred to in subsection (1) in accordance with the recognised supplementary code referred to in that subsection.
(3) For the purposes of subsection (1)(c), a certificate is to be ignored if—
(a) it has expired in accordance with its terms,
(b) it has been withdrawn by the body that issued it, or
(c) subsection (4) applies.
(4) This subsection applies if—
(a) the recognised supplementary code to which the certificate relates has been revised since the certificate was issued,
(b) the certificate was issued before the revision to the supplementary code took effect, and
(c) the supplementary code (as revised) provides—
(i) that certificates issued before the time the revision takes effect are required to be ignored, or
(ii) that such certificates are to be ignored from a date, or from the end of a period, specified in the code and that date has passed or that period has elapsed.”—(Sir John Whittingdale.)
This amendment provides for a person to add services to their supplementary note in the DVS register and requires the Secretary of State to amend the note to record that a person is registered in respect of the additional services.
Brought up, read the First and Second time, and added to the Bill.
New Clause 22
Duty to remove services from the DVS register
“(1) Where a person is registered in the DVS register in respect of digital verification services, subsection (2) applies if the person—
(a) asks for the register to be amended so that the person is no longer registered in respect of one or more of those services,
(b) ceases to provide one or more of those services, or
(c) no longer holds a certificate from an accredited conformity assessment body certifying that all of those services are provided in accordance with the main code.
(2) The Secretary of State must amend the register to record that the person is no longer registered in respect of (as the case may be)—
(a) the service or services mentioned in a request described in subsection (1)(a),
(b) the service or services which the person has ceased to provide, or
(c) the service or services for which there is no longer a certificate as described in subsection (1)(c).
(3) For the purposes of subsection (1)(c), a certificate is to be ignored if—
(a) it has expired in accordance with its terms,
(b) it has been withdrawn by the body that issued it, or
(c) it is required to be ignored by reason of provision included in the DVS trust framework under section 49(10).”—(Sir John Whittingdale.)
This amendment places the Secretary of State under a duty to amend the DVS register, in certain circumstances, to record that a person is no longer registered in respect of certain services.
Brought up, read the First and Second time, and added to the Bill.
New Clause 23
Duty to remove supplementary notes from the DVS register
“(1) The Secretary of State must remove a supplementary note included in the entry in the DVS register relating to a person if—
(a) the person asks for the note to be removed,
(b) the person ceases to provide all of the digital verification services to which the note relates,
(c) the person no longer holds a certificate from an accredited conformity assessment body certifying that at least one of those digital verification services is provided in accordance with the supplementary code, or
(d) the person continues to hold a certificate described in paragraph (c) but the supplementary code is not a recognised supplementary code.
(2) For the purposes of subsection (1)(c) and (d), a certificate is to be ignored if—
(a) it has expired in accordance with its terms,
(b) it has been withdrawn by the body that issued it, or
(c) subsection (3) applies.
(3) This subsection applies if—
(a) the supplementary code to which the certificate relates has been revised since the certificate was issued,
(b) the certificate was issued before the revision to the supplementary code took effect, and
(c) the supplementary code (as revised) provides—
(i) that certificates issued before the time the revision takes effect are required to be ignored, or
(ii) that such certificates are to be ignored from a date, or from the end of a period, specified in the code and that date has passed or that period has elapsed.”—(Sir John Whittingdale.)
This amendment sets out the circumstances in which the Secretary of State must remove a supplementary note from the DVS register.
Brought up, read the First and Second time, and added to the Bill.
New Clause 24
Duty to remove services from supplementary notes
“(1) Where a person has a supplementary note included in their entry in the DVS register in respect of digital verification services, subsection (2) applies if the person—
(a) asks for the register to be amended so that the note no longer records one or more of those services,
(b) ceases to provide one or more of the services recorded in the note, or
(c) no longer holds a certificate from an accredited conformity assessment body certifying that all of the services included in the note are provided in accordance with a supplementary code.
(2) The Secretary of State must amend the supplementary note so it no longer records (as the case maA24y be)—
(a) the service or services mentioned in a request described in subsection (1)(a),
(b) the service or services which the person has ceased to provide, or
(c) the service or services for which there is no longer a certificate as described in subsection (1)(c).
(3) For the purposes of subsection (1)(c), a certificate is to be ignored if—
(a) it has expired in accordance with its terms,
(b) it has been withdrawn by the body that issued it, or
(c) subsection (4) applies.
(4) This subsection applies if—
(a) the supplementary code to which the certificate relates has been revised since the certificate was issued,
(b) the certificate was issued before the revision to the supplementary code took effect, and
(c) the supplementary code (as revised) provides—
(i) that certificates issued before the time the revision takes effect are required to be ignored, or
(ii) that such certificates are to be ignored from a date, or from the end of a period, specified in the code and that date has passed or that period has elapsed.”—(Sir John Whittingdale.)
This amendment places the Secretary of State under a duty to amend a supplementary note on the DVS register relating to a person, in certain circumstances, to remove reference to certain services from the note.
Brought up, read the First and Second time, and added to the Bill.
New Clause 25
Index of defined terms for Part 2
“The Table below lists provisions that define or otherwise explain terms defined for the purposes of this Part of this Act.
—(Sir John Whittingdale.)
This amendment provides an index of terms which are defined in Part 2.
Brought up, read the First and Second time, and added to the Bill.
New Clause 26
Powers relating to verification of identity or status
“(1) In section 15 of the Immigration, Asylum and Nationality Act 2006 (penalty for employing a person subject to immigration control), after subsection (7) insert—
“(8) An order under subsection (3) containing provision described in subsection (7)(a), (b) or (c) may, in particular—
(a) specify a document generated by a DVS-registered person or a DVS-registered person of a specified description;
(b) specify a document which was provided to such a person in order to generate such a document;
(c) specify steps involving the use of services provided by such a person.
(9) In subsection (8), “DVS-registered person” means a person who is registered in the DVS register maintained under Part 2 of the Data Protection and Digital Information Act 2024 (“the DVS register”).
(10) An order under subsection (3) which specifies a description of DVS-registered person may do so by, for example, describing a DVS-registered person whose entry in the DVS register includes a note relating to specified services (see section (Supplementary notes) of the Data Protection and Digital Information Act 2024).”
(2) In section 34 of the Immigration Act 2014 (requirements which may be prescribed for the purposes of provisions about occupying premises under a residential tenancy agreement)—
(a) in subsection (1)—
(i) in paragraph (a), after “occupiers” insert “, a DVS-registered person or a DVS-registered person of a prescribed description”,
(ii) in paragraph (b), after “occupiers” insert “, a DVS-registered person or a DVS-registered person of a prescribed description”, and
(iii) in paragraph (c), at the end insert “, including steps involving the use of services provided by a DVS-registered person or a DVS-registered person of a prescribed description”, and
(b) after that subsection insert—
“(1A) An order prescribing requirements for the purposes of this Chapter which contains provision described in subsection (1)(a) or (b) may, in particular—
(a) prescribe a document generated by a DVS-registered person or a DVS-registered person of a prescribed description;
(b) prescribe a document which was provided to such a person in order to generate such a document.
(1B) In subsections (1) and (1A), “DVS-registered person” means a person who is registered in the DVS register maintained under Part 2 of the Data Protection and Digital Information Act 2024 (“the DVS register”).
(1C) An order prescribing requirements for the purposes of this Chapter which prescribes a description of DVS-registered person may do so by, for example, describing a DVS-registered person whose entry in the DVS register includes a note relating to prescribed services (see section (Supplementary notes) of the Data Protection and Digital Information Act 2024).”
(3) In Schedule 6 to the Immigration Act 2016 (illegal working compliance orders etc), after paragraph 5 insert—
“Prescribed checks and documents
5A (1) Regulations under paragraph 5(6)(b) or (c) may, in particular—
(a) prescribe checks carried out using services provided by a DVS-registered person or a DVS-registered person of a prescribed description;
(b) prescribe documents generated by such a person;
(c) prescribe documents which were provided to such a person in order to generate such documents.
(2) In sub-paragraph (1), “DVS-registered person” means a person who is registered in the DVS register maintained under Part 2 of the Data Protection and Digital Information Act 2024 (“the DVS register”).
(3) Regulations under paragraph 5(6)(b) or (c) which prescribe a description of DVS-registered person may do so by, for example, describing a DVS-registered person whose entry in the DVS register includes a note relating to prescribed services (see section (Supplementary notes) of the Data Protection and Digital Information Act 2024).””—(Sir John Whittingdale.)
This amendment contains amendments of powers to make subordinate legislation so they can be exercised so as to make provision by reference to persons registered in the DVS register established under Part 2 of the Bill.
Brought up, read the First and Second time, and added to the Bill.
New Clause 27
Interface bodies
“(1) This section is about the provision that regulations under section 66 or 68 may (among other things) contain about bodies with one or more of the following tasks—
(a) establishing a facility or service used, or capable of being used, for providing, publishing or otherwise processing customer data or business data or for taking action described in section 66(3) (an “interface”);
(b) setting standards (“interface standards”), or making other arrangements (“interface arrangements”), for use by other persons when establishing, maintaining or managing an interface;
(c) maintaining or managing an interface, interface standards or interface arrangements.
(2) Such bodies are referred to in this Part as “interface bodies”.
(3) The regulations may—
(a) require a data holder, an authorised person or a third party recipient to set up an interface body;
(b) make provision about the type of body to be set up.
(4) In relation to an interface body (whether or not it is required to be set up by regulations under section 66 or 68), the regulations may—
(a) make provision about the body’s composition and governance;
(b) make provision requiring a data holder, an authorised person or a third party recipient to provide, or arrange for, assistance for the body;
(c) impose other requirements relating to the body on a person required to set it up or to provide, or arrange for, assistance for the body;
(d) make provision requiring the body to carry on all or part of a task described in subsection (1);
(e) make provision requiring the body to do other things in connection with its interface, interface standards or interface arrangements;
(f) make provision about how the body carries out its functions (such as, for example, provision about the body’s objectives or matters to be taken into account by the body);
(g) confer powers on the body for the purpose of monitoring use of its interface, interface standards or interface arrangements (“monitoring powers”) (and see section 71 for provision about enforcement of requirements imposed in exercise of those powers);
(h) make provision for the body to arrange for its monitoring powers to be exercised by another person;
(i) make provision about the rights of persons affected by the exercise of the body’s functions under the regulations, including (among other things)—
(i) provision about the review of decisions made in exercise of those functions;
(ii) provision about appeals to a court or tribunal;
(j) make provision about complaints, including provision requiring the body to implement procedures for the handling of complaints;
(k) make provision enabling or requiring the body to publish, or provide to a specified person, specified documents or information relating to its interface, interface standards or interface arrangements;
(l) make provision enabling or requiring the body to produce guidance about how it proposes to exercise its functions under the regulations, to publish the guidance and to provide copies to specified persons.
(5) The monitoring powers that may be conferred on an interface body include power to require the provision of documents or information (but such powers are subject to the restrictions in section 72 as well as any restrictions included in the regulations).
(6) Examples of facilities or services referred to in subsection (1) include dashboard services, other electronic communications services and application programming interfaces.
(7) In subsection (4)(b) and (c), the references to assistance include actual or contingent financial assistance (such as, for example, a grant, loan, guarantee or indemnity or buying a company’s share capital).”—(Sir John Whittingdale.)
This new clause enables regulations under Part 3 to make provision about bodies providing facilities or services used for providing, publishing or processing customer data or business data, or setting standards or making other arrangements in connection with such facilities or services.
Brought up, read the First and Second time, and added to the Bill.
New Clause 28
The FCA and financial services interfaces
“(1) The Treasury may by regulations make provision enabling or requiring the Financial Conduct Authority (“the FCA”) to make rules—
(a) requiring financial services providers described in the regulations to use a prescribed interface, or prescribed interface standards or interface arrangements, when providing or receiving customer data or business data which is required to be provided by or to the financial services provider by data regulations;
(b) requiring persons described in the regulations to use a prescribed interface, or prescribed interface standards or interface arrangements, when the person, in the course of a business, receives, from a financial services provider, customer data or business data which is required to be provided to the person by data regulations;
(c) imposing interface-related requirements on a description of person falling within subsection (2),
and such rules are referred to in this Part as “FCA interface rules”.
(2) The following persons fall within this subsection—
(a) an interface body linked to the financial services sector on which requirements are imposed by regulations made in reliance on section (Interface bodies);
(b) a person required by regulations made in reliance on section (Interface bodies) to set up an interface body linked to the financial services sector;
(c) a person who uses an interface, interface standards or interface arrangements linked to the financial services sector or who is required to do so by data regulations or rules made by virtue of regulations under subsection (1)(a) or (b).
(3) For the purposes of this section, requirements are interface-related if they relate to—
(a) the composition, governance or activities of an interface body linked to the financial services sector,
(b) an interface, interface standards or interface arrangements linked to the financial services sector, or
(c) the use of such an interface, such interface standards or such interface arrangements.
(4) For the purposes of this section—
(a) an interface body is linked to the financial services sector to the extent that its interface, interface standards or interface arrangements are linked to the financial service sector;
(b) interfaces, interface standards and interface arrangements are linked to the financial services sector to the extent that they are used, or intended to be used, by financial services providers (whether or not they are used, or intended to be used, by other persons).
(5) The Treasury may by regulations make provision enabling or requiring the FCA to impose requirements on a person to whom FCA interface rules apply (referred to in this Part as “FCA additional requirements”) where the FCA considers it appropriate to impose the requirement—
(a) in response to a failure, or likely failure, by the person to comply with an FCA interface rule or FCA additional requirement, or
(b) in order to advance a purpose which the FCA is required to advance when exercising functions conferred by regulations under this section (see section (The FCA and financial services interfaces: supplementary)(3)(a)).
(6) Regulations under subsection (5) may, for example, provide for the FCA to impose requirements by giving a notice or direction.
(7) The restrictions in section 72 apply in connection with FCA interface rules and FCA additional requirements as they apply in connection with regulations under this Part.
(8) In section 72 as so applied—
(a) the references in subsections (1)(b) and (8) to an enforcer include the FCA, and
(b) the references in subsections (3) and (4) to data regulations include FCA interface rules and FCA additional requirements.
(9) In this section—
“financial services provider” means a person providing financial services;
“prescribed” means prescribed in FCA interface rules.”—(Sir John Whittingdale.)
This new clause and new clause NC29 enable the Treasury, by regulations, to confer powers on the Financial Conduct Authority to impose requirements (by means of rules or otherwise) on interface bodies used by the financial services sector and on persons participating in, or using facilities and services provided by, such bodies.
Brought up, read the First and Second time, and added to the Bill.
New Clause 29
The FCA and financial services interfaces: supplementary
“(1) This section is about provision that regulations under section (The FCA and financial services interfaces) may or must (among other things) contain.
(2) The regulations—
(a) may enable or require the FCA to impose interface-related requirements that could be imposed by regulations made in reliance on section (Interface bodies)(4) or (5), but
(b) may not enable or require the FCA to require a person to set up an interface body.
(3) The regulations must—
(a) require the FCA, so far as is reasonably possible, to exercise functions conferred by the regulations in a manner which is compatible with, or which advances, one or more specified purposes;
(b) specify one or more matters to which the FCA must have regard when exercising functions conferred by the regulations;
(c) if they enable or require the FCA to make rules, make provision about the procedure for making rules, including provision requiring such consultation with persons likely to be affected by the rules or representatives of such persons as the FCA considers appropriate.
(4) The regulations may—
(a) require the FCA to carry out an analysis of the costs and benefits that will arise if proposed rules are made or proposed changes are made to rules and make provision about what the analysis must include;
(b) require the FCA to publish rules or changes to rules and to provide copies to specified persons;
(c) make provision about the effect of rules, including provision about circumstances in which rules are void and circumstances in which a person is not to be taken to have contravened a rule;
(d) make provision enabling or requiring the FCA to modify or waive rules as they apply to a particular case;
(e) make provision about the procedure for imposing FCA additional requirements;
(f) make provision enabling or requiring the FCA to produce guidance about how it proposes to exercise its functions under the regulations, to publish the guidance and to provide copies to specified persons.
(5) The regulations may enable or require the FCA to impose the following types of requirement on a person as FCA additional requirements—
(a) a requirement to review the person’s conduct;
(b) a requirement to take remedial action;
(c) a requirement to make redress for loss or damage suffered by others as a result of the person’s conduct.
(6) The regulations may enable or require the FCA to make rules requiring a person falling within section (The FCA and financial services interfaces)(2)(b) or (c) to pay fees to an interface body for the purpose of meeting expenses incurred, or to be incurred, by such a body in performing duties, or exercising powers, imposed or conferred by regulations under this Part or by rules made by virtue of regulations under section (The FCA and financial services interfaces).
(7) Regulations made in reliance on subsection (6)—
(a) may enable rules to provide for the amount of a fee to be an amount which is intended to exceed the cost of the things in respect of which the fee is charged;
(b) must require rules to provide for the amount of a fee to be—
(i) a prescribed amount or an amount determined in accordance with the rules, or
(ii) an amount not exceeding such an amount;
(c) may enable or require rules to provide for the amount, or maximum amount, of a fee to increase at specified times and by—
(i) a prescribed amount or an amount determined in accordance with the rules, or
(ii) an amount not exceeding such an amount;
(d) if they enable rules to enable a person to determine an amount, must require rules to require the person to publish information about the amount and how it is determined;
(e) may enable or require rules to make provision about—
(i) interest on any unpaid amounts;
(ii) the recovery of unpaid amounts.
(8) In this section—
“interface-related” has the meaning given in section (The FCA and financial services interfaces);
“prescribed” means prescribed in FCA interface rules.
(9) The reference in subsection (5)(c) to making redress includes—
(a) paying interest, and
(b) providing redress in the form of a remedy or relief which could not be awarded in legal proceedings.”—(Sir John Whittingdale.)
See the explanatory statement for new clause NC28.
Brought up, read the First and Second time, and added to the Bill.
New Clause 30
The FCA and financial services interfaces: penalties and levies
“(1) Subsections (2) and (3) are about the provision that regulations made by the Treasury under this Part providing for the FCA to enforce requirements under FCA interface rules may (among other things) contain in relation to financial penalties.
(2) The regulations may require or enable the FCA—
(a) to set the amount or maximum amount of, or of an increase in, a penalty imposed in respect of failure to comply with a requirement imposed by the FCA in exercise of a power conferred by regulations under section (The FCA and financial services interfaces) (whether imposed by means of FCA interface rules or an FCA additional requirement), or
(b) to set the method for determining such an amount.
(3) Regulations made in reliance on subsection (2)—
(a) must require the FCA to produce and publish a statement of its policy with respect to the amount of the penalties;
(b) may require the policy to include specified matters;
(c) may make provision about the procedure for producing the statement;
(d) may require copies of the statement to be provided to specified persons;
(e) may require the FCA to have regard to a statement published in accordance with the regulations.
(4) The Treasury may by regulations—
(a) impose, or provide for the FCA to impose, a levy on data holders, authorised persons or third party recipients for the purpose of meeting all or part of the expenses incurred, or to be incurred, during a period by the FCA, or by a person acting on the FCA’s behalf, in performing duties, or exercising powers, imposed or conferred on the FCA by regulations under section (The FCA and financial services interfaces), and
(b) make provision about how funds raised by means of the levy must or may be used.
(5) Regulations under subsection (4) may only provide for a levy in respect of expenses of the FCA to be imposed on persons that appear to the Treasury to be capable of being directly affected by the exercise of some or all of the functions conferred on the FCA by regulations under section (The FCA and financial services interfaces).
(6) Section 75(3) and (4) apply in relation to regulations under subsection (4) of this section as they apply in relation to regulations under section 75(1).”—(Sir John Whittingdale.)
This new clause enables the Treasury, by regulations, to confer power on the Financial Conduct Authority to set the amount of certain penalties. It also enables the Treasury to impose a levy in respect of expenses incurred by that Authority.
Brought up, read the First and Second time, and added to the Bill.
New Clause 31
Liability in damages
“(1) The Secretary of State or the Treasury may by regulations provide that a person listed in subsection (2) is not liable in damages for anything done or omitted to be done in the exercise of functions conferred by regulations under this Part.
(2) Those persons are—
(a) a public authority;
(b) a member, officer or member of staff of a public authority;
(c) a person who could be held vicariously liable for things done or omitted by a public authority.
(3) Regulations under this section may not—
(a) make provision removing liability for an act or omission which is shown to have been in bad faith, or
(b) make provision so as to prevent an award of damages made in respect of an act or omission on the ground that the act or omission was unlawful as a result of section 6(1) of the Human Rights Act 1998.”— (Sir John Whittingdale.)
This new clause enables regulations under Part 3 to provide that certain persons are not liable in damages when exercising functions under such regulations.
Brought up, read the First and Second time, and added to the Bill.
New Clause 32
Other data provision
“(1) This section is about cases in which subordinate legislation other than regulations under this Part contains provision described in section 66(1) to (3) or 68(1) to (2A) (“other data provision”).
(2) The regulation-making powers under this Part may be exercised so as to make, in connection with the other data provision, any provision that they could be exercised to make as part of, or in connection with, provision made under section 66(1) to (3) or 68(1) to (2A) that is equivalent to the other data provision.
(3) In this Part, references to “data regulations” include regulations made in reliance on subsection (2) to the extent that they make provision described in sections 66 to 70 or (Interface bodies).
(4) In this section, “subordinate legislation” has the same meaning as in the Interpretation Act 1978 (see section 21 of that Act).”—(Sir John Whittingdale.)
This new clause enables the regulation-making powers under Part 3 to be used to supplement existing subordinate legislation which requires customer data or business data to be provided to customers and others.
Brought up, read the First and Second time, and added to the Bill.
New Clause 33
Duty to notify the Commissioner of personal data breach: time periods
“(1) In regulation 5A of the PEC Regulations (personal data breach)—
(a) in paragraph (2), after “delay” insert “and, where feasible, not later than 72 hours after having become aware of it”, and
(b) after paragraph (3) insert—
“(3A) Where notification under paragraph (2) is not made within 72 hours, it must be accompanied by reasons for the delay.”
(2) In Article 2 of Commission Regulation (EU) No 611/2013 of 24 June 2013 on the measures applicable to the notification of personal data breaches under Directive 2002/58/EC of the European Parliament and of the Council on privacy and electronic communications (notification to the Information Commissioner)—
(a) in paragraph 2—
(i) in the first subparagraph, for the words from “no” to “feasible” substitute “without undue delay and, where feasible, not later than 72 hours after having becoming aware of it”, and
(ii) in the second subparagraph, after “shall” insert “, subject to paragraph 3,”, and
(b) for paragraph 3 substitute—
“3. To the extent that the information set out in Annex 1 is not available to be included in the notification, it may be provided in phases without undue further delay.””—(Sir John Whittingdale.)
This adjusts the period within which the Information Commissioner must be notified of a personal data breach. It also inserts a duty (into the PEC Regulations) to give reasons for not notifying within 72 hours and adjusts the duty (in Commission Regulation (EU) No 611/2013) to provide accompanying information.
Brought up, read the First and Second time, and added to the Bill.
New Clause 34
Power to require information for social security purposes
“In Schedule (Power to require information for social security purposes)—
(a) Part 1 amends the Social Security Administration Act 1992 to make provision about a power for the Secretary of State to obtain information for social security purposes;
(b) Part 2 amends the Social Security Administration (Northern Ireland) Act 1992 to make provision about a power for the Department for Communities to obtain information for such purposes;
(c) Part 3 makes related amendments of the Proceeds of Crime Act 2002.”—(Sir John Whittingdale.)
This new clause introduces a new Schedule NS1 which amends social security legislation to make provision about a new power for the Secretary of State or, in Northern Ireland, the Department for Communities, to obtain information for social security purposes.
Brought up, read the First and Second time, and added to the Bill.
New Clause 35
Retention of information by providers of internet services in connection with death of child
“(1) The Online Safety Act 2023 is amended as follows.
(2) In section 100 (power to require information)—
(a) omit subsection (7);
(b) after subsection (8) insert—
“(8A) The power to give a notice conferred by subsection (1) does not include power to require processing of personal data that would contravene the data protection legislation (but in determining whether processing of personal data would do so, the duty imposed by the notice is to be taken into account).”
(3) In section 101 (information in connection with investigation into death of child)—
(a) before subsection (1) insert—
“(A1) Subsection (D1) applies if a senior coroner (in England and Wales), a procurator fiscal (in Scotland) or a coroner (in Northern Ireland) (“the investigating authority”)—
(a) notifies OFCOM that—
(i) they are conducting an investigation, or are due to conduct an investigation, in connection with the death of a child, and
(ii) they suspect that the child may have taken their own life, and
(b) provides OFCOM with the details in subsection (B1).
(B1) The details are—
(a) the name of the child who has died,
(b) the child’s date of birth,
(c) any email addresses used by the child (so far as the investigating authority knows), and
(d) if any regulated service has been brought to the attention of the investigating authority as being of interest in connection with the child’s death, the name of the service.
(C1) Where this subsection applies, OFCOM—
(a) must give a notice to the provider of a service within subsection (E1) requiring the provider to ensure the retention of information relating to the use of the service by the child who has died, and
(b) may give a notice to any other relevant person requiring the person to ensure the retention of information relating to the use of a service within subsection (E1) by that child.
(D1) The references in subsection (C1) to ensuring the retention of information relating to the child’s use of a service include taking all reasonable steps, without delay, to prevent the deletion of such information by the routine operation of systems or processes.
(E1) A service is within this subsection if it is—
(a) a regulated service of a kind described in regulations made by the Secretary of State, or
(b) a regulated service notified to OFCOM by the investigating authority as described in subsection (B1)(d).
(F1) A notice under subsection (C1) may require information described in that subsection to be retained only if it is information—
(a) of a kind which OFCOM have power to require under a notice under subsection (1) (see, in particular, subsection (2)(a) to (d)), or
(b) which a person might need to retain to enable the person to provide information in response to a notice under subsection (1) (if such a notice were given).
(G1) OFCOM must share with the investigating authority any information they receive in response to requirements mentioned in section 102(5A)(d) that are included in a notice under subsection (C1).”
(b) in subsection (3), for “power conferred by subsection (1) includes” substitute “powers conferred by this section include”;
(c) after subsection (5) insert—
“(5A) The powers to give a notice conferred by this section do not include power to require processing of personal data that would contravene the data protection legislation (but in determining whether processing of personal data would do so, the duty imposed by the notice is to be taken into account).”
(4) In section 102 (information notices)—
(a) in subsection (1), for “101(1)” substitute “101(C1) or (1)”;
(b) in subsection (3)—
(i) after “information notice” insert “under section 100(1) or 101(1)”,
(ii) omit “and” at the end of paragraph (c), and
(iii) after paragraph (c) insert—
“(ca) specify when the information must be provided (which may be on or by a specified date, within a specified period, or at specified intervals), and”;
(c) omit subsection (4);
(d) after subsection (5) insert—
“(5A) An information notice under section 101(C1) must—
(a) specify or describe the information to be retained,
(b) specify why OFCOM require the information to be retained,
(c) require the information to be retained for the period of one year beginning with the date of the notice,
(d) require the person to whom the notice is given—
(i) if the child to whom the notice relates used the service in question, to notify OFCOM by a specified date of steps taken to ensure the retention of information;
(ii) if the child did not use the service, or the person does not hold any information of the kind required, to notify OFCOM of that fact by a specified date, and
(e) contain information about the consequences of not complying with the notice.
(5B) If OFCOM give an information notice to a person under section 101(C1), they may, in response to information received from the investigating authority, extend the period for which the person is required to retain information by a maximum period of six months.
(5C) The power conferred by subsection (5B) is exercisable—
(a) by giving the person a notice varying the notice under section 101(C1) and stating the further period for which information must be retained and the reason for the extension;
(b) any number of times.”;
(e) after subsection (9) insert—
“(9A) OFCOM must cancel an information notice under section 101(C1) by notice to the person to whom it was given if advised by the investigating authority that the information in question no longer needs to be retained.”
(f) in subsection (10), after the definition of “information” insert—
““the investigating authority” has the same meaning as in section 101;”.
(5) In section 109 (offences in connection with information notices)—
(a) in subsection (2)(b), for “all reasonable steps” substitute “all of the steps that it was reasonable, and reasonably practicable, to take”;
(b) after subsection (6) insert—
“(6A) A person who is given an information notice under section 101(C1) commits an offence if—
(a) the person deletes or alters, or causes or permits the deletion or alteration of, any information required by the notice to be retained, and
(b) the person’s intention was to prevent the information being available, or (as the case may be) to prevent it being available in unaltered form, for the purposes of any official investigation into the death of the child to whom the notice relates.
(6B) For the purposes of subsection (6A) information has been deleted if it is irrecoverable (however that occurred).”
(6) In section 110 (senior managers’ liability: information offences)—
(a) after subsection (6) insert—
“(6A) An individual named as a senior manager of an entity commits an offence if—
(a) the entity commits an offence under section 109(6A) (deletion etc of information), and
(b) the individual has failed to take all reasonable steps to prevent that offence being committed.”;
(b) in subsection (7), for “or (6)” substitute “, (6) or (6A)”.
(7) In section 113 (penalties for information offences), in subsection (2)—
(a) for “(4) or (5)” substitute “(4), (5) or (6A)”;
(b) for “(5) or (6)” substitute “(5), (6) or (6A)”.
(8) In section 114 (co-operation and disclosure of information: overseas regulators), in subsection (7), omit the definition of “the data protection legislation”.
(9) In section 225 (Parliamentary procedure for regulations), in subsection (10), after paragraph (c) insert—
“(ca) regulations under section 101(E1)(a),”
(10) In section 236(1) (interpretation)—
(a) after the definition of “country” insert—
““the data protection legislation” has the same meaning as in the Data Protection Act 2018 (see section 3(9) of that Act);”;
(b) in the definition of “information notice”, for “101(1)” substitute “101(C1) or (1)”.
(11) In section 237 (index of defined terms), after the entry for “CSEA content” insert—
—(Sir John Whittingdale.)
This new clause amends the Online Safety Act 2023 to enable OFCOM to give internet service providers a notice requiring them to retain information in connection with an investigation by a coroner (or, in Scotland, procurator fiscal) into the death of a child suspected to have taken their own life. The new clause also creates related offences.
Brought up, read the First and Second time, and added to the Bill.
New Clause 36
Retention of biometric data and recordable offences
“(1) Part 1 of the Counter-Terrorism Act 2008 (powers to gather and share information) is amended in accordance with subsections (2) to (10).
(2) In section 18A(3) (retention of material: general), after “recordable offence” insert “or recordable-equivalent offence”.
(3) Section 18E (supplementary provision) is amended in accordance with subsections (4) to (10).
(4) In subsection (1), after the definition of “recordable offence” insert—
““recordable-equivalent offence” means an offence under the law of a country or territory outside England and Wales and Northern Ireland where the act constituting the offence would constitute a recordable offence if done in England and Wales or Northern Ireland (whether or not the act constituted such an offence when the person was convicted);”.
(5) In subsection (3), in the words before paragraph (a), after “offence” insert “in England and Wales or Northern Ireland”.
(6) After subsection (5) insert—
“(5A) For the purposes of section 18A, a person is to be treated as having been convicted of an offence in a country or territory outside England and Wales and Northern Ireland if, in respect of such an offence, a court exercising jurisdiction under the law of that country or territory has made a finding equivalent to—
(a) a finding that the person is not guilty by reason of insanity, or
(b) a finding that the person is under a disability and did the act charged against the person in respect of the offence.”
(7) In subsection (6)(a)—
(a) after “convicted” insert “—
(i) ‘”, and
(b) after “offence,” insert “or
(ii) in a country or territory outside England and Wales and Northern Ireland, of a recordable-equivalent offence,”.
(8) In subsection (6)(b)—
(a) omit “of a recordable offence”, and
(b) for “a recordable offence, other than a qualifying offence” substitute “an offence, other than a qualifying offence or qualifying-equivalent offence”.
(9) In subsection (7), for “subsection (6)” substitute “this section”.
(10) After subsection (7) insert—
“(7A) In subsection (6), “qualifying-equivalent offence” means an offence under the law of a country or territory outside England and Wales and Northern Ireland where the act constituting the offence would constitute a qualifying offence if done in England and Wales or Northern Ireland (whether or not the act constituted such an offence when the person was convicted).”
(11) The amendments made by this section apply only in connection with the retention of section 18 material that is or was obtained or acquired by a law enforcement authority—
(a) on or after the commencement day, or
(b) in the period of 3 years ending immediately before the commencement day.
(12) Subsection (13) of this section applies where—
(a) at the beginning of the commencement day, a law enforcement authority has section 18 material which it obtained or acquired in the period of 3 years ending immediately before the commencement day,
(b) at a time before the commencement day (a “pre-commencement time”), the law enforcement authority was required by section 18(4) of the Counter-Terrorism Act 2008 to destroy the material, and
(c) at the pre-commencement time, the law enforcement authority could have retained the material under section 18A of the Counter-Terrorism Act 2008, as it has effect taking account of the amendments made by subsections (2) to (10) of this section, if those amendments had been in force.
(13) Where this subsection applies—
(a) the law enforcement authority is to be treated as not having been required to destroy the material at the pre-commencement time, but
(b) the material may not be used in evidence against the person to whom the material relates—
(i) in criminal proceedings in England and Wales, Northern Ireland or Scotland in relation to an offence where those proceedings, or other criminal proceedings in relation to the person and the offence, were instituted before the commencement day, or
(ii) in criminal proceedings in any other country or territory.
(14) In this section—
“the commencement day” means the day on which this Act is passed;
“law enforcement authority” has the meaning given by section 18E(1) of the Counter-Terrorism Act 2008;
“section 18 material” has the meaning given by section 18(2) of that Act.
(15) For the purposes of this section, proceedings in relation to an offence are instituted—
(a) in England and Wales, when they are instituted for the purposes of Part 1 of the Prosecution of Offences Act 1985 (see section 15(2) of that Act);
(b) in Northern Ireland, when they are instituted for the purposes of Part 2 of the Justice (Northern Ireland) Act 2002 (see section 44(1) and (2) of that Act);
(c) in Scotland, when they are instituted for the purposes of Part 3 of the Proceeds of Crime Act 2002 (see section 151(1) and (2) of that Act).”—(Sir John Whittingdale.)
This new clause enables a law enforcement authority to retain fingerprints and DNA profiles where a person has been convicted of an offence equivalent to a recordable offence in a jurisdiction outside England and Wales and Northern Ireland.
Brought up, read the First and Second time, and added to the Bill.
New Clause 37
Retention of pseudonymised biometric data
“(1) Part 1 of the Counter-Terrorism Act 2008 (powers to gather and share information) is amended in accordance with subsections (2) to (6).
(2) Section 18A (retention of material: general) is amended in accordance with subsections (3) to (5).
(3) In subsection (1), for “subsection (5)” substitute “subsections (4) to (9)”.
(4) In subsection (4)(a), after “relates” insert “(a “pseudonymised form”)”.
(5) After subsection (6) insert—
“(7) Section 18 material which is not a DNA sample may be retained indefinitely by a law enforcement authority if—
(a) the authority obtains or acquires the material directly or indirectly from an overseas law enforcement authority,
(b) the authority obtains or acquires the material in a form which includes information which identifies the person to whom the material relates,
(c) as soon as reasonably practicable after obtaining or acquiring the material, the authority takes the steps necessary for it to hold the material in a pseudonymised form, and
(d) having taken those steps, the law enforcement authority continues to hold the material in a pseudonymised form.
(8) In a case where section 18 material is being retained by a law enforcement authority under subsection (7), if—
(a) the law enforcement authority ceases to hold the material in a pseudonymised form, and
(b) the material relates to a person who has no previous convictions or only one exempt conviction,
the material may be retained by the law enforcement authority until the end of the retention period specified in subsection (9).
(9) The retention period is the period of 3 years beginning with the date on which the law enforcement authority first ceases to hold the material in a pseudonymised form.”
(6) In section 18E(1) (supplementary provision)—
(a) in the definition of “law enforcement authority”, for paragraph (d) substitute—
“(d) an overseas law enforcement authority;”, and
(b) after that definition insert—
““overseas law enforcement authority” means a person formed or existing under the law of a country or territory outside the United Kingdom so far as exercising functions which—
(a) correspond to those of a police force, or
(b) otherwise involve the investigation or prosecution of offences;”.
(7) The amendments made by this section apply only in connection with the retention of section 18 material that is or was obtained or acquired by a law enforcement authority—
(a) on or after the commencement day, or
(b) in the period of 3 years ending immediately before the commencement day.
(8) Subsections (9) to (12) of this section apply where, at the beginning of the commencement day, a law enforcement authority has section 18 material which it obtained or acquired in the period of 3 years ending immediately before the commencement day.
(9) Where the law enforcement authority holds the material in a pseudonymised form at the beginning of the commencement day, the authority is to be treated for the purposes of section 18A(7)(c) and (d) of the Counter-Terrorism Act 2008 as having—
(a) taken the steps necessary for it to hold the material in a pseudonymised form as soon as reasonably practicable after obtaining or acquiring the material, and
(b) continued to hold the material in a pseudonymised form until the commencement day.
(10) Where the law enforcement authority does not hold the material in a pseudonymised form at the beginning of the commencement day, the authority is to be treated for the purposes of section 18A(7)(c) of the Counter-Terrorism Act 2008 as taking the steps necessary for it to hold the material in a pseudonymised form as soon as reasonably practicable after obtaining or acquiring the material if it takes those steps on, or as soon as reasonably practicable after, the commencement day.
(11) Subsection (12) of this section applies where, at a time before the commencement day (a “pre-commencement time”), the law enforcement authority was required by section 18(4) of the Counter-Terrorism Act 2008 to destroy the material but—
(a) at the pre-commencement time, the law enforcement authority could have retained the material under section 18A(7) to (9) of the Counter-Terrorism Act 2008 (as inserted by this section) if those provisions had been in force, or
(b) on or after the commencement day, the law enforcement authority may retain the material under those provisions by virtue of subsection (9) or (10) of this section.
(12) Where this subsection applies—
(a) the law enforcement authority is to be treated as not having been required to destroy the material at the pre-commencement time, but
(b) the material may not be used in evidence against the person to whom the material relates—
(i) in criminal proceedings in England and Wales, Northern Ireland or Scotland in relation to an offence where those proceedings, or other criminal proceedings in relation to the person and the offence, were instituted before the commencement day, or
(ii) in criminal proceedings in any other country or territory.
(13) In this section—
“the commencement day” , “law enforcement authority” and “section 18 material” have the meaning given in section (Retention of biometric data and recordable offences)(14);
“instituted” , in relation to proceedings, has the meaning given in section (Retention of biometric data and recordable offences)(15);
“in a pseudonymised form” has the meaning given by section 18A(4) and (10) of the Counter-Terrorism Act 2008 (as amended or inserted by this section).”—(Sir John Whittingdale.)
This new clause enables a law enforcement authority to retain fingerprints and DNA profiles where, as soon as reasonably practicable after acquiring or obtaining them, the authority takes the steps necessary for it to hold the material in a form which does not include information which identifies the person to whom the material relates.
Brought up, read the First and Second time, and added to the Bill.
New Clause 38
Retention of biometric data from INTERPOL
“(1) Part 1 of the Counter-Terrorism Act 2008 (powers to gather and share information) is amended in accordance with subsections (2) to (4).
(2) In section 18(4) (destruction of national security material not subject to existing statutory restrictions), after “18A” insert “, 18AA”.
(3) After section 18A insert—
“18AA Retention of material from INTERPOL
(1) This section applies to section 18 material which is not a DNA sample where the law enforcement authority obtained or acquired the material as part of a request for assistance, or a notification of a threat, sent to the United Kingdom via INTERPOL’s systems.
(2) The law enforcement authority may retain the material until the National Central Bureau informs the authority that the request or notification has been cancelled or withdrawn.
(3) If the law enforcement authority is the National Central Bureau, it may retain the material until it becomes aware that the request or notification has been cancelled or withdrawn.
(4) In this section—
“INTERPOL” means the organisation called the International Criminal Police Organization - INTERPOL;
“the National Central Bureau” means the body appointed for the time being in accordance with INTERPOL’s constitution to serve as the United Kingdom’s National Central Bureau.
(5) The reference in subsection (1) to material obtained or acquired as part of a request or notification includes material obtained or acquired as part of a communication, sent to the United Kingdom via INTERPOL’s systems, correcting, updating or otherwise supplementing the request or notification.
18AB Retention of material from INTERPOL: supplementary
(1) The Secretary of State may by regulations amend section 18AA to make such changes as the Secretary of State considers appropriate in consequence of—
(a) changes to the name of the organisation which, when section 18AA was enacted, was called the International Criminal Police Organization - INTERPOL (“the organisation”),
(b) changes to arrangements made by the organisation which involve fingerprints or DNA profiles being provided to members of the organisation (whether changes to existing arrangements or changes putting in place new arrangements), or
(c) changes to the organisation’s arrangements for liaison between the organisation and its members or between its members.
(2) Regulations under this section are subject to affirmative resolution procedure.”
(4) In section 18BA(5)(a) (retention of further fingerprints), after “18A” insert “, 18AA”.
(5) Section 18AA of the Counter-Terrorism Act 2008 applies in relation to section 18 material obtained or acquired by a law enforcement authority before the commencement day (as well as material obtained or acquired on or after that day), except where the law enforcement authority was informed, or became aware, as described in subsection (2) or (3) of that section before the commencement day.
(6) Subsection (7) of this section applies where—
(a) at the beginning of the commencement day, a law enforcement authority has section 18 material,
(b) at a time before the commencement day (a “pre-commencement time”), the law enforcement authority was required by section 18(4) of the Counter-Terrorism Act 2008 to destroy the material, but
(c) at the pre-commencement time, the law enforcement authority could have retained the material under section 18AA of that Act (as inserted by this section) if it had been in force.
(7) Where this subsection applies—
(a) the law enforcement authority is to be treated as not having been required to destroy the material at the pre-commencement time, but
(b) the material may not be used in evidence against the person to whom the material relates—
(i) in criminal proceedings in England and Wales, Northern Ireland or Scotland in relation to an offence where those proceedings, or other criminal proceedings in relation to the person and the offence, were instituted before the commencement day, or
(ii) in criminal proceedings in any other country or territory.
(8) In this section—
“the commencement day” , “law enforcement authority” and “section 18 material” have the meaning given in section (Retention of biometric data and recordable offences)(14);
“instituted” , in relation to proceedings, has the meaning given in section (Retention of biometric data and recordable offences)(15).”—(Sir John Whittingdale.)
This new clause enables fingerprints and DNA profiles obtained as part of a request for assistance, or notification of a threat, from INTERPOL and held for national security purposes by a law enforcement authority to be retained until the authority is informed that the request or notification has been withdrawn or cancelled.
Brought up, read the First and Second time, and added to the Bill.
New Clause 39
National Underground Asset Register
“(1) After section 106 of the New Roads and Street Works Act 1991 insert—
“Part 3A
National Underground Asset Register: England and Wales
The register
106A National Underground Asset Register
(1) The Secretary of State must keep a register of information relating to apparatus in streets in England and Wales.
(2) The register is to be known as the National Underground Asset Register (and is referred to in this Act as “NUAR”).
(3) NUAR must be kept in such form and manner as may be prescribed.
(4) The Secretary of State must make arrangements so as to enable any person who is required, by a provision of Part 3, to enter information into NUAR to have access to NUAR for that purpose.
(5) Regulations under subsection (3) are subject to the negative procedure.
106B Access to information kept in NUAR
(1) The Secretary of State may by regulations make provision in connection with making information kept in NUAR available—
(a) under a licence, or
(b) without a licence.
(2) The regulations may (among other things)—
(a) make provision about which information, or descriptions of information, may be made available;
(b) make provision about the descriptions of person to whom information may be made available;
(c) make provision for information to be made available subject to exceptions;
(d) make provision requiring or authorising the Secretary of State to adapt, modify or obscure information before making it available;
(e) make provision authorising all information kept in NUAR to be made available to prescribed descriptions of person under prescribed conditions;
(f) make provision about the purposes for which information may be made available;
(g) make provision about the form and manner in which information may be made available.
(3) The regulations may make provision about licences under which information kept in NUAR is made available, including—
(a) provision about the form of a licence;
(b) provision about the terms and conditions of a licence;
(c) provision for information to be made available under a licence for free or for a fee;
(d) provision about the amount of the fees, including provision for the amount of a fee to be an amount which is intended to exceed the cost of the things in respect of which the fee is charged;
(e) provision about how funds raised by means of fees must or may be used, including provision for funds to be paid to persons who are required, by a provision of Part 3, to enter information into NUAR.
(4) Except as otherwise prescribed and subject to section 106G, processing of information by the Secretary of State in exercise of functions conferred by or under section 106A or this section does not breach—
(a) any obligation of confidence owed by the Secretary of State, or
(b) any other restriction on the processing of information (however imposed).
(5) Regulations under this section are subject to the affirmative procedure.
Requirements for undertakers to pay fees and provide information
106C Fees payable by undertakers in relation to NUAR
(1) The Secretary of State may by regulations make provision requiring undertakers having apparatus in a street to pay fees to the Secretary of State for or in connection with the exercise by the Secretary of State of any function conferred by or under this Part.
(2) The regulations may—
(a) specify the amounts of the fees, or the maximum amounts of the fees, or
(b) provide for the amounts of the fees, or the maximum amounts of the fees, to be determined in accordance with the regulations.
(3) In making the regulations the Secretary of State must seek to secure that, so far as possible and taking one year with another, the income from fees matches the expenses incurred by the Secretary of State in, or in connection with, exercising functions conferred by or under this Part (including expenses not directly connected with the keeping of NUAR).
(4) Except where the regulations specify the amounts of the fees—
(a) the amounts of the fees must be specified by the Secretary of State in a statement, and
(b) the Secretary of State must—
(i) publish the statement, and
(ii) lay it before Parliament.
(5) Regulations under subsection (1) may make provision about—
(a) when a fee is to be paid;
(b) the manner in which a fee is to be paid;
(c) the payment of discounted fees;
(d) exceptions to requirements to pay fees;
(e) the refund of all or part of a fee which has been paid.
(6) Before making regulations under subsection (1) the Secretary of State must consult—
(a) such representatives of persons likely to be affected by the regulations as the Secretary of State considers appropriate, and
(b) such other persons as the Secretary of State considers appropriate.
(7) Subject to the following provisions of this section regulations under subsection (1) are subject to the affirmative procedure.
(8) Regulations under subsection (1) that only make provision of a kind mentioned in subsection (2) are subject to the negative procedure.
(9) But the first regulations under subsection (1) that make provision of a kind mentioned in subsection (2) are subject to the affirmative procedure.
106D Providing information for purposes of regulations under section 106C
(1) The Secretary of State may by regulations make provision requiring undertakers having apparatus in a street to provide information to the Secretary of State for either or both of the following purposes—
(a) assisting the Secretary of State in determining the provision that it is appropriate for regulations under section 106C(1) or a statement under section 106C(4) to make;
(b) assisting the Secretary of State in determining whether it is appropriate to make changes to such provision.
(2) The Secretary of State may by regulations make provision requiring undertakers having apparatus in a street to provide information to the Secretary of State for either or both of the following purposes—
(a) ascertaining whether a fee is payable by a person under regulations under section 106C(1);
(b) working out the amount of a fee payable by a person.
(3) Regulations under subsection (1) or (2) may require an undertaker to notify the Secretary of State of any changes to information previously provided under the regulations.
(4) Regulations under subsection (1) or (2) may make provision about—
(a) when information is to be provided (which may be at prescribed intervals);
(b) the form and manner in which information is to be provided;
(c) exceptions to requirements to provide information.
(5) Regulations under subsection (1) or (2) are subject to the negative procedure.
Monetary penalties
106E Monetary penalties
Schedule 5A makes provision about the imposition of penalties in connection with requirements imposed by regulations under sections 106C(1) and 106D(1) and (2).
Exercise of functions by third party
106F Arrangements for third party to exercise functions
(1) The Secretary of State may make arrangements for a prescribed person to exercise a relevant function of the Secretary of State.
(2) More than one person may be prescribed.
(3) Arrangements under this section may—
(a) provide for the Secretary of State to make payments to the person, and
(b) make provision as to the circumstances in which any such payments are to be repaid to the Secretary of State.
(4) In the case of the exercise of a function by a person authorised by arrangements under this section to exercise that function, any reference in this Part or in regulations under this Part to the Secretary of State in connection with that function is to be read as a reference to that person.
(5) Arrangements under this section do not prevent the Secretary of State from exercising a function to which the arrangements relate.
(6) Except as otherwise prescribed and subject to section 106G, the disclosure of information between the Secretary of State and a person in connection with the person’s entering into arrangements under this section or exercise of functions to which such arrangements relate does not breach—
(a) any obligation of confidence owed by the person making the disclosure, or
(b) any other restriction on the disclosure of information (however imposed).
(7) Regulations under this section are subject to the affirmative procedure.
(8) In this section “relevant function” means any function of the Secretary of State conferred by or under this Part (including the function of charging or recovering fees under section 106C) other than—
(a) a power to make regulations, or
(b) a function under section 106C(4) (specifying of fees etc).
Data protection
106G Data protection
(1) A duty or power to process information that is imposed or conferred by or under this Part does not operate to require or authorise the processing of personal data that would contravene the data protection legislation (but in determining whether processing of personal data would do so, that duty or power is to be taken into account).
(2) In this section—
“the data protection legislation” has the same meaning as in the Data Protection Act 2018 (see section 3(9) of that Act);
“personal data” has the same meaning as in that Act (see section 3(2) of that Act).
Supplementary provisions
106H Regulations under this Part
(1) In this Part “prescribed” means prescribed by regulations made by the Secretary of State.
(2) Regulations under this Part may make—
(a) different provision for different purposes;
(b) supplementary and incidental provision.
(3) Regulations under this Part are to be made by statutory instrument.
(4) Before making regulations under this Part the Secretary of State must consult the Welsh Ministers.
(5) Where regulations under this Part are subject to “the affirmative procedure” the regulations may not be made unless a draft of the statutory instrument containing them has been laid before and approved by a resolution of each House of Parliament.
(6) Where regulations under this Part are subject to “the negative procedure” the statutory instrument containing the regulations is subject to annulment in pursuance of a resolution of either House of Parliament.
(7) Any provision that may be made in regulations under this Part subject to the negative procedure may be made in regulations subject to the affirmative procedure.
106I Interpretation
(1) In this Part the following terms have the same meaning as in Part 3—
“apparatus” (see sections 89(3) and 105(1));
“in” (in a context referring to apparatus in a street) (see section 105(1));
“street” (see section 48(1) and (2));
“undertaker” (in relation to apparatus or in a context referring to having apparatus in a street) (see sections 48(5) and 89(4)).
(2) In this Part “processing” has the same meaning as in the Data Protection Act 2018 (see section 3(4) of that Act) and “process” is to be read accordingly.”
(2) In section 167 of the New Roads and Street Works Act 1991 (Crown application)—
(a) after subsection (4) insert—
“(4A) The provisions of Part 3A of this Act (National Underground Asset Register: England and Wales) bind the Crown.”;
(b) in subsection (5), for “(4)” substitute “(4) or (4A)”.
(3) Schedule (National Underground Asset Register: monetary penalties) to this Act inserts Schedule 5A into the New Roads and Street Works Act 1991 (monetary penalties).”—(Sir John Whittingdale.)
This amendment inserts Part 3A into the New Roads and Street Works Act 1991 which requires, and makes provision in connection with, the keeping of a register of information relating to apparatus in streets (to be called the National Underground Asset Register).
Brought up, read the First and Second time, and added to the Bill.
New Clause 40
Information in relation to apparatus
“(1) The New Roads and Street Works Act 1991 is amended in accordance with subsections (2) to (6).
(2) For the italic heading before section 79 (records of location of apparatus) substitute “Duties in relation to recording and sharing of information about apparatus”.
(3) In section 79—
(a) for the heading substitute “Information in relation to apparatus”;
(b) in subsection (1), for paragraph (c) substitute—
“(c) being informed of its location under section 80(2),”;
(c) after subsection (1A) (as inserted by section 46(2) of the Traffic Management Act 2004) insert—
“(1B) An undertaker must, except in such cases as may be prescribed, record in relation to every item of apparatus belonging to the undertaker such other information as may be prescribed as soon as reasonably practicable after—
(a) placing the item in the street or altering its position,
(b) inspecting, maintaining, adjusting, repairing, altering or renewing the item,
(c) locating the item in the street in the course of executing any other works, or
(d) receiving any such information in relation to the item under section 80(2).”
(d) omit subsection (3);
(e) in subsection (3A) (as inserted by section 46(4) of the Traffic Management Act 2004)—
(i) for “to (3)” substitute “and (2A)”;
(ii) for “subsection (1)” substitute “this section”;
(f) after subsection (3A) insert—
“(3B) Before the end of the initial upload period an undertaker must enter into NUAR—
(a) all information that is included in the undertaker’s records under subsection (1) on the archive upload date, and
(b) any other information of a prescribed description that is held by the undertaker on that date.
(3C) Where an undertaker records information as required by subsection (1) or (1B), or updates such information, the undertaker must, within a prescribed period, enter the recorded or updated information into NUAR.
(3D) The duty under subsection (3C) does not apply in relation to information recorded or updated before the archive upload date.
(3E) A duty under subsection (3B) or (3C) does not apply in such cases as may be prescribed.
(3F) Information must be entered into NUAR under subsection (3B) or (3C) in such form and manner as may be prescribed.”
(g) in subsection (4)(a), omit “not exceeding level 5 on the standard scale”;
(h) after subsection (6) insert—
“(7) For the purposes of subsection (3B) the Secretary of State must by regulations—
(a) specify a date as “the archive upload date”, and
(b) specify a period beginning with that date as the “initial upload period”.
(8) For the meaning of “NUAR”, see section 106A.”
(4) For section 80 (duty to inform undertakers of location of apparatus) substitute—
“80 Duties to report missing or incorrect information in relation to apparatus
(1) Subsection (2) applies where a person executing works of any description in a street finds an item of apparatus belonging to an undertaker in relation to which prescribed information—
(a) is not entered in NUAR, or
(b) is entered in NUAR but is incorrect.
(2) The person must take such steps as are reasonably practicable to inform the undertaker to whom the item belongs of the missing or incorrect information.
(3) Where a person executing works of any description in a street finds an item of apparatus which does not belong to the person and is unable, after taking such steps as are reasonably practicable, to ascertain to whom the item belongs, the person must—
(a) if the person is an undertaker, enter into NUAR, in such form and manner as may be prescribed, prescribed information in relation to the item;
(b) in any other case, inform the street authority of that information.
(4) Subsections (2) and (3) have effect subject to such exceptions as may be prescribed.
(5) A person who fails to comply with subsection (2) or (3) commits an offence.
(6) A person who commits an offence under subsection (5) is liable on summary conviction to a fine not exceeding level 4 on the standard scale.
(7) Before making regulations under this section the Secretary of State must consult—
(a) such representatives of persons likely to be affected by the regulations as the Secretary of State considers appropriate, and
(b) such other persons as the Secretary of State considers appropriate.
(8) For the meaning of “NUAR”, see section 106A.”
(5) Before section 81 (duty to maintain apparatus) insert—
“Other duties and liabilities of undertakers in relation to apparatus”.
(6) In section 104 (regulations), after subsection (1) insert—
“(1A) Before making regulations under section 79 or 80 the Secretary of State must consult the Welsh Ministers.
(1B) Regulations under this Part may make supplementary or incidental provision.”
(7) In consequence of the provision made by subsection (4), omit section 47 of the Traffic Management Act 2004.”—(Sir John Whittingdale.)
This amendment amends the New Roads and Street Works Act 1991 so as to impose new duties on undertakers to keep records of, and share information relating to, apparatus in streets; and makes amendments consequential on those changes.
Brought up, read the First and Second time, and added to the Bill.
New Clause 41
Pre-commencement consultation
“A requirement to consult under a provision inserted into the New Roads and Street Works Act 1991 by section (National Underground Asset Register) or (Information in relation to apparatus) may be satisfied by consultation before, as well as consultation after, the provision inserting that provision comes into force.”—(Sir John Whittingdale.)
This amendment provides that a requirement that the Secretary of State consult under a provision inserted into the New Roads and Street Works Act 1991 by the new clauses inserted by Amendments NC39 and NC40 may be satisfied by consultation undertaken before or after the provision inserting that provision comes into force.
Brought up, read the First and Second time, and added to the Bill.
New Clause 42
Transfer of certain functions to Secretary of State
“(1) The powers to make regulations under section 79(1) and (2) of the New Roads and Street Works Act 1991, so far as exercisable in relation to Wales, are transferred to the Secretary of State.
(2) The power to make regulations under section 79(1A) of that Act (as inserted by section 46(2) A42of the Traffic Management Act 2004), so far as exercisable in relation to Wales, is transferred to the Secretary of State.
(3) The Street Works (Records) (England) Regulations 2002 (S.I. 2002/3217) have effect as if the reference to England in regulation 1(2) were a reference to England and Wales.
(4) The Street Works (Records) (Wales) Regulations 2005 (S.I. 2005/1812) are revoked.”—(Sir John Whittingdale.)
This amendment provides that certain powers to make regulations under section 79 of the New Roads and Street Works Act 1991, so far as exercisable in relation to Wales, are transferred from the Welsh Ministers to the Secretary of State; and makes provision in relation to regulations already made under those powers.
Brought up, read the First and Second time, and added to the Bill.
Clause 5
Lawfulness of processing
Amendment proposed: 11, page 7, line 12, at end insert—
““internal administrative purposes”, in relation to special category data, means the conditions set out for lawful processing in paragraph 1 of Schedule 1 of the Data Protection Act 2018.”—(Kate Osborne.)
This amendment clarifies that the processing of special category data in employment must follow established principles for reasonable processing, as defined by paragraph 1 of Schedule 1 of the Data Protection Act 2018.
Question put, That the amendment be made.
I beg to move, That the Bill be now read the Third time.
This Bill will deliver tangible benefits to British consumers and businesses alike, which would not have been possible if Britain had still been a member of the European Union. It delivers a more flexible and less burdensome data protection regime that maintains high standards of privacy protection while promoting growth and boosting innovation. It does so with the support of the Information Commissioner, and without jeopardising the UK’s European Union data adequacy.
I would like to thank all Members who contributed during the passage of the Bill, and all those who have helped get it right. I now commend it to the House on its onward passage to the other place.